Playstation 5 specs leaked! (Project Aerial)

  • 123 results
  • 1
  • 2
  • 3
Avatar image for Steppy_76
#52 Posted by Steppy_76 (2710 posts) -

@Shewgenja said:

@Steppy_76: Hold up. Weren't you guys touting a 2tflop difference on a jaguar-based console as the "True 4k console"? Sit the ENTIRE hell down with all that.

That's kind of the point performance is relative... THe X performed roughly 5x better than the original one with a difference of 4.7tflops. The pro performed 2.3x the PS4 with a difference of 2.4 Tflops.

To get to 2.3 x the 6 Tflops of the x requires 13.8 Tflops. Even if you add the Tflops of the processor in the PS5 specs to the 14 of the GPU rumor, You are still looking at roughly the same difference between the X and the PS5 as the PS4 pro was to the original PS4.

People are doubting the PS5 would be feasible at a decent cost with those specs, yet those specs only push it as far from the X as the PS4 Pro was from the original PS4, You still think next gen is going to have a dramatic increase over the previous gen?

Avatar image for nfamouslegend
#53 Posted by NfamousLegend (365 posts) -

@Steppy_76 said:
@Shewgenja said:

@Steppy_76: Hold up. Weren't you guys touting a 2tflop difference on a jaguar-based console as the "True 4k console"? Sit the ENTIRE hell down with all that.

That's kind of the point performance is relative... THe X performed roughly 5x better than the original one with a difference of 4.7tflops. The pro performed 2.3x the PS4 with a difference of 2.4 Tflops.

To get to 2.3 x the 6 Tflops of the x requires 13.8 Tflops. Even if you add the Tflops of the processor in the PS5 specs to the 14 of the GPU rumor, You are still looking at roughly the same difference between the X and the PS5 as the PS4 pro was to the original PS4.

People are doubting the PS5 would be feasible at a decent cost with those specs, yet those specs only push it as far from the X as the PS4 Pro was from the original PS4, You still think next gen is going to have a dramatic increase over the previous gen?

Assuming these leaks are somehow correct going from a 1.8tflop older architecture PS4 GPU to a 14tflop Navi GPU is a very big increase. The biggest increase next gen will be in the CPU. Going from a awful Jaguar to a 16 thread Zen2 chip is like a 4X increase in capability alone.

Avatar image for ivangrozny
#54 Posted by IvanGrozny (896 posts) -

Fake as hell especially with 20GB of GDDR6. That amount alone would be around 400 bucks.

Avatar image for Epak_
#55 Posted by Epak_ (10678 posts) -

@XVision84: It'll probably not be that much, but I'm defo ready for next gen.

Avatar image for kali-b1rd
#56 Posted by Kali-B1rd (2213 posts) -

fake and gay. (Ariola lol)

Avatar image for davillain-
#57 Posted by DaVillain- (36951 posts) -

@PSP107 said:
@davillain- said:

Always wait for Sony official statements.

You never found any enjoyment of rumors/speculations?

Sometimes I do but this isn't one of them. Especially with next-gen consoles cause everyone just pulls BS out of nowhere.

Avatar image for son-goku7523
#58 Posted by Son-Goku7523 (955 posts) -
@ivangrozny said:

Fake as hell especially with 20GB of GDDR6. That amount alone would be around 400 bucks.

Not that I'm saying that this rumor is right or anything but this statement isn't entirely correct. If you're buying as a consumer then yes the statement would be true but Sony and MS don't buy as consumers, they buy as manufacturers. As a result of them buying in bulk as manufacturers they are able to get components way cheaper than we consumers do due to discounts. It wouldn't cost Sony nearly as much for GDDR6 as it does for your average Joe.

Avatar image for djoffer
#59 Posted by djoffer (1377 posts) -

Looks fake as hell, however it that is the specs and it launches with full bc and at a reasonable price, I think I’ll pick one up.

Avatar image for ni6htmare01
#60 Posted by ni6htmare01 (3785 posts) -

Will it be enough to do all games at 4K/60 fps? Maybe is time to upgrade my TV when it comes out!

Avatar image for howmakewood
#61 Posted by Howmakewood (5934 posts) -

@son-goku7523: which consumer is buying gddr6? Not that I agree that 20gb of if would cost as much, hbm2 might

Avatar image for son-goku7523
#62 Posted by Son-Goku7523 (955 posts) -
@howmakewood said:

@son-goku7523: which consumer is buying gddr6? Not that I agree that 20gb of if would cost as much, hbm2 might

No one is, it's used in high end graphics cards like the 2080Ti but no one buys it directly from the manufacturer. It's one of the components of the 2080Ti that hikes up the price. I'm just saying that the GDDR6 tax we can expect to be paying when we buy high end GPUs wouldn't necessarily translate to console manufacturers that buy in huge numbers, much higher than Nvidia currently does for their high end cards.

Avatar image for Random_Matt
#63 Posted by Random_Matt (4251 posts) -

Kinda more interested in this Google stuff, they released a teaser. I already know PS5 will be godly whatever the hardware.

Avatar image for sakaixx
#64 Edited by sakaiXx (5712 posts) -

@PAL360 said:

@sakaixx: Considering current gen games already look and run incredibly well, those specs would be more than enough.

@Livecommander said:

@sakaixx: do you know how many people would never buy the playstation 6 in that case ?

The specs will do just fine for a seven year cycle.

I use to wish for unrealistic things in future products as well. Its understandable lol

@techhog89 said:

Are you even capable of making a post that isn't stupid?

The leaked specs is already unreasonable considering Xbox X specs and price, do I need to explain more.

Avatar image for techhog89
#65 Posted by Techhog89 (3747 posts) -

@sakaixx: Sarcasm on the internet doesn't work well when you're known for making dumb statements and you throw in extra details for no reason (such as attempting to brag about a mid-range PC).

Avatar image for Midnightshade29
#66 Posted by Midnightshade29 (5997 posts) -

@R4gn4r0k said:

300 MHz Power Processor

Voodoo 2 graphics

12MB RAM

Floppy disk drive

Calling it now

Can't wait to play Quake 2 on that beast!!! It won't run Quake 3 or UT99 though, at least not the outside levels, which will chug at a 5 frames per second.

Avatar image for rmpumper
#67 Posted by rmpumper (654 posts) -

@ni6htmare01: For current games, sure (after all, if these specs are correct - not likely - the PS5 will be more than 2x powerful than the OneX and it already handles 4K30), for next gen, who knows.

Avatar image for floppydics
#68 Posted by floppydics (89 posts) -

How is it possible to get a console with those specs with usual rrp launch prices..?? If true that would be a super duper deal.

Avatar image for i_p_daily
#69 Posted by I_P_Daily (12045 posts) -

@mandzilla said:

Will it run Ridge Racer though?

As a racing fan all I can say about Ridge Racer is

Loading Video...

Avatar image for blackhairedhero
#70 Posted by Blackhairedhero (3233 posts) -

@Steppy_76: The problem is your only looking at GPU. The rumored processors are about 3x the power of the old ones.

Avatar image for tormentos
#71 Posted by tormentos (29198 posts) -

@Steppy_76 said:
@Shewgenja said:

@Steppy_76: Hold up. Weren't you guys touting a 2tflop difference on a jaguar-based console as the "True 4k console"? Sit the ENTIRE hell down with all that.

That's kind of the point performance is relative... THe X performed roughly 5x better than the original one with a difference of 4.7tflops. The pro performed 2.3x the PS4 with a difference of 2.4 Tflops.

To get to 2.3 x the 6 Tflops of the x requires 13.8 Tflops. Even if you add the Tflops of the processor in the PS5 specs to the 14 of the GPU rumor, You are still looking at roughly the same difference between the X and the PS5 as the PS4 pro was to the original PS4.

People are doubting the PS5 would be feasible at a decent cost with those specs, yet those specs only push it as far from the X as the PS4 Pro was from the original PS4, You still think next gen is going to have a dramatic increase over the previous gen?

The xbox one X look like a big jump because the xbox one S was a shitty weak platform.

Avatar image for ronvalencia
#72 Edited by ronvalencia (28082 posts) -

@nfamouslegend said:

The leak comes via Beyond3D but has been reported by several sites. Playstation 5 is going by the working name Aerial and is scheduled for release in the first half of 2020 with a reveal in the next 3 months.

Can't say for sure if this is the development kits or what they are targeting for final hardware, either way it's absolutely beast.

CPU - 8c16t Zen2 @3.2ghz

GPU - 14.2tflop Navi graphics engine

20GB GDDR6, 880Gb/s 4GB DDR4 OS

2TB HDD

I guess we will find out soon enough but this would be an absolute beast machine easily capable of outdoing even the best current rigs with console optimization. Could be a dev kit though, even then at 15% reduction in capability it would be powerful.

Thoughts?

https://www.google.com/amp/s/www.altchar.com/games-news/590803/playstation-5-beefy-specs-leaked/amp

Also a video on Redgamingtech channel.

FALSE, Something wrong with 20GB chip layout.

GDDR6 config examples based chip count and chip storage density

256 bit GDDR6 = 8 GB** or 16 GB** config

352 bit GDDR6 =11 GB** or 22 GB config (384bit bus PCB with a single 32bit memory controller disabled)

384 bit GDDR6 = 12 GB or 24 GB** or 48 GB** config

512 bit GDDR6 = 16 GB or 32 GB config

**Real GDDR6 configuration from TU102 and TU104.

Avatar image for ronvalencia
#73 Edited by ronvalencia (28082 posts) -

@howmakewood said:

@son-goku7523: which consumer is buying gddr6? Not that I agree that 20gb of if would cost as much, hbm2 might

Mid-range GTX 1160 Ti has 6GB 192 bit GDDR6-12000 (288GB/s), hence it's PCB design and cost structures are similar to GTX 1060's 192 bit bus GDDR5-8000/9000.

One can guess AMD's GTX 1160 Ti/RTX 2060 competitor with NAVI GPU would be 256 bit GDDR6 PCB version.

PS4 follows 256 bit bus Radeon 7850 mid range

PS4 Pro follows 256 bit bus RX-470 mid range

Xbox One X disrupts AMD's mid range 256 bit bus PCB design with 384 bit bus PCB.

My speculation based on GDDR6 config rules with Xbox One X's PCB budget. For X1X, MS used GDDR5-6800 config which are RX-470's GDDR5-7000 price level chips,

GDDR6-12000 x 384 bit = 576 GB/s

GDDR6-13000 x 384 bit =624 GB/s <---- near 2X over X1X's 326 GB/s

GDDR6-14000 x 384 bit = 672 GB/s

Avatar image for Steppy_76
#74 Edited by Steppy_76 (2710 posts) -

@tormentos said:@Steppy_76 said: @Shewgenja said: @Steppy_76: Hold up. Weren't you guys touting a 2tflop difference on a jaguar-based console as the "True 4k console"? Sit the ENTIRE hell down with all that.

That's kind of the point performance is relative... THe X performed roughly 5x better than the original one with a difference of 4.7tflops. The pro performed 2.3x the PS4 with a difference of 2.4 Tflops.

To get to 2.3 x the 6 Tflops of the x requires 13.8 Tflops. Even if you add the Tflops of the processor in the PS5 specs to the 14 of the GPU rumor, You are still looking at roughly the same difference between the X and the PS5 as the PS4 pro was to the original PS4.

People are doubting the PS5 would be feasible at a decent cost with those specs, yet those specs only push it as far from the X as the PS4 Pro was from the original PS4, You still think next gen is going to have a dramatic increase over the previous gen?

The xbox one X look like a big jump because the xbox one S was a shitty weak platform.

No, it looks like a big jump because if you are maintaining compatibility you can't do much with CPU as the extra power is wasted since games are going to be written with the base in mind, what you can do is jack up the GPU and crank up the resolutions to take advantage of the extra fill rate without messing up the base system. I fully expect every refresh to do this, leave the CPU alone for the most part and boost the GPU. Then next gen is again time for CPU boost.

Avatar image for APiranhaAteMyVa
#75 Posted by APiranhaAteMyVa (4123 posts) -

The only power anyone needs to know is SONY DOMINATION CONTINUATION

Avatar image for horgen
#76 Posted by Horgen (120622 posts) -

The CPU could happen.. The GPU? No, not happening.

Avatar image for boxrekt
#77 Posted by BoxRekt (1813 posts) -

@Pedro said:
@tormentos said:

I think non can deliver anything more than that, releasing mid gen machines is a blessing and a curse, to be a decent next gen jump at least 5 times as powerful the console need to be 30TF in the case of the xbox one x and like 22 TF to be a big jump over the pro and I just don't see such jump by 2020.

I view it more as blessing. The never-ending cycle of pushing graphics is closing because of technical and economical limitations. Developers would be forced to attract gamers in new ways besides just graphics. The jump graphically "next gen" is going to be smaller than any previous generation.

lol WRONG!

It's funny how far you miss the bar. You still don't realize you're only seeing at best 1.8 TF games at best that simply have resolution buffs for more powerful systems.

There are 0 games even developed to take advantage of even a 6TF system much less 10+TF one.

This is why exclusives matter. PC and xbox fanboys are short sighted because they rely on mutiplats or the exclusives they get are low end shovelware.

When Sony shows off TRUE PS5 exclusives you guys are going to shit your pants and claim it's completely CGI, not too unlike what many of the failure profits did for games like God of War and Spiderman. It's going to be funny to see you guys make fools of yourselves again not understanding what designing a game for one specific piece of hardware can acheive.

Avatar image for Pedro
#78 Posted by Pedro (34958 posts) -

@boxrekt said:

lol WRONG!

It's funny how far you miss the bar. You still don't realize you're only seeing at best 1.8 TF games at best that simply have resolution buffs for more powerful systems.

There are 0 games even developed to take advantage of even a 6TF system much less 10+TF one.

This is why exclusives matter. PC and xbox fanboys are short sighted because they rely on mutiplats or the exclusives they get are low end shovelware.

When Sony shows off TRUE PS5 exclusives you guys are going to shit your pants and claim it's completely CGI, not too unlike what many of the failure profits did for games like God of War and Spiderman. It's going to be funny to see you guys make fools of yourselves again not understanding what designing a game for one specific piece of hardware can acheive.

Firstly, processing power is not a mysterious entity that needs studying in order to unlock its true potential. Its simply processing. That's it. Nothing more nothing less.

Secondly, you can't state that there are 0 games developed that take advantage of 6TF. That is a factually and demonstrably FALSE statement. There are games so many games on the market that fully utilize the 6TF of power, some more successful than others when using resolution and framerates as benchmarks.

Thirdly, exclusives don't magically unlock the "hidden potential" of hardware. What are you? A moron?

Finally, stop being a shill.

Avatar image for Grey_Eyed_Elf
#79 Posted by Grey_Eyed_Elf (6460 posts) -

Misleading article.

The specifications "leaked" are from a forum user who leaked the Wii U devkit, and these specifications are for the PS5 devkit.

Link: https://twitter.com/LiquidTitan/status/1104257661661597698

Loading Video...

Also its not a 14.2TFLOP GPU... Its listed on the leak as 12.6-14.2TFLOPS.

Also can I mention that the Xbox One X devkits have 24GB RAM with a 6.6TFLOP GPU... Which makes sense since devkits need the extra GPU overhead and a LOT more memory.

So if this is a devkit leak then the PS5 will be:

  • 8c/16t CPU slower than 3.2GHz
  • 10-12GB GDDR6 with 4GB DDR4 for OS
  • Navi slower than 12.6TFLOPS

Which puts it inline with what anyone has been saying about the coming consoles who knows anything about GCN, you will be LUCKY to get 10-11 TFLOPS.

Avatar image for mandzilla
#80 Posted by Mandzilla (4110 posts) -

@i_p_daily: Haha.

Avatar image for lundy86_4
#81 Posted by lundy86_4 (53386 posts) -

This would be a devkit, if it's even accurate. Devkits aren't provided in final-form cases and have excess cooling, etc.

Avatar image for mazuiface
#82 Posted by mazuiface (898 posts) -

@nfamouslegend said:

GPU - 14.2tflop Navi graphics engine

No way this is true unless the console will cost 500+

Avatar image for boxrekt
#83 Edited by BoxRekt (1813 posts) -

@Pedro said:

Firstly, processing power is not a mysterious entity that needs studying in order to unlock its true potential. Its simply processing. That's it. Nothing more nothing less.

Secondly, you can't state that there are 0 games developed that take advantage of 6TF. That is a factually and demonstrably FALSE statement. There are games so many games on the market that fully utilize the 6TF of power, some more successful than others when using resolution and framerates as benchmarks.

Thirdly, exclusives don't magically unlock the "hidden potential" of hardware. What are you? A moron?

Finally, stop being a shill.

No, you need to stop being damage controlling the fact that you bought a useless $500 box and make up dumb arguments to try to make yourself feel better about your bonehead decision.

You need to realize that the UNDERLINE and the BOLD statements do no = the same thing!

When did I ever say that 6TF of power wasn't' "utilized"? Hell developers even "utilize" 1080TI's with excess frames per-second and resolution buffs.

That's NOT developing games specifically tailored to that hardware spec dopey.

This is why you sound desperate and ignorant when trying to talk about generations lumping your crappy mid-gen xbox into the conversation. You're the one who needs to stop shilling. EVERY game that has released this generation is developed around the limitations and specs of BASE PS4 and Xbox one systems, The X and Pro didn't change that.

The ONLY thing you're getting with more powerful hardware, NOW, is extra frames, resolution and textures for game built to around 1.3 TF Xbox One and 1.8TF PS4 systems.

THAT is why generations exist!

Again there are ZERO games DEVELOPED to take advantage of systems with even 6TF of power out on the market now. Next gen with change that.

Avatar image for Pedro
#84 Posted by Pedro (34958 posts) -

@boxrekt: Blah blah! Didn't read because there is no counter to facts. :)

Avatar image for boxrekt
#85 Edited by BoxRekt (1813 posts) -

@Pedro said:

@boxrekt: Blah blah! Didn't read because there is no counter to facts. :)

I guess it still stings that you bought a $500 box that turned out to be worthless in the grand scheme of things. That's not anyones fault but yours.

But I agree, there are no counter for facts lol.

Let me help you out though, there is only 1 game this gen that was developed above PS4/Xbox One specs and that title is Star Citizen. The only problem is it never fully released and isn't going to release until 2020 when next gen starts! Then it will get ported to consoles and fall right in line with other "next gen" games LMFAO. The fact is most PC exclusive games even fall below BASE PS4 specification and it's only when console ports arrive that higher end PC hardware is be "utilized".

I.E. Not developed for, unless you think playing pushing Tetris at 8k @ 500fps means it was developed for a 1070? lol

So yeah there are zero games released now that were developed for 6TF systems or above only up scaled base PS4 and xbox one titles.

Avatar image for ronvalencia
#86 Posted by ronvalencia (28082 posts) -

@Grey_Eyed_Elf said:

Misleading article.

The specifications "leaked" are from a forum user who leaked the Wii U devkit, and these specifications are for the PS5 devkit.

Link: https://twitter.com/LiquidTitan/status/1104257661661597698

Also its not a 14.2TFLOP GPU... Its listed on the leak as 12.6-14.2TFLOPS.

Also can I mention that the Xbox One X devkits have 24GB RAM with a 6.6TFLOP GPU... Which makes sense since devkits need the extra GPU overhead and a LOT more memory.

So if this is a devkit leak then the PS5 will be:

  • 8c/16t CPU slower than 3.2GHz
  • 10-12GB GDDR6 with 4GB DDR4 for OS
  • Navi slower than 12.6TFLOPS

Which puts it inline with what anyone has been saying about the coming consoles who knows anything about GCN, you will be LUCKY to get 10-11 TFLOPS.

X1X dev kit's GPU is just the full 44 CUs version while retail X1X has 40 CUs active and four disabled CUs.

Avatar image for Gaming-Planet
#87 Posted by Gaming-Planet (19965 posts) -

The only exciting spec is the CPU.

Avatar image for Grey_Eyed_Elf
#88 Posted by Grey_Eyed_Elf (6460 posts) -

@ronvalencia: I know. Its kinda the point the Dev kits aren't restricted by yields, tdp and manufacturing costs... A mass produced console is sold more than any one single GPU in a PC consumer market so the chips need to stable and achieve the same level of performance(clocks) within TDP targets and price. Those 4 CU's that where cut where done for those reasons. So this DEVKIT leak means that the retail unit as usual will have 1/2 the VRAM and lower clocks and maybe even some CU's disabled for yeilds... If its a 12.6-14.2TFLOP devkit we could be looking at a 10-11TFLOP chip in a retail unit or worse.

Avatar image for emgesp
#89 Posted by emgesp (7832 posts) -
@Grey_Eyed_Elf said:

Misleading article.

The specifications "leaked" are from a forum user who leaked the Wii U devkit, and these specifications are for the PS5 devkit.

Link: https://twitter.com/LiquidTitan/status/1104257661661597698

Also its not a 14.2TFLOP GPU... Its listed on the leak as 12.6-14.2TFLOPS.

Also can I mention that the Xbox One X devkits have 24GB RAM with a 6.6TFLOP GPU... Which makes sense since devkits need the extra GPU overhead and a LOT more memory.

So if this is a devkit leak then the PS5 will be:

  • 8c/16t CPU slower than 3.2GHz
  • 10-12GB GDDR6 with 4GB DDR4 for OS
  • Navi slower than 12.6TFLOPS

Which puts it inline with what anyone has been saying about the coming consoles who knows anything about GCN, you will be LUCKY to get 10-11 TFLOPS.

This crap you keep spewing needs to end. 10 Tflops will be the minimum Sony releases with a possibility of 11 - 12 Teraflops max. There also will be a minimum of 16GBs total of GDDR6. There is nothing lucky about these specs. They are actually not that special for a 2020 console.

Avatar image for raining51
#90 Posted by Raining51 (1124 posts) -

Yes, that sounds promising.

Might be PS2 levels honestly.

Avatar image for ronvalencia
#91 Posted by ronvalencia (28082 posts) -

@Grey_Eyed_Elf said:

@ronvalencia: I know. Its kinda the point the Dev kits aren't restricted by yields, tdp and manufacturing costs... A mass produced console is sold more than any one single GPU in a PC consumer market so the chips need to stable and achieve the same level of performance(clocks) within TDP targets and price. Those 4 CU's that where cut where done for those reasons. So this DEVKIT leak means that the retail unit as usual will have 1/2 the VRAM and lower clocks and maybe even some CU's disabled for yeilds... If its a 12.6-14.2TFLOP devkit we could be looking at a 10-11TFLOP chip in a retail unit or worse.

TFLOPS means little when the ROPS are the primary graphics read/write unit to expose GPU's TFLOPS power to external memory storage e.g. multi-MB L2 cache (tile cache render) and GDDR6 frame-buffer.

I expect NAVI to master 64 ROPS over 256 bit memory bus design like Pascal GP104 and Turing TU104. Don't expect RTX 2080 TI level raster power.

Avatar image for KillzoneSnake
#93 Edited by KillzoneSnake (2462 posts) -

Unless they wanna release that for 599 i call BS. 16GB 10tflops sounds reasonable.

Avatar image for tormentos
#94 Posted by tormentos (29198 posts) -

@Pedro said:

Firstly, processing power is not a mysterious entity that needs studying in order to unlock its true potential. Its simply processing. That's it. Nothing more nothing less.

Secondly, you can't state that there are 0 games developed that take advantage of 6TF. That is a factually and demonstrably FALSE statement. There are games so many games on the market that fully utilize the 6TF of power, some more successful than others when using resolution and framerates as benchmarks.

Thirdly, exclusives don't magically unlock the "hidden potential" of hardware. What are you? A moron?

Finally, stop being a shill.

A PS1 game in 4k still is a shitty looking game.

Power is more than resolution and frames,this is why most sony games look freaking great even on base PS4 is the injection of power into visual fidelity rather than just resolution or frames.

So using 6TF just to bump resolution to 4k isn't really doing much,outside giving you a cleaner sharper picture of what you were already seeing.

So yeah i believe a game coded to a 10TF machine injecting more power into visuals than resolution will make for quite a spectacle.

@Grey_Eyed_Elf said:

@ronvalencia: I know. Its kinda the point the Dev kits aren't restricted by yields, tdp and manufacturing costs... A mass produced console is sold more than any one single GPU in a PC consumer market so the chips need to stable and achieve the same level of performance(clocks) within TDP targets and price. Those 4 CU's that where cut where done for those reasons. So this DEVKIT leak means that the retail unit as usual will have 1/2 the VRAM and lower clocks and maybe even some CU's disabled for yeilds... If its a 12.6-14.2TFLOP devkit we could be looking at a 10-11TFLOP chip in a retail unit or worse.

The only reason why it has 4CU disable is for yield,because if you chose a GPU with 44CU and the GPU comes out of factory with 43CU that GPU is useless and this is critical for making as many consoles as possible in a short period of time.

The xbox 360 suffer a massive yield problem when it launched in 2005 for that reason because the GPU was top of the line there was no other models abode it.

So choosing 40CU instead of 44Cu warranties that any chip with 44,43,42,41, and 40CU can be use on production of the xbox one X and this increase the yields.

This is how all companies work that includes Nvidia and AMD most of this GPU are cut down version of full top of the line GPU,as the yields mostly determine which model will be.

Avatar image for ronvalencia
#95 Edited by ronvalencia (28082 posts) -

@tormentos said:
@Pedro said:

Firstly, processing power is not a mysterious entity that needs studying in order to unlock its true potential. Its simply processing. That's it. Nothing more nothing less.

Secondly, you can't state that there are 0 games developed that take advantage of 6TF. That is a factually and demonstrably FALSE statement. There are games so many games on the market that fully utilize the 6TF of power, some more successful than others when using resolution and framerates as benchmarks.

Thirdly, exclusives don't magically unlock the "hidden potential" of hardware. What are you? A moron?

Finally, stop being a shill.

A PS1 game in 4k still is a shitty looking game.

Power is more than resolution and frames,this is why most sony games look freaking great even on base PS4 is the injection of power into visual fidelity rather than just resolution or frames.

So using 6TF just to bump resolution to 4k isn't really doing much,outside giving you a cleaner sharper picture of what you were already seeing.

So yeah i believe a game coded to a 10TF machine injecting more power into visuals than resolution will make for quite a spectacle.

@Grey_Eyed_Elf said:

@ronvalencia: I know. Its kinda the point the Dev kits aren't restricted by yields, tdp and manufacturing costs... A mass produced console is sold more than any one single GPU in a PC consumer market so the chips need to stable and achieve the same level of performance(clocks) within TDP targets and price. Those 4 CU's that where cut where done for those reasons. So this DEVKIT leak means that the retail unit as usual will have 1/2 the VRAM and lower clocks and maybe even some CU's disabled for yeilds... If its a 12.6-14.2TFLOP devkit we could be looking at a 10-11TFLOP chip in a retail unit or worse.

The only reason why it has 4CU disable is for yield,because if you chose a GPU with 44CU and the GPU comes out of factory with 43CU that GPU is useless and this is critical for making as many consoles as possible in a short period of time.

The xbox 360 suffer a massive yield problem when it launched in 2005 for that reason because the GPU was top of the line there was no other models abode it.

So choosing 40CU instead of 44Cu warranties that any chip with 44,43,42,41, and 40CU can be use on production of the xbox one X and this increase the yields.

This is how all companies work that includes Nvidia and AMD most of this GPU are cut down version of full top of the line GPU,as the yields mostly determine which model will be.

RTX 2080 Ti is salvage TU102 chip with defected CUDA cores/ROPS block.

VII is salvage Vega 20 chip with up to four CU defected.

https://www.kitguru.net/components/graphic-cards/anton-shilov/new-tools-may-help-to-re-activate-disabled-stream-processors-of-amd-fiji-other-gpus/

Some GCN Fury Pro's 8 disabled CUs can be re-activated into Fury X, but it's a silicon lottery.

It's possible to reactivate 4 disabled CU on VII

Avatar image for Grey_Eyed_Elf
#96 Posted by Grey_Eyed_Elf (6460 posts) -

@emgesp said:
@Grey_Eyed_Elf said:

Misleading article.

The specifications "leaked" are from a forum user who leaked the Wii U devkit, and these specifications are for the PS5 devkit.

Link: https://twitter.com/LiquidTitan/status/1104257661661597698

Also its not a 14.2TFLOP GPU... Its listed on the leak as 12.6-14.2TFLOPS.

Also can I mention that the Xbox One X devkits have 24GB RAM with a 6.6TFLOP GPU... Which makes sense since devkits need the extra GPU overhead and a LOT more memory.

So if this is a devkit leak then the PS5 will be:

  • 8c/16t CPU slower than 3.2GHz
  • 10-12GB GDDR6 with 4GB DDR4 for OS
  • Navi slower than 12.6TFLOPS

Which puts it inline with what anyone has been saying about the coming consoles who knows anything about GCN, you will be LUCKY to get 10-11 TFLOPS.

This crap you keep spewing needs to end. 10 Tflops will be the minimum Sony releases with a possibility of 11 - 12 Teraflops max. There also will be a minimum of 16GBs total of GDDR6. There is nothing lucky about these specs. They are actually not that special for a 2020 console.

I don't think you fully understand how TFLOPS are counted using GCN the reason I say lucky is based on TDP... 10-11 TFLOPS on a 200w TDP with GCN, miracle work. The Radeon VII is a 7nm GCN based card and it alone is at 300w with 13.8 TFLOPS. Also the RAM spec's I mentioned are based on the leak which is for a DEV Kit, if you knew that Dev kits have double the VRAM then you would know that if a dev kit is leaked half that VRAM and that's what will be in the console.

"This crap" i'm spewing is based on well known information about AMD's GPU's, but hey its all speculation regardless of my arrogance behind what I say.

Avatar image for Grey_Eyed_Elf
#97 Posted by Grey_Eyed_Elf (6460 posts) -

@ronvalencia said:
@Grey_Eyed_Elf said:

@ronvalencia: I know. Its kinda the point the Dev kits aren't restricted by yields, tdp and manufacturing costs... A mass produced console is sold more than any one single GPU in a PC consumer market so the chips need to stable and achieve the same level of performance(clocks) within TDP targets and price. Those 4 CU's that where cut where done for those reasons. So this DEVKIT leak means that the retail unit as usual will have 1/2 the VRAM and lower clocks and maybe even some CU's disabled for yeilds... If its a 12.6-14.2TFLOP devkit we could be looking at a 10-11TFLOP chip in a retail unit or worse.

TFLOPS means little when the ROPS are the primary graphics read/write unit to expose GPU's TFLOPS power to external memory storage e.g. multi-MB L2 cache (tile cache render) and GDDR6 frame-buffer.

I expect NAVI to master 64 ROPS over 256 bit memory bus design like Pascal GP104 and Turing TU104. Don't expect RTX 2080 TI level raster power.

With the all the surrounded rumours of on Navi it doesn't look like they plan on using all the ROPS at least not with the initial release.

Avatar image for kuu2
#98 Posted by kuu2 (11155 posts) -

@phbz said:

PlayStation Areola will be a monster! Praise the Areola!

Love me some Areolas. Cant wait.

Avatar image for howmakewood
#99 Posted by Howmakewood (5934 posts) -
@Grey_Eyed_Elf said:
@emgesp said:
@Grey_Eyed_Elf said:

Misleading article.

The specifications "leaked" are from a forum user who leaked the Wii U devkit, and these specifications are for the PS5 devkit.

Link: https://twitter.com/LiquidTitan/status/1104257661661597698

Also its not a 14.2TFLOP GPU... Its listed on the leak as 12.6-14.2TFLOPS.

Also can I mention that the Xbox One X devkits have 24GB RAM with a 6.6TFLOP GPU... Which makes sense since devkits need the extra GPU overhead and a LOT more memory.

So if this is a devkit leak then the PS5 will be:

  • 8c/16t CPU slower than 3.2GHz
  • 10-12GB GDDR6 with 4GB DDR4 for OS
  • Navi slower than 12.6TFLOPS

Which puts it inline with what anyone has been saying about the coming consoles who knows anything about GCN, you will be LUCKY to get 10-11 TFLOPS.

This crap you keep spewing needs to end. 10 Tflops will be the minimum Sony releases with a possibility of 11 - 12 Teraflops max. There also will be a minimum of 16GBs total of GDDR6. There is nothing lucky about these specs. They are actually not that special for a 2020 console.

I don't think you fully understand how TFLOPS are counted using GCN the reason I say lucky is based on TDP... 10-11 TFLOPS on a 200w TDP with GCN, miracle work. The Radeon VII is a 7nm GCN based card and it alone is at 300w with 13.8 TFLOPS. Also the RAM spec's I mentioned are based on the leak which is for a DEV Kit, if you knew that Dev kits have double the VRAM then you would know that if a dev kit is leaked half that VRAM and that's what will be in the console.

"This crap" i'm spewing is based on well known information about AMD's GPU's, but hey its all speculation regardless of my arrogance behind what I say.

This while using more power efficient memory, gddr6 would hike up the power draw again

Avatar image for ronvalencia
#100 Posted by ronvalencia (28082 posts) -

@Grey_Eyed_Elf said:
@emgesp said:
@Grey_Eyed_Elf said:

Misleading article.

The specifications "leaked" are from a forum user who leaked the Wii U devkit, and these specifications are for the PS5 devkit.

Link: https://twitter.com/LiquidTitan/status/1104257661661597698

Also its not a 14.2TFLOP GPU... Its listed on the leak as 12.6-14.2TFLOPS.

Also can I mention that the Xbox One X devkits have 24GB RAM with a 6.6TFLOP GPU... Which makes sense since devkits need the extra GPU overhead and a LOT more memory.

So if this is a devkit leak then the PS5 will be:

  • 8c/16t CPU slower than 3.2GHz
  • 10-12GB GDDR6 with 4GB DDR4 for OS
  • Navi slower than 12.6TFLOPS

Which puts it inline with what anyone has been saying about the coming consoles who knows anything about GCN, you will be LUCKY to get 10-11 TFLOPS.

This crap you keep spewing needs to end. 10 Tflops will be the minimum Sony releases with a possibility of 11 - 12 Teraflops max. There also will be a minimum of 16GBs total of GDDR6. There is nothing lucky about these specs. They are actually not that special for a 2020 console.

I don't think you fully understand how TFLOPS are counted using GCN the reason I say lucky is based on TDP... 10-11 TFLOPS on a 200w TDP with GCN, miracle work. The Radeon VII is a 7nm GCN based card and it alone is at 300w with 13.8 TFLOPS. Also the RAM spec's I mentioned are based on the leak which is for a DEV Kit, if you knew that Dev kits have double the VRAM then you would know that if a dev kit is leaked half that VRAM and that's what will be in the console.

"This crap" i'm spewing is based on well known information about AMD's GPU's, but hey its all speculation regardless of my arrogance behind what I say.

Power curve is not straight.

Spec comparison with proposed 7 nm mid-range GPU

RX Vega 56 overclock edition

TFLOPS: 11 TFLOPS at 1536 Mhz overclock, 87.5 percent from Vega 64 AC

ROPS: 64 at 1536 Mhz overclock from Vega 64, hence rasterization power is identical.

L2 cache: 4 MB unified for both TMU and ROPS usage. Programmer can software tile render with Vega architecture from either TMU and ROPS paths.

Vega 56 overclock has similar performance to Vega 64

Basic design parameters for TSMC 7 nm and X1X Next/PS5 GPU,

https://www.anandtech.com/show/12677/tsmc-kicks-off-volume-production-of-7nm-chips

The 7 nm node is a big deal for the foundry industry in general and TSMC in particular. When compared to the CLN16FF+ technology (TSMC’s most widely used FinFET process technology) the CLN7FF will enable chip designers to shrink their die sizes by 70% (at the same transistor count), drop power consumption by 60%, or increase frequency by 30% (at the same complexity).

Condition 1: For every 0.5 percent clock speed increase, 60 percent power consumption drop is reduce by 1 percent.

Condition 2: PS4/PS4 Pro/X1X GPU TDP target: about 115 to 130 watts

Condition 3: Vega 64 Air cooled at 295 watts dropped by 60 percent which turns Vega 64 from 295 watts into 118 watts. Vega 64 Air cooled has 12.58 TFLOPS at 1536Mhz

Lets target 130 watts using Condition 1 relationship.

Proposal 1

TSMC 7nm applied: 118 watts at 1536 Mhz at full 64 CU enabled.

TFLOPS: 8 disabled CUs for increase yields, 56 CU at 1536 Mhz yields 11 TFLOPS. 8 missing CUs yields extra TDP headroom, hence reducing to 103 watts.

Based on Condition 1, apply 10 percent clock speed increase with 20 percent TDP increase, yields 1689 Mhz at 129.6 watts

Resulting clock speed is 1689 Mhz, which yields ~12.11 TFLOPS

ROPS: 64 ROPS at 1689 Mhz

L2 cache: 8 MB twice of X1X GPU's 2MB L2 cache (TMU) + 2 MB render cache (ROPS). This is one of Turning's improvement i.e double L2 cache size.

Side issues: For games, Vega 64 has some useless features such as high bandwidth cache which is slower/higher latency than L2 cache and 64 bit floating point processing.

GPU result has 9 percent better rasterization performance over Vega 56 OC/64.

If AMD stays with Vega based circuit design, this is estimated GPU power for RX-670 while RX-680 and RX 690 has higher TDP limits and clock speed.

Proposal 2 (I need to revise this)

Based on graph above specs

TFLOPS: 8 missing CUs yields extra TDP headroom.

Resulting clock speed is 1720 Mhz, which yields ~12.33TFLOPS

ROPS: 64 ROPS at 1720 Mhz

GPU result has ~12 percent better rasterization performance over Vega 56 OC/64 at 1536 Mhz

Proposal 3

Based on graph above specs

TFLOPS: 20 disabled CUs which reduces power consumption. Resulting clock speed is 1840 Mhz, which yields ~10.362TFLOPS

ROPS: 64 ROPS at 1840 Mhz

GPU at 1840 Mhz result has ~20 percent better rasterization performance over Vega 56 OC/64 at 1536 Mhz

Characteristics getting closer to RTX 2080.

Real life Vega 56 at 1710 Mhz

https://www.youtube.com/watch?v=w6gpxe0QoUs

Disclaimer: Speculation above needs to be slightly revised, but the principle is the same. Do your own power curve graph.

For comparison

RTX 2080 has equivalent of 46 CU at ~1928 Mhz clock speeds with 64 ROPS and 11.0 TFLOPS.

RTX 2080 has 4 MB L2 cache which higher than GTX 1080 Ti's 3 MB L2 cache. NVIDIA has superior delta color compression.