PS5 details from Mark Cerny: Backwards compatible, 8k, raytracing support, SSD standard, and more

Avatar image for rzxv04
#551 Posted by rzxv04 (686 posts) -

@pc_rocks said:
@rzxv04 said:

If they do fake HDR and claimed otherwise, I hope they get sued.

There's no way to verify it because there's no HDR content on PS4. Probably the reason they got away with it.

Not sure what you mean by that.

I have an OG PS4 and it does "HDR" on the TV's I've mentioned such as the one of the best p/p TV's in 2018 such as the X900F.

The darks were darker and there highlights were slightly better in HDR mode BUT as you mentioned, it could be a "fake" form of HDR where perhaps the result basically equates to something like SDR with tweaked black levels and/or colors.

Someone with proper software and equipment could do a proper comparison with the same TV using the OG PS4 vs other models/competing consoles.

Vincent with PS4 Pro with HDR head to head to head using 3 Sony TVs:

https://www.youtube.com/watch?v=t0GzpKxakZM

Sadly, my impression of HDR and DV (even in movies) was quite sub par specially with the terrible blooming. Not sure if I'd be frustrated with OLED's ABL as some say that's where HDR can quite shine too due to high contrast though that makes me wonder on how much impact nits actually have since OLEDs can't even produce as high peak brightness.

Do forgive me if you were saying that the PS4 doesn't have HDR/DV content for movies. No idea about that currently.

Avatar image for pc_rocks
#552 Posted by PC_Rocks (2316 posts) -

@rzxv04 said:
@pc_rocks said:
@rzxv04 said:

If they do fake HDR and claimed otherwise, I hope they get sued.

There's no way to verify it because there's no HDR content on PS4. Probably the reason they got away with it.

Not sure what you mean by that.

I have an OG PS4 and it does "HDR" on the TV's I've mentioned such as the one of the best p/p TV's in 2018 such as the X900F.

The darks were darker and there highlights were slightly better in HDR mode BUT as you mentioned, it could be a "fake" form of HDR where perhaps the result basically equates to something like SDR with tweaked black levels and/or colors.

Someone with proper software and equipment could do a proper comparison with the same TV using the OG PS4 vs other models/competing consoles.

Vincent with PS4 Pro with HDR head to head to head using 3 Sony TVs:

https://www.youtube.com/watch?v=t0GzpKxakZM

Sadly, my impression of HDR and DV (even in movies) was quite sub par specially with the terrible blooming. Not sure if I'd be frustrated with OLED's ABL as some say that's where HDR can quite shine too due to high contrast though that makes me wonder on how much impact nits actually have since OLEDs can't even produce as high peak brightness.

Do forgive me if you were saying that the PS4 doesn't have HDR/DV content for movies. No idea about that currently.

What content did you use for HDR because as per the link shared by Ron there's no HDR content available on PS4 to test it.

Avatar image for rzxv04
#553 Posted by rzxv04 (686 posts) -

@pc_rocks said:
@rzxv04 said:
@pc_rocks said:
@rzxv04 said:

If they do fake HDR and claimed otherwise, I hope they get sued.

There's no way to verify it because there's no HDR content on PS4. Probably the reason they got away with it.

Not sure what you mean by that.

I have an OG PS4 and it does "HDR" on the TV's I've mentioned such as the one of the best p/p TV's in 2018 such as the X900F.

The darks were darker and there highlights were slightly better in HDR mode BUT as you mentioned, it could be a "fake" form of HDR where perhaps the result basically equates to something like SDR with tweaked black levels and/or colors.

Someone with proper software and equipment could do a proper comparison with the same TV using the OG PS4 vs other models/competing consoles.

Vincent with PS4 Pro with HDR head to head to head using 3 Sony TVs:

https://www.youtube.com/watch?v=t0GzpKxakZM

Sadly, my impression of HDR and DV (even in movies) was quite sub par specially with the terrible blooming. Not sure if I'd be frustrated with OLED's ABL as some say that's where HDR can quite shine too due to high contrast though that makes me wonder on how much impact nits actually have since OLEDs can't even produce as high peak brightness.

Do forgive me if you were saying that the PS4 doesn't have HDR/DV content for movies. No idea about that currently.

What content did you use for HDR because as per the link shared by Ron there's no HDR content available on PS4 to test it.

Uncharted 4, HZD FWL, RDR2. RDR2 was said to have fake HDR though HDTVTEST's Vincent also mentions this. Only have OG PS4 sadly so I wouldn't be able to compare it an Xbox One X or PS4 Pro to even eyeball with.

I thought it was about 10bit vs 8 bit/8bit+FRC where some consider the latter "fake". There are some that mention that most cannot tell the difference between the two and like I mentioned earlier even RTINGS mentioned that they're hard to distinguish and they lump 8 bit + FRC and 10 bits into just the 10 bit category (can no longer find the specific thread for this).

Bit depth seem to be mostly gradient related so it's one factor. There are also nits, contrast, etc. which affects overall HDR perception.

https://www.avsforum.com/forum/166-lcd-flat-panel-displays/3056482-10-bit-panel-min-req-true-hdr.html

https://www.reddit.com/r/xboxone/comments/81fbcy/xbox_one_x_hdr_color_depth/

Avatar image for ronvalencia
#554 Edited by ronvalencia (27867 posts) -

@pc_rocks said:
@ronvalencia said:
@pc_rocks said:
@04dcarraher said:

1. As one who has fought the rabid Sony fanboys in 2013 suggesting PS4 was s fast or faster than a Pc with high end gpus like the 7970. The GPU in the PS4 wasn't shit for the time frame it released in. It wasn't bad (on par with 7850), but what holds it back is the cpu and lack of memory for beyond 900p-1080p gaming. Mark Cerny is known to twist stuff, exaggerate, use marketing orientated terms to sell a product. Such as PS4 Pro's "Secret Sauce" (with its FP 16 bit processing) telling the public it can make the console an 8.2 TFLOP beast.... Which is false.

2. Even though Ron likes to rabble with tangents..... However HDR is possible with HDMI 1.4, you can not however send a 4k signal through it. You need HDMI 2.0+. PS4 can output HDR signal.

1. True, only the GPU was decent but in reality it was still in the ballpark of 570 from 2010 so overall Mark Cerny lied as always just like he's lying and twisting about ray tracing for PS5. He's clearly riding the hype from RTX series by calling they are using ray tracing for audio. No one calls that ray tracing, it's quite a common technique that even games like original Thief used. Ray tracing in a traditional sense has always been used to describe its usage in rendering in the world space. Shooting a single ray for FoV, path finding etc is not comparable to firing 10s of thousands of rays for rendering.

2. My point is not about bandwidth, it's about passing 10bit info. All the articles and research I have done showed me that HDMI 1.4 could not pass 10bit. If it was only about bandwidth than HDMI 1.3 and 1.4 has the same bandwidth.

1. True Audio Next recycles Radeon Rays2.0 BVH ray tracing accelerated structures.

RTX Turing accelerates BVH search ray tracing methods via fix function hardware units. RTX cores guides shader's color pixel render.

Yes, Mark Cerny is lying or at the very least deliberately misleading the public as usual. Crytek used the so called ray-tracing in screen space for RLR in Crysis 2 yet they didn't claim it was ray-tracing. Every game in existence fires rays to for path-finding, collision detection, line of sight etc. yet no one claims it as ray-tracing. Rays also being used for spatial audio before yet no one explicitly called it ray-tracing. Epic and Crytek implemented SVOGI but they didn't claim it as ray-tracing. The reason is because ray-tracing traditionally been used to define huge number of rays being cast for rendering in world space. Funny only Sony/GG tried to claim they are doing ray-tracing in RLR in KZ:SF 2 years after Crytek and again trying to use it as a buzz word to hype their product without telling that it's not the same.

For NEON NOIR demo, Crytek claims Screen Space Reflections (SSR) are turned off.

https://www.youtube.com/watch?v=1nqhkDm2_Tw

All scenes are rendered in real-time in-editor on an AMD Vega 56 GPU. Reflections are achieved with the new experimental ray tracing feature in CRYENGINE 5 - no SSR.

https://docs.cryengine.com/pages/viewpage.action?pageId=25535599

Voxel-Based Global Illumination (SVOGI)

This GI solution is based on voxel ray tracing and provides the following effects:

...

  • First we prepare voxel representation of the scene geometry (at run-time, on CPU, asynchronously and incrementally).

Vega 56 is 10.5 TFLOPS compute with 4MB L2 cache which is nearly 2X over GTX 1660 Ti's ~5.5 TFLOPS (~1800Mhz) compute (not including separate integer units) with 1.5 MB L2 cache

https://www.eurogamer.net/articles/digitalfoundry-2019-metro-exodus-tech-interview

In terms of the viability of RT on next generation consoles, the hardware doesn't have to be specifically RTX cores. Those cores aren't the only thing that matters when it comes to ray tracing. They are fixed function hardware that speed up the calculations specifically relating to the BVH intersection tests. Those calculations can be done in standard compute if the computer cores are numerous and fast enough (which we believe they will be on the next gen consoles). In fact, any GPU that is running DX12 will be able to "run" DXR since DXR is just an extension of DX12.

Other things that really affect how quickly you can do ray tracing are a really fast BVH generation algorithm, which will be handled by the core APIs; and really fast memory. The nasty thing that ray tracing does, as opposed to something like say SSAO, is randomly access memory. SSAO will grab a load of texel data from a local area in texture space and because of the way those textures are stored there is a reasonably good chance that those texels will be quite close (or adjacent) in memory. Also, the SSAO for the next pixel over will work with pretty much the same set of samples. So, you have to load far less from memory because you can cache and awful lot of data.

Working on data that is in cache speeds things up a ridiculous amount. Unfortunately, rays don't really have this same level of coherence. They can randomly access just about any part of the set of geometry, and the ray for the next pixels could be grabbing data from and equally random location. So as much as specialised hardware to speed up the calculations of the ray intersections is important, fast compute cores and memory which lets you get at you bounding volume data quickly is also a viable path to doing real-time RT.

-----------------

Side issue, DirectX12 on XBO

Ben Archard: Actually, we've got a great perf boost on Xbox-family consoles on both GPU and CPU thanks to DX12.X API. I believe it is a common/public knowledge, but GPU microcode on Xbox directly consumes API as is, like SetPSO is just a few DWORDs in command buffer. As for PC - you know, all the new stuff and features accessible goes into DX12, and DX11 is kind of forgotten. As we are frequently on the bleeding edge - we have no choice!

Avatar image for ronvalencia
#555 Edited by ronvalencia (27867 posts) -

@hofuldig said:
@Grey_Eyed_Elf said:

Even DF is sceptical about the 1.8GHz leak... With their own estimates at the top end being 12TFLOPs. Also the leak doesn't mention CU count and the TFLOP count relies entirely on that so we could be looking at anything 9-12TFLOPS if the 1.8GHz is legit.

Quite frankly the PS4 CPU is clocked at 1.6GHz so even if the zen2 cpu in the PS5 was also clocked at 1.6GHz your looking at double the performance right off the bat clock for clock, and thats before optimizations that sony would make to the cpu itself to fit it better into the PS5.

On AVX at the same clock speed

ZEN v1 is 2X (FMUL) to 2X (FADD) over Jaguar

Jaguar CPU has one 128bit FADD and one 128bit FMUL

Jaguar 's 128bit FADD and 128 bit FMUL will fuse for 128bit FMAC operation. EachJaguar has a single 128bit FMAC capability.

ZEN v1 CPU has two 128bit FADD and two 128bit MUL,

ZEN v1's 128bit FADD and 128 bit FMUL will fuse for 128bit FMAC operation. Each ZEN v1 has two 128bit FMAC capability.

----------

ZEN v2 has full AVX 256bit support, hence ZEN v2 is 4X (FMUL) to 4X (FADD) over Jaguar

ZEN v2 CPU has two 256 bit FADD and two 256 bit FMUL

https://fuse.wikichip.org/news/1815/amd-discloses-initial-zen-2-details/

ZEN v2's 256 bit FADD and 256 bit FMUL will fuse for 256 bit FMAC operation. Each ZEN v2 has two 256 bit FMAC capability matching Intel Haswell/Broadwell/Skylake S.

At the same clock speed, Zen v2 is 4X the 128bit FMAC capability over Jaguar.

Reminder, Intel Skylake X has two 512bit FMACS.

Avatar image for ronvalencia
#556 Edited by ronvalencia (27867 posts) -

@pc_rocks said:
@ronvalencia said:
@pc_rocks said:
@ronvalencia said:

Red herring with 4K. PS4 is requesting 1080p bit rate with HDR10 NOT 4K HDR10

https://arstechnica.com/gaming/2016/10/ps4-hdr-no-games-media-useless/

The article again agreed with what I said, it has the bandwidth but what about the hardware to pass 10bit? HDMI 1.4 only supports 8 bit, and it further agrees with me that there's no content on original PS4 in HDR to verify. Further proving my point. Sony lied to the teeth while Xbone S has HDR games.

No body claimed that it's impossible to do HDR with 1080p the question was and still is how do you process 10 bits on a 8 bit hardware without compression or simulation.

I connected an old Sony Bravia HDTV 46 inch 1080p to HDMI 1.4 Radeon HD R9-290X and it registered 12 bits color display according to AMD's control panel. LOL

I can select 8, 10 and 12bits color within AMD's control panel.

The old Sony Bravia HDTV is missing the specific HDR10 color profile.

DisplayPort not used.

------------

I plan to connect HDMI 1.4 Radeon HD R9-290X to HDMI 2.0a FreeSync +HDR +4K LG 32inch monitor.

HDMI 2.0a FreeSync +HDR +4K LG 32 inch monitor has been verified to work with my laptop's HDMI 2.0a with HDR10, FreeSync and 4K.

You're again talking about Display and GPUs. I never claimed anything related to display or the GPU power. The point is where is the proof that HDMI 1.4 can pass 10 bit info to the display without compression/simulation or dithering. And most specifically where's the evidence to verify Sony's HDR claims as there's no content. I won't respond again if you keep talking about PC GPUs and TVs.

PS4 follows Hawaii GCN (R9-290X) limits on HDMI 1.4.

Dithering is at flat panel level NOT at Hawaii GCN's HDMI 1.4 port.

Avatar image for hofuldig
#557 Posted by hofuldig (5126 posts) -
@ronvalencia said:
@pc_rocks said:
@ronvalencia said:
@pc_rocks said:
@ronvalencia said:

Red herring with 4K. PS4 is requesting 1080p bit rate with HDR10 NOT 4K HDR10

https://arstechnica.com/gaming/2016/10/ps4-hdr-no-games-media-useless/

The article again agreed with what I said, it has the bandwidth but what about the hardware to pass 10bit? HDMI 1.4 only supports 8 bit, and it further agrees with me that there's no content on original PS4 in HDR to verify. Further proving my point. Sony lied to the teeth while Xbone S has HDR games.

No body claimed that it's impossible to do HDR with 1080p the question was and still is how do you process 10 bits on a 8 bit hardware without compression or simulation.

I connected an old Sony Bravia HDTV 46 inch 1080p to HDMI 1.4 Radeon HD R9-290X and it registered 12 bits color display according to AMD's control panel. LOL

I can select 8, 10 and 12bits color within AMD's control panel.

The old Sony Bravia HDTV is missing the specific HDR10 color profile.

DisplayPort not used.

------------

I plan to connect HDMI 1.4 Radeon HD R9-290X to HDMI 2.0a FreeSync +HDR +4K LG 32inch monitor.

HDMI 2.0a FreeSync +HDR +4K LG 32 inch monitor has been verified to work with my laptop's HDMI 2.0a with HDR10, FreeSync and 4K.

You're again talking about Display and GPUs. I never claimed anything related to display or the GPU power. The point is where is the proof that HDMI 1.4 can pass 10 bit info to the display without compression/simulation or dithering. And most specifically where's the evidence to verify Sony's HDR claims as there's no content. I won't respond again if you keep talking about PC GPUs and TVs.

PS4 follows Hawaii GCN (R9-290X) limits on HDMI 1.4.

Dithering is at flat panel level NOT at Hawaii GCN's HDMI 1.4 port.

if you simply go to wikipedia you can see that the HDMI 1.4 spec supports up to 16 bit panels so anyone saying that HDMI 1.4 CANT do 10-bit is dumber than hell i tell ye what!

Avatar image for Pedro
#558 Posted by Pedro (34581 posts) -

@hofuldig: 10bit per channel not 16 bit total.

Avatar image for EG101
#559 Posted by EG101 (2000 posts) -

@son-goku7523 said:

@boxrekt: Your argument assumes Sony views the PS5 as some sort of higher end PS4. What makes you think Sony is content with people holding on to their PS4s and waiting for PS5 to become affordable in 2 or 3 years with competition like MS and Google Stadia around? That makes no sense at all. They want to gain marketshare as quick as possible so that the PS5 can start a new install-base ASAP for next gen.

Marketing your console to gaming enthusiasts is the surest way to fail. You’ve said so yourself, the X has been successful among the hardcore but it hasn’t moved the needle for MS at all when you place it next to Sony’s overall sales which are bolstered by the uber successful OG PS4 and PS4 Slim. The PS5 isn’t supposed to be a replacement to the PS4 Pro or compete with the X for hardcore gamers, it’s meant to replace the aging regular PS4, it’s a new starting point for Sony. Marketing it as a premium console is exactly what Sony should avoid. They are supposed to market the PS5 as an affordable yet capable console like they did with the PS4. Trying to market it as a premium PS4 Pro or X replacement will cause the PS5 to fail. It will be appealing to people like us on SW but outside gaming enthusiast circles it will fail hard, and that’s what Sony should avoid at all costs. Crossgen game development doesn’t change the dynamics at all. A $400 console is still way more marketable than a $500 or $600 one.

So why not market a $300 console I mean according to you cheaper is better right?

People expect Next Gen to be a jump from last Gen and you are not getting that in a $400 box which will be a slightly better XB1X.

At $400 you are probably looking at 8 or 9 TF console with a Zen CPU and 16 Gigs of Ram and that could be with Sony taking a loss on each unit.

It makes much more sense to aim for $600. Give us 24 Gigs of Ram, 12-16 TF's with a Zen CPU, SSD and not take a loss on each sale.

Avatar image for hofuldig
#560 Posted by hofuldig (5126 posts) -
@Pedro said:

@hofuldig: 10bit per channel not 16 bit total.

something i just thught about, this argument is fucking stupid, who gives 2 shits if the PS4 (base) supports HDR or not? it only plays games up to 1920x1080, and to use HDR you need a 4K TV anyways. this thread is about the PS5 not about some stupid shit feature nobody is using.

Avatar image for pc_rocks
#561 Posted by PC_Rocks (2316 posts) -

@Pedro said:

@hofuldig: 10bit per channel not 16 bit total.

And he had the guts to say this: so anyone saying that HDMI 1.4 CANT do 10-bit is dumber than hell i tell ye what! ROFLMAO!

Avatar image for hofuldig
#562 Posted by hofuldig (5126 posts) -
@EG101 said:
@son-goku7523 said:

@boxrekt: Your argument assumes Sony views the PS5 as some sort of higher end PS4. What makes you think Sony is content with people holding on to their PS4s and waiting for PS5 to become affordable in 2 or 3 years with competition like MS and Google Stadia around? That makes no sense at all. They want to gain marketshare as quick as possible so that the PS5 can start a new install-base ASAP for next gen.

Marketing your console to gaming enthusiasts is the surest way to fail. You’ve said so yourself, the X has been successful among the hardcore but it hasn’t moved the needle for MS at all when you place it next to Sony’s overall sales which are bolstered by the uber successful OG PS4 and PS4 Slim. The PS5 isn’t supposed to be a replacement to the PS4 Pro or compete with the X for hardcore gamers, it’s meant to replace the aging regular PS4, it’s a new starting point for Sony. Marketing it as a premium console is exactly what Sony should avoid. They are supposed to market the PS5 as an affordable yet capable console like they did with the PS4. Trying to market it as a premium PS4 Pro or X replacement will cause the PS5 to fail. It will be appealing to people like us on SW but outside gaming enthusiast circles it will fail hard, and that’s what Sony should avoid at all costs. Crossgen game development doesn’t change the dynamics at all. A $400 console is still way more marketable than a $500 or $600 one.

So why not market a $300 console I mean according to you cheaper is better right?

People expect Next Gen to be a jump from last Gen and you are not getting that in a $400 box which will be a slightly better XB1X.

At $400 you are probably looking at 8 or 9 TF console with a Zen CPU and 16 Gigs of Ram and that could be with Sony taking a loss on each unit.

It makes much more sense to aim for $600. Give us 24 Gigs of Ram, 12-16 TF's with a Zen CPU, SSD and not take a loss on each sale.

its going to be $399, its going to have either 12 or 16GB of ram, zen2, Navi is supposed to be the mid end cheapo GPU from AMD and if you havnt followed the news Navi was re designed to work better with consoles. i expect that at most Navi will be no more than 20% faster than the PS4 pro however the big leaps will be in the CPU and if devs can do some "magic" like naughty dog did with the PS3 they could offload some GPU workloads to the CPU.

Avatar image for pc_rocks
#563 Posted by PC_Rocks (2316 posts) -

@ronvalencia:

What does that got to do with what I said? Yeah I'm referring to both of your posts. You know what I'm done with it, I don't have the energy to keep going in circles.

Avatar image for EG101
#564 Posted by EG101 (2000 posts) -

@hofuldig said:
@EG101 said:
@son-goku7523 said:

@boxrekt: Your argument assumes Sony views the PS5 as some sort of higher end PS4. What makes you think Sony is content with people holding on to their PS4s and waiting for PS5 to become affordable in 2 or 3 years with competition like MS and Google Stadia around? That makes no sense at all. They want to gain marketshare as quick as possible so that the PS5 can start a new install-base ASAP for next gen.

Marketing your console to gaming enthusiasts is the surest way to fail. You’ve said so yourself, the X has been successful among the hardcore but it hasn’t moved the needle for MS at all when you place it next to Sony’s overall sales which are bolstered by the uber successful OG PS4 and PS4 Slim. The PS5 isn’t supposed to be a replacement to the PS4 Pro or compete with the X for hardcore gamers, it’s meant to replace the aging regular PS4, it’s a new starting point for Sony. Marketing it as a premium console is exactly what Sony should avoid. They are supposed to market the PS5 as an affordable yet capable console like they did with the PS4. Trying to market it as a premium PS4 Pro or X replacement will cause the PS5 to fail. It will be appealing to people like us on SW but outside gaming enthusiast circles it will fail hard, and that’s what Sony should avoid at all costs. Crossgen game development doesn’t change the dynamics at all. A $400 console is still way more marketable than a $500 or $600 one.

So why not market a $300 console I mean according to you cheaper is better right?

People expect Next Gen to be a jump from last Gen and you are not getting that in a $400 box which will be a slightly better XB1X.

At $400 you are probably looking at 8 or 9 TF console with a Zen CPU and 16 Gigs of Ram and that could be with Sony taking a loss on each unit.

It makes much more sense to aim for $600. Give us 24 Gigs of Ram, 12-16 TF's with a Zen CPU, SSD and not take a loss on each sale.

its going to be $399, its going to have either 12 or 16GB of ram, zen2, Navi is supposed to be the mid end cheapo GPU from AMD and if you havnt followed the news Navi was re designed to work better with consoles. i expect that at most Navi will be no more than 20% faster than the PS4 pro however the big leaps will be in the CPU and if devs can do some "magic" like naughty dog did with the PS3 they could offload some GPU workloads to the CPU.

Would you please provide a Link to Sony's official $400 price announcement.

Avatar image for hofuldig
#565 Posted by hofuldig (5126 posts) -
@EG101 said:
@hofuldig said:
@EG101 said:
@son-goku7523 said:

@boxrekt: Your argument assumes Sony views the PS5 as some sort of higher end PS4. What makes you think Sony is content with people holding on to their PS4s and waiting for PS5 to become affordable in 2 or 3 years with competition like MS and Google Stadia around? That makes no sense at all. They want to gain marketshare as quick as possible so that the PS5 can start a new install-base ASAP for next gen.

Marketing your console to gaming enthusiasts is the surest way to fail. You’ve said so yourself, the X has been successful among the hardcore but it hasn’t moved the needle for MS at all when you place it next to Sony’s overall sales which are bolstered by the uber successful OG PS4 and PS4 Slim. The PS5 isn’t supposed to be a replacement to the PS4 Pro or compete with the X for hardcore gamers, it’s meant to replace the aging regular PS4, it’s a new starting point for Sony. Marketing it as a premium console is exactly what Sony should avoid. They are supposed to market the PS5 as an affordable yet capable console like they did with the PS4. Trying to market it as a premium PS4 Pro or X replacement will cause the PS5 to fail. It will be appealing to people like us on SW but outside gaming enthusiast circles it will fail hard, and that’s what Sony should avoid at all costs. Crossgen game development doesn’t change the dynamics at all. A $400 console is still way more marketable than a $500 or $600 one.

So why not market a $300 console I mean according to you cheaper is better right?

People expect Next Gen to be a jump from last Gen and you are not getting that in a $400 box which will be a slightly better XB1X.

At $400 you are probably looking at 8 or 9 TF console with a Zen CPU and 16 Gigs of Ram and that could be with Sony taking a loss on each unit.

It makes much more sense to aim for $600. Give us 24 Gigs of Ram, 12-16 TF's with a Zen CPU, SSD and not take a loss on each sale.

its going to be $399, its going to have either 12 or 16GB of ram, zen2, Navi is supposed to be the mid end cheapo GPU from AMD and if you havnt followed the news Navi was re designed to work better with consoles. i expect that at most Navi will be no more than 20% faster than the PS4 pro however the big leaps will be in the CPU and if devs can do some "magic" like naughty dog did with the PS3 they could offload some GPU workloads to the CPU.

Would you please provide a Link to Sony's official $400 price announcement.

see "PS4" launch, also if you actually do some research you can find out what these components cost for the individual, as a large corporation sony would pay significantly less, like the Zen2 CPU for example. AMD revealed their yields and it only costs them like $40 to make an 8c Chip, so you could assume sony and AMD have a very good working relationship it wouldnt be hard to think that they would be paying $60 for the cpu. same with the ram, if its GDDR6 regular consumer pricing is something like $9/GB, its not a hard stretch to think that sony is probably paying like $5/GB. the OS is free as it is developed in house and a 250W Psu probably costs like $30 considering you can get a 500W for $40.

Avatar image for ronvalencia
#566 Edited by ronvalencia (27867 posts) -

@pc_rocks said:

@ronvalencia:

What does that got to do with what I said? Yeah I'm referring to both of your posts. You know what I'm done with it, I don't have the energy to keep going in circles.

You're stupid. You wasted my time when I re-activated my old i7-4770K+R9-290X gaming PC (re-flash BIOS from R9-390X back down to R9-290X), updated to Windows 10 1809 build and connect it to an old Sony 46 inch HDTV 1920x1080p via HDMI port.

The old Sony 46 inch HDTV 1920x1080p does NOT support DisplayPort.

The old Sony 46 inch HDTV 1920x1080p supports 8/10/12 bit color channel according to AMD's control panel i.e. it's better than my old 4K 28 inch Samsung monitor with just 8bit color support via HDMI (up to 4K 30 hz) while DisplayPort supports both 8 and 10bit color channel with 4K 60 hz.

Avatar image for boxrekt
#567 Posted by BoxRekt (1393 posts) -

@son-goku7523 said:

@boxrekt: Your argument assumes Sony views the PS5 as some sort of higher end PS4.

What the hell?

That absolutely not what I said at all. I don't know how you came to that silly conclusion. You may need to reread what I wrote because you're 100% off if that's what you think I said.

Cross-gen development while in TRANSITION to Next gen saturation =/= mid-gen continuation of current gen platform.

My point was that Sony doesn't need to launch at $400 because PS5 is backward compatible and will play all PS4 games (which will STILL be the market leader into next gen) thus they can still maintain market dominance because PS4 and PS5 games will be shared for 98% of games during the first year or so into next generation.

Sony will be selling PS5's to enthusiasts and reduced price PS4 Pros and base PS4 to low entry adopters who want the best games from this generation until next gen development takes over and PS5's will be at a lower price point for mass market.

Avatar image for Grey_Eyed_Elf
#568 Posted by Grey_Eyed_Elf (6389 posts) -

@hofuldig said:

@Grey_Eyed_Elf:

@Grey_Eyed_Elf said:
@emgesp said:
@Grey_Eyed_Elf said:
@emgesp said:

PS4 Pro can hit 1800p 30fps in a lot of games with those 8 Jaguar cores and a 4.2 Tflop GPU, so I'm pretty sure PS5 will handle native 4K 30fps fine for most games.

Which is exactly what I said, You might want to re-read.

Games don't have universal performance see below:

Lets say the PS5 is between the performance of a Vega 56 and 64... With the examples above you can see that some games will struggle to give you 30FPS and Some games will give you close to 60FPS.

Now what people don't seem to understand is that games WILL get more demanding so logically if you are just about good enough for 4K/30 on SOME current games then EXPECT FUTURE games SOME of them to struggle to maintain 30FPS or have dynamic resolution... With Cerny talking about Ray Tracing its almost a guarantee that next generation regardless of how many TFLOPs are involved will have horrible shitty performing games just like this generation because developers usually take the approach of pushing the hardware to breaking point.

The logic that "well current generation can do this with this power" needs to die because your demands for resolution is exactly why we have poor performing games on consoles. PC gamers who buy GPU's like mine or a Vega 64 aren't playing at 4K because they are not stupid... You on the other hand are its why developers will keep selling console to you with these hype buzz words and you will feed off it and then spend the next 5 years telling your self 30FPS is okay in 2022!

Remember a console will always outperform a PC with exact same specs to some degree. Its just the nature of having locked hardware in a console. Now, I'm not saying it'll be drastically better than a similar PC, but it will have its advantages for sure. Anyways, even if they had to utilize checkerboard rendering or a similar technique who cares, it can look extremely close to native 4K when done right. I'm not a resolution snob.

To some degree... Look at that benchmark again, the difference between the 56 and 64 vega cards is 2TFLOPs which is barely a extra 3-10FPS at 4K, just look at the RX 590 which is on par with a X1X and the 12.5 TFLOP Vega 64 its barely 1/3rd more in the performance.

4K is a different beast to the lower resolution's where the CPU can add extra performance, its almost entirely GPU dependant as of right now and when next generation games come out using ray tracing and pushing graphics in general forward that 10-12TFLOP GCN card will fold.

You console gamer's will be hit with a reality check very soon not just when it comes to resolution but performance.

How will console gamers be hit with a reality check when the only way to do real raytracing is to spend $600 on a GPU that cant even do more than 70FPS with raytracing on at 2560x1440?

The reality check is that 4K is GPU dependant and a RX 580/590 vs Vega 56/64 is not going to be drastic power jump for 4K gaming... A lot console gamers are under the impression that 12TFLOPS means 4K/60 and the CPU jump will mean no performance drops in games. The reality check will come in two forms the CPU doesn't change much at 4K and a lot of games will still have performance issues depending on how much the developers optimises and pushes the hardware and that 12TFLOP is still not enough for 4K/60 on the majority of AAA games.

Avatar image for ronvalencia
#569 Edited by ronvalencia (27867 posts) -

@boxrekt said:
@son-goku7523 said:

@boxrekt: Your argument assumes Sony views the PS5 as some sort of higher end PS4.

What the hell?

That absolutely not what I said at all. I don't know how you came to that silly conclusion. You may need to reread what I wrote because you're 100% off if that's what you think I said.

Cross-gen development while in TRANSITION to Next gen saturation =/= mid-gen continuation of current gen platform.

My point was that Sony doesn't need to launch at $400 because PS5 is backward compatible and will play all PS4 games (which will STILL be the market leader into next gen) thus they can still maintain market dominance because PS4 and PS5 games will be shared for 98% of games during the first year or so into next generation.

Sony will be selling PS5's to enthusiasts and reduced price PS4 Pros and base PS4 to low entry adopters who want the best games from this generation until next gen development takes over and PS5's will be at a lower price point for mass market.

Unlike PS3 to PS4 transition, PS5 can leverage PS4's market power.

8 core Zen v2 should be the real CELL replacement i.e. ~2X GFLOPS FP32/4X GFLOPS FP16 power when 8 core Zen v2 at 1.6Ghz vs CELL at 3.2 Ghz (missing FP16 rapid pack math).

PS; SPU's effective IPC is 0.5 while Jaguar's effective IPC is 1.x to 1.8

Avatar image for ronvalencia
#570 Edited by ronvalencia (27867 posts) -

@hofuldig said:
@EG101 said:
@son-goku7523 said:

@boxrekt: Your argument assumes Sony views the PS5 as some sort of higher end PS4. What makes you think Sony is content with people holding on to their PS4s and waiting for PS5 to become affordable in 2 or 3 years with competition like MS and Google Stadia around? That makes no sense at all. They want to gain marketshare as quick as possible so that the PS5 can start a new install-base ASAP for next gen.

Marketing your console to gaming enthusiasts is the surest way to fail. You’ve said so yourself, the X has been successful among the hardcore but it hasn’t moved the needle for MS at all when you place it next to Sony’s overall sales which are bolstered by the uber successful OG PS4 and PS4 Slim. The PS5 isn’t supposed to be a replacement to the PS4 Pro or compete with the X for hardcore gamers, it’s meant to replace the aging regular PS4, it’s a new starting point for Sony. Marketing it as a premium console is exactly what Sony should avoid. They are supposed to market the PS5 as an affordable yet capable console like they did with the PS4. Trying to market it as a premium PS4 Pro or X replacement will cause the PS5 to fail. It will be appealing to people like us on SW but outside gaming enthusiast circles it will fail hard, and that’s what Sony should avoid at all costs. Crossgen game development doesn’t change the dynamics at all. A $400 console is still way more marketable than a $500 or $600 one.

So why not market a $300 console I mean according to you cheaper is better right?

People expect Next Gen to be a jump from last Gen and you are not getting that in a $400 box which will be a slightly better XB1X.

At $400 you are probably looking at 8 or 9 TF console with a Zen CPU and 16 Gigs of Ram and that could be with Sony taking a loss on each unit.

It makes much more sense to aim for $600. Give us 24 Gigs of Ram, 12-16 TF's with a Zen CPU, SSD and not take a loss on each sale.

its going to be $399, its going to have either 12 or 16GB of ram, zen2, Navi is supposed to be the mid end cheapo GPU from AMD and if you havnt followed the news Navi was re designed to work better with consoles. i expect that at most Navi will be no more than 20% faster than the PS4 pro however the big leaps will be in the CPU and if devs can do some "magic" like naughty dog did with the PS3 they could offload some GPU workloads to the CPU.

X1X GPU is already up to 2X effectiveness over PS4 Pro GPU with RDR2.

Built on 2nd gen 16 nm tech, X1X is already 42 percent faster in FP32 when compared to PS4 Pro. Are you crazy?

X1X GPU into introduces near Vega ROP multi-MB cache design but Vega 56 has 64 ROPS with 4MB L2 cache while X1X has 32 ROPS with 2 MB render cache..

RX Vega II ~= Vega 64 +~32 percent,

RX 3080 (256 bit GDDR6) ~= Vega 64 +~15 percent level, Replace RX-580/590

RX 3070 (256 bit GDDR6) ~= Vega 56 level <------- PS5's expectation is at this level. Replace RX-570

RX 3060 (128 bit GDDR6) ~= RX 580 level, Replace RX-560

Avatar image for EG101
#571 Posted by EG101 (2000 posts) -

@hofuldig said:
@EG101 said:
@hofuldig said:
@EG101 said:
@son-goku7523 said:

@boxrekt: Your argument assumes Sony views the PS5 as some sort of higher end PS4. What makes you think Sony is content with people holding on to their PS4s and waiting for PS5 to become affordable in 2 or 3 years with competition like MS and Google Stadia around? That makes no sense at all. They want to gain marketshare as quick as possible so that the PS5 can start a new install-base ASAP for next gen.

Marketing your console to gaming enthusiasts is the surest way to fail. You’ve said so yourself, the X has been successful among the hardcore but it hasn’t moved the needle for MS at all when you place it next to Sony’s overall sales which are bolstered by the uber successful OG PS4 and PS4 Slim. The PS5 isn’t supposed to be a replacement to the PS4 Pro or compete with the X for hardcore gamers, it’s meant to replace the aging regular PS4, it’s a new starting point for Sony. Marketing it as a premium console is exactly what Sony should avoid. They are supposed to market the PS5 as an affordable yet capable console like they did with the PS4. Trying to market it as a premium PS4 Pro or X replacement will cause the PS5 to fail. It will be appealing to people like us on SW but outside gaming enthusiast circles it will fail hard, and that’s what Sony should avoid at all costs. Crossgen game development doesn’t change the dynamics at all. A $400 console is still way more marketable than a $500 or $600 one.

So why not market a $300 console I mean according to you cheaper is better right?

People expect Next Gen to be a jump from last Gen and you are not getting that in a $400 box which will be a slightly better XB1X.

At $400 you are probably looking at 8 or 9 TF console with a Zen CPU and 16 Gigs of Ram and that could be with Sony taking a loss on each unit.

It makes much more sense to aim for $600. Give us 24 Gigs of Ram, 12-16 TF's with a Zen CPU, SSD and not take a loss on each sale.

its going to be $399, its going to have either 12 or 16GB of ram, zen2, Navi is supposed to be the mid end cheapo GPU from AMD and if you havnt followed the news Navi was re designed to work better with consoles. i expect that at most Navi will be no more than 20% faster than the PS4 pro however the big leaps will be in the CPU and if devs can do some "magic" like naughty dog did with the PS3 they could offload some GPU workloads to the CPU.

Would you please provide a Link to Sony's official $400 price announcement.

see "PS4" launch, also if you actually do some research you can find out what these components cost for the individual, as a large corporation sony would pay significantly less, like the Zen2 CPU for example. AMD revealed their yields and it only costs them like $40 to make an 8c Chip, so you could assume sony and AMD have a very good working relationship it wouldnt be hard to think that they would be paying $60 for the cpu. same with the ram, if its GDDR6 regular consumer pricing is something like $9/GB, its not a hard stretch to think that sony is probably paying like $5/GB. the OS is free as it is developed in house and a 250W Psu probably costs like $30 considering you can get a 500W for $40.

So No Link.

Guess there is still a chance of a $500 - $600 Power House PS5 being released instead of a PS4 Pro-er.

Avatar image for hofuldig
#572 Posted by hofuldig (5126 posts) -
@EG101 said:
@hofuldig said:
@EG101 said:
@hofuldig said:
@EG101 said:

So why not market a $300 console I mean according to you cheaper is better right?

People expect Next Gen to be a jump from last Gen and you are not getting that in a $400 box which will be a slightly better XB1X.

At $400 you are probably looking at 8 or 9 TF console with a Zen CPU and 16 Gigs of Ram and that could be with Sony taking a loss on each unit.

It makes much more sense to aim for $600. Give us 24 Gigs of Ram, 12-16 TF's with a Zen CPU, SSD and not take a loss on each sale.

its going to be $399, its going to have either 12 or 16GB of ram, zen2, Navi is supposed to be the mid end cheapo GPU from AMD and if you havnt followed the news Navi was re designed to work better with consoles. i expect that at most Navi will be no more than 20% faster than the PS4 pro however the big leaps will be in the CPU and if devs can do some "magic" like naughty dog did with the PS3 they could offload some GPU workloads to the CPU.

Would you please provide a Link to Sony's official $400 price announcement.

see "PS4" launch, also if you actually do some research you can find out what these components cost for the individual, as a large corporation sony would pay significantly less, like the Zen2 CPU for example. AMD revealed their yields and it only costs them like $40 to make an 8c Chip, so you could assume sony and AMD have a very good working relationship it wouldnt be hard to think that they would be paying $60 for the cpu. same with the ram, if its GDDR6 regular consumer pricing is something like $9/GB, its not a hard stretch to think that sony is probably paying like $5/GB. the OS is free as it is developed in house and a 250W Psu probably costs like $30 considering you can get a 500W for $40.

So No Link.

Guess there is still a chance of a $500 - $600 Power House PS5 being released instead of a PS4 Pro-er.

tell you what, the PS4 and PS4 pro were both launched at $399, the PS4 pro is a pretty significant upgrade over the PS4 but sony still chose to sell them at $399, there is absolutely no reason to think that sony would sell a PS5 at anymore than $399. its very clear that 399 is sonys price and again with the price of hardware and the fact that console manufacturers always take a loss on their products for the first year or two its not even a hard stretch to think the PS5 will be $399. still a better deal than building a whole new gaming PC.

Avatar image for ronvalencia
#573 Edited by ronvalencia (27867 posts) -

@Grey_Eyed_Elf said:
@hofuldig said:

@Grey_Eyed_Elf:

@Grey_Eyed_Elf said:
@emgesp said:

Remember a console will always outperform a PC with exact same specs to some degree. Its just the nature of having locked hardware in a console. Now, I'm not saying it'll be drastically better than a similar PC, but it will have its advantages for sure. Anyways, even if they had to utilize checkerboard rendering or a similar technique who cares, it can look extremely close to native 4K when done right. I'm not a resolution snob.

To some degree... Look at that benchmark again, the difference between the 56 and 64 vega cards is 2TFLOPs which is barely a extra 3-10FPS at 4K, just look at the RX 590 which is on par with a X1X and the 12.5 TFLOP Vega 64 its barely 1/3rd more in the performance.

4K is a different beast to the lower resolution's where the CPU can add extra performance, its almost entirely GPU dependant as of right now and when next generation games come out using ray tracing and pushing graphics in general forward that 10-12TFLOP GCN card will fold.

You console gamer's will be hit with a reality check very soon not just when it comes to resolution but performance.

How will console gamers be hit with a reality check when the only way to do real raytracing is to spend $600 on a GPU that cant even do more than 70FPS with raytracing on at 2560x1440?

The reality check is that 4K is GPU dependant and a RX 580/590 vs Vega 56/64 is not going to be drastic power jump for 4K gaming... A lot console gamers are under the impression that 12TFLOPS means 4K/60 and the CPU jump will mean no performance drops in games. The reality check will come in two forms the CPU doesn't change much at 4K and a lot of games will still have performance issues depending on how much the developers optimises and pushes the hardware and that 12TFLOP is still not enough for 4K/60 on the majority of AAA games.

Reality check, 4K locked 60 hz is not required with FreeSync HDMI 2.0 and HDMI 2.1.

Rasterization more than doubles from RX-590's 32 ROPS without multi-MB L2 cache to Vega 56's 64 ROPS with 4 MB L2 cache.

World War Z in example for optimized game for Vega GPU

1. Vulkan low level API

2. Shader intrinsics (similar incoming DirectML"s Meta-commands API access)

3. Rapid Pack Math (similar incoming DirectML API access)

4. Heavy async compute, higher TMU read/write path usage to workaround 64 ROPS limit. Works well when 4 MB L2 cache boundary is factored in i.e. software tiled cache render.

There's a reason for Turing RTX 2080/2070 targeting 4MB L2 cache being common with Vega 56/64/VII.

5. PS4 games will not have NVIDIA Gameworks shader library with Nvidia''s architecture assumptions.

Avatar image for Grey_Eyed_Elf
#574 Posted by Grey_Eyed_Elf (6389 posts) -

@ronvalencia said:
@Grey_Eyed_Elf said:
@hofuldig said:

@Grey_Eyed_Elf:

@Grey_Eyed_Elf said:
@emgesp said:

Remember a console will always outperform a PC with exact same specs to some degree. Its just the nature of having locked hardware in a console. Now, I'm not saying it'll be drastically better than a similar PC, but it will have its advantages for sure. Anyways, even if they had to utilize checkerboard rendering or a similar technique who cares, it can look extremely close to native 4K when done right. I'm not a resolution snob.

To some degree... Look at that benchmark again, the difference between the 56 and 64 vega cards is 2TFLOPs which is barely a extra 3-10FPS at 4K, just look at the RX 590 which is on par with a X1X and the 12.5 TFLOP Vega 64 its barely 1/3rd more in the performance.

4K is a different beast to the lower resolution's where the CPU can add extra performance, its almost entirely GPU dependant as of right now and when next generation games come out using ray tracing and pushing graphics in general forward that 10-12TFLOP GCN card will fold.

You console gamer's will be hit with a reality check very soon not just when it comes to resolution but performance.

How will console gamers be hit with a reality check when the only way to do real raytracing is to spend $600 on a GPU that cant even do more than 70FPS with raytracing on at 2560x1440?

The reality check is that 4K is GPU dependant and a RX 580/590 vs Vega 56/64 is not going to be drastic power jump for 4K gaming... A lot console gamers are under the impression that 12TFLOPS means 4K/60 and the CPU jump will mean no performance drops in games. The reality check will come in two forms the CPU doesn't change much at 4K and a lot of games will still have performance issues depending on how much the developers optimises and pushes the hardware and that 12TFLOP is still not enough for 4K/60 on the majority of AAA games.

Reality check, 4K locked 60 hz is not required with FreeSync HDMI 2.0 and HDMI 2.1.

Rasterization more than doubles from RX-590's 32 ROPS without multi-MB L2 cache to Vega 56's 64 ROPS with 4 MB L2 cache.

World War Z in example for optimized game for Vega GPU

1. Vulkan low level API

2. Shader intrinsics (similar incoming DirectML"s Meta-commands API access)

3. Rapid Pack Math (similar incoming DirectML API access)

4. Heavy async compute, higher TMU read/write path usage to workaround 64 ROPS limit. Works well when 4 MB L2 cache boundary is factored in i.e. software tiled cache render.

There's a reason for Turing RTX 2080/2070 targeting 4MB L2 cache being common with Vega 56/64/VII.

5. PS4 games will not have NVIDIA Gameworks shader library with Nvidia''s architecture assumptions.

Reality check no one is upgrading their $1-2K 4K TV just to get Freesync. That's the same as saying that every RTX gamer has a 144Hz Ultrawide G-Sync display... So for the 99% of people who buy a PS5 It is required because their TV won't support freesync, I just bought my TV 8 months ago and I don't plan on upgrading for another 3-4 years. Your logic is so flawed its ridiculous some times.

All your gibberish is meaningless the reality is Navi is targeted for mid range hardware and at best will perform around Vega 56-64 levels which is not enough for current AAA games at 4K/60 let alone next generation games that push graphics further and or use a form of Ray Tracing.

Avatar image for son-goku7523
#575 Posted by Son-Goku7523 (955 posts) -
@hofuldig said:
@EG101 said:
@hofuldig said:
@EG101 said:
@hofuldig said:

its going to be $399, its going to have either 12 or 16GB of ram, zen2, Navi is supposed to be the mid end cheapo GPU from AMD and if you havnt followed the news Navi was re designed to work better with consoles. i expect that at most Navi will be no more than 20% faster than the PS4 pro however the big leaps will be in the CPU and if devs can do some "magic" like naughty dog did with the PS3 they could offload some GPU workloads to the CPU.

Would you please provide a Link to Sony's official $400 price announcement.

see "PS4" launch, also if you actually do some research you can find out what these components cost for the individual, as a large corporation sony would pay significantly less, like the Zen2 CPU for example. AMD revealed their yields and it only costs them like $40 to make an 8c Chip, so you could assume sony and AMD have a very good working relationship it wouldnt be hard to think that they would be paying $60 for the cpu. same with the ram, if its GDDR6 regular consumer pricing is something like $9/GB, its not a hard stretch to think that sony is probably paying like $5/GB. the OS is free as it is developed in house and a 250W Psu probably costs like $30 considering you can get a 500W for $40.

So No Link.

Guess there is still a chance of a $500 - $600 Power House PS5 being released instead of a PS4 Pro-er.

tell you what, the PS4 and PS4 pro were both launched at $399, the PS4 pro is a pretty significant upgrade over the PS4 but sony still chose to sell them at $399, there is absolutely no reason to think that sony would sell a PS5 at anymore than $399. its very clear that 399 is sonys price and again with the price of hardware and the fact that console manufacturers always take a loss on their products for the first year or two its not even a hard stretch to think the PS5 will be $399. still a better deal than building a whole new gaming PC.

I totally agree and that's something people seem to forget. Sony is in it to win and in order to win they need to attract hardcore as well as not so hardcore gamers and the only way to do that is to take a hit and sell at $399. $499 will only attract hardcore gamers like us and forum dwellers, it's not a successful price point. No console has ever sold at that price and been successful. Fact is the X flopped at that price and even if MS had games for it only hardcore lems would have bought it at that price. $499 simply isn't an attractive price for a console at launch. People comparing the price to phone prices are simply missing the point. Phone prices are subsidized by cellphone companies and phones are viewed as an essential everyday device by consumers, a luxury that consoles do not enjoy.

Avatar image for boxrekt
#576 Edited by BoxRekt (1393 posts) -

@son-goku7523:

I'm a Sony fan but it's going to be $499 bro.

Don't be dense or stupid the X was $499 and is only 6TF running random current gen games 4k 30fps and most upscalled from 1800p.

You're just setting yourself up for disappointment with your silly expectation.

You heard the specs of PS5. Trust me, Sony will be taking a loss on PS5, but they're going to be taking a loss just to get to that $499 price point.

If you're not ready to upgrade to a NEXT GENERATION system then you're just going to have to settle for current gen titles that cross over until you can afford to make the jump.

Avatar image for Pedro
#577 Posted by Pedro (34581 posts) -

Oh no! Here come the battle of the Alts.

Avatar image for son-goku7523
#578 Edited by Son-Goku7523 (955 posts) -
@boxrekt said:

@son-goku7523:

I'm a Sony fan but it's going to be $499 bro.

Don't be dense or stupid the X was $499 and is only 6TF running random current gen games 4k 30fps and most upscalled from 1800p.

You're just setting yourself up for disappointment with your silly expectation.

You heard the specs of PS5. Trust me, Sony will be taking a loss on PS5, but they're going to be taking a loss just to get to that $499 price point.

If you're not ready to upgrade to a NEXT GENERATION system then you're just going to have to settle for current gen titles that cross over until you can afford to make the jump.

Don't be a dumbass, the X cost that much based on what MS was willing to lose to get it in people's homes, how many they were planning to produce, and what AMD had available at the time. You can't use the X to compare to what Sony will be releasing in 2020. I'm a PC gamer as well as a console gamer, and I've been gaming for over 30 years. I'm not saying the price will be $399 because I'm a fan of Sony, I'm saying it will be that price based on my informed knowledge of how tech and consoles are made and sold.

The PS5 is based on AMD's new APUs, they are more efficient, have much better yields during manufacturing if rumors are to be believed, and Sony is planning to manufacture a lot more of them than MS was for X hence they will get a much better deal than MS was able to get with the X from AMD. On top of that, Sony's gaming division is making a lot more money in 2019 from PSN and PS Store royalties than MS gaming division was making in 2017 (or Sony in 2006) and as a result of that are able to eat the cost better. They can sell at a loss and still make back their money in spades from PSN and services. A $399 console gets them more consoles in people's homes and more subscriptions than any $499 console ever could.

Avatar image for boxrekt
#579 Posted by BoxRekt (1393 posts) -

@son-goku7523:

Wow, ok buddy. Don't say I didn't tell you Mr informed.

Avatar image for son-goku7523
#580 Edited by Son-Goku7523 (955 posts) -
@boxrekt said:

@son-goku7523:

Wow, ok buddy. Don't say I didn't tell you Mr informed.

I've been right for years now when it comes to prices and even release periods for consoles, I don't see myself being wrong anytime soon.

A lot of people base these things on what they want things to be instead of what they are likely gonna be. Many people want the PS5 to be a $499 beast but they aren't thinking about how Sony could sell such a machine to the masses (not hardcore gamers and forum dwellers like us). I like to look at a company's recent past decisions and also their present to see where they might go in the future. Sony's recent moves with the Pro show me a manufacturer that thinks $399 is the way to go after they failed at $499 in the past, and I think that's where they will go. Sony doesn't like risk when it comes to selling PS hardware, they take risk with software but ever since the PS3 flopped they've played it safe with hardware, and $399 is the safest price they can launch at.

Avatar image for i_p_daily
#581 Posted by I_P_Daily (11814 posts) -

@Pedro said:

Oh no! Here come the battle of the cow Alts.

Fixed that for you :)

Oh and its fun to watch lol.

Avatar image for Grey_Eyed_Elf
#582 Posted by Grey_Eyed_Elf (6389 posts) -

@i_p_daily said:
@Pedro said:

Oh no! Here come the battle of the cow Alts.

Fixed that for you :)

Oh and its fun to watch lol.

Yeah I am beginning to feel more and more lately that this forums consists of 1 to 3 people with 3 or more alts each just arguing with each other and they all seem very trollish/dumb and or misinformed about all things to do with hardware.

Avatar image for ronvalencia
#583 Edited by ronvalencia (27867 posts) -

@Grey_Eyed_Elf said:
@ronvalencia said:
@Grey_Eyed_Elf said:
@hofuldig said:

@Grey_Eyed_Elf:

How will console gamers be hit with a reality check when the only way to do real raytracing is to spend $600 on a GPU that cant even do more than 70FPS with raytracing on at 2560x1440?

The reality check is that 4K is GPU dependant and a RX 580/590 vs Vega 56/64 is not going to be drastic power jump for 4K gaming... A lot console gamers are under the impression that 12TFLOPS means 4K/60 and the CPU jump will mean no performance drops in games. The reality check will come in two forms the CPU doesn't change much at 4K and a lot of games will still have performance issues depending on how much the developers optimises and pushes the hardware and that 12TFLOP is still not enough for 4K/60 on the majority of AAA games.

Reality check, 4K locked 60 hz is not required with FreeSync HDMI 2.0 and HDMI 2.1.

Rasterization more than doubles from RX-590's 32 ROPS without multi-MB L2 cache to Vega 56's 64 ROPS with 4 MB L2 cache.

World War Z in example for optimized game for Vega GPU

1. Vulkan low level API

2. Shader intrinsics (similar incoming DirectML"s Meta-commands API access)

3. Rapid Pack Math (similar incoming DirectML API access)

4. Heavy async compute, higher TMU read/write path usage to workaround 64 ROPS limit. Works well when 4 MB L2 cache boundary is factored in i.e. software tiled cache render.

There's a reason for Turing RTX 2080/2070 targeting 4MB L2 cache being common with Vega 56/64/VII.

5. PS4 games will not have NVIDIA Gameworks shader library with Nvidia''s architecture assumptions.

Reality check no one is upgrading their $1-2K 4K TV just to get Freesync. That's the same as saying that every RTX gamer has a 144Hz Ultrawide G-Sync display... So for the 99% of people who buy a PS5 It is required because their TV won't support freesync, I just bought my TV 8 months ago and I don't plan on upgrading for another 3-4 years. Your logic is so flawed its ridiculous some times.

All your gibberish is meaningless the reality is Navi is targeted for mid range hardware and at best will perform around Vega 56-64 levels which is not enough for current AAA games at 4K/60 let alone next generation games that push graphics further and or use a form of Ray Tracing.

FreeSync panels sales from both PC monitor and Ultra-HDTV segments are higher market when compared to PC's 144 hz panel G-sync niche. Hint: NVIDIA is forced to support FreeSync due to market pressure.

Forza Motosport 7 on the PC is an example lacking NVIDIA's Gimpworks and it's 8K 60 Hz on Vega 56 at 1500Mhz (~11 TFLOPS).

Sony's 1st party games are the main strength for Play-Station platform and their GCN optimisations are similar to Foza Motosport 7. Gran Turismo Sport is already 8K on PS5 dev kit which is similar to Vega 56 at 1500Mhz's Forza Motosport 7 results.

World War Z example points to Vega 56/64/VII GCN optimization methods for Sony's 1st party games.

NVIDIA DX11 driver is multi-threaded with async command list submission while AMD DX11 driver is single threaded.

Sony doesn't have a problem with 1st party game content to sell their PlayStation platform.

PS; Pascal has async compute with a single context (Hyper-Q) hardware under DX11 driver which failed multi-context async compute DX12 requirements. DX12/Vulkan would require CPU emulation to maintain multiple async context on Pascal.

Avatar image for i_p_daily
#584 Posted by I_P_Daily (11814 posts) -

@Grey_Eyed_Elf said:
@i_p_daily said:
@Pedro said:

Oh no! Here come the battle of the cow Alts.

Fixed that for you :)

Oh and its fun to watch lol.

Yeah I am beginning to feel more and more lately that this forums consists of 1 to 3 people with 3 or more alts each just arguing with each other and they all seem very trollish/dumb and or misinformed about all things to do with hardware.

Well you're in a conversation with ronbot, that shit would do my head in as I don't even read what he says lol.

I don't care for the technical side of gaming just the games, so maybe that's where you're going wrong.

As for the same people with many alts, well I do agree as a LOT if not all are cows who continue to be banned, but come back time & time again.

Avatar image for boxrekt
#585 Edited by BoxRekt (1393 posts) -

@son-goku7523 said:
@boxrekt said:

@son-goku7523:

Wow, ok buddy. Don't say I didn't tell you Mr informed.

I've been right for years now when it comes to prices and even release periods for consoles, I don't see myself being wrong anytime soon.

A lot of people base these things on what they want things to be instead of what they are likely gonna be. Many people want the PS5 to be a $499 beast but

REALITY CHECK!

Sony ALREADY gave an outline of what PS5 will be and have inside!

It looks like you're the one who's basing things on what you want.

I'm going off what Sony promised. But like I said, don't say I didn't tell you when you get disappointed for having unrealistic expectations.

Avatar image for Grey_Eyed_Elf
#586 Edited by Grey_Eyed_Elf (6389 posts) -

@ronvalencia said:
@Grey_Eyed_Elf said:

Reality check no one is upgrading their $1-2K 4K TV just to get Freesync. That's the same as saying that every RTX gamer has a 144Hz Ultrawide G-Sync display... So for the 99% of people who buy a PS5 It is required because their TV won't support freesync, I just bought my TV 8 months ago and I don't plan on upgrading for another 3-4 years. Your logic is so flawed its ridiculous some times.

All your gibberish is meaningless the reality is Navi is targeted for mid range hardware and at best will perform around Vega 56-64 levels which is not enough for current AAA games at 4K/60 let alone next generation games that push graphics further and or use a form of Ray Tracing.

FreeSync panels sales from both PC monitor and Ultra-HDTV segments are higher market niche PC's 144 hz. Hint: NVIDIA is forced to support FreeSync due to market pressure.

Forza Motosport 7 on the PC is an example lacking NVIDIA's Gimpworks and it's 8K 60 Hz on Vega 56 at 1500Mhz (~11 TFLOPS).

Sony's 1st party games are the main strength for Play-Station platform and their GCN optimisations are similar to Foza Motosport 7. Gran Turismo Sport is already 8K on PS5 dev kit similar to Vega 56 at 1500Mhz's Forza Motosport 7 results.

World War Z example points to Vega 56/64/VII GCN optimization methods for Sony's 1st party games.

You are talking about easy to run games... A GTX 1080 Ti at 8K on Forza 7 runs at less than 60% usage, its a rare game.

Not all engines/developers are the same let alone the game, so pointing out easy to run games means nothing when the other 9/10 games released that year struggle to achieve playable 30FPS at lower resolutions, your entire arguments based on performance are weak straw man attempts.

No one is saying that their won't be easy to run games or beautiful optimised games let alone me, my point is that the majority of games do NOT run well and you're implying that next generation ALL games will achieve this level of optimisation is just false... And even if they where optimised it would then still not mean 60FPS since most developers will easily opt for newer features to push graphics with that extra headroom and opt for a 30FPS target and since one of those new features are Ray Tracing you better believe that 60FPS will not be the norm unless its the developers intent.

As for your FreeSync stop it... 4K adoption is still low let alone HDR/OLED/QLED and now you act as if steady framerates won't matter because 0.05% of people have FreeSync TV's on earth? Stop it.

Since your straw man pull is annoying let me post 10 games that don't hit 60... Which will elaborate my point that games/engines/developers are all different and depending on the developer's talent or target with graphics 60FPS is not a gaurantee on games regardless of how much power you have especially on a console where the developer will more than likely chose higher graphical quality over 60FPS especailly for AAA first party games, racing and fighting games are the only exception to this rule otherwise Uncharted and Gears of War would be running at 60FPS but no the developers chose to push graphics over framerate.

Gallery image 1Gallery image 2Gallery image 3Gallery image 4Gallery image 5Gallery image 6Gallery image 7Gallery image 8Gallery image 9Gallery image 10

Avatar image for ronvalencia
#587 Edited by ronvalencia (27867 posts) -

@Grey_Eyed_Elf said:
@ronvalencia said:
@Grey_Eyed_Elf said:

Reality check no one is upgrading their $1-2K 4K TV just to get Freesync. That's the same as saying that every RTX gamer has a 144Hz Ultrawide G-Sync display... So for the 99% of people who buy a PS5 It is required because their TV won't support freesync, I just bought my TV 8 months ago and I don't plan on upgrading for another 3-4 years. Your logic is so flawed its ridiculous some times.

All your gibberish is meaningless the reality is Navi is targeted for mid range hardware and at best will perform around Vega 56-64 levels which is not enough for current AAA games at 4K/60 let alone next generation games that push graphics further and or use a form of Ray Tracing.

FreeSync panels sales from both PC monitor and Ultra-HDTV segments are higher market niche PC's 144 hz. Hint: NVIDIA is forced to support FreeSync due to market pressure.

Forza Motosport 7 on the PC is an example lacking NVIDIA's Gimpworks and it's 8K 60 Hz on Vega 56 at 1500Mhz (~11 TFLOPS).

Sony's 1st party games are the main strength for Play-Station platform and their GCN optimisations are similar to Foza Motosport 7. Gran Turismo Sport is already 8K on PS5 dev kit similar to Vega 56 at 1500Mhz's Forza Motosport 7 results.

World War Z example points to Vega 56/64/VII GCN optimization methods for Sony's 1st party games.

You are talking about easy to run games... A GTX 1080 Ti at 8K on Forza 7 runs at less than 60% usage, its a rare game.

Not all engines/developers are the same let alone the game, so pointing out easy to run games means nothing when the other 9/10 games released that year struggle to achieve playable 30FPS at lower resolutions, your entire arguments based on performance are weak straw man attempts.

No one is saying that their won't be easy to run games or beautiful optimised games let alone me, my point is that the majority of games do NOT run well and you're implying that next generation ALL games will achieve this level of optimisation is just false... And even if they where optimised it would then still not mean 60FPS since most developers will easily opt for newer features to push graphics with that extra headroom and opt for a 30FPS target and since one of those new features are Ray Tracing you better believe that 60FPS will not be the norm unless its the developers intent.

As for your FreeSync stop it... 4K adoption is still low let alone HDR/OLED/QLED and now you act as if steady framerates won't matter because 0.05% of people have FreeSync TV's on earth? Stop it.

Arguing GTX 1080 Ti for Forza M7 at 8K is a red herring since GTX 1080 Ti is typically has higher TFLOPS with it's stealth overclock between 1700 to 1800Mhz (depending on TDP headroom) and that's with GTX 1080 Ti FE model not including factory OC editions. 12.1 TFLOPS potential for GTX 1080 Ti FE at 1700Mhz.

Again, PlayStation's main sales strength is with it's 1st party games.

FreeSync is beating G-Sync in flat panel sales, hence NVIDIA was forced to eat humble pie.

We know from Tekken 7 developers that Unreal Engine 4 is not GCN friendly i.e. memory access pattern needs to be re-coded. Unreal Engine 4 *is* Gameworks game engine with higher pixel shader/ROPS path usage.

It's well known AMD GPUs gets smashed by heavy pixel shader/ROPS path usage when GTX 1080 Ti and RTX 2080 Ti has hardware superiority with this area.

NVIDIA's single context hyper-Q (async command list submission) is well optimized for DX11 multi-threading with pixel shader/ROPS path.

Console GCNs has two graphics command list processor units while PC version is stuck with a single unit. Supporting DX11 MT command list needs hardware support and it's lacking on AMD's PC GPUs. No brainier on XBO's DX11.X MT support when there's two graphics command list processor units.

When Vega 56 is overclock to 1710Mhz (12.2 TFLOPS) and beat Vega 64 at 1590Mhz (13 TFLOPS), higher clock speed improves ROPS read/write path.

Well optimized AMD GPU game should land around NVIDIA's TFLOPS counterpart.

The expectation for NAVI 3070/3080 is 64 ROPS over 256bit bus GDDR6, double RX-580/RX-590's raster power and almost double memory bandwidth.

Ray-tracing is mostly integer (search component) and floating point compute/TMU read write path i.e. Vega 56 is almost 2X over GTX 1160 TI in this regard.

GTX 1160 Ti

ALU FP: 5.6 TFLOPS

L2 cache: 2MB

RX Vega 56

ALU FP: 10.5 TFLOPS or 11 TFLOPS at 1500Mhz.

L2 cache: 4 MB <-------- RTX 2070 Turing like.

Ideally, I rather see Async compute shader/TMU path with Async pixel shader/ROPS path support.

Avatar image for Grey_Eyed_Elf
#588 Posted by Grey_Eyed_Elf (6389 posts) -

@ronvalencia said:
@Grey_Eyed_Elf said:
@ronvalencia said:
@Grey_Eyed_Elf said:

Reality check no one is upgrading their $1-2K 4K TV just to get Freesync. That's the same as saying that every RTX gamer has a 144Hz Ultrawide G-Sync display... So for the 99% of people who buy a PS5 It is required because their TV won't support freesync, I just bought my TV 8 months ago and I don't plan on upgrading for another 3-4 years. Your logic is so flawed its ridiculous some times.

All your gibberish is meaningless the reality is Navi is targeted for mid range hardware and at best will perform around Vega 56-64 levels which is not enough for current AAA games at 4K/60 let alone next generation games that push graphics further and or use a form of Ray Tracing.

FreeSync panels sales from both PC monitor and Ultra-HDTV segments are higher market niche PC's 144 hz. Hint: NVIDIA is forced to support FreeSync due to market pressure.

Forza Motosport 7 on the PC is an example lacking NVIDIA's Gimpworks and it's 8K 60 Hz on Vega 56 at 1500Mhz (~11 TFLOPS).

Sony's 1st party games are the main strength for Play-Station platform and their GCN optimisations are similar to Foza Motosport 7. Gran Turismo Sport is already 8K on PS5 dev kit similar to Vega 56 at 1500Mhz's Forza Motosport 7 results.

World War Z example points to Vega 56/64/VII GCN optimization methods for Sony's 1st party games.

You are talking about easy to run games... A GTX 1080 Ti at 8K on Forza 7 runs at less than 60% usage, its a rare game.

Not all engines/developers are the same let alone the game, so pointing out easy to run games means nothing when the other 9/10 games released that year struggle to achieve playable 30FPS at lower resolutions, your entire arguments based on performance are weak straw man attempts.

No one is saying that their won't be easy to run games or beautiful optimised games let alone me, my point is that the majority of games do NOT run well and you're implying that next generation ALL games will achieve this level of optimisation is just false... And even if they where optimised it would then still not mean 60FPS since most developers will easily opt for newer features to push graphics with that extra headroom and opt for a 30FPS target and since one of those new features are Ray Tracing you better believe that 60FPS will not be the norm unless its the developers intent.

As for your FreeSync stop it... 4K adoption is still low let alone HDR/OLED/QLED and now you act as if steady framerates won't matter because 0.05% of people have FreeSync TV's on earth? Stop it.

Arguing GTX 1080 Ti for Forza M7 at 8K is a red herring since GTX 1080 Ti is typically has higher TFLOPS with it's stealth overclock between 1700 to 1800Mhz (depending on TDP headroom) and that's with GTX 1080 Ti FE model not including factory OC editions. 12.1 TFLOPS potential for GTX 1080 Ti FE at 1700Mhz.

Again, PlayStation's main sales strength is with it's 1st party games.

FreeSync is beating G-Sync in flat panel sales, hence NVIDIA was forced to eat humble pie.

We know from developers from Tekken 7 developers Unreal Engine 4 is not GCN friendly i.e. memory access pattern needs to be re-coded. Unreal Engine 4 *is* Gameworks game engine with higher pixel shader/ROPS path usage.

It's well known AMD GPUs gets smashed by heavy pixel shader/ROPS path usage when GTX 1080 Ti and RTX 2080 Ti has hardware superiority with this area.

NVIDIA's single context hyper-Q (async command list submission) is well optimized for DX11 multi-threading with pixel shader/ROPS path.

When Vega 56 is overclock to 1710Mhz (12.2 TFLOPS) and beat Vega 64 at 1590Mhz (13 TFLOPS), higher clock speed improves ROPS read/write path.

Well optimized AMD GPU game should land around NVIDIA's TFLOPS counterpart.

Ron... I can't do this with you anymore, you refuse to accept the reality that developers target 30FPS because they can push visuals further than targeting 60FPS on consoles and unless the developer needs 60FPS say if its a FPS/Racing game or fighter there is little to no chance they would even consider what ever graphical drop is required in order to achieve it.

I am aware that 4K/60 is possible with RX 56, I had a GTX 1070 and played BF1 on a mixture of medium/high settings at 4K and got 60FPS... and that it would be even more so on a game that is more AMD friendly but that is not how development works buddy World War Z is not a ground breaking AAA game with Ray Tracing so while optimisation for certain architectures will give developers more to work with all that will do is give them more power headroom to push graphics further rather than use that power for a higher frame rate because graphics sell.

Also I will remind you that while Forza 7 was 60FPS on X1X... Forza Horizon 4 was 30FPS, because the developers had to make that frame rate sacrifice in order to push that open world on that level of hardware.

4K/30 will be the target for almost every developer out there that isn't making a FPS, Racing game or a fighter.

You think any developer out there with the console graphics king crown would have that crown if they targeted 60FPS?... No, graphics sell!

  • Uncharted 5,
  • Gears of War 6,
  • Assassins Creed London,
  • FarCry 6,
  • Last of Us Part III,
  • GTA 6,
  • Elder Scrolls 6,
  • Red Dead Redemption III,
  • Starfield
  • Cyberpunk 2077
  • Horizon Zero Dawn 2

You think the developers of those games will target 60FPS on a PS5?... Or would they target 30FPS and push the graphical fidelity higher than what is possible with current consoles?... The answer is so obvious only a fool would argue otherwise.

Avatar image for ronvalencia
#589 Edited by ronvalencia (27867 posts) -

@Grey_Eyed_Elf said:
@ronvalencia said:

Arguing GTX 1080 Ti for Forza M7 at 8K is a red herring since GTX 1080 Ti is typically has higher TFLOPS with it's stealth overclock between 1700 to 1800Mhz (depending on TDP headroom) and that's with GTX 1080 Ti FE model not including factory OC editions. 12.1 TFLOPS potential for GTX 1080 Ti FE at 1700Mhz.

Again, PlayStation's main sales strength is with it's 1st party games.

FreeSync is beating G-Sync in flat panel sales, hence NVIDIA was forced to eat humble pie.

We know from developers from Tekken 7 developers Unreal Engine 4 is not GCN friendly i.e. memory access pattern needs to be re-coded. Unreal Engine 4 *is* Gameworks game engine with higher pixel shader/ROPS path usage.

It's well known AMD GPUs gets smashed by heavy pixel shader/ROPS path usage when GTX 1080 Ti and RTX 2080 Ti has hardware superiority with this area.

NVIDIA's single context hyper-Q (async command list submission) is well optimized for DX11 multi-threading with pixel shader/ROPS path.

When Vega 56 is overclock to 1710Mhz (12.2 TFLOPS) and beat Vega 64 at 1590Mhz (13 TFLOPS), higher clock speed improves ROPS read/write path.

Well optimized AMD GPU game should land around NVIDIA's TFLOPS counterpart.

Ron... I can't do this with you anymore, you refuse to accept the reality that developers target 30FPS because they can push visuals further than targeting 60FPS on consoles and unless the developer needs 60FPS say if its a FPS/Racing game or fighter there is little to no chance they would even consider what ever graphical drop is required in order to achieve it.

I am aware that 4K/60 is possible with RX 56, I had a GTX 1070 and played BF1 on a mixture of medium/high settings at 4K and got 60FPS... and that it would be even more so on a game that is more AMD friendly but that is not how development works buddy World War Z is not a ground breaking AAA game with Ray Tracing so while optimisation for certain architectures will give developers more to work with all that will do is give them more power headroom to push graphics further rather than use that power for a higher frame rate because graphics sell.

Also I will remind you that while Forza 7 was 60FPS on X1X... Forza Horizon 4 was 30FPS, because the developers had to make that frame rate sacrifice in order to push that open world on that level of hardware.

4K/30 will be the target for almost every developer out there that isn't making a FPS, Racing game or a fighter.

You think any developer out there with the console graphics king crown would have that crown if they targeted 60FPS?... No, graphics sell!

  • Uncharted 5,
  • Gears of War 6,
  • Assassins Creed London,
  • FarCry 6,
  • Last of Us Part III,
  • GTA 6,
  • Elder Scrolls 6,
  • Red Dead Redemption III,
  • Starfield
  • Cyberpunk 2077
  • Horizon Zero Dawn 2

You think the developers of those games will target 60FPS on a PS5?... Or would they target 30FPS and push the graphical fidelity higher than what is possible with current consoles?... The answer is so obvious only a fool would argue otherwise.

Your "I can't do this with you anymore, you refuse to accept the reality that developers target 30FPS" narrative on me is bullshit.

My argument is Vega 56 at 1500Mhz already delivers 8K 60 hz with FM7 which is similar to GTS's workload, hence PS5's GPU expectation is Vega 56 like.

Your GTX 1070 doesn't have Turing CUDA features, Hint: it's similar to Vega NCU features.

Pascal will age... worst than Vega 56 when DirectML (MS's rapid pack math API access**) kicks in on the Windows 10 platform.

Variable Shading Rate feature saves shader resource.

https://hexus.net/tech/news/graphics/128588-microsoft-adds-variable-rate-shading-support-directx-12/

. As mentioned above there are rumours that AMD will have Tier 2 GPUs in the coming months

From NVAPI access, Variable Shading Rate and Rapid Pack Maths are two main features that separated RTX 2080 vs GTX 1080 Ti result with Wolfenstein II

RTX 2080 other advantages over GTX 1080 Ti with Async compute hardware scheduler and 4MB L2 cache.

**Older GPUs without Rapid Pack Maths runs the same workload in single rate FP32 compatibility mode.

DirectML enables DLSS pixel reconstruction alternative on Windows platform.

Radeon Rays 3.0 has GPU shader accelerated Bounding Volume Hierarchy (BVH) and half-precision (FP16) computation support (rapid pack math).

AMD needs to combine DirectML de-noise pass with Radeon Rays 3.0's BVH for DXR.

Avatar image for son-goku7523
#590 Edited by Son-Goku7523 (955 posts) -

@boxrekt: They haven’t given us shit bro, you’re the one making up your own reality. Ryzen CPU, Navi, and proprietary SSD is about all we know. Just because Mark Cerny mentioned 8K everyone is thinking the PS5 is going to be some 12-14 Tflop beast that Sony will sell for $500. We don’t know anything about the PS5 based on what he gave us other than the fact it will have faster load speeds and backwards compatibility. 8K Outputs can mean anything from checkerboard/upscaled 8K to 8K video output. It doesn’t mean the PS5 will be an expensive monster of a console. We still haven’t been given the clockspeeds of the APU or what kind of memory it will use and you have people already trying to benchmark the machine and price it based on the vague bullshit Mark Cerny said. I’m just basing what I’m saying on Sony’s past actions. Their last 2 machines were priced at $400 and neither them nor anyone has ever successfully sold a $500 machine at launch in high enough numbers. I could be wrong but Sony’s history is on my side for now.

Avatar image for emgesp
#591 Edited by emgesp (7830 posts) -
@ronvalencia said:

Unlike PS3 to PS4 transition, PS5 can leverage PS4's market power.

8 core Zen v2 should be the real CELL replacement i.e. ~2X GFLOPS FP32/4X GFLOPS FP16 power when 8 core Zen v2 at 1.6Ghz vs CELL at 3.2 Ghz (missing FP16 rapid pack math).

PS; SPU's effective IPC is 0.5 while Jaguar's effective IPC is 1.x to 1.8

No way is PS5's Zen CPU going to be clocked at 1.6Ghz. Ron, I have much respect and appreciate your posts here, but you are low balling PS5's CPU clockspeeds by a lot.

Avatar image for emgesp
#592 Edited by emgesp (7830 posts) -

Also, no way in hell is Sony going to price PS5 at $499.99, especially if Microsoft has that cheaper Lockhart SKU at the ready. I don't care about what tech Sony confirmed so far, $399.99 has worked too well for them this current gen for them to just give up. If Sony is only coming out with a single SKU at launch then I guarantee you it will be under $499.99, I am 99.9% confident in that and anyone else with common sense should be too.

Mark Cerny also said "PS5 price will be appealing." and last time I checked $500 has never been considered appealing for a true next-gen console. You can't compare mid-gen upgrades priced at $500 (i.e. XB1X) to brand new next-gen consoles. Sony will want to start off next-gen strong out of the gate and well a $499.99 only SKU would prevent that.

Avatar image for ronvalencia
#593 Edited by ronvalencia (27867 posts) -

@emgesp said:
@ronvalencia said:

Unlike PS3 to PS4 transition, PS5 can leverage PS4's market power.

8 core Zen v2 should be the real CELL replacement i.e. ~2X GFLOPS FP32/4X GFLOPS FP16 power when 8 core Zen v2 at 1.6Ghz vs CELL at 3.2 Ghz (missing FP16 rapid pack math).

PS; SPU's effective IPC is 0.5 while Jaguar's effective IPC is 1.x to 1.8

No way is PS5's Zen CPU going to be clocked at 1.6Ghz. Ron, I have much respect and appreciate your posts here, but you are low balling PS5's CPU clockspeeds by a lot.

1.6 Ghz is worst case situation or running like ultra mobile version with all 8 cores active.

Avatar image for emgesp
#594 Posted by emgesp (7830 posts) -

@ronvalencia said:
@emgesp said:
@ronvalencia said:

Unlike PS3 to PS4 transition, PS5 can leverage PS4's market power.

8 core Zen v2 should be the real CELL replacement i.e. ~2X GFLOPS FP32/4X GFLOPS FP16 power when 8 core Zen v2 at 1.6Ghz vs CELL at 3.2 Ghz (missing FP16 rapid pack math).

PS; SPU's effective IPC is 0.5 while Jaguar's effective IPC is 1.x to 1.8

No way is PS5's Zen CPU going to be clocked at 1.6Ghz. Ron, I have much respect and appreciate your posts here, but you are low balling PS5's CPU clockspeeds by a lot.

1.6 Ghz is worst case situation or running like ultra mobile version with all 8 cores active.

What happens when you bump that up to 2.8 - 3.0Ghz?

Avatar image for ronvalencia
#595 Edited by ronvalencia (27867 posts) -

@emgesp said:
@ronvalencia said:
@emgesp said:
@ronvalencia said:

Unlike PS3 to PS4 transition, PS5 can leverage PS4's market power.

8 core Zen v2 should be the real CELL replacement i.e. ~2X GFLOPS FP32/4X GFLOPS FP16 power when 8 core Zen v2 at 1.6Ghz vs CELL at 3.2 Ghz (missing FP16 rapid pack math).

PS; SPU's effective IPC is 0.5 while Jaguar's effective IPC is 1.x to 1.8

No way is PS5's Zen CPU going to be clocked at 1.6Ghz. Ron, I have much respect and appreciate your posts here, but you are low balling PS5's CPU clockspeeds by a lot.

1.6 Ghz is worst case situation or running like ultra mobile version with all 8 cores active.

What happens when you bump that up to 2.8 - 3.0Ghz?

On Cinebnech R15, 14 nm Ryzen ultra mobile at 3Ghz four cores active consumes about 20 watts. I config my Ryzen mobile for ~1.6 to 1.8 Ghz with ~1100 Mhz Vega 8 with 25 watt budget. I use RyzenAdj tool to change TDP profile to 30 or 35 watts, hence less throttling e.g. +2Ghz CPUs with Vega 8 at 1100 Mhz.

7nm improvements, Epyc 2 doubles CPU core count within Epyc 1's TDP limits.

7nm VII is over-voltage by AMD.

https://www.reddit.com/r/Amd/comments/ao43xl/radeon_vii_insanely_overvolted_undervolting/

VII with under-voltage rivals RTX 2080's performance per watts.

8 core Zen v2 at 1.6Ghz is 4X math and 2X integer resources over PS4's Jaguar 8 cores at 1.6Ghz.

8 core Zen v2 at 1.6 Ghz + RX 3070 is effectively a drop in PS4 replacement with similar TDP and BOM cost.

Avatar image for ronvalencia
#596 Edited by ronvalencia (27867 posts) -

@son-goku7523 said:

@boxrekt: They haven’t given us shit bro, you’re the one making up your own reality. Ryzen CPU, Navi, and proprietary SSD is about all we know. Just because Mark Cerny mentioned 8K everyone is thinking the PS5 is going to be some 12-14 Tflop beast that Sony will sell for $500. We don’t know anything about the PS5 based on what he gave us other than the fact it will have faster load speeds and backwards compatibility. 8K Outputs can mean anything from checkerboard/upscaled 8K to 8K video output. It doesn’t mean the PS5 will be an expensive monster of a console. We still haven’t been given the clockspeeds of the APU or what kind of memory it will use and you have people already trying to benchmark the machine and price it based on the vague bullshit Mark Cerny said. I’m just basing what I’m saying on Sony’s past actions. Their last 2 machines were priced at $400 and neither them nor anyone has ever successfully sold a $500 machine at launch in high enough numbers. I could be wrong but Sony’s history is on my side for now.

I prefer higher clocked Vega 56 over lower clocked Vega 64. One can't ignore raster power to expose it's TFLOPS into memory operations.

Avatar image for emgesp
#597 Edited by emgesp (7830 posts) -
@ronvalencia said:
@emgesp said:

What happens when you bump that up to 2.8 - 3.0Ghz?

On Cinebnech R15, 14 nm Ryzen ultra mobile at 3Ghz four cores active consumes about 20 watts. I config my Ryzen mobile for ~1.6 to 1.8 Ghz with ~1100 Mhz Vega 8 with 25 watt budget. I use RyzenAdj tool to change TDP profile to 30 or 35 watts, hence less throttling e.g. +2Ghz CPUs with Vega 8 at 1100 Mhz.

7nm improvements, Epyc 2 doubles CPU core count within Epyc 1's TDP limits.

7nm VII is over-voltage by AMD.

https://www.reddit.com/r/Amd/comments/ao43xl/radeon_vii_insanely_overvolted_undervolting/

VII with under-voltage rivals RTX 2080's performance per watts.

8 core Zen v2 at 1.6Ghz is 4X math and 2X integer resources over PS4's Jaguar 8 cores at 1.6Ghz.

8 core Zen v2 at 1.6 Ghz + RX 3070 is effectively a drop in PS4 replacement with similar TDP and BOM cost.

So you honestly don't think the Zen CPU in the PS5 is gonna be in the 2.8 - 3.0Ghz range, I thought that was pretty much a given.

I get that Zen at the same clock speed is still an improvement over Jaguar, but I just don't buy Sony going under 2Ghz with the Zen cores, but I do understand how you got to that conclusion. Its definitely a possibility, but I still have doubts.

Avatar image for rzxv04
#598 Posted by rzxv04 (686 posts) -

@emgesp said:
@ronvalencia said:
@emgesp said:

What happens when you bump that up to 2.8 - 3.0Ghz?

On Cinebnech R15, 14 nm Ryzen ultra mobile at 3Ghz four cores active consumes about 20 watts. I config my Ryzen mobile for ~1.6 to 1.8 Ghz with ~1100 Mhz Vega 8 with 25 watt budget. I use RyzenAdj tool to change TDP profile to 30 or 35 watts, hence less throttling e.g. +2Ghz CPUs with Vega 8 at 1100 Mhz.

7nm improvements, Epyc 2 doubles CPU core count within Epyc 1's TDP limits.

7nm VII is over-voltage by AMD.

https://www.reddit.com/r/Amd/comments/ao43xl/radeon_vii_insanely_overvolted_undervolting/

VII with under-voltage rivals RTX 2080's performance per watts.

8 core Zen v2 at 1.6Ghz is 4X math and 2X integer resources over PS4's Jaguar 8 cores at 1.6Ghz.

8 core Zen v2 at 1.6 Ghz + RX 3070 is effectively a drop in PS4 replacement with similar TDP and BOM cost.

So you honestly don't think the Zen CPU in the PS5 is gonna be in the 2.8 - 3.0Ghz range, I thought that was pretty much a given.

I get that Zen at the same clock speed is still an improvement over Jaguar, but I just don't buy Sony going under 2Ghz with the Zen cores, but I do understand how you got to that conclusion. Its definitely a possibility, but I still have doubts.

I may not know much about how the insides of a CPU work but I really highly doubt it's gonna be lower than 2.4 Ghz (above 1X's 2.3 Ghz), partly for marketing purposes. It'll be a mindshare for most folks that look at specs that know little to nothing about IPC, etc.

There are other aspects Sony/MS and AMD can tweak to reach that 2.4 Ghz minimum number. AFAIK the Xbox 1X even has a more customized jaguar.

Avatar image for ronvalencia
#599 Edited by ronvalencia (27867 posts) -

@emgesp said:
@ronvalencia said:
@emgesp said:

What happens when you bump that up to 2.8 - 3.0Ghz?

On Cinebnech R15, 14 nm Ryzen ultra mobile at 3Ghz four cores active consumes about 20 watts. I config my Ryzen mobile for ~1.6 to 1.8 Ghz with ~1100 Mhz Vega 8 with 25 watt budget. I use RyzenAdj tool to change TDP profile to 30 or 35 watts, hence less throttling e.g. +2Ghz CPUs with Vega 8 at 1100 Mhz.

7nm improvements, Epyc 2 doubles CPU core count within Epyc 1's TDP limits.

7nm VII is over-voltage by AMD.

https://www.reddit.com/r/Amd/comments/ao43xl/radeon_vii_insanely_overvolted_undervolting/

VII with under-voltage rivals RTX 2080's performance per watts.

8 core Zen v2 at 1.6Ghz is 4X math and 2X integer resources over PS4's Jaguar 8 cores at 1.6Ghz.

8 core Zen v2 at 1.6 Ghz + RX 3070 is effectively a drop in PS4 replacement with similar TDP and BOM cost.

So you honestly don't think the Zen CPU in the PS5 is gonna be in the 2.8 - 3.0Ghz range, I thought that was pretty much a given.

I get that Zen at the same clock speed is still an improvement over Jaguar, but I just don't buy Sony going under 2Ghz with the Zen cores, but I do understand how you got to that conclusion. Its definitely a possibility, but I still have doubts.

PS4 Pro has 2.1 Ghz 8 core Jaguar at 16nm

X1X has 2.3 Ghz 8 core Jaguar (lower latency) at 16nm

PS5 is asking for ??? Ghz 8 core Zen v2 at 7 nm. Zen v2 has 2X integer, 4X load/store(dual 256bit wide) and 4X floats (two AVX 256bit FMAC) when compared to Jaguar.

You're asking more than 2X-to-4X improvements from 16 nm Jaguar to 7nm Zen v2

If Sony finds a way assimilate Xbox One X's Vapor Chamber cooling solution within $399 box, then TDP limit is higher. Xbox One X is under $399 atm.

Avatar image for hofuldig
#600 Edited by hofuldig (5126 posts) -
@rzxv04 said:
@emgesp said:
@ronvalencia said:
@emgesp said:

What happens when you bump that up to 2.8 - 3.0Ghz?

On Cinebnech R15, 14 nm Ryzen ultra mobile at 3Ghz four cores active consumes about 20 watts. I config my Ryzen mobile for ~1.6 to 1.8 Ghz with ~1100 Mhz Vega 8 with 25 watt budget. I use RyzenAdj tool to change TDP profile to 30 or 35 watts, hence less throttling e.g. +2Ghz CPUs with Vega 8 at 1100 Mhz.

7nm improvements, Epyc 2 doubles CPU core count within Epyc 1's TDP limits.

7nm VII is over-voltage by AMD.

https://www.reddit.com/r/Amd/comments/ao43xl/radeon_vii_insanely_overvolted_undervolting/

VII with under-voltage rivals RTX 2080's performance per watts.

8 core Zen v2 at 1.6Ghz is 4X math and 2X integer resources over PS4's Jaguar 8 cores at 1.6Ghz.

8 core Zen v2 at 1.6 Ghz + RX 3070 is effectively a drop in PS4 replacement with similar TDP and BOM cost.

So you honestly don't think the Zen CPU in the PS5 is gonna be in the 2.8 - 3.0Ghz range, I thought that was pretty much a given.

I get that Zen at the same clock speed is still an improvement over Jaguar, but I just don't buy Sony going under 2Ghz with the Zen cores, but I do understand how you got to that conclusion. Its definitely a possibility, but I still have doubts.

I may not know much about how the insides of a CPU work but I really highly doubt it's gonna be lower than 2.4 Ghz (above 1X's 2.3 Ghz), partly for marketing purposes. It'll be a mindshare for most folks that look at specs that know little to nothing about IPC, etc.

There are other aspects Sony/MS and AMD can tweak to reach that 2.4 Ghz minimum number. AFAIK the Xbox 1X even has a more customized jaguar.

According to the current leaks its possible that the CPU will have power states and it will clock up to 3.2GHz, it wouldnt be far fetched to think sony would do this considering that the PS3 and Xbox360 both had 3GHz CPUs and at 7NM it will sip power and wont even get particularly hot. and i still think that the PS5 will be $399 i mean look at the PS4 Pro, its bigger the CPU is higher clocked 1TBHDD and they added a 2nd identical GPU for $399.