PS5 details from Mark Cerny: Backwards compatible, 8k, raytracing support, SSD standard, and more

Avatar image for Grey_Eyed_Elf
#501 Posted by Grey_Eyed_Elf (6453 posts) -

@emgesp said:

Perhaps that SemiAccurate leak from this time last year wasn't that far off afterall. These are the specs that leaker claimed. The SSD is what threw everyone off and what made it seem fake, but now that SSD was confirmed perhaps these are the legit specs. 14.20 Tflops isn't technically impossible and the 32GBs could always just be what devkits have.

  • 100% backwards compatible with PS4 games
  • Fully compatible with PS VR and PlayStation Move offering a improved VR experience
  • CPU: AMD Zen 8 cores - single-chip custom processor
  • GPU: 14.20 TFLOPS, AMD Navi-based graphics engine
  • Memory: 32GB GDDR6 technology
  • Storage - 1TB SSD

That leak was for a Dev kit... They usually have double or more RAM and more powerful GPU's.

Avatar image for Grey_Eyed_Elf
#502 Posted by Grey_Eyed_Elf (6453 posts) -

Even DF is sceptical about the 1.8GHz leak... With their own estimates at the top end being 12TFLOPs. Also the leak doesn't mention CU count and the TFLOP count relies entirely on that so we could be looking at anything 9-12TFLOPS if the 1.8GHz is legit.

Loading Video...

Avatar image for hofuldig
#503 Edited by hofuldig (5126 posts) -
@Grey_Eyed_Elf said:

Even DF is sceptical about the 1.8GHz leak... With their own estimates at the top end being 12TFLOPs. Also the leak doesn't mention CU count and the TFLOP count relies entirely on that so we could be looking at anything 9-12TFLOPS if the 1.8GHz is legit.


Quite frankly the PS4 CPU is clocked at 1.6GHz so even if the zen2 cpu in the PS5 was also clocked at 1.6GHz your looking at double the performance right off the bat clock for clock, and thats before optimizations that sony would make to the cpu itself to fit it better into the PS5.

Avatar image for Grey_Eyed_Elf
#504 Posted by Grey_Eyed_Elf (6453 posts) -

@hofuldig said:
@Grey_Eyed_Elf said:

Even DF is sceptical about the 1.8GHz leak... With their own estimates at the top end being 12TFLOPs. Also the leak doesn't mention CU count and the TFLOP count relies entirely on that so we could be looking at anything 9-12TFLOPS if the 1.8GHz is legit.

Quite frankly the PS4 CPU is clocked at 1.6GHz so even if the zen2 cpu in the PS5 was also clocked at 1.6GHz your looking at double the performance right off the bat clock for clock, and thats before optimizations that sony would make to the cpu itself to fit it better into the PS5.

Its leaked to be a 1.6GHz base clock with a 3.2GHz max boost clock... So it will be drastically better during game especially when you take into account the substantially IPC boost.

The issue is the GPU it could be another generation 4K because not all games are easy to run at 4K/30 even with a Vega 64 with a 4.2GH 2700X there are games where you would have to drop to 1440 or 1800p on custom resolutions to keep a steady framerate and that is with current games God knows what future games will run like especially if they use Ray Tracing.

Avatar image for EG101
#505 Posted by EG101 (2010 posts) -

@Zero_epyon said:
@freedomfreak said:

These consoles can't even do 4K.

It'll probably do 8K Checkerboard, 4K native.

Makes sense.

4K Native is more than enough pixels on a 70 inch screen anyways.

8K is definitely over kill.

Avatar image for EG101
#506 Edited by EG101 (2010 posts) -

@dalger21 said:
@Pedro said:

Can't wait to see the price

Current specs say it will be 499-599

If Sony stuffs everything it can in a $600 box I will be pleased.

It's going to be late 2020 almost 2021 during the launch of PS5. $400 is a pathetic budget for a High Tech Toy in 2021.

Avatar image for son-goku7523
#507 Posted by Son-Goku7523 (955 posts) -
@EG101 said:
@dalger21 said:
@Pedro said:

Can't wait to see the price

Current specs say it will be 499-599

If Sony stuffs everything it can in a $600 box I will be pleased.

It's going to be late 2020 almost 2021 during the launch of PS5. $400 is a pathetic budget for a High Tech Toy in 2021.

Sony will never sell a $600 PS5, last time they sold at that price it almost killed their entire company.

Avatar image for FireEmblem_Man
#508 Posted by FireEmblem_Man (19737 posts) -

It's obvious that the SoC will be based on AMD's Gonzalo APU. It will depend what GPU they will be using. People forget that R&D need to break even so the max price for PS5 has to be between $300 to $500

Avatar image for tormentos
#509 Posted by tormentos (29192 posts) -

@FireEmblem_Man:

300 is just off the books no way they can make that happen without losing 150 or more per unit.

Sony can lose something now like they use to do,because now they make more money of psn,bu I don't think they will over do it this time.

Avatar image for emgesp
#510 Posted by emgesp (7832 posts) -
@hofuldig said:
@Grey_Eyed_Elf said:

Even DF is sceptical about the 1.8GHz leak... With their own estimates at the top end being 12TFLOPs. Also the leak doesn't mention CU count and the TFLOP count relies entirely on that so we could be looking at anything 9-12TFLOPS if the 1.8GHz is legit.

Quite frankly the PS4 CPU is clocked at 1.6GHz so even if the zen2 cpu in the PS5 was also clocked at 1.6GHz your looking at double the performance right off the bat clock for clock, and thats before optimizations that sony would make to the cpu itself to fit it better into the PS5.

1.8Ghz is for the GPU, CPU will obviously be much higher than that.

Avatar image for Zero_epyon
#511 Posted by Zero_epyon (13292 posts) -

Loading Video...
Avatar image for emgesp
#512 Posted by emgesp (7832 posts) -
@Grey_Eyed_Elf said:
@hofuldig said:
@Grey_Eyed_Elf said:

Even DF is sceptical about the 1.8GHz leak... With their own estimates at the top end being 12TFLOPs. Also the leak doesn't mention CU count and the TFLOP count relies entirely on that so we could be looking at anything 9-12TFLOPS if the 1.8GHz is legit.

Quite frankly the PS4 CPU is clocked at 1.6GHz so even if the zen2 cpu in the PS5 was also clocked at 1.6GHz your looking at double the performance right off the bat clock for clock, and thats before optimizations that sony would make to the cpu itself to fit it better into the PS5.

Its leaked to be a 1.6GHz base clock with a 3.2GHz max boost clock... So it will be drastically better during game especially when you take into account the substantially IPC boost.

The issue is the GPU it could be another generation 4K because not all games are easy to run at 4K/30 even with a Vega 64 with a 4.2GH 2700X there are games where you would have to drop to 1440 or 1800p on custom resolutions to keep a steady framerate and that is with current games God knows what future games will run like especially if they use Ray Tracing.

PS4 Pro can hit 1800p 30fps in a lot of games with those 8 Jaguar cores and a 4.2 Tflop GPU, so I'm pretty sure PS5 will handle native 4K 30fps fine for most games.

Avatar image for Grey_Eyed_Elf
#513 Posted by Grey_Eyed_Elf (6453 posts) -

@emgesp said:
@Grey_Eyed_Elf said:
@hofuldig said:
@Grey_Eyed_Elf said:

Even DF is sceptical about the 1.8GHz leak... With their own estimates at the top end being 12TFLOPs. Also the leak doesn't mention CU count and the TFLOP count relies entirely on that so we could be looking at anything 9-12TFLOPS if the 1.8GHz is legit.

Quite frankly the PS4 CPU is clocked at 1.6GHz so even if the zen2 cpu in the PS5 was also clocked at 1.6GHz your looking at double the performance right off the bat clock for clock, and thats before optimizations that sony would make to the cpu itself to fit it better into the PS5.

Its leaked to be a 1.6GHz base clock with a 3.2GHz max boost clock... So it will be drastically better during game especially when you take into account the substantially IPC boost.

The issue is the GPU it could be another generation 4K because not all games are easy to run at 4K/30 even with a Vega 64 with a 4.2GH 2700X there are games where you would have to drop to 1440 or 1800p on custom resolutions to keep a steady framerate and that is with current games God knows what future games will run like especially if they use Ray Tracing.

PS4 Pro can hit 1800p 30fps in a lot of games with those 8 Jaguar cores and a 4.2 Tflop GPU, so I'm pretty sure PS5 will handle native 4K 30fps fine for most games.

Which is exactly what I said, You might want to re-read.

Games don't have universal performance see below:

Lets say the PS5 is between the performance of a Vega 56 and 64... With the examples above you can see that some games will struggle to give you 30FPS and Some games will give you close to 60FPS.

Now what people don't seem to understand is that games WILL get more demanding so logically if you are just about good enough for 4K/30 on SOME current games then EXPECT FUTURE games SOME of them to struggle to maintain 30FPS or have dynamic resolution... With Cerny talking about Ray Tracing its almost a guarantee that next generation regardless of how many TFLOPs are involved will have horrible shitty performing games just like this generation because developers usually take the approach of pushing the hardware to breaking point.

The logic that "well current generation can do this with this power" needs to die because your demands for resolution is exactly why we have poor performing games on consoles. PC gamers who buy GPU's like mine or a Vega 64 aren't playing at 4K because they are not stupid... You on the other hand are its why developers will keep selling console to you with these hype buzz words and you will feed off it and then spend the next 5 years telling your self 30FPS is okay in 2022!

Avatar image for emgesp
#514 Posted by emgesp (7832 posts) -
@Grey_Eyed_Elf said:
@emgesp said:
@Grey_Eyed_Elf said:

Its leaked to be a 1.6GHz base clock with a 3.2GHz max boost clock... So it will be drastically better during game especially when you take into account the substantially IPC boost.

The issue is the GPU it could be another generation 4K because not all games are easy to run at 4K/30 even with a Vega 64 with a 4.2GH 2700X there are games where you would have to drop to 1440 or 1800p on custom resolutions to keep a steady framerate and that is with current games God knows what future games will run like especially if they use Ray Tracing.

PS4 Pro can hit 1800p 30fps in a lot of games with those 8 Jaguar cores and a 4.2 Tflop GPU, so I'm pretty sure PS5 will handle native 4K 30fps fine for most games.

Which is exactly what I said, You might want to re-read.

Games don't have universal performance see below:

Lets say the PS5 is between the performance of a Vega 56 and 64... With the examples above you can see that some games will struggle to give you 30FPS and Some games will give you close to 60FPS.

Now what people don't seem to understand is that games WILL get more demanding so logically if you are just about good enough for 4K/30 on SOME current games then EXPECT FUTURE games SOME of them to struggle to maintain 30FPS or have dynamic resolution... With Cerny talking about Ray Tracing its almost a guarantee that next generation regardless of how many TFLOPs are involved will have horrible shitty performing games just like this generation because developers usually take the approach of pushing the hardware to breaking point.

The logic that "well current generation can do this with this power" needs to die because your demands for resolution is exactly why we have poor performing games on consoles. PC gamers who buy GPU's like mine or a Vega 64 aren't playing at 4K because they are not stupid... You on the other hand are its why developers will keep selling console to you with these hype buzz words and you will feed off it and then spend the next 5 years telling your self 30FPS is okay in 2022!

Remember a console will always outperform a PC with exact same specs to some degree. Its just the nature of having locked hardware in a console. Now, I'm not saying it'll be drastically better than a similar PC, but it will have its advantages for sure. Anyways, even if they had to utilize checkerboard rendering or a similar technique who cares, it can look extremely close to native 4K when done right. I'm not a resolution snob.

Avatar image for Grey_Eyed_Elf
#515 Edited by Grey_Eyed_Elf (6453 posts) -

@emgesp said:
@Grey_Eyed_Elf said:
@emgesp said:
@Grey_Eyed_Elf said:

Its leaked to be a 1.6GHz base clock with a 3.2GHz max boost clock... So it will be drastically better during game especially when you take into account the substantially IPC boost.

The issue is the GPU it could be another generation 4K because not all games are easy to run at 4K/30 even with a Vega 64 with a 4.2GH 2700X there are games where you would have to drop to 1440 or 1800p on custom resolutions to keep a steady framerate and that is with current games God knows what future games will run like especially if they use Ray Tracing.

PS4 Pro can hit 1800p 30fps in a lot of games with those 8 Jaguar cores and a 4.2 Tflop GPU, so I'm pretty sure PS5 will handle native 4K 30fps fine for most games.

Which is exactly what I said, You might want to re-read.

Games don't have universal performance see below:

Lets say the PS5 is between the performance of a Vega 56 and 64... With the examples above you can see that some games will struggle to give you 30FPS and Some games will give you close to 60FPS.

Now what people don't seem to understand is that games WILL get more demanding so logically if you are just about good enough for 4K/30 on SOME current games then EXPECT FUTURE games SOME of them to struggle to maintain 30FPS or have dynamic resolution... With Cerny talking about Ray Tracing its almost a guarantee that next generation regardless of how many TFLOPs are involved will have horrible shitty performing games just like this generation because developers usually take the approach of pushing the hardware to breaking point.

The logic that "well current generation can do this with this power" needs to die because your demands for resolution is exactly why we have poor performing games on consoles. PC gamers who buy GPU's like mine or a Vega 64 aren't playing at 4K because they are not stupid... You on the other hand are its why developers will keep selling console to you with these hype buzz words and you will feed off it and then spend the next 5 years telling your self 30FPS is okay in 2022!

Remember a console will always outperform a PC with exact same specs to some degree. Its just the nature of having locked hardware in a console. Now, I'm not saying it'll be drastically better than a similar PC, but it will have its advantages for sure. Anyways, even if they had to utilize checkerboard rendering or a similar technique who cares, it can look extremely close to native 4K when done right. I'm not a resolution snob.

To some degree... Look at that benchmark again, the difference between the 56 and 64 vega cards is 2TFLOPs which is barely a extra 3-10FPS at 4K, just look at the RX 590 which is on par with a X1X and the 12.5 TFLOP Vega 64 its barely 1/3rd more in the performance.

4K is a different beast to the lower resolution's where the CPU can add extra performance, its almost entirely GPU dependant as of right now and when next generation games come out using ray tracing and pushing graphics in general forward that 10-12TFLOP GCN card will fold.

You console gamer's will be hit with a reality check very soon not just when it comes to resolution but performance.

Avatar image for hofuldig
#516 Posted by hofuldig (5126 posts) -

@Grey_Eyed_Elf:

@Grey_Eyed_Elf said:
@emgesp said:
@Grey_Eyed_Elf said:
@emgesp said:
@Grey_Eyed_Elf said:

Its leaked to be a 1.6GHz base clock with a 3.2GHz max boost clock... So it will be drastically better during game especially when you take into account the substantially IPC boost.

The issue is the GPU it could be another generation 4K because not all games are easy to run at 4K/30 even with a Vega 64 with a 4.2GH 2700X there are games where you would have to drop to 1440 or 1800p on custom resolutions to keep a steady framerate and that is with current games God knows what future games will run like especially if they use Ray Tracing.

PS4 Pro can hit 1800p 30fps in a lot of games with those 8 Jaguar cores and a 4.2 Tflop GPU, so I'm pretty sure PS5 will handle native 4K 30fps fine for most games.

Which is exactly what I said, You might want to re-read.

Games don't have universal performance see below:

Lets say the PS5 is between the performance of a Vega 56 and 64... With the examples above you can see that some games will struggle to give you 30FPS and Some games will give you close to 60FPS.

Now what people don't seem to understand is that games WILL get more demanding so logically if you are just about good enough for 4K/30 on SOME current games then EXPECT FUTURE games SOME of them to struggle to maintain 30FPS or have dynamic resolution... With Cerny talking about Ray Tracing its almost a guarantee that next generation regardless of how many TFLOPs are involved will have horrible shitty performing games just like this generation because developers usually take the approach of pushing the hardware to breaking point.

The logic that "well current generation can do this with this power" needs to die because your demands for resolution is exactly why we have poor performing games on consoles. PC gamers who buy GPU's like mine or a Vega 64 aren't playing at 4K because they are not stupid... You on the other hand are its why developers will keep selling console to you with these hype buzz words and you will feed off it and then spend the next 5 years telling your self 30FPS is okay in 2022!

Remember a console will always outperform a PC with exact same specs to some degree. Its just the nature of having locked hardware in a console. Now, I'm not saying it'll be drastically better than a similar PC, but it will have its advantages for sure. Anyways, even if they had to utilize checkerboard rendering or a similar technique who cares, it can look extremely close to native 4K when done right. I'm not a resolution snob.

To some degree... Look at that benchmark again, the difference between the 56 and 64 vega cards is 2TFLOPs which is barely a extra 3-10FPS at 4K, just look at the RX 590 which is on par with a X1X and the 12.5 TFLOP Vega 64 its barely 1/3rd more in the performance.

4K is a different beast to the lower resolution's where the CPU can add extra performance, its almost entirely GPU dependant as of right now and when next generation games come out using ray tracing and pushing graphics in general forward that 10-12TFLOP GCN card will fold.

You console gamer's will be hit with a reality check very soon not just when it comes to resolution but performance.

How will console gamers be hit with a reality check when the only way to do real raytracing is to spend $600 on a GPU that cant even do more than 70FPS with raytracing on at 2560x1440?

Avatar image for Random_Matt
#517 Posted by Random_Matt (4247 posts) -

Some of us already knew the limitations of the tech, DF is right. Still brain dead muppets thinking it is a beast? Cerny is so full of shit, all the extra RRP will be from storage solutions and other stuff.

Avatar image for ronvalencia
#518 Edited by ronvalencia (28067 posts) -

For "World War Z" game at 4K resolution.

@Shewgenja said:

https://www.gamespot.com/articles/ps5-first-details-specs-backwards-compatible-8k-ps/1100-6466281/

It will also have a disc drive and it seems that early releases will be cross gen. PSVR support out of the box. Seems like a seamless upgrade from the 4.

The above has Vulkan API

World War Z

RTX 2080 Ti DX11 = 83 fps

Vega II Vulkan+RPM = 69 fps

RTX 2080 DX11 = 65 fps

Vega 64 Vulkan+RPM = 64 fps

GTX 1080 Ti DX11 = 59 fps

Vega 56 Vulkan+RPM = 56 fps

https://www.reddit.com/r/Amd/comments/beidjd/world_war_z_features_vulkan_api_shader_intrinsics/

World War Z features Vega Rapid Pack Math and GCN shader intrinsics.

Wolfenstein 2 Vulkan also uses Turing's Rapid Pack Math features.

DirectML unifies common API access for Rapid Pack Math and Meta-Commands (intrinsics)

Avatar image for ronvalencia
#519 Edited by ronvalencia (28067 posts) -

@pc_rocks said:
@ronvalencia said:
@pc_rocks said:
@ronvalencia said:

Read Sony's link again. Look at system requirements.

Optimize Your PS4™ and HDR Experience

For the best HDR experience, we highly recommend the following:

2K/4K HDR capable TV that supports the HDR10 format.

PS4™ is connected to the TV with a Premium HDMI cable.

Play or watch HDR enabled content, like games or streaming video service.

PS4™ has system software 4.0 or later installed.

R9-290X (renamed as R9-390X) supports HDR10 format when the display supports it. PS4's IGP is with Hawaii GCN generation.

PS4's IGP is effectively Hawaii GCN with half of CU and ROPS units count. PS4's GPU is effectively Hawaii Lite and it's slightly different from GCN 1.0 Pitcairn.

TV that supports HDR10. My question is how do you transmit 10 bit color from an HDMI chip that only supports 8 bit? PS4 has HDMI 1.4a not 1.4b or HDMI 2.0 or Display Port. Funny thing is the same question is posted on PS4 Reddit yet there's no answer.

HDMI 1.4b was released on October 11, 2011.

Hawaii GCN can generate digital data for HDMI 2.0 HDR10 at HDMI 1.4 bit rate.

Do you do realize that 4:2:0 can be 10bit or 8bit right?

My 32 inch LG monitor has FreeSync + HDR10 + HDMI 2.0a and DisplayPort. I can connect my old 4770K + MSI R9-290X Gaming X (similar vintage to PS4) to it.

I do realize 4:2:0 can be 8 bit and 10 it however 4:2:0 is compressed and not representative of the true color space.

Lastly, I never questioned the GPU is incapable of HDR, I specifically said that PS4 didn't have HDMI 2.0 to pass 10bit HDR. Unless you can prove that HDMI 1.4 can do HDR all of that is irrelevant. I need evidence that HDMI 1.4 can do HDR not GPU, not PS4 hardware not monitors or TVs. Simple.

Here's your ownage:

How did Sony do HDR without the hardware for it? You can't change the hardware but just updating the firmware. Sony is faking HDR on original PS4.

Red herring with 4K. PS4 is requesting 1080p bit rate with HDR10 NOT 4K HDR10

https://arstechnica.com/gaming/2016/10/ps4-hdr-no-games-media-useless/

Avatar image for hofuldig
#520 Posted by hofuldig (5126 posts) -
@ronvalencia said:
@pc_rocks said:
@ronvalencia said:
@pc_rocks said:
@ronvalencia said:

Read Sony's link again. Look at system requirements.

Optimize Your PS4™ and HDR Experience

For the best HDR experience, we highly recommend the following:

2K/4K HDR capable TV that supports the HDR10 format.

PS4™ is connected to the TV with a Premium HDMI cable.

Play or watch HDR enabled content, like games or streaming video service.

PS4™ has system software 4.0 or later installed.

R9-290X (renamed as R9-390X) supports HDR10 format when the display supports it. PS4's IGP is with Hawaii GCN generation.

PS4's IGP is effectively Hawaii GCN with half of CU and ROPS units count. PS4's GPU is effectively Hawaii Lite and it's slightly different from GCN 1.0 Pitcairn.

TV that supports HDR10. My question is how do you transmit 10 bit color from an HDMI chip that only supports 8 bit? PS4 has HDMI 1.4a not 1.4b or HDMI 2.0 or Display Port. Funny thing is the same question is posted on PS4 Reddit yet there's no answer.

HDMI 1.4b was released on October 11, 2011.

Hawaii GCN can generate digital data for HDMI 2.0 HDR10 at HDMI 1.4 bit rate.

Do you do realize that 4:2:0 can be 10bit or 8bit right?

My 32 inch LG monitor has FreeSync + HDR10 + HDMI 2.0a and DisplayPort. I can connect my old 4770K + MSI R9-290X Gaming X (similar vintage to PS4) to it.

I do realize 4:2:0 can be 8 bit and 10 it however 4:2:0 is compressed and not representative of the true color space.

Lastly, I never questioned the GPU is incapable of HDR, I specifically said that PS4 didn't have HDMI 2.0 to pass 10bit HDR. Unless you can prove that HDMI 1.4 can do HDR all of that is irrelevant. I need evidence that HDMI 1.4 can do HDR not GPU, not PS4 hardware not monitors or TVs. Simple.

Here's your ownage:

ow did Sony do HDR without the hardware for it? You can't change the hardware but just updating the firmware. Sony is faking HDR on original PS4.

Red herring with 4K. PS4 is requesting 1080p bit rate with HDR10 NOT 4K HDR10

Arguing weather an old console that is about to be obsolete supports a certain function, fantastic.

Avatar image for rzxv04
#521 Posted by rzxv04 (686 posts) -

https://www.gamespot.com/forums/system-wars-314159282/poll-do-you-think-we-can-have-free-supplemental-ss-33436296/

Many on that thread said outright nope.

Some at least seem to show an open idea.

Makes me wonder how exactly they're gonna implement.

NVME X4, X2 or Sata or something like NVME X4 + HDD where NVME is more of sophisticated semi-caching system.

Avatar image for rzxv04
#522 Posted by rzxv04 (686 posts) -

Might give an idea to some:

Some HDR panels are really 8bit + FRC not native 10bit. Even some 8 bit + FRC panels have smoother color transitions, less banding than some native 10bits panels. It all depends.

I think you mean the HDR impact on brightness and contrast. Not many budged tvs offer that.

https://www.rtings.com/tv/tests/picture-quality/gradient

I remember in avsforum where people usually avoided explaining and showing examples of 10 bit vs 8 bit + FRC. There were even links where it was mentioned that RTINGS lump them together as they couldn't distinguish and there are other factors to judge the output/display.

Can't remember the specific threads since I was looking into getting a new TV in 2018 but after buying around and testing them at home, wasn't happy.

On a sidenote, we've tried HDR/DV on an X900F, Vizio P55-F1 and cheap entry TCL.

Blooming is so bad on Sony and Vizio and the blacks are just so crushed on many dark scenes in DV.

Cheap TCL didn't have ugly ass blooming and the biggest problem was just the weird colors and way horrid viewing angles.

Sony's SDR upscaling/regular 4K was so damn good though, far better than HDR.

Either one needs for a proper h/w based calibration to make dv/hdr shine or it's just really overhyped.

Plus there's different kinds of hdr/dv, where you wanna clip. There's no true standard (clipping) as even a professionals like people in RTINGS and Vincent Teoh from HDTV test mentioned. Up to us which looks better.

Avatar image for Shewgenja
#523 Posted by Shewgenja (21456 posts) -

@ronvalencia: I guess this is the part where I make my position clear on the 8k stuff.

I think indies and a small smattering of budget titles may support it. All Sony is doing by mentioning 8k at all is securing a content pipeline as the generation unfolds. There will be a PS5 Pro at some point that will take a more substantive swing and a whiff at being a half-step 8k console but ensuring there will be content to support it is way more important than announcing that it is a standard for a launch console when the technology simply isn't there.

Also, for the record, I think the gaggle of people saying that Sony is stupid for bringing 8k into the mix are dumber than a box of rocks. 8k TVs are already on shelves as well as 8k TV monitors. NHK, a Japanese channel, already broadcasts in 8k. 8K will be a thing long before Gen9 is over. Sony is planning for that, and so is Microsoft. It's going to be funny as shit around here come E3 when the exact same posters who have so much to say on the subject fangirl out for 8k. Won't be long. Lol

Avatar image for ronvalencia
#524 Edited by ronvalencia (28067 posts) -

@Shewgenja said:

@ronvalencia: I guess this is the part where I make my position clear on the 8k stuff.

I think indies and a small smattering of budget titles may support it. All Sony is doing by mentioning 8k at all is securing a content pipeline as the generation unfolds. There will be a PS5 Pro at some point that will take a more substantive swing and a whiff at being a half-step 8k console but ensuring there will be content to support it is way more important than announcing that it is a standard for a launch console when the technology simply isn't there.

Also, for the record, I think the gaggle of people saying that Sony is stupid for bringing 8k into the mix are dumber than a box of rocks. 8k TVs are already on shelves as well as 8k TV monitors. NHK, a Japanese channel, already broadcasts in 8k. 8K will be a thing long before Gen9 is over. Sony is planning for that, and so is Microsoft. It's going to be funny as shit around here come E3 when the exact same posters who have so much to say on the subject fangirl out for 8k. Won't be long. Lol

At 4K and 8K resolutions, major bottlenecks are 64 ROPS with 256bit bus GDDR6 design which AMD has yet to master.

RTX 2080 is the first CUDA GPU with decoupled six geometry-raster engines with 64 ROPS and 256 bit bus.

1. Geometry-raster engines handles mass converting floating point geometry into integer pixel grid conversion. Higher resolution has larger pixel grid conversion targets. RTX 2080 has GTX 1080 Ti class geometry-raster engines with GTX 1080's ROPS units with 4MB L2 cache. GTX 1080 Ti has lesser 3MB L2 cache.

2. ROPS units are the traditional read/write units (which contains Z-ROPS and color-ROPS) to feed geometry-raster engine. Compute Shader uses TMU read/write path

3. To exceed 96 ROPS with 384 bit design limit, Volta GV100 has 128 ROPS with quad stack HBM v2 but with GTX 1080 Ti's six geometry-raster engines.

When TFLOPS are scaled higher, raster hardware has to scale higher with it.

For sustained 4K and 8K, major GPU innovation would be raster engine scaling beyond the current GPU products.

Tricks like "variable shading rate" enables shading resource to be conserved.

Anyway, 4K resolution with Vega II OC with UV (under voltage) results

Loading Video...

Avatar image for EG101
#525 Posted by EG101 (2010 posts) -

@son-goku7523 said:
@EG101 said:
@dalger21 said:
@Pedro said:

Can't wait to see the price

Current specs say it will be 499-599

If Sony stuffs everything it can in a $600 box I will be pleased.

It's going to be late 2020 almost 2021 during the launch of PS5. $400 is a pathetic budget for a High Tech Toy in 2021.

Sony will never sell a $600 PS5, last time they sold at that price it almost killed their entire company.

First you have to understand why the $600 PS3 launch hurt Sony.

The most obvious reason was that Sony was selling an $800 box for $600 taking a $200 hit on each launch PS3.

2nd the PS3 launch was in 2006, the PS5 will be launching in 2020 14 years later and inflation is a real thing.

Sony should simply build a Slim PS4 Pro that can sell for $250 and give us a $600 PS5 Console at cost without taking a loss. If $200 is going to stop you from buying a console that will give you 6-8 years of current gen gaming then you shouldn't be gaming at all as you obviously should be using the money on other things. Gaming is a hobby and an expensive one.

Avatar image for pc_rocks
#526 Posted by PC_Rocks (2499 posts) -

@04dcarraher said:
@pc_rocks said:

1. It has shitty textures and shitty AA to no AA and yes, they have a shit tier hardware even in 2013. Anyway, all conjecture and not relevant to the point. Sony/Mark Cerny lied about the PS4 as usual. We are not discussing graphics here but the claims of (non) architect Mark Cerny - the liar.

2. No he didn't. He keeps babbling about GPU and bandwidth, those are irrelevant. Your conjecture and hoping it does because there are no articles doesn't prove my evidence. Where's the hardware to do it? There's no DF video even talking about HDR on PS4 yet they made and talked about it on Pro and XBone S. Sony as usual kept on lying.

1. As one who has fought the rabid Sony fanboys in 2013 suggesting PS4 was s fast or faster than a Pc with high end gpus like the 7970. The GPU in the PS4 wasn't shit for the time frame it released in. It wasn't bad (on par with 7850), but what holds it back is the cpu and lack of memory for beyond 900p-1080p gaming. Mark Cerny is known to twist stuff, exaggerate, use marketing orientated terms to sell a product. Such as PS4 Pro's "Secret Sauce" (with its FP 16 bit processing) telling the public it can make the console an 8.2 TFLOP beast.... Which is false.

2. Even though Ron likes to rabble with tangents..... However HDR is possible with HDMI 1.4, you can not however send a 4k signal through it. You need HDMI 2.0+. PS4 can output HDR signal.

1. True, only the GPU was decent but in reality it was still in the ballpark of 570 from 2010 so overall Mark Cerny lied as always just like he's lying and twisting about ray tracing for PS5. He's clearly riding the hype from RTX series by calling they are using ray tracing for audio. No one calls that ray tracing, it's quite a common technique that even games like original Thief used. Ray tracing in a traditional sense has always been used to describe its usage in rendering in the world space. Shooting a single ray for FoV, path finding etc is not comparable to firing 10s of thousands of rays for rendering.

2. My point is not about bandwidth, it's about passing 10bit info. All the articles and research I have done showed me that HDMI 1.4 could not pass 10bit. If it was only about bandwidth than HDMI 1.3 and 1.4 has the same bandwidth.

Avatar image for pc_rocks
#527 Edited by PC_Rocks (2499 posts) -

@ronvalencia said:
@pc_rocks said:
@ronvalencia said:
@pc_rocks said:
@ronvalencia said:

Read Sony's link again. Look at system requirements.

Optimize Your PS4™ and HDR Experience

For the best HDR experience, we highly recommend the following:

2K/4K HDR capable TV that supports the HDR10 format.

PS4™ is connected to the TV with a Premium HDMI cable.

Play or watch HDR enabled content, like games or streaming video service.

PS4™ has system software 4.0 or later installed.

R9-290X (renamed as R9-390X) supports HDR10 format when the display supports it. PS4's IGP is with Hawaii GCN generation.

PS4's IGP is effectively Hawaii GCN with half of CU and ROPS units count. PS4's GPU is effectively Hawaii Lite and it's slightly different from GCN 1.0 Pitcairn.

TV that supports HDR10. My question is how do you transmit 10 bit color from an HDMI chip that only supports 8 bit? PS4 has HDMI 1.4a not 1.4b or HDMI 2.0 or Display Port. Funny thing is the same question is posted on PS4 Reddit yet there's no answer.

HDMI 1.4b was released on October 11, 2011.

Hawaii GCN can generate digital data for HDMI 2.0 HDR10 at HDMI 1.4 bit rate.

Do you do realize that 4:2:0 can be 10bit or 8bit right?

My 32 inch LG monitor has FreeSync + HDR10 + HDMI 2.0a and DisplayPort. I can connect my old 4770K + MSI R9-290X Gaming X (similar vintage to PS4) to it.

I do realize 4:2:0 can be 8 bit and 10 it however 4:2:0 is compressed and not representative of the true color space.

Lastly, I never questioned the GPU is incapable of HDR, I specifically said that PS4 didn't have HDMI 2.0 to pass 10bit HDR. Unless you can prove that HDMI 1.4 can do HDR all of that is irrelevant. I need evidence that HDMI 1.4 can do HDR not GPU, not PS4 hardware not monitors or TVs. Simple.

Here's your ownage:

How did Sony do HDR without the hardware for it? You can't change the hardware but just updating the firmware. Sony is faking HDR on original PS4.

Red herring with 4K. PS4 is requesting 1080p bit rate with HDR10 NOT 4K HDR10

https://arstechnica.com/gaming/2016/10/ps4-hdr-no-games-media-useless/

The article again agreed with what I said, it has the bandwidth but what about the hardware to pass 10bit? HDMI 1.4 only supports 8 bit, and it further agrees with me that there's no content on original PS4 in HDR to verify. Further proving my point. Sony lied to the teeth while Xbone S has HDR games.

No body claimed that it's impossible to do HDR with 1080p the question was and still is how do you process 10 bits on a 8 bit hardware without compression or simulation.

Avatar image for pc_rocks
#528 Posted by PC_Rocks (2499 posts) -

@rzxv04 said:

Might give an idea to some:

Some HDR panels are really 8bit + FRC not native 10bit. Even some 8 bit + FRC panels have smoother color transitions, less banding than some native 10bits panels. It all depends.

I think you mean the HDR impact on brightness and contrast. Not many budged tvs offer that.

Thank you. That's what I was pointing to when I talked about 8 bit dithering, many manufacturers claim it as HDR yet it isn't true 10bit HDR. We are not talking about the implementation or panels, of course cheap 10bit panels will be inferior. The point is one is true the other is a way to simulate it.

Avatar image for son-goku7523
#529 Posted by Son-Goku7523 (955 posts) -
@EG101 said:
@son-goku7523 said:
@EG101 said:
@dalger21 said:
@Pedro said:

Can't wait to see the price

Current specs say it will be 499-599

If Sony stuffs everything it can in a $600 box I will be pleased.

It's going to be late 2020 almost 2021 during the launch of PS5. $400 is a pathetic budget for a High Tech Toy in 2021.

Sony will never sell a $600 PS5, last time they sold at that price it almost killed their entire company.

First you have to understand why the $600 PS3 launch hurt Sony.

The most obvious reason was that Sony was selling an $800 box for $600 taking a $200 hit on each launch PS3.

2nd the PS3 launch was in 2006, the PS5 will be launching in 2020 14 years later and inflation is a real thing.

Sony should simply build a Slim PS4 Pro that can sell for $250 and give us a $600 PS5 Console at cost without taking a loss. If $200 is going to stop you from buying a console that will give you 6-8 years of current gen gaming then you shouldn't be gaming at all as you obviously should be using the money on other things. Gaming is a hobby and an expensive one.

I get what you're saying but $600 PS5 would flop bad. Other than hardcore cows no one would buy that shit. Sony doesn't want that kinda scenario. They want the PS5 to be as successful or even more successful than PS4. They have more competition going into next generation with Google Stadia and a vengeful MS, they can't afford to f*** it up with a $600 PS5.

Avatar image for boxrekt
#530 Edited by BoxRekt (1791 posts) -

@son-goku7523:

It's going to be $500.

Thanks to MS marketing of "premium console" and reasonable success of the X Sony knows they can get away with $500 at launch.

Sony will discount the Pro as well as base PS4 to continue PS4 sales and since Sony (and the industry) will be supporting cross-gen games for at least 1 year past the start of Next gen Sony will be able to sell PS5 for $500 and not worry about game sales.

Sony will not worry about PS5 sales because most of those PS4 adopters will likely upgrade to PS5 when they want 4k60 next gen gaming.

Avatar image for goldenelementxl
#531 Posted by GoldenElementXL (3248 posts) -

@boxrekt: What success has the X had and why would Sony, the company with the product that he outsold the X every month it’s been out, try to emulate it? What you’re saying makes no sense

Avatar image for zaryia
#532 Edited by Zaryia (9023 posts) -

He forgot to mention it would be 2nd or 3rd place again.

Avatar image for boxrekt
#533 Edited by BoxRekt (1791 posts) -

@goldenelementxl said:

@boxrekt: What success has the X had and why would Sony, the company with the product that he outsold the X every month it’s been out, try to emulate it? What you’re saying makes no sense

lol what a contradiction.

So the X didn't have any success now? LMAO well that's NEW confession from a proud X owner like yourself!

In fact most of the xbox fans on this board double dipped and not only bought the X but bragged about dropping $500 for the X's 6TF of POWER to get higher res multiplats.

Ok, I'm not gone bust your grapes over that slip up :P

First I said "reasonable" success. The X really didn't sell a boat load to new users but they sold quite well to xbox fans who were already invested in the xbox ecosystem!

THAT point is what Sony probably paid attention to.

3 things:

  1. All PS4 your games will be playable on PS5
  2. The industry will support cross-gen development for at least 1 year.
  3. Early adopters are willing to pay more for new hardware.

Sony is in a win win situation because of PS4's success and it's likely TLOU 2 to be a PS5 launch title 4k60fps.

Sony doesn't need to invest in a huge launch line-up because they can push PS5's ability to run PS4 games 4k 60 to enthusiasts and save a huge PS5 exclusives for holiday 2020 (if it releases March 2020) and give 1 or 2 big PS5 games a year while cross-gen development slowly phased out and PS5 price slowly drops.

What doesn't make sense to you? lol

Avatar image for son-goku7523
#534 Edited by Son-Goku7523 (955 posts) -

@boxrekt said:

@son-goku7523:

It's going to be $500.

Thanks to MS marketing of "premium console" and reasonable success of the X Sony knows they can get away with $500 at launch.

Sony will discount the Pro as well as base PS4 to continue PS4 sales and since Sony (and the industry) will be supporting cross-gen games for at least 1 year past the start of Next gen Sony will be able to sell PS5 for $500 and not worry about game sales.

Sony will not worry about PS5 sales because most of those PS4 adopters will likely upgrade to PS5 when they want 4k60 next gen gaming.

lol, c'mon bro, that's horse shit.

What success has the X had? You are aware that MS has lost every NPD since the X came out right? The very month the X launched Sony completely destroyed them at NPD. Things are even worse when you bring in worldwide numbers.

$500 console is a losing formula and MS hasn't been successful at marketing it, no one has. $400 is the sweet spot and it's why Sony was extremely careful top price the Pro at that price.

Here on SW it may appear the X is more successful but in the real world Sony has been eating MS' lunch with the Pro (there was a period the Pro was even in short supply in stores, that has never happened with the X). Sony has been killing MS in sales, the X hasn't changed MS' fortunes at all. Sony would have to be completely idiotic to leave their winning model to follow MS' losing formula next gen.

The X is only successful among hardcore lems, the general public completely ignores it. In order for Sony to make a successful PS5, they need a price that will not only attract hardcore cows but gamers on the fence about upgrading. A $500 sticker price isn't going to attract gamers on the fence.

Avatar image for boxrekt
#535 Posted by BoxRekt (1791 posts) -

@son-goku7523:

read my reply to @goldenelementxl

The problem is you're comparing the X to PS4 overall sales. That's not the way you should be looking at this.

You guys should look at the way the X sales to the xbox one fans.

Of course the PS5 will still sell better than any $500 mid-gen refresh because it's actually going to have next gen game that current gen systems can't play, note, in the future.

What you have to realize is cross-gen development is a reality and has been for the last 2 generations. You are guaranteed at LEAST 1 year cross gen support by Sony and probably longer by the industry. The fact that PS5 plays all PS4 games and, possibly at higher res and frame rate, with additional EXCLUSIVES give Sony the leeway to price PS5 a bit higher because not everyone needs to buy a PS5 a launch because ANY PS4 games that come out will be playable on PS5 when PS4 support stops.

Avatar image for goldenelementxl
#536 Posted by GoldenElementXL (3248 posts) -

@boxrekt said:
@goldenelementxl said:

@boxrekt: What success has the X had and why would Sony, the company with the product that he outsold the X every month it’s been out, try to emulate it? What you’re saying makes no sense

lol what a contradiction.

So the X didn't have any success now? LMAO well that's NEW confession from a proud X owner like yourself!

In fact most of the xbox fans on this board double dipped and not only bought the X but bragged about dropping $500 for the X's 6TF of POWER to get higher res multiplats.

Ok, I'm not gone bust your grapes over that slip up :P

First I said "reasonable" success. The X really didn't sell a boat load to new users but they sold quite well to xbox fans who were already invested in the xbox ecosystem!

THAT point is what Sony probably paid attention to.

3 things:

  1. All PS4 your games will be playable on PS5
  2. The industry will support cross-gen development for at least 1 year.
  3. Early adopters are willing to pay more for new hardware.

Sony is in a win win situation because of PS4's success and it's likely TLOU 2 to be a PS5 launch title 4k60fps.

Sony doesn't need to invest in a huge launch line-up because they can push PS5's ability to run PS4 games 4k 60 to enthusiasts and save a huge PS5 exclusives for holiday 2020 (if it releases March 2020) and give 1 or 2 big PS5 games a year while cross-gen development slowly phased out and PS5 price slowly drops.

What doesn't make sense to you? lol

Just because I own 3 X's doesn't mean the console is a success. It's a perfect product for me and my needs, but consumers as a whole haven't flocked to the product like they have with the competition. Hence the reason the Xbox One brand is consistently in 3rd or 4th place each month.

What doesn't make sense is why Sony would emulate anything Microsoft has done in order to be successful. That would be like the New England Patriots trying to emulate the Detroit Lions. Why would they do that? Sony should focus on what put them in 1st this gen. But don't worry Xbox fans, it seems like Sony is dropping the ball and may give you the lead again next gen. Just one PR disaster after another with them. If Xbox would have come up with a better name for the SAD Edition and priced it correctly, they would have all the momentum. But of course they messed that one up.

I've never seen companies fumble momentum like Sony and Microsoft.

Avatar image for goldenelementxl
#537 Posted by GoldenElementXL (3248 posts) -

@boxrekt said:

@son-goku7523:

read my reply to @goldenelementxl

The problem is you're comparing the X to PS4 overall sales. That's not the way you should be looking at this.

You guys should look at the way the X sales to the xbox one fans.

That's exactly the way we should be looking at it, as should Microsoft. Their install base is less than half of the PS4's. So why spend hundreds of millions on R&D, production, marketing etc on a product that only a fraction of your already small install-base will purchase? The X is a great console for gaming enthusiasts. But it didn't do them any favors with market share. I wouldn't be surprised if it was a big waste of money for them overall.

So why would Sony want to emulate that???

Avatar image for rzxv04
#538 Posted by rzxv04 (686 posts) -

@pc_rocks said:
@rzxv04 said:

Might give an idea to some:

Some HDR panels are really 8bit + FRC not native 10bit. Even some 8 bit + FRC panels have smoother color transitions, less banding than some native 10bits panels. It all depends.

I think you mean the HDR impact on brightness and contrast. Not many budged tvs offer that.

Thank you. That's what I was pointing to when I talked about 8 bit dithering, many manufacturers claim it as HDR yet it isn't true 10bit HDR. We are not talking about the implementation or panels, of course cheap 10bit panels will be inferior. The point is one is true the other is a way to simulate it.

Yeah it's actually weird. Sadly, I've seen some RTINGS and other avsforum members claim that one cannot tell 10 bit from 8 bit FRC. I can't remember the exact forum post/links sadly as this was about 5 months ago when I was trying to relearn new tv tech.

Anyway TV buying today kinda sucks. Too many issues coming from forms of panel lottery.

Btw, it's a long shot but HDTVtest in youtube has a way to properly check for HDR. It's would be interesting if they can check HDR for the base PS4.

https://www.youtube.com/watch?v=kyvqlqSlUiM

https://www.youtube.com/watch?v=fl7PCjVHslE

https://www.youtube.com/watch?v=8_rUfVBhM8U

https://www.youtube.com/watch?v=ZhSyxjnwLOA

If they do fake HDR and claimed otherwise, I hope they get sued.

Anyway, HDR/DV feels very underwhelming and I doubt many have proper calibration tools to calibrate for hdr/dv.. probably some are placebo or just black crush/color push setting you can do on a good sdr display to achieve a close to "good hdr" look.

I don't know. Just that looking for tv buying and experiencing it recently left a bad taste in my mouth.

Sorry, went too off topic.

Back to consoles specs :)

Avatar image for rzxv04
#539 Posted by rzxv04 (686 posts) -

@EG101 said:
@son-goku7523 said:
@EG101 said:
@dalger21 said:
@Pedro said:

Can't wait to see the price

Current specs say it will be 499-599

If Sony stuffs everything it can in a $600 box I will be pleased.

It's going to be late 2020 almost 2021 during the launch of PS5. $400 is a pathetic budget for a High Tech Toy in 2021.

Sony will never sell a $600 PS5, last time they sold at that price it almost killed their entire company.

First you have to understand why the $600 PS3 launch hurt Sony.

The most obvious reason was that Sony was selling an $800 box for $600 taking a $200 hit on each launch PS3.

2nd the PS3 launch was in 2006, the PS5 will be launching in 2020 14 years later and inflation is a real thing.

Sony should simply build a Slim PS4 Pro that can sell for $250 and give us a $600 PS5 Console at cost without taking a loss. If $200 is going to stop you from buying a console that will give you 6-8 years of current gen gaming then you shouldn't be gaming at all as you obviously should be using the money on other things. Gaming is a hobby and an expensive one.

I agree with this. $ 599 from 200X is not $ 599 in 2020/2021.

That being said, there's a stigma about that price. Who knows how the web will spin and react to that. Meme age.

Maybe they can pull a $ 549-589 price just for the looks of the number.

I still wouldn't be surprised if Sony takes a strategy with more $ loss/unit sold this time around. Somewhere between the original PS3 and the PS4.

Avatar image for PAL360
#540 Posted by PAL360 (29589 posts) -

I really hope devs don't waste resources with 8k. Something like 1800p60 with ray tracing would be perfect, in my opinion.

Avatar image for son-goku7523
#541 Edited by Son-Goku7523 (955 posts) -

@boxrekt: Your argument assumes Sony views the PS5 as some sort of higher end PS4. What makes you think Sony is content with people holding on to their PS4s and waiting for PS5 to become affordable in 2 or 3 years with competition like MS and Google Stadia around? That makes no sense at all. They want to gain marketshare as quick as possible so that the PS5 can start a new install-base ASAP for next gen.

Marketing your console to gaming enthusiasts is the surest way to fail. You’ve said so yourself, the X has been successful among the hardcore but it hasn’t moved the needle for MS at all when you place it next to Sony’s overall sales which are bolstered by the uber successful OG PS4 and PS4 Slim. The PS5 isn’t supposed to be a replacement to the PS4 Pro or compete with the X for hardcore gamers, it’s meant to replace the aging regular PS4, it’s a new starting point for Sony. Marketing it as a premium console is exactly what Sony should avoid. They are supposed to market the PS5 as an affordable yet capable console like they did with the PS4. Trying to market it as a premium PS4 Pro or X replacement will cause the PS5 to fail. It will be appealing to people like us on SW but outside gaming enthusiast circles it will fail hard, and that’s what Sony should avoid at all costs. Crossgen game development doesn’t change the dynamics at all. A $400 console is still way more marketable than a $500 or $600 one.

Avatar image for ronvalencia
#542 Edited by ronvalencia (28067 posts) -

@pc_rocks said:
@ronvalencia said:
@pc_rocks said:
@ronvalencia said:

HDMI 1.4b was released on October 11, 2011.

Hawaii GCN can generate digital data for HDMI 2.0 HDR10 at HDMI 1.4 bit rate.

Do you do realize that 4:2:0 can be 10bit or 8bit right?

My 32 inch LG monitor has FreeSync + HDR10 + HDMI 2.0a and DisplayPort. I can connect my old 4770K + MSI R9-290X Gaming X (similar vintage to PS4) to it.

I do realize 4:2:0 can be 8 bit and 10 it however 4:2:0 is compressed and not representative of the true color space.

Lastly, I never questioned the GPU is incapable of HDR, I specifically said that PS4 didn't have HDMI 2.0 to pass 10bit HDR. Unless you can prove that HDMI 1.4 can do HDR all of that is irrelevant. I need evidence that HDMI 1.4 can do HDR not GPU, not PS4 hardware not monitors or TVs. Simple.

Here's your ownage:

How did Sony do HDR without the hardware for it? You can't change the hardware but just updating the firmware. Sony is faking HDR on original PS4.

Red herring with 4K. PS4 is requesting 1080p bit rate with HDR10 NOT 4K HDR10

https://arstechnica.com/gaming/2016/10/ps4-hdr-no-games-media-useless/

The article again agreed with what I said, it has the bandwidth but what about the hardware to pass 10bit? HDMI 1.4 only supports 8 bit, and it further agrees with me that there's no content on original PS4 in HDR to verify. Further proving my point. Sony lied to the teeth while Xbone S has HDR games.

No body claimed that it's impossible to do HDR with 1080p the question was and still is how do you process 10 bits on a 8 bit hardware without compression or simulation.

I connected an old Sony Bravia HDTV 46 inch 1080p to HDMI 1.4 Radeon HD R9-290X and it registered 12 bits color display according to AMD's control panel. LOL

I can select 8, 10 and 12bits color within AMD's control panel.

The old Sony Bravia HDTV is missing the specific HDR10 color profile.

DisplayPort not used.

------------

I plan to connect HDMI 1.4 Radeon HD R9-290X to HDMI 2.0a FreeSync +HDR +4K LG 32inch monitor.

HDMI 2.0a FreeSync +HDR +4K LG 32 inch monitor has been verified to work with my laptop's HDMI 2.0a with HDR10, FreeSync and 4K.

Avatar image for ronvalencia
#543 Edited by ronvalencia (28067 posts) -

@goldenelementxl said:
@boxrekt said:

@son-goku7523:

read my reply to @goldenelementxl

The problem is you're comparing the X to PS4 overall sales. That's not the way you should be looking at this.

You guys should look at the way the X sales to the xbox one fans.

That's exactly the way we should be looking at it, as should Microsoft. Their install base is less than half of the PS4's. So why spend hundreds of millions on R&D, production, marketing etc on a product that only a fraction of your already small install-base will purchase? The X is a great console for gaming enthusiasts. But it didn't do them any favors with market share. I wouldn't be surprised if it was a big waste of money for them overall.

So why would Sony want to emulate that???

Like GTX 1060's 192 bit GDDR5, 192 bit GDDR6 is reaching mainstream via GTX 1660 Ti's entry into the market. AMD equivalent to GTX 1060's 192 bit is like RX-470/RX-480's 256bit with AMD taking a lower profit margins when compared NVIDIA's.

GTX 1660 Ti includes 48 ROPS with three GPC (three mass geometry-raster conversion engines).

On AMD's side

RX 3080 NAVI (Vega 64 level) replace RX-580 with 256 bit bus

RX 3070 NAVI (Vega 56 level, 10 TFLOPS, 64 ROPS) replace RX-570 (5 TFLOPS, 32 ROPS) with 256 bit bus <------- PS4/PS4 Pro price segment

RX 3060 NAVI (RX-580 level) replace RX-560 with 128 bit bus

X1X has 384 bit bus PCB price segment with higher BOM cost over RX-580, hence unreleased PC SKU equivalent between RX-580 and RX Vega 56 segment.

RX 3080 NAVI targets RTX 2070

RX 3070 NAVI targets RTX 2060

Avatar image for ronvalencia
#544 Edited by ronvalencia (28067 posts) -

@pc_rocks said:
@04dcarraher said:
@pc_rocks said:

1. It has shitty textures and shitty AA to no AA and yes, they have a shit tier hardware even in 2013. Anyway, all conjecture and not relevant to the point. Sony/Mark Cerny lied about the PS4 as usual. We are not discussing graphics here but the claims of (non) architect Mark Cerny - the liar.

2. No he didn't. He keeps babbling about GPU and bandwidth, those are irrelevant. Your conjecture and hoping it does because there are no articles doesn't prove my evidence. Where's the hardware to do it? There's no DF video even talking about HDR on PS4 yet they made and talked about it on Pro and XBone S. Sony as usual kept on lying.

1. As one who has fought the rabid Sony fanboys in 2013 suggesting PS4 was s fast or faster than a Pc with high end gpus like the 7970. The GPU in the PS4 wasn't shit for the time frame it released in. It wasn't bad (on par with 7850), but what holds it back is the cpu and lack of memory for beyond 900p-1080p gaming. Mark Cerny is known to twist stuff, exaggerate, use marketing orientated terms to sell a product. Such as PS4 Pro's "Secret Sauce" (with its FP 16 bit processing) telling the public it can make the console an 8.2 TFLOP beast.... Which is false.

2. Even though Ron likes to rabble with tangents..... However HDR is possible with HDMI 1.4, you can not however send a 4k signal through it. You need HDMI 2.0+. PS4 can output HDR signal.

1. True, only the GPU was decent but in reality it was still in the ballpark of 570 from 2010 so overall Mark Cerny lied as always just like he's lying and twisting about ray tracing for PS5. He's clearly riding the hype from RTX series by calling they are using ray tracing for audio. No one calls that ray tracing, it's quite a common technique that even games like original Thief used. Ray tracing in a traditional sense has always been used to describe its usage in rendering in the world space. Shooting a single ray for FoV, path finding etc is not comparable to firing 10s of thousands of rays for rendering.

2. My point is not about bandwidth, it's about passing 10bit info. All the articles and research I have done showed me that HDMI 1.4 could not pass 10bit. If it was only about bandwidth than HDMI 1.3 and 1.4 has the same bandwidth.

1. True Audio Next recycles Radeon Rays2.0 BVH ray tracing accelerated structures.

RTX Turing accelerates BVH search ray tracing methods via fix function hardware units. RTX cores guides shader's color pixel render.

Mark Cerny is not lying. Mark Cerny is repeating AMD's GPUOpen information.

2. My MSI Gaming X R9-290X with HDMI 1.4 port can pass 8/10/12 bit color to an old Sony Bravia HDTV 46 inch 1080p. I fvcking wasting my time to support Sony's claims.

Sony is not lying.

Sony is repeating AMD's HDR marketing.

AMD has close relationships with HDMI backers to inserted it's IP e.g. Freesync.

Samsung being Xbox One X's official 4K panel partner e.g. FreeSync.

Sony for shipping AMD gaming boxes and it's 4K/8K panel partner.

LG partner for AMD's IP e.g. FreeSync 2.0.

Xbox One X has beta support for Dolby Vision (12bit color HDR).

NVIDIA lost the adaptive sync war and it was forced to support AMD's FreeSync!

Avatar image for uitravioience
#545 Posted by UItravioIence (1198 posts) -

Yea but yall have to take into account that amd uses a modified Flux capacitor while nVidia uses a reverse tachyon accelerator.

Avatar image for ronvalencia
#546 Posted by ronvalencia (28067 posts) -

@uitravioience said:

Yea but yall have to take into account that amd uses a modified Flux capacitor while nVidia uses a reverse tachyon accelerator.

NVIDIA copied PowerVR's BHV search hardware.

http://cdn.imgtec.com/sdk-presentations/gdc2014_introductionToPowerVRRayTracing.pdf

PowerVR's ray-tracing tech is based on BVH search engine accelerator similar NVIDIA's RT cores.

NVIDIA's argument

Avatar image for pc_rocks
#547 Posted by PC_Rocks (2499 posts) -

@ronvalencia said:
@pc_rocks said:
@ronvalencia said:
@pc_rocks said:
@ronvalencia said:

HDMI 1.4b was released on October 11, 2011.

Hawaii GCN can generate digital data for HDMI 2.0 HDR10 at HDMI 1.4 bit rate.

Do you do realize that 4:2:0 can be 10bit or 8bit right?

My 32 inch LG monitor has FreeSync + HDR10 + HDMI 2.0a and DisplayPort. I can connect my old 4770K + MSI R9-290X Gaming X (similar vintage to PS4) to it.

I do realize 4:2:0 can be 8 bit and 10 it however 4:2:0 is compressed and not representative of the true color space.

Lastly, I never questioned the GPU is incapable of HDR, I specifically said that PS4 didn't have HDMI 2.0 to pass 10bit HDR. Unless you can prove that HDMI 1.4 can do HDR all of that is irrelevant. I need evidence that HDMI 1.4 can do HDR not GPU, not PS4 hardware not monitors or TVs. Simple.

Here's your ownage:

How did Sony do HDR without the hardware for it? You can't change the hardware but just updating the firmware. Sony is faking HDR on original PS4.

Red herring with 4K. PS4 is requesting 1080p bit rate with HDR10 NOT 4K HDR10

https://arstechnica.com/gaming/2016/10/ps4-hdr-no-games-media-useless/

The article again agreed with what I said, it has the bandwidth but what about the hardware to pass 10bit? HDMI 1.4 only supports 8 bit, and it further agrees with me that there's no content on original PS4 in HDR to verify. Further proving my point. Sony lied to the teeth while Xbone S has HDR games.

No body claimed that it's impossible to do HDR with 1080p the question was and still is how do you process 10 bits on a 8 bit hardware without compression or simulation.

I connected an old Sony Bravia HDTV 46 inch 1080p to HDMI 1.4 Radeon HD R9-290X and it registered 12 bits color display according to AMD's control panel. LOL

I can select 8, 10 and 12bits color within AMD's control panel.

The old Sony Bravia HDTV is missing the specific HDR10 color profile.

DisplayPort not used.

------------

I plan to connect HDMI 1.4 Radeon HD R9-290X to HDMI 2.0a FreeSync +HDR +4K LG 32inch monitor.

HDMI 2.0a FreeSync +HDR +4K LG 32 inch monitor has been verified to work with my laptop's HDMI 2.0a with HDR10, FreeSync and 4K.

You're again talking about Display and GPUs. I never claimed anything related to display or the GPU power. The point is where is the proof that HDMI 1.4 can pass 10 bit info to the display without compression/simulation or dithering. And most specifically where's the evidence to verify Sony's HDR claims as there's no content. I won't respond again if you keep talking about PC GPUs and TVs.

Avatar image for pc_rocks
#548 Posted by PC_Rocks (2499 posts) -

@rzxv04 said:

If they do fake HDR and claimed otherwise, I hope they get sued.

There's no way to verify it because there's no HDR content on PS4. Probably the reason they got away with it.

Avatar image for hofuldig
#549 Posted by hofuldig (5126 posts) -
@ronvalencia said:
@uitravioience said:

Yea but yall have to take into account that amd uses a modified Flux capacitor while nVidia uses a reverse tachyon accelerator.

NVIDIA copied PowerVR's BHV search hardware.

http://cdn.imgtec.com/sdk-presentations/gdc2014_introductionToPowerVRRayTracing.pdf

PowerVR's ray-tracing tech is based on BVH search engine accelerator similar NVIDIA's RT cores.

NVIDIA's argument

you know you replied to a joke post right?

Avatar image for pc_rocks
#550 Posted by PC_Rocks (2499 posts) -

@ronvalencia said:
@pc_rocks said:
@04dcarraher said:
@pc_rocks said:

1. It has shitty textures and shitty AA to no AA and yes, they have a shit tier hardware even in 2013. Anyway, all conjecture and not relevant to the point. Sony/Mark Cerny lied about the PS4 as usual. We are not discussing graphics here but the claims of (non) architect Mark Cerny - the liar.

2. No he didn't. He keeps babbling about GPU and bandwidth, those are irrelevant. Your conjecture and hoping it does because there are no articles doesn't prove my evidence. Where's the hardware to do it? There's no DF video even talking about HDR on PS4 yet they made and talked about it on Pro and XBone S. Sony as usual kept on lying.

1. As one who has fought the rabid Sony fanboys in 2013 suggesting PS4 was s fast or faster than a Pc with high end gpus like the 7970. The GPU in the PS4 wasn't shit for the time frame it released in. It wasn't bad (on par with 7850), but what holds it back is the cpu and lack of memory for beyond 900p-1080p gaming. Mark Cerny is known to twist stuff, exaggerate, use marketing orientated terms to sell a product. Such as PS4 Pro's "Secret Sauce" (with its FP 16 bit processing) telling the public it can make the console an 8.2 TFLOP beast.... Which is false.

2. Even though Ron likes to rabble with tangents..... However HDR is possible with HDMI 1.4, you can not however send a 4k signal through it. You need HDMI 2.0+. PS4 can output HDR signal.

1. True, only the GPU was decent but in reality it was still in the ballpark of 570 from 2010 so overall Mark Cerny lied as always just like he's lying and twisting about ray tracing for PS5. He's clearly riding the hype from RTX series by calling they are using ray tracing for audio. No one calls that ray tracing, it's quite a common technique that even games like original Thief used. Ray tracing in a traditional sense has always been used to describe its usage in rendering in the world space. Shooting a single ray for FoV, path finding etc is not comparable to firing 10s of thousands of rays for rendering.

2. My point is not about bandwidth, it's about passing 10bit info. All the articles and research I have done showed me that HDMI 1.4 could not pass 10bit. If it was only about bandwidth than HDMI 1.3 and 1.4 has the same bandwidth.

1. True Audio Next recycles Radeon Rays2.0 BVH ray tracing accelerated structures.

RTX Turing accelerates BVH search ray tracing methods via fix function hardware units. RTX cores guides shader's color pixel render.

Yes, Mark Cerny is lying or at the very least deliberately misleading the public as usual. Crytek used the so called ray-tracing in screen space for RLR in Crysis 2 yet they didn't claim it was ray-tracing. Every game in existence fires rays to for path-finding, collision detection, line of sight etc. yet no one claims it as ray-tracing. Rays also being used for spatial audio before yet no one explicitly called it ray-tracing. Epic and Crytek implemented SVOGI but they didn't claim it as ray-tracing. The reason is because ray-tracing traditionally been used to define huge number of rays being cast for rendering in world space. Funny only Sony/GG tried to claim they are doing ray-tracing in RLR in KZ:SF 2 years after Crytek and again trying to use it as a buzz word to hype their product without telling that it's not the same.