Seems multiplats will be the same

Avatar image for Pedro
Pedro

46313

Forum Posts

0

Wiki Points

0

Followers

Reviews: 68

User Lists: 0

#101 Pedro
Member since 2002 • 46313 Posts

Initiating damage control.

Post 1.

Post 2.

Post 3.

Damage control complete. Disaster averted.

Avatar image for pc_rocks
PC_Rocks

5000

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#102 PC_Rocks
Member since 2018 • 5000 Posts

@tormentos said:

@pc_rocks:

https://wccftech.com/sony-ps5-vs-xbox-series-x-analysis/

Several sites reported it like that, but i think they were rounding the number.

Not that you will gain anything from 10.28 to 10.30 TF.🤷‍♂️

It isn't from Sony or Cerny but from your very own article, this is what they have to say:

Sustained GPU Clock Speed2.0 GHz [Estimated]
Sustained FP32 Performance9.2 TFLOPs [Estimated]
Avatar image for the6millionlies
The6millionLies

564

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#103 The6millionLies
Member since 2020 • 564 Posts

@tormentos: Dude chill , I just asked you a question , why you get over defensive ?👀

Avatar image for tormentos
tormentos

33699

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#104 tormentos
Member since 2003 • 33699 Posts

@pc_rocks:

I didnt say it was from Cerny in fact claimed they rounded the numbers as possible answer to the whole 10.3TF because like him i have see that 10.3 shit elsewhere as well.

But again from 10.28 to 10.30 i dont see a single frame more of performance gain.

Avatar image for pc_rocks
PC_Rocks

5000

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#105 PC_Rocks
Member since 2018 • 5000 Posts

@tormentos said:

@pc_rocks:

I didnt say it was from Cerny in fact claimed they rounded the numbers as possible answer to the whole 10.3TF because like him i have see that 10.3 shit elsewhere as well.

But again from 10.28 to 10.30 i dont see a single frame more of performance gain.

10.28 or 10.3 is not in question. I specifically asked where did Sony/Cerny claimed that PS5 is 10.3 (or for that matter 9.2) which the TC claimed.

Avatar image for PAL360
PAL360

30060

Forum Posts

0

Wiki Points

0

Followers

Reviews: 31

User Lists: 0

#106  Edited By PAL360
Member since 2007 • 30060 Posts

Of course most multiplats will look and perform the same. We are talking about 2 systems that are many times more capable than what's available now, with a difference between them smaller than 20%.

Even if there was a huge hardware difference, most devs would not even know how to take advantage of the new hardware for the first 2 or 3 years.

Avatar image for WitIsWisdom
WitIsWisdom

6918

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#107 WitIsWisdom
Member since 2007 • 6918 Posts

Very few if any devs are going to take the time to truly optimize games on the XSX over the PS5 especially since the games have to run on the XSS. Most multiplats will look and play almost identically with perhaps some better ray tracing or special effects. Later in the generation it may be evident in exclusive games but time will tell.

Avatar image for BlackShirt20
BlackShirt20

2493

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#108 BlackShirt20
Member since 2005 • 2493 Posts

@tormentos: Still waiting for you to explain how variable CPU and GPU speeds will manage to run at max clock speeds like XSX as you consistently claim every time you bring this up.

Avatar image for regnaston
regnaston

1885

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#109 regnaston
Member since 2008 • 1885 Posts

@WitIsWisdom said:

Very few if any devs are going to take the time to truly optimize games on the XSX over the PS5 especially since the games have to run on the XSS. Most multiplats will look and play almost identically with perhaps some better ray tracing or special effects. Later in the generation it may be evident in exclusive games but time will tell.

Just the 23 Developers within MS Game Studios I guess :D

Avatar image for Zero_epyon
Zero_epyon

15992

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#110  Edited By Zero_epyon
Member since 2004 • 15992 Posts

@BlackShirt20 said:

@tormentos: Still waiting for you to explain how variable CPU and GPU speeds will manage to run at max clock speeds like XSX as you consistently claim every time you bring this up.

The CPU and GPU have variable speeds because SmartShift adjusts power based on a power budget. The shift in power can happen multiple times in a single frame and depending on the frame's complexity and can affect the frequency of the GPU of CPU. From what I learned about all this, the power budget for the chip is enough for both the CPU and GPU to run at 100% frequency at the same time.

The shift happens when you have one component needing more power (AVX CPU instructions for example) when the GPU doesn't need as much. Instead of feeding both components and the GPU and having the GPU idle with that power, the power is shifted to the CPU instead of taken from the budget. This keeps thermals and power consumption down.

Also, when the shift happens the CPU or GPU is undervolted, not throttled down. Undervolting AMD CPUs and GPUs don't translate to linear decreases in frequency. So even when it shifts some power away, the result is literally a few MHz for a millisecond or two.

Hopefully that clears things up.

Avatar image for briguyb13
briguyb13

5491

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#111 briguyb13
Member since 2007 • 5491 Posts

tc never confirmed his claim with sources, yet this thread is still open. That tells you a lot about how desperate SW needs posters and traffic.

Avatar image for BlackShirt20
BlackShirt20

2493

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#112 BlackShirt20
Member since 2005 • 2493 Posts

@Zero_epyon: Exactly. Per Cerny at the road to PS5, the GOU can manage those clock speeds as long as there is CPU overhead. They will not manage Max clocks together. Also, given the fact now we have reports that PS5 needs to adjust fan speeds based on user feedback because the system runs hot.

Don’t be shocked if the clock speeds drop because the system can’t manage to contain the heat.

Avatar image for Zero_epyon
Zero_epyon

15992

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#113 Zero_epyon
Member since 2004 • 15992 Posts

@BlackShirt20 said:

@Zero_epyon: Exactly. Per Cerny at the road to PS5, the GOU can manage those clock speeds as long as there is CPU overhead. They will not manage Max clocks together. Also, given the fact now we have reports that PS5 needs to adjust fan speeds based on user feedback because the system runs hot.

Don’t be shocked if the clock speeds drop because the system can’t manage to contain the heat.

I think you might have misunderstood my post a bit. The GPU and CPU can run at max at the same time. This is what Cerny called the "worst case" scenario when rendering a frame because that's when a frame needs both CPU and GPU running at max frequencies to maintain performance targets. This means that these frames draw the most power and can produce the most heat. But they don't anticipate this happening all the time which is when the shifting will happen.

As for your point with the fans, I think Sony is preparing for both scenarios. There might be a game that doesn't produce a lot of heat and they can tune the fans to be quieter where it was probably spinning higher than needed.

There can also be a game where the system doesn't spin up the fans enough and they'll have to adjust it. I think that building this into their system is pretty damn awesome because we won't be stuck with a jet engine if it thinks the game will be a heat maker when it isn't.

Last point. Clock speeds are not tied to thermals. That's the point of smartshift. Clock speeds are affected by power needs, not how hot the chip is. The thermal management system in the PS5 just cares about the chip temp and fan speed, not about throttling the CPU/GPU.

Avatar image for girlusocrazy
GirlUSoCrazy

10976

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#114 GirlUSoCrazy
Member since 2015 • 10976 Posts

@Zero_epyon: This exactly

Avatar image for BlackShirt20
BlackShirt20

2493

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#115 BlackShirt20
Member since 2005 • 2493 Posts

@Zero_epyon: https://youtu.be/KfM_nTTxftE

5min mark on.

Won’t happen.

Also, PS5 bottleneck is its bandwidth. Those high clocks are not helping but hurting the GPU because it’s not gonna put out performance need of it had more CU cores.

Avatar image for Zero_epyon
Zero_epyon

15992

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#116  Edited By Zero_epyon
Member since 2004 • 15992 Posts

@BlackShirt20 said:

@Zero_epyon: https://youtu.be/KfM_nTTxftE

5min mark on.

Won’t happen.

Also, PS5 bottleneck is its bandwidth. Those high clocks are not helping but hurting the GPU because it’s not gonna put out performance need of it had more CU cores.

Richard made that claim based on comments he recieved from devs saying they used profiles that override smart shift and shifted most of the power to the GPU so that the GPU clock stayed at 2.23. We learned shortly after that this is a feature only available to devs on dev kits. Smartshift will take unused GPU power away from the GPU and distribute it to the CPU if it needs it.

EDIT: Also note that in the same video, Richard confirms from Cerney that the CPU and GPU will, for most of the time, run at their typical clock speeds and reduction of speeds will be minimal.

As for bandwidth bottleneck, where are you getting that from?

Avatar image for Sagemode87
Sagemode87

2038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#117 Sagemode87
Member since 2013 • 2038 Posts

@Zero_epyon: out of his a$$, just like everything else Lems pull about PS5 specs.

Avatar image for pyro1245
pyro1245

7159

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#118 pyro1245
Member since 2003 • 7159 Posts

show us the frame rate graphs!

Avatar image for lundy86_4
lundy86_4

57751

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#119  Edited By lundy86_4
Member since 2003 • 57751 Posts

@tormentos said:

@lundy86_4:

So you continue to run damage control for the xbox? Its kind of pathetic at this point

Here we are on a site were virtually all lemmings have claim there would be a difference in resolution between both platforms, were people outside the lemming circle have claim that as well, and now all of the sudden the PS5 is say to run at the same resolution and frames and some how people run and hide on settings? When most consoles have show resolution or frames disparities over settings to show the difference?

I hope there is some disparities between both consoles, because it didnt took the PS4 or Xbox one X 1 much time to show its muscle over the competition in fact it was instantaneous and 4k vs 1800p 720p vs 1080p it was there.

Now i simply dont see the gap.

You spewed a lot of nonsense, and none of them addressed my post. TBH tormy, you calling out some assumed leaning, doesn't do much for your rep. You spew verbal diarrhea on the regular.

I'll simply touch on the fact that I am supposedly running DC for the Xbox, when the TC didn't substantiate their claim... But you do you.

Avatar image for lundy86_4
lundy86_4

57751

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#120 lundy86_4
Member since 2003 • 57751 Posts

@lundy86_4 said:
@Sagemode87 said:

@lundy86_4: considering PS4 and Xbox One had the same settings on games with bigger power difference, can easily assume it'll be the same setting here with a MUCH smaller difference. Thanks.

So... No. Thus this thread is worthless. I'd like to thank you, mate.

Did you prove your OP yet?

Avatar image for joshrmeyer
JoshRMeyer

11339

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#121 JoshRMeyer
Member since 2015 • 11339 Posts

@Zero_epyon: You explain things well. Some people read what they want to read though.

Avatar image for Juub1990
Juub1990

10626

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#122 Juub1990
Member since 2013 • 10626 Posts
@pc_rocks said:

10.28 or 10.3 is not in question. I specifically asked where did Sony/Cerny claimed that PS5 is 10.3 (or for that matter 9.2) which the TC claimed.

In The Road to PS5, Cerny said they had to cap the frequency to 2.23GHz (presumably, it could have gone higher) and that he expects the frequency to be most of the time at the cap or close to it. He showed 10.3 TFLOPs on the screen, claiming this was the max theoretical number.

Avatar image for BlackShirt20
BlackShirt20

2493

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#123 BlackShirt20
Member since 2005 • 2493 Posts

@Zero_epyon: It’s all in the video. 🤓

Avatar image for Sagemode87
Sagemode87

2038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#124  Edited By Sagemode87
Member since 2013 • 2038 Posts

@lundy86_4: where did you get the notion that the settings would be different. Resolution and framerate would change before settings.Have any proof to to dispute that trend? Name a console game that announces 4K60 High, Medium, etc? You can't. Games this gen didn't do it, games next gen won't do it. Post something serious or be done. Thanks.

Avatar image for lundy86_4
lundy86_4

57751

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#125 lundy86_4
Member since 2003 • 57751 Posts

@Sagemode87 said:

@lundy86_4: where did you get the notion that the settings would be different. Resolution and framerate would change before settings.Have any proof to to dispute that trend?

You made the initial claim, thus you are obligated to provide the proof. Are you all there?

Prove your claim, or just f*ck off. Either is fine with me.

Avatar image for Sagemode87
Sagemode87

2038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#126  Edited By Sagemode87
Member since 2013 • 2038 Posts

@lundy86_4: you sound delusional . No game ever announces "4K60, High settings". Not on console. Gtfo off my thread bruh. You want me to prove something that's been a trend for years, lmao. Deal with the fact that MS over hyped their console.

Avatar image for lundy86_4
lundy86_4

57751

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#127 lundy86_4
Member since 2003 • 57751 Posts

@Sagemode87 said:

@lundy86_4: you sound dumb. No game ever announces "4K60, High settings" gtfo of my thread bruh. Deal with the fact that MS over hyped their console.

I sound dumb? I'm not the one who can't prove my own claim...

Avatar image for Sagemode87
Sagemode87

2038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#128  Edited By Sagemode87
Member since 2013 • 2038 Posts

@lundy86_4: Not pulling up a link for common sense. The settings for Pro and X were the same and the only differences were framerate and resolution. With a much smaller difference here, you shouldn't expect high settings for one and medium settings for the other. Get some common sense. I'm done with you now.

Avatar image for lundy86_4
lundy86_4

57751

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#129  Edited By lundy86_4
Member since 2003 • 57751 Posts

@Sagemode87 said:

@lundy86_4: you sound delusional . No game ever announces "4K60, High settings". Not on console. Gtfo off my thread bruh. You want me to prove something that's been a trend for years, lmao. Deal with the fact that MS over hyped their console.

I see you edited, so allow me to add:

Prove your point. Provide evidence. Do the bare minimum to back up your argument. Stop side-stepping the point of contention. Do the very least of what is required in a debate, which is providing your evidence.

Avatar image for Sagemode87
Sagemode87

2038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#130 Sagemode87
Member since 2013 • 2038 Posts

@lundy86_4: I proved my claim. All those games run 4k60. What's next, you're going to ask the color temperature? Gtfo.

Avatar image for lundy86_4
lundy86_4

57751

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#131 lundy86_4
Member since 2003 • 57751 Posts

@Sagemode87 said:

@lundy86_4: Not pulling up a link for common sense. If the settings for Pro and X were the same and the only differences were framerate and resolution. With a much smaller difference here, you shouldn't expect high settings for one and medium settings for the other. Get some common sense. I'm done with you now.

Above^^

Avatar image for Sagemode87
Sagemode87

2038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#132 Sagemode87
Member since 2013 • 2038 Posts

@lundy86_4: learn to read and click the links. Not my problem you can't figure it out. Sorry.

Avatar image for lundy86_4
lundy86_4

57751

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#133 lundy86_4
Member since 2003 • 57751 Posts

@Sagemode87 said:

@lundy86_4: I proved my claim. All those games run 4k60.

You said they run the same. There's no evidence of that. 4K60 with mixed High settings is not the same as 4K60 with mixed High-medium-Low. So, are they running the same or no? Where's your evidence?

Avatar image for Zero_epyon
Zero_epyon

15992

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#134 Zero_epyon
Member since 2004 • 15992 Posts

@BlackShirt20 said:

@Zero_epyon: It’s all in the video. 🤓

Yeah in the video they managed to increase the ram speeds by 9% and only managed to squeeze out an extra 1% on average. The Series X has 10GB of memory that's about 20% faster than PS5's 16 GB of memory thanks to a wider bus. So if the same memory performance holds in the next gen consoles, the PS5 would only perform about 2-3% better if it had the extra memory speed. Not really much of a bottleneck.

Avatar image for Zero_epyon
Zero_epyon

15992

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#135 Zero_epyon
Member since 2004 • 15992 Posts

@joshrmeyer said:

@Zero_epyon: You explain things well. Some people read what they want to read though.

Thanks. Unfortunately yes.

Avatar image for BlackShirt20
BlackShirt20

2493

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#136 BlackShirt20
Member since 2005 • 2493 Posts

@Zero_epyon: Almost yes. See, the higher they clock the CU’s the less performance they get. Made worse by the fact they cannot increase the bandwidth.

XSX has such a performance advantage it’s almost comical. Once developers start focusing on the new tech, XSX will establish itself as they premiere console of the generation. At least for the first 3/4 years and the Series Elite launches.

Avatar image for WitIsWisdom
WitIsWisdom

6918

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#137 WitIsWisdom
Member since 2007 • 6918 Posts

@regnaston said:
@WitIsWisdom said:

Very few if any devs are going to take the time to truly optimize games on the XSX over the PS5 especially since the games have to run on the XSS. Most multiplats will look and play almost identically with perhaps some better ray tracing or special effects. Later in the generation it may be evident in exclusive games but time will tell.

Just the 23 Developers within MS Game Studios I guess :D

Yep, that's why I said exclusive games. Although not all 23 studios will put games out that look better than anything else but a few will probably do their damned best. If they don't downgrade Hellblade 2 that is looking incredible and easily better than anything else I have seen on console (and by no small margin.. that game looks incredible). The question is, how long will it take until we see some of these games? 2-4 years? At that point the question is whether or not one or both will have mid gen refreshes which would kind of detract from the point.
I am interested to see how things play out, it just sucks we can't know now instead of having to wait to see for ourselves.

Avatar image for Zero_epyon
Zero_epyon

15992

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#138  Edited By Zero_epyon
Member since 2004 • 15992 Posts

@BlackShirt20 said:

@Zero_epyon: Almost yes. See, the higher they clock the CU’s the less performance they get. Made worse by the fact they cannot increase the bandwidth.

XSX has such a performance advantage it’s almost comical. Once developers start focusing on the new tech, XSX will establish itself as they premiere console of the generation. At least for the first 3/4 years and the Series Elite launches.

That might be true in some cases. But take a look at this:

https://www.pcgamesn.com/amd/radeon-rx-5700-unlock-overclock-undervolt

Essentially, when you remove the artificial limits of the RX 5700 and clock it higher, it can perform as well if not better than the 5700 XT at stock speeds with the extra CUs. More examples here:

Loading Video...

AMD knows this. That's why AMD had the software limits in place so the 5700 and the 5700 XT didn't perform identically while both were on the market at different prices. Sony and MS most likely knew this as well but they obviously chose different strategies. But the above demonstrate that a higher clocked lower CU count GPU can rival a lower clocked higher CU count card of the same architecture.

DF did a quick test but I doubt they went as far as using the MorePowerTool. Instead, they downclocked both cards and did the comparisons there. Not really what's happening between the two consoles.

The memory bandwidth is a non issue really. ~2% difference in performance is negligible and couple that with the fact that there's only 10GB of memory that fast on the Series X while the other 6 are slower than the PS5. Split memory architectures are always less than ideal, but we'll see how devs work around that.

So don't expect the differences to be night and day. I predict that Xbox will win a few and PS5 will win a few, but in either case, the differences will be minor. Oh and don't forget that the tech in the Series X is the exact same tech in the PS5. Except one is clocked higher.

Avatar image for i_p_daily
I_P_Daily

20711

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#139 I_P_Daily
Member since 2015 • 20711 Posts

@SecretPolice said:

That's funny, i'm gonna steal this for future use LOL.

Avatar image for BlackShirt20
BlackShirt20

2493

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#140  Edited By BlackShirt20
Member since 2005 • 2493 Posts

@Zero_epyon: Yep. I have seen that video many times. It’s all Torm ever post during his long essay as he has a total meltdown.

Here is the problem with your argument and video that for some reason you are not understanding. Whenever people point this out to your friend who also uses those videos, he always seems to vanish......

You ready? DF ran the test using a CPU that is near identical to what will be in the next generation consoles. That is key.

So when I say it’s all in the video. I mean it’s all in the video. If you would watch from 5 mins on you would see OC that GPU is gonna get them less performance. They have a massive CU and memory bandwidth disadvantage. Now, will games look noticeably better on XSX? Maybe. Will games have better performance? Definitely.

The card inside the PS5 is a 2060 Super (or 2070 take your pick).

The Card inside XSX is a 2080 Super (Its actually better, but let’s just say 2080 Super).

Now go look at performance between those card and downplay the power difference.

Avatar image for bluestars
Bluestars

2178

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#141 Bluestars
Member since 2019 • 2178 Posts

@tormentos:

its a huge gap,i wanna see it

Avatar image for SecretPolice
SecretPolice

37442

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#142 SecretPolice
Member since 2007 • 37442 Posts

:P

Avatar image for pc_rocks
PC_Rocks

5000

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#143  Edited By PC_Rocks
Member since 2018 • 5000 Posts

@Zero_epyon said:
@BlackShirt20 said:

@tormentos: Still waiting for you to explain how variable CPU and GPU speeds will manage to run at max clock speeds like XSX as you consistently claim every time you bring this up.

The CPU and GPU have variable speeds because SmartShift adjusts power based on a power budget. The shift in power can happen multiple times in a single frame and depending on the frame's complexity and can affect the frequency of the GPU of CPU. From what I learned about all this, the power budget for the chip is enough for both the CPU and GPU to run at 100% frequency at the same time.

The shift happens when you have one component needing more power (AVX CPU instructions for example) when the GPU doesn't need as much. Instead of feeding both components and the GPU and having the GPU idle with that power, the power is shifted to the CPU instead of taken from the budget. This keeps thermals and power consumption down.

Also, when the shift happens the CPU or GPU is undervolted, not throttled down. Undervolting AMD CPUs and GPUs don't translate to linear decreases in frequency. So even when it shifts some power away, the result is literally a few MHz for a millisecond or two.

Hopefully that clears things up.

This is factually false or else there won't be a need for SmartShift. People saying that should know that SmartShift isn't new and already been in used on Laptops. Why would you shift the power and to where if both can run at the max frequency at the same time. Put it another way, why would a component need more power when it's already running at full power. What you're describing is an oxymoron.

Lastly, if none of the factual arguments sway you Sony people's minds then we already have devs saying they are throttling CPU to make GPU run at full clocks. No matter how many mental gymnastics Cerny or cows do, the fact remains the same that the CPU and GPU can't run at full clocks at the same time.

As for these were only available on Dev Kits excuse that Cerny provided back then, why did he refused to answer the question on base clocks when DF asked him point blank. He could have said it simply that 2.23 and 3.5GHz are base clocks.

Avatar image for pc_rocks
PC_Rocks

5000

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#144 PC_Rocks
Member since 2018 • 5000 Posts
@Juub1990 said:
@pc_rocks said:

10.28 or 10.3 is not in question. I specifically asked where did Sony/Cerny claimed that PS5 is 10.3 (or for that matter 9.2) which the TC claimed.

In The Road to PS5, Cerny said they had to cap the frequency to 2.23GHz (presumably, it could have gone higher) and that he expects the frequency to be most of the time at the cap or close to it. He showed 10.3 TFLOPs on the screen, claiming this was the max theoretical number.

Not my point. I said where did Sony/Cerny claimed PS5 to be 10.3TFLOPs machine. They called it the peak performance not the sustained or base performance. What's the number when GPU or CPU downclocks due to power budget distribution according to workloads?

Avatar image for pc_rocks
PC_Rocks

5000

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#145  Edited By PC_Rocks
Member since 2018 • 5000 Posts
@Zero_epyon said:
@BlackShirt20 said:

@Zero_epyon: Almost yes. See, the higher they clock the CU’s the less performance they get. Made worse by the fact they cannot increase the bandwidth.

XSX has such a performance advantage it’s almost comical. Once developers start focusing on the new tech, XSX will establish itself as they premiere console of the generation. At least for the first 3/4 years and the Series Elite launches.

That might be true in some cases. But take a look at this:

https://www.pcgamesn.com/amd/radeon-rx-5700-unlock-overclock-undervolt

Essentially, when you remove the artificial limits of the RX 5700 and clock it higher, it can perform as well if not better than the 5700 XT at stock speeds with the extra CUs. More examples here:

Loading Video...

AMD knows this. That's why AMD had the software limits in place so the 5700 and the 5700 XT didn't perform identically while both were on the market at different prices. Sony and MS most likely knew this as well but they obviously chose different strategies. But the above demonstrate that a higher clocked lower CU count GPU can rival a lower clocked higher CU count card of the same architecture.

DF did a quick test but I doubt they went as far as using the MorePowerTool. Instead, they downclocked both cards and did the comparisons there. Not really what's happening between the two consoles.

The memory bandwidth is a non issue really. ~2% difference in performance is negligible and couple that with the fact that there's only 10GB of memory that fast on the Series X while the other 6 are slower than the PS5. Split memory architectures are always less than ideal, but we'll see how devs work around that.

So don't expect the differences to be night and day. I predict that Xbox will win a few and PS5 will win a few, but in either case, the differences will be minor. Oh and don't forget that the tech in the Series X is the exact same tech in the PS5. Except one is clocked higher.

Again factually false and cherry picking workloads that don't stress CUs. If higher clocks are a substitute of more SM/CUs as Cerny claimed then none of the top GPUs have more cores and less clocks which is generally the case in PCs. Apart from pixel rate none of the graphics workloads scale well with clocks over cores. It's baffling that suddenly Cerny and cows know more about GPUs then Nvidia + AMD combined with decades of designing GPUs.

Lighting, RT, Geometry/Tessellation, Geometry culling all scale well on cores, especially RT which is also significantly heavy on bandwidth. Another disadvantage of higher clocks over core count is latency/access time for memory. In other words higher clocks over more cores needs even higher memory bandwidth which PS5 already has less.

Avatar image for Zero_epyon
Zero_epyon

15992

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#146 Zero_epyon
Member since 2004 • 15992 Posts

@pc_rocks said:

This is factually false or else there won't be a need for SmartShift. People saying that should know that SmartShift isn't new and already been in used on Laptops. Why would you shift the power and to where if both can run at the max frequency at the same time. Put it another way, why would a component need more power when it's already running at full power. What you're describing is an oxymoron.

Lastly, if none of the factual arguments sway you Sony people's minds then we already have devs saying they are throttling CPU to make GPU run at full clocks. No matter how many mental gymnastics Cerny or cows do, the fact remains the same that the CPU and GPU can't run at full clocks at the same time.

As for these were only available on Dev Kits excuse that Cerny provided back then, why did he refused to answer the question on base clocks when DF asked him point blank. He could have said it simply that 2.23 and 3.5GHz are base clocks.

First, SmartShift is literally new and only one laptop currently has it and there won't be more until next year.

Second, both CPU and GPU can absolutely be boosted at the same time. I don't know where you guys are getting that it can't. How do I know? Well, because AMD says so:

https://www.amd.com/en/technologies/smartshift

A new interface within AMD Radeon Software Adrenalin 2020 Edition makes it easy to see how power is being shifted to the CPU and GPU.​

Unlike other implementations, AMD SmartShift can boost both components during the same workload.

The point of smartshift to effectively shift power between CPU and GPU when necessary. If the CPU is working on power intensive operations, instead of throttling the CPU to keep it cool, it will get the power that would have gone to the GPU if the GPU is idle. And vice versa. But for normal operations where both GPU and CPU are being worked, they will be power boosted at the same time.

Dev kits let you override the shifter but we all know in production SmartShit uses machine learning to auto adjust. Perhaps devs are given that ability to run experiments. I don't know. I also don't know why Cerny didn't want to mention base clock, but it's probably because it's a moot point since the clocks stay boosted until something comes along that might demand too much from either component.

Avatar image for Pedro
Pedro

46313

Forum Posts

0

Wiki Points

0

Followers

Reviews: 68

User Lists: 0

#147 Pedro
Member since 2002 • 46313 Posts

Smartshift is used to balance the performance on a set power budget. It is only present because neither component is being boosted at the same time. At the moment, the effectiveness of the tech has not been validated and the non boosted performance of the PS5 has not been disclosed. Sony was legally obligated to state variable frequency and max frequency because the frequency is not always running at max despite claims .

Avatar image for Zero_epyon
Zero_epyon

15992

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#148 Zero_epyon
Member since 2004 • 15992 Posts
@BlackShirt20 said:

@Zero_epyon: Yep. I have seen that video many times. It’s all Torm ever post during his long essay as he has a total meltdown.

Here is the problem with your argument and video that for some reason you are not understanding. Whenever people point this out to your friend who also uses those videos, he always seems to vanish......

You ready? DF ran the test using a CPU that is near identical to what will be in the next generation consoles. That is key.

So when I say it’s all in the video. I mean it’s all in the video. If you would watch from 5 mins on you would see OC that GPU is gonna get them less performance. They have a massive CU and memory bandwidth disadvantage. Now, will games look noticeably better on XSX? Maybe. Will games have better performance? Definitely.

The card inside the PS5 is a 2060 Super (or 2070 take your pick).

The Card inside XSX is a 2080 Super (Its actually better, but let’s just say 2080 Super).

Now go look at performance between those card and downplay the power difference.

Can you elaborate on why the CPU is key to the comparison? The CPU isn't a factor as the scene rendered in the test is GPU bound as Richard points out. In the benchmarks in the video, the same CPU was used for all comparisons at varying resolutions.

You keep saying the same things despite the contrary existing in the video. And now you're saying the Series X is a 2080 super? Yeah can't really take this conversation seriously after that.

Avatar image for Zero_epyon
Zero_epyon

15992

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#149 Zero_epyon
Member since 2004 • 15992 Posts

@pc_rocks said:
@Zero_epyon said:

That might be true in some cases. But take a look at this:

https://www.pcgamesn.com/amd/radeon-rx-5700-unlock-overclock-undervolt

Essentially, when you remove the artificial limits of the RX 5700 and clock it higher, it can perform as well if not better than the 5700 XT at stock speeds with the extra CUs. More examples here:

AMD knows this. That's why AMD had the software limits in place so the 5700 and the 5700 XT didn't perform identically while both were on the market at different prices. Sony and MS most likely knew this as well but they obviously chose different strategies. But the above demonstrate that a higher clocked lower CU count GPU can rival a lower clocked higher CU count card of the same architecture.

DF did a quick test but I doubt they went as far as using the MorePowerTool. Instead, they downclocked both cards and did the comparisons there. Not really what's happening between the two consoles.

The memory bandwidth is a non issue really. ~2% difference in performance is negligible and couple that with the fact that there's only 10GB of memory that fast on the Series X while the other 6 are slower than the PS5. Split memory architectures are always less than ideal, but we'll see how devs work around that.

So don't expect the differences to be night and day. I predict that Xbox will win a few and PS5 will win a few, but in either case, the differences will be minor. Oh and don't forget that the tech in the Series X is the exact same tech in the PS5. Except one is clocked higher.

Again factually false and cherry picking workloads that don't stress CUs. If higher clocks are a substitute of more SM/CUs as Cerny claimed then none of the top GPUs have more cores and less clocks which is generally the case in PCs. Apart from pixel rate none of the graphics workloads scale well with clocks over cores. It's baffling that suddenly Cerny and cows know more about GPUs then Nvidia + AMD combined with decades of designing GPUs.

Lighting, RT, Geometry/Tessellation, Geometry culling all scale well on cores, especially RT which is also significantly heavy on bandwidth. Another disadvantage of higher clocks over core count is latency/access time for memory. In other words higher clocks over more cores needs even higher memory bandwidth which PS5 already has less.

Loading Video...

Sure if you didn't like the games in the first video and the ones tested here: https://www.pcgamesn.com/amd/radeon-rx-5700-unlock-overclock-undervolt then check out the one above that uses games like RDR2, FH4, Tomb Raider, and Witcher 3.

An overclocked morepowertool edited 5700 can perform as well or perform better than a stock 5700 XT with extra CU counts. This has been common knowledge since last year amongst PC enthusiasts. Also, as Richard from DF states, oveclocking memory on AMD cards doesn't always translate into a significant boost to performance. In his test, he overclocked the memory by 9% but got back a 1% difference in performance.

Avatar image for the6millionlies
The6millionLies

564

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#150 The6millionLies
Member since 2020 • 564 Posts

If games are designed with DX api ( PS5 Oberon ) is going to have a hard time , Oberon doesn't support DX api , Yeah I know OpenGL is just as good but considering some developers are just money grabbers they won't put much effort to create equal versions and if PS5 presents some difficulties even less effort.