Seems multiplats will be the same

Avatar image for regnaston
regnaston

4681

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#151 regnaston
Member since 2008 • 4681 Posts

@Zero_epyon: However the CU count difference is 10% on the 5700 to the 5700XT the difference in CU count in the PS5 to the XSX is just over 30%. If the core count was closer on the PS5 vs the XSX then this might be a useful video.

Avatar image for Zero_epyon
Zero_epyon

20103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#152 Zero_epyon
Member since 2004 • 20103 Posts

@regnaston said:

@Zero_epyon: However the CU count difference is 10% on the 5700 to the 5700XT the difference in CU count in the PS5 to the XSX is just over 30%. If the core count was closer on the PS5 vs the XSX then this might be a useful video.

This is true. And I'm not saying that the difference isn't there. What I'm saying is that the difference won't as significant as some are expecting. Some are expecting 2080 Super vs 2060 Super. They're setting themselves up for disappointment.

Avatar image for regnaston
regnaston

4681

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#153 regnaston
Member since 2008 • 4681 Posts

@Zero_epyon said:
@regnaston said:

@Zero_epyon: However the CU count difference is 10% on the 5700 to the 5700XT the difference in CU count in the PS5 to the XSX is just over 30%. If the core count was closer on the PS5 vs the XSX then this might be a useful video.

This is true. And I'm not saying that the difference isn't there. What I'm saying is that the difference won't as significant as some are expecting. Some are expecting 2080 Super vs 2060 Super. They're setting themselves up for disappointment.

true

I will wait until these systems are out in the wild and being tested by people like DF before I pass judgement. Even then I think the differences will be minor and not cause for concern

PS5 may load some games a second or two faster and the XSX may have slightly better graphics in some multiplats but I think most non anal people will be able to enjoy games on whatever system they choose

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#154  Edited By tormentos
Member since 2003 • 33784 Posts

@regnaston:

Bullshit and even more of it turn out rumors about RDNA2 are true, and it has 2.4GHz speed, that would prove without doubt that part of RDNA2 performance against, is not based on CU but on much higher clock speed.

From 2.4GHz to 1.8GHz there are some 600mhz of speed difference.

Performance is not just CU, is also speed of those CU , again if you downclock the series X to 1GHz is not a 12TF machine any more even with 52 CU.

So yeah the 5700 can beat the 5700xt, nut just so you know the gap between the 5700 stock and 5700xt is like 22% and the gap in performance between both is minimal.

Avatar image for Chutebox
Chutebox

50556

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#155 Chutebox
Member since 2007 • 50556 Posts

@regnaston said:
@Zero_epyon said:
@regnaston said:

@Zero_epyon: However the CU count difference is 10% on the 5700 to the 5700XT the difference in CU count in the PS5 to the XSX is just over 30%. If the core count was closer on the PS5 vs the XSX then this might be a useful video.

This is true. And I'm not saying that the difference isn't there. What I'm saying is that the difference won't as significant as some are expecting. Some are expecting 2080 Super vs 2060 Super. They're setting themselves up for disappointment.

true

I will wait until these systems are out in the wild and being tested by people like DF before I pass judgement. Even then I think the differences will be minor and not cause for concern

PS5 may load some games a second or two faster and the XSX may have slightly better graphics in some multiplats but I think most non anal people will be able to enjoy games on whatever system they choose

Brah, SW is all about the anal.

Wait, shit!

Wait...

Avatar image for regnaston
regnaston

4681

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#156 regnaston
Member since 2008 • 4681 Posts

@Chutebox: the post above yours proves that both in terms of the poster's mental makeup and sexual desires

Avatar image for Zero_epyon
Zero_epyon

20103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#157 Zero_epyon
Member since 2004 • 20103 Posts

@Chutebox said:
@regnaston said:
@Zero_epyon said:
@regnaston said:

@Zero_epyon: However the CU count difference is 10% on the 5700 to the 5700XT the difference in CU count in the PS5 to the XSX is just over 30%. If the core count was closer on the PS5 vs the XSX then this might be a useful video.

This is true. And I'm not saying that the difference isn't there. What I'm saying is that the difference won't as significant as some are expecting. Some are expecting 2080 Super vs 2060 Super. They're setting themselves up for disappointment.

true

I will wait until these systems are out in the wild and being tested by people like DF before I pass judgement. Even then I think the differences will be minor and not cause for concern

PS5 may load some games a second or two faster and the XSX may have slightly better graphics in some multiplats but I think most non anal people will be able to enjoy games on whatever system they choose

Brah, SW is all about the anal.

Wait, shit!

Wait...

Avatar image for Sagemode87
Sagemode87

3416

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#158 Sagemode87
Member since 2013 • 3416 Posts

Dirt 5, 4K 120 modes. Not seing that power difference Lems ... https://mobile.twitter.com/dirtgame/status/1318582859918180352?ref_src=twsrc%5Etfw%7Ctwcamp%5Etweetembed%7Ctwterm%5E1318582859918180352%7Ctwgr%5Eshare_3%2Ccontainerclick_1&ref_url=https%3A%2F%2Fs9e.github.io%2Fiframe%2F2%2Ftwitter.min.html1318582859918180352

Avatar image for the6millionlies
The6millionLies

564

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#159 The6millionLies
Member since 2020 • 564 Posts

@Sagemode87: Haha lol you think Dirt is going to be native 4K on PS5

Avatar image for Sagemode87
Sagemode87

3416

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#160  Edited By Sagemode87
Member since 2013 • 3416 Posts

@the6millionlies: lmao you guys are funny. It's not even an impressive looking game. Lems narrative is just one big troll job at this point. I'm sorry you guys bought Microsofts hype.

Avatar image for Pedro
Pedro

69448

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#161  Edited By Pedro
Member since 2002 • 69448 Posts

@Sagemode87 said:

@the6millionlies:... I'm sorry you guys bought Microsofts hype.

It reminds me of how your faction bought into the hype of the SSD. 😎

Avatar image for Zero_epyon
Zero_epyon

20103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#162  Edited By Zero_epyon
Member since 2004 • 20103 Posts

@Sagemode87 said:

@the6millionlies: lmao you guys are funny. It's not even an impressive looking game. Lems narrative is just one big troll job at this point. I'm sorry you guys bought Microsofts hype.

According to DF, Dirt 5 has a dynamic scaler in play for both 4K/60 modes. The 120hz mode rendered at 1440p with drops into the 90's for them. 1440p is due to their display limitations.

So it doesn't look like either version is going to guarantee a locked 4K/60 and that seems like it's the devs fault.

Avatar image for dabear
dabear

8853

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#163 dabear
Member since 2002 • 8853 Posts

@Sagemode87 said:

Valhalla runs the same on PS5 as it does XSex, same for Watchdogs. Two open world games. Also the same for COD. Where's this "HUGE" advantage Lems? Seems you're running out of things to hype..The most powerful console won't even show its superiority at launch. PS4 instantly had resolution differences at launch. Wonder what the excuses will be.

Lems said the same thing about the PS4 and Xbox1 (launch sku).

Oh, how history repeats itself...

Avatar image for robert_sparkes
robert_sparkes

7231

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#164 robert_sparkes
Member since 2018 • 7231 Posts

The DF comparisons are definitely going to be interesting on the upcoming multiplats.

Avatar image for Zero_epyon
Zero_epyon

20103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#165 Zero_epyon
Member since 2004 • 20103 Posts

Cherno is rendered nearly speechless

Loading Video...

But not in a good way...

Avatar image for Pedro
Pedro

69448

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#166 Pedro
Member since 2002 • 69448 Posts

@Zero_epyon: Why should anyone care about this guy?

Avatar image for Zero_epyon
Zero_epyon

20103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#167 Zero_epyon
Member since 2004 • 20103 Posts

@Pedro said:

@Zero_epyon: Why should anyone care about this guy?

He's been running an educational youtube channel for software engineering, specifically game development, for the last 8 years. He's also a former EA software engineer of 4 years and he's currently building a game engine on his own while continuing his educational videos. He provides some good insight as an actual game developer.

Avatar image for Sagemode87
Sagemode87

3416

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#168 Sagemode87
Member since 2013 • 3416 Posts

@dabear: how? Games ran at clearly different resolutions. That's not the case here. What difference will you be happy with from 16 to 18 percent. The desperation is insane.

Avatar image for BlackShirt20
BlackShirt20

2631

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#169 BlackShirt20
Member since 2005 • 2631 Posts

@Zero_epyon: https://youtu.be/4cmclCjS5jg

https://youtu.be/4cmclCjS5jg

You don’t have to take this conversation seriously. But it’s true. The GPU inside the Xbox Series X is about a 2080 Super.

The power differences will be noticeable and XSX will have a massive performance advantage.

Avatar image for gifford38
Gifford38

7162

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#170 Gifford38
Member since 2020 • 7162 Posts

@hardwenzen said:
@Sagemode87 said:

@hardwenzen: I think it's time to admit there is no 30 percent difference bud. We know that you're just trying to make the 16 percent difference bigger than it is. Still running with the 9.2tf narrative, huh? You're going to have to accept its 10.3 bro. Why isn't Xbox showing it its massive difference? You realize PS4 was 50 percent stronger than Xbox One, right? The difference was there but it was far from night and day. The more the numbers go up, the more dimenishing returns you get. Sorry, you'll have to find something else to hype. PS5 also does some things better than Xbox.

I see you have a lot of trouble accepting that 30%. It will be a very difficult generation for you. I wouldn't want to be in your shoes tbh.

The PS5 won't do anything better than the XSX unless 1-2sec faster loading is what you consider as an advantage worth paying attention to. Lets pretend 30% less powerful isn't much, but a 1-2sec faster loading, in a 5sec loading screen is such a big deal 😱

no do the math its not 30%. the math does not add up to 30% so how is it we need to accepted it.

fact is fans boys made up the 30% because of the 9.2 teraflops. its not its 10.3.

fact is the ps5 loads faster who cares. 20% more powerful who cares. sony first party is going to create some great looking games next gen are they not?

Avatar image for pc_rocks
PC_Rocks

8470

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#171 PC_Rocks
Member since 2018 • 8470 Posts

@Zero_epyon said:
@pc_rocks said:

This is factually false or else there won't be a need for SmartShift. People saying that should know that SmartShift isn't new and already been in used on Laptops. Why would you shift the power and to where if both can run at the max frequency at the same time. Put it another way, why would a component need more power when it's already running at full power. What you're describing is an oxymoron.

Lastly, if none of the factual arguments sway you Sony people's minds then we already have devs saying they are throttling CPU to make GPU run at full clocks. No matter how many mental gymnastics Cerny or cows do, the fact remains the same that the CPU and GPU can't run at full clocks at the same time.

As for these were only available on Dev Kits excuse that Cerny provided back then, why did he refused to answer the question on base clocks when DF asked him point blank. He could have said it simply that 2.23 and 3.5GHz are base clocks.

First, SmartShift is literally new and only one laptop currently has it and there won't be more until next year.

Second, both CPU and GPU can absolutely be boosted at the same time. I don't know where you guys are getting that it can't. How do I know? Well, because AMD says so:

https://www.amd.com/en/technologies/smartshift

A new interface within AMD Radeon Software Adrenalin 2020 Edition makes it easy to see how power is being shifted to the CPU and GPU.​

Unlike other implementations, AMD SmartShift can boost both components during the same workload.

The point of smartshift to effectively shift power between CPU and GPU when necessary. If the CPU is working on power intensive operations, instead of throttling the CPU to keep it cool, it will get the power that would have gone to the GPU if the GPU is idle. And vice versa. But for normal operations where both GPU and CPU are being worked, they will be power boosted at the same time.

Dev kits let you override the shifter but we all know in production SmartShit uses machine learning to auto adjust. Perhaps devs are given that ability to run experiments. I don't know. I also don't know why Cerny didn't want to mention base clock, but it's probably because it's a moot point since the clocks stay boosted until something comes along that might demand too much from either component.

I already read that quote and it quoted out of context. No where did AMD said that both component will run attheir max clocks at the same time. They only said both can be boosted, meaning they can be boosted as long as they are below their threshold and max power. As long as one hits the ceiling they can no longer be boosted simultaneously and only one will run at the max clocks.

Whether it's moot or not, if they base clocks were always at the peak then Cerny would have said so not avoid answering it. Machine Learning or manual control is again irrelevant because no how you choose to control it you will still be running just one of them at max clock at the same time and it has been proven in practice.

So, you along with your fellow cows who keep babbling and misquoting it, can tell me if both can be run at their max simultaneously then where will the power be routed to and from what? If both can run simultaneously at max then why Smartshift is needed? All other mental gymnastics is irrelevant, answer the simple question.

Avatar image for gifford38
Gifford38

7162

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#172 Gifford38
Member since 2020 • 7162 Posts

@BlackShirt20 said:

@Zero_epyon: https://youtu.be/4cmclCjS5jg

https://youtu.be/4cmclCjS5jg

You don’t have to take this conversation seriously. But it’s true. The GPU inside the Xbox Series X is about a 2080 Super.

The power differences will be noticeable and XSX will have a massive performance advantage.

so you got the ps5 numbers to right? we know for a fact what the ps5 will be doing?

its all guess work with out them testing each one. it might be a quiverlant to this or that is all made up without testing them. because we don't know the features they have added to the custom gpu's. same with series x. there is no way to compare it to pc gpu's without the machine to test it.

the series x and ps5 will do more raytracing of a super because of the developers will code it in there games. just like how gears 5 has more raytracing in it over the pc version until the update comes.

Avatar image for Pedro
Pedro

69448

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#173  Edited By Pedro
Member since 2002 • 69448 Posts

@Zero_epyon: Sorry if I don't take his input too seriously. 😎

Avatar image for pc_rocks
PC_Rocks

8470

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#174 PC_Rocks
Member since 2018 • 8470 Posts

@Zero_epyon said:
@pc_rocks said:

Again factually false and cherry picking workloads that don't stress CUs. If higher clocks are a substitute of more SM/CUs as Cerny claimed then none of the top GPUs have more cores and less clocks which is generally the case in PCs. Apart from pixel rate none of the graphics workloads scale well with clocks over cores. It's baffling that suddenly Cerny and cows know more about GPUs then Nvidia + AMD combined with decades of designing GPUs.

Lighting, RT, Geometry/Tessellation, Geometry culling all scale well on cores, especially RT which is also significantly heavy on bandwidth. Another disadvantage of higher clocks over core count is latency/access time for memory. In other words higher clocks over more cores needs even higher memory bandwidth which PS5 already has less.

Loading Video...

Sure if you didn't like the games in the first video and the ones tested here: https://www.pcgamesn.com/amd/radeon-rx-5700-unlock-overclock-undervolt then check out the one above that uses games like RDR2, FH4, Tomb Raider, and Witcher 3.

An overclocked morepowertool edited 5700 can perform as well or perform better than a stock 5700 XT with extra CU counts. This has been common knowledge since last year amongst PC enthusiasts. Also, as Richard from DF states, oveclocking memory on AMD cards doesn't always translate into a significant boost to performance. In his test, he overclocked the memory by 9% but got back a 1% difference in performance.

Again missing the point. Let me break it down:

1. 5700 vs XT isn't a good comparison for PS5 vs XSX. One has just a difference of 4 CUs while the other has 16.

2. In the video you posted the games are running at 2K or 1080p, can't really tell but definitely not at 4K and definitely not at max performance. Once you have enough head room due to low resolution it won't show you much difference. This can be confirmed from the lowest FPS differential.

3. The better comparison would be 7970XT with 7870 OC or something like that. I forgot the actual two models but the OC version has higher TFLOPs but lower CUs and it didn't perform well in games.

4. Add RT and the difference will be higher. Also, pretty sure Richard didn't test the RT's effect on memory bandwidth. Regardless, at higher clocks with less cores memory becomes a bottleneck.

5. Lastly, if Cerny is such a genius on why higher clocks are better then do tell why Nvidia and AMD for their flagship GPUs have more cores and faster memory over clocks. Suddenly Cerny knows more about the actual creators of GPUs.

Avatar image for pc_rocks
PC_Rocks

8470

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#175 PC_Rocks
Member since 2018 • 8470 Posts

@Zero_epyon said:
@BlackShirt20 said:

@Zero_epyon: Yep. I have seen that video many times. It’s all Torm ever post during his long essay as he has a total meltdown.

Here is the problem with your argument and video that for some reason you are not understanding. Whenever people point this out to your friend who also uses those videos, he always seems to vanish......

You ready? DF ran the test using a CPU that is near identical to what will be in the next generation consoles. That is key.

So when I say it’s all in the video. I mean it’s all in the video. If you would watch from 5 mins on you would see OC that GPU is gonna get them less performance. They have a massive CU and memory bandwidth disadvantage. Now, will games look noticeably better on XSX? Maybe. Will games have better performance? Definitely.

The card inside the PS5 is a 2060 Super (or 2070 take your pick).

The Card inside XSX is a 2080 Super (Its actually better, but let’s just say 2080 Super).

Now go look at performance between those card and downplay the power difference.

Can you elaborate on why the CPU is key to the comparison? The CPU isn't a factor as the scene rendered in the test is GPU bound as Richard points out. In the benchmarks in the video, the same CPU was used for all comparisons at varying resolutions.

You keep saying the same things despite the contrary existing in the video. And now you're saying the Series X is a 2080 super? Yeah can't really take this conversation seriously after that.

The card inside the XSX is not 2080 Super. It won't even be 2080. Add in RT and ML and it will be way less, actually for ML and RT, XSX was performing in the ballpark of 2060 if I remember correctly. You consolites needs to stop overhyping your hardware and use Gears 5 as a benchmark of anything.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#176 tormentos
Member since 2003 • 33784 Posts

@Pedro:

What a shock pedro de fake developer doesnt take seriously a real one.😂😂😂

Epic.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#177 tormentos
Member since 2003 • 33784 Posts

@regnaston:

Thats funny for some one so invested in downplaying the PS5 to the point of spreading false information.

Avatar image for Zero_epyon
Zero_epyon

20103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#178 Zero_epyon
Member since 2004 • 20103 Posts

@BlackShirt20 said:

@Zero_epyon: https://youtu.be/4cmclCjS5jg

https://youtu.be/4cmclCjS5jg

You don’t have to take this conversation seriously. But it’s true. The GPU inside the Xbox Series X is about a 2080 Super.

The power differences will be noticeable and XSX will have a massive performance advantage.

The two links are to the same video and they only discuss the hardware in it. You might be trying to refer to this?

https://www.eurogamer.net/articles/digitalfoundry-2020-inside-xbox-series-x-full-specs

However, even basic ports which barely use any of the Series X's new features are delivering impressive results. The Coalition's Mike Rayner and Colin Penty showed us a Series X conversion of Gears 5, produced in just two weeks. The developers worked with Epic Games in getting UE4 operating on Series X, then simply upped all of the internal quality presets to the equivalent of PC's ultra, adding improved contact shadows and UE4's brand-new (software-based) ray traced screen-space global illumination. On top of that, Gears 5's cutscenes - running at 30fps on Xbox One X - were upped to a flawless 60fps. We'll be covering more on this soon, but there was one startling takeaway - we were shown benchmark results that, on this two-week-old, unoptimised port, already deliver very, very similar performance to an RTX 2080.

And if that's what you're thinking if then I say be careful with how you interpret this. For one he says 2080 and not 2080 Super. Also when the One X was revealed, Richard made a similar claim that it performed like GTX 1070 based on a demo of a Forza title. DF later walked some of that back in later comparisons where they claimed that the One X was more like an RX 580.

Then consider this:

Loading Video...

This is Gears 5's Benchmark feature running on an GTX 1080ti at 4K Ultra. The benchmark runner stresses that he's using Nvidia's Shadow play which takes performance down between 5-10%. So the game should hit 60 fps with so lows in the low-mid 50's. Also, consider that the Benchmarking tool really taxes the hardware and doesn't represent typical gameplay performance. So it's absolutely possible to get 4K Ultra 60 fps on Gears 5 with a 1080ti. That's why I'm always a bit hesitant to believe Richard when he makes points like this.

Avatar image for Zero_epyon
Zero_epyon

20103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#179 Zero_epyon
Member since 2004 • 20103 Posts

@Pedro said:

@Zero_epyon: Sorry if I don't take his input too seriously. 😎

That's fine. I'll remember that the next time you give your opinion lol

Avatar image for Zero_epyon
Zero_epyon

20103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#180 Zero_epyon
Member since 2004 • 20103 Posts
@pc_rocks said:
@Zero_epyon said:
@pc_rocks said:

Again factually false and cherry picking workloads that don't stress CUs. If higher clocks are a substitute of more SM/CUs as Cerny claimed then none of the top GPUs have more cores and less clocks which is generally the case in PCs. Apart from pixel rate none of the graphics workloads scale well with clocks over cores. It's baffling that suddenly Cerny and cows know more about GPUs then Nvidia + AMD combined with decades of designing GPUs.

Lighting, RT, Geometry/Tessellation, Geometry culling all scale well on cores, especially RT which is also significantly heavy on bandwidth. Another disadvantage of higher clocks over core count is latency/access time for memory. In other words higher clocks over more cores needs even higher memory bandwidth which PS5 already has less.

Loading Video...

Sure if you didn't like the games in the first video and the ones tested here: https://www.pcgamesn.com/amd/radeon-rx-5700-unlock-overclock-undervolt then check out the one above that uses games like RDR2, FH4, Tomb Raider, and Witcher 3.

An overclocked morepowertool edited 5700 can perform as well or perform better than a stock 5700 XT with extra CU counts. This has been common knowledge since last year amongst PC enthusiasts. Also, as Richard from DF states, oveclocking memory on AMD cards doesn't always translate into a significant boost to performance. In his test, he overclocked the memory by 9% but got back a 1% difference in performance.

Again missing the point. Let me break it down:

1. 5700 vs XT isn't a good comparison for PS5 vs XSX. One has just a difference of 4 CUs while the other has 16.

2. In the video you posted the games are running at 2K or 1080p, can't really tell but definitely not at 4K and definitely not at max performance. Once you have enough head room due to low resolution it won't show you much difference. This can be confirmed from the lowest FPS differential.

3. The better comparison would be 7970XT with 7870 OC or something like that. I forgot the actual two models but the OC version has higher TFLOPs but lower CUs and it didn't perform well in games.

4. Add RT and the difference will be higher. Also, pretty sure Richard didn't test the RT's effect on memory bandwidth. Regardless, at higher clocks with less cores memory becomes a bottleneck.

5. Lastly, if Cerny is such a genius on why higher clocks are better then do tell why Nvidia and AMD for their flagship GPUs have more cores and faster memory over clocks. Suddenly Cerny knows more about the actual creators of GPUs.

1. As I said to @regnaston earlier, I'm not saying that the clock nullifies the difference, but that the difference won't be as larger as people expect.

2. You are correct that it's 1440p. I thought it was 4K. However, the first video I posted shows little differences between the 4K modes in those games, including games like Metro Exodus. If that doesn't convince you then you're free to do so.

3. That comparison isn't better though. The 7970 had an extra GB of Ram than the 7870 XT. The 5700 and 5700 XT are identical enough that the VBios for the 5700 XT can be installed on the 5700 and work well. I'm not sure the same could be done with the 7000 cards. The 5700 XT and 5700 also share the same memory size, bandwidth, and bus sizes. I couldn't find any benchmarks but I'd imagine the 1 GB memory difference would play a big factor in performance.

4. He couldn't. The 5700 doesn't have dedicated RT hardware like RDNA 2 is supposed to have. So the Jury is still out on that one, but if it's supposed to be more beneficial to have for CUs then it is what it is. Again, not arguing that there will be no difference, just that it won't be as pronounced as some expect.

5. If these leaks are real, Navi 21 XT will have a boost clock of 2.4Ghz, just a bit higher than the PS5's boost clock along with what you said. I'm sure cost was a factor in going with the memory speeds but I can't speak for Cerny. https://technosports.co.in/2020/10/19/the-leak-of-amd-navi-21-gpu-is-one-the-biggest-flex-we-have-seen-in-2020/#:~:text=The%20veteran%20tech%20leaker%20Rogaine,of%20up%20to%202.4GHz.

Avatar image for Zero_epyon
Zero_epyon

20103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#181  Edited By Zero_epyon
Member since 2004 • 20103 Posts
@pc_rocks said:
@Zero_epyon said:
@BlackShirt20 said:

@Zero_epyon: Yep. I have seen that video many times. It’s all Torm ever post during his long essay as he has a total meltdown.

Here is the problem with your argument and video that for some reason you are not understanding. Whenever people point this out to your friend who also uses those videos, he always seems to vanish......

You ready? DF ran the test using a CPU that is near identical to what will be in the next generation consoles. That is key.

So when I say it’s all in the video. I mean it’s all in the video. If you would watch from 5 mins on you would see OC that GPU is gonna get them less performance. They have a massive CU and memory bandwidth disadvantage. Now, will games look noticeably better on XSX? Maybe. Will games have better performance? Definitely.

The card inside the PS5 is a 2060 Super (or 2070 take your pick).

The Card inside XSX is a 2080 Super (Its actually better, but let’s just say 2080 Super).

Now go look at performance between those card and downplay the power difference.

Can you elaborate on why the CPU is key to the comparison? The CPU isn't a factor as the scene rendered in the test is GPU bound as Richard points out. In the benchmarks in the video, the same CPU was used for all comparisons at varying resolutions.

You keep saying the same things despite the contrary existing in the video. And now you're saying the Series X is a 2080 super? Yeah can't really take this conversation seriously after that.

The card inside the XSX is not 2080 Super. It won't even be 2080. Add in RT and ML and it will be way less, actually for ML and RT, XSX was performing in the ballpark of 2060 if I remember correctly. You consolites needs to stop overhyping your hardware and use Gears 5 as a benchmark of anything.

Basically. Gears 5 can run near a Series X on a 1080Ti. Obviously, it won't be able to do any ray tracing stuff but it doesn't need to be a 2080 super to do so.

Avatar image for Zero_epyon
Zero_epyon

20103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#182 Zero_epyon
Member since 2004 • 20103 Posts

This is a fun thread. Really enjoying the conversation here lol.

Avatar image for BlackShirt20
BlackShirt20

2631

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#183  Edited By BlackShirt20
Member since 2005 • 2631 Posts

@Zero_epyon: https://www.techpowerup.com/gpu-specs/xbox-series-x-gpu.c3482

https://www.techpowerup.com/gpu-specs/geforce-rtx-2080-super.c3439

Even Alex says “It’s much faster than a 2080.”

As far the video you and Torm love and hype. It perfectly demonstrates that the more you push clocks the less performance you get back.

In what world do you think the PS5 will win head to head in performance? Remember when RE3 remake came out and DF ran there analyst and found the PS4 Pro was out performing the Xbox One X in that game? Sony ponies made thread after thread about how power doesn’t matter. It’s not about TF. They only made a big deal about this because they knew PS5 was a lot weaker than XSX and they used that as an example that power doesn’t matter. That was until a week later when they patched RE3 and XBOX One X ran at higher resolution and more stable frames......Then you guys went silent.

Only time PS5 will beat XSX is if developers decide they don’t want to optimize the game for XSX fully. Just like they did for RE3 Remake until fans called them on it.

Avatar image for Zero_epyon
Zero_epyon

20103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#184 Zero_epyon
Member since 2004 • 20103 Posts

@BlackShirt20 said:

@Zero_epyon: https://www.techpowerup.com/gpu-specs/xbox-series-x-gpu.c3482

https://www.techpowerup.com/gpu-specs/geforce-rtx-2080-super.c3439

Even Alex says “It’s much faster than a 2080.”

As far the video you and Torm love and hype. It perfectly demonstrates that the more you push clocks the less performance you get back.

In what world do you think the PS5 will win head to head in performance? Remember when RE3 remake came out and DF ran there analyst and found the PS4 Pro was out performing the Xbox One X in that game? Sony ponies made thread after thread about how power doesn’t matter. It’s not about TF. They only made a big deal about this because they knew PS5 was a lot weaker than XSX and they used that as an example that power doesn’t matter. That was until a week later when they patched RE3 and XBOX One X ran at higher resolution and more stable frames......Then you guys went silent.

Only time PS5 will beat XSX is if developers decide they don’t want to optimize the game for XSX fully. Just like they did for RE3 Remake until fans called them on it.

I don't know what those two links are supposed to demonstrate. You can't compare spec/metrics of different architectures since they mean different things on each architecture.

As for your Alex quote, no you misquoted it again.

Loading Video...

The video starts at where they start talking about the 2080 comparisons. They said it's on par with the 2080 because the heavy benchmark ran close to the 2080. But as you saw, and conveniently failed to address, the 1080Ti can run that benchmark in ultra at 4K close to 60 FPS.

I remember that thread and funny enough, the reason why the One X version of RE3 ran poorly against the Pro was because the One X ran the game at Native 4K without a scaler while the Pro had a lower resolution and with a scaler I believe. I think the argument wasn't that power wasn't everything, but the push for 4K wasn't worth it if framerates tanked. After the backlash, Capcom patched it so that it ran with a scaler like the RE2 remake did, which fixed performance. So you have it totally backward (starting to see a trend lol).

The PS5 will perform on par or better than the Series X when the game is running workloads that benefit from higher frequencies than CU counts. But when it doesn't, those higher clocks will mitigate some of the performance gap between the CU differences, but obviously not completely. That's my take.

Avatar image for BlackShirt20
BlackShirt20

2631

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#185  Edited By BlackShirt20
Member since 2005 • 2631 Posts

@Zero_epyon: But it won’t always perform on par right? I doubt it ever out performs the XSX unless the developers don’t really optimize the game. For reasons explained. The XSX has just too many advantages. Some developers have said PS5 is difficult to get to match XSX in performance. I know RE8 developers said the PS5 is struggling to match XSX as well. It’s struggling to hit 1080p 60fps while XSX is pushing 4k 60fps. So I don’t know if I believe your claim.

I think XSX is just too powerful for the PS5 to keep up. Like I said and like DF said, it’s a 2060 Super vs 2080 Super. It’s not gonna happen buddy. Sorry.

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#186 Juub1990
Member since 2013 • 12620 Posts

@pc_rocks: He says he expects it to remain at or close to the peak. Take it for what it’s worth.

Avatar image for Zero_epyon
Zero_epyon

20103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#187  Edited By Zero_epyon
Member since 2004 • 20103 Posts

@BlackShirt20 said:

@Zero_epyon: But it won’t always perform on par right? I doubt it ever out performs the XSX unless the developers don’t really optimize the game. For reasons explained. The XSX has just too many advantages. Some developers have said PS5 is difficult to get to match XSX in performance. I know RE8 developers said the PS5 is struggling to match XSX as well. It’s struggling to hit 1080p 60fps while XSX is pushing 4k 60fps. So I don’t know if I believe your claim.

I think XSX is just too powerful for the PS5 to keep up. Like I said and like DF said, it’s a 2060 Super vs 2080 Super. It’s not gonna happen buddy. Sorry.

The fact that you believe RE8 can't do 1080p/60 on PS5 but somehow can do 4k/60 on series X because of a few more compute units tells me you don't really understand the hardware.

Also, DF never said 2080 super either so..

Avatar image for the6millionlies
The6millionLies

564

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#188 The6millionLies
Member since 2020 • 564 Posts

Like someone else said , if clock speeds are all and CUs aren't important , why the AMD and NVIDIA are putting more CUs to their GPUs? The 3090 is a beast with a lot of CUs/Cuda cores and clocked at 1700 Ghz

Avatar image for BlackShirt20
BlackShirt20

2631

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#189 BlackShirt20
Member since 2005 • 2631 Posts

@Zero_epyon: I never said the PS5 could not do 1080p. I was quoting a development update from RE8 that said the PS5 is struggling to hit 1080p 60fps while XSX is running the game at 4k. So you need to relax.

You refuse to accept anything that doesn’t suit your video. The problem is, it’s just a video. Doesn’t mean anything.

https://youtu.be/oOt1lOMK5qY

Same thing. The more you push the card, the less you get back. Sometimes it can match performance and once in awhile yes the card can perform better by a couple percentage points. But most the time it does not. I don’t know how you don’t understanding that?

More core usually wins. Secondly, we are not talking about a 4 CU core difference. We are talking a massive CU disadvantage for PS5. Not only that but we are talking about a decent chunk in terms of raw power as well. Then add to the fact the XSX has bandwidth advantages, CPU advantages. The fact you blindly look away think to yourself..... yes, PS5 can overcome this because of a video I saw on YouTube is kind of troubling.

The other shocking disadvantage for Sony is it’s gonna struggle to manage its power consumption. The PS5 is energy hungry. The XSX uses less energy than X1X. The engineers for Xbox created a masterpiece of technology.

But yes. RE8 is struggling with PS5 development 🤓. At least according to them.

Avatar image for Sagemode87
Sagemode87

3416

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#190 Sagemode87
Member since 2013 • 3416 Posts

@BlackShirt20: You got owned up and down this thread but still won't admit fault. Pretty sad.

Avatar image for BlackShirt20
BlackShirt20

2631

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#191 BlackShirt20
Member since 2005 • 2631 Posts

@Sagemode87: LOL we will see very soon.

Sony can’t hide forever.

Avatar image for Sagemode87
Sagemode87

3416

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#192 Sagemode87
Member since 2013 • 3416 Posts

@BlackShirt20 said:

@Sagemode87: LOL we will see very soon.

Sony can’t hide forever.

Seriously dude

Avatar image for techhog89
Techhog89

5430

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#193  Edited By Techhog89
Member since 2015 • 5430 Posts

@hardwenzen said:

How do you know it runs the same? One could have textures on ultra while the other on high. One could have shadows on ultra and x16 af, while the other has medium shadows and a pathetic x2 af.

30% difference in power is m a s s i v e. You can deny it, or accept it in your life. Your choice.

I love how you think that AF makes a huge performance difference.

And it's not 30%. The way you people take the rumored minimum clock as being the clock it runs at 100% of the time and as fact when it's unconfirmed in the first place is ridiculous.

Avatar image for Zero_epyon
Zero_epyon

20103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#194 Zero_epyon
Member since 2004 • 20103 Posts

@BlackShirt20 said:

@Zero_epyon: I never said the PS5 could not do 1080p. I was quoting a development update from RE8 that said the PS5 is struggling to hit 1080p 60fps while XSX is running the game at 4k. So you need to relax.

You refuse to accept anything that doesn’t suit your video. The problem is, it’s just a video. Doesn’t mean anything.

https://youtu.be/oOt1lOMK5qY

Same thing. The more you push the card, the less you get back. Sometimes it can match performance and once in awhile yes the card can perform better by a couple percentage points. But most the time it does not. I don’t know how you don’t understanding that?

More core usually wins. Secondly, we are not talking about a 4 CU core difference. We are talking a massive CU disadvantage for PS5. Not only that but we are talking about a decent chunk in terms of raw power as well. Then add to the fact the XSX has bandwidth advantages, CPU advantages. The fact you blindly look away think to yourself..... yes, PS5 can overcome this because of a video I saw on YouTube is kind of troubling.

The other shocking disadvantage for Sony is it’s gonna struggle to manage its power consumption. The PS5 is energy hungry. The XSX uses less energy than X1X. The engineers for Xbox created a masterpiece of technology.

But yes. RE8 is struggling with PS5 development 🤓. At least according to them.

You're arguing in circles, dismissing video evidence that counters your claims, and ignoring the fact that you've misrepresented and misquoted DF multiple times. Now you're resorting to rumors from a guy who said he spread the RE8 rumor because he thought press at the time was being one sided against Series X and wanted to even it out.

You also seem to be ignoring the countless times I've said to you and others here that the CU counts are an advantage that will provide some differences in performance between the Series X and PS5. But with the clocks being higher on the PS5, that gap will narrow, and the difference will end up not being so pronounced. You just want there to be a massive difference so bad that you've ignored the central point in all of this.

Interesting that you're making claims about the PS5 that haven't been tested yet. So tell me, how much power does the PS5 actually draw? In which games? What about any games that fully use the RT hardware in the CUs?

Don't even bother answering. I think we're done here because at this point I don't think you're interested in having a real conversation about this and you just want to cheerlead your preferred platform with buzzwords you heard on DF.

Good day...

Avatar image for regnaston
regnaston

4681

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#195 regnaston
Member since 2008 • 4681 Posts

@Zero_epyon said:
@BlackShirt20 said:

@Zero_epyon: I never said the PS5 could not do 1080p. I was quoting a development update from RE8 that said the PS5 is struggling to hit 1080p 60fps while XSX is running the game at 4k. So you need to relax.

You refuse to accept anything that doesn’t suit your video. The problem is, it’s just a video. Doesn’t mean anything.

https://youtu.be/oOt1lOMK5qY

Same thing. The more you push the card, the less you get back. Sometimes it can match performance and once in awhile yes the card can perform better by a couple percentage points. But most the time it does not. I don’t know how you don’t understanding that?

More core usually wins. Secondly, we are not talking about a 4 CU core difference. We are talking a massive CU disadvantage for PS5. Not only that but we are talking about a decent chunk in terms of raw power as well. Then add to the fact the XSX has bandwidth advantages, CPU advantages. The fact you blindly look away think to yourself..... yes, PS5 can overcome this because of a video I saw on YouTube is kind of troubling.

The other shocking disadvantage for Sony is it’s gonna struggle to manage its power consumption. The PS5 is energy hungry. The XSX uses less energy than X1X. The engineers for Xbox created a masterpiece of technology.

But yes. RE8 is struggling with PS5 development 🤓. At least according to them.

You're arguing in circles, dismissing video evidence that counters your claims, and ignoring the fact that you've misrepresented and misquoted DF multiple times. Now you're resorting to rumors from a guy who said he spread the RE8 rumor because he thought press at the time was being one sided against Series X and wanted to even it out.

Can you show me where the guy retracted what he said? You and others seem to be saying/implying that he said he lied. In fact he stands by what he said.

Avatar image for Pedro
Pedro

69448

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#196 Pedro
Member since 2002 • 69448 Posts

@tormentos: Still suffering from that Pedro envy. I am sorry I make you feel like a dunce.😂🤣

Avatar image for Pedro
Pedro

69448

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#197 Pedro
Member since 2002 • 69448 Posts
@Zero_epyon said:

That's fine. I'll remember that the next time you give your opinion lol

Now you know how much I view your opinion. 😉

Avatar image for Zero_epyon
Zero_epyon

20103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#198 Zero_epyon
Member since 2004 • 20103 Posts

@Pedro: enough to engage in a conversation apparently lol

Avatar image for Zero_epyon
Zero_epyon

20103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#199  Edited By Zero_epyon
Member since 2004 • 20103 Posts

@regnaston: didn’t say he retracted it. Him standing by it doesn’t make it true. He also said the PS5 would cost $600 to make around the same time a Bloomberg report said it costs $450. Rumors are just that. Rumors. But they’re being used here as fact.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#200 tormentos
Member since 2003 • 33784 Posts

@Pedro:

Na its just funny showing how much of a corporate b!tch you are while you desperate try to hide it.😂