Nvidia Ampere, AMD RDNA 2, and consoles: Why TFLOPS are bullsh-

Avatar image for techhog89
Techhog89

5430

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 Techhog89
Member since 2015 • 5430 Posts

Disclaimer: TFLOPS aren't actually BS per se, but using them as a measurement of gaming performance doesn't really work well, even within the same architecture in some cases.

So, Nvidia just announced their new RTX 3000 GPUs, which show INSANE performance increases on paper. The number of TFLOPS has gone up more than anyone could have imagined, and even the RTX 2080 Ti and Xbox Series X look low-end now compared to the weakest of the new GPUs just looking at the numbers. Thus, I made a thread mocking people because of how this all looks, and another person mentioned how the 3070 is "twice as powerful" as the PS5. However, it's really not that simple. There's some magic going on here that you all need to be made aware of, and it will become very clear once RDNA 2 cards are reviewed that TFLOPS are almost completely meaningless to gamers.

So first of all, what even is a FLOP? Well.... that word is meaningless in this context. FLOPS is an acronym that stands for "FLoating point Operations Per Second." Basically, it's a number that says how much computing power a processor has. So, you might think that you run some kind of program to measure this number, right? Well, that's not true. In reality, it's typically calculated using a formula. There are different formulas for different types of processors and FLOPS, but the one commonly used for single-precision operations on a modern GPU is as follows:

GFLOPS = Shader count x Clockspeed in GHz x 2

Note: Nvidia calls Shaders "CUDA cores" and AMD calls them "Stream Processors" or "SPs"

Obviously that is a very simplified version of the concept, but it's enough for the point being made here. If you doubt that it's really just that, let's calculate the RTX 3070's TFLOPS:

5888 x 1.725 GHz x 2 = 20,313.6 GFLOPS, or 20.3 TFLOPS

Now, we know that the 3070 is faster than the 2080 Ti, so let's see where that falls.

4352 x 1.545 GHz x 2 = 13,447.68 GFLOPS, or 13.4 TFLOPS

Seems pretty clear cut then, huh? Thanks to a higher clock speed and many more shaders, the 3070 is a whopping 51% percent faster than the 2080 Ti!

... But then why didn't Nvidia claim that it's 51% faster? And why would they charge so little for such a massive performance bump? Well, there are several factors. Perhaps one that some of you immediately think of is IPC. Perhaps the IPC of Ampere is reduced compared to Turing to increase the number of cores? Well, that's not it. In fact, the IPC is actually higher! But how can I say that? Because the 3070 actually has fewer cores than the 2080 Ti. See, it's not really the shaders that are the true cores of a GPU; they're just part of the core. For Nvidia, the cores are the Streaming Multiprocessors, better known as SMs. The 2080 Ti has 68 SMs, while the 3070 has just 46.

So, how is it possible that the 3070 has more CUDA cores if those are part of the SM and the 2080 Ti has more of them? It's pretty simple actually. The 20 series has 64 cores per SM, while the 30 series has 128 cores per SM. While the CUDA cores do contribute to performance, in games so many parts of the SM are used that doubling the per-SM count doesn't affect game performance much.

"So you're saying that Nvidia lied! The REAL performance is 10.15 TFLOPS! Weaker than PS5 confirmed!"

Yeah, no. 20.3 TFLOPS is still accurate. It's just that shader TFLOPS take into account so little of the GPU that they're pretty much meaningless to us. Using them as an absolute comparison for gaming performance doesn't really get us anywhere. It's a poor metric, especially when comparing consoles that have custom hardware for certain things. In addition, while the shaders don't usually contribute much to gaming on their own, certain games can make heavy use of them and actually see a large boost from this change. Many GPU-accelerated programs may make great use of them as well. Thus, we cant claim that the figure is fake or misleading, or that Nvidia pulled a trick here. It's just that the figure is pretty much arbitrary for gaming.

"Okay, fine. But I just noticed that you used boost clocks for your calculations! Did Nvidia do that too?"

Yes but-

"FALSE ADVERTISING! ONLY 17.7 TFLOPS THAT PERFORMS LIKE 8.8 TFLOPS CONFIRMED!"

Let me finish, please. Yes, but boost in 2020 is vastly different from what it was in 2013, and that applies to PS5 as well. Modern CPUs and GPUs do not boost for a little while and then drop back to base (except in laptops) under typical loads like gaming. In fact, since 2014's Maxwell, Nvidia GPUs often run above the rated boost speed by a few hundred MHz in gaming without any overclocking! Basically, the GPUs have limits to their power usage and temperatures and will boost higher if they are running cool and have headroom for more power. AMD and Intel are adapting similar technology into their CPUs and GPUs as well, though Nvidia's is more agressive.

Enter the PS5 and its supposed "9.2 TFLOPS." PS5 cannot boost past its known max of course, but the idea that it'll typically run at base is inaccurate and ultimately irrelevant. PS5's boost limits are based on power and load balancing between the CPU and GPU. Some of you are taking that as meaning that only one can run at full clocks at a time and devs will be forced to find the right balance, but this is inaccurate. Really, it's just a way to manage power usage. That one dev who said that they had to reduce the CPU to 3.0GHz to get the full speed of the GPU likely would not have benefited much if the clocks were fixed to full boost, since in reality it was load balancing that allowed this to happen, not just power balancing. How do I know this? Because that's how AMD SmartShift works. The CPU clock dropped because the game they're making pushes the GPU hard and doesn't push the CPU much. So, since the game is GPU-bottlenecked while the CPU has more headroom, the power is instead diverted to the GPU's Compute Units (CUs, AMD's equivalent to SMs, which have 64 SPs each), which are actually being stressed. If the GPU is the bottleneck and is running at full clocks, the CPU running faster would not make much of a difference other than the fact that it would be using more energy while still sitting there not doing much. It's possible that it could result in more stutters compared to XSX, but a good developer should be able to work around that. So, basically, PS5 will run at full speed when it needs to, just like how Nvidia GPUs run as fast as the can. I know this won't stop the trolls, but either way the difference between the PS5 and XSX won't be that big. It'll be a bit smaller than PS4 Pro vs One X in fact. PS5's dedicated hardware for stuff like audio and decompression make this even more irrelevant, as that's even less that the CPU has to worry about and makes the difference in CPU speeds more or less moot.

TL;DR Shut up about TFLOPS. Let the games do the talking. Also, boost clocks aren't opportunistic anymore and base clocks are just a worst case.

Avatar image for onesiphorus
onesiphorus

5248

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 6

#2 onesiphorus
Member since 2014 • 5248 Posts

Do we need to have another NVIDIA-related topic in this forum?

Avatar image for moosewayne
MooseWayne

361

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 MooseWayne
Member since 2017 • 361 Posts

I'm dumb as shit. why isn't there a universal number for graphics power? has to be quantifiable.

Avatar image for techhog89
Techhog89

5430

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 Techhog89
Member since 2015 • 5430 Posts

@moosewayne said:

I'm dumb as shit. why isn't there a universal number for graphics power? has to be quantifiable.

It isn't for the reason I mentioned: there are too many variables. The different parts of the hardware, the architecture, cache, memory bandwidth, APIs, drivers on PC, software optimization... They all work together, and something is going to be the bottleneck. In the case of the PS5 and XSX they're similar enough that TFLOPS can give a pretty good idea, but it's still far from perfect. XSX is noticeably more powerful than PS5 for sure, but not by the amount that people think and it'll still be up to Microsoft to take advantage of that power in a way beyond 1440p vs 4K. I also didn't even mention XSS, which has a weaker GPU than One X on paper but will beat it in practice.

Avatar image for sakaixx
sakaiXx

15914

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 5

#5 sakaiXx
Member since 2013 • 15914 Posts

Cerny is right all along

Avatar image for fedor
Fedor

11612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 Fedor
Member since 2015 • 11612 Posts

@sakaixx:

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 Juub1990
Member since 2013 • 12620 Posts

@moosewayne: There isn’t because graphics are made of a myriad of things and a GPU might be more proficient at one thing than another. Take a very simple example; textures. A weaker card overall might deliver better textures simply from having more VRAM than a stronger card that doesn’t have enough of it.

No need to even get into things like FP32 va FP16, TMU’s triangles/s, ray casting etc. Graphics can’t be quantified by a single number.

It’s like those sports games where players are given a rating based on a number of different variables. Ratings which are often wrong.

Avatar image for deactivated-60bf765068a74
deactivated-60bf765068a74

9558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8  Edited By deactivated-60bf765068a74
Member since 2007 • 9558 Posts

Dude PS5 has the weakest flops we just need to accept it, we got game tho we can fight on that front.

Avatar image for deactivated-642321fb121ca
deactivated-642321fb121ca

7142

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#9 deactivated-642321fb121ca
Member since 2013 • 7142 Posts

This being published?

Avatar image for fedor
Fedor

11612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 Fedor
Member since 2015 • 11612 Posts

@Random_Matt: Getting him a deal with Pendant Publishing as we speak.

Avatar image for moosewayne
MooseWayne

361

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 MooseWayne
Member since 2017 • 361 Posts

@Juub1990: Fair enough.

Avatar image for KBFloYd
KBFloYd

22714

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#13  Edited By KBFloYd
Member since 2009 • 22714 Posts

yea, i dont even care about tflops or cuda cores. i wait till the card or system comes out and then see the graphics/resolution/FPS and thats what i care about.

yes with the switch i dont really care too much. as long as it has a good artstyle.

Avatar image for lundy86_4
lundy86_4

61481

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#14 lundy86_4
Member since 2003 • 61481 Posts

Not really a number I take into account, unless it's poking fun in SW. I'm not tech-savvy enough anymore lol. I like to see real-world benches as opposed to cherry-picked and slightly biased manufacturer graphs.

Avatar image for SolidGame_basic
SolidGame_basic

45101

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 SolidGame_basic
Member since 2003 • 45101 Posts

@ProtossRushX said:

Dude PS5 has the weakest flops we just need to accept it, we got game tho we can fight on that front.

In Cerny I trust.

Avatar image for eoten
Eoten

8671

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#16  Edited By Eoten
Member since 2020 • 8671 Posts

Also, one graphics card may have more RAM, another may have faster RAM, which one works better on a specific game will depend on if that game engine is made to use faster RAM, or more RAM, which is why one GPU may work better than another on one game, but worse on another.

It's basically "the bit wars" part 2. Before Atari came out with the Jaguar and proved bits meant nothing.

___

But you also have to look at the capabilities of gaming engines, as well as what the human eye can detect in terms of pixel size and refresh rate. I have a monitor that goes up to 165hz, and honestly cannot detect any difference above 120hz. How small of a pixel you can detect is dependent on the distance you are from the screen. At 30 inches, a 24" monitor is going to look the same whether its 1080P, 1440P, 4k, or 8k because pixel size is already so small. Get up to 27" and you begin to see a difference between 1080p and the rest, and you need to go over 32 inches to see a difference between 1440p and the rest. I don't know about you, but I am not going to be sitting 2 1/2 feet from a 40 inch screen to take advantage of 4k and 8k.

At present, even the AMD Radeon VII can push todays AAA titles on their maximum settings to 120fps on a 1440p resolution, so more power beyond that point is only going to aid in "future proofing." That said, graphics in games is slowing down, considerably. Look back just 10 years ago at Mass Effect 2. It still looks great by todays standards. Go back 10 years before that and you're looking at something like Resident Evil 3 on PS1 which looks considerably worse.

So how many years will it be before that "future proofing" pays off? And at that point, what graphics cards will be available at that time? And how much will they cost?

So if you already have something like a VII, or 2080TI, there's really nothing to gain at this point with a 3080, and by time there IS something to gain, a 3080 is going to be way cheaper, or something better will be out. Likewise for those who own something like a 5700XT, you could probably still hold off another 2 years and get the gen after the 5950XT. So the only people who would benefit from upgrading at this point are those using cards like 1060s, 1660s, maybe 1070s, 1080s still seem good though, and those using RX590s or below for AMD people.

Avatar image for ivangrozny
IvanGrozny

1845

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 IvanGrozny
Member since 2015 • 1845 Posts

Oh wow, this must be the most insane damage control over PS5 9.2TF i have ever seen to date. Count me in as impressed.

Did you OP came up with this 2-page drivel by yourself or did you use some sources?

OP, you know there are professionals that can prescribe you some meds for that? No trolling, just genuine concern.

Avatar image for Jag85
Jag85

19543

Forum Posts

0

Wiki Points

0

Followers

Reviews: 219

User Lists: 0

#18 Jag85
Member since 2005 • 19543 Posts

Every generation always has a simplistic misleading benchmark to compare performance. Back in the days, it was bits, then MHz, and then polygons. Now it's FLOPS.

A more accurate benchmark for comparing raw power is transistor count, i.e. the number of MOS transistors on a chip. That's been a fairly consistent benchmark for measuring power, across the generations.

Avatar image for BassMan
BassMan

17806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#19 BassMan
Member since 2002 • 17806 Posts

@moosewayne said:

I'm dumb as shit. why isn't there a universal number for graphics power? has to be quantifiable.

That is why you go by benchmarks and not specs. That shows the true performance.

Avatar image for firedrakes
firedrakes

4365

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#20 firedrakes
Member since 2004 • 4365 Posts

@Juub1990 said:

@moosewayne: There isn’t because graphics are made of a myriad of things and a GPU might be more proficient at one thing than another. Take a very simple example; textures. A weaker card overall might deliver better textures simply from having more VRAM than a stronger card that doesn’t have enough of it.

No need to even get into things like FP32 va FP16, TMU’s triangles/s, ray casting etc. Graphics can’t be quantified by a single number.

It’s like those sports games where players are given a rating based on a number of different variables. Ratings which are often wrong.

best answer i seen.

Avatar image for deactivated-5f58f7bf5ecf3
deactivated-5f58f7bf5ecf3

293

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#21 deactivated-5f58f7bf5ecf3
Member since 2020 • 293 Posts

So...Cerny was right?

Avatar image for techhog89
Techhog89

5430

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 Techhog89
Member since 2015 • 5430 Posts

@ivangrozny said:

Oh wow, this must be the most insane damage control over PS5 9.2TF i have ever seen to date. Count me in as impressed.

Did you OP came up with this 2-page drivel by yourself or did you use some sources?

OP, you know there are professionals that can prescribe you some meds for that? No trolling, just genuine concern.

How can you call something insane without reading it?

Avatar image for drserigala
DrSerigala

208

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#24  Edited By DrSerigala
Member since 2018 • 208 Posts

Cow lost his mind. ROFL

Cannot accept the fact they will paying the weakest system on Next-Gen.

But muh SSD. Can your SSD cure the cancer yet? LMAO

Avatar image for techhog89
Techhog89

5430

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 Techhog89
Member since 2015 • 5430 Posts

@drserigala said:

Cow lost his mind. ROFL

Cannot accept the fact they will paying the weakest system on Next-Gen.

But muh SSD. Can your SSD cure the cancer yet? LMAO

Nah, my 3080 will beat both consoles lol. Also Series S would be the weakest.

Avatar image for eoten
Eoten

8671

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#26 Eoten
Member since 2020 • 8671 Posts

@techhog89 said:
@drserigala said:

Cow lost his mind. ROFL

Cannot accept the fact they will paying the weakest system on Next-Gen.

But muh SSD. Can your SSD cure the cancer yet? LMAO

Nah, my 3080 will beat both consoles lol. Also Series S would be the weakest.

Actually, it won't. Because by time game developers reach a point to where a 3080 will have a noticeable improvement on games, the next generation GPUs will be on the market, and people will be dumping 3080s for the next one for fractions of the cost.

Avatar image for fedor
Fedor

11612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27  Edited By Fedor
Member since 2015 • 11612 Posts

@eoten said:
@techhog89 said:
@drserigala said:

Cow lost his mind. ROFL

Cannot accept the fact they will paying the weakest system on Next-Gen.

But muh SSD. Can your SSD cure the cancer yet? LMAO

Nah, my 3080 will beat both consoles lol. Also Series S would be the weakest.

Actually, it won't. Because by time game developers reach a point to where a 3080 will have a noticeable improvement on games, the next generation GPUs will be on the market, and people will be dumping 3080s for the next one for fractions of the cost.

Lol, the 3080 is always going to be far stronger than the PS5 and XSX... Also, you ever heard of a game called Cyberpunk 2077? There's going to be a very noticeable difference between the 3080 and the consoles in that game.

Avatar image for eoten
Eoten

8671

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#28 Eoten
Member since 2020 • 8671 Posts

@fedor said:
@eoten said:
@techhog89 said:
@drserigala said:

Cow lost his mind. ROFL

Cannot accept the fact they will paying the weakest system on Next-Gen.

But muh SSD. Can your SSD cure the cancer yet? LMAO

Nah, my 3080 will beat both consoles lol. Also Series S would be the weakest.

Actually, it won't. Because by time game developers reach a point to where a 3080 will have a noticeable improvement on games, the next generation GPUs will be on the market, and people will be dumping 3080s for the next one for fractions of the cost.

Lol, the 3080 is always going to be far stronger than the PS5 and XSX... Also, you ever heard of a game called Cyberpunk 2077? There's going to be a very noticeable difference between the 3080 and the consoles in that game.

It doesn't matter. It's like trying to compare a car that can do 100mph to one that can do 180 when you're driving both at the speed limit. There are other bottlenecks to consider, and there is a limitation of how small of a pixel the human eye can individual detect, and how fast a screen can refresh and that difference be noticeable. Current top end GPUs already pass that mark so until gaming engines start adding more to them that require more of those features, which is going to take a couple years, then you're always going to be tied to the limit of the games you are playing. So no, you're not going to see a practical, noticeable improvement between a 2080ti and a 3080 for at least a couple more years. All you'll be able to compare until then are numbers on a spec sheet.

Avatar image for fedor
Fedor

11612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#29 Fedor
Member since 2015 • 11612 Posts

@eoten said:
@fedor said:
@eoten said:
@techhog89 said:
@drserigala said:

Cow lost his mind. ROFL

Cannot accept the fact they will paying the weakest system on Next-Gen.

But muh SSD. Can your SSD cure the cancer yet? LMAO

Nah, my 3080 will beat both consoles lol. Also Series S would be the weakest.

Actually, it won't. Because by time game developers reach a point to where a 3080 will have a noticeable improvement on games, the next generation GPUs will be on the market, and people will be dumping 3080s for the next one for fractions of the cost.

Lol, the 3080 is always going to be far stronger than the PS5 and XSX... Also, you ever heard of a game called Cyberpunk 2077? There's going to be a very noticeable difference between the 3080 and the consoles in that game.

It doesn't matter. It's like trying to compare a car that can do 100mph to one that can do 180 when you're driving both at the speed limit. There are other bottlenecks to consider, and there is a limitation of how small of a pixel the human eye can individual detect, and how fast a screen can refresh and that difference be noticeable. Current top end GPUs already pass that mark so until gaming engines start adding more to them that require more of those features, which is going to take a couple years, then you're always going to be tied to the limit of the games you are playing. So no, you're not going to see a practical, noticeable improvement between a 2080ti and a 3080 for at least a couple more years. All you'll be able to compare until then are numbers on a spec sheet.

Yes it does matter, and yes, the 3080 will ALWAYS HAVE FAR MORE PERFORMANCE THAN THE PS5 AND THE XSX. Thanks for the odd tangent though.

Avatar image for zaryia
Zaryia

21607

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#30  Edited By Zaryia
Member since 2016 • 21607 Posts

TLDR: PC destroys next gen consoles before the new gen is even out and the gap will continue to grow very wide.

LMAO! I LOVE IT!

Avatar image for eoten
Eoten

8671

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#31  Edited By Eoten
Member since 2020 • 8671 Posts

@fedor said:
@eoten said:
@fedor said:
@eoten said:

Actually, it won't. Because by time game developers reach a point to where a 3080 will have a noticeable improvement on games, the next generation GPUs will be on the market, and people will be dumping 3080s for the next one for fractions of the cost.

Lol, the 3080 is always going to be far stronger than the PS5 and XSX... Also, you ever heard of a game called Cyberpunk 2077? There's going to be a very noticeable difference between the 3080 and the consoles in that game.

It doesn't matter. It's like trying to compare a car that can do 100mph to one that can do 180 when you're driving both at the speed limit. There are other bottlenecks to consider, and there is a limitation of how small of a pixel the human eye can individual detect, and how fast a screen can refresh and that difference be noticeable. Current top end GPUs already pass that mark so until gaming engines start adding more to them that require more of those features, which is going to take a couple years, then you're always going to be tied to the limit of the games you are playing. So no, you're not going to see a practical, noticeable improvement between a 2080ti and a 3080 for at least a couple more years. All you'll be able to compare until then are numbers on a spec sheet.

Yes it does matter, and yes, the 3080 will ALWAYS HAVE FAR MORE PERFORMANCE THAN THE PS5 AND THE XSX. Thanks for the odd tangent though.

It sure as **** didn't amount to much for all those people who wasted $1,200 on a 2080TI, did it? By time gaming got to a point where they could notice some real improvements with their cards, the 3080 came riding in.

And dump on console players all you want, I've been strictly a PC player since 2013, so I've been guilty of this as well, but at least they can play anything from Fall Guys to Warzone and not have to deal with rampant cheating.

Avatar image for fedor
Fedor

11612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#32 Fedor
Member since 2015 • 11612 Posts

@eoten said:
@fedor said:
@eoten said:
@fedor said:
@eoten said:

Actually, it won't. Because by time game developers reach a point to where a 3080 will have a noticeable improvement on games, the next generation GPUs will be on the market, and people will be dumping 3080s for the next one for fractions of the cost.

Lol, the 3080 is always going to be far stronger than the PS5 and XSX... Also, you ever heard of a game called Cyberpunk 2077? There's going to be a very noticeable difference between the 3080 and the consoles in that game.

It doesn't matter. It's like trying to compare a car that can do 100mph to one that can do 180 when you're driving both at the speed limit. There are other bottlenecks to consider, and there is a limitation of how small of a pixel the human eye can individual detect, and how fast a screen can refresh and that difference be noticeable. Current top end GPUs already pass that mark so until gaming engines start adding more to them that require more of those features, which is going to take a couple years, then you're always going to be tied to the limit of the games you are playing. So no, you're not going to see a practical, noticeable improvement between a 2080ti and a 3080 for at least a couple more years. All you'll be able to compare until then are numbers on a spec sheet.

Yes it does matter, and yes, the 3080 will ALWAYS HAVE FAR MORE PERFORMANCE THAN THE PS5 AND THE XSX. Thanks for the odd tangent though.

It sure as **** didn't amount to much for all those people who wasted $1,200 on a 2080TI, did it?By time gaming got to a point where they could notice some real improvements with their cards, the 3080 came riding in.

And dump on console players all you want,I've been strictly a PC player since 2013, so I've been guilty of this as well, but at least they can play anything from Fall Guys to Warzone and not have to deal with rampant cheating.

It sure did pay off for them, they were the only people playing max settings/60fps/4k for the last 2 years. I also didn't dump on console players once so no clue what you're talking about.

Avatar image for deactivated-611a8cd6e3c93
deactivated-611a8cd6e3c93

421

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#33 deactivated-611a8cd6e3c93
Member since 2013 • 421 Posts

Ah TFLOPS... If it was that easy, AMD would not be completely irrelevant in the high-end GPU market for years...😆


Avatar image for sancho_panzer
Sancho_Panzer

2524

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#34  Edited By Sancho_Panzer
Member since 2015 • 2524 Posts

It will be interesting to see how much frame stutter and erratic fan behaviour ends up affecting 3rd party games on PS5. The idea of good developers being able to work around a system's quirks is fine as long as that's something as simple as toggling SmartShift off where needed, but it would be a tall order to ask multiplat studios to tailor their entire game design to PS5 limitations. I'll bet consistent moment-to-moment performance, free of jerks and hitches is a bigger deal to most console owners than occasionally higher FPS.

SmartShift doesn't seem to have taken off on PC, and maybe there's a good reason. Could it be that Mark Cerny recognised what a potent secret sauce ingredient it could be and Sony has shelled out for exclusivity? :0

Avatar image for rzxv04
rzxv04

2578

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#35 rzxv04
Member since 2018 • 2578 Posts

@sancho_panzer said:

It will be interesting to see how much frame stutter and erratic fan behaviour ends up affecting 3rd party games on PS5. The idea of good developers being able to work around a system's quirks is fine as long as that's something as simple as toggling SmartShift off where needed, but it would be a tall order to ask multiplat studios to tailor their entire game design to PS5 limitations. I'll bet consistent moment-to-moment performance, free of jerks and hitches is a bigger deal to most console owners than occasionally higher FPS.

SmartShift doesn't seem to have taken off on PC, and maybe there's a good reason. Could it be that Mark Cerny recognised what a potent secret sauce ingredient it could be and Sony has shelled out for exclusivity? :0

I think that the PS5 "Smart shift" is more baked into the development kits. I remember that the "Model SoC" makes things much more consistent for devs. I think someone mentioned that it could have an internal table and devs would be able to choose.

It's not exactly like how you can tweak PCs to its max, where you tweak per CPU/GPU/RAM because each individual unit could go do different levels of OC. The PS5 has the "Model SoC" which would be the same for every PS5 unit.

I think you're right though. PS5 development might be a little bit more of a hassle to squeeze out everything it can offer because of the design but I'll give them the benefit of the doubt after more GDC like talks in 2021. Hopefully it's a more automated system that what it initially sounds like.

Avatar image for gtx021
gtx021

515

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36 gtx021
Member since 2013 • 515 Posts

2080ti full joking card & sorry for it´s owners,this card will run next year 2021 in aaa games ,with 1080p,60fps ,

nvidia treats 2080ti owners like a sh!t.

Avatar image for eoten
Eoten

8671

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#37  Edited By Eoten
Member since 2020 • 8671 Posts

@fedor said:
@eoten said:
@fedor said:
@eoten said:

It doesn't matter. It's like trying to compare a car that can do 100mph to one that can do 180 when you're driving both at the speed limit. There are other bottlenecks to consider, and there is a limitation of how small of a pixel the human eye can individual detect, and how fast a screen can refresh and that difference be noticeable. Current top end GPUs already pass that mark so until gaming engines start adding more to them that require more of those features, which is going to take a couple years, then you're always going to be tied to the limit of the games you are playing. So no, you're not going to see a practical, noticeable improvement between a 2080ti and a 3080 for at least a couple more years. All you'll be able to compare until then are numbers on a spec sheet.

Yes it does matter, and yes, the 3080 will ALWAYS HAVE FAR MORE PERFORMANCE THAN THE PS5 AND THE XSX. Thanks for the odd tangent though.

It sure as **** didn't amount to much for all those people who wasted $1,200 on a 2080TI, did it?By time gaming got to a point where they could notice some real improvements with their cards, the 3080 came riding in.

And dump on console players all you want,I've been strictly a PC player since 2013, so I've been guilty of this as well, but at least they can play anything from Fall Guys to Warzone and not have to deal with rampant cheating.

It sure did pay off for them, they were the only people playing max settings/60fps/4k for the last 2 years. I also didn't dump on console players once so no clue what you're talking about.

4K gaming is a gimmick. Lmao @ 60fps.

Avatar image for deactivated-611a8cd6e3c93
deactivated-611a8cd6e3c93

421

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#38  Edited By deactivated-611a8cd6e3c93
Member since 2013 • 421 Posts

@eoten said:
@fedor said:

It sure did pay off for them, they were the only people playing max settings/60fps/4k for the last 2 years. I also didn't dump on console players once so no clue what you're talking about.

4K gaming is a gimmick. Lmao @ 60fps.

...as if those console plebs weren't full of delusions, a random hunchback throws more into the mix...🙄

Avatar image for eoten
Eoten

8671

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#39  Edited By Eoten
Member since 2020 • 8671 Posts

@WESTBLADE85 said:
@eoten said:
@fedor said:

It sure did pay off for them, they were the only people playing max settings/60fps/4k for the last 2 years. I also didn't dump on console players once so no clue what you're talking about.

4K gaming is a gimmick. Lmao @ 60fps.

...as if those console plebs weren't full of delusions, a random hunchback throws more into the mix...🙄

Except I am a PC gamer ;-). if 60fps and motion blur is what you're into, that's what you're more likely to get on a console.

Avatar image for deactivated-611a8cd6e3c93
deactivated-611a8cd6e3c93

421

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#40  Edited By deactivated-611a8cd6e3c93
Member since 2013 • 421 Posts

@eoten said:
@WESTBLADE85 said:
@eoten said:

4K gaming is a gimmick. Lmao @ 60fps.

...as if those console plebs weren't full of delusions, a random hunchback throws more into the mix...🙄

Except I am a PC gamer ;-). if 60fps and motion blur is what you're into, that's what you're more likely to get on a console.

Hunchback = acting all boss, yet has a tiny screen and a smelly chair.
My 65" 4K/120Hz/HDR10+ TV says hi...

Avatar image for pc_rocks
PC_Rocks

8470

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#41 PC_Rocks
Member since 2018 • 8470 Posts

That's whole lot of words to say that PS5 is the weakest of the bunch. Damn, cows have really lost their minds. I think MS should give Nvidia their place because their reveals do more damage to cows than anything.

Avatar image for techhog89
Techhog89

5430

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#42 Techhog89
Member since 2015 • 5430 Posts

@pc_rocks: Not a cow. Just setting people straight about Ampere TFLOPS and this 9.2 nonsense.

Avatar image for slimdogmilionar
slimdogmilionar

1343

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#43 slimdogmilionar
Member since 2014 • 1343 Posts

@techhog89: well written

Avatar image for pc_rocks
PC_Rocks

8470

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#44 PC_Rocks
Member since 2018 • 8470 Posts

@techhog89 said:

@pc_rocks: Not a cow. Just setting people straight about Ampere TFLOPS and this 9.2 nonsense.

Cool to know you agree you wrote all this because lems keep rubbing 9.2 in your face.

Avatar image for DragonfireXZ95
DragonfireXZ95

26645

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#45 DragonfireXZ95
Member since 2005 • 26645 Posts

@eoten said:
@fedor said:
@eoten said:
@fedor said:
@eoten said:

Actually, it won't. Because by time game developers reach a point to where a 3080 will have a noticeable improvement on games, the next generation GPUs will be on the market, and people will be dumping 3080s for the next one for fractions of the cost.

Lol, the 3080 is always going to be far stronger than the PS5 and XSX... Also, you ever heard of a game called Cyberpunk 2077? There's going to be a very noticeable difference between the 3080 and the consoles in that game.

It doesn't matter. It's like trying to compare a car that can do 100mph to one that can do 180 when you're driving both at the speed limit. There are other bottlenecks to consider, and there is a limitation of how small of a pixel the human eye can individual detect, and how fast a screen can refresh and that difference be noticeable. Current top end GPUs already pass that mark so until gaming engines start adding more to them that require more of those features, which is going to take a couple years, then you're always going to be tied to the limit of the games you are playing. So no, you're not going to see a practical, noticeable improvement between a 2080ti and a 3080 for at least a couple more years. All you'll be able to compare until then are numbers on a spec sheet.

Yes it does matter, and yes, the 3080 will ALWAYS HAVE FAR MORE PERFORMANCE THAN THE PS5 AND THE XSX. Thanks for the odd tangent though.

It sure as **** didn't amount to much for all those people who wasted $1,200 on a 2080TI, did it? By time gaming got to a point where they could notice some real improvements with their cards, the 3080 came riding in.

And dump on console players all you want, I've been strictly a PC player since 2013, so I've been guilty of this as well, but at least they can play anything from Fall Guys to Warzone and not have to deal with rampant cheating.

That's not true. Xbox players are forced to play with crossplay on for Warzone.

Avatar image for techhog89
Techhog89

5430

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#46 Techhog89
Member since 2015 • 5430 Posts

@pc_rocks said:
@techhog89 said:

@pc_rocks: Not a cow. Just setting people straight about Ampere TFLOPS and this 9.2 nonsense.

Cool to know you agree you wrote all this because lems keep rubbing 9.2 in your face.

I mean if I really cared about that I could have just unironically rubbed 30 in your face. :/ I just don't like inaccurate information flying around.