2013 GPU Availability vs 2020 GPU Availability

  • 63 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#51 ronvalencia
Member since 2008 • 29612 Posts
@rdnav2 said:

No he wasn’t spot on.

He assumed Navi was just 1 GPU for the longest time, and GTX 1080 level Performance.

He assumed Navi was just GCN all over again

He assumed Next Gen Consoles would Get old AMD GPU Tech

He assumed GPU’s can’t be downclocked to get better efficiency per watt

He Assumed AMD GPU efficiency would remain stagnant for years.

Mistake after mistake had him adamantly predicting 8-9 GCN Teraflops.

In the end we got 12.2 RDNA Teraflops which is the equivalent to 14-15 GCN Teraflops.

He was massively wrong

This is system wars, not GameSpot forum poster wars.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#52  Edited By ronvalencia
Member since 2008 • 29612 Posts
@pc_rocks said:

They witnessed it under controlled settings. Did they get to dissect it in depth? Did they test the final build? Considering it as an evidence is the same as when Ron goes on about X1X better than or equal to 1070 based on Forza.

How could anyone claim the capabilities of anything when you haven't put in under stressed at all? In terms of games, RDR 2 or Control would be a much better candidate.

I never said X1X being better than GTX 1070. DF didn't even claim X1X being better than GTX 1070. Your narrative is false.

Games such as Far Cry 5 have shown X1X beating RX 580. While X1X's Killer Instinct Season 3 shows GTX 980 Ti and GTX 1070 like results.

It's a good move by MS to show Nvidia bias Unreal Engine 4 based Gears 5 for XSX's reveal.

Avatar image for KungfuKitten
KungfuKitten

27389

Forum Posts

0

Wiki Points

0

Followers

Reviews: 42

User Lists: 0

#53  Edited By KungfuKitten
Member since 2006 • 27389 Posts

I'm stoked about it. Finally a bit of a performance leap on consoles. Multiplat presentation may get a good push forward, and I can still play them on my custom PC setup.

And with the new upscale tech advancing, maybe we can all start enjoying games at an interpreted 4k 60+ fps or 2k 100+ fps.

Also the new advances on the serverside for gaming allowing for crazy player numbers, and machinelearning starting to dig into developing real time manipulation of data in impressive ways... I think by the end of next gen we will see some crazy cool stuff. Plus this step up in hardware power will enable VR with a better sensation of presence.

Avatar image for pc_rocks
PC_Rocks

8470

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#54  Edited By PC_Rocks
Member since 2018 • 8470 Posts

@rdnav2 said:
@pc_rocks said:

He was right on the spot, so is 04Carrier. Context is all that matters which you're conveniently trying to overlook. What's the TDP of PS5 and XSX in comparison to Pro/X1X and why the XSX is a tower design? It's a departure from the more conventional TDP budgets of consoles. Had the form factor remain the same both the predictions would be right on the spot.

Although a better question is how's the RT and ML capabilities of both XSX and PS5 compare to Turing? No that RT intersection number isn't comparable at all. They purposefully used that metric to avoid direct comparison with Turing or else they would look bad.

No he wasn’t spot on.

He assumed Navi was just 1 GPU for the longest time, and GTX 1080 level Performance.

He assumed Navi was just GCN all over again

He assumed Next Gen Consoles would Get old AMD GPU Tech

He assumed GPU’s can’t be downclocked to get better efficiency per watt

He Assumed AMD GPU efficiency would remain stagnant for years.

Mistake after mistake had him adamantly predicting 8-9 GCN Teraflops.

In the end we got 12.2 RDNA Teraflops which is the equivalent to 14-15 GCN Teraflops.

He was massively wrong

Regardless, the new consoles are not playing by he same rules as the previous consoles and 04carrier specifically mentioned this bit in his quote about TDP.

Avatar image for pc_rocks
PC_Rocks

8470

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#55 PC_Rocks
Member since 2018 • 8470 Posts
@Juub1990 said:

@pc_rocks said:

They witnessed it under controlled settings. Did they get to dissect it in depth? Did they test the final build? Considering it as an evidence is the same as when Ron goes on about X1X better than or equal to 1070 based on Forza.

How could anyone claim the capabilities of anything when you haven't put in under stressed at all? In terms of games, RDR 2 or Control would be a much better candidate.

We know Forza is bullshit because prior to the driver updates, AMD cards performed incredibly better than NVIDIA cards to the point a freakin' 5700XT could beat a 2080 Ti. Gears of War doesn't do anything of the sort. Not to mention it was quickly scrambled to have something to show. Done in 2 weeks with very little optimization.

You dudes are in for a rude awakening once the performance reviews for the new consoles are out. I sure as heck don't expect them to beat top-tier GPU's but suggesting a 2080 isn't a wash with the Series X is just completely biased bullshit.

Anyway, we'll see it when it happens. No need to argue over incomplete information.

Again you're going by PR and no third party benchmarks. Already seen that with AMD's Vega 64 better than 2080 slide.

Again, why would RDNA or XSX being powerful means any bad news or rude awakening for me. For once PC won't be held that much back but most importantly I would love to have competent AMD GPU so Nvidia can get their a$$es in check and stop with their price gauge.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#56  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@Grey_Eyed_Elf said:

50% better performance per watt just means that it uses less power... It has nothing to do with TFLOP vs TFLOP, that is IPC.

DF said the XSX performed CLOSE to a RTX 2080... Meaning its at overclocked RTX 2070 S performance. So all these fanboys freaking out and claiming the only GPU better than a XSX is a 2080 Ti are just ignoring the ONLY information we know.

Same as Ray Tracing ability based on Minecraft alone puts the XSX at RTX 2060 - 2070 levels based on Nvidia's target and the fact that the XSX couldn't lock a 60FPS at 1080P when a Ti did it with ease and is targeting 1440P with Minecraft.

PS5 = RTX 2060 S

XSX = RTX 2070 S

Ray Tracing abilities on both a tier lower than their rasterization performance compared to Nvidia...

PS5 = RTX 2060

XSX = RTX 2070

Based all on information we know. No fanboy bulls*** and things pulled from the air.

Wrong, 50% more performance per watt means two things..... 50% more performance for same power draw or the same performance with 50% less power draw.

You need to re-watch that Gears 5 video with only 2 weeks worth of work porting...... DF stated that the XsX was nearly identical performance to a Pc with Ryzen Threadripper 2950x & GeForce RTX 2080 Ti graphics card...... https://youtu.be/oNZibJazWTo?t=743

Now as for RT performance we have no idea where a matured and optimized version would do.

As it stands now with RT not involved PS5 is looking at RTX 2070S performance range and XsX is at RTX 2080S level. Once they mature we will see more performance out of them.

We are going to have to wait until these consoles come out along with AMD's RDNA 2.0 based gpus to actually be able to gauge their performance numbers. But the problem is that we cannot compare these consoles with RDNA 1.0 right now. Because we have no idea what RDNA 2.0 has done better over v1.0.

Avatar image for techhog89
Techhog89

5430

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#57 Techhog89
Member since 2015 • 5430 Posts

Stop using FLOPS to compare completely different GPUs. It doesn't work.

Avatar image for techhog89
Techhog89

5430

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#59 Techhog89
Member since 2015 • 5430 Posts

Actually, seeing what some of you are saying, you guys shouldn't be talking about tech at all.

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#60  Edited By Juub1990
Member since 2013 • 12620 Posts
@04dcarraher said:

Wrong, 50% more performance per watt means two things..... 50% more performance for same power draw or the same performance with 50% less power draw.

You need to re-watch that Gears 5 video with only 2 weeks worth of work porting...... DF stated that the XsX was nearly identical performance to a Pc with Ryzen Threadripper 2950x & GeForce RTX 2080 Ti graphics card...... https://youtu.be/oNZibJazWTo?t=743

Now as for RT performance we have no idea where a matured and optimized version would do.

As it stands now with RT not involved PS5 is looking at RTX 2070S performance range and XsX is at RTX 2080S level. Once they mature we will see more performance out of them.

We are going to have to wait until these consoles come out along with AMD's RDNA 2.0 based gpus to actually be able to gauge their performance numbers. But the problem is that we cannot compare these consoles with RDNA 1.0 right now. Because we have no idea what RDNA 2.0 has done better over v1.0.

It's almost as fast as a 2080, not a 2080 Ti. The video says they compared it to a 2080, not the Ti my dude.

The video shows a 2080 Ti but it says "You will have to use your imagination since we have no footage of this." And John keeps saying 2080, not 2080 Ti.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#61 04dcarraher
Member since 2004 • 23829 Posts

@Juub1990 said:

It's almost as fast as a 2080, not a 2080 Ti. The video says they compared it to a 2080, not the Ti my dude.

The video shows a 2080 Ti but it says "You will have to use your imagination since we have no footage of this." And John keeps saying 2080, not 2080 Ti.

he also stated at the timestamp i linked....

" that pc using same settings as XsX with fixed resolution scaling produced nearly same results"

Even though he may say RTX 2080. he also stated showing was with only two weeks of work. He's eying balling but the fact he stated produced nearly same results means it has to be better than a plain RTX 2080.

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#62 Juub1990
Member since 2013 • 12620 Posts

@04dcarraher: I already said that it produced nearly the same results as a Threadripper+2080 above in a quick port done in two weeks. Just wanted to correct the 2080 Ti comparison.

Avatar image for morningrisee
MorningRisee

1

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#63 MorningRisee
Member since 2020 • 1 Posts

Not even close because the base PS4 beats all those cards you mention here easily except for the titan. These cards can't even play ps4 multiplatform games since 2017. And if they can even run anything they play on UGLY graphics only. Look it up.

GTX 590 can't play like 75% PS4 games since 2017. And the other 25% this card can only play on UGLY graphics. Look it up.

same thing with the GTX 690, it can play most PS4 games. And if it can play anything it's on UGLY graphics as well. Please do your research. PS4 plays all games looking A+ beautiful.

R7 260X? Just like the 750ti these cards are mere 2gb vram cards these cards plays only on UGLY graphics since 2017. Look it up. PS4 never plays on ugly graphics like these cards. 750ti/r7 260x plays mortal kombat 11 on PS3 graphics 40fps while the PS4 plays it looking A+ beautiful 60fps native 1080p

PS4 is even beating a gtx 1050ti now. PS4 is a 5gb vram machine to begin with and the 1050ti is 4gb vram. Gtx 1050ti plays on inferior graphics in marvel avengers than the ps4. Crysis remaster minimum requirements is a gtx 1050ti and this is still a ps4 games.

Problem is you only compared these cards to the base PS4 only on the first 2 years when games are not yet demanding. Witcher 3 is an easy game most 2gb cards can play this 30fps plus at all high settings.

When games started requiring at least 4gb of vram to look pretty all these 2gb cards only plays on UGLY graphics and this is since 2017. I repeat it again, Ugly graphics. 3gb vram cards like gtx 780ti plays on inferior graphics than the ps4 too since 2018.

It's not about tflops. It's all about the vram in the long run.

Gtx 1050 a mere 2.1gb (but with 4gb of vram) > gtx 780ti / gtx 690 / gtx 960 / 7970.

Look it up these old gpu' can't play most modern games at PS4 graphics and performance.

Cyberpunk valhalla minimum requirement is a R9 290x 4gb and this is still a ps4 game.

The base 2013 PS4 actually beats all high end gpu of 2013 except for R9290x in the long run starting at year 4 (2017).

Please look it up. Gtx 590/ Gtx 690 can't play most modern demanding PS4 games since 2017. And if it can run anything like COD 2019, it can only plays on UGLY PS3 graphics while the base PS4 plays COD 2019 looking 98% identical to PC ultra setting at 1080p. There is virtually no difference even digital foundry can't find any difference between all consoles and pc ultra settings (it's only look sharper on native 4k with a rtx 2080ti).

Gtx 590/Gtx 690 has higher tflops etc than the PS4 but these cards has old gpu architecture built during the PS3 era and has inferior vram than the PS4. This is the reason these cards is not playing modern PS4 games properly or at all.

Please research first. Gtx 770 plays on UGLY graphics since 2017.

And guess what history will repeat itself.

Rtx 2080ti / Rtx 3070 / Rtx 3080 will easily be beaten by the PS5/SeriesX in the long run starting at year 4 and up.

2013 PS4 1.8tflops 5gb vram with new gpu architecture > gtx 590/gtx690 2.8tflops 2gb vram with old gpu architectures.

2013 PS4 1.8tflops 5gb vram with new gpu architecutre > gtx 780ti 4.4 tflops 2/4gb vram.

It is not all about tflops. The ps3 was a 2tflops machine vs the ps4 1.8tflops. It's all about the vram and gpu generation architecture.

Also, the base ps4 don't do variable fps. They lock it at 30fps or 60fps intentionally. Even if the PS4 can actually do like 28-43 fps or 35-51fps on a certain game they would still lock it at 30 for consistency. As long as it can't reach 60fps they would lock it at 30fps.

PS5/SeriesX is capable of 13.5-14gb of vram by default from its 16gb of shared memory. XboxoneX has 12gb of shared memory and 9gb is exclusive for vram. The shared memory of the consoles are full on flexible and almost all of the memory can be use as vram if needed too.

PS5/SeriesX 13.5gb vram and new gpu architecture > Rtx 2080ti 11gb vram 13.5tflops with old gpu architecture.

You really think a 2080ti can still ps5/seriesx graphics/games for the next 9 years like these next gen consoles easily will? Ps5/Series will play insane more demanding graphics 5-9 years from now easily.

PS5/SeriesX 13.5gb vram > Gtx 3070 8gb vram / Gtx 3080 10gb vram in the long run starting at year 4.

When games starts requiring more than 10gb of vram even at 1440p to look pretty these cards will only play on ugly graphics or completely unplayable too in other games.

Please research how all these old cards you mention perform in modern games since 2017 vs the PS4. PS4 beats all of these 2013 cards and below except for the titan/r9 290x.

History will always repeat itself. PS5/Seriesx will easily beat these 2 year old rtx 2000 series cards just like the PS4 did. No 2 year old high end gpu beat the ps2. No 2 year old high end gpu beat the ps3 too.

Please research on modern games how these 2013 cards (except for titan and R9 290x) and below plays modern demanding PS4 games. PS4/Xboxone is superior in the long run. Not even a contest.

And guess what, the 2006 PS3 also beats the most powerful gpu of 2007 and like 99% of all high end gpu in 2008. These high eng gpu can't even play like the last 20 latest most demanding PS3 games like MGS5, dragon age inquisition, theif, destiny, farcry 4, crysis 3, black flag, shadow of mordor, black ops 3, and so on..

Please research in the long run and not only rely on the first 2-3 years. Yes, Rtx 2080 will be able to play next gen games in the first 2 years no problem but not in the long run.

Avatar image for davillain
DaVillain

56062

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#64  Edited By DaVillain  Moderator  Online
Member since 2014 • 56062 Posts

@morningrisee: This thread is 5 months old and not worth a pump if you're new here, don't bump old threads that are 2 weeks old without anyone posting.

Locking for necro thread.