Is the nvidia 700 series already obsolete?

Avatar image for dangamit
dangamit

664

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#1  Edited By dangamit
Member since 2010 • 664 Posts

The top-tier GTX 780, which is supposed to be outperforming the R9 290, is being outbenched by the -at best- second-tier R9 280X. Although it was the Witcher 3 that started this shitstorm, the 700 series has been going downhill for more than half a year now. In fact, 280X was trading blows with the GTX 780 when Far Cry 4 was released.

Just look at these benchmarks

http://www.techspot.com/review/1006-the-witcher-3-benchmarks/page4.html

http://www.techspot.com/review/917-far-cry-4-benchmarks/page4.html

http://www.techspot.com/review/991-gta-5-pc-benchmarks/page3.html

http://www.techspot.com/review/921-dragon-age-inquisition-benchmarks/page4.html

Just to put things into perspective, look at these older benchmarks and notice the difference between the 780/680 and the 280X

http://www.techspot.com/review/733-batman-arkham-origins-benchmarks/page2.html

http://www.techspot.com/review/615-far-cry-3-performance/page5.html

http://www.techspot.com/review/642-crysis-3-performance/page4.html

For the Crysis 3 benchmark, the 680 (not even the 780) blows the 7970 out of the water! 7970 is basically the same card as the 280X. Nowadays, the 7970 shits on both 680 AND the 780. Either AMD guys are wizards, or nvidia does not give a shit.

At lower resolutions and settings, the GTX 780 is the clear winner, but the higher you go in resolution and graphics quality, the worse the GTX 780 performs. The GTX 780 is not even in the same league as the R 290 anymore. Why should I keep my 2 GTX 780s when I can sell them, buy a couple of R 290s for their better performance, and make $100 in the process?

Unless nvidia explains what is happening and fixes the problem, I will never buy their products again, because either nvidia is doing some shady shit behindd the scenes to promote their Maxwell cards, or they simply build overpriced products that are clearly inferior to the competition, and that can't stand the test of time

And to the fanboys who keep repeating the same crap about "teh maxwell tessellation technology", please remember that you might find yourselves in a similar position a year from now when nvidia releases the new series of graphics cards.

Avatar image for ShadowDeathX
ShadowDeathX

11698

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#2  Edited By ShadowDeathX
Member since 2006 • 11698 Posts

I guess this is one of the pros of having "rehashed" cards with 28nm or cards with the same/similar architecture. Longer life with optimizations. When I bought my 7970's more than 3 years ago, I wasn't expecting full usage out of them for this long and now I can go longer.

Maybe Nvidia only optimizes or puts Maxwell on priority near game launches and later optimizes for Kepler and then Fermi?

You probably won't see this repeated since AMD's new 14nm architecture will finally show up (this is why GCN was rehashed. Because AMD for 28nm only designed GCN) and Pascal come in 2016. Who really expected the GPU node to remain at 28nm for so long? Clearly not AMD.

AMD prob. had a new card design for 20nm but that never came to be.

Avatar image for dangamit
dangamit

664

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#3  Edited By dangamit
Member since 2010 • 664 Posts

So I found this post on one of the threads on nvidia forums. I really hope this is true.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#4  Edited By 04dcarraher
Member since 2004 • 23829 Posts
@dangamit said:

The top-tier GTX 780, which is supposed to be outperforming the R9 290, is being outbenched by the -at best- second-tier R9 280X. Although it was the Witcher 3 that started this shitstorm, the 700 series has been going downhill for more than half a year now. In fact, 280X was trading blows with the GTX 780 when Far Cry 4 was released.


For the Crysis 3 benchmark, the 680 (not even the 780) blows the 7970 out of the water! 7970 is basically the same card as the 280X. Nowadays, the 7970 shits on both 680 AND the 780. Either AMD guys are wizards, or nvidia does not give a shit.

At lower resolutions and settings, the GTX 780 is the clear winner, but the higher you go in resolution and graphics quality, the worse the GTX 780 performs. The GTX 780 is not even in the same league as the R 290 anymore. Why should I keep my 2 GTX 780s when I can sell them, buy a couple of R 290s for their better performance, and make $100 in the process?

Unless nvidia explains what is happening and fixes the problem, I will never buy their products again, because either nvidia is doing some shady shit behindd the scenes to promote their Maxwell cards, or they simply build overpriced products that are clearly inferior to the competition, and that can't stand the test of time

And to the fanboys who keep repeating the same crap about "teh maxwell tessellation technology", please remember that you might find yourselves in a similar position a year from now when nvidia releases the new series of graphics cards.

You need look at the whole picture and understand what is going on. The difference between 7970 ghz or 280x vs 780 to 290 the performance percentages are small to begin with normal gaming circumstances. Your talking about 20-30% differences between each tier.

The AMD GCN architecture is good with direct compute, and the games that make use of that ability will see gains. While at the same time GCN does not handle tessellation too well. Now with Kepler ie 600/700 series does ok with tessellation but does not do well with direct compute. With the Witcher, it uses heavy tessellation and some direct compute with hairworks. This is how we are seeing AMD gpu's able to perform on par or abit better then the smaller Kepler's the tessellation+compute is too much while rendering.

There is no shady business going on, its just the tech that has been putting into some of these games that show the limitations of the smaller/older Kepler architecture. Games that do not use those real time physics or using tessellation based effects you see these gpu's on even ground. Also note that those games running at 2560x1600 and the newer games that require more vram at 1080p tend to saturate the 2gb and even 3gb buffers causing performance to drop. So seeing 280x performing better the 680 and coming close to 780 is not a shock, when their both not vram limited as 680, but there is only around 20% difference in general performance between 280x and 780.

Buying any of the current GCN gpu's atm is dumb, especially with the upcoming release of DX12. Its best to hold off and wait it out until there is real reason to upgrade from dual 780's. The fact is Nvidia's Maxwell architecture handles tessellation and direct compute much better then their Kepler. so there is no "shady" stuff going on.

Side note with the Witcher the hairworks uses x8 AA so lowering it helps quite abit.

Avatar image for alucrd2009
Alucrd2009

787

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#5 Alucrd2009
Member since 2007 • 787 Posts

Red Team Wins :)))

Avatar image for deactivated-579f651eab962
deactivated-579f651eab962

5404

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#6 deactivated-579f651eab962
Member since 2003 • 5404 Posts

Moved to Hardware

Avatar image for dangamit
dangamit

664

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#7 dangamit
Member since 2010 • 664 Posts

@04dcarraher: So in short, nvidia's shortsightedness and/or incompetence is the reason their cards are falling behind. I really don't care about the engineering/design aspect of the problem. When I bought the 780 I was told it was going to be future proof. You're saying it's not. Had nvidia told me that Kepler lacks the technology to be relevant in a couple of years, I would told em to shove it. No matter how you look at it, nvidia has done fucked up.

Avatar image for gerygo
GeryGo

12803

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#8 GeryGo  Moderator
Member since 2006 • 12803 Posts

@dangamit Watch some more benchies:

I made all three benchies taken from 3 different review sites, the 780 is on the same level as 290.

I've got a 280X and I know it's about 10% weaker than 770, the 780 is a whole different tier.

Avatar image for insane_metalist
insane_metalist

7797

Forum Posts

0

Wiki Points

0

Followers

Reviews: 42

User Lists: 0

#9 insane_metalist
Member since 2006 • 7797 Posts

Don't judge by one game. 290's just do better at higher res.

Avatar image for deactivated-579f651eab962
deactivated-579f651eab962

5404

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#10 deactivated-579f651eab962
Member since 2003 • 5404 Posts

@dangamit said:

@04dcarraher: So in short, nvidia's shortsightedness and/or incompetence is the reason their cards are falling behind. I really don't care about the engineering/design aspect of the problem. When I bought the 780 I was told it was going to be future proof. You're saying it's not. Had nvidia told me that Kepler lacks the technology to be relevant in a couple of years, I would told em to shove it. No matter how you look at it, nvidia has done fucked up.

780's we're never future proof. My Titan X's aren't future proof. Nothing is future proof.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#11  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@klunt_bumskrint said:
@dangamit said:

@04dcarraher: So in short, nvidia's shortsightedness and/or incompetence is the reason their cards are falling behind. I really don't care about the engineering/design aspect of the problem. When I bought the 780 I was told it was going to be future proof. You're saying it's not. Had nvidia told me that Kepler lacks the technology to be relevant in a couple of years, I would told em to shove it. No matter how you look at it, nvidia has done fucked up.

780's we're never future proof. My Titan X's aren't future proof. Nothing is future proof.

Yep,

No gpu is future proof, suggesting nvidia's shortsightedness and/or incompetence, is not on them but its on you TC. Who told you that a 780 will be future proof and you believing them without a doubt? even AMD gpu's are not future proof and they also have their own folly's as well.

Avatar image for ShadowDeathX
ShadowDeathX

11698

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#12  Edited By ShadowDeathX
Member since 2006 • 11698 Posts

A found another recent benchmark...on a Nvidia title.

http://us.hardware.info/reviews/6029/5/evolve-tested-with-23-gpus-including-frametimes-testresults-ultra-hd-3840x2160

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#13 04dcarraher
Member since 2004 • 23829 Posts

@ShadowDeathX:

weird

Avatar image for ShadowDeathX
ShadowDeathX

11698

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#14  Edited By ShadowDeathX
Member since 2006 • 11698 Posts

^ The one I posted is more recent. Titan X included.

AMD and Nvidia given more time for driver updates and Turtle Rock for patches.

Game is dead though.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#15  Edited By Coseniath
Member since 2004 • 3183 Posts
@ShadowDeathX said:

^ The one I posted is more recent. Titan X.

AMD and Nvidia given more time for driver updates and Turtle Rock for patches.

Game is dead though.

Techspot doesn't use lately a TitanX. :P

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16  Edited By RyviusARC
Member since 2011 • 5708 Posts

@Coseniath said:
@ShadowDeathX said:

^ The one I posted is more recent. Titan X.

AMD and Nvidia given more time for driver updates and Turtle Rock for patches.

Game is dead though.

Techspot doesn't use lately a TitanX. :P

I am a bit skeptical of a 980 performing 33% better than a 970 in this benchmark.

Usually the 980 is only around 15-20% better.

And other benchmarks are closer to my comparison.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#17 Coseniath
Member since 2004 • 3183 Posts

@RyviusARC: There is a great difference between these two benchmarks.

Techspot uses Gameworks, while Guru3D doesn't.

I suspect that the reason behind Techspot's high difference is Gameworks.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18 RyviusARC
Member since 2011 • 5708 Posts

@Coseniath said:

@RyviusARC: There is a great difference between these two benchmarks.

Techspot uses Gameworks, while Guru3D doesn't.

I suspect that the reason behind Techspot's high difference is Gameworks.

So you are saying the GTX 980 dropped only 4fps with gameworks on yet the 970 dropped 10?

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#19  Edited By Coseniath
Member since 2004 • 3183 Posts
@RyviusARC said:

So you are saying the GTX 980 dropped only 4fps with gameworks on yet the 970 dropped 10?

I am saying that we don't know how gameworks works, so I (and we) have no idea whats happening. :P

Avatar image for neatfeatguy
neatfeatguy

4400

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#20  Edited By neatfeatguy
Member since 2005 • 4400 Posts

A video card is only obsolete when the owner decides it doesn't meet their standards for gaming purposes. Just because you feel a GTX 780 (now 2 years old: GeForce GTX 780 released on May 23, 2013) doesn't provide top-notch performance, doesn't mean it's obsolete.

I still utilize dual GTX 570s and I game 1080p with most settings on high and if I feel the need, I'll run some games 5760x1080. Granted on 5760x1080, any game runs from 30-60fps, with some dips into the 20s, but I don't mind. I still enjoy the experience of it.

If I game on 1080p, things run great and most games I play run 50-80fps. Are the 570s obsolete? No, not to me.

Avatar image for dangamit
dangamit

664

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#21 dangamit
Member since 2010 • 664 Posts

@04dcarraher: Well, nvidia did admit that there is something wrong with their kepler drivers, and that they are currently working on a driver update. They even said kepler GPUs will see performance improvements across the board, not just in the Witcher 3. This goes to show that it was indeed incompetence on their part. That, or they crippled the performance on purpose but never really expected gamers to catch them in the act and the shitstorm that followed. I'm gonna give nvidia the benefit of the doubt and say they simply screwed up. But what bothers is that nvidia fanboys are still coming up with silly excuses and meaningless arguments for nvidia when nvidia themselves have admitted that something is wrong.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#22 Coseniath
Member since 2004 • 3183 Posts
@dangamit said:

@04dcarraher: Well, nvidia did admit that there is something wrong with their kepler drivers, and that they are currently working on a driver update. They even said kepler GPUs will see performance improvements across the board, not just in the Witcher 3. This goes to show that it was indeed incompetence on their part. That, or they crippled the performance on purpose but never really expected gamers to catch them in the act and the shitstorm that followed. I'm gonna give nvidia the benefit of the doubt and say they simply screwed up. But what bothers is that nvidia fanboys are still coming up with silly excuses and meaningless arguments for nvidia when nvidia themselves have admitted that something is wrong.

I just came here to verify this. :P

Upcoming Kepler Fix Will Increase Performance In Several Games

A few days ago Nvidia reported on working on Driver Updates to Address Kepler Issues With The Witcher 3. It appears though, that Kepler owners will enjoy a performance boost in several games with the new driver.

Nvidia’s ManuelG noted that the upcoming drivers will offer a universal fix for Kepler GPUs that will boost performance in various PC games.

Avatar image for deactivated-59d151f079814
deactivated-59d151f079814

47239

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#23 deactivated-59d151f079814
Member since 2003 • 47239 Posts

@04dcarraher said:
@dangamit said:

The top-tier GTX 780, which is supposed to be outperforming the R9 290, is being outbenched by the -at best- second-tier R9 280X. Although it was the Witcher 3 that started this shitstorm, the 700 series has been going downhill for more than half a year now. In fact, 280X was trading blows with the GTX 780 when Far Cry 4 was released.


For the Crysis 3 benchmark, the 680 (not even the 780) blows the 7970 out of the water! 7970 is basically the same card as the 280X. Nowadays, the 7970 shits on both 680 AND the 780. Either AMD guys are wizards, or nvidia does not give a shit.

At lower resolutions and settings, the GTX 780 is the clear winner, but the higher you go in resolution and graphics quality, the worse the GTX 780 performs. The GTX 780 is not even in the same league as the R 290 anymore. Why should I keep my 2 GTX 780s when I can sell them, buy a couple of R 290s for their better performance, and make $100 in the process?

Unless nvidia explains what is happening and fixes the problem, I will never buy their products again, because either nvidia is doing some shady shit behindd the scenes to promote their Maxwell cards, or they simply build overpriced products that are clearly inferior to the competition, and that can't stand the test of time

And to the fanboys who keep repeating the same crap about "teh maxwell tessellation technology", please remember that you might find yourselves in a similar position a year from now when nvidia releases the new series of graphics cards.

You need look at the whole picture and understand what is going on. The difference between 7970 ghz or 280x vs 780 to 290 the performance percentages are small to begin with normal gaming circumstances. Your talking about 20-30% differences between each tier.

The AMD GCN architecture is good with direct compute, and the games that make use of that ability will see gains. While at the same time GCN does not handle tessellation too well. Now with Kepler ie 600/700 series does ok with tessellation but does not do well with direct compute. With the Witcher, it uses heavy tessellation and some direct compute with hairworks. This is how we are seeing AMD gpu's able to perform on par or abit better then the smaller Kepler's the tessellation+compute is too much while rendering.

There is no shady business going on, its just the tech that has been putting into some of these games that show the limitations of the smaller/older Kepler architecture. Games that do not use those real time physics or using tessellation based effects you see these gpu's on even ground. Also note that those games running at 2560x1600 and the newer games that require more vram at 1080p tend to saturate the 2gb and even 3gb buffers causing performance to drop. So seeing 280x performing better the 680 and coming close to 780 is not a shock, when their both not vram limited as 680, but there is only around 20% difference in general performance between 280x and 780.

Buying any of the current GCN gpu's atm is dumb, especially with the upcoming release of DX12. Its best to hold off and wait it out until there is real reason to upgrade from dual 780's. The fact is Nvidia's Maxwell architecture handles tessellation and direct compute much better then their Kepler. so there is no "shady" stuff going on.

Side note with the Witcher the hairworks uses x8 AA so lowering it helps quite abit.

That last point is really the crux of the matter.. Many of these games are seeing poor performance with even the most MODERN video cards because they are poorly thought out features providing little return in visuals compared to the massive hit in performance you get.. Ubersampling was a great example of this in Witcher 2.. Shadow quality is another huge example in other games.. on my 660m laptop gameworks would always put my shadow quality for Hitman absolution to ultra (or what ever the highest setting it was).. This caused dips in frame rates which were noticeable.. I switched it to high.. I really didn't notice a difference in the quality of shadows, or they were extremely minor and trivial, and I literally saw a 20 to 40% increase in fps with no dips.. This shit has happened for years where there are certain features that the dev did not think out well enough that shouldn't even be deemed as a issue with your hardware not being new enough imo.. That is why I find the entire idea of what "maxing" out games a dumb term because many games out there has poorly optimized features or at best experimental features that can bring down the most powerful machine down to it's knees due to complete lack of optimization..

It's like the recent ncix video in which they attempted 12k resolution with quad titans, something that should work on paper with a game like Battlefield hardline, but wasn't possible with out severe micro stuttering.

Avatar image for deactivated-579f651eab962
deactivated-579f651eab962

5404

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#24 deactivated-579f651eab962
Member since 2003 • 5404 Posts

My 780's finally went to the 2nd hand gods, they will be missed. Still plenty of life left left in a 700 series sli setup