AMD to offer deal for those who are returning their GTX 970

  • 59 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#1 Xtasy26
Member since 2008 • 5582 Posts

"Anyone returning their GTX970 and wanting a great deal on a Radeon with a full 4GB please let us know."

https://twitter.com/amd_roy/status/560462075193880576

E-mail: roy.taylor@amd.com

AMD R9 290X/290 with a "full 4GB" along with some other "deal" doesn't sound like a bad idea.

And oh yeah straight from AMD rep: :P

https://twitter.com/Thracks/status/560511204951855104

Avatar image for GTR12
GTR12

13490

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 GTR12
Member since 2006 • 13490 Posts

My 970 Strix has 4GB, word your advertisement better AMD.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#3 04dcarraher
Member since 2004 • 23829 Posts

OMG the ploy is to get the uneducated to buy their power hungry out classed cards...

Avatar image for topgunmv
topgunmv

10880

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#4 topgunmv
Member since 2003 • 10880 Posts

I wonder if nvidia will be open to lawsuits for falsely adverting their number of ROPS and l2 cache size.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#5  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@topgunmv said:

I wonder if nvidia will be open to lawsuits for falsely adverting their number of ROPS and l2 cache size.

Doubtful... has AMD been sued for lying about false TDP numbers, black screen GCN issue that has plagued many. And how about AMD "lying" about bulldozer's 2 billion transistors when actually it only has 1.2 billion. And AMD had to go a fix the "mistake" updating the data with reviewers. It took AMD awhile to fix that. But fact is it may have been a miscommunication at first and no one bothered with fixing the mistake because they knew that the performance wasn't affected enough to worry about and it could have been swept under the rug and no one would have been the wiser.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#6 Xtasy26
Member since 2008 • 5582 Posts

@topgunmv said:

I wonder if nvidia will be open to lawsuits for falsely adverting their number of ROPS and l2 cache size.

In sue happy America, you never know. They did get sued for their "defective die" packaging fiasco for their laptops, for which I was a victim but I didn't get compensation as my gaming laptop maker wasn't part of the laptop makers that were part of the deal, :( (which I am still p***** of against nVidia about) only included, Dell, HP, Acer. and the lost in court and had to pay out compensation. Sad thing is I was thinking about jumping back into nVidia as the GTX 970 is the best price/perfomance GPU for 2014 IMO. Not sure if I want to do that since RAMgate fiasco, people who already brought the GTX 970 are returning it.

Avatar image for insane_metalist
insane_metalist

7797

Forum Posts

0

Wiki Points

0

Followers

Reviews: 42

User Lists: 0

#7 insane_metalist
Member since 2006 • 7797 Posts

Yup, actual 4GB cards. A while back I was monitoring and Advanced Warfare alone uses 3.8GB @ 1440P.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#8  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@insane_metalist said:

Yup, actual 4GB cards. A while back I was monitoring and Advanced Warfare alone uses 3.8GB @ 1440P.

970 is as well Farcry 4 with DSR with max settings no AA used 4030mb.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#10  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@Chatch09 said:

@04dcarraher said:

@insane_metalist said:

Yup, actual 4GB cards. A while back I was monitoring and Advanced Warfare alone uses 3.8GB @ 1440P.

970 is as well Farcry 4 with DSR with max settings no AA used 4030mb.

Are you getting the reported 5-10% slow-down when it uses that much of the VRAM?

You really cant see the difference, because only in benchmarking this issue is seen.

You dont suddenly lose enough performance to notice so instead of 21 fps you get 20 fps, that is what the 5% average loss is doing from the memory segmentation. When you push settings and resolution to use the 4gb buffer the 970 or 980 dont have enough grunt to give you enough performance. And this is where many are overlooking the fact of excessive gpu load causing issues.

Fact is people are overreacting and the GTX 970 performance fits right into the bracket of being 10-15% slower then the 980 at 4k.

Avatar image for JigglyWiggly_
JigglyWiggly_

24625

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#11 JigglyWiggly_
Member since 2009 • 24625 Posts

Damn what a deal, trade in your defective gpu for one that lights itself on fire.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#13  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@Chatch09 said:

@04dcarraher said:

@Chatch09 said:

Oh ok, but I would think that 5-10% would be more significant when running games well over 60 fps. Or is it that turning the settings down to achieve 96-144fps doesnt take as much VRAM to cause the problem? (if that makes sense)

The easiest way to explain this would be, running games below 4k and not using excessive settings your not going to see any real issue with the 970 or 980. Running 4k resolutions with demanding games , even the 980 does not even make the 60 fps averages even with the full 4gb memory utilization. The Vram usage is not the issue its the gpu load from the settings and resolutions being used people are overlooking.

Here is one test to see what the memory segmentation does when you use the both pools.

Avatar image for gerygo
GeryGo

12803

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#14  Edited By GeryGo  Moderator
Member since 2006 • 12803 Posts

Well if games use all that VRAM or not, using 4K or 1080p - it doesn't matter, when you're buying a high end GPU you expect it to last over years, and when games will become more demanding on the VRAM you'll be sad to see the performance over AMD GPUs or against other tier Nvidia GPUs.

Avatar image for bfa1509
bfa1509

1058

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#16 bfa1509
Member since 2011 • 1058 Posts

@Xtasy26

You lose a lot of integrity by simply not telling us what the deal is.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#17 Xtasy26
Member since 2008 • 5582 Posts

According to PC World AMD just announced a drop in the price of their R9 290X in light of the nVidia's 970 memory fiasco. Can get R9 290X for as low as $280. Thats very tempting.

http://www.pcworld.com/article/2877223/big-flagship-radeon-graphics-card-price-cuts-add-pressure-to-gtx-970-memory-furor.html

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#18 Xtasy26
Member since 2008 • 5582 Posts

@bfa1509: Don't know you have to e-mail roy.taylor@amd.com. It only applies to those who brought GTX 970. I think it's a special type of thing done on a one on basis. E-mail the guy to find out.

Avatar image for DefconRave
DefconRave

806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 40

User Lists: 0

#19 DefconRave
Member since 2013 • 806 Posts

Nah, i'm good with my 970. Don't want to downgrade ;D

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#20 Xtasy26
Member since 2008 • 5582 Posts

@JigglyWiggly_ said:

Damn what a deal, trade in your defective gpu for one that lights itself on fire.

Not true. Custom cooling from Asus, XFX, Sapphire as well as others are very effective in dissipating heat and at a great price to boot $280-$300.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#21 Xtasy26
Member since 2008 • 5582 Posts

@DefconRave said:

Nah, i'm good with my 970. Don't want to downgrade ;D

Not necessarily. Holds up pretty well in 4K. ;) And what about the future games that can be memory intensive.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#22  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@Xtasy26 said:

@DefconRave said:

Nah, i'm good with my 970. Don't want to downgrade ;D

Not necessarily. Holds up pretty well in 4K. ;) And what about the future games that can be memory intensive.

970 tends to outperform 290x all the way upto 4k, and even then 290x barely out paces the 970. Believe it or not at 4k 970 uses that segmented 4gb buffer and still is on the heels of the 290x aka few frames. The 290x even with its 512bit 320GB/s cant out right beat a 970.

What in the world are you talking about future games that can be memory intensive? Unless a game requires more then 4gb of vram 290x isnt going to do much better then a 970. And at 4k both the 970 980 and 290x dont have enough grunt to reach 60 fps averages. If and when Nvidia releases a driver update to help allocate vram better. By loading the typical windows/driver cache that gets used to that .5gb part. To point out not all of the vram buffer is solely used for the game your playing a reserve does exist. So the 970 should see some gains.

Avatar image for GiantAssPanda
GiantAssPanda

1885

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23  Edited By GiantAssPanda
Member since 2011 • 1885 Posts

Despite the controversy, at the end of the day at least in my perspective, the GTX 970 is still the best bang for the buck card out there and don't see the point returning mine (2xMSI 970) to get AMD cards. Especially considering how power hungry and hot they are, and the issues they have with crossfire especially with DX9 games. Also, there seems to be some variations between each cards regarding their max VRAM limit. My other card can only use around 3,55GB of VRAM before massive stuttering and frame rate drops occur while my other one can go near 3,8GB VRAM usage before hitting the limit. Thus I'm using the latter one as my master GPU in my SLI configuration. Some say that this is because some cards have more severely cut L2 cache than others.

I could return my cards, get a refund and go for a dual GTX 980 setup but that would cost around 400€ more so I could get 200-300MB more VRAM and some extra performance? No thanks. The law of diminishing returns would hit pretty hard at this point.

Screw Nvidia for lying to their customers though. The fault lies in Nvidia yet people like retailers and board manufacturers will also suffer from this..

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#24 Xtasy26
Member since 2008 • 5582 Posts

Hello ladies and gents,

Straight from AMD's front page:

http://www.amd.com/en-us

If you use the hash tag #IWant4GB on twitter between now and end of tomorrow, you may have a chance to win a AMD R9 290X.

Good luck!

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#25  Edited By Xtasy26
Member since 2008 • 5582 Posts

@04dcarraher said:

@Xtasy26 said:

@DefconRave said:

Nah, i'm good with my 970. Don't want to downgrade ;D

Not necessarily. Holds up pretty well in 4K. ;) And what about the future games that can be memory intensive.

970 tends to outperform 290x all the way upto 4k, and even then 290x barely out paces the 970. Believe it or not at 4k 970 uses that segmented 4gb buffer and still is on the heels of the 290x aka few frames. The 290x even with its 512bit 320GB/s cant out right beat a 970.

What in the world are you talking about future games that can be memory intensive? Unless a game requires more then 4gb of vram 290x isnt going to do much better then a 970. And at 4k both the 970 980 and 290x dont have enough grunt to reach 60 fps averages. If and when Nvidia releases a driver update to help allocate vram better. By loading the typical windows/driver cache that gets used to that .5gb part. To point out not all of the vram buffer is solely used for the game your playing a reserve does exist. So the 970 should see some gains.

Future games can certainly affect performance. People are having issues with certain games that exceed 3.5 GB if you go over at nVidia's GeForce forums. If one is having issues right now, what happens with future games? 4GB would be more "future proof" as that's the reason why some people brought it. If that wasn't the case, people wouldn't be flooding the forums asking how they can return than GTX 970. The classic example I use is GTA IV where I couldn't max it out because of memory limitation as GTA IV required at least 1.5 GB and my HD 4870 was only 1GB. It wasn't until I got 2GB Card that I could max out GTA IV. And games doesn't need to necessarily require more than 4GB it could be more than 3.5 GB and at 4K it doesn't necessarily need to run at 60 FPS. It could be between 30 - 40 FPS.

And the guy from nVidia who posted about driver update to fix the issue that made waves over the internet back pedaled and went back and altered his post.

http://imgur.com/HMqmIw7

Even nVidia's recent twitter post indicated that a driver update won't magically fix this, which isn't a surprise, as this is more related to the design of the GTX 970 as opposed to something that can readily be fixed by a driver update.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#26  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@Xtasy26:

Again do not ignore excessive resolutions and settings being used by many of those people. Many are just using too much that no single gpu can handle with a good frame rate.ie 4k, massive amounts of AA etc.

What you seem not to understand that the processing power difference between 970 and 290 series and vram usage will not make any real big difference with future games below 4k. Both gpus dont have enough processing power to do 4k and or massive amounts of AA. It does not matter if a future game can allocate 4gb of vram, When future games come they will more demanding on processing power not just vram usage and by the time 4gb isnt enough for 1080p gaming both the 970 and 290x will be out of the picture.

Avatar image for deactivated-59d151f079814
deactivated-59d151f079814

47239

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#28 deactivated-59d151f079814
Member since 2003 • 47239 Posts

@04dcarraher said:

@Xtasy26:

Again do not ignore excessive resolutions and settings being used by many of those people. Many are just using too much that no single gpu can handle with a good frame rate.ie 4k, massive amounts of AA etc.

What you seem not to understand that the processing power difference between 970 and 290 series and vram usage will not make any real big difference with future games below 4k. Both gpus dont have enough processing power to do 4k and or massive amounts of AA. It does not matter if a future game can allocate 4gb of vram, When future games come they will more demanding on processing power not just vram usage and by the time 4gb isnt enough for 1080p gaming both the 970 and 290x will be out of the picture.

4k resolution is literally used by a very small % of the actual gaming population to begin with.. And it makes me scratch my head that the same guys who spent a ton of money on getting a 4k display are the ones that got a 970 over a 980 SLI..

Avatar image for deactivated-59d151f079814
deactivated-59d151f079814

47239

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#29  Edited By deactivated-59d151f079814
Member since 2003 • 47239 Posts

@Xtasy26 said:

@04dcarraher said:

@Xtasy26 said:

@DefconRave said:

Nah, i'm good with my 970. Don't want to downgrade ;D

Not necessarily. Holds up pretty well in 4K. ;) And what about the future games that can be memory intensive.

970 tends to outperform 290x all the way upto 4k, and even then 290x barely out paces the 970. Believe it or not at 4k 970 uses that segmented 4gb buffer and still is on the heels of the 290x aka few frames. The 290x even with its 512bit 320GB/s cant out right beat a 970.

What in the world are you talking about future games that can be memory intensive? Unless a game requires more then 4gb of vram 290x isnt going to do much better then a 970. And at 4k both the 970 980 and 290x dont have enough grunt to reach 60 fps averages. If and when Nvidia releases a driver update to help allocate vram better. By loading the typical windows/driver cache that gets used to that .5gb part. To point out not all of the vram buffer is solely used for the game your playing a reserve does exist. So the 970 should see some gains.

Future games can certainly affect performance. People are having issues with certain games that exceed 3.5 GB if you go over at nVidia's GeForce forums. If one is having issues right now, what happens with future games? 4GB would be more "future proof" as that's the reason why some people brought it. If that wasn't the case, people wouldn't be flooding the forums asking how they can return than GTX 970. The classic example I use is GTA IV where I couldn't max it out because of memory limitation as GTA IV required at least 1.5 GB and my HD 4870 was only 1GB. It wasn't until I got 2GB Card that I could max out GTA IV. And games doesn't need to necessarily require more than 4GB it could be more than 3.5 GB and at 4K it doesn't necessarily need to run at 60 FPS. It could be between 30 - 40 FPS.

And the guy from nVidia who posted about driver update to fix the issue that made waves over the internet back pedaled and went back and altered his post.

http://imgur.com/HMqmIw7

Even nVidia's recent twitter post indicated that a driver update won't magically fix this, which isn't a surprise, as this is more related to the design of the GTX 970 as opposed to something that can readily be fixed by a driver update.

......... I don't get this hard on for 4k displays.. They literally make up less than 1% of the actual gaming population who uses them.. Personally I would take 1080p at 144 fps ANY DAY of the week over 4k resolution at 30 to 40 fps.. And it is funny that you mention GTA4 a notorious game that has awful optimization.. Meanwhile the GTAV recommends are insanely low.. This isn't trying to defend the whole fiasco with Nvidia, it is ridiculous.. I am more concerned on the impact it may have down the road with 1080p gaming.. If your going on by 4k display gaming, as far as I am concerned your an idiot for being outraged.. 4k display gaming is extremely demanding (not to mention displays of 4k are extremely expensive).. Why the **** would you buy a single or even sli 970s when 4k display resolutions have been bringing set ups on many of these games down on their knees on higher end cards? I would sooner call you a stupid consumer than put any blame on the companies.. Seriously even on older games like Farcry 3 you are seeing dips with the 290x and 970 in crossfire/sli to 30 or less fps at 4k resolution.. Using either card in SLI at 4k resolution imo is already not very attractive.. In less you guys enjoy that "cinematic" 30fps we all make fun of.

Avatar image for Elann2008
Elann2008

33028

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#31  Edited By Elann2008
Member since 2007 • 33028 Posts
@bfa1509 said:

@Xtasy26

You lose a lot of integrity by simply not telling us what the deal is.

Hahaha. Made my day. =o)

@04dcarraher said:

@Xtasy26:

Again do not ignore excessive resolutions and settings being used by many of those people. Many are just using too much that no single gpu can handle with a good frame rate.ie 4k, massive amounts of AA etc.

What you seem not to understand that the processing power difference between 970 and 290 series and vram usage will not make any real big difference with future games below 4k. Both gpus dont have enough processing power to do 4k and or massive amounts of AA. It does not matter if a future game can allocate 4gb of vram, When future games come they will more demanding on processing power not just vram usage and by the time 4gb isnt enough for 1080p gaming both the 970 and 290x will be out of the picture.

Exactly.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#32 Xtasy26
Member since 2008 • 5582 Posts

@04dcarraher said:

@Xtasy26:

Again do not ignore excessive resolutions and settings being used by many of those people. Many are just using too much that no single gpu can handle with a good frame rate.ie 4k, massive amounts of AA etc.

What you seem not to understand that the processing power difference between 970 and 290 series and vram usage will not make any real big difference with future games below 4k. Both gpus dont have enough processing power to do 4k and or massive amounts of AA. It does not matter if a future game can allocate 4gb of vram, When future games come they will more demanding on processing power not just vram usage and by the time 4gb isnt enough for 1080p gaming both the 970 and 290x will be out of the picture.

Some people may want to use their GPUs for 3 - 4 years. If you go over to nVidia's GeForce forums; people who were putting down $330 for a GPU they expected to be 4GB. This is especially true for those who brought SLI setups of GTX 970 who were even more pi$$ed off as they wanted their new GPUs to last for a while.

Heck my HD 6970 is going on for a little over 4 years now.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#33 Xtasy26
Member since 2008 • 5582 Posts

@sSubZerOo said:

@Xtasy26 said:

@04dcarraher said:

@Xtasy26 said:

@DefconRave said:

Nah, i'm good with my 970. Don't want to downgrade ;D

Not necessarily. Holds up pretty well in 4K. ;) And what about the future games that can be memory intensive.

970 tends to outperform 290x all the way upto 4k, and even then 290x barely out paces the 970. Believe it or not at 4k 970 uses that segmented 4gb buffer and still is on the heels of the 290x aka few frames. The 290x even with its 512bit 320GB/s cant out right beat a 970.

What in the world are you talking about future games that can be memory intensive? Unless a game requires more then 4gb of vram 290x isnt going to do much better then a 970. And at 4k both the 970 980 and 290x dont have enough grunt to reach 60 fps averages. If and when Nvidia releases a driver update to help allocate vram better. By loading the typical windows/driver cache that gets used to that .5gb part. To point out not all of the vram buffer is solely used for the game your playing a reserve does exist. So the 970 should see some gains.

Future games can certainly affect performance. People are having issues with certain games that exceed 3.5 GB if you go over at nVidia's GeForce forums. If one is having issues right now, what happens with future games? 4GB would be more "future proof" as that's the reason why some people brought it. If that wasn't the case, people wouldn't be flooding the forums asking how they can return than GTX 970. The classic example I use is GTA IV where I couldn't max it out because of memory limitation as GTA IV required at least 1.5 GB and my HD 4870 was only 1GB. It wasn't until I got 2GB Card that I could max out GTA IV. And games doesn't need to necessarily require more than 4GB it could be more than 3.5 GB and at 4K it doesn't necessarily need to run at 60 FPS. It could be between 30 - 40 FPS.

And the guy from nVidia who posted about driver update to fix the issue that made waves over the internet back pedaled and went back and altered his post.

http://imgur.com/HMqmIw7

Even nVidia's recent twitter post indicated that a driver update won't magically fix this, which isn't a surprise, as this is more related to the design of the GTX 970 as opposed to something that can readily be fixed by a driver update.

......... I don't get this hard on for 4k displays.. They literally make up less than 1% of the actual gaming population who uses them.. Personally I would take 1080p at 144 fps ANY DAY of the week over 4k resolution at 30 to 40 fps.. And it is funny that you mention GTA4 a notorious game that has awful optimization.. Meanwhile the GTAV recommends are insanely low.. This isn't trying to defend the whole fiasco with Nvidia, it is ridiculous.. I am more concerned on the impact it may have down the road with 1080p gaming.. If your going on by 4k display gaming, as far as I am concerned your an idiot for being outraged.. 4k display gaming is extremely demanding (not to mention displays of 4k are extremely expensive).. Why the **** would you buy a single or even sli 970s when 4k display resolutions have been bringing set ups on many of these games down on their knees on higher end cards? I would sooner call you a stupid consumer than put any blame on the companies.. Seriously even on older games like Farcry 3 you are seeing dips with the 290x and 970 in crossfire/sli to 30 or less fps at 4k resolution.. Using either card in SLI at 4k resolution imo is already not very attractive.. In less you guys enjoy that "cinematic" 30fps we all make fun of.

You being concerned about 1080P gaming down the road would also have been my concern down the road too. And secondly, some people did buy GTX 970 SLI specifically for their 4K or 1440P setup, they don't sound to happy as I wouldn't be given that it was nearly a $700 investment.

@Elann2008 said:

Hahaha. Made my day. =o)

Hey, don't blame me. I am just the messenger. ;)

Avatar image for insane_metalist
insane_metalist

7797

Forum Posts

0

Wiki Points

0

Followers

Reviews: 42

User Lists: 0

#34 insane_metalist
Member since 2006 • 7797 Posts

SHOCKING interview with Nvidia engineer about the 970 fiasco

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#35  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@Xtasy26 said:

@04dcarraher said:

@Xtasy26:

Again do not ignore excessive resolutions and settings being used by many of those people. Many are just using too much that no single gpu can handle with a good frame rate.ie 4k, massive amounts of AA etc.

What you seem not to understand that the processing power difference between 970 and 290 series and vram usage will not make any real big difference with future games below 4k. Both gpus dont have enough processing power to do 4k and or massive amounts of AA. It does not matter if a future game can allocate 4gb of vram, When future games come they will more demanding on processing power not just vram usage and by the time 4gb isnt enough for 1080p gaming both the 970 and 290x will be out of the picture.

Some people may want to use their GPUs for 3 - 4 years. If you go over to nVidia's GeForce forums; people who were putting down $330 for a GPU they expected to be 4GB. This is especially true for those who brought SLI setups of GTX 970 who were even more pi$$ed off as they wanted their new GPUs to last for a while.

Heck my HD 6970 is going on for a little over 4 years now.

lol again 3.5gb vs 4gb with similar gpu processing power is not what you should be aiming for for 3-4 years use at or above 1440p. And is not going to make any real difference in the end. and saying it will is a bold face lie. Prime examples is GTX 680 vs 7970 which same type of difference with even 1gb difference and yet 680 performs on par to 7970 at 1440p with 4xAA with BF4 or even Dragon Age Inquisition at 1440. Heck even with Assassin Creed Unity at 1080p 7970 was a whole 4 fps faster then 680 but yet 7970 still can not even make 30 fps average.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36  Edited By RyviusARC
Member since 2011 • 5708 Posts

@04dcarraher said:

@Xtasy26 said:

@04dcarraher said:

@Xtasy26:

Again do not ignore excessive resolutions and settings being used by many of those people. Many are just using too much that no single gpu can handle with a good frame rate.ie 4k, massive amounts of AA etc.

What you seem not to understand that the processing power difference between 970 and 290 series and vram usage will not make any real big difference with future games below 4k. Both gpus dont have enough processing power to do 4k and or massive amounts of AA. It does not matter if a future game can allocate 4gb of vram, When future games come they will more demanding on processing power not just vram usage and by the time 4gb isnt enough for 1080p gaming both the 970 and 290x will be out of the picture.

Some people may want to use their GPUs for 3 - 4 years. If you go over to nVidia's GeForce forums; people who were putting down $330 for a GPU they expected to be 4GB. This is especially true for those who brought SLI setups of GTX 970 who were even more pi$$ed off as they wanted their new GPUs to last for a while.

Heck my HD 6970 is going on for a little over 4 years now.

lol again 3.5gb vs 4gb with similar gpu processing power is not what you should be aiming for for 3-4 years use at or above 1440p. And is not going to make any real difference in the end. and saying it will is a bold face lie. Prime examples is GTX 680 vs 7970 which same type of difference with even 1gb difference and yet 680 performs on par to 7970 at 1440p with 4xAA with BF4 or even Dragon Age Inquisition at 1440. Heck even with Assassin Creed Unity at 1080p 7970 was a whole 4 fps faster then 680 but yet 7970 still can not even make 30 fps average.

I want my 970 sli setup to last at least 2 years for 1440p gaming and it can but I have my doubts about doing that at max settings.

Currently at 1440p the only games the really caused a lot of stuttering was Skyrim with modded textures and Assassin's Creed Unity when going from 2xMSAA to 4xMSAA with maxed settings.

Shadow of Mordor could get the occasional stutter with ultra textures but it didn't happen often enough to bug me.

I know consoles had way more performance issues so in it's current form the GTX 970s even in SLI are not doing horrible with 1440p max settings.

I will probably switch when Pascal comes out since it's bringing a lot of changes.

But I will only make the switch if the games demand it and I will definitely wait a while to make sure Nvidia is not lying about specs.

Now if I cannot max The Witcher 3 (ubersampling disabled) at 1440p then I will not be pleased but I will see when that time comes.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#38  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@RyviusARC said:

I want my 970 sli setup to last at least 2 years for 1440p gaming and it can but I have my doubts about doing that at max settings.

Currently at 1440p the only games the really caused a lot of stuttering was Skyrim with modded textures and Assassin's Creed Unity when going from 2xMSAA to 4xMSAA with maxed settings.

Shadow of Mordor could get the occasional stutter with ultra textures but it didn't happen often enough to bug me.

I know consoles had way more performance issues so in it's current form the GTX 970s even in SLI are not doing horrible with 1440p max settings.

I will probably switch when Pascal comes out since it's bringing a lot of changes.

But I will only make the switch if the games demand it and I will definitely wait a while to make sure Nvidia is not lying about specs.

Now if I cannot max The Witcher 3 (ubersampling disabled) at 1440p then I will not be pleased but I will see when that time comes.

Part of the problem is expecting to max out games with slight to moderate amounts of AA at 1440p or higher, and doing so on games that are known to be poorly coded should not be used as proof of an issue with the vram buffer. Prime example is Assassins so why are you expecting your SLI 970 to perform well when a single 970 cant even get 20 fps at 1440 at 4x AA, even with a 290x with a full "4gb" buffer and 512 bit bus barely gets 25 fps. Shadow of Mordor vram usage with ultra textures pack can allocate all the way to 6gb, So even if you had say a 980 sli with its full memory bandwidth 4gb buffer you would still see the stutter. I've played around with high and ultra textures "not the super ultra texture DLC" , and it's hard to tell much of a difference while playing. Even with DSR without using the Ultra textures, just using high the game caps around 3gb.

The segmented memory pool in the SLI 970's can cause issues but it tends to be game by game scenario. BF4 with SLI 970's for example does not have much of an issue at 4k until you start upping the scaling past 1.4x which would be like running at 6k resolution. Farcry 4 runs fine with SLI 970's only allocating about 3gb at 1440p ultra settings.

It was not right for Nvidia to screw up like that and hid the truth about the 970's differences. whether or not it was at first a honest mistake and then became a guilty omission.

But best thing to do is not to blindly expecting to max out games, some are well coded and see no issues while others hog up resources for no real reason. Just going to have to look at the differences between high vs ultra textures side by side and see if the gains are even worth it.

Avatar image for topgunmv
topgunmv

10880

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#39  Edited By topgunmv
Member since 2003 • 10880 Posts

@04dcarraher said:

@RyviusARC said:

I want my 970 sli setup to last at least 2 years for 1440p gaming and it can but I have my doubts about doing that at max settings.

Currently at 1440p the only games the really caused a lot of stuttering was Skyrim with modded textures and Assassin's Creed Unity when going from 2xMSAA to 4xMSAA with maxed settings.

Shadow of Mordor could get the occasional stutter with ultra textures but it didn't happen often enough to bug me.

I know consoles had way more performance issues so in it's current form the GTX 970s even in SLI are not doing horrible with 1440p max settings.

I will probably switch when Pascal comes out since it's bringing a lot of changes.

But I will only make the switch if the games demand it and I will definitely wait a while to make sure Nvidia is not lying about specs.

Now if I cannot max The Witcher 3 (ubersampling disabled) at 1440p then I will not be pleased but I will see when that time comes.

Part of the problem is expecting to max out games with slight to moderate amounts of AA at 1440p or higher, and doing so on games that are known to be poorly coded should not be used as proof of an issue with the vram buffer. Prime example is Assassins so why are you expecting your SLI 970 to perform well when a single 970 cant even get 20 fps at 1440 at 4x AA, even with a 290x with a full "4gb" buffer and 512 bit bus barely gets 25 fps. Shadow of Mordor vram usage with ultra textures pack can allocate all the way to 6gb, So even if you had say a 980 sli with its full memory bandwidth 4gb buffer you would still see the stutter. I've played around with high and ultra textures "not the super ultra texture DLC" , and it's hard to tell much of a difference while playing. Even with DSR without using the Ultra textures, just using high the game caps around 3gb.

The segmented memory pool in the SLI 970's can cause issues but it tends to be game by game scenario. BF4 with SLI 970's for example does not have much of an issue at 4k until you start upping the scaling past 1.4x which would be like running at 6k resolution. Farcry 4 runs fine with SLI 970's only allocating about 3gb at 1440p ultra settings.

It was not right for Nvidia to screw up like that and hid the truth about the 970's differences. whether or not it was at first a honest mistake and then became a guilty omission.

But best thing to do is not to blindly expecting to max out games, some are well coded and see no issues while others hog up resources for no real reason. Just going to have to look at the differences between high vs ultra textures side by side and see if the gains are even worth it.

There is no difference between high and ultra textures if you don't download the "hd" texture dlc.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#40  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@topgunmv said:

There is no difference between high and ultra textures if you don't download the "hd" texture dlc.

Textures arent as compressed, very minor differences, but vram usage is abit higher using another 500-600mb

Avatar image for deactivated-59d151f079814
deactivated-59d151f079814

47239

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#41 deactivated-59d151f079814
Member since 2003 • 47239 Posts

@Xtasy26 said:

@sSubZerOo said:

@Xtasy26 said:

@04dcarraher said:

@Xtasy26 said:

@DefconRave said:

Nah, i'm good with my 970. Don't want to downgrade ;D

Not necessarily. Holds up pretty well in 4K. ;) And what about the future games that can be memory intensive.

970 tends to outperform 290x all the way upto 4k, and even then 290x barely out paces the 970. Believe it or not at 4k 970 uses that segmented 4gb buffer and still is on the heels of the 290x aka few frames. The 290x even with its 512bit 320GB/s cant out right beat a 970.

What in the world are you talking about future games that can be memory intensive? Unless a game requires more then 4gb of vram 290x isnt going to do much better then a 970. And at 4k both the 970 980 and 290x dont have enough grunt to reach 60 fps averages. If and when Nvidia releases a driver update to help allocate vram better. By loading the typical windows/driver cache that gets used to that .5gb part. To point out not all of the vram buffer is solely used for the game your playing a reserve does exist. So the 970 should see some gains.

Future games can certainly affect performance. People are having issues with certain games that exceed 3.5 GB if you go over at nVidia's GeForce forums. If one is having issues right now, what happens with future games? 4GB would be more "future proof" as that's the reason why some people brought it. If that wasn't the case, people wouldn't be flooding the forums asking how they can return than GTX 970. The classic example I use is GTA IV where I couldn't max it out because of memory limitation as GTA IV required at least 1.5 GB and my HD 4870 was only 1GB. It wasn't until I got 2GB Card that I could max out GTA IV. And games doesn't need to necessarily require more than 4GB it could be more than 3.5 GB and at 4K it doesn't necessarily need to run at 60 FPS. It could be between 30 - 40 FPS.

And the guy from nVidia who posted about driver update to fix the issue that made waves over the internet back pedaled and went back and altered his post.

http://imgur.com/HMqmIw7

Even nVidia's recent twitter post indicated that a driver update won't magically fix this, which isn't a surprise, as this is more related to the design of the GTX 970 as opposed to something that can readily be fixed by a driver update.

......... I don't get this hard on for 4k displays.. They literally make up less than 1% of the actual gaming population who uses them.. Personally I would take 1080p at 144 fps ANY DAY of the week over 4k resolution at 30 to 40 fps.. And it is funny that you mention GTA4 a notorious game that has awful optimization.. Meanwhile the GTAV recommends are insanely low.. This isn't trying to defend the whole fiasco with Nvidia, it is ridiculous.. I am more concerned on the impact it may have down the road with 1080p gaming.. If your going on by 4k display gaming, as far as I am concerned your an idiot for being outraged.. 4k display gaming is extremely demanding (not to mention displays of 4k are extremely expensive).. Why the **** would you buy a single or even sli 970s when 4k display resolutions have been bringing set ups on many of these games down on their knees on higher end cards? I would sooner call you a stupid consumer than put any blame on the companies.. Seriously even on older games like Farcry 3 you are seeing dips with the 290x and 970 in crossfire/sli to 30 or less fps at 4k resolution.. Using either card in SLI at 4k resolution imo is already not very attractive.. In less you guys enjoy that "cinematic" 30fps we all make fun of.

You being concerned about 1080P gaming down the road would also have been my concern down the road too. And secondly, some people did buy GTX 970 SLI specifically for their 4K or 1440P setup, they don't sound to happy as I wouldn't be given that it was nearly a $700 investment.

@Elann2008 said:

Hahaha. Made my day. =o)

Hey, don't blame me. I am just the messenger. ;)

And they are fools for doing it.. Look at any benchmark out there at 4k resolution.. It brings video card hardware down to it's knees.. Both the 290x and 970 gtx in sli/crossifre with any of the more demanding games dip below 30 fps.. And average in the 30s.... Your already getting what I would consider subpar performance to begin with at that resolution.. As for 1440p.. Every benchmark I have seen has the 290x neck and neck with the 970.. So time will tell how big of difference it is going to make.. Seeing as if it was earth shattering as people are making it sound, reviewers should have been able to catch the memory fiasco from the beginning.. When in actuality they have benchmarked the most demanding games out there, including memory hungry ones like Mordor.. In which Tomshardware has a good break down that has the 290x around 5 to 8 fps higher than the 970.. With the 970 just under a average of 60fps.. And that is at stock speeds, which your likely to close that gap even more with some overclocking from the 970..

Avatar image for BassMan
BassMan

17808

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#42  Edited By BassMan
Member since 2002 • 17808 Posts

NCIX Canada has an MSI 290x for $339.99 after a $100 mail-in rebate. http://www.ncix.com/detail/msi-radeon-r9-290x-gaming-df-92892-1509.htm

I still wouldn't buy it because it doesn't have G-sync and there aren't any free sync monitors out yet. Also, the 300 series is just around the corner. For a card that is on par and sometimes better than a 970, it is a good deal.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#43  Edited By RyviusARC
Member since 2011 • 5708 Posts

@04dcarraher said:

@RyviusARC said:

I want my 970 sli setup to last at least 2 years for 1440p gaming and it can but I have my doubts about doing that at max settings.

Currently at 1440p the only games the really caused a lot of stuttering was Skyrim with modded textures and Assassin's Creed Unity when going from 2xMSAA to 4xMSAA with maxed settings.

Shadow of Mordor could get the occasional stutter with ultra textures but it didn't happen often enough to bug me.

I know consoles had way more performance issues so in it's current form the GTX 970s even in SLI are not doing horrible with 1440p max settings.

I will probably switch when Pascal comes out since it's bringing a lot of changes.

But I will only make the switch if the games demand it and I will definitely wait a while to make sure Nvidia is not lying about specs.

Now if I cannot max The Witcher 3 (ubersampling disabled) at 1440p then I will not be pleased but I will see when that time comes.

Part of the problem is expecting to max out games with slight to moderate amounts of AA at 1440p or higher, and doing so on games that are known to be poorly coded should not be used as proof of an issue with the vram buffer. Prime example is Assassins so why are you expecting your SLI 970 to perform well when a single 970 cant even get 20 fps at 1440 at 4x AA, even with a 290x with a full "4gb" buffer and 512 bit bus barely gets 25 fps. Shadow of Mordor vram usage with ultra textures pack can allocate all the way to 6gb, So even if you had say a 980 sli with its full memory bandwidth 4gb buffer you would still see the stutter. I've played around with high and ultra textures "not the super ultra texture DLC" , and it's hard to tell much of a difference while playing. Even with DSR without using the Ultra textures, just using high the game caps around 3gb.

The segmented memory pool in the SLI 970's can cause issues but it tends to be game by game scenario. BF4 with SLI 970's for example does not have much of an issue at 4k until you start upping the scaling past 1.4x which would be like running at 6k resolution. Farcry 4 runs fine with SLI 970's only allocating about 3gb at 1440p ultra settings.

It was not right for Nvidia to screw up like that and hid the truth about the 970's differences. whether or not it was at first a honest mistake and then became a guilty omission.

But best thing to do is not to blindly expecting to max out games, some are well coded and see no issues while others hog up resources for no real reason. Just going to have to look at the differences between high vs ultra textures side by side and see if the gains are even worth it.

Well here is the thing.

With Assassin's Creed Unity at 2xMSAA I am getting around 60fps with max settings and 1440p res.

With only SMAA or FXAA I get around 70-80fps.

When I bump it up to 4xMSAA it goes down to around 45fps and stutters a lot. I've seen people with true 4GB cards run it at 1440p max settings with 4xMSAA with no stuttering.

On Shadow of Mordor I've seen it not stutter on 4GB cards in the same spots where mine will stutter but like I said it very rarely will stutter since it loads in large areas at once.

And there is a difference between high and ultra but it's on certain textures especially the collar on the clothing and the roots on the walls.

Battlefield 4 will stutter at 4k res with 4xmsaa as I have tried it. The thing is though, BF4 loads most of it's textures at once so you only get the stutter a few times while going through a whole level.

I am a bit pissed at Nvidia for lying and I don't see why it is weird to expect games to run fine maxed out at 1440p with 720 dollars worth in video cards.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#44  Edited By Xtasy26
Member since 2008 • 5582 Posts

@04dcarraher said:

@Xtasy26 said:

@04dcarraher said:

@Xtasy26:

Again do not ignore excessive resolutions and settings being used by many of those people. Many are just using too much that no single gpu can handle with a good frame rate.ie 4k, massive amounts of AA etc.

What you seem not to understand that the processing power difference between 970 and 290 series and vram usage will not make any real big difference with future games below 4k. Both gpus dont have enough processing power to do 4k and or massive amounts of AA. It does not matter if a future game can allocate 4gb of vram, When future games come they will more demanding on processing power not just vram usage and by the time 4gb isnt enough for 1080p gaming both the 970 and 290x will be out of the picture.

Some people may want to use their GPUs for 3 - 4 years. If you go over to nVidia's GeForce forums; people who were putting down $330 for a GPU they expected to be 4GB. This is especially true for those who brought SLI setups of GTX 970 who were even more pi$$ed off as they wanted their new GPUs to last for a while.

Heck my HD 6970 is going on for a little over 4 years now.

lol again 3.5gb vs 4gb with similar gpu processing power is not what you should be aiming for for 3-4 years use at or above 1440p. And is not going to make any real difference in the end. and saying it will is a bold face lie. Prime examples is GTX 680 vs 7970 which same type of difference with even 1gb difference and yet 680 performs on par to 7970 at 1440p with 4xAA with BF4 or even Dragon Age Inquisition at 1440. Heck even with Assassin Creed Unity at 1080p 7970 was a whole 4 fps faster then 680 but yet 7970 still can not even make 30 fps average.

Problem is some people are having issues with stuttering even at current games like Assassin's Creed Unity.

If it's having suttering effects with certain games NOW, what about going forward even at 1080P on future games? What about those who brought the 970 now thinking they may be able to upgrade to 1440P monitor or even 4K in the future given that what they thought was a full blown 4GB card, they may not be able to do so, they may now have to sell their graphics card and get a "full 4GB card".

I mean a quick look at GeForce forums which now over 330+ pages shows problems with stuttering to the point when it becomes really annoying which I can imagine.

https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/16/

"After playing Assassins creed unity for a couple of mins, in the most extreme areas of town full of people, fires, guards Vram maxed there with Ultra settings - FXAA it refuses to go any higher than 3558

.....

I was able to hit 4037 with MSAA 8x But there was stuttering...the FPS were starting to dip cause the memory being pushed that high tanks the FPS due to the performance issues at high Vram load."

This is just on page 16. There's many more examples.

I think the main problem is people feel they were lied to as they though they had gotten a full 4GB card. To give an analogy, I am looking to buy a car now and I am only looking for car with V6 engines that's at least 3.2L+ engine ( I am a car enthusiast and only high end cars with V6 engines within certain specs), I don't even bat an eye for V4 engine cars. Now, if I were to buy a car that stated that its a V6 engine and only to later find out that at certain high speeds during driving people are experiencing performance issues (akin to GTX 970 running at high resolutions or games with high memory allocations) and the company comes out and says well at a certain speeds one of the cylinders operates at half frequency so it's technically like "V5" engine..or in this case when with the GTX 970 when it run games with high memory allocations or games at high resolutions it goes from 192GB/sec to 28GB/sec on slow 512MB segment, I as a consumer wouldn't be too happy. I would have gotten a car that doesn't have that issue that operates like a full blown V6 car on this case a full blown 4GB card like the R9 290X/GTX 980.

I especially feel bad for a guy who brought a GTX 970 in Brazil which cost him more than a GTX 980 as it can cost $750 - $800+ for a GTX 970 as Brazil is a low volume market. He is upset as he can't return his graphics card. People like him may be using his graphics card for several years as high end Graphics are very expensive there and it's not fair to him as he brought what he thought was a full-blown 4GB graphics card.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#45 Xtasy26
Member since 2008 • 5582 Posts

@sSubZerOo said:

@Xtasy26 said:

@sSubZerOo said:

@Xtasy26 said:

@04dcarraher said:

@Xtasy26 said:

@DefconRave said:

Nah, i'm good with my 970. Don't want to downgrade ;D

Not necessarily. Holds up pretty well in 4K. ;) And what about the future games that can be memory intensive.

970 tends to outperform 290x all the way upto 4k, and even then 290x barely out paces the 970. Believe it or not at 4k 970 uses that segmented 4gb buffer and still is on the heels of the 290x aka few frames. The 290x even with its 512bit 320GB/s cant out right beat a 970.

What in the world are you talking about future games that can be memory intensive? Unless a game requires more then 4gb of vram 290x isnt going to do much better then a 970. And at 4k both the 970 980 and 290x dont have enough grunt to reach 60 fps averages. If and when Nvidia releases a driver update to help allocate vram better. By loading the typical windows/driver cache that gets used to that .5gb part. To point out not all of the vram buffer is solely used for the game your playing a reserve does exist. So the 970 should see some gains.

Future games can certainly affect performance. People are having issues with certain games that exceed 3.5 GB if you go over at nVidia's GeForce forums. If one is having issues right now, what happens with future games? 4GB would be more "future proof" as that's the reason why some people brought it. If that wasn't the case, people wouldn't be flooding the forums asking how they can return than GTX 970. The classic example I use is GTA IV where I couldn't max it out because of memory limitation as GTA IV required at least 1.5 GB and my HD 4870 was only 1GB. It wasn't until I got 2GB Card that I could max out GTA IV. And games doesn't need to necessarily require more than 4GB it could be more than 3.5 GB and at 4K it doesn't necessarily need to run at 60 FPS. It could be between 30 - 40 FPS.

And the guy from nVidia who posted about driver update to fix the issue that made waves over the internet back pedaled and went back and altered his post.

http://imgur.com/HMqmIw7

Even nVidia's recent twitter post indicated that a driver update won't magically fix this, which isn't a surprise, as this is more related to the design of the GTX 970 as opposed to something that can readily be fixed by a driver update.

......... I don't get this hard on for 4k displays.. They literally make up less than 1% of the actual gaming population who uses them.. Personally I would take 1080p at 144 fps ANY DAY of the week over 4k resolution at 30 to 40 fps.. And it is funny that you mention GTA4 a notorious game that has awful optimization.. Meanwhile the GTAV recommends are insanely low.. This isn't trying to defend the whole fiasco with Nvidia, it is ridiculous.. I am more concerned on the impact it may have down the road with 1080p gaming.. If your going on by 4k display gaming, as far as I am concerned your an idiot for being outraged.. 4k display gaming is extremely demanding (not to mention displays of 4k are extremely expensive).. Why the **** would you buy a single or even sli 970s when 4k display resolutions have been bringing set ups on many of these games down on their knees on higher end cards? I would sooner call you a stupid consumer than put any blame on the companies.. Seriously even on older games like Farcry 3 you are seeing dips with the 290x and 970 in crossfire/sli to 30 or less fps at 4k resolution.. Using either card in SLI at 4k resolution imo is already not very attractive.. In less you guys enjoy that "cinematic" 30fps we all make fun of.

You being concerned about 1080P gaming down the road would also have been my concern down the road too. And secondly, some people did buy GTX 970 SLI specifically for their 4K or 1440P setup, they don't sound to happy as I wouldn't be given that it was nearly a $700 investment.

@Elann2008 said:

Hahaha. Made my day. =o)

Hey, don't blame me. I am just the messenger. ;)

And they are fools for doing it.. Look at any benchmark out there at 4k resolution.. It brings video card hardware down to it's knees.. Both the 290x and 970 gtx in sli/crossifre with any of the more demanding games dip below 30 fps.. And average in the 30s.... Your already getting what I would consider subpar performance to begin with at that resolution.. As for 1440p.. Every benchmark I have seen has the 290x neck and neck with the 970.. So time will tell how big of difference it is going to make.. Seeing as if it was earth shattering as people are making it sound, reviewers should have been able to catch the memory fiasco from the beginning.. When in actuality they have benchmarked the most demanding games out there, including memory hungry ones like Mordor.. In which Tomshardware has a good break down that has the 290x around 5 to 8 fps higher than the 970.. With the 970 just under a average of 60fps.. And that is at stock speeds, which your likely to close that gap even more with some overclocking from the 970..

You are taking into account the frame rate drops or stuttering that is the result of it. While the average frame rate may look good, the stuttering effect can make the game playing experience unpleasant as many have pointed out.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#46  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@Xtasy26:

What you seem not to understand is that using the 4gb with 970 is not the issue its the over excessive usage of resolutions and settings that do require more gpu horsepower and more then 4gb. Some games are not using the last segment and directly going to system memory. Also older and or poorly coded games cause issues as well.

Most of the people complaining about stuttering is because of resolutions and or settings that require more then 4gb including what windows allocates. Here is an example of someone running 1440p with 2x AA and it uses more then the 3.5gb segment and reaches past 4000mb but no stuttering with SLI 970's..... Link

Using MSAA x8 to to prove a point that its the vram is not even an accurate test since that type of AA is excessive gpu load.

The 970 is a 4gb card. SLI can aggravate the situation because when the last segment is being used the bandwidth isnt high enough to allow the gpus to communicate fluidly .

yes it was wrong of Nvidia not to come out clean before hand.

Many not taking into account other factors that can be causing their issues.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#47  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@Xtasy26 said:

@sSubZerOo said:

And they are fools for doing it.. Look at any benchmark out there at 4k resolution.. It brings video card hardware down to it's knees.. Both the 290x and 970 gtx in sli/crossifre with any of the more demanding games dip below 30 fps.. And average in the 30s.... Your already getting what I would consider subpar performance to begin with at that resolution.. As for 1440p.. Every benchmark I have seen has the 290x neck and neck with the 970.. So time will tell how big of difference it is going to make.. Seeing as if it was earth shattering as people are making it sound, reviewers should have been able to catch the memory fiasco from the beginning.. When in actuality they have benchmarked the most demanding games out there, including memory hungry ones like Mordor.. In which Tomshardware has a good break down that has the 290x around 5 to 8 fps higher than the 970.. With the 970 just under a average of 60fps.. And that is at stock speeds, which your likely to close that gap even more with some overclocking from the 970..

You are taking into account the frame rate drops or stuttering that is the result of it. While the average frame rate may look good, the stuttering effect can make the game playing experience unpleasant as many have pointed out.

Your not taking unto account of excessive gpu usage. Vram amounts are only part of the equation.For example Farcry 4 at 4k with ultra settings uses more then 5.2gb of vram, While at 4k with very high preset uses 3.5gb. At 1440p ultra its under 3gb. On both 290 and 970. Or lets look at the new dragon age game at 4k ultra uses near 6gb, and at 1440p it can go over 4gb but yet a 970 performance vs 290 is virtually the same. Hell even at 4k 290 is a whooping 2-3 fps faster minimums and averages. Fact is that 290 series and 980/970 cards are in the same boat by the time games require 4gb for games both are not going to be up to the task of running games at 60 fps. Because gpu processing requirements go hand in hand with typical vram usage.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#48 Xtasy26
Member since 2008 • 5582 Posts

@04dcarraher said:

@Xtasy26 said:

@sSubZerOo said:

And they are fools for doing it.. Look at any benchmark out there at 4k resolution.. It brings video card hardware down to it's knees.. Both the 290x and 970 gtx in sli/crossifre with any of the more demanding games dip below 30 fps.. And average in the 30s.... Your already getting what I would consider subpar performance to begin with at that resolution.. As for 1440p.. Every benchmark I have seen has the 290x neck and neck with the 970.. So time will tell how big of difference it is going to make.. Seeing as if it was earth shattering as people are making it sound, reviewers should have been able to catch the memory fiasco from the beginning.. When in actuality they have benchmarked the most demanding games out there, including memory hungry ones like Mordor.. In which Tomshardware has a good break down that has the 290x around 5 to 8 fps higher than the 970.. With the 970 just under a average of 60fps.. And that is at stock speeds, which your likely to close that gap even more with some overclocking from the 970..

You are taking into account the frame rate drops or stuttering that is the result of it. While the average frame rate may look good, the stuttering effect can make the game playing experience unpleasant as many have pointed out.

Your not taking unto account of excessive gpu usage. Vram amounts are only part of the equation.For example Farcry 4 at 4k with ultra settings uses more then 5.2gb of vram, While at 4k with very high preset uses 3.5gb. At 1440p ultra its under 3gb. On both 290 and 970. Or lets look at the new dragon age game at 4k ultra uses near 6gb, and at 1440p it can go over 4gb but yet a 970 performance vs 290 is virtually the same. Hell even at 4k 290 is a whooping 2-3 fps faster minimums and averages. Fact is that 290 series and 980/970 cards are in the same boat by the time games require 4gb for games both are not going to be up to the task of running games at 60 fps. Because gpu processing requirements go hand in hand with typical vram usage.

You are making the assumption that people will be trying to hit 60 FPS in future games. People who want to use their GPUs for several years may be satisfied with the 30+ FPS, hence the complaint by GTX 970 users about it being future proof. Heck even certain games today the GTX 970 runs fine when the memory is below 3.5 GB but the moment it goes over 3.5GB it turns into a power point slide show. That's in addition to certain games as indicated by GTX 970 owners. You are also ignoring the stuttering effect that GTX 970 owners are experiencing where it becomes annoying as hell that makes the game playing experience unplayable. Example below is The Crew which runs fine when memory usage is below 3.5 GB but once it is pushed above 3.5 GB it turns into a power point slide show.

Loading Video...

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#49  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@Xtasy26 said:

You are making the assumption that people will be trying to hit 60 FPS in future games. People who want to use their GPUs for several years may be satisfied with the 30+ FPS, hence the complaint by GTX 970 users about it being future proof. Heck even certain games today the GTX 970 runs fine when the memory is below 3.5 GB but the moment it goes over 3.5GB it turns into a power point slide show. That's in addition to certain games as indicated by GTX 970 owners. You are also ignoring the stuttering effect that GTX 970 owners are experiencing where it becomes annoying as hell that makes the game playing experience unplayable. Example below is The Crew which runs fine when memory usage is below 3.5 GB but once it is pushed above 3.5 GB it turns into a power point slide show.

lol really? that example is so dumb, and you need to stop being such an AMD fanboy, you do not get the type of stutter with a single 970 with settings within reason even with all 4gb being used.

If you think that example on youtube is proof you clearly have no idea what your posting..... lets run two games at the same time so we can use all the vram but totally ignore the ram and harddrive usage , including gpu,cpu processing power being reserved for the other game as well. Here is a fact with the crew not even the 290 series with 4gb cant run the game at 4k with ultra settings with 4xAA with a solid 30 fps which only still only allocates around 3.5gb.