damn...AMDowned nvidia

  • 64 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for blaznwiipspman1
blaznwiipspman1

16542

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 blaznwiipspman1
Member since 2007 • 16542 Posts

http://www.techpowerup.com/reviews/AMD/Catalyst_12.11_Performance/1.html

Basically, the 7970 is on par or stronger than the 680 gtx. Supposedly AMD was figuring out how to optimize drivers for a new architecture which is why theres alot of performance increases coming so late. It makes sense because the 7970 has more transistors than the 680 gtx meaning more raw power. AMD really pulled the wool over nvidias eyes with this one. On a side note, why didn't ionusx post this??

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#2 04dcarraher
Member since 2004 • 23829 Posts

http://www.techpowerup.com/reviews/AMD/Catalyst_12.11_Performance/1.html

Basically, the 7970 is on par or stronger than the 680 gtx. Supposedly AMD was figuring out how to optimize drivers for a new architecture which is why theres alot of performance increases coming so late. It makes sense because the 7970 has more transistors than the 680 gtx meaning more raw power. AMD really pulled the wool over nvidias eyes with this one. On a side note, why didn't ionusx post this??

blaznwiipspman1

lol you need to get those blinders off and look closer...... it takes the ghz editions at 1920x1200 to actually to become on par or maybe slightly faster. those 670/680 that are not even overclocked. So not sure why in your mind how AMD owned Nvidia... the only reason the 7950 or 7970 actually surpass the 670/680's beyond 1920x1200 isnt because that they have more transistors its because they have the memory bandwidth to actually allow 1440+ resolution data to flow. Because if it was because they had more raw processing power you should see a major difference in performance at 1920x1200 and you dont.

Avatar image for blaznwiipspman1
blaznwiipspman1

16542

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 blaznwiipspman1
Member since 2007 • 16542 Posts

[QUOTE="blaznwiipspman1"]

http://www.techpowerup.com/reviews/AMD/Catalyst_12.11_Performance/1.html

Basically, the 7970 is on par or stronger than the 680 gtx. Supposedly AMD was figuring out how to optimize drivers for a new architecture which is why theres alot of performance increases coming so late. It makes sense because the 7970 has more transistors than the 680 gtx meaning more raw power. AMD really pulled the wool over nvidias eyes with this one. On a side note, why didn't ionusx post this??

04dcarraher

lol you need to get those blinders off...... it takes the ghz editions at 1920x1200 to actually to become on par or maybe slightly faster. those 670/680 and that's not even overclocked. So not sure why in your mind how AMD owned Nvidia... the only reason the 7950 or 7970 actually surpass the 670/680's beyond 1920x1200 isnt because that they have more transistors its because they have the memory bandwidth to actually allow 1440+ resolutions to flow. Because if it was because they had more raw processing power you should see a major difference in performance at 1920x1200.

The reason im saying the 79xx cards has more raw power is because of the games it does well in compared to the geforce ones. For example, battlefield 3, crysis 1 and2, shogun, metro 2033, whereas nvidia does better in games with the UE3 engine aka borderlands 2, batman etc etc. Also the 79xx ghz editions are also at stock...they can be overclocked to 1-200mhz more pretty easily on air. I remember when the 680gtx came out everyone was bashing the 7970....looks like the situation is reversed now.

Avatar image for deactivated-5cf4b2c19c4ab
deactivated-5cf4b2c19c4ab

17476

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#4 deactivated-5cf4b2c19c4ab
Member since 2008 • 17476 Posts

[QUOTE="blaznwiipspman1"]

http://www.techpowerup.com/reviews/AMD/Catalyst_12.11_Performance/1.html

Basically, the 7970 is on par or stronger than the 680 gtx. Supposedly AMD was figuring out how to optimize drivers for a new architecture which is why theres alot of performance increases coming so late. It makes sense because the 7970 has more transistors than the 680 gtx meaning more raw power. AMD really pulled the wool over nvidias eyes with this one. On a side note, why didn't ionusx post this??

04dcarraher

lol you need to get those blinders off...... it takes the ghz editions at 1920x1200 to actually match or maybe slightly out perform a GTX 680 and that's not even overclocked. So not sure in your mind how AMD owned Nvidia... the only reason the 7950 or 7970 actually surpass the 670/680's beyond 1920x1200 isnt because that they have more transistors its because they have the memory bandwidth to actually allow 1440+ resolutions.

Is performing better on higher resolutions a bad thing now?

I really doubt that there will be many people who buy a top tier card 7970/680 and play on resolutions lower than 1080p. So that the cards perform better than the nvidia counterparts at higher resolutions is pretty relevant.

Avatar image for ShadowDeathX
ShadowDeathX

11698

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#5 ShadowDeathX
Member since 2006 • 11698 Posts
Switched over to AMD a few months ago, I'm loving it.
Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#6 04dcarraher
Member since 2004 • 23829 Posts

[QUOTE="04dcarraher"][QUOTE="blaznwiipspman1"]

http://www.techpowerup.com/reviews/AMD/Catalyst_12.11_Performance/1.html

Basically, the 7970 is on par or stronger than the 680 gtx. Supposedly AMD was figuring out how to optimize drivers for a new architecture which is why theres alot of performance increases coming so late. It makes sense because the 7970 has more transistors than the 680 gtx meaning more raw power. AMD really pulled the wool over nvidias eyes with this one. On a side note, why didn't ionusx post this??

ferret-gamer

lol you need to get those blinders off...... it takes the ghz editions at 1920x1200 to actually match or maybe slightly out perform a GTX 680 and that's not even overclocked. So not sure in your mind how AMD owned Nvidia... the only reason the 7950 or 7970 actually surpass the 670/680's beyond 1920x1200 isnt because that they have more transistors its because they have the memory bandwidth to actually allow 1440+ resolutions.

Is performing better on higher resolutions a bad thing now?

I really doubt that there will be many people who buy a top tier card 7970/680 and play on resolutions lower than 1080p. So that the cards perform better than the nvidia counterparts at higher resolutions is pretty relevant.

lol, your not understanding the point, if the gpu is "more powerful" then its counterpart at really high resolutions it should be able to so at lower resolutions. The real main factor in the 7950.7970 ability to perform better at higher resolutions beyond 1920x1200 is because of the wide memory bus and bandwidth. not it actually being "faster" per say.
Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#7 04dcarraher
Member since 2004 • 23829 Posts

[QUOTE="04dcarraher"]

[QUOTE="blaznwiipspman1"]

http://www.techpowerup.com/reviews/AMD/Catalyst_12.11_Performance/1.html

Basically, the 7970 is on par or stronger than the 680 gtx. Supposedly AMD was figuring out how to optimize drivers for a new architecture which is why theres alot of performance increases coming so late. It makes sense because the 7970 has more transistors than the 680 gtx meaning more raw power. AMD really pulled the wool over nvidias eyes with this one. On a side note, why didn't ionusx post this??

blaznwiipspman1

lol you need to get those blinders off...... it takes the ghz editions at 1920x1200 to actually to become on par or maybe slightly faster. those 670/680 and that's not even overclocked. So not sure why in your mind how AMD owned Nvidia... the only reason the 7950 or 7970 actually surpass the 670/680's beyond 1920x1200 isnt because that they have more transistors its because they have the memory bandwidth to actually allow 1440+ resolutions to flow. Because if it was because they had more raw processing power you should see a major difference in performance at 1920x1200.

The reason im saying the 79xx cards has more raw power is because of the games it does well in compared to the geforce ones. For example, battlefield 3, crysis 1 and2, shogun, metro 2033, whereas nvidia does better in games with the UE3 engine aka borderlands 2, batman etc etc. Also the 79xx ghz editions are also at stock...they can be overclocked to 1-200mhz more pretty easily on air. I remember when the 680gtx came out everyone was bashing the 7970....looks like the situation is reversed now.

How is it reversed when you need revised gpu's that are basically factory overclocked to 1+ ghz vs their counterparts at stock clocks to actually be on par or better.
Avatar image for FelipeInside
FelipeInside

28548

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 FelipeInside
Member since 2003 • 28548 Posts
Honesly, are people STILL doing the whole NVIDIA vs ATI thing? Grow up. It works like this: AMD brings out a new card that beats NVIDIA > NVIDIA brings out a new card that beats AMD >AMD brings out a new card that beats NVIDIA > NVIDIA brings out a new card that beats AMD and so on and so on.
Avatar image for Marfoo
Marfoo

6002

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 Marfoo
Member since 2004 • 6002 Posts
Both are good companies, just look at what cards perform best for the games you want to play or that have the features you want (PhsyX, beast OpenCL performance, ZeroCore etc...) and make the purchase that does your wallet good.
Avatar image for deactivated-5cf4b2c19c4ab
deactivated-5cf4b2c19c4ab

17476

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#10 deactivated-5cf4b2c19c4ab
Member since 2008 • 17476 Posts
[QUOTE="ferret-gamer"]

[QUOTE="04dcarraher"] lol you need to get those blinders off...... it takes the ghz editions at 1920x1200 to actually match or maybe slightly out perform a GTX 680 and that's not even overclocked. So not sure in your mind how AMD owned Nvidia... the only reason the 7950 or 7970 actually surpass the 670/680's beyond 1920x1200 isnt because that they have more transistors its because they have the memory bandwidth to actually allow 1440+ resolutions. 04dcarraher

Is performing better on higher resolutions a bad thing now?

I really doubt that there will be many people who buy a top tier card 7970/680 and play on resolutions lower than 1080p. So that the cards perform better than the nvidia counterparts at higher resolutions is pretty relevant.

lol, your not understanding the point, if the gpu is "more powerful" then its counterpart at really high resolutions it should be able to so at lower resolutions. The real main factor in the 7950.7970 ability to perform better at higher resolutions beyond 1920x1200 is because of the wide memory bus and bandwidth. not it actually being "faster" per say.

Performs better = more fps = faster. Why the card performs better isn't really all to important, just that it does perform better. And again, we are talking about flagship cards in late 2012, people who buy these types of cards probably aren't going to be playing on lower resolutions. If you are looking to buy a flagship card for your 720p monitor, you probably have your priorities mixed up.
Avatar image for JigglyWiggly_
JigglyWiggly_

24625

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#11 JigglyWiggly_
Member since 2009 • 24625 Posts

Eh AMD performance gains are not what they should work.

Just yesterdayI put back my hd5850s in CF after one my 460s's fans died.

AMD drivers too amazing.

[spoiler]

You can't even force scaling and over ride applications settings. E.g, I play BF3 at 1280x960, with ATI there is no way to keep it cropped, BF3 ignores catalyst. NVIDIA has an over ride feature.

When I first put my gpus in it said something along the lines of "Cannot control video card", I had to reinstall like 5 times for it to work.
Of course i had the infamous 2nd monitor flickering when I oc'd, I got around this by using my igpu for my 2nd monitor.
Then of course msi afterburner couldn't see my 2nd ati gpu properly so ocing was wack.
And of course even though I had 200fps in BF3, it felt all stuttery.
I tried quake live, and even though I had a constant 125fps and refresh was at 120, it was so stuttery feeling. It was like worse than playing 60hz.
I disabled CF and then it worked.

At this point I realized, I rather just use one 460.

[/spoiler]

Avatar image for ShadowDeathX
ShadowDeathX

11698

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#12 ShadowDeathX
Member since 2006 • 11698 Posts
^ Why do you ALWAYS have problems with AMD? It' hates you. :)
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#13 ronvalencia
Member since 2008 • 29612 Posts

http://www.techpowerup.com/reviews/AMD/Catalyst_12.11_Performance/1.html

Basically, the 7970 is on par or stronger than the 680 gtx. Supposedly AMD was figuring out how to optimize drivers for a new architecture which is why theres alot of performance increases coming so late. It makes sense because the 7970 has more transistors than the 680 gtx meaning more raw power. AMD really pulled the wool over nvidias eyes with this one. On a side note, why didn't ionusx post this??

blaznwiipspman1
Higher resolutions has higher horizontal workloads i.e. more pixels/values.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#14 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="blaznwiipspman1"]

http://www.techpowerup.com/reviews/AMD/Catalyst_12.11_Performance/1.html

Basically, the 7970 is on par or stronger than the 680 gtx. Supposedly AMD was figuring out how to optimize drivers for a new architecture which is why theres alot of performance increases coming so late. It makes sense because the 7970 has more transistors than the 680 gtx meaning more raw power. AMD really pulled the wool over nvidias eyes with this one. On a side note, why didn't ionusx post this??

04dcarraher

lol you need to get those blinders off and look closer...... it takes the ghz editions at 1920x1200 to actually to become on par or maybe slightly faster. those 670/680 that are not even overclocked. So not sure why in your mind how AMD owned Nvidia... the only reason the 7950 or 7970 actually surpass the 670/680's beyond 1920x1200 isnt because that they have more transistors its because they have the memory bandwidth to actually allow 1440+ resolution data to flow. Because if it was because they had more raw processing power you should see a major difference in performance at 1920x1200 and you dont.

AMD FireGL W8000 indicates otherwise i.e. 7970 type chip @900Mhz with 256it wide bus.
Avatar image for msfan1289
msfan1289

1044

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 msfan1289
Member since 2011 • 1044 Posts

wait isnt the HD x9xx cards suppose to fight against the GTX x70 x80s?

Avatar image for blaznwiipspman1
blaznwiipspman1

16542

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16 blaznwiipspman1
Member since 2007 • 16542 Posts

Eh AMD performance gains are not what they should work.

Just yesterdayI put back my hd5850s in CF after one my 460s's fans died.

AMD drivers too amazing.

[spoiler]

You can't even force scaling and over ride applications settings. E.g, I play BF3 at 1280x960, with ATI there is no way to keep it cropped, BF3 ignores catalyst. NVIDIA has an over ride feature.

When I first put my gpus in it said something along the lines of "Cannot control video card", I had to reinstall like 5 times for it to work.
Of course i had the infamous 2nd monitor flickering when I oc'd, I got around this by using my igpu for my 2nd monitor.
Then of course msi afterburner couldn't see my 2nd ati gpu properly so ocing was wack.
And of course even though I had 200fps in BF3, it felt all stuttery.
I tried quake live, and even though I had a constant 125fps and refresh was at 120, it was so stuttery feeling. It was like worse than playing 60hz.
I disabled CF and then it worked.

At this point I realized, I rather just use one 460.

[/spoiler]

JigglyWiggly_

or you could have just used one 5850 since it is faster than the 460. I dont know why people even complain about dual card configurations, they always have problems. You knew the risks when doing it, so you have no right to complain about it. 99% of people do NOT do dual cards, precisely because of this. Hell, ive seen a geforce 590, a dual gpu card blow up and catch on fire on youtube.

Avatar image for topgunmv
topgunmv

10880

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#17 topgunmv
Member since 2003 • 10880 Posts

[QUOTE="ferret-gamer"]

[QUOTE="04dcarraher"] lol you need to get those blinders off...... it takes the ghz editions at 1920x1200 to actually match or maybe slightly out perform a GTX 680 and that's not even overclocked. So not sure in your mind how AMD owned Nvidia... the only reason the 7950 or 7970 actually surpass the 670/680's beyond 1920x1200 isnt because that they have more transistors its because they have the memory bandwidth to actually allow 1440+ resolutions. 04dcarraher

Is performing better on higher resolutions a bad thing now?

I really doubt that there will be many people who buy a top tier card 7970/680 and play on resolutions lower than 1080p. So that the cards perform better than the nvidia counterparts at higher resolutions is pretty relevant.

lol, your not understanding the point, if the gpu is "more powerful" then its counterpart at really high resolutions it should be able to so at lower resolutions. The real main factor in the 7950.7970 ability to perform better at higher resolutions beyond 1920x1200 is because of the wide memory bus and bandwidth. not it actually being "faster" per say.

Lets run all our benchmarks at 800x600, remove any doubt.

/Sarcasm.

Are you being for real here?

Avatar image for JigglyWiggly_
JigglyWiggly_

24625

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#18 JigglyWiggly_
Member since 2009 • 24625 Posts

[QUOTE="JigglyWiggly_"]

Eh AMD performance gains are not what they should work.

Just yesterdayI put back my hd5850s in CF after one my 460s's fans died.

AMD drivers too amazing.

[spoiler]

You can't even force scaling and over ride applications settings. E.g, I play BF3 at 1280x960, with ATI there is no way to keep it cropped, BF3 ignores catalyst. NVIDIA has an over ride feature.

When I first put my gpus in it said something along the lines of "Cannot control video card", I had to reinstall like 5 times for it to work.
Of course i had the infamous 2nd monitor flickering when I oc'd, I got around this by using my igpu for my 2nd monitor.
Then of course msi afterburner couldn't see my 2nd ati gpu properly so ocing was wack.
And of course even though I had 200fps in BF3, it felt all stuttery.
I tried quake live, and even though I had a constant 125fps and refresh was at 120, it was so stuttery feeling. It was like worse than playing 60hz.
I disabled CF and then it worked.

At this point I realized, I rather just use one 460.

[/spoiler]

blaznwiipspman1

or you could have just used one 5850 since it is faster than the 460. I dont know why people even complain about dual card configurations, they always have problems. You knew the risks when doing it, so you have no right to complain about it. 99% of people do NOT do dual cards, precisely because of this. Hell, ive seen a geforce 590, a dual gpu card blow up and catch on fire on youtube.

Zubin has a 590, he has no real issues.

My gtx 460s in SLI ran flawlessly.

It's just that gigabyte's 1gb gtx 460s all fail. (Check the reviews, they went from 5 stars initially to 2 and a half now)

So basically, avoid non reference cards at all cost.

Also if I just used one 5850, I couldn't play in 4:3 for BF3.



EDIT: I lied about going back to my 460.

Why work when you can nag your rich parents? :3

670 tomorrow

ah ye

Avatar image for jhcho2
jhcho2

5103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 2

#19 jhcho2
Member since 2004 • 5103 Posts

Lol, AMD consolation thread. There's a few things TC overlooked:

1. AMD/ATI cards mostly perform better at higher resolutions because their cards always have more memory. GTX 680 has 2GB RAM while the HD 7970 has 3GB. Two years ago, the HD 6870 had 2GB of RAM while the GTX 580 had only 1536MB. This has been the case since the HD 5970. The difference becomes even more apparent with duel gpu or SLI/Crossfire setups. So yeah, performance at higher resolutions are a no brainer.

2. The HD 7970 came out almost a year ago, about 10 months to be exact. Back, then, it lost to the GTX 680 in most areas, even Crysis 2 and non-Unreal engine games. The 680's lead over the 7970 in terms of framerate was small, but if this little edge is all AMD/ATI can come up with after 10 months, I'm not sure if it's something worth getting excited about. 10 months is quite a long time. For all you know, Nvidia already has a trump card, but is holding back on releasing it just to see what AMD can come up with.

3. And the whole assumption about transistors is quite pointless nowdays. Yes, more transistors GENERALLY give an indication of how powerful a card is. A card with 4 billion transistors will most likely be more powerful than a card with only 800 million transistors. However, the GTX 680 had 3.54 billion transistors compared to the 7970's 4.31 billion, and yet, 10 months ago, the GTX 680 won the 7970 in many areas. This isn't the NES era, where the strength in hardware boiled down to bits. The 7970 is 512-bit while the 680 is only 384-bit. All these figures about transistors and bits are abstract and superficial in today's day and age. Performance nowadays is all about ARCHITECTURE. And expanding upwards isn't necessarily better than expending say....sideways.

4. The article posted is about AMD's Catalyst, which is a driver. It's software, not hardware. What AMD did was optimize their code for the cards. So what we can say for sure is that the 7970 isn't necessarily weaker than the 680, but it's vice versa. Maybe Nvidia drivers for the 600 series cards are unoptimized as well. And maybe Nvidia just can't be bothered because they may be focusing their resources on the next generation of cards.

Bottom line - Don't compare two cards which came out 10 months ago based on a driver for one card which came out 10 months later. There's no such thing as a perfect driver. The more work they put into driver optimization, the better it will be. That's definite. The bigger question that Nvidia and AMD would ponder is - would it be better to focus on their current cards, or future cards?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#20 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="04dcarraher"]

[QUOTE="blaznwiipspman1"]

http://www.techpowerup.com/reviews/AMD/Catalyst_12.11_Performance/1.html

Basically, the 7970 is on par or stronger than the 680 gtx. Supposedly AMD was figuring out how to optimize drivers for a new architecture which is why theres alot of performance increases coming so late. It makes sense because the 7970 has more transistors than the 680 gtx meaning more raw power. AMD really pulled the wool over nvidias eyes with this one. On a side note, why didn't ionusx post this??

lol you need to get those blinders off and look closer...... it takes the ghz editions at 1920x1200 to actually to become on par or maybe slightly faster. those 670/680 that are not even overclocked. So not sure why in your mind how AMD owned Nvidia... the only reason the 7950 or 7970 actually surpass the 670/680's beyond 1920x1200 isnt because that they have more transistors its because they have the memory bandwidth to actually allow 1440+ resolution data to flow. Because if it was because they had more raw processing power you should see a major difference in performance at 1920x1200 and you dont.

AMD GCN has more cache/local data storage/buffer per ALU which consumes more transistors.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#21 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="topgunmv"]

[QUOTE="04dcarraher"][QUOTE="ferret-gamer"] Is performing better on higher resolutions a bad thing now?

I really doubt that there will be many people who buy a top tier card 7970/680 and play on resolutions lower than 1080p. So that the cards perform better than the nvidia counterparts at higher resolutions is pretty relevant.

lol, your not understanding the point, if the gpu is "more powerful" then its counterpart at really high resolutions it should be able to so at lower resolutions. The real main factor in the 7950.7970 ability to perform better at higher resolutions beyond 1920x1200 is because of the wide memory bus and bandwidth. not it actually being "faster" per say.

Lets run all our benchmarks at 800x600, remove any doubt.

/Sarcasm.

Are you being for real here?

Running at low resolution with high end GPU is just a LOL material.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#22 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="blaznwiipspman1"]

[QUOTE="04dcarraher"] lol you need to get those blinders off...... it takes the ghz editions at 1920x1200 to actually to become on par or maybe slightly faster. those 670/680 and that's not even overclocked. So not sure why in your mind how AMD owned Nvidia... the only reason the 7950 or 7970 actually surpass the 670/680's beyond 1920x1200 isnt because that they have more transistors its because they have the memory bandwidth to actually allow 1440+ resolutions to flow. Because if it was because they had more raw processing power you should see a major difference in performance at 1920x1200.

04dcarraher

The reason im saying the 79xx cards has more raw power is because of the games it does well in compared to the geforce ones. For example, battlefield 3, crysis 1 and2, shogun, metro 2033, whereas nvidia does better in games with the UE3 engine aka borderlands 2, batman etc etc. Also the 79xx ghz editions are also at stock...they can be overclocked to 1-200mhz more pretty easily on air. I remember when the 680gtx came out everyone was bashing the 7970....looks like the situation is reversed now.

How is it reversed when you need revised gpu's that are basically factory overclocked to 1+ ghz vs their counterparts at stock clocks to actually be on par or better.

7970 GE's 1Ghz speed is no different to any CPU vendor setting it's product's clockspeed.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#23 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="blaznwiipspman1"]

[QUOTE="04dcarraher"]

[QUOTE="blaznwiipspman1"]

http://www.techpowerup.com/reviews/AMD/Catalyst_12.11_Performance/1.html

Basically, the 7970 is on par or stronger than the 680 gtx. Supposedly AMD was figuring out how to optimize drivers for a new architecture which is why theres alot of performance increases coming so late. It makes sense because the 7970 has more transistors than the 680 gtx meaning more raw power. AMD really pulled the wool over nvidias eyes with this one. On a side note, why didn't ionusx post this??

lol you need to get those blinders off...... it takes the ghz editions at 1920x1200 to actually to become on par or maybe slightly faster. those 670/680 and that's not even overclocked. So not sure why in your mind how AMD owned Nvidia... the only reason the 7950 or 7970 actually surpass the 670/680's beyond 1920x1200 isnt because that they have more transistors its because they have the memory bandwidth to actually allow 1440+ resolutions to flow. Because if it was because they had more raw processing power you should see a major difference in performance at 1920x1200.

The reason im saying the 79xx cards has more raw power is because of the games it does well in compared to the geforce ones. For example, battlefield 3, crysis 1 and2, shogun, metro 2033, whereas nvidia does better in games with the UE3 engine aka borderlands 2, batman etc etc. Also the 79xx ghz editions are also at stock...they can be overclocked to 1-200mhz more pretty easily on air. I remember when the 680gtx came out everyone was bashing the 7970....looks like the situation is reversed now.

680 didn't win with heavy PC games e.g. Metro 2033.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#24 ronvalencia
Member since 2008 • 29612 Posts

Lol, AMD consolation thread. There's a few things TC overlooked:

1. AMD/ATI cards mostly perform better at higher resolutions because their cards always have more memory. GTX 680 has 2GB RAM while the HD 7970 has 3GB. Two years ago, the HD 6870 had 2GB of RAM while the GTX 580 had only 1536MB. This has been the case since the HD 5970. The difference becomes even more apparent with duel gpu or SLI/Crossfire setups. So yeah, performance at higher resolutions are a no brainer.

2. The HD 7970 came out almost a year ago, about 10 months to be exact. Back, then, it lost to the GTX 680 in most areas, even Crysis 2 and non-Unreal engine games. The 680's lead over the 7970 in terms of framerate was small, but if this little edge is all AMD/ATI can come up with after 10 months, I'm not sure if it's something worth getting excited about. 10 months is quite a long time. For all you know, Nvidia already has a trump card, but is holding back on releasing it just to see what AMD can come up with.

3. And the whole assumption about transistors is quite pointless nowdays. Yes, more transistors GENERALLY give an indication of how powerful a card is. A card with 4 billion transistors will most likely be more powerful than a card with only 800 million transistors. However, the GTX 680 had 3.54 billion transistors compared to the 7970's 4.31 billion, and yet, 10 months ago, the GTX 680 won the 7970 in many areas. This isn't the NES era, where the strength in hardware boiled down to bits. The 7970 is 512-bit while the 680 is only 384-bit. All these figures about transistors and bits are abstract and superficial in today's day and age. Performance nowadays is all about ARCHITECTURE. And expanding upwards isn't necessarily better than expending say....sideways.

4. The article posted is about AMD's Catalyst, which is a driver. It's software, not hardware. What AMD did was optimize their code for the cards. So what we can say for sure is that the 7970 isn't necessarily weaker than the 680, but it's vice versa. Maybe Nvidia drivers for the 600 series cards are unoptimized as well. And maybe Nvidia just can't be bothered because they may be focusing their resources on the next generation of cards.

Bottom line - Don't compare two cards which came out 10 months ago based on a driver for one card which came out 10 months later. There's no such thing as a perfect driver. The more work they put into driver optimization, the better it will be. That's definite. The bigger question that Nvidia and AMD would ponder is - would it be better to focus on their current cards, or future cards?

jhcho2

For point 2.

http://techreport.com/review/22653/nvidia-geforce-gtx-680-graphics-processor-reviewed/11

crysis2-fps.gif

Techreport uses newer Catalyst driver (Catalyst 8.95.5-120224a) compared techpowerup's (8.921.2 RC11) http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_680/6.html

Radeon HD 8xx0 GCN still uses CU with 64 ALU cluster.

For point 3.

79x0 has additional support for practical double floats and better cache/local data storage(LDS)/buffer vs ALU count ratio.

For driver side's just in time re-complier, there are large differences between VLIW (i.e. AMD Evergreen/Cayman) vs SIMD (i.e. AMD GCN) architectures.

Your "7970 is 512-bit while the 680 is only 384-bit" statement is not correct i.e. 7970 has 384 bit wide bus while 680 has 256 bit wide bus.

For point 4,

Radeon HD 8870 still sports GCN architectures with some changes i.e. it's basically 7970 without 384-bit bus and practical double floats support i.e. scaled 7870 to 7970's CU count. The driver improvements for 7xx0 GCN would be applicable for the next 8xx0 GCN.

Avatar image for blaznwiipspman1
blaznwiipspman1

16542

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 blaznwiipspman1
Member since 2007 • 16542 Posts

[QUOTE="04dcarraher"][QUOTE="blaznwiipspman1"]

The reason im saying the 79xx cards has more raw power is because of the games it does well in compared to the geforce ones. For example, battlefield 3, crysis 1 and2, shogun, metro 2033, whereas nvidia does better in games with the UE3 engine aka borderlands 2, batman etc etc. Also the 79xx ghz editions are also at stock...they can be overclocked to 1-200mhz more pretty easily on air. I remember when the 680gtx came out everyone was bashing the 7970....looks like the situation is reversed now.

ronvalencia

How is it reversed when you need revised gpu's that are basically factory overclocked to 1+ ghz vs their counterparts at stock clocks to actually be on par or better.

7970 GE's 1Ghz speed is no different to any CPU vendor setting it's product's clockspeed.

yep and thats what most people fail to realize

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#26 04dcarraher
Member since 2004 • 23829 Posts

:o

Two still ignore the point at hand. Compare one set of revised gpu's vs another set of gpu's that haven't been modified or overclocked. Anywhere else that would be considered lopsided and bias.....The review is just to show the differences between the drivers but still also shows us that even with the new drivers the 7970 ghz is only 1-7% faster depending on resolution. While the vanilla 7970 is still 2-9% slower depending on resolution. How about Nvidia drivers.... we have no idea what drivers they are using.... the newer 310 beta drivers or the latest driver set 306 or the older 304's

Because the beta 310's can see as much as 10% increase in fps with certain games vs the 306.... that sounds alot like what these newer drivers did for AMD.

Here is the point, if we are to correctly compare the performance between AMD and Nvidia flagship cards they both need to use latest drivers and have clockrates that are at the same ratio. Not compare a revised aka factory overclocked gpu with latest drivers vs standard gpu with unknown set of (most likley older)drivers. And claim AMD owned Nvidia or the tables have been reversed. You have to look at all the angles.

Avatar image for deactivated-5a9b3f32ef4e9
deactivated-5a9b3f32ef4e9

7779

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27 deactivated-5a9b3f32ef4e9
Member since 2009 • 7779 Posts

Lol you guys.

Avatar image for Guovssohas
Guovssohas

330

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 Guovssohas
Member since 2010 • 330 Posts

:o

Two still ignore the point at hand. Compare one set of revised gpu's vs another set of gpu's that haven't been modified or overclocked. Anywhere else that would be considered lopsided and bias.....The review is just to show the differences between the drivers but still also shows us that even with the new drivers the 7970 ghz is only 1-7% faster depending on resolution. While the vanilla 7970 is still 2-9% slower depending on resolution. How about Nvidia drivers.... we have no idea what drivers they are using.... the newer 310 beta drivers or the latest driver set 306 or the older 304's

Because the beta 310's can see as much as 10% increase in fps with certain games vs the 306.... that sounds alot like what these newer drivers did for AMD.

Here is the point, if we are to correctly compare the performance between AMD and Nvidia flagship cards they both need to use latest drivers and have clockrates that are at the same ratio. Not compare a revised aka factory overclocked gpu with latest drivers vs standard gpu with unknown set of (most likley older)drivers. And claim AMD owned Nvidia or the tables have been reversed. You have to look at all the angles.

04dcarraher
Then let's take a look at two highly OC'd top of the line GPU's. http://translate.google.com/translate?hl=fi&sl=fi&tl=en&u=http%3A%2F%2Fmuropaketti.com%2Fartikkelit%2Fnaytonohjaimet%2Fasus-rog-matrix-hd-7970-platinum-vs-msi-n680gtx-lightning ASUS 7970 Matrix Platinum(1275mhz) vs MSI 680 Lightning(1346mhz). Go to page 6 then scroll down for OC'd results. At page 5 there's the stock results, the 7970 at 1100mhz and the 680 at 1176mhz.
Avatar image for V4LENT1NE
V4LENT1NE

12901

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#29 V4LENT1NE
Member since 2006 • 12901 Posts
This forum really needs to start banning fanboys, its getting worse than system wars in here...
Avatar image for JohnF111
JohnF111

14190

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

#30 JohnF111
Member since 2010 • 14190 Posts
So once again both companies are on par, with one or two advantages in different areas. How is this AMD owning nVidia exactly? One counter in one area? Oh wow :roll: Both are about equal, nvidia uses less power while AMD is cheaper. Take your pick.
Avatar image for superclocked
superclocked

5864

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31 superclocked
Member since 2009 • 5864 Posts

[QUOTE="JigglyWiggly_"]

Eh AMD performance gains are not what they should work.

Just yesterdayI put back my hd5850s in CF after one my 460s's fans died.

AMD drivers too amazing.

[spoiler]

You can't even force scaling and over ride applications settings. E.g, I play BF3 at 1280x960, with ATI there is no way to keep it cropped, BF3 ignores catalyst. NVIDIA has an over ride feature.

When I first put my gpus in it said something along the lines of "Cannot control video card", I had to reinstall like 5 times for it to work.
Of course i had the infamous 2nd monitor flickering when I oc'd, I got around this by using my igpu for my 2nd monitor.
Then of course msi afterburner couldn't see my 2nd ati gpu properly so ocing was wack.
And of course even though I had 200fps in BF3, it felt all stuttery.
I tried quake live, and even though I had a constant 125fps and refresh was at 120, it was so stuttery feeling. It was like worse than playing 60hz.
I disabled CF and then it worked.

At this point I realized, I rather just use one 460.

[/spoiler]

blaznwiipspman1

or you could have just used one 5850 since it is faster than the 460. I dont know why people even complain about dual card configurations, they always have problems. You knew the risks when doing it, so you have no right to complain about it. 99% of people do NOT do dual cards, precisely because of this. Hell, ive seen a geforce 590, a dual gpu card blow up and catch on fire on youtube.

I've never had a single problem with SLI. For some reason you have to manually set new games to run in SLI using the nVidia control panel (select alternate frame rendering), but SLI has always worked flawlessly. I was considering CF and a triple monitor setup when the 6950's could be unlocked to 6970's, but I read that the third monitor would've looked darker or some such, so I decided to get a 2GB 560 Ti instead. Hell, that's been 2 years ago already. Damn time flies...
Avatar image for Addict187
Addict187

1128

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#32 Addict187
Member since 2008 • 1128 Posts
Switched over to AMD a few months ago, I'm loving it.ShadowDeathX
I switched to Nivida this time i may never go back AMD. Sorry to say this but everything just works. Not so much with AMD but I used them for the last 6 years and did not know any better. I figuerd that saving a few bucks was the way to go. And now wish I made the change over a long time ago. All my Need for Speed games finlly just work without problems. Everything just runs better I find. I went from a 6870 to gtx 670 and dame i have beem missing out here. I had 4890 when they came out had problems with a few games. 6870 same thing 6870 crossfire dont get me started with that ordeal. GTX 670 all probems are gone
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#33 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ShadowDeathX"]Switched over to AMD a few months ago, I'm loving it.Addict187
I switched to Nivida this time i may never go back AMD. Sorry to say this but everything just works. Not so much with AMD but I used them for the last 6 years and did not know any better. I figuerd that saving a few bucks was the way to go. And now wish I made the change over a long time ago. All my Need for Speed games finlly just work without problems. Everything just runs better I find. I went from a 6870 to gtx 670 and dame i have beem missing out here. I had 4890 when they came out had problems with a few games. 6870 same thing 6870 crossfire dont get me started with that ordeal. GTX 670 all probems are gone

Geforce GTX 680 has it's own issues e.g. http://www.youtube.com/watch?v=hMP0ETuyQzw&noredirect=1

PS: My last Geforce CUDA was Geforce 8600M GT GDDR3 (dead, ASUS G1S laptop), 9500M GS GDDR2 (ASUS G1SN** replaced dead G1S), 9650M GT (ASUS N80vn, unstable/BSOD with Win7/Vista X64, but fine with Vista/Win7 32bit, may try Win8 X64 later, the motherboard has been replaced and ASUS gave up on the fix). ASUS N80vn's Windows 64bit issues caused me to purchased Sony Vaio VGN-FW45 laptop with AMD Radeon HD 4650M.

**ASUS G1SN overheats and cracked(hair line fracture) the case around the heat sink.

Avatar image for Addict187
Addict187

1128

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#34 Addict187
Member since 2008 • 1128 Posts

[QUOTE="Addict187"][QUOTE="ShadowDeathX"]Switched over to AMD a few months ago, I'm loving it.ronvalencia

I switched to Nivida this time i may never go back AMD. Sorry to say this but everything just works. Not so much with AMD but I used them for the last 6 years and did not know any better. I figuerd that saving a few bucks was the way to go. And now wish I made the change over a long time ago. All my Need for Speed games finlly just work without problems. Everything just runs better I find. I went from a 6870 to gtx 670 and dame i have beem missing out here. I had 4890 when they came out had problems with a few games. 6870 same thing 6870 crossfire dont get me started with that ordeal. GTX 670 all probems are gone

Geforce GTX 680 has it's own issues e.g. http://www.youtube.com/watch?v=hMP0ETuyQzw&noredirect=1

PS: My last Geforce CUDA was Geforce 8600M GT GDDR3 (dead, ASUS G1S laptop), 9500M GS GDDR2 (ASUS G1SN** replaced dead G1S), 9650M GT (ASUS N80vn, unstable/BSOD with Win7/Vista X64, but fine with Vista/Win7 32bit, may try Win8 X64 later, the motherboard has been replaced and ASUS gave up on the fix). ASUS N80vn's Windows 64bit issues caused me to purchased Sony Vaio VGN-FW45 laptop with AMD Radeon HD 4650M.

**ASUS G1SN overheats and cracked(hair line fracture) the case around the heat sink.

I had this same problem with AMD. it is a simpel fix. You have to create a custom rez slitley less then 1080p. This is a problem within the game it self. I have the game so i know what im talking about. This is only a problem with DX11 settings. set it back to DX9 and it ill run fine regardless AMD or Nivid. Or make a custom rez
Avatar image for blaznwiipspman1
blaznwiipspman1

16542

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#35 blaznwiipspman1
Member since 2007 • 16542 Posts

[QUOTE="ronvalencia"]

[QUOTE="Addict187"] I switched to Nivida this time i may never go back AMD. Sorry to say this but everything just works. Not so much with AMD but I used them for the last 6 years and did not know any better. I figuerd that saving a few bucks was the way to go. And now wish I made the change over a long time ago. All my Need for Speed games finlly just work without problems. Everything just runs better I find. I went from a 6870 to gtx 670 and dame i have beem missing out here. I had 4890 when they came out had problems with a few games. 6870 same thing 6870 crossfire dont get me started with that ordeal. GTX 670 all probems are gone Addict187

Geforce GTX 680 has it's own issues e.g. http://www.youtube.com/watch?v=hMP0ETuyQzw&noredirect=1

PS: My last Geforce CUDA was Geforce 8600M GT GDDR3 (dead, ASUS G1S laptop), 9500M GS GDDR2 (ASUS G1SN** replaced dead G1S), 9650M GT (ASUS N80vn, unstable/BSOD with Win7/Vista X64, but fine with Vista/Win7 32bit, may try Win8 X64 later, the motherboard has been replaced and ASUS gave up on the fix). ASUS N80vn's Windows 64bit issues caused me to purchased Sony Vaio VGN-FW45 laptop with AMD Radeon HD 4650M.

**ASUS G1SN overheats and cracked(hair line fracture) the case around the heat sink.

I had this same problem with AMD. it is a simpel fix. You have to create a custom rez slitley less then 1080p. This is a problem within the game it self. I have the game so i know what im talking about. This is only a problem with DX11 settings. set it back to DX9 and it ill run fine regardless AMD or Nivid. Or make a custom rez

i havent had any problems with nfs games or any game so far on my radeon card, except for oblivion which freezes in one cave area. Can't blame AMD though, the game is 7 years old now. My geforce 460 i used to have refused to play metro 2033 until i patched the game and ive seen plenty of faulty hardware issues with nvidia cards...ie baking cards (geforce 8800) and cards catching fire (gtx 590). My radeon 9200 from way back in 2002 is still working lol.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#36 04dcarraher
Member since 2004 • 23829 Posts

[QUOTE="Addict187"][QUOTE="ronvalencia"]

Geforce GTX 680 has it's own issues e.g. http://www.youtube.com/watch?v=hMP0ETuyQzw&noredirect=1

PS: My last Geforce CUDA was Geforce 8600M GT GDDR3 (dead, ASUS G1S laptop), 9500M GS GDDR2 (ASUS G1SN** replaced dead G1S), 9650M GT (ASUS N80vn, unstable/BSOD with Win7/Vista X64, but fine with Vista/Win7 32bit, may try Win8 X64 later, the motherboard has been replaced and ASUS gave up on the fix). ASUS N80vn's Windows 64bit issues caused me to purchased Sony Vaio VGN-FW45 laptop with AMD Radeon HD 4650M.

**ASUS G1SN overheats and cracked(hair line fracture) the case around the heat sink.

blaznwiipspman1

I had this same problem with AMD. it is a simpel fix. You have to create a custom rez slitley less then 1080p. This is a problem within the game it self. I have the game so i know what im talking about. This is only a problem with DX11 settings. set it back to DX9 and it ill run fine regardless AMD or Nivid. Or make a custom rez

i havent had any problems with nfs games or any game so far on my radeon card, except for oblivion which freezes in one cave area. Can't blame AMD though, the game is 7 years old now. My geforce 460 i used to have refused to play metro 2033 until i patched the game and ive seen plenty of faulty hardware issues with nvidia cards...ie baking cards (geforce 8800) and cards catching fire (gtx 590). My radeon 9200 from way back in 2002 is still working lol.

You finding faults with only Nvidia gpu's who could have guessed....

All gpu's can have issues and both companies can have problems with specific games and models of cards. I never had any issue with NFS games Metro or oblivion with my 8800's or 560 or 7750, I had two 8800's in sli from late 2008 to 2011 never had a issue (first gpu from 2007), my brother also has a 8800GT since 2008 and its still working to this day.You have to know what your doing to maintain hardware especially something like the 8800GT's or gtx 590's with their single slot coolers or high TDP. If you had a GTX 590 and "it caught on fire" as you say, you had cooling problems or psu issues, And even if the card was in fault you could have gotten it RMA'ed and kept that superior graphics card.

Now you claiming your 9200 is still working, I still have a Geforce 4 mx440, a FX 5200 and a 6600GT that are in working order to this day. The older the card the power needed and heat produced is much lower allowing them to last longer.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#38 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="blaznwiipspman1"]

[QUOTE="Addict187"] I had this same problem with AMD. it is a simpel fix. You have to create a custom rez slitley less then 1080p. This is a problem within the game it self. I have the game so i know what im talking about. This is only a problem with DX11 settings. set it back to DX9 and it ill run fine regardless AMD or Nivid. Or make a custom rez04dcarraher

i havent had any problems with nfs games or any game so far on my radeon card, except for oblivion which freezes in one cave area. Can't blame AMD though, the game is 7 years old now. My geforce 460 i used to have refused to play metro 2033 until i patched the game and ive seen plenty of faulty hardware issues with nvidia cards...ie baking cards (geforce 8800) and cards catching fire (gtx 590). My radeon 9200 from way back in 2002 is still working lol.

You finding faults with only Nvidia gpu's who could have guessed....

All gpu's can have issues and both companies can have problems with specific games and models of cards. I never had any issue with NFS games Metro or oblivion with my 8800's or 560 or 7750, I had two 8800's in sli from late 2008 to 2011 never had a issue (first gpu from 2007), my brother also has a 8800GT since 2008 and its still working to this day.You have to know what your doing to maintain hardware especially something like the 8800GT's or gtx 590's with their single slot coolers or high TDP. If you had a GTX 590 and "it caught on fire" as you say, you had cooling problems or psu issues, And even if the card was in fault you could have gotten it RMA'ed and kept that superior graphics card.

Now you claiming your 9200 is still working, I still have a Geforce 4 mx440, a FX 5200 and a 6600GT that are in working order to this day. The older the card the power needed and heat produced is much lower allowing them to last longer.

One problem, laptop PC's annual unit sale exceeds desktop PC's annual unit sales.

In terms of failure rate, NVIDIA "bumpgate" rivals Xbox 360's RROD.

I still have my working Geforce 4200 TI AGP, FX 5950 Ultra AGP, 8600 GT GDDR3 (desktop, DIY ViDock/eGPU with ExpressCard slot), but my Dell Inspiron 5150 work laptop with mobile Pentium IV 3Ghz + Geforce Go 5200 FX is dead.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#39 04dcarraher
Member since 2004 • 23829 Posts

[QUOTE="04dcarraher"]

[QUOTE="blaznwiipspman1"]

i havent had any problems with nfs games or any game so far on my radeon card, except for oblivion which freezes in one cave area. Can't blame AMD though, the game is 7 years old now. My geforce 460 i used to have refused to play metro 2033 until i patched the game and ive seen plenty of faulty hardware issues with nvidia cards...ie baking cards (geforce 8800) and cards catching fire (gtx 590). My radeon 9200 from way back in 2002 is still working lol.

ronvalencia

You finding faults with only Nvidia gpu's who could have guessed....

All gpu's can have issues and both companies can have problems with specific games and models of cards. I never had any issue with NFS games Metro or oblivion with my 8800's or 560 or 7750, I had two 8800's in sli from late 2008 to 2011 never had a issue (first gpu from 2007), my brother also has a 8800GT since 2008 and its still working to this day.You have to know what your doing to maintain hardware especially something like the 8800GT's or gtx 590's with their single slot coolers or high TDP. If you had a GTX 590 and "it caught on fire" as you say, you had cooling problems or psu issues, And even if the card was in fault you could have gotten it RMA'ed and kept that superior graphics card.

Now you claiming your 9200 is still working, I still have a Geforce 4 mx440, a FX 5200 and a 6600GT that are in working order to this day. The older the card the power needed and heat produced is much lower allowing them to last longer.

One problem, laptop PC's annual unit sale exceeds desktop PC's annual unit sales.

In terms of failure rate, NVIDIA "bumpgate" rivals Xbox 360's RROD.

I still have my working Geforce 4200 TI AGP, FX 5950 Ultra AGP, 8600 GT GDDR3 (desktop, DIY ViDock/eGPU with ExpressCard slot), but my Dell Inspiron 5150 work laptop with mobile Pentium IV 3Ghz + Geforce Go 5200 FX is dead.

they just didnt get the concepts of reliability and testing. This became painfully clear with their flailing over the bumpgate fiasco. The talking point was that it was a new area of science, and not understood by anyone. Having talked to five separate packaging engineers who all understood the problem, and gave me the exact same reasoning for its occurrence, Originally I thought they were just trying to make me believe their version of the truth, but I later found out that was not the case. This dawned on me when I noticed Nvidia was hiring a lot of thermal engineers in late 2008 to solve the problem retroactively. The disconnect was that the bumpgate failures were not due to a thermal problem per se. The chips did not cook to death. Nvidias bumps were cracking due to repeated thermal stresses, leading to physical failure. Hiring thermal engineers to fix that is much akin to hiring more gas station attendants to improve a cars fuel economy, it was addressing the wrong problem. The fixes showed some creative but problematic engineering choices. By this point, things had gone from morbid humor to rather sad. Flailing is fun to watch for a bit, but it gets old, even if you call it dancing in a press release. The short story is that Nvidia was not testing properly, and the result was dead chips. That is the science that the company didnt understand, and while its implementation may be complex, the concept most definitely is not.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#40 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="04dcarraher"]

You finding faults with only Nvidia gpu's who could have guessed....

All gpu's can have issues and both companies can have problems with specific games and models of cards. I never had any issue with NFS games Metro or oblivion with my 8800's or 560 or 7750, I had two 8800's in sli from late 2008 to 2011 never had a issue (first gpu from 2007), my brother also has a 8800GT since 2008 and its still working to this day.You have to know what your doing to maintain hardware especially something like the 8800GT's or gtx 590's with their single slot coolers or high TDP. If you had a GTX 590 and "it caught on fire" as you say, you had cooling problems or psu issues, And even if the card was in fault you could have gotten it RMA'ed and kept that superior graphics card.

Now you claiming your 9200 is still working, I still have a Geforce 4 mx440, a FX 5200 and a 6600GT that are in working order to this day. The older the card the power needed and heat produced is much lower allowing them to last longer.

04dcarraher

One problem, laptop PC's annual unit sale exceeds desktop PC's annual unit sales.

In terms of failure rate, NVIDIA "bumpgate" rivals Xbox 360's RROD.

I still have my working Geforce 4200 TI AGP, FX 5950 Ultra AGP, 8600 GT GDDR3 (desktop, DIY ViDock/eGPU with ExpressCard slot), but my Dell Inspiron 5150 work laptop with mobile Pentium IV 3Ghz + Geforce Go 5200 FX is dead.

they just didnt get the concepts of reliability and testing. This became painfully clear with their flailing over the bumpgate fiasco. The talking point was that it was a new area of science, and not understood by anyone. Having talked to five separate packaging engineers who all understood the problem, and gave me the exact same reasoning for its occurrence, Originally I thought they were just trying to make me believe their version of the truth, but I later found out that was not the case. This dawned on me when I noticed Nvidia was hiring a lot of thermal engineers in late 2008 to solve the problem retroactively. The disconnect was that the bumpgate failures were not due to a thermal problem per se. The chips did not cook to death. Nvidias bumps were cracking due to repeated thermal stresses, leading to physical failure. Hiring thermal engineers to fix that is much akin to hiring more gas station attendants to improve a cars fuel economy, it was addressing the wrong problem. The fixes showed some creative but problematic engineering choices. By this point, things had gone from morbid humor to rather sad. Flailing is fun to watch for a bit, but it gets old, even if you call it dancing in a press release. The short story is that Nvidia was not testing properly, and the result was dead chips. That is the science that the company didnt understand, and while its implementation may be complex, the concept most definitely is not.

Yes, it was sad...

My other old work laptop ASUS W3J (14 inch) laptop with mobility Radeon X1600 Pro still works. ASUS N80vn(14 inch) was the laptop to replace ASUS W3J.

ASUS G1S/G1Sn and Sony VGN-FW-45 laptops are my personal laptops.

My current DELL Studio XPS 1645 laptop is powerful enough (i.e. quad-core/8 threads) to run my dev work within VMWARE Workstation and my normal home workloads.

Anyway, my next year's upgrade cycle might have mid-range NV Geforce 8xx(~35 watts) or AMD Radeon 86x0M(~35 watts), but key part would be Intel Haswell's release.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#41 04dcarraher
Member since 2004 • 23829 Posts

[QUOTE="04dcarraher"][QUOTE="ronvalencia"]

One problem, laptop PC's annual unit sale exceeds desktop PC's annual unit sales.

In terms of failure rate, NVIDIA "bumpgate" rivals Xbox 360's RROD.

I still have my working Geforce 4200 TI AGP, FX 5950 Ultra AGP, 8600 GT GDDR3 (desktop, DIY ViDock/eGPU with ExpressCard slot), but my Dell Inspiron 5150 work laptop with mobile Pentium IV 3Ghz + Geforce Go 5200 FX is dead.

ronvalencia

they just didnt get the concepts of reliability and testing. This became painfully clear with their flailing over the bumpgate fiasco. The talking point was that it was a new area of science, and not understood by anyone. Having talked to five separate packaging engineers who all understood the problem, and gave me the exact same reasoning for its occurrence, Originally I thought they were just trying to make me believe their version of the truth, but I later found out that was not the case. This dawned on me when I noticed Nvidia was hiring a lot of thermal engineers in late 2008 to solve the problem retroactively. The disconnect was that the bumpgate failures were not due to a thermal problem per se. The chips did not cook to death. Nvidias bumps were cracking due to repeated thermal stresses, leading to physical failure. Hiring thermal engineers to fix that is much akin to hiring more gas station attendants to improve a cars fuel economy, it was addressing the wrong problem. The fixes showed some creative but problematic engineering choices. By this point, things had gone from morbid humor to rather sad. Flailing is fun to watch for a bit, but it gets old, even if you call it dancing in a press release. The short story is that Nvidia was not testing properly, and the result was dead chips. That is the science that the company didnt understand, and while its implementation may be complex, the concept most definitely is not.

My other old work laptop ASUS W3J (14 inch) laptop with mobility Radeon X1600 Pro still works. ASUS N80vn(14 inch) was the laptop to replace ASUS W3J.

Alot of laptops are still working with nvidia gpu's that are 5 years old or more.

Avatar image for godzillavskong
godzillavskong

7904

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#42 godzillavskong
Member since 2007 • 7904 Posts
[QUOTE="FelipeInside"]Honesly, are people STILL doing the whole NVIDIA vs ATI thing? Grow up. It works like this: AMD brings out a new card that beats NVIDIA > NVIDIA brings out a new card that beats AMD >AMD brings out a new card that beats NVIDIA > NVIDIA brings out a new card that beats AMD and so on and so on.

I hear you. I usually buy amd cards but I've really been looking into possibly getting one of those , or maybe two of those 680s. I like the 7970s too. They seem too close in performance anyways. I may just go with the cheaper of the bunch. I've been looking to replace my 6870s, but I'm not sure if I should just wait for the 8xxx series to come out. But I agree with you. Both are great companies and make very comparable gpus.
Avatar image for jhcho2
jhcho2

5103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 2

#43 jhcho2
Member since 2004 • 5103 Posts

[QUOTE="blaznwiipspman1"]

[QUOTE="04dcarraher"] lol you need to get those blinders off...... it takes the ghz editions at 1920x1200 to actually to become on par or maybe slightly faster. those 670/680 and that's not even overclocked. So not sure why in your mind how AMD owned Nvidia... the only reason the 7950 or 7970 actually surpass the 670/680's beyond 1920x1200 isnt because that they have more transistors its because they have the memory bandwidth to actually allow 1440+ resolutions to flow. Because if it was because they had more raw processing power you should see a major difference in performance at 1920x1200.

ronvalencia

The reason im saying the 79xx cards has more raw power is because of the games it does well in compared to the geforce ones. For example, battlefield 3, crysis 1 and2, shogun, metro 2033, whereas nvidia does better in games with the UE3 engine aka borderlands 2, batman etc etc. Also the 79xx ghz editions are also at stock...they can be overclocked to 1-200mhz more pretty easily on air. I remember when the 680gtx came out everyone was bashing the 7970....looks like the situation is reversed now.

680 didn't win with heavy PC games e.g. Metro 2033.

it's nothing to do with 'heavy' games as you put it. Metro 2033 has always been biased in favour of ATI cards.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#44 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"][QUOTE="blaznwiipspman1"]

The reason im saying the 79xx cards has more raw power is because of the games it does well in compared to the geforce ones. For example, battlefield 3, crysis 1 and2, shogun, metro 2033, whereas nvidia does better in games with the UE3 engine aka borderlands 2, batman etc etc. Also the 79xx ghz editions are also at stock...they can be overclocked to 1-200mhz more pretty easily on air. I remember when the 680gtx came out everyone was bashing the 7970....looks like the situation is reversed now.

jhcho2

680 didn't win with heavy PC games e.g. Metro 2033.

it's nothing to do with 'heavy' games as you put it. Metro 2033 has always been biased in favour of ATI cards.

Metro 2033 is a Nvidia "The Way It's Meant To Be Played" title and the 7970 didn't exist during the design of Metro 2033.

http://www.nzone.ro/object/nzone_twimtbp_gameslist_uk.html search for metro 2033

7970 didn't win with light weight Blizzard created games, Call of Duty 4 DX9, which skewed the averages.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#45 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"][QUOTE="04dcarraher"] they just didnt get the concepts of reliability and testing. This became painfully clear with their flailing over the bumpgate fiasco. The talking point was that it was a new area of science, and not understood by anyone. Having talked to five separate packaging engineers who all understood the problem, and gave me the exact same reasoning for its occurrence, Originally I thought they were just trying to make me believe their version of the truth, but I later found out that was not the case. This dawned on me when I noticed Nvidia was hiring a lot of thermal engineers in late 2008 to solve the problem retroactively. The disconnect was that the bumpgate failures were not due to a thermal problem per se. The chips did not cook to death. Nvidias bumps were cracking due to repeated thermal stresses, leading to physical failure. Hiring thermal engineers to fix that is much akin to hiring more gas station attendants to improve a cars fuel economy, it was addressing the wrong problem. The fixes showed some creative but problematic engineering choices. By this point, things had gone from morbid humor to rather sad. Flailing is fun to watch for a bit, but it gets old, even if you call it dancing in a press release. The short story is that Nvidia was not testing properly, and the result was dead chips. That is the science that the company didnt understand, and while its implementation may be complex, the concept most definitely is not.04dcarraher

My other old work laptop ASUS W3J (14 inch) laptop with mobility Radeon X1600 Pro still works. ASUS N80vn(14 inch) was the laptop to replace ASUS W3J.

Alot of laptops are still working with nvidia gpu's that are 5 years old or more.

NVidia's bumpgate incident has above average failure rates.
Avatar image for blaznwiipspman1
blaznwiipspman1

16542

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#46 blaznwiipspman1
Member since 2007 • 16542 Posts

[QUOTE="ronvalencia"][QUOTE="04dcarraher"] they just didnt get the concepts of reliability and testing. This became painfully clear with their flailing over the bumpgate fiasco. The talking point was that it was a new area of science, and not understood by anyone. Having talked to five separate packaging engineers who all understood the problem, and gave me the exact same reasoning for its occurrence, Originally I thought they were just trying to make me believe their version of the truth, but I later found out that was not the case. This dawned on me when I noticed Nvidia was hiring a lot of thermal engineers in late 2008 to solve the problem retroactively. The disconnect was that the bumpgate failures were not due to a thermal problem per se. The chips did not cook to death. Nvidias bumps were cracking due to repeated thermal stresses, leading to physical failure. Hiring thermal engineers to fix that is much akin to hiring more gas station attendants to improve a cars fuel economy, it was addressing the wrong problem. The fixes showed some creative but problematic engineering choices. By this point, things had gone from morbid humor to rather sad. Flailing is fun to watch for a bit, but it gets old, even if you call it dancing in a press release. The short story is that Nvidia was not testing properly, and the result was dead chips. That is the science that the company didnt understand, and while its implementation may be complex, the concept most definitely is not.04dcarraher

My other old work laptop ASUS W3J (14 inch) laptop with mobility Radeon X1600 Pro still works. ASUS N80vn(14 inch) was the laptop to replace ASUS W3J.

Alot of laptops are still working with nvidia gpu's that are 5 years old or more.

funnily enough, the only laptop that ever died on me had an nvidia graphics card. I paid close to $1500 for the asus z71v back in 2005, it had some of the best specs for a laptop back in those days, a high end intel graphics card, large HDD, wifi, plus a geforce graphics card capable of high settings in most games. After 3-4 years the laptop started artifacting, then it became stuck on lowest res and overheating like crazy. The culprit was the graphics chip, the geforce 6200 GO. In my books, nvidia just isn' known for quality hardware.

Avatar image for V4LENT1NE
V4LENT1NE

12901

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#47 V4LENT1NE
Member since 2006 • 12901 Posts

[QUOTE="04dcarraher"]

[QUOTE="ronvalencia"] My other old work laptop ASUS W3J (14 inch) laptop with mobility Radeon X1600 Pro still works. ASUS N80vn(14 inch) was the laptop to replace ASUS W3J.blaznwiipspman1

Alot of laptops are still working with nvidia gpu's that are 5 years old or more.

funnily enough, the only laptop that ever died on me had an nvidia graphics card. I paid close to $1500 for the asus z71v back in 2005, it had some of the best specs for a laptop back in those days, a high end intel graphics card, large HDD, wifi, plus a geforce graphics card capable of high settings in most games. After 3-4 years the laptop started artifacting, then it became stuck on lowest res and overheating like crazy. The culprit was the graphics chip, the geforce 6200 GO. In my books, nvidia just isn' known for quality hardware.

So because one Nvidia chip died on you that means they ALL must be bad, ha, haha, HAHAHAHAHA.
Avatar image for JohnF111
JohnF111

14190

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

#48 JohnF111
Member since 2010 • 14190 Posts
[QUOTE="blaznwiipspman1"]

[QUOTE="04dcarraher"] Alot of laptops are still working with nvidia gpu's that are 5 years old or more.

V4LENT1NE

funnily enough, the only laptop that ever died on me had an nvidia graphics card. I paid close to $1500 for the asus z71v back in 2005, it had some of the best specs for a laptop back in those days, a high end intel graphics card, large HDD, wifi, plus a geforce graphics card capable of high settings in most games. After 3-4 years the laptop started artifacting, then it became stuck on lowest res and overheating like crazy. The culprit was the graphics chip, the geforce 6200 GO. In my books, nvidia just isn' known for quality hardware.

So because one Nvidia chip died on you that means they ALL must be bad, ha, haha, HAHAHAHAHA.

He's a fanboy what'd you expect, of course he's going to take one bad chip and blow it into proportions that equate to every nVidia chip ever made.
Avatar image for jhcho2
jhcho2

5103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 2

#49 jhcho2
Member since 2004 • 5103 Posts

[QUOTE="jhcho2"]

[QUOTE="ronvalencia"] 680 didn't win with heavy PC games e.g. Metro 2033. ronvalencia

it's nothing to do with 'heavy' games as you put it. Metro 2033 has always been biased in favour of ATI cards.

Metro 2033 is a Nvidia "The Way It's Meant To Be Played" title and the 7970 didn't exist during the design of Metro 2033.

http://www.nzone.ro/object/nzone_twimtbp_gameslist_uk.html search for metro 2033

7970 didn't win with light weight Blizzard created games, Call of Duty 4 DX9, which skewed the averages.

Are you new to benchmarks? Metro 2033 has always yielded unfavourable results to Nvidia cards. ATI cards that would lose to certain Nvidia cards in every benchmark would suddenly win for Metro 2033. So far, I have identified two games which I would never use as a fair comparison between Nvidia and ATI. One is Enemy Territories: Quake Wars. The other is Metro 2033.

Second thing, next time post benchmarks from other sites like Guru3D or Tom's Hardware as well. Their results seems to show something different about Crysis 2.

Avatar image for marcthpro
marcthpro

7927

Forum Posts

0

Wiki Points

0

Followers

Reviews: 24

User Lists: 0

#50 marcthpro
Member since 2003 • 7927 Posts

[QUOTE="V4LENT1NE"][QUOTE="blaznwiipspman1"]

funnily enough, the only laptop that ever died on me had an nvidia graphics card. I paid close to $1500 for the asus z71v back in 2005, it had some of the best specs for a laptop back in those days, a high end intel graphics card, large HDD, wifi, plus a geforce graphics card capable of high settings in most games.

After 3-4 years the laptop started artifacting, then it became stuck on lowest res and overheating like crazy. The culprit was the graphics chip, the geforce 6200 GO. In my books, nvidia just isn' known for quality hardware.

JohnF111

So because one Nvidia chip died on you that means they ALL must be bad, ha, haha, HAHAHAHAHA.



He's a fanboy what'd you expect, of course he's going to take one bad chip and blow it into proportions that equate to every nVidia chip ever made.



LOL that story again as much it suck i find it funny to hear it again i was here when he did rant about that laptop if only i could find post it must be around 4 Year old nvidia are reliable for me now as far i know :)

it ATI that gave me headhache with my ATI RADEON HD4870x2 Every since Driver Series 10.4 in Driver support i ad lot of issue I still managed to fix micro stuttering in Battleifled Bad company II this is the worst purchase i ever made since it gave me countless CTD in few game and issue but it maxed them mostly the FPS WAS high the temp was high however the noise as well and when ever i tried to ask for help i was getting Troll reply from here so i check on other forums and found solution's temporary

while that make me look like im Bashing ATI : ATI did DESERVE to be BASH now however AMD from what i can read HD6xxx and HD7xxx are Fairly reliable thought the Rage Video game Issue made A huge Flame war over there :P it was like Gamespot System wars Thread lol got Locked