Xfire VS SLi (again) But Look....

This topic is locked from further discussion.

Avatar image for xipotec
xipotec

493

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#1 xipotec
Member since 2005 • 493 Posts

Been seen alot about this battle of GPU's

Where do you stand. I heard alot of people like Xfire better than Sli (mainly because of scaling issues.)

But look at these benchmarks.

http://it-review.net/index.php?option=com_content&task=view&id=1483&Itemid=91&limit=1&limitstart=5

(look through all the tests.)

Anyone gota link with a similar comparison. I am specially interested in a contradictory result. (who isn't)

Xipo

Avatar image for Rhamsus
Rhamsus

1078

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 Rhamsus
Member since 2007 • 1078 Posts

ill bite,

ATI takes the victory here in most benchmarks, but the visual quality is still on the side of NVIDIA.

only in supreme commander would i want a CF setup. (due to the low FPS in both) i prefer picture quality over FPS most times. only in a case such as this where those few frames are the difference between playable and unplayable would i go for more FPS setup. i give it to nividia given the miniscule FPSdifference in most tests. thats my own personal preference, some will say different im sure.

Avatar image for yoyo462001
yoyo462001

7535

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#3 yoyo462001
Member since 2005 • 7535 Posts
Xfire?? Xfire is a gaming network and communication program your mistaking it for Crossfire. the thing with SLI is that its nto efficent enough in a dual setup youve loss about 50 % of a card power really (well thats the performance you see )...
Avatar image for 42316
42316

1502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#4 42316
Member since 2006 • 1502 Posts
LOL....when i saw the title I was like WTF, Xfire is nothing like SLI so how the hell can they be compared, then I was like OHHHHHHHHHH!!!!!!!!...........btw I aint got a clue, but I reckon SLI, cos its sounds cool...lol JK
Avatar image for Indestructible2
Indestructible2

5935

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 Indestructible2
Member since 2007 • 5935 Posts
This topic is pointless,only fools buy HD 2900XT's or 8800GTS 320/640's nowadays,find me some CROSSfire vs SLI benchies with HD 3850/70 and 8800GT.
Avatar image for TheLiberal
TheLiberal

294

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 TheLiberal
Member since 2007 • 294 Posts

Xfire?? Xfire is a gaming network and communication program your mistaking it for Crossfire. the thing with SLI is that its nto efficent enough in a dual setup youve loss about 50 % of a card power really (well thats the performance you see )...yoyo462001

It's obvious what he's talking about here. No one with an IQ of 100 or better would have trouble figuring it out. Plus, xfire isn't an uncommon abbreviation for crossfire.

Avatar image for blackstar
blackstar

1252

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 blackstar
Member since 2004 • 1252 Posts
Well he was just trying to clarify him.

But anyways, crossfire tends to scale better than SLi (I am looking at the bigger picture, not just based from one review).
Avatar image for dayaccus007
dayaccus007

4349

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 dayaccus007
Member since 2007 • 4349 Posts
I don't trust that site, I saw GT SLI vs HD3870 Cross. and HD crossfire was faster.
Avatar image for codezer0
codezer0

15898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 44

User Lists: 0

#9 codezer0
Member since 2004 • 15898 Posts
SLi > CrossFire. ATi is still playing catch-up with their multi-GPU implementation. The new 7 series ATi chipset may be able to allow up to four GPU's in CrossFire, but no board I've seen on the market can (safely) accept four HD 3870's in one big CrossFire, short of having them all equipped with aftermarket water blocks. In almost every case, you'd either have to go with some 3850s (for the single-slot), or scale back to two 3870's to safely have room for CrossFire and the remaining slots, in the best case. CrossFire may support one more rendering mode than SLi, but it only works in Direct3D. And ATi has a long, deep-seated hatred for OpenGL, with most of its default actions for CrossFire either scaling back or disabling CF altogether for an OpenGL game, and no way of being able to override this (even if the game engine supports it) like there is in SLi. And single-card vs. Single card, ATi has nothing that can legitimately compete with the 8800GTX/Ultra - not even their highest-end 3870. And if you want a lifetime warranty, forget it. The only manufacturer making lifetime-warranty Radeon GPU's (Visiontek) has an eternal waiting list for any cards from them, and no motherboard makers willing to make a board with lifetime warranty with a DAAMiT chipset.
Avatar image for dayaccus007
dayaccus007

4349

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 dayaccus007
Member since 2007 • 4349 Posts

SLi > CrossFire. codezer0

I doubt, Crossfire is 50% faster than single card while SLI is only 30%

Avatar image for TheLiberal
TheLiberal

294

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 TheLiberal
Member since 2007 • 294 Posts

SLi > CrossFire. ATi is still playing catch-up with their multi-GPU implementation. The new 7 series ATi chipset may be able to allow up to four GPU's in CrossFire, but no board I've seen on the market can (safely) accept four HD 3870's in one big CrossFire, short of having them all equipped with aftermarket water blocks. In almost every case, you'd either have to go with some 3850s (for the single-slot), or scale back to two 3870's to safely have room for CrossFire and the remaining slots, in the best case. CrossFire may support one more rendering mode than SLi, but it only works in Direct3D. And ATi has a long, deep-seated hatred for OpenGL, with most of its default actions for CrossFire either scaling back or disabling CF altogether for an OpenGL game, and no way of being able to override this (even if the game engine supports it) like there is in SLi. And single-card vs. Single card, ATi has nothing that can legitimately compete with the 8800GTX/Ultra - not even their highest-end 3870. And if you want a lifetime warranty, forget it. The only manufacturer making lifetime-warranty Radeon GPU's (Visiontek) has an eternal waiting list for any cards from them, and no motherboard makers willing to make a board with lifetime warranty with a DAAMiT chipset.codezer0

Big waiting list for Visiontek products you say? Nope.

I guess this was bound to happen sooner or later, but the fanboys have arrived to ruin the discussion. SHOO!

Avatar image for codezer0
codezer0

15898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 44

User Lists: 0

#12 codezer0
Member since 2004 • 15898 Posts

Big waiting list for Visiontek products you say? Nope.

I guess this was bound to happen sooner or later, but the fanboys have arrived to ruin the discussion. SHOO!

TheLiberal
3850 might be available, but fat chance in scoring a 3870, which does need a waiting list to get.

[QUOTE="codezer0"]SLi > CrossFire. dayaccus007

I doubt, Crossfire is 50% faster than single card while SLI is only 30%

Please pass whatever crack you're smoking so I can have some. SLi was only a 30% improvement in the geForce 6 generation. geForce 7 and 8 successively gained MUCH more than that. And Crysis as-is on DX10 Very High saw a linear improvement from single-card up to Tri-SLi, and this is before the so-called multi-GPU patch they are working on.
Avatar image for TheLiberal
TheLiberal

294

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 TheLiberal
Member since 2007 • 294 Posts

[QUOTE="codezer0"]The only manufacturer making lifetime-warranty Radeon GPU's (Visiontek) has an eternal waiting list for any cards from themTheLiberal

Big waiting list for Visiontek products you say? Nope.

I guess this was bound to happen sooner or later, but the fanboys have arrived to ruin the discussion. SHOO!

[QUOTE="TheLiberal"]

Big waiting list for Visiontek products you say? Nope.

I guess this was bound to happen sooner or later, but the fanboys have arrived to ruin the discussion. SHOO!

codezer0

3850 might be available, but fat chance in scoring a 3870, which does need a waiting list to get.

Oh, I see, so now we've gone from "any cards" to just the 3870, eh? Come on, man, stick to your story, you're a fanboy, facts mean nothing to you, just because I proved you wrong doesn't mean you can't just parrot the same thing over and over again, lol.

Have you noticed that the 3870 is rather popular and that many retailers are out of many different brands of the 3870? Let's face it, you're biased against ATi for some reason (most likely no reason), and you made something up to make them sound bad without checking to see if you were right thinking no one would call you out for being a fanboy.

This is why gamespot needs wikipedia's [Citation Needed] tag.

I own both nvidia and ati cards and I'm going to let you in on a little secret: neither company is significantly better than the other in any aspect. Especially now with the 3800 series and the 8800 refresh parts. I know that's blasphemy to you, but, in the real world, it's the truth. Also, before you try saying "the gtx/ultra is way faster", first, they're not way faster, and second, show me someone who buys the gtx or ultra with the gt and gts (g92) out, and I'll show you a man without a brain.

Avatar image for codezer0
codezer0

15898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 44

User Lists: 0

#14 codezer0
Member since 2004 • 15898 Posts
I'm biased against ATi because I went with them when everybody and their grandmother was saying that they had the superior product, only to realize that it was all a load of garbage. The card never performed at the level that the reviews and benchmarks indicated, even when the test system had a slower CPU and less RAM than mine did, I was always constantly fighting to get one game or another to work correctly, and their tech support was the single most infuriating thing to deal with this side of AOL and Dell. Yes, ATi finally got me a trouble-free card in the X1600, but it took nearly three years of fighting with the damned product I did initially buy from them and going through the motions over and over again with support and a better business bureau complaint before I got any damned satisfaction. It took six months since the release of KotOR for PC before ATi even acknowledged there was a problem in their driver, and another five before releasing a partial fix that didn't work fully. It was finally after a year that there was a fix that consistently worked with KotOR 1, only for it to be removed and broken again in the next month's Catalyst driver release. And to date, ATi still cannot explain to me why their driver packages required all those different services to be able to open their CCC control panel. They still cannot explain to me why even after Windows is finished booting and everything's loaded, the whole system feels like it's 'tarding out before the screen blacks out and things respond normally. They've also still not explained why I need to have two of the same executables for different services and the CCC running at all times (consuming >100MB of system memory) in order to be able to open the CCC at all. To date, I've yet to call nVidia tech support for anything, other than to find out where I could buy notebooks equipped with a certain GPU of theirs (i.e. when the geforce 7 go series launched). Never had a technical issue with their product, never had a game not work for more than one driver revision, and never even had to run through the ritualistic voodoo of uninstalling, running driver cleaners and erasing every little registry entry before installing the next version of a driver. And the drivers always **** worked with nVidia. Yet every single driver release for ATi seems to just cause more problems, make the thing perform slower, and introduce new holes and bugs to contend with. If I'm going to be accused of being a fanboy, it's because I'm a fanboy for products that **** work right from the word "go." And it took almost three years of fighting with ATi's product before finally receiving one that would do that. Conversely, nVidia products worked right the first time and kept working.
Avatar image for codezer0
codezer0

15898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 44

User Lists: 0

#16 codezer0
Member since 2004 • 15898 Posts
It isn't a fabrication, **** It's first-hand experience with the company. If they want to screw me over, fine. But I have the right, priviledge, and responsibility to share the breadth at which they screwed me over. And you have no right to call my experiences a lie. If you don't want to believe it, that's fine. But that just equates to you saying how you believe the solar system revolves around the earth like the Christian Church has been trying to say in spite of scientific evidence proving it wrong.
Avatar image for Rhamsus
Rhamsus

1078

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 Rhamsus
Member since 2007 • 1078 Posts

[QUOTE="codezer0"]I'm biased against ATi...TheLiberal

I'll just save everyone the time of reading a long whiny story, most of which is probably a fabrication, and leave the only part of your rant that matters.

That's all that needs to be said here. You're biased against them, which is why I called you out in the first place. The best thing is, you admit it. Like they always say, the first step to recovery is admitting you have a problem.

and your problem is?

.......being an internet tough guy.

now im not a fanboy by any means. if ATi makes the superior card im buying ATi if NVIDIA does, you get the point. what it comes down to is for quite awhile now ATi has had the inferior product, support, and driver. you cannot argue that regardless of cost/brand or whatever other constraint you want to wrap around this, NVIDIA is superior.

now thats been said. LONG LIVE 3DFX!

Avatar image for GamingMonkeyPC
GamingMonkeyPC

3576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 0

#18 GamingMonkeyPC
Member since 2005 • 3576 Posts

In regards to having links of more recent Crossfire and SLI, I recommend checking out Bit-Tech's review:

http://www.bit-tech.net/hardware/2007/11/30/rv670_amd_ati_radeon_hd_3870/1

Personally, I still feel it's best to stick with a one poweful card solution over dual video cards. I do however like the Hybrid Crossfire idea... onboard + 3D accelerator; kind of like the good ol' 3dfx Voodoo 1 and 2 days!

Avatar image for quietguy
quietguy

1218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#19 quietguy
Member since 2003 • 1218 Posts

[QUOTE="codezer0"]I'm biased against ATi...TheLiberal

I'll just save everyone the time of reading a long whiny story, most of which is probably a fabrication, and leave the only part of your rant that matters.

That's all that needs to be said here. You're biased against them, which is why I called you out in the first place. The best thing is, you admit it. Like they always say, the first step to recovery is admitting you have a problem.

Reported.

So anyways I am inclined to agree that ATI has been scraping against the pavement for the better 3-4 years ever since nVidia began stepping up their product lines and performance ratios. If we went back and looked at the bench mark last year, anyone can find that ATI's products were off by several FPS; A very miniscule difference but that was because they were not running under the ideal conditions that would've really stressed out their hardware bandwidths. Now if we were to do those same comparisons again under up-to-date technologies of this year's, I hypothize that one would run into a different conclusion because their included technological specs would show which distinction is better.

I like the idea that CrossFire is going, but it might as well hurt gamers that wants to go old school, which I mean by playing games made in the 80s/90s in their original state. The idea behind it was so that it would be developed to run today's games, not yesterday's (unless you have graphic fixes) in an unparalleled, dedicated performance process; which would be solely ATI proprietary. SLi was more like a "64 bit" upgrade to an existing architecture that the industry had been using for over a decade; It may be not that much faster as a dedicated Direct3D pipeline, but it manages to retains some sense of compatibility and durability.

But I think I'll stick with single cards, more heat efficient and cost effective.

Avatar image for TheLiberal
TheLiberal

294

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20 TheLiberal
Member since 2007 • 294 Posts

Reported.quietguy

lol

I forgot why I stopped posting here, but now I remember. The people here are about as smart as the reviews here are credible.

After reading that, I know a few websites I won't be visiting any more.

Avatar image for karasill
karasill

3155

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#21 karasill
Member since 2007 • 3155 Posts

SLi > CrossFire. ATi is still playing catch-up with their multi-GPU implementation. The new 7 series ATi chipset may be able to allow up to four GPU's in CrossFire, but no board I've seen on the market can (safely) accept four HD 3870's in one big CrossFire, short of having them all equipped with aftermarket water blocks. In almost every case, you'd either have to go with some 3850s (for the single-slot), or scale back to two 3870's to safely have room for CrossFire and the remaining slots, in the best case.CrossFire may support one more rendering mode than SLi, but it only works in Direct3D. And ATi has a long, deep-seated hatred for OpenGL, with most of its default actions for CrossFire either scaling back or disabling CF altogether for an OpenGL game, and no way of being able to override this (even if the game engine supports it) like there is in SLi. And single-card vs. Single card, ATi has nothing that can legitimately compete with the 8800GTX/Ultra - not even their highest-end 3870. And if you want a lifetime warranty, forget it. The only manufacturer making lifetime-warranty Radeon GPU's (Visiontek) has an eternal waiting list for any cards from them, and no motherboard makers willing to make a board with lifetime warranty with a DAAMiT chipset.codezer0
You can have tri Crossfire. We all know you hate ATI with a passion, but honestly I doubt anyone is going to take you seriously because of that. Your objectivity is tarnished and hence everything you'll say in regards of ATI will be negative even if it's something positive to begin with (Quad Crossfire for instance).

So why don't you take a break from these boards because you seem to come here and argue all the time, like you thrive on conflict or something. If you want to argue go to SW's, this is a hadware discussion board, not a "I hate ATI", "they screwed me over", and "I'm an Nvidia fanboy" board.

Edit: Crossfire > SLI, there is a recent benchmark of the HD3870 in Crossfire beating the 8800 GT in SLI by a good amount, even though a single 8800 GT is 10-15% faster then a single HD 3870. And Tri SLI is limited to only the Ultra and GTX, both of which cost an arm and a leg.

Avatar image for quietguy
quietguy

1218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#23 quietguy
Member since 2003 • 1218 Posts
[QUOTE="quietguy"]

Reported.TheLiberal

lol

I forgot why I stopped posting here, but now I remember. The people here are about as smart as the reviews here are credible.

After reading that, I know a few websites I won't be visiting any more.

Good for you ^^. That just means less trolls in here.

Avatar image for sabbath2gamer
sabbath2gamer

2515

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#24 sabbath2gamer
Member since 2007 • 2515 Posts
codezer0 your saying you go with nividia just becuuase you had a card that was crap from ati well newesflash everycompany will have a bad apple once in a while you cant just hate them cuz you got one lousy card out of millions of other cards.
Avatar image for codezer0
codezer0

15898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 44

User Lists: 0

#25 codezer0
Member since 2004 • 15898 Posts
codezer0 your saying you go with nividia just becuuase you had a card that was crap from ati well newesflash everycompany will have a bad apple once in a while you cant just hate them cuz you got one lousy card out of millions of other cards. sabbath2gamer
I hate them because rather than try and help me get "a good one," as you would say, they just kept dicking me around and making me waste time with their drivers, their support, until I finally decided it was time to stop taking ATi's **** and filed with the BBB. If not for that, they would have simply let my warranty run out and I would have been out the cost to find a replacement graphics card. :| It's like how cell companies nowadays will want to lock you into a two-year contract just so they can screw you over those two years, and that in order to get a replacement phone that is worth a damn, that then you suddenly have to re-new and extend your service contract with them. And the whole thing with how Every new driver seemed to cause more problems than solve them with my old ATi card also has a lot to do with why I am still angry with them. The only way I would even consider buying an ATi product again is if it was from Visiontek, and that's because of the lifetime warranty. And that only covers the GPU's - last I've been able to find, there are no CrossFire-capable motherboards that feature a lifetime warranty (while eVGA's A* series motherboards all do, for example). There's also the fact that ATi has historically **** up OpenGL gaming, and hates it with a passion. The reality is that if you have an OpenGL app and their driver doesn't provide a pre-defined profile for CrossFire for it, you either get the very slowest form of CrossFire acceleration, or no CF acceleration at all. Meaning that that second card is basically sitting there doing nothing. SLi's drivers on the other hand will let you create a profile for any game or app, whether it uses OpenGL or Direct3D. This blatant oversight by ATi either says to me that their driver development team is full of incompetents (as if requiring .net for their CCC control panel wasn't indication enough), or that they have a deep-seated hatred for OpenGL and want to punish any and everyone that wants to run an OpenGL game or app. It's a wonder that their FireGL workstation products even work at all, because the people that would be in the market for a FireGL or Quadro card are the kind of people who would be working with apps that are OpenGL ONLY - very few modeling/rendering applications out there do anything with Direct3D, because OpenGL is so much faster at this. And unlike the situation between DX9.* and DX10, OpenGL can be handled in ONE build of the program and automatically scale as far back as OpenGL 1.* hardware if need be.
Avatar image for xipotec
xipotec

493

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#26 xipotec
Member since 2005 • 493 Posts

Ok, Getting a bit off topic?

Anyone else with cold hard FACTS?

BTW I have seen Xfire as a short for Crossfire before, but I forgot about the gaming thing. Sorry bout the confusion.

Xipo

Avatar image for Wesker776
Wesker776

7004

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27 Wesker776
Member since 2005 • 7004 Posts

SLi > CrossFire. ATi is still playing catch-up with their multi-GPU implementation. codezer0

What?

A lot has changed, codezero, in the performance gains that CrossFire brings in. Since its launch with the X100 series, CrossFire has come a long way to now having superior performance scaling than SLi. This is mainly due to the high bandwidth being communicated between the two cards and the way CrossFire renders frames (Alternate Frame Rendering is the default rendering mode).

CrossFire does scale better then SLi in the majority of games.

Avatar image for blackstar
blackstar

1252

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 blackstar
Member since 2004 • 1252 Posts
[QUOTE="TheLiberal"]

Big waiting list for Visiontek products you say? Nope.

I guess this was bound to happen sooner or later, but the fanboys have arrived to ruin the discussion. SHOO!

codezer0
3850 might be available, but fat chance in scoring a 3870, which does need a waiting list to get.

[QUOTE="codezer0"]SLi > CrossFire. dayaccus007

I doubt, Crossfire is 50% faster than single card while SLI is only 30%

Please pass whatever crack you're smoking so I can have some. SLi was only a 30% improvement in the geForce 6 generation. geForce 7 and 8 successively gained MUCH more than that. And Crysis as-is on DX10 Very High saw a linear improvement from single-card up to Tri-SLi, and this is before the so-called multi-GPU patch they are working on.




Excuse me? A linear improvments with Crysis on TRI-SLi? I think you need to look at those graphs again and get those eyes checked out (just kidding though).

http://www.anandtech.com/video/showdoc.aspx?i=3183&p=3

Crysis benchmarks on other page.


EDIT.

Ahh you said linear, I thought you meant exponential.. nevermind.
Avatar image for bumsoil
bumsoil

924

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#29 bumsoil
Member since 2006 • 924 Posts

[QUOTE="codezer0"]SLi > CrossFire. codezer0

I doubt, Crossfire is 50% faster than single card while SLI is only 30%

Please pass whatever crack you're smoking so I can have some. SLi was only a 30% improvement in the geForce 6 generation. geForce 7 and 8 successively gained MUCH more than that. And Crysis as-is on DX10 Very High saw a linear improvement from single-card up to Tri-SLi, and this is before the so-called multi-GPU patch they are working on.

THANK YOU!!!! i was going to say the same.

Avatar image for bumsoil
bumsoil

924

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#30 bumsoil
Member since 2006 • 924 Posts

[QUOTE="sabbath2gamer"]codezer0 your saying you go with nividia just becuuase you had a card that was crap from ati well newesflash everycompany will have a bad apple once in a while you cant just hate them cuz you got one lousy card out of millions of other cards. codezer0
I hate them because rather than try and help me get "a good one," as you would say, they just kept dicking me around and making me waste time with their drivers, their support, until I finally decided it was time to stop taking ATi's **** and filed with the BBB. If not for that, they would have simply let my warranty run out and I would have been out the cost to find a replacement graphics card. :| It's like how cell companies nowadays will want to lock you into a two-year contract just so they can screw you over those two years, and that in order to get a replacement phone that is worth a damn, that then you suddenly have to re-new and extend your service contract with them. And the whole thing with how Every new driver seemed to cause more problems than solve them with my old ATi card also has a lot to do with why I am still angry with them. The only way I would even consider buying an ATi product again is if it was from Visiontek, and that's because of the lifetime warranty. And that only covers the GPU's - last I've been able to find, there are no CrossFire-capable motherboards that feature a lifetime warranty (while eVGA's A* series motherboards all do, for example). There's also the fact that ATi has historically **** up OpenGL gaming, and hates it with a passion. The reality is that if you have an OpenGL app and their driver doesn't provide a pre-defined profile for CrossFire for it, you either get the very slowest form of CrossFire acceleration, or no CF acceleration at all. Meaning that that second card is basically sitting there doing nothing. SLi's drivers on the other hand will let you create a profile for any game or app, whether it uses OpenGL or Direct3D. This blatant oversight by ATi either says to me that their driver development team is full of incompetents (as if requiring .net for their CCC control panel wasn't indication enough), or that they have a deep-seated hatred for OpenGL and want to punish any and everyone that wants to run an OpenGL game or app. It's a wonder that their FireGL workstation products even work at all, because the people that would be in the market for a FireGL or Quadro card are the kind of people who would be working with apps that are OpenGL ONLY - very few modeling/rendering applications out there do anything with Direct3D, because OpenGL is so much faster at this. And unlike the situation between DX9.* and DX10, OpenGL can be handled in ONE build of the program and automatically scale as far back as OpenGL 1.* hardware if need be.

you are great........i had the same prob with ati, also xfire does NOT work with quake wars!

Avatar image for saifiii
saifiii

274

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31 saifiii
Member since 2006 • 274 Posts

wats the fuss with OpenGL, FIRE GL

http://www.fudzilla.com/index.php?option=com_content&task=view&id=4679&Itemid=1

Avatar image for Wesker776
Wesker776

7004

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#32 Wesker776
Member since 2005 • 7004 Posts

Big performance gains for CrossFire are seen here compared to SLi.

Comparing the two (red indicates performance to ATi, green for NVIDIA):


"But teh atI hatez t3h openGL!!!! (Quake Wars is OpenGL for anyone who doesn't know)"

ATi has never "hated" OpenGL, they just haven't been very good at it in past. But that's all changing now, especially with their commitments to the professional market with their FireGL graphics line. For example, take a sample review from tomshardware between the FireGL V7600 and the Quadro FX 4600:

The transition to workstation graphics cards with support for shader model 4.0 goes hand in hand with a performance leap. Thus, ATI's and Nvidia's previous generation of cards are ready to be put on the shelf. You should only buy one of these models if you can find it at a steep discount, and if its performance is sufficient for your needs.

Currently, Nvidia's Quadro FX 4600 offers very balanced performance, both in synthetic and real-world benchmarks. This is due to the very mature ForceWare driver release 162.62. Still, the card does not take the lead across the board. Also, the street price of €1650 is too high, considering how well ATI's competing product performs.

ATI's FireGL V7600 is entering the market with very aggressive pricing, aiming to sell its card at $1000 (before taxes). Although the pricing for Europe had not been finalized at press time, that would translate to roughly €850 including tax (at the current exchange rate). And therein lies Nvidia's problem - the V7600 displayed very good performance potential in all synthetic Viewperf benchmarks in our tests. In Maya 6.5, Nvidia is left in the dust, while the competitors are tied for the lead in 3D Studio Max. The only benchmark that ATI has to concede to Nvidia is Solidworks, and in our opinion, that will change once ATI's drivers mature some more.

As a result of its very good value for the money, ATI's FireGL V7600 gets our editors choice recommendation. The R600 chip with shader model 4.0 brings a performance leap to the workstation arena. Meanwhile, the Nvidia's G80-based chips also have a lot to offer. However, Nvidia should reconsider its pricing strategy - only then will the company become competitive again.

I think it's time that you sober up from your hatred against ATi. ATi has changed a lot, even before it was bought out by AMD. Most companies change for the better to survive; take Intel as a prime example. Is the aggressive and committed Intel that launched Core 2 Duo the same as the one that was forcing NetBurst on to consumers?

Avatar image for Mam00th
Mam00th

432

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#33 Mam00th
Member since 2005 • 432 Posts

WTF cross fire or sli,

Iruncrysisallmaxedwithmyvodoo3at60fps

Avatar image for codezer0
codezer0

15898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 44

User Lists: 0

#34 codezer0
Member since 2004 • 15898 Posts
:| Remember all the supposed "OpenGL performance enhancements" when Doom 3 came out, and how they were hyping up how it would make Radeon cards 50% faster in Doom 3 and the like? Oh sure, it sounded all nice and great... but those improvements only applied to the X1k lineup, and at the time, I only had the 9800 Pro. So instead of a performance improvement, I actually saw even WORSE performance with the new drivers. It was a constant battle even after the supposed fixes to get Kotor 1 and 2 to work properly. And when ATi finally released a driver that had a properly working fix to keep the games from crashing, it was immediately broken again the very next month because Kotor 2 didn't work and wouldn't stop crashing. Oh yea, and the whole "ATi is preferred graphics for Half Life 2" bit? sad thing is I bought into that **** with the 9800 Pro. So explain to me why my 9800 Pro in nearly three **** years of drivers, updates, patches, etc. Could NEVER run HL2 on High at any resolution, while there were review sites using test systems with even slower CPU's than my oldie uses able to do it? They hype up each driver release like it's somehow going to earn you a magical performance improvement ("OMG 70% INCREASE IN T3H B10SH0CK!11!" remember that one? Which again only applied if you were doing a pair of HD 2900XT's in CrossFire), but it's all a lot of hardcore bull****, especially on the OpenGL gaming side. Even when my 9800 Pro was their flagship card, those hyped up performance improvements were a load of bull **** that ATi kept feeding just to try and keep people on their cards. And again, look at your graphs... Sli may not scale as readily/highly, but the issue is that by the numbers, the SLi config is still (usually) ending up faster.
Avatar image for blackstar
blackstar

1252

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#35 blackstar
Member since 2004 • 1252 Posts
I fail to realize how people do not check benchmarks on crossfire vs. SLi, and say SLi is better.
Avatar image for Rhamsus
Rhamsus

1078

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36 Rhamsus
Member since 2007 • 1078 Posts

WTF cross fire or sli,

Iruncrysisallmaxedwithmyvodoo3at60fps

Mam00th

only 60? im getting over 9000!

Avatar image for RayvinAzn
RayvinAzn

12552

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#37 RayvinAzn
Member since 2004 • 12552 Posts

So explain to me why my 9800 Pro in nearly three **** years of drivers, updates, patches, etc. Could NEVER run HL2 on High at any resolution, while there were review sites using test systems with even slower CPU's than my oldie uses able to do it? codezer0

Bad card? Poor customer service is a shame, and it is worth bringing up. Still, Occam's razor clearly indicates that something about your machine was faulty, most likely the graphics card, possibly your motherboard, or even your power supply. If everyone else was doing great and you weren't the problem clearly sounds like it was on your end, not ATI's.

Bottom line - customer service-wise, I feel your pain. I've even admit that driver updates are more complicated for ATI than Nvidia. But the bottom line is this: most of your ill-will towards ATI is based on what it likely a bad component, not a bad card lineup.

Avatar image for Wesker776
Wesker776

7004

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#38 Wesker776
Member since 2005 • 7004 Posts

:| Remember all the supposed "OpenGL performance enhancements" when Doom 3 came out, and how they were hyping up how it would make Radeon cards 50% faster in Doom 3 and the like? Oh sure, it sounded all nice and great... but those improvements only applied to the X1k lineup, and at the time, I only had the 9800 Pro. So instead of a performance improvement, I actually saw even WORSE performance with the new drivers. It was a constant battle even after the supposed fixes to get Kotor 1 and 2 to work properly. And when ATi finally released a driver that had a properly working fix to keep the games from crashing, it was immediately broken again the very next month because Kotor 2 didn't work and wouldn't stop crashing. Oh yea, and the whole "ATi is preferred graphics for Half Life 2" bit? sad thing is I bought into that **** with the 9800 Pro. So explain to me why my 9800 Pro in nearly three **** years of drivers, updates, patches, etc. Could NEVER run HL2 on High at any resolution, while there were review sites using test systems with even slower CPU's than my oldie uses able to do it? They hype up each driver release like it's somehow going to earn you a magical performance improvement ("OMG 70% INCREASE IN T3H B10SH0CK!11!" remember that one? Which again only applied if you were doing a pair of HD 2900XT's in CrossFire), but it's all a lot of hardcore bull****, especially on the OpenGL gaming side. Even when my 9800 Pro was their flagship card, those hyped up performance improvements were a load of bull **** that ATi kept feeding just to try and keep people on their cards.codezer0

I can only say "get over it".

You're going on about the past, in which you were extremely unlucky in.

The vast majority of ATi users have nothing but good things to say about their cards and ATi's service. You're an outlier who's taking things way too far.

Also, what's wrong with a 70% CrossFire improvement in Bioshock? There was also a lot more than just a performance improvement in Bioshock anyway. At least the driver team brought in some kind of huge performance gain.

All your problems and whinging can be countered with:
- Bad luck
- Companies change

You're also going drastically off topic. This topic is about CrossFire vs SLi, not codezer0's hatred for ATi.

And again, look at your graphs... Sli may not scale as readily/highly, but the issue is that by the numbers, the SLi config is still (usually) ending up faster.

:|

Ignorance much?

I think you're not seeing the 50% performance lead CrossFire HD3870 has over 8800 GT SLi in Call of Juarez DX10...

Avatar image for mura89
mura89

50

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#39 mura89
Member since 2007 • 50 Posts

This topic is pointless,only fools buy HD 2900XT's or 8800GTS 320/640's nowadays,find me some CROSSfire vs SLI benchies with HD 3850/70 and 8800GT.Indestructible2

to true. I´m going to buy me a HD3850 I think I´ll be ok for all of my gaming for a year or 2