ATI Radeon HD 2000 Series Hands-On

Find out about ATI's DirectX 10 Radeon HD 2000 series video card lineup.

[UPDATE: Added new release information and benchmark results for the Radeon HD 2600 and HD 2400 series cards.]

AMD may have released its new DirectX 10-compatible ATI Radeon months after the Windows Vista launch, but the delay hasn't affected PC enthusiasts because most of the highly anticipated DirectX 10 games, such as Crysis and BioShock, aren't going to arrive until the second half of the year. Yes, we've seen DirectX 10 versions of Lost Planet and Company of Heroes, but we can't expect modified DirectX 9 games to compete against new games developed for DirectX 10. Subtle shadowing changes and extra rocks on the ground aren't going to help Microsoft sell more copies of Vista.

Those holding off on DirectX 10 Windows Vista upgrades in anticipation of upcoming DirectX 10 games can choose from two brands of video cards. Nvidia's DX10-compatible GeForce 8 series has been available since the end of last year. AMD released the first member of its ATI Radeon HD 2000 GPU lineup, the ATI Radeon HD 2900 XT, this past May. The midrange Radeon HD 2600 and the entry-level Radeon HD 2400 cards arrive in early July. All the Radeon HD 2000 series cards feature full DirectX 10 compatibility, image quality enhancements, and hardware-accelerated high-definition video support.

Radeon HD 2900 XTRadeon HD 2600 XTRadeon HD 2600 ProRadeon HD 2400 XTRadeon HD 2400 Pro
Price:$399$149 (GDDR4), $119 (GDDR3)$89 – 99$75 – 85$50 – 55
Stream Processing Units:3201201204040
Clock Speed:740MHz800MHz600MHz700MHz525MHz
Memory:512MB (GDDR3)256MB (GDDR4, GDDR3)256MB (GDDR3, DDR2)256MB (GDDR3)128 – 256MB (DDR2)
Memory Interface:512-bit128-bit128-bit64-bit64-bit
Memory Speed:825MHz800MHz (GDDR3), 1100MHz (GDDR4)400 – 500MHz700 – 800MHz400 – 500MHz
Transistors:700 million390 million390 million180 million180 million

The Radeon HD 2600 and Radeon HD 2400 cards will be available in the usual Pro and XT variants, but there will be only a single XT card at the HD 2900 level at launch. The XT cards generally have faster core and memory clock speeds than the Pro versions. The model numbers indicate relative GPU strength. For example, the Radeon HD 2900 has 320 stream processing units, while the Radeon HD 2600 and Radeon HD 2400 have 120 and 40 stream processing units, respectively. On-board memory will range from 256MB for the 2400, 256MB for the 2600, and up to 512MB for the 2900. Board manufacturers will likely offer more memory options as the GPU line matures. Several manufacturers are also working on special Radeon HD 2600 XT "Gemini" cards that will have two 2600 XT GPUs on a single card. ATI hasn't announced a launch date for the dual-GPU "Gemini" board yet, but the company has stated that the suggested pricing will be in the $189-249 range.

The Radeon HD 2000 GPU is ATI's first unified shader architecture for the desktop, but it's actually a second-generation design. ATI's first unified shader was the "Xenos" GPU built for the Xbox 360. Older, nonunified shader designs had separate hardware shaders dedicated to pixel or vertex processing. This made GPUs inefficient, because games never maintain the same pixel-to-vertex workload ratio to always match the pixel-to-vertex shader ratios on the hardware. The unified shader architecture gives the GPU more flexibility by allowing all the shaders to process pixel, vertex, and now, with DirectX 10, geometry work. Nvidia also switched over from traditional shaders to the unified shader approach in its current GeForce 8 GPU.

ATI created a new Ruby "Whiteout" tech demo to show off what the Radeon HD 2900 XT can do. In the new demo, the Ruby model has 200,000 triangles, and the entire video averages more than 1 million triangles per frame. In comparison, the Ruby from the Radeon X1000 series demo, "The Assassin," only has 80,000 triangles, and the demo averages just over 500,000 triangles per frame. The new Ruby also has 128 facial animation targets compared to four for the older models, which allows for more realistic facial expressions.

The Radeon HD 2900 XT will also feature the return of the voucher. Each card will come with a code for Valve's Half-Life 2: The Black Box, which includes Half-Life 2 Episode 2, Portal, and Team Fortress 2. It'll be a great value if the game ships on time. Several years ago, ATI partnered with Valve to include vouchers for the original Half-Life 2 with high-end Radeon 9000 series cards, but some consumers ended up waiting years to cash in because of development delays.

ATI Radeon HD 2000 Series

The Radeon HD 2900 XT has a dual-slot design with a cooler that pulls air from within the case and exhausts it out the back of the card. Each card can draw up to 215 watts of power at standard clock speeds. AMD recommends having a 550W power supply unit for a single card and a 750W power supply for a dual-card CrossFire setup. But company representatives have indicated that these recommendations are on the cautious side, as they've certified power supply units as low as 400W for the Radeon HD 2900 XT.

The card has two power connectors, a 6-pin PCI-E connector and a new 8-pin PCI-E connector. Most new power supply units will have the 8-pin cable, but older units might only have 6-pin cables. That doesn't mean potential owners will need to buy a new power supply. You can plug a 6-pin cable into the 8-pin socket to get the card to run. However, you will need an 8-pin cable if you want to enable overclocking, because the 8-pin connector can supply twice as much power as a 6-pin connector.

Power will not be a concern for the midrange and entry-level Radeon cards. The Radeon HD 2600 XT sports a single-slot design and no external power connectors. It's about as long as a Radeon HD 2900 XT, but doesn't have the power draw of its older sibling. The Radeon HD 2600 Pro has a shorter board and a basic cooling unit. The Radeon HD 2400 XT is even more compact with its "L" shaped board. Several card manufacturers are also designing fanless 2600 and 2400 cards with massive heatsinks for quiet operation. The Radeon HD 2900 XT, 2600 XT, and 2400 XT cards all have the CrossFire connectors for dual-card support. We noticed that our Radeon HD 2600 Pro didn't have the CrossFire card connectors, but ATI representatives told us that the Pro can also support CrossFire if the video card manufacturer chooses to add the connectors to the board.

(From left to right: ATI Radeon HD 2600 XT, ATI Radeon HD 2600 Pro, ATI Radeon HD 2400 XT.)

Image Quality Improvements

The Radeon HD 2000 series offers several image quality improvements over the previous Radeon X1000 line. There's new custom filter anti-aliasing (CFAA) support that allows for programmable filters, which can be updated with new video card drivers. New CFAA modes include "narrow tent" and "wide tent" filters that sample neighboring pixels in addition to sampling within the primary pixel to create smoother blends. The drivers also offer an additional "edge detect" filter that determines if a pixel is lined up on an edge and gives additional weight to samples along that same edge. The new CFAA modes are a mix and match of the various tent and edge detect filtering options.

The ATI Radeon HD 2000 also has a programmable hardware tessellation unit that can transform a basic polygon model into a highly detailed model by recursively subdividing existing triangles into a more accurate shape. The tessellation unit will let games use high-quality models with much lower bandwidth overhead. The Xbox 360 graphics have a similar tessellation unit, which Rare used in Viva Piñata. It's a forward-looking feature because it requires game developer support, but ATI's head of developer relations Richard Huddy has told GameSpot that he expects to see a handful of PC games with tessellation support as soon as this holiday season.

High-Definition Video

ATI has equipped the Radeon HD 2000 series with several new Avivo features to aid high-definition video playback. All GPUs, including the Radeon HD 2400 and the Radeon HD 2600, have a built-in unified video decoder chip designed to accelerate VC-1 and H.264 decoding for smooth HD-DVD and Blu-ray playback with minimal CPU utilization. Moving all the work to the Radeon GPU will let system owners watch HD content on systems that have less powerful CPUs.

All that decoding power won't be of use if the card can't actually output video because of copy protection restrictions. ATI has added all the necessary features to get that copy-protected, high-definition content onto any HDCP-compliant screen. ATI has embedded the necessary encryption keys into every chip to ensure HDCP video compliance right out of the box. Radeon HD 2000 cards will also be able to provide full-resolution content to dual-link HDCP displays. Nvidia's GeForce 8 GPUs are also HDCP-ready, but it's only enabled if the video card manufacturer chooses to add an encryption ROM to the board.

ATI will also offer a DVI-to-HDMI adapter to provide Radeon HD 2000 HDMI output support. However, simply creating an HDMI adapter and attaching it to a regular video card wouldn't be of use because it would only carry a video signal. ATI has integrated an audio controller directly into the new chip to allow the card to output both audio and video over HDMI.

The ATI Radeon HD 2900 XT is available in stores now, and the Radeon HD 2600 and Radeon HD 2400 cards should reach online retailers in early July.

The Radeon HD 2900 XT's $400 price places it in direct competition with the Nvidia GeForce 8800 GTS 640MB, not the GeForce 8800 GTX. However, Nvidia isn't going to let ATI beat up on its six-month old GTS without a response. Team Nvidia is releasing a new overclocked GeForce 8800 GTS GPU to greet ATI's new Radeon. The EVGA "Superclocked" GeForce 8800 GTS 640MB we tested yielded around five to ten percent more frames per second than our reference GeForce 8800 GTS 640MB. Nvidia will slot the overclocked GTS in the $400 spot and the regular GTS will settle into the $350 area. Our performance testing confirms that the ATI Radeon HD 2900 XT is comparable to the GeForce 8800 GTS. The cards traded wins across all of our tests. Frame rates were close in most games, except in S.T.A.L.K.E.R. where the GTS blew past the XT.

The Radeon HD 2600 XT with GDDR4 memory will likely sit somewhere between the $129 GeForce 8600 GT and the $175 GeForce 8600 GTS with its estimated $149 retail price. The Radeon HD 2600 XT CrossFire setup won all the midrange tests--as it should, since it'll cost around $300 to put together a matching pair of Radeon HD 2600 XTs. The single Radeon HD 2600 XT outperformed the GeForce 8600 GT and GTS cards in Company of Heroes, but fell behind in both cards in S.T.A.L.K.E.R. and could only tie the less expensive GeForce 8600 GT in Oblivion.

The $99 Radeon HD 2600 Pro should be compared to the GeForce 8500 GT, which currently sells for $90-$110 online. Following the game preferences exhibited by the midrange cards, the Radeon HD 2400 XT has the advantage in Company of Heroes, the GeForce takes S.T.A.L.K.E.R., and both cards draw even in Oblivion.

System Setup: Intel Core 2 X6800, Intel 975XBX2, EVGA nForce680i SLI, 2GB Corsair Dominator CM2X1024 Memory (1GB x 2), 160GB Seagate 7200.7 SATA Hard Disk Drive, Windows XP SP2, Windows Vista Ultimate. Graphics Cards: ATI Radeon HD 2900 XT 512MB, ATI Radeon X1950 XTX 512MB, Nvidia GeForce 8800 GTX 768MB, Nvidia GeForce 8800 GTS 640MB. Graphics Driver: ATI Catalyst 7.4, ATI Catalyst beta 8-37-4-070419a, Nvidia Forceware 158.22, Nvidia Forceware beta 158.43.

System Setup: Intel Core 2 X6800, Intel 975XBX2, 2GB Corsair Dominator CM2X1024 Memory (1GB x 2), 160GB Seagate 7200.7 SATA Hard Disk Drive, Windows XP SP2, Windows Vista. Graphics Cards: GeForce 8600 GT 256MB, GeForce 8500 GT 256MB, Radeon HD 2600 XT 256MB, Radeon HD 2600 Pro 256MB, Radeon HD 2400 XT 256MB. Graphics Driver: beta Catalyst 8.38.9.1, Forceware 158.22 (Windows XP), Forceware 158.24 (Windows Vista).

We used the same application test suite in our Windows Vista testing. We actually ran several Call of Juarez and Lost Planet tests, but we determined that DirectX 10 isn't suitable for benchmarking yet. The ATI video card driver we used was unable to run the Lost Planet application even when previous drivers worked fine. Call of Juarez could run, but we took a pass on this one due to the game's ATI launch event ties and the ongoing dispute between Nvidia and Techland. We also saw an uncomfortable amount of performance variation between the different drivers we tested for each card. When manufacturers ask you to use specific Vista drivers for each game, it probably means that the platform isn't reliable for performance comparison. We'll revisit DirectX 10 performance after the high-profile games start shipping later this year.

The ATI Radeon HD 2900 XT looks to be a compelling value in the $400 DirectX 10 video card category. It's not a GTX killer, but it compares well to similarly priced GeForce 8800 GTS cards in 3D performance, and the advanced HDCP video support with HDMI output makes the Radeon HD 2900 XT--and the rest of the Radeon HD 2000 line, for that matter--extremely attractive for media applications. The Radeon HD 2600 XT lived up to its $149 price point by performing as well as its GeForce 8600 GTS and 8600 GT competition. The Radeon HD 2400 XT's CrossFire performance was somewhat disappointing, but anyone willing to spend $150-$175 on graphics should be getting a single-card solution anyway.

System Setup: Intel Core 2 X6800, Intel 975XBX2, EVGA nForce680i SLI, 2GB Corsair Dominator CM2X1024 Memory (1GB x 2), 160GB Seagate 7200.7 SATA Hard Disk Drive, Windows XP SP2, Windows Vista Ultimate. Graphics Cards: ATI Radeon HD 2900 XT 512MB, ATI Radeon X1950 XTX 512MB, Nvidia GeForce 8800 GTX 768MB, Nvidia GeForce 8800 GTS 640MB. Graphics Driver: ATI Catalyst 7.4, ATI Catalyst beta 8-37-4-070419a, Nvidia Forceware 158.22, Nvidia Forceware beta 158.42.

System Setup: Intel Core 2 X6800, Intel 975XBX2, 2GB Corsair Dominator CM2X1024 Memory (1GB x 2), 160GB Seagate 7200.7 SATA Hard Disk Drive, Windows XP SP2, Windows Vista. Graphics Cards: GeForce 8600 GT 256MB, GeForce 8500 GT 256MB, Radeon HD 2600 XT 256MB, Radeon HD 2600 Pro 256MB, Radeon HD 2400 XT 256MB. Graphics Driver: beta Catalyst 8.38.9.1, Forceware 158.22 (Windows XP), Forceware 158.24 (Windows Vista).

Written By

Discussion

803 comments
KRZYHO100
KRZYHO100

I'm mam tą karte to znaczy ATI Radeon HD 2600 Pro i Diała mi gta 4. Polecam tom karte wszystkim.

Waseemreddler
Waseemreddler

situation looks grave , may be the change of plan is needed, i played games when it was the maximum requirment 500Mhz 128MB ram and 8MB Video Card , but age is gone , minds are changed ............. wt the hell is this?

SeanWithBigBall
SeanWithBigBall

The new iMac's from Apple have these cards in them. The 20-inch 2.0GHz model has an ATI Radeon HD 2400 XT with 128MB memory. The 20-inch 2.4GHz model, 24-inch 2.4GHz model, and the 24-inch 2.8GHz model have an ATI Radeon HD 2600 PRO with 256MB memory. They all have the Intel Core 2 Duo in them! So take that you Mac haters whom don't understand us.

cameron06
cameron06

just barely though really you'd wanna get 2 of them just so you have abit of head room, also crysis is a game wich would depend more on the memory of the video card the 640MB verson of the GTS will do much better then a 512MB 2900XT , the GTX also runs alot better with all the filters turned up, unlike the 2900XT wich will begin to shutter after the resolutions go up but the GTX just get faster and faster when you turn the reso up higher and higher because of how much memory it has

DaDude253
DaDude253

Would Lost Planet on the PC be able to run with a ATI HD 2400 XT?

swedish-marine
swedish-marine

VfighterX - the problem handeling Crysis or any modern game isn't really the GPu, it's about only the CPU that judges it - especially in World in Conflict. Myself I have a Sapphire HD 2900 XT since a time and I'm satisfied with it, even if I still have a old AMD 64 3700+ CPU for a few more days it's still impressive to see how much you'll get for the price - not only does the card look fantastic, it comes with great budled games even if they aren't launched yet (distr. via Steam) and performs way better than the 8800GTS 640mb which was the real target for this HD2900XT to take out - every resolution works better on the faster bandwith of the HD2900XT. Yes, the card consumes alot of power, but having a new, more expensive and solid PSU is better than having a good and more power effecient card along with a trashy aged PSU since about all NEW PSUs will handle this card like nothing. I'm going to pair these up by the start of next year along with the new Phenom processors for AM2+ (AMD Athlon 64 X4) to really be able to kick everything up - but for now I've just ordered a Crossfire-ready AM2 mobo, ATI Crossfire Certified RAM by OCZ (CL4) and a AMD Athlon X2 6000+ to have a platform to build on. I can't see what people like with nVIdia that causes them to dislike ATi - if it wasn't for ATi you would probably have to pay doubled the price for that 8800GTS 640mb for example. The 8800GTS I've installed for a friend wasn't in any real quality at all, very plastic and small and stupid when it came to solutions for the power connectors so that they wouldn't be in the way - and the card also costed more, performed lackier and game with two $5-games while the HD2900XT was really heavy, big but wasn't too big either due to the smart design, had a fantastic sense of quality and came with a bundles arsenal of $70+ in games. I'd recommend the HD2900XT for everyone - the 8800GTS 640mb is a bad deal and the 8800GTX is only a trophy for nVidia which they never want to sell - they've not reduced the price since launch, more or less, and that's not a good sign. The card also has the best possible Multi-GPU solution now when Crossfire has became far more scalable than nVidias SLi while these cards really shine with their fantastic looking coolers.

VfighterX
VfighterX

I wonder how fast these cards would run Crysis.

LouieV13
LouieV13

My HD 2900XT crossfire setup PWNS

ahmedkandil
ahmedkandil

and for the love of god people STOP SAYING THAT HD2900XT DOES NOT BEAT THE 8800GTX!!!! THE 2900XT WAS REALESED TO COMPARE TO THE 8800GTS (and in about half the cases, it BEATS the superclocked 8800gts) the HD2900XTX is supposed to compete with the 8800GTX (not sure what ati has to say to the ultra)

ahmedkandil
ahmedkandil

screw the HD2900XT....HD2900XTX FTW!!!!

dtfann003
dtfann003

I like nvidia over ATI. What I dont like is.... Why the Hell would any one pay over 170 bucks for a video that will in just in weeks will end up 100 bucks?

PSP360Gamr83
PSP360Gamr83

Benchmarks are crap on the 2900 XT for a card that sucks up 550W of power... crazy

bladekkk
bladekkk

the 2900xt 1 gig is out now on newegg for just like 60 more thasn the 512 im thinking since i just got my 512 3 days ago i will refund and place new order i will do this at the same time and use 1 day shipping so when i send my card back my new one will be here the same day. these cards kick ass especially for the price image quality is unbelievable you need to see it in motion not just some screenies or benchies btw screw benchies,gamneplay is where its at

CaribouLou5
CaribouLou5

What happened to the 1024 mb Radeon HD 2900 XTX? It was supposed to wipe the floor with the GTX. Maybe ATi is saving that one for last...

KingSigy
KingSigy

These Radeon HD's sure look to be killers. I'll definitely opt for the HD 2900 XT, then I can keep it in my case for a good 3 years before updating.

tranquilo_
tranquilo_

When I bought the nvidia FX5900 I wanted the best and it ended up having its problems, so from then on I buy the best performance/price ratio there is (medium level). I bought the 6600GT and it is still a nice card (enough for me to play CSS, COD2, etc), sure its getting dated, I will wait and see who has the upper hand in the performance/price ratio with DX10 games. For now I am opting for the 8600GTS OC or the Radeon HD 2600XT.

TrixteH
TrixteH

If you have the money and are an enthusiast and want the best there is, at the end of the day....why have a burger (HD2900) when you can have a steak (nvidia 8800 ultra). lol :D

foe666
foe666

I care! I need to get 100000 fps on cave story and ragnarok online!

itzmec
itzmec

who cares! there's nothing to play......

foe666
foe666

Screw Vista. XP FTW!! And I definitely won't decide on new cards until some DX10 games actually come out >_>

uberjannie
uberjannie

I agree with brono. I wont fluke out with 100 quid extra if its only 5-10% better performance with that card. Id rather just overclock the card that costs 100 quid less..

brono
brono

sorry to say this .but you must be all rich people i admit that nvidia cards are best that ati. but ati can conpite widh nvidia for almost alf price so ati its the winner for all gamers that arent like you all rich people. first ting that i see in hardware its the price\ performance relation so if i find a good hardware for the lowest price for me its the best choice so i prefere ati .i do admit nvidia normali its better but only for rich people that dose not have anything to do only spend money for fun

Lord-Shock
Lord-Shock

Sorry but if ATI is trying to defeat to NVIDIA with those video cards then they need to get back to the laboratory to plan it again

SaliSB
SaliSB

whether AMD made the right move or not by releasing a "midranged" card compared to nvidia's stuff, idk, what i do know is that i would by the hd2900xt

mrdazy
mrdazy

No. It is actually true. Being a computer reseller/repairer/builder, I have access to information that is not "casually" available to the average reader (company newsletters/emails, resellers magazines, corporate standings reviews, etc), and it is a well known fact that "keeping in the game" is far more important to AMD/ATI right now (obviously) than making benchmark scores. Bad move? Perhaps in the minds of ppl only looking at quarterly market gains, synthetic benchmarks . If you have "been computing" for longer than a few years, then you know full well that Top Product is never top for long. In the long run they (and we, as consumers) can only benefit from Direct development communication and coordination between a GPU and CPU manufacturer. Only *mildly* off topic.... :-D http://www.crn.com/hardware/200001783 http://www.extremetech.com/article2/0,1697,2152458,00.asp Final jab: Think NATIVE multi core! AMD has it!

whizzedOUTwoz
whizzedOUTwoz

Midrange my ass, that was what they came up with and stamped a mid range price on it, im sorry it just dosent float with me, i guess we will see what ATI can come up with in a years time, yet they need to catchup as it seems they are atleast 1 full cycle behind already. Its looking like the AMD/ATI merger was a bad move as Intel and NVIDIA are dominating the market in both areas.

mrdazy
mrdazy

Their new card was INTENTIONALLY meant to be a midrange, its not what they "can" produce, it's what they DID produce. Financially they are still recovering from the merger, so this card is just to "keep them in the market" whilst they restructure the development departments, marketing, etc. Admittedly, ATIs cards have not allways been the top mark, but well enough to keep in the running. But AMDs chip architecture development has always been far more progressive and forward thinking. I have no doubts that AMD/ATI has a very bright future ahead worth watching.

TrixteH
TrixteH

I find it highley amusing that ATI release there flagship dx10 card some 6 months later than Nvidia and still can only produce the power of nvidias mid range card..... I would love to have seen the results if they had benchmarked the HD2900 crossfire against the 8800 Ultra SLI......ATI/AMD would get pwned 4 sure. I read this discussion board laughing with tears in my eyes at some of the comments made by ATI fanboys sticking by there cards its hilarious...but how long can you keep it up for lol Intel and Nivida are now without doubt the brands of choice for enthusiasts alike.

-spAce
-spAce

There is no DX10 3D card available that I would buy. They all spend too much electrons. Why in the **** do they spend so much power in idle mode. Dosent look good in electricity bills and dosent do good for the nature. The whole DX10 is a failure.

gamer100g
gamer100g

Are they getting better performance in vista?

Shehzad15
Shehzad15

Is this graphics card more powerful than the xbox 360 and ps3?

whizzedOUTwoz
whizzedOUTwoz

I just purchased an MSI 8800 Ultra which was factory overclocked and its the fastest frekking card ever, im well chuffed with the performance even in the only handful of DX10 games, which still beat the ATIs. And why would ATI release only a mid range card?, I thought the idea was to beat or atleast get close to your rival in benchmarks, to me it makes no sense, you are giving out the wrong impression as NVIDIA will have even more sway for consumers. It just makes the gap look even bigger. And im sure ATI/AMD would be saying this is only our mid range card.

Poeticinsomniac
Poeticinsomniac

Why do so many people have a problem understanding the simple fact that the HD2900xt is ati's MID-RANGE dx10 card. The HD2900 has a price tag of $380 that comes with the Half life 2 orange box voucher which is something like $50-80 of software. The 8800gtx is something like $500, the GTX ultra is $700+. Also the gts and gtx cards have an additional 128-256mbs of memory. The HD2900 was meant to compete with the 640gts card. Yet in numerous tests it comes in close to the gtx or surpasseses it in performance. There are almost no DX10 games available to accurately compare performance between the cards, but in terms of price, the HD2900 is better. As well as a hell of a lot more inovative then the 8800. It isn't the slaughter over nvidia most were expecting, but it's impressive considering it's only their midrange card and it competes with nvidias high end hardware for half the price. SP33doh saying that the 8800 is the best dx9 card out is stupid. Plain and simple, stupid. It would be pretty sad if the next gen card couldnt outpreform the previos generation of hardware. With the 1950xt costing around $200 now it is the best dx9 card available, mine can still run every dx9 game out on ultra settings in 1600x1200 and most in 2048x1536. It's not just the video card...the rest of your system has to have good hardware as well. Why the hell should AMD/ATI rush hardware that isn't needed. With no DX10 games around there isn't any need there, and considering the quad cores are being optimized for 64bit vista (they're working with microsoft to do so) why should they rush those when everyone running a 64bit rig is still using xp, or 32bit vista. There isn't going to be anymore 32bit software, or operating system/support come 2008, and since nvidia and intel are focusing on 32bit interface, i suspect they're going to be screwed once that happens. So why must every fanboy crawl out of the woodwork to boast about the fact they spend $1500-2000 every 6months to get the latest and greatest video cards and cpu's to use software on a 5 year old operating system while running their monitor resolution at 1280x1024. Up the resolution, get a 64bit OS to go with your 64bit hardware and then run some tests and see who fares best. BTW...until nvida came out with the 8800 line....they were getting their teeth kicked in since ATI came out with the x850. Just like AMD was destroying intel since t he release of the 939 socket up until the conroe came out. But intel and nvidia are just pumping out new shiny crap that no one needs, because they don't even use the software it was intended for, because they know once the software upgrade is made they'll once again be screwed.....much like the people spending $1500 on GTX ultra's to play Warcraft or Half life 2 in 1280x1024 so they can say they get 300 FPS. Bravo

bladekkk
bladekkk

i just bought a 2900xt,i had a x1900xt that had a warranty that was up in 4 days i sent it back to the retailer and they refunded me full price which was 389 so i bought a x2900xt for 377, it kicks butt big time i have never winter nights 2 running at full settings with 4aa running with another 4aa from the filter and all other 3d settings at max i get great framrates with little drops but the visual experience is where this card shines i had no idea a game could look this damn good like a movie from lush scenery to great lighting effects,even the ground looks terrific compared to what i had on my x1900xt my only regret is i could not get the 1 gig version as they do not have it yet i recommend the x2900xt fully no i am not a fanboy either

666NightsInHell
666NightsInHell

[This message was deleted at the request of the original poster]

beckoflight
beckoflight

Good point Sefveron ... & another thing Ati whould have realesed 1gb ddr 3,4 version but the provider wasn't up for the job... practicaly the 1gb Radeon does exist & its being tested as we speak .... & its much more longer than the XT(http://www.price.ro/forum/printthread.php?t=25395) & sure much more powerfull than GTX & XT + when i meant the 1950 XTX is the best DX9 card ..i was refearing at last generations cards that only support DX9 !!! & PLS .. PLS no more fanboyism .... i just want to see the real DX10 test .... till now i'mm ore atracted to the crossfire 2900 XT its much cheaper & better ( maybe its better than 2 GTX's in sli who knows just w8 & think clear not fanboyism )... i think now it all depends on drivers & Ati's teams are working hard to make them perfect so its just a matter of time !

Sefveron
Sefveron

People keep going on about how the HD2900xt is getting its assed kicked by the 8800gtx but if they spent 5 minutes to research a little they would see that the 2900xt is about the same price as the 640mb 8800gts and performs a fair decent amount better, also the 2900xtx was delayed due to memory bandwidth problems , and if Nvidia fan boys want to poke fun at that i recall that the 8800gtx's had capacitor problems which were just fixed in time for the launch

mikley28
mikley28

how about comparing the hd2900 xt Crossfire to the GF 8800 GTX SLI instead of the GTS SLI.

MADCAT_Mishka
MADCAT_Mishka

all this proves is that Crossfire is better than SLI...

bang1101
bang1101

I'd rather they do a screen shot compairison then scoreing, if you can run something at medium settings and have it look like its running on high is where I'm interested. so far thats where ATI has surpassed Nvidia

sokunthy
sokunthy

Nvidia will aways stay on top.

escapee88
escapee88

Damn, amd/ati sux right now, they delayed the release of their card by how many months? and it's still getting its ass kicked by the 8800gtx?

Jmaster3265
Jmaster3265

Now the question is which is better Nvidia or Radeon?

B4mB00
B4mB00

They should separate the Crossfire vs single card setups. The way they have it set up makes it seem as if ATI wins every round when in reality, they lose nearly every bench. GS should also do a price/performance comparison, because at least then ATI could save its skin.