Nvidia GeForce GTX 280 Hands-On
Find out about Nvidia's next-generation GeForce GTX 280 GPU and see how it performs!
The PC gaming industry is undergoing a transition period, but that isn't slowing down the pace of graphics innovation. The industry is shifting from retail to digital distribution, and the current generation of consoles have adopted many features previously available only on the PC, such as HD graphics and pervasive online multiplayer support. The current consoles are much stronger competition in this time around, but the PC platform is up to the challenge. Industry heavyweights including Microsoft, Activision, Intel, Nvidia, and AMD have formed the PC Gaming Alliance to reinvigorate PC gaming by making it easier and more accessible to gamers. The PC still has its greatest advantage, a steady stream of new CPUs and GPUs that ensure that the PC platform's computational power is always several generations ahead of the consoles. Nvidia has increased the PC graphics lead one step further with today's release of the first GeForce 200 series GPU, the GeForce GTX 280.
The GeForce GTX 280 takes over for the GeForce 9800 GTX as Nvidia's latest and most powerful DirectX 10 GPU. The new GeForce GTX 280 actually compares best with the GeForce 9800 GX2 which has two GeForce 9800 GTX processors on a single card. The GeForce 9 series might be the most short-lived Nvidia GeForce generation we've ever seen, but the GeForce 9's brief life span makes sense when you consider that the GeForce 9800 GTX GPU was basically a die shrink of the GeForce 8800 GTX with some minor memory-interface revisions. Shrinking the die is an efficiency play that moves an existing chip design to a newer manufacturing process and results in smaller chips. The die shrink reduces costs because more chips can fit onto each silicon wafer, and it can often increase GPU performance because smaller chips can achieve higher operating speeds.
The GeForce GTX 280 has 240 stream processors, almost double the number of processors on the GeForce 9800 GTX, and Nvidia has optimized the new chip architecture to squeeze even more performance out of each processor. Nvidia bumped up the onboard memory to 1GB for a single GPU and widened the memory interface to 512-bit to improve performance at high resolutions with antialiasing enabled. The performance doesn't come cheap--the GeForce GTX 280 comes with a very top-of-the-line $649 MSRP. However, the GTX 280 is only half of the GeForce GTX 200 launch. Nvidia plans to release the slightly less powerful, but much more affordable GeForce GTX 260 next week on June 26, 2008. The $399 GeForce GTX 260 will have 192 stream processors, 896MB of memory, and a 448-bit memory interface.
The GTX 280 continues Nvidia's commitment to evolving the video card into a more consumer-friendly product. The entire dual-slot card is encased in a glossy, molded shell, and the new design hides the card's SLI connector and audio port beneath rubberized covers. It's really only a matter of time before a designer extends out the shell to hide the PCI Express connector, the last exposed portion of the card's PCB shame.
|GPU||GeForce GTX 280||GeForce GTX 260||GeForce 9800 GX2||GeForce 9800 GTX|
Nvidia added more open space around the 8-pin and 6-pin power connectors to accommodate larger power plugs. The GeForce 9800 cards had less clearance around the power connectors, which forced users to either find a power adapter or snap off the extra plastic to get power cables to seat properly. Nvidia recommends using a 550W power supply with at least 40A on the 12V rail for a single GeForce GTX 280. Cards will work in 2-way and 3-way SLI, provided you have an SLI-enabled motherboard with the appropriate number of PCI Express slots.
The card has two dual-link, HDCP-compliant DVI-I outputs and a 7-pin analog connector that can output S-Video as well as composite and component with the appropriate cable dongle. HDMI output comes via a DVI-to-HDMI adapter, but you'll need to jack a SPDIF audio feed to the top of the card if you want to get sound integrated into the HDMI-out. As with all recent Nvidia GPUs, the GeForce GTX 200 series has PureVideo support that provides full decode acceleration for all popular HD file formats.
The GTX 200 GPUs also have smarter power management features that can automatically detect and throttle the chip's power depending on how much 3D-performance the system needs. According to Nvidia, the GeForce 280 GTX will only consume 25-35W when running in desktop mode or while playing a Blu-ray movie, but it can ramp up to full power, approximately 236W, when it's time to fire up Call of Duty 4. The GTX 200 GPUs also support Nvidia's HybridPower feature that can switch all graphics work over to the motherboard graphics chip for low-intensity, nongaming applications provided you have a motherboard with an nForce 780a or 790i chipset.System Setup: Intel Core 2 Extreme QX9650, EVGA 780 SLI motherboard, 2GB Corsair DDR2 (1GBx2), Seagate 7200.11 750GB Hard Disk Drive, Windows Vista 32-bit SP1. Graphics Cards: GeForce GTX 280 1GB, GeForce 9800 GX2 1GB (512MBx2), GeForce 9800 GTX 512MB. Graphics Drivers: Nvidia ForceWare beta 177.26, Nvidia ForceWare 175.16.
The performance tests show us that the GeForce GTX 280's 240 stream processors are very capable of taking on 256 processors from the previous generation. The GeForce GTX 280 only has a slight lead over its competition in Call of Duty 4 and Team Fortress 2, but the newest GeForce shows what it can do in our most challenging tests, Crysis with high-quality settings and the new 3D Mark Vantage. Crysis has been the most graphically demanding game in our benchmark suite since its release late last year. Most cards struggle to maintain playable framerates at higher resolutions at the best image quality levels, but GeForce GTX 280 handles high-quality, 1600x1200 without a problem and actually makes antialiasing a viable option. The card does particularly well in the 3DMark Vantage Extreme test which sets the resolution to 1920x1200 and increases all shaders to “extreme” levels.
The video card isn't just about gaming anymore, either. Nvidia is currently working on expanding the video card's usefulness outside of graphics applications. The process started in the last generation with the launch of the CUDA (Compute Unified Device Architecture) initiative with the GeForce 8 series. CUDA opens up the GeForce GPU's processing power to non-graphics applications such as video transcoders, image manipulation programs, or any other work that can benefit from parallel processing. CUDA-enabled consumer applications are still rare, but there are a few promising programs on the horizon.
Software start-up Elemental Technologies is developing video transcoding software that uses the GPU to accelerate transcode times 10 to 20 times faster than CPU transcoding, and users participating in the Folding@Home program will soon be able to use their CUDA-enabled GPUs to start racking up the points at an incredible rate. If these two programs are a sign of things to come, it's not difficult to imagine a time when everyone will need to take non-gaming software support into account when buying a new video card.
The GeForce GTX 280 is your only choice if you're looking for the most powerful single-GPU available today. The GeForce GTX 280 matches up well against the dual-GPU GeForce 9800 GX2 in current games, but it seems like many of the 280's engine improvements won't become apparent until games start adopting 3DMark Vantage-level graphics workloads. The card also has plenty of non-gaming upside with its pending CUDA applications, but that also applies to all CUDA-enabled GeForce 8 and 9 series cards. The GeForce GTX 280's primary downside is its hefty $649 MSRP--pretty steep considering that you can get a GX2 for just under $500. For what it's worth, the GeForce 280's advanced power options will make the card more affordable to operate when you're not gaming.
The products discussed here were independently chosen by our editors. GameSpot may get a share of the revenue if you buy anything featured on our site.
Got a news tip or want to contact us directly? Email email@example.com
Join the conversation