The Alienware X51 and Nvidia GTX 680: A Match Made in Heaven?
Can Nvidia's new power-efficient GTX 680 find a home in the tiny Alienware X51? We grabbed a screwdriver, some scissors, and several cups of coffee to find out.
When Nvidia announced the GTX 680 and its accompanying Kepler architecture last month, it made a big fuss about power efficiency. Instead of raw clock speeds, Kepler focuses on performance per watt. That got us thinking. Just how power efficient is Kepler? How low could we go with the GTX 680's power supply, given its lower TDP of 195 watts?
Well, as luck would have it, our chums over at CNET UK just took delivery of a shiny new Alienware X51 desktop. As gaming machines go, it's tiny, being not much bigger than an original Xbox 360, and it has just a 330-watt PSU. Being the enterprising types we are, we just couldn't resist seeing if we could elevate the X51 to near god-like levels of GPU power by slotting a GTX 680 inside.
Disclaimer: The X51 is most certainly not designed to house a GTX 680, let alone power one. We just can't help tinkering.
Getting into the X51 is a tricky business. It's not locked down or anything, but the case certainly isn't a tool-free design, requiring the removal of screws to get the side panels off. Inside we were presented with another challenge. The GPU was held in place by a mean-looking steel bracket, the bottom of which housed a PCI-E adaptor to allow for a sideways mount of the GPU. This was not easy to disassemble.
After removing several screws, removing a plastic mounting bracket, and a lot of wiggling, the GPU cage was freed. That allowed us to get a good look inside the case and to see if the card would fit. Our X51 was fitted with an Nvidia GTX 555 GPU, a card that's significantly smaller than the GTX 680.
Undeterred we gingerly slotted the 680 into its new steel home and slid it into the chassis. It was an incredibly tight squeeze, with the rear of the card pushing right up against the front intake fan.
Still, it did fit! But the biggest problem actually came from something rather mundane. The GTX 680 requires two six-pin power connectors, which the X51 provides. However, the power sockets on the 680 are actually recessed thanks to Nvidia's new space-saving design, meaning the odd right-angled connectors of the X51 wouldn't fit.
It's not like we could easily replace the cable either, because it's a single lead coming from the motherboard that splits into two. We thought it was all over, but thanks to some hocus-pocus and a bit of determination, we were able to bend back part of the power connector, allowing it to fit the 680's recessed sockets.
With the card in place we put the cover back on and hoped we hadn't destroyed anything in the process. With a quick press of the power switch, the X51 sprang into life, and we were greeted with the Windows 7 desktop.
So far, so good, but the real test was seeing if it benchmarked. Did it work? Well, yes and no. We managed to run a few games including Batman Arkham City, Dirt 3, and Battlefield 3, all of which saw a significant (sometimes huge) boost in frame rates from the stock GTX 555.
Sadly, it didn't work well for long. The system was very unstable, and most games killed the system after around half an hour. We tried benchmarking in 3DMark Vantage too, but it simply wouldn't finish before the system caved in.
Our hopes of having a small, nonthreatening, uberpowerful PC tucked under the TV were sadly dashed. No, the GTX 680 and X51 were never meant to be together, but if Nvidia follows its usual product cycle, then there will be smaller, less-power-hungry Kepler cards to follow in the future. And those will be a shoo-in for a system like this.'