There was a time, back in the late nineties and early noughties, when console graphics were king. While the PC industry and the likes of the now-defunct Silicon Graphics poured most of their efforts into making specialist chips for specialist 3D workstations, it was Sony and Nintendo that led the charge on 3D graphics for the consumer. By the time the PlayStation 2 launched in early 2000, console graphics had far surpassed anything available on the humble PC, much to the chagrin of dedicated PC players.
Fast-forward a few years, and huge investments into R&D, tighter fabrication processes, and a freedom from the tighter power constraints of the console--not to mention a much-improved DirectX from Microsoft--saw the PC steadily claw back the performance crown. Even the last bastion of console overkill, the PlayStation 3 and its Cell processor, was backed up by what was essentially a standard 7000-series Nvidia graphics card, with a performance to match.
Nowadays, it's tough to imagine even a company as large as Sony putting as much time and money into developing something as esoteric as the Cell processor, which is probably why it and Microsoft have eschewed such tech for off-the-shelf (or thereabouts) chips from a company that makes them for a living. That certainly has its benefits (it's simpler for developers for one), but they aren't challenging the best the PC has to offer in the same way that the PS3 and Xbox 360 did.
Where those consoles ushered in the HD generation, such a sea change isn't in the cards this time around, at least in terms of whizzy visuals. Instead, the PS4 and Xbox One peak at a nice but hardly cutting-edge 1080p at 60fps for games. That's the standard for now, but if big tech trade shows like CES and IFA are anything to go by, it certainly won't be for long.
"…they aren't challenging the best the PC has to offer in the same way that the PS3 and Xbox 360 did."
4K, or Ultra HD as it's otherwise known, has been around for a while in cinemas and video production houses, but it's only in the last year or so that the technology has switched from completely absurd pricing to something that's a little more accessible for the average Joe. A decent 60Hz 4K TV or monitor still runs you a couple of thousand dollars, but as the technology matures over the course of the next few years, prices will fall. And if you're willing to take a punt on a cheap Chinese import, a 50-inch 4K TV can be yours for less than $1,000 right now.
So why would you want 4K? Think of it like the first time you saw the iPhone 4's retina display: the crispness of the text, the pin-sharp pictures, and the horrible realisation that from that point on, anything less would look like garbage. It's a wonderful thing to see with your own eyes. And, unlike the 3D technology heavily pushed to get us all to buy new TVs, it's unlikely to be a fad. After all, it's an easier up sell. 4K is more--more pixels, more sharpness, more definition--and it's easily demoed in stores. If there's one thing people love, it's more.
Like in the transition from SD to HD, there's not a whole lot of native 4K content out there at the moment, but if you're a well-heeled PC gamer who doesn't mind a little bit of fiddling, you can get in on the action right now. Monitors like ASUS' PQ321 31-inch display are slowly hitting the market, and while it's hardly cheap at $3,499, it's far better than the $20,000 such monitors once cost.
Rendering the 8 million pixels of a 4K set in real time is a tough challenge, even for the most powerful of PCs. And it's made all the more difficult by how 4K monitors currently work. Rather than one giant monitor, they are actually two 1920 x 2160 panels stitched together, so clever software from the likes of Nvidia and ATI is needed to prevent any noticeable vertical tearing or artefacts between the two panels. There were some problems in the very early days of 4K, but the latest set of drivers from both companies seem to have ironed out most of the issues.
Indeed, we didn't spot any of those issues as we gawked in amazement at Metro: Last Light and Battlefield 3 being run in 4K. The benefits of cramming so many pixels into a display are open to debate (and be sure to watch Reality Check for an insight into that), but when you're sitting just a few feet away like the typical PC user does, everything looks crisp and clear, and the 31-inch size of the monitor does wonders for sucking you right into the action.
"You need a very powerful PC to drive the display."
But it's worth reiterating that you need a very powerful PC to drive the display. Our test rig--despite sporting an Intel i7, Samsung Evo SSD, 16GB of RAM, and a killer graphics card in the form of Nvidia's Titan--struggled during some of the more demanding games. The opening of Crysis slowed to a crawl, forcing us to knock down from ultra to mere high settings, while busier sections of BioShock Infinite suffered from some mild chugging.
There are also some games, such as Skyrim, that don't have particularly high-resolution textures, meaning they look a little worse in 4K, thanks to the blurring effect of stretching the textures out. More games are poised to adopt higher-resolution textures, though, and it wouldn't be a surprise to see next year's PC games adopt 4K assets as standard over the 2K ones currently used. But the fact is, most games work fine at decent frame rates and look spectacular. There's not even a need to use antialiasing, thanks to those tightly packed pixels taking the place of blending colours to smooth edges. And it all works on technology, albeit high-end technology, that you can buy right now.
The beauty of the PC is that today's high-end tech is tomorrow's mid-range. In as little as a year's time, playing 4K games on the PC is going to be much cheaper, and they'll perform even better too. That leaves the next-gen consoles in something of a predicament. If a $3,000 PC equipped with the most powerful GPU around can only just cope with 4K, what chance does the PS4 or Xbox One have? A deeper access to hardware and a lighter OS gets you only so far.
Raw power and eye-popping visuals certainly aren't a requisite for making a great game. Indeed, as the late, great godfather of Nintendo, Hiroshi Yamauchi, once said, "We cannot guarantee interesting video games through the use of better technology." But as living rooms across the land begin to move to 4K, the next-gen consoles won't be able to deliver that content. Sure, Sony has confirmed that the PS4 will support 4K films and photos, and Microsoft has said that the Xbox One will too. They may even support lighter 2D games in 4K further down the line.
But the big blockbuster AAA experiences in 4K? That will be the domain of the PC. It's a platform that's easily adapted for new technologies, and that's only going to get more powerful as the years roll by. It's even got its sights set on the living room with the likes of Valve's SteamMachines and Steam OS. Honestly, once you've seen what games look like in 4K, those next-gen consoles are far less attractive. You're simply not going to want anything less.