triple monitor setups are b*tchin'..... quad seems too much. However, I can't wait until we have six-monitor support with a chair that has a protracting keyboard that turns based on the movement of the mouse.
Quicker, quieter, and more power efficient, the GTX 680 is new "world's fastest" GPU.
At GDC 2011, Epic Games (maker of Gears of War) unveiled Samaritan, an eye-popping technical demo that showed what was possible with Unreal Engine 3 and a seriously hardcore PC. It was--and still is--an impressive demo, showcasing smoothly tessellated facial features, point light reflections, and judicious use of movie-style bokeh. The demo was so impressive that Epic decided to show it again at this year's GDC, with vice president Mark Rein reiterating that Samaritan is its vision for the next generation--a "screw you" to the naysayers predicting that graphical prowess will play second fiddle to features and functionality.
The problem was, Samaritan didn't exactly run on your average gaming PC, requiring three Nvidia GTX 580 GPUs at a cost of thousands of dollars, as well as a power supply that brought Greenpeace members out in a cold sweat. And this left us wondering: if it took so much power to run the demo, what chance would the next generation of console and PC gamers have to experience it?
The answer, it turns out, was also unveiled at this year's GDC in the form of Nvidia's brand-new GTX 680 GPU, a single one of which easily powers the Samaritan demo. Aside from a few optimisations from Epic, most of that comes down to the new 28nm Kepler architecture that the 680 is based on. It features a new Streaming Multiprocessor (SMX) design, GPU Boost, new FX and TX Anti-aliasing (FXAA/TXAA) technology, Adaptive VSync, support for up to four monitors (including 3D Vision Surround), and--most interestingly of all--much reduced power consumption with an increased focus on performance per watt.
Multiple acronyms aside, just what do you get if you plump down your hard-earned cash for a GTX 680? Quite a bit as it goes: 1536 CUDA Cores, 128 Texture Units, 32 ROP Units, 1Ghz Base Clock, 2GB GDDR5 RAM @ 6Ghz, 128.8 GigaTexels/sec Filtering Rate, 28nm Fabrication, 2x Dual Link DVI, 1x HDMI, 1x DisplayPort, 2x 6-pin Power Connectors, and a 195 Watt TDP.
Based on specs alone, the 680 is a more powerful beast than its predecessor, but it's also more practical too. The four display outputs mean you can drive four monitors (at up to 4K 3840x2160 resolution!) at once from a single card. That goes for 3D vision surround, meaning you don't have to splurge on an SLI setup if you're into giving yourself a headache. Also notable is the reduced power consumption of the card, which has a TDP of 195 Watts, compared to 250 Watts in a GTX 580, meaning you only need two six-pin connectors, and you can run it from a much smaller power supply.
"Clearly, the 680 is a more powerful beast than its predecessor, but it's also more practical too."
The reduced TDP also results in reduced heat, meaning it's easier to keep cool. The reference card has an all-new cooling setup that's much kinder on your ears; a godsend for anyone that ever had to endure the jet engine sounds of the GTX 400 series. It features special acoustic dampening material around the fan, a triple heat pipe design, and a redesigned fin stack that has been shaped for better airflow. Of course, this being a reference card, you can expect manufacturers to come up with their own crazy cooling solutions once they start shipping 680s.
Another neat feature of the 680 is a new hardware-based H.264 video encoder called NVENC. If you've ever tried to encode H.264 video, you'll know that it's a time-consuming process. While previous GTX cards sped up encoding using the GPU's CUDA cores, it resulted in increased power consumption. And so, in keeping with Nvidia's new power-saving attitude, the NVENC encoder consumes much less power, while also being four times as fast.
"If you're anything like us, then nothing gets you more excited than realistic cloth animations and individually animated strands of hair."
That means 1080p videos encode up to eight times faster than real time, depending on your chosen quality setting, so a 16-minute-long 1080p, 30fps video takes approximately 2 minutes to complete. Unfortunately, software developers need to incorporate NVENC support in their software, so at launch you're limited to using Cyberlink's Media Expresso. Support for Cyberlink PowerDirector and Arcsoft MediaConverter is promised for a later date.
One other notable improvement to the GTX 680 comes in the form of improved PhysX performance. And, if you're anything like us, then nothing gets you more excited than realistic cloth animations and individually animated strands of hair. To demonstrate the 680's improved performance, Nvidia has put together a tech demo featuring a very hairy ape in a wind tunnel. Each strand of its fur is individually animated, with PhysX processing each movement in real time.
Nvidia also put together a demo called Fracture, which features three destructible pillars. Instead of scripted animations, it uses PhysX to calculate the destruction of an object in real time. Depending on what force the pillar is struck with and taking into account the environment and any previous damage, it falls apart in an amazingly realistic way. The obvious application for this tech is in action games, where gunfire could accurately damage buildings.
The improvements to PhysX aren't just part of a tech demo either. The PC version of Gearbox Software's upcoming Borderlands 2 is set to support many PhysX enhancements. These include water that reacts accurately to a player's movements, rippling and splashing around the environment as you walk through it. Borderlands 2 also makes use of PhysX to render destruction. For example, fire a rocket launcher into the ground, and huge chunks of earth and gravel fly into the air. The resulting debris settles on the floor, where you can kick it around by walking through it.
So those are the top-line improvements to Kepler, but there's plenty more tech to get stuck into on the next page, where we take a more in-depth look at Nvidia's latest architecture. Or, if you just want to get straight to some benchmarks, head over to page three.
6870 been running everything new for almost 2 years now with no problem. @ minimum 1000p, with all effects always all the way up. why blow the extra $500+ for any "stronger" setup?
Sounds awesome, but we need more games developed for the pc that can take advantage of the power. I've had a GTX580 for a year now and it hasn't even broken a sweat. What a waste of money. Unless you're playing on multiple displays, what's the point. I should have stuck with my 5770.
oh deer lord Jesus i hope that card is implemented into the next consoles, and that destruction video, thats the kind of destruction i want in the next battlefield game
@bluebird08 That's assuming there will be new consoles in the near future. The way things look, doesn't look like there will be a new one within a few years. But I digress here, I am a PC gamer, so that doesn't bother me too much :P.
i will get, when i feel the need for it. So far i cant see anything that will need that kind of power for gaming. i will wait in the shadows till then lol.....................
bboy you are an idiot that should stick with consoles & cod. Good review looking forward for next gen graphics
@BrassBullet You are right about the consoles needing hardware like this in the next xbox and ps but its already been stated but not official is the next xbox gpu will be a 6780 equivalent so i very very very much doubt a 680 will go into the next gen consoles. So if the next xbox does have a 6780 and Sony stick with Nvidia i reckon it will be like a 650 or whatever they call it gpu. We shall wait and see. Oh and i dont think it was mentioned in this article but i have read in articles on the 680 that the codename for it indicates it may actually not be the highest end single gpu nvidia release. Hell they may even be goin back to the old 680 ultra naming scheme which would be awesome lol. I mean they have plenty of headroom to increase it cause of temps and power usage so nvidia could really spice things up and release some super high end single gpu which leaves ati's 7970 in the dust. Btw i own 2 6970s in crossfire so im not a ati fanboy but im liking what both companies are doing although prices are too expensive in my opinion
As a high-end PC gpu I'm underwhelmed by the 680, but I think it might just be the perfect fit for the new Consoles. First, it's very small, about half the size of the 580, this will make it cheap to produce in mass for a low margin item like a game console. Second, they stripped out a lot of the GPGPU performance. GPGPU is thr sole reason enterprises buy nvidia products, so the 680 is clearly not ment to win that market like the last few flagships. Lastly, and largely because of the last 2 reasons, it is the most power efficient high end GPU. The space and cooling constraints of a console make this a huge selling point. Especially because AMD has chosen to add more features unnecessary for a console, I wouldn't be surprised in the least if the 680 is exactly the GPU that shows up in thex Nextbox or PS4.
The GTX 680 is an excellent chip in every way, but I feel it is being a little overhyped. The Radeon HD 7970 card has more memory (2 gb vs 3gb) and actually performs on par with and often beats the GTX 680 at ultra-high resolutions, such as the 2560 x 1600 or multiple monitor setups such as 5760x 1080, which is where such cards will realistically be used. AMD aren't in trouble, they just need to price the 7970 about $20 cheaper than the GTX 680 to remain competitive. Plus, aside from gaming, the 7970 is proven to have much more compute power (read any professional review, such as Tom's Hardware's for details).
Good luck getting your hands on one from newegg anytime soon. http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&N=100006662&isNodeId=1&Description=gtx+680&x=0&y=0
It's amazing really how just in 2 years since the 4xx series came out the technology difference between the 2 is insane. Kepler is clearly the next gen over Fermi, but around 2014 its expected that the next gen Maxwell will outperform Kepler by up to 10 times. Also with Nvidia putting a roadmap for mobile (Tegra) performance that will match PC performance soon with also next gen architecture, Nvidia, the way I see it is pushing technological hardware levels of humanity forward really fast.
So basically AMD's high end 7970 got its butt kicked by the not really high end gtx 680, AMD is in trouble now!
"The maker of my PC called me up and offered if I wanted the new 680 GTX that I could have it at no extra charge" Become Hulk RARRWWWRR~* Me.. Envy.. At.. Gruug~! Hulk Smash~
@NeoEnigma I agree completely. I was just stating that in terms of power, if anyone was confused by the article. The prime choice for high-end GPU would be the GTX 680, no doubt. I prefer a single GPU setup over sli/crossfire any day of the week. One of the main points of the GTX 680 is less power consumption. So in terms of performance/watt, it's hands down the best. For $500 USD, it's way better than the GTX 580 when it launched.
@Elann2008 - but with so many games having broken or no SLI support at all, I'd sooner sell my current 580 and buy a 680 rather than get a 2nd 580 and SLI it. I Just feel like it's the more stable and reliable option. My 580 will be on sale soon... :)
Just ordered a new PC with a 580 GTX. The maker of my PC called me up and offered if I wanted the new 680 GTX that I could have it at no extra charge. I said "OF COURSE!".
I can run anything with my AMD budget PC at 1680x1050. (Phenom II 955BE and OCd 5850) If I ever do decide to play at 1920x1080 I'll definitely be getting one of these. By then the price will have dropped as well! Win-win.
i just ordered one myself. can't wait for it, to test it out. the benchmarks looks promising. and consider that, most people play at 1080p these days. its the sweat spot for every gamer :)
If the guy had so much energy and strength then why was he using a torch to cut those chains? LOL....
@johnnybrac Maybe he was using it to toughen the chains. But if he was trying to cut them, you're right LMAO
I am currently running 2 560 ti cards, mixed w/ my 8 gb ram and my amd phenom 6 core, atm There is no need for me to upgrade, But when new games start to run slow, or dx12 comes out, Then it is Time ! ( imo )
@ll0stryker0ll This "review" gets so many things wrong, it's a complete joke. Sounds like it was copied out of an Nvidia press junket. 3 benchmarks, and only two of them games? At 1080p? For a high-end graphics card? Might as well just draw some lines with crayon. This card is good enough to stand on its own merits, but Nvidia still feels the need to play games with the press. It's disappointing, really.
@markiewicz "You are wrong because the CPU clocks are provided by NVIDIA and come with a warranty. The card is designed to work within these frequencies at an exact power consumption and heat output treshold." One would hope so. The problem is that Nvidia will do whatever it can to maintain the perception of a performance advantage. So it wouldn't surprise me if these cards started failing prematurely because they were clocked too high. "Difference is that Nvidia guarantees every single card you buy will hit the quoted clockspeed, including Boost." Yeah, but Nvidia's guarantee isn't exactly rock solid. See bumpgate. I mean, I wouldn't worry too much about the 680, but Nvidia in general has burned quite a few bridges in the past, which is the only reason I'm still hesitant to buy a sweet-looking card such as this one.
mine ati radeon 5770 is doing awesome work for me till now. but still this one is much powerful than mine, & its awesome
My 6950 2gb hasn't broken a sweat yet using my i7 2600k....well, I take that back, Metro 2033 was the only game that made it suffer. Especially when those damn grenades explode....
this couldn't have been announced at a better time. Now I can wait for the 580 to drop so I can double up :)
B-boy I don't know what they told you on your planet Mars but the truth is everything was introduced on PC first here on Earth!!
@B-boy Your joking right?Please tell me you are joking and you are not that dumb enough to believe what you wrote.If your being serious,then I suggest doing some major research before posting such nonsense in the future.Not trying to insult you here but I have to agree with ihsiep and say that was ignorant indeed.
naryanrobinson $500 is the retail price for end consumers and will never mean their costs are around $450 or something!! this is a technology sell and one of the best in market and that is why it costs so much to end consumers as no majority of comppanies are not even capable of producing such items!! giants in tech like nvidia always throw up a lot of costs in research and development!! for a product to be successful they need to cover those costs!! what I believe is that Nvidia and ATI will be dying to get their cards be used in one of those microsoft and sony machines!! the larger they have a secured sell the lower the costs can be!! So all those who say Microsoft or Sony may use these cards in their nex gen consoles might be right!! Clearly you are not related to costing, finance or even accounts field!! :-p
hmmmm I think I'll wait for this to come out so that the 580 comes down a bit more then nab another 580 to SLI. Not worth me upgrading just yet - I only upgraded from a 768mb 280 to a 3gb 580 about a month ago and I'm happy as Larry. The FPS gain to price ratio is a no brainer for me.
This article had quatation under that said "Death of the desktop" .....Great always open with a joke.
The 680 is looking and performing good. But since I just owned a GTX560Ti here, a true upgrade would be skipping 600 series and wait for 700 series.