GameSpot may receive revenue from affiliate and advertising partnerships for sharing this content and from purchases through links.

Nvidia GTX 780 Ti Review: A Powerful GPU With A Price To Match

The GTX 780 Ti just about pushes Nvidia back to the top of the GPU performance pile, but its price is far from competitive.

232 Comments

For the vast majority of PC players, 1080p is the benchmark for performance, and by far the most popular resolution in use for gaming (at least according to the latest Steam Hardware Survey). Games look great at 1080p, monitors are cheap and plentiful, and you don't need to spend a fortune on an insanely powerful GPU to drive them. But if you're running multiple monitors, high resolutions like 1600p or 4K, or if you're simply after some bragging rights, then the likes of a GTX 650 Ti or Radeon 7850 just aren't going to cut it.

Enter the GTX 780 Ti, the latest GPU from Nvidia based on the GK110 chip. That's the same full-fat Kepler chip used in the GTX Titan and GTX 780, both of which are already excellent performers at high-resolutions. The trouble is, they aren't the best performers anymore. AMD's latest R9 290X and R9 290 have benchmarked extremely well, not only taking the performance crown from their rival, but also seriously undercutting it in terms of price. Nvidia's latest round of price cuts evens the playing field somewhat, but there's nothing quite like the prestige of having "the world's fastest graphics card".

No Caption Provided

The 780 Ti, then, has a big job ahead of it. At an RRP of $699 (£559 in the UK), it's still around $100 more expensive than the 290X, so it isn't going to be winning any awards for value. In terms of performance, though, it's very impressive. The 780 Ti is the first GPU to make use of the entire GK110 chip, that is, the full 2880 single precision CUDA cores, 240 texture units, and 48 ROP units. Memory comes in the form of 3GB of extremely fast 7Gbps GDDR5 for 336GB/s of bandwidth, while the base clock speed gets a bump to 845Mhz, and the boost clock speed to 928Mhz. It does lack scientific features like HyperQ and high-end 64 bit performance, but on paper at least, the GTX 780 Ti is the most powerful gaming card Nvidia's released.

GTX 780 Ti GPU SpecsGTX 780 Ti Memory Specs

2880 CUDA Cores
845 Base Clock (MHz)
928 Boost Clock (MHz)
210 GigaTexels/sex Texture Filtering Rate
240 Texture Units
48 ROP units

7.0 Gbps Memory Clock
3072 MB Standard Memory Config
GDDR5 Memory Interface
384-bit GDDR5 Memory Interface Width
336 GB/s Memory Bandwidth
GTX 780 Ti Software SupportGTX 780 Ti Display Support
OpenGL 4.3
PCI Express 3.0
GPU Boost 2.0, 3D Vision, CUDA, DirectX 11, PhysX, TXAA, Adaptive VSync, FXAA, 3D Vision Surround, SLI-ready
Four displays for Multi Monitor
4096x2160 Maximum Digital Resolution
2048x1536 Maximum VGA Resolution
Two Dual Link DVI, One HDMI, One DisplayPort
GTX 780 Ti DimensionsGTX 780 Ti Power Specs
10.5 inches Length
4.3 inches Height
Dual-slot Width
250 W TDP
600 W Recommended Power Supply
One 8-pin and one 6-pin Power Connector

Software

Like all of Nvidia's GPUs, the 780 Ti comes bundled with GeForce Experience (GFE), an application that automatically optimizes the graphics settings of your games based upon your hardware. GFE automatically updates your drivers and scans your games library for supported games, aiming to target settings that achieve 40 to 60 frames per second. Since its release earlier in the year, GFE's performance has improved by leaps and bounds, with many more supported games and optimal settings chosen. Naturally, you'll be able to eke out more performance by diving in and editing things manually, but if you're happy to let GFE do the job for you, the results are impressive.

No Caption Provided

Also part of the 780 Ti software package is ShadowPlay, a gameplay capture system that leverages the H.264 encoder built into Kepler (600, 700 series) GPUs. It automatically records the last 20 minutes of gameplay at up to 1080p60 at 50Mbps in automatic mode, but you can record as much footage as your hard drive allows in manual mode. ShadowPlay's also due to support direct streaming to Twitch.tv, although that feature isn't in the current beta. The advantage of using ShadowPlay over something like Fraps is CPU and memory usage. In our testing we found it affected the frame rate far less than Fraps did, in many cases with a hit of just a few frames per second. The software is still in beta, though, so we experienced a few capturing hiccups and crashes, but hopefully those issues will be ironed out before its full release.

There's also a great games bundle attached to the 780 Ti, with copies of Assassin’s Creed IV: Black Flag, Batman: Arkham Origins and Splinter Cell: Black List coming with every card. That's a sweet deal considering they're such current games, and hey, if you've already got them there's always the joy of gifting or selling on eBay.

Performance

Our trusty Ivy Bridge PC backed the GTX 780 Ti, although this time we overclocked the CPU to 4.2Ghz for a little extra oomph. A 1080p monitor would have been a waste for such a card, so we went with Asus' PQ321Q 4K monitor to really test its pixel-pushing power. With the exception of Crysis 3, all games were run at maximum settings and where possible we used FXAA for a performance boost. Call Of Duty: Ghosts was run at a lower resolution of 2560x1600, due to a current lack of 4K support.

MotherboardAsus P8Z68-V Motherboard
ProcessorIntel Core i5 3570k @ 4.2Ghz
RAM16GB 1600Mhz DDR3 Corsair Vengeance RAM
Hard DriveCorsair Force GT/Samsung Spinpoint F3 1 TB
Power SupplyCorsair HX850 PSU
DisplayAsus PQ321Q @ 3840x2160/Dell 3007WFP-HC @ 2560x1600

Battlefield 4 (2x MSAA @ 3840x2160)

No Caption Provided
Average FPSMinimum FPSMaximum FPS
GTX 780 Ti322444
GTX Titan292140
GTX 780261336

Crysis 3 (High Settings, FXAA @ 3840x2160)

No Caption Provided
Average FPSMinimum FPSMaximum FPS
GTX 780 Ti302444
GTX Titan272234
GTX 780252037

Call Of Duty: Ghosts (HBAO+, FXAA @ 2560x1600)

No Caption Provided
Average FPSMinimum FPSMaximum FPS
GTX 780 Ti7528107
GTX Titan7647104
GTX 780543783

Bioshock Infinite (Ultra @ 3840x2160)

No Caption Provided
Average FPSMinimum FPSMaximum FPS
GTX 780 Ti503467
GTX Titan403361
GTX 780302561

Tomb Raider (Ultra @ 3840x2160)

No Caption Provided
Average FPSMinimum FPSMaximum FPS
GTX 780 Ti302343
GTX Titan292139
GTX 780281637

Metro: Last Light (Ultra @ 3840x2160)

No Caption Provided
Average FPSMinimum FPSMaximum FPS
GTX 780 Ti332749
GTX Titan292537
GTX 780252040

A Pricey Performer

As expected with such killer specs, the GTX 780 Ti screams through the likes of Battlefield 4 and Call Of Duty: Ghosts, even at 4K, easily beating the GTX 780 and even the $1000 Titan. It's an impressive showing for a card based on an architecture that's now well over a year and a half old, and represents the peak of Kepler's rendering abilities. While we unfortunately didn't have an AMD R9 290X on hand to make a direct comparison, judging by the benchmarks out there, the 780 Ti is a comparable card and once again places Nvidia within striking distance of, if not back at the top of GPU performance.

Such performance comes at a price, though. At over $100 more than the R9 290X and nearly $300 more than the similarly performing R9 290, the 780 Ti is an expensive choice. It's also $100 more expensive than the GTX 780, a GPU that's hardly a slouch when it comes to high-resolution performance. Yes, the 780 Ti is far more power-efficient than AMD's latest, and yes, it's a very quiet card in operation too, and we experienced none of the power throttling issues that are currently plaguing the R9 290.

Whether that's worth the extra cash, though, is debatable. No doubt about it, the GTX 780 Ti is a brilliant GPU backed by some brilliant software, but you can do a lot with that $100 saving (or even $300 if you plump for the R9 290). AMD's aggressive pricing has taken the shine off the GTX 780 Ti, but if you're all in for team green and have the high-res setup to do it justice, it's the absolute best you can get from Nvidia, and one of the best GPUs (a lot) of money can buy.

Got a news tip or want to contact us directly? Email news@gamespot.com

Join the conversation
There are 232 comments about this story
232 Comments  RefreshSorted By 
GameSpot has a zero tolerance policy when it comes to toxic conduct in comments. Any abusive, racist, sexist, threatening, bullying, vulgar, and otherwise objectionable behavior will result in moderation and/or account termination. Please keep your discussion civil.

Avatar image for TzarStefan
TzarStefan

408

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Buying x2 <3

Upvote • 
Avatar image for MAD_AI
MAD_AI

28

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

Considering the reviews the 290x got concerning it's temperatures at peak, over 95 Celcius, I think the extra 100 $ is justified if Nvidia is offering better heatsinks and cooling.


7 • 
Avatar image for plm3d_basic
plm3d_basic

413

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

<< LINK REMOVED >>

The GTX 780 Ti is at least $150 more than the 290x and $300 more than the 290 which performs almost as good. All you need is to either wait for aftermarket coolers or add one yourself and the performance will be negligible but the price won't be. Also, the fact that lots of games such as Battlefield 4 and other Frostbite 3 engine games and Star Citizen will use mantle then those performance will disappear fairly quick but the price won't.

Upvote • 
Avatar image for unreal101
unreal101

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >> Once the custom-cooled, upclocked aftermarket versions of the 290 start showing up at $450-475, it's gonna destroy Nvidia in sales.

Upvote • 
Avatar image for sammoth
sammoth

50

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

<< LINK REMOVED >><< LINK REMOVED >> It's not a PR stunt. It's simple AMD have always had driver issues just isn't as bad as it used to be. This is a well known fact. Just because a few people didn't have issues doesn't mean others didn't.

Upvote • 
Avatar image for plm3d_basic
plm3d_basic

413

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

<< LINK REMOVED >>

Well I call it as it is. First, I've owned an ATI 9800, x850 XT, 3850, 4850, 5870, and currently a 7970. I have had only 1 bad driver in those 10 plus years that caused a BSOD and I usually upgrade every driver. This misleading stigma about bad drivers is completely false and untrue. The people complaining are most likely paid Nvidia PR hacks on the internet.

Upvote • 
Avatar image for unreal101
unreal101

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >><< LINK REMOVED >><< LINK REMOVED >> Time will tell, I suppose.

Truth be told, unless AMD becomes undeniably superior to its competition, I'll probably keep basing my rigs on Nvidia and Intel. There's just a bit more quality to their products.

Upvote • 
Avatar image for McGregor
McGregor

505

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >><< LINK REMOVED >><< LINK REMOVED >> I run AMD cards. So far I haven't had any issues with drivers. I actually am hearing from my friends that use Nvidia that they wish their drivers were updated as often as AMD. Then again, I always here that both sides need better driver updates.

Upvote • 
Avatar image for Anigmar
Anigmar

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

<< LINK REMOVED >><< LINK REMOVED >><< LINK REMOVED >> I'll be happy if that happens, in the meantime we can just disagree and respect each other.

4 • 
Avatar image for unreal101
unreal101

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >><< LINK REMOVED >><< LINK REMOVED >> Something tells me that AMD driver support is going to become less and less of an issue with time, now that:

1) Games are being made with X1 and PS4 in mind, and are being optimized for AMD hardware to begin with, and

2) AMD's monetary influx from consoles will hopefully lead to better support from them.

Upvote • 
Avatar image for Anigmar
Anigmar

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

<< LINK REMOVED >><< LINK REMOVED >><< LINK REMOVED >><< LINK REMOVED >> Thanks for keeping this civilized.

5 • 
Avatar image for plm3d_basic
plm3d_basic

413

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

<< LINK REMOVED >><< LINK REMOVED >><< LINK REMOVED >>

Bullshit.

2 • 
Avatar image for Anigmar
Anigmar

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

<< LINK REMOVED >> Agreed. We want a gaming rig, not a barbecue.

Upvote • 
Avatar image for divyeshk1
divyeshk1

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Agree or not R9 290 X much better ...

Upvote • 
Avatar image for plm3d_basic
plm3d_basic

413

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

<< LINK REMOVED >>

Only with after market coolers.

Upvote • 
Avatar image for berserker66666
berserker66666

1754

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

Good job Nvidia. Way to suck us dry only to release a better GPU few months later.

Upvote • 
Avatar image for unreal101
unreal101

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >> That's kinda how PC gaming always is. Your rig can only be top-of the line for a few months at best, then something pricier and shinier comes along.

Upvote • 
Avatar image for ScreamDream
ScreamDream

3953

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Not a technical leap but nice if you want bragging rights.

Upvote • 
Avatar image for kingcrimson24
kingcrimson24

824

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 1

wow!

Upvote • 
Avatar image for diabolik_023
diabolik_023

472

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

no thank you !

Upvote • 
Avatar image for Anigmar
Anigmar

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

The best of this new release is most cards got a good price reduction (770 is now about 310 euros). My GTX 670 runs all my games fine and since my monitor max. resolution is 1680x1050 I can play basicly everything maxed out with some AA. Still if I had bigger needs I wouldn't go for it yet. More Tis models are coming and GTX 800 series is a few months away. Anyway if I had to go for a brand I'd go Nvidia, ATI is always a headache.

Upvote • 
Avatar image for McGregor
McGregor

505

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >> This is my thought. My monitor(s) and TV only run 1080p. Why buy a card that is capable of doing this when my existing setup runs everything at 1080p with an average of 50 fps right now. Maybe when Titanfall hits I'll upgrade my video card, but not until then. By then hopefully the other cards have dropped $50-$100. Probably end up with a 7970 at that point.

Upvote • 
Avatar image for Anigmar
Anigmar

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

<< LINK REMOVED >><< LINK REMOVED >> I agree. I'd like a 1080 monitor but my 22" Samsung works fine and I'm not sure if going from 1680 to 1920 is worth all the trouble researching what brand & model I'd like. Any advice?

Upvote • 
Avatar image for CptYoutube
CptYoutube

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

So in other words, wait 3 months, then pay $400 just like all Nvidia cards:D That's what I did with the GTX 680!!!

Upvote • 
Avatar image for IJONOI
IJONOI

246

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

Time to trade in the 690 soon me thinks.

Upvote • 
Avatar image for Tremblay343
Tremblay343

1658

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

<< LINK REMOVED >> If you have a 690 I wouldn't trade that in for a long time. It still outperforms pretty much everything. All the buzz is about single card gpu's but if you put in a grand for a 690 there is literally no reason to buy another card until at least the 800 series

Upvote • 
Avatar image for mikeyvp87
mikeyvp87

408

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Nvidia has such great driver support. From my little experience with their cards it's amazing how quickly they update their drivers for new games. That being said power for price you go with the R9 290X. I'd live with the driver support and just know my card can power through most hiccups.

Upvote • 
Avatar image for ivanchiz
ivanchiz

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >> Maybe you should look at the resolution before talking.

7 • 
Avatar image for Amaregas
Amaregas

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

<< LINK REMOVED >> you are getiing 60-80 FPS at

3840x2160?
8 •