GameSpot may receive revenue from affiliate and advertising partnerships for sharing this content and from purchases through links.

Nvidia GTX 780 Ti Review: A Powerful GPU With A Price To Match

The GTX 780 Ti just about pushes Nvidia back to the top of the GPU performance pile, but its price is far from competitive.

232 Comments

For the vast majority of PC players, 1080p is the benchmark for performance, and by far the most popular resolution in use for gaming (at least according to the latest Steam Hardware Survey). Games look great at 1080p, monitors are cheap and plentiful, and you don't need to spend a fortune on an insanely powerful GPU to drive them. But if you're running multiple monitors, high resolutions like 1600p or 4K, or if you're simply after some bragging rights, then the likes of a GTX 650 Ti or Radeon 7850 just aren't going to cut it.

Enter the GTX 780 Ti, the latest GPU from Nvidia based on the GK110 chip. That's the same full-fat Kepler chip used in the GTX Titan and GTX 780, both of which are already excellent performers at high-resolutions. The trouble is, they aren't the best performers anymore. AMD's latest R9 290X and R9 290 have benchmarked extremely well, not only taking the performance crown from their rival, but also seriously undercutting it in terms of price. Nvidia's latest round of price cuts evens the playing field somewhat, but there's nothing quite like the prestige of having "the world's fastest graphics card".

No Caption Provided

The 780 Ti, then, has a big job ahead of it. At an RRP of $699 (£559 in the UK), it's still around $100 more expensive than the 290X, so it isn't going to be winning any awards for value. In terms of performance, though, it's very impressive. The 780 Ti is the first GPU to make use of the entire GK110 chip, that is, the full 2880 single precision CUDA cores, 240 texture units, and 48 ROP units. Memory comes in the form of 3GB of extremely fast 7Gbps GDDR5 for 336GB/s of bandwidth, while the base clock speed gets a bump to 845Mhz, and the boost clock speed to 928Mhz. It does lack scientific features like HyperQ and high-end 64 bit performance, but on paper at least, the GTX 780 Ti is the most powerful gaming card Nvidia's released.

GTX 780 Ti GPU SpecsGTX 780 Ti Memory Specs

2880 CUDA Cores
845 Base Clock (MHz)
928 Boost Clock (MHz)
210 GigaTexels/sex Texture Filtering Rate
240 Texture Units
48 ROP units

7.0 Gbps Memory Clock
3072 MB Standard Memory Config
GDDR5 Memory Interface
384-bit GDDR5 Memory Interface Width
336 GB/s Memory Bandwidth
GTX 780 Ti Software SupportGTX 780 Ti Display Support
OpenGL 4.3
PCI Express 3.0
GPU Boost 2.0, 3D Vision, CUDA, DirectX 11, PhysX, TXAA, Adaptive VSync, FXAA, 3D Vision Surround, SLI-ready
Four displays for Multi Monitor
4096x2160 Maximum Digital Resolution
2048x1536 Maximum VGA Resolution
Two Dual Link DVI, One HDMI, One DisplayPort
GTX 780 Ti DimensionsGTX 780 Ti Power Specs
10.5 inches Length
4.3 inches Height
Dual-slot Width
250 W TDP
600 W Recommended Power Supply
One 8-pin and one 6-pin Power Connector

Software

Like all of Nvidia's GPUs, the 780 Ti comes bundled with GeForce Experience (GFE), an application that automatically optimizes the graphics settings of your games based upon your hardware. GFE automatically updates your drivers and scans your games library for supported games, aiming to target settings that achieve 40 to 60 frames per second. Since its release earlier in the year, GFE's performance has improved by leaps and bounds, with many more supported games and optimal settings chosen. Naturally, you'll be able to eke out more performance by diving in and editing things manually, but if you're happy to let GFE do the job for you, the results are impressive.

No Caption Provided

Also part of the 780 Ti software package is ShadowPlay, a gameplay capture system that leverages the H.264 encoder built into Kepler (600, 700 series) GPUs. It automatically records the last 20 minutes of gameplay at up to 1080p60 at 50Mbps in automatic mode, but you can record as much footage as your hard drive allows in manual mode. ShadowPlay's also due to support direct streaming to Twitch.tv, although that feature isn't in the current beta. The advantage of using ShadowPlay over something like Fraps is CPU and memory usage. In our testing we found it affected the frame rate far less than Fraps did, in many cases with a hit of just a few frames per second. The software is still in beta, though, so we experienced a few capturing hiccups and crashes, but hopefully those issues will be ironed out before its full release.

There's also a great games bundle attached to the 780 Ti, with copies of Assassin’s Creed IV: Black Flag, Batman: Arkham Origins and Splinter Cell: Black List coming with every card. That's a sweet deal considering they're such current games, and hey, if you've already got them there's always the joy of gifting or selling on eBay.

Performance

Our trusty Ivy Bridge PC backed the GTX 780 Ti, although this time we overclocked the CPU to 4.2Ghz for a little extra oomph. A 1080p monitor would have been a waste for such a card, so we went with Asus' PQ321Q 4K monitor to really test its pixel-pushing power. With the exception of Crysis 3, all games were run at maximum settings and where possible we used FXAA for a performance boost. Call Of Duty: Ghosts was run at a lower resolution of 2560x1600, due to a current lack of 4K support.

MotherboardAsus P8Z68-V Motherboard
ProcessorIntel Core i5 3570k @ 4.2Ghz
RAM16GB 1600Mhz DDR3 Corsair Vengeance RAM
Hard DriveCorsair Force GT/Samsung Spinpoint F3 1 TB
Power SupplyCorsair HX850 PSU
DisplayAsus PQ321Q @ 3840x2160/Dell 3007WFP-HC @ 2560x1600

Battlefield 4 (2x MSAA @ 3840x2160)

No Caption Provided
Average FPSMinimum FPSMaximum FPS
GTX 780 Ti322444
GTX Titan292140
GTX 780261336

Crysis 3 (High Settings, FXAA @ 3840x2160)

No Caption Provided
Average FPSMinimum FPSMaximum FPS
GTX 780 Ti302444
GTX Titan272234
GTX 780252037

Call Of Duty: Ghosts (HBAO+, FXAA @ 2560x1600)

No Caption Provided
Average FPSMinimum FPSMaximum FPS
GTX 780 Ti7528107
GTX Titan7647104
GTX 780543783

Bioshock Infinite (Ultra @ 3840x2160)

No Caption Provided
Average FPSMinimum FPSMaximum FPS
GTX 780 Ti503467
GTX Titan403361
GTX 780302561

Tomb Raider (Ultra @ 3840x2160)

No Caption Provided
Average FPSMinimum FPSMaximum FPS
GTX 780 Ti302343
GTX Titan292139
GTX 780281637

Metro: Last Light (Ultra @ 3840x2160)

No Caption Provided
Average FPSMinimum FPSMaximum FPS
GTX 780 Ti332749
GTX Titan292537
GTX 780252040

A Pricey Performer

As expected with such killer specs, the GTX 780 Ti screams through the likes of Battlefield 4 and Call Of Duty: Ghosts, even at 4K, easily beating the GTX 780 and even the $1000 Titan. It's an impressive showing for a card based on an architecture that's now well over a year and a half old, and represents the peak of Kepler's rendering abilities. While we unfortunately didn't have an AMD R9 290X on hand to make a direct comparison, judging by the benchmarks out there, the 780 Ti is a comparable card and once again places Nvidia within striking distance of, if not back at the top of GPU performance.

Such performance comes at a price, though. At over $100 more than the R9 290X and nearly $300 more than the similarly performing R9 290, the 780 Ti is an expensive choice. It's also $100 more expensive than the GTX 780, a GPU that's hardly a slouch when it comes to high-resolution performance. Yes, the 780 Ti is far more power-efficient than AMD's latest, and yes, it's a very quiet card in operation too, and we experienced none of the power throttling issues that are currently plaguing the R9 290.

Whether that's worth the extra cash, though, is debatable. No doubt about it, the GTX 780 Ti is a brilliant GPU backed by some brilliant software, but you can do a lot with that $100 saving (or even $300 if you plump for the R9 290). AMD's aggressive pricing has taken the shine off the GTX 780 Ti, but if you're all in for team green and have the high-res setup to do it justice, it's the absolute best you can get from Nvidia, and one of the best GPUs (a lot) of money can buy.

Got a news tip or want to contact us directly? Email news@gamespot.com

Join the conversation
There are 232 comments about this story
232 Comments  RefreshSorted By 
GameSpot has a zero tolerance policy when it comes to toxic conduct in comments. Any abusive, racist, sexist, threatening, bullying, vulgar, and otherwise objectionable behavior will result in moderation and/or account termination. Please keep your discussion civil.

Avatar image for edblevins
edblevins

162

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

Great article.

I've enjoyed the Nvidia cards I've owned in the past few years. I'm a budget gamer, however, and I don't go for the expensive cards; I tend to stick with in the $300 - $400 range. I know I would get more bang for my buck with AMD but I find it hard to willingly move from such dependability.

Upvote • 
Avatar image for meatz666
meatz666

27

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >> AMD is good, specially for processors, but I prefer Nvidia for graphics too.

Upvote • 
Avatar image for edblevins
edblevins

162

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

<< LINK REMOVED >> Yea, I honestly can't say anything bad about AMD; they're solid cards and well priced.

Upvote • 
Avatar image for Morphine_OD
Morphine_OD

702

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

680GTX still rips every game out there to sharp 1080p shreds. And there I was thinking that "next-gen" software would give me a valid reason to upgrade.

Upvote • 
Avatar image for meatz666
meatz666

27

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >> Even my 660 GTX. Who plays at 3840x2160? This is 3 times the resolution of Xbone.

Upvote • 
Avatar image for nodbgp
nodbgp

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

After going Nvidia for such a long time, my last one being a 660 which is still awesome, but I doubt will be able to handle The Witcher 3 at its best, I think it`s finally time to jump ship and go AMD in 2014, the R9 290 price/benefit is just way too good.

Upvote • 
Avatar image for AK1015
AK1015

42

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

<< LINK REMOVED >> I honestly think you'll be surprised. Just wait and see.

Upvote • 
Avatar image for numbes
numbes

44

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

it is not wasted on a 1080p monitor if you are a competitive gamer 120hz or 144hz is what you should have and more fps no drops below 60 is what you are after

about the 290x "not only taking the performance crown from their rival, but also seriously undercutting it in terms of price."

this is debatable it really annoyed me how they over hyped it. at stock it beat the 780/titan on some amd optimised games but it runs really loud and really hot i wanted them to compete but any gamer with these sorts of cards is going to overclock a bit and then it losses across the board.

4k is unplayable still why bother with those benches

Upvote • 
Avatar image for devilzzzwin
DeViLzzzWIN

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

Console wars lolol

Eh the only thing people should do is take those consoles they pre-ordered and resell them for as much as they can get then spend it on master race ... PC .. get some awesome hardware !

8 • 
Avatar image for Wahab_MinSeo
Wahab_MinSeo

35

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

I Wounder why Call of Duty Ghosts only is 2560x1600 not 4k (3840x2160 or 4096x2160)?

Upvote • 
Avatar image for nodbgp
nodbgp

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >> For the simple fact that it doesn`t support it, it`s written in the article.

Upvote • 
Avatar image for Trickymaster
Trickymaster

354

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

<< LINK REMOVED >> What are you talking about? Of course. It's 4k. There is not reason to limit it. :P

Upvote • 
Avatar image for push88
push88

143

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I'm looking into building a gaming desktop pc. I'm not interested in a multi monitor setup, but I do want my games to run smooth on very high settings. The graphics card I am looking at is an Nvidia GeForce GTX 690 4GB GDDR5. Can anyone tell me if this will suit my needs? Also, the total seemed a bit pricy for me at around $3300(for the entire computer). I could really use some helpful advice. Thanks.

Upvote • 
Avatar image for meatz666
meatz666

27

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >> Buy an Alienware Aurora, if you're dishing out 3k.

Upvote • 
Avatar image for mick_holland
mick_holland

26

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >> i got a gtx 680 16gb ram and an intel core i7 and theres not a game i cant run at ultra everything and even with battlefield 4 still gettin 80-100 fps. just single monitor set up mind you.


5 • 
Avatar image for Morphine_OD
Morphine_OD

702

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

<< LINK REMOVED >> same here - good ol gtx680/2600k setup

Upvote • 
Avatar image for numbes
numbes

44

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

<< LINK REMOVED >><< LINK REMOVED >> 770 doesn't make sense to me in price everywhere i look it is not competitive price to performance hxxp://pcpartpicker.com/user/numbers/saved/2Lvb then get g-sync when that comes for the monitor

Upvote • 
Avatar image for skidmarkmike11
skidmarkmike11

26

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >> I would look at possible getting a 670 or 770 in sli (two cards), that would save you a good 300-400 bucks. That would run games great at 1080p and you could max out pretty much everything without much trouble.

Upvote • 
Avatar image for push88
push88

143

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >><< LINK REMOVED >> I'll look into that, thank you.

Upvote • 
Avatar image for Gen007
Gen007

11006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

<< LINK REMOVED >> well what resolution do you plan on playing at? because a 690 is gonna be overkill for most people right now unless youre trying to play at 4 k and the price reflects that. Also the 6 series is outdated now. I would be look to get a 7 series card from Nivida now especially if you looking to pay that much. After seeing this story the 780Ti seems to be Nvidias best single card right now + it beats out the 690 in most cases + its cheaper. Also if you really want to save more money go AMD quick frankly. Their new R9 cards are better than their 7 series equivalents but cheaper.

Upvote • 
Avatar image for push88
push88

143

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@Gen007 @push88 Really?! 690 is overkill? I'll need to look into something else. I'll be playing graphic heavy games in 1080p like BioShock Infinite, Battlefield 4, Metro: Last Light, etc and I want "Advanced PhysX". Lol I was really hoping to stay at around $2300.

Upvote • 
Avatar image for cfisher2833
cfisher2833

2150

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

<< LINK REMOVED >> You don't want a 690 anyways. A 690 is simply two 680 GPUs in SLI on the same PCB; if you're new to building PCs, you don't want to deal with the hassles of SLI (ie multiple GPUs). Go with something like the 780 or the 780ti if you have the money. Some may tell you to go AMD, but if you like PhysX, you'll need to stay with Nvidia (I personally do like PhysX as well--dem Arkham Origins snow footprints!!).

Upvote • 
Avatar image for Gen007
Gen007

11006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

<< LINK REMOVED >> Yeah a 690 will pretty much run anything out now and the near future pretty well at 1080p so yeah it would be overkill imo or rather a waste of money. Yeah id look into a a 770 or 780 those are much cheaper and still plenty powerful. Also like skid mark mentioned Sli is always an option. You could SLi two 770's and get better performance than a 780 and it will still be cheaper than getting a 690.

3 • 
Avatar image for Gen007
Gen007

11006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

Great card it seems but Nvidia is gonna have to come off their premium pricing if they aren't offering a premium product. I have to say AMD has them beat here R9 cards being faster and much cheaper. I must say AMD is clearly gunning for Nvidia's head they dont seem to be taking the threat seriously. I mean in December BF4 is supposed to get an update to support the mantle api on AMD cards and if it does What it claims to then it could end up changing everything and putting Nvidia is a really tuff spot. AMD controls all of the next gen cosnoles so Mantle could end up being a super ace card for AMD.

Upvote • 
Avatar image for quickshooterMk2
quickshooterMk2

284

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

wow, last light is such demanding game


Upvote • 
Avatar image for starjay009
starjay009

792

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Another article for PC fanboys to scream, yell, and pillow fight among themselves. What a bore ! A GPU over 1000$ for an extra 5-7 fps. This is daylight robbery as is always the case with PC gaming. As a PC gamer myself, I don't read too much into these articles. I own a GTX 570 which maxes out pretty much every game currently available at 1080p. I can live with some compromises in visual quality from time to time. But I don't want to spend my hard earned money on such pricey hardware especially when I won't be noticing much difference between now and then. This is why console gaming is better. Good looking games with a small price tag. PC gaming will still be better BUT at the cost of such madness.

Upvote • 
Avatar image for shadow580
shadow580

400

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

<< LINK REMOVED >> This article isn't for you. Move along now.

11 • 
Avatar image for mayan_ozolins
Mayan_Ozolins

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

well the performance of this GPU is incresed of the Default 780 but lower then the 790 and equal to the TITAN but -200 cuda cores if you are wants a uber gaming pc then this is the GPU for you

Upvote • 
Avatar image for DARKSPACE
DARKSPACE

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Good. Now PC owners can fight among themselves for while on who's system is better. LOL.

Upvote • 
Avatar image for redskins26rocs
redskins26rocs

2674

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Since when does GS review GPUs, it is not like a whole AMD series just recently released yet all we get is a review of the GTX 780 ti

2 • 
Avatar image for euphoric666
euphoric666

261

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

i feel outdated with my GTX680 already =(

Upvote • 
Avatar image for Morphine_OD
Morphine_OD

702

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

<< LINK REMOVED >> why would you feel outdated? it's pretty obvious that there is no game 680 doesn't run on ultra everything 1080p

2 • 
Avatar image for DeusGladiorum
DeusGladiorum

89

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@euphoric666 Yea I bought a $400 GTX 770 over the summer. Now looking at benchmarks on Tom's Hardware, it's dwarfed by all the other high end cards save for the R9 280X. It makes me feel like my system is weak, even though it's ridiculously capable for 1080p, but even so the price has dropped to $330 which would have been great savings.

Upvote • 
Avatar image for jack00
jack00

4265

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >> Same here :( I bought it about 2 months after its release for 500$. I feel scammed. Anywho, i'm not wasting another 500$ on a new gpu for at least 2 years.

Upvote • 
Avatar image for Halloll
Halloll

939

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

<< LINK REMOVED >><< LINK REMOVED >> I bought it 2 months ago, F***!

Upvote • 
Avatar image for usagism
Usagism

30

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

Hey guys, I'm still new to PC gaming, so excuse my stupidity, but when they write 3840 x 2160, is that the resolution they are playing it in? If it is, why would anyone need to play a game at that high of a resolution? Or is it just a way to test how well the card can perform?

Upvote • 
Avatar image for DeusGladiorum
DeusGladiorum

89

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >>Because 3840x2160 looks gorgeous, plain and simple. There's always been a steady increase in resolution over the years and the standard has always been increasing. Obviously 1920x1080 was not always the standard. Back in the mid 90s, and somewhat into the late 90s and early 00's, 800x600 was a good standard bread and butter resolution and most gamers felt most comfortable playing at that. In the mid 90s if they're rig was fairly powerful or in the late 90s and early 00s when it was much a lot more commonplace, 1024x768 was a great resolution. Back then, 1600x1200 was considered to be an insane resolution for gaming for only some of the most powerful rigs, and is essentially analogous to 3840x2160 right now. 3840x2160 is 4 times as many pixels as 1920x1080, just as 1600x1200 is 4 times as many pixels as 800x600, and many people were complaining about 1600x1200 then as well. 3840x2160 is the future, and it's going to look fantastic. The point of testing at that resolution is to see how future proof it is.

Upvote • 
Avatar image for Mantan911
Mantan911

56

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

<< LINK REMOVED >><< LINK REMOVED >> 4K however is still quite low fps even on the best of Gpu's (including this). Nowadays refresh rates also differ in screens

Upvote • 
Avatar image for deactivated-60b838d2a137f
deactivated-60b838d2a137f

2184

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

@usagism It's a silly question to ask why play at such a high resolution... Why don't we all play in 480p, why don't we all watch movies on VHS, hell why don't we all go back to greyscale TV's? Because all those things suck... and a higher resolution makes a wonderful difference in quality. OF course you don't need it, but if you got a GPU this powerful you'd be wasting it on lowly ol' 1080p

Upvote • 
Avatar image for DeadorRock
DeadorRock

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

<< LINK REMOVED >> Yes, that's the resolution they are playing in, and it'll be the next high definition standard resolution in the near future (UHD). Graphic cards powerful as 780Ti and R9 290X have to be tested in that resolution because they are kinda overkill for 1080p gaming.

Upvote • 
Avatar image for buccomatic
buccomatic

1941

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

this is a great article and write up. thx so much, mark. thoroughly enjoyed.

Upvote • 
Avatar image for frylock1987
frylock1987

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

and I'll give this card about 6 months to a year to drastically drop down in price and finally swoop in and make my purchase.

Upvote • 
Avatar image for aeterna789
aeterna789

252

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

@frylock1987 A great advise. This is exactly what happened to the current 700 series which dropped a significant price this past few weeks.

Upvote • 
Avatar image for Alucard1475
Alucard1475

564

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

Looks like the one card to pick up for next year's Witcher 3, Star Citizen, etc., and by that time it should be quite cheaper.

Upvote • 
Avatar image for slayermyshorts
slayermyshorts

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

why can't they unlock the 15th smx on the Titan?

Upvote •