The Future of Gaming Tech Looks Brighter Than Ever

As our current stable of consoles begin to fade into obsolescence, the PC market has already defined what we can expect from the future of console gaming and development.

The moment Watch Dogs and Star Wars 1313 entered our field of view at E3, we were teased with a glimpse into the future of gaming tech with demos exhibiting real-time elements that we usually see in prerendered cutscenes and film. It was clear that the fidelity onscreen was a step above what we're used to. LucasArts has already said that Star Wars 1313 is intended for the next generation of hardware, and that much is obvious once the game is seen in action. Watch Dogs was confirmed for release on the Xbox 360 and PlayStation 3, but those versions will not reflect the same tech exhibited onstage, likely sacrificing lighting, texture resolution, and particle effects in the transition.

Both of these titles ran on high-end PCs at E3, with Star Wars 1313 (and Watch Dogs, presumably) using Nvidia's latest GPU (the Kepler-based GTX 680), a card that's opening doors for major advances in graphics and computation.

But if you try to pinpoint what made Watch Dogs and Star Wars 1313 look so impressive, a quick visual analysis will reveal that lighting and particles are key factors contributing to the perceived fidelity on display. Truthfully, a properly lit environment with 750K polygons looks more authentic than a poorly illuminated environment at twice the polygon count, especially in motion. These elements are only now becoming a reality in the realm of real-time rendering, and it's changing the balance of power in the gaming arms race. Where polygons were once king, they've been usurped by techniques that, in essence, require a different type of artistic prowess. This new direction is going to have a major impact on our expectations for AAA titles.

Insomuch as delivering these techniques requires advanced hardware, a new breed of software is also necessary to facilitate their implementation. Leading the charge is Epic's workhorse, Unreal Engine 4. The real-time demo Epic showed at E3 was nothing short of a revelation. After running a scripted cutscene, the Epic developer driving the demonstration began to alter the scene in real time, crushing our perception of what can and cannot be done on the fly. Light bounced off every surface and, depending on the shaders in use, appropriately illuminated the surrounding objects. This aptly named global illumination is nothing new to film, but it's a first for gaming. Once seen in action, these developments will clearly dominate the coming generation of consoles if Sony and Microsoft hope to keep up with the PC market.

Clearly, the results speak for themselves, but beyond Unreal Engine 4, the hardware powering the demo deserves mention. The backbone of the machine was Nvidia's GTX 680, which embodies significant strides in packing performance in a concise and efficient package. To illustrate its prowess, we should examine another demo from Epic that evolved between its debut at GDC '11 and its rebirth at GDC '12. Have a look at the real-time Samaritan Demo, built within Unreal Engine 3.

The point is that this demo, running on the same hardware as the E3 Elemental demo, debuted in 2011 running on three GTX 580s. In broad terms, the GTX 680 is capable of performing as well as a top-of-the-line, triple-SLI-fueled rig from 2011. This drives down cost, obviously, and you can even cram the GTX 680 into a diminutive Shuttle PC, as Epic had for its presentations. This is good news for those of us who shudder at the thought of a full tower beside their desk.

Once seen in action, these developments will clearly dominate the coming generation of consoles if Sony and Microsoft hope to keep up with the PC market.
Manufacturing processes continue to shrink year over year, and it's a significant factor in the wealth of performance available in the 680, but Nvidia has done its fair share of legwork in the meantime. Regarding the Samaritan demo, the 2012 variant boasted a 76 percent decrease in VRAM usage over the prior iteration--not a figure to be taken lightly. This is due in no small part to Nvidia's pioneering of FXAA (Fast Approximate Anti-Aliasing), a new antialiasing technique that taxes shader processing cores, rather than RAM. Utilizing FXAA initially required developers to hard-code the functionality into their software, but thanks to a recent driver update, you can now enable FXAA through the Nvidia control panel, boosting performance in hundreds of PC titles. This also means it's a possibility for Microsoft and Sony to use FXAA in the future if they have the right hardware to back it up. Also, since it's a post-processing technique, it can improve on 3D "classics" through backward compatibility.

There is a trade-off when using FXAA, but it's one most people will likely accept. Essentially, once the environment and characters are rendered, jaggies and all, FXAA takes that output and works its magic. The previous king of antialiasing, MSAA (Multi-Sample Anti-Aliasing), has to account for every polygon and edge prior to rendering the image, sacrificing frames per second along the way. It's technically the more accurate technique of the two, but the minor discrepancies in antialiasing discrimination present in FXAA are overshadowed by the decrease in resource usage, and increase in frame rates. Overall, FXAA is hyper-efficient relative to other methods, and in most cases produces better results. Here is an image illustrating the differences between MSAA and FXAA, albeit in an isolated sample.

With all of these factors in mind, it's easy to see how the next leap in console tech will be defined by more than advances in polygon counts and resolution. Some industry professionals bemoan a new generation, and there's a lot of truth to the notion that function should precede form in the realm of gaming. For some (including the hardware manufacturers), the market needs occasional stimulation--breadcrumbs that lead to a brighter, globally illuminated future. For better or worse, we've created an economy of performance, where games are valued primarily for their ability to impress their worth upon us in as little time as possible. Hopefully, these powers will be put to good use and help developers flex their creativity, rather than dictate it.

Written By

Want the latest news about Star Wars: 1313?

Star Wars: 1313

Star Wars: 1313

Follow

Discussion

459 comments
itchyflop
itchyflop

wank!! i thought star wars was on ps3. So will watch dogs look that good on consoles or "high end" £1000 pcs only. and i hope the next gen software looks better than that. 

toderascu23
toderascu23

what? no lightsaber for star wars? they can put the best graphics in the world, if they don't add a lightsaber then it's not star wars.

vishakhenzo2005
vishakhenzo2005

what the hell? i want good gameplay more than graphics. Can people please spread that. I don't want in screen prompts(press x to evade) and developers telling me its FXAA3 technology.!

RockySquirrel
RockySquirrel

Peter doesn't like full towers?

 

TRAITOR!!!!

Skilnes
Skilnes

I wasn't really impressed by the new star wars, it's stuff i've already seen in uncharted.. watch dogs was impressive, but because of the gameplay

Straperry
Straperry

Looks like graphics from Korea. The advancement of it.

offspring94
offspring94

One step forward for graphics, one giant leap backwards for the art form.  Getting graphics like these means even bigger development budgets and less tolerance of artistic risks; we can look forward to never seeing another game like the original Deus Ex again.

Colekern
Colekern

Switching to Nvidia soon... hopefully.

Jermone123
Jermone123

"Regarding the Samaritan demo, the 2012 variant boasted a 76 percent decrease in VRAM usage over the prior iteration--not a figure to be taken lightly."

 

Ha, I think he is talking about the VRAM from the tri-sli 2011 rig? Come on man, VRAM isn't used on the additional cards while using SLI. Only one card's VRAM is used. Which, makes that point he made totally mute...   : )

jimy1475
jimy1475

gameplay gameplay gameplay

stylechyld
stylechyld

I knew within time a game like this was going to be made.. a man that control his environment with his phone.. well good idea since the world now revolves on people glued to there cell phone..the real challenge for this game is going to be getting  people off there cell phone to play a game of a man holding a cell phone..lol

Aristowi
Aristowi

Too bad Sony went for an AMD chip for the PS4. Or maybe it's too good? Who knows, tech moves so fast.

JoKeR_421
JoKeR_421

damn, i didnt know ppls opinion really bothered a lot of ppl lol. jeez, no one can talk these days without someone clinging ur throat lol.

JoKeR_421
JoKeR_421

 @frostyballoon actually donkey kong is much better than RDR, i enjoyed it more. RDR got boring, repetitive fast. but thats my opinion, and its just as valid as u thinkin RDR is better lol

 

JoKeR_421
JoKeR_421

@Basik3108

 but u didnt. not my fault u didnt explain right. and ur right about one thing, yea graphics should be complimented cuz some really do impress, but why care about graphics when the game is trash. dont take what i say out of proportion and assume im attacking u in any way. by all means u have ur opinion and u have the right to say what u think or like....but dont come out and say we thought this and that, when u clearly type something that u should of thought or read through before posting. anyways. my point is, i wish devs focus more gameplay and story, more than they do on graphics. i have enjoyed god knows how many games, wish poor graphics and got poor reviews cuz of it.

 

thats another thing, half of these reviews cap on a game so much cuz of graphics. yea i know we are in 2012 and graphics should be all nice and dandy. but ill still stick to my comment, games should be about enjoying them and good gameplay. david jaffe came out and said it lovely. that 'devs forgot how to make good games' cuz thye focus on graphics, and nothing new or bring something fresh to the table for gamers to enjoy

Fandango_Letho
Fandango_Letho

My PC could run these types of games without a problem. It's a bit odd to hear about how Star Wars 1313 looks supreme when... a title like this could be released this summer for PC's if they had worked on that platform.

dream0fchaos
dream0fchaos

i cant wait to see the games will utilize these engines

TheRealToonLink
TheRealToonLink

I don't think the next generation of consoles should happen yet mostly because at the moment, most developers are making games at a loss because they are too expensive to make but don't sell enough to pay back the development costs. I mean, besides the most hardcore, who's going to fork out $300-$500 for a new system when first parties are still pushing their hardware and introducing new features for it. (case in point, SmartGlass and WonderBook)

 

I just feel that if developers create and push new hardware, we are going to end up with a second video game crash. 

Annihilator-X-
Annihilator-X-

"Overall, FXAA is hyper-efficient relative to other methods, and in most cases, produces better results."

You wrote one sentence earlier MSAA produce more accurate results, just more taxing, and then you wrote FXAA produces better results in most cases, this makes no sense.

trollkind
trollkind

When the future of consoles really is buying a game, putting the disc in, installing for an hour, downloading patches for an hour (games get bigger and bigger) and installing them, then creating an account with the publisher, then one with the developers own online service, finally playing but being annoyed with constant reminders of new shop content, "social" updates and server issues and flat out not being able to play the game in multiplayer after a year, then I might as well go back to PC gaming.

 

I became devoted to console gaming because it was worryfree instant pure gaming fun, now everything I started to dislike in PC gaming followed us over. -_-

LoserMike
LoserMike

The PS3 and 360 already have a few games that use FXAA. Post-process AA is nothing new. Almost all Sony's first-party games since God of War 3 use MLAA which uses the SPUs in the CELL  to perform anti-aliasing.

rattyocaster
rattyocaster

I think that some people are selling short what the PS4/Nextbox can really do. I dare you to load up An original console release for either the PS3 or 360, then load up a recent game (Uncharted then Uncharted 3 is a good example) and see the difference. The current generation of consoles are pushing 6 years old and in that time developers have been managing to ring more detailed graphics out of them, which is a tall order considering both consoles have essentially 512MB of RAM (in the 360 it's shared in the PS3 it's 256MB for system and then 256MB graphics memory) and GPU's based on Radeon X1000 series or GeForce 7000 series. Considering I have a PC with Radeon 6870 crossfire setup, most mult platform releases are better for me on PC, but there are games which are PC only or heavily optimised (Battlefield 3 and Dirt Showdown come to mind) which look and run much better than their console counterparts, equally, Need for Speed The Run runs and looks better on console, even though the tech is much worse. Personally I can't wait to see what the next consoles can really do, you know, in about 6 years time...

sepir
sepir

Seeing all this just makes me a bit depressed (as a PC user), as near top of the line PC's that can run one new release game with great graphics perfectly fine can then struggle with an inferior graphic game. Well not struggle, the game frame rate struggles whilst the PC is barely using any of it's resources. All well and good showing us this, but if the developers don't code it properly it really isn't going to matter.

 

Or maybe I need to learn more, but I'm not the only one that gets these problems.

Drakillion
Drakillion

Good graphics are nice but for the time being. You'd have to shell a pretty penny to achieve those great looking visuals. Agni's Philosophy used two interlaced (I think) GTX 680, which are the highest end cards on the market.

jchristenberry
jchristenberry

I dont think for a second that gameplay has taken a back seat to graphics. Otherwise these successful and good look franchises wouldnt have made it for as long as they have such as Battlefield, Call of Duty, Mass Effect, etc... There is certainly a vocal demand for better graphics, but it makes sense. The gaming industry has strived for years (with no help from nintendo) to be taken seriously and not just as "something for the kids". It's only practical that game developers would push to make their games look that much better/realistic. Its a means to be taken more seriously. Take Heavy Rain for example... no one can call that a "kids game" simply due to the graphics alone. Then you take in account the gameplay and context of the story, it makes me all the more eager to see what they come up with next!

Yojimbo25
Yojimbo25

I saw Square's next gen engine not impressed. Unreal 4 still not impressed. Honestly the jump from PS3/360 to PS4/720 won't have as much impact as the jump from PS2/Xbox was for me. But in a age where graphics matter, but gameplay takes a back burner my comment will be invalid because its all about the graphics now..

brgreen
brgreen

 @toderascu23 There are a ton of Star Wars games out there with no light sabers. i.e - shadows of the empire for N64 was mainly a shooter, and you have all the flight simulator games etc.

skinntech
skinntech

 @offspring94

  I wouldnt say that. epic usually releases their engines for free. I've been using udk for about a year now along with many other indie developers. The reality is that the new unreal engine makes it even easier to develop for by streamlining the dev process. It's actually going to make it easier to create big budget looking games with less staff.

hopscotch_bandi
hopscotch_bandi

 @Jermone123

 It kills me when people post and try and sound all smart and technical and then end with something as stupid as "that point he made totally mute"  It's moot moot moot moot, not mute. 

CieloPerso
CieloPerso

 @Annihilator-X- I think he meant better results overall in regards to a combination of both performance (speed, fluidity, &c.) & visual quality (AA, sharpness, &c.).

cr8ive
cr8ive

 @sepir That's why we should always push for moddable games :)

as long as the developers don't water down the future gameplay to simple swipes and taps.. or gestures...as in kinect

dutchgamer83
dutchgamer83

 @sepir Problem is that most PC games are made for the console first. The consoles can't run games as well as the pc (for all the fanboys, this is not bashing this is just a fact, the PC is generations ahead of the current consoles). So they restrict certain things, once ported to the pc they don't optimize it for the pc so it causes conflicts.

And often its just a easy port to the pc without coding it properly so we pc gamers with a high end pc still get frame drops and other things that shouldn't happen cause you can run other much better looking and more heavy for your pc games like a charm.

 

Devs need to make a separated pc version, and some do like battlefield and hitman, but often the pc version is just a xbox port with a few extra options like setting your resolution and AA and such. When they have a separated version you often notice the difference compared to a port.

benandmax
benandmax

 @jchristenberry I don't quite understand this ' The gaming industry has strived for years (with no help from nintendo) to be taken seriously and not just as "something for the kids".No help from Nintendo?  They are the only video game company that has been able to get adults and even seniors playing games.  Microsoft and Sony have not been able to do this.  If you are talking about "mature" games, 90%(just a guess) of the people playing those are between 5 and 19.  If you are going to be taken seriously, you need to appeal to that broad audience and Nintendo is the only one who has done that.

NightFox313
NightFox313

 @Yojimbo25 I'm with you there. Next-gen is only going to be better graphics and an even bigger focus on online gameplay and video games that are primarily DLC-based. This new generation of gaming tech just doesn't interest me like it did back then. There's just not much "new" this time around.

 

Sure, graphics make a game look neat to look at but what use is it if the gameplay sucks? And for me, it's online gaming that needs to take the "back burner", because there's just way too much focus on the online that there's no way to have fun side-by-side with your friends anymore. I'm not saying that there shouldn't be any online interaction between gamers anymore, but developers need to tone down the focus on the online multiplayer. I hope this isn't asking too much.

agallardo13
agallardo13

 @dutchgamer83 do you know how insane the cost will be if they had to separately make a pc version of a game? i understand what you are saying but on a business perspective it will mean the end of your profits

02050muh
02050muh

 @dutchgamer83 that is few years back, but nowadays, when development of a game become more cheaper, they start develop games separately. although not entirely separate, coz cost r high these days n they NEED to make it multiplatform to gain profit. n i don't think developers are even ready for new gen yet. just look at 360 n ps3 launch titles. they r mostly suck,dont they?

xXl_z3r0_lXx
xXl_z3r0_lXx

 @NightFox313 I wish devs would make games with local multiplayer too. Some of the best memories I have are of me and some friends playing games NEXT TO each other, in the same room, chatting, eating junk food, etc.

xXl_z3r0_lXx
xXl_z3r0_lXx

 @NightFox313 -And I'll admit that the 30 hour tutorial was bad, but once you got out to the overworld it really wasn't so bad.

xXl_z3r0_lXx
xXl_z3r0_lXx

 @dutchgamer83 I'll give you the 30 hour tutorial thing, but I always found that it was all in how you set up your paradigms and used them together along with the well timed paradigm shifts. Of course, I always used Army of One to make paradigm shifts less destructive to my own party.

NightFox313
NightFox313

 @dutchgamer83  I used to play computer games with my brother side-by-side on the keyboard all the time (Sonic Robo Blast 2 FTW). The graphics were sprite sheet-based, but it was acceptable because the game was super fun. I first played video games on the PC as well, and then I moved on to consoles shortly afterwards.

 

I have about two CoD games (CoD: Classic on the PS3 and CoD3 on the PS2) and I found them a tad enjoyable at first, but they quickly got repetitive and eventually boring. I don't even play them anymore. I've seen people play Minecraft and I think the graphics are charming as well.

 

Final Fantasy XIII was a great-looking game, but I agree with you about the gameplay. My friend let me play it once on the PS3 and I could say it got really boring fast. It was super linear, and basically the game just has such a repetitive pace that I just tended to run away from the enemies when they tried to force me to tap X over and over again in the turn-based segments.

 

Totally agree with you there.

trollkind
trollkind

 @dutchgamer83  "I enjoyed the movies but the game was nothing more than press x" At least for me it was just "Press x" during the 30 hour tutorial, then it was frantic and well timed paradigm shifting unless you really outleveled your opponent.

xXl_z3r0_lXx
xXl_z3r0_lXx

 @dutchgamer83  @NightFox313 Thoroughly agreed, with both of you. Although I don't agree on the FFXIII thing, I kinda like the gameplay and I only used auto battle when I absolutely needed the extra time for summons and paradigm shifts, otherwise I used abilities. Oh and the main section of Archylte steppe was pretty large.On a side note though, Minecraft was actually very graphically taxing and reliant on the GPU, even though everything looks the way it does.

dutchgamer83
dutchgamer83

 @xXl_z3r0_lXx And bring back splitscreen even on the pc damnit. Need for speed was always a blast with a friend on the pc. Both on the keyboard. It really is anoying they move everything to the internet. When i have a friend over we want to play a game together. He isn't gonna bring his pc with him, nor his xbox or ps3.

 

@NightFox313 I totaly agree. Iam a pc gamer first and graphics hardly do it for me (guess that's what you get when you are generations ahead but hardly get to see the power you machine can handle).

Sure i like beautiful graphics but i get more excited about a indie game with poor graphics that does something really special  or has a great immersion than i get excited for a new Crysis game. Crysis and CoD looked nice but they got rather boring quickly for me seeing i only played them for the singleplayer (2012 omg you still want singleplayer game? O.o...yes..yes i do!).

Great example is Minecraft, the graphics suuuuuuuuck..but that is the biggest charm of the game.

 

The new game engine of square enix is okay looking i suppose. But as long as its just a CGI animation (i know they claim it was a game, but did anyone get to control a character and walk around in their level? No, good till than i say its a CGI animation) i don't believe a damn thing about it. First let them make a real game with the engine that still looks like that, than we shall talk again.  And FF13 looked amazing too but it came with a huge cost...small level design (mostly narrow corridors) that isn't as demanding as a big map to freely explore. And the gameplay wasn't really interesting. I enjoyed the movies but the game was nothing more than press x, press x, press some more x (when playing on the ps3).