Hard facts on the Nintendo GPU emerge. So is it more powerful than PS3 and Xbox

  • 73 results
  • 1
  • 2

This topic is locked from further discussion.

#1 Posted by SecretPolice (22725 posts) -

Whelp, there ya go but who really thought it wasn't at least a little more powerful than 360 / PS3 ?

So, perhaps before the MS Sony consoles arrive, the U just may nab itself a graphics queen crown. :P

Discuss if you wish.

The Story.

---

http://www.eurogamer.net/articles/df-hardware-wii-u-graphics-power-finally-revealed

Wii U graphics power finally revealed

Hard facts on the Nintendo GPU emerge. So is it more powerful than PS3 and Xbox 360?

Despite its general release two months ago, Nintendo's Wii U console would remain something of a technological mystery. We quickly gained a good idea of the make-up of the IBM tri-core CPU, but the system's apparent strengths are in its graphics hardware, and in that regard we had little or no idea of the composition of the Radeon core. Indeed, it's safe to say that we knew much more about the graphics processors in the next-generation Xbox and PlayStation. Until now.

Detailed polysilicon die photography of the Wii U's GPU has now been released, showing the hardware make-up at the nano level and resolving most of the outstanding mysteries. However, the story of how these photos came into existence is a fascinating tale in itself: community forum NeoGAF noticed that Chipworks were selling Wii U reverse-engineering photography on its website, with shots of the principal silicon being offered at $200 a pop. Seeking to draw a line under the controversy surrounding the Nintendo hardware, a collection was started to buy the photos.

There was just one problem. The shots were simply higher quality versions of what had already been revealed on sites like Anandtech - good for getting an idea of the amount of silicon used and the make-up of the overall design, but without the ultra-magnification required to provide answers, and therefore no further use in unearthing the secrets of the Wii U hardware. At this point, Chipworks itself became aware of the community money-raising effort, and decided to help out by providing the required shot - for free. It's a remarkably generous gesture bearing in mind that the cost of carrying out this work is, as Chipworks' Rob Williamson told us, "non-trivial".

"Sourcing the images required to nail down the make-up of the Wii U GPU was the result of a remarkable community effort, plus the generosity of reverse engineering specialists, Chipworks."

2
1

The NeoGAF community started a collection to buy Chipworks' reverse-engineering photos of the Wii processors, but alas the existing photography wouldn't have told us anything new. On the left we see the markings on the heatspreader while on the right we see the metal shield removed and the GPU and CPU revealed.

So, what does the new shot below actually tell us? Well, first of all, let's be clear about how we draw our conclusions. Graphics cores work principally by spreading work in parallel over a vast array of processors. On the die shot, this manifests as the same mini-blocks of transistors "copied and pasted" next to one another. We know that the Wii U hardware is based on AMD's RV770 line of processors - essentially the Radeon HD 4xxx cards - so we have some point of comparison with existing photography of equivalent AMD hardware.

Chipworks' shot is still being analysed, but the core fundamentals are now seemingly beyond doubt. The Wii U GPU core features 320 stream processors married up with 16 texture mapping units and featuring 8 ROPs. After the Wii U's initial reveal at E3 2011, our take on the hardware was more reserved than most. "We reckon it probably has more in common with the Radeon HD 4650/4670 as opposed to anything more exotic," we said at the time. "The 320 stream processors on those chips would have more than enough power to support 360 and PS3 level visuals, especially in a closed-box system."

It was ballpark speculation at the time based on what we had eyeballed at the event, but the final GPU is indeed a close match to the 4650/4670, albeit with a deficit in the number of texture-mapping units and a lower clock speed - 550MHz. AMD's RV770 hardware is well documented so with these numbers we can now, categorically, finally rule out any next-gen pretensions for the Wii U - the GCN hardware in Durango and Orbis is in a completely different league. However, the 16 TMUs at 550MHz and texture cache improvements found in RV770 do elevate the capabilities of this hardware beyond the Xenos GPU in the Xbox 360 - 1.5 times the raw shader power sounds about right. 1080p resolution is around 2.5x that of 720p, so bearing in mind the inclusion of just eight ROPs, it's highly unlikely that we'll be seeing any complex 3D titles running at 1080p.

All of which may lead some to wonder quite why many of the Wii U ports disappoint - especially Black Ops 2, which appears to have been derived from the Xbox 360 version, running more slowly even at the same 880x720 sub-hd resolution. The answer comes from a mixture of known and unknown variables.

"Plenty of questions remain about the Wii U hardware, but this die-shot tells us everything we need to know about the core make-up of the Radeon graphics processor."

die

Chipworks' polysilicon die-shot of the Wii U graphics core, with some tentative annotations added by us. In the cyan area we have the 32MB of eDRAM - fast memory contained within the GPU itself. Above that we have two more areas of embedded memory - this is unconfirmed but we believe it's part of the Wii back-compat hardware. On the right we get to the juicy stuff. In red we see the 320 stream processors while in yellow we have the 16 texture mapping units. The chip itself is fabricated at 40nm.

The obvious suspect would be the Wii U's 1.2GHz CPU, a tri-core piece of hardware re-architected from the Wii's Broadway chip, in turn a tweaked, overclocked version of the GameCube's Gekko processor. In many of our Wii U Face-Offs we've seen substantial performance dips on CPU-specific tasks. However, there still plenty of unknowns to factor in too - specifically the bandwidth levels from the main RAM and the exact nature of the GPU's interface to its 32MB of onboard eDRAM. While the general capabilities of the Wii U hardware are now beyond doubt, discussion will continue about how the principal processing elements and the memory are interfaced together, and Nintendo's platform-exclusive titles should give us some indication of what this core is capable of when developers are targeting it directly.

However, while we now have our most important answers, the die-shot also throws up a few more mysteries too - specifically, what is the nature of the second and third banks of RAM up on the top-left, and bearing in mind how little of the chip is taken up by the ALUs and TMUs, what else is taking up the rest of the space? Here we can only speculate, but away from other essential GPU elements such as the ROPs and the command processor, we'd put good money on the Wii U equivalent to the Wii's ARM 'Starlet' security core being a part of this hardware, along with an audio DSP. We wouldn't be surprised at all if there's a hardware video encoder in there too for compressing the framebuffer for transmission to the GamePad LCD display. The additional banks of memory could well be there for Wii compatibility, and could account for the 1MB texture and 2MB framebuffer. Indeed, the entire Wii GPU could be on there, to ensure full backwards compatibility.

While there's still room for plenty of debate about the Wii U hardware, the core fundamentals are now in place and effectively we have something approaching a full spec. It took an extraordinary effort to get this far and you may be wondering quite why it took a reverse engineering specialist using ultra-magnification photography to get this information, when we already know the equivalent data for Durango and Orbis. The answer is fairly straightforward - leaks tend to derive from development kit and SDK documentation and, as we understand it, this crucial information simply wasn't available in Nintendo's papers, with developers essentially left to their own devices to figure out the performance level of the hardware

#2 Posted by AmnesiaHaze (5683 posts) -

almost like a pc , more powerful , yet ps3/360 get all the good games :)

#3 Posted by clyde46 (46999 posts) -
Isnt there already a thread on this?
#4 Posted by GD1551 (9155 posts) -

lmao I can't believe those guys paid for a picture of the Wii U's GPU WOW!

#5 Posted by ZombieKiller7 (6255 posts) -

I don't think people buy Wiis for the horsepower.

#6 Posted by LegatoSkyheart (25993 posts) -

Isnt there already a thread on this?clyde46

I think so, the thread was all about how "Innovative" the hardware was because it generates less heat than most and the GPU was modified just so we could have Backwards compatibility.

or something like that.

#7 Posted by OctaBech (276 posts) -

crucial information simply wasn't available in Nintendo's papers, with developers essentially left to their own devices to figure out the performance level of the hardware

Seriously Ninty. :( This is why 3'rd party devs avoid you like the plague.
#8 Posted by SecretPolice (22725 posts) -

[QUOTE="clyde46"]Isnt there already a thread on this?LegatoSkyheart

I think so, the thread was all about how "Innovative" the hardware was because it generates less heat than most and the GPU was modified just so we could have Backwards compatibility.

or something like that.

After posting this, I noticed a similar thread, came back to delete but was too late as someone had already ported here - oh well,

#9 Posted by TilxWLOC (1052 posts) -

[QUOTE="LegatoSkyheart"]

[QUOTE="clyde46"]

Isnt there already a thread on this?

SecretPolice

I think so, the thread was all about how "Innovative" the hardware was because it generates less heat than most and the GPU was modified just so we could have Backwards compatibility.

or something like that.

After posting this, I noticed a similar thread, came back to delete but was too late as someone had already ported here - oh well,

If we are talking about the same thread, this one is more interesting and informative anyway. I still can't believe I read the whole thing, not understanding anything they said technically.

#10 Posted by Zeviander (9503 posts) -
 .
#11 Posted by SecretPolice (22725 posts) -

[QUOTE="SecretPolice"]

[QUOTE="LegatoSkyheart"]

I think so, the thread was all about how "Innovative" the hardware was because it generates less heat than most and the GPU was modified just so we could have Backwards compatibility.

or something like that.

TilxWLOC

After posting this, I noticed a similar thread, came back to delete but was too late as someone had already ported here - oh well,

If we are talking about the same thread, this one is more interesting and informative anyway. I still can't believe I read the whole thing, not understanding anything they said technically.

Yeah, I'm a bit biased being the TC and all but I agree :P

Oh, and the last part made me chuckle. :)

#12 Posted by nintendoboy16 (27492 posts) -
Isnt there already a thread on this?clyde46
Yep, there is.
#13 Posted by StormyJoe (5875 posts) -

So, it's marginally faster than the 360/PS3. Congrats - the first "next gen" console is barely more powerful than the previous gen's consoles. Nintendo for the win! :roll:

#14 Posted by KungfuKitten (21286 posts) -

If you make technology super power efficient, does that increase its stability? I'm trying to figure out what went through their minds. They've focussed on power consumption before, and nobody really paid attention to it. Is it just a result of making the card exteremely limited in its functioning or so?

#15 Posted by LegatoSkyheart (25993 posts) -

 .Zeviander

>GPU produces less heat, which explains why the system always seems to feel cool, so Longer Life expected on the console.

>GPU is only slightly more powerful than the PS3 or 360, Architecture is still too vague to prove it, but Perfect Backwards Compatibility is achieved. 

Pretty sure this is all you need to know.

#16 Posted by Zeviander (9503 posts) -
[QUOTE="LegatoSkyheart"]>GPU produces less heat, which explains why the system always seems to feel cool, so Longer Life expected on the console. >GPU is only slightly more powerful than the PS3 or 360, Architecture is still too vague to prove it, but Perfect Backwards Compatibility is achieved. Pretty sure this is all you need to know.

Thx bro.
#17 Posted by AmazonTreeBoa (16745 posts) -
I couldn't care less what the paper stats/specs are. I only care about the end result (the games), and so far I haven't seen anything to lead me to believe the Wii U is more powerful.
#18 Posted by Heil68 (46182 posts) -
But the time a game could look better than PS3/360 God Station 4 will be out.
#19 Posted by LegatoSkyheart (25993 posts) -

I couldn't care less what the paper stats/specs are. I only care about the end result (the games), and so far I haven't seen anything to lead me to believe the Wii U is more powerful.AmazonTreeBoa

from what I'm seeing, it's not, it's what current gen is now and that's why Iwata is getting hounded by investors.

#20 Posted by campzor (34932 posts) -
Welcome to 2006 sheep
#21 Posted by LegatoSkyheart (25993 posts) -

Welcome to 2006 sheep campzor

>2005

Xbox 360 was 2005

#22 Posted by ScreamDream (3953 posts) -

The CPU at 1.2 GHZ is the eye opener but it seems to be put together well from the specs and can look better than ps3/360 but I have a feeling programming to squeeze that speed out might be difficult.

#23 Posted by nintendoboy16 (27492 posts) -

[QUOTE="campzor"]Welcome to 2006 sheep LegatoSkyheart

>2005

Xbox 360 was 2005

Buh... Buh... Sony says next gen doesn't start until they say so.
#24 Posted by g0ddyX (3914 posts) -

Barely better than a 6 year old PS3/360.
What a lame achievement.
I guess Wii:U is going to miss out on stellar next-gen engines and titles as they opted for the easier option.

#25 Posted by Martin_G_N (1734 posts) -

It's what I expected after seeing the final specs for the CPU and RAM. The WiiU has a better GPU, but it is slowed down by low RAM speeds and a slow 1.2GHZ CPU. Which is probably why the WiiU struggles with multiplats. Had the CPU been running at 3.2GHZ as it was suppose to, and the memory bandwidth been closer to the X360 and PS3, then porting games to it might not have been so difficult.

#26 Posted by Wasdie (50380 posts) -

It being more powerful than the PS3/360 shouldn't suprise anybody. GPU tech has gotten really good and cheap recently. The PS3 uses one of the last GPUs before the unified shader architecture took over and the 360 uses one of the first models of the unified shader GPUs. Bascially they have been extremely out of date for years. 

The CPU speed doesn't matter as much as some people say, but it's still a factor. RAM bandwidth is another factor. I've heard bad things about the RAM bandwidth. 

The next Playstation and Xbox will run circles around this thing. Even if their rumored 1.6 ghz processors are true, they still will have far more power than the WiiU will have. If the rumors that they will run an x86 based processor, then the WiiU is going to get a tiny fraction of multiplats, if any at all. 

Nintendo has really kind of screwed themselves with the hardware this time around. They'll need to really ramp up the 1st party support as they won't be getting tons of 3rd party support once the other consoles are out.

#27 Posted by Heirren (18138 posts) -
In other words; 720 and PS4 will share games, while Wiiu and Ouya will share games.
#28 Posted by adamosmaki (9747 posts) -
in other words WiiU is gonna be the Wii of this generation. What a shocker Gpu is around the level of a 4650 which means about 50% faster than what is on Ps3 or 360 but this is 2013 so WiiU once again is way underpowered
#29 Posted by super600 (30865 posts) -

DF seems to have ignore the WTH parts of the GPU that may make it look stronger then it looks.

#30 Posted by Heirren (18138 posts) -

DF seems to have ignore the WTH parts of the GPU that may make it look stronger then it looks.

super600
I think the gist of the article is that wiiu will simply pale in comparison to the other two consoles.
#31 Posted by loosingENDS (11881 posts) -

So, a garbage GPU which is a bit more powerfull than a 2005 GPU, which means is like a 2006 GPU

Add the slow ram and OS that takes up all RAM LOL and WiiU is a disatster as a next gen system

Will be a nice console though, graphics are very good even today

But will not run anything next gen

Next gen is not about going back to 2006

#32 Posted by super600 (30865 posts) -

[QUOTE="super600"]

DF seems to have ignore the WTH parts of the GPU that may make it look stronger then it looks.

Heirren

I think the gist of the article is that wiiu will simply pale in comparison to the other two consoles.

That's right, but I don't think it's going ot be easy to determine how powerful the console may actually be.

#33 Posted by loosingENDS (11881 posts) -

[QUOTE="Heirren"][QUOTE="super600"]

DF seems to have ignore the WTH parts of the GPU that may make it look stronger then it looks.

super600

I think the gist of the article is that wiiu will simply pale in comparison to the other two consoles.

That's right, but I don't think it's going ot be easy to determine how powerful the console may actually be.

That is easy, all multipltforms so far look and perform better on xbox 360

It is marginally better than xbox 360 one and with the slow RAM and CPU the whole thing is about as strong as xbox 360 (still have some doubts though)

With huge optimization maybe goes a little bit above xbox 360/PS3 graphics

But that is not what next gen is about

#35 Posted by Heirren (18138 posts) -

[QUOTE="Heirren"][QUOTE="super600"]

DF seems to have ignore the WTH parts of the GPU that may make it look stronger then it looks.

super600

I think the gist of the article is that wiiu will simply pale in comparison to the other two consoles.

That's right, but I don't think it's going ot be easy to determine how powerful the console may actually be.

Power is relative to what is done with it. The real question is whether or not it will be viable for 3rd parties to invest the time/money in a game catered solely to the wiiu. Does it really make sense to do so when they could spend that time Ina ps4/720 multiplatform game and double up with the dlc--something Nintendo doesnt fully support, anyways? As a wiiu owner, I'm inclined to say it does not make sense for them to do so. This first year for the wiiu was extremely important to gather 3rd parties, but it isn't looking too good for future support at the moment. As long as Nintendo releases some stellar games I'll still be happy with the purchase, but it is unfortunate it will likely be nothing more than that. I wouldn't be surprised if there are some internal changes at Nintendo in the coming years.
#36 Posted by delta3074 (18627 posts) -
Good op SP, nice that somebody confirms what anyone with half a brain realised anyway, maybe now we can end this rediculous 'wii-u is not a next gen console' rubbish thats been going around lately:)
#37 Posted by SaltyMeatballs (25158 posts) -

We know it's not a lot more powerful, and once again this is the assumption we make. Do we really need to look into it any deeper?

What's important is the end product, not theoretical numbers; what Wii U will do vs PS4/720 will do.

#38 Posted by loosingENDS (11881 posts) -

Good op SP, nice that somebody confirms what anyone with half a brain realised anyway, maybe now we can end this rediculous 'wii-u is not a next gen console' rubbish thats been going around lately:)delta3074

It is a next gen console, with last gen graphics

It cant even handle multipltforms as good as xbox360, so the real world performance so far is a joke

#39 Posted by delta3074 (18627 posts) -

[QUOTE="delta3074"]Good op SP, nice that somebody confirms what anyone with half a brain realised anyway, maybe now we can end this rediculous 'wii-u is not a next gen console' rubbish thats been going around lately:)loosingENDS

It is a next gen console, with last gen graphics

It cant even handle multipltforms as good as xbox360, so the real world performance so far is a joke

Just as well generations are actually decided by succession and not power then:)
#40 Posted by King_Dodongo (3351 posts) -

It being more powerful than the PS3/360 shouldn't suprise anybody. GPU tech has gotten really good and cheap recently. The PS3 uses one of the last GPUs before the unified shader architecture took over and the 360 uses one of the first models of the unified shader GPUs. Bascially they have been extremely out of date for years. 

The CPU speed doesn't matter as much as some people say, but it's still a factor. RAM bandwidth is another factor. I've heard bad things about the RAM bandwidth. 

The next Playstation and Xbox will run circles around this thing. Even if their rumored 1.6 ghz processors are true, they still will have far more power than the WiiU will have. If the rumors that they will run an x86 based processor, then the WiiU is going to get a tiny fraction of multiplats, if any at all. 

Nintendo has really kind of screwed themselves with the hardware this time around. They'll need to really ramp up the 1st party support as they won't be getting tons of 3rd party support once the other consoles are out.

Wasdie
The bad is on the others IMO. The development costs are already high, who knows what the PS4/720 would mean to third parties. Last gen many studios closed already cause of the HD standard.
#41 Posted by silversix_ (15315 posts) -
I can't believe we're comparing a next gen system to a 7 years old console... Do you think ps4/720 will be compared to 360/ps3 or high end pc's? lol wiiu.
#42 Posted by SecretPolice (22725 posts) -

We know it's not a lot more powerful, and once again this is the assumption we make. Do we really need to look into it any deeper?

What's important is the end product, not theoretical numbers; what Wii U will do vs PS4/720 will do.

SaltyMeatballs

Hmm, perhaps it's like some sigs around here, some are only worth a glance whilst others demand a deeper look :shock:

Sorry, I just couldn't resist. :P

#43 Posted by StrongDeadlift (5184 posts) -

Lol, the Wii U GPU is in the range of a HD4650.  Its actually LOWER than the 4650 (Its core is based off of the R700 series of cards, aka 4xxx, so these are directly comparable)

Wii U:

Core = (320:16:8), clock = 550mhz

Radeon HD 4650:

Core: (320:32:8), clock = 650mhz

 

 

 They really made the Wii U GPU not only from a line of cards that will be 5 years old in 4 months, but weaker than one of the LOWEST END cards from that line of cards tho :|

God damn nintendo.  Its like they purposely went out of their way to make the system as weak as possibru.  :lol:

 

#44 Posted by Wasdie (50380 posts) -

 The bad is on the others IMO. The development costs are already high, who knows what the PS4/720 would mean to third parties. Last gen many studios closed already cause of the HD standard.King_Dodongo

"HD standard" had nothing to do with it. 

PC devs have been thriving despite the push for better graphics even in lower budget games. Some of the best looking games on the PC are done by smaller studios operating with much smaller budgets than the big guys.

Publishers and licensing fees from the big game console manufacturers add tons and tons of unnecessary overhead where studios cannot cope. The actual development costs aren't the biggest issue anymore. It's keeping up with publisher's demands. 

Microsoft, Sony, and Nintendo add tons of unnecessary overhead that hurts game development. Big publishers like EA, Ubisoft, Activision, and all of the rest are doing more damage to game studios than the "HD standard".

Games like ArmA 2, The Witcher 2, Hard Reset, and Trine 2, all prove that you can have amazing looking games with a smaller budget. Hell, Crysis was a game that cost under 10 million to make.

 

#45 Posted by Gue1 (10724 posts) -

Sorry! I wrote this comment on the wrong thread like boss.   :cool:

#46 Posted by SaltyMeatballs (25158 posts) -

"HD standard" had nothing to do with it. 

PC devs have been thriving despite the push for better graphics even in lower budget games. Some of the best looking games on the PC are done by smaller studios operating with much smaller budgets than the big guys.

Publishers and licensing fees from the big game console manufacturers add tons and tons of unnecessary overhead where studios cannot cope. The actual development costs aren't the biggest issue anymore. It's keeping up with publisher's demands. 

Microsoft, Sony, and Nintendo add tons of unnecessary overhead that hurts game development. Big publishers like EA, Ubisoft, Activision, and all of the rest are doing more damage to game studios than the "HD standard".

Games like ArmA 2, The Witcher 2, Hard Reset, and Trine 2, all prove that you can have amazing looking games with a smaller budget. Hell, Crysis was a game that cost under 10 million to make.Wasdie

 

You can't blame this on "unnecessary overhead", the biggest change was the transition to HD (more realistic graphics):

I'm currently transcribing net income - data for several video game companies and I noticed that most of them list their development costs too (the companies name those division slightly different, but it's the same).

While it's nothing new that game companies invest more money nowadays, I found it interesting how steep the curve is.

It's not adjusted to inflation, because it wouldn't change the graph (I tested it with one company).

Note: All those companies have lot's of other expenses, I'll just focus on this part anyway.
sources: Annual reports / SEC - filings
Read: FY3 2003 = Fiscal year ended in March 2003 = April 2002 - March 2003 (FY10 = October, etc.)

Electronic Arts


Activision (Blizzard)


Take-Two


Ubisoft


THQ

Notice: FY3 2003 is just one quarter

 Captain Smoker

http://www.neogaf.com/forum/showpost.php?p=47255603&postcount=1

#47 Posted by Heirren (18138 posts) -

[QUOTE="King_Dodongo"] The bad is on the others IMO. The development costs are already high, who knows what the PS4/720 would mean to third parties. Last gen many studios closed already cause of the HD standard.Wasdie

"HD standard" had nothing to do with it. 

PC devs have been thriving despite the push for better graphics even in lower budget games. Some of the best looking games on the PC are done by smaller studios operating with much smaller budgets than the big guys.

Publishers and licensing fees from the big game console manufacturers add tons and tons of unnecessary overhead where studios cannot cope. The actual development costs aren't the biggest issue anymore. It's keeping up with publisher's demands. 

Microsoft, Sony, and Nintendo add tons of unnecessary overhead that hurts game development. Big publishers like EA, Ubisoft, Activision, and all of the rest are doing more damage to game studios than the "HD standard".

Games like ArmA 2, The Witcher 2, Hard Reset, and Trine 2, all prove that you can have amazing looking games with a smaller budget. Hell, Crysis was a game that cost under 10 million to make.

 

I think nintendos comment about the difference between wiiu and 720/ps4 not being as large as wii/ps360 was them convincing themselves that as long as the output image is 720/1080p, gamers would not see much of a difference.
#48 Posted by SaltyMeatballs (25158 posts) -

[QUOTE="Wasdie"]

[QUOTE="King_Dodongo"] The bad is on the others IMO. The development costs are already high, who knows what the PS4/720 would mean to third parties. Last gen many studios closed already cause of the HD standard.Heirren

"HD standard" had nothing to do with it. 

PC devs have been thriving despite the push for better graphics even in lower budget games. Some of the best looking games on the PC are done by smaller studios operating with much smaller budgets than the big guys.

Publishers and licensing fees from the big game console manufacturers add tons and tons of unnecessary overhead where studios cannot cope. The actual development costs aren't the biggest issue anymore. It's keeping up with publisher's demands. 

Microsoft, Sony, and Nintendo add tons of unnecessary overhead that hurts game development. Big publishers like EA, Ubisoft, Activision, and all of the rest are doing more damage to game studios than the "HD standard".

Games like ArmA 2, The Witcher 2, Hard Reset, and Trine 2, all prove that you can have amazing looking games with a smaller budget. Hell, Crysis was a game that cost under 10 million to make.

 

I think nintendos comment about the difference between wiiu and 720/ps4 not being as large as wii/ps360 was them convincing themselves that as long as the output image is 720/1080p, gamers would not see much of a difference.

Maybe they have a point. I'm being serious, I was surprised when I saw people say the UE4 tech demo didn't look impressive (not samaritan, the fire demon demo). Just shows how much trickery the developers achieved on 360/PS3.

#49 Posted by AcidThunder (2332 posts) -

Its as if Nintendo has already given up. It isn't even trying anymore. Seriously, this is the best they could come up with?
Nintendo might have taken the 7th gen but they are surely not gonna take the 8th. 

#50 Posted by AM-Gamer (4440 posts) -

We already new this, Its slighly more powerful then teh PS3/360 far weaker then the next PS and Xbox.