Hard facts on the Nintendo GPU emerge. So is it more powerful than PS3 and Xbox

  • 73 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for SecretPolice
SecretPolice

44061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 SecretPolice
Member since 2007 • 44061 Posts

Whelp, there ya go but who really thought it wasn't at least a little more powerful than 360 / PS3 ?

So, perhaps before the MS Sony consoles arrive, the U just may nab itself a graphics queen crown. :P

Discuss if you wish.

The Story.

---

http://www.eurogamer.net/articles/df-hardware-wii-u-graphics-power-finally-revealed

Wii U graphics power finally revealed

Hard facts on the Nintendo GPU emerge. So is it more powerful than PS3 and Xbox 360?

Despite its general release two months ago, Nintendo's Wii U console would remain something of a technological mystery. We quickly gained a good idea of the make-up of the IBM tri-core CPU, but the system's apparent strengths are in its graphics hardware, and in that regard we had little or no idea of the composition of the Radeon core. Indeed, it's safe to say that we knew much more about the graphics processors in the next-generation Xbox and PlayStation. Until now.

Detailed polysilicon die photography of the Wii U's GPU has now been released, showing the hardware make-up at the nano level and resolving most of the outstanding mysteries. However, the story of how these photos came into existence is a fascinating tale in itself: community forum NeoGAF noticed that Chipworks were selling Wii U reverse-engineering photography on its website, with shots of the principal silicon being offered at $200 a pop. Seeking to draw a line under the controversy surrounding the Nintendo hardware, a collection was started to buy the photos.

There was just one problem. The shots were simply higher quality versions of what had already been revealed on sites like Anandtech - good for getting an idea of the amount of silicon used and the make-up of the overall design, but without the ultra-magnification required to provide answers, and therefore no further use in unearthing the secrets of the Wii U hardware. At this point, Chipworks itself became aware of the community money-raising effort, and decided to help out by providing the required shot - for free. It's a remarkably generous gesture bearing in mind that the cost of carrying out this work is, as Chipworks' Rob Williamson told us, "non-trivial".

"Sourcing the images required to nail down the make-up of the Wii U GPU was the result of a remarkable community effort, plus the generosity of reverse engineering specialists, Chipworks."

2
1

The NeoGAF community started a collection to buy Chipworks' reverse-engineering photos of the Wii processors, but alas the existing photography wouldn't have told us anything new. On the left we see the markings on the heatspreader while on the right we see the metal shield removed and the GPU and CPU revealed.

So, what does the new shot below actually tell us? Well, first of all, let's be clear about how we draw our conclusions. Graphics cores work principally by spreading work in parallel over a vast array of processors. On the die shot, this manifests as the same mini-blocks of transistors "copied and pasted" next to one another. We know that the Wii U hardware is based on AMD's RV770 line of processors - essentially the Radeon HD 4xxx cards - so we have some point of comparison with existing photography of equivalent AMD hardware.

Chipworks' shot is still being analysed, but the core fundamentals are now seemingly beyond doubt. The Wii U GPU core features 320 stream processors married up with 16 texture mapping units and featuring 8 ROPs. After the Wii U's initial reveal at E3 2011, our take on the hardware was more reserved than most. "We reckon it probably has more in common with the Radeon HD 4650/4670 as opposed to anything more exotic," we said at the time. "The 320 stream processors on those chips would have more than enough power to support 360 and PS3 level visuals, especially in a closed-box system."

It was ballpark speculation at the time based on what we had eyeballed at the event, but the final GPU is indeed a close match to the 4650/4670, albeit with a deficit in the number of texture-mapping units and a lower clock speed - 550MHz. AMD's RV770 hardware is well documented so with these numbers we can now, categorically, finally rule out any next-gen pretensions for the Wii U - the GCN hardware in Durango and Orbis is in a completely different league. However, the 16 TMUs at 550MHz and texture cache improvements found in RV770 do elevate the capabilities of this hardware beyond the Xenos GPU in the Xbox 360 - 1.5 times the raw shader power sounds about right. 1080p resolution is around 2.5x that of 720p, so bearing in mind the inclusion of just eight ROPs, it's highly unlikely that we'll be seeing any complex 3D titles running at 1080p.

All of which may lead some to wonder quite why many of the Wii U ports disappoint - especially Black Ops 2, which appears to have been derived from the Xbox 360 version, running more slowly even at the same 880x720 sub-hd resolution. The answer comes from a mixture of known and unknown variables.

"Plenty of questions remain about the Wii U hardware, but this die-shot tells us everything we need to know about the core make-up of the Radeon graphics processor."

die

Chipworks' polysilicon die-shot of the Wii U graphics core, with some tentative annotations added by us. In the cyan area we have the 32MB of eDRAM - fast memory contained within the GPU itself. Above that we have two more areas of embedded memory - this is unconfirmed but we believe it's part of the Wii back-compat hardware. On the right we get to the juicy stuff. In red we see the 320 stream processors while in yellow we have the 16 texture mapping units. The chip itself is fabricated at 40nm.

The obvious suspect would be the Wii U's 1.2GHz CPU, a tri-core piece of hardware re-architected from the Wii's Broadway chip, in turn a tweaked, overclocked version of the GameCube's Gekko processor. In many of our Wii U Face-Offs we've seen substantial performance dips on CPU-specific tasks. However, there still plenty of unknowns to factor in too - specifically the bandwidth levels from the main RAM and the exact nature of the GPU's interface to its 32MB of onboard eDRAM. While the general capabilities of the Wii U hardware are now beyond doubt, discussion will continue about how the principal processing elements and the memory are interfaced together, and Nintendo's platform-exclusive titles should give us some indication of what this core is capable of when developers are targeting it directly.

However, while we now have our most important answers, the die-shot also throws up a few more mysteries too - specifically, what is the nature of the second and third banks of RAM up on the top-left, and bearing in mind how little of the chip is taken up by the ALUs and TMUs, what else is taking up the rest of the space? Here we can only speculate, but away from other essential GPU elements such as the ROPs and the command processor, we'd put good money on the Wii U equivalent to the Wii's ARM 'Starlet' security core being a part of this hardware, along with an audio DSP. We wouldn't be surprised at all if there's a hardware video encoder in there too for compressing the framebuffer for transmission to the GamePad LCD display. The additional banks of memory could well be there for Wii compatibility, and could account for the 1MB texture and 2MB framebuffer. Indeed, the entire Wii GPU could be on there, to ensure full backwards compatibility.

While there's still room for plenty of debate about the Wii U hardware, the core fundamentals are now in place and effectively we have something approaching a full spec. It took an extraordinary effort to get this far and you may be wondering quite why it took a reverse engineering specialist using ultra-magnification photography to get this information, when we already know the equivalent data for Durango and Orbis. The answer is fairly straightforward - leaks tend to derive from development kit and SDK documentation and, as we understand it, this crucial information simply wasn't available in Nintendo's papers, with developers essentially left to their own devices to figure out the performance level of the hardware

Avatar image for AmnesiaHaze
AmnesiaHaze

5685

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#2 AmnesiaHaze
Member since 2008 • 5685 Posts

almost like a pc , more powerful , yet ps3/360 get all the good games :)

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#3 clyde46
Member since 2005 • 49061 Posts
Isnt there already a thread on this?
Avatar image for GD1551
GD1551

9645

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 GD1551
Member since 2011 • 9645 Posts

lmao I can't believe those guys paid for a picture of the Wii U's GPU WOW!

Avatar image for ZombieKiller7
ZombieKiller7

6463

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#5 ZombieKiller7
Member since 2011 • 6463 Posts

I don't think people buy Wiis for the horsepower.

Avatar image for LegatoSkyheart
LegatoSkyheart

29733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 1

#6 LegatoSkyheart
Member since 2009 • 29733 Posts

Isnt there already a thread on this?clyde46

I think so, the thread was all about how "Innovative" the hardware was because it generates less heat than most and the GPU was modified just so we could have Backwards compatibility.

or something like that.

Avatar image for OctaBech
OctaBech

276

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 OctaBech
Member since 2008 • 276 Posts

crucial information simply wasn't available in Nintendo's papers, with developers essentially left to their own devices to figure out the performance level of the hardware

Seriously Ninty. :( This is why 3'rd party devs avoid you like the plague.
Avatar image for SecretPolice
SecretPolice

44061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 SecretPolice
Member since 2007 • 44061 Posts

[QUOTE="clyde46"]Isnt there already a thread on this?LegatoSkyheart

I think so, the thread was all about how "Innovative" the hardware was because it generates less heat than most and the GPU was modified just so we could have Backwards compatibility.

or something like that.

After posting this, I noticed a similar thread, came back to delete but was too late as someone had already ported here - oh well,

Avatar image for TilxWLOC
TilxWLOC

1164

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 TilxWLOC
Member since 2011 • 1164 Posts

[QUOTE="LegatoSkyheart"]

[QUOTE="clyde46"]

Isnt there already a thread on this?

SecretPolice

I think so, the thread was all about how "Innovative" the hardware was because it generates less heat than most and the GPU was modified just so we could have Backwards compatibility.

or something like that.

After posting this, I noticed a similar thread, came back to delete but was too late as someone had already ported here - oh well,

If we are talking about the same thread, this one is more interesting and informative anyway. I still can't believe I read the whole thing, not understanding anything they said technically.

Avatar image for Zeviander
Zeviander

9503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#10 Zeviander
Member since 2011 • 9503 Posts
 .
Avatar image for SecretPolice
SecretPolice

44061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 SecretPolice
Member since 2007 • 44061 Posts

[QUOTE="SecretPolice"]

[QUOTE="LegatoSkyheart"]

I think so, the thread was all about how "Innovative" the hardware was because it generates less heat than most and the GPU was modified just so we could have Backwards compatibility.

or something like that.

TilxWLOC

After posting this, I noticed a similar thread, came back to delete but was too late as someone had already ported here - oh well,

If we are talking about the same thread, this one is more interesting and informative anyway. I still can't believe I read the whole thing, not understanding anything they said technically.

Yeah, I'm a bit biased being the TC and all but I agree :P

Oh, and the last part made me chuckle. :)

Avatar image for nintendoboy16
nintendoboy16

41532

Forum Posts

0

Wiki Points

0

Followers

Reviews: 43

User Lists: 14

#12 nintendoboy16
Member since 2007 • 41532 Posts
Isnt there already a thread on this?clyde46
Yep, there is.
Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#13 StormyJoe
Member since 2011 • 7806 Posts

So, it's marginally faster than the 360/PS3. Congrats - the first "next gen" console is barely more powerful than the previous gen's consoles. Nintendo for the win! :roll:

Avatar image for KungfuKitten
KungfuKitten

27389

Forum Posts

0

Wiki Points

0

Followers

Reviews: 42

User Lists: 0

#14 KungfuKitten
Member since 2006 • 27389 Posts

If you make technology super power efficient, does that increase its stability? I'm trying to figure out what went through their minds. They've focussed on power consumption before, and nobody really paid attention to it. Is it just a result of making the card exteremely limited in its functioning or so?

Avatar image for LegatoSkyheart
LegatoSkyheart

29733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 1

#15 LegatoSkyheart
Member since 2009 • 29733 Posts

 .Zeviander

>GPU produces less heat, which explains why the system always seems to feel cool, so Longer Life expected on the console.

>GPU is only slightly more powerful than the PS3 or 360, Architecture is still too vague to prove it, but Perfect Backwards Compatibility is achieved. 

Pretty sure this is all you need to know.

Avatar image for Zeviander
Zeviander

9503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#16 Zeviander
Member since 2011 • 9503 Posts
[QUOTE="LegatoSkyheart"]>GPU produces less heat, which explains why the system always seems to feel cool, so Longer Life expected on the console. >GPU is only slightly more powerful than the PS3 or 360, Architecture is still too vague to prove it, but Perfect Backwards Compatibility is achieved. Pretty sure this is all you need to know.

Thx bro.
Avatar image for AmazonTreeBoa
AmazonTreeBoa

16745

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 AmazonTreeBoa
Member since 2011 • 16745 Posts
I couldn't care less what the paper stats/specs are. I only care about the end result (the games), and so far I haven't seen anything to lead me to believe the Wii U is more powerful.
Avatar image for Heil68
Heil68

60713

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#18 Heil68
Member since 2004 • 60713 Posts
But the time a game could look better than PS3/360 God Station 4 will be out.
Avatar image for LegatoSkyheart
LegatoSkyheart

29733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 1

#19 LegatoSkyheart
Member since 2009 • 29733 Posts

I couldn't care less what the paper stats/specs are. I only care about the end result (the games), and so far I haven't seen anything to lead me to believe the Wii U is more powerful.AmazonTreeBoa

from what I'm seeing, it's not, it's what current gen is now and that's why Iwata is getting hounded by investors.

Avatar image for campzor
campzor

34932

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20 campzor
Member since 2004 • 34932 Posts
Welcome to 2006 sheep
Avatar image for LegatoSkyheart
LegatoSkyheart

29733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 1

#21 LegatoSkyheart
Member since 2009 • 29733 Posts

Welcome to 2006 sheep campzor

>2005

Xbox 360 was 2005

Avatar image for ScreamDream
ScreamDream

3953

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 ScreamDream
Member since 2006 • 3953 Posts

The CPU at 1.2 GHZ is the eye opener but it seems to be put together well from the specs and can look better than ps3/360 but I have a feeling programming to squeeze that speed out might be difficult.

Avatar image for nintendoboy16
nintendoboy16

41532

Forum Posts

0

Wiki Points

0

Followers

Reviews: 43

User Lists: 14

#23 nintendoboy16
Member since 2007 • 41532 Posts

[QUOTE="campzor"]Welcome to 2006 sheep LegatoSkyheart

>2005

Xbox 360 was 2005

Buh... Buh... Sony says next gen doesn't start until they say so.
Avatar image for g0ddyX
g0ddyX

3914

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24 g0ddyX
Member since 2005 • 3914 Posts

Barely better than a 6 year old PS3/360.
What a lame achievement.
I guess Wii:U is going to miss out on stellar next-gen engines and titles as they opted for the easier option.

Avatar image for Martin_G_N
Martin_G_N

2124

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 Martin_G_N
Member since 2006 • 2124 Posts

It's what I expected after seeing the final specs for the CPU and RAM. The WiiU has a better GPU, but it is slowed down by low RAM speeds and a slow 1.2GHZ CPU. Which is probably why the WiiU struggles with multiplats. Had the CPU been running at 3.2GHZ as it was suppose to, and the memory bandwidth been closer to the X360 and PS3, then porting games to it might not have been so difficult.

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#26 Wasdie  Moderator
Member since 2003 • 53622 Posts

It being more powerful than the PS3/360 shouldn't suprise anybody. GPU tech has gotten really good and cheap recently. The PS3 uses one of the last GPUs before the unified shader architecture took over and the 360 uses one of the first models of the unified shader GPUs. Bascially they have been extremely out of date for years. 

The CPU speed doesn't matter as much as some people say, but it's still a factor. RAM bandwidth is another factor. I've heard bad things about the RAM bandwidth. 

The next Playstation and Xbox will run circles around this thing. Even if their rumored 1.6 ghz processors are true, they still will have far more power than the WiiU will have. If the rumors that they will run an x86 based processor, then the WiiU is going to get a tiny fraction of multiplats, if any at all. 

Nintendo has really kind of screwed themselves with the hardware this time around. They'll need to really ramp up the 1st party support as they won't be getting tons of 3rd party support once the other consoles are out.

Avatar image for deactivated-57ad0e5285d73
deactivated-57ad0e5285d73

21398

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27 deactivated-57ad0e5285d73
Member since 2009 • 21398 Posts
In other words; 720 and PS4 will share games, while Wiiu and Ouya will share games.
Avatar image for adamosmaki
adamosmaki

10718

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

#28 adamosmaki
Member since 2007 • 10718 Posts
in other words WiiU is gonna be the Wii of this generation. What a shocker Gpu is around the level of a 4650 which means about 50% faster than what is on Ps3 or 360 but this is 2013 so WiiU once again is way underpowered
Avatar image for super600
super600

33103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#29 super600  Moderator
Member since 2007 • 33103 Posts

DF seems to have ignore the WTH parts of the GPU that may make it look stronger then it looks.

Avatar image for deactivated-57ad0e5285d73
deactivated-57ad0e5285d73

21398

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#30 deactivated-57ad0e5285d73
Member since 2009 • 21398 Posts

DF seems to have ignore the WTH parts of the GPU that may make it look stronger then it looks.

super600
I think the gist of the article is that wiiu will simply pale in comparison to the other two consoles.
Avatar image for loosingENDS
loosingENDS

11793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31 loosingENDS
Member since 2011 • 11793 Posts

So, a garbage GPU which is a bit more powerfull than a 2005 GPU, which means is like a 2006 GPU

Add the slow ram and OS that takes up all RAM LOL and WiiU is a disatster as a next gen system

Will be a nice console though, graphics are very good even today

But will not run anything next gen

Next gen is not about going back to 2006

Avatar image for super600
super600

33103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#32 super600  Moderator
Member since 2007 • 33103 Posts

[QUOTE="super600"]

DF seems to have ignore the WTH parts of the GPU that may make it look stronger then it looks.

Heirren

I think the gist of the article is that wiiu will simply pale in comparison to the other two consoles.

That's right, but I don't think it's going ot be easy to determine how powerful the console may actually be.

Avatar image for loosingENDS
loosingENDS

11793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#33 loosingENDS
Member since 2011 • 11793 Posts

[QUOTE="Heirren"][QUOTE="super600"]

DF seems to have ignore the WTH parts of the GPU that may make it look stronger then it looks.

super600

I think the gist of the article is that wiiu will simply pale in comparison to the other two consoles.

That's right, but I don't think it's going ot be easy to determine how powerful the console may actually be.

That is easy, all multipltforms so far look and perform better on xbox 360

It is marginally better than xbox 360 one and with the slow RAM and CPU the whole thing is about as strong as xbox 360 (still have some doubts though)

With huge optimization maybe goes a little bit above xbox 360/PS3 graphics

But that is not what next gen is about

Avatar image for deactivated-57ad0e5285d73
deactivated-57ad0e5285d73

21398

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#35 deactivated-57ad0e5285d73
Member since 2009 • 21398 Posts

[QUOTE="Heirren"][QUOTE="super600"]

DF seems to have ignore the WTH parts of the GPU that may make it look stronger then it looks.

super600

I think the gist of the article is that wiiu will simply pale in comparison to the other two consoles.

That's right, but I don't think it's going ot be easy to determine how powerful the console may actually be.

Power is relative to what is done with it. The real question is whether or not it will be viable for 3rd parties to invest the time/money in a game catered solely to the wiiu. Does it really make sense to do so when they could spend that time Ina ps4/720 multiplatform game and double up with the dlc--something Nintendo doesnt fully support, anyways? As a wiiu owner, I'm inclined to say it does not make sense for them to do so. This first year for the wiiu was extremely important to gather 3rd parties, but it isn't looking too good for future support at the moment. As long as Nintendo releases some stellar games I'll still be happy with the purchase, but it is unfortunate it will likely be nothing more than that. I wouldn't be surprised if there are some internal changes at Nintendo in the coming years.
Avatar image for delta3074
delta3074

20003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#36 delta3074
Member since 2007 • 20003 Posts
Good op SP, nice that somebody confirms what anyone with half a brain realised anyway, maybe now we can end this rediculous 'wii-u is not a next gen console' rubbish thats been going around lately:)
Avatar image for SaltyMeatballs
SaltyMeatballs

25165

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#37 SaltyMeatballs
Member since 2009 • 25165 Posts

We know it's not a lot more powerful, and once again this is the assumption we make. Do we really need to look into it any deeper?

What's important is the end product, not theoretical numbers; what Wii U will do vs PS4/720 will do.

Avatar image for loosingENDS
loosingENDS

11793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#38 loosingENDS
Member since 2011 • 11793 Posts

Good op SP, nice that somebody confirms what anyone with half a brain realised anyway, maybe now we can end this rediculous 'wii-u is not a next gen console' rubbish thats been going around lately:)delta3074

It is a next gen console, with last gen graphics

It cant even handle multipltforms as good as xbox360, so the real world performance so far is a joke

Avatar image for delta3074
delta3074

20003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#39 delta3074
Member since 2007 • 20003 Posts

[QUOTE="delta3074"]Good op SP, nice that somebody confirms what anyone with half a brain realised anyway, maybe now we can end this rediculous 'wii-u is not a next gen console' rubbish thats been going around lately:)loosingENDS

It is a next gen console, with last gen graphics

It cant even handle multipltforms as good as xbox360, so the real world performance so far is a joke

Just as well generations are actually decided by succession and not power then:)
Avatar image for King_Dodongo
King_Dodongo

3759

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#40 King_Dodongo
Member since 2006 • 3759 Posts

It being more powerful than the PS3/360 shouldn't suprise anybody. GPU tech has gotten really good and cheap recently. The PS3 uses one of the last GPUs before the unified shader architecture took over and the 360 uses one of the first models of the unified shader GPUs. Bascially they have been extremely out of date for years. 

The CPU speed doesn't matter as much as some people say, but it's still a factor. RAM bandwidth is another factor. I've heard bad things about the RAM bandwidth. 

The next Playstation and Xbox will run circles around this thing. Even if their rumored 1.6 ghz processors are true, they still will have far more power than the WiiU will have. If the rumors that they will run an x86 based processor, then the WiiU is going to get a tiny fraction of multiplats, if any at all. 

Nintendo has really kind of screwed themselves with the hardware this time around. They'll need to really ramp up the 1st party support as they won't be getting tons of 3rd party support once the other consoles are out.

Wasdie
The bad is on the others IMO. The development costs are already high, who knows what the PS4/720 would mean to third parties. Last gen many studios closed already cause of the HD standard.
Avatar image for silversix_
silversix_

26347

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#41 silversix_
Member since 2010 • 26347 Posts
I can't believe we're comparing a next gen system to a 7 years old console... Do you think ps4/720 will be compared to 360/ps3 or high end pc's? lol wiiu.
Avatar image for SecretPolice
SecretPolice

44061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#42 SecretPolice
Member since 2007 • 44061 Posts

We know it's not a lot more powerful, and once again this is the assumption we make. Do we really need to look into it any deeper?

What's important is the end product, not theoretical numbers; what Wii U will do vs PS4/720 will do.

SaltyMeatballs

Hmm, perhaps it's like some sigs around here, some are only worth a glance whilst others demand a deeper look :shock:

Sorry, I just couldn't resist. :P

Avatar image for StrongDeadlift
StrongDeadlift

6073

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#43 StrongDeadlift
Member since 2010 • 6073 Posts

Lol, the Wii U GPU is in the range of a HD4650.  Its actually LOWER than the 4650 (Its core is based off of the R700 series of cards, aka 4xxx, so these are directly comparable)

Wii U:

Core = (320:16:8), clock = 550mhz

Radeon HD 4650:

Core: (320:32:8), clock = 650mhz

 

 

 They really made the Wii U GPU not only from a line of cards that will be 5 years old in 4 months, but weaker than one of the LOWEST END cards from that line of cards tho :|

God damn nintendo.  Its like they purposely went out of their way to make the system as weak as possibru.  :lol:

 

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#44 Wasdie  Moderator
Member since 2003 • 53622 Posts

 The bad is on the others IMO. The development costs are already high, who knows what the PS4/720 would mean to third parties. Last gen many studios closed already cause of the HD standard.King_Dodongo

"HD standard" had nothing to do with it. 

PC devs have been thriving despite the push for better graphics even in lower budget games. Some of the best looking games on the PC are done by smaller studios operating with much smaller budgets than the big guys.

Publishers and licensing fees from the big game console manufacturers add tons and tons of unnecessary overhead where studios cannot cope. The actual development costs aren't the biggest issue anymore. It's keeping up with publisher's demands. 

Microsoft, Sony, and Nintendo add tons of unnecessary overhead that hurts game development. Big publishers like EA, Ubisoft, Activision, and all of the rest are doing more damage to game studios than the "HD standard".

Games like ArmA 2, The Witcher 2, Hard Reset, and Trine 2, all prove that you can have amazing looking games with a smaller budget. Hell, Crysis was a game that cost under 10 million to make.

 

Avatar image for Gue1
Gue1

12171

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#45 Gue1
Member since 2004 • 12171 Posts

Sorry! I wrote this comment on the wrong thread like boss.   :cool:

Avatar image for SaltyMeatballs
SaltyMeatballs

25165

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#46 SaltyMeatballs
Member since 2009 • 25165 Posts

"HD standard" had nothing to do with it. 

PC devs have been thriving despite the push for better graphics even in lower budget games. Some of the best looking games on the PC are done by smaller studios operating with much smaller budgets than the big guys.

Publishers and licensing fees from the big game console manufacturers add tons and tons of unnecessary overhead where studios cannot cope. The actual development costs aren't the biggest issue anymore. It's keeping up with publisher's demands. 

Microsoft, Sony, and Nintendo add tons of unnecessary overhead that hurts game development. Big publishers like EA, Ubisoft, Activision, and all of the rest are doing more damage to game studios than the "HD standard".

Games like ArmA 2, The Witcher 2, Hard Reset, and Trine 2, all prove that you can have amazing looking games with a smaller budget. Hell, Crysis was a game that cost under 10 million to make.Wasdie

 

You can't blame this on "unnecessary overhead", the biggest change was the transition to HD (more realistic graphics):

I'm currently transcribing net income - data for several video game companies and I noticed that most of them list their development costs too (the companies name those division slightly different, but it's the same).

While it's nothing new that game companies invest more money nowadays, I found it interesting how steep the curve is.

It's not adjusted to inflation, because it wouldn't change the graph (I tested it with one company).

Note: All those companies have lot's of other expenses, I'll just focus on this part anyway.
sources: Annual reports / SEC - filings
Read: FY3 2003 = Fiscal year ended in March 2003 = April 2002 - March 2003 (FY10 = October, etc.)

Electronic Arts


Activision (Blizzard)


Take-Two


Ubisoft


THQ

Notice: FY3 2003 is just one quarter

 Captain Smoker

http://www.neogaf.com/forum/showpost.php?p=47255603&postcount=1

Avatar image for deactivated-57ad0e5285d73
deactivated-57ad0e5285d73

21398

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#47 deactivated-57ad0e5285d73
Member since 2009 • 21398 Posts

[QUOTE="King_Dodongo"] The bad is on the others IMO. The development costs are already high, who knows what the PS4/720 would mean to third parties. Last gen many studios closed already cause of the HD standard.Wasdie

"HD standard" had nothing to do with it. 

PC devs have been thriving despite the push for better graphics even in lower budget games. Some of the best looking games on the PC are done by smaller studios operating with much smaller budgets than the big guys.

Publishers and licensing fees from the big game console manufacturers add tons and tons of unnecessary overhead where studios cannot cope. The actual development costs aren't the biggest issue anymore. It's keeping up with publisher's demands. 

Microsoft, Sony, and Nintendo add tons of unnecessary overhead that hurts game development. Big publishers like EA, Ubisoft, Activision, and all of the rest are doing more damage to game studios than the "HD standard".

Games like ArmA 2, The Witcher 2, Hard Reset, and Trine 2, all prove that you can have amazing looking games with a smaller budget. Hell, Crysis was a game that cost under 10 million to make.

 

I think nintendos comment about the difference between wiiu and 720/ps4 not being as large as wii/ps360 was them convincing themselves that as long as the output image is 720/1080p, gamers would not see much of a difference.
Avatar image for SaltyMeatballs
SaltyMeatballs

25165

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#48 SaltyMeatballs
Member since 2009 • 25165 Posts

[QUOTE="Wasdie"]

[QUOTE="King_Dodongo"] The bad is on the others IMO. The development costs are already high, who knows what the PS4/720 would mean to third parties. Last gen many studios closed already cause of the HD standard.Heirren

"HD standard" had nothing to do with it. 

PC devs have been thriving despite the push for better graphics even in lower budget games. Some of the best looking games on the PC are done by smaller studios operating with much smaller budgets than the big guys.

Publishers and licensing fees from the big game console manufacturers add tons and tons of unnecessary overhead where studios cannot cope. The actual development costs aren't the biggest issue anymore. It's keeping up with publisher's demands. 

Microsoft, Sony, and Nintendo add tons of unnecessary overhead that hurts game development. Big publishers like EA, Ubisoft, Activision, and all of the rest are doing more damage to game studios than the "HD standard".

Games like ArmA 2, The Witcher 2, Hard Reset, and Trine 2, all prove that you can have amazing looking games with a smaller budget. Hell, Crysis was a game that cost under 10 million to make.

 

I think nintendos comment about the difference between wiiu and 720/ps4 not being as large as wii/ps360 was them convincing themselves that as long as the output image is 720/1080p, gamers would not see much of a difference.

Maybe they have a point. I'm being serious, I was surprised when I saw people say the UE4 tech demo didn't look impressive (not samaritan, the fire demon demo). Just shows how much trickery the developers achieved on 360/PS3.

Avatar image for AcidThunder
AcidThunder

2332

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#49 AcidThunder
Member since 2010 • 2332 Posts

Its as if Nintendo has already given up. It isn't even trying anymore. Seriously, this is the best they could come up with?
Nintendo might have taken the 7th gen but they are surely not gonna take the 8th. 

Avatar image for AM-Gamer
AM-Gamer

8116

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#50 AM-Gamer
Member since 2012 • 8116 Posts

We already new this, Its slighly more powerful then teh PS3/360 far weaker then the next PS and Xbox.Â