Xbox one = 7790 confirmed by xbox one Architec.

This topic is locked from further discussion.

#1 Posted by tormentos (16754 posts) -

 

 

Every one of the Xbox One dev kits actually has 14 CUs on the silicon. Two of those CUs are reserved for redundancy in manufacturing, but we could go and do the experiment - if we were actually at 14 CUs what kind of performance benefit would we get versus 12? And if we raised the GPU clock what sort of performance advantage would we get? And we actually saw on the launch titles - we looked at a lot of titles in a lot of depth - we found that going to 14 CUs wasn't as effective as the 6.6 per cent clock upgrade that we did."

 

http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects

 

I just destroyed Ronvalencia argument,like i claim many times the xbox one was a 7790 with 2CU disable for yields...:lol:

 

Total bullsh** so MS is telling people that 12 CU are better than 14 on GCN..:lol:

 

The same discussion with ESRAM as well - the 204GB/s number that was presented at Hot Chips is taking known limitations of the logic around the ESRAM into account. You can't sustain writes for absolutely every single cycle. The writes is known to insert a bubble [a dead cycle] occasionally... one out of every eight cycles is a bubble so that's how you get the combined 204GB/s as the raw peak that we can really achieve over the ESRAM. And then if you say what can you achieve out of an application - we've measured about 140-150GB/s for ESRAM

 

 

Yeah 140Gb/s 150GB/s not 204Gb/s it can't be sustained,so practical 140 to 150GB/s.

 

68GB/s + 140 = 208GB/s

68GB/s + 150 = 218GB/s

PS4

176GB/s + 40GB/s from CPU and Onion = 216GB/s

 

Also is ery funny how Digital Foundry is doing MS PR...:lol:

And is always leadbetter..

 

Goosen also believes that leaked Sony documents on VGLeaks bear out Microsoft's argument:

"Sony was actually agreeing with us. They said that their system was balanced for 14 CUs. They used that term: balance. Balance is so important in terms of your actual efficient design.

 

MS Architect riding in something that was prove to be false there never was 14+4...:lol:

 

 

We think we have very good balance, very good performance, we have a product which can handle things other than just raw ALU [GPU compute power].

 

This is pure bullsh** at its best MS trying to act as if the GCN inside the xbox was from Nvidia..:lol:

 

"On the SoC, there are many parallel engines - some of those are more like CPU cores or DSP cores. How we count to fifteen: [we have] eight inside the audio block, four move engines, one video encode, one video decode and one video compositor/resizer," says Nick Baker.

 

MS 15 different processors..

8 are Audio related.

4 Move engines.

1 video codec

1 video decode

and 1 video compositor resizer..

 

:lol::lol::lol:

This keeps getting better and better so the 15 processors like it was predicted were sh** that are also inside the PS4 all but move engines which we know is to move data not to precess sh**..

MS counting even the batteries on the xbox one controller to claim better specs..:lol:

 

 

#2 Posted by Suppaman100 (3708 posts) -
PS4 much stronger than Xbone confirmed. TLHBO
#3 Posted by -Snooze- (7304 posts) -

The 7790 is a poor poor card. I don't understand why Sony and MS went in with such poor gpu's. I understand the CPU being weak as shit, but why the GPU?

#4 Posted by GuNsbl4ziN (285 posts) -
Who cares about consoles when you can have a PC?
#5 Posted by marklarmer (3883 posts) -

lol will you be able to sleep now?

#6 Posted by Davekeeh (4019 posts) -

Well done Tormentos, the tech king of SW

#7 Posted by Suppaman100 (3708 posts) -
Who cares about consoles when you can have a PC?GuNsbl4ziN
Because there's nothing good to play on PC now.
#8 Posted by ramonnl (769 posts) -

The 7790 is a poor poor card. I don't understand why Sony and MS went in with such poor gpu's. I understand the CPU being weak as shit, but why the GPU?

-Snooze-

Because otherwise the ps4 would cost 600-700 euro's again, that didn't work out so well with the ps3.

#9 Posted by Stinger78 (5826 posts) -
Ok. It's nice to know. At least if I ever get an Xbox One, I'll be sure to keep that in mind....oh, wait, just like any other generation, it doesn't matter if the games are entertaining.
#10 Posted by Stinger78 (5826 posts) -
[QUOTE="GuNsbl4ziN"]Who cares about consoles when you can have a PC?Suppaman100
Because there's nothing good to play on PC now.

Ahh.. Guess that Humble Origin Bundle, Humble Indie 9 Bundle, and "X" collection I got were all fake, as well as all the other games I already own on Steam and Origin.
#11 Posted by GuNsbl4ziN (285 posts) -
[QUOTE="GuNsbl4ziN"]Who cares about consoles when you can have a PC?Suppaman100
Because there's nothing good to play on PC now.

Yeah, I agree. there is almost too much good to play.
#12 Posted by campzor (34932 posts) -
Lems getting the subpar console yet again.
#13 Posted by tormentos (16754 posts) -

Ok. It's nice to know. At least if I ever get an Xbox One, I'll be sure to keep that in mind....oh, wait, just like any other generation, it doesn't matter if the games are entertaining. Stinger78

 

 

Well that is right,the highest rated games this passing gen are from Wii,so yeah the point of my thread is to proved how silly some poster are here,that refuse to see what MS did and what is doing now,the xbox one will have some great games i am sure,but my point wasn't about that.

#14 Posted by EG101 (857 posts) -

Every one of the Xbox One dev kits actually has 14 CUs on the silicon. Two of those CUs are reserved for redundancy in manufacturing, but we could go and do the experiment - if we were actually at 14 CUs what kind of performance benefit would we get versus 12? And if we raised the GPU clock what sort of performance advantage would we get? And we actually saw on the launch titles - we looked at a lot of titles in a lot of depth - we found that going to 14 CUs wasn't as effective as the 6.6 per cent clock upgrade that we did."tormentos

http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects

I just destroyed Ronvalencia argument,like i claim many times the xbox one was a 7790 with 2CU disable for yields...:lol:

Total bullsh** so MS is telling people that 12 CU are better than 14 on GCN..:lol:

The same discussion with ESRAM as well - the 204GB/s number that was presented at Hot Chips is taking known limitations of the logic around the ESRAM into account. You can't sustain writes for absolutely every single cycle. The writes is known to insert a bubble [a dead cycle] occasionally... one out of every eight cycles is a bubble so that's how you get the combined 204GB/s as the raw peak that we can really achieve over the ESRAM. And then if you say what can you achieve out of an application - we've measured about 140-150GB/s for ESRAM

Yeah 140Gb/s 150GB/s not 204Gb/s it can't be sustained,so practical 140 to 150GB/s.

68GB/s + 140 = 208GB/s

68GB/s + 150 = 218GB/s

PS4

176GB/s + 40GB/s from CPU and Onion = 216GB/s

Also is ery funny how Digital Foundry is doing MS PR...:lol:

And is always leadbetter..

Goosen also believes that leaked Sony documents on VGLeaks bear out Microsoft's argument:

"Sony was actually agreeing with us. They said that their system was balanced for 14 CUs. They used that term: balance. Balance is so important in terms of your actual efficient design.

MS Architect riding in something that was prove to be false there never was 14+4...:lol:

We think we have very good balance, very good performance, we have a product which can handle things other than just raw ALU [GPU compute power].

This is pure bullsh** at its best MS trying to act as if the GCN inside the xbox was from Nvidia..:lol:

This definitely not how it works :lol:

The GDDR5 inside the PS4 will never exceed the PEAK which is 176GB/S. There is only 1 RAM Pool inside the PS4 and that has a PEAK of 176GB/S but peak is rarely maintained. The reason you can add BW inside the XB1 is because there are 2 pools of ram with seperate buses that can BOTH run in Parallel something the PS4 Does NOT have.

#15 Posted by wiiutroll (135 posts) -

[QUOTE="tormentos"]

 

 

Every one of the Xbox One dev kits actually has 14 CUs on the silicon. Two of those CUs are reserved for redundancy in manufacturing, but we could go and do the experiment - if we were actually at 14 CUs what kind of performance benefit would we get versus 12? And if we raised the GPU clock what sort of performance advantage would we get? And we actually saw on the launch titles - we looked at a lot of titles in a lot of depth - we found that going to 14 CUs wasn't as effective as the 6.6 per cent clock upgrade that we did."EG101

 

http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects

 

I just destroyed Ronvalencia argument,like i claim many times the xbox one was a 7790 with 2CU disable for yields...:lol:

 

Total bullsh** so MS is telling people that 12 CU are better than 14 on GCN..:lol:

 

The same discussion with ESRAM as well - the 204GB/s number that was presented at Hot Chips is taking known limitations of the logic around the ESRAM into account. You can't sustain writes for absolutely every single cycle. The writes is known to insert a bubble [a dead cycle] occasionally... one out of every eight cycles is a bubble so that's how you get the combined 204GB/s as the raw peak that we can really achieve over the ESRAM. And then if you say what can you achieve out of an application - we've measured about 140-150GB/s for ESRAM

 

 

Yeah 140Gb/s 150GB/s not 204Gb/s it can't be sustained,so practical 140 to 150GB/s.

 

68GB/s + 140 = 208GB/s

68GB/s + 150 = 218GB/s

PS4

176GB/s + 40GB/s from CPU and Onion = 216GB/s

 

Also is ery funny how Digital Foundry is doing MS PR...:lol:

And is always leadbetter..

 

Goosen also believes that leaked Sony documents on VGLeaks bear out Microsoft's argument:

"Sony was actually agreeing with us. They said that their system was balanced for 14 CUs. They used that term: balance. Balance is so important in terms of your actual efficient design.

 

MS Architect riding in something that was prove to be false there never was 14+4...:lol:

 

We think we have very good balance, very good performance, we have a product which can handle things other than just raw ALU [GPU compute power].

 

This is pure bullsh** at its best MS trying to act as if the GCN inside the xbox was from Nvidia..:lol:

 

This definitely not how it works :lol:

The GDDR5 inside the PS4 will never exceed the PEAK which is 176GB/S. There is only 1 RAM Pool inside the PS4 and that has a PEAK of 176GB/S but peak is rarely maintained. The reason you can add BW inside the XB1 is because there are 2 pools of ram with seperate buses that can BOTH run in Parallel something the PS4 Does NOT have.

that's what i was thinking

#16 Posted by SKaREO (3161 posts) -
[QUOTE="GuNsbl4ziN"]Who cares about consoles when you can have a PC?Suppaman100
Because there's nothing good to play on PC now.

Exactly this. PC gaming is dead now, get over it.
#17 Posted by RoslindaleOne (7565 posts) -
Looked like Ron owned you big time. Let it go.
#18 Posted by Elitro (578 posts) -

[QUOTE="GuNsbl4ziN"]Who cares about consoles when you can have a PC?Suppaman100
Because there's nothing good to play on PC now.

 

You're right... Dragon Age, Dark Souls 2 and Witcher 3 are comming next year.

#19 Posted by silversix_ (13899 posts) -
Stuck for 10 years with a 7790 :lol: Good luck to you Lems rofl
#20 Posted by tormentos (16754 posts) -

 

This definitely not how it works :lol:

The GDDR5 inside the PS4 will never exceed the PEAK which is 176GB/S. There is only 1 RAM Pool inside the PS4 and that has a PEAK of 176GB/S but peak is rarely maintained. The reason you can add BW inside the XB1 is because there are 2 pools of ram with seperate buses that can BOTH run in Parallel something the PS4 Does NOT have.

EG101

 

By odworld developer the PS4 was getting 172GB/s already out of the 176GB/s..:lol:

 

No the 176GB/s is a direct link between the GPU and memory pool,there is another link from CPU to memory that is 20GB,and another one that is 20GB/s from CPU to GPU...:lol:

Get your fact right.

#21 Posted by metal_zombie (2286 posts) -

[QUOTE="-Snooze-"]

The 7790 is a poor poor card. I don't understand why Sony and MS went in with such poor gpu's. I understand the CPU being weak as shit, but why the GPU?

ramonnl

Because otherwise the ps4 would cost 600-700 euro's again, that didn't work out so well with the ps3.

I wouldn't minded paying more for better consoles but i suppose it makes sense to launch less risky hardware in todays economy
#22 Posted by clone01 (24373 posts) -

lol will you be able to sleep now?

marklarmer
Nope.
#23 Posted by tormentos (16754 posts) -

 that's what i was thinking

wiiutroll

 

And you are wrong as well.

 

13z4vms.png

#24 Posted by metal_zombie (2286 posts) -
Stuck for 10 years with a 7790 :lol: Good luck to you Lems roflsilversix_
It's not that bad.... on pc yeah having a shitty card is terrible but on consoles it will be fine.
#25 Posted by silversix_ (13899 posts) -

[QUOTE="silversix_"]Stuck for 10 years with a 7790 :lol: Good luck to you Lems roflmetal_zombie
It's not that bad.... on pc yeah having a shitty card is terrible but on consoles it will be fine.

Its here for TEN YEARS lol this isn't just bad, its atrocious

#26 Posted by tormentos (16754 posts) -

Looked like Ron owned you big time. Let it go.RoslindaleOne

 

Yeah he sure did,specially on the part where he desperately tryed to to imply that the xbox one GPU wasn't bonaire that was pitcairn,or like one of hes last arguments a cross between a 7790 and 7850..:lol:

#27 Posted by Zophar87 (4345 posts) -

I called this months ago. lol

#28 Posted by TheKingIAm (985 posts) -
I thought we knew the x1 was weak as hell
#29 Posted by ronvalencia (15109 posts) -

Every one of the Xbox One dev kits actually has 14 CUs on the silicon. Two of those CUs are reserved for redundancy in manufacturing, but we could go and do the experiment - if we were actually at 14 CUs what kind of performance benefit would we get versus 12? And if we raised the GPU clock what sort of performance advantage would we get? And we actually saw on the launch titles - we looked at a lot of titles in a lot of depth - we found that going to 14 CUs wasn't as effective as the 6.6 per cent clock upgrade that we did."tormentos

http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects

I just destroyed Ronvalencia argument,like i claim many times the xbox one was a 7790 with 2CU disable for yields...:lol:

Total bullsh** so MS is telling people that 12 CU are better than 14 on GCN..:lol:

The same discussion with ESRAM as well - the 204GB/s number that was presented at Hot Chips is taking known limitations of the logic around the ESRAM into account. You can't sustain writes for absolutely every single cycle. The writes is known to insert a bubble [a dead cycle] occasionally... one out of every eight cycles is a bubble so that's how you get the combined 204GB/s as the raw peak that we can really achieve over the ESRAM. And then if you say what can you achieve out of an application - we've measured about 140-150GB/s for ESRAM

Yeah 140Gb/s 150GB/s not 204Gb/s it can't be sustained,so practical 140 to 150GB/s.

68GB/s + 140 = 208GB/s

68GB/s + 150 = 218GB/s

PS4

176GB/s + 40GB/s from CPU and Onion = 216GB/s

Also is ery funny how Digital Foundry is doing MS PR...:lol:

And is always leadbetter..

Goosen also believes that leaked Sony documents on VGLeaks bear out Microsoft's argument:

"Sony was actually agreeing with us. They said that their system was balanced for 14 CUs. They used that term: balance. Balance is so important in terms of your actual efficient design.

MS Architect riding in something that was prove to be false there never was 14+4...:lol:

We think we have very good balance, very good performance, we have a product which can handle things other than just raw ALU [GPU compute power].

This is pure bullsh** at its best MS trying to act as if the GCN inside the xbox was from Nvidia..:lol:

"On the SoC, there are many parallel engines - some of those are more like CPU cores or DSP cores. How we count to fifteen: [we have] eight inside the audio block, four move engines, one video encode, one video decode and one video compositor/resizer," says Nick Baker.

MS 15 different processors..

8 are Audio related.

4 Move engines.

1 video codec

1 video decode

and 1 video compositor resizer..

:lol::lol::lol:

This keeps getting better and better so the 15 processors like it was predicted were sh** that are also inside the PS4 all but move engines which we know is to move data not to precess sh**..

MS counting even the batteries on the xbox one controller to claim better specs..:lol:

Your memory bandwidth assumptions doesn't add up.

http://www.edge-online.com/news/gaijin-games-on-why-war-thunder-isnt-coming-to-xbox-one/

How much more powerful?

AY: It depends what youre doing. GPU, like 40 per cent more powerful. DDR5 is basically 50 per cent more powerful than DDR3, but the memory write [performance] is bigger on Xbox One so it depends on what youre doing.

How is that going to translate to on-screen results for the kinds of games you want to make? So to optimise War Thunder on both consoles you could hypothetically make a better, prettier version on PS4?

AY: Yep.

KY: Probably yes. But again, thats not a very big deal.

---------------

7790 doesn't have the following memory controllers.

"You can think of the ESRAM and the DDR3 as making up eight total memory controllers, so there are four external memory controllers (which are 64-bit) which go to the DDR3 and then there are four internal memory controllers that are 256-bit that go to the ESRAM. These are all connected via a crossbar and so in fact it will be true that you can go directly, simultaneously to DRAM and ESRAM," he explains

---------------------------

7790's memory controllers and it's L2 cache bottleneck..

radeonhd7790-slide-2-new.PNG

#30 Posted by metal_zombie (2286 posts) -

[QUOTE="metal_zombie"][QUOTE="silversix_"]Stuck for 10 years with a 7790 :lol: Good luck to you Lems roflsilversix_

It's not that bad.... on pc yeah having a shitty card is terrible but on consoles it will be fine.

Its here for TEN YEARS lol this isn't just bad, its atrocious

It will run games at a decent frame rate console players won't care that it can't put out good visuals when compared to the pc.
#31 Posted by ZoomZoom2490 (3934 posts) -

The worst thing about X1 is the fact that memory is controlled by the same old xbox360 128bit-bus memory.

dont let MS fool you into thinking that the console has 256bit memory width because of that crap ESRAM that's also used in WiiU.

#32 Posted by moistsandwich (0 posts) -

[QUOTE="metal_zombie"][QUOTE="silversix_"]Stuck for 10 years with a 7790 :lol: Good luck to you Lems roflsilversix_

It's not that bad.... on pc yeah having a shitty card is terrible but on consoles it will be fine.

Its here for TEN YEARS lol this isn't just bad, its atrocious

6-8 years.... there has never been a 10 year gap between console launches within any of the big 3. But I get your point.

#33 Posted by savagetwinkie (7981 posts) -

[QUOTE="wiiutroll"]

that's what i was thinking

tormentos

And you are wrong as well.

13z4vms.png

this doesn't actually refute anything they just said... 176GB/s peak of gddr5 and it has 1 memory type.

#34 Posted by WilliamRLBaker (28338 posts) -

awww el tormo trying to understand hardware and trying to explain it but getting it wrong

El tormos limit of knowledge= derp derpp..one number is bigger than another number.

#35 Posted by metal_zombie (2286 posts) -

[QUOTE="RoslindaleOne"]Looked like Ron owned you big time. Let it go.tormentos

 

Yeah he sure did,specially on the part where he desperately tryed to to imply that the xbox one GPU wasn't bonaire that was pitcairn,or like one of hes last arguments a cross between a 7790 and 7850..:lol:

RbWsWvU.png

#36 Posted by clone01 (24373 posts) -

[QUOTE="silversix_"]

[QUOTE="metal_zombie"] It's not that bad.... on pc yeah having a shitty card is terrible but on consoles it will be fine.moistsandwich

Its here for TEN YEARS lol this isn't just bad, its atrocious

6-8 years.... there has never been a 10 year gap between console launches within any of the big 3. But I get your point.

I find it truly silly that fanboys debate about console power. Its a console. You want performance, get a PC. And this is coming from essentially a console-only gamer.
#37 Posted by wiiutroll (135 posts) -

[QUOTE="tormentos"]

[QUOTE="wiiutroll"]

that's what i was thinking

savagetwinkie

 

And you are wrong as well.

 

13z4vms.png

this doesn't actually refute anything they just said... 176GB/s peak of gddr5 and it has 1 memory type.

lol

#38 Posted by savagetwinkie (7981 posts) -
[QUOTE="moistsandwich"]

[QUOTE="silversix_"]Its here for TEN YEARS lol this isn't just bad, its atrocious

clone01

6-8 years.... there has never been a 10 year gap between console launches within any of the big 3. But I get your point.

I find it truly silly that fanboys debate about console power. Its a console. You want performance, get a PC. And this is coming from essentially a console-only gamer.

A lot of people don't like PC though, too many options, more things that can go wrong, its online is a unified experience
#39 Posted by LA-Nighthawk (5 posts) -
Graphics look fine to me. Looking at the launch games and knowing how much better games will look a few years down the line, I'm totally okay with that. Enjoy obsessing over meaningless numbers for the next few years though.
#40 Posted by EG101 (857 posts) -

[QUOTE="EG101"]

This definitely not how it works :lol:

The GDDR5 inside the PS4 will never exceed the PEAK which is 176GB/S. There is only 1 RAM Pool inside the PS4 and that has a PEAK of 176GB/S but peak is rarely maintained. The reason you can add BW inside the XB1 is because there are 2 pools of ram with seperate buses that can BOTH run in Parallel something the PS4 Does NOT have.

tormentos

By odworld developer the PS4 was getting 172GB/s already out of the 176GB/s..:lol:

No the 176GB/s is a direct link between the GPU and memory pool,there is another link from CPU to memory that is 20GB,and another one that is 20GB/s from CPU to GPU...:lol:

Get your fact right.

You get your facts straight.

That GDDR5 will NEVER have 200+ GB/S bandwidth NEVER. You tried to make it seem like the PS4 Ram had that much BW. Doesn't matter how many highways go to that RAM there is a maximum bandwidth of 176 GB/S coming from that RAM. You can NOT add Band Width on the PS4 the way you CAN on the XB1.

#41 Posted by Evo_nine (1598 posts) -

Stuck for 10 years with a 7790 :lol: Good luck to you Lems roflsilversix_
errr....cows are in the same boat re: stuck with a crappy GPU

. Im betting this generation wont last 10 years, more like 5. Well....I HOPE.

#42 Posted by Lumpy311 (673 posts) -

[QUOTE="silversix_"]Stuck for 10 years with a 7790 :lol: Good luck to you Lems roflEvo_nine

errr....cows are in the same boat re: stuck with a crappy GPU

 

. Im betting this generation wont last 10 years, more like 5. Well....I HOPE.

I hope it does last 10 years to PC gaming can grow even more.

#43 Posted by clone01 (24373 posts) -
[QUOTE="clone01"][QUOTE="moistsandwich"]

6-8 years.... there has never been a 10 year gap between console launches within any of the big 3. But I get your point.

savagetwinkie
I find it truly silly that fanboys debate about console power. Its a console. You want performance, get a PC. And this is coming from essentially a console-only gamer.

A lot of people don't like PC though, too many options, more things that can go wrong, its online is a unified experience

Certainly. That's why I like consoles. Although the line gets blurrier every day.
#44 Posted by kuu2 (6885 posts) -

Another Tormentos thread where he pretends to be a hardware architect.

 

And can someone explain to me why Sony Fan continues to obsesse over X1s hardware specs?

#45 Posted by SKaREO (3161 posts) -

[QUOTE="Evo_nine"]

[QUOTE="silversix_"]Stuck for 10 years with a 7790 :lol: Good luck to you Lems roflLumpy311

errr....cows are in the same boat re: stuck with a crappy GPU

 

. Im betting this generation wont last 10 years, more like 5. Well....I HOPE.

I hope it does last 10 years to PC gaming can grow even more.

Mobile gaming is growing exponentially faster than PC gaming ever was. PC gaming is a niche and it will always be a niche. Thinking that a PC has any influence in games is a joke these days. Devlopers don't have much intention of making their games available on the PC, takes Rockstar Games for example. I don't buy a $2500 system to brag about it. I buy a system to play games and if there aren't any good games on the PC then what's the point of owning one?
#46 Posted by Lumpy311 (673 posts) -

Another Tormentos thread where he pretends to be a hardware architect.

 

And can someone explain to me why Sony Fan continues to obsesse over X1s hardware specs?

kuu2

That's how factions on System Wars work.

#47 Posted by ronvalencia (15109 posts) -

The 7790 is a poor poor card. I don't understand why Sony and MS went in with such poor gpu's. I understand the CPU being weak as shit, but why the GPU?

-Snooze-
7790 has a higher FLOPS than 7850, but 7790 has a gimped L2 cache. Unlike X1, 7790 can NOT do memory writes @ 150GB/s (assuming this claim is true). There's a reason why I used prototype-7850 with 12 CUs for X1.
#48 Posted by ronvalencia (15109 posts) -

The worst thing about X1 is the fact that memory is controlled by the same old xbox360 128bit-bus memory.

dont let MS fool you into thinking that the console has 256bit memory width because of that crap ESRAM that's also used in WiiU.

ZoomZoom2490
Btw, GDDR3 is based on DDR2.
#49 Posted by Evo_nine (1598 posts) -

[QUOTE="Evo_nine"]

[QUOTE="silversix_"]Stuck for 10 years with a 7790 :lol: Good luck to you Lems roflLumpy311

errr....cows are in the same boat re: stuck with a crappy GPU

. Im betting this generation wont last 10 years, more like 5. Well....I HOPE.

I hope it does last 10 years to PC gaming can grow even more.

I haven't built a gaming PC for years, but ill definately be thinking about it if this console gen looks to last longer than 5 years. Also i really want to play star citizen.

Anyways, PC gaming always benefits from improving console tech.

http://techreport.com/news/25378/pc-gaming-will-benefit-from-next-gen-consoles-says-amd

#50 Posted by Basinboy (10983 posts) -

And now in addition to everyone being a critic, everyone's an engineer.

Just gimme a damn controller and let me play already.