Official ALL NextGen rumors and speculation (PS4, 720, Steambox)

This topic is locked from further discussion.

Avatar image for spiderluck
spiderluck

2405

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#201 spiderluck
Member since 2012 • 2405 Posts

[QUOTE="M8ingSeezun"]

Dont really care at the moment.

Sony is beginning to disappoint me, especially rumor surrounding them ditching the dual shock in favor for a LCD screen based controller like WiiU.

Rumors are rumors, yeah, but just look at the Move and Wiimote, both so damn similar.

OhSnapitz

Hopefully the rumor with it having a touchscreen controller is false. It isn't necessary.

And lol at some of these users saying they are ok with a touch screen controller after dogging Nintendo about theirs. Typical

Kingpin0114

It wouldn't be an atrocity if they design the controller modestly..

SNrBs.jpg

..that would not intrude on my gaming experience.

Just invert the d-pad and stick location and that would be a fantastic next-box contoller
Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#202 tormentos
Member since 2003 • 33784 Posts
Perhaps, 3 GCN 384 stream processors (6 CUs) LOL.. A total of 18 CUs....ronvalencia
No Ron what is been rumored is 3 soc's no1 but 3 SOC's that would mean 3 CPU and GPU,not only that 2 of those SOC's are say to carry 8970 GPU which have a TDP of 250watts,the other a 8850. Come on Ron you know that is impossible for a console you just know it.
Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#203 tormentos
Member since 2003 • 33784 Posts
[QUOTE="WilliamRLBaker"] *shakes head* I never knew someone was silly enough to debate with ronvalencia.

Ron is just as a big fanboy as you are,he happen to be more of an ATI fanboy than an MS,alto he has see many xbots do stupid claims and he don't correct them,but as soon as he see a PS3 one talking about CPU he is quick to join and pull 30 charts and 20 links. I just quote him on Durango having 3 GPU which he knows is impossible unless it has 3 extremely low cost and under power ones,and he was actually trying validate that,lol when hi him self talk a ton of crap about console low TDP vs PC. He is just as big as a fanboy as you are,he just cheers for AMD(ATI more).
Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#204 tormentos
Member since 2003 • 33784 Posts
Radeon HD 7850 already includes 20 CUs with 16 working CUs i.e. 4 failed due to minor yield issues. If AMD improves the minor yield issues, 18 CU enabled 7850+ is possible.

3 Radeon HD 8790M with 6 CUs would have a total of 18 CUs and ~105 watts.

"Wafer starts" and risks would be paid by Sony or MS not by AMD.

ronvalencia
3 billions in loss the first year. Don't you get it this SOC's cost money they are not free,there are other cost related to consoles and not just CPU and GPU..
Avatar image for o0squishy0o
o0squishy0o

2802

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#205 o0squishy0o
Member since 2007 • 2802 Posts
[QUOTE="OhSnapitz"]

[QUOTE="M8ingSeezun"] [QUOTE="Kingpin0114"]

Hopefully the rumor with it having a touchscreen controller is false. It isn't necessary.

And lol at some of these users saying they are ok with a touch screen controller after dogging Nintendo about theirs. Typical

spiderluck

It wouldn't be an atrocity if they design the controller modestly..

SNrBs.jpg

..that would not intrude on my gaming experience.

Just invert the d-pad and stick location and that would be a fantastic next-box contoller

Whats the point in that screen? I mean what purpose does it give than possibly taking away the gamers attention from the screen. Zero point. Its useless.
Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#206 tormentos
Member since 2003 • 33784 Posts

Dont really care at the moment.

Sony is beginning to disappoint me, especially rumor surrounding them ditching the dual shock in favor for a LCD screen based controller like WiiU.

Rumors are rumors, yeah, but just look at the Move and Wiimote, both so damn similar.

M8ingSeezun
The rumors also point at the DS3 been compatible so you have nothing to worry.
Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#207 04dcarraher
Member since 2004 • 23832 Posts

[QUOTE="ronvalencia"]

[QUOTE="MK-Professor"]

both cpu and gpu look outdated even for today standards. I mean a real HD7970 is almost double the speed of a HD7970M

MK-Professor

Desktop HD7970 uses AMD Tahiti XT GPU while HD7970M uses downclocked/gimped AMD Pitcairn XT (aka 7870)

----

Just to mess everybody up, AMD releases/allows 7870 model number with AMD "Tahiti" GPU i.e. 256bit VRAM + AMD "Tahiti" LE GPU e.g. PowerColor Radeon HD 7870 MYST Edition.

You have two GPU designs referring to the same "7870" model number. PowerColor's Radeon HD 7870 MYST Edition (AMD "Tahiti" LE) is ussualy faster than a reference "7870 Ghz Edition" (AMD Pitcairn XT).

the HD7970 performe around 60% faster in games than the HD7870 and the HD7970M is slower than the HD7870 so big differences here.

Also a real HD7970 have 4097 GFLOPS and the HD7970M have 2176 GFLOPS

Also note that single precision performance does not really gauge real performance for the heavy shader workloads done with shader model 5 and tessellation Double precision does. The 7970m has a 1:16 ratio of performance SP vs DP, while desktop models of the 7900 series have a 1:4 ratio. 7870 does about 25 GFLOPS more then 7970m in DP (135 vs 160). And about 400 GFLOPS more in SP (2100 vs 2500). And also the ghz 7970 does upto 4.3 TFLOP in SP and DP it does 1075.
Avatar image for NoodleFighter
NoodleFighter

11803

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#208 NoodleFighter
Member since 2011 • 11803 Posts

[QUOTE="arto1223"]

My answer was, "Doesn't matter, PC blows them all away, lol"

Even though the 'Steam Box" will be a PC, it will not have the crazy powerful hardware that will be available.

AM-Gamer

Call me when I can get the next game from ND, the next god of war, Sucker Punch and a list of other great 3rd party exclusives that will never see the light of day. As long as a console can master visuals in 1080p with top assets the pc's advantages will be meaningless.

:P

Outlast%201.jpg

prepare to be disappointed with your next gen consoles, enjoy that 1080p we've been enjoying for years while we move onto 1440p and beyond.

Avatar image for El_Garbanzo
El_Garbanzo

296

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#209 El_Garbanzo
Member since 2012 • 296 Posts

[QUOTE="spiderluck"][QUOTE="OhSnapitz"] It wouldn't be an atrocity if they design the controller modestly..

SNrBs.jpg

..that would not intrude on my gaming experience.

o0squishy0o

Just invert the d-pad and stick location and that would be a fantastic next-box contoller

Whats the point in that screen? I mean what purpose does it give than possibly taking away the gamers attention from the screen. Zero point. Its useless.

Nothing but a waste of battery life. No point ever to use a little doodoo controller screen when you got a big tv to do it. Only a nintendo fanboy can think it's a good idea, but then again these children watch pokemon and other dumb cartoons so it's expected that theyll never understand.

Avatar image for Shielder7
Shielder7

5191

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#210 Shielder7
Member since 2006 • 5191 Posts

[QUOTE="Shielder7"] 2. 3GB for games is kinda low? Most PC games require at least 2.5GB of RAM and that number is only ever increasing, but consoles are more efficient when it comes to utilizing hardware, just look what the PS 4 and 360 have done with only 512MB of RAM.

Kinthalis

This is a bit of a misunderstood subject. Yes, PC's need more RAM because they are usually dealign with a multi-tasking, full fledged operating system. Consoles - not so much.

That fact, however, has nothign to do with how efficiently a PC will use RAM when it comes to gaming. Assuming you could force a PC to only use 512 MB of RAM for a particular game, it would perform fairly close to a console. It's just that the game engine would have to pull off the same tricks it does on the console (and admittedly, it might not be as efficient as pulling off some fo these tricks). It's why console games have issues liek constant texture and object pop-in. Developers have figured out that you need to load objects and textures as dynamically as possible in order to squeeze in stuff within that small buffer.

They could certinaly do the same on the Pc (and sometimes they do), but they don't have to.

That clears it up I guess thanks.

There is one other point the PS3 is susposed to be using ddr5 RAM while the 360 is susposed to use ddr3 RAM just more of it. So which is better for gaming 3GB of ddr5 or 5-6 GB of ddr3?

Avatar image for burgeg
burgeg

3599

Forum Posts

0

Wiki Points

0

Followers

Reviews: 26

User Lists: 0

#211 burgeg
Member since 2005 • 3599 Posts

[QUOTE="spiderluck"][QUOTE="OhSnapitz"] It wouldn't be an atrocity if they design the controller modestly..

SNrBs.jpg

..that would not intrude on my gaming experience.

o0squishy0o

Just invert the d-pad and stick location and that would be a fantastic next-box contoller

Whats the point in that screen? I mean what purpose does it give than possibly taking away the gamers attention from the screen. Zero point. Its useless.

Seeing/accessing your inventory without having to take you out of gameplay. Seeing your map without having to pause the game. Marking a waypoint without having to pause the game etc. It has the same uses the second screen has on the DS. And when did you ever get distracted by the bottom screen on the DS? A screen on a controller has plenty of uses and can improve gaming. The only reason I would be against a screen on a controller is because I'd be afraid it was make the controller too expensive and make the system itself more expensive. If it's done right and can still keep the cost of the system down, I'm all for a screen on a controller.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#212 tormentos
Member since 2003 • 33784 Posts
prepare to be disappointed with your next gen consoles, enjoy that 1080p we've been enjoying for years while we move onto 1440p and beyond.NoodleFighter
Enjoy selling your car to keep your hardware up to date and for now display,while we play the same game with less detail at a fraction of what PC user will pay,now argue how a mid range card will do it,when you are now arguing against it..:lol:
Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#213 tormentos
Member since 2003 • 33784 Posts
That clears it up I guess thanks.

There is one other point the PS3 is susposed to be using ddr5 RAM while the 360 is susposed to use ddr3 RAM just more of it. So which is better for gaming 3GB of ddr5 or 5-6 GB of ddr3?

Shielder7
The PS4 and 720 you mean. GDDR5 is 3 times faster or more than DDR3.. Is speed vs size.. The 720 has more space but it takes way more time to fill that space,the PS4 has less space but by the time the 720 fill its 6GB the PS4 would have fill those 3 GB 5 times probably...
Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#214 04dcarraher
Member since 2004 • 23832 Posts
[QUOTE="NoodleFighter"]prepare to be disappointed with your next gen consoles, enjoy that 1080p we've been enjoying for years while we move onto 1440p and beyond.tormentos
Enjoy selling your car to keep your hardware up to date and for now display,while we play the same game with less detail at a fraction of what PC user will pay,now argue how a mid range card will do it,when you are now arguing against it..:lol:

Now now, you were doing fine, but this post is full of fail.
Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#215 04dcarraher
Member since 2004 • 23832 Posts
[QUOTE="Shielder7"]That clears it up I guess thanks.

There is one other point the PS3 is susposed to be using ddr5 RAM while the 360 is susposed to use ddr3 RAM just more of it. So which is better for gaming 3GB of ddr5 or 5-6 GB of ddr3?

tormentos
The PS4 and 720 you mean. GDDR5 is 3 times faster or more than DDR3.. Is speed vs size.. The 720 has more space but it takes way more time to fill that space,the PS4 has less space but by the time the 720 fill its 6GB the PS4 would have fill those 3 GB 5 times probably...

All depends on the memory bus, GDDR3 on 256bit or 384bit or 512bit bus can move more data then 128 or 192bit or even 256bit GDDR5. For example GDDR3 at 1080mhz on a 384 bit can move 103 GB/s while 256bit GDDR5 at 4000mhz moves 128 GB/s
Avatar image for deactivated-57d8401f17c55
deactivated-57d8401f17c55

7221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#216 deactivated-57d8401f17c55
Member since 2012 • 7221 Posts

[QUOTE="Shielder7"]That clears it up I guess thanks.

There is one other point the PS3 is susposed to be using ddr5 RAM while the 360 is susposed to use ddr3 RAM just more of it. So which is better for gaming 3GB of ddr5 or 5-6 GB of ddr3?

tormentos

The PS4 and 720 you mean. GDDR5 is 3 times faster or more than DDR3.. Is speed vs size.. The 720 has more space but it takes way more time to fill that space,the PS4 has less space but by the time the 720 fill its 6GB the PS4 would have fill those 3 GB 5 times probably...

Again, DDR3 is not 3 times as slow as GDDR5, it's more like 30.

Also, eDRAM. 720 will have lots.

Avatar image for arto1223
arto1223

4412

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#217 arto1223
Member since 2005 • 4412 Posts

[QUOTE="arto1223"]

[QUOTE="AM-Gamer"]

Call me when I can get the next game from ND, the next god of war, Sucker Punch and a list of other great 3rd party exclusives that will never see the light of day. As long as a console can master visuals in 1080p with top assets the pc's advantages will be meaningless.

AM-Gamer

Ha! The PC has so many more advantages to it than just graphics. Framerate is one of the biggest. I now have a 120hz BenQ monitor and everything looks just amazing in 120hz. Then there are the controls (mouse/keyboard, any possible gamepad, Novint Falcon, and whatever else you can think of). Then there are the genres (RTS, MOBA or whatever they are called, and MMO are almost non-existent on console and genres like racing sims, FPS, RPG, and puzzel have more quantity and quality on PC. So we don't have many JRPGs... who cares but the small few out there?). Then there are all the new Free2Play games. Then there are the great dedicated servers. Then there are the mods. Then there are the games with build in level editors (yea, consoles have one or two of those I guess). Then there are all the other great aspects like an open platform, digital push, Steam, customization/choices, and so much more.

Have fun missing out on all that. Oh, and PC games will still look better than all those PS3/4 exclusives. I keep my PS3 around for its exclusives and I'll do the same for the PS4. Just don't try to pass them off as better than what the PC has to offer.

cool story bro then play on the pc , I personally like the exclsuives better ont he PS3 then on the PC plain and simple there better to me then what hte pc offers. I like action adventure and fighting and the pc is horrid on both those genres . In the mean time if im getting a steady 60fps im happy anything above that is an absolute waste of optimization. Hermits can justify there higher cost all they want ill enjoy superior optimzationa and high end performance for those first couple years for about a 3rd of the cost.

I do play on the PC. I have a PC, PS3, and Wii. You said you prefer the PS3 exclusives... okay, cool. I have my PS3 for that specific reason, the exclusives. So why not get a PC for its exclusives as well as for all the multiplatform games? Whenever I get into a debate with a console gamer, it almost always comes down to them admiting to settling for less. I just don't get it. If you have the money, build a nice rig. You can have both systems like I do.

Avatar image for Boddicker
Boddicker

4458

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#218 Boddicker
Member since 2012 • 4458 Posts

Meh.

I'll wait until official spec announcements are made until I get in on the mudslinging.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#219 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="tormentos"][QUOTE="Shielder7"]That clears it up I guess thanks.

There is one other point the PS3 is susposed to be using ddr5 RAM while the 360 is susposed to use ddr3 RAM just more of it. So which is better for gaming 3GB of ddr5 or 5-6 GB of ddr3?

04dcarraher
The PS4 and 720 you mean. GDDR5 is 3 times faster or more than DDR3.. Is speed vs size.. The 720 has more space but it takes way more time to fill that space,the PS4 has less space but by the time the 720 fill its 6GB the PS4 would have fill those 3 GB 5 times probably...

All depends on the memory bus, GDDR3 on 256bit or 384bit or 512bit bus can move more data then 128 or 192bit or even 256bit GDDR5. For example GDDR3 at 1080mhz on a 384 bit can move 103 GB/s while 256bit GDDR5 at 4000mhz moves 128 GB/s

The cost for PCB with 384 bit would greater than a PCB with 256bit.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#220 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="MK-Professor"]

[QUOTE="ronvalencia"]

Desktop HD7970 uses AMD Tahiti XT GPU while HD7970M uses downclocked/gimped AMD Pitcairn XT (aka 7870)

----

Just to mess everybody up, AMD releases/allows 7870 model number with AMD "Tahiti" GPU i.e. 256bit VRAM + AMD "Tahiti" LE GPU e.g. PowerColor Radeon HD 7870 MYST Edition.

You have two GPU designs referring to the same "7870" model number. PowerColor's Radeon HD 7870 MYST Edition (AMD "Tahiti" LE) is ussualy faster than a reference "7870 Ghz Edition" (AMD Pitcairn XT).

04dcarraher

the HD7970 performe around 60% faster in games than the HD7870 and the HD7970M is slower than the HD7870 so big differences here.

Also a real HD7970 have 4097 GFLOPS and the HD7970M have 2176 GFLOPS

Also note that single precision performance does not really gauge real performance for the heavy shader workloads done with shader model 5 and tessellation Double precision does. The 7970m has a 1:16 ratio of performance SP vs DP, while desktop models of the 7900 series have a 1:4 ratio. 7870 does about 25 GFLOPS more then 7970m in DP (135 vs 160). And about 400 GFLOPS more in SP (2100 vs 2500). And also the ghz 7970 does upto 4.3 TFLOP in SP and DP it does 1075.

The cheapest AMD Tahiti based card would be 7870 XT's "Tahiti LE" variants and still beats the reference "7870 Ghz Edition" and both has the same 256bit PCB+two 6 pin PCI-E power connectors.

"Tahiti LE" still has 1:4 ratio for SP and DP.

If the console TDP budget has 200 watts, AMD Tahiti LE + 8 core AMD Jaguar is still possible. From my POV, this is the maxmium config for PS4 candidate and it's very heavy on the GpGPU(C++ capable) side.

AMD Tahiti LE + 8 core AMD Jaguar's total chip size is still smaller than 1st generation PS3's 488 mm^2 total chip size budget.

----

AMD Tahiti LE = 352 mm^2 with 24 working CUs from 32 CUs i.e. 8 non-working CUs.

AMD Jaguar core: 3.1 mm^2 + 2.1 mm^2 L2 cache = 5.2 mm^2 x 8 cores = 41.6 mm^2 (28nm process tech)

Total chip size : 393.6 mm^2.

1st gen PS3's power consumption can be around 209 watts (e.g. FF13). Sony could go for 3rd tier AMD flagship GPU and claim Radeon HD 7900/8900 type AMD Tahiti GPU and it doesn't have the premium blu-ray and CELL cost for PS4.

Xbox 360 has a lower TDP limit than PS3 and it equalised due to modern GPU design. MS wouldn't have modern GPU design advantage over PS3 for the next console wars.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#221 04dcarraher
Member since 2004 • 23832 Posts

[QUOTE="04dcarraher"][QUOTE="MK-Professor"]

the HD7970 performe around 60% faster in games than the HD7870 and the HD7970M is slower than the HD7870 so big differences here.

Also a real HD7970 have 4097 GFLOPS and the HD7970M have 2176 GFLOPS

ronvalencia

Also note that single precision performance does not really gauge real performance for the heavy shader workloads done with shader model 5 and tessellation Double precision does. The 7970m has a 1:16 ratio of performance SP vs DP, while desktop models of the 7900 series have a 1:4 ratio. 7870 does about 25 GFLOPS more then 7970m in DP (135 vs 160). And about 400 GFLOPS more in SP (2100 vs 2500). And also the ghz 7970 does upto 4.3 TFLOP in SP and DP it does 1075.

The cheapest AMD Tahiti based card would be 7870 XT's "Tahiti LE" variants and still beats the reference "7870 Ghz Edition" and both has the same 256bit PCB+two 6 pin PCI-E power connectors.

"Tahiti LE" still has 1:4 ratio for SP and DP.

If the console TDP budget has 200 watts, AMD Tahiti LE + 8 core AMD Jaguar is still possible. From my POV, this is the maxmium config for PS4 candidate and it's very heavy on the GpGPU(C++ capable) side.

AMD Tahiti LE + 8 core AMD Jaguar's total chip size is still smaller than 1st generation PS3's 488 mm^2 total chip size budget.

----

AMD Tahiti LE = 352 mm^2 with 24 working CUs from 32 CUs i.e. 8 non-working CUs.

AMD Jaguar core: 3.1 mm^2 + 2.1 mm^2 L2 cache = 5.2 mm^2 x 8 cores = 41.6 mm^2 (28nm process tech)

Total chip size : 393.6 mm^2.

1st gen PS3's power consumption can be around 209 watts (e.g. FF13). Sony could go for 3rd tier AMD flagship GPU and claim Radeon HD 7900/8900 type AMD Tahiti GPU and it doesn't have the premium blu-ray and CELL cost for PS4.

According to new rumors "It is interesting to note that both next gen consoles employ laptop CPUs and GPUs instead of their more power-consuming counterparts. Of note also is their reliance on Jaguar-based CPUs that are characterized by their excellent performance per watt. All things considered, it is likely that next gen consoles will draw no more than 150W compared to first released Microsoft and Sony consoles that consumed upto or beyond 200W."
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#222 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]Radeon HD 7850 already includes 20 CUs with 16 working CUs i.e. 4 failed due to minor yield issues. If AMD improves the minor yield issues, 18 CU enabled 7850+ is possible.

3 Radeon HD 8790M with 6 CUs would have a total of 18 CUs and ~105 watts.

"Wafer starts" and risks would be paid by Sony or MS not by AMD.

tormentos

3 billions in loss the first year. Don't you get it this SOC's cost money they are not free,there are other cost related to consoles and not just CPU and GPU..

SoC's cost is dependant on SoC's chip size.

AMD Jaguar SoC's non-compute blocks are cheap i.e. Southbridge IP with just enough netbook/tablet functions. There's very little point having 3 SoCs since you have 3 southbridge IP i.e. you only need one SoC with a southbridge functions.

AMD Jaguar SoC combines an AMD APU and an AMD SB.

Note why I only stated 3 Radeon HD 8790M but not 3 SoCs.

AMD Jaguar SoC is divided into several IP blocks i.e. CPU, GPU, NB/MCH and SB.

AMD Jaguar CU(quad core per CU) can be scale i.e. two AMD Jaguar CUs for 8 core config. AMD designing AMD Jaguar like a Radeon HD's CU scaling model.

01462sz1i29819200.jpg

Notice AMD GCN's four CUs are connected via L1 cache.

AMD_GCN_CUnit_689.jpg

AMD Jaguar's GPU block can be modifed via "copy-and-paste" engineering i.e. select a GCN ASIC. GCN ASIC contains memory controllers and PCI-E glue logic. AMD GCN's PCI-E glue logic can connect to AMD Jaguar's NB/PCI-E glue logic.

AMD Jaguar's SB block has enough functions for game console operations i.e. you only need one SB. SB's PCI-E glue logic can connect to AMD Jaguar's NB/PCI-E glue logic.

Both AMD Jaguar and AMD GCN follows X86-64 address format.

AMD's "semi-custom" APU's development efforts should be similar to Sony Vita's quad-ARM Cortex A9 + PowerVR 543 MP4 "copy and paste" efforts.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#223 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="04dcarraher"] Also note that single precision performance does not really gauge real performance for the heavy shader workloads done with shader model 5 and tessellation Double precision does. The 7970m has a 1:16 ratio of performance SP vs DP, while desktop models of the 7900 series have a 1:4 ratio. 7870 does about 25 GFLOPS more then 7970m in DP (135 vs 160). And about 400 GFLOPS more in SP (2100 vs 2500). And also the ghz 7970 does upto 4.3 TFLOP in SP and DP it does 1075. 04dcarraher

The cheapest AMD Tahiti based card would be 7870 XT's "Tahiti LE" variants and still beats the reference "7870 Ghz Edition" and both has the same 256bit PCB+two 6 pin PCI-E power connectors.

"Tahiti LE" still has 1:4 ratio for SP and DP.

If the console TDP budget has 200 watts, AMD Tahiti LE + 8 core AMD Jaguar is still possible. From my POV, this is the maxmium config for PS4 candidate and it's very heavy on the GpGPU(C++ capable) side.

AMD Tahiti LE + 8 core AMD Jaguar's total chip size is still smaller than 1st generation PS3's 488 mm^2 total chip size budget.

----

AMD Tahiti LE = 352 mm^2 with 24 working CUs from 32 CUs i.e. 8 non-working CUs.

AMD Jaguar core: 3.1 mm^2 + 2.1 mm^2 L2 cache = 5.2 mm^2 x 8 cores = 41.6 mm^2 (28nm process tech)

Total chip size : 393.6 mm^2.

1st gen PS3's power consumption can be around 209 watts (e.g. FF13). Sony could go for 3rd tier AMD flagship GPU and claim Radeon HD 7900/8900 type AMD Tahiti GPU and it doesn't have the premium blu-ray and CELL cost for PS4.

According to new rumors "It is interesting to note that both next gen consoles employ laptop CPUs and GPUs instead of their more power-consuming counterparts. Of note also is their reliance on Jaguar-based CPUs that are characterized by their excellent performance per watt. All things considered, it is likely that next gen consoles will draw no more than 150W compared to first released Microsoft and Sony consoles that consumed upto or beyond 200W."

The problem with 8000M with AMD's "four corners" slide is with "cloud corner" i.e. the cloud is unlikely to use mobile 8000M parts.

Anyway, if we scale 8790M''s 6 CU @ 35 watts by 3X we pretty much have the desktop 7850 i.e. 105 watts with 18 CUs. 8790M''s silicon reflects mainstream quality (amp resistance/signal noise) i.e. replacing mainstream 6670M/7670M.

I based my TDP limits from https://twitter.com/aegies/status/288485398785699840

"the wii u is 50. durango and orbis are ~170-200".

The statement above is almost like a re-run of Xbox 360 and PS3, with one big difference i.e. MS doesn't have unified shader architecture advantage.


Btw, if we scale 8790M''s 6 CU @ 35 watts by 6X = 210 watts with 36 CUs. It matches AMD Tenerife's 36 CU count or any other AMD Tahiti replacement rumors.

The PC still has the flagship status.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#224 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="WilliamRLBaker"] *shakes head* I never knew someone was silly enough to debate with ronvalencia.tormentos
Ron is just as a big fanboy as you are,he happen to be more of an ATI fanboy than an MS,alto he has see many xbots do stupid claims and he don't correct them,but as soon as he see a PS3 one talking about CPU he is quick to join and pull 30 charts and 20 links. I just quote him on Durango having 3 GPU which he knows is impossible unless it has 3 extremely low cost and under power ones,and he was actually trying validate that,lol when hi him self talk a ton of crap about console low TDP vs PC. He is just as big as a fanboy as you are,he just cheers for AMD(ATI more).

My views on PS3 hasn't changed since Nov 2006's Geforce 8800.

For example

FP_compare.jpg

Decoupled.jpg

Switching to NVIDIA Geforce CUDA wouldn't change my view on PS3.

http://www.guru3d.com/news_story/nvidia_era_of_fully_custom_console_hardware_is_over.html

You cannot make a game console such as the PlayStation 2 anymore. When it emerged, PS2 had a 100 times higher performance than the most powerful PC. I wonder whether it is possible to make something 100 times powerful than GeForce GTX 680? If possible, Nvidia will make it, said Jen-Hsun Huang.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#225 tormentos
Member since 2003 • 33784 Posts

[QUOTE="tormentos"][QUOTE="ronvalencia"]Radeon HD 7850 already includes 20 CUs with 16 working CUs i.e. 4 failed due to minor yield issues. If AMD improves the minor yield issues, 18 CU enabled 7850+ is possible.

3 Radeon HD 8790M with 6 CUs would have a total of 18 CUs and ~105 watts.

"Wafer starts" and risks would be paid by Sony or MS not by AMD.

ronvalencia

3 billions in loss the first year. Don't you get it this SOC's cost money they are not free,there are other cost related to consoles and not just CPU and GPU..

SoC's cost is dependant on SoC's chip size.

AMD Jaguar SoC's non-compute blocks are cheap i.e. Southbridge IP with just enough netbook/tablet functions. There's very little point having 3 SoCs since you have 3 southbridge IP i.e. you only need one SoC with a southbridge functions.

AMD Jaguar SoC combines an AMD APU and an AMD SB.

Note why I only stated 3 Radeon HD 8790M but not 3 SoCs.

AMD Jaguar SoC is divided into several IP blocks i.e. CPU, GPU, NB/MCH and SB.

AMD Jaguar CU(quad core per CU) can be scale i.e. two AMD Jaguar CUs for 8 core config. AMD designing AMD Jaguar like a Radeon HD's CU scaling model.

01462sz1i29819200.jpg

Notice AMD GCN's four CUs are connected via L1 cache.

AMD_GCN_CUnit_689.jpg

AMD Jaguar's GPU block can be modifed via "copy-and-paste" engineering i.e. select a GCN ASIC. GCN ASIC contains memory controllers and PCI-E glue logic. AMD GCN's PCI-E glue logic can connect to AMD Jaguar's NB/PCI-E glue logic.

AMD Jaguar's SB block has enough functions for game console operations i.e. you only need one SB. SB's PCI-E glue logic can connect to AMD Jaguar's NB/PCI-E glue logic.

Both AMD Jaguar and AMD GCN follows X86-64 address format.

AMD's "semi-custom" APU's development efforts should be similar to Sony Vita's quad-ARM Cortex A9 + PowerVR 543 MP4 "copy and paste" efforts.

But that is the problem Ron what is been stated is 3 SOC not 3 GPU,which would be out of the question as well.
Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#226 tormentos
Member since 2003 • 33784 Posts
[QUOTE="04dcarraher"] Now now, you were doing fine, but this post is full of fail.

How is full of fail.? He is making fun of the PS4 1080p because he would game on much higher resolution,while the PS4 will be stuck with 1080p,the problem is that the PS4 is rumored to have a 7870 GPU or close GPU,even the 7850 which is lower in spec can run Battlefield 3 on Ultra FXAA on high at 2560x1600 and keep a respectable 32 frames per second.. Second i told him that because most PC argument vs console here revolve around you not having to use a high end card to outdo the PS3 and 360 that a mid range PC will do,but now the PS4 will be more or less mid range,even higher because it has things to help the GPU which on PC are not found,because they rely on brute force.. So that mean that to outdo the PS4 you will need something more than mid range,probably high end and that is not cheap in any way.
Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#227 04dcarraher
Member since 2004 • 23832 Posts
[QUOTE="tormentos"][QUOTE="04dcarraher"] Now now, you were doing fine, but this post is full of fail.

How is full of fail.? He is making fun of the PS4 1080p because he would game on much higher resolution,while the PS4 will be stuck with 1080p,the problem is that the PS4 is rumored to have a 7870 GPU or close GPU,even the 7850 which is lower in spec can run Battlefield 3 on Ultra FXAA on high at 2560x1600 and keep a respectable 32 frames per second.. Second i told him that because most PC argument vs console here revolve around you not having to use a high end card to outdo the PS3 and 360 that a mid range PC will do,but now the PS4 will be more or less mid range,even higher because it has things to help the GPU which on PC are not found,because they rely on brute force.. So that mean that to outdo the PS4 you will need something more than mid range,probably high end and that is not cheap in any way.

I must have missed the full argument :P:
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#228 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="04dcarraher"] Now now, you were doing fine, but this post is full of fail.tormentos
How is full of fail.? He is making fun of the PS4 1080p because he would game on much higher resolution,while the PS4 will be stuck with 1080p,the problem is that the PS4 is rumored to have a 7870 GPU or close GPU,even the 7850 which is lower in spec can run Battlefield 3 on Ultra FXAA on high at 2560x1600 and keep a respectable 32 frames per second.. Second i told him that because most PC argument vs console here revolve around you not having to use a high end card to outdo the PS3 and 360 that a mid range PC will do,but now the PS4 will be more or less mid range,even higher because it has things to help the GPU which on PC are not found,because they rely on brute force.. So that mean that to outdo the PS4 you will need something more than mid range,probably high end and that is not cheap in any way.

For Sony, we don't know if the final box will be using GCN from Tahiti or Pitcairn.

From 7870 XT(185W(1), Tahiti LE), Tahiti can scale down and give 7870 Ghz Editon (175W, Pitcairn XT) some internal competition.

With the same PCB and similar memory bandwidth,

1. Tahiti LE beats Pitcairn XT on raytacing GpGPU i.e. stronger with complex shaders.

2. Tahiti LE beats Pitcairn XT on Battlefield 3.

Reference

1. http://www.cpu-world.com/news_2012/2012113001_AMD_Release_Tahiti_LE_GPU.html