Official ALL NextGen rumors and speculation (PS4, 720, Steambox)

This topic is locked from further discussion.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#101 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="Chozofication"]

Lol 4.2 teraflops and ray tracing.

clyde46
And that 8900 series GPU....

8970 OEM (rename 7970 Ghz Edition aka Tahiti XT2) has about 4.3 TFLOPs and >200 watts chip. You are looking at Silverstone's Mini-ITX SG07/SG08/SG09/SG10 level box with a matching power supply.
Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#102 clyde46
Member since 2005 • 49061 Posts
[QUOTE="clyde46"][QUOTE="Chozofication"]

Lol 4.2 teraflops and ray tracing.

ronvalencia
And that 8900 series GPU....

8970 OEM (rename 7970 Ghz Edition aka Tahiti XT2) has about 4.3 TFLOPs and >200 watts chip. You are looking at Silverstone's Mini-ITX SG07/SG08/SG09/SG10 level box with a matching power supply.

So its not going to be a regular GPU then?
Avatar image for cryemocry
cryemocry

590

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#103 cryemocry
Member since 2013 • 590 Posts

8000 series is just a rebranded 7000 series card im sure some of you know.

Avatar image for cryemocry
cryemocry

590

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#104 cryemocry
Member since 2013 • 590 Posts

console gonna have 128 bit memory bus 256 bit too expensive LOL

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#105 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="clyde46"][QUOTE="Chozofication"]

Lol 4.2 teraflops and ray tracing.

04dcarraher

And that 8900 series GPU....

No 8850 and a 8950 duct taped together! :P

8950 (renamed 7950 Boost Edition) would need a case similar to Silverstone's min-ITX SG07/SG08/SG09/SG10 cases with matching power supply.

Microsoft can sell low end microwave oven without a magnetron ray-gun.

Avatar image for cryemocry
cryemocry

590

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#106 cryemocry
Member since 2013 • 590 Posts

7950 is still too expensive for a console.

Avatar image for Tessellation
Tessellation

9297

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#108 Tessellation
Member since 2009 • 9297 Posts
lol ps4 is going to do fake 4k resolution? wasn't suppose to be 50% more powerful than the next xbox? :cool:
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#109 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="clyde46"][QUOTE="ronvalencia"][QUOTE="clyde46"] And that 8900 series GPU....

8970 OEM (rename 7970 Ghz Edition aka Tahiti XT2) has about 4.3 TFLOPs and >200 watts chip. You are looking at Silverstone's Mini-ITX SG07/SG08/SG09/SG10 level box with a matching power supply.

So its not going to be a regular GPU then?

The guts of the GPU would be still based on 28nm fab'ed GCN.
Avatar image for cryemocry
cryemocry

590

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#110 cryemocry
Member since 2013 • 590 Posts

It's not gonna be a off the shelf gpu xbox neither 360 was.

original xbox was modified geforce 3 had like same amount of vertex pipelines as gf4.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#111 04dcarraher
Member since 2004 • 23829 Posts
[QUOTE="ronvalencia"][QUOTE="clyde46"][QUOTE="ronvalencia"] 8970 OEM (rename 7970 Ghz Edition aka Tahiti XT2) has about 4.3 TFLOPs and >200 watts chip. You are looking at Silverstone's Mini-ITX SG07/SG08/SG09/SG10 level box with a matching power supply.

So its not going to be a regular GPU then?

The guts of the GPU would be still based on 28nm fab'ed GCN.

Which means at best you will see 10% power usage difference if they rearranged the transistors , but it seems they are just renamed 7000 series.
Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#112 clyde46
Member since 2005 • 49061 Posts
[QUOTE="ronvalencia"][QUOTE="clyde46"][QUOTE="ronvalencia"] 8970 OEM (rename 7970 Ghz Edition aka Tahiti XT2) has about 4.3 TFLOPs and >200 watts chip. You are looking at Silverstone's Mini-ITX SG07/SG08/SG09/SG10 level box with a matching power supply.

So its not going to be a regular GPU then?

The guts of the GPU would be still based on 28nm fab'ed GCN.

Umm.. in regular speak?
Avatar image for MonsieurX
MonsieurX

39858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#113 MonsieurX
Member since 2008 • 39858 Posts
It's even funnier when people doesn't quote you Just random posts that no one seems to care about :lol:
Avatar image for cryemocry
cryemocry

590

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#114 cryemocry
Member since 2013 • 590 Posts

who cares about the hardware these days when their software support is atrocious.

Avatar image for Mystery_Writer
Mystery_Writer

8351

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#115 Mystery_Writer
Member since 2004 • 8351 Posts

a side note, was wondering what's the fastest card from AMD and noticed they have HD 7990 (which I never heard of it before now, but that thing is a beast)

Avatar image for cryemocry
cryemocry

590

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#116 cryemocry
Member since 2013 • 590 Posts

most powerful ATI card f*ck calling em amd is the dual 7970 gpu bs

Avatar image for MonsieurX
MonsieurX

39858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#117 MonsieurX
Member since 2008 • 39858 Posts

most powerful ATI card f*ck calling em amd is the dual 7970 gpu bs

cryemocry
u jelly?
Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#118 04dcarraher
Member since 2004 • 23829 Posts
[QUOTE="clyde46"][QUOTE="ronvalencia"][QUOTE="clyde46"] So its not going to be a regular GPU then?

The guts of the GPU would be still based on 28nm fab'ed GCN.

Umm.. in regular speak?

meaning the TDP, (refering to the maximum amount of power the cooling system in a computer is required to dissipate) would be virtually the same as the 7000 series. So a 8970 will have the same TDP of a 7970 ghz which is over 200 watts at full load.
Avatar image for Mystery_Writer
Mystery_Writer

8351

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#119 Mystery_Writer
Member since 2004 • 8351 Posts

most powerful ATI card f*ck calling em amd is the dual 7970 gpu bs

cryemocry

ya, it's 2 GPU card, I have the older HD 6990 for the past two years and was wondering why the 7990 didn't come out on March last year (similar to the 6990 release date), so I assumed they aren't releasing one for the 7000 series until I saw that now. (I'm checking the benchmarks now).

P.S. the 6990 is really a good video card, it runs almost everything on very high settings, even today's games.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#120 04dcarraher
Member since 2004 • 23829 Posts
[QUOTE="cryemocry"]

most powerful ATI card f*ck calling em amd is the dual 7970 gpu bs

MonsieurX
u jelly?

Na just just does not understand , "great power comes greater amount of power and cooling required"
Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#121 04dcarraher
Member since 2004 • 23829 Posts
[QUOTE="cryemocry"]

most powerful ATI card f*ck calling em amd is the dual 7970 gpu bs

Mystery_Writer
ya, I have the HD 6990 for the past two years and was wondering why the 7990 didn't come out on March last year (similar to the 6990 release date), so I assumed they aren't releasing one for the 7000 series until I saw that now. (I'm checking the benchmarks now). P.S. the 6990 is really a very powerful card, it runs almost everything on high details even today's games.

And two 7970's would need 400w or more and 2x the cooling of a single 7970. Your 6990 is two down clocked 6970's that needs 375w. Your gpu alone uses more then 2x the power draw of the original 360.
Avatar image for deactivated-57d8401f17c55
deactivated-57d8401f17c55

7221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#122 deactivated-57d8401f17c55
Member since 2012 • 7221 Posts

I remember when people thought Sony was going to use a Cell 2.0, I actually thought they would too. I guess Sony finally saw it as a waste of money but it's looking like the overall rendering setup this time will be similar.

Avatar image for cryemocry
cryemocry

590

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#123 cryemocry
Member since 2013 • 590 Posts

What is there to really look forward to on 720 besides halo 5? Or forza what 6?

Shame they didnt bring back original xbox franchises like blinx and **** into this franchise.

They actually gave blinx more attention last gen than they have banjo kazooie.

Avatar image for cryemocry
cryemocry

590

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#124 cryemocry
Member since 2013 • 590 Posts

made two blinx games last gen and only one banjo game this gen.

blinx is xbox's real mascot lol.

Avatar image for MonsieurX
MonsieurX

39858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#125 MonsieurX
Member since 2008 • 39858 Posts
Why don't you go whine in your secret diary?
Avatar image for Mystery_Writer
Mystery_Writer

8351

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#126 Mystery_Writer
Member since 2004 • 8351 Posts

[QUOTE="Mystery_Writer"][QUOTE="cryemocry"]

most powerful ATI card f*ck calling em amd is the dual 7970 gpu bs

04dcarraher

ya, I have the HD 6990 for the past two years and was wondering why the 7990 didn't come out on March last year (similar to the 6990 release date), so I assumed they aren't releasing one for the 7000 series until I saw that now. (I'm checking the benchmarks now). P.S. the 6990 is really a very powerful card, it runs almost everything on high details even today's games.

And two 7970's would need 400w or more and 2x the cooling of a single 7970. Your 6990 is two down clocked 6970's that needs 375w. Your gpu alone uses more then 2x the power draw of the original 360.

ya, it also very noisy. But it's been doing its job for the past 2 years. Thankfully I'm happy with it.

I went with a 6 core AMD x6 T1100 + 16 GB with that card back in the day (don't remember if it was at the same day, or bought it after the HD 6990).

Anyway, I'm happy with my AMD setup so far. It's surprisingly still runs all the new games at very high settings.

Avatar image for TheXFiles88
TheXFiles88

1040

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#127 TheXFiles88
Member since 2008 • 1040 Posts

Lol, even to these days, these rumors still have no clue who really are behind the CPU design for the Next Xbox. It is not from AMD but IBM is real contributor. You fool...:lol:

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#128 04dcarraher
Member since 2004 • 23829 Posts

I remember when people thought Sony was going to use a Cell 2.0, I actually thought they would too. I guess Sony finally saw it as a waste of money but it's looking like the overall rendering setup this time will be similar.

Chozofication
I knew the Sony would abandon the Cell since the Cell was originally designed to be the cpu and gpu and after tests the Cell was sub par to what was out , so they contacted Nvidia for a stop gap measure using the G70 (7800).
Avatar image for deactivated-57d8401f17c55
deactivated-57d8401f17c55

7221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#129 deactivated-57d8401f17c55
Member since 2012 • 7221 Posts

[QUOTE="Chozofication"]

I remember when people thought Sony was going to use a Cell 2.0, I actually thought they would too. I guess Sony finally saw it as a waste of money but it's looking like the overall rendering setup this time will be similar.

04dcarraher

I knew the Sony would abandon the Cell since the Cell was originally designed to be the cpu and gpu and after tests the Cell was sub par to what was out , so they contacted Nvidia for a stop gap measure using the G70 (7800).

Yeah. They were gonna use 2 cell's and no gpu. If only they contacted Ati instead...

Cell was pretty incredible by itself but whoever thought it could do graphics by itself wasn't thinking straight.

Avatar image for AdobeArtist
AdobeArtist

25184

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#130 AdobeArtist  Moderator
Member since 2006 • 25184 Posts

cryemocry, can you stop spamming this thread already?

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#131 tormentos
Member since 2003 • 33784 Posts
Lol, you are such a hypocrite. You just wanted to beleive in whatever looks good for $ony but always rebuff any advantages that the Next Xbox might have. Also, you still don't understand the RAM types to being of.TheXFiles88
Looks who's talking.. First it has been rumored to be 32MB that is what i read i have never read about 100MB of eDRAM. Second it sounded better if sony had an 8000 than 7870 AKA 7970M sadly that is another rumor as well. 3rd. I do understand the ram how eDRAM is been use to compensate for the slow DDR3 speed,fact is GDDR5 is almost 3 times faster. And lastly i actually even say that when all is say and done they will probably perform more or less the same,now stop been such a cry baby.
Avatar image for deactivated-57d8401f17c55
deactivated-57d8401f17c55

7221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#132 deactivated-57d8401f17c55
Member since 2012 • 7221 Posts

[QUOTE="TheXFiles88"]Lol, you are such a hypocrite. You just wanted to beleive in whatever looks good for $ony but always rebuff any advantages that the Next Xbox might have. Also, you still don't understand the RAM types to being of.tormentos
3rd. I do understand the ram how eDRAM is been use to compensate for the slow DDR3 speed,fact is GDDR5 is almost 3 times faster. And lastly i actually even say that when all is say and done they will probably perform more or less the same,now stop been such a cry baby.

I doubt you've seen anything saying it'll have 32 mb's, that would be asinine. (Well there are crazy rumors but I haven't seen that one)

The way eDRAM works is, in addition to storing the framebuffer, the eDRAM helps with graphics tasks that require such fast ram, and the DDR3 can handle the rest.

Also, DDR3 1600 only has 12.8 gb's of bandwidth, I don't know where you got 68 from. But the DDR3 in 720 would have to be at least 1866 or more.

Avatar image for TheXFiles88
TheXFiles88

1040

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#133 TheXFiles88
Member since 2008 • 1040 Posts

[QUOTE="TheXFiles88"]Lol, you are such a hypocrite. You just wanted to beleive in whatever looks good for $ony but always rebuff any advantages that the Next Xbox might have. Also, you still don't understand the RAM types to being of.tormentos
Looks who's talking.. First it has been rumored to be 32MB that is what i read i have never read about 100MB of eDRAM. Second it sounded better if sony had an 8000 than 7870 AKA 7970M sadly that is another rumor as well. 3rd. I do understand the ram how eDRAM is been use to compensate for the slow DDR3 speed,fact is GDDR5 is almost 3 times faster. And lastly i actually even say that when all is say and done they will probably perform more or less the same,now stop been such a cry baby.

I am a cry baby? Lol. Look, you believe in "RUMORS" which are best suited for you but rebuffed "REAL" patent such as the "Scalable Console Patent" from M$ in 2010 which clearly stated that there have been multi-Core CPU & GPU?

"Microsoft applies for scalable console patent"

A patent application filed by Microsoft back in 2010 has just been discovered and points to the possibility that the next-gen Xbox will support a scalable hardware structure. The documents were first spotted by a member of theBeyond3Dforum and dissected byEurogamer. Based on the findings, it looks like Microsoft may be looking to create a hardware foundation that could then be customized to create models with varying performance capabilities.

http://www.gamerevolution.com/news/microsoft-patent-points-to-plans-for-scalable-console-13939

1

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#134 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"][QUOTE="clyde46"] So its not going to be a regular GPU then?04dcarraher
The guts of the GPU would be still based on 28nm fab'ed GCN.

Which means at best you will see 10% power usage difference if they rearranged the transistors , but it seems they are just renamed 7000 series.

From GoFlo fabs, AMD could get another ~10%-to-15% by applying "resonant clock mesh" tech for AMD GCN, but desktop Radeon HD 79x0/89x0 would be still off limits for ~200 watts box.

Avatar image for cryemocry
cryemocry

590

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#135 cryemocry
Member since 2013 • 590 Posts
Why don't you go whine in your secret diary?MonsieurX
I'll write a blog about how you are the worst troll on here lol.
Avatar image for cryemocry
cryemocry

590

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#136 cryemocry
Member since 2013 • 590 Posts

That knows very little about his hobby.

Avatar image for MonsieurX
MonsieurX

39858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#137 MonsieurX
Member since 2008 • 39858 Posts
[QUOTE="MonsieurX"]Why don't you go whine in your secret diary?cryemocry
I'll write a blog about how you are the worst troll on here lol.

Can't wait to read it.
Avatar image for cryemocry
cryemocry

590

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#138 cryemocry
Member since 2013 • 590 Posts

Even if the gpu is based off the 7900 series lol it's gonna be 128 bit ps3 gpu based off 7900 geforce series but it was 128 bit.

consoles the developers can keep optimizing overtime on the same software and hardware.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#139 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="04dcarraher"][QUOTE="Chozofication"]

I remember when people thought Sony was going to use a Cell 2.0, I actually thought they would too. I guess Sony finally saw it as a waste of money but it's looking like the overall rendering setup this time will be similar.

Chozofication

I knew the Sony would abandon the Cell since the Cell was originally designed to be the cpu and gpu and after tests the Cell was sub par to what was out , so they contacted Nvidia for a stop gap measure using the G70 (7800).

Yeah. They were gonna use 2 cell's and no gpu. If only they contacted Ati instead...

Cell was pretty incredible by itself but whoever thought it could do graphics by itself wasn't thinking straight.

PS3 with Radeon X1900/128bit GDDR3 + CELL/72bit XDR would have two processors capable for Fold@Home.

On PS3, Radeon X1900 wouldn't be restricted by MS DX9c e.g. ATI's Close-To-Metal GpGPU compute, 3DC+ (recycled for DX10), Havok type GpGPU physics (AMD tech demo), MSAA with FP HDR.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#140 tormentos
Member since 2003 • 33784 Posts
I am a cry baby? Lol. Look, you beleive in "RUMORS" which are best suited for you but rebuffed "REAL"TheXFiles88
Dude you know how many patents not only sony also MS had that they will never use,just because they patent something doesn't mean it will appear in a product.. In fact that patent is why some blind fanboys are saying the 720 will have 3 GPU,the 3 GPU in question all from the 8000 series 2 of them been top of the line 8970,the other 8850,there is no way in hell that MS will pack that unless the xbox 720 is the size of a huge tower with huge fans,and cost $1,500 dollars minimum.. The 8970 has a TDP of 250 Watts 1 alone is already to hot for the 720,let alone 2 of them + another 8850,ad to this ram,CPU we are looking at what a 700 watts console.? Please dude stop been a cry baby that just a lame patent MS did,to cash on any one who do something like that.
Avatar image for deactivated-57d8401f17c55
deactivated-57d8401f17c55

7221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#141 deactivated-57d8401f17c55
Member since 2012 • 7221 Posts

[QUOTE="Chozofication"]

[QUOTE="04dcarraher"] I knew the Sony would abandon the Cell since the Cell was originally designed to be the cpu and gpu and after tests the Cell was sub par to what was out , so they contacted Nvidia for a stop gap measure using the G70 (7800). ronvalencia

Yeah. They were gonna use 2 cell's and no gpu. If only they contacted Ati instead...

Cell was pretty incredible by itself but whoever thought it could do graphics by itself wasn't thinking straight.

PS3 with Radeon X1900/128bit GDDR3 + CELL/72bit XDR would have two processors capable for Fold@Home.

On PS3, Radeon X1900 wouldn't be restricted by MS DX9c e.g. ATI's Close-To-Metal GpGPU compute, 3DC+ (recycled for DX10), Havok type GpGPU physics (AMD tech demo), MSAA with FP HDR.

Why do you think Sony went with Nvidia? Only thing I can think of is that they just weren't thinking. I don't think it would've cost more for a Xenos.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#142 04dcarraher
Member since 2004 • 23829 Posts

[QUOTE="ronvalencia"]

[QUOTE="Chozofication"]

Yeah. They were gonna use 2 cell's and no gpu. If only they contacted Ati instead...

Cell was pretty incredible by itself but whoever thought it could do graphics by itself wasn't thinking straight.

Chozofication

PS3 with Radeon X1900/128bit GDDR3 + CELL/72bit XDR would have two processors capable for Fold@Home.

On PS3, Radeon X1900 wouldn't be restricted by MS DX9c e.g. ATI's Close-To-Metal GpGPU compute, 3DC+ (recycled for DX10), Havok type GpGPU physics (AMD tech demo), MSAA with FP HDR.

Why do you think Sony went with Nvidia? Only thing I can think of is that they just weren't thinking. I don't think it would've cost more for a Xenos.

Because it was a stop gap measure to use the G70 chipset.
Avatar image for deactivated-57d8401f17c55
deactivated-57d8401f17c55

7221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#143 deactivated-57d8401f17c55
Member since 2012 • 7221 Posts

[QUOTE="Chozofication"]

[QUOTE="ronvalencia"] PS3 with Radeon X1900/128bit GDDR3 + CELL/72bit XDR would have two processors capable for Fold@Home.

On PS3, Radeon X1900 wouldn't be restricted by MS DX9c e.g. ATI's Close-To-Metal GpGPU compute, 3DC+ (recycled for DX10), Havok type GpGPU physics (AMD tech demo), MSAA with FP HDR.

04dcarraher

Why do you think Sony went with Nvidia? Only thing I can think of is that they just weren't thinking. I don't think it would've cost more for a Xenos.

Because it was a stop gap measure to use the G70 chipset.

I don't follow.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#144 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="04dcarraher"][QUOTE="Chozofication"]

Yeah. They were gonna use 2 cell's and no gpu. If only they contacted Ati instead...

Cell was pretty incredible by itself but whoever thought it could do graphics by itself wasn't thinking straight.

Chozofication

PS3 with Radeon X1900/128bit GDDR3 + CELL/72bit XDR would have two processors capable for Fold@Home.

On PS3, Radeon X1900 wouldn't be restricted by MS DX9c e.g. ATI's Close-To-Metal GpGPU compute, 3DC+ (recycled for DX10), Havok type GpGPU physics (AMD tech demo), MSAA with FP HDR.

Why do you think Sony went with Nvidia? Only thing I can think of is that they just weren't thinking. I don't think it would've cost more for a Xenos.

Sony was looking to patch CELL's missing GPU fix function units and G70 would be the closest match.

Avatar image for deactivated-57d8401f17c55
deactivated-57d8401f17c55

7221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#145 deactivated-57d8401f17c55
Member since 2012 • 7221 Posts

[QUOTE="Chozofication"]

[QUOTE="04dcarraher"] PS3 with Radeon X1900/128bit GDDR3 + CELL/72bit XDR would have two processors capable for Fold@Home.

On PS3, Radeon X1900 wouldn't be restricted by MS DX9c e.g. ATI's Close-To-Metal GpGPU compute, 3DC+ (recycled for DX10), Havok type GpGPU physics (AMD tech demo), MSAA with FP HDR.

ronvalencia

Why do you think Sony went with Nvidia? Only thing I can think of is that they just weren't thinking. I don't think it would've cost more for a Xenos.

Sony was looking to patch CELL's missing GPU fix function units and G70 would be the closest match.

Huh. Does that mean a X1900 or Xenos actually would've been worse for the PS3?

Avatar image for TheXFiles88
TheXFiles88

1040

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#146 TheXFiles88
Member since 2008 • 1040 Posts

[QUOTE="TheXFiles88"]I am a cry baby? Lol. Look, you beleive in "RUMORS" which are best suited for you but rebuffed "REAL"tormentos
Dude you know how many patents not only sony also MS had that they will never use,just because they patent something doesn't mean it will appear in a product.. In fact that patent is why some blind fanboys are saying the 720 will have 3 GPU,the 3 GPU in question all from the 8000 series 2 of them been top of the line 8970,the other 8850,there is no way in hell that MS will pack that unless the xbox 720 is the size of a huge tower with huge fans,and cost $1,500 dollars minimum.. The 8970 has a TDP of 250 Watts 1 alone is already to hot for the 720,let alone 2 of them + another 8850,ad to this ram,CPU we are looking at what a 700 watts console.? Please dude stop been a cry baby that just a lame patent MS did,to cash on any one who do something like that.

Lol, you still don't really understand the most basic fundamental architecture in video game consoles, dude. You just can not put the retail PC products in a video game console. It just doesn't work that way.

These are the custom built CPU & GPU in which IBM/AMD/M$ would remove any functions which are not applicable for games. Also, M$ has full control of the costs since they manuflacture these chips from a 3rd party such as Flexronics...

Also, the CPU & GPU may have multi-core structures, no multiple phycial CPU's & GPU's, just like the Xbox's Xenon design. If that pic doesn't satisfy you, how about this one. Same muiti-core CPU & GPU layout. So who is the cry baby now???

1

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#147 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"][QUOTE="Chozofication"]

Why do you think Sony went with Nvidia? Only thing I can think of is that they just weren't thinking. I don't think it would've cost more for a Xenos.

Chozofication

Sony was looking to patch CELL's missing GPU fix function units and G70 would be the closest match.

Huh. Does that mean a X1900 or Xenos actually would've been worse for the PS3?

No. Sony didn't factor in the lack of forward design with G70. Radeon X1900 goes beyond DX9c's SM3.0 and fails DX10's SM4.0.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#148 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="tormentos"][QUOTE="TheXFiles88"]I am a cry baby? Lol. Look, you beleive in "RUMORS" which are best suited for you but rebuffed "REAL"TheXFiles88

Dude you know how many patents not only sony also MS had that they will never use,just because they patent something doesn't mean it will appear in a product.. In fact that patent is why some blind fanboys are saying the 720 will have 3 GPU,the 3 GPU in question all from the 8000 series 2 of them been top of the line 8970,the other 8850,there is no way in hell that MS will pack that unless the xbox 720 is the size of a huge tower with huge fans,and cost $1,500 dollars minimum.. The 8970 has a TDP of 250 Watts 1 alone is already to hot for the 720,let alone 2 of them + another 8850,ad to this ram,CPU we are looking at what a 700 watts console.? Please dude stop been a cry baby that just a lame patent MS did,to cash on any one who do something like that.

Lol, you still don't really understand the most basic fundamental architecture in video game consoles, dude. You just can not put the retail PC products in a video game console. It just doesn't work that way.

These are the custom built CPU & GPU in which IBM/AMD/M$ would remove any functions which are not applicable for games. Also, M$ has full control of the costs since they manuflacture these chips from a 3rd party such as Flexronics...

Also, the CPU & GPU may have multi-core structures, no multiple phycial CPU's & GPU's, just like the Xbox's Xenon design. If that pic doesn't satisfy you, how about this one. Same muiti-core CPU & GPU layout. So who is the cry baby now???

1

6 to 8 ARM/X86 follows AMD's plan to include quad-core ARM CPUs with it's future X86 APUs.

Avatar image for deactivated-57d8401f17c55
deactivated-57d8401f17c55

7221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#149 deactivated-57d8401f17c55
Member since 2012 • 7221 Posts

[QUOTE="Chozofication"]

[QUOTE="ronvalencia"] Sony was looking to patch CELL's missing GPU fix function units and G70 would be the closest match.ronvalencia

Huh. Does that mean a X1900 or Xenos actually would've been worse for the PS3?

No. Sony didn't factor in the lack of forward design with G70. Radeon X1900 goes beyond DX9c's SM3.0 and fails DX10's SM4.0.

I think I get it.

RSX was better for the type of rendering going on when Sony chose Nvidia, but Xenos was made with future rendering in mind.

I just thought that Xenos was always better period, regardless of workload.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#150 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"][QUOTE="Chozofication"]

Huh. Does that mean a X1900 or Xenos actually would've been worse for the PS3?

Chozofication

No. Sony didn't factor in the lack of forward design with G70. Radeon X1900 goes beyond DX9c's SM3.0 and fails DX10's SM4.0.

I think I get it.

RSX was better for the type of rendering going on when Sony chose Nvidia, but Xenos was made with future rendering in mind.

I just thought that Xenos was always better period, regardless of workload.

Xenos has it's own issues i.e. NEC provided "SMART" eDRAM part. CELL's SPUs has enough power to patch RSX's aging design.

In general, Xbox 360 and PS3 are roughtly similar.

PS: SMART eDRAM includes NEC's GPU fix function backend units i.e. it's not 100 percent ATI.