The return of 'secret sauce' but this time it's 'special sauce'.

  • 112 results
  • 1
  • 2
  • 3
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#51  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@pc_rocks said:

Irrelevant and without substance. PS3 was supposed to just perform with teh cell and it couldn't hence the reason Nvidia was brought on board to save face. Without GPU it would be an even bigger disaster. GPU saved teh cell not the other way around.

Sorry, I believe DF, Tiago Sousa - the principal rendering engineer over what a butthurt cow says. Crysis 2 is technically superior to whatever Sony put out on PS3. Funny Tiago specifically busted al the myths about it in the interview with DF and coincidentally Sony/ND/GG stopped hyping about the power of teh cell and only possible on PS3 bullsh*t after that interview. Egg on Sony's face right there.

Teh cell was a complete and utter disaster and didn't come anywhere close to even one of its promises. No amount of DC will change that as Sony themselves abandon it the first chance they got. Quite funny was when Sony First Party was hyping the power of PS4, its architecture, ease of use etc by directly sh*tting on PS3 yet just a couple of years back saying teh cell having limit less power.

I don't need to drop any act. I'm what I'm. Not every cow hater is a lem, that's just you cows default defense mechanism because you can't even comprehend that all reasonable people are fed up with your extreme cult around Sony.

1-Bullshit systems are not complete until finish the RSX was announce in 2004 they were working on it already,development of the PS3 begin in 2001.

But again show me a CPU from 2006 that was even close to Cell and what it can do GPU wise i DARE you.

In those times dual core were not even a thing and were ultra expensive and still would not hold a candle to Cell in GPU task,cell is a CPU not GPU and runs at 3.2ghz speed that not GPU runs on 2005 which speed were 500mhz or a little more.

2-Crysis 2 hahahahaha you mean that game with shitty animation,horrible frame rate and sub 720p resolution?

Dude effect,animation,level of detail and attention to detail and even frame is better in Uncharted 3 than Crysis 2 and if we bring TLOU which is a step abode Uncharted 3 is an ever bigger gap,while delivering true 720p not sub 720p like crysis 2.

But i remember this kind of argument here from certain lemmings who wanted to pretend they were hermit,and they just hated the PS3,now i am sure that you are lol worthy alt.

What Cell promise? Please LINK me to what Cell or sony promise with Cell.

Yeah yet the Jaguar inside the PS4 is weaker than Cell a 2001 CPU design..Hahahaa

Ups what that? The PS3 with 5 SPE toping the Jaguar on PS4 and pissing on the xbox 360 CPU how can that be cell is shitty according to you.

Why do you think the PS4 can't emulate the PS3 because sony doesn't have an emulator or is bad at it? hahahaa The PS4 Jaguar simply can't.

NO lemming i have here 16 years so yeah i know how to recognize an alt,a lemming and a TRUE hermit,which in fact many are here,poster like you who pretend to be PC lover but only attack sony are lemmings..lol

Once again the one who claim secret sauce was MS and Digital foundry by the way when it was running PR articles for MS specially Richard leadbetter.

But what i see here is a butthurt lemming trying to compare something AMD say to what was say about MS by its own PR team,you failed miserably.

Reminder, AMD GCNs CU is fully IEEE-754-2008 FP32 complaint like Intel SSE which is better than PowerPC's Altivec and SPU instruction set.

AMD APU =

Jaguar CPU acting like multiple PPE command role,

GCN CU (local data store per CU) acting SPE (local data store per SPU) math co-processor role.

GCN raster/texture hardware acting RSX's raster/texture hardware role.

Both GCN and X86-64 CPU can share x86-64 pointer with GCN acting like X86-64 lite node.

Unlike the IBM solution, both GCN CU and SSE follows the full IEEE-754 FP32 handling standards (hint, mixing between SSE and GCN CU has the same compute FP results).

Unlike IBM's SIMD solution, Intel SSE2/AVX has FP64 pack math with full IEEE-754 FP64 handing.

Sony has ported CELL's SPU middleware libraries to AMD's GCN CU. Main difference is horizontal compute size which is very wide on GCN CU.

NAVI CU has wavebreak to support different wave grid size perhaps catches to up NVIDIA's variable 16 or 32 warp size in CUDA. Alternatvely, NAVI CU's wavebreak could related variable shader rate

https://www.reddit.com/r/Amd/comments/9du2w4/navi_update_gfx10_in_gpu_open/

Gfx10ForceWaveBreakSize - "Forces the size of a wave break; over-rides client-specified value."

Options:

Gfx10ForceWaveBreakSizeNone: "No wave breaks by region."

Gfx10ForceWaveBreakSize8x8: "8x8 region size."

Gfx10ForceWaveBreakSize16x16: "16x16 region size."

Gfx10ForceWaveBreakSize32x32: "32x32 region size."

Gfx10ForceWaveBreakSizeClient: "Use client specified value."

Gfx10ForceWaveBreakSizeAuto: "Let PAL decide."

Avatar image for Shewgenja
Shewgenja

21456

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#52 Shewgenja
Member since 2009 • 21456 Posts

@Pedro: I mean... yeah? Have you seen the way XBox has managed first party this gen? Only console to not have a AAAAE. Cant say it's an accident.

Avatar image for deactivated-646b8e74d84bb
deactivated-646b8e74d84bb

30

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#53 deactivated-646b8e74d84bb
Member since 2003 • 30 Posts

@pc_rocks: SIT down cows, peoples are talking now. That s some mighty high horse you managed to hop on. Did any company u affiliate with (not Sony) provide you with trampoline or "resonable" PeoPles did you a solid. Do we have a representives now?

Avatar image for Pedro
Pedro

69083

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#54 Pedro
Member since 2002 • 69083 Posts

@Shewgenja: If you say so buddy.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#55 tormentos
Member since 2003 • 33784 Posts

@Pedro said:

The cell flopped harder than the Xbox One ever will. No-one cares or give shit about the overrated under performing cell tech except for folks like you. :)

Lets see sony deliver 87 million PS3 and the xbox one is not even half way there even that it started cheaper than the PS3.

The Cell gave birth what could be consider as the best looking games of last gen,while the xbox one could not even keep up with the PS4 and MS was force to release a stronger model after DX12,the magic cloud,and a speed bump failed to do anything.

But but but i am not a lemming.

@Pedro said:

@tormentos: Its funny how you think you are proving him to the contrary but you are actually doubling down on proving him right. Yikes!

Oh how cute group therapy.

@Pedro said:

The technology failed and was abandoned. That made it flopped. Sony couldn't have abandon it for the PS3 once it was started using the shit tech thus the reason for the PS4 change in technology. If the cell wasn't a flop it would have been in the PS4 but it isn't and it would NEVER make a return. Its time to accept the facts. :)

That is like saying Nvidia sucked as well since sony abandoned as well and MS also did,yet Nvidia has the best performing GPU for years.

The only reason sony abandon Cell is because it was expensive,and in fact more was achieve with Cell than with the shitty Xenon on 360,to the point were sony had Cell based servers,and was use by the army and for medical purposes so you see it was expensive for a console,but for other purposes it was cheap which is why it was use in medicine and by the army.

Is the same reason why MS and Sony both also abandoned Nvidia and why MS abandoned Intel ass well.

It wasn't good busyness for then as they need profits and using hardware to expensive cut hard into those profits.

Also Cell could not be made into an APU,which is why sony changed its tech and moved on,is all about profits and cutting cost,if you have dedicated GPU and CPU you need dedicated cooling for both,is simply more expensive.

It has nothing to do with Cell flopping or not it was a decision based on cost.

@ronvalencia said:

CELL's SPU is not a GPU since its missing geometry-raster engine (mass floating-point to integer conversion), texture units(texture data read/write + filtering) and ROPS(z-buffer with color raster read/write and blending effects).

SPU doesn't have Blitter (basic GPU function, mass data movements without ALU resource usage).

CELL's SPU's instruction set is based on PowerPC's Altivec (software 128 register model).

IBM's SPU has gather instructions at local data storage level while Intel's AVX 2 has gather instruction at register level. GPU has gather/scatter instructions with each texture unit.

On apples to apples comparison, BattleField 3 used deferred compute tile rendering on PS3's SPU and Xbox 360's GPU ALU compute hardware with Xbox 360's results being superior. Hint: software tile compute render loop with lower latency within the compute unit.

NVIDIA built RT (raytracing BHV search and intersect test) and Tensor cores (ray-tracing denoise) next to CUDA cores to reduce latency.

Where did i claim Cell SPE were GPU.? Cell SPE run at 3.2ghz far far from 2005 and 2006 GPU speed,this is something i have tell you 10 times my self already you should stop quoting me just because i am defending cell without actually reading.

The rest if just the irrelevant crap you always spew here.

The end result is what matter Cell was able to aid the RSX in a way the Xenon could not help the Xenos simply because it lacked power.

Avatar image for Pedro
Pedro

69083

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#56  Edited By Pedro
Member since 2002 • 69083 Posts

@tormentos: All of that doesn't change the fact that the cell flopped and NO ONE is using the tech because it flopped. Now its time for you to deal with it. :)

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#57 tormentos
Member since 2003 • 33784 Posts

@Pedro:

Doesn't change the fact that calling something a flop, doesn't quite make it one outside your fragile and biased mind.

Avatar image for zaryia
Zaryia

21607

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#58 Zaryia
Member since 2016 • 21607 Posts

da cell

Avatar image for pc_rocks
PC_Rocks

8459

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#59  Edited By PC_Rocks
Member since 2018 • 8459 Posts

@tormentos said:
@pc_rocks said:

Irrelevant and without substance. PS3 was supposed to just perform with teh cell and it couldn't hence the reason Nvidia was brought on board to save face. Without GPU it would be an even bigger disaster. GPU saved teh cell not the other way around.

Sorry, I believe DF, Tiago Sousa - the principal rendering engineer over what a butthurt cow says. Crysis 2 is technically superior to whatever Sony put out on PS3. Funny Tiago specifically busted al the myths about it in the interview with DF and coincidentally Sony/ND/GG stopped hyping about the power of teh cell and only possible on PS3 bullsh*t after that interview. Egg on Sony's face right there.

Teh cell was a complete and utter disaster and didn't come anywhere close to even one of its promises. No amount of DC will change that as Sony themselves abandon it the first chance they got. Quite funny was when Sony First Party was hyping the power of PS4, its architecture, ease of use etc by directly sh*tting on PS3 yet just a couple of years back saying teh cell having limit less power.

I don't need to drop any act. I'm what I'm. Not every cow hater is a lem, that's just you cows default defense mechanism because you can't even comprehend that all reasonable people are fed up with your extreme cult around Sony.

1-Bullshit systems are not complete until finish the RSX was announce in 2004 they were working on it already,development of the PS3 begin in 2001.

But again show me a CPU from 2006 that was even close to Cell and what it can do GPU wise i DARE you.

In those times dual core were not even a thing and were ultra expensive and still would not hold a candle to Cell in GPU task,cell is a CPU not GPU and runs at 3.2ghz speed that not GPU runs on 2005 which speed were 500mhz or a little more.

2-Crysis 2 hahahahaha you mean that game with shitty animation,horrible frame rate and sub 720p resolution?

Dude effect,animation,level of detail and attention to detail and even frame is better in Uncharted 3 than Crysis 2 and if we bring TLOU which is a step abode Uncharted 3 is an ever bigger gap,while delivering true 720p not sub 720p like crysis 2.

But i remember this kind of argument here from certain lemmings who wanted to pretend they were hermit,and they just hated the PS3,now i am sure that you are lol worthy alt.

What Cell promise? Please LINK me to what Cell or sony promise with Cell.

Yeah yet the Jaguar inside the PS4 is weaker than Cell a 2001 CPU design..Hahahaa

Ups what that? The PS3 with 5 SPE toping the Jaguar on PS4 and pissing on the xbox 360 CPU how can that be cell is shitty according to you.

Why do you think the PS4 can't emulate the PS3 because sony doesn't have an emulator or is bad at it? hahahaa The PS4 Jaguar simply can't.

NO lemming i have here 16 years so yeah i know how to recognize an alt,a lemming and a TRUE hermit,which in fact many are here,poster like you who pretend to be PC lover but only attack sony are lemmings..lol

Once again the one who claim secret sauce was MS and Digital foundry by the way when it was running PR articles for MS specially Richard leadbetter.

But what i see here is a butthurt lemming trying to compare something AMD say to what was say about MS by its own PR team,you failed miserably.

Another whole lot of garbage without any substance. teh cell was a complete and utter disaster and didn't come anywhere near to a single one of its promises. It was so shit that Sony had to bundle in a last gen outdated GPU over it rather than another crap cell. If anything the GPU save teh cell.

Any CPU from that era does better than teh cell with much less development cost. It was shit at branching, it was shit at any general purpose computing and it's venctor units i.e. SPEs were later found to be so shit that Sony had to bundle in a last gen outdated shader model GPU rather than put in another cell when that was the original plan.

Nobody cares, what a butthurt cow like you has to say about Crysis 2 on 360. It was superior to anything that Sony put out last gen on teh cell.

That graph is out of content, why not link the complete presentation because I know Ubisoft is only talking about one facet not comparing the overall power across all scenarios or even CPU performance. Precisely the reason why Sony abandoned teh cell the first chance they got. You think your deflection and out of context cherry picked sources will work on me? Still waiting for dat 4D, dual 1080p and 120 FPS.

Like I said about your default defense mechanism. Nobody cares what you think I'm. Sheeps, lems, hermit everyone is sick of you cows and whatever new faction for Google, Apple etc be as well. The entire SW across countless threads is a living proof of that. And I have already seen you call every hermit I see here that goes after cows, lems. That's your only pathetic defense because you can't defend your overlord Sony on its own merits.

Anyway, stop derailing the thread. This is about teh special sauce. If you have anything to add to it fine because I don't have time for your verbal diarrhea without facts.

Avatar image for Zero_epyon
Zero_epyon

20095

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#60 Zero_epyon
Member since 2004 • 20095 Posts
@Ant_17 said:

@davillain-: Same. Don't care if the PS5 is a Switch even.

Nah I draw the line right there lol. I would hate for the PS5 to be a Switch copy cat.

Avatar image for Ant_17
Ant_17

13634

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#61 Ant_17
Member since 2005 • 13634 Posts

@Zero_epyon: Diference is, Sony can make tech.

If they tried, they could make a small PS5 that connects to a TV. Issue will be the price.

Avatar image for Pedro
Pedro

69083

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#62 Pedro
Member since 2002 • 69083 Posts

@tormentos: Bias? Lol. I game primarily on the PS3 and owned 4 of them. So, keep your bias bullshit to yourself. You can't handle the fact that the cell flopped. Thus the reason the tech was abandoned. Stop being a sad excuse for a tool.??

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#63  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@Pedro said:

The cell flopped harder than the Xbox One ever will. No-one cares or give shit about the overrated under performing cell tech except for folks like you. :)

Lets see sony deliver 87 million PS3 and the xbox one is not even half way there even that it started cheaper than the PS3.

The Cell gave birth what could be consider as the best looking games of last gen,while the xbox one could not even keep up with the PS4 and MS was force to release a stronger model after DX12,the magic cloud,and a speed bump failed to do anything.

But but but i am not a lemming.

@Pedro said:

@tormentos: Its funny how you think you are proving him to the contrary but you are actually doubling down on proving him right. Yikes!

Oh how cute group therapy.

@Pedro said:

The technology failed and was abandoned. That made it flopped. Sony couldn't have abandon it for the PS3 once it was started using the shit tech thus the reason for the PS4 change in technology. If the cell wasn't a flop it would have been in the PS4 but it isn't and it would NEVER make a return. Its time to accept the facts. :)

That is like saying Nvidia sucked as well since sony abandoned as well and MS also did,yet Nvidia has the best performing GPU for years.

The only reason sony abandon Cell is because it was expensive,and in fact more was achieve with Cell than with the shitty Xenon on 360,to the point were sony had Cell based servers,and was use by the army and for medical purposes so you see it was expensive for a console,but for other purposes it was cheap which is why it was use in medicine and by the army.

Is the same reason why MS and Sony both also abandoned Nvidia and why MS abandoned Intel ass well.

It wasn't good busyness for then as they need profits and using hardware to expensive cut hard into those profits.

Also Cell could not be made into an APU,which is why sony changed its tech and moved on,is all about profits and cutting cost,if you have dedicated GPU and CPU you need dedicated cooling for both,is simply more expensive.

It has nothing to do with Cell flopping or not it was a decision based on cost.

@ronvalencia said:

CELL's SPU is not a GPU since its missing geometry-raster engine (mass floating-point to integer conversion), texture units(texture data read/write + filtering) and ROPS(z-buffer with color raster read/write and blending effects).

SPU doesn't have Blitter (basic GPU function, mass data movements without ALU resource usage).

CELL's SPU's instruction set is based on PowerPC's Altivec (software 128 register model).

IBM's SPU has gather instructions at local data storage level while Intel's AVX 2 has gather instruction at register level. GPU has gather/scatter instructions with each texture unit.

On apples to apples comparison, BattleField 3 used deferred compute tile rendering on PS3's SPU and Xbox 360's GPU ALU compute hardware with Xbox 360's results being superior. Hint: software tile compute render loop with lower latency within the compute unit.

NVIDIA built RT (raytracing BHV search and intersect test) and Tensor cores (ray-tracing denoise) next to CUDA cores to reduce latency.

Where did i claim Cell SPE were GPU.? Cell SPE run at 3.2ghz far far from 2005 and 2006 GPU speed,this is something i have tell you 10 times my self already you should stop quoting me just because i am defending cell without actually reading.

The rest if just the irrelevant crap you always spew here.

The end result is what matter Cell was able to aid the RSX in a way the Xenon could not help the Xenos simply because it lacked power.

In simple terms to convert SPU into GCN CU

AVX from Zen v2 256 bit SIMD unit with eight 32bit elements per operand via FMA3 operation. Zen v2 has two 256 bit SIMD units per CPU core.

1. [d][d][d][d][d][d][d][d]x[d][d][d][d][d][d][d][d]+[d][d][d][d][d][d][d][d],

2. [d][d][d][d][d][d][d][d]x[d][d][d][d][d][d][d][d]+[d][d][d][d][d][d][d][d],

3. [d][d][d][d][d][d][d][d]x[d][d][d][d][d][d][d][d]+[d][d][d][d][d][d][d][d],

4. [d][d][d][d][d][d][d][d]x[d][d][d][d][d][d][d][d]+[d][d][d][d][d][d][d][d],

...

n. [d][d][d][d][d][d][d][d]x[d][d][d][d][d][d][d][d]+[d][d][d][d][d][d][d][d],

SPU's 128 bit SIMD unit with four 32bit elements per operand via FMA3 operation

1. [d][d][d][d]x[d][d][d][d]+[d][d][d][d],

2. [d][d][d][d]x[d][d][d][d]+[d][d][d][d]

3. [d][d][d][d]x[d][d][d][d]+[d][d][d][d]

4. [d][d][d][d]x[d][d][d][d]+[d][d][d][d]

...

n. [d][d][d][d]x[d][d][d][d]+[d][d][d][d]

GCN CU, super wide MIMD not including a separate scalar unit.

1. [i][d][d][d][i][d][d][d][d][i][d][d][d][d][i][d][d][d][d][d][i][d][d][d][i][d][d][d][d][i][d][d][d][d][i][d][d][d][d][d][i][d][d][d][i][d][d][d][d][i][d][d][d][d][i][d][d][d][d][d][i][d][d][d][i][d][d][d][d][i][d][d][d][d][i][d][d][d][d][d],

64 32bit elements per wave front compute. NAVI has "wavebreak" feature with different grid matrix size... (perhaps related to this area). NVIDIA CUDA's MIMD (warp) is 16 or 32 elements wide.

2. [i][d][d][d][i][d][d][d][d][i][d][d][d][d][i][d][d][d][d][d][i][d][d][d][i][d][d][d][d][i][d][d][d][d][i][d][d][d][d][d][i][d][d][d][i][d][d][d][d][i][d][d][d][d][i][d][d][d][d][d][i][d][d][d][i][d][d][d][d][i][d][d][d][d][i][d][d][d][d][d]

...

n. [i][d][d][d][i][d][d][d][d][i][d][d][d][d][i][d][d][d][d][d][i][d][d][d][i][d][d][d][d][i][d][d][d][d][i][d][d][d][d][d][i][d][d][d][i][d][d][d][d][i][d][d][d][d][i][d][d][d][d][d][i][d][d][d][i][d][d][d][d][i][d][d][d][d][i][d][d][d][d][d]

NAVI CU has 3 separate scalar units.

GCN's CU with ultra wide MIMD is closer to Tensor matrix math units behavior when compared CUDA 16 or 32 warp..

GpGPU is like super wide SPU with raster graphics hardware.

Tensor 4x4x4 cube has 64 data elements which is the same data element size as GCN's CU wave front payload issue. AMD adds AI instructions into existing CU design.

Avatar image for clone01
clone01

29822

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#64 clone01
Member since 2003 • 29822 Posts

@ProtossRushX said:

Damn Secret technology for the ps5 is epic...this means it could maybe beat PC gaming with secret new technology.

Trolling or just stupid?

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#65 tormentos
Member since 2003 • 33784 Posts

@pc_rocks said:

Another whole lot of garbage without any substance. teh cell was a complete and utter disaster and didn't come anywhere near to a single one of its promises. It was so shit that Sony had to bundle in a last gen outdated GPU over it rather than another crap cell. If anything the GPU save teh cell.

Any CPU from that era does better than teh cell with much less development cost. It was shit at branching, it was shit at any general purpose computing and it's venctor units i.e. SPEs were later found to be so shit that Sony had to bundle in a last gen outdated shader model GPU rather than put in another cell when that was the original plan.

Nobody cares, what a butthurt cow like you has to say about Crysis 2 on 360. It was superior to anything that Sony put out last gen on teh cell.

That graph is out of content, why not link the complete presentation because I know Ubisoft is only talking about one facet not comparing the overall power across all scenarios or even CPU performance. Precisely the reason why Sony abandoned teh cell the first chance they got. You think your deflection and out of context cherry picked sources will work on me? Still waiting for dat 4D, dual 1080p and 120 FPS.

Like I said about your default defense mechanism. Nobody cares what you think I'm. Sheeps, lems, hermit everyone is sick of you cows and whatever new faction for Google, Apple etc be as well. The entire SW across countless threads is a living proof of that. And I have already seen you call every hermit I see here that goes after cows, lems. That's your only pathetic defense because you can't defend your overlord Sony on its own merits.

Anyway, stop derailing the thread. This is about teh special sauce. If you have anything to add to it fine because I don't have time for your verbal diarrhea without facts.

LINK me to Cell so call promises basically at this point you are grasping and talking shit without any links backing your argument please link me to the so call promises made with Cell.

ON 2006 The RSX wasn't a last gen GPU buffoon that little you know basically the PS3 is a 7600GTX which is a cut down 7800GTX both from the same generation of cards.

Basically all CPU suck at branch predicting,basically you are using Major Nelson ass arguments which were nothing more than a total downplay of Cell,basically you are using MS PR that claimed Cell was shit vs the xbox 360 CPU.

Now this ^^ is reality this is an actual test,see how compute on PS3 kicked the living crap out of the 360 CPU,see how with just 5 SPE it was enough to beat the pathetic entire CPU by 3 times.

This is a test done by Ubisoft but but general purpose computing, news flash the PS3 was running GAMES not windows why in hell it would need to be great at general purpose computing?

No it wasn't it was a shitty performing game which could not even reach 720p in any of the 2 platforms while having less visual fidelity,shitty animations and attention to detail compare to games like Uncharted 3 and TLOU.

See that paragraph in bold? Yeah that is the usual garbage use by butthurt lemmings to downplay the PS3 if i see it once i see it 100 times here.

1-4D does not i repeat does not have anything to do with the visual of the game it self, exmaple 2d vs 3d vs 4d.

In fact 4D is consider as a next step in experience and it exist already in fact is implemented in theaters around the world,it has to be more with the experience.

In other word did you play the eye of judgement on PS3? Is about more about experience.

Sony didn't promise the PS3 would come with 2 HDMI ports.

And they never say at all that all games would run at 120FPS and in fact the PS3 i am sure could run some of the FPS of those days at 120FPS since they were not as demanding as games are now.

As you can see running 120FPS in even on resolution over 720p which wasn't the norm back then was possible,on the 7800GTX which was actually weaker than the Xenos inside the xbox 360.

Quote me i know who is a hermit here and who isn't go ahead.

Stop making an ass of your self, if you are going to cry about secret sauce complain about MS and its failed DX12,infinite cloud power and never changed anything or materialize,instead of trying to argument were there is non.

@Pedro said:

@tormentos: Bias? Lol. I game primarily on the PS3 and owned 4 of them. So, keep your bias bullshit to yourself. You can't handle the fact that the cell flopped. Thus the reason the tech was abandoned. Stop being a sad excuse for a tool.??

Yeah i owned a 360 and owned 10 of them i guess this board hold what ever crap you spew.

No actually the xbox one flopped and still you defend that shit,so yeah says more about you than my argument defending cell.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#67  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@pc_rocks said:

Another whole lot of garbage without any substance. teh cell was a complete and utter disaster and didn't come anywhere near to a single one of its promises. It was so shit that Sony had to bundle in a last gen outdated GPU over it rather than another crap cell. If anything the GPU save teh cell.

Any CPU from that era does better than teh cell with much less development cost. It was shit at branching, it was shit at any general purpose computing and it's venctor units i.e. SPEs were later found to be so shit that Sony had to bundle in a last gen outdated shader model GPU rather than put in another cell when that was the original plan.

Nobody cares, what a butthurt cow like you has to say about Crysis 2 on 360. It was superior to anything that Sony put out last gen on teh cell.

That graph is out of content, why not link the complete presentation because I know Ubisoft is only talking about one facet not comparing the overall power across all scenarios or even CPU performance. Precisely the reason why Sony abandoned teh cell the first chance they got. You think your deflection and out of context cherry picked sources will work on me? Still waiting for dat 4D, dual 1080p and 120 FPS.

Like I said about your default defense mechanism. Nobody cares what you think I'm. Sheeps, lems, hermit everyone is sick of you cows and whatever new faction for Google, Apple etc be as well. The entire SW across countless threads is a living proof of that. And I have already seen you call every hermit I see here that goes after cows, lems. That's your only pathetic defense because you can't defend your overlord Sony on its own merits.

Anyway, stop derailing the thread. This is about teh special sauce. If you have anything to add to it fine because I don't have time for your verbal diarrhea without facts.

LINK me to Cell so call promises basically at this point you are grasping and talking shit without any links backing your argument please link me to the so call promises made with Cell.

ON 2006 The RSX wasn't a last gen GPU buffoon that little you know basically the PS3 is a 7600GTX which is a cut down 7800GTX both from the same generation of cards.

Basically all CPU suck at branch predicting,basically you are using Major Nelson ass arguments which were nothing more than a total downplay of Cell,basically you are using MS PR that claimed Cell was shit vs the xbox 360 CPU.

Now this ^^ is reality this is an actual test,see how compute on PS3 kicked the living crap out of the 360 CPU,see how with just 5 SPE it was enough to beat the pathetic entire CPU by 3 times.

This is a test done by Ubisoft but but general purpose computing, news flash the PS3 was running GAMES not windows why in hell it would need to be great at general purpose computing?

No it wasn't it was a shitty performing game which could not even reach 720p in any of the 2 platforms while having less visual fidelity,shitty animations and attention to detail compare to games like Uncharted 3 and TLOU.

See that paragraph in bold? Yeah that is the usual garbage use by butthurt lemmings to downplay the PS3 if i see it once i see it 100 times here.

1-4D does not i repeat does not have anything to do with the visual of the game it self, exmaple 2d vs 3d vs 4d.

In fact 4D is consider as a next step in experience and it exist already in fact is implemented in theaters around the world,it has to be more with the experience.

In other word did you play the eye of judgement on PS3? Is about more about experience.

Sony didn't promise the PS3 would come with 2 HDMI ports.

And they never say at all that all games would run at 120FPS and in fact the PS3 i am sure could run some of the FPS of those days at 120FPS since they were not as demanding as games are now.

As you can see running 120FPS in even on resolution over 720p which wasn't the norm back then was possible,on the 7800GTX which was actually weaker than the Xenos inside the xbox 360.

Quote me i know who is a hermit here and who isn't go ahead.

Stop making an ass of your self, if you are going to cry about secret sauce complain about MS and its failed DX12,infinite cloud power and never changed anything or materialize,instead of trying to argument were there is non.

1. Xbox 360's IBM PPE solution is gimped by 1.6 Ghz L2 cache which is shared by 3 PPE CPU cores at 3.2Ghz. IBM PPE is a bullsh*t high clock speed marketing with very low effective IPC.

SPE has 3.2 Ghz clock speed for ALUs and local data storage (LDS). 256KB local data storage (LDS) is allocated for each SPU.

MS set system TDP bias towards the GPU.

Xenos GPU has tessellation hardware coupled with vertex shaders and shader stream out which is similar to "GPU - SO" (stream out)

GPU GS = geometry shader, it's role has been reduced by DX11's tessellation hardware. DX10 GPU doesn't have tessellation hardware.

2. RSX's G70 design is last gen since 8800 GTX (1st gen CUDA) was released a weeks earlier.

RSX has problems with full speed FP32 compute and shader branch.

Avatar image for pc_rocks
PC_Rocks

8459

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#68  Edited By PC_Rocks
Member since 2018 • 8459 Posts

@tormentos said:

LINK me to Cell so call promises basically at this point you are grasping and talking shit without any links backing your argument please link me to the so call promises made with Cell.

ON 2006 The RSX wasn't a last gen GPU buffoon that little you know basically the PS3 is a 7600GTX which is a cut down 7800GTX both from the same generation of cards.

Basically all CPU suck at branch predicting,basically you are using Major Nelson ass arguments which were nothing more than a total downplay of Cell,basically you are using MS PR that claimed Cell was shit vs the xbox 360 CPU.

Now this ^^ is reality this is an actual test,see how compute on PS3 kicked the living crap out of the 360 CPU,see how with just 5 SPE it was enough to beat the pathetic entire CPU by 3 times.

This is a test done by Ubisoft but but general purpose computing, news flash the PS3 was running GAMES not windows why in hell it would need to be great at general purpose computing?

No it wasn't it was a shitty performing game which could not even reach 720p in any of the 2 platforms while having less visual fidelity,shitty animations and attention to detail compare to games like Uncharted 3 and TLOU.

See that paragraph in bold? Yeah that is the usual garbage use by butthurt lemmings to downplay the PS3 if i see it once i see it 100 times here.

1-4D does not i repeat does not have anything to do with the visual of the game it self, exmaple 2d vs 3d vs 4d.

In fact 4D is consider as a next step in experience and it exist already in fact is implemented in theaters around the world,it has to be more with the experience.

In other word did you play the eye of judgement on PS3? Is about more about experience.

Sony didn't promise the PS3 would come with 2 HDMI ports.

And they never say at all that all games would run at 120FPS and in fact the PS3 i am sure could run some of the FPS of those days at 120FPS since they were not as demanding as games are now.

As you can see running 120FPS in even on resolution over 720p which wasn't the norm back then was possible,on the 7800GTX which was actually weaker than the Xenos inside the xbox 360.

Quote me i know who is a hermit here and who isn't go ahead.

Stop making an ass of your self, if you are going to cry about secret sauce complain about MS and its failed DX12,infinite cloud power and never changed anything or materialize,instead of trying to argument were there is non.

I already linked the 3 promises. Where did it accomplished that? Don't worry link me to those and I'll still have dozens more of the ridiculous claims about teh cell by Sony and all its first party. Teh cell was ac omplete and utter disaster and it didn't deliver jack sh*t. If anything the GPU saved teh cell not the other way around.

RSX was a last gen GPU it didn't have unified shaders and Sony opted for that rather than put in another cell as they previously hoped because it was complete crap at it. Your overlords chose a cut down last gen GPY over teh cell when the realized the stinker and the money pit it was.

Just branching? Teh cell was also crap at time sharing, threading, any task that relies on multiple instruction streams. The only thing it was somewhat decent was execute a single instruction with multiple data sets in parallel, kinda like a GPU compute but that was again nowhere near the ballpark figure a GPU can do. Even in that scenario PPU has to fork and merge the jobs result from SPE's negating much of the advantage and sucked at parallel workloads that traditional CPUs do. So it tried to combine what CPU and GPUs do but failed miserably at both of those tasks just like Larrabee. No amount of DC will change that. Your overlords themselves abandon it the first chance they got when the plan was to go for teh cell 2 with 2 PPUs and 14-16 SPU but the both the traditional CPUs and GPUs just demolished all hopes of Sony because they excel much better than teh cell at those tasks while being cheaper and much more developer friendly. Sony was planning to incorporate teh cell in multimedia devices, consoles, data centers, TVs, computers everything and look how that panned out. No those army projects with cell doesn't count as I'm sure you're definitely going to fall back on for a an excuse. It was before the advent of modern GPU and they thought it would a good for those number crunching tasks as before that only CPUs were used to solve these problems. Even then they abandon it in favor of GPU based supercomputers coupled with traditional CPUs putting final nail in teh cell's coffin.

Again as I predicted, you didn't link the complete presentation and just tried to cherry pick the part out of context. Funny you are now tring to show that Ubisoft say that while I was the one that pointed it out as I know the actual presentation which you tried to hide.They were talking about the compute like performance not the CPU power, you failed again. Teh cell was a complete and utter disaster. Your out of context link don't work on me as I know exactly everything about it. You can't bullsh*t your way out of me by throwing random graphs. Loving the DC that it wasn't trying to run Windows hence the general performance doesn't matter you and Sony tried to pass it off as teh supercomputer in the same breath for those army projects. Loved that double standards and contradiction for the same argument just shows you have no leg to stand on. Hell I was go as far as to say that teh cell was one of the biggest architecture and design blunders in the history of computing architectures and their own IP owners abandonment is the living proof of that.

That's the reason all people are sick and tired of you cows because the cult is easier to see. You're defending a failed project even though your overlords admitted it and moved on by pulling all their investments out of it. It's not just lems but your cult around Sony. Still waiting for where did it do 4D, 120FPS and dual 1080p. Show those first I'll still put out dozens of other outrageous claims.

Yeah that extreme linear games with tunnels, extremely static worlds with everything baked, no physics, no shadows, no realtime lighting can go toe to toe with Crysis 2? 720p is a chump change compared to what Crysis 2 did. Nobody cares about a narrative that you cows have tried to build here. I would take the worlds of the actual engineers and DF over a butthurt cow. Crysis 2/3 on 360 is still techniucally superior to whatever Sony put out on teh cell just like the majority of the multiplats.

Again nobody, and when I say nobody I literally mean it, it's not just a figure of speech. Nobody cares who you think about people. Every single thread and post of yours in Sw is a proof of that. All laughs at you and your hypocrisy. You call everyone lemming that criticizes your overlord SOny and points out the hypocrisy of cows. You have called me, Zaria, NoodleFighter, howmakewood, ghosts, dragonfire, goldenelement, mansieur, kalibird - lems when many of them only owns PS4 as a console not Xbone and some of them don't own any consoles for that matter. All of them are hermits. Hell I have seen you call lundy, ragnarok and even one mod lemming when they don't even after go Sony/PS4/cows as much as we do. As I said, it's just your excuse to get out of the criticism. I'm sure you just consider cows that have or had a PC - hermits like quackknight, randommatt etc. or people like davillain that are a fan of PS4 true hermits to make you feel better. Sorry tormentos, all the factions are sick and tired of cows bullsh*t, be it lems, hermits or sheep. All the threads in SW are the proof of that.

If anything you're a bigger lemming than the people I listed along with me. They are all consistent in their arguments while you - a hypocrite switches position in a heart beat. I can easily pull ton of your threads defending Xbone/MS against PC with us hermits when your only way of defending Sony was to defend the MS/Xbone when you in other threads you were hating the MS/Xbone with the same arguments we hermits put up. You're a hypocrite of a highest order. You hide behind Xbone/Switch when it fits your agenda to defend but Sony but in other instances have gone as far as hiding behind mobiles to attack Switch. You're a fraud!

Anyway, stop derailing the thread. This is about teh special sauce. If you have anything to add to it fine because I don't have time for your verbal diarrhea without facts.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#69  Edited By ronvalencia
Member since 2008 • 29612 Posts

@pc_rocks:

@tormentos

http://www.geautomation.com/news/ge-fanuc-intelligent-platforms-to-revolutionize-military-and-aerospace-embedded-computing-with-nvidia-s-cuda-technology/n2645

GE's military applications with NVIDIA CUDA technology... NOT just "teh CELL".

https://coloradoengineering.com/cei-teams-u-s-navy-radar-system-lockheed-martin-joint-strike-fighter/

NVIDIA CUDA GpGPU technology embedded into incoming F-35 Block 4 avionics upgrade.

LAKEHURST, N.J., July 7, 2017 – The Naval Air Warfare Center Aircraft Division in Lakehurst, N.J. awarded a $3 million contract to Colorado Engineering Inc. (CEI) last week to design, fabricate and test graphics processing units (GPUs) paired with a supporting processor infrastructure to upgrade radar aboard the Lockheed Martin F-35 Lightning II joint strike fighter Block 4 .

The planned Block 4 upgrades to the F-35 – which have an ETA of 2023 – are expected to include adding a wide-area high-resolution synthetic aperture radar (SAR) mode to the Northrop Grumman APG-81 active electronically scanned array (AESA) radar. The so-called “Big SAR” capability on future versions of the F-35 should be able to produce high-definition radar imagery that covers a large area on the ground in one image to enhance the F-35’s reconnaissance and targeting capabilities.

General-purpose graphics processing (GPGPU) is becoming the cornerstone of digital signal processing in aerospace and defense applications like radar and sonar signal processing, image processing, hyperspectral sensor imaging, signals intelligence, electronic warfare, and persistent surveillance.

The U.S. Navy chose CEI for its radar signal processing expertise. CEI is also experienced in complex circuit board design for military, industrial, and commercial applications, and in the OpenCL software language often used to program GPGPU chips. CEI has established strategic industry partnerships with GPGPU designers NVIDIA and AMD, as well as high-performance general-purpose processor manufacturers Intel Corp. and Texas Instruments.

Intel AVX v2's 256 bit and AVX-512 bit wide SIMDs and gather instructions at register level is closer to a GpGPU's wide horizontal compute when compared to IBM SPU's 128 bit wide SIMD.

AVX-512 bit has scatter instructions. https://software.intel.com/en-us/articles/intel-advisor-map-gatherscatter

https://www.militaryaerospace.com/articles/2017/07/radar-signal-processing-embedded-computing-gpgpu.html

LAKEHURST, N.J. – U.S. Navy airborne sensors experts are asking radar signal processing experts at Colorado Engineering Inc. in Colorado Springs, Colo., to design advanced embedded computing to upgrade the radar system aboard the Lockheed Martin F-35Lightning II joint strike fighter.

Officials of the Naval Air Warfare Center Aircraft Division in Lakehurst, N.J., announced a $3 million contract to Colorado Engineering last week to design, fabricate and test graphics processing units (GPUs) paired with a supporting processor infrastructure that are compatible with the F-35's Block 4 radar.

The planned Block 4 upgrades to the F-35 are expected to include adding a wide-area high-resolution synthetic aperture radar (SAR) mode to the Northrop Grumman APG-81 active electronically scanned array (AESA) radar. The F-35 Block 4 upgrades will be broken into two increments; Block 4A in 2021 and Block 4B in 2023.

The so-called "Big SAR" capability on future versions of the F-35 should be able to produce high-definition radar imagery that covers a large area on the ground in one image to enhance the F-35's reconnaissance and targeting capabilities.

The planned F-35 Block 4 upgrades also will include an upgraded electro-optical targeting system, as well as new weapons such as long-range air-to-air missiles and the Small Diameter Bomb.

General-purpose graphics processing (GPGPU) is becoming the cornerstone of digital signal processing in aerospace and defense applications like radar and sonar signal processing, image processing, hyperspectral sensor imaging, signals intelligence, electronic warfare, and persistent surveillance.

The GPGPU began life in the last decade as a graphics-processing engine aimed at high-end computer gaming, yet in recent years has emerged as a powerful massively parallel processor. It lends itself to complex floating-point processing, and is easy enough to program to appeal to a broad range of military embedded systems.

The primary designers of GPGPU chips in the U.S. today are NVIDIA Corp. in Santa Clara, Calif., and Advanced Micro Devices Inc. (AMD) in Sunnyvale, Calif.

Colorado Research has expertise in complex circuit board design for military, industrial, and commercial applications, as well as in the OpenCL software language often used to program GPGPU chips.

Among its partners, Colorado Research counts GPGPU designers NVIDIA and AMD, as well as high-performance general-purpose processor manufacturers Intel Corp. and Texas Instruments.

Intel Larrabee core contained single512-bit vector processing unit and scales to 48 cores.

Larrabee's successors

  • AMD Epyc 2(Zen 2) with 64 CPU cores which each CPU core has two 256 bit wide AVX v2.
  • Intel Haswell/Broadwell/Skylake S/Kaby Lake/Coffee Lake core has two 256 bit wide AVX v2. Scales to 10 CPU cores with Comet Lake SKU.
  • Intel Skylake X/Cascade Lake core contained two 512-bit wide AVX units. Scales to 48 CPU cores.

With Zen v2, AMD creates it's own "Larrabee" with superior general IPC.

AMD GCN's X86-64 lite behavior (e.g. pointer, virtual memory, memory page fault, interrupts, IEEE-754 FP conformance with SSE/AVX) is nearly "Larrabee". Also applicable to Intel OpenCL 2.x complaint IGP (1 TFLOPS FP32 range).

Intel Larrabee was runover by Intel Xeon Core i Series core wars against AMD.

Larrabee project was recycled into Xeon Phi products with upgraded CPU core from Silvermont/Airmont (Intel Atom) and supports AVX-512

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#70 tormentos
Member since 2003 • 33784 Posts

@pc_rocks said:

I already linked the 3 promises. Where did it accomplished that? Don't worry link me to those and I'll still have dozens more of the ridiculous claims about teh cell by Sony and all its first party. Teh cell was ac omplete and utter disaster and it didn't deliver jack sh*t. If anything the GPU saved teh cell not the other way around.

RSX was a last gen GPU it didn't have unified shaders and Sony opted for that rather than put in another cell as they previously hoped because it was complete crap at it. Your overlords chose a cut down last gen GPY over teh cell when the realized the stinker and the money pit it was.

Just branching? Teh cell was also crap at time sharing, threading, any task that relies on multiple instruction streams. The only thing it was somewhat decent was execute a single instruction with multiple data sets in parallel, kinda like a GPU compute but that was again nowhere near the ballpark figure a GPU can do. Even in that scenario PPU has to fork and merge the jobs result from SPE's negating much of the advantage and sucked at parallel workloads that traditional CPUs do. So it tried to combine what CPU and GPUs do but failed miserably at both of those tasks just like Larrabee. No amount of DC will change that. Your overlords themselves abandon it the first chance they got when the plan was to go for teh cell 2 with 2 PPUs and 14-16 SPU but the both the traditional CPUs and GPUs just demolished all hopes of Sony because they excel much better than teh cell at those tasks while being cheaper and much more developer friendly. Sony was planning to incorporate teh cell in multimedia devices, consoles, data centers, TVs, computers everything and look how that panned out. No those army projects with cell doesn't count as I'm sure you're definitely going to fall back on for a an excuse. It was before the advent of modern GPU and they thought it would a good for those number crunching tasks as before that only CPUs were used to solve these problems. Even then they abandon it in favor of GPU based supercomputers coupled with traditional CPUs putting final nail in teh cell's coffin.

Again as I predicted, you didn't link the complete presentation and just tried to cherry pick the part out of context. Funny you are now tring to show that Ubisoft say that while I was the one that pointed it out as I know the actual presentation which you tried to hide.They were talking about the compute like performance not the CPU power, you failed again. Teh cell was a complete and utter disaster. Your out of context link don't work on me as I know exactly everything about it. You can't bullsh*t your way out of me by throwing random graphs. Loving the DC that it wasn't trying to run Windows hence the general performance doesn't matter you and Sony tried to pass it off as teh supercomputer in the same breath for those army projects. Loved that double standards and contradiction for the same argument just shows you have no leg to stand on. Hell I was go as far as to say that teh cell was one of the biggest architecture and design blunders in the history of computing architectures and their own IP owners abandonment is the living proof of that.

That's the reason all people are sick and tired of you cows because the cult is easier to see. You're defending a failed project even though your overlords admitted it and moved on by pulling all their investments out of it. It's not just lems but your cult around Sony. Still waiting for where did it do 4D, 120FPS and dual 1080p. Show those first I'll still put out dozens of other outrageous claims.

Yeah that extreme linear games with tunnels, extremely static worlds with everything baked, no physics, no shadows, no realtime lighting can go toe to toe with Crysis 2? 720p is a chump change compared to what Crysis 2 did. Nobody cares about a narrative that you cows have tried to build here. I would take the worlds of the actual engineers and DF over a butthurt cow. Crysis 2/3 on 360 is still techniucally superior to whatever Sony put out on teh cell just like the majority of the multiplats.

Again nobody, and when I say nobody I literally mean it, it's not just a figure of speech. Nobody cares who you think about people. Every single thread and post of yours in Sw is a proof of that. All laughs at you and your hypocrisy. You call everyone lemming that criticizes your overlord SOny and points out the hypocrisy of cows. You have called me, Zaria, NoodleFighter, howmakewood, ghosts, dragonfire, goldenelement, mansieur, kalibird - lems when many of them only owns PS4 as a console not Xbone and some of them don't own any consoles for that matter. All of them are hermits. Hell I have seen you call lundy, ragnarok and even one mod lemming when they don't even after go Sony/PS4/cows as much as we do. As I said, it's just your excuse to get out of the criticism. I'm sure you just consider cows that have or had a PC - hermits like quackknight, randommatt etc. or people like davillain that are a fan of PS4 true hermits to make you feel better. Sorry tormentos, all the factions are sick and tired of cows bullsh*t, be it lems, hermits or sheep. All the threads in SW are the proof of that.

If anything you're a bigger lemming than the people I listed along with me. They are all consistent in their arguments while you - a hypocrite switches position in a heart beat. I can easily pull ton of your threads defending Xbone/MS against PC with us hermits when your only way of defending Sony was to defend the MS/Xbone when you in other threads you were hating the MS/Xbone with the same arguments we hermits put up. You're a hypocrite of a highest order. You hide behind Xbone/Switch when it fits your agenda to defend but Sony but in other instances have gone as far as hiding behind mobiles to attack Switch. You're a fraud!

Anyway, stop derailing the thread. This is about teh special sauce. If you have anything to add to it fine because I don't have time for your verbal diarrhea without facts.

Link me to Cell promises i want to see Sony state ""With Cell we Promise _____________________""

Back up your claims.

The 7900GTX spanked the xbox 360 and didn't have unified shaders ass,just because a feature is missing doesn't mean something is last gen,you were FACTUALLY wrong.

Cell is not a CPU created by sony it self is IBM tech modify with sony engineer,nothing different from what sony is doing with AMD.

Sony dropped the RSX as well i guess that means Nvidia sucks ass.Oh wait Nvidia currently controls the market and has the strongest GPU for years how come sony drops them?

The only reason Cell as well as Nvidia and Intel before it as well were drop is simple $$$$$.

Nvidia and Intel are more expensive than AMD,just like Cell was to expensive ans difficult to shrink which is need it for cost reducing latter on the generation.

There was nothing quite like Cell in 2006 CPU wise or close,hell a dual core CPU in 2006 was more than $300 and if you wanted the top of the line one with 2 cores was close to $700.

And those CPU in GPU task would not be close to Cell at all.

There was no plans for Cell 2 of 16 spe that is just bullshit spread on forums such as this one,on 2009 already IBM pulled the plug,what you don't see now is that Cell was revolutionary in fact it was the grand father of modern APU right now.

Cell was a hybrid CPU that could handle both task.

n an interview with Heise.de, IBM's VP of Deep Computing, David Turek, confirmed that the Cell processor has reached the end of the line. Turek then put a more positive spin on the news by stating the obvious truth that heterogeneous multiprocessors, of which Cell was the first mass-market example of, are here to stay, so insofar as IBM continues to produce such chips, Cell's basic concepts and ideas will live on in the company's product line.

https://arstechnica.com/gadgets/2009/11/end-of-the-line-for-ibms-cell/

Once again you end owned by your own lies,just because you read some bullshit on forums doesn't mean is true,basically you are arguing with stupidity use on forums such as this and trying to picture it as some kind of FACT.

Again is obvious that you are a shitty lemming alt,and that you are not a hermit or close.

This thread is about the stupidity of a lemming who want to imply the PS5 will have some kind of secret sauce using AMD words and not sony's in a pathetic attempt to create a notion that doesn't exist and that when it existed it was coined by MS and digital foundry running PR,and which lemmings such as your self from your other account defend.

Want to talk about secret sauce talk about how much of a let down DX12 was on xbox and how MS fooled morons to believe for 4 years that a magical cloud would increase power on its console and how DX12 would double the xbox one power.

:)

@ronvalencia said:

@pc_rocks:

@tormentos

http://www.geautomation.com/news/ge-fanuc-intelligent-platforms-to-revolutionize-military-and-aerospace-embedded-computing-with-nvidia-s-cuda-technology/n2645

GE's military applications with NVIDIA CUDA technology... NOT just "teh CELL".

https://coloradoengineering.com/cei-teams-u-s-navy-radar-system-lockheed-martin-joint-strike-fighter/

NVIDIA CUDA GpGPU technology embedded into incoming F-35 Block 4 avionics upgrade.

Intel AVX v2's 256 bit and AVX-512 bit wide SIMDs and gather instructions at register level is closer to a GpGPU's wide horizontal compute when compared to IBM SPU's 128 bit wide SIMD.

AVX-512 bit has scatter instructions. https://software.intel.com/en-us/articles/intel-advisor-map-gatherscatter

https://www.militaryaerospace.com/articles/2017/07/radar-signal-processing-embedded-computing-gpgpu.html

Intel Larrabee core contained single512-bit vector processing unit and scales to 48 cores.

Larrabee's successors

  • AMD Epyc 2(Zen 2) with 64 CPU cores which each CPU core has two 256 bit wide AVX v2.
  • Intel Haswell/Broadwell/Skylake S/Kaby Lake/Coffee Lake core has two 256 bit wide AVX v2. Scales to 10 CPU cores with Comet Lake SKU.
  • Intel Skylake X/Cascade Lake core contained two 512-bit wide AVX units. Scales to 48 CPU cores.

With Zen v2, AMD creates it's own "Larrabee" with superior general IPC.

AMD GCN's X86-64 lite behavior (e.g. pointer, virtual memory, memory page fault, interrupts, IEEE-754 FP conformance with SSE/AVX) is nearly "Larrabee". Also applicable to Intel OpenCL 2.x complaint IGP (1 TFLOPS FP32 range).

Intel Larrabee was runover by Intel Xeon Core i Series core wars against AMD.

Larrabee project was recycled into Xeon Phi products with upgraded CPU core from Silvermont/Airmont (Intel Atom) and supports AVX-512

What is your point here.? To prove that after sony did it others did it to?

Because is about the only think i can get from this,and is great to see that you keep comparing cell to GPU and even to the Xeon Phi which was a super expensive line of compute CPU as well.

Wow that Cell really was great.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#71 tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

1. Xbox 360's IBM PPE solution is gimped by 1.6 Ghz L2 cache which is shared by 3 PPE CPU cores at 3.2Ghz. IBM PPE is a bullsh*t high clock speed marketing with very low effective IPC.

SPE has 3.2 Ghz clock speed for ALUs and local data storage (LDS). 256KB local data storage (LDS) is allocated for each SPU.

MS set system TDP bias towards the GPU.

Xenos GPU has tessellation hardware coupled with vertex shaders and shader stream out which is similar to "GPU - SO" (stream out)

GPU GS = geometry shader, it's role has been reduced by DX11's tessellation hardware. DX10 GPU doesn't have tessellation hardware.

2. RSX's G70 design is last gen since 8800 GTX (1st gen CUDA) was released a weeks earlier.

RSX has problems with full speed FP32 compute and shader branch.

Well i guess that make the xbox one X Last gen,you know since the PS4 Pro has Vega like FP16 and other features as well while the xbox one X is just polaris and doesn't have it,

Now think what your spin will be for this one.

Just because a GPU is missing 1 feature doesn't mean it is last gen,that is just a feature.

But hey that mean Cell was generations ahead of other GPU and CPU since it integrated hardware for both task.

The 7XXX line of GPU was current gen back then,it wasn't last gen it simply had features missing,and still the 7900GTX kicked the xbox 360 GPU without a problem even without having unified shaders.

But thanks because applying your argument Cell was next gen there was nothing quite like it and CPU or GPU from those times didn't have what cell had in 1 package.

Avatar image for Pedro
Pedro

69083

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#72 Pedro
Member since 2002 • 69083 Posts

@tormentos: This is not about the Xbox One, this about the Fact that the cell flopped. This about you trying to rewrite history and reality. Switching the conversation to Xbox One does change the facts of the cell's demise. So, the sequence of words you are looking for is "I was wrong." ?

Avatar image for 04dcarraher
04dcarraher

23824

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#73  Edited By 04dcarraher
Member since 2004 • 23824 Posts
@tormentos said:
@pc_rocks said:

"The Cell was an over engineered dead end piece of tech that did not deliver what was originally promised"

Link me to Cell promises i want to see Sony state ""With Cell we Promise _____________________""

Back up your claims.

The 7900GTX spanked the xbox 360 and didn't have unified shaders ass,just because a feature is missing doesn't mean something is last gen,you were FACTUALLY wrong.

Cell is not a CPU created by sony it self is IBM tech modify with sony engineer,nothing different from what sony is doing with AMD.

Sony dropped the RSX as well i guess that means Nvidia sucks ass.Oh wait Nvidia currently controls the market and has the strongest GPU for years how come sony drops them?

The only reason Cell as well as Nvidia and Intel before it as well were drop is simple $$$$$.

Nvidia and Intel are more expensive than AMD,just like Cell was to expensive ans difficult to shrink which is need it for cost reducing latter on the generation.

There was nothing quite like Cell in 2006 CPU wise or close,hell a dual core CPU in 2006 was more than $300 and if you wanted the top of the line one with 2 cores was close to $700.

And those CPU in GPU task would not be close to Cell at all.

There was no plans for Cell 2 of 16 spe that is just bullshit spread on forums such as this one,on 2009 already IBM pulled the plug,what you don't see now is that Cell was revolutionary in fact it was the grand father of modern APU right now.

Cell was a hybrid CPU that could handle both task.

I know both IBM and Sony touted the cell as a "supercomputer on a chip" back in 2005.

The 7900GTX only was able to outright beat the 360 gpu in early games that didn't use alot of vertex workloads. RSX aka Geforce 7 was based on older fixed pixel and vertex processors no point in using obsolete hardware.... Once games started to use full features of the Shader model 3 standard. The xbox 360 started to pull its weight with its flexible unified shader architecture design. The only thing the 7900GTX could outpace the 360's gpu was pixel and texture rate.

The Cell was a partnership "collaboration" between IBM and Sony. IBM created the Power Architecture for the Cell but the Cell processor design for the PS3 heavily influenced by Sony..... Sony had its own fabs produce the Cell for their use and not directly supplied by IBM ..... Sony and AMD's partnership is not even close to what was IBM and Sony did with the PS3.

Sony didn't want to go with Intel nor Nvidia because of cost and time needed to integrate the two. When AMD offered better solution with their own APU solution.

The Cell was the first attempt on a general purpose cpu with ability to do parallel workloads. The original plan for the Cell was to do it all for the PS3 but they quickly found out the PPE and 7 SPE's of the Cell wasn't enough to keep up with a system with a dedicated gpu long before it released hence the need for RSX. Those dual cores paired with Geforce 8800 from 2006 decimated the Cell.... Even though they did cost more...

There were talks on releasing a newer upgraded Cell processor for data center type stuff but not for Sony's use. But yea IBM pulled their focus on the Cell in 2009.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#74  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@ronvalencia said:

1. Xbox 360's IBM PPE solution is gimped by 1.6 Ghz L2 cache which is shared by 3 PPE CPU cores at 3.2Ghz. IBM PPE is a bullsh*t high clock speed marketing with very low effective IPC.

SPE has 3.2 Ghz clock speed for ALUs and local data storage (LDS). 256KB local data storage (LDS) is allocated for each SPU.

MS set system TDP bias towards the GPU.

Xenos GPU has tessellation hardware coupled with vertex shaders and shader stream out which is similar to "GPU - SO" (stream out)

GPU GS = geometry shader, it's role has been reduced by DX11's tessellation hardware. DX10 GPU doesn't have tessellation hardware.

2. RSX's G70 design is last gen since 8800 GTX (1st gen CUDA) was released a weeks earlier.

RSX has problems with full speed FP32 compute and shader branch.

Well i guess that make the xbox one X Last gen,you know since the PS4 Pro has Vega like FP16 and other features as well while the xbox one X is just polaris and doesn't have it,

Now think what your spin will be for this one.

Just because a GPU is missing 1 feature doesn't mean it is last gen,that is just a feature.

But hey that mean Cell was generations ahead of other GPU and CPU since it integrated hardware for both task.

The 7XXX line of GPU was current gen back then,it wasn't last gen it simply had features missing,and still the 7900GTX kicked the xbox 360 GPU without a problem even without having unified shaders.

But thanks because applying your argument Cell was next gen there was nothing quite like it and CPU or GPU from those times didn't have what cell had in 1 package.

PS4 Pro GPU's ROPS doesn't have X1X GPU's near Vega ROPS coupled with multiple MB very high speed cache. PS4 Pro's GPU doesn't have full Vega IP. PS4 Pro's ROPS are old school GCN.

The main reason for RSX's aging design is due to very poor GpGPU compute. Battlefield 3's software deferred tile rendering on PS3 has used SPUs instead of RSX.

On the PC, Battlefield 3 requires DX10 GPU. https://www.origin.com/aus/en-us/store/battlefield/battlefield-3

On the Xbox 360, Battlefield 3 used Xenos GPU's ALU mode for software deferred tile render.

Xbox 360's GPU kicked 7800 GTX/7900 GTX (zero frame rate) on Battlefield 3.

Xbox 360's GPU kicked 7800 GTX on Crysis 2 (deferred render on shaders). Xbox 360 GPU's 8 ROPS disadvantage was bypassed with ALU's stream out write feature. Does this workaround method sound familiar on GCN?

Loading Video...

Crysis 2 is a slide show on 7800 GTX (G70). PC doesn't have CELL to patch G70's shader feature problem, hence G80/G84 was needed to be purchased.

Near Xbox 360 results with X1950's Crysis 2

Loading Video...

X1950 has full speed FP32 36 pixel shaders with 16 ROPS (read/write units) at 575Mhz.

With DX10 on PC, it's game over for DX9c GPUs e.g. DX10's expanded register set fuks up DX9c GPUs.

Xbox 360 GPU already has DirectX 12_1 ROV (e.g. transparency blending order) like feature with it's ROPS since this feature was desired for efficient Xbox 360 GPU emulation on PC e.g. Halo 3. DirectX12_0 GPUs can brute force re-order transparency blending functions with compute shaders.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#75  Edited By ronvalencia
Member since 2008 • 29612 Posts

@04dcarraher said:
@tormentos said:
@pc_rocks said:

"The Cell was an over engineered dead end piece of tech that did not deliver what was originally promised"

Link me to Cell promises i want to see Sony state ""With Cell we Promise _____________________""

Back up your claims.

The 7900GTX spanked the xbox 360 and didn't have unified shaders ass,just because a feature is missing doesn't mean something is last gen,you were FACTUALLY wrong.

Cell is not a CPU created by sony it self is IBM tech modify with sony engineer,nothing different from what sony is doing with AMD.

Sony dropped the RSX as well i guess that means Nvidia sucks ass.Oh wait Nvidia currently controls the market and has the strongest GPU for years how come sony drops them?

The only reason Cell as well as Nvidia and Intel before it as well were drop is simple $$$$$.

Nvidia and Intel are more expensive than AMD,just like Cell was to expensive ans difficult to shrink which is need it for cost reducing latter on the generation.

There was nothing quite like Cell in 2006 CPU wise or close,hell a dual core CPU in 2006 was more than $300 and if you wanted the top of the line one with 2 cores was close to $700.

And those CPU in GPU task would not be close to Cell at all.

There was no plans for Cell 2 of 16 spe that is just bullshit spread on forums such as this one,on 2009 already IBM pulled the plug,what you don't see now is that Cell was revolutionary in fact it was the grand father of modern APU right now.

Cell was a hybrid CPU that could handle both task.

I know both IBM and Sony touted the cell as a "supercomputer on a chip" back in 2005.

The 7900GTX only was able to outright beat the 360 gpu in early games that didn't use alot of vertex workloads. RSX aka Geforce 7 was based on older fixed pixel and vertex processors no point in using obsolete hardware.... Once games started to use full features of the Shader model 3 standard. The xbox 360 started to pull its weight with its flexible unified shader architecture design. The only thing the 7900GTX could outpace the 360's gpu was pixel and texture rate.

The Cell was a partnership "collaboration" between IBM and Sony. IBM created the Power Architecture for the Cell but the Cell processor design for the PS3 was vastly on Sony..... Sony had its own fabs produce the Cell for their use and not directly supplied by IBM ..... Sony and AMD's partnership is not even close to what was IBM and Sony did with the PS3.

Sony didn't want to go with Intel nor Nvidia because of cost and time needed to integrate the two. When AMD offered better solution with their own APU solution.

The Cell was the first attempt on a general purpose cpu with ability to do parallel workloads. The original plan for the Cell was to do it all for the PS3 but they quickly found out the PPE and 7 SPE's of the Cell wasn't enough to keep up with a system with a dedicated gpu long before it released hence the need for RSX. Those dual cores paired with Geforce 8800 from 2006 decimated the Cell.... Even though they did cost more...

There were talks on releasing a newer upgraded Cell processor for data center type stuff but not for Sony's use. But yea IBM pulled their focus on the Cell in 2009.

SPE vector instruction set is based on PowerPC's Altivec/VMX.

Xbox 360 has compute shader feature which doesn't exist on PC's DX9c's shader model 3. Both Xbox 360 and Xbox One GPUs has microcode features.

8 concurrent context feature for Xbox 360 GPU. Current GCN has this feature with it's Async compute with multiple context features. PC GPUs prior to GCN didn't have multiple context capability i.e. fakes multiple context handling via the CPU on Nvidia Maxwell GPUs which is CPU driven async compute with multiple context.

Xbox 360's GPU 8 concurrent context can be individually driven by a CPU thread and Xbox 360 CPUs has 6 threads.

Xbox 360's GPU 8 concurrent context presents the programmer with 8 virtual stream processors on top of GPU's 48 stream processors.

GCN version 1.1 ACE unit has 8 context with 8 queues.

PS4 and Hawaii GCN version 1.1 went overboard with 8 ACE units.

Xbox 360's GPU was parent design for the successor GCN. Xbox One GPU can easily emulate Xbox 360 GPU.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#76  Edited By tormentos
Member since 2003 • 33784 Posts

@Pedro said:

@tormentos: This is not about the Xbox One, this about the Fact that the cell flopped. This about you trying to rewrite history and reality. Switching the conversation to Xbox One does change the facts of the cell's demise. So, the sequence of words you are looking for is "I was wrong." ?

No in fact this is not even about the PS3 ass this thread is about the PS5,but the conversation evolved to the PS3 thanks to PS haters.

So again is lol worthy that you can fallow an argument to certain point but not further.

But again that say more about you than it does about me.

Don't worry if you people rewrote history to show the xbox 360 as some kind of success when it ended 3rd and had the worse recorded console fail ratio i guess i am ok rewriting it as well.

@04dcarraher said:

I know both IBM and Sony touted the cell as a "supercomputer on a chip" back in 2005.

The 7900GTX only was able to outright beat the 360 gpu in early games that didn't use alot of vertex workloads. RSX aka Geforce 7 was based on older fixed pixel and vertex processors no point in using obsolete hardware.... Once games started to use full features of the Shader model 3 standard. The xbox 360 started to pull its weight with its flexible unified shader architecture design. The only thing the 7900GTX could outpace the 360's gpu was pixel and texture rate.

The Cell was a partnership "collaboration" between IBM and Sony. IBM created the Power Architecture for the Cell but the Cell processor design for the PS3 heavily influenced by Sony..... Sony had its own fabs produce the Cell for their use and not directly supplied by IBM ..... Sony and AMD's partnership is not even close to what was IBM and Sony did with the PS3.

Sony didn't want to go with Intel nor Nvidia because of cost and time needed to integrate the two. When AMD offered better solution with their own APU solution.

The Cell was the first attempt on a general purpose cpu with ability to do parallel workloads. The original plan for the Cell was to do it all for the PS3 but they quickly found out the PPE and 7 SPE's of the Cell wasn't enough to keep up with a system with a dedicated gpu long before it released hence the need for RSX. Those dual cores paired with Geforce 8800 from 2006 decimated the Cell.... Even though they did cost more...

There were talks on releasing a newer upgraded Cell processor for data center type stuff but not for Sony's use. But yea IBM pulled their focus on the Cell in 2009.

As a matter of fact Cell was a a collaboration between Sony,IBM and Toshiba,but it was IBM tech modify with the help of sony and toshiba,and obviously manufacture by sony on its foundries but IBM was the one with the PPC cores not sony.

In fact is the same with the xbox 360 CPU the xenon as well the only different the modification on Xenon were close to non,but also use PPC tech.

Unlike in those days sony doesn't manufacture those processors anymore nor has the foundry for them,but the way it works NOW is very similar,AMD put the tech Sony make some customization and nor sony or AMD manufacture them,TSMC does and sony pays AMD a royalty for each chip manufacture,this is the same way MS does as well.

But yeah there was no intention of cell 2 for consoles,it was to expensive and difficult to cost reduction which is vital for consoles.

@ronvalencia said:

PS4 Pro GPU's ROPS doesn't have X1X GPU's near Vega ROPS coupled with multiple MB very high speed cache. PS4 Pro's GPU doesn't have full Vega IP. PS4 Pro's ROPS are old school GCN.

The main reason for RSX's aging design is due to very poor GpGPU compute. Battlefield 3's software deferred tile rendering on PS3 has used SPUs instead of RSX.

On the PC, Battlefield 3 requires DX10 GPU. https://www.origin.com/aus/en-us/store/battlefield/battlefield-3

On the Xbox 360, Battlefield 3 used Xenos GPU's ALU mode for software deferred tile render.

Xbox 360's GPU kicked 7800 GTX/7900 GTX (zero frame rate) on Battlefield 3.

Xbox 360's GPU kicked 7800 GTX on Crysis 2 (deferred render on shaders). Xbox 360 GPU's 8 ROPS disadvantage was bypassed with ALU's stream out write feature. Does this workaround method sound familiar on GCN?

Crysis 2 is a slide show on 7800 GTX (G70). PC doesn't have CELL to patch G70's shader feature problem, hence G80/G84 was needed to be purchased.

Near Xbox 360 results with X1950's Crysis 2

X1950 has full speed FP32 36 pixel shaders with 16 ROPS (read/write units) at 575Mhz.

With DX10 on PC, it's game over for DX9c GPUs e.g. DX10's expanded register set fuks up DX9c GPUs.

Xbox 360 GPU already has DirectX 12_1 ROV (e.g. transparency blending order) like feature with it's ROPS since this feature was desired for efficient Xbox 360 GPU emulation on PC e.g. Halo 3. DirectX12_0 GPUs can brute force re-order transparency blending functions with compute shaders.

First thing first i say 7900GTX not 7800,i wonder how you didn't see it,in fact i my self have claim multiple times that the 360 GPU was superior to the 7800gtx. Oh and in fact crysis 2 on xbox 360 is not even full 720p and frame rate is all over the place with drops to the low 20's.

The xbox one x has near vega CRAP it has nothing from Vega and nothing MS confirmed for the GPU was actually from Vega while sony did name 2 features from Vega.

Again the Pro has FP16 the XBO X doesn't that means the PS4 Pro is next gen the xbox one X isn't you say it,is arguments like this that make me laugh you and how you want to act depending on how the xbox does to just defend the xbox.

So 1 feature extra make the GPU next gen YES OR NO.?

Think wise before you own your self and keep your argument consistent.

Avatar image for Pedro
Pedro

69083

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#77 Pedro
Member since 2002 • 69083 Posts

@tormentos: Silly person. Side tracking because you can't handle the facts. The fact that you decided to argue against was the Cell failing aka flopping. It did and you are unable to handle it. To the point that you side track to the Xbox One which literally has nothing to do with the Cell's failure. Then to name calling. Still doesn't change the fact that the Cell flopped. Now you are trying to side track again. All you have to do is simply accept that the Cell was a failure and flopped megalithically hard. Learn to accept facts, don't fight it. :)

Avatar image for deactivated-5cd08b1605da1
deactivated-5cd08b1605da1

9317

Forum Posts

0

Wiki Points

0

Followers

Reviews: 27

User Lists: 0

#78 deactivated-5cd08b1605da1
Member since 2012 • 9317 Posts

@Bread_or_Decide said:

Every multiplat was worse on PS3.

Hum, no they werent. Some exceptions existed. In fact, some later gen games like GTA V for ex looked better on PS3.

https://www.eurogamer.net/articles/digitalfoundry-grand-theft-auto-5-face-off

Since all else is identical across the board, the PS3 version is recommended on the grounds of image quality if you have the option.

Probably because developers already understood better the "hard to develop for" PS3 architecture later in the gen. I'm not a tech savy by any means, but from what I understand the problem with the cell was its "hard to develop for" architecture rather than its actual capabilities. Hell, I remember reading about the US military buying thousands of PS3s for the cell processor alone, so something special must have been there...

Avatar image for hrt_rulz01
hrt_rulz01

22367

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#79 hrt_rulz01
Member since 2006 • 22367 Posts

@cainetao11 said:

Lmao this forum.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#80  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@Pedro said:

@tormentos: This is not about the Xbox One, this about the Fact that the cell flopped. This about you trying to rewrite history and reality. Switching the conversation to Xbox One does change the facts of the cell's demise. So, the sequence of words you are looking for is "I was wrong." ?

No in fact this is not even about the PS3 ass this thread is about the PS5,but the conversation evolved to the PS3 thanks to PS haters.

So again is lol worthy that you can fallow an argument to certain point but not further.

But again that say more about you than it does about me.

Don't worry if you people rewrote history to show the xbox 360 as some kind of success when it ended 3rd and had the worse recorded console fail ratio i guess i am ok rewriting it as well.

@04dcarraher said:

I know both IBM and Sony touted the cell as a "supercomputer on a chip" back in 2005.

The 7900GTX only was able to outright beat the 360 gpu in early games that didn't use alot of vertex workloads. RSX aka Geforce 7 was based on older fixed pixel and vertex processors no point in using obsolete hardware.... Once games started to use full features of the Shader model 3 standard. The xbox 360 started to pull its weight with its flexible unified shader architecture design. The only thing the 7900GTX could outpace the 360's gpu was pixel and texture rate.

The Cell was a partnership "collaboration" between IBM and Sony. IBM created the Power Architecture for the Cell but the Cell processor design for the PS3 heavily influenced by Sony..... Sony had its own fabs produce the Cell for their use and not directly supplied by IBM ..... Sony and AMD's partnership is not even close to what was IBM and Sony did with the PS3.

Sony didn't want to go with Intel nor Nvidia because of cost and time needed to integrate the two. When AMD offered better solution with their own APU solution.

The Cell was the first attempt on a general purpose cpu with ability to do parallel workloads. The original plan for the Cell was to do it all for the PS3 but they quickly found out the PPE and 7 SPE's of the Cell wasn't enough to keep up with a system with a dedicated gpu long before it released hence the need for RSX. Those dual cores paired with Geforce 8800 from 2006 decimated the Cell.... Even though they did cost more...

There were talks on releasing a newer upgraded Cell processor for data center type stuff but not for Sony's use. But yea IBM pulled their focus on the Cell in 2009.

As a matter of fact Cell was a a collaboration between Sony,IBM and Toshiba,but it was IBM tech modify with the help of sony and toshiba,and obviously manufacture by sony on its foundries but IBM was the one with the PPC cores not sony.

In fact is the same with the xbox 360 CPU the xenon as well the only different the modification on Xenon were close to non,but also use PPC tech.

Unlike in those days sony doesn't manufacture those processors anymore nor has the foundry for them,but the way it works NOW is very similar,AMD put the tech Sony make some customization and nor sony or AMD manufacture them,TSMC does and sony pays AMD a royalty for each chip manufacture,this is the same way MS does as well.

But yeah there was no intention of cell 2 for consoles,it was to expensive and difficult to cost reduction which is vital for consoles.

@ronvalencia said:

PS4 Pro GPU's ROPS doesn't have X1X GPU's near Vega ROPS coupled with multiple MB very high speed cache. PS4 Pro's GPU doesn't have full Vega IP. PS4 Pro's ROPS are old school GCN.

The main reason for RSX's aging design is due to very poor GpGPU compute. Battlefield 3's software deferred tile rendering on PS3 has used SPUs instead of RSX.

On the PC, Battlefield 3 requires DX10 GPU. https://www.origin.com/aus/en-us/store/battlefield/battlefield-3

On the Xbox 360, Battlefield 3 used Xenos GPU's ALU mode for software deferred tile render.

Xbox 360's GPU kicked 7800 GTX/7900 GTX (zero frame rate) on Battlefield 3.

Xbox 360's GPU kicked 7800 GTX on Crysis 2 (deferred render on shaders). Xbox 360 GPU's 8 ROPS disadvantage was bypassed with ALU's stream out write feature. Does this workaround method sound familiar on GCN?

Crysis 2 is a slide show on 7800 GTX (G70). PC doesn't have CELL to patch G70's shader feature problem, hence G80/G84 was needed to be purchased.

Near Xbox 360 results with X1950's Crysis 2

X1950 has full speed FP32 36 pixel shaders with 16 ROPS (read/write units) at 575Mhz.

With DX10 on PC, it's game over for DX9c GPUs e.g. DX10's expanded register set fuks up DX9c GPUs.

Xbox 360 GPU already has DirectX 12_1 ROV (e.g. transparency blending order) like feature with it's ROPS since this feature was desired for efficient Xbox 360 GPU emulation on PC e.g. Halo 3. DirectX12_0 GPUs can brute force re-order transparency blending functions with compute shaders.

First thing first i say 7900GTX not 7800,i wonder how you didn't see it,in fact i my self have claim multiple times that the 360 GPU was superior to the 7800gtx. Oh and in fact crysis 2 on xbox 360 is not even full 720p and frame rate is all over the place with drops to the low 20's.

The xbox one x has near vega CRAP it has nothing from Vega and nothing MS confirmed for the GPU was actually from Vega while sony did name 2 features from Vega.

Again the Pro has FP16 the XBO X doesn't that means the PS4 Pro is next gen the xbox one X isn't you say it,is arguments like this that make me laugh you and how you want to act depending on how the xbox does to just defend the xbox.

So 1 feature extra make the GPU next gen YES OR NO.?

Think wise before you own your self and keep your argument consistent.

7900 GTX is similar to 7800 GTX.

Loading Video...

Slide show (it's lower than 24 FPS cinematic frame rate LOL) on GT 7900 with low detail settings! Deferred render hammered G71. PC's Crysis 1 is not deferred render i.e. it's a forward render.

Xbox One X's (GCN 40/44 CU) ROPS has 2 MB render cache and TMU has 2 MB L2 cache (as per Polaris 10/20 IP). Total cache outside CU's L1 and LDS is about 4 MB. That's near Vega 56/64's 4 MB L2 cache for it's ROPS and TMUs.

Vega 56/64's ROPS and TMU has 4 MB L2 cache shared.

Polaris IP baseline doesn't have 2 MB render cache for it's ROPS.

Xbox One X's GPU didn't pool 2 MB render cache together with 2 MB L2 cache to make it Vega 56/64 4MB L2 cache design, hence MS can't claim X1X's cache setup to be Vega 56/64, but it's more than Polaris 10/20 baseline design.

X1X beaten PS4 Pro delivering 2X pixel count in RDR2 (resolution scales by GCN TFLOPS power until X1X's large IPC gain).

https://www.eurogamer.net/articles/digitalfoundry-crysis2-face-off

The reason here is fairly straightforward: while Crysis 2 runs at 1152x720 resolution on the Microsoft platform, PS3 operates at a base resolution of 1024x720

1280x720 = 921600

1152x720= 829440, Xbox 360 is 90 percent of 1280x720p pixel count.

1024x720 = 737280

Your " frame rate is all over the place with drops to the low 20's" is bullshit since Xbox 360's frame rate majority is not low 20s.

Without using Xbox 360's stream out feature, 8 ROPS will be a limitation relative to G70/G71's higher ROPS count. RSX has 8 ROPS like GeForce 7600, hence SPUs will be needed to exceed 8 ROPS memory write operations.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#81 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

AMD has been struggling with GCN for years now plus they lost a lot of key staff from the GPU department who went to work with Intel... I have very little faith in Navi and even less in a TDP limited console.

Avatar image for deactivated-6092a2d005fba
deactivated-6092a2d005fba

22663

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#82 deactivated-6092a2d005fba
Member since 2015 • 22663 Posts

@babyjoker1221 said:
@tormentos said:

@Pedro:

Doesn't change the fact that calling something a flop, doesn't quite make it one outside your fragile and biased mind.

You know what really flopped hard? Your education. Whoever taught you to read and write should be shot.

Also, your thumbs are a flop as well. Evolution attempted to thwart you from posting nonsense on the internet, but here you are.... Fighting it every step of the way.

Now that shit is funny to me lol

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#83  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Grey_Eyed_Elf said:

AMD has been struggling with GCN for years now plus they lost a lot of key staff from the GPU department who went to work with Intel... I have very little faith in Navi and even less in a TDP limited console.

Define these key staff e.g.

Raja Koduri

Koduri joined S3 Graphics in 1996.

He became the director of advanced technology development at ATI Technologies in 2001.[2]

Following Advanced Micro Devices's 2006 acquisition of ATI, he served as chief technology officer for graphics at AMD until 2009.

He then went to Apple Inc., where he worked with graphics hardware, which allowed Apple to transition to high-resolution Retina displays for its Mac computers.[3]

He returned to AMD in 2013 as a vice president in Visual Computing, which includes both GPU hardware and software, unlike his pre-2009 role at AMD which only concerned GPU hardware.

Under Koduri, GCN didn't increase it's quad raster engine/64 ROPS count and focused on "TFLOPS" .

2007, R600 VLIW5, R600 engineers fired and moved to Intel. Heavy focus on CU FLOPS, gimped ROPS's MSAA hardware. Another design team handling SIMD based Xenos GPU. VLIW5 fuking Itanium type design concept for GPUs. NVIDIA quickly dumped VLIW based Geforce FX.

2008, R700 VLIW5, Radeon HD 4870 was decent.

2009, Evergreen VLIW5 e.g. Radeon HD 5870. Koduri jumped ship to Apple.

2010, Northern islands VLIW4 e.g. Radeon HD 6970. Introduced dual geometry-raster engines with 32 ROPS.

Dec 2011, Southern Island, SIMD based GPU returned with 7970. Dumping VLIW based GPU designs.

...

2013, R9-290X was released with quad raster engine/64 ROPS design and the same year Koduri rejoins AMD. Without Koduri, Radeon improved raster unit count.

....

Dec 2017, Koduri left AMD for Intel. Still stuck at quad raster engine/64 ROPS design after almost 5 years!!! Raster IPC getting worst with Fury X and Vega 64.

RTG needs purging just like Bulldozer debacle i.e. mass executive management level firings.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#84 tormentos
Member since 2003 • 33784 Posts

@Pedro said:

@tormentos: Silly person. Side tracking because you can't handle the facts. The fact that you decided to argue against was the Cell failing aka flopping. It did and you are unable to handle it. To the point that you side track to the Xbox One which literally has nothing to do with the Cell's failure. Then to name calling. Still doesn't change the fact that the Cell flopped. Now you are trying to side track again. All you have to do is simply accept that the Cell was a failure and flopped megalithically hard. Learn to accept facts, don't fight it. :)

There is no side tracking the troll thread by the way was about the PS5 suppose secrete sauce in a failed attempt by a lemming hiding behind PC to prove sony was doing the same shit MS did which is not true, from then on the PS3 was bring into the thread,and you fallow without problems but when i bring the xbox one into the equation to make a point you simply could not fallow anymore.

Again it says more about you than it does about me lemming. :)

@ronvalencia said:

7900 GTX is similar to 7800 GTX.

Slide show (it's lower than 24 FPS cinematic frame rate LOL) on GT 7900 with low detail settings! Deferred render hammered G71. PC's Crysis 1 is not deferred render i.e. it's a forward render.

Xbox One X's (GCN 40/44 CU) ROPS has 2 MB render cache and TMU has 2 MB L2 cache (as per Polaris 10/20 IP). Total cache outside CU's L1 and LDS is about 4 MB. That's near Vega 56/64's 4 MB L2 cache for it's ROPS and TMUs.

Vega 56/64's ROPS and TMU has 4 MB L2 cache shared.

Polaris IP baseline doesn't have 2 MB render cache for it's ROPS.

Xbox One X's GPU didn't pool 2 MB render cache together with 2 MB L2 cache to make it Vega 56/64 4MB L2 cache design, hence MS can't claim X1X's cache setup to be Vega 56/64, but it's more than Polaris 10/20 baseline design.

X1X beaten PS4 Pro delivering 2X pixel count in RDR2 (resolution scales by GCN TFLOPS power until X1X's large IPC gain).

https://www.eurogamer.net/articles/digitalfoundry-crysis2-face-off

The reason here is fairly straightforward: while Crysis 2 runs at 1152x720 resolution on the Microsoft platform, PS3 operates at a base resolution of 1024x720

1280x720 = 921600

1152x720= 829440, Xbox 360 is 90 percent of 1280x720p pixel count.

1024x720 = 737280

Your " frame rate is all over the place with drops to the low 20's" is bullshit since Xbox 360's frame rate majority is not low 20s.

Without using Xbox 360's stream out feature, 8 ROPS will be a limitation relative to G70/G71's higher ROPS count. RSX has 8 ROPS like GeForce 7600, hence SPUs will be needed to exceed 8 ROPS memory write operations.

Is also funny that from all the examples out there Crysis 2 was pick,Crysis for years has been known to be a pretty mess up game,basically what Crytek did was throw a bunch of effects without clear optimization,even on low running Crysis was a problem and not because it looked great at low settings it simple was an un optimized game,even on consoles still was when great looking games like Gears,Uncharted had much better frames while also been true 720p not 90% by the way.

The PS3 GPU was total shit it could not hold a candle to the xbox 360 at all and in fact even its bigger brother the 7800 could not beat the 360,but some how the PS3 did how?

Yeah that is what people ignore and want to pretend it didn't happen,the truth is that is the PS3 would have Xenon like CPU the xbox 360 would have walk all over it because there would no way to help that wimp GPU.

In fact i posted an even cleaner example Oblivion on xbox 360 which wasn't even 720p at launch and on PC with higher than 360 settings could go to 1200x1600 and still have smooth frames.

But like always you chose to hide in the worse example and the game that doesn't represent at all the majority of the games on PC because Crysis doesn't represent how the great majority of games were on PC.

@i_p_daily said:

Now that shit is funny to me lol

Oh look group therapy..Hahahaha

Show me on this anatomically correct doll were sony touched you all.

Hahahahaa

Avatar image for Pedro
Pedro

69083

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#85 Pedro
Member since 2002 • 69083 Posts

@tormentos: Still unwilling to accept the failure of the cell I see. Unfortunately for you, it doesn't change its actual demise. ?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#86  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@Pedro said:

@tormentos: Silly person. Side tracking because you can't handle the facts. The fact that you decided to argue against was the Cell failing aka flopping. It did and you are unable to handle it. To the point that you side track to the Xbox One which literally has nothing to do with the Cell's failure. Then to name calling. Still doesn't change the fact that the Cell flopped. Now you are trying to side track again. All you have to do is simply accept that the Cell was a failure and flopped megalithically hard. Learn to accept facts, don't fight it. :)

There is no side tracking the troll thread by the way was about the PS5 suppose secrete sauce in a failed attempt by a lemming hiding behind PC to prove sony was doing the same shit MS did which is not true, from then on the PS3 was bring into the thread,and you fallow without problems but when i bring the xbox one into the equation to make a point you simply could not fallow anymore.

Again it says more about you than it does about me lemming. :)

@ronvalencia said:

7900 GTX is similar to 7800 GTX.

Slide show (it's lower than 24 FPS cinematic frame rate LOL) on GT 7900 with low detail settings! Deferred render hammered G71. PC's Crysis 1 is not deferred render i.e. it's a forward render.

Xbox One X's (GCN 40/44 CU) ROPS has 2 MB render cache and TMU has 2 MB L2 cache (as per Polaris 10/20 IP). Total cache outside CU's L1 and LDS is about 4 MB. That's near Vega 56/64's 4 MB L2 cache for it's ROPS and TMUs.

Vega 56/64's ROPS and TMU has 4 MB L2 cache shared.

Polaris IP baseline doesn't have 2 MB render cache for it's ROPS.

Xbox One X's GPU didn't pool 2 MB render cache together with 2 MB L2 cache to make it Vega 56/64 4MB L2 cache design, hence MS can't claim X1X's cache setup to be Vega 56/64, but it's more than Polaris 10/20 baseline design.

X1X beaten PS4 Pro delivering 2X pixel count in RDR2 (resolution scales by GCN TFLOPS power until X1X's large IPC gain).

https://www.eurogamer.net/articles/digitalfoundry-crysis2-face-off

The reason here is fairly straightforward: while Crysis 2 runs at 1152x720 resolution on the Microsoft platform, PS3 operates at a base resolution of 1024x720

1280x720 = 921600

1152x720= 829440, Xbox 360 is 90 percent of 1280x720p pixel count.

1024x720 = 737280

Your " frame rate is all over the place with drops to the low 20's" is bullshit since Xbox 360's frame rate majority is not low 20s.

Without using Xbox 360's stream out feature, 8 ROPS will be a limitation relative to G70/G71's higher ROPS count. RSX has 8 ROPS like GeForce 7600, hence SPUs will be needed to exceed 8 ROPS memory write operations.

Is also funny that from all the examples out there Crysis 2 was pick,Crysis for years has been known to be a pretty mess up game,basically what Crytek did was throw a bunch of effects without clear optimization,even on low running Crysis was a problem and not because it looked great at low settings it simple was an un optimized game,even on consoles still was when great looking games like Gears,Uncharted had much better frames while also been true 720p not 90% by the way.

The PS3 GPU was total shit it could not hold a candle to the xbox 360 at all and in fact even its bigger brother the 7800 could not beat the 360,but some how the PS3 did how?

Yeah that is what people ignore and want to pretend it didn't happen,the truth is that is the PS3 would have Xenon like CPU the xbox 360 would have walk all over it because there would no way to help that wimp GPU.

In fact i posted an even cleaner example Oblivion on xbox 360 which wasn't even 720p at launch and on PC with higher than 360 settings could go to 1200x1600 and still have smooth frames.

But like always you chose to hide in the worse example and the game that doesn't represent at all the majority of the games on PC because Crysis doesn't represent how the great majority of games were on PC.

Your facts? Who are you?

http://www.gamasutra.com/view/feature/3904/processing_the_truth_an_interview_.php?page=3

"I'm going to have to answer with an 'it depends,'" laughs Shippy, after a pause. "Again, they're completely different models. So in the PS3, you've got this Cell chip which has massive parallel processing power, the PowerPC core, multiple SPU cores it's got a GPU that is, in the model here, processing more in the Cell chip and less in the GPU. So that's one processing paradigm -- a heterogeneous paradigm."

"With the Xbox 360, you've got more of a traditional multi-core system, and you've got three PowerPC cores, each of them having dual threads -- so you've got six threads running there, at least in the CPU. Six threads in Xbox 360, and eight or nine threads in the PS3 -- but then you've got to factor in the GPU," Shippy explains. "The GPU is highly sophisticated in the Xbox 360."

He concludes: "At the end of the day, when you put them all together, depending on the software, I think they're pretty equal, even though they're completely different processing models."

Xbox 360's 10 MB EDRAM can impose hard resolution limits and it's ROPS locked inside this eDRAM. XBO improves this situation with the ability to write to DRAM and eSRAM, but XBO's GPU wouldn't beat larger scale GPU inside the PS4.

Xbox 360 GPU has design concept that directly leads into GCN e.g. Xenos GPU has 8 concurrent context, 64 threads, 48 scalar + 48 SIMD4 stream processors at 500Mhz, compute stream out feature (DX10 like) to bypass 8 ROPS limits, hardware tessellation (leads to DX11 tessellation), Direct 12_1 like ROPS blending re-order feature (returned under Vega ROPS), GPU/host CPU pointer compatibility.

CELL's 6 SPU (scalar + SIMD4) at 3.2 Ghz available for games can fake the above compute functions with RSX handling texture and raster functions. Each SPU can handle it's own context.

1) Two ppu/vmx units There are three ppu/vmx units on the 360, and just one on the PS3. So any load on the 360's remaining two ppu/vmx units must be moved to spu.

2) Vertex culling You can look back a few years at my first post talking about this, but it's common knowledge now that you need to move as much vertex load as possible to spu otherwise it won't keep pace with the 360.

3) Vertex texture sampling You can texture sample in vertex shaders on 360 just fine, but it's unusably slow on PS3. Most multi platform games simply won't use this feature on 360 to make keeping parity easier, but if a dev does make use of it then you will have no choice but to move all such functionality to spu.

4) Shader patching Changing variables in shader programs is cake on the 360. Not so on the PS3 because they are embedded into the shader programs. So you have to use spu's to patch your shader programs.

5) Branching You never want a lot of branching in general, but when you do really need it the 360 handles it fine, PS3 does not. If you are stuck needing branching in shaders then you will want to move all such functionality to spu.

6) Shader inputs You can pass plenty of inputs to shaders on 360, but do it on PS3 and your game will grind to a halt. You will want to move all such functionality to spu to minimize the amount of inputs needed on the shader programs.

7) MSAA alternatives Msaa runs full speed on 360 gpu needing just cpu tiling calculations. Msaa on PS3 gpu is very slow. You will want to move msaa to spu as soon as you can.

8. Post processing 360 is unified architecture meaning post process steps can often be slotted into gpu idle time. This is not as easily doable on PS3, so you will want to move as much post process to spu as possible.

9) Load balancing 360 gpu load balances itself just fine since it's unified. If the load on a given frame shifts to heavy vertex or heavy pixel load then you don't care. Not so on PS3 where such load shifts will cause frame drops. You will want to shift as much load as possible to spu to minimize your peak load on the gpu.

10) Half floats You can use full floats just fine on the 360 gpu. On the PS3 gpu they cause performance slowdowns. If you really need/have to use shaders with many full floats then you will want to move such functionality over to the spu's.

11) Shader array indexing You can index into arrays in shaders on the 360 gpu no problem. You can't do that on PS3. If you absolutely need this functionality then you will have to either rework your shaders or move it all to spu.

Etc, etc, etc...

Too many design flaws with RSX/G7x which is hidden by NVIDIA's "The way it's meant to be played" and the price was paid with poor fine wine characteristics.

RSX's FP32 units needs to be flatten.

RSX: 400 GFLOPS (24 x 27 Flops x 550 )+ (8 x 10 Flops * 550). But, the 27 FLOPS are not presented as a flat structure as in generalized unified shaders GPUs.

Pixel Shader pipeline for NVIDIA G7X/RSX

The bottleneck is the input fragment data i.e. FP32 Shader Unit 2 is dependant on FP32 Shader Unit 1.

RSX's pixel shader FP units are NOT generalised as per AMD's GCN CU stream processors i.e. not apples to apples comparison.

Xbox 360's GPU has 48 unified shader pipelines being exposed to the programmer as a flat layout.

The ideal situation is to combine 6 SPU and RSX silicon budget into G80 like beast with Xenos's 8 concurrent context handling front end which Sony has executed for PS4.

Both Xbox 360 (182 mm2) and Xbox One (160 mm2) followed similar GPU ship area size while PS4 GPU has 212 mm2. Microsoft retaliated with ~284 mm2 size GPU via Xbox One X and ditched 30 MB eSRAM design (XBO's design is influenced by Xbox 360's chip area proportions).

In terms of game console, Xbox One X's GPU area size is the largest to date.

With PS4 PCB budget, Sony combined PS3 PCB budget's 64 bit XDR with 128 bit GDDR3 into 256 bit GDDR5. Microsoft retaliated with 384 bit GDDR5 PCB budget via X1X.

PS4 Pro has 256 bit bus GDDR5 + 1 GB DDR3 PCB budget. Sony is pushing PCB design within a set budget.

Xbox Anaconda follows X1X's design concepts with updated 7nm process tech and GDDR6 parts.

Loading Video...

Hawaii Pro GCN's "Fine wine" crushed GTX 780 with recent games.

Avatar image for pc_rocks
PC_Rocks

8459

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#87  Edited By PC_Rocks
Member since 2018 • 8459 Posts

@tormentos said:
@pc_rocks said:

I already linked the 3 promises. Where did it accomplished that? Don't worry link me to those and I'll still have dozens more of the ridiculous claims about teh cell by Sony and all its first party. Teh cell was ac omplete and utter disaster and it didn't deliver jack sh*t. If anything the GPU saved teh cell not the other way around.

RSX was a last gen GPU it didn't have unified shaders and Sony opted for that rather than put in another cell as they previously hoped because it was complete crap at it. Your overlords chose a cut down last gen GPY over teh cell when the realized the stinker and the money pit it was.

Just branching? Teh cell was also crap at time sharing, threading, any task that relies on multiple instruction streams. The only thing it was somewhat decent was execute a single instruction with multiple data sets in parallel, kinda like a GPU compute but that was again nowhere near the ballpark figure a GPU can do. Even in that scenario PPU has to fork and merge the jobs result from SPE's negating much of the advantage and sucked at parallel workloads that traditional CPUs do. So it tried to combine what CPU and GPUs do but failed miserably at both of those tasks just like Larrabee. No amount of DC will change that. Your overlords themselves abandon it the first chance they got when the plan was to go for teh cell 2 with 2 PPUs and 14-16 SPU but the both the traditional CPUs and GPUs just demolished all hopes of Sony because they excel much better than teh cell at those tasks while being cheaper and much more developer friendly. Sony was planning to incorporate teh cell in multimedia devices, consoles, data centers, TVs, computers everything and look how that panned out. No those army projects with cell doesn't count as I'm sure you're definitely going to fall back on for a an excuse. It was before the advent of modern GPU and they thought it would a good for those number crunching tasks as before that only CPUs were used to solve these problems. Even then they abandon it in favor of GPU based supercomputers coupled with traditional CPUs putting final nail in teh cell's coffin.

Again as I predicted, you didn't link the complete presentation and just tried to cherry pick the part out of context. Funny you are now tring to show that Ubisoft say that while I was the one that pointed it out as I know the actual presentation which you tried to hide.They were talking about the compute like performance not the CPU power, you failed again. Teh cell was a complete and utter disaster. Your out of context link don't work on me as I know exactly everything about it. You can't bullsh*t your way out of me by throwing random graphs. Loving the DC that it wasn't trying to run Windows hence the general performance doesn't matter you and Sony tried to pass it off as teh supercomputer in the same breath for those army projects. Loved that double standards and contradiction for the same argument just shows you have no leg to stand on. Hell I was go as far as to say that teh cell was one of the biggest architecture and design blunders in the history of computing architectures and their own IP owners abandonment is the living proof of that.

That's the reason all people are sick and tired of you cows because the cult is easier to see. You're defending a failed project even though your overlords admitted it and moved on by pulling all their investments out of it. It's not just lems but your cult around Sony. Still waiting for where did it do 4D, 120FPS and dual 1080p. Show those first I'll still put out dozens of other outrageous claims.

Yeah that extreme linear games with tunnels, extremely static worlds with everything baked, no physics, no shadows, no realtime lighting can go toe to toe with Crysis 2? 720p is a chump change compared to what Crysis 2 did. Nobody cares about a narrative that you cows have tried to build here. I would take the worlds of the actual engineers and DF over a butthurt cow. Crysis 2/3 on 360 is still techniucally superior to whatever Sony put out on teh cell just like the majority of the multiplats.

Again nobody, and when I say nobody I literally mean it, it's not just a figure of speech. Nobody cares who you think about people. Every single thread and post of yours in Sw is a proof of that. All laughs at you and your hypocrisy. You call everyone lemming that criticizes your overlord SOny and points out the hypocrisy of cows. You have called me, Zaria, NoodleFighter, howmakewood, ghosts, dragonfire, goldenelement, mansieur, kalibird - lems when many of them only owns PS4 as a console not Xbone and some of them don't own any consoles for that matter. All of them are hermits. Hell I have seen you call lundy, ragnarok and even one mod lemming when they don't even after go Sony/PS4/cows as much as we do. As I said, it's just your excuse to get out of the criticism. I'm sure you just consider cows that have or had a PC - hermits like quackknight, randommatt etc. or people like davillain that are a fan of PS4 true hermits to make you feel better. Sorry tormentos, all the factions are sick and tired of cows bullsh*t, be it lems, hermits or sheep. All the threads in SW are the proof of that.

If anything you're a bigger lemming than the people I listed along with me. They are all consistent in their arguments while you - a hypocrite switches position in a heart beat. I can easily pull ton of your threads defending Xbone/MS against PC with us hermits when your only way of defending Sony was to defend the MS/Xbone when you in other threads you were hating the MS/Xbone with the same arguments we hermits put up. You're a hypocrite of a highest order. You hide behind Xbone/Switch when it fits your agenda to defend but Sony but in other instances have gone as far as hiding behind mobiles to attack Switch. You're a fraud!

Anyway, stop derailing the thread. This is about teh special sauce. If you have anything to add to it fine because I don't have time for your verbal diarrhea without facts.

Link me to Cell promises i want to see Sony state ""With Cell we Promise _____________________""

Back up your claims.

The 7900GTX spanked the xbox 360 and didn't have unified shaders ass,just because a feature is missing doesn't mean something is last gen,you were FACTUALLY wrong.

Cell is not a CPU created by sony it self is IBM tech modify with sony engineer,nothing different from what sony is doing with AMD.

Sony dropped the RSX as well i guess that means Nvidia sucks ass.Oh wait Nvidia currently controls the market and has the strongest GPU for years how come sony drops them?

The only reason Cell as well as Nvidia and Intel before it as well were drop is simple $$$$$.

Nvidia and Intel are more expensive than AMD,just like Cell was to expensive ans difficult to shrink which is need it for cost reducing latter on the generation.

There was nothing quite like Cell in 2006 CPU wise or close,hell a dual core CPU in 2006 was more than $300 and if you wanted the top of the line one with 2 cores was close to $700.

And those CPU in GPU task would not be close to Cell at all.

There was no plans for Cell 2 of 16 spe that is just bullshit spread on forums such as this one,on 2009 already IBM pulled the plug,what you don't see now is that Cell was revolutionary in fact it was the grand father of modern APU right now.

Cell was a hybrid CPU that could handle both task.

n an interview with Heise.de, IBM's VP of Deep Computing, David Turek, confirmed that the Cell processor has reached the end of the line. Turek then put a more positive spin on the news by stating the obvious truth that heterogeneous multiprocessors, of which Cell was the first mass-market example of, are here to stay, so insofar as IBM continues to produce such chips, Cell's basic concepts and ideas will live on in the company's product line.

https://arstechnica.com/gadgets/2009/11/end-of-the-line-for-ibms-cell/

Once again you end owned by your own lies,just because you read some bullshit on forums doesn't mean is true,basically you are arguing with stupidity use on forums such as this and trying to picture it as some kind of FACT.

Again is obvious that you are a shitty lemming alt,and that you are not a hermit or close.

This thread is about the stupidity of a lemming who want to imply the PS5 will have some kind of secret sauce using AMD words and not sony's in a pathetic attempt to create a notion that doesn't exist and that when it existed it was coined by MS and digital foundry running PR,and which lemmings such as your self from your other account defend.

Want to talk about secret sauce talk about how much of a let down DX12 was on xbox and how MS fooled morons to believe for 4 years that a magical cloud would increase power on its console and how DX12 would double the xbox one power.

:)

If I need to link those Sony claims then that means you have already lost the debate and are now just looking to drag the feet. Hint, you just acknowledged those claims in your previous post. Show me where did teh cell accomplished those outrageous claims.

Second, teh cell was an utter and complete disaster and if it weren't for a last gen GPU that Sony put in PS3 at the last moment after teh cell turned out to be a stinker it would have been an even bigger disaster. If anything the GPU saved teh cell.

LOL no, Sony didn't abandon RSX like they abandon teh cell. Keep grasping at straws and keep trying to throw sh*t at wall in hopes some would stick. None of your schtick works on me. Sony and MS both approached Nvidia (and also Intel as the common sense tells). Nvidia weren't bleeding money like AMD and hence they didn't compromise on the price, they shared that knowledge publicly. They chose to focus their attention on PC and data center GPUs. So, it was NVidia who abandon Sony not the other way around. Another egg on your face while Sony completely abandon teh cell and sold all their investments in it. Where are the other products that Sony hoped to use cell in? Where's teh cell 2 with 2 PPUs and 16 SPUs?

LOL, at the desperation by posting a PR statement from a company that produced a failed product in joint collaboration. What did you expect them to say that we wasted billions on a crap architecture that no one uses and got its a$$ handed over to it by modern GPUs in tandem with CPUs? If he's right where's its successor? Which product IBM/Sony based on teh cell's design? Even your linked quote mentioned that he put a positive 'spin' on it. LOL at the out of context cherry picked quote again. Intel also said that they learned valuable lessons from Larrabee and will use it in other products just not in the consumer space. Now tell me it wasn't a failure.

You're a fraud, always have been, always will be because you're desperately and knowingly lying to protect your overlord Sony. Not that you have much technical know how to begin with. Nobody cares what a fraud and hypocrite like you thinks. Keep trying to end the post with a smiley. It won't hide your actual feelings that you're foaming at the mouth because none of your schtick is working on me. Here have a reminder again:

Again nobody, and when I say nobody I literally mean it, it's not just a figure of speech. Nobody cares who you think about people. Every single thread and post of yours in Sw is a proof of that. All laughs at you and your hypocrisy. You call everyone lemming that criticizes your overlord Sony and points out the hypocrisy of cows. You have called me, Zaria, NoodleFighter, howmakewood, ghosts, dragonfire, goldenelement, mansieur, kalibird - lems when many of them only owns PS4 as a console not Xbone and some of them don't own any consoles for that matter. All of them are hermits. Hell I have seen you call lundy, ragnarok and even one mod lemming when they don't even after go Sony/PS4/cows as much as we do. As I said, it's just your excuse to get out of the criticism. I'm sure you just consider cows that have or had a PC - hermits like quackknight, randommatt etc. or people like davillain that are a fan of PS4 true hermits to make you feel better. Sorry tormentos, all the factions are sick and tired of cows bullsh*t, be it lems, hermits or sheep. All the threads in SW are the proof of that.

If anything you're a bigger lemming than the people I listed along with me. They are all consistent in their arguments while you - a hypocrite switches position in a heart beat. I can easily pull ton of your threads defending Xbone/MS against PC with us hermits when your only way of defending Sony was to defend the MS/Xbone when you in other threads you were hating the MS/Xbone with the same arguments we hermits put up. You're a hypocrite of a highest order. You hide behind Xbone/Switch when it fits your agenda to defend but Sony but in other instances have gone as far as hiding behind mobiles to attack Switch. You're a fraud!

Anyway, stop derailing the thread. This is about teh special sauce. If you have anything to add to it fine because I don't have time for your verbal diarrhea without facts.

Avatar image for pc_rocks
PC_Rocks

8459

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#88 PC_Rocks
Member since 2018 • 8459 Posts

@Vatusus said:
@Bread_or_Decide said:

Every multiplat was worse on PS3.

Hum, no they werent. Some exceptions existed. In fact, some later gen games like GTA V for ex looked better on PS3.

https://www.eurogamer.net/articles/digitalfoundry-grand-theft-auto-5-face-off

Since all else is identical across the board, the PS3 version is recommended on the grounds of image quality if you have the option.

Probably because developers already understood better the "hard to develop for" PS3 architecture later in the gen. I'm not a tech savy by any means, but from what I understand the problem with the cell was its "hard to develop for" architecture rather than its actual capabilities. Hell, I remember reading about the US military buying thousands of PS3s for the cell processor alone, so something special must have been there...

That is precisely why you shouldn't talk about its capabilities. Military bought cell for its vector processors 'SPUs' for their specialized tasks. The cell was fairly good at 'number crunching' tasks in specific workloads but it was still crap at all kinds of general purpose workload as well as have a higher development cost. The key to note here is the military bought those as well as the Roadrunner that IBM made was before the unified shader was implemented and AMD/Nvidia released GPUs with compute by demonstrating its potential and most supercomputers or clusters were only made with CPUs. Even the military and all the research institutes have abandon the cell and pretty much all supercomputers now use GPUs for number crunching and traditional CPUs for coordinating these simulations. IBM/Sony betted on the wrong horse with cell and paid dearly for it as the thing it was created to solve can be done much efficiently with GPUs with far less development cost. In essence cell tried to do both the CPU and GPU's tasks but failed to do either effectively.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#89  Edited By ronvalencia
Member since 2008 • 29612 Posts

@pc_rocks said:
@Vatusus said:
@Bread_or_Decide said:

Every multiplat was worse on PS3.

Hum, no they werent. Some exceptions existed. In fact, some later gen games like GTA V for ex looked better on PS3.

https://www.eurogamer.net/articles/digitalfoundry-grand-theft-auto-5-face-off

Since all else is identical across the board, the PS3 version is recommended on the grounds of image quality if you have the option.

Probably because developers already understood better the "hard to develop for" PS3 architecture later in the gen. I'm not a tech savy by any means, but from what I understand the problem with the cell was its "hard to develop for" architecture rather than its actual capabilities. Hell, I remember reading about the US military buying thousands of PS3s for the cell processor alone, so something special must have been there...

That is precisely why you shouldn't talk about its capabilities. Military bought cell for its vector processors 'SPUs' for their specialized tasks. The cell was fairly good at 'number crunching' tasks in specific workloads but it was still crap at all kinds of general purpose workload as well as have a higher development cost. The key to note here is the military bought those as well as the Roadrunner that IBM made was before the unified shader was implemented and AMD/Nvidia released GPUs with compute by demonstrating its potential and most supercomputers or clusters were only made with CPUs. Even the military and all the research institutes have abandon the cell and pretty much all supercomputers now use GPUs for number crunching and traditional CPUs for coordinating these simulations. IBM/Sony betted on the wrong horse with cell and paid dearly for it as the thing it was created to solve can be done much efficiently with GPUs with far less development cost. In essence cell tried to do both the CPU and GPU's tasks but failed to do either effectively.

Vector math comparison

IBM SPU has 128 bit FMA3 vector unit.

AMD Bulldozer has custom extension dual 128 bit FMA4 vector units, later "abandoned" by Ryzen's AVX v2 FMA3 support (Zen v1 has dual 128 bit FMA3 vector units and dual 128 bit FADD vector units). https://en.wikichip.org/wiki/amd/microarchitectures/zen

Intel Haswell has dual 256 bit FMA3 vector units. This design is up to CoffeeLake and incoming Comet Lake. Future IceLake has 512bit AVX.

Intel Skylake X has dual 512 bit FMA3 vector units.

AMD Zen v2 has dual 256 bit FMA3 vector units.

FMA3 = fused multiply-add operation with 3 operands

FMA4 = fused multiply-add operation with 4 operands.

Zen v1 has undocumented support for dual 128 bit FMA4 vector math.

https://www.agner.org/optimize/blog/read.php?i=838

FMA4 is not officially supported on Ryzen, but I found that the FMA4 instructions actually work correctly on Ryzen, even though the CPUID instruction says that FMA4 is not supported.

Unlike modern x86 CPUs(1), SPU doesn't support FP16 pack math. SPU supports vector INT16 and FP32(2).

Reference

1. https://software.intel.com/en-us/articles/performance-benefits-of-half-precision-floats

2. http://www.netlib.org/utk/people/JackDongarra/WEB-PAGES/SPRING-2008/Lect02-cell.pdf Slide 11.

Intel and AMD has their "CELL" with industry leading IPC their current CPU designs and currently engaging in a core count and clock speed war (5 Ghz war).

Avatar image for blackhairedhero
Blackhairedhero

3231

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#90 Blackhairedhero
Member since 2018 • 3231 Posts

@pc_rocks: Where did Sony touch you?

Avatar image for cainetao11
cainetao11

38026

Forum Posts

0

Wiki Points

0

Followers

Reviews: 77

User Lists: 1

#91 cainetao11
Member since 2006 • 38026 Posts

Holy shit with this thread.

Avatar image for dimebag667
dimebag667

3014

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#92 dimebag667
Member since 2003 • 3014 Posts

@cainetao11: this is when SW just gets silly

Avatar image for deactivated-6092a2d005fba
deactivated-6092a2d005fba

22663

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#93 deactivated-6092a2d005fba
Member since 2015 • 22663 Posts

@tormentos: This is how you do it tormy...

You know what they say about guys with small thumbs don't you tormy? well they have small thumb nails lol

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#94 tormentos
Member since 2003 • 33784 Posts
@pc_rocks said:

If I need to link those Sony claims then that means you have already lost the debate and are now just looking to drag the feet. Hint, you just acknowledged those claims in your previous post. Show me where did teh cell accomplished those outrageous claims.

Second, teh cell was an utter and complete disaster and if it weren't for a last gen GPU that Sony put in PS3 at the last moment after teh cell turned out to be a stinker it would have been an even bigger disaster. If anything the GPU saved teh cell.

LOL no, Sony didn't abandon RSX like they abandon teh cell.

This is the only crap i will address from your post as the rest is repeated bullshit.

1-You need a link because YOU CLAIMED SOME PROMISES WERE MAKE FOR CELL.

So yeah you back them up,you need to post a link because if you don't then you are the one who already loss the debate ass.

Cell was so good that it helped a shitty cut down 7600gt beat the Xenos which is actually stronger than the top of the line GPU in that series the 7800GTX.

What cell did is the equivalent of a 7850 beating a 7970 just because the CPU side was able to help the GPU side,this in particular is the part you don't understand,Cell is a damn CPU not a stand alone GPU so regardless of the bullshit and butthurt argument without Cell the PS3 would had ended up even worse.

Yeah the abandon Nvidia EXACTLY like they abandon cell,and is pretty obvious the PS4 doesn't have an Nvidia GPU or Cell so yeah sony abandon both i guess that means Nvidia sucks,hell now is even more confirmed as MS also abandoned Nvidia as well so there.

Oh that also mean Intel sucks as MS also abandon Intel.

Is not my argument remember is YOURS i am just been fair and using it for all cases rather to what serve me best like you do lemming.

@Pedro said:

@tormentos: Still unwilling to accept the failure of the cell I see. Unfortunately for you, it doesn't change its actual demise. ?

You are not trying hard enough. :)

@ronvalencia said:

Your facts? Who are you?

http://www.gamasutra.com/view/feature/3904/processing_the_truth_an_interview_.php?page=3

"I'm going to have to answer with an 'it depends,'" laughs Shippy, after a pause. "Again, they're completely different models. So in the PS3, you've got this Cell chip which has massive parallel processing power, the PowerPC core, multiple SPU cores it's got a GPU that is, in the model here, processing more in the Cell chip and less in the GPU. So that's one processing paradigm -- a heterogeneous paradigm."

"With the Xbox 360, you've got more of a traditional multi-core system, and you've got three PowerPC cores, each of them having dual threads -- so you've got six threads running there, at least in the CPU. Six threads in Xbox 360, and eight or nine threads in the PS3 -- but then you've got to factor in the GPU," Shippy explains. "The GPU is highly sophisticated in the Xbox 360."

He concludes: "At the end of the day, when you put them all together, depending on the software, I think they're pretty equal, even though they're completely different processing models."

No Caption Provided

The bottleneck is the input fragment data i.e. FP32 Shader Unit 2 is dependant on FP32 Shader Unit 1.

RSX's pixel shader FP units are NOT generalised as per AMD's GCN CU stream processors i.e. not apples to apples comparison.

Xbox 360's GPU has 48 unified shader pipelines being exposed to the programmer as a flat layout.

The ideal situation is to combine 6 SPU and RSX silicon budget into G80 like beast with Xenos's 8 concurrent context handling front end which Sony has executed for PS4.

Both Xbox 360 (182 mm2) and Xbox One (160 mm2) followed similar GPU ship area size while PS4 GPU has 212 mm2. Microsoft retaliated with ~284 mm2 size GPU via Xbox One X and ditched 30 MB eSRAM design (XBO's design is influenced by Xbox 360's chip area proportions).

In terms of game console, Xbox One X's GPU area size is the largest to date.

With PS4 PCB budget, Sony combined PS3 PCB budget's 64 bit XDR with 128 bit GDDR3 into 256 bit GDDR5. Microsoft retaliated with 384 bit GDDR5 PCB budget via X1X.

PS4 Pro has 256 bit bus GDDR5 + 1 GB DDR3 PCB budget. Sony is pushing PCB design within a set budget.

Xbox Anaconda follows X1X's design concepts with updated 7nm process tech and GDDR6 parts.

Hawaii Pro GCN's "Fine wine" crushed GTX 780 with recent games.

Now what the fu** are you arguing here really?

Tell what are you really arguing because if you are arguing that the xbox 360 had a more advance GPU than the PS3 i guess you are about 14 years late to that argument man.

You can cry about how shitty the RSX was vs the Xenos that make the xbox 360 been top by the PS3 graphically even worse.

Here we have 2 consoles one boost a super powerful GPU cutting edge not even on PC in some functions,paired with a ok CPU,and the other one has a pretty lol worthy GPU downplay as been super weak and with many holes and pitfalls but with on the CPU side its CPU is quite powerful and can AID the GPU offloading process that other would wise would have been handle by the GPU it self which was quite weak.

The end result is simply outstanding the console with the weakest GPU is able to keep up and even beat the one with a stronger GPU even that said console was also super hard to code for.

That is a damn miracle if you ask me.

The more people downplay the RSX the more Cell looks as simple as that.

Avatar image for deactivated-60bf765068a74
deactivated-60bf765068a74

9558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#95 deactivated-60bf765068a74
Member since 2007 • 9558 Posts

Remember when PC gamers flipped out over GDDR 5 Ram

They couldn't handle it, it was stronger ram then there ram, dedicated to the gpu.

PC gamers look out bro the ps5 is coming and its gonna have new ram that beats your ram again.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#96 tormentos
Member since 2003 • 33784 Posts

@pc_rocks said:

That is precisely why you shouldn't talk about its capabilities. Military bought cell for its vector processors 'SPUs' for their specialized tasks. The cell was fairly good at 'number crunching' tasks in specific workloads but it was still crap at all kinds of general purpose workload as well as have a higher development cost. The key to note here is the military bought those as well as the Roadrunner that IBM made was before the unified shader was implemented and AMD/Nvidia released GPUs with compute by demonstrating its potential and most supercomputers or clusters were only made with CPUs. Even the military and all the research institutes have abandon the cell and pretty much all supercomputers now use GPUs for number crunching and traditional CPUs for coordinating these simulations. IBM/Sony betted on the wrong horse with cell and paid dearly for it as the thing it was created to solve can be done much efficiently with GPUs with far less development cost. In essence cell tried to do both the CPU and GPU's tasks but failed to do either effectively.

I don't think you should tell anyone that they should not discuss something,you now shit about this and the only thing you have done here is spew the same bullshit you read on this forum before.

Like Cell 2 lol.

The military bought Cell because it was a cheap and powerful linux PC that could be use in parallel to increase its performance,while costing a fraction or even less than what the army would have pay for server or other type of PC.

One of the virtue of cell was that you can actually link many and use them in tandem.

@i_p_daily said:

@tormentos: This is how you do it tormy...

You know what they say about guys with small thumbs don't you tormy?well they have small thumb nails lol

That we really know how to get it on?

Yes i know.

Avatar image for xhedon
xHedon

293

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#97 xHedon
Member since 2018 • 293 Posts

We are still waiting on the Cerny Special Sauce for the PS Pro that doubles the terflops that he claimed for "extra performance", lol.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#98  Edited By tormentos
Member since 2003 • 33784 Posts

@xhedon said:

We are still waiting on the Cerny Special Sauce for the PS Pro that doubles the terflops that he claimed for "extra performance", lol.

Actually even i call that bullshit.

While it is true that FP16 double performance,obviously is on the task in which it can be used,since FP16 can't be use on all task yeah you will never get double performance.

But factually FP16 can double performance it is PROVEN.

Which bring me to a certain lemming here who act like the all knowing hardware master who claimed the xbox one X would do 12TF because he believed that FP16 would be a feature on XBO X.

Avatar image for xhedon
xHedon

293

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#99 xHedon
Member since 2018 • 293 Posts
@tormentos said:
@xhedon said:

We are still waiting on the Cerny Special Sauce for the PS Pro that doubles the terflops that he claimed for "extra performance", lol.

Actually even i call that bullshit.

Calling a spade a spade for sure.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#100  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@pc_rocks said:

If I need to link those Sony claims then that means you have already lost the debate and are now just looking to drag the feet. Hint, you just acknowledged those claims in your previous post. Show me where did teh cell accomplished those outrageous claims.

Second, teh cell was an utter and complete disaster and if it weren't for a last gen GPU that Sony put in PS3 at the last moment after teh cell turned out to be a stinker it would have been an even bigger disaster. If anything the GPU saved teh cell.

LOL no, Sony didn't abandon RSX like they abandon teh cell.

This is the only crap i will address from your post as the rest is repeated bullshit.

1-You need a link because YOU CLAIMED SOME PROMISES WERE MAKE FOR CELL.

So yeah you back them up,you need to post a link because if you don't then you are the one who already loss the debate ass.

Cell was so good that it helped a shitty cut down 7600gt beat the Xenos which is actually stronger than the top of the line GPU in that series the 7800GTX.

What cell did is the equivalent of a 7850 beating a 7970 just because the CPU side was able to help the GPU side,this in particular is the part you don't understand,Cell is a damn CPU not a stand alone GPU so regardless of the bullshit and butthurt argument without Cell the PS3 would had ended up even worse.

Yeah the abandon Nvidia EXACTLY like they abandon cell,and is pretty obvious the PS4 doesn't have an Nvidia GPU or Cell so yeah sony abandon both i guess that means Nvidia sucks,hell now is even more confirmed as MS also abandoned Nvidia as well so there.

Oh that also mean Intel sucks as MS also abandon Intel.

Is not my argument remember is YOURS i am just been fair and using it for all cases rather to what serve me best like you do lemming.

@Pedro said:

@tormentos: Still unwilling to accept the failure of the cell I see. Unfortunately for you, it doesn't change its actual demise. ?

You are not trying hard enough. :)

@ronvalencia said:

Your facts? Who are you?

http://www.gamasutra.com/view/feature/3904/processing_the_truth_an_interview_.php?page=3

"I'm going to have to answer with an 'it depends,'" laughs Shippy, after a pause. "Again, they're completely different models. So in the PS3, you've got this Cell chip which has massive parallel processing power, the PowerPC core, multiple SPU cores it's got a GPU that is, in the model here, processing more in the Cell chip and less in the GPU. So that's one processing paradigm -- a heterogeneous paradigm."

"With the Xbox 360, you've got more of a traditional multi-core system, and you've got three PowerPC cores, each of them having dual threads -- so you've got six threads running there, at least in the CPU. Six threads in Xbox 360, and eight or nine threads in the PS3 -- but then you've got to factor in the GPU," Shippy explains. "The GPU is highly sophisticated in the Xbox 360."

He concludes: "At the end of the day, when you put them all together, depending on the software, I think they're pretty equal, even though they're completely different processing models."

The bottleneck is the input fragment data i.e. FP32 Shader Unit 2 is dependant on FP32 Shader Unit 1.

RSX's pixel shader FP units are NOT generalised as per AMD's GCN CU stream processors i.e. not apples to apples comparison.

Xbox 360's GPU has 48 unified shader pipelines being exposed to the programmer as a flat layout.

The ideal situation is to combine 6 SPU and RSX silicon budget into G80 like beast with Xenos's 8 concurrent context handling front end which Sony has executed for PS4.

Both Xbox 360 (182 mm2) and Xbox One (160 mm2) followed similar GPU ship area size while PS4 GPU has 212 mm2. Microsoft retaliated with ~284 mm2 size GPU via Xbox One X and ditched 30 MB eSRAM design (XBO's design is influenced by Xbox 360's chip area proportions).

In terms of game console, Xbox One X's GPU area size is the largest to date.

With PS4 PCB budget, Sony combined PS3 PCB budget's 64 bit XDR with 128 bit GDDR3 into 256 bit GDDR5. Microsoft retaliated with 384 bit GDDR5 PCB budget via X1X.

PS4 Pro has 256 bit bus GDDR5 + 1 GB DDR3 PCB budget. Sony is pushing PCB design within a set budget.

Xbox Anaconda follows X1X's design concepts with updated 7nm process tech and GDDR6 parts.

Hawaii Pro GCN's "Fine wine" crushed GTX 780 with recent games.

Now what the fu** are you arguing here really?

Tell what are you really arguing because if you are arguing that the xbox 360 had a more advance GPU than the PS3 i guess you are about 14 years late to that argument man.

You can cry about how shitty the RSX was vs the Xenos that make the xbox 360 been top by the PS3 graphically even worse.

Here we have 2 consoles one boost a super powerful GPU cutting edge not even on PC in some functions,paired with a ok CPU,and the other one has a pretty lol worthy GPU downplay as been super weak and with many holes and pitfalls but with on the CPU side its CPU is quite powerful and can AID the GPU offloading process that other would wise would have been handle by the GPU it self which was quite weak.

The end result is simply outstanding the console with the weakest GPU is able to keep up and even beat the one with a stronger GPU even that said console was also super hard to code for.

That is a damn miracle if you ask me.

The more people downplay the RSX the more Cell looks as simple as that.

You argued PS3 was more powerful over Xbox 360 when IBM lead designer for CELL and PPE contradicted your claim. Again, who are you? You are nobody.

On apples to apples without subjective artwork comparison like Battlefield 3 with software tile deferred render method, Xbox 360 was shown to be superior over PS3 and you can't argue CELL wasn't used.

Deferred Shading compute power effectiveness for CELL vs 7800 GTX GPU

Sony(SCEA)'sstudypaper on "Deferred Pixel Shading on the Playstation 3" and comparative performance to Geforce 7800 GTX. Can be found from http://research.scea.com/ps3_deferred_shading.pdf

Quote

D. Comparison to GeForce 7800 GTX GPU

We implemented the same algorithm on a high end state of

the art GPU, the NVIDIA GeForce 7800 GTX running in a

Linux workstation. This GPU has 24 fragment shader

pipelines running at 430 Mhz and processes 24 fragments

in parallel. By comparison the 5 SPEs that we used process

20 pixels in parallel in quad-SIMD form.

The GeForce required 11.1 ms to complete the shading

operation. In comparison the Cell/B.E. required 11.65 ms

including the DMA waiting time

From Sony's own words, 5 SPEs(with DMA) is roughly equal to Geforce 7800 GTX.

That's 24 pixel shader from G70 + 24 pixel shading equivalent from 5 SPEs yielding 48 pixel pipeline effective. RSX handles raster and texture work.

Lower latency 48 unified shader Xenos would do the job.

For PS4 project,

Sony wanted modifications on Kepler and Nvidia rejected it based on Sony's offered money vs impact on Nvidia's PC GPU development with chasing higher margin market segments e.g. RTX Turing R&D took 10 years to combine the following IP, Tensor cores, RT (ray-tracing hardware accelerator) cores, DX12 async compute, HyperQ graphics commander processor, rapid pack math, raster scaling with six GPC and 96/128 ROPS, leading edge memory compression and immediate mode tile cache render.

AMD's Xenos 8 concurrent context compute foundation already has near GCN and closer to Sony's design goals. Multiple compute context support via interleaving method without penalty arrived with Pascal. Volta delivered hardware async compute support.

Nvidia prioritized raster power e.g. beyond four geometry-raster engines/64 ROPS scaling, leading edge memory compression and immediate mode tile cache render ahead of compute path Asysc hardware with Maxwell R&D, hence delivering power efficient GPU designs. Maxwell includes CUDA core count vs register storage ratio correction (for compute shader) due to mistakes with Kelper.

GTX 980 Ti Maxwell v2 was the foundation for Pascal GTX 1080 Ti with similar IPC at same clock speed. Pascal architecture in-turn sets up Volta and then Turing designs i.e. Nvidia scaling TFLOPS with good geometry-raster and ROPS foundation scaling.

Meanwhile, AMD's raster setup is stuck at four geometry-raster units and 64 ROPS scaling.

It was MS pushing AMD towards Maxwell style micro-tile cache render after XBO debacle.

----

On your CELL's super computer subject.

Expected World’s Fastest Supercomputer Powered by AMD EPYC CPUs and Radeon Instinct GPUs

Loading Video...

The deal is said to be worth a total of $600 million dollars and will be commissioned in early 2022. AMD beats IBM.

Loading Video...

AMD Frontier (Epyc+ Vega II) supercomputer: 1.5 ExaFLOPS for $600 million. Cray acts as systems integrator.

Intel Aurora (Cascade lake + Xe GPU) supercomputer: 1 ExaFLOPS for $500 million. Cray acts as systems integrator.https://www.youtube.com/watch?v=DBDPFC8cIMo

IBM/NVIDIA Summit (Power9 + V100 Volta) supercomputer: 200 PetaFLOPS for $325 million.

AMD/NVIDIA "Perlmutter" supercomputer (Epyc 2 + Volta Next). 100 petaflops for $146 million. https://www.hpcwire.com/2018/10/30/cray-unveils-shasta-lands-nersc-9-contract/ Cray acts as systems integrator.