Inside Dev sources revelation on PS5 to XSX performance difference will make SW tech experts have a meltdown

This topic is locked from further discussion.

Avatar image for drlostrib
DrLostRib

5736

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#101 DrLostRib
Member since 2017 • 5736 Posts

@Zero_epyon said:

@boxrekt: Gears 5 cutscenes go from dynamic 4K/30 to native 4K/60. Gameplay sticks to Native 4K 60 with no drops. That's on top of ultra settings plus increased shadow quality not included on the PC version. Loading times are cut in half without the use of the new IO apis. That was all done in under two weeks.

shut up you damn xbot!

Avatar image for Zero_epyon
Zero_epyon

14069

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#102 Zero_epyon
Member since 2004 • 14069 Posts

@drlostrib said:
@Zero_epyon said:

@boxrekt: Gears 5 cutscenes go from dynamic 4K/30 to native 4K/60. Gameplay sticks to Native 4K 60 with no drops. That's on top of ultra settings plus increased shadow quality not included on the PC version. Loading times are cut in half without the use of the new IO apis. That was all done in under two weeks.

shut up you damn xbot!

Is that what I sound like!?!?!? Ewww..

Avatar image for masonshoemocker
masonshoemocker

630

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#103 masonshoemocker
Member since 2003 • 630 Posts

I've been gaming on a PC for some time and strictly use SSDs. One SSD dedicated for the Windows OS and another SSD to run ADOBE apps and games from. I've gone through cheap, mid-grade, and expensive. I can barely tell a f'n difference from either of them in real world use. If anything, maybe, just maybe 3-5 seconds in loading. I have found that I was paying more for reliability instead of performance.

I really don't see how Sony is going to make a difference with a SSD. You guys are ridiculous.

Avatar image for PAL360
PAL360

29842

Forum Posts

0

Wiki Points

0

Followers

Reviews: 31

User Lists: 0

#104  Edited By PAL360
Member since 2007 • 29842 Posts

I didn't fall in the 'tech cell' crap 15 years ago, so i certainly would not fall in it now. I just couldn't care less for brands. But isn't the PS5 supposed to be console designed for devs, according to their demands too?

Avatar image for masonshoemocker
masonshoemocker

630

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#105 masonshoemocker
Member since 2003 • 630 Posts

@PAL360 said:

I didn't fall in the 'tech cell' crap 15 years ago, so i certainly would not fall in it now. I just couldn't care less for brands. But isn't the PS5 supposed to be console designed for devs, according to their demands?

Well both consoles are x86. They were already designed for any developer.

Avatar image for R4gn4r0k
R4gn4r0k

33759

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#106 R4gn4r0k
Member since 2004 • 33759 Posts

Best time for System Wars in a long, long time

Avatar image for PAL360
PAL360

29842

Forum Posts

0

Wiki Points

0

Followers

Reviews: 31

User Lists: 0

#107  Edited By PAL360
Member since 2007 • 29842 Posts

@masonshoemocker: I mean the whole hardware...the sum of its parts.

Series X is more powerful, fact, but i find hard to believe PS5 will have some crazy dated hardware, since again, it was designed with devs in mind.

Avatar image for thatdbfan
ThatDBFan

570

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 6

#108 ThatDBFan
Member since 2019 • 570 Posts

@boxrekt: I love how you're the one who made a thread about Xbox fans having a meltdown over the hardware. yet you're the one having a meltdown. The Xbox SX specs really are eating at you aren't they?

I would imagine Xbox fans have nothing to worry about really, as most credible sources have said (as if we even needed this to be said), that the Series X is overall more powerful.

Avatar image for phbz
phbz

5694

Forum Posts

0

Wiki Points

0

Followers

Reviews: 54

User Lists: 0

#109 phbz
Member since 2009 • 5694 Posts

Please nobody show Chris Grannell Twitter to TC or we might lose him.

Avatar image for thatdbfan
ThatDBFan

570

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 6

#110  Edited By ThatDBFan
Member since 2019 • 570 Posts

@boxrekt: What even is this source xD? This is some nobody on YT with a video that looks like it was edited by a 12 year old. You are really desperate for anything to shed some light on the PS5 aren't you?I bet you searched "PS5 specs better than Xbox SX", and spent hours searching through all the Xbox SX spec videos to find this.

Avatar image for navyguy21
navyguy21

15461

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#111 navyguy21
Member since 2003 • 15461 Posts

Avatar image for ellos
ellos

2262

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#112 ellos
Member since 2015 • 2262 Posts

@navyguy21 said:

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#113 ronvalencia
Member since 2008 • 29612 Posts

@navyguy21 said:

Sony has a pattern of poor transparency.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#114  Edited By ronvalencia
Member since 2008 • 29612 Posts
@PAL360 said:

I didn't fall in the 'tech cell' crap 15 years ago, so i certainly would not fall in it now. I just couldn't care less for brands. But isn't the PS5 supposed to be console designed for devs, according to their demands too?

STI CELL was crap i.e. half-ass'ed Atom like CPU with half-ass'ed fake GPU aka SPUs (it's a DSP not a GPU). CELL was a master of none.

In 2006,

Fat out-of-order X86-64 CPU = industry-leading command logic and branch performance.

GeForce 8800 GPU = industry-leading data-parallel processing with industry-leading fix function raster hardware performance.

AMD's PS4/X1X is industry-leading for $$$ (bang per buck).

For PS4 project, IBM was offering in-order PowerPC A2 (then new embedded in-order PowerPC) which is the successor to PPE and its was wreaked by AMD Jaguar (lite out of-order X86-64 with Core 2 like performance). It was like 2006 rematch between refined PPE vs Core 2 clone.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#116  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Grey_Eyed_Elf said:

You do realise that mesh shading isn't exclusive to XSX nor AMD alone?... Why do I bring this up you ask?... Well you just shot your self in the foot, its purely a technique used and will be used with PS5 also and the more power you have the better it will run.

The XSX has a better GPU than the PS5 and according to you system warrior's is almost as powerful as a 2080 Ti... Yet in the mesh shading demo is taking more than 2x the time to render the dragon with pass through than a Ti.

Good luck next generation guys.

You people are too gullible, developers are human beings who have WILL and have in the past said what ever favours the console developer they are working with to build hype, nothing new.

I reviewed the original video from https://youtu.be/0sJ_g-aWriQ?t=1482

Loading Video...

At 24:42 time marker, Martin Fuller claims

XSX renders at 4K (8,294,400 pixels)

RTX 2080 Ti renders at 1440p (3,686,400 pixels)

XSX is rendering a "lot more triangles".

This is not apple to apple comparison. Review the original source to remove the 3rd party BS.

I may review all claims with 3rd party embellishment over the original source.

Try again.

Avatar image for tormentos
tormentos

30974

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#117  Edited By tormentos
Member since 2003 • 30974 Posts

@navyguy21 said:

Yes and he is jumping the gun as some xbox fanboys,just because sony doesn't mention VRS he actually hang on to that,VRS is a functions of RDNA2 and the PS5 uses RDNA2 stated by AMD and SONY.

This remind me when lemmings here claimed the PS5 didn't have hardware ray tracing because when sony claimed raytracer they didn't say hardware like MS many sites did the same shit,even that sony release its specs first and claim raytracing.

And don't forget this is a guy from DF which hasn't been the most neutral place this gen,specially when they make a barrage of articles pushing secret sauce on xbox and going hand with hand with MS.

I have no doubt that if you have more resources you can do more,but he is basing part of that argument on the PS5 supposedly having less features because sony didn't confirm them.

@ronvalencia said:
@navyguy21 said:

Sony has a pattern of poor transparency.

Remember VRS is a function of RDNA2 and the PS5 is RDNA2 confirmed by AMD and sony.

Remember this happen when sonhy didn' say HBRA and MS did,in fact you were one of those who put it on doubt and started saying sony need it to clarify,when it was clear.

@ronvalencia said:

STI CELL was crap i.e. half-ass'ed Atom like CPU with half-ass'ed fake GPU aka SPUs (it's a DSP not a GPU). CELL was a master of none.

In 2006,

Fat out-of-order X86-64 CPU = industry-leading command logic and branch performance.

GeForce 8800 GPU = industry-leading data-parallel processing with industry-leading fix function raster hardware performance.

AMD's PS4/X1X is industry-leading for $$$ (bang per buck).

For PS4 project, IBM was offering in-order PowerPC A2 (then new embedded in-order PowerPC) which is the successor to PPE and its was wreaked by AMD Jaguar (lite out of-order X86-64 with Core 2 like performance). It was like 2006 rematch between refined PPE vs Core 2 clone.

Find me 1 CPU on 2006 that was doing what cell was doing.

Find me 1 from Intel or AMD that could run GPU task.

NON were even close GPU don't run at 3.2ghz SPE did,in fact the speed of GPU in PC circa 2006 was like 600mhz or little more.

The 8800 is a GPU not a CPU,the only reason why you always hide on it is because it beat cell as a GPU when cell is a CPU,cry all you want GPU on 2006 didn't run at 3.2ghz in fact still today 14 years latter and they don't run that fast.

You have a pathetic hate for the PS3,maybe you were to poor and could not afford it in 2006.πŸ€·β€β™‚οΈ

Avatar image for pmanden
pmanden

895

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#118 pmanden  Online
Member since 2016 • 895 Posts

@masonshoemocker: Sony aren't going to make a difference with their SSD. They have to make a difference with their 1st party games. Which they are actually rather good at. Still, the power of ps5 seems a little embarassing compared to the xbox.

Avatar image for Pedro
Pedro

39258

Forum Posts

0

Wiki Points

0

Followers

Reviews: 66

User Lists: 0

#119 Pedro
Member since 2002 • 39258 Posts

@tormentos: You need to let it go man. The cell was a shit CPU then and now. Not even Sony could have relied on that shit. They abandoned that shitty overhyped processor and you should to.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#120  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:

@ronvalencia said:

Sony has a pattern of poor transparency.

1. Remember VRS is a function of RDNA2 and the PS5 is RDNA2 confirmed by AMD and sony.

Remember this happen when sonhy didn' say HBRA and MS did,in fact you were one of those who put it on doubt and started saying sony need it to clarify,when it was clear.

@ronvalencia said:

STI CELL was crap i.e. half-ass'ed Atom like CPU with half-ass'ed fake GPU aka SPUs (it's a DSP not a GPU). CELL was a master of none.

In 2006,

Fat out-of-order X86-64 CPU = industry-leading command logic and branch performance.

GeForce 8800 GPU = industry-leading data-parallel processing with industry-leading fix function raster hardware performance.

AMD's PS4/X1X is industry-leading for $$$ (bang per buck).

For PS4 project, IBM was offering in-order PowerPC A2 (then new embedded in-order PowerPC) which is the successor to PPE and its was wreaked by AMD Jaguar (lite out of-order X86-64 with Core 2 like performance). It was like 2006 rematch between refined PPE vs Core 2 clone.

2. Find me 1 CPU on 2006 that was doing what cell was doing.

Find me 1 from Intel or AMD that could run GPU task.

NON were even close GPU don't run at 3.2ghz SPE did,in fact the speed of GPU in PC circa 2006 was like 600mhz or little more.

The 8800 is a GPU not a CPU,the only reason why you always hide on it is because it beat cell as a GPU when cell is a CPU,cry all you want GPU on 2006 didn't run at 3.2ghz in fact still today 14 years latter and they don't run that fast.

You have a pathetic hate for the PS3,maybe you were to poor and could not afford it in 2006.πŸ€·β€β™‚οΈ

1. Hardware by itself is useless. RDNA 2 acts like RDNA 1 when using RDNA 1 API and software, hence the Github PS5 leak NAVI 10 lite information.

Higher GPU clock speed is RDNA 2 self-evident which doesn't need extra software to use it.

----

2. STI PPE CPU, nothing special.

STI SPU (DSP), NVIDIA CUDA GPU is superior.

For CUDA's warp32, think of future AVX 1024 bit which has 32 operations. CUDA's warp16 is like AVX 512 bit which has 16 operations.

The difference between CUDA Warp from AVX is CUDA Warp is MIMD (multiple instructions, multiple data) while AVX is SIMD (single instruction with multiple data).

STI SPU uses 128bit SIMD4 format i.e. single instruction with four data elements. SPU instruction set is based on PowerPC VMX with a local data store.

For data processing 32 elements,

With CUDA Warp32, I can place multiple scalar operations with 32 data elements in a single payload.

With SPU SIMD4, I would need 8 separate 128bit payloads.

What makes GPU separate from DSP is with rasterization hardware with Z-buffer accelerated structure.

STI SPU is not a GPU, it's a DSP.

What makes RTX separate from GPU is with RT branch test (a set of triangles + ray shot branch test) and BVH accelerated structure.

STI SPU is not an RTX, it's a DSP.

"DSP" terminology assigned to SPU is IBM's claim. IBM is not pretending SPU to be a GPU.

Unlike Intel SSE/AVX, STI SPU's FP32 SIMD is not compliant to the full IEEE-754 standards.

SPU's SIMD and PPE VMX IEEE-754 handling and obtained results are different, hence gimped the fusion processing model. LOL

True fusion is delivered by AMD APU with GCN since CPU and GPU have full IEEE 754-2008 handling and obtained results, 64bit pointer exchange and X86-64 memory paging.

You can't pointer exchange between PPE and SPU.

Avatar image for mclarenmaster18
MclarenMaster18

2825

Forum Posts

0

Wiki Points

0

Followers

Reviews: 29

User Lists: 5

#121 MclarenMaster18
Member since 2014 • 2825 Posts

2013: Sony fans says 'PS4 is much better than XB1 because of stronger hardware'

2017: Sony fans says 'Who cares if PS4 Pro is weaker than XB1X? because PS4 Pro 1st parties looks better than what XB1X has'

2020: Sony fans says 'Who cares if PS5 is weaker than XB Series X? because PS5 is going to have superior library of games plus faster SSD!'

Now they changed their minds from hardware specs to storage speed?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#122  Edited By ronvalencia
Member since 2008 • 29612 Posts

@mclarenmaster18 said:

2013: Sony fans says 'PS4 is much better than XB1 because of stronger hardware'

2017: Sony fans says 'Who cares if PS4 Pro is weaker than XB1X? because PS4 Pro 1st parties looks better than what XB1X has'

2020: Sony fans says 'Who cares if PS5 is weaker than XB Series X? because PS5 is going to have superior library of games plus faster SSD!'

Now they changed their minds from hardware specs to storage speed?

2006: PS3 has the blu-ray storage advantage over X360 and PS3 has "teh CELL". Meanwhile, Xbox 360 has near-DX10 unified shader GPU instead that continuously evolved to this day.

Xbox 360 GPU's tessellation hardware... today's RDNA 2's mesh shader hardware

Avatar image for mclarenmaster18
MclarenMaster18

2825

Forum Posts

0

Wiki Points

0

Followers

Reviews: 29

User Lists: 5

#123  Edited By MclarenMaster18
Member since 2014 • 2825 Posts

@ronvalencia: Oh I also see that Xbox 360 is actually stronger than PS3.

If I remembered correctly that why PS3 is more expensive than 360? because of the CD drive

While 360 has better online than PS3 because of faster servers.

Edit: During the days of PS3, the PSN received a huge amount of complaints regarding about the hackers, slower servers, etc.

Avatar image for gifford38
Gifford38

685

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#124 Gifford38
Member since 2020 • 685 Posts

@Pedro said:

@tormentos: You need to let it go man. The cell was a shit CPU then and now. Not even Sony could have relied on that shit. They abandoned that shitty overhyped processor and you should to.

the cell was not made for gaming. but the chip was awesome at crunching numbers. it was great at cancer research. the numbers was so much faster than the pc of that time. it took 2 days for the cell chip to give the results rather than i think it was three months or three weeks (don't remember) for the results. but the cell made huge leap in cancer treatments. but for gaming it did suck. that is why sony is using a spre chip called tempest like the cell for there 3d audio.

Avatar image for gifford38
Gifford38

685

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#125  Edited By Gifford38
Member since 2020 • 685 Posts

@mclarenmaster18 said:

2013: Sony fans says 'PS4 is much better than XB1 because of stronger hardware'

2017: Sony fans says 'Who cares if PS4 Pro is weaker than XB1X? because PS4 Pro 1st parties looks better than what XB1X has'

2020: Sony fans says 'Who cares if PS5 is weaker than XB Series X? because PS5 is going to have superior library of games plus faster SSD!'

Now they changed their minds from hardware specs to storage speed?

when are you xbox people going to realize that the ssd is not for just storage and load speeds. the ps 5 utilizes that to take off functions that the gpu and cpu functions as needed.

Avatar image for tormentos
tormentos

30974

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#126 tormentos
Member since 2003 • 30974 Posts

@ronvalencia said:

1. Hardware by itself is useless. RDNA 2 acts like RDNA 1 when using RDNA 1 API and software, hence the Github PS5 leak NAVI 10 lite information.

Higher GPU clock speed is RDNA 2 self-evident which doesn't need extra software to use it.

----

2. STI PPE CPU, nothing special.

STI SPU (DSP), NVIDIA CUDA GPU is superior.

For CUDA's warp32, think of future AVX 1024 bit which has 32 operations. CUDA's warp16 is like AVX 512 bit which has 16 operations.

The difference between CUDA Warp from AVX is CUDA Warp is MIMD (multiple instructions, multiple data) while AVX is SIMD (single instruction with multiple data).

STI SPU uses 128bit SIMD4 format i.e. single instruction with four data elements. SPU instruction set is based on PowerPC VMX with a local data store.

For data processing 32 elements,

With CUDA Warp32, I can place multiple scalar operations with 32 data elements in a single payload.

With SPU SIMD4, I would need 8 separate 128bit payloads.

What makes GPU separate from DSP is with rasterization hardware with Z-buffer accelerated structure.

STI SPU is not a GPU, it's a DSP.

What makes RTX separate from GPU is with RT branch test (a set of triangles + ray shot branch test) and BVH accelerated structure.

STI SPU is not an RTX, it's a DSP.

"DSP" terminology assigned to SPU is IBM's claim. IBM is not pretending SPU to be a GPU.

Unlike Intel SSE/AVX, STI SPU's FP32 SIMD is not compliant to the full IEEE-754 standards.

SPU's SIMD and PPE VMX IEEE-754 handling and obtained results are different, hence gimped the fusion processing model. LOL

True fusion is delivered by AMD APU with GCN since CPU and GPU have full IEEE 754-2008 handling and obtained results, 64bit pointer exchange and X86-64 memory paging.

You can't pointer exchange between PPE and SPU.

Name me 1 CPU from AMD or Intel in 2006 that were even close to Cell running GPU task.

Stop stalling.

@Pedro said:

@tormentos: You need to let it go man. The cell was a shit CPU then and now. Not even Sony could have relied on that shit. They abandoned that shitty overhyped processor and you should to.

Bah when you show me how the PS3 could even begging to challenge the xbox 360 superior GPU without Cell then you will have a point.

Without Cell the PS3 would have simply fall behind the 360 in every aspect.

They abandon it because it was CHEAPER to freaking put a Jagaur with GPU included,because for cost reduction an APU is 10 times better than using components with separate chips,it was all about savings.

@mclarenmaster18 said:

2013: Sony fans says 'PS4 is much better than XB1 because of stronger hardware'

2017: Sony fans says 'Who cares if PS4 Pro is weaker than XB1X? because PS4 Pro 1st parties looks better than what XB1X has'

2020: Sony fans says 'Who cares if PS5 is weaker than XB Series X? because PS5 is going to have superior library of games plus faster SSD!'

Now they changed their minds from hardware specs to storage speed?

2001: xbox fans claims the xbox is superior because of superio hardware.

2006: Xbox fans claim the xbox 360 is more powerful and hype superior multiplatforms,sometimes hypiding differences as small as 720p vs 640p,a missing patch of grass or 5 or 6 FPS more.

2013: Xbox fans claims graphics don't matter,while at the same time denying a 40% gap in power,that resulted in most games been 900p vs 1080p with lesser effects some times and frames,and on extreme cases 720p vs 1080p,then proceed to claim no one can notice the difference 1080p and 900p or 720p vs 1080p,suddenly less frames,less resolution less details is no notable.

All that while using every oportunity to spread MS lies,about DX12,cloud,tile resources,and we invented DX arguments as something that would change the outcome.

2017: Xbox fans care about graphics again,even that the extra graphics cost $500 more,and $100 more than the Pro when sony deliver on 2013 40% more power for $100 less.

2020: Xbox fans take a 17% GPU gap and make it seem like the PS5 will be 2 generations behind,since the the argument wasn't sticking,now change the argument to 44% more CU to make the gap seen bigger than it is and circuvent the 17% gap.

The story you told didn't start in 2013 it started in 2001.

Avatar image for tormentos
tormentos

30974

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#127 tormentos
Member since 2003 • 30974 Posts
@mclarenmaster18 said:

@ronvalencia: Oh I also see that Xbox 360 is actually stronger than PS3.

If I remembered correctly that why PS3 is more expensive than 360? because of the CD drive

While 360 has better online than PS3 because of faster servers.

Edit: During the days of PS3, the PSN received a huge amount of complaints regarding about the hackers, slower servers, etc.

Xenos >>RSX

Cell>>> Xenon.

Cell+RSX > Xenon+ Xenos.

Which is why sony exclusives looked better.

Faster servers..

Lol the PS3 had games on actuall dedicated serves for FREE when MS was doing P2P,the only thing the 360 had over the PS3 was voice crass across game.

Bullshit the first real outage of the PS3 happen in 2011,and live got hackers in fact so true it was that even Major Nelson own account was stolen..lol

The PS3 was more expensive but also was superior hardware all around,the only thing the 360 had for it was its GPU.

Avatar image for 04dcarraher
04dcarraher

23427

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#128  Edited By 04dcarraher
Member since 2004 • 23427 Posts

@gifford38 said:

when are you xbox people going to realize that the ssd is not for just storage and load speeds. the ps 5 utilizes that to take off functions that the gpu and cpu functions as needed.

SSD do not process anything for the cpu nor gpu, so no it wont off load work from neither. The point for the ultra fast SSD in the PS5 is to remove as much as the I/O bottleneck from storage not being able to feed the system pool of memory on the fly. This will allow smoother transitions for vram buffer for texture streaming for the gpu. But the console will still be limited by its shared 16gb pool at any given time.

There is as major flaw in PS5 "no loading screens" claim. When you have a game that has alot of AI routines and random events in say an complex open world rpg. it still takes time for the cpu to map out everything. We will still see games taking several seconds to a half a minute to load depending on complexity of the cpu workload.

So the argument between XsX's 2.5gb/s ssd vs PS5's 5.5 Gb/s SSD. Means virtually nothing since at best in loading times you may only see 2 second difference, but the average will be virtually the same being under a second difference.

Avatar image for Pedro
Pedro

39258

Forum Posts

0

Wiki Points

0

Followers

Reviews: 66

User Lists: 0

#129 Pedro
Member since 2002 • 39258 Posts

@tormentos said:
@Pedro said:

@tormentos: You need to let it go man. The cell was a shit CPU then and now. Not even Sony could have relied on that shit. They abandoned that shitty overhyped processor and you should to.

Bah when you show me how the PS3 could even begging to challenge the xbox 360 superior GPU without Cell then you will have a point.

Without Cell the PS3 would have simply fall behind the 360 in every aspect.

They abandon it because it was CHEAPER to freaking put a Jagaur with GPU included,because for cost reduction an APU is 10 times better than using components with separate chips,it was all about savings.

I am starting to wonder if you ever gamed on the PS3? The Cell failed because it was not a good CPU for gaming and had extremely limited application. It failed and was abandoned. Deal with it.

Avatar image for ellos
ellos

2262

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#130  Edited By ellos
Member since 2015 • 2262 Posts

@04dcarraher said:
@gifford38 said:

when are you xbox people going to realize that the ssd is not for just storage and load speeds. the ps 5 utilizes that to take off functions that the gpu and cpu functions as needed.

SSD do not process anything for the cpu nor gpu, so no it wont off load work from neither. The point for the ultra fast SSD in the PS5 is to remove as much as the I/O bottleneck from storage not being able to feed the system pool of memory on the fly. This will allow smoother transitions for vram buffer for texture streaming for the gpu. But the console will still be limited by its shared 16gb pool at any given time.

There is as major flaw in PS5 "no loading screens" claim. When you have a game that has alot of AI routines and random events in say an complex open world rpg. it still takes time for the cpu to map out everything. We will still see games taking several seconds to a half a minute to load depending on complexity of the cpu workload.

So the argument between XsX's 2.5gb/s ssd vs PS5's 5.5 Gb/s SSD. Means virtually nothing since at best in loading times you may only see 2 second difference, but the average will be virtually the same being under a second difference.

I think folks kinda don't understand that. I think what the fantasy is though. Even with current memory speeds and space its still not enough. Games still ask the gpu to render more then what is needed to be seen. In some cases they have to do it to free up the memory. The scene is more than what is needed. Again basically the memory space is not enough. The fantasy is that the ssd will help the system be even more efficient on reducing rendering load. If it can be fast enough it can be a storage extension of the memory. Well it kinda is but what I mean is fast enough where its instantaneously gives the memory what it needs like an almost literal extension. I hope I make sense and I know its a fantasy lol. Again the keyword is actually memory space is still not enough and the speed is not enough to help. Forget bottle neck they still don't have enough to try a crazy system like Cerny turn around time. And that games especially open world game still render more than needed.

I'm of course theorizing but I hope at least I make sense lol.

Avatar image for tormentos
tormentos

30974

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#131 tormentos
Member since 2003 • 30974 Posts

@Pedro said:

I am starting to wonder if you ever gamed on the PS3? The Cell failed because it was not a good CPU for gaming and had extremely limited application. It failed and was abandoned. Deal with it.

Yes i gamed on PS3 from freaking day 1.

Yeah Cell failed...😊

Avatar image for Pedro
Pedro

39258

Forum Posts

0

Wiki Points

0

Followers

Reviews: 66

User Lists: 0

#132 Pedro
Member since 2002 • 39258 Posts

@tormentos said:
@Pedro said:

I am starting to wonder if you ever gamed on the PS3? The Cell failed because it was not a good CPU for gaming and had extremely limited application. It failed and was abandoned. Deal with it.

Yes i gamed on PS3 from freaking day 1.

Yeah Cell failed...😊

It did. You showing benchmarks doesn't change the FACT that it failed. Like I said, time for you to move on from the Cell's failure like Sony.

Avatar image for tormentos
tormentos

30974

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#133 tormentos
Member since 2003 • 30974 Posts

@04dcarraher said:

SSD do not process anything for the cpu nor gpu, so no it wont off load work from neither. The point for the ultra fast SSD in the PS5 is to remove as much as the I/O bottleneck from storage not being able to feed the system pool of memory on the fly. This will allow smoother transitions for vram buffer for texture streaming for the gpu. But the console will still be limited by its shared 16gb pool at any given time.

There is as major flaw in PS5 "no loading screens" claim. When you have a game that has alot of AI routines and random events in say an complex open world rpg. it still takes time for the cpu to map out everything. We will still see games taking several seconds to a half a minute to load depending on complexity of the cpu workload.

So the argument between XsX's 2.5gb/s ssd vs PS5's 5.5 Gb/s SSD. Means virtually nothing since at best in loading times you may only see 2 second difference, but the average will be virtually the same being under a second difference.

While that is true compressing and decompressing data takes CPU resources,the PS5 does have hardware in place that will save CPU time.

In fact the DMA they have is equivalent to 9 ryzen 2 cores.

The bottom line? 5.5GBs of bandwidth translates into an effective eight or nine gigabytes per second fed into the system. "By the way, in terms of performance, that custom decompressor equates to nine of our Zen 2 cores, that's what it would take to decompress the Kraken stream with a conventional CPU," Cerny reveals.

Thats a pretty intensive hardware modification.

That if leave to the system CPU would basically eat it all.

All of this is delivered to developers without them needing to do anything. Even the decompression is taken care of by the custom silicon. "You just indicate what data you'd like to read from your original, uncompressed file, and where you'd like to put it, and the whole process of loading it happens invisibly to you and at very high speed," Cerny explains.

Now i don't see that improving power or any shit like that,but it should make for a considerable faster loading on PS5.

Avatar image for i_p_daily
I_P_Daily

16204

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#134 I_P_Daily
Member since 2015 • 16204 Posts

Tormy's back with his special brand of damage control, and a new cow alt has appeared lol.

Avatar image for Pedro
Pedro

39258

Forum Posts

0

Wiki Points

0

Followers

Reviews: 66

User Lists: 0

#135 Pedro
Member since 2002 • 39258 Posts

@tormentos said:

The bottom line? 5.5GBs of bandwidth translates into an effective eight or nine gigabytes per second fed into the system. "By the way, in terms of performance, that custom decompressor equates to nine of our Zen 2 cores, that's what it would take to decompress the Kraken stream with a conventional CPU," Cerny reveals.

Thats a pretty intensive hardware modification.

That if leave to the system CPU would basically eat it all.

What that tells me is that Kraken decompression is less efficient than other solutions(BCPack) if it requires 9 Zen cores.

Avatar image for 04dcarraher
04dcarraher

23427

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#136  Edited By 04dcarraher
Member since 2004 • 23427 Posts

@tormentos said:

While that is true compressing and decompressing data takes CPU resources,the PS5 does have hardware in place that will save CPU time.

In fact the DMA they have is equivalent to 9 ryzen 2 cores.

The bottom line? 5.5GBs of bandwidth translates into an effective eight or nine gigabytes per second fed into the system. "By the way, in terms of performance, that custom decompressor equates to nine of our Zen 2 cores, that's what it would take to decompress the Kraken stream with a conventional CPU," Cerny reveals.

Thats a pretty intensive hardware modification.

That if leave to the system CPU would basically eat it all.

All of this is delivered to developers without them needing to do anything. Even the decompression is taken care of by the custom silicon. "You just indicate what data you'd like to read from your original, uncompressed file, and where you'd like to put it, and the whole process of loading it happens invisibly to you and at very high speed," Cerny explains.

Now i don't see that improving power or any shit like that,but it should make for a considerable faster loading on PS5.

Their custom decompression hardware silicon will help with freeing up cpu resources however. Kraken is a generalized purposed system for decompression. Now Ive read that some devs are saying XsX's BCPack texture compression is actually better than PS5's when it comes to texture data. However the extra bandwidth should overcome Kraken's deficiency.

But the point still stands PS5's "no loading screens" is BS, in regarding to actually having shorter load times.

Avatar image for Pedro
Pedro

39258

Forum Posts

0

Wiki Points

0

Followers

Reviews: 66

User Lists: 0

#137 Pedro
Member since 2002 • 39258 Posts

@04dcarraher said:

Their custom decompression hardware silicon will help with freeing up cpu resources however. Kraken is a generalized purposed system for decompression. Now Ive read that some devs are saying XsX's BCPack texture compression is actually better than PS5's when it comes to texture data. However the extra bandwidth should overcome Kraken's deficiency.

But the point still stands PS5's "no loading screens" is BS, in regarding to actually having shorter load times.

You are correct. Faster data transfer solves one portion of the loading equation. The other, is processing the loaded data. Accomplishing no loading is very possible and is game dependent. Its up to the developer to make the seamless experience. I am surprised that folks have forgotten about games like the original Jak and Daxkter running off a disc.

Avatar image for tormentos
tormentos

30974

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#138 tormentos
Member since 2003 • 30974 Posts

@04dcarraher:

We will see as far as I see it,some people are attacking Sony because it's ssd is faster than what you even have on PC.

I don't see why it should not be like Sony claim,in fact from what I have read developer seem more exited about the SSD than the actual power or ray traycing.

Avatar image for tormentos
tormentos

30974

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#139  Edited By tormentos
Member since 2003 • 30974 Posts

@Pedro said:

You are correct. Faster data transfer solves one portion of the loading equation. The other, is processing the loaded data. Accomplishing no loading is very possible and is game dependent. Its up to the developer to make the seamless experience. I am surprised that folks have forgotten about games like the original Jak and Daxkter running off a disc.

Don't you think the PS5 should process it find considering it has free CPU time to deal with it?

Sony is doubling down on solid-state storage in providing a truly transformative next generation experience. Every couple of years, Mark Cerny travels the world, meeting dozens of developers and publishers and the integration of the SSD was the number one next-gen request. Sony's actual implementation is something else, with performance rated at two orders of magnitude faster than PlayStation 4. 2GB of data can be loaded in one quarter of a second, meaning that in theory, the entirety of PS5's 16GB can be filled in just two seconds. "As game creators, we go from trying to distract the player from how long fast travel is taking - like those Spider-Man subway rides - to being so blindingly fast that we might even have to slow that transition down," says Cerny.

Delivering two orders of magnitude improvement in performance required a lot of custom hardware to seamlessly marry the SSD to the main processor. A custom flash marries up to the SSD modules via a 12 channel interface, delivering the required 5.5GB/s of performance with a total of 825GB of storage. This may sound like a strange choice for storage size when considering that consumer SSDs offer 512GB, 1TB or more of capacity, but Sony's solution is proprietary, 825GB is most optimal match for the 12-channel interface and there are other advantages too. In short, Sony had more freedom to adapt its design: "We can look at the available NAND flash parts and construct something with optimal price performance. Someone constructing an M.2 drive presumably does not have that freedom, it would be difficult to market and sell if it were not one of those standard sizes," Mark Cerny says.

The controller itself hooks up to the main processor via a four-lane PCI Express 4.0 interconnect, and contains a number of bespoke hardware blocks designed to eliminate SSD bottlenecks.The system has six priority levels, meaning that developers can literally prioritise the delivery of data according to the game's needs.

The controller supports hardware decompression for the industry-standard ZLIB, but also the new Kraken format from RAD Game Tools, which offers an additional 10 per cent of compression efficiency. The bottom line? 5.5GBs of bandwidth translates into an effective eight or nine gigabytes per second fed into the system. "By the way, in terms of performance, that custom decompressor equates to nine of our Zen 2 cores, that's what it would take to decompress the Kraken stream with a conventional CPU," Cerny reveals.

A dedicated DMA controller (equivalent to one or two Zen 2 cores in performance terms) directs data to where it needs to be, while two dedicated, custom processors handle I/O and memory mapping. On top of that, coherency engines operate as housekeepers of sorts.

"Coherency comes up in a lot of places, probably the biggest coherency issue is stale data in the GPU caches," explains Cerny in his presentation. "Flushing all the GPU caches whenever the SSD is read is an unattractive option - it could really hurt the GPU performance -

so we've implemented a gentler way of doing things, where the coherency engines inform the GPU of the overwritten address ranges and custom scrubbers in several dozen GPU caches do pinpoint evictions of just those address ranges."

All of this is delivered to developers without them needing to do anything. Even the decompression is taken care of by the custom silicon. "You just indicate what data you'd like to read from your original, uncompressed file, and where you'd like to put it, and the whole process of loading it happens invisibly to you and at very high speed," Cerny explains.

Man just read that they took more than 1 step to eliminate bootlenecks,is not jjust ultra fast ssd and move on,they really make some customizations to help that data really flow,in fact you can see how Cerny talk about how stale data was a coherency problem in the GPU caches,and how flushing all the GPU caches when there ssd read the data wasn't an option because it would hurt GPU performance.

And they came with a custom solution.

I think people are been unfair about sony's ssd.

Avatar image for Pedro
Pedro

39258

Forum Posts

0

Wiki Points

0

Followers

Reviews: 66

User Lists: 0

#140 Pedro
Member since 2002 • 39258 Posts

@tormentos said:
@Pedro said:

You are correct. Faster data transfer solves one portion of the loading equation. The other, is processing the loaded data. Accomplishing no loading is very possible and is game dependent. Its up to the developer to make the seamless experience. I am surprised that folks have forgotten about games like the original Jak and Daxkter running off a disc.

Don't you think the PS5 should process it find considering it has free CPU time to deal with it?

Sony is doubling down on solid-state storage in providing a truly transformative next generation experience. Every couple of years, Mark Cerny travels the world, meeting dozens of developers and publishers and the integration of the SSD was the number one next-gen request. Sony's actual implementation is something else, with performance rated at two orders of magnitude faster than PlayStation 4. 2GB of data can be loaded in one quarter of a second, meaning that in theory, the entirety of PS5's 16GB can be filled in just two seconds. "As game creators, we go from trying to distract the player from how long fast travel is taking - like those Spider-Man subway rides - to being so blindingly fast that we might even have to slow that transition down," says Cerny.

Delivering two orders of magnitude improvement in performance required a lot of custom hardware to seamlessly marry the SSD to the main processor. A custom flash marries up to the SSD modules via a 12 channel interface, delivering the required 5.5GB/s of performance with a total of 825GB of storage. This may sound like a strange choice for storage size when considering that consumer SSDs offer 512GB, 1TB or more of capacity, but Sony's solution is proprietary, 825GB is most optimal match for the 12-channel interface and there are other advantages too. In short, Sony had more freedom to adapt its design: "We can look at the available NAND flash parts and construct something with optimal price performance. Someone constructing an M.2 drive presumably does not have that freedom, it would be difficult to market and sell if it were not one of those standard sizes," Mark Cerny says.

The controller itself hooks up to the main processor via a four-lane PCI Express 4.0 interconnect, and contains a number of bespoke hardware blocks designed to eliminate SSD bottlenecks.The system has six priority levels, meaning that developers can literally prioritise the delivery of data according to the game's needs.

The controller supports hardware decompression for the industry-standard ZLIB, but also the new Kraken format from RAD Game Tools, which offers an additional 10 per cent of compression efficiency. The bottom line? 5.5GBs of bandwidth translates into an effective eight or nine gigabytes per second fed into the system. "By the way, in terms of performance, that custom decompressor equates to nine of our Zen 2 cores, that's what it would take to decompress the Kraken stream with a conventional CPU," Cerny reveals.

A dedicated DMA controller (equivalent to one or two Zen 2 cores in performance terms) directs data to where it needs to be, while two dedicated, custom processors handle I/O and memory mapping. On top of that, coherency engines operate as housekeepers of sorts.

"Coherency comes up in a lot of places, probably the biggest coherency issue is stale data in the GPU caches," explains Cerny in his presentation. "Flushing all the GPU caches whenever the SSD is read is an unattractive option - it could really hurt the GPU performance -

so we've implemented a gentler way of doing things, where the coherency engines inform the GPU of the overwritten address ranges and custom scrubbers in several dozen GPU caches do pinpoint evictions of just those address ranges."

All of this is delivered to developers without them needing to do anything. Even the decompression is taken care of by the custom silicon. "You just indicate what data you'd like to read from your original, uncompressed file, and where you'd like to put it, and the whole process of loading it happens invisibly to you and at very high speed," Cerny explains.

Man just read that they took more than 1 step to eliminate bootlenecks,is not jjust ultra fast ssd and move on,they really make some customizations to help that data really flow,in fact you can see how Cerny talk about how stale data was a coherency problem in the GPU caches,and how flushing all the GPU caches when there ssd read the data wasn't an option because it would GPU performance.

And they came with a custom solution.

I think people are been unfair about sony's ssd.

Step one. Actually read my comment.

Step two. Allow Sony to leave your mind.

Step three. Re-read my comment.

Surprise. It had nothing to do with Sony or the PS5 but just how games work.

Avatar image for 04dcarraher
04dcarraher

23427

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#141  Edited By 04dcarraher
Member since 2004 • 23427 Posts

Sony is just using using PR to sell their product because its their only real selling point hardware wise over MS's XSX.

Here is the thing, that the SSD in the PS5 is vs PCI-E 4.0 NVME is only 500-550mb/s faster. Its not the bandwidth that matters in gaming loading times but the memory's access times or "IOPS". The difference between a good sata ssd "550mb/s" vs M.2 NVME upto 5gb/s is very minor and there is as much as 4450 mb/s difference between them. What the PS5 SSD in its environment is trying to do is create and secondary caching system to stream in more data more fluidly into the main system ram. When it comes to gaming loading times the PS5 isnt going to outclass any descent SSD.

Avatar image for Pedro
Pedro

39258

Forum Posts

0

Wiki Points

0

Followers

Reviews: 66

User Lists: 0

#142 Pedro
Member since 2002 • 39258 Posts

@04dcarraher: I decided to checkout loading of existing games with varying SSD performance and the results are as expected. It all depends on the game.

Loading Video...

The drives being used are

  • SSD M.2 - Samsung 970 EVO 500 GB (3,400MB/s Seq. Read)
  • SSD SATA3 - Samsung 860 Evo-Series (550MB/s Seq. Read)
  • 500GB HDD - Western Digital Blue 7200rpm 1TB ((~150MB/s Seq. Read)

Avatar image for boxrekt
BoxRekt

2425

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#143 BoxRekt
Member since 2019 • 2425 Posts

@Pedro said:
@tormentos said:
@Pedro said:

I am starting to wonder if you ever gamed on the PS3? The Cell failed because it was not a good CPU for gaming and had extremely limited application. It failed and was abandoned. Deal with it.

Yes i gamed on PS3 from freaking day 1.

Yeah Cell failed...😊

It did. You showing benchmarks doesn't change the FACT that it failed. Like I said, time for you to move on from the Cell's failure like Sony.

So...NOW...numbers don't matter?

lol oh boy what a flip flop from your previous arguments when numbers meant everything.

Avatar image for Pedro
Pedro

39258

Forum Posts

0

Wiki Points

0

Followers

Reviews: 66

User Lists: 0

#144 Pedro
Member since 2002 • 39258 Posts

@boxrekt said:

So...NOW...numbers don't matter?

lol oh boy what a flip flop from your previous arguments when numbers meant everything.

And the fact remains that the Cell failed. Benchmarks doesn't change the fact that it failed. Let me put it in a way your simple mind can understand, the One X is faster than the PS4 and PS4 Pro yet it FAILED to surpass the PS4 in sales. Does that help?

Avatar image for ellos
ellos

2262

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#145  Edited By ellos
Member since 2015 • 2262 Posts

@Pedro said:

@04dcarraher: I decided to checkout loading of existing games with varying SSD performance and the results are as expected. It all depends on the game.

The drives being used are

  • SSD M.2 - Samsung 970 EVO 500 GB (3,400MB/s Seq. Read)
  • SSD SATA3 - Samsung 860 Evo-Series (550MB/s Seq. Read)
  • 500GB HDD - Western Digital Blue 7200rpm 1TB ((~150MB/s Seq. Read)

To be fair are we doing the usual its not going to mount to anything because we already see it in all sorts of pc enviroment. My PC shows the way and its not much etc etc.... My question is will it help make a bigger difference because. 1 its there environment they can tailor best case scenario, reduce the depends on the game part. 2 there ssd is faster and on PCIE 4.0. Will things be different on a ps5 environment.

Avatar image for Uruz7laevatein
Uruz7laevatein

133

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#146  Edited By Uruz7laevatein
Member since 2009 • 133 Posts

@ellos:

There have been similar uses of fast non-flash based nvram (ReRam/3DXpoint/Flash?) in the enterprise (Workstations/HPC/Datacenters), for streaming and handling large data-sets (computed/pre-computed, latter being more exploited in the console) immediately on assets that can't fit (be wasted) in main memory of the GPU at a given time while proving very high IOPs/Responsiveness. It's just that the feature is now finally disseminated into the mass market console/consumers environment for the first time which is rather exciting.

https://www.youtube.com/watch?v=KvdGTCFEqhg&feature=emb_title

Though it'd nice if the consoles featured 10 Gbe Ethernet via RJ45/SFP+ for Cloud-Streaming/Distributed-Computing.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#147 ronvalencia
Member since 2008 • 29612 Posts

@Pedro:

Loading Video...

Sabrent 1TB Rocket NVMe 4.0 Gen4 PCIe M.2 (+5000 MB/s seq read)

https://www.tweaktown.com/reviews/9288/sabrent-rocket-nvme-4-1tb-pcie-gen4-2-ssd-review/index2.html

---

Sabrent 1TB Rocket NVMe 3.0 Gen3 PCIe M.2 (3400 MB/s read and 3000 MB/s write) https://www.sabrent.com/product/SB-ROCKET-1TB/1tb-rocket-nvme-pcie-m-2-2280-internal-ssd-high-performance-solid-state-drive/#description

----

SAMSUNG 860 EVO 1TB SATA III

Avatar image for Pedro
Pedro

39258

Forum Posts

0

Wiki Points

0

Followers

Reviews: 66

User Lists: 0

#148 Pedro
Member since 2002 • 39258 Posts

@ellos said:
@Pedro said:

@04dcarraher: I decided to checkout loading of existing games with varying SSD performance and the results are as expected. It all depends on the game.

The drives being used are

  • SSD M.2 - Samsung 970 EVO 500 GB (3,400MB/s Seq. Read)
  • SSD SATA3 - Samsung 860 Evo-Series (550MB/s Seq. Read)
  • 500GB HDD - Western Digital Blue 7200rpm 1TB ((~150MB/s Seq. Read)

To be fair are we doing the usual its not going to mount to anything because we already see it in all sorts of pc enviroment. My PC shows the way and its not much etc etc.... My question is will it help make a bigger difference because. 1 its there environment they can tailor best case scenario, reduce the depends on the game part. 2 there ssd is faster and on PCIE 4.0. Will things be different on a ps5 environment.

In order for games to removing loading the developer has to design the game with that core in mind. This has always been the case independent of SSDs. SSDs would allow for a smaller buffer between assets that are loaded and assets that needs to be loaded. Slower hard-drives needs a larger buffer. The buffer I am referring to is in-game buffers such as elevators, long winding corridors etc. More exotic solutions can be utilize that allow for dynamic streaming because of the SSD and the use of directly loading data into memory without the CPU. To avoid hitching from processing CPU dependent assets, assets activation can be staggered. Well, at least that is how I would approach it.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#149  Edited By ronvalencia
Member since 2008 • 29612 Posts
@tormentos said:
@mclarenmaster18 said:

@ronvalencia: Oh I also see that Xbox 360 is actually stronger than PS3.

If I remembered correctly that why PS3 is more expensive than 360? because of the CD drive

While 360 has better online than PS3 because of faster servers.

Edit: During the days of PS3, the PSN received a huge amount of complaints regarding about the hackers, slower servers, etc.

Xenos >>RSX

Cell>>> Xenon.

Cell+RSX > Xenon+ Xenos.

Which is why sony exclusives looked better.

Faster servers..

Lol the PS3 had games on actuall dedicated serves for FREE when MS was doing P2P,the only thing the 360 had over the PS3 was voice crass across game.

Bullshit the first real outage of the PS3 happen in 2011,and live got hackers in fact so true it was that even Major Nelson own account was stolen..lol

The PS3 was more expensive but also was superior hardware all around,the only thing the 360 had for it was its GPU.

The argument "Which is why sony exclusives looked better" is subjective i.e. artwork style differences

Xenos has a tessellation unit that doesn't exist for RSX. For PS3, this function needs to be emulated on SPUs**. Tessellation hardware feature arrives with PC DX11 GPUs. DX10 GPUs have geometry shaders that reduce the available shader resources.

Xenos has greater vertex shaders resources when compared to RSX. For PS3, this function needs to be emulated on SPUs**.

Xenos has MSAA with FP HDR that doesn't exist for RSX. For PS3, this function needs to be emulated on SPUs**.

Xenos has full speed FP32 compute while it's gimped on RSX. For PS3, this function needs to be emulated on SPUs**.

Xenos has a reasonable shader branch capability while it's gimped on RSX. For PS3, this function needs to be emulated on SPUs**.

Xenos has ROPS with ROV (Rasterizer Order Views) like feature that doesn't exist for RSX. For PS3, this function needs to be emulated on SPUs**. Xbox 360 game programmer reveals functionality when contributing for Xbox 360 emulator for PC. Xenos ROPS functionality was directly remapped to DirectX 12_1's ROV feature.

**PS3 incurs cumulative latency penalties when ping-pong between SPEs and RSX.

There's a reason for RT cores are placed inside SM or CU designs.

Xbox 360's PPE X3 remapped to PS3's PPE + two SPUs, hence leaving four SPUs to patch RSX GPU.

Against your "Cell+RSX > Xenon+ Xenos" argument from IBM itself.

https://www.gamasutra.com/view/feature/132297/processing_the_truth_an_interview_.php

But can Shippy's insight on both console's processors finally answer the age-old debate about which console is actually more powerful?

"I'm going to have to answer with an 'it depends,'" laughs Shippy, after a pause. "Again, they're completely different models. So in the PS3, you've got this Cell chip which has massive parallel processing power, the PowerPC core, multiple SPU cores… it's got a GPU that is, in the model here, processing more in the Cell chip and less in the GPU. So that's one processing paradigm -- a heterogeneous paradigm."

"With the Xbox 360, you've got more of a traditional multi-core system, and you've got three PowerPC cores, each of them having dual threads -- so you've got six threads running there, at least in the CPU. Six threads in Xbox 360, and eight or nine threads in the PS3 -- but then you've got to factor in the GPU," Shippy explains. "The GPU is highly sophisticated in the Xbox 360."

He concludes: "At the end of the day, when you put them all together, depending on the software, I think they're pretty equal, even though they're completely different processing models."

Every modern unified shader GPU has its design template from ATI Xenos i.e. AMD owns the unified shader GPU patent.

Xenos GPU is specifically designed for triangles and pixel graphics processing.

Continuing from Xenos hardware tessellation evolution, the next evolution is mesh shading. Both Turing RTX and RDNA 2 GPUs have been loaded with additional accelerated hardware functions.

Avatar image for ellos
ellos

2262

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#150  Edited By ellos
Member since 2015 • 2262 Posts

@Pedro said:
@ellos said:

To be fair are we doing the usual its not going to mount to anything because we already see it in all sorts of pc enviroment. My PC shows the way and its not much etc etc.... My question is will it help make a bigger difference because. 1 its there environment they can tailor best case scenario, reduce the depends on the game part. 2 there ssd is faster and on PCIE 4.0. Will things be different on a ps5 environment.

In order for games to removing loading the developer has to design the game with that core in mind. This has always been the case independent of SSDs. SSDs would allow for a smaller buffer between assets that are loaded and assets that needs to be loaded. Slower hard-drives needs a larger buffer. The buffer I am referring to is in-game buffers such as elevators, long winding corridors etc. More exotic solutions can be utilize that allow for dynamic streaming because of the SSD and the use of directly loading data into memory without the CPU. To avoid hitching from processing CPU dependent assets, assets activation can be staggered. Well, at least that is how I would approach it.

Yeah technically there is always going to be loading. Initial loading and dependent of developer design. The question do you have enough speed for those buffers to be very very small. To the point of I don't know you can be loading enough stuff while the player is looking the other way. Erm those long long Uncharted climbing sequences can have more gameplay mechanics lol. Maybe one day the dream is loading will just be a part of the installation.