Scarlett more advanced than PS5

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#201  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@scatteh316 said:

In a sense it would be be affected by RT......but the overall raw performance of the die wouldn't be.

Say Microsoft start with a 40 CU Navi GPU and have to sacrifice 10 CU's for RT...... that'll be a 25% reduction in raw GPU performance but with this set-up running RT won't affect the utilisation of those remaining 30 CU's........ but turning off RT won't give more raw performance....But as long as the RT cores are fast enough to make up for the reduction in CU's it could actually work in their favour.

What's interesting is how much compute performance is going to be required for RT via shaders to match what will be possible via dedicated hardware.

If for example a next generation COD game ships with RT enabled via hardware acceleration on Scarlett it will still have 30 CU's worth of GPU performance to handle the rest of the graphics.

Now...... what if this same COD game ships on PS5 ?? There could be several scenario's......

1- They ditch RT completely and have inferior lighting/shadows/reflections (But still not ugly looking) but use all 40 CU's to push the visuals in other area's

2- They dedicate CU's to do RT via shaders.....but..... what if this requires 15 CU's to achieve parity with Scarletts dedicated hardware? That'll leave 25 CU's left for the rest of the graphics.....less then Scarlett!!!

If Scarlett and crucially AMD's implementation of hardware acceleration of RT is robust and performs well Scarlett may very well be the faster console in the real world.

Or has Sony or Cerny made custom tweaks to the core logic to enable better RT performance on PS5??

It's all very interesting and will be good to see how it all unfolds.

That would still be problematic.

And we can see scenarios were even HB RT is to much of a hit to be implemented,even at low preset.

This is on a RTX 2080 Ti neither the PS5 or Scarlet will get something like this GPU,and you can see how even on low preset the sting is nasty to performance.

HB RT doesn't mean there is no cost to performance,but that is lower,as we can see we are not even close to reach a point were RT can be implemented without taking a huge hit,if sony manage to get stronger hardware in exchage for shader based RT i would not even use it if i were sony,this is an RTX from the top GPU maker in the industry,imagine how scarlet or PS5 would run this,the level of sacrifice that would be need it is simply to much.

For now i believe RT should be off the table rather than trying to force it as the next best thing while performance goes to hell and beyond.

Your Battlefield V RT benchmarks seems to be old

https://www.guru3d.com/news-story/december-4-battlefield-v-raytracing-dxr-performance-patch-released-(benchmarks).html

RTX 2080 Ti at 4K RT ultra reach 41 fps and ~70 fps at 1440p

RT consumes memory bandwidth which is being shared with shader cores.

RTX 2070 4K with DXR low lands on 36 fps.

Turing's rapid pack maths and variable shading rate features wasn't used.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#202  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tdkmillsy said:

Most 720p/1080p games on Xbox One/PS4 where early on and very likely to be development kit issues. The standard difference between them (as in most games that had a difference) was 900p/1080p, it might not play out on a 1080p TV at normal gaming distances but it a difference and advantage to PS4.

Microsoft played down the difference in exactly the same way Sony have played down the lack of full 4k when comparing the X to the Pro.

They both do it but fanboy aside what else would you expect them to do. Its a competitive market, they are all going to up play the advantages and down play the disadvantages.

It will continue in the next gen and all that follow it, but if it didn't we wouldn't have system wars :)

XBO's ESRAM tiling needs programmer's intervention(skill based) and split DDR3/ESRAM render arrived later, but it wouldn't overcome CU-ALU bound issues.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#203  Edited By ronvalencia
Member since 2008 • 29612 Posts

@scatteh316 said:
@Pedro said:
@scatteh316 said:

I know RT improves things....but as per my comment "I'm more then happy with the quality of reflections and lighting we currently have"

At least doing RT via shaders there's the option to just turn it off and use the power for everything...... an option not available with hardware acceleration.

I would of liked to of seen a voxel approach used next generation as opposed to RT as it's less resource heavy and would still give a nice improvement over current techniques...... it also would of been the perfect stop gap in the work up to RT.

But it will ultimately depend on how good the consoles are at RT....... if they have the performance to run it well then maybe...just maybe.....it'll be OK.

You would think that if RT is hardware accelerated that the core performance would be unaffected by its implementation but that is not the case. I reckon that Nvidia's current accelerated solution (which may also be AMDs) is too limited, relying on other components of the GPU thus the performance hit. The ideal situation is true hardware acceleration that runs parallel to the existing render pipeline.

In a sense it would be be affected by RT......but the overall raw performance of the die wouldn't be.

Say Microsoft start with a 40 CU Navi GPU and have to sacrifice 10 CU's for RT...... that'll be a 25% reduction in raw GPU performance but with this set-up running RT won't affect the utilisation of those remaining 30 CU's........ but turning off RT won't give more raw performance....But as long as the RT cores are fast enough to make up for the reduction in CU's it could actually work in their favour.

What's interesting is how much compute performance is going to be required for RT via shaders to match what will be possible via dedicated hardware.

If for example a next generation COD game ships with RT enabled via hardware acceleration on Scarlett it will still have 30 CU's worth of GPU performance to handle the rest of the graphics.

Now...... what if this same COD game ships on PS5 ?? There could be several scenario's......

1- They ditch RT completely and have inferior lighting/shadows/reflections (But still not ugly looking) but use all 40 CU's to push the visuals in other area's

2- They dedicate CU's to do RT via shaders.....but..... what if this requires 15 CU's to achieve parity with Scarletts dedicated hardware? That'll leave 25 CU's left for the rest of the graphics.....less then Scarlett!!!

If Scarlett and crucially AMD's implementation of hardware acceleration of RT is robust and performs well Scarlett may very well be the faster console in the real world.

Or has Sony or Cerny made custom tweaks to the core logic to enable better RT performance on PS5??

It's all very interesting and will be good to see how it all unfolds.

Scarlet APU has been shown to be 380 to 400 mm2

8 core Zen v2 chiplet + 5700 XT has 321 mm2.

TSMC 7nm+ has 20 percent increase density when compared to fake 7nm.

TSMC 1st gen 7nm is the BS 7nm.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#204  Edited By scatteh316
Member since 2004 • 10273 Posts

@ronvalencia said:
@scatteh316 said:
@Pedro said:
@scatteh316 said:

I know RT improves things....but as per my comment "I'm more then happy with the quality of reflections and lighting we currently have"

At least doing RT via shaders there's the option to just turn it off and use the power for everything...... an option not available with hardware acceleration.

I would of liked to of seen a voxel approach used next generation as opposed to RT as it's less resource heavy and would still give a nice improvement over current techniques...... it also would of been the perfect stop gap in the work up to RT.

But it will ultimately depend on how good the consoles are at RT....... if they have the performance to run it well then maybe...just maybe.....it'll be OK.

You would think that if RT is hardware accelerated that the core performance would be unaffected by its implementation but that is not the case. I reckon that Nvidia's current accelerated solution (which may also be AMDs) is too limited, relying on other components of the GPU thus the performance hit. The ideal situation is true hardware acceleration that runs parallel to the existing render pipeline.

In a sense it would be be affected by RT......but the overall raw performance of the die wouldn't be.

Say Microsoft start with a 40 CU Navi GPU and have to sacrifice 10 CU's for RT...... that'll be a 25% reduction in raw GPU performance but with this set-up running RT won't affect the utilisation of those remaining 30 CU's........ but turning off RT won't give more raw performance....But as long as the RT cores are fast enough to make up for the reduction in CU's it could actually work in their favour.

What's interesting is how much compute performance is going to be required for RT via shaders to match what will be possible via dedicated hardware.

If for example a next generation COD game ships with RT enabled via hardware acceleration on Scarlett it will still have 30 CU's worth of GPU performance to handle the rest of the graphics.

Now...... what if this same COD game ships on PS5 ?? There could be several scenario's......

1- They ditch RT completely and have inferior lighting/shadows/reflections (But still not ugly looking) but use all 40 CU's to push the visuals in other area's

2- They dedicate CU's to do RT via shaders.....but..... what if this requires 15 CU's to achieve parity with Scarletts dedicated hardware? That'll leave 25 CU's left for the rest of the graphics.....less then Scarlett!!!

If Scarlett and crucially AMD's implementation of hardware acceleration of RT is robust and performs well Scarlett may very well be the faster console in the real world.

Or has Sony or Cerny made custom tweaks to the core logic to enable better RT performance on PS5??

It's all very interesting and will be good to see how it all unfolds.

Scarlet APU has been shown to be 380 to 400 mm2

8 core Zen v2 chiplet + 5700 XT has 321 mm2.

TSMC 7nm+ has 20 percent increase density when compared to fake 7nm.

TSMC 1st gen 7nm is the BS 7nm.

Will you go away.... I was using 40 CU's just as an example to show CU count and how it could be affected by using dedicated RT logic.......I was not claiming they'll have 40 CU's and use the full fat die.

So bug off!

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#205 tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

Your Battlefield V RT benchmarks seems to be old

https://www.guru3d.com/news-story/december-4-battlefield-v-raytracing-dxr-performance-patch-released-(benchmarks).html

RTX 2080 Ti at 4K RT ultra reach 41 fps and ~70 fps at 1440p

RT consumes memory bandwidth which is being shared with shader cores.

RTX 2070 4K with DXR low lands on 36 fps.

Turing's rapid pack maths and variable shading rate features wasn't used.

So the 2080TI RTX does 83FPS withg RT off,with it on does 41FPS cutting performanse by more than 100% and you think some how the hit is not big.?

Even with newer drivers the hit is to big,remember this is an RTX 2080ti nor Scarlet of the PS5 will get something like this.

Avatar image for deactivated-5f3ec00254b0d
deactivated-5f3ec00254b0d

6278

Forum Posts

0

Wiki Points

0

Followers

Reviews: 54

User Lists: 0

#206 deactivated-5f3ec00254b0d
Member since 2009 • 6278 Posts

I'm loving this debate based on absolutely nothing. Please continue.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#207 tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

Your percentage argument hides the real pixel count difference.

I have 7970 1 Ghz and 7950 factory 900Mhz (OC to 950Mhz) which I can CrossFire prior to R9-290X purchase.

I haven't bought XBO or PS4.

@tormentos said:

You trying to downplay the gap using the damn 7970 is a freaking joke,so i assume the gap between the PS4 Pro and the XBO X is also small since you know Vega has GPU way over Scorpio Tflop count.

This ^^ is red hearrin what you are doing basically creating a distraction from the fact that MS would not even admit a gap between both consoles,but when they had the lead they promote it to hell and beyond.

X1X RDR2 delivered 2X pixels over PS4 Pro results. Your pure TFLOPS argument is flawed.

---

Both Vega 56/64 and X1X has quad geometry-raster units, hence there's a bottleneck with Vega 56/64. Vega 64 didn't double it's geometry power in accordance to TFLOPS scaling.

NAVI 10's (8 Prim Shader In, 4 Prisms Shader Out) geometry engine will raster better than Polaris 10/Vega's (4 Prim Shader In, 4 Prisms Shader Out).

GCN takes in 4 triangles and then rasters what isn't culled and you will generally lose half just to backface. So it's really rasterizing ~2 triangles per clock on average. Some engines were resorting to manually culling all triangles via compute shaders so that they could actually rasterize 4 per clock, but this workaround consumes shader resource. GCN geometry workaround is Radeon 2900 XT MSAA running on shaders level debacle.

NAVI 10's 8 triangles transformed/culled and 4 rasterized will actually allow the rasterizers to hit close to full throughput.

Again, your pure TFLOPS argument is flawed.

I'm not stopping.

NO is not Vega top of the line GPU has higher Tflop than Scorpio so my argument is not flawed unless you prove the contrary which you can't.

@tdkmillsy said:

Most 720p/1080p games on Xbox One/PS4 where early on and very likely to be development kit issues. The standard difference between them (as in most games that had a difference) was 900p/1080p, it might not play out on a 1080p TV at normal gaming distances but it a difference and advantage to PS4.

Microsoft played down the difference in exactly the same way Sony have played down the lack of full 4k when comparing the X to the Pro.

They both do it but fanboy aside what else would you expect them to do. Its a competitive market, they are all going to up play the advantages and down play the disadvantages.

It will continue in the next gen and all that follow it, but if it didn't we wouldn't have system wars :)

Not even close,quote sony claming MS doesn't have an advantage with Scorpio or xbox one X over the pro like MS did.

PS4 is not more powerful than Xbox One

The Xbox One and the PlayStation 4 are apparently closer rivals in power than the raw stats would lead you to believe. According to Albert Penello of Microsoft who posted some interesting remarks earlier today on NeoGAF, “There’s no way that we’re giving up a 30% advantage to Sony.”

People DO understand that Microsoft has some of the smartest graphics programmers IN THE WORLD. We CREATED DirectX, the standard API’s that everyone programs against. So while people laude Sony for their HW skills, do you really think we don’t know how to build a system optimized for maximizing graphics for programmers? Seriously? There is no way we’re giving up a 30%+ advantage to Sony. And ANYONE who has seen both systems running could say there are great looking games on both systems. If there was really huge performance difference – it would be obvious.

https://attackofthefanboy.com/news/ps4-powerful-xbox/

Hell they would not even get the number right because the gap was more than 40%,and still they claimed that MS would not give sony a 30% advantage.

So yeah there is a incredible difference between what MS does and what sony does,MS flat out lie about the PS4 not been stronger than the xbox one,only to brag about a 45% stronger xbox one X as if the thing as 2 generations better.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#208 scatteh316
Member since 2004 • 10273 Posts

@tormentos said:
@ronvalencia said:

Your Battlefield V RT benchmarks seems to be old

https://www.guru3d.com/news-story/december-4-battlefield-v-raytracing-dxr-performance-patch-released-(benchmarks).html

RTX 2080 Ti at 4K RT ultra reach 41 fps and ~70 fps at 1440p

RT consumes memory bandwidth which is being shared with shader cores.

RTX 2070 4K with DXR low lands on 36 fps.

Turing's rapid pack maths and variable shading rate features wasn't used.

remember this is an RTX 2080ti nor Scarlet of the PS5 will get something like this.

Remember, we do not know the performance impact of running RT on AMD hardware so stop using performance hit on Nvidia you mug.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#209  Edited By tormentos
Member since 2003 • 33784 Posts

@scatteh316 said:

Remember, we do not know the performance impact of running RT on AMD hardware so stop using performance hit on Nvidia you mug.

But we do know Nvidia >>>> AMD when it comes to power and performance.

So i doubt it will change with Navi else AMD would be now announcing a GPU that beat the RTX 2080ti instead of one that beat the RX2070.

Facts are fact and RT is not free even on hardware based form it has a nasty impact on performance.

But hey YOU assume sony has shader based RT based on sony not confirming it,which is a total joke if you ask me,so remember we don't know if it hardware based or not.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#210 scatteh316
Member since 2004 • 10273 Posts

@tormentos said:
@scatteh316 said:

Remember, we do not know the performance impact of running RT on AMD hardware so stop using performance hit on Nvidia you mug.

But we do know Nvidia >>>> AMD when it comes to power and performance.

So i doubt it will change with Navi else AMD would be now announcing a GPU that beat the RTX 2080ti instead of one that beat the RX2070.

Facts are fact and RT is not free even on hardware based form it has a nasty impact on performance.

But hey YOU assume sony has shader based RT based on sony not confirming it,which is a total joke if you ask me,so remember we don't know if it hardware based or not.

I never said RT is free so that sentence is idiotic and has no place in the discussion........

And unlike you I'm using AMD's presentation and common sense (something you lack) as a basis..... AMD are using shader based RT for now and until Sony confirm they're using hardware based then it's accurate and reasonable to assume PS5 is also shader based.

You on the other have no logic in comparing performance drops for Nvidia hardware in regards to AMD which is one of stupidest things ever.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#211 tormentos
Member since 2003 • 33784 Posts

@scatteh316 said:

I never said RT is free so that sentence is idiotic and has no place in the discussion........

And unlike you I'm using AMD's presentation and common sense (something you lack) as a basis..... AMD are using shader based RT for now and until Sony confirm they're using hardware based then it's accurate and reasonable to assume PS5 is also shader based.

You on the other have no logic in comparing performance drops for Nvidia hardware in regards to AMD which is one of stupidest things ever.

NO is not idiotic because you are fighting something for which we have PROOF that it sucks performance wise on what it is the very best GPU out there.

NO is not reasonable.

Just because MS went it more details doesn't mean sony doesn't have the feature,in fact MS also went into more details as to how SSD work something sony didn't do either,worse we have Matt from Resetera which has inside info stating both have HB RT,i look at HBRT as i use to look at PRT or Tile resources on MS camp,so there is a BIG chance both MS and sony have GPU from AMD second line of Navi GPU which do come with HBRT.

Yes i mave logic because regardless of difference bwteen GPU,performance hit will always strike both lines period.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#212 scatteh316
Member since 2004 • 10273 Posts

@tormentos said:
@scatteh316 said:

I never said RT is free so that sentence is idiotic and has no place in the discussion........

And unlike you I'm using AMD's presentation and common sense (something you lack) as a basis..... AMD are using shader based RT for now and until Sony confirm they're using hardware based then it's accurate and reasonable to assume PS5 is also shader based.

You on the other have no logic in comparing performance drops for Nvidia hardware in regards to AMD which is one of stupidest things ever.

NO is not idiotic because you are fighting something for which we have PROOF that it sucks performance wise on what it is the very best GPU out there.

NO is not reasonable.

Just because MS went it more details doesn't mean sony doesn't have the feature,in fact MS also went into more details as to how SSD work something sony didn't do either,worse we have Matt from Resetera which has inside info stating both have HB RT,i look at HBRT as i use to look at PRT or Tile resources on MS camp,so there is a BIG chance both MS and sony have GPU from AMD second line of Navi GPU which do come with HBRT.

Yes i mave logic because regardless of difference bwteen GPU,performance hit will always strike both lines period.

1. It is idiocy..... you might as well use the performance drop from a GTX8800 Ultra to show the performance drop.

2. Yes it is

3. More unrelated and irrelevant dribble

4. Stupid logic..... Especially are you trying to use a completely unrelated GPU as a basis to try and predict the performance hit of a brand new architecture.

Avatar image for Pedro
Pedro

69083

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#213 Pedro
Member since 2002 • 69083 Posts

@phbz said:

I'm loving this debate based on absolutely nothing. Please continue.

Isn't it fun? :P

Avatar image for tdkmillsy
tdkmillsy

5819

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#214 tdkmillsy
Member since 2003 • 5819 Posts

@tormentos said:
@ronvalencia said:

Your percentage argument hides the real pixel count difference.

I have 7970 1 Ghz and 7950 factory 900Mhz (OC to 950Mhz) which I can CrossFire prior to R9-290X purchase.

I haven't bought XBO or PS4.

@tormentos said:

You trying to downplay the gap using the damn 7970 is a freaking joke,so i assume the gap between the PS4 Pro and the XBO X is also small since you know Vega has GPU way over Scorpio Tflop count.

This ^^ is red hearrin what you are doing basically creating a distraction from the fact that MS would not even admit a gap between both consoles,but when they had the lead they promote it to hell and beyond.

X1X RDR2 delivered 2X pixels over PS4 Pro results. Your pure TFLOPS argument is flawed.

---

Both Vega 56/64 and X1X has quad geometry-raster units, hence there's a bottleneck with Vega 56/64. Vega 64 didn't double it's geometry power in accordance to TFLOPS scaling.

NAVI 10's (8 Prim Shader In, 4 Prisms Shader Out) geometry engine will raster better than Polaris 10/Vega's (4 Prim Shader In, 4 Prisms Shader Out).

GCN takes in 4 triangles and then rasters what isn't culled and you will generally lose half just to backface. So it's really rasterizing ~2 triangles per clock on average. Some engines were resorting to manually culling all triangles via compute shaders so that they could actually rasterize 4 per clock, but this workaround consumes shader resource. GCN geometry workaround is Radeon 2900 XT MSAA running on shaders level debacle.

NAVI 10's 8 triangles transformed/culled and 4 rasterized will actually allow the rasterizers to hit close to full throughput.

Again, your pure TFLOPS argument is flawed.

I'm not stopping.

NO is not Vega top of the line GPU has higher Tflop than Scorpio so my argument is not flawed unless you prove the contrary which you can't.

@tdkmillsy said:

Most 720p/1080p games on Xbox One/PS4 where early on and very likely to be development kit issues. The standard difference between them (as in most games that had a difference) was 900p/1080p, it might not play out on a 1080p TV at normal gaming distances but it a difference and advantage to PS4.

Microsoft played down the difference in exactly the same way Sony have played down the lack of full 4k when comparing the X to the Pro.

They both do it but fanboy aside what else would you expect them to do. Its a competitive market, they are all going to up play the advantages and down play the disadvantages.

It will continue in the next gen and all that follow it, but if it didn't we wouldn't have system wars :)

Not even close,quote sony claming MS doesn't have an advantage with Scorpio or xbox one X over the pro like MS did.

PS4 is not more powerful than Xbox One

The Xbox One and the PlayStation 4 are apparently closer rivals in power than the raw stats would lead you to believe. According to Albert Penello of Microsoft who posted some interesting remarks earlier today on NeoGAF, “There’s no way that we’re giving up a 30% advantage to Sony.”

People DO understand that Microsoft has some of the smartest graphics programmers IN THE WORLD. We CREATED DirectX, the standard API’s that everyone programs against. So while people laude Sony for their HW skills, do you really think we don’t know how to build a system optimized for maximizing graphics for programmers? Seriously? There is no way we’re giving up a 30%+ advantage to Sony. And ANYONE who has seen both systems running could say there are great looking games on both systems. If there was really huge performance difference – it would be obvious.

https://attackofthefanboy.com/news/ps4-powerful-xbox/

Hell they would not even get the number right because the gap was more than 40%,and still they claimed that MS would not give sony a 30% advantage.

So yeah there is a incredible difference between what MS does and what sony does,MS flat out lie about the PS4 not been stronger than the xbox one,only to brag about a 45% stronger xbox one X as if the thing as 2 generations better.

Oh come on man, get your head out of your arse. All companies play up there advantages and play down their disadvantages.

https://www.theverge.com/2017/6/5/15720206/sony-ceo-shawn-layden-interview-ps4-pro

“We see 4K as being the next HD,” says Shawn Layden, president and CEO of Sony Interactive Entertainment America, “and PlayStation 4 Pro is our answer to that opportunity.”

Clearly stating PS4 Pro was their answer to 4k, in a direct damage limitation to the X.

They all bloody do it.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#215  Edited By ronvalencia
Member since 2008 • 29612 Posts

@scatteh316 said:
@ronvalencia said:
@scatteh316 said:
@Pedro said:

You would think that if RT is hardware accelerated that the core performance would be unaffected by its implementation but that is not the case. I reckon that Nvidia's current accelerated solution (which may also be AMDs) is too limited, relying on other components of the GPU thus the performance hit. The ideal situation is true hardware acceleration that runs parallel to the existing render pipeline.

In a sense it would be be affected by RT......but the overall raw performance of the die wouldn't be.

Say Microsoft start with a 40 CU Navi GPU and have to sacrifice 10 CU's for RT...... that'll be a 25% reduction in raw GPU performance but with this set-up running RT won't affect the utilisation of those remaining 30 CU's........ but turning off RT won't give more raw performance....But as long as the RT cores are fast enough to make up for the reduction in CU's it could actually work in their favour.

What's interesting is how much compute performance is going to be required for RT via shaders to match what will be possible via dedicated hardware.

If for example a next generation COD game ships with RT enabled via hardware acceleration on Scarlett it will still have 30 CU's worth of GPU performance to handle the rest of the graphics.

Now...... what if this same COD game ships on PS5 ?? There could be several scenario's......

1- They ditch RT completely and have inferior lighting/shadows/reflections (But still not ugly looking) but use all 40 CU's to push the visuals in other area's

2- They dedicate CU's to do RT via shaders.....but..... what if this requires 15 CU's to achieve parity with Scarletts dedicated hardware? That'll leave 25 CU's left for the rest of the graphics.....less then Scarlett!!!

If Scarlett and crucially AMD's implementation of hardware acceleration of RT is robust and performs well Scarlett may very well be the faster console in the real world.

Or has Sony or Cerny made custom tweaks to the core logic to enable better RT performance on PS5??

It's all very interesting and will be good to see how it all unfolds.

Scarlet APU has been shown to be 380 to 400 mm2

8 core Zen v2 chiplet + 5700 XT has 321 mm2.

TSMC 7nm+ has 20 percent increase density when compared to fake 7nm.

TSMC 1st gen 7nm is the BS 7nm.

Will you go away.... I was using 40 CU's just as an example to show CU count and how it could be affected by using dedicated RT logic.......I was not claiming they'll have 40 CU's and use the full fat die.

So bug off!

Scarlet APU size has been reveal'ed stupid bug.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#216  Edited By scatteh316
Member since 2004 • 10273 Posts

@ronvalencia said:
@scatteh316 said:
@ronvalencia said:
@scatteh316 said:

In a sense it would be be affected by RT......but the overall raw performance of the die wouldn't be.

Say Microsoft start with a 40 CU Navi GPU and have to sacrifice 10 CU's for RT...... that'll be a 25% reduction in raw GPU performance but with this set-up running RT won't affect the utilisation of those remaining 30 CU's........ but turning off RT won't give more raw performance....But as long as the RT cores are fast enough to make up for the reduction in CU's it could actually work in their favour.

What's interesting is how much compute performance is going to be required for RT via shaders to match what will be possible via dedicated hardware.

If for example a next generation COD game ships with RT enabled via hardware acceleration on Scarlett it will still have 30 CU's worth of GPU performance to handle the rest of the graphics.

Now...... what if this same COD game ships on PS5 ?? There could be several scenario's......

1- They ditch RT completely and have inferior lighting/shadows/reflections (But still not ugly looking) but use all 40 CU's to push the visuals in other area's

2- They dedicate CU's to do RT via shaders.....but..... what if this requires 15 CU's to achieve parity with Scarletts dedicated hardware? That'll leave 25 CU's left for the rest of the graphics.....less then Scarlett!!!

If Scarlett and crucially AMD's implementation of hardware acceleration of RT is robust and performs well Scarlett may very well be the faster console in the real world.

Or has Sony or Cerny made custom tweaks to the core logic to enable better RT performance on PS5??

It's all very interesting and will be good to see how it all unfolds.

Scarlet APU has been shown to be 380 to 400 mm2

8 core Zen v2 chiplet + 5700 XT has 321 mm2.

TSMC 7nm+ has 20 percent increase density when compared to fake 7nm.

TSMC 1st gen 7nm is the BS 7nm.

Will you go away.... I was using 40 CU's just as an example to show CU count and how it could be affected by using dedicated RT logic.......I was not claiming they'll have 40 CU's and use the full fat die.

So bug off!

Scarlet APU size has been reveal'ed stupid bug.

Which has NOTHING to do with what I was talking about you clown so stop quoting and replying to with random shit that has nothing to do with what I'm talking about.

Jesus.......where's the block button when you need one.

Avatar image for boxrekt
BoxRekt

2425

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#217  Edited By BoxRekt
Member since 2019 • 2425 Posts
@ronvalencia said:

Scarlet APU size has been reveal'ed stupid bug.

I'd like to read about it...LINK?

Avatar image for Pedro
Pedro

69083

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#218  Edited By Pedro
Member since 2002 • 69083 Posts

@scatteh316 said:

Which has NOTHING to do with what I was talking about you clown so stop quoting and replying to with random shit that has nothing to do with what I'm talking about.

Jesus.......where's the block button when you need one.

AMD launched the 5700 XT AE which is over 10 TFLOPS in performance.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#219  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@ronvalencia said:

Your Battlefield V RT benchmarks seems to be old

https://www.guru3d.com/news-story/december-4-battlefield-v-raytracing-dxr-performance-patch-released-(benchmarks).html

RTX 2080 Ti at 4K RT ultra reach 41 fps and ~70 fps at 1440p

RT consumes memory bandwidth which is being shared with shader cores.

RTX 2070 4K with DXR low lands on 36 fps.

Turing's rapid pack maths and variable shading rate features wasn't used.

So the 2080TI RTX does 83FPS withg RT off,with it on does 41FPS cutting performanse by more than 100% and you think some how the hit is not big.?

Even with newer drivers the hit is to big,remember this is an RTX 2080ti nor Scarlet of the PS5 will get something like this.

(Bounding volume hierarchy ray-tracing) DXR hit is worst on GTX 1080 Ti. Expensive rays as per EA DICE's hierarchy ray-tracing definition is very expensive. Complex or bloated shader programs also has significant performance hit.

DXR preforms a hierarchy tree search which can be memory extensive.

RT cores tells pixel shader to shade the pixel's color and Variable Shading Rate feature can further reduce the shader workload with minimal geometry resolution compromise.

Hierarchy ray-tracing can speed up with less rays being projected. NVIDIA's Gameworks examples are usually bias to sell the highest SKU e.g. Hairworks bloat vs AMD's light weight TressFX.

Battlefield V is not using Turing's other new features such as Rapid Pack Maths and Variable Shading Rate.

X1X has Variable Shading Rate like feature. https://www.eurogamer.net/articles/digitalfoundry-2017-the-scorpio-engine-in-depth

Andrew Goossen tells us that the GPU supports extensions that allow depth and ID buffers to be efficiently rendered at full native resolution, while colour buffers can be rendered at half resolution with full pixel shader efficiency. Based on conversations last year with Mark Cerny, there is some commonality in approach here with some of the aspects of PlayStation 4 Pro's design, but we can expect some variation in customisations - despite both working with AMD, we're reliably informed that neither Sony or Microsoft are at all aware of each other's designs before they are publicly unveiled.

Depth buffer deals with geometry render buffer.https://developer.nvidia.com/vrworks/graphics/variablerateshading has similar functions which keeps geometry resolution native while the color shading resolution is less, hence it's harder pixel count by geometry edge method. Vega IP doesn't have this hardware feature.

Xbox Scarlet would have Scorpio GPU custom additions.

Sony's PS4 Pro GPU has different approach to conserving shader resources e.g. checkerboard resolve hardware which doesn't exist on PC Vega IP.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#220  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@ronvalencia said:

Your percentage argument hides the real pixel count difference.

I have 7970 1 Ghz and 7950 factory 900Mhz (OC to 950Mhz) which I can CrossFire prior to R9-290X purchase.

I haven't bought XBO or PS4.

@tormentos said:

You trying to downplay the gap using the damn 7970 is a freaking joke,so i assume the gap between the PS4 Pro and the XBO X is also small since you know Vega has GPU way over Scorpio Tflop count.

This ^^ is red hearrin what you are doing basically creating a distraction from the fact that MS would not even admit a gap between both consoles,but when they had the lead they promote it to hell and beyond.

X1X RDR2 delivered 2X pixels over PS4 Pro results. Your pure TFLOPS argument is flawed.

---

Both Vega 56/64 and X1X has quad geometry-raster units, hence there's a bottleneck with Vega 56/64. Vega 64 didn't double it's geometry power in accordance to TFLOPS scaling.

NAVI 10's (8 Prim Shader In, 4 Prisms Shader Out) geometry engine will raster better than Polaris 10/Vega's (4 Prim Shader In, 4 Prisms Shader Out).

GCN takes in 4 triangles and then rasters what isn't culled and you will generally lose half just to backface. So it's really rasterizing ~2 triangles per clock on average. Some engines were resorting to manually culling all triangles via compute shaders so that they could actually rasterize 4 per clock, but this workaround consumes shader resource. GCN geometry workaround is Radeon 2900 XT MSAA running on shaders level debacle.

NAVI 10's 8 triangles transformed/culled and 4 rasterized will actually allow the rasterizers to hit close to full throughput.

Again, your pure TFLOPS argument is flawed.

I'm not stopping.

NO is not Vega top of the line GPU has higher Tflop than Scorpio so my argument is not flawed unless you prove the contrary which you can't.

GCN's TFLOPS doesn't scale in a linear uplift direction when GCN higher than R9-390X (5.9 TFLOPS) i.e. IPC drop-off. Let's post more Battlefield benchmarks with R9-390X. AMD can't keep adding CU to Hawaii's quad shader engine layout.

GCN TFLOPS scales better with higher clock speed with less CU e.g. Vega 56 at 1710Mhz (~12 TFLOPS) beating Strix Vega 64 OC 1590 Mhz(~13 TFLOPS).

I have advocated RX-5700 XT like idea which is Hawaii 44 CU scaled GCN with Vega's IP improvements e.g. higher clock speed over Polaris and large Vegas, Polaris DCC, better culling, ROPS with 4MB L2 cache and 'etc'. Goal: improve hardware raster with reduced CU count for extra TDP headroom.

It's harder to populate data with very wide GPUs. NAVI has double width texture filter, double branch per CU and double wave32 (CUDA v7 like, Volta).

GTX 1080 Ti is like Vega 56 CU in terms of shader count with higher clock speed, 6 shader engines and 88 ROPS.

Titian XP is like Vega 60 CU in terms of shader count with higher clock speed, 6 shader engines and 96 ROPS.

GTX 1080 like Vega 40 CU in terms of shader count with higher clock speed, 4 shader engines and 64 ROPS. GTX 1080 usually competes against RTX 2070. Does GTX 1080 sound familiar to RX 5700 XT? GTX Turing baseline IP has ACE units, RPM, variable shading rate and improved cache features.

12nm Volta CUDA v7 acted like RDNA v1 for NVIDIA.

Turing's CUDA hardware version is 7.5 and it's second 12 nm CUDA v7 generation.

Avatar image for boxrekt
BoxRekt

2425

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#221 BoxRekt
Member since 2019 • 2425 Posts
@boxrekt said:
@ronvalencia said:

Scarlet APU size has been reveal'ed stupid bug.

I'd like to read about it...LINK?

I see you've posted twice since I wrote this.

Where's the link for this argument you keep referring to talking about Scarlet's APU size?

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#222 tormentos
Member since 2003 • 33784 Posts

@tdkmillsy said:

Oh come on man, get your head out of your arse. All companies play up there advantages and play down their disadvantages.

https://www.theverge.com/2017/6/5/15720206/sony-ceo-shawn-layden-interview-ps4-pro

“We see 4K as being the next HD,” says Shawn Layden, president and CEO of Sony Interactive Entertainment America, “and PlayStation 4 Pro is our answer to that opportunity.”

Clearly stating PS4 Pro was their answer to 4k, in a direct damage limitation to the X.

They all bloody do it.

WTF?

No no really WTF?

I quoted Albert Panello directly denying an advantage on sony side,and you bring me a vague quote that doesn't mean at all what you think it mean from sony.

Tell me how did that quote prove that sony deny the advantage MS had?

In fact what they claimed there is that the PS4 Pro was the answer they have for 4k,in fact the PS4 Pro was already out we already knew about checkerboard and all that stuff.

You are so blinded by your MS love is not even funny.

@scatteh316 said:

1. It is idiocy..... you might as well use the performance drop from a GTX8800 Ultra to show the performance drop.

2. Yes it is

3. More unrelated and irrelevant dribble

4. Stupid logic..... Especially are you trying to use a completely unrelated GPU as a basis to try and predict the performance hit of a brand new architecture.

1-Nice spin the 8800GTX is not a this gen GPU dude,the RTX 2080TI is and basically the best one.

2-No is not at all.

3-By this point your are basically out or ammo,since i also bring how MS gave more details about their SSD implementation.

4-Bullshit considering than Ray Tracing has an impact on software or hardware but ok ill go ahead and bookmark this thread from future reference.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#223  Edited By ronvalencia
Member since 2008 • 29612 Posts

@boxrekt said:
@ronvalencia said:

Scarlet APU size has been reveal'ed stupid bug.

I'd like to read about it...LINK?

I recycled Tormented's own resetera link which one the posters has estimated Scarlet's APU size using GDDR6 chip size as reference.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#225  Edited By ronvalencia
Member since 2008 • 29612 Posts

@scatteh316 said:
@ronvalencia said:
@scatteh316 said:
@ronvalencia said:

Scarlet APU has been shown to be 380 to 400 mm2

8 core Zen v2 chiplet + 5700 XT has 321 mm2.

TSMC 7nm+ has 20 percent increase density when compared to fake 7nm.

TSMC 1st gen 7nm is the BS 7nm.

Will you go away.... I was using 40 CU's just as an example to show CU count and how it could be affected by using dedicated RT logic.......I was not claiming they'll have 40 CU's and use the full fat die.

So bug off!

Scarlet APU size has been reveal'ed stupid bug.

Which has NOTHING to do with what I was talking about you clown so stop quoting and replying to with random shit that has nothing to do with what I'm talking about.

Jesus.......where's the block button when you need one.

Your argument is not based on historical patterns from MS's X1X and Sony's PS4 Pro with both ASIC designs having larger GCN CU scaling over AMD's own RX-480/RX-580 36 CU ASIC.

MS's X1X vs RX-580 basic parameters

  • X1X GPU's base clock speed is 93 percent from RX-580's base clock speed.
  • X1X GPU's base clock speed is 85 percent from RX-580's boost clock speed.
  • X1X GPU has base clock speed 6 TFLOPS FP32 with lower power consumption when compared to RX-580's boost mode 6 TFLOPS 185 watts.
  • X1X GPU has 12 GDDR5-6800 chips to be powered vs RX-580's eight GDDR5-8000 chips. X1X has the superiority at 4K over RX-580.
  • The entire X1X box has about 185 watts.
  • MS paid for 44 CU GCN.

Sony's PS4 Pro vs RX-480 basic parameters

  • PS4Pro GPU's base clock speed is 81 percent from RX-480's base clock speed at 1120 Mhz
  • PS4Pro GPU's base clock speed is 72 percent from RX-480's boost clock speed at 1266 MHz
  • PS4Pro GPU has base clock speed 4.2 TFLOPS FP32 with lower power consumption when compared to RX-480's boost mode 5.8 TFLOPS with ~150 watts.
  • Sony paid for 40 CU GCN.

MS has rivalled AMD PC desktop's RX-580's basic 6 TFLOPS numbers while X1X GPU consumes less power with 4 additional GDDR5 chips + ROPS 2MB render cache + variable shading rate features when compared to AMD's RX-580 6 TFLOPS 185 watts configuration.

The fukin NAVI 10's 251 mm2 already has Polaris 10's 232 mm2 like chip size already.

Somebody in resetara forum estimated Scarlet APU size to be 380 to 400 mm2 range from GDDR6 memory chip's size.

8 core Zen v2 chiplet has 70 mm2 size.

Go ahead fill in the blanks.

It's you who need to be blocked! Tormented hasn't have a monopoly on weaponizing resetera link.

Deal with it.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#226  Edited By tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

I recycled Tormented's own resetera link which one the posters has estimated Scarlet's APU size using GDDR6 chip size as reference.

You didn't recycle anything and matt confirmed both have HB RT and you refuse to admit it,but some how want to use as chip size referenze from the same site, talking about been a 2 face hypocrite congratulation Ron.

@ronvalencia said:

Your argument is not based on historical patterns from MS's X1X and Sony's PS4 Pro with both ASIC designs having larger GCN CU scaling over AMD's own RX-480/RX-580 36 CU ASIC.

MS's X1X vs RX-580 basic parameters

  • X1X GPU's base clock speed is 93 percent from RX-580's base clock speed.
  • X1X GPU's base clock speed is 85 percent from RX-580's boost clock speed.
  • X1X GPU has base clock speed 6 TFLOPS FP32 with lower power consumption when compared to RX-580's boost mode 6 TFLOPS 185 watts.
  • X1X GPU has 12 GDDR5-6800 chips to be powered vs RX-580's eight GDDR5-8000 chips. X1X has the superiority at 4K over RX-580.
  • The entire X1X box has about 185 watts.
  • MS paid for 44 CU GCN.

Sony's PS4 Pro vs RX-480 basic parameters

  • PS4Pro GPU's base clock speed is 81 percent from RX-480's base clock speed at 1120 Mhz
  • PS4Pro GPU's base clock speed is 72 percent from RX-480's boost clock speed at 1266 MHz
  • PS4Pro GPU has base clock speed 4.2 TFLOPS FP32 with lower power consumption when compared to RX-480's boost mode 5.8 TFLOPS with ~150 watts.
  • Sony paid for 40 CU GCN.

MS has rivalled AMD PC desktop's RX-580's basic 6 TFLOPS numbers while X1X GPU consumes less power with 4 additional GDDR5 chips + ROPS 2MB render cache + variable shading rate features when compared to AMD's RX-580 6 TFLOPS 185 watts configuration.

The fukin NAVI 10's 251 mm2 already has Polaris 10's 232 mm2 like chip size already.

Somebody in resetara forum estimated Scarlet APU size to be 380 to 400 mm2 range from GDDR6 memory chip's size.

8 core Zen v2 chiplet has 70 mm2 size.

Go ahead fill in the blanks.

It's you who need to be blocked! Tormented hasn't have a monopoly on weaponizing resetera link.

Deal with it.

This is what i hate about your shitty argument you talk as if freaking MS was even better at engineering GPU than AMD is,that stupid and blind you sound every time.

Let me tell you this MS rivaled shit.

They got a bigger GPU than the RX580 and DOWN CLOCK IT so it would draw less watts,there is not freaking science behind it,higher clock more watt consumption lower clocks less watts consumption,there is no suprise to this or anything is not groud breaking at all.

Even making a Vapor chamber and a ton of other bullshit they could not match the clock speed of the RX580,and with 4CU less still the RX580 can beat the xbox one X by considerable margins in some games,Like Forza Hoarizon 4 which is a game designed from ground up on xbox,and can hit 60FPS almost all the time on ultra at 1440p while the xbox one X has to drop all the way to 1080p.

When both GPU have about the same flop power,the X has more ram and higher bandwidth by your own arguments.

I hit you in the face with Matt claim about HB RT which you refuse to admit,but now you want to use a chip size estimation from resetera in a totally hipocrit way.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#228  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@ronvalencia said:

I recycled Tormented's own resetera link which one the posters has estimated Scarlet's APU size using GDDR6 chip size as reference.

You didn't recycle anything and matt confirmed both have HB RT and you refuse to admit it,but some how want to use as chip size referenze from the same site, talking about been a 2 face hypocrite congratulation Ron.

@ronvalencia said:

Your argument is not based on historical patterns from MS's X1X and Sony's PS4 Pro with both ASIC designs having larger GCN CU scaling over AMD's own RX-480/RX-580 36 CU ASIC.

MS's X1X vs RX-580 basic parameters

  • X1X GPU's base clock speed is 93 percent from RX-580's base clock speed.
  • X1X GPU's base clock speed is 85 percent from RX-580's boost clock speed.
  • X1X GPU has base clock speed 6 TFLOPS FP32 with lower power consumption when compared to RX-580's boost mode 6 TFLOPS 185 watts.
  • X1X GPU has 12 GDDR5-6800 chips to be powered vs RX-580's eight GDDR5-8000 chips. X1X has the superiority at 4K over RX-580.
  • The entire X1X box has about 185 watts.
  • MS paid for 44 CU GCN.

Sony's PS4 Pro vs RX-480 basic parameters

  • PS4Pro GPU's base clock speed is 81 percent from RX-480's base clock speed at 1120 Mhz
  • PS4Pro GPU's base clock speed is 72 percent from RX-480's boost clock speed at 1266 MHz
  • PS4Pro GPU has base clock speed 4.2 TFLOPS FP32 with lower power consumption when compared to RX-480's boost mode 5.8 TFLOPS with ~150 watts.
  • Sony paid for 40 CU GCN.

MS has rivalled AMD PC desktop's RX-580's basic 6 TFLOPS numbers while X1X GPU consumes less power with 4 additional GDDR5 chips + ROPS 2MB render cache + variable shading rate features when compared to AMD's RX-580 6 TFLOPS 185 watts configuration.

The fukin NAVI 10's 251 mm2 already has Polaris 10's 232 mm2 like chip size already.

Somebody in resetara forum estimated Scarlet APU size to be 380 to 400 mm2 range from GDDR6 memory chip's size.

8 core Zen v2 chiplet has 70 mm2 size.

Go ahead fill in the blanks.

It's you who need to be blocked! Tormented hasn't have a monopoly on weaponizing resetera link.

Deal with it.

This is what i hate about your shitty argument you talk as if freaking MS was even better at engineering GPU than AMD is,that stupid and blind you sound every time.

Let me tell you this MS rivaled shit.

1. They got a bigger GPU than the RX580 and DOWN CLOCK IT so it would draw less watts,there is not freaking science behind it,higher clock more watt consumption lower clocks less watts consumption,there is no suprise to this or anything is not groud breaking at all.

Even making a Vapor chamber and a ton of other bullshit they could not match the clock speed of the RX580,and with 4CU less still the RX580 can beat the xbox one X by considerable margins in some games,Like Forza Hoarizon 4 which is a game designed from ground up on xbox,and can hit 60FPS almost all the time on ultra at 1440p while the xbox one X has to drop all the way to 1080p.

When both GPU have about the same flop power,the X has more ram and higher bandwidth by your own arguments.

I hit you in the face with Matt claim about HB RT which you refuse to admit,but now you want to use a chip size estimation from resetera in a totally hipocrit way.

1. Polaris 30 RX-590's (225 watts TDP / 36 CU) scaled to 64 CU = 400 watts. Meanwhile Vega 64 has 295 watts TDP.

RX-590's 1545 Mhz has similar boost clock speed to Vega 64's 1536 Mhz.

AMD's perf/watts road map is real.

https://www.tomshardware.com/reviews/amd-radeon-rx-580-review,5020-6.html

RX-580's actual typical gaming power consumption is 209 watts, hence it overshot AMD's 185 watts TDP target.

X1X typical gaming power consumption for the entire machine is about 172 watts. https://www.anandtech.com/show/11992/the-xbox-one-x-review/6 hence the GPU is somewhere 145 watts where CPU has 20 watts + 2.5 inch HDD has 2.5 watts + cut-down South Bridge has ~2.5 watts.

https://www.tomshardware.com/reviews/amd-radeon-rx-480-polaris-10,4616-9.html

RX 480's gaming has 164 watts power consumption which overshot AMD's 150 watts TDP target. PS4 Pro GPU was reduce to 911 Mhz to budget for CPU+2.5 HDD+Optical drive+controller.

  • AMD's PC Polaris GPUs doesn't have X1X's auto-under voltage VRm design
  • RX-580 doesn't have silicon maturity like Vega
  • X1X GPU's perf/watt is NOT PC's Polaris.AMD/MS didn't sit back doing nothing between RX-480's June 20016 release to X1X's engineering release 1st silicon December 2016.

X1X's perf/watt against PS4 is about 3.2 which is about Vega's perf/watt

PS4 Pro's perf/watt against PS4 is about 2.3 which is about 1st gen Polaris's perf/watt

Try again.

-------------

1. Matt didn't claim hardware accelerated ray-tracing. Matt is not Sony.

2. Sony is known to claim cheap rays as ray-tracing. RX-480 Polaris already has ray-casting (cheap rays) DSP instructions.

3. Sony claims ray-tracing support

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#229  Edited By ronvalencia
Member since 2008 • 29612 Posts

@EG101:

HC-14 GDDR6 comes from Samsung.

It's GDDR6-14000.

320 bit GDDR6-14000 ???? LOL

From https://forum.beyond3d.com/threads/next-generation-hardware-speculation-with-a-technical-spin-post-e3-2019.61245/page-2

About twice the memory bandwidth over Xbox Scorpio.

https://forum.beyond3d.com/threads/next-generation-hardware-speculation-with-a-technical-spin-post-e3-2019.61245/page-8

Alright broke out the measurement taping and found out some things about Navi and the Anaconda in terms of sizes on 7nm.

5700:

GDDR6 phy controller: 4.5mm x 8

Dual CU: 3.37mm x 20

4 ROP cluster: .55mm x 16

L1+L2+ACE+Gemotry processor+empty buffer spaces + etc: 139mm

Now Anaconda:

A rougher estimate using the 12x14mm GDDR6 chips next to the SOC.

370mm-390mm.

It's a bit bigger than the 1X SOC for sure.

If we use the figure of 380mm,

75mm for CPU

45mm for 10 GDDR6 controllers

8.8mm for ROPs

140mm for buses, caches, ACE, geometry processors, shape etc. I might be over estimating this part as the 5700 seems to have lots of "empty" areas.

We have ~110mm left for CUs + RT hardware. There is enough there for ~30 dual CUs and RT extensions.

Conclusion:

The Anaconda SOC is around the minimum size you need to fit the maximum Navi GPU and Zen2 cores.

I expect Anaconda to have a minimum of 48 CUs if the secret sauce is extra heavy or 60CUs if the sauce is light.

https://forum.beyond3d.com/threads/next-generation-hardware-speculation-with-a-technical-spin-post-e3-2019.61245/page-9

>24mm x >16mm = >384 mm2.

Scarlet is Xbox Anaconda follow on from Xbox Scorpio.

XBox Scarlett might end up using RDNA2 with AMD's own RTRT, since backward compatibility there works mostly on a DX12/OS level, like a PC.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#230 tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

1. Polaris 30 RX-590's (225 watts TDP / 36 CU) scaled to 64 CU = 400 watts. Meanwhile Vega 64 has 295 watts TDP.

RX-590's 1545 Mhz has similar boost clock speed to Vega 64's 1536 Mhz.

AMD's perf/watts road map is real.

https://www.tomshardware.com/reviews/amd-radeon-rx-580-review,5020-6.html

RX-580's actual typical gaming power consumption is 209 watts, hence it overshot AMD's 185 watts TDP target.

X1X typical gaming power consumption for the entire machine is about 172 watts. https://www.anandtech.com/show/11992/the-xbox-one-x-review/6 hence the GPU is somewhere 145 watts where CPU has 20 watts + 2.5 inch HDD has 2.5 watts + cut-down South Bridge has ~2.5 watts.

https://www.tomshardware.com/reviews/amd-radeon-rx-480-polaris-10,4616-9.html

RX 480's gaming has 164 watts power consumption which overshot AMD's 150 watts TDP target. PS4 Pro GPU was reduce to 911 Mhz to budget for CPU+2.5 HDD+Optical drive+controller.

  • AMD's PC Polaris GPUs doesn't have X1X's auto-under voltage VRm design
  • RX-580 doesn't have silicon maturity like Vega
  • X1X GPU's perf/watt is NOT PC's Polaris.AMD/MS didn't sit back doing nothing between RX-480's June 20016 release to X1X's engineering release 1st silicon December 2016.

X1X's perf/watt against PS4 is about 3.2 which is about Vega's perf/watt

PS4 Pro's perf/watt against PS4 is about 2.3 which is about 1st gen Polaris's perf/watt

Try again.

-------------

1. Matt didn't claim hardware accelerated ray-tracing. Matt is not Sony.

2. Sony is known to claim cheap rays as ray-tracing. RX-480 Polaris already has ray-casting (cheap rays) DSP instructions.

3. Sony claims ray-tracing support

1-Nice spin i say 580 not 590.

2-You are a 2 face hypocrite,on one side you downplay Matt claims about HB RT because you claim matt is not Sony,but on the other you use Beyond 3D and resetera estimates not comming from MS,you are a JOKE a total buffon who can't admit been wrong period.

3.They also known for adding extra features on their GPU as well,when dony claimed cheap rays was on previous hardware not on this one ass.

4-Yes and they also claim SSD but was MS the one who claimed theiy use SSD as virual ram while sony didn't claim that.

The difference here is that MS went into more detail than sony about it has,again you fu**ing better hope sony doesn't have HB RT because every time you post here i will quote you for a month every day pasting the same information just to have fun.

Sony claimed Ray tracing that in no way is an admision that is software based,an omittion is not an affirmation.

Contrary to what MS did when they claimed most powerful console THEY created and not most powerful console ever like the did with Scorpio.

Yet here you are breating your back trying to proof MS Soc is bigger so that you can imply Scarlet is more powerful based on assumptions and estimations from people who are not MS.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#231 ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@ronvalencia said:

1. Polaris 30 RX-590's (225 watts TDP / 36 CU) scaled to 64 CU = 400 watts. Meanwhile Vega 64 has 295 watts TDP.

RX-590's 1545 Mhz has similar boost clock speed to Vega 64's 1536 Mhz.

AMD's perf/watts road map is real.

https://www.tomshardware.com/reviews/amd-radeon-rx-580-review,5020-6.html

RX-580's actual typical gaming power consumption is 209 watts, hence it overshot AMD's 185 watts TDP target.

X1X typical gaming power consumption for the entire machine is about 172 watts. https://www.anandtech.com/show/11992/the-xbox-one-x-review/6 hence the GPU is somewhere 145 watts where CPU has 20 watts + 2.5 inch HDD has 2.5 watts + cut-down South Bridge has ~2.5 watts.

https://www.tomshardware.com/reviews/amd-radeon-rx-480-polaris-10,4616-9.html

RX 480's gaming has 164 watts power consumption which overshot AMD's 150 watts TDP target. PS4 Pro GPU was reduce to 911 Mhz to budget for CPU+2.5 HDD+Optical drive+controller.

  • AMD's PC Polaris GPUs doesn't have X1X's auto-under voltage VRm design
  • RX-580 doesn't have silicon maturity like Vega
  • X1X GPU's perf/watt is NOT PC's Polaris.AMD/MS didn't sit back doing nothing between RX-480's June 20016 release to X1X's engineering release 1st silicon December 2016.

X1X's perf/watt against PS4 is about 3.2 which is about Vega's perf/watt

PS4 Pro's perf/watt against PS4 is about 2.3 which is about 1st gen Polaris's perf/watt

Try again.

-------------

1. Matt didn't claim hardware accelerated ray-tracing. Matt is not Sony.

2. Sony is known to claim cheap rays as ray-tracing. RX-480 Polaris already has ray-casting (cheap rays) DSP instructions.

3. Sony claims ray-tracing support

1-Nice spin i say 580 not 590.

2-You are a 2 face hypocrite,on one side you downplay Matt claims about HB RT because you claim matt is not Sony,but on the other you use Beyond 3D and resetera estimates not comming from MS,you are a JOKE a total buffon who can't admit been wrong period.

3.They also known for adding extra features on their GPU as well,when dony claimed cheap rays was on previous hardware not on this one ass.

4-Yes and they also claim SSD but was MS the one who claimed theiy use SSD as virual ram while sony didn't claim that.

The difference here is that MS went into more detail than sony about it has,again you fu**ing better hope sony doesn't have HB RT because every time you post here i will quote you for a month every day pasting the same information just to have fun.

5. Sony claimed Ray tracing that in no way is an admision that is software based,an omittion is not an affirmation.

6. Contrary to what MS did when they claimed most powerful console THEY created and not most powerful console ever like the did with Scorpio.

7. Yet here you are breating your back trying to proof MS Soc is bigger so that you can imply Scarlet is more powerful based on assumptions and estimations from people who are not MS.

1. Too bad for you, I normalized Polaris's boost clock speed against Vega 64's for apples to apples comparison and scaled Polaris into 64 CU.

2. Matt is not Sony. Matt didn't claim hardware accelerated ray-tracing. Sony is known to play lose with cheap rays as ray-tracing, hence additional confirmation is required from Sony which should state hardware accelerated ray-tracing instead of blabbering about custom audio 3D hardware. PS5's hardware accelerated ray-tracing remained unconfirmed.

I denote my speculation with an "IF" statement. YOU CAN"T READ. YOU DON"T KNOW SIMPLE IF STATEMENT. An IF statement has a simple true or false condition which is 50 percent chance.

3. Sony is known to play lose with cheap rays as ray-tracing in PS4 and PS3.

4. My argument is not about SSD. Using SSD as virtual memory is like PC's virtual memory swap-file running on very fast SSD which is a nothing burger.

5. PS5's hardware accelerated ray-tracing remained unconfirmed by Sony.

6. Phil Spencer has specifically addressed this argument i.e. they(MS) doesn't know PS5 and they don't a PS5 dev kit, hence they wouldn't claim "the most powerful games console". Phil also stated Sony sourced their hardware from AMD.

Both MS and Sony pays for own silicon with near zero risk for AMD.

7. PS5 being stronger than Scarlet based on an early dev kit has been debunked.

YOU ARE WRONG.

Avatar image for Martin_G_N
Martin_G_N

2124

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#232 Martin_G_N
Member since 2006 • 2124 Posts

I doubt having hardware based Ray tracing is important for consoles as long as they have enough power and bandwidth. They can do it through software. And I'm sure first party titles on PS5 will be some of the best looking games and have RT. MS could lock CU's for RT only and call it hardware based but that's a stupid idea if devs want to use the available power for physics or other stuff.

Avatar image for tdkmillsy
tdkmillsy

5819

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#233 tdkmillsy
Member since 2003 • 5819 Posts

@tormentos said:
@tdkmillsy said:

Oh come on man, get your head out of your arse. All companies play up there advantages and play down their disadvantages.

https://www.theverge.com/2017/6/5/15720206/sony-ceo-shawn-layden-interview-ps4-pro

“We see 4K as being the next HD,” says Shawn Layden, president and CEO of Sony Interactive Entertainment America, “and PlayStation 4 Pro is our answer to that opportunity.”

Clearly stating PS4 Pro was their answer to 4k, in a direct damage limitation to the X.

They all bloody do it.

WTF?

No no really WTF?

I quoted Albert Panello directly denying an advantage on sony side,and you bring me a vague quote that doesn't mean at all what you think it mean from sony.

Tell me how did that quote prove that sony deny the advantage MS had?

In fact what they claimed there is that the PS4 Pro was the answer they have for 4k,in fact the PS4 Pro was already out we already knew about checkerboard and all that stuff.

You are so blinded by your MS love is not even funny.

@scatteh316 said:

1. It is idiocy..... you might as well use the performance drop from a GTX8800 Ultra to show the performance drop.

2. Yes it is

3. More unrelated and irrelevant dribble

4. Stupid logic..... Especially are you trying to use a completely unrelated GPU as a basis to try and predict the performance hit of a brand new architecture.

1-Nice spin the 8800GTX is not a this gen GPU dude,the RTX 2080TI is and basically the best one.

2-No is not at all.

3-By this point your are basically out or ammo,since i also bring how MS gave more details about their SSD implementation.

4-Bullshit considering than Ray Tracing has an impact on software or hardware but ok ill go ahead and bookmark this thread from future reference.

You still didnt answer the question

You didn't answer many but go answer this one.

With what we know now.

Would you bet your life saving and house on Sony releasing a move powerful console.

Avatar image for deactivated-5f3ec00254b0d
deactivated-5f3ec00254b0d

6278

Forum Posts

0

Wiki Points

0

Followers

Reviews: 54

User Lists: 0

#234 deactivated-5f3ec00254b0d
Member since 2009 • 6278 Posts

@tdkmillsy: Are you really trying to reason with Tormy? ?

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#235 tormentos
Member since 2003 • 33784 Posts

@tdkmillsy said:

You still didnt answer the question

You didn't answer many but go answer this one.

With what we know now.

Would you bet your life saving and house on Sony releasing a move powerful console.

First of all why would i bet anything on who has the strongest hardware?

Second why are you deviating from the point here,i proved how Alberr Panello would not even admit the PS4 advantage,i aksed you to show me sony doing the same and you post some shitty ass quote which in nothing shows that sony was doing something even close to that.

Basically you were grasping nothing more.

MS was fast to Hype an advantage that wasn't even confirmed because sony never confirmed the specs,but this time MS doesn't dare claim to have the world most powerful console and you hide on MS playing safe,when they didn't in 2016.

When MS has an advantage they exploit it when they don't have it is damage control time.

@phbz said:

@tdkmillsy: Are you really trying to reason with Tormy? ?

Actually reasoning with a lemming is much more work than reasoning with me,but you can see that for obvious reasons.

@ronvalencia said:

1. Too bad for you, I normalized Polaris's boost clock speed against Vega 64's for apples to apples comparison and scaled Polaris into 64 CU.

2. Matt is not Sony. Matt didn't claim hardware accelerated ray-tracing. Sony is known to play lose with cheap rays as ray-tracing, hence additional confirmation is required from Sony which should state hardware accelerated ray-tracing instead of blabbering about custom audio 3D hardware. PS5's hardware accelerated ray-tracing remained unconfirmed.

I denote my speculation with an "IF" statement. YOU CAN"T READ. YOU DON"T KNOW SIMPLE IF STATEMENT. An IF statement has a simple true or false condition which is 50 percent chance.

3. Sony is known to play lose with cheap rays as ray-tracing in PS4 and PS3.

4. My argument is not about SSD. Using SSD as virtual memory is like PC's virtual memory swap-file running on very fast SSD which is a nothing burger.

5. PS5's hardware accelerated ray-tracing remained unconfirmed by Sony.

6. Phil Spencer has specifically addressed this argument i.e. they(MS) doesn't know PS5 and they don't a PS5 dev kit, hence they wouldn't claim "the most powerful games console". Phil also stated Sony sourced their hardware from AMD.

Both MS and Sony pays for own silicon with near zero risk for AMD.

7. PS5 being stronger than Scarlet based on an early dev kit has been debunked.

YOU ARE WRONG.

1-I say 580 not 590 stop spining.

2-MS didn't confirm chip size Beyond3D is not MS see how is easy is to own your ass over? The fun part about this is while claim Matt is not sony,you are quoting Panello who is a LIAR is MS BIASED already exposed and don't even work with MS any more and since he never EVER worked with sony he can't even speak from experince for sony either.

Oh great you are a hypocrite why didn't MS confirm 8 core Zen like sony did? Intead of making bullshit claimes about 4X xbox one X performance with Scarlet.

So does MS has 8 cores of 4? You of all people who rided those 150mhz more on a shitty Jaguar for the xbox one,and then latter for scorpio as well should be now pulling your hairs, if this turn out to be true and MS end with 4 cores 8 threads while sony ends with 8 cores 16 threads would be epic,and i already saw you doing some damage control about it.

NO you insert an if then talk as if it was a FACT which is the problem.

3-See your doing it again trying to imply sony has cheap rays speaking of PS4 and PS3 effects that were to resamble Ray tracing,so again you are threating this as a FACT.

4-But i make it a point because MS also went into more details while sony didn't that doesn't say sony version is not effiecient or be the same as Scarlet.

5-Yes so does MS having the world most powerful console and so does 8 cores which they have not confirm,neither have they confirm chip size yet you can't stop using Beyond3d info as some kind of fact you 2 face hypocrite.

6-MS didn't know the PS4 Pro true spec when they claimed the most powerful console in 2016,a leak doesn't = a confirmation that is what you are trying to imply here right? And is what you use to downplay Matt,so please don't hide on Pro leaked specs those were not confirmed by sony until after Scorpio spec were already announce.

7-Alber Panello didn't debunk shit buffoon he doesn't work for MS anymore and never has work with sony to know if sony devkits are final or not,he doesn't know SHIT he doesn't work on MS anymore and the xbox 360 is not Scarlet.

As i already proved developers are given a target spec they can shot for regardles of they having final kits or not.

Avatar image for Blackrose62090
Blackrose62090

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: -10

User Lists: 0

#236  Edited By Blackrose62090
Member since 2004 • 7806 Posts

Microsoft wouldn't know how to make a good console if Will Smith himself came out of the genie bottle and granted them wishes. I look forward to playing Killzone 7 on my PS5, I've already pre-ordered my season pass.

Avatar image for Pedro
Pedro

69083

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#237 Pedro
Member since 2002 • 69083 Posts
@tormentos said:
@phbz said:

@tdkmillsy: Are you really trying to reason with Tormy? ?

Actually reasoning with a lemming is much more work than reasoning with me,but you can see that for obvious reasons.

Not only is that factually false you are severe case of Sony ass kissing as yourself has confirmed.

"Yes proven and certified By sony's it self. :)" Tormentos

Avatar image for tdkmillsy
tdkmillsy

5819

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#238  Edited By tdkmillsy
Member since 2003 • 5819 Posts

@tormentos said:
@tdkmillsy said:

You still didnt answer the question

You didn't answer many but go answer this one.

With what we know now.

Would you bet your life saving and house on Sony releasing a move powerful console.

First of all why would i bet anything on who has the strongest hardware?

Second why are you deviating from the point here,i proved how Alberr Panello would not even admit the PS4 advantage,i aksed you to show me sony doing the same and you post some shitty ass quote which in nothing shows that sony was doing something even close to that.

Basically you were grasping nothing more.

MS was fast to Hype an advantage that wasn't even confirmed because sony never confirmed the specs,but this time MS doesn't dare claim to have the world most powerful console and you hide on MS playing safe,when they didn't in 2016.

When MS has an advantage they exploit it when they don't have it is damage control time.

@phbz said:

@tdkmillsy: Are you really trying to reason with Tormy? ?

Actually reasoning with a lemming is much more work than reasoning with me,but you can see that for obvious reasons.

@ronvalencia said:

1. Too bad for you, I normalized Polaris's boost clock speed against Vega 64's for apples to apples comparison and scaled Polaris into 64 CU.

2. Matt is not Sony. Matt didn't claim hardware accelerated ray-tracing. Sony is known to play lose with cheap rays as ray-tracing, hence additional confirmation is required from Sony which should state hardware accelerated ray-tracing instead of blabbering about custom audio 3D hardware. PS5's hardware accelerated ray-tracing remained unconfirmed.

I denote my speculation with an "IF" statement. YOU CAN"T READ. YOU DON"T KNOW SIMPLE IF STATEMENT. An IF statement has a simple true or false condition which is 50 percent chance.

3. Sony is known to play lose with cheap rays as ray-tracing in PS4 and PS3.

4. My argument is not about SSD. Using SSD as virtual memory is like PC's virtual memory swap-file running on very fast SSD which is a nothing burger.

5. PS5's hardware accelerated ray-tracing remained unconfirmed by Sony.

6. Phil Spencer has specifically addressed this argument i.e. they(MS) doesn't know PS5 and they don't a PS5 dev kit, hence they wouldn't claim "the most powerful games console". Phil also stated Sony sourced their hardware from AMD.

Both MS and Sony pays for own silicon with near zero risk for AMD.

7. PS5 being stronger than Scarlet based on an early dev kit has been debunked.

YOU ARE WRONG.

1-I say 580 not 590 stop spining.

2-MS didn't confirm chip size Beyond3D is not MS see how is easy is to own your ass over? The fun part about this is while claim Matt is not sony,you are quoting Panello who is a LIAR is MS BIASED already exposed and don't even work with MS any more and since he never EVER worked with sony he can't even speak from experince for sony either.

Oh great you are a hypocrite why didn't MS confirm 8 core Zen like sony did? Intead of making bullshit claimes about 4X xbox one X performance with Scarlet.

So does MS has 8 cores of 4? You of all people who rided those 150mhz more on a shitty Jaguar for the xbox one,and then latter for scorpio as well should be now pulling your hairs, if this turn out to be true and MS end with 4 cores 8 threads while sony ends with 8 cores 16 threads would be epic,and i already saw you doing some damage control about it.

NO you insert an if then talk as if it was a FACT which is the problem.

3-See your doing it again trying to imply sony has cheap rays speaking of PS4 and PS3 effects that were to resamble Ray tracing,so again you are threating this as a FACT.

4-But i make it a point because MS also went into more details while sony didn't that doesn't say sony version is not effiecient or be the same as Scarlet.

5-Yes so does MS having the world most powerful console and so does 8 cores which they have not confirm,neither have they confirm chip size yet you can't stop using Beyond3d info as some kind of fact you 2 face hypocrite.

6-MS didn't know the PS4 Pro true spec when they claimed the most powerful console in 2016,a leak doesn't = a confirmation that is what you are trying to imply here right? And is what you use to downplay Matt,so please don't hide on Pro leaked specs those were not confirmed by sony until after Scorpio spec were already announce.

7-Alber Panello didn't debunk shit buffoon he doesn't work for MS anymore and never has work with sony to know if sony devkits are final or not,he doesn't know SHIT he doesn't work on MS anymore and the xbox 360 is not Scarlet.

As i already proved developers are given a target spec they can shot for regardles of they having final kits or not.

Failed again to answer the question or was that a No, because you know there is nothing conclusive to take from anything so far.

major failure at all levels, happy to talk the talk but when it comes down to it you don't even believe what you are saying yourself.

Why would I provide anything. I said they all play up advantages, which they do.

Nobody is going to acknowledge the competitors advantage. Your just being daft.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#239 tormentos
Member since 2003 • 33784 Posts

@tdkmillsy said:

Failed again to answer the question.

major failure at all levels, happy to talk the talk but when it comes down to it you don't even believe what you are saying yourself.

Why would I provide anything. I said they all play up advantages, which they do.

Nobody is going to acknowledge the competitors advantage. Your just being daft.

Well you can hide all you want on your questions,the PS5 been more powerful is not confirmed by any means and i can admit that.

Now MS wording on E3 was pretty telling and sites pick up quite fast how MS language was and how it changed.

No i already showed how sony played their ardvantage vs how MS played theirs,and how they also played their lack of advantage by simply denying it,not to mention that the contant reference to magics API like DX12 doubling the xbox one power,and the cloud increasing performance as well alone side DF and its secret sauce articles weren't fun times.

Well at least you can't quote sony stating the xbox one X doesn't have a 45% gap in performance vs the Pro,like i can do.

Avatar image for lundy86_4
lundy86_4

61423

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#240 lundy86_4
Member since 2003 • 61423 Posts

@Blackrose62090 said:

Microsoft wouldn't know how to make a good console if Will Smith himself came out of the genie bottle and granted them wishes. I look forward to playing Killzone 7 on my PS5, I've already pre-ordered my season pass.

The X is an incredibly competent console. Far more competent than Sony's offerings (and I own a Pro.)

Avatar image for boxrekt
BoxRekt

2425

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#241  Edited By BoxRekt
Member since 2019 • 2425 Posts
@Martin_G_N said:

I doubt having hardware based Ray tracing is important for consoles as long as they have enough power and bandwidth. They can do it through software. And I'm sure first party titles on PS5 will be some of the best looking games and have RT. MS could lock CU's for RT only and call it hardware based but that's a stupid idea if devs want to use the available power for physics or other stuff.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#242  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@tdkmillsy said:

You still didnt answer the question

You didn't answer many but go answer this one.

With what we know now.

Would you bet your life saving and house on Sony releasing a move powerful console.

First of all why would i bet anything on who has the strongest hardware?

Second why are you deviating from the point here,i proved how Alberr Panello would not even admit the PS4 advantage,i aksed you to show me sony doing the same and you post some shitty ass quote which in nothing shows that sony was doing something even close to that.

Basically you were grasping nothing more.

MS was fast to Hype an advantage that wasn't even confirmed because sony never confirmed the specs,but this time MS doesn't dare claim to have the world most powerful console and you hide on MS playing safe,when they didn't in 2016.

When MS has an advantage they exploit it when they don't have it is damage control time.

@phbz said:

@tdkmillsy: Are you really trying to reason with Tormy? ?

Actually reasoning with a lemming is much more work than reasoning with me,but you can see that for obvious reasons.

@ronvalencia said:

1. Too bad for you, I normalized Polaris's boost clock speed against Vega 64's for apples to apples comparison and scaled Polaris into 64 CU.

2. Matt is not Sony. Matt didn't claim hardware accelerated ray-tracing. Sony is known to play lose with cheap rays as ray-tracing, hence additional confirmation is required from Sony which should state hardware accelerated ray-tracing instead of blabbering about custom audio 3D hardware. PS5's hardware accelerated ray-tracing remained unconfirmed.

I denote my speculation with an "IF" statement. YOU CAN"T READ. YOU DON"T KNOW SIMPLE IF STATEMENT. An IF statement has a simple true or false condition which is 50 percent chance.

3. Sony is known to play lose with cheap rays as ray-tracing in PS4 and PS3.

4. My argument is not about SSD. Using SSD as virtual memory is like PC's virtual memory swap-file running on very fast SSD which is a nothing burger.

5. PS5's hardware accelerated ray-tracing remained unconfirmed by Sony.

6. Phil Spencer has specifically addressed this argument i.e. they(MS) doesn't know PS5 and they don't a PS5 dev kit, hence they wouldn't claim "the most powerful games console". Phil also stated Sony sourced their hardware from AMD.

Both MS and Sony pays for own silicon with near zero risk for AMD.

7. PS5 being stronger than Scarlet based on an early dev kit has been debunked.

YOU ARE WRONG.

1-I say 580 not 590 stop spining.

2-MS didn't confirm chip size Beyond3D is not MS see how is easy is to own your ass over? The fun part about this is while claim Matt is not sony,you are quoting Panello who is a LIAR is MS BIASED already exposed and don't even work with MS any more and since he never EVER worked with sony he can't even speak from experince for sony either.

Oh great you are a hypocrite why didn't MS confirm 8 core Zen like sony did? Intead of making bullshit claimes about 4X xbox one X performance with Scarlet.

So does MS has 8 cores of 4? You of all people who rided those 150mhz more on a shitty Jaguar for the xbox one,and then latter for scorpio as well should be now pulling your hairs, if this turn out to be true and MS end with 4 cores 8 threads while sony ends with 8 cores 16 threads would be epic,and i already saw you doing some damage control about it.

NO you insert an if then talk as if it was a FACT which is the problem.

3-See your doing it again trying to imply sony has cheap rays speaking of PS4 and PS3 effects that were to resamble Ray tracing,so again you are threating this as a FACT.

4-But i make it a point because MS also went into more details while sony didn't that doesn't say sony version is not effiecient or be the same as Scarlet.

5-Yes so does MS having the world most powerful console and so does 8 cores which they have not confirm,neither have they confirm chip size yet you can't stop using Beyond3d info as some kind of fact you 2 face hypocrite.

6-MS didn't know the PS4 Pro true spec when they claimed the most powerful console in 2016,a leak doesn't = a confirmation that is what you are trying to imply here right? And is what you use to downplay Matt,so please don't hide on Pro leaked specs those were not confirmed by sony until after Scorpio spec were already announce.

7-Alber Panello didn't debunk shit buffoon he doesn't work for MS anymore and never has work with sony to know if sony devkits are final or not,he doesn't know SHIT he doesn't work on MS anymore and the xbox 360 is not Scarlet.

As i already proved developers are given a target spec they can shot for regardles of they having final kits or not.

1. You didn't normalized the clock speed between Vega 64 and Polaris, hence you argument is debunked.

RX-580 has exceeded AMD's 185 watts TDP with gaming workload 209 wattswhile X1X's entire machine is 175 watts.

Vega 64's 12.6 TFLOPS / 295 watts TDP = 42 GFLOPS per watt

RX-580's 6.1 TFLOPS / 185 watts TDP= 32 GFLOPS per watt

RX-580's 6.1 TFLOPS / 209 watts gaming = 29 GFLOPS per watt

X1X GPU's 6 TFLOPS / ~145 watts gaming = 41 GFLOPS per watt

Notice X1X's GPU's GFLOPS per watt number is similar to Vega 64's

With year 2016 era 14nm/16 nm GCN silicon maturity, Sony has configured PS4 Pro to be less than PC's RX-480 (5.8 TFLOPS) and RX-470 (4.9 TFLOPS).

With year 2017 era 14nm/16 nm GCN silicon maturity, MS has configured X1X to rival PC's RX-580 (6.1 TFLOPS) while consuming less power than PC's RX 580.

Your argument is debunked.

2. MS has officially shown APU size relative to GDDR6 memory module chip size from E3 2019 reveal and it's up to the posters with geometry skills to estimate APU size based on relative to well known Samsung GDDR6 memory module size. This is no different to Project Scorpio's E3 2017 reveal.

Your MATT from resetera mod argument is not based on official Sony reveal.

MATT from resetera has NOT claim "hardware accelerated" ray-tracing.

Current Facts: Sony claims support ray-tracing has NOT confirmed "hardware accelerated" ray-tracing. There's nothing more than that.

3. Sony's Killzone Shadow Fall has "cheap rays" since it's not hierarchy rays.

Sony's GG also claimed "2.5D ray-tracing" which is hybrid between screen space and tiny amount of 3D.

https://www.eurogamer.net/articles/digitalfoundry-the-making-of-killzone-shadow-fall

"So what we do is find a reflection for every pixel on screen, we find a reflection vector that goes into the screen and then basically start stepping every second pixel until we find something that's a hit. It's a 2.5D ray-trace... We can compute a rough approximation of where the vector would go and we can find pixels on-screen that represent that surface. This is all integrated into our lighting model."

GG's method is not Bounding Volume Hierarchy ray-tracing. Note why Crytek's recent RTRT demo has specifically mentioned "disabled screen space reflection (SSR)" to show it's octree voxel based real time ray-tracing tech.

4. That's your argument not mine. At this time, Sony hasn't officially confirmed "hardware accelerated" ray-tracing.

5. MS has officially shown APU size relative to GDDR6 memory module chip size from E3 2019 reveal and it's up to the posters with geometry skills to estimate APU size. This is no different to Project Scorpio's E3 2017 reveal.

Your MATT from resetera mod argument is not based on official Sony reveal. Current Facts: Sony claims ray-tracing support but not "hardware accelerated" ray-tracing.

Fuk-off with your misapplied hypocrite argument.

6. MS knows launching Polaris silicon maturity in year 2016 would result in an inferior box. More time = extra time for AMD to resolve silicon related issues. MS has it's own "PS4 Pro" like box in 2016 which is cancelled.

7. It debunks your argument with early dev kits being the final Scarlet dev kits. PowerMacs with Radeon X800 GPU era Xbox 360 dev kits has higher GPU power over the original Xbox's GPU power.

Try again.

Avatar image for Xabiss
Xabiss

4749

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#243  Edited By Xabiss
Member since 2012 • 4749 Posts

All you fanboys arguing over rumors are ridiculous! Just stop it already!

No one knows a fucking thing until Microsoft or Sony releases their final stats.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#244  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Martin_G_N said:

I doubt having hardware based Ray tracing is important for consoles as long as they have enough power and bandwidth. They can do it through software. And I'm sure first party titles on PS5 will be some of the best looking games and have RT. MS could lock CU's for RT only and call it hardware based but that's a stupid idea if devs want to use the available power for physics or other stuff.

Doing hierarchy based ray-tracing (hierarchical search tree and intersect test) eats many shader clock cycles, hence why RTX GPUs has separate RT processors from CUDA shader cores.

Both Imagination (PowerVR) and NVIDIA has demonstrated bounding volume Hierarchy (BVH) search tree and Intersect test hardware acceleration.

Polaris RX-480 already has DSP instruction set for ray-cast TrueAudio Next which is one of "cheap ray" methods.

SSD hype indicates heavy pre-baked art-assets usage.

Avatar image for Pedro
Pedro

69083

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#245 Pedro
Member since 2002 • 69083 Posts

@Xabiss said:

All you fanboys arguing over rumors are ridiculous! Just stop it already!

No one knows a fucking thing until Microsoft or Sony releases their final stats.

But the rumors says..... WAAAAH
But the rumors says..... WAAAAH

Avatar image for raining51
Raining51

1162

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#246 Raining51
Member since 2016 • 1162 Posts

Whatever, that's honestly all I got.

Avatar image for tdkmillsy
tdkmillsy

5819

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#247 tdkmillsy
Member since 2003 • 5819 Posts

@tormentos said:
@tdkmillsy said:

Failed again to answer the question.

major failure at all levels, happy to talk the talk but when it comes down to it you don't even believe what you are saying yourself.

Why would I provide anything. I said they all play up advantages, which they do.

Nobody is going to acknowledge the competitors advantage. Your just being daft.

Well you can hide all you want on your questions,the PS5 been more powerful is not confirmed by any means and i can admit that.

Now MS wording on E3 was pretty telling and sites pick up quite fast how MS language was and how it changed.

No i already showed how sony played their ardvantage vs how MS played theirs,and how they also played their lack of advantage by simply denying it,not to mention that the contant reference to magics API like DX12 doubling the xbox one power,and the cloud increasing performance as well alone side DF and its secret sauce articles weren't fun times.

Well at least you can't quote sony stating the xbox one X doesn't have a 45% gap in performance vs the Pro,like i can do.

People DO understand that Microsoft has some of the smartest graphics programmers IN THE WORLD. We CREATED DirectX, the standard API’s that everyone programs against. So while people laude Sony for their HW skills, do you really think we don’t know how to build a system optimized for maximizing graphics for programmers? Seriously? There is no way we’re giving up a 30%+ advantage to Sony. And ANYONE who has seen both systems running could say there are great looking games on both systems. If there was really huge performance difference – it would be obvious.

He doesn't say there isn't a performance gap at all. He even states "I’m not dismissing raw performance." He points out they will do what they can with software to make up the difference and both systems will have great looking games. There are some games that are at parity across both systems so the massive 45% difference doesn't pan out for everything. He is playing down the disadvantage by saying the architecture is different. Its PR talk trying to make the gap seem smaller and damage control it.

Sony done a similar thing with PS4 PRo

https://mspoweruser.com/sonys-jim-ryan-believes-developers-will-hold-back-xbox-one-x-versions-due-playstation-4-pro/

Sony’s Jim Ryan said that developers usually aim for the “lowest common denominator” when designing games so he wasn’t worried that Xbox One X games would look vastly superior to their PlayStation 4 Pro counterparts. Ryan gave the example of the PlayStation 3 and how its power wasn’t utilized by developers. Its power wasn’t utilized by developers because it was notoriously difficult to program for, not because they didn’t want to. Remember that infamous Bayonetta scandal where it couldn’t even run on PlayStation 3 properly?

He's damage controlling by trying say developers wont use all the power in the X.

Its all damage control, they all do it.

You made such a big deal about them not saying something and ignore the rational reasoning behind it. Now when pushed you deny that its anything substantial.

The difference here is I dont deny Microsoft doing it, I'm simply saying they all do, which if you open your eys for a moment you will see.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#248  Edited By tormentos
Member since 2003 • 33784 Posts

@tdkmillsy said:

People DO understand that Microsoft has some of the smartest graphics programmers IN THE WORLD. We CREATED DirectX, the standard API’s that everyone programs against. So while people laude Sony for their HW skills, do you really think we don’t know how to build a system optimized for maximizing graphics for programmers? Seriously? There is no way we’re giving up a 30%+ advantage to Sony. And ANYONE who has seen both systems running could say there are great looking games on both systems. If there was really huge performance difference – it would be obvious.

He doesn't say there isn't a performance gap at all. He even states "I’m not dismissing raw performance." He points out they will do what they can with software to make up the difference and both systems will have great looking games. There are some games that are at parity across both systems so the massive 45% difference doesn't pan out for everything. He is playing down the disadvantage by saying the architecture is different. Its PR talk trying to make the gap seem smaller and damage control it.

Sony done a similar thing with PS4 PRo

https://mspoweruser.com/sonys-jim-ryan-believes-developers-will-hold-back-xbox-one-x-versions-due-playstation-4-pro/

Sony’s Jim Ryan said that developers usually aim for the “lowest common denominator” when designing games so he wasn’t worried that Xbox One X games would look vastly superior to their PlayStation 4 Pro counterparts. Ryan gave the example of the PlayStation 3 and how its power wasn’t utilized by developers. Its power wasn’t utilized by developers because it was notoriously difficult to program for, not because they didn’t want to. Remember that infamous Bayonetta scandal where it couldn’t even run on PlayStation 3 properly?

He's damage controlling by trying say developers wont use all the power in the X.

Its all damage control, they all do it.

You made such a big deal about them not saying something and ignore the rational reasoning behind it. Now when pushed you deny that its anything substantial.

The difference here is I dont deny Microsoft doing it, I'm simply saying they all do, which if you open your eys for a moment you will see.

Yeah Play stupid how did MS creating DX worked out for them on xbox one? Did that stopped the PS4 from walking all over the xbox one? You do know the gap between the PS4 and xbox one is not 30%?

Is 40%+ not only MS would not recognize the true gap but downoplayed it and claimed they would not give sony or anyone a 30% advantage this is lol worthy considering the effort MS put on DX12 false claimes of double performance on xbox one and the clouds making the xbox one 4 times as powerful.

Yes on one point he say i am not dismissing it and other the very damn next sentense claim they would not give sony or anyone a 30% advantage DENYING that there would be a Gap.

In fact what Jim Ryan did there is not the same MDS did,he never claimed that there wasn't a gap,he point out to the fact that developer tend to code to the lowest common denominator,and put the PS3 as example which is a very valid one,the PS3 suffer from shitty ports all the gen,and it wasn't because it was weaker.

In fact that was proven to the teeth when one of the Call of duty game came out on PS3 and it say on one of the modes i think it was search and destroy ""NO Party chat available"" because on that mode you were blocked from using party chat,as it was pron to cheating and once you were dead you wold not suppose to know ere the enemy was to stop telling,Activision developer were so lacy they did not even remove the party chat claim from the PS3 version when making the PS3 version which obviously never got party chat.

That shitty porting was on PS3.

But he doesn't say there there is no gap,or we will not give MS a 45% advantage or claim the use of magic api to make out for the gap.

The more you try the more desperate you look.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#249 tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

1. You didn't normalized the clock speed between Vega 64 and Polaris, hence you argument is debunked.

RX-580 has exceeded AMD's 185 watts TDP with gaming workload 209 wattswhile X1X's entire machine is 175 watts.

Vega 64's 12.6 TFLOPS / 295 watts TDP = 42 GFLOPS per watt

RX-580's 6.1 TFLOPS / 185 watts TDP= 32 GFLOPS per watt

RX-580's 6.1 TFLOPS / 209 watts gaming = 29 GFLOPS per watt

X1X GPU's 6 TFLOPS / ~145 watts gaming = 41 GFLOPS per watt

Notice X1X's GPU's GFLOPS per watt number is similar to Vega 64's

With year 2016 era 14nm/16 nm GCN silicon maturity, Sony has configured PS4 Pro to be less than PC's RX-480 (5.8 TFLOPS) and RX-470 (4.9 TFLOPS).

With year 2017 era 14nm/16 nm GCN silicon maturity, MS has configured X1X to rival PC's RX-580 (6.1 TFLOPS) while consuming less power than PC's RX 580.

Your argument is debunked.

2. MS has officially shown APU size relative to GDDR6 memory module chip size from E3 2019 reveal and it's up to the posters with geometry skills to estimate APU size based on relative to well known Samsung GDDR6 memory module size. This is no different to Project Scorpio's E3 2017 reveal.

Your MATT from resetera mod argument is not based on official Sony reveal.

MATT from resetera has NOT claim "hardware accelerated" ray-tracing.

Current Facts: Sony claims support ray-tracing has NOT confirmed "hardware accelerated" ray-tracing. There's nothing more than that.

3. Sony's Killzone Shadow Fall has "cheap rays" since it's not hierarchy rays.

Sony's GG also claimed "2.5D ray-tracing" which is hybrid between screen space and tiny amount of 3D.

https://www.eurogamer.net/articles/digitalfoundry-the-making-of-killzone-shadow-fall

"So what we do is find a reflection for every pixel on screen, we find a reflection vector that goes into the screen and then basically start stepping every second pixel until we find something that's a hit. It's a 2.5D ray-trace... We can compute a rough approximation of where the vector would go and we can find pixels on-screen that represent that surface. This is all integrated into our lighting model."

GG's method is not Bounding Volume Hierarchy ray-tracing. Note why Crytek's recent RTRT demo has specifically mentioned "disabled screen space reflection (SSR)" to show it's octree voxel based real time ray-tracing tech.

4. That's your argument not mine. At this time, Sony hasn't officially confirmed "hardware accelerated" ray-tracing.

5. MS has officially shown APU size relative to GDDR6 memory module chip size from E3 2019 reveal and it's up to the posters with geometry skills to estimate APU size. This is no different to Project Scorpio's E3 2017 reveal.

Your MATT from resetera mod argument is not based on official Sony reveal. Current Facts: Sony claims ray-tracing support but not "hardware accelerated" ray-tracing.

Fuk-off with your misapplied hypocrite argument.

6. MS knows launching Polaris silicon maturity in year 2016 would result in an inferior box. More time = extra time for AMD to resolve silicon related issues. MS has it's own "PS4 Pro" like box in 2016 which is cancelled.

7. It debunks your argument with early dev kits being the final Scarlet dev kits. PowerMacs with Radeon X800 GPU era Xbox 360 dev kits has higher GPU power over the original Xbox's GPU power.

Try again.

1-RX580 y never say 590,and the argument still stand the RX580 beat the xbox one X by a considerable margin in some games,regardless of having less CU,again if the PS4 Pro would have more bandwidth better cooling and more ram even with 36CU could still match the xbox one X because the RX580 does,and Pro GPU is polaris and Vega mix and you know it,FP16 at 6TF by clock speed would have help even more the PS4 Pro.

2.NO MS has not confirm the size of its chip,what we have are sites like Beyond3D guessing the size based on screens but MS hasn't reveal the real size period. So stop giving credibility to what serve you bets fool.

3-Which i never argue it is you arguing that shit,just because has those doesn't mean it is software based,again you are acting like it is a FACT that is shader based and not hardware based.

4-Which mean shit as you can't assume it doesn't have it sony hasn't say.

5-NO they have not hypocrite Link me to MS stating our SOC is this many MM,go ahead ill wait.

Beyond3d is extrapolating the number and you use it as FACT period,when you don't want to use Matt inside info as fact,making you a 2 face hypocrite.

6-GTFO MS got something stronger because they waited 1 year,if they would have release in 2016 they would have end equal or worse than sony.

7-It debunks nothing Alber doesn't work for MS anymore and he never worked for sony he doesn't know shit.

And in fact rumors point at sony sending devkit last year,which is were the whole 13TF come from apparently sony is using a vega 13TF GPU because what they will use will probably be an equivalent on Navi,i don't expect this machines to come with 10 or 12 TF GPU but rather 9TF and hide behind improvements of Navi vs Vega.

Example 9TF on Navi = 12 to 13TF on Vega.

Avatar image for tdkmillsy
tdkmillsy

5819

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#250 tdkmillsy
Member since 2003 • 5819 Posts

@tormentos said:
@tdkmillsy said:

People DO understand that Microsoft has some of the smartest graphics programmers IN THE WORLD. We CREATED DirectX, the standard API’s that everyone programs against. So while people laude Sony for their HW skills, do you really think we don’t know how to build a system optimized for maximizing graphics for programmers? Seriously? There is no way we’re giving up a 30%+ advantage to Sony. And ANYONE who has seen both systems running could say there are great looking games on both systems. If there was really huge performance difference – it would be obvious.

He doesn't say there isn't a performance gap at all. He even states "I’m not dismissing raw performance." He points out they will do what they can with software to make up the difference and both systems will have great looking games. There are some games that are at parity across both systems so the massive 45% difference doesn't pan out for everything. He is playing down the disadvantage by saying the architecture is different. Its PR talk trying to make the gap seem smaller and damage control it.

Sony done a similar thing with PS4 PRo

https://mspoweruser.com/sonys-jim-ryan-believes-developers-will-hold-back-xbox-one-x-versions-due-playstation-4-pro/

Sony’s Jim Ryan said that developers usually aim for the “lowest common denominator” when designing games so he wasn’t worried that Xbox One X games would look vastly superior to their PlayStation 4 Pro counterparts. Ryan gave the example of the PlayStation 3 and how its power wasn’t utilized by developers. Its power wasn’t utilized by developers because it was notoriously difficult to program for, not because they didn’t want to. Remember that infamous Bayonetta scandal where it couldn’t even run on PlayStation 3 properly?

He's damage controlling by trying say developers wont use all the power in the X.

Its all damage control, they all do it.

You made such a big deal about them not saying something and ignore the rational reasoning behind it. Now when pushed you deny that its anything substantial.

The difference here is I dont deny Microsoft doing it, I'm simply saying they all do, which if you open your eys for a moment you will see.

Yeah Play stupid how did MS creating DX worked out for them on xbox one? Did that stopped the PS4 from walking all over the xbox one? You do know the gap between the PS4 and xbox one is not 30%?

Is 40%+ not only MS would not recognize the true gap but downoplayed it and claimed they would not give sony or anyone a 30% advantage this is lol worthy considering the effort MS put on DX12 false claimes of double performance on xbox one and the clouds making the xbox one 4 times as powerful.

Yes on one point he say i am not dismissing it and other the very damn next sentense claim they would not give sony or anyone a 30% advantage DENYING that there would be a Gap.

In fact what Jim Ryan did there is not the same MDS did,he never claimed that there wasn't a gap,he point out to the fact that developer tend to code to the lowest common denominator,and put the PS3 as example which is a very valid one,the PS3 suffer from shitty ports all the gen,and it wasn't because it was weaker.

In fact that was proven to the teeth when one of the Call of duty game came out on PS3 and it say on one of the modes i think it was search and destroy ""NO Party chat available"" because on that mode you were blocked from using party chat,as it was pron to cheating and once you were dead you wold not suppose to know ere the enemy was to stop telling,Activision developer were so lacy they did not even remove the party chat claim from the PS3 version when making the PS3 version which obviously never got party chat.

That shitty porting was on PS3.

But he doesn't say there there is no gap,or we will not give MS a 45% advantage or claim the use of magic api to make out for the gap.

The more you try the more desperate you look.

Desperate to do what exactly???

What point are you trying to prove?? Can Sony do no wrong with you, is this just an act for System wars? I hope so cos that's as far as it should go, but I really get the feeling its important to you.

You lost the plot my friend, or maybe you cant comprehend what he is saying, but I'll give you the benefit of the doubt if English isnt your primary language. Its PR talk designed to not deny anything but to minimise the damage. A combination of acknowledging the gap without being direct, distraction with something else and to point out they will still have good games to play. Downplayed it exactly as I said and exactly the same as Jim downplaying the pro and X difference. There where plenty of websites that went on to prove Jim totally wrong using games across PS4 and Xbox One and Pro and X as examples, but I guess that doesnt matter to you.