Sorry tormentos, PS5 lost the GPU wars

Avatar image for Uruz7laevatein
Uruz7laevatein

113

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#401 Uruz7laevatein
Member since 2009 • 113 Posts

@ronvalencia:

1. So are you saying PC master race is set up with a 10GB RAM GPU with 6 GB of System RAM....

2. So does increasing CUs also increase ROPs/Geometry/DCC/etc and other parts of classic GPU hardware lol.....

3. Ryzen 7 3700X with a base clock of 3.7 Ghz and boost to up 4.3 Ghz with SMT at higher clocks without the need to disable SMT lol. XSX CPU runs at max 3.6 SMT with 3.8 Ghz with no SMT (the only reason for doing such are for older games and keeping power consumption down). AVX2 instructions are known to reduce CPU clock by saturating the entire CPU pipeline and with a hefty increase in power-consumption/temps.

Avatar image for tormentos
tormentos

30040

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#402 tormentos
Member since 2003 • 30040 Posts

@ronvalencia said:

Stop your cow dung.

https://www.eurogamer.net/articles/ps5-specs-features-ssd-ray-tracing-cpu-gpu-6300

Sony has a pattern with gimping memory bandwidth like in PS4 Pro.

PS4 has 176 GB/s memory bandwidth with CPU has two 10GB/s links, hence the GPU's memory bandwidth is on par with PC's HD 7850. This is NOT the case for PS4 Pro i.e. recycled RX 470's 22 4GB/s with additional near-mobile CPU memory bandwidth consumer which is then repeated for PS5 i.e. recycled RX-5700/5700XT's 448 GB/s with additional desktop PC CPU level memory bandwidth consumer.

I don't need to mention XSX since PS5's GPU went below RX 5700/RX-5700 XT's bandwidth allocation!

Sony talks about the large-scale gaming world while not allocating additional bandwidth for the PC CPU level memory bandwidth consumer. LOL

PS5 = PS4 Pro concept with upgrades e.g. 7P nm PC CPU(Zen 2 8C/16T 3.5Ghz variable) and RDNA 2 GPU (36 CU, 2230Mhz variable), and 256 bit GDDR6-14000.

Deal with it

You are a complete and total moron,you simply can't see pass your moronic arguments and your total MS bias.

Sony doesn't gimp memory you walking pancake,sony had a TARGET BUDGET,the freaking PS4 Pro was $400 dollars not $500 dollars 1 year latter,so sony targed within its price budget not because they gimped bandwidth or that they simply can't see a bottleneck.

And again i think sony is targeting $400 dollars VS MS targeting $500 is not gimping sony has a set budget,so does MS so MS went higher and and will pay more for it,sony went lower and will pay less.

You act like if MS outsmarted sony,when in reality Sony had a target budget and MS had a target budget,and is pathetic seeing you act like a total fool on a damn 17% GPU you,who spent the first 4 years of this gen damage controlling MS short comming and downplaying the gaps the PS4 had on the xbox.

To actuall ride DX12 bullshit claims and Tile resources as well,worse you even claim every xbox one sold = one xbox 360 sold,which is completely moronic and proves without doubt that you will say what ever shit comes to mind to defend MS,including claming that a fan inside the damn xbox is not a cooling fan and even freaking bring the vaccum of space.

That stupid you are.

Avatar image for i_p_daily
I_P_Daily

15199

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#403 I_P_Daily
Member since 2015 • 15199 Posts

@tormentos said:
@i_p_daily said:

If I cared about power, then why would I join this site in the era when the PS4 has more power over the XB1?

I'm using it to make fun of you and other cows because we all know you cows lose your shit over the smallest things.

You joined this site before your join date,go fool other people we know your account arrived exact whe several lemmings vanish from this forum,you went into hiding from your regular account into this one..😂

You are here arguing about taking advantage of hardware or not.

Your stupidity has no bounds. But for the lols, tell which from which user am I an alt of again?

Crickets...………...

Avatar image for ronvalencia
ronvalencia

28988

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#404  Edited By ronvalencia
Member since 2008 • 28988 Posts

@tormentos said:
@ronvalencia said:

Stop your cow dung.

https://www.eurogamer.net/articles/ps5-specs-features-ssd-ray-tracing-cpu-gpu-6300

Sony has a pattern with gimping memory bandwidth like in PS4 Pro.

PS4 has 176 GB/s memory bandwidth with CPU has two 10GB/s links, hence the GPU's memory bandwidth is on par with PC's HD 7850. This is NOT the case for PS4 Pro i.e. recycled RX 470's 22 4GB/s with additional near-mobile CPU memory bandwidth consumer which is then repeated for PS5 i.e. recycled RX-5700/5700XT's 448 GB/s with additional desktop PC CPU level memory bandwidth consumer.

I don't need to mention XSX since PS5's GPU went below RX 5700/RX-5700 XT's bandwidth allocation!

Sony talks about the large-scale gaming world while not allocating additional bandwidth for the PC CPU level memory bandwidth consumer. LOL

PS5 = PS4 Pro concept with upgrades e.g. 7P nm PC CPU(Zen 2 8C/16T 3.5Ghz variable) and RDNA 2 GPU (36 CU, 2230Mhz variable), and 256 bit GDDR6-14000.

Deal with it

1. You are a complete and total moron,you simply can't see pass your moronic arguments and your total MS bias.

2. Sony doesn't gimp memory you walking pancake,sony had a TARGET BUDGET,the freaking PS4 Pro was $400 dollars not $500 dollars 1 year latter,so sony targed within its price budget not because they gimped bandwidth or that they simply can't see a bottleneck.

And again i think sony is targeting $400 dollars VS MS targeting $500 is not gimping sony has a set budget,so does MS so MS went higher and and will pay more for it,sony went lower and will pay less.

You act like if MS outsmarted sony,when in reality Sony had a target budget and MS had a target budget,and is pathetic seeing you act like a total fool on a damn 17% GPU you,who spent the first 4 years of this gen damage controlling MS short comming and downplaying the gaps the PS4 had on the xbox.

To actuall ride DX12 bullshit claims and Tile resources as well,worse you even claim every xbox one sold = one xbox 360 sold,which is completely moronic and proves without doubt that you will say what ever shit comes to mind to defend MS,including claming that a fan inside the damn xbox is not a cooling fan and even freaking bring the vaccum of space.

That stupid you are.

1. I don't need MS when discussing hardware.

2. FYI, For PS4 Pro, Sony paid for GDDR5-8000 chips.

https://www.ifixit.com/Teardown/PlayStation+4+Pro+Teardown/72946

Sony used Samsung K4G80325FB GDDR5 chips

https://www.samsung.com/semiconductor/dram/gddr5/K4G80325FB-HC25/

From https://www.samsung.com/semiconductor/dram/gddr5/

Samsung K4G80325FB is 8.0 Gbps rated which is GDDR5-8000.

PS4 Pro has 256 GB/s memory bandwidth potential. Sony's mindset is conservative.

There's nothing wrong with PS4 Pro's parts.

You are ignorant. Damn yourself.

Avatar image for ronvalencia
ronvalencia

28988

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#405  Edited By ronvalencia
Member since 2008 • 28988 Posts

@Uruz7laevatein said:

@ronvalencia:

1. So are you saying PC master race is set up with a 10GB RAM GPU with 6 GB of System RAM....

2. So does increasing CUs also increase ROPs/Geometry/DCC/etc and other parts of classic GPU hardware lol.....

3. Ryzen 7 3700X with a base clock of 3.7 Ghz and boost to up 4.3 Ghz with SMT at higher clocks without the need to disable SMT lol. XSX CPU runs at max 3.6 SMT with 3.8 Ghz with no SMT (the only reason for doing such are for older games and keeping power consumption down). AVX2 instructions are known to reduce CPU clock by saturating the entire CPU pipeline and with a hefty increase in power-consumption/temps.

1. I'm referring to CPU and GPU memory bandwidth ratio.

Zen 2 has AVX 2 gather instructions which are mass data fetch instructions which is a relic from Intel CPU wannabe IGP R&D.

2. Microsoft did NOT reveal XSX GPU's non-CU details, but 12 TFLOPS RDNA 2 is scaling like RTX 2080 from two weeks raw Gears 5 port's benchmark with PC Ultra settings.

Hardware above L2 cache, left of geometry and RBs are within the CU, hence parts of DCC are scaling horizontally.

L2 cache could scale with increase memory controllers. XSX GPU's TFLOPS, TMU, TFU and memory bandwidth are scaled by about 25 percent.

For RBs, 64 ROPS version at 1825 Mhz memory bandwdith usage against 560 GB/s external memory bandwdith (BW)

RGBA8: 1825Mhz * 64 * 4 bytes = 467.2 GB/s (ROPS bound)

RGBA16F: 1825Mhz * 64 * 8 bytes = 934.4 GB/s (BW bound)

RGBA32F: 1825Mhz * 64 * 16 bytes = 1868.8 GB/s (BW bound)

BW bound needs DCC and cache render tricks.

3. From AMD.com website that shows Ryzen 7 3700X's 3.6 Ghz base clock.

Avatar image for tormentos
tormentos

30040

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#406 tormentos
Member since 2003 • 30040 Posts

@ronvalencia said:

1. I don't need MS when discussing hardware.

2. FYI, For PS4 Pro, Sony paid for GDDR5-8000 chips.

https://www.ifixit.com/Teardown/PlayStation+4+Pro+Teardown/72946

Sony used Samsung K4G80325FB GDDR5 chips

https://www.samsung.com/semiconductor/dram/gddr5/K4G80325FB-HC25/

From https://www.samsung.com/semiconductor/dram/gddr5/

Samsung K4G80325FB is 8.0 Gbps rated which is GDDR5-8000.

PS4 Pro has 256 GB/s memory bandwidth potential. Sony's mindset is conservative.

There's nothing wrong with PS4 Pro's parts.

You are ignorant. Damn yourself.

Sony mind is not conservative else they would not have a damn GPU clocked at 2.23ghz which no other console comes close to and no GPU from AMD on PC comes from factory as high clocked.

Sony had a budget and they stick to it,sure it could mean little worse performance but it could also mean $100 less which would be incredibly more important vs the xbox,considering how bad the xbox sold at $500.

@i_p_daily said:

Your stupidity has no bounds. But for the lols, tell which from which user am I an alt of again?

Crickets...………...

Is not my foult that you joined this site posting like if you had 10 years here,a mistake most sorry lemmings with alts always do.

Avatar image for ronvalencia
ronvalencia

28988

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#407  Edited By ronvalencia
Member since 2008 • 28988 Posts

@tormentos said:
@ronvalencia said:

1. I don't need MS when discussing hardware.

2. FYI, For PS4 Pro, Sony paid for GDDR5-8000 chips.

https://www.ifixit.com/Teardown/PlayStation+4+Pro+Teardown/72946

Sony used Samsung K4G80325FB GDDR5 chips

https://www.samsung.com/semiconductor/dram/gddr5/K4G80325FB-HC25/

From https://www.samsung.com/semiconductor/dram/gddr5/

Samsung K4G80325FB is 8.0 Gbps rated which is GDDR5-8000.

PS4 Pro has 256 GB/s memory bandwidth potential. Sony's mindset is conservative.

There's nothing wrong with PS4 Pro's parts.

You are ignorant. Damn yourself.

Sony mind is not conservative else they would not have a damn GPU clocked at 2.23ghz which no other console comes close to and no GPU from AMD on PC comes from factory as high clocked.

Sony had a budget and they stick to it,sure it could mean little worse performance but it could also mean $100 less which would be incredibly more important vs the xbox,considering how bad the xbox sold at $500.

If AMD's 50% perf/watt PR claim is true, 2.23 Ghz boost mode would be a good target for RDNA 2's NAVI 10 replacement i.e. RX 6700 and RX 6700 XT. Sony put a cap on GPU's 2.23Ghz boost mode.

RDNA 2 is fabricated on second-generation 7nm N7P or 7nm Enhanced.

https://www.techpowerup.com/review/msi-radeon-rx-5700-xt-gaming-x/33.html

MSI RX 5700 XT Gaming X has 1.99 Ghz average and 2.07 Ghz max boost with 270 watts average gaming power consumption. This is from a factory i.e. MSI.

Apply AMD's 50% perf/watt claim turns 270 watts into 135 watts at ~2Ghz average.

Apply 40% perf/watt claim turns 270 watts into 162 watts at ~2Ghz average.

Sony could spend the remaining TDP headroom into 2.23 Ghz without major problems. I don't support PS5's RDNA 2 GPU to be 9.2 TFLOPS average arguments i.e. I'm not extremist for the other camp.

Sony has stated CPU AVX workloads to be the major factor in the throttling.

You can't compare RDNA v1 7nm against RDNA v2 N7P or 7nm Enhanced when PC's RDNA 2 based SKUs are not yet revealed.

I'm positive for Sony to reveal RDNA 2 boost clocks since it benefits PC's RX 6700 XT which is expected to scale higher clock speed, especially from another MSI Gaming X like SKU.

RDNA 2 40 CU needs to be around 150 watts for RDNA 2's 80 CU "Big Navi" version!

Avatar image for KillzoneSnake
KillzoneSnake

2757

Forum Posts

0

Wiki Points

0

Followers

Reviews: 37

User Lists: 0

#408 KillzoneSnake
Member since 2012 • 2757 Posts

@tormentos said:

If you didn't die during the 720p x box days PS5 fans will be ok with 4k.

You fanboys are trying to hard.

Oh and using the bothconsole moniker doesn't exclude you from the blind fanboy list.

Some of you don't read.

The GPU speed is 2.23GHZ almost all the time,it drops when the CPU requires more power,i don't think that will be the case much,but even with that it is been say that the machine should not drop from 10.1TF.

The gap will probably not hit 20% i don't see how that would = the PS5 getting smoke.

Some of you are clueless.

Yeah so taking power away from the CPU for the GPU to go full overclock lol. Nice one. For sure most of the time it will run at 9TF then.

Personally i dont care, i think its still a very powerful system. Hell im happy with the PS4 lol. BC news and more AAA game variety imo is more important. Its going to be a sad day for PS fanboys when digital foundry benchmarks both systems lol

Avatar image for ronvalencia
ronvalencia

28988

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#409 ronvalencia
Member since 2008 • 28988 Posts

@KillzoneSnake said:
@tormentos said:

If you didn't die during the 720p x box days PS5 fans will be ok with 4k.

You fanboys are trying to hard.

Oh and using the bothconsole moniker doesn't exclude you from the blind fanboy list.

Some of you don't read.

The GPU speed is 2.23GHZ almost all the time,it drops when the CPU requires more power,i don't think that will be the case much,but even with that it is been say that the machine should not drop from 10.1TF.

The gap will probably not hit 20% i don't see how that would = the PS5 getting smoke.

Some of you are clueless.

Yeah so taking power away from the CPU for the GPU to go full overclock lol. Nice one. For sure most of the time it will run at 9TF then.

Personally i dont care, i think its still a very powerful system. Hell im happy with the PS4 lol. BC news and more AAA game variety imo is more important. Its going to be a sad day for PS fanboys when digital foundry benchmarks both systems lol

FYI, Mark Cerny states heavy AVX CPU workload can cause clock speeds to drop. LOL.

Avatar image for phbz
phbz

5258

Forum Posts

0

Wiki Points

0

Followers

Reviews: 54

User Lists: 0

#410  Edited By phbz
Member since 2009 • 5258 Posts

Shame that the PS5 will not be much more powerful than a Sega Genesis.

Avatar image for tormentos
tormentos

30040

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#411 tormentos
Member since 2003 • 30040 Posts

@KillzoneSnake said:

Yeah so taking power away from the CPU for the GPU to go full overclock lol. Nice one. For sure most of the time it will run at 9TF then.

Personally i dont care, i think its still a very powerful system. Hell im happy with the PS4 lol. BC news and more AAA game variety imo is more important. Its going to be a sad day for PS fanboys when digital foundry benchmarks both systems lol

Actually that is not how it works,if you haven't read it yet you don't want to that much is obvious from your trolling post.

@ronvalencia said:

FYI, Mark Cerny states heavy AVX CPU workload can cause clock speeds to drop. LOL.

A 10% drop in worse case scenario that would result in a small frequency drod,in fact it is been claim that the GPU will probably not drop below 10.1TF.

But you know this and now you are just as sad of a troll as he is.lol

@ronvalencia said:

If AMD's 50% perf/watt PR claim is true, 2.23 Ghz boost mode would be a good target for RDNA 2's NAVI 10 replacement i.e. RX 6700 and RX 6700 XT. Sony put a cap on GPU's 2.23Ghz boost mode.

RDNA 2 is fabricated on second-generation 7nm N7P or 7nm Enhanced.

https://www.techpowerup.com/review/msi-radeon-rx-5700-xt-gaming-x/33.html

MSI RX 5700 XT Gaming X has 1.99 Ghz average and 2.07 Ghz max boost with 270 watts average gaming power consumption. This is from a factory i.e. MSI.

Apply AMD's 50% perf/watt claim turns 270 watts into 135 watts at ~2Ghz average.

Apply 40% perf/watt claim turns 270 watts into 162 watts at ~2Ghz average.

Sony could spend the remaining TDP headroom into 2.23 Ghz without major problems. I don't support PS5's RDNA 2 GPU to be 9.2 TFLOPS average arguments i.e. I'm not extremist for the other camp.

Sony has stated CPU AVX workloads to be the major factor in the throttling.

You can't compare RDNA v1 7nm against RDNA v2 N7P or 7nm Enhanced when PC's RDNA 2 based SKUs are not yet revealed.

I'm happy for Sony to reveal RDNA 2 boost clocks since it benefits PC's RX 6700 XT which is expected to scale higher clock speed, especially from another MSI Gaming X like SKU.

RDNA 2 40 CU needs to be around 150 watts for RDNA 2's 80 CU "Big Navi" version!

Nothing you say there address my point you are the master of saying tons of bullshit without saying anything really.

The PS4 Pro came in 2016 and was $400 the X came in 2017 1 year latter which mean even cheaper hardware but came at $100 more $500.

The budget for this 2 machines were difference Sony simply could not achieve an xbox one X setup in 2016 at $400 hell not even MS could which is why they release in 2017 and for $100 MORE.

Budget is the keyword here you just don't get it because you are to busy kissing MS ass.

Avatar image for ronvalencia
ronvalencia

28988

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#412  Edited By ronvalencia
Member since 2008 • 28988 Posts

@tormentos said:
@KillzoneSnake said:

Yeah so taking power away from the CPU for the GPU to go full overclock lol. Nice one. For sure most of the time it will run at 9TF then.

Personally i dont care, i think its still a very powerful system. Hell im happy with the PS4 lol. BC news and more AAA game variety imo is more important. Its going to be a sad day for PS fanboys when digital foundry benchmarks both systems lol

Actually that is not how it works,if you haven't read it yet you don't want to that much is obvious from your trolling post.

@ronvalencia said:

FYI, Mark Cerny states heavy AVX CPU workload can cause clock speeds to drop. LOL.

A 10% drop in worse case scenario that would result in a small frequency drod,in fact it is been claim that the GPU will probably not drop below 10.1TF.

But you know this and now you are just as sad of a troll as he is.lol (FUK0FF)

@ronvalencia said:

If AMD's 50% perf/watt PR claim is true, 2.23 Ghz boost mode would be a good target for RDNA 2's NAVI 10 replacement i.e. RX 6700 and RX 6700 XT. Sony put a cap on GPU's 2.23Ghz boost mode.

RDNA 2 is fabricated on second-generation 7nm N7P or 7nm Enhanced.

https://www.techpowerup.com/review/msi-radeon-rx-5700-xt-gaming-x/33.html

MSI RX 5700 XT Gaming X has 1.99 Ghz average and 2.07 Ghz max boost with 270 watts average gaming power consumption. This is from a factory i.e. MSI.

Apply AMD's 50% perf/watt claim turns 270 watts into 135 watts at ~2Ghz average.

Apply 40% perf/watt claim turns 270 watts into 162 watts at ~2Ghz average.

Sony could spend the remaining TDP headroom into 2.23 Ghz without major problems. I don't support PS5's RDNA 2 GPU to be 9.2 TFLOPS average arguments i.e. I'm not extremist for the other camp.

Sony has stated CPU AVX workloads to be the major factor in the throttling.

You can't compare RDNA v1 7nm against RDNA v2 N7P or 7nm Enhanced when PC's RDNA 2 based SKUs are not yet revealed.

I'm happy for Sony to reveal RDNA 2 boost clocks since it benefits PC's RX 6700 XT which is expected to scale higher clock speed, especially from another MSI Gaming X like SKU.

RDNA 2 40 CU needs to be around 150 watts for RDNA 2's 80 CU "Big Navi" version!

Nothing you say there address my point you are the master of saying tons of bullshit without saying anything really.

The PS4 Pro came in 2016 and was $400 the X came in 2017 1 year latter which mean even cheaper hardware but came at $100 more $500.

The budget for this 2 machines were difference Sony simply could not achieve an xbox one X setup in 2016 at $400 hell not even MS could which is why they release in 2017 and for $100 MORE.

Budget is the keyword here you just don't get it because you are to busy kissing MS ass.

I have addressed your points and I have countered with Sony already paid for eight GDDR5-8000 components.

For X1X, MS was waiting for Vega 64's perf/watts improvements i.e. half of 12.66 TFLOPS at 295 watts yields 6 TFLOPS version would be close to 150 watts. Both Vega 64 and X1X was released in year 2017.

X1X GPU's 32 ROPS was the 1st GCN with 2MB render cache which is half of Vega 64's 64 ROPS with 4MB L2 cache.

X1X has GDDR5-7000 rated chips which is available in year 2016.

AMD's 384-bit bus PCB design already exists since December 2011 with Radeon HD 7970's retail release. Individual GDDR5-7000 chip is cheaper than GDDR5-8000 chip.

X1X GPU's 44 CU with quad shader engine units layout design already exists with year 2013 Hawaii 44 CU GCN.

X1X has an additional cost on Ultra-Blu-ray drive, four extra GDDR5-7000 chips, flash storage (for OS swap file), vapor cooler and slightly complex 384-bit bus PCB.

The main delaying factor for X1X is AMD's perf/watt silicon maturity and MS's customization with GCN's rasterization.

X1X's cost has reached $399 since November 2018. https://www.theverge.com/2018/11/10/18083294/xbox-one-x-black-friday-deal-microsoft

I don't need MS when discussing mostly PC hardware.

Again, I don't support PS5's RDNA 2 GPU to be 9.2 TFLOPS average arguments i.e. I'm not extremist for the other camp. It's your problem to defend Sony.