Sorry tormentos, PS5 lost the GPU wars

Avatar image for Uruz7laevatein
Uruz7laevatein

160

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#401 Uruz7laevatein
Member since 2009 • 160 Posts

@ronvalencia:

1. So are you saying PC master race is set up with a 10GB RAM GPU with 6 GB of System RAM....

2. So does increasing CUs also increase ROPs/Geometry/DCC/etc and other parts of classic GPU hardware lol.....

3. Ryzen 7 3700X with a base clock of 3.7 Ghz and boost to up 4.3 Ghz with SMT at higher clocks without the need to disable SMT lol. XSX CPU runs at max 3.6 SMT with 3.8 Ghz with no SMT (the only reason for doing such are for older games and keeping power consumption down). AVX2 instructions are known to reduce CPU clock by saturating the entire CPU pipeline and with a hefty increase in power-consumption/temps.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#402 tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

Stop your cow dung.

https://www.eurogamer.net/articles/ps5-specs-features-ssd-ray-tracing-cpu-gpu-6300

Sony has a pattern with gimping memory bandwidth like in PS4 Pro.

PS4 has 176 GB/s memory bandwidth with CPU has two 10GB/s links, hence the GPU's memory bandwidth is on par with PC's HD 7850. This is NOT the case for PS4 Pro i.e. recycled RX 470's 22 4GB/s with additional near-mobile CPU memory bandwidth consumer which is then repeated for PS5 i.e. recycled RX-5700/5700XT's 448 GB/s with additional desktop PC CPU level memory bandwidth consumer.

I don't need to mention XSX since PS5's GPU went below RX 5700/RX-5700 XT's bandwidth allocation!

Sony talks about the large-scale gaming world while not allocating additional bandwidth for the PC CPU level memory bandwidth consumer. LOL

PS5 = PS4 Pro concept with upgrades e.g. 7P nm PC CPU(Zen 2 8C/16T 3.5Ghz variable) and RDNA 2 GPU (36 CU, 2230Mhz variable), and 256 bit GDDR6-14000.

Deal with it

You are a complete and total moron,you simply can't see pass your moronic arguments and your total MS bias.

Sony doesn't gimp memory you walking pancake,sony had a TARGET BUDGET,the freaking PS4 Pro was $400 dollars not $500 dollars 1 year latter,so sony targed within its price budget not because they gimped bandwidth or that they simply can't see a bottleneck.

And again i think sony is targeting $400 dollars VS MS targeting $500 is not gimping sony has a set budget,so does MS so MS went higher and and will pay more for it,sony went lower and will pay less.

You act like if MS outsmarted sony,when in reality Sony had a target budget and MS had a target budget,and is pathetic seeing you act like a total fool on a damn 17% GPU you,who spent the first 4 years of this gen damage controlling MS short comming and downplaying the gaps the PS4 had on the xbox.

To actuall ride DX12 bullshit claims and Tile resources as well,worse you even claim every xbox one sold = one xbox 360 sold,which is completely moronic and proves without doubt that you will say what ever shit comes to mind to defend MS,including claming that a fan inside the damn xbox is not a cooling fan and even freaking bring the vaccum of space.

That stupid you are.

Avatar image for deactivated-6092a2d005fba
deactivated-6092a2d005fba

22663

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#403 deactivated-6092a2d005fba
Member since 2015 • 22663 Posts

@tormentos said:
@i_p_daily said:

If I cared about power, then why would I join this site in the era when the PS4 has more power over the XB1?

I'm using it to make fun of you and other cows because we all know you cows lose your shit over the smallest things.

You joined this site before your join date,go fool other people we know your account arrived exact whe several lemmings vanish from this forum,you went into hiding from your regular account into this one..😂

You are here arguing about taking advantage of hardware or not.

Your stupidity has no bounds. But for the lols, tell which from which user am I an alt of again?

Crickets...………...

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#404  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@ronvalencia said:

Stop your cow dung.

https://www.eurogamer.net/articles/ps5-specs-features-ssd-ray-tracing-cpu-gpu-6300

Sony has a pattern with gimping memory bandwidth like in PS4 Pro.

PS4 has 176 GB/s memory bandwidth with CPU has two 10GB/s links, hence the GPU's memory bandwidth is on par with PC's HD 7850. This is NOT the case for PS4 Pro i.e. recycled RX 470's 22 4GB/s with additional near-mobile CPU memory bandwidth consumer which is then repeated for PS5 i.e. recycled RX-5700/5700XT's 448 GB/s with additional desktop PC CPU level memory bandwidth consumer.

I don't need to mention XSX since PS5's GPU went below RX 5700/RX-5700 XT's bandwidth allocation!

Sony talks about the large-scale gaming world while not allocating additional bandwidth for the PC CPU level memory bandwidth consumer. LOL

PS5 = PS4 Pro concept with upgrades e.g. 7P nm PC CPU(Zen 2 8C/16T 3.5Ghz variable) and RDNA 2 GPU (36 CU, 2230Mhz variable), and 256 bit GDDR6-14000.

Deal with it

1. You are a complete and total moron,you simply can't see pass your moronic arguments and your total MS bias.

2. Sony doesn't gimp memory you walking pancake,sony had a TARGET BUDGET,the freaking PS4 Pro was $400 dollars not $500 dollars 1 year latter,so sony targed within its price budget not because they gimped bandwidth or that they simply can't see a bottleneck.

And again i think sony is targeting $400 dollars VS MS targeting $500 is not gimping sony has a set budget,so does MS so MS went higher and and will pay more for it,sony went lower and will pay less.

You act like if MS outsmarted sony,when in reality Sony had a target budget and MS had a target budget,and is pathetic seeing you act like a total fool on a damn 17% GPU you,who spent the first 4 years of this gen damage controlling MS short comming and downplaying the gaps the PS4 had on the xbox.

To actuall ride DX12 bullshit claims and Tile resources as well,worse you even claim every xbox one sold = one xbox 360 sold,which is completely moronic and proves without doubt that you will say what ever shit comes to mind to defend MS,including claming that a fan inside the damn xbox is not a cooling fan and even freaking bring the vaccum of space.

That stupid you are.

1. I don't need MS when discussing hardware.

2. FYI, For PS4 Pro, Sony paid for GDDR5-8000 chips.

https://www.ifixit.com/Teardown/PlayStation+4+Pro+Teardown/72946

Sony used Samsung K4G80325FB GDDR5 chips

https://www.samsung.com/semiconductor/dram/gddr5/K4G80325FB-HC25/

From https://www.samsung.com/semiconductor/dram/gddr5/

Samsung K4G80325FB is 8.0 Gbps rated which is GDDR5-8000.

PS4 Pro has 256 GB/s memory bandwidth potential. Sony's mindset is conservative.

There's nothing wrong with PS4 Pro's parts.

You are ignorant. Damn yourself.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#405  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Uruz7laevatein said:

@ronvalencia:

1. So are you saying PC master race is set up with a 10GB RAM GPU with 6 GB of System RAM....

2. So does increasing CUs also increase ROPs/Geometry/DCC/etc and other parts of classic GPU hardware lol.....

3. Ryzen 7 3700X with a base clock of 3.7 Ghz and boost to up 4.3 Ghz with SMT at higher clocks without the need to disable SMT lol. XSX CPU runs at max 3.6 SMT with 3.8 Ghz with no SMT (the only reason for doing such are for older games and keeping power consumption down). AVX2 instructions are known to reduce CPU clock by saturating the entire CPU pipeline and with a hefty increase in power-consumption/temps.

1. I'm referring to CPU and GPU memory bandwidth ratio.

Zen 2 has AVX 2 gather instructions which are mass data fetch instructions which is a relic from Intel CPU wannabe IGP R&D.

2. Microsoft did NOT reveal XSX GPU's non-CU details, but 12 TFLOPS RDNA 2 is scaling like RTX 2080 from two weeks raw Gears 5 port's benchmark with PC Ultra settings.

Hardware above L2 cache, left of geometry and RBs are within the CU, hence parts of DCC are scaling horizontally.

L2 cache could scale with increase memory controllers. XSX GPU's TFLOPS, TMU, TFU and memory bandwidth are scaled by about 25 percent.

For RBs, 64 ROPS version at 1825 Mhz memory bandwdith usage against 560 GB/s external memory bandwdith (BW)

RGBA8: 1825Mhz * 64 * 4 bytes = 467.2 GB/s (ROPS bound)

RGBA16F: 1825Mhz * 64 * 8 bytes = 934.4 GB/s (BW bound)

RGBA32F: 1825Mhz * 64 * 16 bytes = 1868.8 GB/s (BW bound)

BW bound needs DCC and cache render tricks.

3. From AMD.com website that shows Ryzen 7 3700X's 3.6 Ghz base clock.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#406 tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

1. I don't need MS when discussing hardware.

2. FYI, For PS4 Pro, Sony paid for GDDR5-8000 chips.

https://www.ifixit.com/Teardown/PlayStation+4+Pro+Teardown/72946

Sony used Samsung K4G80325FB GDDR5 chips

https://www.samsung.com/semiconductor/dram/gddr5/K4G80325FB-HC25/

From https://www.samsung.com/semiconductor/dram/gddr5/

Samsung K4G80325FB is 8.0 Gbps rated which is GDDR5-8000.

PS4 Pro has 256 GB/s memory bandwidth potential. Sony's mindset is conservative.

There's nothing wrong with PS4 Pro's parts.

You are ignorant. Damn yourself.

Sony mind is not conservative else they would not have a damn GPU clocked at 2.23ghz which no other console comes close to and no GPU from AMD on PC comes from factory as high clocked.

Sony had a budget and they stick to it,sure it could mean little worse performance but it could also mean $100 less which would be incredibly more important vs the xbox,considering how bad the xbox sold at $500.

@i_p_daily said:

Your stupidity has no bounds. But for the lols, tell which from which user am I an alt of again?

Crickets...………...

Is not my foult that you joined this site posting like if you had 10 years here,a mistake most sorry lemmings with alts always do.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#407  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@ronvalencia said:

1. I don't need MS when discussing hardware.

2. FYI, For PS4 Pro, Sony paid for GDDR5-8000 chips.

https://www.ifixit.com/Teardown/PlayStation+4+Pro+Teardown/72946

Sony used Samsung K4G80325FB GDDR5 chips

https://www.samsung.com/semiconductor/dram/gddr5/K4G80325FB-HC25/

From https://www.samsung.com/semiconductor/dram/gddr5/

Samsung K4G80325FB is 8.0 Gbps rated which is GDDR5-8000.

PS4 Pro has 256 GB/s memory bandwidth potential. Sony's mindset is conservative.

There's nothing wrong with PS4 Pro's parts.

You are ignorant. Damn yourself.

Sony mind is not conservative else they would not have a damn GPU clocked at 2.23ghz which no other console comes close to and no GPU from AMD on PC comes from factory as high clocked.

Sony had a budget and they stick to it,sure it could mean little worse performance but it could also mean $100 less which would be incredibly more important vs the xbox,considering how bad the xbox sold at $500.

If AMD's 50% perf/watt PR claim is true, 2.23 Ghz boost mode would be a good target for RDNA 2's NAVI 10 replacement i.e. RX 6700 and RX 6700 XT. Sony put a cap on GPU's 2.23Ghz boost mode.

RDNA 2 is fabricated on second-generation 7nm N7P or 7nm Enhanced.

https://www.techpowerup.com/review/msi-radeon-rx-5700-xt-gaming-x/33.html

MSI RX 5700 XT Gaming X has 1.99 Ghz average and 2.07 Ghz max boost with 270 watts average gaming power consumption. This is from a factory i.e. MSI.

Apply AMD's 50% perf/watt claim turns 270 watts into 135 watts at ~2Ghz average.

Apply 40% perf/watt claim turns 270 watts into 162 watts at ~2Ghz average.

Sony could spend the remaining TDP headroom into 2.23 Ghz without major problems. I don't support PS5's RDNA 2 GPU to be 9.2 TFLOPS average arguments i.e. I'm not extremist for the other camp.

Sony has stated CPU AVX workloads to be the major factor in the throttling.

You can't compare RDNA v1 7nm against RDNA v2 N7P or 7nm Enhanced when PC's RDNA 2 based SKUs are not yet revealed.

I'm positive for Sony to reveal RDNA 2 boost clocks since it benefits PC's RX 6700 XT which is expected to scale higher clock speed, especially from another MSI Gaming X like SKU.

RDNA 2 40 CU needs to be around 150 watts for RDNA 2's 80 CU "Big Navi" version!

Avatar image for KillzoneSnake
KillzoneSnake

2761

Forum Posts

0

Wiki Points

0

Followers

Reviews: 37

User Lists: 0

#408 KillzoneSnake
Member since 2012 • 2761 Posts

@tormentos said:

If you didn't die during the 720p x box days PS5 fans will be ok with 4k.

You fanboys are trying to hard.

Oh and using the bothconsole moniker doesn't exclude you from the blind fanboy list.

Some of you don't read.

The GPU speed is 2.23GHZ almost all the time,it drops when the CPU requires more power,i don't think that will be the case much,but even with that it is been say that the machine should not drop from 10.1TF.

The gap will probably not hit 20% i don't see how that would = the PS5 getting smoke.

Some of you are clueless.

Yeah so taking power away from the CPU for the GPU to go full overclock lol. Nice one. For sure most of the time it will run at 9TF then.

Personally i dont care, i think its still a very powerful system. Hell im happy with the PS4 lol. BC news and more AAA game variety imo is more important. Its going to be a sad day for PS fanboys when digital foundry benchmarks both systems lol

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#409 ronvalencia
Member since 2008 • 29612 Posts

@KillzoneSnake said:
@tormentos said:

If you didn't die during the 720p x box days PS5 fans will be ok with 4k.

You fanboys are trying to hard.

Oh and using the bothconsole moniker doesn't exclude you from the blind fanboy list.

Some of you don't read.

The GPU speed is 2.23GHZ almost all the time,it drops when the CPU requires more power,i don't think that will be the case much,but even with that it is been say that the machine should not drop from 10.1TF.

The gap will probably not hit 20% i don't see how that would = the PS5 getting smoke.

Some of you are clueless.

Yeah so taking power away from the CPU for the GPU to go full overclock lol. Nice one. For sure most of the time it will run at 9TF then.

Personally i dont care, i think its still a very powerful system. Hell im happy with the PS4 lol. BC news and more AAA game variety imo is more important. Its going to be a sad day for PS fanboys when digital foundry benchmarks both systems lol

FYI, Mark Cerny states heavy AVX CPU workload can cause clock speeds to drop. LOL.

Avatar image for deactivated-5f3ec00254b0d
deactivated-5f3ec00254b0d

6278

Forum Posts

0

Wiki Points

0

Followers

Reviews: 54

User Lists: 0

#410  Edited By deactivated-5f3ec00254b0d
Member since 2009 • 6278 Posts

Shame that the PS5 will not be much more powerful than a Sega Genesis.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#411 tormentos
Member since 2003 • 33784 Posts

@KillzoneSnake said:

Yeah so taking power away from the CPU for the GPU to go full overclock lol. Nice one. For sure most of the time it will run at 9TF then.

Personally i dont care, i think its still a very powerful system. Hell im happy with the PS4 lol. BC news and more AAA game variety imo is more important. Its going to be a sad day for PS fanboys when digital foundry benchmarks both systems lol

Actually that is not how it works,if you haven't read it yet you don't want to that much is obvious from your trolling post.

@ronvalencia said:

FYI, Mark Cerny states heavy AVX CPU workload can cause clock speeds to drop. LOL.

A 10% drop in worse case scenario that would result in a small frequency drod,in fact it is been claim that the GPU will probably not drop below 10.1TF.

But you know this and now you are just as sad of a troll as he is.lol

@ronvalencia said:

If AMD's 50% perf/watt PR claim is true, 2.23 Ghz boost mode would be a good target for RDNA 2's NAVI 10 replacement i.e. RX 6700 and RX 6700 XT. Sony put a cap on GPU's 2.23Ghz boost mode.

RDNA 2 is fabricated on second-generation 7nm N7P or 7nm Enhanced.

https://www.techpowerup.com/review/msi-radeon-rx-5700-xt-gaming-x/33.html

MSI RX 5700 XT Gaming X has 1.99 Ghz average and 2.07 Ghz max boost with 270 watts average gaming power consumption. This is from a factory i.e. MSI.

Apply AMD's 50% perf/watt claim turns 270 watts into 135 watts at ~2Ghz average.

Apply 40% perf/watt claim turns 270 watts into 162 watts at ~2Ghz average.

Sony could spend the remaining TDP headroom into 2.23 Ghz without major problems. I don't support PS5's RDNA 2 GPU to be 9.2 TFLOPS average arguments i.e. I'm not extremist for the other camp.

Sony has stated CPU AVX workloads to be the major factor in the throttling.

You can't compare RDNA v1 7nm against RDNA v2 N7P or 7nm Enhanced when PC's RDNA 2 based SKUs are not yet revealed.

I'm happy for Sony to reveal RDNA 2 boost clocks since it benefits PC's RX 6700 XT which is expected to scale higher clock speed, especially from another MSI Gaming X like SKU.

RDNA 2 40 CU needs to be around 150 watts for RDNA 2's 80 CU "Big Navi" version!

Nothing you say there address my point you are the master of saying tons of bullshit without saying anything really.

The PS4 Pro came in 2016 and was $400 the X came in 2017 1 year latter which mean even cheaper hardware but came at $100 more $500.

The budget for this 2 machines were difference Sony simply could not achieve an xbox one X setup in 2016 at $400 hell not even MS could which is why they release in 2017 and for $100 MORE.

Budget is the keyword here you just don't get it because you are to busy kissing MS ass.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#412  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@KillzoneSnake said:

Yeah so taking power away from the CPU for the GPU to go full overclock lol. Nice one. For sure most of the time it will run at 9TF then.

Personally i dont care, i think its still a very powerful system. Hell im happy with the PS4 lol. BC news and more AAA game variety imo is more important. Its going to be a sad day for PS fanboys when digital foundry benchmarks both systems lol

Actually that is not how it works,if you haven't read it yet you don't want to that much is obvious from your trolling post.

@ronvalencia said:

FYI, Mark Cerny states heavy AVX CPU workload can cause clock speeds to drop. LOL.

A 10% drop in worse case scenario that would result in a small frequency drod,in fact it is been claim that the GPU will probably not drop below 10.1TF.

But you know this and now you are just as sad of a troll as he is.lol (FUK0FF)

@ronvalencia said:

If AMD's 50% perf/watt PR claim is true, 2.23 Ghz boost mode would be a good target for RDNA 2's NAVI 10 replacement i.e. RX 6700 and RX 6700 XT. Sony put a cap on GPU's 2.23Ghz boost mode.

RDNA 2 is fabricated on second-generation 7nm N7P or 7nm Enhanced.

https://www.techpowerup.com/review/msi-radeon-rx-5700-xt-gaming-x/33.html

MSI RX 5700 XT Gaming X has 1.99 Ghz average and 2.07 Ghz max boost with 270 watts average gaming power consumption. This is from a factory i.e. MSI.

Apply AMD's 50% perf/watt claim turns 270 watts into 135 watts at ~2Ghz average.

Apply 40% perf/watt claim turns 270 watts into 162 watts at ~2Ghz average.

Sony could spend the remaining TDP headroom into 2.23 Ghz without major problems. I don't support PS5's RDNA 2 GPU to be 9.2 TFLOPS average arguments i.e. I'm not extremist for the other camp.

Sony has stated CPU AVX workloads to be the major factor in the throttling.

You can't compare RDNA v1 7nm against RDNA v2 N7P or 7nm Enhanced when PC's RDNA 2 based SKUs are not yet revealed.

I'm happy for Sony to reveal RDNA 2 boost clocks since it benefits PC's RX 6700 XT which is expected to scale higher clock speed, especially from another MSI Gaming X like SKU.

RDNA 2 40 CU needs to be around 150 watts for RDNA 2's 80 CU "Big Navi" version!

Nothing you say there address my point you are the master of saying tons of bullshit without saying anything really.

The PS4 Pro came in 2016 and was $400 the X came in 2017 1 year latter which mean even cheaper hardware but came at $100 more $500.

The budget for this 2 machines were difference Sony simply could not achieve an xbox one X setup in 2016 at $400 hell not even MS could which is why they release in 2017 and for $100 MORE.

Budget is the keyword here you just don't get it because you are to busy kissing MS ass.

I have addressed your points and I have countered with Sony already paid for eight GDDR5-8000 components.

For X1X, MS was waiting for Vega 64's perf/watts improvements i.e. half of 12.66 TFLOPS at 295 watts yields 6 TFLOPS version would be close to 150 watts. Both Vega 64 and X1X was released in year 2017.

X1X GPU's 32 ROPS was the 1st GCN with 2MB render cache which is half of Vega 64's 64 ROPS with 4MB L2 cache.

X1X has GDDR5-7000 rated chips which is available in year 2016.

AMD's 384-bit bus PCB design already exists since December 2011 with Radeon HD 7970's retail release. Individual GDDR5-7000 chip is cheaper than GDDR5-8000 chip.

X1X GPU's 44 CU with quad shader engine units layout design already exists with year 2013 Hawaii 44 CU GCN.

X1X has an additional cost on Ultra-Blu-ray drive, four extra GDDR5-7000 chips, flash storage (for OS swap file), vapor cooler and slightly complex 384-bit bus PCB.

The main delaying factor for X1X is AMD's perf/watt silicon maturity and MS's customization with GCN's rasterization.

X1X's cost has reached $399 since November 2018. https://www.theverge.com/2018/11/10/18083294/xbox-one-x-black-friday-deal-microsoft

I don't need MS when discussing mostly PC hardware.

Again, I don't support PS5's RDNA 2 GPU to be 9.2 TFLOPS average arguments i.e. I'm not extremist for the other camp. It's your problem to defend Sony.

Avatar image for Gatygun
Gatygun

2709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#413  Edited By Gatygun
Member since 2010 • 2709 Posts

@Uruz7laevatein said:
@Gatygun said:
@Uruz7laevatein said:

@Gatygun: LOL , So what you are saying is an 8-core Zen2 is a now a bottleneck or whatever that means when a console has it? Holy phook, that's news to me ( I mean do people even know what a bottleneck is). The worst case scenario in gaming for me was when my Ryzen-3700X (before I swapped it out for an Ryzen 9-3950X)hits 20% usage at worse on unoptimized PC games(where overhead is much higher). The only time a CPU pegged hard is when the FPS is uncapped beyond 60 FPS (when GPU is not a bottleneck), or running video-encoding/compilation-software/etc.

Look if you don't know how technology works pls don't react.

Every CPU is bottlenecked and every CPU will always be bottlenecked because they have fixed performance. Go play anno 1800 on your 3700x and 3950x and compare them, your fps stays exactly the same as the same clocks.

Now clock that 3950x below the 3700x and see your performance decrease. Why? bottleneck. Because the game relays on high performance on your cores and any multi core game is because mostly the first 2 cores are doing most of the tasks ( first core actually ), the usage of your entire cpu doesn't mean much as result as if you double the cores when only 4 cores are requested u will get half the usage but the same bottleneck applies because frequency matters a lot.

The CPU in the PS5 can be used for 55% and be bottlenecked without effort while 45% is doing nothing. This is why people on PC platform spend tons of money to get highest clocks they can on the CPU front.

Also why do you think they clocked those cores so high to start with? yea that's why.

So before you start to quote me and laugh, actually know your material.

Uh huh huh

Oh noes every CPU is a bottleneck, what in the world will I ever do. Did you know the water is wet and fire burns? Oh noes my 3950X stays at same fps at 10% CPU usage despite being so buttery smooth, oh my bottleneck.

Translation: Oh noes game only uses a few threads and the rest of CPU is idling. Dayum you idiot engineers at AMD/Intel, for designing such poor CPUs that don't run at 10Ghz, I Gatygun with my armchair leggo building expertise exceeds yours.

Translation: Oh noes AMD/Sony/MS you gaiz doesn't have a clue on designing hardware, I Gatygun who assembled leggos is more knowledgeable than you at hardware and compilers engineering cuz I overlock muh 9900K for muh epeens.

Oh noes how dare I quote you, I had a good laugh at such grandeur of delusions.

U should probably use less drugs.

Nothing what u say makes sense.

Avatar image for deactivated-5f3ec00254b0d
deactivated-5f3ec00254b0d

6278

Forum Posts

0

Wiki Points

0

Followers

Reviews: 54

User Lists: 0

#414 deactivated-5f3ec00254b0d
Member since 2009 • 6278 Posts

I've heard PS5 GPU is almost rdna2. How could the XSX compete against such power?

Avatar image for Pedro
Pedro

69470

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#415 Pedro
Member since 2002 • 69470 Posts

I don't believe some understand the concept of bottleneck or else this line wouldn't exist "every CPU is bottlenecked".

Avatar image for gifford38
Gifford38

7165

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#416 Gifford38
Member since 2020 • 7165 Posts

@Pedro said:

I don't believe some understand the concept of bottleneck or else this line wouldn't exist "every CPU is bottlenecked".

yes that is why sony put in some tech to get rid of bottlenecks. mark in the presentation was talking about bottlenecks and added things in the system to get rid of bottlenecks. lol theirs a lot of bottlenecks in that sentence. now I need a bottle to get rid of these headaches from reading half of these comments on this forum. in the end EVERYONE LISTEN BOTH SYSTEMS ARE A BIG JUMP FROM LAST GEN. BOTH SYSTEMS ARE GOING TO WOW US EXCEPT FOR FANBOYS. i only go with sony because since ps1 i been happy with every sony system since then. Sony I trust in giving me the games I want.

Avatar image for robert_sparkes
robert_sparkes

7233

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#417 robert_sparkes
Member since 2018 • 7233 Posts

I agree both consoles are a huge upgrade to this generation. The tech was already outdated when the systems hit the market.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#418 ronvalencia
Member since 2008 • 29612 Posts

@phbz said:

I've heard PS5 GPU is almost rdna2. How could the XSX compete against such power?

XSX includes RDNA 2 with 52 CU scaling.

Avatar image for Tessellation
Tessellation

9297

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#419  Edited By Tessellation
Member since 2009 • 9297 Posts

@madrocketeer: Doesn't matter how high is clocked more CUs will always be better even if one card runs with lower clock speed.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#420 tormentos
Member since 2003 • 33784 Posts

@Tessellation:

Bullshit which is why Vega 56 can beat Vega 64.

Clocking higher also have other benefits as well.

In thins case the Xbox has more power because it has 16 CU more,alto much lower clocked which is why the gap is only 18%,when on PS 4 vs Xbox one the difference was just 6 more CU but the gap was 40%,if the series X would be clocked at 2.23ghz the gap would be much much wider,but with so many CU that would cause heat problems.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#421  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:

@Tessellation:

Bullshit which is why Vega 56 can beat Vega 64.

Clocking higher also have other benefits as well.

In thins case the Xbox has more power because it has 16 CU more,alto much lower clocked which is why the gap is only 18%,when on PS 4 vs Xbox one the difference was just 6 more CU but the gap was 40%,if the series X would be clocked at 2.23ghz the gap would be much much wider,but with so many CU that would cause heat problems.

Too bad for you, XSX scaled to RTX 2080 level with Gears 5's built-in benchmark and it's not AMD friendly games such as Battlefield V and Forza series games.