So, how much difference is there between PS5 and Sex really?

  • 165 results
  • 1
  • 2
  • 3
  • 4
Avatar image for npiet1
npiet1

3576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#1 npiet1
Member since 2018 • 3576 Posts

I'm trying to actually find out how much difference their will between the 2. Tried reading the forums/other places but it's a Sh!t show tbh about comparing the two.

So how much do CU's matter, or is it speed or a combo?

Speed seems to be an important factor here comparing older GPU's when CU's are similar. Comparing the 5700 and XT version. There's 10fps gain for only 4 units and 100mhz increase. 1.5tflop increase for 10fps (@1440p). Doesn't seem like that's a whole lot. Which almost the same amount of difference between Xbox and Ps5.

Also I've heard flops don't necessarily matter either.

Cpu is .3ghz difference between the 2. In my experience OCing is not a huge gain but still noticeable non the less.

Then Cerny said that they removed the bottlenecks as much as they could. Does this mean that the mobo clock speed is increased as well, which could improve the system as a whole?

Then programing, won't that make a huge difference?

Or is this all just wait till release then compare and just judge pure performance?

*Still pretty amateur at this level. So it's appreciated for non trolling answers.

Avatar image for lamprey263
lamprey263

44574

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#2 lamprey263
Member since 2006 • 44574 Posts

Maybe in end developers won't design games around SeX performance edge nor PS5's memory throughput advantage and performance will be close, maybe just to have parity and not alienate gamers on either side of the market, with maybe only exclusives left to demonstrate advantages of each.

That might mean as far as commercial performance goes, smaller factors might make a bigger difference along the way, like pricing, availability, marketing, game line-up, messaging, other perks that might entice gamers like Game Pass and PS Now, and the environment of gaming media and social networks. Any unintended complications on one console might also help the other, as has been case last two generations.

But yeah, trying to draw solid conclusions at this point would be premature. We'll need to wait and see. MS has two big SeX presentations planned in coming weeks. Sony will have to shed more light on their system in response. We might have bigger idea of what each might offer but even then, we won't know until these systems are in consumer hands.

Avatar image for Pedro
Pedro

69553

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#3  Edited By Pedro
Member since 2002 • 69553 Posts

Most likely not much of anything. At the end of the day gamers are not going to be gaming with the systems side by side. So, whether the difference is larger or smaller than anticipated, it doesn't really matter. If you bought a PS5 and it slower than the Xbox Series X, you would not know or experience the difference. Just like if you bought a Xbox Series X you would not know or experience the performance advantage. These differences mainly matter to hardcore fanboys or people who are going to purchase both. For everyone else, game on. :)

Avatar image for shellcase86
shellcase86

6849

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 shellcase86
Member since 2012 • 6849 Posts

On paper, the Series X sounds like a heavier hitter. The PS5 is maybe better optimized/more efficient?

I'm a PS fan and am ok w/ that. The amazing games their studios have put out on PS4, and I'm using a base launch model, have been astounding. I think the games will be incredible, despite the tech in the box. I actually find the current graphics to be very good, just struggle with loading -- and it seems like that will be something no longer to contend with.

Avatar image for drlostrib
DrLostRib

5931

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#5 DrLostRib
Member since 2017 • 5931 Posts

This is what happens when they only teach abstinence in health class

Avatar image for r-gamer
R-Gamer

2221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#6 R-Gamer
Member since 2019 • 2221 Posts

It's the smallest gap we have had in years. At the end if the day it will be the games that show what system is worth it.

Avatar image for hardwenzen
hardwenzen

38931

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#7 hardwenzen
Member since 2005 • 38931 Posts

Thirty Percent. At least.

Huge gap between the two. For example, a game could be a locked 60fps/4k on the sexbox, but only a dynamic 900p/42fps on the PS5. Its huge.

Avatar image for onesiphorus
onesiphorus

5255

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 6

#8 onesiphorus  Online
Member since 2014 • 5255 Posts

Sex?

Avatar image for rmpumper
rmpumper

2141

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 rmpumper
Member since 2016 • 2141 Posts

This much (_|_)

Avatar image for deactivated-618bc23e9b1c9
deactivated-618bc23e9b1c9

7339

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#10 deactivated-618bc23e9b1c9
Member since 2007 • 7339 Posts

25% difference.

Avatar image for gifford38
Gifford38

7173

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#11 Gifford38
Member since 2020 • 7173 Posts

In the end doesn't matter because it's the dev's who will create these games. look at switch with zelda not a graphical game yet it still sells tons and is fun to play. power is nothing because in the end there all just a box to play games on. it's the dev's who should get all the praise because without them we would have nothing to play.

Avatar image for gifford38
Gifford38

7173

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#12 Gifford38
Member since 2020 • 7173 Posts
@hardwenzen said:

Thirty Percent. At least.

Huge gap between the two. For example, a game could be a locked 60fps/4k on the sexbox, but only a dynamic 900p/42fps on the PS5. Its huge.

that will not happen and it's not huge. look at pro and one x. one is 4k checker bored and one is 4k but there 2 teraflops difference but both run at same fps. it's not a huge difference. now from ps2 to ps5 now thats a huge difference.

there both going to have 4k/60 fps. there both are a huge jump from last gen not just series x. you guys at like they just added a ssd to the ps4 and called it ps5. there both going to bust out great graphics.

Avatar image for R4gn4r0k
R4gn4r0k

46348

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#13 R4gn4r0k
Member since 2004 • 46348 Posts

@rmpumper said:

This much (_|_)

A butt?

But how wide is that butt?

Avatar image for sealionact
sealionact

9817

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#14 sealionact
Member since 2014 • 9817 Posts

You'll see a difference on 3rd party games...no matter how small. Both should run at 4k 60fps targeted, with xsx hitting the target more often.

That's if ps5 delivers as promised...we dont really know enough about how the power shift between cpu and gpu affects games, nor having less CUs and slower memory bandwidth.

PS5's SSD will load games around 5 seconds faster, but it's 175GB smaller...

Avatar image for osan0
osan0

17822

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#15 osan0  Online
Member since 2004 • 17822 Posts

not much.

The XSX has some advantages (GPU, CPU, peak memory bandwidth), the PS5 has some benefits (all ram at the same speed so more flexible without a penalty, higher bandwidth with the SSD), overall the XSX is the better package in terms of gaming ability but its not any sort of game changing difference.

any game that the XSX can run the PS5 can too. every graphical trick the XSX can pull off the PS5 can pull off (they have the same CPU and GPU architecture after all). we will probably just see a trend where the Ps5 version runs at a slightly lower res and maybe the number of rays cast for the ray tracing is a little lower and thats about it really.

Avatar image for masonshoemocker
masonshoemocker

740

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16 masonshoemocker
Member since 2003 • 740 Posts

Honestly I care about high FPS now. I think that's where you'll see the biggest difference between the two; in which of the two outputs and maintains the highest frames.

Avatar image for Nonstop-Madness
Nonstop-Madness

12304

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#17  Edited By Nonstop-Madness
Member since 2008 • 12304 Posts

It's a smaller performance gap than PS4 vs Xbox One or, PS4 Pro vs Xbox One X.

Avatar image for xantufrog
xantufrog

17875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#18 xantufrog  Moderator
Member since 2013 • 17875 Posts

PS5 is actually a synonym for sex

~Mark Horndog Cerny

Avatar image for osan0
osan0

17822

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#19 osan0  Online
Member since 2004 • 17822 Posts

@masonshoemocker: I wouldn't get your hopes up on a high FPS for most next gen games. We may see more of them during the cross gen period but as devs start to stretch out to fill the processing capacity of the new consoles they will drop the FPS to 30 to meet their other targets.

Most devs will have a preference for 4K (or as close as possible) 30FPS with the highest visual settings they can muster and the most accurate and detailed underlying simulations under the hood hammering the CPU. with stuff like DLSS (do the new consoles support something like that?) and temporal reconstruction/chequerboard rendering (and whatever new tricks they come up with next gen) they will also be happy to drop the native res below 4K and upscale rather than focus on 4K native to keep a stable-ish 30FPS.

The only way we will ever see, say, 60FPS as standard is if MS/Sony/nintendo demand that all games be 60FPS on their platforms and that is very unlikely to happen. No matter how powerful any console is, developers will find ways to fill it with work.

Avatar image for pc_rocks
PC_Rocks

8477

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#20 PC_Rocks
Member since 2018 • 8477 Posts

In terms of concrete numbers we don't know because not enough information is out there. In terms of actual and perceptible difference in games: almost nothing or negligible.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 tormentos
Member since 2003 • 33784 Posts

It depends on spec 18% on some peoples mind 50% more with 50% more raytracing and 200% more bandwidth.

In actual ingame performance probably minimal,on some fanboys minds 500%.

Avatar image for iambatman7986
iambatman7986

4576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 15

User Lists: 0

#23  Edited By iambatman7986
Member since 2013 • 4576 Posts

My main concern for the ps5 is the variable clocks. Listening to digital foundry, both the cpu and gpu can't be clocked that high at the same time. For cpu intensive games, the gpu will suffer and vice versa. This came as they talked to Mark Cerny. Developers also can't lock them like they can in the dev kits. They are both a great step up in technology from last gen though.

Avatar image for gifford38
Gifford38

7173

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#24 Gifford38
Member since 2020 • 7173 Posts

@osan0 said:

@masonshoemocker: I wouldn't get your hopes up on a high FPS for most next gen games. We may see more of them during the cross gen period but as devs start to stretch out to fill the processing capacity of the new consoles they will drop the FPS to 30 to meet their other targets.

Most devs will have a preference for 4K (or as close as possible) 30FPS with the highest visual settings they can muster and the most accurate and detailed underlying simulations under the hood hammering the CPU. with stuff like DLSS (do the new consoles support something like that?) and temporal reconstruction/chequerboard rendering (and whatever new tricks they come up with next gen) they will also be happy to drop the native res below 4K and upscale rather than focus on 4K native to keep a stable-ish 30FPS.

The only way we will ever see, say, 60FPS as standard is if MS/Sony/nintendo demand that all games be 60FPS on their platforms and that is very unlikely to happen. No matter how powerful any console is, developers will find ways to fill it with work.

NO NO NO i think the dev's all want 60 but can't because of limitations. now we don't have that so im sure every game this next gen will be 60fps. 60 fps will be a standard this gen. and moving forward.

Avatar image for howmakewood
Howmakewood

7705

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#25 Howmakewood
Member since 2015 • 7705 Posts

@gifford38 said:
@osan0 said:

@masonshoemocker: I wouldn't get your hopes up on a high FPS for most next gen games. We may see more of them during the cross gen period but as devs start to stretch out to fill the processing capacity of the new consoles they will drop the FPS to 30 to meet their other targets.

Most devs will have a preference for 4K (or as close as possible) 30FPS with the highest visual settings they can muster and the most accurate and detailed underlying simulations under the hood hammering the CPU. with stuff like DLSS (do the new consoles support something like that?) and temporal reconstruction/chequerboard rendering (and whatever new tricks they come up with next gen) they will also be happy to drop the native res below 4K and upscale rather than focus on 4K native to keep a stable-ish 30FPS.

The only way we will ever see, say, 60FPS as standard is if MS/Sony/nintendo demand that all games be 60FPS on their platforms and that is very unlikely to happen. No matter how powerful any console is, developers will find ways to fill it with work.

NO NO NO i think the dev's all want 60 but can't because of limitations. now we don't have that so im sure every game this next gen will be 60fps. 60 fps will be a standard this gen. and moving forward.

It wont, they will chase higher visuals rather than double the framerate

Avatar image for lamprey263
lamprey263

44574

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#26 lamprey263
Member since 2006 • 44574 Posts

There are also instances I have noticed with the XB1X where I feel devs are too ambitious with hardware and problems arise by trying to maximize use of hardware, by contrast taking extra measures on a weaker systems limitations to achieve similar results ends up being more efficient. Ran across interesting article about games using lower quality resolution and polishing them through post-processing effects, checkerboard rendering, temporal supersampling and such, achieves better performance by limiting overhead and leaving resources to enhance visual fidelity more effectively.

Avatar image for Sushiglutton
Sushiglutton

9853

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#27  Edited By Sushiglutton
Member since 2009 • 9853 Posts

Think the smart move is to wait a couple of months after release to know for sure. Probably won‘t mean too much in terms of enjoyment of games.

Avatar image for bluestars
Bluestars

2789

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#28 Bluestars
Member since 2019 • 2789 Posts

Buy both

Get all the power

Get all the games

The end

# realgamer

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#29 tormentos
Member since 2003 • 33784 Posts

@bluestars said:

Buy both

Get all the power

Get all the games

The end

# realgamer

Sure kinglemming you surely convince us that you are a unbiased gamer.🙄

@iambatman7986 said:

My main concern for the ps5 is the variable clocks. Listening to digital foundry, both the cpu and gpu can't be clocked that high at the same time. For cpu intensive games, the gpu will suffer and vice versa. This came as they talked to Mark Cerny. Developers also can't lock them like they can in the dev kits. They are both a great step up in technology from last gen though.

I don't think there will be CPU intensive games on this machines,one because at 4k CPU overhead is minimal,and second because this machines no matter the PR are not aiming for 120FPS in all its games.

And considering a jaguar still been use today i don't see how a 3.5ghz 16 core CPU would be bottleneck.

There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."

Cerny also say this on the second DF article.

Avatar image for deactivated-5f2b4872031c2
deactivated-5f2b4872031c2

2683

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#30 deactivated-5f2b4872031c2
Member since 2018 • 2683 Posts

Saw the thread title, thought I'd accidentally wandered into the wrong part of the internet.

Avatar image for sovkhan
sovkhan

1591

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31 sovkhan
Member since 2015 • 1591 Posts

Nothing with the naked eyes, we'll be waiting for DF for the verdict.

But even then we won't see huge differences.

All in all, potent machines, with a little more power to the XsX.

Not the big deal some make out of it.

Avatar image for Nonstop-Madness
Nonstop-Madness

12304

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#32 Nonstop-Madness
Member since 2008 • 12304 Posts

@tormentos said:
@iambatman7986 said:

My main concern for the ps5 is the variable clocks. Listening to digital foundry, both the cpu and gpu can't be clocked that high at the same time. For cpu intensive games, the gpu will suffer and vice versa. This came as they talked to Mark Cerny. Developers also can't lock them like they can in the dev kits. They are both a great step up in technology from last gen though.

I don't think there will be CPU intensive games on this machines, one because at 4k CPU overhead is minimal,and second because this machines no matter the PR are not aiming for 120FPS in all its games.

And considering a jaguar still been use today i don't see how a 3.5ghz 16 core CPU would be bottleneck.

There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."

Cerny also say this on the second DF article.

Right, the CPU and GPU can both run at those frequencies as long as the workload doesn't push the system past its power consumption or thermal limits. Ex. Cerny says a 10% frequency reduction reduces power consumption by 27%. That means the CPU could cap out at 90% usage for some period of time and give the GPU 27% more power ... "power" means workload over some period of time. I.e. It can relax the CPU's max threshold for some time (frames) and allow the GPU to crank out very high usage (lets say 100%) for a more extended period of time without compromising the system hitting its limits.

I think the general misunderstand is that games run at very high usage (90%+) against the CPU and, GPU all the time. That's simply not true ... you're asking for trouble if your game is running a 90%+ on the CPU *and* GPU at all time. It implies that your system is running at the same usage whether a character is walking down a hallway or just blew up 15 explosives barrels on screen. It also implies that every action in game is creating equally high amounts of workload for both the CPU and GPU. That isn't true.

Yes, you want a high performance ceiling on the CPU and GPU for those frames that require peak performance but, games are pretty much *never* using very high (90%+) CPU *and* GPU usage at the exact same time ... and functioning well. lol

Avatar image for Jag85
Jag85

19566

Forum Posts

0

Wiki Points

0

Followers

Reviews: 219

User Lists: 0

#33 Jag85
Member since 2005 • 19566 Posts

How much difference is there between PS4 and sex?

PlayStation 4 Is the Most Popular Pornhub Console

Avatar image for deactivated-60bf765068a74
deactivated-60bf765068a74

9558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#34 deactivated-60bf765068a74
Member since 2007 • 9558 Posts

PS5 > Sex

Avatar image for deactivated-63d1ad7651984
deactivated-63d1ad7651984

10057

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 13

#35 deactivated-63d1ad7651984
Member since 2017 • 10057 Posts
Loading Video...

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#36 ronvalencia
Member since 2008 • 29612 Posts

@Nonstop-Madness said:
@tormentos said:
@iambatman7986 said:

My main concern for the ps5 is the variable clocks. Listening to digital foundry, both the cpu and gpu can't be clocked that high at the same time. For cpu intensive games, the gpu will suffer and vice versa. This came as they talked to Mark Cerny. Developers also can't lock them like they can in the dev kits. They are both a great step up in technology from last gen though.

I don't think there will be CPU intensive games on this machines, one because at 4k CPU overhead is minimal,and second because this machines no matter the PR are not aiming for 120FPS in all its games.

And considering a jaguar still been use today i don't see how a 3.5ghz 16 core CPU would be bottleneck.

There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."

Cerny also say this on the second DF article.

Right, the CPU and GPU can both run at those frequencies as long as the workload doesn't push the system past its power consumption or thermal limits. Ex. Cerny says a 10% frequency reduction reduces power consumption by 27%. That means the CPU could cap out at 90% usage for some period of time and give the GPU 27% more power ... "power" means workload over some period of time. I.e. It can relax the CPU's max threshold for some time (frames) and allow the GPU to crank out very high usage (lets say 100%) for a more extended period of time without compromising the system hitting its limits.

I think the general misunderstand is that games run at very high usage (90%+) against the CPU and, GPU all the time. That's simply not true ... you're asking for trouble if your game is running a 90%+ on the CPU *and* GPU at all time. It implies that your system is running at the same usage whether a character is walking down a hallway or just blew up 15 explosives barrels on screen. It also implies that every action in game is creating equally high amounts of workload for both the CPU and GPU. That isn't true.

Yes, you want a high performance ceiling on the CPU and GPU for those frames that require peak performance but, games are pretty much *never* using very high (90%+) CPU *and* GPU usage at the exact same time ... and functioning well. lol

Your argument is effectively current-gen gameplay with better graphics instead of Ashes of the SIngularity level NPC count.

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#37 Xplode_games
Member since 2011 • 2540 Posts

@npiet1 said:

I'm trying to actually find out how much difference their will between the 2. Tried reading the forums/other places but it's a Sh!t show tbh about comparing the two.

So how much do CU's matter, or is it speed or a combo?

Speed seems to be an important factor here comparing older GPU's when CU's are similar. Comparing the 5700 and XT version. There's 10fps gain for only 4 units and 100mhz increase. 1.5tflop increase for 10fps (@1440p). Doesn't seem like that's a whole lot. Which almost the same amount of difference between Xbox and Ps5.

Also I've heard flops don't necessarily matter either.

Cpu is .3ghz difference between the 2. In my experience OCing is not a huge gain but still noticeable non the less.

Then Cerny said that they removed the bottlenecks as much as they could. Does this mean that the mobo clock speed is increased as well, which could improve the system as a whole?

Then programing, won't that make a huge difference?

Or is this all just wait till release then compare and just judge pure performance?

*Still pretty amateur at this level. So it's appreciated for non trolling answers.

The XSX is going to be a lot more powerful, why do you think Sony overclocked the PS5 at the last minute? Does the difference matter? Yes it will but it depends. Even the PS5 is miles better than the X1X, it's not even close. This is a new gen, no joke. The PS5 may be gimped compared to the XSX but it's going to look next gen compared to the last gen consoles running the same games. It's not a bad console the PS5, it's just a lot worse than the XSX. Just like an RTX 2060 is not a bad card but a 2080 is a lot better. Again, the 2080 being a lot better does not mean the 2060 is not an awesome card and great buy that consumers will love.

Avatar image for drlostrib
DrLostRib

5931

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#38 DrLostRib
Member since 2017 • 5931 Posts

@Xplode_games said:
@npiet1 said:

I'm trying to actually find out how much difference their will between the 2. Tried reading the forums/other places but it's a Sh!t show tbh about comparing the two.

So how much do CU's matter, or is it speed or a combo?

Speed seems to be an important factor here comparing older GPU's when CU's are similar. Comparing the 5700 and XT version. There's 10fps gain for only 4 units and 100mhz increase. 1.5tflop increase for 10fps (@1440p). Doesn't seem like that's a whole lot. Which almost the same amount of difference between Xbox and Ps5.

Also I've heard flops don't necessarily matter either.

Cpu is .3ghz difference between the 2. In my experience OCing is not a huge gain but still noticeable non the less.

Then Cerny said that they removed the bottlenecks as much as they could. Does this mean that the mobo clock speed is increased as well, which could improve the system as a whole?

Then programing, won't that make a huge difference?

Or is this all just wait till release then compare and just judge pure performance?

*Still pretty amateur at this level. So it's appreciated for non trolling answers.

The XSX is going to be a lot more powerful, why do you think Sony overclocked the PS5 at the last minute? Does the difference matter? Yes it will but it depends. Even the PS5 is miles better than the X1X, it's not even close. This is a new gen, no joke. The PS5 may be gimped compared to the XSX but it's going to look next gen compared to the last gen consoles running the same games. It's not a bad console the PS5, it's just a lot worse than the XSX. Just like an RTX 2060 is not a bad card but a 2080 is a lot better. Again, the 2080 being a lot better does not mean the 2060 is not an awesome card and great buy that consumers will love.

how do you quantify "a lot better"?

Avatar image for Fairmonkey
Fairmonkey

2312

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#39 Fairmonkey
Member since 2011 • 2312 Posts

Probably pretty similar for multiplat devs. It doesn't really matter that much, look how good uncharted 4 looked on a standard PS4. What matters are the games and Halo/gears/Forza just doesn't cut it anymore for most people

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#40  Edited By ronvalencia
Member since 2008 • 29612 Posts

@npiet1 said:

I'm trying to actually find out how much difference their will between the 2. Tried reading the forums/other places but it's a Sh!t show tbh about comparing the two.

So how much do CU's matter, or is it speed or a combo?

Speed seems to be an important factor here comparing older GPU's when CU's are similar. Comparing the 5700 and XT version. There's 10fps gain for only 4 units and 100mhz increase. 1.5tflop increase for 10fps (@1440p). Doesn't seem like that's a whole lot. Which almost the same amount of difference between Xbox and Ps5.

Also I've heard flops don't necessarily matter either.

Cpu is .3ghz difference between the 2. In my experience OCing is not a huge gain but still noticeable non the less.

Then Cerny said that they removed the bottlenecks as much as they could. Does this mean that the mobo clock speed is increased as well, which could improve the system as a whole?

Then programing, won't that make a huge difference?

Or is this all just wait till release then compare and just judge pure performance?

*Still pretty amateur at this level. So it's appreciated for non trolling answers.

Both 5700 and RX 5700 XT have the same 448 GB/s memory bandwidth with 24.67% TFLOPS difference. Due to cost consideration, AMD didn't scale memory bandwidth with TFLOPS increase. Finance beats engineering.

Expect PC's RX 6700 XT RDNA 2 refresh to sport faster memory modules e.g. 256 bit GDDR6-15500 with 496 GB/s.

From PS5 baseline, XSX backs it's 18.2% TFLOPS difference with up to 25% memory bandwidth difference.

From RX 5700 XT baseline, XSX backs it's 25.75% TFLOPS difference with up to 25% memory bandwidth difference.

From https://www.eurogamer.net/articles/digitalfoundry-2020-inside-xbox-series-x-full-specs

but there was one startling takeaway - we were shown benchmark results that, on this two-week-old, unoptimised port, already deliver very, very similar performance to an RTX 2080.

Scale RX 5700 XT's 9.66 TFLOPS** into XSX's 12.147 TFLOPS and it lands on RTX 2080 class level i.e. 50 fps

**Based from real-world average clock speed for RX 5700 XT from https://www.techpowerup.com/review/amd-radeon-rx-5700-xt/34.html

12.147 / 9.66 = 1.25745 or 25.745%, hence XSX GPU has ~25.75% extra TFLOPS over RX 5700 XT.

Gears 5's ranking follows

Apply 1.25 on RX 5700 XT's 122% and it would land on 152.5‬% which is between RTX 2080 and RTX 2080 Super.

XSX GPU solution is ~25% superior over RX 5700 XT in TFLOPS, texture mapper, texture filtering, on-chip SRAM storage, and memory bandwidth.

Avatar image for npiet1
npiet1

3576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#41 npiet1
Member since 2018 • 3576 Posts

@hardwenzen said:

Thirty Percent. At least.

Huge gap between the two. For example, a game could be a locked 60fps/4k on the sexbox, but only a dynamic 900p/42fps on the PS5. Its huge.

where did you get 30%?

As my example shows, there's a difference between the 5700 and the XT of 1.5tflops. So how would the difference be so much?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#42  Edited By ronvalencia
Member since 2008 • 29612 Posts

@npiet1 said:
@hardwenzen said:

Thirty Percent. At least.

Huge gap between the two. For example, a game could be a locked 60fps/4k on the sexbox, but only a dynamic 900p/42fps on the PS5. Its huge.

where did you get 30%?

As my example shows, there's a difference between the 5700 and the XT of 1.5tflops. So how would the difference be so much?

The difference is like RTX 2080 Super vs RTX 2070 overclocked to 2.23 Ghz.

RTX 2080 Super FE has 11.78 TFLOPS with 496 GB/s memory bandwidth. MSI's RTX 2080 Super Gaming X has 12.1 TFLOPS.

RTX 2070 at 2.23 Ghz OC has 10.275 TFLOPS with 448 GB/s memory bandwidth.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#43  Edited By ronvalencia
Member since 2008 • 29612 Posts
@tormentos said:
@bluestars said:

Buy both

Get all the power

Get all the games

The end

# realgamer

Sure kinglemming you surely convince us that you are a unbiased gamer.🙄

@iambatman7986 said:

My main concern for the ps5 is the variable clocks. Listening to digital foundry, both the cpu and gpu can't be clocked that high at the same time. For cpu intensive games, the gpu will suffer and vice versa. This came as they talked to Mark Cerny. Developers also can't lock them like they can in the dev kits. They are both a great step up in technology from last gen though.

I don't think there will be CPU intensive games on this machines,one because at 4k CPU overhead is minimal,and second because this machines no matter the PR are not aiming for 120FPS in all its games.

And considering a jaguar still been use today i don't see how a 3.5ghz 16 core CPU would be bottleneck.

There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."

Cerny also say this on the second DF article.

Mark Cerny warns against 256-bit AVX workloads in his presentation.

Using 128-bit SSE4 reduces SIMD math co-processor energy usage.

Avatar image for deactivated-6092a2d005fba
deactivated-6092a2d005fba

22663

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#44 deactivated-6092a2d005fba
Member since 2015 • 22663 Posts

@ProtossRushX said:

PS5 > Sex

I bet you're right handed lol.

Avatar image for npiet1
npiet1

3576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#45 npiet1
Member since 2018 • 3576 Posts

@ronvalencia said:
@npiet1 said:
@hardwenzen said:

Thirty Percent. At least.

Huge gap between the two. For example, a game could be a locked 60fps/4k on the sexbox, but only a dynamic 900p/42fps on the PS5. Its huge.

where did you get 30%?

As my example shows, there's a difference between the 5700 and the XT of 1.5tflops. So how would the difference be so much?

The difference is like RTX 2080 Super vs RTX 2070 overclocked to 2.23 Ghz.

RTX 2080 Super FE has 11.78 TFLOPS with 496 GB/s memory bandwidth. MSI's RTX 2080 Super Gaming X has 12.1 TFLOPS.

RTX 2070 at 2.23 Ghz OC has 10.275 TFLOPS with 448 GB/s memory bandwidth.

So at the same resolution you will only see a difference of 10-15fps? That's not a huge difference. Devs could tweak the visuals ever so slightly lower on the ps5 and you wouldn't see a difference at all.

Avatar image for Ibacai
Ibacai

14459

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#46 Ibacai
Member since 2006 • 14459 Posts

@npiet1: Ok, the debate is silly and these numbers are all fluff at the moment......but....10-15 frames is a huge difference. That’s a clearly noticeable difference you can see with the naked eye. Especially when the frames dip.

Avatar image for npiet1
npiet1

3576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#47 npiet1
Member since 2018 • 3576 Posts

@Ibacai said:

@npiet1: Ok, the debate is silly and these numbers are all fluff at the moment......but....10-15 frames is a huge difference. That’s a clearly noticeable difference you can see with the naked eye. Especially when the frames dip.

It's a huge difference if it's 45 vs 60, but it's not when it's 60 vs 75. Really depends on the initial frames target I guess, but then a small dip in Visuals would keep the FPS similar.

Avatar image for r-gamer
R-Gamer

2221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#48 R-Gamer
Member since 2019 • 2221 Posts

@npiet1: If the game was running at 60fps it may run at 52 fps on PS5. You will get no where near 15fps unless the game was running at well over 60fps on both consoles.

Avatar image for npiet1
npiet1

3576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#49 npiet1
Member since 2018 • 3576 Posts

@r-gamer said:

@npiet1: If the game was running at 60fps it may run at 52 fps on PS5. You will get no where near 15fps unless the game was running at well over 60fps on both consoles.

So it's pretty safe to say, that the difference between the 2 is barley going to be noticed.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#50  Edited By ronvalencia
Member since 2008 • 29612 Posts

@npiet1 said:
@ronvalencia said:

The difference is like RTX 2080 Super vs RTX 2070 overclocked to 2.23 Ghz.

RTX 2080 Super FE has 11.78 TFLOPS with 496 GB/s memory bandwidth. MSI's RTX 2080 Super Gaming X has 12.1 TFLOPS.

RTX 2070 at 2.23 Ghz OC has 10.275 TFLOPS with 448 GB/s memory bandwidth.

So at the same resolution you will only see a difference of 10-15fps? That's not a huge difference. Devs could tweak the visuals ever so slightly lower on the ps5 and you wouldn't see a difference at all.

Against PS5, it's more than 18.21% difference when Sony added extra GPU TFLOPS on the same 448 GB/s memory bandwidth as RX 5700 and RX 5700 XT.

RX 5700 XT stock and RX 5700 XT factory overclock with the same 448 GB/s memory bandwidth are already showing diminishing returns relative to RX 5700's 7.7 TFLOPS.

https://www.techpowerup.com/review/asrock-radeon-rx-5700-xt-taichi-oc-plus/33.html

This RX 5700 XT Taich OC has 1996 Mhz average, hence 10.219 TFLOPS.

https://www.techpowerup.com/review/asrock-radeon-rx-5700-xt-taichi-oc-plus/27.html

For XSX's estimate, apply 1.25 on RX 5700 XT's 96% lands on 120% which is RTX 2080 level.

The difference could reach 20% when 448 GB/s memory bandwidth bottlenecks are factored in.

At 120 fps at 4K and 18.21% difference

XSX has 120 fps

PS5 has 101.5 fps

At 60 fps at 4K and 18.21% difference

XSX has 60 fps

PS5 has 50.7 fps

At 60 fps at 4K and 20% difference due to 448GB/s memory bandwidth bottleneck

XSX has 60 fps

PS5 has 48 fps

The difference could reach 21% when GPU clock speed is reduced by 1%.

AMD needs to spec-up RX 5700 XT with GDDR6-15500 or GDDR6-16000 rated modules.

https://videocardz.com/newz/msi-outs-geforce-rtx-2080-ti-gaming-z-with-16-gbps-gddr6-memory

GDDR6-16000 already being shipped, hence pushing GDDR6-15500 into a lower tier.

4K resolution render targets will need the highest memory bandwidth for the money.