The PS4's weakness shows again in latest witcher patch

This topic is locked from further discussion.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#101  Edited By commander
Member since 2010 • 16217 Posts

@Shewgenja said:

Whatever the PS4 is "losing" in cycles in terms of system performance from it's slightly slower processor piece is more than being made up for in terms of memory performance with a unified memory addressing architecture. If you are bottlenecking in terms of compute performance, it is because you have not optimized your engine for the CUs.

Seeing as how this is a multiplatform game from a small developer, these things come as no surprise. What also doesn't come as a surprise is that, once again, an Xbox fanpuppet is making a big deal about something no one is actually giving a flying crap about.

The memory does nothing for the ps4 besides increasing gpu performance, the latency in gddr5 is actually worse for cpu calculations.

It's like @04dcarraher says the longer we're in this gen the more prevalent the cpu advantage will become. X1 has already optimized by dropping the reserved resources fro the kinect and unlocking that 7th core. Dx 12 won't make the x1 run games at 1080p, but the framerate difference will become higher making the powerpoint station 4 do what it was born to do.

and that is making full hd slideshows lmao

Avatar image for Shewgenja
Shewgenja

21456

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#102  Edited By Shewgenja
Member since 2009 • 21456 Posts

@commander said:
@Shewgenja said:

Whatever the PS4 is "losing" in cycles in terms of system performance from it's slightly slower processor piece is more than being made up for in terms of memory performance with a unified memory addressing architecture. If you are bottlenecking in terms of compute performance, it is because you have not optimized your engine for the CUs.

Seeing as how this is a multiplatform game from a small developer, these things come as no surprise. What also doesn't come as a surprise is that, once again, an Xbox fanpuppet is making a big deal about something no one is actually giving a flying crap about.

The memory does nothing for the ps4 besides increasing gpu performance, the latency in gddr5 is actually worse for cpu calculations.

It's like @04dcarraher says the longer we're in this gen the more prevalent the cpu advantage will become. X1 has already optimized by dropping the reserved resources fro the kinect and unlocking that 7th core. Dx 12 won't make the x1 run games at 1080p, but the framerate difference will become higher making the powerpoint station 4 do what it was born to do.

and that is making full hd slideshows lmao

the latency in gddr5 is actually worse for cpu calculations.

...

the latency in gddr5 is actually worse for cpu calculations.

..

the latency in gddr5 is actually worse for cpu calculations.

LOL, you're still pimping this weak tired shit? GDDR5 is essentially full duplex in read/write due to having asynchronous clock timing whereas previous JEDEC RAM specifications do not do this. In real-world performance, comparing GDDR5's CAS latency to DDR3s CAS is apples and oranges.

Most multiplatform games are not even scratching the surface of what they can do with the PS4 due to having to do this silly thing called meet a deadline while supporting multiple platforms. Your, frankly limp-dick approach to blanketing every little patch and nuance to a technical deficiency with the device just so you can eek out a kudos for the XBox is, well rather pathetic.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#103 04dcarraher
Member since 2004 • 23829 Posts

@commander said:
@Shewgenja said:

Whatever the PS4 is "losing" in cycles in terms of system performance from it's slightly slower processor piece is more than being made up for in terms of memory performance with a unified memory addressing architecture. If you are bottlenecking in terms of compute performance, it is because you have not optimized your engine for the CUs.

Seeing as how this is a multiplatform game from a small developer, these things come as no surprise. What also doesn't come as a surprise is that, once again, an Xbox fanpuppet is making a big deal about something no one is actually giving a flying crap about.

The memory does nothing for the ps4 besides increasing gpu performance, the latency in gddr5 is actually worse for cpu calculations.

It's like @04dcarraher says the longer we're in this gen the more prevalent the cpu advantage will become. X1 has already optimized by dropping the reserved resources fro the kinect and unlocking that 7th core. Dx 12 won't make the x1 run games at 1080p, but the framerate difference will become higher making the powerpoint station 4 do what it was born to do.

and that is making full hd slideshows lmao

GDDR5 isnt really any worse for PS4's cpu, since its memory controllers actually buffer the greater latency, and that its only when do a bunch of small tasks where GDDR5 would be worse. Overall your not going to see any major difference between the DDR3 and GDDR5 for their cpu work.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#104  Edited By commander
Member since 2010 • 16217 Posts

@Shewgenja said:
@commander said:

The memory does nothing for the ps4 besides increasing gpu performance, the latency in gddr5 is actually worse for cpu calculations.

It's like @04dcarraher says the longer we're in this gen the more prevalent the cpu advantage will become. X1 has already optimized by dropping the reserved resources fro the kinect and unlocking that 7th core. Dx 12 won't make the x1 run games at 1080p, but the framerate difference will become higher making the powerpoint station 4 do what it was born to do.

and that is making full hd slideshows lmao

the latency in gddr5 is actually worse for cpu calculations.

...

the latency in gddr5 is actually worse for cpu calculations.

..

the latency in gddr5 is actually worse for cpu calculations.

LOL, you're still pimping this weak tired shit? GDDR5 is essentially full duplex in read/write whereas previous JEDEC RAM specifications do not do this. In real-world performance, comparing GDDR5's CAS latency to DDR3s CAS is apples and oranges.

Still doesn't matter, normally high speed memory makes up for latency with bandwith but that bandwith is useless since the cpu is so weak. Gddr5 is great for the gpu of course but does nothing for the cpu.

I might be that the optimizatons nullified the latency difference i can give you that, but mentioning the gddr 5 as a memory advantage because it's shared is bs. The cpu in the ps4 can never utilize that bandwith.

Avatar image for Shewgenja
Shewgenja

21456

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#105 Shewgenja
Member since 2009 • 21456 Posts

@commander said:

Still doesn't matter, normally high speed memory makes up for latency with bandwith but that bandwith is useless since the cpu is so weak. Gddr5 is great for the gpu of course but does nothing for the cpu.

I might be that the optimizatons nullified the latency difference i can give you that, but mentioning the gddr 5 as a memory advantage because it's shared is bs. The cpu in the ps4 can never utilize that bandwith.

Unless you were expecting Sony to time travel to Cyberdine headquarters after the nuclear war, did you honestly expect a CPU that could take full advantage of 5.5GHz RAM? Intel isn't even coming close to that with their memory controllers. Meanwhile, you are spewing this nonsense AGAIN and AGAIN AND AGAIN and it is disinformation you fucking schill. GDDR5 accomplishes read and write for the same clock. Therefore, it's CAS latency is a small piece of the puzzle that is typically a much bigger deal for earlier JEDEC standards.

You are comparing apples to oranges, and then when called out on it, you compare it to magical fairytale technology that doesn't exist yet in regards to the PS4s ability to handle that memory. It's a balanced design, especially if the developers can be arsed to let the CPU and GPU address memory in tandem to cut down on the overall number of processes actually needed to create a result. The CUs can do a lot more of the work if people have the time, man hours, and budget. Then again, when you have a vocally angry fanbase on the derpbox making a shitstorm over every FPS and resolution difference on their precious machine, that's where you devote your time and money.

Meanwhile, the rest of us have moved on. You can have your 2fps or whatever it is that strokes you so hard on a game that is months-old and 100% likely to get another patch that corrects whatever it is in the code on PS4 that spawned this shit thread.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#106  Edited By commander
Member since 2010 • 16217 Posts

@Shewgenja said:
@commander said:

Still doesn't matter, normally high speed memory makes up for latency with bandwith but that bandwith is useless since the cpu is so weak. Gddr5 is great for the gpu of course but does nothing for the cpu.

I might be that the optimizatons nullified the latency difference i can give you that, but mentioning the gddr 5 as a memory advantage because it's shared is bs. The cpu in the ps4 can never utilize that bandwith.

Unless you were expecting Sony to time travel to Cyberdine headquarters after the nuclear war, did you honestly expect a CPU that could take full advantage of 5.5GHz RAM? Intel isn't even coming close to that with their memory controllers. Meanwhile, you are spewing this nonsense AGAIN and AGAIN AND AGAIN and it is disinformation you fucking schill. GDDR5 accomplishes read and write for the same clock. Therefore, it's CAS latency is a small piece of the puzzle that is typically a much bigger deal for earlier JEDEC standards.

You are comparing apples to oranges, and then when called out on it, you compare it to magical fairytale technology that doesn't exist yet in regards to the PS4s ability to handle that memory. It's a balanced design, especially if the developers can be arsed to let the CPU and GPU address memory in tandem to cut down on the overall number of processes actually needed to create a result. The CUs can do a lot more of the work if people have the time, man hours, and budget. Then again, when you have a vocally angry fanbase on the derpbox making a shitstorm over every FPS and resolution difference on their precious machine, that's where you devote your time and money.

Meanwhile, the rest of us have moved on. You can have your 2fps or whatever it is that strokes you so hard on a game that is months-old and 100% likely to get another patch that corrects whatever it is in the code on PS4 that spawned this shit thread.

I never brought up gddr5 as an argument for the cpu performance, you did. Now you're mad that i pointed out the latency and the fact that the ps4 cpu can never make use of that bandwith.

I told you it might be possible sony was able to rectify or at least diminish the gddr5 latency with the ps4's memory controller (allthough that's highly unlikely since intel and amd cannot do it yet) but even then that doesn't change the fact that the ps4 cpu is too weak to make use of the gdrr5 bandwith and because of that your argument is totally debunked.

And the difference in framerates is a whole lot more than 2 fps, it's mostly double that so if the ps4 is running at 26 fps and the X1 at 30 fps, then that's a 15 percent difference.

It's also the difference between smooth gameplay and lag, I can understand that you call it nonsense because you're a rabid cow but you will just have to suck it up. The ps4 is not the powerhouse fans and sony make it out to be.

Avatar image for Shewgenja
Shewgenja

21456

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#107  Edited By Shewgenja
Member since 2009 • 21456 Posts

@commander said:

I never brought up gddr5 as an argument for the cpu performance, you did. Now you're mad that i pointed out the latency and the fact that the ps4 cpu can never make use of that bandwith.

I told you it might be possible sony was able to rectify or at least diminish the gddr5 latency with the ps4's memory controller (allthough that's highly unlikely since intel and amd cannot do it yet) but even then that doesn't change the fact that the ps4 cpu is too weak to make use of the gdrr5 bandwith and because of that your argument is totally debunked.

And the difference in framerates is a whole lot more than 2 fps, it's mostly double that so if the ps4 is running at 26 fps and the X1 at 30 fps, then that's a 15 percent difference.

It's also the difference between smooth gameplay and lag, I can understand that you call it nonsense because you're a rabid cow but you will just have to suck it up. The ps4 is not the powerhouse fans and sony make it out to be.

Time will certainly tell. In one corner, you have DX12 magic sauce and in the other corner you have a GPU designed to offload heavy computation from it's barely slower CPU. Look, I'll be honest here. The winner is going to be the one developers put more effort into. If the time and effort are the exact same, though? Hardware will win every time and that advantage strictly leans into the realm of the PS4.

On top of that, there are very real limitations to what the XBone can do. For differed rendering, it has a complete achilles heel due to the ESRAM simply not being large enough for a full 1080p framebuffer.

Avatar image for babyjoker1221
babyjoker1221

1313

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#108 babyjoker1221
Member since 2015 • 1313 Posts

@Shewgenja: You're getting utterly destroyed here bud. Perhaps you should just move along to a thread where people won't call you out on your bs.

Avatar image for ermacness
ermacness

10613

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#109 ermacness
Member since 2005 • 10613 Posts

@commander said:
@tormentos said:
@commander said:
@MonsieurX said:

But the 720p/900p games

But slideshows...

Rather than improve PS4 and Xbox One's frame-rates as expected, patch 1.07 actually shows a regression for each. Microsoft's platform now appears to use a double-buffer method of v-sync to match PS4, causing it to lock at 20fps when the going gets tough in the swamps.

Did you even read.?

Hahahahaaaaaaaaaaaaa.....

Both drops into the 20 while the PS4 has 40% higher resolution which never changes,since this game was falsely claim to have dynamic resolution of 1080p but in reality is only 1080p on the menu never did DF recorded a 1080p moment gameplay wise..hahaha

Maybe you should learn how to read.

.

It's a set-up that matches PlayStation 4, which also locks at 20fps for this section of gameplay (swamps)

Our analysis shows Microsoft's console retains its advantage in frame-rate over the PS4

Compared side-by-side, the gap has narrowed between PS4 and Xbox One, but only due to the latter's apparent degradation in performance, forcing the two to run at a matching 20fps in challenging areas. Otherwise, PS4's performance profile on patch 1.07 doesn't give us much to celebrate - it still stutters in places where Xbox One runs at a perfectly smooth 30fps, and in cut-scenes, Sony's console produces the lower readings overall.

The matter of the fact is that sony cannot hide it's inferiority anymore. It can produce 1080p but it cannot produce acceptable framerates. It has been shown in numerous games. When we're talking current gen games, the cpu cannot keep up. I said it a year ago and i say it again, the xbox one is a balanced system, the ps4 isn't.

Enjoy your powerpoint station 4

What's even funnier is that you generalize the power of the ps4 off of one (and I repeat ......ONE) game when 90-95% of multiplats this gen all look, and run better on the ps4. Get at least 30% of multiplats to run better (because we both know that getting them to look better on the x1 is a massive chore) before we start calling the ps4 the "Powerpoint Station 4"

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#110  Edited By ReadingRainbow4
Member since 2012 • 18733 Posts

@ermacness said:
@commander said:
@tormentos said:
@commander said:
@MonsieurX said:

But the 720p/900p games

But slideshows...

Rather than improve PS4 and Xbox One's frame-rates as expected, patch 1.07 actually shows a regression for each. Microsoft's platform now appears to use a double-buffer method of v-sync to match PS4, causing it to lock at 20fps when the going gets tough in the swamps.

Did you even read.?

Hahahahaaaaaaaaaaaaa.....

Both drops into the 20 while the PS4 has 40% higher resolution which never changes,since this game was falsely claim to have dynamic resolution of 1080p but in reality is only 1080p on the menu never did DF recorded a 1080p moment gameplay wise..hahaha

Maybe you should learn how to read.

.

It's a set-up that matches PlayStation 4, which also locks at 20fps for this section of gameplay (swamps)

Our analysis shows Microsoft's console retains its advantage in frame-rate over the PS4

Compared side-by-side, the gap has narrowed between PS4 and Xbox One, but only due to the latter's apparent degradation in performance, forcing the two to run at a matching 20fps in challenging areas. Otherwise, PS4's performance profile on patch 1.07 doesn't give us much to celebrate - it still stutters in places where Xbox One runs at a perfectly smooth 30fps, and in cut-scenes, Sony's console produces the lower readings overall.

The matter of the fact is that sony cannot hide it's inferiority anymore. It can produce 1080p but it cannot produce acceptable framerates. It has been shown in numerous games. When we're talking current gen games, the cpu cannot keep up. I said it a year ago and i say it again, the xbox one is a balanced system, the ps4 isn't.

Enjoy your powerpoint station 4

What's even funnier is that you generalize the power of the ps4 off of one (and I repeat ......ONE) game when 90-95% of multiplats this gen all look, and run better on the ps4. Get at least 30% of multiplats to run better (because we both know that getting them to look better on the x1 is a massive chore) before we start calling the ps4 the "Powerpoint Station 4"

Not to mention all this thread proves is that CD Projekt red is absolute shit at console optimization. lol.

But it's par for the course with a commander thread, dudes a loony lem.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#111  Edited By commander
Member since 2010 • 16217 Posts

@Shewgenja said:
@commander said:

I never brought up gddr5 as an argument for the cpu performance, you did. Now you're mad that i pointed out the latency and the fact that the ps4 cpu can never make use of that bandwith.

I told you it might be possible sony was able to rectify or at least diminish the gddr5 latency with the ps4's memory controller (allthough that's highly unlikely since intel and amd cannot do it yet) but even then that doesn't change the fact that the ps4 cpu is too weak to make use of the gdrr5 bandwith and because of that your argument is totally debunked.

And the difference in framerates is a whole lot more than 2 fps, it's mostly double that so if the ps4 is running at 26 fps and the X1 at 30 fps, then that's a 15 percent difference.

It's also the difference between smooth gameplay and lag, I can understand that you call it nonsense because you're a rabid cow but you will just have to suck it up. The ps4 is not the powerhouse fans and sony make it out to be.

Time will certainly tell. In one corner, you have DX12 magic sauce and in the other corner you have a GPU designed to offload heavy computation from it's barely slower CPU. Look, I'll be honest here. The winner is going to be the one developers put more effort into. If the time and effort are the exact same, though? Hardware will win every time and that advantage strictly leans into the realm of the PS4.

On top of that, there are very real limitations to what the XBone can do. For differed rendering, it has a complete achilles heel due to the ESRAM simply not being large enough for a full 1080p framebuffer.

Well, you certainly got a point there that the ps4 has a much stronger gpu and that the xboxone is not good at 1080p but for some current gen games the ps4 doesn't seem to be very good either. Sometimes i think sony pushes devs to output at 1080p and then you see the X1 getting better frames. A lot of devs realized the x1 cannot cope with 1080p, and that was its saving grace. It's a shame for a number of games, gta V would have been a much better game on the X1 if they released it at 900p instead of 1080p.

However , as you can see the ps4 struggles with 1080p as well, not as much as the x1 but it won't be as easy as just lowering the resolution, devs need to get creative to transfer that power. You might be right that dx12 solves a lot for the ps4, but it can solve a lot for the X1 either. Don't forget, directx is technology made by microsoft.

Time will indeed tell, a lot of games will release the following 6 months/year and it will certainly give us a lot of SW material.

Avatar image for RoboCopISJesus
RoboCopISJesus

2225

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#112  Edited By RoboCopISJesus
Member since 2004 • 2225 Posts

Why is PS4 so weak?

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#113  Edited By commander
Member since 2010 • 16217 Posts

@ReadingRainbow4 said:
@ermacness said:
@commander said:

Maybe you should learn how to read.

.

It's a set-up that matches PlayStation 4, which also locks at 20fps for this section of gameplay (swamps)

Our analysis shows Microsoft's console retains its advantage in frame-rate over the PS4

Compared side-by-side, the gap has narrowed between PS4 and Xbox One, but only due to the latter's apparent degradation in performance, forcing the two to run at a matching 20fps in challenging areas. Otherwise, PS4's performance profile on patch 1.07 doesn't give us much to celebrate - it still stutters in places where Xbox One runs at a perfectly smooth 30fps, and in cut-scenes, Sony's console produces the lower readings overall.

The matter of the fact is that sony cannot hide it's inferiority anymore. It can produce 1080p but it cannot produce acceptable framerates. It has been shown in numerous games. When we're talking current gen games, the cpu cannot keep up. I said it a year ago and i say it again, the xbox one is a balanced system, the ps4 isn't.

Enjoy your powerpoint station 4

What's even funnier is that you generalize the power of the ps4 off of one (and I repeat ......ONE) game when 90-95% of multiplats this gen all look, and run better on the ps4. Get at least 30% of multiplats to run better (because we both know that getting them to look better on the x1 is a massive chore) before we start calling the ps4 the "Powerpoint Station 4"

Not to mention all this thread proves is that CD Projekt red is absolute shit at console optimization. lol.

But it's par for the course with a commander thread, dudes a loony lem.

Horseshit, other devs have had problems with cpu power on the ps4 as well.

rockstar, ubisoft, they must be noobs when it comes to console optimization. After all they're only the longest and biggest in this business.

Avatar image for ermacness
ermacness

10613

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#114 ermacness
Member since 2005 • 10613 Posts

@commander said:
@ReadingRainbow4 said:
@ermacness said:
@commander said:

Maybe you should learn how to read.

.

It's a set-up that matches PlayStation 4, which also locks at 20fps for this section of gameplay (swamps)

Our analysis shows Microsoft's console retains its advantage in frame-rate over the PS4

Compared side-by-side, the gap has narrowed between PS4 and Xbox One, but only due to the latter's apparent degradation in performance, forcing the two to run at a matching 20fps in challenging areas. Otherwise, PS4's performance profile on patch 1.07 doesn't give us much to celebrate - it still stutters in places where Xbox One runs at a perfectly smooth 30fps, and in cut-scenes, Sony's console produces the lower readings overall.

The matter of the fact is that sony cannot hide it's inferiority anymore. It can produce 1080p but it cannot produce acceptable framerates. It has been shown in numerous games. When we're talking current gen games, the cpu cannot keep up. I said it a year ago and i say it again, the xbox one is a balanced system, the ps4 isn't.

Enjoy your powerpoint station 4

What's even funnier is that you generalize the power of the ps4 off of one (and I repeat ......ONE) game when 90-95% of multiplats this gen all look, and run better on the ps4. Get at least 30% of multiplats to run better (because we both know that getting them to look better on the x1 is a massive chore) before we start calling the ps4 the "Powerpoint Station 4"

Not to mention all this thread proves is that CD Projekt red is absolute shit at console optimization. lol.

But it's par for the course with a commander thread, dudes a loony lem.

Horseshit, other devs have had problems with cpu power on the ps4 as well.

rockstar, ubisoft, they must be noobs when it comes to console optimization. After all they're only the longest and biggest in this business.

Just like they did with the x1. If it was the other way around, those games would not only run, but look better on the x1 as well.

Thanks for playing ; )

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#115  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Shewgenja said:
@commander said:

Still doesn't matter, normally high speed memory makes up for latency with bandwith but that bandwith is useless since the cpu is so weak. Gddr5 is great for the gpu of course but does nothing for the cpu.

I might be that the optimizatons nullified the latency difference i can give you that, but mentioning the gddr 5 as a memory advantage because it's shared is bs. The cpu in the ps4 can never utilize that bandwith.

Unless you were expecting Sony to time travel to Cyberdine headquarters after the nuclear war, did you honestly expect a CPU that could take full advantage of 5.5GHz RAM? Intel isn't even coming close to that with their memory controllers. Meanwhile, you are spewing this nonsense AGAIN and AGAIN AND AGAIN and it is disinformation you fucking schill. GDDR5 accomplishes read and write for the same clock. Therefore, it's CAS latency is a small piece of the puzzle that is typically a much bigger deal for earlier JEDEC standards.

You are comparing apples to oranges, and then when called out on it, you compare it to magical fairytale technology that doesn't exist yet in regards to the PS4s ability to handle that memory. It's a balanced design, especially if the developers can be arsed to let the CPU and GPU address memory in tandem to cut down on the overall number of processes actually needed to create a result. The CUs can do a lot more of the work if people have the time, man hours, and budget. Then again, when you have a vocally angry fanbase on the derpbox making a shitstorm over every FPS and resolution difference on their precious machine, that's where you devote your time and money.

Meanwhile, the rest of us have moved on. You can have your 2fps or whatever it is that strokes you so hard on a game that is months-old and 100% likely to get another patch that corrects whatever it is in the code on PS4 that spawned this shit thread.

Intel's memory controller access latency is superior to AMD's DDRx and GDDR5 memory controllers.

One of the low hanging fruit on why Intel CPUs beats AMD CPUs.

Avatar image for bobrossperm
BobRossPerm

2886

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#116  Edited By BobRossPerm
Member since 2015 • 2886 Posts

So what exactly is so much better about the Xbones CPU?

Avatar image for bobrossperm
BobRossPerm

2886

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#117 BobRossPerm
Member since 2015 • 2886 Posts
@ten_pints said:

"but the X1 get's an even bigger performance hit." ???

And swiftly he ignores the **** out of you :)

Avatar image for soulitane
soulitane

15091

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#118 soulitane
Member since 2010 • 15091 Posts

@bobrossperm said:

So what exactly is so much better about the Xbones CPU?

It's clocked every so slightly higher than the PS4's, thus negating all other technical shortcoming of the XO.

Avatar image for deactivated-5f3ec00254b0d
deactivated-5f3ec00254b0d

6278

Forum Posts

0

Wiki Points

0

Followers

Reviews: 54

User Lists: 0

#119 deactivated-5f3ec00254b0d
Member since 2009 • 6278 Posts

I wonder if PS4 users wouldn't be happier if their game ran at a lower resolution but with good FPS? And I wonder if Sony, for marketing reasons, has some clause that forces the game to run at 1080p? Honestly I see no other reason. Why insist on a resolution that the hardware evidently can't handle?

I don't think a game has to run at 60fps to be good, but to constantly drop to around 20fps it's just ridiculous. Specially considering that at a lower resolution the game would be much better for players in general. To me W3 was one of the best RPGs I've played (on PC) and I think it's very sad that PS4 users have the game kind of ruined because of some very vocal fanboys and worse... because Sony chooses to feed their resolution hysteria.

Just look at the Wii people. Super Mario Galaxy series and Metroid Prime - as an example- are still superb games despite of running at lower resolutions. IMO still better than any game released for X1 or PS4.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#120  Edited By commander
Member since 2010 • 16217 Posts

@phbz Very wise words my friend

@bobrossperm said:
@ten_pints said:

"but the X1 get's an even bigger performance hit." ???

And swiftly he ignores the **** out of you :)

I ignored him because he didn't understandd the context of the sentence, apparently you don't either

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#121 ronvalencia
Member since 2008 • 29612 Posts

@Shewgenja said:
@commander said:

I never brought up gddr5 as an argument for the cpu performance, you did. Now you're mad that i pointed out the latency and the fact that the ps4 cpu can never make use of that bandwith.

I told you it might be possible sony was able to rectify or at least diminish the gddr5 latency with the ps4's memory controller (allthough that's highly unlikely since intel and amd cannot do it yet) but even then that doesn't change the fact that the ps4 cpu is too weak to make use of the gdrr5 bandwith and because of that your argument is totally debunked.

And the difference in framerates is a whole lot more than 2 fps, it's mostly double that so if the ps4 is running at 26 fps and the X1 at 30 fps, then that's a 15 percent difference.

It's also the difference between smooth gameplay and lag, I can understand that you call it nonsense because you're a rabid cow but you will just have to suck it up. The ps4 is not the powerhouse fans and sony make it out to be.

Time will certainly tell. In one corner, you have DX12 magic sauce and in the other corner you have a GPU designed to offload heavy computation from it's barely slower CPU. Look, I'll be honest here. The winner is going to be the one developers put more effort into. If the time and effort are the exact same, though? Hardware will win every time and that advantage strictly leans into the realm of the PS4.

On top of that, there are very real limitations to what the XBone can do. For differed rendering, it has a complete achilles heel due to the ESRAM simply not being large enough for a full 1080p framebuffer.

XBO's new split rendering feature with 32MB ESRAM handling most of the frame buffer operations while DDR3 handles the residual frame buffer operations. This is shown in GDC 2015 and you are being too stupid to notice.

Avatar image for Heil68
Heil68

60712

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#122 Heil68
Member since 2004 • 60712 Posts

Add to it that the PS4 is 50% more powerful than the X1 and you know the X1 is in for some real trouble this gen.

Oh my.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#123 04dcarraher
Member since 2004 • 23829 Posts

@Heil68 said:

Add to it that the PS4 is 50% more powerful than the X1 and you know the X1 is in for some real trouble this gen.

Oh my.

you know that PS4 gpu is not 50% faster...... since the gpu processing power is only 29% more than X1, and its texture rate is again only 29% faster. now the pixel rate is 47% more. Over all the PS4 gpu is between 35-40% faster..... Then cpu wise right now X1 has an advantage with is 150mhz upclock across 6+1(80%) cores. Which adds around 20% more cpu power over PS4.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#124 commander
Member since 2010 • 16217 Posts

@04dcarraher said:
@Heil68 said:

Add to it that the PS4 is 50% more powerful than the X1 and you know the X1 is in for some real trouble this gen.

Oh my.

you know that PS4 gpu is not 50% faster...... since the gpu processing power is only 29% more than X1, and its texture rate is again only 29% faster. now the pixel rate is 47% more. Over all the PS4 gpu is between 35-40% faster..... Then cpu wise right now X1 has an advantage with is 150mhz upclock across 6+1(80%) cores. Which adds around 20% more cpu power over PS4.

Yeah these fanboys lol, they simply don't realize that some games just play smoother on the X1, albeit in 900p resolution but i always lower settings on the pc to get smoother framerates.

I also have an x1

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#125  Edited By ronvalencia
Member since 2008 • 29612 Posts

@04dcarraher said:
@Heil68 said:

Add to it that the PS4 is 50% more powerful than the X1 and you know the X1 is in for some real trouble this gen.

Oh my.

you know that PS4 gpu is not 50% faster...... since the gpu processing power is only 29% more than X1, and its texture rate is again only 29% faster. now the pixel rate is 47% more. Over all the PS4 gpu is between 35-40% faster..... Then cpu wise right now X1 has an advantage with is 150mhz upclock across 6+1(80%) cores. Which adds around 20% more cpu power over PS4.

XBO's ROPS workaround via TMUs, hence it's memory bandwidth bound. This workaround relates well with Async compute shader.

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#126  Edited By QuadKnight
Member since 2015 • 12916 Posts

I find it funny when lems make graphics threads like this. OP is rejoicing for one unoptimized game that happens to "run better" on Xbone when 99% of the games out there look and run better on PS4. Dat lem hypocrisy.

Avatar image for Guy_Brohski
Guy_Brohski

2221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#127 Guy_Brohski
Member since 2013 • 2221 Posts

I blame this all on Nintendo. When the pitifully underpowered Wii was released in 2006 and sold like hotcakes, it told Sony and MS that this gimping hardware thing was okay. Note to console makers, no it's not okay to release pitifully underpowered hardware, you bunch of boobs.

Avatar image for asylumni
asylumni

3304

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#128 asylumni
Member since 2003 • 3304 Posts

@04dcarraher: I'm curious, how did you arrive at the 29% numbers for gpu power and texture rate and 47% for pixel fill rate?

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#129 ReadingRainbow4
Member since 2012 • 18733 Posts

@quadknight said:

I find it funny when lems make graphics threads like this. OP is rejoicing for one unoptimized game that happens to "run better" on Xbone when 99% of the games out there look and run better on PS4. Dat lem hypocrisy.

The best part is this update lowered framerates across both versions and this jack ass is using it as a way to champion his favorite piece of plastic. But remember guys, CD projekt Red is fantastic at console optimization, hur hur hur.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#131  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@asylumni said:

@04dcarraher: I'm curious, how did you arrive at the 29% numbers for gpu power and texture rate and 47% for pixel fill rate?

look at the gpu specs" performance numbers"

1.31 TFLOP is 71% the compute power of 1.84 TFLOP. Which means there is a 29% difference there. Then look at texture fill rates and pixel fill rates, 40.9 GTex/s vs 57.6 GTex/s, again 40.9 is 71% of 57.6 meaning a 29% difference. Then for Pixel 13.6 Gpix/s vs 25.6 Gpix/s 13.6 is 53% of 25.6, meaning a 47% difference in performance in that area.

Avatar image for asylumni
asylumni

3304

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#132 asylumni
Member since 2003 • 3304 Posts

@04dcarraher: Ah, the raw numbers look right, you just used bad math. What that actually calculated was how much less the Xbox One could do each than the PS4. If you want to know how much more the PS4 does compared to the Xbox One, you divide the PS4 number by the Xbox number. This gives 140%, 140% & 188%, therefore 40% Teraflops more, 40% GTex/s more and 88% GPix/s more, respective to your order. To put it another way, before the overclock, the PS4 was rated at twice the pixel fill rate of the Xbox One. That's 100% more than the Xbox One (or 200%). The math would be 25.6/12.8=200% or, 100% more.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#133  Edited By commander
Member since 2010 • 16217 Posts

@asylumni said:

@04dcarraher: Ah, the raw numbers look right, you just used bad math. What that actually calculated was how much less the Xbox One could do each than the PS4. If you want to know how much more the PS4 does compared to the Xbox One, you divide the PS4 number by the Xbox number. This gives 140%, 140% & 188%, therefore 40% Teraflops more, 40% GTex/s more and 88% GPix/s more, respective to your order. To put it another way, before the overclock, the PS4 was rated at twice the pixel fill rate of the Xbox One. That's 100% more than the Xbox One (or 200%). The math would be 25.6/12.8=200% or, 100% more.

You're right about that, unfortunately this comparison doesn't consider the esram. Of course even with the esram the ps4's gpu will be significantly stronger, but not 40 percent, far from that, if the devs use the esram sdk that is and not all of them do that, just like not all of them use gpgpu tools on the ps4.

I'm curious what dx12 will do, the ps4 gpu won't be fast enough to use texture tiling in dx12 and will simply load all textures into the memory like it does now but dx 12 will be more optimized to make use of that esram speed.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#134  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@asylumni:

To compare the hardware performance not really . While yes that math you use is correct going from lowest set number then taking the difference needed to reach the higher set number. does not reflect the real performance.

If PS4 had a true 88% more pixel rate etc you wouldn't see the PS4 only being one resolution step above the X1. ie 720p vs 900p or 900p vs 1080p. Or even at same resolution locked at same set standards.

Just like say GTX 960 vs GTX 970 having a 86% more pixel rate difference along with 51% more compute and texel power, with 1 yet only sees an average of 58% performance between them at 1080p. Even at 1440p there is only around an average 62% performance difference.

This is why you should take the difference from gpu numbers between them as they are because they more realistically reflect the performance differences.

PS4 gpu is overall around 40% stronger

Avatar image for lordlors
lordlors

6128

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#135 lordlors
Member since 2004 • 6128 Posts

Console hardware just sucks. Plain and simple. :)

Avatar image for asylumni
asylumni

3304

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#136 asylumni
Member since 2003 • 3304 Posts

@04dcarraher said:

@asylumni:

To compare the hardware performance not really . While yes that math you use is correct going from lowest set number then taking the difference needed to reach the higher set number. does not reflect the real performance.

If PS4 had a true 88% more pixel rate etc you wouldn't see the PS4 only being one resolution step above the X1. ie 720p vs 900p or 900p vs 1080p. Or even at same resolution locked at same set standards.

Just like say GTX 960 vs GTX 970 having a 86% more pixel rate difference along with 51% more compute and texel power, with 1 yet only sees an average of 58% performance between them at 1080p. Even at 1440p there is only around an average 62% performance difference.

This is why you should take the difference from gpu numbers between them as they are because they more realistically reflect the performance differences.

PS4 gpu is overall around 40% stronger

The problem with bad math is, even if you get close to the number you want, it's still bad math. You can't just do random operations and stop when you get the number you want and call it good. In this instance, the Pixel fill rate is a bit misleading. There is no way the PS4 will ever get anywhere close to it's rated capacity for even a half of a second since the most the PS4 will probably ever be asked to output is less than 249 million Pixels/second ( 1080Px60fps x 2 for stereoscopic 3D). So clearly, pixel fill rate isn't the most accurate comparison tool and needs context. Since these are two GPUs based on the same architecture and measured the same way, Texture fill rate or even GFLOPS is a much more accurate method of comparison. This is born out by the 40% PS4 advantage matching up very closely to the 40% single step resolution difference you cited. This is not the 29% nor the 47% advantage your bad math came up with. Your GTX 960 vs GTX 970 also shows the compute and texture difference match much closer to the real world difference (the 970 also has twice the memory bandwidth to help feed the GPU).

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#137 04dcarraher
Member since 2004 • 23829 Posts

@asylumni:

That's not the case, its not bad math, the gap between the performance numbers is correct and it reflects what we are seeing.

Avatar image for asylumni
asylumni

3304

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#138 asylumni
Member since 2003 • 3304 Posts

@commander said:
@asylumni said:

@04dcarraher: Ah, the raw numbers look right, you just used bad math. What that actually calculated was how much less the Xbox One could do each than the PS4. If you want to know how much more the PS4 does compared to the Xbox One, you divide the PS4 number by the Xbox number. This gives 140%, 140% & 188%, therefore 40% Teraflops more, 40% GTex/s more and 88% GPix/s more, respective to your order. To put it another way, before the overclock, the PS4 was rated at twice the pixel fill rate of the Xbox One. That's 100% more than the Xbox One (or 200%). The math would be 25.6/12.8=200% or, 100% more.

You're right about that, unfortunately this comparison doesn't consider the esram. Of course even with the esram the ps4's gpu will be significantly stronger, but not 40 percent, far from that, if the devs use the esram sdk that is and not all of them do that, just like not all of them use gpgpu tools on the ps4.

I'm curious what dx12 will do, the ps4 gpu won't be fast enough to use texture tiling in dx12 and will simply load all textures into the memory like it does now but dx 12 will be more optimized to make use of that esram speed.

Well, no, this comparison is just among the processing abilities of the respective GPU's and doesn't account for the the benefit of the EDRAM, nor the detriment of the DDR3 RAM (solely considering the GPU side). But what are you referring to as texture tiling? Is it PRT or Megatextures that the PS3 was fast enough to use? Or is it tiled rendering that with a single, high bandwidth memory pool you wouldn't want to use on the PS4? Overall, I wouldn't look to big performance gains (in fps or resolution) from DX12 on the Xbox One but more consistency in frame times with less stuttering. MS has a long history of hyping up the new DX with misleading benchmarks, and while the improvements really are there and they are really good, it's not the holy grail they advertise it to be.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#139  Edited By ronvalencia
Member since 2008 • 29612 Posts

@asylumni: For GCN, raw numbers means very little when they bound by memory bandwidth and other design issues.

The gap between XBO and PS4 changes depending on bottlenecks in the workload process.

In classic raster workloads, TMU reads texture data and ROP writes the result to memory. Compute shader changes this factor with TMU memory writes, hence a workaround for XBO's 16 ROPS issue.

On pixel power difference IF they have similar frame rates.

1920x1080p = 2073600

1600x900p = 1440000

2073600 / 1440000 = 1.44 or PS4's GPU has 44 percent higher pixels over XBO.

1440000 / 2073600 = 0.69 or XBO is 31 percent less pixels than PS4.

TFLOPS power difference

1.84 / 1.31 = 1.40 or PS4's GPU has 40 percent higher FLOPS over XBO.

1.31 / 1.84 = 0.71 or XBO is 29 percent less FLOPS than PS4.

The gap would be narrowed if XBO has higher frame rates at 1600x900p when compared to PS4's 1920x1080p frame rates

If both XBO and PS4 aimed for 1920x1080p and close to 60 fps target,

PS4 = 60 fps

XBO = 42 fps i.e. 71 percent from 60 fps.

If both XBO and PS4 aimed for 1920x1080p and close to 30 fps target,

PS4 = 42 fps, 40 percent higher frame rates than XBO

XBO = 30 fps i.e.

Avatar image for asylumni
asylumni

3304

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#140 asylumni
Member since 2003 • 3304 Posts

@04dcarraher said:

@asylumni:

That's not the case, its not bad math, the gap between the performance numbers is correct and it reflects what we are seeing.

It is bad math, here's what you said...

@04dcarraher said:

you know that PS4 gpu is not 50% faster...... since the gpu processing power is only 29% more than X1, and its texture rate is again only 29% faster. now the pixel rate is 47% more. Over all the PS4 gpu is between 35-40% faster..... Then cpu wise right now X1 has an advantage with is 150mhz upclock across 6+1(80%) cores. Which adds around 20% more cpu power over PS4.

We'll just do the GPU processing power, the rest are similar.

You claim the PS4 is 29% more than the Xbox One.

The Xbox One is rated at 1.31 TFLOPS and the PS4 is rated at 1.84 TFLOPS.

29% of the Xbox One performance is ~.380 TFLOPS.

Therefore, 29% (.380TFLOPS) more than the Xbox One (1.31 TFLOPS) is 1.69 TFLOPS, not 1.84 TFLOPS (PS4).

Bad math.

Avatar image for asylumni
asylumni

3304

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#141 asylumni
Member since 2003 • 3304 Posts

@ronvalencia said:

@asylumni: For GCN, raw numbers means very little nothing when they bound by memory bandwidth and other design issues.

Very little nothing? As opposed to a whole lot of nothing? :P

I would think they were pretty significant when you're talking about the same architecture and the same developer. Sure, each variable introduces more inconsistency, but we're not talking Naughty Dog vs 343. This whole thing is more about the multiplatform development, where it's not too unreasonable that each studio, that is trying to get the most out of the hardware, would be similarly skilled with regards to each platform and be able to draw a similar efficiency from both GPUs, in time. Given one single developer, I think the raw numbers are fairly reliable.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#142  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@asylumni said:
@04dcarraher said:

@asylumni:

That's not the case, its not bad math, the gap between the performance numbers is correct and it reflects what we are seeing.

It is bad math, here's what you said...

@04dcarraher said:

you know that PS4 gpu is not 50% faster...... since the gpu processing power is only 29% more than X1, and its texture rate is again only 29% faster. now the pixel rate is 47% more. Over all the PS4 gpu is between 35-40% faster..... Then cpu wise right now X1 has an advantage with is 150mhz upclock across 6+1(80%) cores. Which adds around 20% more cpu power over PS4.

We'll just do the GPU processing power, the rest are similar.

You claim the PS4 is 29% more than the Xbox One.

The Xbox One is rated at 1.31 TFLOPS and the PS4 is rated at 1.84 TFLOPS.

29% of the Xbox One performance is ~.380 TFLOPS.

Therefore, 29% (.380TFLOPS) more than the Xbox One (1.31 TFLOPS) is 1.69 TFLOPS, not 1.84 TFLOPS (PS4).

Bad math.

Your not understanding

1.31 is 71% of PS4's 1.84

X1 has 71% of PS4's compute power 100-71=29 difference

13.6 is 53% of PS4's 25.6

X1 has 53% of the pixel fillrate of PS4 a 47% gap. 100-53=47 difference

You dont take 29% by X1's number to get the difference.

When your comparing gpu power vs another

Like below For example GTX 970 has 90% of the performance Fury a 10% difference, or take 270x has 49% the performance of Fury.

Avatar image for ellos
ellos

2532

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#143  Edited By ellos
Member since 2015 • 2532 Posts

@asylumni said:
@commander said:
@asylumni said:

@04dcarraher: Ah, the raw numbers look right, you just used bad math. What that actually calculated was how much less the Xbox One could do each than the PS4. If you want to know how much more the PS4 does compared to the Xbox One, you divide the PS4 number by the Xbox number. This gives 140%, 140% & 188%, therefore 40% Teraflops more, 40% GTex/s more and 88% GPix/s more, respective to your order. To put it another way, before the overclock, the PS4 was rated at twice the pixel fill rate of the Xbox One. That's 100% more than the Xbox One (or 200%). The math would be 25.6/12.8=200% or, 100% more.

You're right about that, unfortunately this comparison doesn't consider the esram. Of course even with the esram the ps4's gpu will be significantly stronger, but not 40 percent, far from that, if the devs use the esram sdk that is and not all of them do that, just like not all of them use gpgpu tools on the ps4.

I'm curious what dx12 will do, the ps4 gpu won't be fast enough to use texture tiling in dx12 and will simply load all textures into the memory like it does now but dx 12 will be more optimized to make use of that esram speed.

Well, no, this comparison is just among the processing abilities of the respective GPU's and doesn't account for the the benefit of the EDRAM, nor the detriment of the DDR3 RAM (solely considering the GPU side). But what are you referring to as texture tiling? Is it PRT or Megatextures that the PS3 was fast enough to use? Or is it tiled rendering that with a single, high bandwidth memory pool you wouldn't want to use on the PS4? Overall, I wouldn't look to big performance gains (in fps or resolution) from DX12 on the Xbox One but more consistency in frame times with less stuttering. MS has a long history of hyping up the new DX with misleading benchmarks, and while the improvements really are there and they are really good, it's not the holy grail they advertise it to be.

I'm going to bet that he is talking about the old Tiled Resources and PRT discussion. The way he puts things together don't make sense sometimes. I would love for ms to have a dx12 xbox one workshop where they talk about benefits it will bring with some real-time demonstration. Heck not just Microsoft it will be a dream to have sony talk about there sdk or some sort of a leak like the xbox one sdk updates leak.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#144  Edited By ronvalencia
Member since 2008 • 29612 Posts

@asylumni:

For example, ROPS color fill rate difference between XBO and PS4 does not result in similar effective preformance gap. This why some raw numbers are nearly pointless.

Avatar image for Scipio8
Scipio8

937

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#145 Scipio8
Member since 2013 • 937 Posts

Its too bad Sony don't care about gamers, only about marketing as long as they can hit 1080p, nevermind it runs at 10 FPS.

Avatar image for asylumni
asylumni

3304

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#146 asylumni
Member since 2003 • 3304 Posts

@04dcarraher said:
@asylumni said:
@04dcarraher said:

@asylumni:

That's not the case, its not bad math, the gap between the performance numbers is correct and it reflects what we are seeing.

It is bad math, here's what you said...

@04dcarraher said:

you know that PS4 gpu is not 50% faster...... since the gpu processing power is only 29% more than X1, and its texture rate is again only 29% faster. now the pixel rate is 47% more. Over all the PS4 gpu is between 35-40% faster..... Then cpu wise right now X1 has an advantage with is 150mhz upclock across 6+1(80%) cores. Which adds around 20% more cpu power over PS4.

We'll just do the GPU processing power, the rest are similar.

You claim the PS4 is 29% more than the Xbox One.

The Xbox One is rated at 1.31 TFLOPS and the PS4 is rated at 1.84 TFLOPS.

29% of the Xbox One performance is ~.380 TFLOPS.

Therefore, 29% (.380TFLOPS) more than the Xbox One (1.31 TFLOPS) is 1.69 TFLOPS, not 1.84 TFLOPS (PS4).

Bad math.

Your not understanding

1.31 is 71% of PS4's 1.84

X1 has 71% of PS4's compute power 100-71=29 difference

13.6 is 53% of PS4's 25.6

X1 has 53% of the pixel fillrate of PS4 a 47% gap. 100-53=47 difference

You dont take 29% by X1's number to get the difference.

When your comparing gpu power vs another

And that would be fine if you were saying how much less power the Xbox One had than the PS4. The math works.

The Xbox One (1.31 GFLOPS) has (=) 29% less than PS4 (1.84 - .53). 1.84GFLOPS - .53GFLOPS = 1.31GFLOPS

Comparing XBOX One to PS4 (ie., "the Xbox One has 29% less gpu power than the PS4"), the PS 4 is your baseline 100%, but comparing PS4 to the XBOX One (ie., "The PS4 has 40% more gpu power than the Xbox One"), the XBOX one is your baseline 100%. You did this correctly with the CPU comparison (The Xbox has roughly 20% more CPU power than the PS4) and the other way (PS4 (83.5%) has roughly 16.5% less CPU power than the Xbox One. You can go either way, but you can't do the math one way and write out the results the other. Setting the baseline at the higher number always gives a smaller % difference.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#147 commander
Member since 2010 • 16217 Posts

@ellos said:
@asylumni said:
@commander said:
@asylumni said:

@04dcarraher: Ah, the raw numbers look right, you just used bad math. What that actually calculated was how much less the Xbox One could do each than the PS4. If you want to know how much more the PS4 does compared to the Xbox One, you divide the PS4 number by the Xbox number. This gives 140%, 140% & 188%, therefore 40% Teraflops more, 40% GTex/s more and 88% GPix/s more, respective to your order. To put it another way, before the overclock, the PS4 was rated at twice the pixel fill rate of the Xbox One. That's 100% more than the Xbox One (or 200%). The math would be 25.6/12.8=200% or, 100% more.

You're right about that, unfortunately this comparison doesn't consider the esram. Of course even with the esram the ps4's gpu will be significantly stronger, but not 40 percent, far from that, if the devs use the esram sdk that is and not all of them do that, just like not all of them use gpgpu tools on the ps4.

I'm curious what dx12 will do, the ps4 gpu won't be fast enough to use texture tiling in dx12 and will simply load all textures into the memory like it does now but dx 12 will be more optimized to make use of that esram speed.

Well, no, this comparison is just among the processing abilities of the respective GPU's and doesn't account for the the benefit of the EDRAM, nor the detriment of the DDR3 RAM (solely considering the GPU side). But what are you referring to as texture tiling? Is it PRT or Megatextures that the PS3 was fast enough to use? Or is it tiled rendering that with a single, high bandwidth memory pool you wouldn't want to use on the PS4? Overall, I wouldn't look to big performance gains (in fps or resolution) from DX12 on the Xbox One but more consistency in frame times with less stuttering. MS has a long history of hyping up the new DX with misleading benchmarks, and while the improvements really are there and they are really good, it's not the holy grail they advertise it to be.

I'm going to bet that he is talking about the old Tiled Resources and PRT discussion. The way he puts things together don't make sense sometimes. I would love for ms to have a dx12 xbox one workshop where they talk about benefits it will bring with some real-time demonstration. Heck not just Microsoft it will be a dream to have sony talk about there sdk or some sort of a leak like the xbox one sdk updates leak.

the texture tiling is not some old discussion, it is already implemented in the dx 11.2 sdk to some extent. Not all developpers make good use of it but they will definetely have more tools available to use it with dx 12.

Texture tiling is basically removing textures from the video memory when your not looking at them. Since you can't see them anyway , it doesn't matter if they are not there. It has indeed similarities to prt.

Esram is basically the perfect ram for this kind of tech , since it's the fastest ram available.

Avatar image for asylumni
asylumni

3304

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#148  Edited By asylumni
Member since 2003 • 3304 Posts

@ronvalencia said:

@asylumni: For GCN, raw numbers means very little when they bound by memory bandwidth and other design issues.

The gap between XBO and PS4 changes depending on bottlenecks in the workload process.

In classic raster workloads, TMU reads texture data and ROP writes the result to memory. Compute shader changes this factor with TMU memory writes, hence a workaround for XBO's 16 ROPS issue.

On pixel power difference IF they have similar frame rates.

1920x1080p = 2073600

1600x900p = 1440000

2073600 / 1440000 = 1.44 or PS4's GPU has 44 percent higher pixels over XBO.

1440000 / 2073600 = 0.69 or XBO is 31 percent less pixels than PS4.

TFLOPS power difference

1.84 / 1.31 = 1.40 or PS4's GPU has 40 percent higher FLOPS over XBO.

1.31 / 1.84 = 0.71 or XBO is 29 percent less FLOPS than PS4.

The gap would be narrowed if XBO has higher frame rates at 1600x900p when compared to PS4's 1920x1080p frame rates

If both XBO and PS4 aimed for 1920x1080p and close to 60 fps target,

PS4 = 60 fps

XBO = 42 fps i.e. 71 percent from 60 fps.

If both XBO and PS4 aimed for 1920x1080p and close to 30 fps target,

PS4 = 42 fps, 40 percent higher frame rates than XBO

XBO = 30 fps i.e.

@ronvalencia said:

@asylumni:

For example, ROPS color fill rate difference between XBO and PS4 does not result in similar effective preformance gap. This why some raw numbers are nearly pointless.

Ah, you added quite a bit since my reply. But that was my main issue, that he was doing the math for, say, the Xbox One having 29% less GFLOPS than the PS4 but claiming the PS4 had 29% more.

I also later mentioned how the Pixel output was a bit misleading, and here's more of why. At 1080P, it would take the PS4 64,800 clock ticks (.000081 seconds) to draw the frame (though this is not necessarily consecutive). At 60 FPS, it's putting out a new frame every .0167 seconds. Therefore, the writing makes up a mere .486% of the frame time, clearly much less of a factor than the fetching and rendering of the frame. Of course, the rendering and drawing happen at the same time, but this is just to show the impact each has on frame time. Even before the overclock, that would put the Xbox one at less than a quarter of a percent frame time difference with half the ROP's. 50% more texture units would have a much greater impact and more closely align with the real world differences. There is, of course, bandwidth and CPU differences that come into play as well, but with the GPU heavy programming paradigm both use and the the EDRAM helping offset the lower DDR3 bandwidth, I believe the GPU is still the major determining factor.

Avatar image for asylumni
asylumni

3304

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#149 asylumni
Member since 2003 • 3304 Posts

@commander said:
@ellos said:

I'm going to bet that he is talking about the old Tiled Resources and PRT discussion. The way he puts things together don't make sense sometimes. I would love for ms to have a dx12 xbox one workshop where they talk about benefits it will bring with some real-time demonstration. Heck not just Microsoft it will be a dream to have sony talk about there sdk or some sort of a leak like the xbox one sdk updates leak.

the texture tiling is not some old discussion, it is already implemented in the dx 11.2 sdk to some extent. Not all developpers make good use of it but they will definetely have more tools available to use it with dx 12.

Texture tiling is basically removing textures from the video memory when your not looking at them. Since you can't see them anyway , it doesn't matter if they are not there. It has indeed similarities to prt.

Esram is basically the perfect ram for this kind of tech , since it's the fastest ram available.

I disagree. The EDRAM does have the highest bandwidth but it's to the GPU. Therefore, the best use would be the high bandwidth functions like MSAA and such. Using it for PRT, you would have to pull from the DDR3 RAM, at which point you would be better off just feeding the GPU rather than move it to the EDRAM then back to the GPU. Sure, the Xbox One has a fixed function chip that would eliminate the GPU from having to move the data, but it doesn't eliminate the bandwidth usage. And if you leave the bulk of the textures in the EDRAM, then you're using the DDR3 connection for the more bandwidth heavy functions resulting in less efficiency.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#150  Edited By commander
Member since 2010 • 16217 Posts

@asylumni said:
@commander said:
@ellos said:

I'm going to bet that he is talking about the old Tiled Resources and PRT discussion. The way he puts things together don't make sense sometimes. I would love for ms to have a dx12 xbox one workshop where they talk about benefits it will bring with some real-time demonstration. Heck not just Microsoft it will be a dream to have sony talk about there sdk or some sort of a leak like the xbox one sdk updates leak.

the texture tiling is not some old discussion, it is already implemented in the dx 11.2 sdk to some extent. Not all developpers make good use of it but they will definetely have more tools available to use it with dx 12.

Texture tiling is basically removing textures from the video memory when your not looking at them. Since you can't see them anyway , it doesn't matter if they are not there. It has indeed similarities to prt.

Esram is basically the perfect ram for this kind of tech , since it's the fastest ram available.

I disagree. The EDRAM does have the highest bandwidth but it's to the GPU. Therefore, the best use would be the high bandwidth functions like MSAA and such. Using it for PRT, you would have to pull from the DDR3 RAM, at which point you would be better off just feeding the GPU rather than move it to the EDRAM then back to the GPU. Sure, the Xbox One has a fixed function chip that would eliminate the GPU from having to move the data, but it doesn't eliminate the bandwidth usage. And if you leave the bulk of the textures in the EDRAM, then you're using the DDR3 connection for the more bandwidth heavy functions resulting in less efficiency.

The esram doesn't have to switch the textures back and forward to the system's shared memory, it has it's own memory. It doesn't have to be a texture either , it could be a shader as well. The bandwith to the ddr3 is more than enough to do the initial loadup of a texture/shader or whatever. Ddr3 is actually better for this task because it has lower latency than gddr5 for instance, since the esram is only 32 mb , the higher bandwith would never make up for the lower latency when feeding data to the esram.

However i know the esram will never make up for the gap between the x1 gpu and the ps4 gpu but it defenitely makes the gap smaller.

Btw why are you referring to the esram with edram, edram was used in the xbox360, the xboxone uses esram.