Why the ps4 isn't stronger than the X1 (updated)

This topic is locked from further discussion.

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#251 ReadingRainbow4
Member since 2012 • 18733 Posts

@evildead6789 said:

@ReadingRainbow4 said:

@evildead6789 said:

@ReadingRainbow4 said:

This thread makes lems look so bad man, lol.

It's nice in a way, you can quickly reference this thread for future use if any of these dumb asses deny they ever bought into the secret sauce.

I would say it makes the cows look bad, since they bought into that 'playstation is stronger' again lol

Allthough this time they didn't have to pay through the nose like with the ps3 but now they're stuck with a gimped cpu , lmao

The ps4 is stronger, it has superior hardware in it. No secret sauce was needed.

Of course it has

Glad to see reason has finally reached you.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#252 commander
Member since 2010 • 16217 Posts

@ReadingRainbow4 said:

@evildead6789 said:

@ReadingRainbow4 said:

@evildead6789 said:

@ReadingRainbow4 said:

This thread makes lems look so bad man, lol.

It's nice in a way, you can quickly reference this thread for future use if any of these dumb asses deny they ever bought into the secret sauce.

I would say it makes the cows look bad, since they bought into that 'playstation is stronger' again lol

Allthough this time they didn't have to pay through the nose like with the ps3 but now they're stuck with a gimped cpu , lmao

The ps4 is stronger, it has superior hardware in it. No secret sauce was needed.

Of course it has

Glad to see reason has finally reached you.

The concept of satire hasn't reached you apparently , here let me help you

http://en.wikipedia.org/wiki/Satire

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#253 ReadingRainbow4
Member since 2012 • 18733 Posts

@evildead6789 said:

@ReadingRainbow4 said:

@evildead6789 said:

@ReadingRainbow4 said:

@evildead6789 said:

@ReadingRainbow4 said:

This thread makes lems look so bad man, lol.

It's nice in a way, you can quickly reference this thread for future use if any of these dumb asses deny they ever bought into the secret sauce.

I would say it makes the cows look bad, since they bought into that 'playstation is stronger' again lol

Allthough this time they didn't have to pay through the nose like with the ps3 but now they're stuck with a gimped cpu , lmao

The ps4 is stronger, it has superior hardware in it. No secret sauce was needed.

Of course it has

Glad to see reason has finally reached you.

The concept of satire hasn't reached you apparently , here let me help you

http://en.wikipedia.org/wiki/Satire

What is sarcasm?

I'll leave you to your fanboy drivel now :)

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#254 btk2k2
Member since 2003 • 440 Posts

@Tighaman: They are both GCN 1.1. I do not need to look at a diagram to calculate bandwidth you just do math. Memory clock * bus width / 8 = bandwidth in bytes per second. 5.5 * 256 / 8 = 176 GB/s. Sure ESRAM has no contention but it is tiny so even though it is fast it cannot do everything so the gpu will be hitting the slow main ram a lot.

SHAPE was confirmed as being for kinect and voice recognition. There is no advantage in the sound systems as they are both equal for game purposes.

One gpu used as 2 or 2 used as one is the same damn thing and there ia no way that is how the gpu is designed. It is a single 12 cu design. So you can use shaders to do back end work? I think not as it is a different stage in the pipeline.

You know next to nothing about this and you just spout the shit mr x media dribbles out. If you want to learn look at gcn deepdives done at anandtech as that is a good start. If you just want to believe fantasy land bollocks then carry on.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#255 commander
Member since 2010 • 16217 Posts

@ReadingRainbow4 said:

@evildead6789 said:

@ReadingRainbow4 said:

@evildead6789 said:

@ReadingRainbow4 said:

@evildead6789 said:

@ReadingRainbow4 said:

This thread makes lems look so bad man, lol.

It's nice in a way, you can quickly reference this thread for future use if any of these dumb asses deny they ever bought into the secret sauce.

I would say it makes the cows look bad, since they bought into that 'playstation is stronger' again lol

Allthough this time they didn't have to pay through the nose like with the ps3 but now they're stuck with a gimped cpu , lmao

The ps4 is stronger, it has superior hardware in it. No secret sauce was needed.

Of course it has

Glad to see reason has finally reached you.

The concept of satire hasn't reached you apparently , here let me help you

http://en.wikipedia.org/wiki/Satire

What is sarcasm?

I'll leave you to your fanboy drivel now :)

Sarcasm is a part of satire...

I suppose you're a ps4 owner?

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#256  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@btk2k2: @RossRichard: @tormentos:

lol, processing, texture, and pixel rates do not add up in that way you guys are thinking. In basic math yes 1.3 vs 1.8 is a 40 % difference, 25.6 GP is 88% more then 13.6GP etc. However, the correct way to show performance difference is to take what one can can do and take the difference. 1.3 is 71% of 1.8 hence the 30% faster performance ,13.6 GPix/s is 53% of 25.6 GPix/s, which is the reason for 47% faster performance , and for the texture rate 40 vs 57 is 57 is 40% more then 40, however 40 of 57 is 70% of the performance hence the 30% faster performance.

A prime example with the 7850 vs 7870 1.76 vs 2.56 TFLOPS. You would say its 45% faster and its texture rate 80 vs 55 would be 45% , and pixel rate 32 vs 27.5 17%, When 1.76 is 68% of 2.56, hence 32% performance difference, 55 is 62% of 80 hence 38% faster performance , and 27.5 is 85% of 32. so 15% performance difference. End results for these gpu's only with an average difference of 15-20% with performance. We can even take a 7790 vs 7850 1.79 vs 1.76 TFLOP and texture rate of 56 vs 55, and with pixel rate and see that 7850 has 72% difference 16 vs 27.5 , but yet in reality 7790 is only 25% slower on average at same settings and resolutions . then we can take it further with a 7770 and 7850 , 1.28 vs 1.76 TFLOP a near 50% difference 40 vs 55 texture rate 36% difference, and 16 vs 27.5 pixel rate again is 72% difference. Only yields a average difference of 45%.

The PS4 gpu is not 50% faster then the X1 its roughly around 35-40% faster overall at best.

Avatar image for cainetao11
cainetao11

38036

Forum Posts

0

Wiki Points

0

Followers

Reviews: 77

User Lists: 1

#257 cainetao11
Member since 2006 • 38036 Posts

@evildead6789:

I LOVE THIS!!!! By this logic, the Dallas Cowboys are NOT better than the NY Giants.

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#258 btk2k2
Member since 2003 • 440 Posts

@04dcarraher said:

@btk2k2: @RossRichard: @tormentos:

lol, processing, texture, and pixel rates do not add up in that way you guys are thinking. In basic math yes 1.3 vs 1.8 is a 40 % difference, 25.6 GP is 88% more then 13.6GP etc. However, the correct way to show performance difference is to take what one can can do and take the difference. 1.3 is 71% of 1.8 hence the 30% faster performance ,13.6 GPix/s is 53% of 25.6 GPix/s, which is the reason for 47% faster performance , and for the texture rate 40 vs 57 is 57 is 40% more then 40, however 40 of 57 is 70% of the performance hence the 30% faster performance.

A prime example with the 7850 vs 7870 1.76 vs 2.56 TFLOPS. You would say its 45% faster and its texture rate 80 vs 55 would be 45% , and pixel rate 32 vs 27.5 17%, When 1.76 is 68% of 2.56, hence 32% performance difference, 55 is 62% of 80 hence 38% faster performance , and 27.5 is 85% of 32. so 15% performance difference. End results for these gpu's only with an average difference of 15-20% with performance. We can even take a 7790 vs 7850 1.79 vs 1.76 TFLOP and texture rate of 56 vs 55, and with pixel rate and see that 7850 has 72% difference 16 vs 27.5 , but yet in reality 7790 is only 25% slower on average at same settings and resolutions . then we can take it further with a 7770 and 7850 , 1.28 vs 1.76 TFLOP a near 50% difference 40 vs 55 texture rate 36% difference, and 16 vs 27.5 pixel rate again is 72% difference. Only yields a average difference of 45%.

The PS4 gpu is not 50% faster then the X1 its roughly around 35-40% faster overall at best.

Wrong.

Avatar image for deactivated-583e460ca986b
deactivated-583e460ca986b

7240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#259 deactivated-583e460ca986b
Member since 2004 • 7240 Posts

I refuse to believe what is happening in this thread is real life. The PS4 is stronger than the Xbox One. The GPU difference outweighs the different CPU clock speeds. We are talking a .15 GHz difference for crying out loud.

If I increased my CPU clock to 4.35 GHz but replaced my Titan Blacks with 780's my benchmark scores would drop and I would lose frames in games. And the performance difference between a 7790 and 7850 is bigger than a Titan Black and 780.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#260  Edited By commander
Member since 2010 • 16217 Posts

@cainetao11 said:

@evildead6789:

I LOVE THIS!!!! By this logic, the Dallas Cowboys are NOT better than the NY Giants.

BUT OF COURSE

American football compares very easily to computer hardware science specially cpu bottlenecking

( a whole lot sarcasm) and this...

Avatar image for CrownKingArthur
CrownKingArthur

5262

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#261 CrownKingArthur
Member since 2013 • 5262 Posts

john cleese. his quote about palmerston north is fucking hilarious. (for me, at least)

he looks a little esurient there.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#262  Edited By commander
Member since 2010 • 16217 Posts

@GoldenElementXL said:

I refuse to believe what is happening in this thread is real life. The PS4 is stronger than the Xbox One. The GPU difference outweighs the different CPU clock speeds. We are talking a .15 GHz difference for crying out loud.

If I increased my CPU clock to 4.35 GHz but replaced my Titan Blacks with 780's my benchmark scores would drop and I would lose frames in games. And the performance difference between a 7790 and 7850 is bigger than a Titan Black and 780.

The gpu doesn't outweigh the different cpu clock speeds, since it not a cpu. AC unity is a perfect example of this, ubisoft used gpgpu and they weren't able to run the game at better quality on the ps4.

Why? Because the cpu is bottlenecking the AI calculations.

How can you compare your system with the ps4 and x1, that cpu won't bottleneck in a very long time, not in computergames anyway, even at stock clocks lol. Your system is pretty balanced, well it's not hard if you buy the best of everything to be balanced.

The ps4 cpu is comparable to an i3 530. Now try pairing a hd 7850 with that. The xboxone is more like an i3 550 paired with a hd 7790. I'm sure you had pc's in the past and probably ran into to bottlenecks. I don't understand how you don't understand this. To mimic the cpu speed of the ps4 you probably would have to disable all cores but one and still it wouldn't be the same, since the x1 and ps4 have about 6 cores dedicated for gaming and 2 cores for os. So to mimic the ps4 and x1 you have to downlock the cpu to low speeds but leave all cores enabled. Then use one titan and downclock it as well.

When doing a rough estimate you downclock all your cores (but leave em all enabled) to run at 1050mhz. Use one titan, downclock it to match something similar of the hd 7850 and do the benchmark to mimic a ps4. Then clock your cpu at 1150 mhz (but leave all cores enabled). Then downclock the titan even more to match the power of a hd 7790 (if you can downclock it that low) and do another benchmark to mimic the xboxone. STill it wouldn't be ideal but with a lot of research you could probably mimic the bottleneck. You would see that in a lot of cases your frames would be higher on the 1150 clock but you wouldn't be able to achieve 1080p, since the gpu bottlenecks as well then. You would be able to achieve 1080p with the 1050mhz clock but your framerates would be lower. Decreasing the resolution wouldn't increase the framerates since the cpu will be bottlenecking and you have room to spare on your titan that's clocked to a hd 7850 performance.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#263  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@btk2k2 said:

@04dcarraher said:

@btk2k2: @RossRichard: @tormentos:

lol, processing, texture, and pixel rates do not add up in that way you guys are thinking. In basic math yes 1.3 vs 1.8 is a 40 % difference, 25.6 GP is 88% more then 13.6GP etc. However, the correct way to show performance difference is to take what one can can do and take the difference. 1.3 is 71% of 1.8 hence the 30% faster performance ,13.6 GPix/s is 53% of 25.6 GPix/s, which is the reason for 47% faster performance , and for the texture rate 40 vs 57 is 57 is 40% more then 40, however 40 of 57 is 70% of the performance hence the 30% faster performance.

A prime example with the 7850 vs 7870 1.76 vs 2.56 TFLOPS. You would say its 45% faster and its texture rate 80 vs 55 would be 45% , and pixel rate 32 vs 27.5 17%, When 1.76 is 68% of 2.56, hence 32% performance difference, 55 is 62% of 80 hence 38% faster performance , and 27.5 is 85% of 32. so 15% performance difference. End results for these gpu's only with an average difference of 15-20% with performance. We can even take a 7790 vs 7850 1.79 vs 1.76 TFLOP and texture rate of 56 vs 55, and with pixel rate and see that 7850 has 72% difference 16 vs 27.5 , but yet in reality 7790 is only 25% slower on average at same settings and resolutions . then we can take it further with a 7770 and 7850 , 1.28 vs 1.76 TFLOP a near 50% difference 40 vs 55 texture rate 36% difference, and 16 vs 27.5 pixel rate again is 72% difference. Only yields a average difference of 45%.

The PS4 gpu is not 50% faster then the X1 its roughly around 35-40% faster overall at best.

Wrong.

lol you wish ,

PS4 gpu that is less then 50% faster in all areas? it means its not 50% faster.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#264 Tighaman
Member since 2006 • 1038 Posts

@btk2k2:

Haaaa this where you think this is from lol why don't you stop reading that guy and think everything is from his site because its not even close and you don't know what GCN is xbox GPU because its not one like it on the market you all just see 1.3tfs and go by that. We can agree to disagree with the bandwidth of ps4 when they already showed the diagram.

Shape is confirmed to be JUST FOR KINECT PLEASE SHOW ME WHERE MS SAID THIS. and don't bring me the quote from the TESTER on beyond3d because like I said he was just a tester.

You haven't told me anything that involves DEEPDIVING and if you know anything you don't always have to look in the past to see where the industry is going. Pascal, Pirate Island will be a WHOLE NEW ARCH and MORE CUs and ROPs is not going to matter

Avatar image for deactivated-583e460ca986b
deactivated-583e460ca986b

7240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#265 deactivated-583e460ca986b
Member since 2004 • 7240 Posts

@evildead6789: You cant really use AC Unity in this argument. What other game runs at the same resolution on both consoles and the same framerate? 99 times out of a 100 the PS4 is running at a higher resolution, a higher framerate or both. The one thing I don't care for is when the PS4 runs a higher resolution but a lower framerate on average. At that point they should just lower the resolution for the sake of smooth gameplay.

AC Unity is in fact very CPU heavy. And yes NPC's are rough on CPU's which is why the framerate is often times better on Xbox One during gameplay. But during cut scenes the PS4 takes the lead again. GTA V is a game where each console is running the same resolution a nearly identical frame rates. But the Xbox One version has effects dialed back or completely absent.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#266  Edited By commander
Member since 2010 • 16217 Posts

@GoldenElementXL said:

@evildead6789: You cant really use AC Unity in this argument. What other game runs at the same resolution on both consoles and the same framerate? 99 times out of a 100 the PS4 is running at a higher resolution, a higher framerate or both. The one thing I don't care for is when the PS4 runs a higher resolution but a lower framerate on average. At that point they should just lower the resolution for the sake of smooth gameplay.

AC Unity is in fact very CPU heavy. And yes NPC's are rough on CPU's which is why the framerate is often times better on Xbox One during gameplay. But during cut scenes the PS4 takes the lead again. GTA V is a game where each console is running the same resolution a nearly identical frame rates. But the Xbox One version has effects dialed back or completely absent.

Well ac unity is a taste of things to come, it's one of the few true next gen games. They didn't take the old consoles into consideration and went full throttle on the next gen consoles. Ubisoft was pretty clear, there is a cpu bottleneck. I know in cutscenes probably the ps4 takes the lead again, but that's exactly my point, the ps4 isn't stronger overall than the x1 , it's on par, some advantages go to the x1, some go to the ps4. IT's a fact that the ps4 is stronger on the gpu side but it's a fact that the x1 is stronger on the cpu side. Since the cpu is bottlenecking , the 10 percent extra performance on the one pretty much makes up for the 30-35 percent that the ps4 has as extra performance on the gpu side.

The differences will be bigger when games are more gpu heavy , but cpu heavy game the differences will be bigger as well. Unity is not the only game that shows this, far cry and cod show it as well, the resolution is higher on the ps4, yes, but it's not 1080p vs 720p, it's more like 1080p vs 900p and the framerates are smoother on the x1. Lowering the resolution wouldn't increase the framerates on the ps4 since the cpu is bottlenecking, so they might as well run it at a higher resolution, of course gpgpu tools can help, but devs have to use them, and there won't be room to spare to keep that 1080p. Unity is the perfect example of it.

As for gta V, things may been dialed back on the x1 , but not when driving at high speeds through cities , the field of view is even higher on the x1.

My only point with this thread is that people wouldn't be deceived, the whole 'ps4 is stronger' is exaggerated, it's marginally better at best. It could be a lot better in gpu heavy games, but what games won't ask for cpu power , games cry for innovation all the time and last gen games in higher resolution is not what everyone is waiting for . In cpu heavy games the ps4 will even struggle as you can see when driving at high speed in gta V and that's a last gen remaster. Of course gpgpu tools will be the saving grace for the ps4, just like the esram tools are the saving grace for the x1 but in the end , the systems are very similar in performance.

People should decide on which controller, exclusives and features they want. Performance is not something they should take into consideration and that's exactly my point.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#267 tormentos
Member since 2003 • 33784 Posts

@tdkmillsy said:

Exactly why I refuse to be involved in his threads and responses. How can you debate with someone who makes stuff up to suit them.

No i wasn't making stuff up and Golden jump the gun like he always does,i mixed the name of the company with the name of the engine it self.

So yeah Phyxs in the end = PHYSICS it is a physics engine reason which also runs on PS4 my post was correct..

@GoldenElementXL said:

Tiny mistake? The guy made something up and called another poster uninformed about it in the same sentence.

And both Evildead and Tormentos are working some incredible mental gymnastics. That mess is not something I want to get involved in. But Tormentos name calling someone while not knowing what he is talking about himself is something that should be called out.

And P.S. The PS4 is obviously stronger than the Xbox One. I don't know why this is even being debated.

You are an idiot and the fact that jumped on me because you actually believe i don't know what Phyxs was is a joke i do and you can find it here i have debate it to death specially in topics about physics and games like Uncharted.

@Tighaman said:

@tormentos:

Them quotes and that diagram is from the IEEE journal of John Sells the maker of the XBOX ONE and he explaining it to the IEEE you made look at your face I see it now its hilarious. You still don't get it do you of course t see anything with DUAL GRAPHIC CONTEXT BECAUSE ITS NOT ONE lol. THIS IS TRUE HSA. Fast Cache, Microprocessors, DSP, GRAPHIC CONTEXT SWITCHING.

First of all the links show xbox brazil supposedly the excuse to not link the person directly is that it is a pay site,you don't need to pay to join John Sell profile in linked in.

Second DX12 will not unlock hidden GPU inside the xbox,DX12 is already on the fu**ing console and i have prove that more than once with MS own freaking words,what supposedly DX12 will do is lower cpu over head on PC which on xbox one it is since day 1.

The xbox one ins't true HSA buffoon the CPU can't see date while it is on ESRAM totally killing it.

Graphics content switching is on every GCN idiot including the PS4..hahahaa

Basically you are talking about shit inside the PS4,again the screen i posted show the xbox one soc 1 core period 14CU..lol

And this thread really deserve a bookmark..hahahaa

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#268  Edited By 04dcarraher
Member since 2004 • 23829 Posts
@evildead6789 said:

@GoldenElementXL said:

@evildead6789: You cant really use AC Unity in this argument. What other game runs at the same resolution on both consoles and the same framerate? 99 times out of a 100 the PS4 is running at a higher resolution, a higher framerate or both. The one thing I don't care for is when the PS4 runs a higher resolution but a lower framerate on average. At that point they should just lower the resolution for the sake of smooth gameplay.

AC Unity is in fact very CPU heavy. And yes NPC's are rough on CPU's which is why the framerate is often times better on Xbox One during gameplay. But during cut scenes the PS4 takes the lead again. GTA V is a game where each console is running the same resolution a nearly identical frame rates. But the Xbox One version has effects dialed back or completely absent.

Well ac unity is a taste of things to come, it's one of the few true next gen games. They didn't take the old consoles into consideration and went full throttle on the next gen consoles. Ubisoft was pretty clear, there is a cpu bottleneck. I know in cutscenes probably the ps4 takes the lead again, but that's exactly my point, the ps4 isn't stronger overall than the x1 , it's on par, some advantages go to the x1, some go to the ps4. IT's a fact that the ps4 is stronger on the gpu side but it's a fact that the x1 is stronger on the cpu side. Since the cpu is bottlenecking , the 10 percent extra performance on the one pretty much makes up for the 30-35 percent that the ps4 has as extra performance on the gpu side.

The differences will be bigger when games are more gpu heavy , but cpu heavy game the differences will be bigger as well.

As for gta V, things may been dialed back on the x1 , but not when driving at high speeds through cities , the field of view is even higher on the x1.

My only point with this thread is that people wouldn't be deceived, the whole 'ps4 is stronger' is exaggerated, it's marginally better at best. It could be a lot better in gpu heavy games, but what games won't ask for cpu power , games cry for innovation all the time and last gen games in higher resolution is not what everyone is waiting for . In cpu heavy games the ps4 will even struggle as you can see when driving at high speed in gta V and that's a last gen remaster.

People should decide on which controller, exclusives and features they want. Performance is not something they should take into consideration.

God I hope not , that Unity is a taste of whats to come, because it was a POS coded game. Fact is we do not have a real reason why with that game that the PS4 seen upto 5 fps difference even in areas where the numbers of NPC's were limited and out of sight. problem is that we have no idea if it because MS coding tools are easier and more efficient then Sony's and to the point that the game was rushed out with a host of bugs and issues. Which leads one to assume that PS4 didnt get the time it needed to be optimized as well as the X1. So again using unity as an example is dumb because of to many variables that cant be figured out.

The X1 vs PS4 cpu even with the the so called bottlenecks between gpu will not yield more then a frame or two more averages.However we will still see majority of games run better and higher detail on the PS4 because of the 30%+ performance gap between them. And dont ignore the parity many developers are doing which does not make the X1 on par with the PS4.

Its not any different in comparing an fx 6350 with a 7770 vs fx 6300 with a 7790, fact is that the one with the stronger gpu will make up and surpass any possible cpu bottleneck.

Avatar image for cainetao11
cainetao11

38036

Forum Posts

0

Wiki Points

0

Followers

Reviews: 77

User Lists: 1

#269 cainetao11
Member since 2006 • 38036 Posts

@evildead6789 said:

@cainetao11 said:

@evildead6789:

I LOVE THIS!!!! By this logic, the Dallas Cowboys are NOT better than the NY Giants.

BUT OF COURSE

American football compares very easily to computer hardware science specially cpu bottlenecking

( a whole lot sarcasm) and this...

Well............I am so stupid that I have no idea that John Cleese is mocking ME!!!!

Avatar image for Krelian-co
Krelian-co

13274

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#270 Krelian-co
Member since 2006 • 13274 Posts

@evildead6789 said:

@GoldenElementXL said:

@evildead6789: You cant really use AC Unity in this argument. What other game runs at the same resolution on both consoles and the same framerate? 99 times out of a 100 the PS4 is running at a higher resolution, a higher framerate or both. The one thing I don't care for is when the PS4 runs a higher resolution but a lower framerate on average. At that point they should just lower the resolution for the sake of smooth gameplay.

AC Unity is in fact very CPU heavy. And yes NPC's are rough on CPU's which is why the framerate is often times better on Xbox One during gameplay. But during cut scenes the PS4 takes the lead again. GTA V is a game where each console is running the same resolution a nearly identical frame rates. But the Xbox One version has effects dialed back or completely absent.

Well ac unity is a taste of things to come, it's one of the few true next gen games. They didn't take the old consoles into consideration and went full throttle on the next gen consoles. Ubisoft was pretty clear, there is a cpu bottleneck. I know in cutscenes probably the ps4 takes the lead again, but that's exactly my point, the ps4 isn't stronger overall than the x1 , it's on par, some advantages go to the x1, some go to the ps4. IT's a fact that the ps4 is stronger on the gpu side but it's a fact that the x1 is stronger on the cpu side. Since the cpu is bottlenecking , the 10 percent extra performance on the one pretty much makes up for the 30-35 percent that the ps4 has as extra performance on the gpu side.

The differences will be bigger when games are more gpu heavy , but cpu heavy game the differences will be bigger as well. Unity is not the only game that shows this, far cry and cod show it as well, the resolution is higher on the ps4, yes, but it's not 1080p vs 720p, it's more like 1080p vs 900p and the framerates are smoother on the x1. Lowering the resolution wouldn't increase the framerates on the ps4 since the cpu is bottlenecking, so they might as well run it at a higher resolution, of course gpgpu tools can help, but devs have to use them, and there won't be room to spare to keep that 1080p. Unity is the perfect example of it.

As for gta V, things may been dialed back on the x1 , but not when driving at high speeds through cities , the field of view is even higher on the x1.

My only point with this thread is that people wouldn't be deceived, the whole 'ps4 is stronger' is exaggerated, it's marginally better at best. It could be a lot better in gpu heavy games, but what games won't ask for cpu power , games cry for innovation all the time and last gen games in higher resolution is not what everyone is waiting for . In cpu heavy games the ps4 will even struggle as you can see when driving at high speed in gta V and that's a last gen remaster. Of course gpgpu tools will be the saving grace for the ps4, just like the esram tools are the saving grace for the x1 but in the end , the systems are very similar in performance.

People should decide on which controller, exclusives and features they want. Performance is not something they should take into consideration and that's exactly my point.

ladies and gentleman, slow clap for the fanboy here, i have never seen such a combination of ignorance, bullshit and stupidity in one post, you sir have outdone system wars.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#271 tormentos
Member since 2003 • 33784 Posts

@Tighaman said:

@btk2k2:

You people don't read just look at the surface

Ps4 has 8 aces each ace can do 8 que. each =64que

Xbox One has 2 compute processors 2 graphic processors that can do 16que each =64 que

Ps4 has 18CU with 130gb BW total for the system if the CPU need some its goes lower to only 110gb BW just for the GPU

Xbox One has 12CU 150gb for the GPU and 50gb for the CPU with no need to fighting over Bandwidth.

It takes 225bw to totally saturate 12CU for graphics that means ps4 is wasting ALU trying to use 18CU just for Graphics hence 14+4 will never have enough bandwidth to use 18CU just for graphic without Asyn Compute.

Ps4 has 1.8tfs GPU that uses resources for CPU offloading, In Game Music, Video Compressing/Decompressing, Color compressing/ Decompressing.

Xbox One has a GPU split into TWO GPU so I don't know the total Tflops but I do know that its has different processors for CPU offloading, SHAPE, Swizzle Textures, Compressing and Decompressing all with its own Cache and bandwidth

You all on this forum is not telling me nothing I don't already know the reason they using dx12 because you can not target one gpu at a time or using processors individually. I'm not telling you anything from a blog or these forums, this is stuff I have been hearing for the makers of the consoles.

Hahahaaaaaaaaaaaaaaaaaaaaaaaa...

No dude it can't do 16 command i see you pulled that from deep deep down your ass..hahaha

The PS4 has 18 CU with 176GB/s bandwidth not 130,bandwidth mean shit when you GPU is weak,the 7870 has more bandwidth than the 660 TI yet who wins in performance.? lol

Hell the 660TI has 100GB/s less than the stock 7950 and is able to beat it in several games..

The xbox one has 140 to 150 GB's,in the best case scenario,the CPU on the xbox one has only 30GB/s you fool stop inventing crap the xbox one GPU also use the same DDR3 bandwidth the CPU use is 68GB/s 30 for the CPU the rest can be access by the GPU you are completely clueless about this topic as i have pointed many times.

Oh you will run out of GPU power before saturating a 220GB/s bandwidth with such a shitty GPU.

That part about the PS4 not been able to fully use AC is a total joke.

The PS4 has separate hardware for video compression and decompression it has AMD true audio which mean it doesn't need to touch the CPU for audio either.

If i have to guess who you are i would say you are Ronvalencias alt you surely post in his stile with his same problems with English and pulling the same debunked crap over and over again,until the game he cheer for Sniper Elite 3 arrived and was up to 30 FPS faster on PS4 with better textures to.

That last bold part is a total joke you are sad boy guess what i learned today that sony has dual GPU to,it has 36 CU total they are just waiting to deliver and update to kick compute into gears,that way they can use the whole 8 aces and 64 commands in the 36 CU 18 for compute 18 for graphics,the PS4 also has a second CPU which we all know about and that one will be also active when the compute update kick into gears you will see... just wait..

@evildead6789 said:

1) Junctions don't show anything more on screen on the ps4 (grass doesn't grow on junctions), the field of view is even higher on the xboxone

One shootout it goes to 24 fps, with a big amount of explosions, yeah the ps4 is stronger there because of it's stronger gpu. I has a bit more grass and foliage in dense foliage areas, also because of it's stronger gpu.

But when the ps4 has to calculate a lot of ai , like driving at high speed through junctions , cities, then it drops a lot more than the x1 because of the weaker cpu on the ps4.

The x1 has a more consistent framerate overall, it's as simply as that. My point isn't that the x1 is stronger than the ps4, my point is that the ps4. isn't stronger than x1, maybe on paper yes, but even then, bottlenecks have to be avoided because it pulls the performance down, just like you see here with the ps4.

2) You're always forgetting the esram deliberately, experts say the xboxone is comparable to the performance of a hd 7790 because of the esram. Without the esram it's comparable to a 7770. You can't measure the performance of the esram with tflops. I think the latest games show that the esram is working.

3) You mean in games at launch, everyone know the xboxone launch was gimped because of numerous reasons, software mistakes and os configuration mistakes, mistakes that are now gone.

4) the i3 4440 and i3 4130 (the ones that have a 200 mhz difference) are not exactly the same type of cpu, hence that's why i didn't compare them because the x1 and ps4 are the exact same type of cpu, running at different clock speed. But the i3-4130 isn't all that different, it has a weaker internal gpu and a bit less cache. Even with a 200 mhz difference , it's still only a difference of 6 percent, that still not the 10 percent difference between the x1 and ps4's gpu. The performance difference with those two cpu's (i3-4440 and i3-4130) is 3fps on 50frames (not 2, liar), that's 6 percent. So again it shows that with a cpu bottleneck the performance difference between the two cpu's translates into real world performance.

When you're comparing small percentages like this, the hardware tested has to be as similar as possible. That's why I took those two examples because they're exactly the same cpu's only running at different clock speeds. If i could I would have used cpu's that have the same performance difference as the x1 and ps4 cpu , but bottleneck benchmarks are scarce, so I worked with the benchmarks available.

Don't try to spin the numbers, well, you're going to do it anyway since you're so hopelessly in love with your ps4 but people do notice your lies and fantasies lol.

1-The PS4 has more grass and less slowdowns confirmed by DF...

2-No is not when it has a weaker GPU less CU = less CU period,the 7850 has the same bandwidth as the 7870 yet it performs worse,why is it.? Oh yeah less processing power less compute units lower clock speed irrelevant of having the same bandwidth so having more will not make the 7850 top the 7870..

3-Launch.? So was Shadow of Mordor a launch game.? is 900p on xbox one for a damn reason at 1080p it is to much for the shit GPU in the xbox one so a sacrifice is made period the same with DA.

4-Exactly the 4340 is faster than the 4130 by 200mhz yet the difference is 2frames the 4340 is stronger faster and with faster memory read to yet it just does 2 frames more,imagine the jaguar on both the PS4 and xbox one which are the same but with a minor speed boost.

I didn't say 4440 stop spinning is 4340 vs 4130 and yes the difference is 2 frames which is nothing.

Save your damage control troll,you loss the difference is total shit and it shows..hahaha

@Tighaman said:

@@btk2k2:

But that's where things get messed up this is not GCN Arch it just have AMD Tech inside there is a difference, You can look at Ps4 OWN DIAGRAM and it SHOWS its never reaches 176gb its only shows 140bw is the most, and the more bandwidth the CPU needs the less the GPU has whitch its not like that one the Xbox. Esram don't have contention so you will get 150+ most times.

Shape is for the whole system and fully programmable for ingame audio what do you think True Audio is one Digital Signal Processor the XBOX has 4.

This is not a SLI or crossfire with two GPU together this is a DESIGN TO HAVE ONE GPU BE USED AS TWO and being able to target EACH PART OF THE GPU INDEPENDENTLY its not because the command processors they are used sort of like ACEs and you don't need many ROPs when you can use the Shader Cores for fillrate.

No that is what you fanboy want to believe the PS4 has 176Gb/s not 140 and what sony showed was a pitfall on a do and don't presentation for developers,in fact there are developers who already were using 172GB's from the 176 and i quote it to you and you ignore it by this point you are just an un denial fanboy who believe any mister x media kind of crap from any one and don't even post link to info because your own links will betray you.

The xbox one GPU is GCN like the PS4 stated by MS already period.

Digital Foundry: When you look at the specs of the GPU, it looks very much like Microsoft chose the AMD Bonaire design and Sony chose Pitcairn - and obviously one has got many more compute units than the other. Let's talk a little bit about the GPU - what AMD family is it based on: Southern Islands, Sea Islands, Volcanic Islands?

Andrew Goossen: Just like our friends we're based on the Sea Islands family.

http://www.eurogamer.net/articles/digitalfoundry-the-complete-xbox-one-interview

Yes it is GCN like the PS4 this is a real XBO engineer on a real site confirming it.. Have a nice day liar..

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#272  Edited By tormentos
Member since 2003 • 33784 Posts

@04dcarraher said:

@btk2k2: @RossRichard: @tormentos:

lol, processing, texture, and pixel rates do not add up in that way you guys are thinking. In basic math yes 1.3 vs 1.8 is a 40 % difference, 25.6 GP is 88% more then 13.6GP etc. However, the correct way to show performance difference is to take what one can can do and take the difference. 1.3 is 71% of 1.8 hence the 30% faster performance ,13.6 GPix/s is 53% of 25.6 GPix/s, which is the reason for 47% faster performance , and for the texture rate 40 vs 57 is 57 is 40% more then 40, however 40 of 57 is 70% of the performance hence the 30% faster performance.

A prime example with the 7850 vs 7870 1.76 vs 2.56 TFLOPS. You would say its 45% faster and its texture rate 80 vs 55 would be 45% , and pixel rate 32 vs 27.5 17%, When 1.76 is 68% of 2.56, hence 32% performance difference, 55 is 62% of 80 hence 38% faster performance , and 27.5 is 85% of 32. so 15% performance difference. End results for these gpu's only with an average difference of 15-20% with performance. We can even take a 7790 vs 7850 1.79 vs 1.76 TFLOP and texture rate of 56 vs 55, and with pixel rate and see that 7850 has 72% difference 16 vs 27.5 , but yet in reality 7790 is only 25% slower on average at same settings and resolutions . then we can take it further with a 7770 and 7850 , 1.28 vs 1.76 TFLOP a near 50% difference 40 vs 55 texture rate 36% difference, and 16 vs 27.5 pixel rate again is 72% difference. Only yields a average difference of 45%.

The PS4 gpu is not 50% faster then the X1 its roughly around 35-40% faster overall at best.

OK up to 30FPS faster 30 vs 60 what amounts to.?

1080p vs 720p what amount to,......

Because those 2 gaps are recorded ins several games most recently PES 2015.

@evildead6789 said:

The gpu doesn't outweigh the different cpu clock speeds, since it not a cpu. AC unity is a perfect example of this, ubisoft used gpgpu and they weren't able to run the game at better quality on the ps4.

Why? Because the cpu is bottlenecking the AI calculations.

How can you compare your system with the ps4 and x1, that cpu won't bottleneck in a very long time, not in computergames anyway, even at stock clocks lol. Your system is pretty balanced, well it's not hard if you buy the best of everything to be balanced.

The ps4 cpu is comparable to an i3 530. Now try pairing a hd 7850 with that. The xboxone is more like an i3 550 paired with a hd 7790. I'm sure you had pc's in the past and probably ran into to bottlenecks. I don't understand how you don't understand this. To mimic the cpu speed of the ps4 you probably would have to disable all cores but one and still it wouldn't be the same, since the x1 and ps4 have about 6 cores dedicated for gaming and 2 cores for os. So to mimic the ps4 and x1 you have to downlock the cpu to low speeds but leave all cores enabled. Then use one titan and downclock it as well.

When doing a rough estimate you downclock all your cores (but leave em all enabled) to run at 1050mhz. Use one titan, downclock it to match something similar of the hd 7850 and do the benchmark to mimic a ps4. Then clock your cpu at 1150 mhz (but leave all cores enabled). Then downclock the titan even more to match the power of a hd 7790 (if you can downclock it that low) and do another benchmark to mimic the xboxone. STill it wouldn't be ideal but with a lot of research you could probably mimic the bottleneck. You would see that in a lot of cases your frames would be higher on the 1150 clock but you wouldn't be able to achieve 1080p, since the gpu bottlenecks as well then. You would be able to achieve 1080p with the 1050mhz clock but your framerates would be lower. Decreasing the resolution wouldn't increase the framerates since the cpu will be bottlenecking and you have room to spare on your titan that's clocked to a hd 7850 performance.

Loading Video...
Loading Video...

Cell me when the xbox one pull a lead like this on any game...

The GPU out weight it period look at COD for example the xbox one has to drop resolution to be able to keep up period.

Now they fu** up teh PS4 version on purpose to avoid arguments..hahaha

Ubisoft Locks Assassin's Creed Unity's PS4 Resolution to Avoid Debates

Ubisoft Locks Assassin's Creed Unity's PS4 Resolution to Avoid Debates

Ubisoft Locks Assassin's Creed Unity's PS4 Resolution to Avoid Debates

Ubisoft Locks Assassin's Creed Unity's PS4 Resolution to Avoid Debates

http://www.pushsquare.com/news/2014/10/ubisoft_locks_assassins_creed_unitys_ps4_resolution_to_avoid_debates

@04dcarraher said:

lol you wish ,

PS4 gpu that is less then 50% faster in all areas? it means its not 50% faster.

When the fu**ing 780GTx can do 1440p and the R290X can't call me other wise,so that i may point which is the graphics king 720p vs 1080p is freaking bigger difference than 50% 1080p has more than 2 fu**ing million pixels 720p doesn't even have a million,and enough shit you pulled when the argument was 720p PS3 vs 1080p PC...

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#273 Tighaman
Member since 2006 • 1038 Posts

@tormentos: Man you are not talking to someone that he say or she say on a forum, I got them words from JOHN SELL said tem 4 graphic and compute processors do 16 que each 4x16=64 don't hate the messenger.

That's the reason for the 14+4 method because you will never have enough bandwidth to fully saturate 18CU and the ALU that's left over use Asyn Compute to fill in the holes.

Based on is not the same as is there is not one APU, GPU, GCN like the XBOX ONE its one of its kind JUST as JOHN SELL SAID

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#274  Edited By tormentos
Member since 2003 • 33784 Posts

@Tighaman said:

@tormentos: Man you are not talking to someone that he say or she say on a forum, I got them words from JOHN SELL said tem 4 graphic and compute processors do 16 que each 4x16=64 don't hate the messenger.

That's the reason for the 14+4 method because you will never have enough bandwidth to fully saturate 18CU and the ALU that's left over use Asyn Compute to fill in the holes.

Based on is not the same as is there is not one APU, GPU, GCN like the XBOX ONE its one of its kind JUST as JOHN SELL SAID

No you got them from some page and no link actually exits to him saying that,worse i did posted info from a xbox one engineer which did state the xbox one is GCN hell you have use that article several times your self..hahhaa

No GCN does 8 command for Aces period the xbox one is not even close to 64..hahahaa

The 7850 and 7870 both have the same bandwidth yet the 7850 is weaker if your argument was true 156Gb/s would not be enough for the 7870 20CU to out do the 7850 the PS4 has the same bandwidth as a 7870..lol

Also 14+4 still is 18CU running over the same bandwidth idiot or what you think compute will run on air.?

It will use the same 176Gb bandwidth as the graphics will,so basically 14+4 or 18 complete it is the same..

Now you change the argument as based on,the GPU is GCN is 14 CU with 2 disable period and i will quote this thread and you all years to come when the PS4 keep beating the xbox one year after year and your silly theories fail to materialize.

Again like i already told you the PS4 also has a hidden GPU and is total 36 CU prove me wrong....lol

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#275  Edited By commander
Member since 2010 • 16217 Posts

@tormentos said:

Cell me when the xbox one pull a lead like this on any game...

The GPU out weight it period look at COD for example the xbox one has to drop resolution to be able to keep up period.

Now they fu** up teh PS4 version on purpose to avoid arguments..hahaha

Ubisoft Locks Assassin's Creed Unity's PS4 Resolution to Avoid Debates

http://www.pushsquare.com/news/2014/10/ubisoft_locks_assassins_creed_unitys_ps4_resolution_to_avoid_debates

Tomb raider and sniper elite 3 were in the beginning of this gen. In the beginning of this gen ms made a lot of mistakes with their os and software configuration, the esram tools only became available later as well. Those games are also cross gen games, so they are less stressfull for the system and so they didn't run into bottlenecks. Those are the only reasons the ps4 got such a lead, sadly that was in the past.

Cod dropped in resolution because it gpu bound at higher resolution on the x1, it also has smoother framerates on the x1, something the ps4 can't solve by decreasing the resolution a bit. I rather have a tiny bit less resolution (which isn't even noticeable) than a lagfest lol.

Your link is from a site that only reviews playstation games

How desperate can you be... They only said that it was to avoid debates because they didn't want to contradict themselves , since they were so happy they could make watchdogs much better on the ps4, sadly that was in the past, and the past is the past.

Here let me put something in big words

Xbox One Vs. PS4: ‘Assassin’s Creed: Unity’ CPU Bound And Slower On The PlayStation 4

Xbox One Vs. PS4: ‘Assassin’s Creed: Unity’ CPU Bound And Slower On The PlayStation 4

Xbox One Vs. PS4: ‘Assassin’s Creed: Unity’ CPU Bound And Slower On The PlayStation 4

Xbox One Vs. PS4: ‘Assassin’s Creed: Unity’ CPU Bound And Slower On The PlayStation 4

http://www.inquisitr.com/1612173/xbox-one-vs-ps4-assassins-creed-unity-cpu-bound-and-slower-on-the-playstation-4/

I know this must be a real nightmare for you, but I have to say, I told you so...

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#276  Edited By commander
Member since 2010 • 16217 Posts

@tormentos said:


@evildead6789 said:

1) Junctions don't show anything more on screen on the ps4 (grass doesn't grow on junctions), the field of view is even higher on the xboxone

One shootout it goes to 24 fps, with a big amount of explosions, yeah the ps4 is stronger there because of it's stronger gpu. I has a bit more grass and foliage in dense foliage areas, also because of it's stronger gpu.

But when the ps4 has to calculate a lot of ai , like driving at high speed through junctions , cities, then it drops a lot more than the x1 because of the weaker cpu on the ps4.

The x1 has a more consistent framerate overall, it's as simply as that. My point isn't that the x1 is stronger than the ps4, my point is that the ps4. isn't stronger than x1, maybe on paper yes, but even then, bottlenecks have to be avoided because it pulls the performance down, just like you see here with the ps4.

2) You're always forgetting the esram deliberately, experts say the xboxone is comparable to the performance of a hd 7790 because of the esram. Without the esram it's comparable to a 7770. You can't measure the performance of the esram with tflops. I think the latest games show that the esram is working.

3) You mean in games at launch, everyone know the xboxone launch was gimped because of numerous reasons, software mistakes and os configuration mistakes, mistakes that are now gone.

4) the i3 4440 and i3 4130 (the ones that have a 200 mhz difference) are not exactly the same type of cpu, hence that's why i didn't compare them because the x1 and ps4 are the exact same type of cpu, running at different clock speed. But the i3-4130 isn't all that different, it has a weaker internal gpu and a bit less cache. Even with a 200 mhz difference , it's still only a difference of 6 percent, that still not the 10 percent difference between the x1 and ps4's gpu. The performance difference with those two cpu's (i3-4440 and i3-4130) is 3fps on 50frames (not 2, liar), that's 6 percent. So again it shows that with a cpu bottleneck the performance difference between the two cpu's translates into real world performance.

When you're comparing small percentages like this, the hardware tested has to be as similar as possible. That's why I took those two examples because they're exactly the same cpu's only running at different clock speeds. If i could I would have used cpu's that have the same performance difference as the x1 and ps4 cpu , but bottleneck benchmarks are scarce, so I worked with the benchmarks available.

Don't try to spin the numbers, well, you're going to do it anyway since you're so hopelessly in love with your ps4 but people do notice your lies and fantasies lol.

1-The PS4 has more grass and less slowdowns confirmed by DF...

2-No is not when it has a weaker GPU less CU = less CU period,the 7850 has the same bandwidth as the 7870 yet it performs worse,why is it.? Oh yeah less processing power less compute units lower clock speed irrelevant of having the same bandwidth so having more will not make the 7850 top the 7870..

3-Launch.? So was Shadow of Mordor a launch game.? is 900p on xbox one for a damn reason at 1080p it is to much for the shit GPU in the xbox one so a sacrifice is made period the same with DA.

4-Exactly the 4340 is faster than the 4130 by 200mhz yet the difference is 2frames the 4340 is stronger faster and with faster memory read to yet it just does 2 frames more,imagine the jaguar on both the PS4 and xbox one which are the same but with a minor speed boost.

I didn't say 4440 stop spinning is 4340 vs 4130 and yes the difference is 2 frames which is nothing.

Save your damage control troll,you loss the difference is total shit and it shows..hahaha


1. That was before they were driving/flying at high speed...

2. No not period, like i said the 7870 xt is a weaker gpu than the 7950 yet it performs the same, because it's clocked a lot higher. Here the xboxone gpu outmatches a hd 7770 because of the esram, end of story.

3. Shadow of mordor started development a lot sooner than recently released game, when esram tools weren't available yet. It will be your last argument that you'll ever be able to use, that and pes lol

4. You're comparing the numbers for minimum framerates smarty pants.... Average framerates is what matters

Ok you didn't spin the numbers, you don't understand them, sorry...

Avatar image for Martin_G_N
Martin_G_N

2124

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#277  Edited By Martin_G_N
Member since 2006 • 2124 Posts

We also know that devs aren't utilizing GPU compute on the PS4, which the PS4 is designed to be able to do. I'm not saying everything can be pushed over to the GPU, but animations, A.I, and other stuff that are getting more and more compute heavy tasks can.

The GPU difference is the biggest difference, and that is why the PS4 has better performance. And please, don't go by a Ubisoft game, they can't even get a stable framerate on a high end rig. Bad code will always be bad code, no matter what hardware it's on.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#278  Edited By commander
Member since 2010 • 16217 Posts

@Martin_G_N said:

We also know that devs aren't utilizing GPU compute on the PS4, which the PS4 is designed to be able to do. I'm not saying everything can be pushed over to the GPU, but animations, A.I, and other stuff that are getting more and more compute heavy tasks can.

The GPU difference is the biggest difference, and that is why the PS4 has better performance. And please, don't go by a Ubisoft game, they can't even get a stable framerate on a high end rig. Bad code will always be bad code, no matter what hardware it's on.

it's already used on unity

Yet unity still runs better on the xboxone

Simple, a bottleneck determines performance, you can have a workaround with gpgpu but still it's not really the best way. You could say ubi's code is not that good, but unity does look gorgeous.

Avatar image for Martin_G_N
Martin_G_N

2124

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#279 Martin_G_N
Member since 2006 • 2124 Posts

@evildead6789 said:

@Martin_G_N said:

We also know that devs aren't utilizing GPU compute on the PS4, which the PS4 is designed to be able to do. I'm not saying everything can be pushed over to the GPU, but animations, A.I, and other stuff that are getting more and more compute heavy tasks can.

The GPU difference is the biggest difference, and that is why the PS4 has better performance. And please, don't go by a Ubisoft game, they can't even get a stable framerate on a high end rig. Bad code will always be bad code, no matter what hardware it's on.

it's already used on unity

Yet unity still runs better on the xboxone

Simple, a bottleneck determines performance, you can have a workaround with gpgpu but still it's not really the best way.

I really doubt they utilized GPU compute properly, not even first party devs have done that, the game is poorly optimized across all platforms. And it's terrible that they couldn't achieve a stable 30fps on either console. Cut down on NPC's, problem solved.

If the CPU really becomes a problem for devs, Sony can overclock it, no problem.

Avatar image for CrownKingArthur
CrownKingArthur

5262

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#280 CrownKingArthur
Member since 2013 • 5262 Posts

#wow

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#281  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@tormentos

OK up to 30FPS faster 30 vs 60 what amounts to.?

1080p vs 720p what amount to,......

Because those 2 gaps are recorded ins several games most recently PES 2015.

Cell me when the xbox one pull a lead like this on any game...


When the fu**ing 780GTx can do 1440p and the R290X can't call me other wise,so that i may point which is the graphics king 720p vs 1080p is freaking bigger difference than 50% 1080p has more than 2 fu**ing million pixels 720p doesn't even have a million,and enough shit you pulled when the argument was 720p PS3 vs 1080p PC...

All you need to do is take off the fanboy goggles and look at the modern games that actually push the systems with update dev kits and coding. Many early games like Cod ghosts or even MSG 5:GZ do not qualify.

Look at BF4, BF Hardline beta, Dragon age, farcry 4, lords of teh fallen shadow of Mordor etc, all those games only have an average 30% resolution difference, 900p vs 1080 or somewhere in between.

Fact is that PS4 gpu is not 50% faster, but at best 40%

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#282 04dcarraher
Member since 2004 • 23829 Posts
@Martin_G_N said:

@evildead6789 said:

@Martin_G_N said:

We also know that devs aren't utilizing GPU compute on the PS4, which the PS4 is designed to be able to do. I'm not saying everything can be pushed over to the GPU, but animations, A.I, and other stuff that are getting more and more compute heavy tasks can.

The GPU difference is the biggest difference, and that is why the PS4 has better performance. And please, don't go by a Ubisoft game, they can't even get a stable framerate on a high end rig. Bad code will always be bad code, no matter what hardware it's on.

it's already used on unity

Yet unity still runs better on the xboxone

Simple, a bottleneck determines performance, you can have a workaround with gpgpu but still it's not really the best way.

I really doubt they utilized GPU compute properly, not even first party devs have done that, the game is poorly optimized across all platforms. And it's terrible that they couldn't achieve a stable 30fps on either console. Cut down on NPC's, problem solved.

If the CPU really becomes a problem for devs, Sony can overclock it, no problem.

Actually no Sony can not just overclock it, that would introduce problems with games without framerate caps, and will cause things to use more power and create more heat. The current cooling system and psu are designed around within a specific limits causing it to go over will cause hardware issues.

Avatar image for misterpmedia
misterpmedia

6209

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#283 misterpmedia
Member since 2013 • 6209 Posts

Has anyone posted anything to do with Mister X yet? I feel like this topic could use it ;).

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#284 commander
Member since 2010 • 16217 Posts

@04dcarraher said:
@Martin_G_N said:

@evildead6789 said:

@Martin_G_N said:

We also know that devs aren't utilizing GPU compute on the PS4, which the PS4 is designed to be able to do. I'm not saying everything can be pushed over to the GPU, but animations, A.I, and other stuff that are getting more and more compute heavy tasks can.

The GPU difference is the biggest difference, and that is why the PS4 has better performance. And please, don't go by a Ubisoft game, they can't even get a stable framerate on a high end rig. Bad code will always be bad code, no matter what hardware it's on.

it's already used on unity

Yet unity still runs better on the xboxone

Simple, a bottleneck determines performance, you can have a workaround with gpgpu but still it's not really the best way.

I really doubt they utilized GPU compute properly, not even first party devs have done that, the game is poorly optimized across all platforms. And it's terrible that they couldn't achieve a stable 30fps on either console. Cut down on NPC's, problem solved.

If the CPU really becomes a problem for devs, Sony can overclock it, no problem.

Actually no Sony can not just overclock it, that would introduce problems with games without framerate caps, and will cause things to use more power and create more heat. The current cooling system and psu are designed around within a specific limits causing it to go over will cause hardware issues.

Not only that , this isn't just like a pc where you can change clock speeds. The system and games are not made for it and the system would simply crash.

Avatar image for Martin_G_N
Martin_G_N

2124

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#285  Edited By Martin_G_N
Member since 2006 • 2124 Posts

@evildead6789: Are you telling me that in 2014 where alot of cpu's can change it's clockspeed depending on load, that with a console that would not work? The psp was upclocked back in the day. And there is definitely headroom for it, it isn't running that hot. I'm not talking increases of several ghz, just a small upclock like the X1.

Avatar image for vpacalypse
vpacalypse

589

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#286 vpacalypse
Member since 2005 • 589 Posts

Someone please explain GPGPU to this guy & the large irrelevance of the cpu boost from 1.6Ghz up to 1.75Ghz. All that does is have slight frame rate benefits & the cpu handles AI. In the end, GPU matters alot more.

AC unity would run well across the board... if they didn't rush it.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#287 tormentos
Member since 2003 • 33784 Posts

@evildead6789 said:

Tomb raider and sniper elite 3 were in the beginning of this gen. In the beginning of this gen ms made a lot of mistakes with their os and software configuration, the esram tools only became available later as well. Those games are also cross gen games, so they are less stressfull for the system and so they didn't run into bottlenecks. Those are the only reasons the ps4 got such a lead, sadly that was in the past.

Cod dropped in resolution because it gpu bound at higher resolution on the x1, it also has smoother framerates on the x1, something the ps4 can't solve by decreasing the resolution a bit. I rather have a tiny bit less resolution (which isn't even noticeable) than a lagfest lol.

Your link is from a site that only reviews playstation games

How desperate can you be... They only said that it was to avoid debates because they didn't want to contradict themselves , since they were so happy they could make watchdogs much better on the ps4, sadly that was in the past, and the past is the past.

Here let me put something in big words

Xbox One Vs. PS4: ‘Assassin’s Creed: Unity’ CPU Bound And Slower On The PlayStation 4

Xbox One Vs. PS4: ‘Assassin’s Creed: Unity’ CPU Bound And Slower On The PlayStation 4

Xbox One Vs. PS4: ‘Assassin’s Creed: Unity’ CPU Bound And Slower On The PlayStation 4

Xbox One Vs. PS4: ‘Assassin’s Creed: Unity’ CPU Bound And Slower On The PlayStation 4

http://www.inquisitr.com/1612173/xbox-one-vs-ps4-assassins-creed-unity-cpu-bound-and-slower-on-the-playstation-4/

I know this must be a real nightmare for you, but I have to say, I told you so...

Yeah the difference is mine comes UBIsoft it self..hahhaahaaaaaaaaaaaaaaaaaaa Not from a 3rd party.

Sniper Elite 3 is a july basically game idiot and uses the latest xbox one update in fact Rebellion for months paraded how the games was coming alone find and that they were not worry any more because update..

They [Microsoft] are releasing a new SDK that’s much faster and we will be comfortably running at 1080p on Xbox One. We were worried six months ago and we are not anymore, it’s got better and they are quite comparable machines.”

http://www.developer-tech.com/news/2014/feb/17/xbox-one-1080p-comfortably-sdk-update-says-rebellion/

It was the poster child of the Kinect reservation and the confirmation that the xbox one could run 1080p,the xbox one version had to be lock at 30 for V-synch unlock had a horrible tearing,the game wasn't hold back on PS4 like ACU was so the advantage was there and pretty easy to see up to 30 FPS faster 20FPS faster on overage..hahaha

SE3 used the xbox one lasted SDK when it launched and the kinect reservation...hahaha Nice try...

Smoother on single player where by a small margin online both version are basically lock 60FPS but the xbox one version of COD is 1360x1080p while the PS4 version remains solid..lol

Butbutbut SE3 is a game from the beginning of the gen it didn't use the tools lol owned again...

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#288 tormentos
Member since 2003 • 33784 Posts

@04dcarraher said:

Actually no Sony can not just overclock it, that would introduce problems with games without framerate caps, and will cause things to use more power and create more heat. The current cooling system and psu are designed around within a specific limits causing it to go over will cause hardware issues.

Actually it is the xbox one the one who gets hotter even that it has bigger case,bigger fan and no internal PSU,the PS4 isn't running even close to what would be consider danger hot.

I already posted comparison from both done by DF the PS4 was actually cooler while gaming even that it draw 5 more watts than the xbox one.

@evildead6789 said:

Not only that , this isn't just like a pc where you can change clock speeds. The system and games are not made for it and the system would simply crash.

Nope...

@evildead6789 said:

it's already used on unity

Yet unity still runs better on the xboxone

Simple, a bottleneck determines performance, you can have a workaround with gpgpu but still it's not really the best way. You could say ubi's code is not that good, but unity does look gorgeous.

No is not in fact if it was it would have beat the xbox one silly.

Funny enough it was the xbox one what hold back the PS4 at 900p,because the game can't run 1080p at even close to 30FPS BF was 900p,and ACU was doing more the PS4 frames would have stay the same in Unity at 1080p for PS4,on xbox one they would have gone to hell,as it would have become GPU bound.

The thing here is simple there are a barrage of games superior on PS4 just 1 on xbox one and that one has a controversy around it because the developer held back one machine to avoid arguments.

Hahahahaa

Don't worry you are not going any where neither do i so you will see more inferior games come your way on xbox one,and we will see if you will hold to the same excuse of been an early gen game..hahaha

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#289  Edited By commander
Member since 2010 • 16217 Posts

@vpacalypse said:

Someone please explain GPGPU to this guy & the large irrelevance of the cpu boost from 1.6Ghz up to 1.75Ghz. All that does is have slight frame rate benefits & the cpu handles AI. In the end, GPU matters alot more.

AC unity would run well across the board... if they didn't rush it.

but of course it would

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#290  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@Martin_G_N said:

Are you telling me that in 2014 where alot of cpu's can change it's clockspeed depending on load, that with a console that would not work? The psp was upclocked back in the day. And there is definitely headroom for it, it isn't running that hot. I'm not talking increases of several ghz, just a small upclock like the X1.

Here is the problem all cores are being fully utilized which means you would have overclock all of them to actually see a difference. Also the jaguar architecture performance per clock is only around 15% faster then their bobcat series. You would need to get those jags at 2ghz or better to actually see a noticeable difference. The 150 mhz upclock is nearly pointless.

Avatar image for Krelian-co
Krelian-co

13274

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#291 Krelian-co
Member since 2006 • 13274 Posts

@evildead6789 said:

@vpacalypse said:

Someone please explain GPGPU to this guy & the large irrelevance of the cpu boost from 1.6Ghz up to 1.75Ghz. All that does is have slight frame rate benefits & the cpu handles AI. In the end, GPU matters alot more.

AC unity would run well across the board... if they didn't rush it.

but of course it would

that picture fits your delusion perfectly, even your fellow lems are telling you the ps4 is stronger, there is no debate, but keep being delusional and in denial

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#292 commander
Member since 2010 • 16217 Posts

@04dcarraher said:

@Martin_G_N said:

Are you telling me that in 2014 where alot of cpu's can change it's clockspeed depending on load, that with a console that would not work? The psp was upclocked back in the day. And there is definitely headroom for it, it isn't running that hot. I'm not talking increases of several ghz, just a small upclock like the X1.

Here is the problem all cores are being fully utilized which means you would have overclock all of them to actually see a difference. Also the jaguar architecture performance per clock is only around 15% faster then their bobcat series. You would need to get those jags at 2ghz or better to actually see a noticeable difference. The 150 mhz upclock is nearly pointless.

lies

Avatar image for hatecalledlove
hatecalledlove

1383

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#293  Edited By hatecalledlove
Member since 2004 • 1383 Posts

@evildead6789:

except that the toolset has been around for a while, launch games used the Esram toolset. Crytech even came out and said it was a bottleneck for the system. On top of that, Shadows of Mordor uses the extra 10% that getting rid of Kinnect gave the developers to use.

It uses the full power of the xboxone and all the tools and was on the border of being gimped because of a lack of power. That's direct from the developer themselves.

As for GTA5, it's been downgraded in a lot of sections on the xboxone version that are noticeable, even then the system does show performance hits (just not as severe) in the same areas as the ps4 when driving. IT also shows issues in other parts of town that the PS4 has no problems with. The xbox one version also shows slowdown when explosions are on screen as well. Then to go with that the PS4 version has more effects and more foliage throughout the game.

http://www.eurogamer.net/articles/digitalfoundry-2014-grand-theft-auto-5-performance-analysis

The issues your talking about aren't clear cut as a CPU bottleneck. There's other variables in play comparing the two systems. Not that the ps4 CPU isn't a bottleneck or going to be one in the future, because it will be.Same with the Xbox one GPU is creating issues and will continue to create them. Luckily the ps4 CPU doesn't have to handle the background operations on the system and MS put in the ESRAM to help with the GPU problems on the xbox one.

Both systems aren't that powerful to be honest, on the plus side this generation should only last 5 years because of the low power of the two systems.

Avatar image for Krelian-co
Krelian-co

13274

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#294 Krelian-co
Member since 2006 • 13274 Posts

@evildead6789 said:

@04dcarraher said:

@Martin_G_N said:

Are you telling me that in 2014 where alot of cpu's can change it's clockspeed depending on load, that with a console that would not work? The psp was upclocked back in the day. And there is definitely headroom for it, it isn't running that hot. I'm not talking increases of several ghz, just a small upclock like the X1.

Here is the problem all cores are being fully utilized which means you would have overclock all of them to actually see a difference. Also the jaguar architecture performance per clock is only around 15% faster then their bobcat series. You would need to get those jags at 2ghz or better to actually see a noticeable difference. The 150 mhz upclock is nearly pointless.

lies

compelling argument, together with the hall of fame delusion at your opening post in the thread we can safely asume you are either deranged, delusional or 10 yo.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#295  Edited By commander
Member since 2010 • 16217 Posts

@Krelian-co said:

@evildead6789 said:

@04dcarraher said:

@Martin_G_N said:

Are you telling me that in 2014 where alot of cpu's can change it's clockspeed depending on load, that with a console that would not work? The psp was upclocked back in the day. And there is definitely headroom for it, it isn't running that hot. I'm not talking increases of several ghz, just a small upclock like the X1.

Here is the problem all cores are being fully utilized which means you would have overclock all of them to actually see a difference. Also the jaguar architecture performance per clock is only around 15% faster then their bobcat series. You would need to get those jags at 2ghz or better to actually see a noticeable difference. The 150 mhz upclock is nearly pointless.

lies

compelling argument, together with the hall of fame delusion at your opening post in the thread we can safely asume you are either deranged, delusional or 10 yo.

Avatar image for Krelian-co
Krelian-co

13274

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#296 Krelian-co
Member since 2006 • 13274 Posts

@evildead6789 said:

@Krelian-co said:

@evildead6789 said:

@04dcarraher said:

@Martin_G_N said:

Are you telling me that in 2014 where alot of cpu's can change it's clockspeed depending on load, that with a console that would not work? The psp was upclocked back in the day. And there is definitely headroom for it, it isn't running that hot. I'm not talking increases of several ghz, just a small upclock like the X1.

Here is the problem all cores are being fully utilized which means you would have overclock all of them to actually see a difference. Also the jaguar architecture performance per clock is only around 15% faster then their bobcat series. You would need to get those jags at 2ghz or better to actually see a noticeable difference. The 150 mhz upclock is nearly pointless.

lies

compelling argument, together with the hall of fame delusion at your opening post in the thread we can safely asume you are either deranged, delusional or 10 yo.

go look up irony on your closest dictionary.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#297  Edited By ronvalencia
Member since 2008 • 29612 Posts
@clr84651 said:

PS4's RAM > X1's

PS4's GPU > X1's

PS4's CPU = X1's CPU

You enjoy your X1. It's OK if it doesn't have as high of hardware as PS4

RAM: PS4 > X1, In general, PS4's solution is superior.

GPU programmable: PS4 > X1 i.e. 1.84 TF with 8 ACE vs 1.31 TF with 2 ACE. PS4's 8 ACE maxmises GpGPU hardware usage.

GPU raster: PS4 > X1 i.e. PS4's 32 ROPS is imited by memory bandwdith which is about 24 ROPS vs X1's 17 ROPS effective (16 ROPS at 853Mhz ~= 17 ROPS at 800Mhz). X1's dual graphics command units maxmises GPU's graphics related hardware usage and reduce CPU's GPU graphics context overheads/management issues, but this doesn't change the CU and ROPS limits. Multi-threaded CPU to GPU submission can maxmise GPU's graphics related hardware usage i.e. AMD Mantle enables this feature on the AMD's PC GCN side. PS4 has Mantle like APIs hence DirectX 12 is a yawn.

GPU tessellation: X1 > PS4 i.e. X1's minor 1.7 billion triangles per second vs PS4's 1.6 billion triangles per second.

CPU: X1 > PS4 i.e. X1's minor 853 Mhz vs 800Mhz

**On AMD GCN ISA, CU scalar processor can take over for some of the CPU related GPU mangement workloads (Ref 1), but this feature is not covered by DirectX or OpenGL.

Ref 1 http://timothylottes.blogspot.com.au/2013/08/notes-on-amd-gcn-isa.html as an example

DX and GL are years behind in API design compared to what is possible on GCN. For instance there is no need for the CPU to do any binding for a traditional material system with unique shaders/textures/samplers/buffers associated with geometry. Going to the metal on GCN, it would be trivial to pass a 32-bit index from the vertex shader to the pixel shader, then use the 32-bit index and S_BUFFER_LOAD_DWORDX16 to get constants, samplers, textures, buffers, and shaders associated with the material. Do a S_SETPC to branch to the proper shader.

This kind of system would use a set of uber-shaders grouped by material shader register usage. So all objects sharing a given uber-shader can get drawn in say one draw call (draw as many views as you have GPU perf for). No need for traditional vertex attributes either, again just one S_SETPC branch in the vertex shader and manually fetch what is needed from buffers or textures...

There are more than one way to minimise the CPU.

AMD Mantle driven Radeon HD 7750 CrossFire with two graphics command units doesn't beat Mantle driven with one large graphics command unit Radon HD 7970 Ghz Edition.

Radeon HD R9-290's and R9-285 's graphics command unit (GCU) has to drive four tessellation units hence the GCU would be scaled up accordingly.

Both X1 and PS4's GCNs has two tessellation units.

Avatar image for killzowned24
killzowned24

7345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#298 killzowned24
Member since 2007 • 7345 Posts

Wow,are there really still lems that believe this BS even though Xbone gets on average three worse games EVERY single month!!?

Avatar image for deactivated-5a30e101a977c
deactivated-5a30e101a977c

5970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#299 deactivated-5a30e101a977c
Member since 2006 • 5970 Posts

@misterpmedia said:

Has anyone posted anything to do with Mister X yet? I feel like this topic could use it ;).

Insider: Tomb raider is superior to uncharted 4. Graphically I find ryse to look a bit better in certain places. Uncharted 4 textures look flat. When you see tomb raiders lighting system your going to be shocked it's a generation above.

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#300 btk2k2
Member since 2003 • 440 Posts

@ronvalencia said:
@clr84651 said:

PS4's RAM > X1's

PS4's GPU > X1's

PS4's CPU = X1's CPU

You enjoy your X1. It's OK if it doesn't have as high of hardware as PS4

RAM: PS4 > X1, In general, PS4's solution is superior.

GPU programmable: PS4 > X1 i.e. 1.84 TF with 8 ACE vs 1.31 TF with 2 ACE. PS4's 8 ACE maxmises GpGPU hardware usage.

GPU raster: PS4 > X1 i.e. PS4's 32 ROPS is imited by memory bandwdith which is about 24 ROPS vs X1's 17 ROPS effective (16 ROPS at 853Mhz ~= 17 ROPS at 800Mhz). X1's dual graphics command units maxmises GPU's graphics related hardware usage and reduce CPU's GPU graphics context overheads/management issues, but this doesn't change the CU and ROPS limits. Multi-threaded CPU to GPU submission can maxmise GPU's graphics related hardware usage i.e. AMD Mantle enables this feature on the AMD's PC GCN side. PS4 has Mantle like APIs hence DirectX 12 is a yawn.

GPU tessellation: X1 > PS4 i.e. X1's minor 1.7 billion triangles per second vs PS4's 1.6 billion triangles per second.

CPU: X1 > PS4 i.e. X1's minor 853 Mhz vs 800Mhz

**On AMD GCN ISA, CU scalar processor can take over for some of the CPU related GPU mangement workloads (Ref 1), but this feature is not covered by DirectX or OpenGL.

Ref 1 http://timothylottes.blogspot.com.au/2013/08/notes-on-amd-gcn-isa.html as an example

DX and GL are years behind in API design compared to what is possible on GCN. For instance there is no need for the CPU to do any binding for a traditional material system with unique shaders/textures/samplers/buffers associated with geometry. Going to the metal on GCN, it would be trivial to pass a 32-bit index from the vertex shader to the pixel shader, then use the 32-bit index and S_BUFFER_LOAD_DWORDX16 to get constants, samplers, textures, buffers, and shaders associated with the material. Do a S_SETPC to branch to the proper shader.

This kind of system would use a set of uber-shaders grouped by material shader register usage. So all objects sharing a given uber-shader can get drawn in say one draw call (draw as many views as you have GPU perf for). No need for traditional vertex attributes either, again just one S_SETPC branch in the vertex shader and manually fetch what is needed from buffers or textures...

There are more than one way to minimise the CPU.

AMD Mantle driven Radeon HD 7750 CrossFire with two graphics command units doesn't beat Mantle driven with one large graphics command unit Radon HD 7970 Ghz Edition.

Radeon HD R9-290's and R9-285 's graphics command unit (GCU) has to drive four tessellation units hence the GCU would be scaled up accordingly.

Both X1 and PS4's GCNs has two tessellation units.

For once Ron I cannot disagree with anything you have said above. Minor correction on the CPU clockspeed though as it is 1.75 Ghz vs 1.6 Ghz but as you said it is a minor bump. I could maybe argue that the Xbox One ROPS will only be the effective equivalent of 17 on the PS4 if it is pulling the data from ESRAM. If it needs to grab the data from the system ram then it is also bandwidth limited but again this is a very minor thing.

In all to think the Xbox One is the PS4s equal in terms of horsepower is a delusional. Yes it has a minor CPU clock speed advantage which in a like for like situation will only yield the tiniest of advantages in certain scenarios. The issue is though at 1080p the GPUs in both consoles are too weak to really put the burden on the CPU so in all but the most edge cases games will be GPU bound not CPU bound. Further in those edge cases that are CPU bound you very rarely end up with 100% clock speed scaling, unless you are starcraft 2, in which case the 9.375% bump in clock speed would not yield a 9.375% performance advantage.

This shows that the Athlon 5350 with a 7970 can only outperform the 5150 with a 7970 by around 10 % despite the fact it has a 28.125% clock speed advantage. Now the 5350 is a 4 core Jaguar APU running at 2.05Ghz, the 5150 is a 4 core Jaguar APU running at 1.6 Ghz and the 7970 is a much faster GPU than what is in either the PS4 or the Xbox One so what conclusions can we draw.

1) At best a 28.125% clock speed increase sees a 21% increase in average frame rate. This occured in Company of Heroes which is an RTS game and these are known for being very CPU intensive just like 4x strategy games. On average the advantage is around 10.8%

2) The 7970 is a much more powerful GPU than the PS4 GPU and it would have been interesting to see if these were just as CPU limited with an R7 265 which is much closer to the PS4 in terms of performance metrics.

3) The Jaguar CPUs are miles slower than even a low end i3. This means to maximise the consoles use of GPU Compute is an absolute must on both pieces of hardware.

4) This means, on average in part CPU, part GPU bound scenarios which are the most likely to occur a 1% clock speed advantage results in a 0.384% FPS increase.

Based on the above in you would expect the Xbox One to outperform the PS4 by around 3.6% in part CPU, part GPU limited scenarios. How that averages out over an entire game, or an entire level though I do not know and it will depend on CPU usage throughout but I doubt any game, or any level, is CPU limited all of the time. It may be in specific parts but in general it will be GPU limited with the occasion part being CPU limited. This is also assuming the API overhead and optimisation is the same for both platforms and that is also unlikely to be the case. This gap might be a bit bigger in specific areas where the Xbox One API has less overhead than the PS4 API or where the Xbox One received more optimisation than the PS4 and the gap might be smaller in the reverse scenario.

Ultimately a 9.375% clock speed advantage does not mean much and will not lead to any meaningful advantage on the Xbox One.