DX12 will not solve XBones 1080p problems

#251 Posted by sts106mat (20394 posts) -

@GrenadeLauncher said:

@sts106mat said:

Sniper Elite? Lol the game looks like shit, it's not made by a developer known for their graphical prowess. Its also cross gen. Before you start, i played it on PS4.

I reckon Batman Arkham Knight will be the multiplat to compare since its not being developed for last gen consoles.

The point is, Simon, Sniper Elite 3 was also "at parity." I remember lemmings crowing about that. Result? Worst on the Bone.

no the point is, nobody cares about it

#252 Posted by GrenadeLauncher (6248 posts) -

@sts106mat said:

no the point is, nobody cares about it

Simon moving the goalposts.

"L-look! Parity game! Xbox won!!!"

Game turns out like shit on the Bone.

"Nobody cares about it!"

Try harder, Simon.

#254 Posted by sts106mat (20394 posts) -

@GrenadeLauncher said:

@sts106mat said:

no the point is, nobody cares about it

Simon moving the goalposts.

"L-look! Parity game! Xbox won!!!"

Game turns out like shit on the Bone.

"Nobody cares about it!"

Try harder, Simon.

wow you are a true fanboy.

I played the game on my PS4, it is shit. nobody cares about your claims of "parity"

in fact.....why do you care?

#255 Posted by Caseytappy (2173 posts) -

So , xbone lower resolution and both downgraded console versions suffering from frame drops .

what else is new ?

#256 Edited by tormentos (19927 posts) -

@FastRobby said:

Hahahaha, keep telling yourself that, this is too funny, I'm out because you're getting ridiculous, and I'm getting embarrassed in your place

Keep that spinning going..lol

@evildead6789 said:

the ps3 number of tflops has nothing to do with the discussion of the memory bandwith. The numbers were giving out by sony and the ps3 was bought to build a super computer cluster, of 8 ps3 which outperformed 100 intel xeon cpu's (source: wiki)

Wether it was programmable for games or not doesn't matter, because that's exactly my point, tflops are not an absolute measurement for gaming performance.

Your comment about the bandwith is again hilarious, simply because you don't understand the subject lol. The 660 ti is better than the 7850 because it has higher clock speed , more shader cpu's and because it has double the texture mapping units, the smaller bit bus is countered by the higher memory speeds (6000 mhz on the 660 ti vs the 4800 on the7850)

The 7850 's memory is still about 7 percent faster though, but the texture fill rate is double as fast with 660 ti, and that's why the 660ti is faster. If it would had a 256 bit bus, it would have higher memory speeds and the gap would even be bigger.

Is it really that difficult, I mean it's just basic mathemathics, multiply and compare lol. Anyway no point in discussing this with you because you're not listening. It's perfectly understandable though, after you made such a fool of yourself , I wouldn't want to know it either.

but i wouldn't keep on coming back and make it worse lol

Oh yes it does because you are a troll who claim the PS3 has more flops than the PS4..hahaha

The GPU inside the PS3 is basically a higher clocked 7600GTX,it has more or less the same bandwidth and ROP as well,unlike the 7800GTX which has more,the 7600GTX wasn't 1.8TF is was like 190Gflops,Sony played the same game MS played when they call the xbox 360 a 1TF machine which was also a joke it wasn't.

Tflops are not a measurement when you talk AMD vs Nvidia,different architectures different performance based on flops,the RSX has more flops than the Xenos,yet the Xenos is more powerful.

Now the flops here do matter because this 2 GPU are from the same company,are from the same series of cards,they are basically brothers one is just scaled down over the other.

GCN is a line each GPU with more flops perform better than the prior one.

7970>7950>7870>7850>7770> 7750 each and every GPU has more flops than the prior and that is how performance scale the 7950 doesn't beat the 7970 period.

There is one exception in this rule is Bonaire 7790 which is the xbox one GPU,Bonaire is 14 CU and 1.79TF i actually 30 flops higher than the 7850 which is 1.76TF,but even so the 7850 beats it,the only GPU with more flops that get beat by one with less flops in GCN is the xbox one GPU Bonaire 7790,which doesn't make the argument any better for you,the 7790 has a hair higher flop count because it has 1027mhz compare to the 860mhz on the 7850,but since the 7850 has 2 extra CU it beat it quite easy,sound familiar.?

The xbox one has 53mhz more than the PS4 in GPU clock,but the PS4 has 6 more CU which is why it perform better,performance comes from Stream Processors inside CU,not from the bus or ESRAM which is basically memory the xbox one could have 300gb/s it will change nothing with that weak ass GPU,nothing the xbox one has will change that,it has a GPU on the level of the 7770 it has 1.28TF and this is GCN it is a line and the only weak breaking point is Bonaire to you are even more screwed argument wise.

Thank you for stealth admitting the reason why the PS4 will always perform better in that bold part.

So the PS4 has more shader power,double the ROP''s the same bus,and higher unified bandwidth as well.

The smaller bus isn't counter by anything you brain dead fanboy,the 660ti has higher memory clock speed but smaller bus which is why it has less bandwidth you fool,the bus width + speed is what determine the final bandwidth you sad fellow,so 192 bit bus + 6000mhz memory = 144GB/s The 7850 has a 256bit bus lower memory clock 4800mhz but since it has a wider bus,the final bandwidth is 153Gb/s,so yeah effectively the 7850 has higher bus and more bandwidth than the 660ti,but it lack the POWERRRRRRRRRRRRRR this is the real point,hell it is even like that in the 7870 vs the 660ti as well and even more on the 7950 vs the 660TI.

Did you know the 660Ti can beat the 7950 in many games.? Even that the 7950 has almost 100GB/s more in bandwidth and even a wider 384 bit bus than the 7850 256 one.?

http://www.anandtech.com/bench/product/855?vs=860

My points are clearly make bus wide mean sh** if you don't have the power to exploit that bus,which the xbox one doesn't have,the 7850 has a 256 bit bus,so does the 7870,the 7790 and 7770 have 128bit bus for a reason less power smaller bus,and the xbox one is officially 1.28TF which is what the 7770 is not the full 7790,so you have basically loss arguments here,not to mention that you claim the xbox one had a full 7790 which it doesn't and which i PROVED.

Xbox one = 12CU,768 SP,853mhz 1.28 TF

7790 = 14 CU ,896 SP, 1027mhz 1.79 TF.

So yeah exactly the 660TI is faster because it has more power regardless of the 7850 bus or bandwidth,you just confirmed by my point ,the PS4 has more shader units.more ROP, is stronger period and regardless of the xbox one having a 256 bus it means nothing,the PS4 will always be ahead and always perform better.

@GrenadeLauncher said:

Joey displays his ignorance again. H2A, Oddworld New 'n' Tasty and Ratchet and Clank PS4 are remakes. TLOU and Halo 1/3/4 are remasters. This is a straight port, with an upscaling filter slapped on and pan and scan for the widescreen. It's so lazy it hurts.

Sniper Elite 3 had "parity" as well. We all know what happened there. It takes more than paper performance to make the Shitbone look good.

Sniper Elite 3 is the biggest example of how developers play xbox fans like fools,Sniper Elite developers have months making statements about how the new devkit fixed all problems,about how they were not worry any more about the game not been 1080p,they call this game 1080p 60FPS on xbox one,and the true is they had to run it without V-synch to be able to keep up with the PS4 and still was like 5 to 10 FPS behind with lower quality effects as well.

Complains arise about the horrible screen tearing the box one version had,a patch was made and the game officially lock into 30FPS,making what at one point was a 5 FPS gap in many parts 45FPS vs 40 into a 15FPS gap instantly,worse the PS4 many times hit 60FPS and was always closer to 60FPS than the xbox one version ever was,it was a total disaster,Sniper Elite 3 showed once again that the xbox one suffers greatly in 1080p,and just like in Tomb Raider the gap was huge only lemmings deny this.

Their poster child game for months failed,i can even remember how many times Ronvalencia quoted Rebellion on ESRAM and the xbox one hitting 1080p..lol

@sts106mat said:

Sniper Elite? Lol the game looks like shit, it's not made by a developer known for their graphical prowess. Its also cross gen. Before you start, i played it on PS4.

I reckon Batman Arkham Knight will be the multiplat to compare since its not being developed for last gen consoles.

Come on now sniper elite looking like sh** in nothing change his argument the xbox one can't handle it like the PS4 does,and it fall behind as much as 30FPS while having lower quality effects,TOMB RAIDER style.

So the xbox one can't even handle a freaking cross gen game,and you actually have faith that it will handle Batman which by the way is cripple in AMD hardware thank to Nvidias efforts and money hats.?

#257 Posted by tormentos (19927 posts) -

@GrenadeLauncher said:

Simon moving the goalposts.

"L-look! Parity game! Xbox won!!!"

Game turns out like shit on the Bone.

"Nobody cares about it!"

Try harder, Simon.

Spot on man spot on...lol

@FastRobby said:

Like shit, what an exaggeration... A lot of games must have looked like shit on your PS3 then, compared to the X360. But then it didn't matter, it only matters when you are winning... Children

Up to 30FPS less with lower quality effects.? come one man be freaking real..hahaha

You bough a 7770 for $500 dollars...lol

#258 Edited by StormyJoe (6439 posts) -

@GrenadeLauncher said:

@StormyJoe said:

It's not a port, it's a remake.

You are like the frog that slowly boils alive because the temp is raised slowly. I have given three examples of upcoming games that have parity on resolution and frame rate; yet you still doubt.

Joey displays his ignorance again. H2A, Oddworld New 'n' Tasty and Ratchet and Clank PS4 are remakes. TLOU and Halo 1/3/4 are remasters. This is a straight port, with an upscaling filter slapped on and pan and scan for the widescreen. It's so lazy it hurts.

Sniper Elite 3 had "parity" as well. We all know what happened there. It takes more than paper performance to make the Shitbone look good.

I have not read that (about RE). But, since I don't care about the series, I will concede the point.

Regardless, I gave examples of several games, of which the info is confirmed.

#259 Posted by StormyJoe (6439 posts) -

@sts106mat said:

@GrenadeLauncher said:

@StormyJoe said:

It's not a port, it's a remake.

You are like the frog that slowly boils alive because the temp is raised slowly. I have given three examples of upcoming games that have parity on resolution and frame rate; yet you still doubt.

Joey displays his ignorance again. H2A, Oddworld New 'n' Tasty and Ratchet and Clank PS4 are remakes. TLOU and Halo 1/3/4 are remasters. This is a straight port, with an upscaling filter slapped on and pan and scan for the widescreen. It's so lazy it hurts.

Sniper Elite 3 had "parity" as well. We all know what happened there. It takes more than paper performance to make the Shitbone look good.

Sniper Elite? Lol the game looks like shit, it's not made by a developer known for their graphical prowess. Its also cross gen. Before you start, i played it on PS4.

I reckon Batman Arkham Knight will be the multiplat to compare since its not being developed for last gen consoles.

Cows point to Sniper Elite like it is some highfalutin, AAA series,

#260 Posted by sts106mat (20394 posts) -

@StormyJoe said:

@sts106mat said:

@GrenadeLauncher said:

@StormyJoe said:

It's not a port, it's a remake.

You are like the frog that slowly boils alive because the temp is raised slowly. I have given three examples of upcoming games that have parity on resolution and frame rate; yet you still doubt.

Joey displays his ignorance again. H2A, Oddworld New 'n' Tasty and Ratchet and Clank PS4 are remakes. TLOU and Halo 1/3/4 are remasters. This is a straight port, with an upscaling filter slapped on and pan and scan for the widescreen. It's so lazy it hurts.

Sniper Elite 3 had "parity" as well. We all know what happened there. It takes more than paper performance to make the Shitbone look good.

Sniper Elite? Lol the game looks like shit, it's not made by a developer known for their graphical prowess. Its also cross gen. Before you start, i played it on PS4.

I reckon Batman Arkham Knight will be the multiplat to compare since its not being developed for last gen consoles.

Cows point to Sniper Elite like it is some highfalutin, AAA series,

of course.

they ignore games like thief and outlast where the XB1 was performing better

#261 Edited by tormentos (19927 posts) -

@StormyJoe said:

Cows point to Sniper Elite like it is some highfalutin, AAA series,

And there goes the goal post again,the quality of the game wasn't in question here what was is the xbox one sh** ass performance on it.

@sts106mat said:

of course.

they ignore games like thief and outlast where the XB1 was performing better

Thief the game that is 900p on xbox one 1080p on PS4,which lack parallax mapping on xbox one,has shadow pop up problems on xbox one,and runs slower than the PS4 as well.?

How the fu** Thief perform better on xbox one.? How.?

1080p vs 900p already show a performance gap in favor of the PS4,the PS4 version lack anisotropic filter,which is counter by the xbox one lack of Parallax Mapping which is what give textures its depth feel,the PS4 has a streaming problems with some textures,the xbox one version have streaming problems with shadows,and the PS4 runs 2 to 3 frames faster in drops.

There is not a single way to claim the xbox one runs better.

Loading Video...

And outlast dipped more during gameplay than the PS4 did,watch the whole video and not the first minute of cut scene,when the gameplay kick is mostly the xbox one which is behind,and that game is horrible optimized has drop to 29 during cut scenes with nothing going on screen.

#262 Edited by GrenadeLauncher (6248 posts) -

Keep deflecting, Joey. Then again it's better than using ignorance as a defence ("durrr i dunt no nuffin about rezzyden evul")

@sts106mat said:

wow you are a true fanboy.

I played the game on my PS4, it is shit. nobody cares about your claims of "parity"

in fact.....why do you care?

Simon moving the goalposts again. SE3 is a perfect example of your "parity."

@sts106mat said:

of course.

they ignore games like thief and outlast where the XB1 was performing better

Simon grasping at straws again. Thief was shit on both consoles (and the PS4 still had a better frame rate) and Outlast was better on the PS4, even though the Xbone version had four months extra optimisation time.

#263 Posted by Shewgenja (10620 posts) -

@StormyJoe said:

Cows point to Sniper Elite like it is some highfalutin, AAA series,

It has one of the highest IQs of this gen so far. I think that's why it gets pointed to.

#264 Posted by GrenadeLauncher (6248 posts) -

@Shewgenja said:

@StormyJoe said:

Cows point to Sniper Elite like it is some highfalutin, AAA series,

It has one of the highest IQs of this gen so far. I think that's why it gets pointed to.

Bu bu but Sniper Elite 3 d-doesn't count because it isn't made by a p-powerhouse AAA d-developer.

#265 Posted by B4X (5660 posts) -

@Shewgenja said:

@StormyJoe said:

Cows point to Sniper Elite like it is some highfalutin, AAA series,

It has one of the highest IQs of this gen so far. I think that's why it gets pointed to.

Also got patched.

The King Graphics Machine needs no help :P

#266 Edited by GrenadeLauncher (6248 posts) -

@b4x said:

@Shewgenja said:

@StormyJoe said:

Cows point to Sniper Elite like it is some highfalutin, AAA series,

It has one of the highest IQs of this gen so far. I think that's why it gets pointed to.

Also got patched.

The King Graphics Machine needs no help :P

And it still had issues with the patch.

#267 Edited by B4X (5660 posts) -

@GrenadeLauncher said:

@b4x said:

@Shewgenja said:

@StormyJoe said:

Cows point to Sniper Elite like it is some highfalutin, AAA series,

It has one of the highest IQs of this gen so far. I think that's why it gets pointed to.

Also got patched.

The King Graphics Machine needs no help :P

And it still had issues with the patch.

Maybe next year. Quantum Break. May say No.

#268 Posted by tormentos (19927 posts) -

@b4x said:

Also got patched.

The King Graphics Machine needs no help :P

On xbox one was and it lock into 30FPS the PS4 version can do 60 FPS you know..lol

@b4x said:

Maybe next year. Quantum Break. May say No.

Developer making and xbox only game,say it game will make you speechless..

Blackace like always fall for it..lol

#270 Edited by B4X (5660 posts) -

@tormentos said:

@b4x said:

Also got patched.

The King Graphics Machine needs no help :P

On xbox one was and it lock into 30FPS the PS4 version can do 60 FPS you know..lol

@b4x said:

Maybe next year. Quantum Break. May say No.

Developer making and xbox only game,say it game will make you speechless..

Blackace like always fall for it..lol

You scared yet? 6 days till your ass hurt, damage control is at an all new high. :P

Will be glorious. Can you stop it?

#271 Posted by GrenadeLauncher (6248 posts) -

Hey blackace, I heard the impending release of GTA5 will feature a great online mode that works day one and has loads of heists, get hyped.

#272 Posted by sts106mat (20394 posts) -

@GrenadeLauncher said:

Keep deflecting, Joey. Then again it's better than using ignorance as a defence ("durrr i dunt no nuffin about rezzyden evul")

@sts106mat said:

wow you are a true fanboy.

I played the game on my PS4, it is shit. nobody cares about your claims of "parity"

in fact.....why do you care?

Simon moving the goalposts again. SE3 is a perfect example of your "parity."

@sts106mat said:

of course.

they ignore games like thief and outlast where the XB1 was performing better

Simon grasping at straws again. Thief was shit on both consoles (and the PS4 still had a better frame rate) and Outlast was better on the PS4, even though the Xbone version had four months extra optimisation time.

I haven'ty once claimed anything about "parity" stop being a baby.

#273 Edited by commander (8621 posts) -

@tormentos said:

@evildead6789 said:

the ps3 number of tflops has nothing to do with the discussion of the memory bandwith. The numbers were giving out by sony and the ps3 was bought to build a super computer cluster, of 8 ps3 which outperformed 100 intel xeon cpu's (source: wiki)

Wether it was programmable for games or not doesn't matter, because that's exactly my point, tflops are not an absolute measurement for gaming performance.

Your comment about the bandwith is again hilarious, simply because you don't understand the subject lol. The 660 ti is better than the 7850 because it has higher clock speed , more shader cpu's and because it has double the texture mapping units, the smaller bit bus is countered by the higher memory speeds (6000 mhz on the 660 ti vs the 4800 on the7850)

The 7850 's memory is still about 7 percent faster though, but the texture fill rate is double as fast with 660 ti, and that's why the 660ti is faster. If it would had a 256 bit bus, it would have higher memory speeds and the gap would even be bigger.

Is it really that difficult, I mean it's just basic mathemathics, multiply and compare lol. Anyway no point in discussing this with you because you're not listening. It's perfectly understandable though, after you made such a fool of yourself , I wouldn't want to know it either.

but i wouldn't keep on coming back and make it worse lol

Oh yes it does because you are a troll who claim the PS3 has more flops than the PS4..hahaha

The GPU inside the PS3 is basically a higher clocked 7600GTX,it has more or less the same bandwidth and ROP as well,unlike the 7800GTX which has more,the 7600GTX wasn't 1.8TF is was like 190Gflops,Sony played the same game MS played when they call the xbox 360 a 1TF machine which was also a joke it wasn't.

Tflops are not a measurement when you talk AMD vs Nvidia,different architectures different performance based on flops,the RSX has more flops than the Xenos,yet the Xenos is more powerful.

Now the flops here do matter because this 2 GPU are from the same company,are from the same series of cards,they are basically brothers one is just scaled down over the other.

GCN is a line each GPU with more flops perform better than the prior one.

7970>7950>7870>7850>7770> 7750 each and every GPU has more flops than the prior and that is how performance scale the 7950 doesn't beat the 7970 period.

There is one exception in this rule is Bonaire 7790 which is the xbox one GPU,Bonaire is 14 CU and 1.79TF i actually 30 flops higher than the 7850 which is 1.76TF,but even so the 7850 beats it,the only GPU with more flops that get beat by one with less flops in GCN is the xbox one GPU Bonaire 7790,which doesn't make the argument any better for you,the 7790 has a hair higher flop count because it has 1027mhz compare to the 860mhz on the 7850,but since the 7850 has 2 extra CU it beat it quite easy,sound familiar.?

The xbox one has 53mhz more than the PS4 in GPU clock,but the PS4 has 6 more CU which is why it perform better,performance comes from Stream Processors inside CU,not from the bus or ESRAM which is basically memory the xbox one could have 300gb/s it will change nothing with that weak ass GPU,nothing the xbox one has will change that,it has a GPU on the level of the 7770 it has 1.28TF and this is GCN it is a line and the only weak breaking point is Bonaire to you are even more screwed argument wise.

Thank you for stealth admitting the reason why the PS4 will always perform better in that bold part.

So the PS4 has more shader power,double the ROP''s the same bus,and higher unified bandwidth as well.

The smaller bus isn't counter by anything you brain dead fanboy,the 660ti has higher memory clock speed but smaller bus which is why it has less bandwidth you fool,the bus width + speed is what determine the final bandwidth you sad fellow,so 192 bit bus + 6000mhz memory = 144GB/s The 7850 has a 256bit bus lower memory clock 4800mhz but since it has a wider bus,the final bandwidth is 153Gb/s,so yeah effectively the 7850 has higher bus and more bandwidth than the 660ti,but it lack the POWERRRRRRRRRRRRRR this is the real point,hell it is even like that in the 7870 vs the 660ti as well and even more on the 7950 vs the 660TI.

Did you know the 660Ti can beat the 7950 in many games.? Even that the 7950 has almost 100GB/s more in bandwidth and even a wider 384 bit bus than the 7850 256 one.?

http://www.anandtech.com/bench/product/855?vs=860

My points are clearly make bus wide mean sh** if you don't have the power to exploit that bus,which the xbox one doesn't have,the 7850 has a 256 bit bus,so does the 7870,the 7790 and 7770 have 128bit bus for a reason less power smaller bus,and the xbox one is officially 1.28TF which is what the 7770 is not the full 7790,so you have basically loss arguments here,not to mention that you claim the xbox one had a full 7790 which it doesn't and which i PROVED.

Xbox one = 12CU,768 SP,853mhz 1.28 TF

7790 = 14 CU ,896 SP, 1027mhz 1.79 TF.

So yeah exactly the 660TI is faster because it has more power regardless of the 7850 bus or bandwidth,you just confirmed by my point ,the PS4 has more shader units.more ROP, is stronger period and regardless of the xbox one having a 256 bus it means nothing,the PS4 will always be ahead and always perform better.


The ps3 has more tflops than the ps4. Those numbers were confirmed by various scientist. You haven't give me a single source that says otherwise. That's why I said that tflops don't always give the right picture. You're comparing the ps3 rsx with a desktop cpu but that is completely wrong because the the ps3 is basically a cpu mixed with a gpu. These are not completely seperate parts like on a pc.

So while tflops might be very good to measure desktop graphic cards for consoles it doesn't always give the right picture because consoles are different chips working together. A desktop has this too but everything is a lot more seperate. Still your comparison about the 7790 and 7850 proves my point. The tflops on the 7790 are higher than the 7850 but it's clear as day the 7850 is the stronger card.

Your comparison between the 7950 and 660ti is wrong because these benchies are done with the hd 7950 at stock clock and that was basically an underclocked card. It was also before the never settle driver updates. A 7950 easily matches a 660 ti in performance. The 7950 wil also be a lot better at higher resolutions because of it's higher bandwith and memory but this is besides the point because this card is simply not in the same league as the consoles are, it is much better.

Here's a link to a hd 7770 which is 31 percent overclocked on the memory and 6 percent on the core clock. The overclock showed a 10 percent increase in performance . The memory bandwith on this card is already higher at stock clocks than the xbox one gpu. It has 73 gb/s and x1 gpu has 68 gb/s. The overclocked card has a memory bandwith of 92 gb/s.

So the card has a 10 percent increase in performance mainly due to memory overclocking but the xone's gpu would not had any benefits from that? While the x1's gpu is a bit stronger than this card. The xboxone also has shared memory, so memory bandwith in this case would show more performance in games than you would have with the memory bandwith of a desktop gpu, since this memory isn't shared.

another site also reports the hd7770 biggest handicap is the memory bandwith.

And again the 256 bit bus is needed to communicate with the esram, since these are 256 bit chips as well. I never said that the ps4 wasn't stronger than the xboxone, but you make it like it is such a big difference but if the esram is used that difference becomes very small, as games as wolfenstein prove.

If the devs don't use that esram then that's sad for x1 users but still it's just a difference vs 900 p and 1080p, because the 720p vs 1080p with launch games was because of the reservation for the kinect and I really doubt the devs won't use that esram in the future because dx 12 wil become very common and the esram is perfect for dx 12 tools.

#274 Edited by sts106mat (20394 posts) -

@tormentos said:

@StormyJoe said:

Cows point to Sniper Elite like it is some highfalutin, AAA series,

And there goes the goal post again,the quality of the game wasn't in question here what was is the xbox one sh** ass performance on it.

@sts106mat said:

of course.

they ignore games like thief and outlast where the XB1 was performing better

Thief the game that is 900p on xbox one 1080p on PS4,which lack parallax mapping on xbox one,has shadow pop up problems on xbox one,and runs slower than the PS4 as well.?

How the fu** Thief perform better on xbox one.? How.?

1080p vs 900p already show a performance gap in favor of the PS4,the PS4 version lack anisotropic filter,which is counter by the xbox one lack of Parallax Mapping which is what give textures its depth feel,the PS4 has a streaming problems with some textures,the xbox one version have streaming problems with shadows,and the PS4 runs 2 to 3 frames faster in drops.

There is not a single way to claim the xbox one runs better.

Loading Video...

And outlast dipped more during gameplay than the PS4 did,watch the whole video and not the first minute of cut scene,when the gameplay kick is mostly the xbox one which is behind,and that game is horrible optimized has drop to 29 during cut scenes with nothing going on screen.

don't know, don't care. enjoy arguing about dropped frames and resolution. I can buy games on either machine....you are preaching to the choir eltormo

#275 Posted by clone01 (25178 posts) -

This is 6 pages of fanboyish sadness. Seriously, do some of you even game, or do you just maturbate to digital foundry and spec sheets?

#276 Edited by SoftwareGeek (539 posts) -

@slimdogmilionar:

I guess you are talking about the sales figures.

http://www.polygon.com/2014/4/17/5626130/xbox-one-sales-5-million

http://en.wikipedia.org/wiki/List_of_million-selling_game_consoles

#277 Posted by cainetao11 (19031 posts) -

The only problem my Xbox one has is its not September yet. All I play on it is titanfall at this point and that's like once a week. Come on Destiny then FH2, then Sunset Overdrive, a little bit DA inquisition in my life, maybe this is why I have no wife.........

#278 Posted by StormyJoe (6439 posts) -

@tormentos said:

@StormyJoe said:

Cows point to Sniper Elite like it is some highfalutin, AAA series,

And there goes the goal post again,the quality of the game wasn't in question here what was is the xbox one sh** ass performance on it.

@sts106mat said:

of course.

they ignore games like thief and outlast where the XB1 was performing better

Thief the game that is 900p on xbox one 1080p on PS4,which lack parallax mapping on xbox one,has shadow pop up problems on xbox one,and runs slower than the PS4 as well.?

How the fu** Thief perform better on xbox one.? How.?

1080p vs 900p already show a performance gap in favor of the PS4,the PS4 version lack anisotropic filter,which is counter by the xbox one lack of Parallax Mapping which is what give textures its depth feel,the PS4 has a streaming problems with some textures,the xbox one version have streaming problems with shadows,and the PS4 runs 2 to 3 frames faster in drops.

There is not a single way to claim the xbox one runs better.

Loading Video...

And outlast dipped more during gameplay than the PS4 did,watch the whole video and not the first minute of cut scene,when the gameplay kick is mostly the xbox one which is behind,and that game is horrible optimized has drop to 29 during cut scenes with nothing going on screen.

Oh wait... I have to look at DF to be able to see a difference?

LOL

#279 Posted by SoftwareGeek (539 posts) -

@xboxiphoneps3:

you're missing the point.

#280 Posted by GrenadeLauncher (6248 posts) -

"Outlast is better on Xbone, hahaha cows!"

"No it isn't, here's indisputable evidence of that"

"N-Nobody cares about it"

Simon does it again!

#281 Posted by clone01 (25178 posts) -

@cainetao11 said:

The only problem my Xbox one has is its not September yet. All I play on it is titanfall at this point and that's like once a week. Come on Destiny then FH2, then Sunset Overdrive, a little bit DA inquisition in my life, maybe this is why I have no wife.........

I bought an XBox One and haven't played it in months. My 360 gets waaaay more gaming time. But yeah, Destiny and Sunset Overdrive look cool. I'll probably pick up a PS4 at some point, as I enjoy the Infamous series (not to mention The Order and Bloodbourne have me excited).

#282 Posted by GrenadeLauncher (6248 posts) -

@StormyJoe said:

Oh wait... I have to look at DF to be able to see a difference?

LOL

Joey missing the bigger picture: the closest actual parity game so far has been an indie game with four months extra optimisation time on the Bone.

#283 Posted by Gue1 (11144 posts) -

the only purpose of the xbone is to force multiplat devs to gimp their games.

#284 Edited by cainetao11 (19031 posts) -

@clone01: if I still had my 360 id play games I have on the HDD, but I gave to my nephew. But I still play my ps3 more, and PC as well.

#285 Posted by Shewgenja (10620 posts) -

@GrenadeLauncher said:

"Outlast is better on Xbone, hahaha cows!"

"No it isn't, here's indisputable evidence of that"

"N-Nobody cares about it"

Simon does it again!

Hush you, leave your logic out of this!

#286 Edited by SoftwareGeek (539 posts) -

@tormentos:

So that outlast video is your proof that the ps4 is superior? Pretty much looks like a toss up to me. You need to do much better to prove your point. Too close to call. When x1 starts off loading computations to the cloud their frame rate will be consistently higher. The PS4 right now doesn't have that ability. Follow the link grasshopper. You can start about 1:00 into the video.

https://www.youtube.com/watch?v=QxHdUDhOMyw

#287 Posted by GrenadeLauncher (6248 posts) -

@softwaregeek said:

@tormentos:

When x1 starts off loading computations to the cloud their frame rate will be consistently higher.

Aaaaaaaahahahahahaahahahaha, lemmings still clinging to that canard.

#289 Posted by B4X (5660 posts) -

@GrenadeLauncher said:

@softwaregeek said:

@tormentos:

When x1 starts off loading computations to the cloud their frame rate will be consistently higher.

Aaaaaaaahahahahahaahahahaha, lemmings still clinging to that canard.

The cloud is fake!! ~GrenadeLauncher

#290 Posted by SoftwareGeek (539 posts) -

@GrenadeLauncher:

You can deny it all you want, but that ability gives the x1 an advantage. It's the next evolution for gaming. That's okay though because you still have a decent system with the ps4.

#291 Posted by SoftwareGeek (539 posts) -

@b4x said:

@GrenadeLauncher said:

@softwaregeek said:

@tormentos:

When x1 starts off loading computations to the cloud their frame rate will be consistently higher.

Aaaaaaaahahahahahaahahahaha, lemmings still clinging to that canard.

The cloud is fake!! ~GrenadeLauncher

just like the internet....fake.

#292 Posted by B4X (5660 posts) -

@Stringerboy said:

So how is Bungie managing to get destiny to 1080p?

They're a competent developer. :P

#293 Posted by ccagracing (819 posts) -

@softwaregeek said:

@GrenadeLauncher:

You can deny it all you want, but that ability gives the x1 an advantage. It's the next evolution for gaming. That's okay though because you still have a decent system with the ps4.

It will give an advantage if all users have the XB1 connected to the internet, however there is a but , developers have to code the games to the lowest common demonators and there is two of them on XB1. If the game is Multiplat then they most likely will not implement this cloud feature set as PC or PS4 don't have this toolset and the second point is; What if the cloud (MS servers) goes down, how would I play my game, if it controls game critical content? In an ideal 100% connected world, the cloud would help and could really make a difference but I believe we wont see it much this gen other than to control AI as per Titanfall or maybe something like Skybox rendering.

#294 Posted by Tighaman (1038 posts) -

ESRAM is not the problem per se but its a problem for deffered rendering and time restricted devs because most devs dont tightly pack buffers and rendertargets they also dont like to split g-buffers because its time and cost consuming. With the x1 you need to split buffers and move the stuff that need to be fast into the ESRAM and the rest in DRAM. GDDR5 is dying ESRAM or HBM or HBC on the GPU is going to be the future. Thast why the guy from valve says Tiled Resources and Megatextures or MESH (what lionheart is designing) will win in the long run.

#295 Posted by GrenadeLauncher (6248 posts) -

@softwaregeek said:

@GrenadeLauncher:

You can deny it all you want, but that ability gives the x1 an advantage. It's the next evolution for gaming. That's okay though because you still have a decent system with the ps4.

Just like PS Now, I can't wait to deal with input lag on current gen asset sizes, DRM by stealth and getting kicked off if my internet shits the bed.

its da fucha u guyz. eggbux won

@Stringerboy said:

So how is Bungie managing to get destiny to 1080p?

Cross gen. Bungie aimed at a target so all versions are at near-parity and they got there.

#296 Posted by SoftwareGeek (539 posts) -

@ccagracing:

"It will give an advantage if all users have the XB1 connected to the internet, however there is a but , developers have to code the games to the lowest common demonators and there is two of them on XB1."

Agreed. Everything you said there is dead on. Most likely the exclusives will take advantage of the cloud and games developed for multiple platforms will all run about the same on each console. I have another post on this thread about that if you care to hunt it down. There's also a YouTube video that demonstrates how the cloud improves frame rate. Also, Microsoft PC games will likely take advantage of the cloud feature as well.

"What if the cloud (MS servers) goes down, how would I play my game, if it controls game critical content?"

Good point. If it goes down you may not be able to play it. However MS's cloud does a lot more than compute problems for games. It hosts web services, enterprise services, web-sites, mobile services, financial transactions and so on. I actually looked into it at one time to deploy a website powered by java/spring. With all the contracts and agreements that MS has with corporations and individuals the chances of the cloud going down is slim because it would cost MS a LOT of money. It may go down for maintenance every once in a while but not too often. The chances that it goes down are probably less than you losing power at home. Here's a link if you want to know more info about azure: https://azure.microsoft.com/en-us/

"In an ideal 100% connected world, the cloud would help and could really make a difference but I believe we wont see it much this gen other than to control AI as per Titanfall or maybe something like Skybox rendering."

Time will tell. But as I said, they already have a demo showing the power of the cloud which improves frame rate. You can watch it here: https://www.youtube.com/watch?v=QxHdUDhOMyw

#297 Posted by GrenadeLauncher (6248 posts) -

Oh God, the marketers are here.

Did they ever give the specs of the local hardware da clawd was operating against in that tech demo?

#298 Posted by StormyJoe (6439 posts) -

@GrenadeLauncher said:

@StormyJoe said:

Oh wait... I have to look at DF to be able to see a difference?

LOL

Joey missing the bigger picture: the closest actual parity game so far has been an indie game with four months extra optimisation time on the Bone.

You are missing the bigger picture - having to use DF in order to spot the difference means that the differences are minor. If you tun your car's engine, and then have to hook it up to a dynamometer in order to see if anything changed; you didn't do much.

#299 Edited by ccagracing (819 posts) -

@GrenadeLauncher said:

Oh God, the marketers are here.

Did they ever give the specs of the local hardware da clawd was operating against in that tech demo?

IM not marketing anything, software geek took my comments out of context. I don't think that if a game uses the cloud it will suddenly gain 10fps or anything like it. I think on an exclusive title the cloud could be used to offset non game critical time sensitive tasks, with this being ignored on multiplats as PS or PC don't make use of any cloud servers so this wont be implemented on these titles at any level. My hope for the cloud would be that it can be used for better AI, which could reduce CPU cycles for that aspect but I don't feel that it will play a big part this gen. I would rather the game developers focused on innovation with fresh ideas, new concepts for gameplay, incorporate multiplayer fully into game rather than an extra 10fps and slightly sharper textures here and there which will make no difference to most games playability.

#300 Edited by SoftwareGeek (539 posts) -

@ccagracing:

"software geek took my comments out of context. I don't think that if a game uses the cloud it will suddenly gain 10fps or anything like it"

Did you watch the video? It clearly shows a 28+ fps boost. I'm just merely pointing out that the cloud is real, it's there, it's already working. Those that think it's a marketing gimmick are free to believe that but their beliefs are misplaced. Crackdown will make extensive use of it and if you've watched the trailer you know that is one sweet looking game.

Capcom is remaking the original Resident Evil 1. That's one of my all time fave's.