DX12 will not solve XBones 1080p problems

#351 Posted by clyde46 (44801 posts) -

@clone01 said:

@clyde46 said:

@GrenadeLauncher: Not the level.... There was a user on here last year or so that created some ruckus... Surely you remember him?

I'll leave this for the lulz

Wow.. That's seriously unglued.

He quit SW in a big huff but he was found out posting under an alt.

#352 Edited by SoftwareGeek (196 posts) -

@Heil68: "SONY crushes Xbone and declared the world's most powerful video game console. A truly modern engineering marvel."

just smothering that bait in BBQ sauce aren't ya? lol

#353 Posted by Shewgenja (8440 posts) -

@clyde46 said:

He quit SW in a big huff but he was found out posting under an alt.

This place is for fun. Sure, I get a rise out of pissing off the Lems here, but I don't personally dislike anyone. Hell, I get a rise out of it when people personally dislike me and make all kinds of assumptions. That's pretty much the SW game. You guys are fun to talk to. There's also some legit smart people in here.

#354 Posted by clyde46 (44801 posts) -

@clyde46 said:

He quit SW in a big huff but he was found out posting under an alt.

This place is for fun. Sure, I get a rise out of pissing off the Lems here, but I don't personally dislike anyone. Hell, I get a rise out of it when people personally dislike me and make all kinds of assumptions. That's pretty much the SW game. You guys are fun to talk to. There's also some legit smart people in here.

Aye. Some users on here are total nutcases but he was something else. For some reason, he couldn't tell the difference between Crysis on the PS3 vs the PC but he could point out all the differences between the PS3 and the 360 versions.

#355 Posted by SoftwareGeek (196 posts) -

@GrenadeLauncher:

"Can't wait to laugh at the lag outside of that controlled environment and the fact any console or modestly-priced PC will be able to render the same stuff."

You just can't handle the fact that it's already been through the proof-of-concept stages and that it's going to work and that gives the X1 a magnificent advantage over your beloved ps4. Let's face it, it's an online world and you can't handle the fact that you screwed the pooch and bought a hardware upgrade. hahaha. gawd.

#356 Posted by Shewgenja (8440 posts) -

@clyde46 said:

Aye. Some users on here are total nutcases but he was something else. For some reason, he couldn't tell the difference between Crysis on the PS3 vs the PC but he could point out all the differences between the PS3 and the 360 versions.

That level of fanboyism is just dangerously bad. I'm kind of glad this place was meltdown central for a while because on whole, the better posters have stuck around.

#357 Edited by clyde46 (44801 posts) -

@clyde46 said:

Aye. Some users on here are total nutcases but he was something else. For some reason, he couldn't tell the difference between Crysis on the PS3 vs the PC but he could point out all the differences between the PS3 and the 360 versions.

That level of fanboyism is just dangerously bad. I'm kind of glad this place was meltdown central for a while because on whole, the better posters have stuck around.

That level of fanboy nuttery was the best! Back then trolls had to stay and fight their corner. No yelling "butthurt" then running off. Trolling back then really was an art form. People like ShadowMoses used to create threads that ran on and on for pages going in circles but it was good as they would always come back for more. Back and forth it went, time and after time. There were many like him but none were as memorable as him.

#358 Edited by clone01 (24492 posts) -

@clyde46 said:

@Shewgenja said:

@clyde46 said:

Aye. Some users on here are total nutcases but he was something else. For some reason, he couldn't tell the difference between Crysis on the PS3 vs the PC but he could point out all the differences between the PS3 and the 360 versions.

That level of fanboyism is just dangerously bad. I'm kind of glad this place was meltdown central for a while because on whole, the better posters have stuck around.

That level of fanboy nuttery was the best! Back then trolls had to stay and fight their corner. No yelling "butthurt" then running off. Trolling back then really was an art form. People like ShadowMoses used to create threads that ran on and on for pages going in circles but it was good as they would always come back for more. Back and forth it went, time and after time. There were many like him but none were as memorable as him.

Agreed...

#359 Posted by tormentos (17013 posts) -

So what, like they can' t import video into a game

or whatever, the cloud will do something , enough people say

like your word is good for anything, you don't how to count, multiply

you don't even understand what context means

So it is 100% confirm you know sh** about this..hahahahahahahaaaaaaaaaaaaaaaa

PSN+ is basically netflix but lower latency so you can controls what is happening in game,you are not streaming assets of any kind,in fact is just a video of the game been render on sony's servers,is like i play on my PC and stream a video online the only difference is that you can controls it.

This requires like 5 to 10MB bandwidth,the cloud MS is advertising is about streaming power directly to the console,or offloading process that other wise would have to be run by the unit,that is totally different.

sony just sucks at making consoles, last gen the cpu was complete overkill. This gen the memory bandwith is overkill.

They just got lucky microsoft reserved 10 percent for the kinect of the power available and because the esram tools weren't used properly

A game like wolfenstein shows how it is,

virtually no difference, and when the cloud starts , who knows what the x1 will be capable of

So much bullsh** from some one who is confirmed and re confirmed to know sh** about what hes talking about..lol

Yeah i guess that is why The witcher 3 hasn't been confirmed to be 1080p on xbox one,because sony got lucky..hahaha

Wolfenstein drop to almost 720p on xbox one to be able to keep up with the PS4..lol

Why do you leave out all the rest of the article? You just cut out what you want to hear, which is childish, and a way of lying. In that same article it clearly states:

Crackdown indicates that while Microsoft's messaging behind the cloud has shifted in the short term, it's sticking to its long term vision for how it could benefit games.

Digital Foundry's Richard Leadbetter welcomed Microsoft's more approachable messaging, but said important questions remain.

"The name might have changed, but this is still very much the Azure 'Thunderhead' service, just with a name-change to make it more appealing to the core gamer," Leadbetter said. "While the cloud obviously offers dedicated server functionality, Azure offers the potential for much more and it doesn't look as though Microsoft has changed its plans there.

"We've only seen a hint of what's possible so far beyond multiplayer gaming with Drivatars and Titanfall's grunt AI. The Crackdown prototype is a great example of what the cloud should excel at - offloading complex calculations away from the host console, where the additional 100ms or so latency to and from the datacentre won't unduly impact gameplay.

"The cloud doesn't address graphics bottlenecks, but here it demonstrates how much of a strain simulating destruction of a complex scene can have on the CPU - an area where both PS4 and Xbox One lag behind even mid-range PC processors.

"The question is really how much CPU time Microsoft is willing to dedicate to each game instance. I suspect that the Crackdown prototype uses an order of magnitude more CPU power than, say, the grunt AI in Titanfall. It'll be an interesting stress test of the Azure infrastructure to see if it can hold its own in a game likely to break the million-sales barrier in short order."

Just freaking read...

Also don't talk about what the cloud is able to do:

The cloud serves for allot of things,rendering graphics over the internet is not one of them sadly,it can be use for AI,some physics as long as the results are not need it in the same frame,baked lighting and stuff like that which don't need constant refreshing.

Completely not true. Not even in the slightest. Do you even know basic AI? You never have given any credentials, a lot of other people have. You've got software engineers that are telling you, you are wrong.

Leadbetter the one who has been defending MS since the whole debacle started and who was force to eats his word when the whole 720p vs 1080p fiasco happen on launch..lol

Let me bold the best parts for you which are a clear give away that this,isn't just a name change no matter what the CLOUD will sound 100 times more appealing to clueless morons like your self,than dedicated servers which is an old ass term used for decades already,and which can imply anything.

I read it now read those bold parts as they are KEY to understanding that no matter what the cloud does MS will be behind.

First one Latency which the freaking internet is full of,which is HORRIBLE for CPU processes,and bad for CPU as well,now that is the reason why not all CPU task can be offloaded to the cloud and just things that don't need constant refreshing,a CPU bandwidth on PC typically is 68GB/s which is what DDR3 offers,now compare how many fu**ing times does 100MB (just for the pleasure of going well over the average internet connection in homes) fit on 68GB...

Some process don't need constant refreshing,AI is one of them,reason why the cloud can do it,Physics (as long as the results are not need it in the same frame) baked lighting, process like this don't need constant refreshing so they can be achieve over the network,which by the way mean that the game that use it need constant internet of certain speed to be able to keep things good,if your connection fails yeah you will not play Crackdown PERIOD.

The second bold part. ""The cloud doesn't address graphical bottle necks"'

That alone should have ended the argument but since you people are hard headed you keep fighting for the same sh**,since the PS4 has more GRAPHICAL power than the xbox one,there is no fu**ing way the cloud can help the xbox one catch the PS4,no parity,no equality no nothing,the cloud will not make up for the xbox one lack of GPU power,that should have ended the argument right there.

Yeah maybe they could have some good physics but at a cost of the games always needing online connection,by the way once MS in 8 years kill the servers for those games,you loss your game for ever,since it was using power to run CPU processes from an outside source that will not be there any more,and the xbox one CPU will not be up to the task to run them,because MS tied up the resources in other process.

The xbox one will always trail the PS4 and the cloud can't do nothing about that because it can't make out for graphical bottle necks,online doesn't have the speed GPU require for that silly lemming.

#360 Posted by Douevenlift_bro (5032 posts) -

TLHBO

#361 Posted by GrenadeLauncher (3999 posts) -

@GrenadeLauncher:

"Can't wait to laugh at the lag outside of that controlled environment and the fact any console or modestly-priced PC will be able to render the same stuff."

You just can't handle the fact that it's already been through the proof-of-concept stages and that it's going to work and that gives the X1 a magnificent advantage over your beloved ps4. Let's face it, it's an online world and you can't handle the fact that you screwed the pooch and bought a hardware upgrade. hahaha. gawd.

Oh man, past proof-of-concept, Xbox well and truly won.

You're going to have to do better than this, marketer.

Wow, I bought an upgrade? That's a relief. Better than buying a gimped cable box with vague promises. Promises that you could say are...cloudy.

#362 Posted by tormentos (17013 posts) -

"Can't wait to laugh at the lag outside of that controlled environment and the fact any console or modestly-priced PC will be able to render the same stuff."

You just can't handle the fact that it's already been through the proof-of-concept stages and that it's going to work and that gives the X1 a magnificent advantage over your beloved ps4. Let's face it, it's an online world and you can't handle the fact that you screwed the pooch and bought a hardware upgrade. hahaha. gawd.

Yeah it show Titanfall look better than Killaonze right..? hahahaaaaaaaa

#363 Edited by evildead6789 (7419 posts) -

@tormentos said:

@evildead6789 said:

So what, like they can' t import video into a game

or whatever, the cloud will do something , enough people say

like your word is good for anything, you don't how to count, multiply

you don't even understand what context means

So it is 100% confirm you know sh** about this..hahahahahahahaaaaaaaaaaaaaaaa

PSN+ is basically netflix but lower latency so you can controls what is happening in game,you are not streaming assets of any kind,in fact is just a video of the game been render on sony's servers,is like i play on my PC and stream a video online the only difference is that you can controls it.

This requires like 5 to 10MB bandwidth,the cloud MS is advertising is about streaming power directly to the console,or offloading process that other wise would have to be run by the unit,that is totally different.

@evildead6789 said:

sony just sucks at making consoles, last gen the cpu was complete overkill. This gen the memory bandwith is overkill.

They just got lucky microsoft reserved 10 percent for the kinect of the power available and because the esram tools weren't used properly

A game like wolfenstein shows how it is,

virtually no difference, and when the cloud starts , who knows what the x1 will be capable of

So much bullsh** from some one who is confirmed and re confirmed to know sh** about what hes talking about..lol

Yeah i guess that is why The witcher 3 hasn't been confirmed to be 1080p on xbox one,because sony got lucky..hahaha

Wolfenstein drop to almost 720p on xbox one to be able to keep up with the PS4..lol


Dear lord what a difference, let's go buy a ps4 and play with fanboys like tormentos

From the reviewer

"While smoke and some effects look slightly different, the consoles are very close, so maybe we can put this discussion to bed"

link

#364 Edited by SoftwareGeek (196 posts) -

@softwaregeek said:

"Can't wait to laugh at the lag outside of that controlled environment and the fact any console or modestly-priced PC will be able to render the same stuff."

You just can't handle the fact that it's already been through the proof-of-concept stages and that it's going to work and that gives the X1 a magnificent advantage over your beloved ps4. Let's face it, it's an online world and you can't handle the fact that you screwed the pooch and bought a hardware upgrade. hahaha. gawd.

Yeah it show Titanfall look better than Killaonze right..? hahahaaaaaaaa

That's all you got? weak sauce

#365 Posted by SoftwareGeek (196 posts) -

@softwaregeek said:

@GrenadeLauncher:

"Can't wait to laugh at the lag outside of that controlled environment and the fact any console or modestly-priced PC will be able to render the same stuff."

You just can't handle the fact that it's already been through the proof-of-concept stages and that it's going to work and that gives the X1 a magnificent advantage over your beloved ps4. Let's face it, it's an online world and you can't handle the fact that you screwed the pooch and bought a hardware upgrade. hahaha. gawd.

Oh man, past proof-of-concept, Xbox well and truly won.

You're going to have to do better than this, marketer.

Wow, I bought an upgrade? That's a relief. Better than buying a gimped cable box with vague promises. Promises that you could say are...cloudy.

You bought a hardware upgrade and no innovation. hahahahahaha!!! gawd! Diablo 3 = parity. So you bought a system that isn't going to run games better than an x1 and has no cloud powah! hahahaha! Nah man, really I'm just razzing ya. You still got a decent system....I wouldn't feel too bad If I were you.

#366 Posted by tormentos (17013 posts) -

Dear lord what a difference, let's go buy a ps4 and play with fanboys like tormentos

From the reviewer

"While smoke and some effects look slightly different, the consoles are very close, so maybe we can put this discussion to bed"

link

The New Order, we finally found out. In our initial performance analysis, we went in search of the first cross-platform 1080p60 first-person shooter and while the game mostly delivered, the discovery of a dynamic resolution suggested that, once again, PlayStation 4 had managed to trump its Microsoft rival.

Metrics in the area of 1760x1080 are found on PS4, while on the Xbox One this can drop to an extreme of 960x1080 in some scenes. This is usually identifiable by an increase in the amount of jaggies on screen, along with a slightly fuzzier appearance to the already gritty aesthetic that Machine Games employs throughout the game.

Meanwhile, shadow quality on the PS4 is a match for the PC game with the effect at its second highest setting, although on the Xbox One these elements appear to operate using lower PCF (percentage closer filtering) sample counts which results in a more jagged shadows.

http://www.eurogamer.net/articles/digitalfoundry-2014-wolfenstein-the-new-order-face-off

The PS4 version doesn't go even close to 900p the xbox one version drops almost to 720p you sad lemming,is quite visible with jaggies and fuzzier image...hahaha

The difference is there...lol

That's all you got? weak sauce

Well something is better than nothing WHAT DO YOU HAVE to prove you totally failed argument.? Oh yeah promises..lol

#367 Posted by SoftwareGeek (196 posts) -

@evildead6789 said:

From the reviewer

"While smoke and some effects look slightly different, the consoles are very close, so maybe we can put this discussion to bed"

link


Dat lighting effect at :40 that's more noticeable on the x1 than the ps4. Another lighting effect at 3:55 that looks better on the x1. The rest looks the same. Not enough difference to even worry about.

#368 Posted by FastRobby (1202 posts) -

@FastRobby said:

Why do you leave out all the rest of the article? You just cut out what you want to hear, which is childish, and a way of lying. In that same article it clearly states:

Crackdown indicates that while Microsoft's messaging behind the cloud has shifted in the short term, it's sticking to its long term vision for how it could benefit games.

Digital Foundry's Richard Leadbetter welcomed Microsoft's more approachable messaging, but said important questions remain.

"The name might have changed, but this is still very much the Azure 'Thunderhead' service, just with a name-change to make it more appealing to the core gamer," Leadbetter said. "While the cloud obviously offers dedicated server functionality, Azure offers the potential for much more and it doesn't look as though Microsoft has changed its plans there.

"We've only seen a hint of what's possible so far beyond multiplayer gaming with Drivatars and Titanfall's grunt AI. The Crackdown prototype is a great example of what the cloud should excel at - offloading complex calculations away from the host console, where the additional 100ms or so latency to and from the datacentre won't unduly impact gameplay.

"The cloud doesn't address graphics bottlenecks, but here it demonstrates how much of a strain simulating destruction of a complex scene can have on the CPU - an area where both PS4 and Xbox One lag behind even mid-range PC processors.

"The question is really how much CPU time Microsoft is willing to dedicate to each game instance. I suspect that the Crackdown prototype uses an order of magnitude more CPU power than, say, the grunt AI in Titanfall. It'll be an interesting stress test of the Azure infrastructure to see if it can hold its own in a game likely to break the million-sales barrier in short order."

Just freaking read...

Also don't talk about what the cloud is able to do:

The cloud serves for allot of things,rendering graphics over the internet is not one of them sadly,it can be use for AI,some physics as long as the results are not need it in the same frame,baked lighting and stuff like that which don't need constant refreshing.

Completely not true. Not even in the slightest. Do you even know basic AI? You never have given any credentials, a lot of other people have. You've got software engineers that are telling you, you are wrong.

Leadbetter the one who has been defending MS since the whole debacle started and who was force to eats his word when the whole 720p vs 1080p fiasco happen on launch..lol

Let me bold the best parts for you which are a clear give away that this,isn't just a name change no matter what the CLOUD will sound 100 times more appealing to clueless morons like your self,than dedicated servers which is an old ass term used for decades already,and which can imply anything.

I read it now read those bold parts as they are KEY to understanding that no matter what the cloud does MS will be behind.

First one Latency which the freaking internet is full of,which is HORRIBLE for CPU processes,and bad for CPU as well,now that is the reason why not all CPU task can be offloaded to the cloud and just things that don't need constant refreshing,a CPU bandwidth on PC typically is 68GB/s which is what DDR3 offers,now compare how many fu**ing times does 100MB (just for the pleasure of going well over the average internet connection in homes) fit on 68GB...

Some process don't need constant refreshing,AI is one of them,reason why the cloud can do it,Physics (as long as the results are not need it in the same frame) baked lighting, process like this don't need constant refreshing so they can be achieve over the network,which by the way mean that the game that use it need constant internet of certain speed to be able to keep things good,if your connection fails yeah you will not play Crackdown PERIOD.

The second bold part. ""The cloud doesn't address graphical bottle necks"'

That alone should have ended the argument but since you people are hard headed you keep fighting for the same sh**,since the PS4 has more GRAPHICAL power than the xbox one,there is no fu**ing way the cloud can help the xbox one catch the PS4,no parity,no equality no nothing,the cloud will not make up for the xbox one lack of GPU power,that should have ended the argument right there.

Yeah maybe they could have some good physics but at a cost of the games always needing online connection,by the way once MS in 8 years kill the servers for those games,you loss your game for ever,since it was using power to run CPU processes from an outside source that will not be there any more,and the xbox one CPU will not be up to the task to run them,because MS tied up the resources in other process.

The xbox one will always trail the PS4 and the cloud can't do nothing about that because it can't make out for graphical bottle necks,online doesn't have the speed GPU require for that silly lemming.

First part, well you keep saying it doesn't need constant refreshing, as I've already mentioned, that is not true, proof it or otherwise stop saying it. Why do you think Microsoft went for on always on device? Not just for DRM, but also so almost every owner of the Xbox One would have access to Microsofts cloud, all the time. As you keep mentioning bandwidth and latency, you clearly don't understand what they want to do. These calculations arrive faster than PS Now video/audio, so who will have the most problems with lag you think?

Second part, no one said anything about graphical bottle necks. As you could have seen at GDC, it frees the CPU of calculations. PS4 will have the GPU do work of the CPU, in the end, PS4 will stay behind, because they can't keep up CPU wise. And that's the part you keep missing apparently. Xbox One will offload most of the CPU to the cloud, PS4 can't do this. Xbox One will have more dynamic, more lively worlds to play in, when PS4 wants this, they have to offload it to the GPU, and this will lower all it's graphical power.

#369 Edited by evildead6789 (7419 posts) -

@tormentos said:

@evildead6789 said:

Dear lord what a difference, let's go buy a ps4 and play with fanboys like tormentos

From the reviewer

"While smoke and some effects look slightly different, the consoles are very close, so maybe we can put this discussion to bed"

link

The New Order, we finally found out. In our initial performance analysis, we went in search of the first cross-platform 1080p60 first-person shooter and while the game mostly delivered, the discovery of a dynamic resolution suggested that, once again, PlayStation 4 had managed to trump its Microsoft rival.

Metrics in the area of 1760x1080 are found on PS4, while on the Xbox One this can drop to an extreme of 960x1080 in some scenes. This is usually identifiable by an increase in the amount of jaggies on screen, along with a slightly fuzzier appearance to the already gritty aesthetic that Machine Games employs throughout the game.

Meanwhile, shadow quality on the PS4 is a match for the PC game with the effect at its second highest setting, although on the Xbox One these elements appear to operate using lower PCF (percentage closer filtering) sample counts which results in a more jagged shadows.

http://www.eurogamer.net/articles/digitalfoundry-2014-wolfenstein-the-new-order-face-off

The PS4 version doesn't go even close to 900p the xbox one version drops almost to 720p you sad lemming,is quite visible with jaggies and fuzzier image...hahaha

The difference is there...lol

@softwaregeek said:

That's all you got? weak sauce

Well something is better than nothing WHAT DO YOU HAVE to prove you totally failed argument.? Oh yeah promises..lol

you're taking everything out of context just to be a fanboy troll. That same article mentions

"Outside of the more aggressive dynamic framebuffer on the Xbox One, there's little to separate it from the PS4 game. Shadow quality is slightly better on the PS4, but the artwork, effects and lighting are all basically identical. The PS4 holds up the closest in delivering a native 1080p experience at 60fps, so benefits from slightly more consistent image quality. As such, once again it's the PS4 release that is our preferred console choice. However, the differences between the two consoles are minor and Wolfenstein is really a worthwhile purchase no matter which one of these systems you own."

This is also the only article that you could find that's so lenient towards the ps4, and still they're talking about minor differences. I think the video speaks for itself.

Hardly a reason to buy a ps4 in my book, if you want superior image quality, buy a pc.

#370 Posted by FastRobby (1202 posts) -

@tormentos said:

lies

you're taking everything out of context just to be a fanboy troll. That same article mentions

He does that ALL the time, that's classic childish, fanboy behavior.

#371 Edited by Daious (1109 posts) -

We are beating a dead horse. xboxpne is weaker from launch and will always be weaker. Deal with it guys. Stop claiming ownage for nothing we already don't know. Dx12 will be great on PC and it will help xboxone get more games with easy ports.It is still great news.

People like forzagearsface/@Tighaman just need to drop it and just accept the good and bad news. Consoles are dated hardware. There are no secrets. The console have been opened up. There is no magic sauce. Drop the crazy bs sw. Focus on games. Games is where its at. You guys are just sounding more crazy than the hype around the cell processor.

#372 Posted by Shewgenja (8440 posts) -

@evildead6789 said:

@tormentos said:

lies

you're taking everything out of context just to be a fanboy troll. That same article mentions

He does that ALL the time, that's classic childish, fanboy behavior.

And pointing fingers on the 8th page of a thread where the DX12 magic sauce gets debunked isn't?

#373 Posted by blackace (20238 posts) -
#374 Posted by hoosier7 (3780 posts) -
#376 Edited by FastRobby (1202 posts) -

@Shewgenja said:

@FastRobby said:

@evildead6789 said:

@tormentos said:

lies

you're taking everything out of context just to be a fanboy troll. That same article mentions

He does that ALL the time, that's classic childish, fanboy behavior.

And pointing fingers on the 8th page of a thread where the DX12 magic sauce gets debunked isn't?

It hasn't been debunked... Witcher 3 developers, who aren't using DX12 are saying one thing, then you've got Battlefield Hardline developers saying the other... You've got a known PS fanboy trying to debunk it with lies over and over again, and then you've got industrie experts from Intel, AMD, Nvidia, saying it's amazing. Well well, who should I believe, the one fanboy on GS forum, or the vice president of AMD, hmm, so hard to choose from...

@daious It's not just low-level, at the moment CPU from Xbox One isn't being used properly, you can't completely multitask and divide all task over all cores.

#377 Posted by Daious (1109 posts) -

@hoosier7:

Consoles have had low level API access since day 1. Users like blackace needs to accept this.

This topic gets brought up every other day. We have known since day one that xboxpne is weaker. It's time to move on to another topic and just play games.

#378 Edited by evildead6789 (7419 posts) -

@Shewgenja said:

@FastRobby said:

@evildead6789 said:

@tormentos said:

lies

you're taking everything out of context just to be a fanboy troll. That same article mentions

He does that ALL the time, that's classic childish, fanboy behavior.

And pointing fingers on the 8th page of a thread where the DX12 magic sauce gets debunked isn't?

It hasn't been debunked... Witcher 3 developers, who aren't using DX12 are saying one thing, then you've got Battlefield Hardline developers saying the other... You've got a known PS fanboy trying to debunk it with lies over and over again, and then you've got industrie experts from Intel, AMD, Nvidia, saying it's amazing. Well well, who should I believe, the one fanboy on GS forum, or the vice president of AMD, hmm, so hard to choose from...

@daious It's not just low-level, at the moment CPU from Xbox One isn't being used properly, you can't completely multitask and divide all task over all cores.

yeah it's because the witcher devs are too lazy to implement the esram and dx12 like they should, that this is true for every game.

#379 Posted by tormentos (17013 posts) -

First part, well you keep saying it doesn't need constant refreshing, as I've already mentioned, that is not true, proof it or otherwise stop saying it.

Why do you think Microsoft went for on always on device? Not just for DRM, but also so almost every owner of the Xbox One would have access to Microsofts cloud, all the time. As you keep mentioning bandwidth and latency, you clearly don't understand what they want to do. These calculations arrive faster than PS Now video/audio, so who will have the most problems with lag you think?

Second part, no one said anything about graphical bottle necks. As you could have seen at GDC, it frees the CPU of calculations. PS4 will have the GPU do work of the CPU, in the end, PS4 will stay behind, because they can't keep up CPU wise.

And that's the part you keep missing apparently. Xbox One will offload most of the CPU to the cloud, PS4 can't do this. Xbox One will have more dynamic, more lively worlds to play in, when PS4 wants this, they have to offload it to the GPU, and this will lower all it's graphical power.

Dude anything that need constant refreshing mean that it has to be render in fractions of seconds,it has to be refresh for every single frame,a typical game on consoles consist of 30FPS Frames per second.

That mean for each frame change that result most be there,is the reason why GPU use such high bandwidth.

So if you want to load 100 MB of data over and over again in 30 frames,that mean you connection need to deliver 100 mb 30 times over 1 second,your internet connection will not DO IT.

Answer me this why the hell do you think ESRAM is on xbox one.?

Simple question why do you think MS is using ESRAM on xbox one.

Yeah it is because 68Gb/s from the DDR3 memory pool isn't fast enough to handle the load between CPU and GPU,so ESRAM is basically a middle man which speed up the data so that the GPU doesn't starve,once you understand this process you will understand why i say things that need constant refreshing can't be done over a network,a 1 second delay is enough to cause the game to freeze.

MS went online need it to verify that your actual game was legal,reason why it was 24 hours check up and not constant all the time check ups.

The calculation as fast as you connection can deliver them even a 50MB connection it total bullsh** compare to what a CPU can use on a single second,which is up to 68GB/s,and that is the CPU the GPU require even faster one.

If you have a 50 MB connection,and your connection can download 1Gb of data in 3 minutes,it would take your connection 528 minutes to stream from the server what the PS4 bass over its bandwidth in 1 second.

Your connection will take 8 hours to stream from the cloud to the PS4 what PS4 can process in 1 freaking second man,is impossible to do it,which is the reason why only things that don't need constant refresh can be done,streaming even 30 MB over a fast connection can take several seconds,oh and i am going by 50MB connection which aren't the standard in any way.

PSN now is a video stream video stream don't need 50MB a second connection,a 10 MB one for 720p is fine,which is the reason PSN now can be deliver,PSN now works like Onlive and Gaikay,on PSN now what happen is that the servers process the game and stream a video of the game been render on a farm,the cloud MS use offload process is not the same in any way.

That is a sad notion that the PS4 CPU will fall behind,what MS can offload from the cloud will not be enough to do anything and the PS4 CPU isn't a bottle neck for the GPU in any way,so basically you are assuming things that will not happen.

The xbox can't offload most of the CPU to the cloud sad lemming,your connection CANT HANDLE IT.

Is like having Dumptruck motor on top of a freaking bicycle,that sad is your connection is total and utter sh** compare to what a CPU pass over DDR3 on PC or even the xbox one,even the PS4 CPU which uses like 20 to 30GB/s it endless times more than what your connection can handle,now before you say any more sh** take this challenge.

Download on your connection 1GB and tell me how much it takes,after it finish take a picture of the window showing it was done it will tell you how much it took.

When your done take that time and multiple it by 145 times.

The total time it gives you is what your connection will take in time to download what the xbox one process with ESRAM in a freaking second,maybe that way you can begin to understand why it can do sh** even for the CPU what can be offloaded is little oh and anything offloaded most me pretty small to it arrives the fastest and to avoid putting the offload in direct competition with the assets you need for online play as well.

you're taking everything out of context just to be a fanboy troll. That same article mentions

"Outside of the more aggressive dynamic framebuffer on the Xbox One, there's little to separate it from the PS4 game. Shadow quality is slightly better on the PS4, but the artwork, effects and lighting are all basically identical. The PS4 holds up the closest in delivering a native 1080p experience at 60fps, so benefits from slightly more consistent image quality. As such, once again it's the PS4 release that is our preferred console choice. However, the differences between the two consoles are minor and Wolfenstein is really a worthwhile purchase no matter which one of these systems you own."

This is also the only article that you could find that's so lenient towards the ps4, and still they're talking about minor differences. I think the video speaks for itself.

Hardly a reason to buy a ps4 in my book, if you want superior image quality, buy a pc.

No digital foundry like always is trying to be neutral,..

""PlayStation 4 had managed to trump its Microsoft rival.""

Same article..lol

He does that ALL the time, that's classic childish, fanboy behavior.

Oh please don't talk about childish behavior on a thread where you are defending at broken back and blinding something that has fail to be proven,Cloud.

Titanfall uses the cloud it offload CPU process it look average and is 792 with frame drops into the mid 30's the cloud is proven to do sh** for graphics.

Now i have Titanfall as prove that it fail to deliver what do you have to prove the cloud work,and i hope is not the damn building demo,done by MS because that is nothing and could be easily as fake as the kinect presentation the faked many times during E3.

@blackace said:

I think there will be more then just DX12 that will make the difference. This developer hasn't seen what's coming in the next 3-8 months from now. They'll be singing a different tune by this time next year.

His game doesn't come out for like 6 to 8 months what the fu** man,so now the Witcher developer doesn't know because he denies your secret sauce,get this DX12 is already on xbox one it has been since launch,and both the Witcher developer and Phill Spencer him self have stated already that DX12 what will do it make it easy to make games for the xbox one,it will not bring the so call performance boost you were lead to believe first.

You know why MS now is lowering the tune in respect to the cloud and DX12,because they know that the time for hype is over,the PS4 is leading and few people fell for the promises,they know it will not deliver so better drop it now so that people next year don't remember it.

It hasn't been debunked... Witcher 3 developers, who aren't using DX12 are saying one thing, then you've got Battlefield Hardline developers saying the other... You've got a known PS fanboy trying to debunk it with lies over and over again, and then you've got industrie experts from Intel, AMD, Nvidia, saying it's amazing. Well well, who should I believe, the one fanboy on GS forum, or the vice president of AMD, hmm, so hard to choose from...

@daious It's not just low-level, at the moment CPU from Xbox One isn't being used properly, you can't completely multitask and divide all task over all cores.

And you have Phill Spencer telling it to your face..

Xbox One chief warns gamers not to expect dramatic improvements from DirectX 12

On Monday, Xbox head Phil Spencer appeared to dump cold water on the idea that DX12 would make a major difference for the console, writing: “It will help developers on XBOX One. It’s not going to be a massive change but will unlock more capability for devs.”

http://www.extremetech.com/gaming/184768-head-of-xbox-warns-gamers-not-to-expect-dramatic-improvements-from-dx12

Maybe you will listen to Phill Spencer...

You know why it will not be a big change.?

Because DX12 work by lowering CPU over head which on console isn't a freaking problem to begin with,have you hear about low level and optimization on consoles.? Yeah that is DX12 for PC,which is what Mantle did before MS which is late like always,DX12 will not do much for the xbox one because is made to solver a freaking problem on PC that doesn't exist on consoles.

Oh and anything the xbox one can do by API the PS4 can do it as well everything,so if the xbox one improve by a technique so will the PS4,and sony has a team dedicated for that Ice team and they are great in what they do,just check out TLOU on PS3.

yeah it's because the witcher devs are too lazy to implement the esram and dx12 like they should, that this is true for every game.

All games use ESRAM on xbox one you silly fanboy..lol

#380 Edited by FastRobby (1202 posts) -
@tormentos said:
@FastRobby said:

First part, well you keep saying it doesn't need constant refreshing, as I've already mentioned, that is not true, proof it or otherwise stop saying it.

Why do you think Microsoft went for on always on device? Not just for DRM, but also so almost every owner of the Xbox One would have access to Microsofts cloud, all the time. As you keep mentioning bandwidth and latency, you clearly don't understand what they want to do. These calculations arrive faster than PS Now video/audio, so who will have the most problems with lag you think?

Second part, no one said anything about graphical bottle necks. As you could have seen at GDC, it frees the CPU of calculations. PS4 will have the GPU do work of the CPU, in the end, PS4 will stay behind, because they can't keep up CPU wise.

And that's the part you keep missing apparently. Xbox One will offload most of the CPU to the cloud, PS4 can't do this. Xbox One will have more dynamic, more lively worlds to play in, when PS4 wants this, they have to offload it to the GPU, and this will lower all it's graphical power.

Dude anything that need constant refreshing mean that it has to be render in fractions of seconds,it has to be refresh for every single frame,a typical game on consoles consist of 30FPS Frames per second.

That mean for each frame change that result most be there,is the reason why GPU use such high bandwidth.

So if you want to load 100 MB of data over and over again in 30 frames,that mean you connection need to deliver 100 mb 30 times over 1 second,your internet connection will not DO IT.

Answer me this why the hell do you think ESRAM is on xbox one.?

Simple question why do you think MS is using ESRAM on xbox one.

Yeah it is because 68Gb/s from the DDR3 memory pool isn't fast enough to handle the load between CPU and GPU,so ESRAM is basically a middle man which speed up the data so that the GPU doesn't starve,once you understand this process you will understand why i say things that need constant refreshing can't be done over a network,a 1 second delay is enough to cause the game to freeze.

MS went online need it to verify that your actual game was legal,reason why it was 24 hours check up and not constant all the time check ups.

The calculation as fast as you connection can deliver them even a 50MB connection it total bullsh** compare to what a CPU can use on a single second,which is up to 68GB/s,and that is the CPU the GPU require even faster one.

If you have a 50 MB connection,and your connection can download 1Gb of data in 3 minutes,it would take your connection 528 minutes to stream from the server what the PS4 bass over its bandwidth in 1 second.

Your connection will take 8 hours to stream from the cloud to the PS4 what PS4 can process in 1 freaking second man,is impossible to do it,which is the reason why only things that don't need constant refresh can be done,streaming even 30 MB over a fast connection can take several seconds,oh and i am going by 50MB connection which aren't the standard in any way.

PSN now is a video stream video stream don't need 50MB a second connection,a 10 MB one for 720p is fine,which is the reason PSN now can be deliver,PSN now works like Onlive and Gaikay,on PSN now what happen is that the servers process the game and stream a video of the game been render on a farm,the cloud MS use offload process is not the same in any way.

That is a sad notion that the PS4 CPU will fall behind,what MS can offload from the cloud will not be enough to do anything and the PS4 CPU isn't a bottle neck for the GPU in any way,so basically you are assuming things that will not happen.

The xbox can't offload most of the CPU to the cloud sad lemming,your connection CANT HANDLE IT.

Is like having Dumptruck motor on top of a freaking bicycle,that sad is your connection is total and utter sh** compare to what a CPU pass over DDR3 on PC or even the xbox one,even the PS4 CPU which uses like 20 to 30GB/s it endless times more than what your connection can handle,now before you say any more sh** take this challenge.

Download on your connection 1GB and tell me how much it takes,after it finish take a picture of the window showing it was done it will tell you how much it took.

When your done take that time and multiple it by 145 times.

The total time it gives you is what your connection will take in time to download what the xbox one process with ESRAM in a freaking second,maybe that way you can begin to understand why it can do sh** even for the CPU what can be offloaded is little oh and anything offloaded most me pretty small to it arrives the fastest and to avoid putting the offload in direct competition with the assets you need for online play as well.

Why do you think it's 100mb of data? Like I said this has NOTHING to do with graphics, so that first part I can completely ignore. Do you even now how big the size is of the data that the CPU has to handle?

The last part is just because you don't believe in it. If what I'm saying is true, the PS4 will fall behind, because my connection, can handle that ;)

@tormentos said:
@FastRobby said:

He does that ALL the time, that's classic childish, fanboy behavior.

Oh please don't talk about childish behavior on a thread where you are defending at broken back and blinding something that has fail to be proven,Cloud.

Titanfall uses the cloud it offload CPU process it look average and is 792 with frame drops into the mid 30's the cloud is proven to do sh** for graphics.

Now i have Titanfall as prove that it fail to deliver what do you have to prove the cloud work,and i hope is not the damn building demo,done by MS because that is nothing and could be easily as fake as the kinect presentation the faked many times during E3.

Why do you use Titanfall as an example? :s Titanfall was out BEFORE they showed the tech at GDC, BEFORE they were going this tech for the first time in Crackdown... Really don't understand why you think this is being used in Titanfall, that is a ridiculous way of thinking

@FastRobby said:

It hasn't been debunked... Witcher 3 developers, who aren't using DX12 are saying one thing, then you've got Battlefield Hardline developers saying the other... You've got a known PS fanboy trying to debunk it with lies over and over again, and then you've got industrie experts from Intel, AMD, Nvidia, saying it's amazing. Well well, who should I believe, the one fanboy on GS forum, or the vice president of AMD, hmm, so hard to choose from...

@daious It's not just low-level, at the moment CPU from Xbox One isn't being used properly, you can't completely multitask and divide all task over all cores.

And you have Phill Spencer telling it to your face..

Xbox One chief warns gamers not to expect dramatic improvements from DirectX 12

On Monday, Xbox head Phil Spencer appeared to dump cold water on the idea that DX12 would make a major difference for the console, writing: “It will help developers on XBOX One. It’s not going to be a massive change but will unlock more capability for devs.”

http://www.extremetech.com/gaming/184768-head-of-xbox-warns-gamers-not-to-expect-dramatic-improvements-from-dx12

Maybe you will listen to Phill Spencer...

You know why it will not be a big change.?

Because DX12 work by lowering CPU over head which on console isn't a freaking problem to begin with,have you hear about low level and optimization on consoles.? Yeah that is DX12 for PC,which is what Mantle did before MS which is late like always,DX12 will not do much for the xbox one because is made to solver a freaking problem on PC that doesn't exist on consoles.

Oh and anything the xbox one can do by API the PS4 can do it as well everything,so if the xbox one improve by a technique so will the PS4,and sony has a team dedicated for that Ice team and they are great in what they do,just check out TLOU on PS3.

It's getting sad that you only cut out what you want to hear from an article. Didn't I clearly state that it will improve CPU multi-threading support? Well looky here, what does it say in the article?

We know that the D3D API is weak in two respects — the number of draw calls it can make and multi-threading support.

So maybe learn to read, then respond to what I'm saying, not to what you think you are hearing. If that task is too difficult for you, I will not respond to you. That's why I stopped reading after a couple of lines, don't have a clue what else you are saying because it all looks like lies when you don't listen and only say what you want to hear. Again, childish.

#381 Posted by GrenadeLauncher (3999 posts) -

@GrenadeLauncher said:

@softwaregeek said:

@GrenadeLauncher:

"Can't wait to laugh at the lag outside of that controlled environment and the fact any console or modestly-priced PC will be able to render the same stuff."

You just can't handle the fact that it's already been through the proof-of-concept stages and that it's going to work and that gives the X1 a magnificent advantage over your beloved ps4. Let's face it, it's an online world and you can't handle the fact that you screwed the pooch and bought a hardware upgrade. hahaha. gawd.

Oh man, past proof-of-concept, Xbox well and truly won.

You're going to have to do better than this, marketer.

Wow, I bought an upgrade? That's a relief. Better than buying a gimped cable box with vague promises. Promises that you could say are...cloudy.

You bought a hardware upgrade and no innovation. hahahahahaha!!! gawd! Diablo 3 = parity. So you bought a system that isn't going to run games better than an x1 and has no cloud powah! hahahaha! Nah man, really I'm just razzing ya. You still got a decent system....I wouldn't feel too bad If I were you.

Cool. The last time we got "innovashun" we had Waggle, Move and Kinect. I'll have a games console please.

Hahaha, well done on getting Diablo 3 to 1080p. Looking at it, the fact it was sub-1080p to begin with just accentuates how bad the hardware is.

#382 Posted by slimdogmilionar (467 posts) -

@tormentos said:

@evildead6789 said:

From the reviewer

"While smoke and some effects look slightly different, the consoles are very close, so maybe we can put this discussion to bed"

link


Dat lighting effect at :40 that's more noticeable on the x1 than the ps4. Another lighting effect at 3:55 that looks better on the x1. The rest looks the same. Not enough difference to even worry about.

If you look at the guys face at 0:00 the xbox1 version appears to have more detail and more looks more defined. I think the PS4 version looks a little sharper though. This is not the first time we've seen this in this game where the xbox one has a more defined image. I think Sony is pushing 1080p as a buzzword and making devs hit 1080p at all cost. DF only seems to care about resolution and higher resolution=winner, but looking at the face you can clearly see that the Xb1 has more detail than the PS4.

#383 Posted by Shewgenja (8440 posts) -

Lems still bumping this thread? They must have REALLY wanted that secret sauce.. Aww

#384 Posted by FastRobby (1202 posts) -

Hahaha, well done on getting Diablo 3 to 1080p. Looking at it, the fact it was sub-1080p to begin with just accentuates how bad the hardware is.

How bad their SDK was.

#385 Edited by Shewgenja (8440 posts) -

@GrenadeLauncher said:

Hahaha, well done on getting Diablo 3 to 1080p. Looking at it, the fact it was sub-1080p to begin with just accentuates how bad the hardware is.

How bad their SDK was.

Or how much better Epic and Sony have relations this generation.

:o

#386 Posted by FastRobby (1202 posts) -

@FastRobby said:

@GrenadeLauncher said:

Hahaha, well done on getting Diablo 3 to 1080p. Looking at it, the fact it was sub-1080p to begin with just accentuates how bad the hardware is.

How bad their SDK was.

Or how much better Epic and Sony have relations this generation.

:o

I don't see the connection between Epic and Sony, what did I miss?

#387 Posted by GrenadeLauncher (3999 posts) -

@GrenadeLauncher said:

Hahaha, well done on getting Diablo 3 to 1080p. Looking at it, the fact it was sub-1080p to begin with just accentuates how bad the hardware is.

How bad their SDK was.

How bad their hardware is full stop, SlowRobby.

I sure hope there isn't screen tearing.

#388 Posted by FastRobby (1202 posts) -

@FastRobby said:

@GrenadeLauncher said:

Hahaha, well done on getting Diablo 3 to 1080p. Looking at it, the fact it was sub-1080p to begin with just accentuates how bad the hardware is.

How bad their SDK was.

How bad their hardware is full stop, SlowRobby.

I sure hope there isn't screen tearing.

Have fun playing on a low-end PC

#389 Posted by tormentos (17013 posts) -

Why do you think it's 100mb of data? Like I said this has NOTHING to do with graphics, so that first part I can completely ignore. Do you even now how big the size is of the data that the CPU has to handle?

The last part is just because you don't believe in it. If what I'm saying is true, the PS4 will fall behind, because my connection, can handle that ;)

Why do you use Titanfall as an example? :s Titanfall was out BEFORE they showed the tech at GDC, BEFORE they were going this tech for the first time in Crackdown... Really don't understand why you think this is being used in Titanfall, that is a ridiculous way of thinking

It's getting sad that you only cut out what you want to hear from an article. Didn't I clearly state that it will improve CPU multi-threading support? Well looky here, what does it say in the article?

We know that the D3D API is weak in two respects — the number of draw calls it can make and multi-threading support.

So maybe learn to read, then respond to what I'm saying, not to what you think you are hearing. If that task is too difficult for you, I will not respond to you. That's why I stopped reading after a couple of lines, don't have a clue what else you are saying because it all looks like lies when you don't listen and only say what you want to hear. Again, childish.

Dude what the fu** learn a thing or 2 about PC then talk why the fu** do you think PC use DDR3 with 68 Gigabytes transfer rate why.?

Gigabytes not MB.

You internet connection is complete and utter sh** compare to what a CPU can use on DDR3,even CPU which use 20 to 30GB/s still orders of magnitude bigger than what your sorry ass connection can handle.

What make you think that CPU process it take 1 MB and not 100 or more.? From what you pull this sad notion,because on PC they use bandwidth in the Gigabytes per second.

Why do i use titanfall you ask.?

http://gimmegimmegames.com/2013/06/respawn-titanfall-couldnt-be-made-without-xbox-cloud-xbox-one-version-better-than-pc-version/

Titanfall is the first xbox one game that offload AI to the cloud,aside from drivatars,it uses the cloud to offload a CPU process,where ever you like to admit it or not,you are just mad because the game look average is 792p and drops at mid 30's.lol

A few months ago Titanfall was the poster child of the cloud power..lol

You are and idiot that is related to PC not xbox one,in fact the PC is getting the xbox one version of D3D...hahaha

In fact i made a thread about t a few months ago,when they clearly stated that they would looking to bring to PC the xbox one D3D gains...hahahaaaaaaaaa

One of the is lower CPU over head,which is what will speed some games on PC not on xbox one,and the demo was made with Forza on PC not on xbox one.,lol

We’re also working with our ISV and IHV partners on future efforts, including bringing the lightweight runtime and tooling capabilities of the Xbox One Direct3D implementation to Windows, and identifying the next generation of advanced 3D graphics technologies.

With Xbox One we have also made significant enhancements to the implementation of Direct3D 11, especially in the area of runtime overhead. The result is a very streamlined, “close to metal” level of runtime performance. In conjunction with the third generation PIX performance tool for Xbox One, developers can use Direct3D 11 to unlock the full performance potential of the console.

We implemented it on Xbox 360 and had a whole lot of ideas on how to make that more efficient [and with] a cleaner API, so we took that opportunity with Xbox One and with our customised command processor we've created extensions on top of D3D which fit very nicely into the D3D model and this is something that we'd like to integrate back into mainline 3D on the PC too - this small, very low-level, very efficient object-orientated submission of your draw [and state] commands.

http://www.gamespot.com/forums/system-wars-314159282/dx12-xbox-tool-done-on-pc-31228472/?page=1

From MS it self you butthurt fanboy...

Hahahaaaa

D3D on DX12 = the xbox one version brought to PC,which is why on PC it was demo and showed gains and on xbox one would show nothing it was already there..hahaha

Class dismiss..

#390 Posted by tormentos (17013 posts) -

@GrenadeLauncher said:

How bad their hardware is full stop, SlowRobby.

I sure hope there isn't screen tearing.

Have fun playing on a low-end PC

The secret sauce lover now tag the PS4 as a low end PC,when he defend the weaker and crappier xbox one..lol

That moving goal post...

#391 Posted by Shewgenja (8440 posts) -

I don't see the connection between Epic and Sony, what did I miss?

Yeah, actually, that was my bad. Apparently, Blizzard used an in-house game engine for DIII and I was under the impression it was Unreal.

#392 Posted by GrenadeLauncher (3999 posts) -

Have fun playing on a low-end PC

But I don't have an Xbone.

#393 Edited by FastRobby (1202 posts) -

@FastRobby said:

Have fun playing on a low-end PC

But I don't have an Xbone.

That's because the PS4 is just a PC, and Xbox One is custom made hardware.

#394 Edited by GrenadeLauncher (3999 posts) -

@FastRobby said:

@GrenadeLauncher said:

@FastRobby said:

Have fun playing on a low-end PC

But I don't have an Xbone.

That's because the PS4 is just a PC, and Xbox One is custom made hardware.

Hahahahhaaha

The Xbone is as custom made as the PS4. Actually less so because the PS4 has an audio chip with TrueAudio and the Bone doesn't.

#395 Posted by FastRobby (1202 posts) -

@FastRobby said:

@GrenadeLauncher said:

@FastRobby said:

Have fun playing on a low-end PC

But I don't have an Xbone.

That's because the PS4 is just a PC, and Xbox One is custom made hardware.

Hahahahhaaha

The Xbone is as custom made as the PS4.

Does you CPU also have 5 billion transistors? Because the Xbox One has, it's a custom made CPU, not the same as the jaguar in the PS4

#396 Edited by SoftwareGeek (196 posts) -

@softwaregeek said:

@GrenadeLauncher said:

@softwaregeek said:

@GrenadeLauncher:

"Can't wait to laugh at the lag outside of that controlled environment and the fact any console or modestly-priced PC will be able to render the same stuff."

You just can't handle the fact that it's already been through the proof-of-concept stages and that it's going to work and that gives the X1 a magnificent advantage over your beloved ps4. Let's face it, it's an online world and you can't handle the fact that you screwed the pooch and bought a hardware upgrade. hahaha. gawd.

Oh man, past proof-of-concept, Xbox well and truly won.

You're going to have to do better than this, marketer.

Wow, I bought an upgrade? That's a relief. Better than buying a gimped cable box with vague promises. Promises that you could say are...cloudy.

You bought a hardware upgrade and no innovation. hahahahahaha!!! gawd! Diablo 3 = parity. So you bought a system that isn't going to run games better than an x1 and has no cloud powah! hahahaha! Nah man, really I'm just razzing ya. You still got a decent system....I wouldn't feel too bad If I were you.

Cool. The last time we got "innovashun" we had Waggle, Move and Kinect. I'll have a games console please.

Hahaha, well done on getting Diablo 3 to 1080p. Looking at it, the fact it was sub-1080p to begin with just accentuates how bad the hardware is.

No, it says that the xbox hardware is a bit more complicated to fine tune. This will become easier over time. x1 owners will enjoy 1080p games and the POWAH of the cloud! hahahahaha! And you PS4 owners will enjoy your 2nd-rate online experience.

#397 Posted by GrenadeLauncher (3999 posts) -

Does you CPU also have 5 billion transistors? Because the Xbox One has, it's a custom made CPU, not the same as the jaguar in the PS4

Hahahaha, "5 billion transistors." I haven't heard that dumb shit since the reveal, and it was mocked as soon as they said it too. Try harder, SlowRobby.

No, it says that the xbox hardware is a bit more complicated to fine tune. This will become easier over time. x1 owners will enjoy 1080p games and the POWAH of the cloud! hahahahaha! And you PS4 owners will enjoy your 2nd-rate online experience.

Fantastic parody, mate. You should think of doing it professionally.

#398 Posted by SoftwareGeek (196 posts) -

@FastRobby said:

Does you CPU also have 5 billion transistors? Because the Xbox One has, it's a custom made CPU, not the same as the jaguar in the PS4

Hahahaha, "5 billion transistors." I haven't heard that dumb shit since the reveal, and it was mocked as soon as they said it too. Try harder, SlowRobby.

@softwaregeek said:

No, it says that the xbox hardware is a bit more complicated to fine tune. This will become easier over time. x1 owners will enjoy 1080p games and the POWAH of the cloud! hahahahaha! And you PS4 owners will enjoy your 2nd-rate online experience.

Fantastic parody, mate. You should think of doing it professionally.

You mean fantastic parity? hahahaha. that's exactly what we've been telling ps4 fanboys. your online experience is still 2nd rate. You need to make some more moo-ving posts.

#399 Edited by GrenadeLauncher (3999 posts) -

You mean fantastic parity? hahahaha. that's exactly what we've been telling ps4 fanboys. your online experience is still 2nd rate. You need to make some more moo-ving posts.

#400 Posted by Shewgenja (8440 posts) -

Does you CPU also have 5 billion transistors? Because the Xbox One has, it's a custom made CPU, not the same as the jaguar in the PS4

Hey. Transistors are on a shit ton of things on a motherboard. What planet are you from?

While we're at it, point to a PC using hUMA right now. XBone is a custom made running kick to your fanboyism.