X1 will end up being the better 1080p system

This topic is locked from further discussion.

#151 Posted by Shielder7 (5152 posts) -

I remember when games being good was what mattered, but apparently anything not running in 1080p and 60 FPS is suddenly no longer worth playing.

Perhaps we're all just a little bored with our systems until another wave of releases blows through.

#152 Posted by misterpmedia (3364 posts) -

@me3x12 said:

@misterpmedia said:

@me3x12 said:

@misterpmedia said:

@me3x12 said:

@misterpmedia said:

@me3x12 said:

This is a vary viable scenario. Here's the deal the drivers are way late for the X1 new SDK kits are going out really soon and already being said to be really fast and it makes achieving 1080p on X1 really easy with the new kits.

Apparently the situation is these new kits also give developers the tools they need to leverage the ESRAM in ways they had no clue.

Also MS will be dropping DX11.2 soon on X1 and when this happens you can store up to 6GB of textures on the ESRAM alone that's massive and will also make it really easy to achieve 1080p.

This is all true don't try to downplay it. When this happens and it's going to really soon the reality is that in FACT the X1 might flip the table and be the more capable box running full 1080p at 60fps because running 60fps

There is a lot of AM CRY coming gonna need lots of tissues soon.

http://www.digitalspy.com/gaming/news/a550445/rebellion-xbox-one-1080p-issues-can-be-fixed.html

http://misterxmedia.livejournal.com/153756.html

http://gamingbolt.com/why-xbox-ones-esram-feels-limited-despite-potential-to-store-6gb-of-tiled-textures-using-dx-11-2

hahahahaa, referencing Misterxmedia makes this whole thread, topic and post invalid to sense. Way to fail TC, even I thought you didn't sink to bottom most rung of Lemdom.

The bold, the PS4 can do too btw. :). Not to mention the PS4 is already tooled to harness DX11.2+ and beyond.

It's so sad how you and the rest of the Cows knowlege is so DOO DOO just PURE POOP!!! PS4 will NEVER EVER EVER!!! do DX of any kind not DX11.1 DX11.2 NOTHING!!! Man you are shit stupid I swear. DX is MS owned in no way shape or form can DX be done on PS4. PS4 is Open GL

http://www.youtube.com/watch?v=wKjxFJfcrcA

LOL there are equivalents you do realise? Hold on let me find this picture to totally own your lem ass, oh yes here it is, but a simple google search away :D. Your comment rendered pointless in 3....2.....1....:

Back to your basement goblin!

But bu bu bu it don't run DX. Let me ask you YES OR NO does PS4 run DX or Open GL? Back under your bridge troll your attempt to twist the truth has FAILED!!!

lmfao you can't get out of it lem, my point was they can do the same thing and the PS4 can actually do more, and that proves it. Can't believe it took you that long to come up with that lame ass reply. Picture proof puts any reply you make in the lembin.

Everyone on here knows you got OWNED!!! Answer the question does PS4 run DX or Open GL? Just because it could don't mean it will MS owns it

lmfao the picture evidence proving my point has made you lose your mind. You should make another thread giving us a run down of your mental condition. PS4 has the equivalent of the DX11.2 and '+' feature set. Am I going to have to bold and underline it for you? Maybe getting owned has caused you to experience an artificial lobotomy.

#153 Edited by Opus_Rea-333 (961 posts) -

If this

"that console games need to be 1080p 60FPS shit/trend made up by Sony Marketing continues im quitting gaming.

Developers will switch their focus for that instead of an actual game with substance.

if the console is weak dont push for 1080p/60 just 1080p/30 or 720p/60.

#154 Edited by jun_aka_pekto (15938 posts) -

@leandrro said:

"you can store up to 6GB of textures on the ESRAM alone" missinformation service up and running

Sounds like "magic beans" to me. Store up to 6Gb inside 32mb of physical memory? That's one heck of a good compression ratio. Even if it was possible, it'd require some kind of super CPU we haven't yet seen.

#155 Posted by killzowned24 (7301 posts) -
#156 Posted by GrenadeLauncher (4205 posts) -

@GrenadeLauncher said:

Literally posted this just when Thief was confirmed 900p Xbone, 1080p PS4.

Also good job citing SW's favourite Russian moron.

All fun and games aside,

Don't make fun of disabled people, asshole.

Yeah, you're right. It's insulting for the disabled to be compared to such a monumental prick like misterxmedia.

#157 Edited by ReadingRainbow4 (13526 posts) -

@StormyJoe said:

@GrenadeLauncher said:

Literally posted this just when Thief was confirmed 900p Xbone, 1080p PS4.

Also good job citing SW's favourite Russian moron.

All fun and games aside,

Don't make fun of disabled people, asshole.

Yeah, you're right. It's insulting for the disabled to be compared to such a monumental prick like misterxmedia.

lmao.

#158 Edited by Ripsaw1994 (96 posts) -

Calling these 1080p systems is kinda pathetic.

#159 Posted by ttboy (272 posts) -

@StormyJoe said:

@GrenadeLauncher said:

Literally posted this just when Thief was confirmed 900p Xbone, 1080p PS4.

Also good job citing SW's favourite Russian moron.

All fun and games aside,

Don't make fun of disabled people, asshole.

Yeah, you're right. It's insulting for the disabled to be compared to such a monumental prick like misterxmedia.

Whats interesting is that misterxmedia at least tries to back up what he is saying with links and whitepaper specs while most people in this thread just post pics of some guy in a vain attempt at humor. I don't think that there is one person on this board who understands hardware architecture as some of the contributors on that site that you insult. Mistercteam just posted an in depth break down on what we know about the Embedded RAM in the Xbox One and he was kind enough to add supporting links. This a level of intelligence that is beyond most of you. I may not agree with all of the conclusions on there but at least they try to make a logical argument with source materials cited.

I work in Software Development and yes you can say any feature of Direct X can be copied on OpenGL. However this is not a real argument. Everything can be emulated in software but at what cost. I've worked with many Microsoft engineers and none of them are stupid. Lets just see how this turns out. I'm intrigued to see if efficiency can beat brute force.

For the dude who finds it hard to believe about DX12 and tiled resources : http://www.youtube.com/watch?v=-I5TEpAnuEg

#160 Posted by GrenadeLauncher (4205 posts) -

@ttboy said:

@GrenadeLauncher said:

@StormyJoe said:

@GrenadeLauncher said:

Literally posted this just when Thief was confirmed 900p Xbone, 1080p PS4.

Also good job citing SW's favourite Russian moron.

All fun and games aside,

Don't make fun of disabled people, asshole.

Yeah, you're right. It's insulting for the disabled to be compared to such a monumental prick like misterxmedia.

Whats interesting is that misterxmedia at least tries to back up what he is saying with links and whitepaper specs while most people in this thread just post pics of some guy in a vain attempt at humor. I don't think that there is one person on this board who understands hardware architecture as some of the contributors on that site that you insult. Mistercteam just posted an in depth break down on what we know about the Embedded RAM in the Xbox One and he was kind enough to add supporting links. This a level of intelligence that is beyond most of you. I may not agree with all of the conclusions on there but at least they try to make a logical argument with source materials cited.

I work in Software Development and yes you can say any feature of Direct X can be copied on OpenGL. However this is not a real argument. Everything can be emulated in software but at what cost. I've worked with many Microsoft engineers and none of them are stupid. Lets just see how this turns out. I'm intrigued to see if efficiency can beat brute force.

For the dude who finds it hard to believe about DX12 and tiled resources : http://www.youtube.com/watch?v=-I5TEpAnuEg

The only supporting evidence mistershitmedia has is the voice in his head - sorry, his "insider" and those MS Paint altered images pointing out stuff we already knew. At this point he's been so wrong so often with his forever extending NDA release that you need to be a dumbass lem to take him seriously.

"I've worked with many Microsoft engineers and none of them are stupid. Lets just see how this turns out." Albert Penello said something similar too.

#161 Posted by zeeshanhaider (2439 posts) -

Nope. The so called outdated 'next gen' consoles are called 900pStation and 720pBox for a reason and I assure you the road will go only downwards from here.

#162 Edited by edwardecl (2142 posts) -

"you can store up to 6GB of textures on the ESRAM alone", I stopped reading your post when I got to that point...

Please tell me how do you store 6GB of textures on 32MB of ESRAM? Even if you are talking about the 6GB being compressed from 6GB lossless (which is retarded to even mention) it does not compute, unless you are being a complete troll with the "up to" statement. PS4 can store up to 40GB of textures on it's GDDR5 too in that case.

#163 Posted by Dire_Weasel (16000 posts) -

@ttboy said:

Whats interesting is that misterxmedia at least tries to back up what he is saying with links and whitepaper specs while most people in this thread just post pics of some guy in a vain attempt at humor. I don't think that there is one person on this board who understands hardware architecture as some of the contributors on that site that you insult. Mistercteam just posted an in depth break down on what we know about the Embedded RAM in the Xbox One and he was kind enough to add supporting links. This a level of intelligence that is beyond most of you. I may not agree with all of the conclusions on there but at least they try to make a logical argument with source materials cited.

I work in Software Development and yes you can say any feature of Direct X can be copied on OpenGL. However this is not a real argument. Everything can be emulated in software but at what cost. I've worked with many Microsoft engineers and none of them are stupid. Lets just see how this turns out. I'm intrigued to see if efficiency can beat brute force.

For the dude who finds it hard to believe about DX12 and tiled resources : http://www.youtube.com/watch?v=-I5TEpAnuEg

Wow, another victim. Yes, mistermxmedia has "level of intelligence" that is beyond most of us, it's so far below us it's hard to make out at all.

Seriously, "mistermxmedia" is either: a world-class troll, a crypto-sony supporter, or certifiably insane.

In any case, nothing he says is marred by even a tiny speck of truth. Stacked GPU? 128 Mb of ESRAM? He's doing the fans of the Xbox One a huge disservice and the true believers aren't making it any better.

#164 Posted by Alucard_Prime (2766 posts) -

Damn this thread is entertaining..Xbox1 graphics don't bother me, they are a nice step up from the 360, but more is always better, it will be interesting to see what happens when those new kits come out.

#165 Posted by misterpmedia (3364 posts) -
#166 Posted by StormyJoe (4956 posts) -

@ttboy said:

@GrenadeLauncher said:

@StormyJoe said:

@GrenadeLauncher said:

Literally posted this just when Thief was confirmed 900p Xbone, 1080p PS4.

Also good job citing SW's favourite Russian moron.

All fun and games aside,

Don't make fun of disabled people, asshole.

Yeah, you're right. It's insulting for the disabled to be compared to such a monumental prick like misterxmedia.

Whats interesting is that misterxmedia at least tries to back up what he is saying with links and whitepaper specs while most people in this thread just post pics of some guy in a vain attempt at humor. I don't think that there is one person on this board who understands hardware architecture as some of the contributors on that site that you insult. Mistercteam just posted an in depth break down on what we know about the Embedded RAM in the Xbox One and he was kind enough to add supporting links. This a level of intelligence that is beyond most of you. I may not agree with all of the conclusions on there but at least they try to make a logical argument with source materials cited.

I work in Software Development and yes you can say any feature of Direct X can be copied on OpenGL. However this is not a real argument. Everything can be emulated in software but at what cost. I've worked with many Microsoft engineers and none of them are stupid. Lets just see how this turns out. I'm intrigued to see if efficiency can beat brute force.

For the dude who finds it hard to believe about DX12 and tiled resources : http://www.youtube.com/watch?v=-I5TEpAnuEg

The only supporting evidence mistershitmedia has is the voice in his head - sorry, his "insider" and those MS Paint altered images pointing out stuff we already knew. At this point he's been so wrong so often with his forever extending NDA release that you need to be a dumbass lem to take him seriously.

"I've worked with many Microsoft engineers and none of them are stupid. Lets just see how this turns out." Albert Penello said something similar too.

I am no longer a developer, but I used to be - I even have my MCSD in .Net 3.0.

If you think you cannot improve performance of software (yes, games are software) by improving APIs, then you are truly an ignorant &%#, who needs to shut his trap.

#167 Edited by misterpmedia (3364 posts) -

@ttboy said:

@GrenadeLauncher said:

@StormyJoe said:

@GrenadeLauncher said:

Literally posted this just when Thief was confirmed 900p Xbone, 1080p PS4.

Also good job citing SW's favourite Russian moron.

All fun and games aside,

Don't make fun of disabled people, asshole.

Yeah, you're right. It's insulting for the disabled to be compared to such a monumental prick like misterxmedia.

Whats interesting is that misterxmedia at least tries to back up what he is saying with links and whitepaper specs while most people in this thread just post pics of some guy in a vain attempt at humor. I don't think that there is one person on this board who understands hardware architecture as some of the contributors on that site that you insult. Mistercteam just posted an in depth break down on what we know about the Embedded RAM in the Xbox One and he was kind enough to add supporting links. This a level of intelligence that is beyond most of you. I may not agree with all of the conclusions on there but at least they try to make a logical argument with source materials cited.

I work in Software Development and yes you can say any feature of Direct X can be copied on OpenGL. However this is not a real argument. Everything can be emulated in software but at what cost. I've worked with many Microsoft engineers and none of them are stupid. Lets just see how this turns out. I'm intrigued to see if efficiency can beat brute force.

For the dude who finds it hard to believe about DX12 and tiled resources : http://www.youtube.com/watch?v=-I5TEpAnuEg

If you knew the history of misterx's blog you wouldn't be saying that. Mistercteam pieces together literally what he wants to produce a narrative to fit the blogs agenda as well. Mister X and Cteam have been both banned from all popular known tech blog sites like semiaccurate, beyond3d, anandtech.

The blog has also been consistently wrong since its conception.

#168 Posted by Krelian-co (10477 posts) -

@GrenadeLauncher said:

@ttboy said:

@GrenadeLauncher said:

@StormyJoe said:

@GrenadeLauncher said:

Literally posted this just when Thief was confirmed 900p Xbone, 1080p PS4.

Also good job citing SW's favourite Russian moron.

All fun and games aside,

Don't make fun of disabled people, asshole.

Yeah, you're right. It's insulting for the disabled to be compared to such a monumental prick like misterxmedia.

Whats interesting is that misterxmedia at least tries to back up what he is saying with links and whitepaper specs while most people in this thread just post pics of some guy in a vain attempt at humor. I don't think that there is one person on this board who understands hardware architecture as some of the contributors on that site that you insult. Mistercteam just posted an in depth break down on what we know about the Embedded RAM in the Xbox One and he was kind enough to add supporting links. This a level of intelligence that is beyond most of you. I may not agree with all of the conclusions on there but at least they try to make a logical argument with source materials cited.

I work in Software Development and yes you can say any feature of Direct X can be copied on OpenGL. However this is not a real argument. Everything can be emulated in software but at what cost. I've worked with many Microsoft engineers and none of them are stupid. Lets just see how this turns out. I'm intrigued to see if efficiency can beat brute force.

For the dude who finds it hard to believe about DX12 and tiled resources : http://www.youtube.com/watch?v=-I5TEpAnuEg

The only supporting evidence mistershitmedia has is the voice in his head - sorry, his "insider" and those MS Paint altered images pointing out stuff we already knew. At this point he's been so wrong so often with his forever extending NDA release that you need to be a dumbass lem to take him seriously.

"I've worked with many Microsoft engineers and none of them are stupid. Lets just see how this turns out." Albert Penello said something similar too.

I am no longer a developer, but I used to be - I even have my MCSD in .Net 3.0.

If you think you cannot improve performance of software (yes, games are software) by improving APIs, then you are truly an ignorant &%#, who needs to shut his trap.

yeah keep the dream alive because the people who develop for sony which is easier to develop for won't get better at it either. The gap will stay, as xbone gets "better" thanks to the "magical optimization (secret sauce included)" so will ps4.

#169 Edited by ttboy (272 posts) -

@GrenadeLauncher:

@ttboy said:

@GrenadeLauncher said:

@StormyJoe said:

@GrenadeLauncher said:

Literally posted this just when Thief was confirmed 900p Xbone, 1080p PS4.

Also good job citing SW's favourite Russian moron.

All fun and games aside,

Don't make fun of disabled people, asshole.

Yeah, you're right. It's insulting for the disabled to be compared to such a monumental prick like misterxmedia.

Whats interesting is that misterxmedia at least tries to back up what he is saying with links and whitepaper specs while most people in this thread just post pics of some guy in a vain attempt at humor. I don't think that there is one person on this board who understands hardware architecture as some of the contributors on that site that you insult. Mistercteam just posted an in depth break down on what we know about the Embedded RAM in the Xbox One and he was kind enough to add supporting links. This a level of intelligence that is beyond most of you. I may not agree with all of the conclusions on there but at least they try to make a logical argument with source materials cited.

I work in Software Development and yes you can say any feature of Direct X can be copied on OpenGL. However this is not a real argument. Everything can be emulated in software but at what cost. I've worked with many Microsoft engineers and none of them are stupid. Lets just see how this turns out. I'm intrigued to see if efficiency can beat brute force.

For the dude who finds it hard to believe about DX12 and tiled resources : http://www.youtube.com/watch?v=-I5TEpAnuEg

The only supporting evidence mistershitmedia has is the voice in his head - sorry, his "insider" and those MS Paint altered images pointing out stuff we already knew. At this point he's been so wrong so often with his forever extending NDA release that you need to be a dumbass lem to take him seriously.

"I've worked with many Microsoft engineers and none of them are stupid. Lets just see how this turns out." Albert Penello said something similar too.

I would agree if he was the only one on that site. It appears that there is at least 2 devs who comment on there. "Hi Def Ninja" appears to be one of them and he cautions against raising expectations of a 4tf machine. What I like is that not everyone on there agrees and will debate his assertions. Mistercteam is either a hardware dev or someone who knows a hell of a lot about hardware architecture both present and future. He's the one who actually does the research to back up the insider's posts.

Also the insider predicted a 50% performance boost via driver update in November last year. One dev has confirmed a major speed increase so we shall see.

Yes, Microsoft hires the top 10% of Waterloo graduates. This is a school similar to MIT in the US with respect to Comp Eng. I think the MS engineers would not build a sub 1080p box and I do think they're better than Sony's engineers but thats just my opinion. Who knows maybe they were forced to build a weak system due to costs but I wouldn't bet thats the case.

#170 Posted by StormyJoe (4956 posts) -

@StormyJoe said:

@GrenadeLauncher said:

@ttboy said:

@GrenadeLauncher said:

@StormyJoe said:

@GrenadeLauncher said:

Literally posted this just when Thief was confirmed 900p Xbone, 1080p PS4.

Also good job citing SW's favourite Russian moron.

All fun and games aside,

Don't make fun of disabled people, asshole.

Yeah, you're right. It's insulting for the disabled to be compared to such a monumental prick like misterxmedia.

Whats interesting is that misterxmedia at least tries to back up what he is saying with links and whitepaper specs while most people in this thread just post pics of some guy in a vain attempt at humor. I don't think that there is one person on this board who understands hardware architecture as some of the contributors on that site that you insult. Mistercteam just posted an in depth break down on what we know about the Embedded RAM in the Xbox One and he was kind enough to add supporting links. This a level of intelligence that is beyond most of you. I may not agree with all of the conclusions on there but at least they try to make a logical argument with source materials cited.

I work in Software Development and yes you can say any feature of Direct X can be copied on OpenGL. However this is not a real argument. Everything can be emulated in software but at what cost. I've worked with many Microsoft engineers and none of them are stupid. Lets just see how this turns out. I'm intrigued to see if efficiency can beat brute force.

For the dude who finds it hard to believe about DX12 and tiled resources : http://www.youtube.com/watch?v=-I5TEpAnuEg

The only supporting evidence mistershitmedia has is the voice in his head - sorry, his "insider" and those MS Paint altered images pointing out stuff we already knew. At this point he's been so wrong so often with his forever extending NDA release that you need to be a dumbass lem to take him seriously.

"I've worked with many Microsoft engineers and none of them are stupid. Lets just see how this turns out." Albert Penello said something similar too.

I am no longer a developer, but I used to be - I even have my MCSD in .Net 3.0.

If you think you cannot improve performance of software (yes, games are software) by improving APIs, then you are truly an ignorant &%#, who needs to shut his trap.

yeah keep the dream alive because the people who develop for sony which is easier to develop for won't get better at it either. The gap will stay, as xbone gets "better" thanks to the "magical optimization (secret sauce included)" so will ps4.

"Blah blah blah, moo moo moo. Sony is the greatest moo blah moo" <-- that's all I see when I read your inane posts.

#171 Edited by misterpmedia (3364 posts) -

@ttboy said:

@GrenadeLauncher:

@GrenadeLauncher said:

@ttboy said:

@GrenadeLauncher said:

@StormyJoe said:

@GrenadeLauncher said:

Literally posted this just when Thief was confirmed 900p Xbone, 1080p PS4.

Also good job citing SW's favourite Russian moron.

All fun and games aside,

Don't make fun of disabled people, asshole.

Yeah, you're right. It's insulting for the disabled to be compared to such a monumental prick like misterxmedia.

Whats interesting is that misterxmedia at least tries to back up what he is saying with links and whitepaper specs while most people in this thread just post pics of some guy in a vain attempt at humor. I don't think that there is one person on this board who understands hardware architecture as some of the contributors on that site that you insult. Mistercteam just posted an in depth break down on what we know about the Embedded RAM in the Xbox One and he was kind enough to add supporting links. This a level of intelligence that is beyond most of you. I may not agree with all of the conclusions on there but at least they try to make a logical argument with source materials cited.

I work in Software Development and yes you can say any feature of Direct X can be copied on OpenGL. However this is not a real argument. Everything can be emulated in software but at what cost. I've worked with many Microsoft engineers and none of them are stupid. Lets just see how this turns out. I'm intrigued to see if efficiency can beat brute force.

For the dude who finds it hard to believe about DX12 and tiled resources : http://www.youtube.com/watch?v=-I5TEpAnuEg

The only supporting evidence mistershitmedia has is the voice in his head - sorry, his "insider" and those MS Paint altered images pointing out stuff we already knew. At this point he's been so wrong so often with his forever extending NDA release that you need to be a dumbass lem to take him seriously.

"I've worked with many Microsoft engineers and none of them are stupid. Lets just see how this turns out." Albert Penello said something similar too.

I would agree if he was the only one on that site. It appears that there is at least 2 devs who comment on there. "Hi Def Ninja" appears to be one of them and he cautions against raising expectations of a 4tf machine. What I like is that not everyone on there agrees and will debate his assertions. Mistercteam is either a hardware dev or someone who knows a hell of a lot about hardware architecture both present and future. He's the one who actually does the research to back up the insider's posts.

Also the insider predicted a 50% performance boost via driver update in November last year. One dev has confirmed a major speed increase so we shall see.

Yes, Microsoft hires the top 10% of Waterloo graduates. This is a school similar to MIT in the US with respect to Comp Eng. I think the MS engineers would not build a sub 1080p box and I do think they're better than Sony's engineers but thats just my opinion. Who knows maybe they were forced to build a weak system due to costs but I wouldn't bet thats the case.

Also this:

#172 Posted by TruthBToldShow (328 posts) -

How many times have lems told us this before?

#173 Posted by GrenadeLauncher (4205 posts) -

@ttboy said:

@GrenadeLauncher:

Also the insider predicted a 50% performance boost via driver update in November last year. One dev has confirmed a major speed increase so we shall see.

That'll be the 8% power boost from jettisoning Kinect which still doesn't bring it to paper specs.

#174 Edited by Krelian-co (10477 posts) -

@StormyJoe said:

@Krelian-co said:

@StormyJoe said:

@GrenadeLauncher said:

@ttboy said:

@GrenadeLauncher said:

@StormyJoe said:

@GrenadeLauncher said:

Literally posted this just when Thief was confirmed 900p Xbone, 1080p PS4.

Also good job citing SW's favourite Russian moron.

All fun and games aside,

Don't make fun of disabled people, asshole.

Yeah, you're right. It's insulting for the disabled to be compared to such a monumental prick like misterxmedia.

Whats interesting is that misterxmedia at least tries to back up what he is saying with links and whitepaper specs while most people in this thread just post pics of some guy in a vain attempt at humor. I don't think that there is one person on this board who understands hardware architecture as some of the contributors on that site that you insult. Mistercteam just posted an in depth break down on what we know about the Embedded RAM in the Xbox One and he was kind enough to add supporting links. This a level of intelligence that is beyond most of you. I may not agree with all of the conclusions on there but at least they try to make a logical argument with source materials cited.

I work in Software Development and yes you can say any feature of Direct X can be copied on OpenGL. However this is not a real argument. Everything can be emulated in software but at what cost. I've worked with many Microsoft engineers and none of them are stupid. Lets just see how this turns out. I'm intrigued to see if efficiency can beat brute force.

For the dude who finds it hard to believe about DX12 and tiled resources : http://www.youtube.com/watch?v=-I5TEpAnuEg

The only supporting evidence mistershitmedia has is the voice in his head - sorry, his "insider" and those MS Paint altered images pointing out stuff we already knew. At this point he's been so wrong so often with his forever extending NDA release that you need to be a dumbass lem to take him seriously.

"I've worked with many Microsoft engineers and none of them are stupid. Lets just see how this turns out." Albert Penello said something similar too.

I am no longer a developer, but I used to be - I even have my MCSD in .Net 3.0.

If you think you cannot improve performance of software (yes, games are software) by improving APIs, then you are truly an ignorant &%#, who needs to shut his trap.

yeah keep the dream alive because the people who develop for sony which is easier to develop for won't get better at it either. The gap will stay, as xbone gets "better" thanks to the "magical optimization (secret sauce included)" so will ps4.

"Blah blah blah, moo moo moo. Sony is the greatest moo blah moo" <-- that's all I see when I read your inane posts.

nice come back, i hope you learn something getting tired of correcting your ignorance and destroying your arguments.

#175 Edited by ronvalencia (15109 posts) -
@me3x12 said:

@bforrester420 said:

Still clinging to secret sauce theories...sad...

This is not secret sauce this is all FACT. It's true that DX11.2 is coming to Xbox One and it's true Tiled Resources is coming to Xbox One and it's a FACT you can store up to 6gb of textures on the 32mb ESRAM alone in FACT it's actually 47MB of ESRAM and it's also a FACT that new SDK dev kits are going out basically as we speak and a developer has already said it's going to make things way fast and make 1080p easy. What's so hard to swallow here?

MS owns DX and they will use Tiled Resources with DX11.2 and also in theory in Tiled Resources could be attempted on PS4 but they have no plans of doing so because it's been said it's nowhere near as efficient as it is on the proprietary DX11.2 and they would have to create separate apps to try to achieve it and it would be a huge system bog

PS4's graphics API is similar to AMD Mantle i.e. it's efficient. Atm, AMD Mantle and PS4's graphics APIs are proprietary APIs.

Xbox One uses DirectX 11.X which is a superset of PC's DirectX11.2.

#176 Posted by StormyJoe (4956 posts) -

@StormyJoe said:

@Krelian-co said:

@StormyJoe said:

@GrenadeLauncher said:

@ttboy said:

@GrenadeLauncher said:

@StormyJoe said:

@GrenadeLauncher said:

Literally posted this just when Thief was confirmed 900p Xbone, 1080p PS4.

Also good job citing SW's favourite Russian moron.

All fun and games aside,

Don't make fun of disabled people, asshole.

Yeah, you're right. It's insulting for the disabled to be compared to such a monumental prick like misterxmedia.

Whats interesting is that misterxmedia at least tries to back up what he is saying with links and whitepaper specs while most people in this thread just post pics of some guy in a vain attempt at humor. I don't think that there is one person on this board who understands hardware architecture as some of the contributors on that site that you insult. Mistercteam just posted an in depth break down on what we know about the Embedded RAM in the Xbox One and he was kind enough to add supporting links. This a level of intelligence that is beyond most of you. I may not agree with all of the conclusions on there but at least they try to make a logical argument with source materials cited.

I work in Software Development and yes you can say any feature of Direct X can be copied on OpenGL. However this is not a real argument. Everything can be emulated in software but at what cost. I've worked with many Microsoft engineers and none of them are stupid. Lets just see how this turns out. I'm intrigued to see if efficiency can beat brute force.

For the dude who finds it hard to believe about DX12 and tiled resources : http://www.youtube.com/watch?v=-I5TEpAnuEg

The only supporting evidence mistershitmedia has is the voice in his head - sorry, his "insider" and those MS Paint altered images pointing out stuff we already knew. At this point he's been so wrong so often with his forever extending NDA release that you need to be a dumbass lem to take him seriously.

"I've worked with many Microsoft engineers and none of them are stupid. Lets just see how this turns out." Albert Penello said something similar too.

I am no longer a developer, but I used to be - I even have my MCSD in .Net 3.0.

If you think you cannot improve performance of software (yes, games are software) by improving APIs, then you are truly an ignorant &%#, who needs to shut his trap.

yeah keep the dream alive because the people who develop for sony which is easier to develop for won't get better at it either. The gap will stay, as xbone gets "better" thanks to the "magical optimization (secret sauce included)" so will ps4.

"Blah blah blah, moo moo moo. Sony is the greatest moo blah moo" <-- that's all I see when I read your inane posts.

nice come back, i hope you learn something getting tired of correcting your ignorance and destroying your arguments.

Destroy my arguments? Laughable. The only chance you ever have to "destroy" any of my points is if a source I quote happens to turn out to be false. That's it, that's your only hope.

Not to be rude, but I am simply smarter than you are. Sorry if that stings, but the truth often does with people such as yourself. I only reply to your threads as a break from things that actually require me to think - like the probability algorithm for market penetration I just finished for the BI dashboard my company's execs use (thank you, finite math!).

#177 Posted by misterpmedia (3364 posts) -

@me3x12 said:

@bforrester420 said:

Still clinging to secret sauce theories...sad...

This is not secret sauce this is all FACT. It's true that DX11.2 is coming to Xbox One and it's true Tiled Resources is coming to Xbox One and it's a FACT you can store up to 6gb of textures on the 32mb ESRAM alone in FACT it's actually 47MB of ESRAM and it's also a FACT that new SDK dev kits are going out basically as we speak and a developer has already said it's going to make things way fast and make 1080p easy. What's so hard to swallow here?

MS owns DX and they will use Tiled Resources with DX11.2 and also in theory in Tiled Resources could be attempted on PS4 but they have no plans of doing so because it's been said it's nowhere near as efficient as it is on the proprietary DX11.2 and they would have to create separate apps to try to achieve it and it would be a huge system bog

PS4's graphics API is similar to AMD Mantle i.e. it's efficient. Atm, AMD Mantle and PS4's graphics APIs are proprietary APIs.

Xbox One uses DirectX 11.X which is a superset of PC's DirectX11.2.

Is that technically better or worse?

#178 Edited by ronvalencia (15109 posts) -

@edwardecl said:

"you can store up to 6GB of textures on the ESRAM alone", I stopped reading your post when I got to that point...

Please tell me how do you store 6GB of textures on 32MB of ESRAM? Even if you are talking about the 6GB being compressed from 6GB lossless (which is retarded to even mention) it does not compute, unless you are being a complete troll with the "up to" statement. PS4 can store up to 40GB of textures on it's GDDR5 too in that case.

AMD PRT/Tiled Resource Tier 3 enables texture tiles to be stream'ed into ESRAM from slower storage device (e.g. DDR3 memory) in a just-in-time manner for GPU's TMU consumption. The stream'ed texture tiles are based on the player's view port.

X1's TMU texture fetch potential changes from 68 GB/s (7770 like) to ~150 GB/s (from ESRAM).

#179 Posted by ronvalencia (15109 posts) -

@ronvalencia said:
@me3x12 said:

@bforrester420 said:

Still clinging to secret sauce theories...sad...

This is not secret sauce this is all FACT. It's true that DX11.2 is coming to Xbox One and it's true Tiled Resources is coming to Xbox One and it's a FACT you can store up to 6gb of textures on the 32mb ESRAM alone in FACT it's actually 47MB of ESRAM and it's also a FACT that new SDK dev kits are going out basically as we speak and a developer has already said it's going to make things way fast and make 1080p easy. What's so hard to swallow here?

MS owns DX and they will use Tiled Resources with DX11.2 and also in theory in Tiled Resources could be attempted on PS4 but they have no plans of doing so because it's been said it's nowhere near as efficient as it is on the proprietary DX11.2 and they would have to create separate apps to try to achieve it and it would be a huge system bog

PS4's graphics API is similar to AMD Mantle i.e. it's efficient. Atm, AMD Mantle and PS4's graphics APIs are proprietary APIs.

Xbox One uses DirectX 11.X which is a superset of PC's DirectX11.2.

Is that technically better or worse?

It's technically better at the API level, but that doesn't factor in better drivers. It's interesting X1's new driver follows after AMD Mantle driver release.

#180 Posted by kinectthedots (1629 posts) -

@tyloss said:

@gamemediator said:

LOL my post got deleted. fuck this forum, its so full of ignorant Sony fanboys. Gamespot can't even review a game without being biased. Gametrailers.com is where its at, people actually talk intelligently and DISCUSS FACTS. go fuck yourself gamespot and X1 haters.

So much tears and rage.

He is a lem, that is all there ever is.

#181 Posted by tormentos (17293 posts) -

The ESRAM is really something truly special, and not just there to try to help out the slow DDR3. It will compensate for the lack of CU units on the GPU and will crush the PS4.

In reality what MS have done is nothing new. The PS2 had a unified 32MB RDRAM@3.2GB/s, and 4MB DRAM @48GB/s. The difference with the X1 is that the speed advantage compared to the competition is'nt there since the GDDR5 RAM in the PS4 is just as fast, and there is more of it. Had the ESRAM been around 500GB/s or more, there would probably have been an advantage.

My god..

ESRAM is just memory is not a flop enhancer or producing part and will not help the xbox one close that gap.

Yeah and having 4MB of DRAM embed on the GPU didn't stop the xbox one from been stronger because it had a stronger GPU,basically you just proved that the xbox one is screwed up.

It could have been 1,000GB/s it would mean little the xbox one GPU doesn't have even close the power to saturate than bandwidth or even close,the problem with giving a 7770 500GB/s is that it will still be a 7770..

#182 Posted by kinectthedots (1629 posts) -

@me3x12 said:

@misterpmedia said:

@me3x12 said:

This is a vary viable scenario. Here's the deal the drivers are way late for the X1 new SDK kits are going out really soon and already being said to be really fast and it makes achieving 1080p on X1 really easy with the new kits.

Apparently the situation is these new kits also give developers the tools they need to leverage the ESRAM in ways they had no clue.

Also MS will be dropping DX11.2 soon on X1 and when this happens you can store up to 6GB of textures on the ESRAM alone that's massive and will also make it really easy to achieve 1080p.

This is all true don't try to downplay it. When this happens and it's going to really soon the reality is that in FACT the X1 might flip the table and be the more capable box running full 1080p at 60fps because running 60fps

There is a lot of AM CRY coming gonna need lots of tissues soon.

http://www.digitalspy.com/gaming/news/a550445/rebellion-xbox-one-1080p-issues-can-be-fixed.html

http://misterxmedia.livejournal.com/153756.html

http://gamingbolt.com/why-xbox-ones-esram-feels-limited-despite-potential-to-store-6gb-of-tiled-textures-using-dx-11-2

hahahahaa, referencing Misterxmedia makes this whole thread, topic and post invalid to sense. Way to fail TC, even I thought you didn't sink to bottom most rung of Lemdom.

The bold, the PS4 can do too btw. :). Not to mention the PS4 is already tooled to harness DX11.2+ and beyond.

It's so sad how you and the rest of the Cows knowlege is so DOO DOO just PURE POOP!!! PS4 will NEVER EVER EVER!!! do DX of any kind not DX11.1 DX11.2 NOTHING!!! Man you are shit stupid I swear. DX is MS owned in no way shape or form can DX be done on PS4. PS4 is Open GL

http://www.youtube.com/watch?v=wKjxFJfcrcA

LOL there are equivalents you do realise? Hold on let me find this picture to totally own your lem ass, oh yes here it is, but a simple google search away :D. Your comment rendered pointless in 3....2.....1....:

Back to your basement goblin!

servere ownage.

Me3x12, "PS4 will NEVER EVER EVER!!! do DX of any kind not DX11.1 DX11.2 NOTHING!!! Man you are shit stupid I swear. DX is MS owned in no way shape or form can DX be done on PS4. PS4 is Open GL"

The proper oder of events that need to follow this post*

Everyone laugh at TC

lock thread

ban TC

#183 Posted by ronvalencia (15109 posts) -

@Martin_G_N said:

The ESRAM is really something truly special, and not just there to try to help out the slow DDR3. It will compensate for the lack of CU units on the GPU and will crush the PS4.

In reality what MS have done is nothing new. The PS2 had a unified 32MB RDRAM@3.2GB/s, and 4MB DRAM @48GB/s. The difference with the X1 is that the speed advantage compared to the competition is'nt there since the GDDR5 RAM in the PS4 is just as fast, and there is more of it. Had the ESRAM been around 500GB/s or more, there would probably have been an advantage.

My god..

ESRAM is just memory is not a flop enhancer or producing part and will not help the xbox one close that gap.

Yeah and having 4MB of DRAM embed on the GPU didn't stop the xbox one from been stronger because it had a stronger GPU,basically you just proved that the xbox one is screwed up.

It could have been 1,000GB/s it would mean little the xbox one GPU doesn't have even close the power to saturate than bandwidth or even close,the problem with giving a 7770 500GB/s is that it will still be a 7770..

1. 7770 with 500 GB/s bandwidth doesn't exist.

2. X1's has two rasterizer units while 7770 has one rasterizer unit. The two rasterizer units are capable in supporting ~150 GB/s level VRAM.

#184 Posted by misterpmedia (3364 posts) -

@misterpmedia said:

@ronvalencia said:
@me3x12 said:

@bforrester420 said:

Still clinging to secret sauce theories...sad...

This is not secret sauce this is all FACT. It's true that DX11.2 is coming to Xbox One and it's true Tiled Resources is coming to Xbox One and it's a FACT you can store up to 6gb of textures on the 32mb ESRAM alone in FACT it's actually 47MB of ESRAM and it's also a FACT that new SDK dev kits are going out basically as we speak and a developer has already said it's going to make things way fast and make 1080p easy. What's so hard to swallow here?

MS owns DX and they will use Tiled Resources with DX11.2 and also in theory in Tiled Resources could be attempted on PS4 but they have no plans of doing so because it's been said it's nowhere near as efficient as it is on the proprietary DX11.2 and they would have to create separate apps to try to achieve it and it would be a huge system bog

PS4's graphics API is similar to AMD Mantle i.e. it's efficient. Atm, AMD Mantle and PS4's graphics APIs are proprietary APIs.

Xbox One uses DirectX 11.X which is a superset of PC's DirectX11.2.

Is that technically better or worse?

It's technically better at the API level, but that doesn't factor in better drivers. It's interesting X1's new driver follows after AMD Mantle driver release.

There's this thing though I saw one of those really awesome devs talk about how coding for Xbox sits ontop of all this other crap whereas the PS4 doesn't. Think it was Corrinne Yu.

#185 Edited by ronvalencia (15109 posts) -

@misterpmedia said:

@me3x12 said:

@misterpmedia said:

@me3x12 said:

This is a vary viable scenario. Here's the deal the drivers are way late for the X1 new SDK kits are going out really soon and already being said to be really fast and it makes achieving 1080p on X1 really easy with the new kits.

Apparently the situation is these new kits also give developers the tools they need to leverage the ESRAM in ways they had no clue.

Also MS will be dropping DX11.2 soon on X1 and when this happens you can store up to 6GB of textures on the ESRAM alone that's massive and will also make it really easy to achieve 1080p.

This is all true don't try to downplay it. When this happens and it's going to really soon the reality is that in FACT the X1 might flip the table and be the more capable box running full 1080p at 60fps because running 60fps

There is a lot of AM CRY coming gonna need lots of tissues soon.

http://www.digitalspy.com/gaming/news/a550445/rebellion-xbox-one-1080p-issues-can-be-fixed.html

http://misterxmedia.livejournal.com/153756.html

http://gamingbolt.com/why-xbox-ones-esram-feels-limited-despite-potential-to-store-6gb-of-tiled-textures-using-dx-11-2

hahahahaa, referencing Misterxmedia makes this whole thread, topic and post invalid to sense. Way to fail TC, even I thought you didn't sink to bottom most rung of Lemdom.

The bold, the PS4 can do too btw. :). Not to mention the PS4 is already tooled to harness DX11.2+ and beyond.

It's so sad how you and the rest of the Cows knowlege is so DOO DOO just PURE POOP!!! PS4 will NEVER EVER EVER!!! do DX of any kind not DX11.1 DX11.2 NOTHING!!! Man you are shit stupid I swear. DX is MS owned in no way shape or form can DX be done on PS4. PS4 is Open GL

http://www.youtube.com/watch?v=wKjxFJfcrcA

LOL there are equivalents you do realise? Hold on let me find this picture to totally own your lem ass, oh yes here it is, but a simple google search away :D. Your comment rendered pointless in 3....2.....1....:

Back to your basement goblin!

servere ownage.

Me3x12, "PS4 will NEVER EVER EVER!!! do DX of any kind not DX11.1 DX11.2 NOTHING!!! Man you are shit stupid I swear. DX is MS owned in no way shape or form can DX be done on PS4. PS4 is Open GL"

The proper oder of events that need to follow this post*

Everyone laugh at TC

lock thread

ban TC

http://blogs.windows.com/windows/b/appbuilder/archive/2013/10/14/raising-the-bar-with-direct3d.aspx

"The Xbox One graphics API is “Direct3D 11.x” and the Xbox One hardware provides a superset of Direct3D 11.2 functionality."

http://www.eurogamer.net/articles/digitalfoundry-microsoft-to-unlock-more-gpu-power-for-xbox-one-developers

"In addition to asynchronous compute queues, the Xbox One hardware supports two concurrent render pipes,"

Xbox One's DirectX 11.X supports AMD GCN's asynchronous compute queues.

#186 Edited by ronvalencia (15109 posts) -

@misterpmedia said:

@ronvalencia said:

@misterpmedia said:

@ronvalencia said:
@me3x12 said:

@bforrester420 said:

Still clinging to secret sauce theories...sad...

This is not secret sauce this is all FACT. It's true that DX11.2 is coming to Xbox One and it's true Tiled Resources is coming to Xbox One and it's a FACT you can store up to 6gb of textures on the 32mb ESRAM alone in FACT it's actually 47MB of ESRAM and it's also a FACT that new SDK dev kits are going out basically as we speak and a developer has already said it's going to make things way fast and make 1080p easy. What's so hard to swallow here?

MS owns DX and they will use Tiled Resources with DX11.2 and also in theory in Tiled Resources could be attempted on PS4 but they have no plans of doing so because it's been said it's nowhere near as efficient as it is on the proprietary DX11.2 and they would have to create separate apps to try to achieve it and it would be a huge system bog

PS4's graphics API is similar to AMD Mantle i.e. it's efficient. Atm, AMD Mantle and PS4's graphics APIs are proprietary APIs.

Xbox One uses DirectX 11.X which is a superset of PC's DirectX11.2.

Is that technically better or worse?

It's technically better at the API level, but that doesn't factor in better drivers. It's interesting X1's new driver follows after AMD Mantle driver release.

There's this thing though I saw one of those really awesome devs talk about how coding for Xbox sits ontop of all this other crap whereas the PS4 doesn't. Think it was Corrinne Yu.

Corrinne Yu was member of Direct3D advisory board LOL... http://www.gamespot.com/articles/naughty-dog-hires-halo-4-programmer-corrinne-yu/1100-6416367/

http://www.reddit.com/r/Games/comments/1rezu8/corrinne_yu_on_the_difference_between_ps4_and/

"She's comparing Windows 8, not Xbox One. Thread title is wrong"

#187 Edited by misterpmedia (3364 posts) -

@ronvalencia said:

@misterpmedia said:

@ronvalencia said:

@misterpmedia said:

@ronvalencia said:
@me3x12 said:

@bforrester420 said:

Still clinging to secret sauce theories...sad...

This is not secret sauce this is all FACT. It's true that DX11.2 is coming to Xbox One and it's true Tiled Resources is coming to Xbox One and it's a FACT you can store up to 6gb of textures on the 32mb ESRAM alone in FACT it's actually 47MB of ESRAM and it's also a FACT that new SDK dev kits are going out basically as we speak and a developer has already said it's going to make things way fast and make 1080p easy. What's so hard to swallow here?

MS owns DX and they will use Tiled Resources with DX11.2 and also in theory in Tiled Resources could be attempted on PS4 but they have no plans of doing so because it's been said it's nowhere near as efficient as it is on the proprietary DX11.2 and they would have to create separate apps to try to achieve it and it would be a huge system bog

PS4's graphics API is similar to AMD Mantle i.e. it's efficient. Atm, AMD Mantle and PS4's graphics APIs are proprietary APIs.

Xbox One uses DirectX 11.X which is a superset of PC's DirectX11.2.

Is that technically better or worse?

It's technically better at the API level, but that doesn't factor in better drivers. It's interesting X1's new driver follows after AMD Mantle driver release.

There's this thing though I saw one of those really awesome devs talk about how coding for Xbox sits ontop of all this other crap whereas the PS4 doesn't. Think it was Corrinne Yu.

Corrinne Yu was member of Direct3D advisory board LOL... http://www.gamespot.com/articles/naughty-dog-hires-halo-4-programmer-corrinne-yu/1100-6416367/

http://www.reddit.com/r/Games/comments/1rezu8/corrinne_yu_on_the_difference_between_ps4_and/

"She's comparing Windows 8, not Xbox One. Thread title is wrong"

edit, nevermind didn't follow that last link.

"In addition to asynchronous compute queues, the Xbox One hardware supports two concurrent render pipes,"

what does this mean for Xbox one?

#188 Posted by edwardecl (2142 posts) -

The people saying Xbox will catch up performance wise with software...

I've seen this argument from Microsoft quite a few times, they say they are better because they made DirectX and they have the better programmers. If that's the case where is it? Their entire argument is they might be able to outpace Sony developers with optimisations. They would be lucky if they are able to just keep up, software optimisation are not going to buy to anything like 30 or 40 percent. Sure the exclusives could use little tricks to make it appear better than it is (what I mean by this is achieve graphical fidelity at the expensive of game play and exploration by almost hand optimising the scenes), but the real test is with true open world games, or games that are written for both systems.

As for DirectX itself... well the argument here is pointless, both companies and their devs teams can make this stuff better over time, however I very much doubt there will be much improvement unless either system was not really finished at the time of release, I suppose with the Xbox you can kind of believe that, but there is no reason why one should be better than the other on features as they are very similar, apart from one being more powerful.

To me Microsoft just sound arrogant with their boasts of superior programmers.

#189 Edited by Gue1 (9570 posts) -

so much talk about tiled resources yet no one mentions the main advantage of Sony. PlayStation has never had high level interfaces getting in the way of programming to the metal thanks to the libGCN library, which is is why the PS2 was so weak yet it had games like: Shadow of the Colossus, FFXII and God of War. libGCN is the reason devs could push the PS3 even further than the X360 even though the former had weaker GPU and worse ram set-up. And now that the PS4 is more powerful than the Xbone component by component there is nothing that can stop it. In fact, Timothy Lottes and Corrine Yuu agree with this.

Corrinne Yuu created the lighting of Halo 4 and M$ considered it to be so good that they patented it. So we have a genius saying that programming for PlayStation is way better than Windows and DirectX because there is no high-level stuff getting in the way.

@Corrinne:

2013-11-24 17:01:16 UTC

PS4 graphics programming does not sit on top of heavy layers of WinOS or GPU drivers I'm excite about as pedal to metal coder

Why she was a big deal at 343: Wikipedia

343 Industries was established in 2007 to oversee the Halo franchise following Bungie's separation from Microsoft. Yu programmed lighting, facial animation, and developed new technology for the 2012 video game Halo 4. While coding on Halo team, Yu researched new lighting techniques, and invented new dynamic radiosity algorithms. Microsoft applied a software patent for Yu's Halo lighting work.

More stuff:

Space Experience:

Yu programmed on the Space Shuttle program at Rockwell International California. She designed and conducted accelerator experiments at LINAC in California and the accelerator at Brookhaven National Laboratory.

Nuclear Physics:

Her nuclear physics research won her a national award from the U.S. Department of Energy.

GDC Award:

In 2009, Corrinne Yu won Best in Engineering internationally at GDC

Other places she has woked:

Gearbox:

Yu worked as Director of Technology of Gearbox Software, creator of Brothers in Arms and Borderlands.

Direct 3D:

Yu was a founding member of Microsoft's Direct 3D Advisory Board. She participated in CUDA and GPU simulation at NVidia.

Ion Storm:

While at Ion she was responsible for the Quake 2 code base and any games based on that engine.

BTW, she now works for Naughty Dog and her husband is senior art director at 343i.

#190 Posted by tormentos (17293 posts) -

Nope. The so called outdated 'next gen' consoles are called 900pStation and 720pBox for a reason and I assure you the road will go only downwards from here.

The PS4 has only 1 game now out that is 900p and is very demanding BF4.

All other games are 1080p,on the xbox one there are several 720p games,and several 900p,as well as several 1080p games.

Damn this thread is entertaining..Xbox1 graphics don't bother me, they are a nice step up from the 360, but more is always better, it will be interesting to see what happens when those new kits come out.

A driver update will not change much..

@ttboy said:

Whats interesting is that misterxmedia at least tries to back up what he is saying with links and whitepaper specs while most people in this thread just post pics of some guy in a vain attempt at humor. I don't think that there is one person on this board who understands hardware architecture as some of the contributors on that site that you insult. Mistercteam just posted an in depth break down on what we know about the Embedded RAM in the Xbox One and he was kind enough to add supporting links. This a level of intelligence that is beyond most of you. I may not agree with all of the conclusions on there but at least they try to make a logical argument with source materials cited.

I work in Software Development and yes you can say any feature of Direct X can be copied on OpenGL. However this is not a real argument. Everything can be emulated in software but at what cost. I've worked with many Microsoft engineers and none of them are stupid. Lets just see how this turns out. I'm intrigued to see if efficiency can beat brute force.

For the dude who finds it hard to believe about DX12 and tiled resources : http://www.youtube.com/watch?v=-I5TEpAnuEg

MisterXmedia is a troll and people who believe them are just empowering his trolling even more,he was already proven wrong about teh staked APU crap,Dgpu and all that sh**,why the fu** MS would need to cut a 10% GPU reservation and risk hardware failures by over clocking the CPU and GPU if they had a staked APU or a DGPU inside.?

Most people who claim this including misterxmedia him self have no real knowledge of hardware,the xbox one has a TPD limit,is impossible to put that APU and a DGPU or even dual APU,these units have a power limitation imposed by some states already,and they want to keep watt consumption low and heat as well.

And that bold part really shouldn't be a real argument,Opengl has less over head,and tend to support new things faster than DX does,in fact PRT or tile resources is supported since its insertion with the 7000 series card in 2011,yet MS didn't implement it into DX until 2013,then claiming that tile resources was an exclusive feature of DX was lol,when Opengl had support it since it arrives on hardware...

OpenGL doesn't have to emulate DX,it does the same using its own code and has less over head.

Currently MegaTexture does this entirely in software using existing OpenGL 3.2 APIs, but AMD believes that more next-generation game engines will use this type of texturing technology.

The second aspect of PRT is managing the tiles. In essence PRT reduces local video memory to a very large cache, where tiles are mapped/pinned as necessary and then evicted as per the cache rules, and elsewhere the hardware handles page/tile translation should a tile not already be in the cache. Large tomes have been written on caching methods, and this aspect is of particular interest to AMD because what they learn about caching here they can apply to graphical workloads (i.e. professional) and not just gaming.

This should prove once and for all that PRT doesn't need 2 different memory setups,and that it works on PS4 possible in a better way than on xbox one to.Not only that Mega textures which is what Carmack did with Rage was also done with OpenGL,so basically MS was years late to PRT or Tile resources.

Wrapping things up, for the time being while Southern Islands will bring hardware support for PRT software support will remain limited. As D3D is not normally extensible it’s really only possible to easily access the feature from other APIs (e.g. OpenGL), which when it comes to games is going to greatly limit the adoption of the technology.

http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/6

@ttboy said:

@GrenadeLauncher:

I would agree if he was the only one on that site. It appears that there is at least 2 devs who comment on there. "Hi Def Ninja" appears to be one of them and he cautions against raising expectations of a 4tf machine. What I like is that not everyone on there agrees and will debate his assertions. Mistercteam is either a hardware dev or someone who knows a hell of a lot about hardware architecture both present and future. He's the one who actually does the research to back up the insider's posts.

Also the insider predicted a 50% performance boost via driver update in November last year. One dev has confirmed a major speed increase so we shall see.

Yes, Microsoft hires the top 10% of Waterloo graduates. This is a school similar to MIT in the US with respect to Comp Eng. I think the MS engineers would not build a sub 1080p box and I do think they're better than Sony's engineers but thats just my opinion. Who knows maybe they were forced to build a weak system due to costs but I wouldn't bet thats the case.

You mean 2 other trolls.?

4TF...lol

So Tomb Raider after the 50% boost will run at 1440 60FPS on xbox one.?

Yeah that explain why the xbox 360 had a 50% fail rate,explain why 360 was huge get it didn't have an internal PSU,oh wait the xbox one look like an 80's VCR,is way bigger than the PS4,weaker and has again a huge power brick.

Meanwhile the PS4 is smaller,stronger and has an internal PS4,yeah that really show MS engineers are better.

Dude they use cheap ass DDR3,because they wanted to Wii out people with Kinect,MS has the money to use 8GB of GDDR5 and a stronger GPU,they decide against it period because in the 2 generations they have been in the stronger console didn't win period,so they took a gamble and when after the casuals and tv watchers,and got punish for that by the market period.

But this is not about been better engineer or not,MS had a vision which is irrelevant to what people wanted,core nintendo fans would have love a stronger wii but that is what Nintendo had planed,well yeah nintendo did the same,there is nothing MS could do to change the outcome of this race,they believed sony was on a weak spot,and that they could not afford anything strong,in fact if MS would have go with GDDR5 instead of DDR3 ESRAM would not be in the picture and probably the xbox one would have get minimum a 7790 which is 1.79TF at least at 1ghz.

"Blah blah blah, moo moo moo. Sony is the greatest moo blah moo" <-- that's all I see when I read your inane posts.

The problem is that you are been the blind here,you told him that games can improve by improving API,and i agree with you 100%,problem is that also apply to the PS4,and there is a point where further refinement will yield no results period.

AMD PRT/Tiled Resource Tier 3 enables texture tiles to be stream'ed into ESRAM from slower storage device (e.g. DDR3 memory) in a just-in-time manner for GPU's TMU consumption. The stream'ed texture tiles are based on the player's view port.

X1's TMU texture fetch potential changes from 68 GB/s (7770 like) to ~150 GB/s (from ESRAM).

Tier 3 exist because PRT doesn't actually work with 2 different memory setups,so basically MS had to make support for it,PRT use the video memory as a cache rather than memory,is on Anandtech analysis i just quoted it but let me quote it again.

The second aspect of PRT is managing the tiles. In essence PRT reduces local video memory to a very large cache, where tiles are mapped/pinned as necessary and then evicted as per the cache rules, and elsewhere the hardware handles page/tile translation should a tile not already be in the cache. Large tomes have been written on caching methods, and this aspect is of particular interest to AMD because what they learn about caching here they can apply to graphical workloads (i.e. professional) and not just gaming.

This should end all your theories on PRT not working on PS4,liek i claim PRT doesn't need 2 different memory setup that is a side effects on MS console for having ESRAM,PRT uses your video card own memory as a cache for those textures.

Partially because not all are loaded,resident because part of them are already there,those textures are were before they hot your GPU.? On the HDD or Disc.?

#191 Posted by jhcho2 (4380 posts) -

@me3x12 said:

This is a vary viable scenario. Here's the deal the drivers are way late for the X1 new SDK kits are going out really soon and already being said to be really fast and it makes achieving 1080p on X1 really easy with the new kits.

Apparently the situation is these new kits also give developers the tools they need to leverage the ESRAM in ways they had no clue.

Also MS will be dropping DX11.2 soon on X1 and when this happens you can store up to 6GB of textures on the ESRAM alone that's massive and will also make it really easy to achieve 1080p.

This is all true don't try to downplay it. When this happens and it's going to really soon the reality is that in FACT the X1 might flip the table and be the more capable box running full 1080p at 60fps because running 60fps

There is a lot of AM CRY coming gonna need lots of tissues soon.

http://www.digitalspy.com/gaming/news/a550445/rebellion-xbox-one-1080p-issues-can-be-fixed.html

http://misterxmedia.livejournal.com/153756.html

http://gamingbolt.com/why-xbox-ones-esram-feels-limited-despite-potential-to-store-6gb-of-tiled-textures-using-dx-11-2

Lol? Directx11.2? Directx is more of a bane than a boon. It's better to let developers write their own code.

#192 Edited by StormyJoe (4956 posts) -

@tormentos: I never said the PS4 wouldn't improve. All games for all consoles get better as the generation progresses. My point is that I still believe the issues with the XB1 not hitting 1080p/60fps when the PS4 can is because of factors that can be corrected. Freeing up Kinect resources, optimizing the APIs, and developer familiarity will close the 3rd party gap. This, IMO, will mean that PS4 exclusives will be somewhat better than XB1 exclusives in the graphics department, but multiplats will go back to parity - not too dissimilar to the 360/PS3 scenario at the end of last gen.

#193 Posted by misterpmedia (3364 posts) -
#194 Edited by -CC- (1890 posts) -

me3x12 just got a 4 page troll. He does not believe his own BS but is having a good time laughing at the people who are buying it. When every post caps words like "OWNED" and "FACTS" he is just pushing buttons and people are reacting. He knows his facts are all BS but he wins this tread because he got what he wanted out of it.

#195 Posted by btk2k2 (364 posts) -

@tormentos: I never said the PS4 wouldn't improve. All games for all consoles get better as the generation progresses. My point is that I still believe the issues with the XB1 not hitting 1080p/60fps when the PS4 can is because of factors that can be corrected. Freeing up Kinect resources, optimizing the APIs, and developer familiarity will close the 3rd party gap. This, IMO, will mean that PS4 exclusives will be somewhat better than XB1 exclusives in the graphics department, but multiplats will go back to parity - not too dissimilar to the 360/PS3 scenario at the end of last gen.

You cannot fix hardware and only having 16 ROPS and only having 32MB of ESRAM means hitting 1080p on the Xbox One requires that compromises are made elsewhere in the game.

There is no way that the majority of multiplats will go back to parity. The gulf in performance is too wide between the consoles. The best I can see happening is that they manage to achieve effects parity but at 900p instead of 1080p. I only think that will happen if games do not take advantage of GPGPU compute, if they do for improved lighting effects, better physics, better game simulation etc then I can see the gap remaining at 720p vs 1080p so there are enough shader resources on the Xbox One to handle the compute workload. Remember if a game requires 4 CUs of compute then the relative difference in rendering performance swings more in favour of the PS4, in this example it would be 8 CUs vs 14 CUs (75%) which is a larger gap than 12 CUs vs 18 CUs (50%).

#196 Posted by me3x12 (1765 posts) -

It's amazing how stupid people are. Take this and think on it. Your hardware is only as good as the software that runs it. Think about this for a second? Ok yes we know MS did not have their software tools and drivers anywhere near where they wanted them at launch. Ok here we are 4 months later and we get word that MS has a new SDK kit going out to devs. One credible developer has already said from what he has seen it's super fast and should make Xbox One running 1080p in games comfortably. But apparently Cows with no knowledge of any of this and some even spewing PS4 runs DX 11.2 LMFAO!! at that one because we all know PS4 runs Open GL but anyway.

Again your hardware is only as powerful as the software that runs it. MS has decades of track record of producing some of the best software tools in the world and they have OWNED Sony in this always and it will be no different. Yes they are late to the party with the tools but I promise you these next few dev tools that go out to developers is going to have the Xbox One running like a finely oiled machine and doing 1080p easily.

#197 Posted by zekere (2509 posts) -

Teh desperation...

#198 Posted by edwardecl (2142 posts) -

@me3x12: We will be sure to quote you on that one when it turns out to be only a minor improvement.

#199 Posted by AzatiS (7345 posts) -

@me3x12: What the heck are you talking about ?!!

If your car engine ( GPU ) can give you max 100mph you WONT make it up to 200mph just because youll use better fuel !! In order to achieve 1080p you need POWER , not patching ! With patching alone you wont achieve that and even if you do youll need to bring graphical quality down ( its either shadows/textures/frames per second and more ) ...

Stop dreaming and face the reality. X1s GPU is bad thats all

#200 Posted by lostrib (34969 posts) -

@me3x12 said:

It's amazing how stupid people are. Take this and think on it. Your hardware is only as good as the software that runs it. Think about this for a second? Ok yes we know MS did not have their software tools and drivers anywhere near where they wanted them at launch. Ok here we are 4 months later and we get word that MS has a new SDK kit going out to devs. One credible developer has already said from what he has seen it's super fast and should make Xbox One running 1080p in games comfortably. But apparently Cows with no knowledge of any of this and some even spewing PS4 runs DX 11.2 LMFAO!! at that one because we all know PS4 runs Open GL but anyway.

Again your hardware is only as powerful as the software that runs it. MS has decades of track record of producing some of the best software tools in the world and they have OWNED Sony in this always and it will be no different. Yes they are late to the party with the tools but I promise you these next few dev tools that go out to developers is going to have the Xbox One running like a finely oiled machine and doing 1080p easily.

dat damage control