X1 will end up being the better 1080p system

This topic is locked from further discussion.

#351 Posted by MonsieurX (30405 posts) -

@me3x12 said:

@sts106mat said:

@tormentos said:

@acp_45 said:

@tormentos: LOL... "PRT are made to work on a single pool of Ram not 2"

Why do you think they chose ESRAM ?

Ok dude... AMD does have an extension for Tiled Recources...

remember I said "Although there are other things in play on PS4 side, I don’t want to go into it now."

I said that because I knew someone like you would take me on like that.... but it didn’t help because you probably didn’t read that part......did you ?

This Extension Tiled Recources firstly are on the bug list with tons of other API extensions it being the least important.... and an a bug could be fixed in a day up until a month and could even take longer.... Then they finally get to AMD_Sparse_Texture I think it’s called...... But by then you are going to have to wait for at least an extra few months to make it’s way to the low-level shading language...

The only thing this extension has been used for are at demo builds.

AMD have very little resources assigned to opengl development. The bug-tracker and developer forums are half-abandoned.

AND WHEN DID I SAY PS4 WAS NOT STRONGER........... !!!!! ?

The chose ESRAM because they chose DDR3 because they wanted 8GB of memory and by the time the decision was make to have 8GB GDDR5 didn't have the yields and was to expensive,specs are lock out almost 2 years before the console release some times even more,hell Cell developments started on 2001 and was ready by 2005.

The second aspect of PRT is managing the tiles. In essence PRT reduces local video memory to a very large cache, where tiles are mapped/pinned as necessary and then evicted as per the cache rules, and elsewhere the hardware handles page/tile translation should a tile not already be in the cache. Large tomes have been written on caching methods, and this aspect is of particular interest to AMD because what they learn about caching here they can apply to graphical workloads (i.e. professional) and not just gaming.

http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/6

PRT or Tile Resources is a GCN feature,before the xbox one was even born,the 7970 hit on December 2011,and OpenGL supported the feature,when MS didn't even support it,PRT also know ans Mega textures (Rage) or what Jonch Carmack used on Rage also used OpenGL to do it even without hardware support,mega textures worked on both PS3 and xbox 360,even that it was software based.

The feature also work on PS4 because it is GCN,the notion of Tile resources not working on PS4 is as silly as the whole henna 2 scaler of the xbox one,is not better than the PS4 one in fact it is the same one all GCN use,but most of you don't even know that,wait what about Data Move Engines also know as DMA which are also on PS4..

By the way textures alone will not make the xbox one work better,tile resources mean little the xbox one lack power because it has less CU and less stream processors,and half the ROP of the PS4.

@sts106mat said:

@tormentos haha no. when you've played Ryse, then you can talk to me about it. funny how most people i know on here are enjoying it.

Why do you ignore the FACT that Heil68 has killed way more barbarians in Ryse than anyone else on my friends list? funny that...Sony champion Cow extreme, yet he spent literally hours playing ryse....a so called terrible game (he himself described it as fun).

keep the tears coming RYSE >>>>> PS4 graphics (for now).

OH i played a demo don't need more of it,just like Knack was sh** to when i play it at least i have the gust to admit it sucked.

Oh please Ryse on PS4 would run better and look better bag your silly ass argument,The Order just piss over it,and the only thing it has on Killzone is characters models the rest Killzone beat it all the way,from more open spaces,to effects resolution and performance to.

Lol, what store runs a demo with a game featuring dismemberments and blood everywhere? LIAR LIAR PANTS ON FIRE

yeah, i also played Knack at a store, it felt like a PS2 game to be honest.

Knack is horrible to be honest PS4 had probably the weakest system game launch lineup ever for any new system.

Both were terrible

#352 Edited by StormyJoe (5475 posts) -

@tormentos said:

@StormyJoe said:

@tormentos: If developers and publishers were so interested in "Not hurting MS's feelings", then COD: Ghosts would have been 720P/60FPS for both the PS4 and XB1.

As for "closing the gap", I suppose I can forgive your ignorance. I used to be a software developer (moved on to DBA/management). I have my MCSD. And, I can tell you first hand how improved SDKs and drivers can greatly imporve performance of a piece of software. You can deny it all you want, but it doesn't change that it's a fact.

I am going to believe my own work experience and what other developers say "on the record" before some forum troll any day.

That is a problem MS can dictate thing on its console not on PS4,they try that last gen with the PS3 forcing developers to have parity on disc and would refuse any game that had extra content on Blu-ray,the same with DL games,this Gen apparently sony probably bite them first,and i am sure that there is a policy in the PS4 contract with developer that games most take advantage of the PS4 hardware.

Sabotaging Ghost on PS4 is a joke,even if a developer like MS and want to be polite this is busyness,one thing is say yeah MS will catch up,and another sabotaging your own game on purpose to please MS,in fact that was one of the biggest worries many fans had because MS already did that on the PS3 gen,actually they try to protect their inferior dvd format.

You can be MS top programmer for all i care,you are a buffoon in my books,denying the hardware difference in this to console is silly period.

SDK improve..... 100000000000% right you are,now prove to me how only the xbox one will improve its SDK and sony will not because that is the only way your argument can even begin to make sense,i already told you which you ignore that SDK DOOOOOOOOOO improve they do,sadly for your silly argument PS4 SDK will improve to,in fact the PS4 after 2 or 3 years will start pushing heavy compute which the xbox one is not even modify to do.

Once again prove to me how those developers are right show me the evidence that back them up,where are the games that are equal on xbox one and PS4 that are not racing game or sports..lol

So now the gap between the PS4 and Xb1 is 1000000000000%?

Typical troll. I could give you irrefutable evidence and you would just ignore it because it ruins your little fantasy world. You said it yourself: you are right and everyone actually "in" the industry is wrong.

Pathetic.

#353 Posted by btk2k2 (401 posts) -

@StormyJoe said:

So now the gap between the PS4 and Xb1 is 1000000000000%?

Typical troll. I could give you irrefutable evidence and you would just ignore it because it ruins your little fantasy world. You said it yourself: you are right and everyone actually "in" the industry is wrong.

Pathetic.

Nice strawman.

He just said that the SDKs will improve but that applies to both consoles and not just the Xbox One.

When I was making my 40-50% (later revised to 35-45% predictions after the GPU clock bump) I was making the assumption that the ESRAM would be used in a way that gave it rough parity with the GDDR5 in the PS4. I did that not because I thought it was a realistic scenario but because I did not want to overstate the PS4 advantage. As it happens it appears my 35-45% estimate has so far proven to be quite a cautious one as the differences are actually greater than that in several instances.

#354 Posted by StormyJoe (5475 posts) -

@btk2k2 said:

@StormyJoe said:

So now the gap between the PS4 and Xb1 is 1000000000000%?

Typical troll. I could give you irrefutable evidence and you would just ignore it because it ruins your little fantasy world. You said it yourself: you are right and everyone actually "in" the industry is wrong.

Pathetic.

Nice strawman.

He just said that the SDKs will improve but that applies to both consoles and not just the Xbox One.

When I was making my 40-50% (later revised to 35-45% predictions after the GPU clock bump) I was making the assumption that the ESRAM would be used in a way that gave it rough parity with the GDDR5 in the PS4. I did that not because I thought it was a realistic scenario but because I did not want to overstate the PS4 advantage. As it happens it appears my 35-45% estimate has so far proven to be quite a cautious one as the differences are actually greater than that in several instances.

No, that's not what he said, and I did not misinterpret his responses. Tormentos is convinced that there is absolutely nothing that will every create a parity between the PS4 and XB1, regardless of what every industry expert says to the contrary.

#355 Edited by xboxiphoneps3 (2376 posts) -

@me3x12 said:

This is a vary viable scenario. Here's the deal the drivers are way late for the X1 new SDK kits are going out really soon and already being said to be really fast and it makes achieving 1080p on X1 really easy with the new kits.

Apparently the situation is these new kits also give developers the tools they need to leverage the ESRAM in ways they had no clue.

Also MS will be dropping DX11.2 soon on X1 and when this happens you can store up to 6GB of textures on the ESRAM alone that's massive and will also make it really easy to achieve 1080p.

This is all true don't try to downplay it. When this happens and it's going to really soon the reality is that in FACT the X1 might flip the table and be the more capable box running full 1080p at 60fps because running 60fps

There is a lot of AM CRY coming gonna need lots of tissues soon.

http://www.digitalspy.com/gaming/news/a550445/rebellion-xbox-one-1080p-issues-can-be-fixed.html

http://misterxmedia.livejournal.com/153756.html

http://gamingbolt.com/why-xbox-ones-esram-feels-limited-despite-potential-to-store-6gb-of-tiled-textures-using-dx-11-2

sorry bro but the Xbox One is weaker then the PS4 and will never be able to output as good graphics as the PS4, Tiled resources isnt exclusive to Xbox One, its called AMD PRT. so wow that really sucks the Xbox One only has 32 mb of "super fast" ram more then half the xbox ones system bandwidth is all locked up in a measly 32 mb of memory.... devs are already complaining the 32 mb of eSRAM is low and it really does suck that Xbox One runs games worse then the PS4 even at lower resolutions, thats what you get for sticking a 7770 in there, its just simply not really strong

you can store even more textures on the PS4 then the Xbox One due to having more available faster ram so wow PS4 games are going to look phenominal with monster texture usage and sizes on the PS4 from tiled resources that is incredible wow....

#356 Posted by dalger21 (1165 posts) -

I actually feel pretty dirty from having read this entire thread. Now I know why I don't bother to come to SW. @Wasdie had some of the better arguments here but it was all drowned out by the absolute ridiculousness of juvenile banter. At the end of the day, what difference will it really make? I have a PS4 and bought it because I wanted to. I don't understand the notion that one system HAS to be better than the other one. But then again, I guess you have to have a place for all the fan boys of the different system to duke it out. Also, I guess I'm old because cows and lems are the new terms used nowadays. Craziness....carry on with it kiddos.

#357 Edited by xboxiphoneps3 (2376 posts) -

@StormyJoe said:

@tormentos said:

@StormyJoe said:

@tormentos: For the cheap seats: No developer agrees with you. Go find some links... otherwise, you are regulated to "dummy" status.

I could give a fu** about what a developer say...

BF4 720p 10 FPS slower on xbox one,900p on PS4 10 FPS faster.

Ghost 720p on xbox one,1080p on PS4.

AC4 900p on xbox one horrible AA problems worse image quality,1080p on PS4.

Tomb Raider up to 30FPS slower on xbox one,lower quality textures,alpha based effects at half the resolution of the PS4 version,lower quality depth of field,reduced levels of anisotropic filtering,900p cut scenes.

This ^^^ by far is the biggest gap yet,and show all the elements i once told you would happen,from frames difference to lower quality effects,and not just some frame 20 FPS slower on average and up to 30 FPS in many parts,did i mention that Tomb Raider is a damn DX game.? That should have been way easy to port to the xbox one than the PS4.?

How much do you think a damn 8% will increase in performance.? Is incredible how you believe that an 8% unlock will get the xbox one to perform to a point were it won't matter,but some how you think that a 40% difference between the PS4 and xbox one is small,you don't know sh** about hardware,you make some sh** ass prediction that until now hasn't held.

Oh did i forgot MG 720p 60 FPS on xbox one,1080p 60 FPS on PS4 and on PS4 has real weather simulation.?

Hold tied to what some polite developers are saying there is a gap in power on the PS4 vs xbox one that is impossible to close no matter how much MS optimize.

@StormyJoe said:

@tormentos: You keep posting the same stupid things. Is that because you can't find a single article where a game developer agrees with you? I can find a whole lot of articles that agree with me.

Owning you is getting so easy.

All the proves i need are on the games,prove me wrong and show me how the xbox one run demanding games even close to the PS4..lol

http://www.anandtech.com/bench/product/778?vs=777

But but magic drivers and 8% GPU will bring parity...lol

Yeah like that ever happen with the 7770 and 7850...lol

HA HA HA HA HA!!! So, what you are saying, is that everyone should listen to you, a half-truth speaking, over the top, super moo-cow-troll instead of people who are in the industry and actually developing game software.

WOW! Thanks for reminding me why I normally ignore you. LOL!!!

wow and i thought lemmings couldnt get any more delusional.... its funny though because Tormentos is completely right, why do just about every multiplat run better on the PS4 with more effects and higher resolution and FPS? GIVE UP LEMMINGS THE XBOX ONE IS WEAKER MORE THEN HALF ITS SYSTEM BANDWIDTH IS LOCKED UP IN A MEASLY 32 MB OF RAM , 32 MEGABYTES

lemmings have nothing to say to just about every multiplat being noticeably superior on the PS4 with higher FPS, higher resolution, more effects and more stable framerates, what a shame, Microsoft has been outclassed by Sony in the hardware department, it runs cooler then the Xbox One, is slimmer and smaller, doesnt have a power brick like the Xbox One, these are facts no one can deny

#358 Edited by Krelian-co (11155 posts) -

@StormyJoe said:

@btk2k2 said:

@StormyJoe said:

So now the gap between the PS4 and Xb1 is 1000000000000%?

Typical troll. I could give you irrefutable evidence and you would just ignore it because it ruins your little fantasy world. You said it yourself: you are right and everyone actually "in" the industry is wrong.

Pathetic.

Nice strawman.

He just said that the SDKs will improve but that applies to both consoles and not just the Xbox One.

When I was making my 40-50% (later revised to 35-45% predictions after the GPU clock bump) I was making the assumption that the ESRAM would be used in a way that gave it rough parity with the GDDR5 in the PS4. I did that not because I thought it was a realistic scenario but because I did not want to overstate the PS4 advantage. As it happens it appears my 35-45% estimate has so far proven to be quite a cautious one as the differences are actually greater than that in several instances.

No, that's not what he said, and I did not misinterpret his responses. Tormentos is convinced that there is absolutely nothing that will every create a parity between the PS4 and XB1, regardless of what every industry expert says to the contrary.

another dumb post by stormyclown.

a. if sdks improve xbone, they will improve ps4 too, but i guess as a lem you like to ignore that fact, oh right you are "not" a lem anymore, hahahah.

b. every industry expert says there will be parity between ps4 and xbone? hahaha lol making bs as usual.

#359 Posted by me3x12 (1765 posts) -

@MonsieurX said:

@me3x12 said:

@sts106mat said:

@tormentos said:

@acp_45 said:

@tormentos: LOL... "PRT are made to work on a single pool of Ram not 2"

Why do you think they chose ESRAM ?

Ok dude... AMD does have an extension for Tiled Recources...

remember I said "Although there are other things in play on PS4 side, I don’t want to go into it now."

I said that because I knew someone like you would take me on like that.... but it didn’t help because you probably didn’t read that part......did you ?

This Extension Tiled Recources firstly are on the bug list with tons of other API extensions it being the least important.... and an a bug could be fixed in a day up until a month and could even take longer.... Then they finally get to AMD_Sparse_Texture I think it’s called...... But by then you are going to have to wait for at least an extra few months to make it’s way to the low-level shading language...

The only thing this extension has been used for are at demo builds.

AMD have very little resources assigned to opengl development. The bug-tracker and developer forums are half-abandoned.

AND WHEN DID I SAY PS4 WAS NOT STRONGER........... !!!!! ?

The chose ESRAM because they chose DDR3 because they wanted 8GB of memory and by the time the decision was make to have 8GB GDDR5 didn't have the yields and was to expensive,specs are lock out almost 2 years before the console release some times even more,hell Cell developments started on 2001 and was ready by 2005.

The second aspect of PRT is managing the tiles. In essence PRT reduces local video memory to a very large cache, where tiles are mapped/pinned as necessary and then evicted as per the cache rules, and elsewhere the hardware handles page/tile translation should a tile not already be in the cache. Large tomes have been written on caching methods, and this aspect is of particular interest to AMD because what they learn about caching here they can apply to graphical workloads (i.e. professional) and not just gaming.

http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/6

PRT or Tile Resources is a GCN feature,before the xbox one was even born,the 7970 hit on December 2011,and OpenGL supported the feature,when MS didn't even support it,PRT also know ans Mega textures (Rage) or what Jonch Carmack used on Rage also used OpenGL to do it even without hardware support,mega textures worked on both PS3 and xbox 360,even that it was software based.

The feature also work on PS4 because it is GCN,the notion of Tile resources not working on PS4 is as silly as the whole henna 2 scaler of the xbox one,is not better than the PS4 one in fact it is the same one all GCN use,but most of you don't even know that,wait what about Data Move Engines also know as DMA which are also on PS4..

By the way textures alone will not make the xbox one work better,tile resources mean little the xbox one lack power because it has less CU and less stream processors,and half the ROP of the PS4.

@sts106mat said:

@tormentos haha no. when you've played Ryse, then you can talk to me about it. funny how most people i know on here are enjoying it.

Why do you ignore the FACT that Heil68 has killed way more barbarians in Ryse than anyone else on my friends list? funny that...Sony champion Cow extreme, yet he spent literally hours playing ryse....a so called terrible game (he himself described it as fun).

keep the tears coming RYSE >>>>> PS4 graphics (for now).

OH i played a demo don't need more of it,just like Knack was sh** to when i play it at least i have the gust to admit it sucked.

Oh please Ryse on PS4 would run better and look better bag your silly ass argument,The Order just piss over it,and the only thing it has on Killzone is characters models the rest Killzone beat it all the way,from more open spaces,to effects resolution and performance to.

Lol, what store runs a demo with a game featuring dismemberments and blood everywhere? LIAR LIAR PANTS ON FIRE

yeah, i also played Knack at a store, it felt like a PS2 game to be honest.

Knack is horrible to be honest PS4 had probably the weakest system game launch lineup ever for any new system.

Both were terrible

Not really Xbox One had a strong lineup with Forza 5, RYSE, and Killer Instinct and Dead Rising 3. That might be the best launch ever for a games system

#360 Posted by MonsieurX (30405 posts) -

@me3x12 said:

@MonsieurX said:

@me3x12 said:

@sts106mat said:

@tormentos said:

@acp_45 said:

@tormentos: LOL... "PRT are made to work on a single pool of Ram not 2"

Why do you think they chose ESRAM ?

Ok dude... AMD does have an extension for Tiled Recources...

remember I said "Although there are other things in play on PS4 side, I don’t want to go into it now."

I said that because I knew someone like you would take me on like that.... but it didn’t help because you probably didn’t read that part......did you ?

This Extension Tiled Recources firstly are on the bug list with tons of other API extensions it being the least important.... and an a bug could be fixed in a day up until a month and could even take longer.... Then they finally get to AMD_Sparse_Texture I think it’s called...... But by then you are going to have to wait for at least an extra few months to make it’s way to the low-level shading language...

The only thing this extension has been used for are at demo builds.

AMD have very little resources assigned to opengl development. The bug-tracker and developer forums are half-abandoned.

AND WHEN DID I SAY PS4 WAS NOT STRONGER........... !!!!! ?

The chose ESRAM because they chose DDR3 because they wanted 8GB of memory and by the time the decision was make to have 8GB GDDR5 didn't have the yields and was to expensive,specs are lock out almost 2 years before the console release some times even more,hell Cell developments started on 2001 and was ready by 2005.

The second aspect of PRT is managing the tiles. In essence PRT reduces local video memory to a very large cache, where tiles are mapped/pinned as necessary and then evicted as per the cache rules, and elsewhere the hardware handles page/tile translation should a tile not already be in the cache. Large tomes have been written on caching methods, and this aspect is of particular interest to AMD because what they learn about caching here they can apply to graphical workloads (i.e. professional) and not just gaming.

http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/6

PRT or Tile Resources is a GCN feature,before the xbox one was even born,the 7970 hit on December 2011,and OpenGL supported the feature,when MS didn't even support it,PRT also know ans Mega textures (Rage) or what Jonch Carmack used on Rage also used OpenGL to do it even without hardware support,mega textures worked on both PS3 and xbox 360,even that it was software based.

The feature also work on PS4 because it is GCN,the notion of Tile resources not working on PS4 is as silly as the whole henna 2 scaler of the xbox one,is not better than the PS4 one in fact it is the same one all GCN use,but most of you don't even know that,wait what about Data Move Engines also know as DMA which are also on PS4..

By the way textures alone will not make the xbox one work better,tile resources mean little the xbox one lack power because it has less CU and less stream processors,and half the ROP of the PS4.

@sts106mat said:

@tormentos haha no. when you've played Ryse, then you can talk to me about it. funny how most people i know on here are enjoying it.

Why do you ignore the FACT that Heil68 has killed way more barbarians in Ryse than anyone else on my friends list? funny that...Sony champion Cow extreme, yet he spent literally hours playing ryse....a so called terrible game (he himself described it as fun).

keep the tears coming RYSE >>>>> PS4 graphics (for now).

OH i played a demo don't need more of it,just like Knack was sh** to when i play it at least i have the gust to admit it sucked.

Oh please Ryse on PS4 would run better and look better bag your silly ass argument,The Order just piss over it,and the only thing it has on Killzone is characters models the rest Killzone beat it all the way,from more open spaces,to effects resolution and performance to.

Lol, what store runs a demo with a game featuring dismemberments and blood everywhere? LIAR LIAR PANTS ON FIRE

yeah, i also played Knack at a store, it felt like a PS2 game to be honest.

Knack is horrible to be honest PS4 had probably the weakest system game launch lineup ever for any new system.

Both were terrible

Not really Xbox One had a strong lineup with Forza 5, RYSE, and Killer Instinct and Dead Rising 3. That might be the best launch ever for a games system

You call that a strong line up?

LOL

#361 Posted by TheShensolidus (224 posts) -

Wow, tons of wheel spinning in this thread. For a lot of you, this is going to be an endless level of rationalization and debate you'll be having all gen long. We have both games that have been released, that are being released, and that will be releasing both in the short-term and far-term that points to this power difference, yet you still hang onto some hope that it just isn't true or there is some misunderstood logic behind it all, when the reality is the simplest answer is the one that is definitely true.

Firstly - yes, we did get a new XDK (about 2 weeks ago in-fact). Some of you are implying these are new dev kits - wrong; its a software update. We're currently on the March update with the "new & improved" party/friends system. As has been said before, the major difference this XDK has given us is it freed up 8% of the originally 10% Kinect reserved GPU cycles (i.e. 10 processes out of a 100 were taken, it would now only be 2), meaning our drawcalls have theoretically been improved (we're still experimenting with it). What I will say this is helping us with across the board is FPS stability, which has been an issue many of you have noted in games released.

Don't go thinking it's going to make past games run better or future games development on the platform is going to be "omg, all da powah!" in terms of resources or potential fidelity, because at the end of the day our biggest limiting factor is still absolutely hardware, and nothing will EVER change that. Yes, drivers and software will become more streamlined and certain tricks to maximize the hardware will be understood, but even if we changed the development environment to the most streamlined available (something which it certainly IS NOT right now), we are still going to be held by the reigns of physical hardware.

Secondly - I have heard this sentiment from other developers over the last few months, and its something we don't quite understand. What, exactly, do we need to understand about eSRAM? I know many a programmer that have taken it as an insult to see comments saying "Devs don't understand how to utilize it", like it's way over our heads. Newsflash- its a 32mb memory pool. We've been working with memory constraints since the dawn of technology, there is nothing magical about it because it has a different name or because of its bus speed. Yes, it does suck that we have such a limited amount of 'the fast RAM', and yes it is causing us headaches. While people will throw around DirectX 11.2 (ugh) and Tiled Resources as some foresighted use case for eSRAM, the fact is that developers are always going to use those 32mbs of fast memory for our frame buffer, we pretty much have to for a number of reasons.

It's because of having to utilize the 'fast' memory in such limited quantity is why we see things like longer load times, longer install times, fewer effects and worse aliasing on multiplatform titles on the system. If we had 128mbs of eSRAM, many of our problems would be exponentially alleviated in the short-term. However, there would STILL be a power difference between the two platforms, you just would have started seeing the pull away in year 2 or 3 instead of year 1.

And i'll leave off with this - what does it REALLY matter what the system can do? No matter what, It's going to do the one principle thing that you purchased it for, and that is play great games. Okay, fine, multiplats will run better across the board on an opposing machine, but that doesn't mean you can't enjoy the games. PS3 owners went an entire generation getting their teeth kicked in with shitty ports, and its still continuing, but they still had amazing gaming experiences. And that is why you purchased this console; to play great games. Don't take up the banner of wanting to support a console just because you want it to be the strongest to win in debates. No, take up the banner of playing with the gaming console you prefer because you prefer it, and who the hell cares what other people think. You're your own person with your own opinion, never feel you have to justify your decision of purchase or support to random internet debaters. As long as you are getting the experience you want from the console you purchased, who the hell should have anything to say contrary to your experience and opinion.

#362 Edited by me3x12 (1765 posts) -

@theshensolidus said:

Wow, tons of wheel spinning in this thread. For a lot of you, this is going to be an endless level of rationalization and debate you'll be having all gen long. We have both games that have been released, that are being released, and that will be releasing both in the short-term and far-term that points to this power difference, yet you still hang onto some hope that it just isn't true or there is some misunderstood logic behind it all, when the reality is the simplest answer is the one that is definitely true.

Firstly - yes, we did get a new XDK (about 2 weeks ago in-fact). Some of you are implying these are new dev kits - wrong; its a software update. We're currently on the March update with the "new & improved" party/friends system. As has been said before, the major difference this XDK has given us is it freed up 8% of the originally 10% Kinect reserved GPU cycles (i.e. 10 processes out of a 100 were taken, it would now only be 2), meaning our drawcalls have theoretically been improved (we're still experimenting with it). What I will say this is helping us with across the board is FPS stability, which has been an issue many of you have noted in games released.

Don't go thinking it's going to make past games run better or future games development on the platform is going to be "omg, all da powah!" in terms of resources or potential fidelity, because at the end of the day our biggest limiting factor is still absolutely hardware, and nothing will EVER change that. Yes, drivers and software will become more streamlined and certain tricks to maximize the hardware will be understood, but even if we changed the development environment to the most streamlined available (something which it certainly IS NOT right now), we are still going to be held by the reigns of physical hardware.

Secondly - I have heard this sentiment from other developers over the last few months, and its something we don't quite understand. What, exactly, do we need to understand about eSRAM? I know many a programmer that have taken it as an insult to see comments saying "Devs don't understand how to utilize it", like it's way over our heads. Newsflash- its a 32mb memory pool. We've been working with memory constraints since the dawn of technology, there is nothing magical about it because it has a different name or because of its bus speed. Yes, it does suck that we have such a limited amount of 'the fast RAM', and yes it is causing us headaches. While people will throw around DirectX 11.2 (ugh) and Tiled Resources as some foresighted use case for eSRAM, the fact is that developers are always going to use those 32mbs of fast memory for our frame buffer, we pretty much have to for a number of reasons.

It's because of having to utilize the 'fast' memory in such limited quantity is why we see things like longer load times, longer install times, fewer effects and worse aliasing on multiplatform titles on the system. If we had 128mbs of eSRAM, many of our problems would be exponentially alleviated in the short-term. However, there would STILL be a power difference between the two platforms, you just would have started seeing the pull away in year 2 or 3 instead of year 1.

And i'll leave off with this - what does it REALLY matter what the system can do? No matter what, It's going to do the one principle thing that you purchased it for, and that is play great games. Okay, fine, multiplats will run better across the board on an opposing machine, but that doesn't mean you can't enjoy the games. PS3 owners went an entire generation getting their teeth kicked in with shitty ports, and its still continuing, but they still had amazing gaming experiences. And that is why you purchased this console; to play great games. Don't take up the banner of wanting to support a console just because you want it to be the strongest to win in debates. No, take up the banner of playing with the gaming console you prefer because you prefer it, and who the hell cares what other people think. You're your own person with your own opinion, never feel you have to justify your decision of purchase or support to random internet debaters. As long as you are getting the experience you want from the console you purchased, who the hell should have anything to say contrary to your experience and opinion.

Name yourself scoundrel?

#363 Posted by StormyJoe (5475 posts) -

@xboxiphoneps3 said:

@StormyJoe said:

@tormentos said:

@StormyJoe said:

@tormentos: For the cheap seats: No developer agrees with you. Go find some links... otherwise, you are regulated to "dummy" status.

I could give a fu** about what a developer say...

BF4 720p 10 FPS slower on xbox one,900p on PS4 10 FPS faster.

Ghost 720p on xbox one,1080p on PS4.

AC4 900p on xbox one horrible AA problems worse image quality,1080p on PS4.

Tomb Raider up to 30FPS slower on xbox one,lower quality textures,alpha based effects at half the resolution of the PS4 version,lower quality depth of field,reduced levels of anisotropic filtering,900p cut scenes.

This ^^^ by far is the biggest gap yet,and show all the elements i once told you would happen,from frames difference to lower quality effects,and not just some frame 20 FPS slower on average and up to 30 FPS in many parts,did i mention that Tomb Raider is a damn DX game.? That should have been way easy to port to the xbox one than the PS4.?

How much do you think a damn 8% will increase in performance.? Is incredible how you believe that an 8% unlock will get the xbox one to perform to a point were it won't matter,but some how you think that a 40% difference between the PS4 and xbox one is small,you don't know sh** about hardware,you make some sh** ass prediction that until now hasn't held.

Oh did i forgot MG 720p 60 FPS on xbox one,1080p 60 FPS on PS4 and on PS4 has real weather simulation.?

Hold tied to what some polite developers are saying there is a gap in power on the PS4 vs xbox one that is impossible to close no matter how much MS optimize.

@StormyJoe said:

@tormentos: You keep posting the same stupid things. Is that because you can't find a single article where a game developer agrees with you? I can find a whole lot of articles that agree with me.

Owning you is getting so easy.

All the proves i need are on the games,prove me wrong and show me how the xbox one run demanding games even close to the PS4..lol

http://www.anandtech.com/bench/product/778?vs=777

But but magic drivers and 8% GPU will bring parity...lol

Yeah like that ever happen with the 7770 and 7850...lol

HA HA HA HA HA!!! So, what you are saying, is that everyone should listen to you, a half-truth speaking, over the top, super moo-cow-troll instead of people who are in the industry and actually developing game software.

WOW! Thanks for reminding me why I normally ignore you. LOL!!!

wow and i thought lemmings couldnt get any more delusional.... its funny though because Tormentos is completely right, why do just about every multiplat run better on the PS4 with more effects and higher resolution and FPS? GIVE UP LEMMINGS THE XBOX ONE IS WEAKER MORE THEN HALF ITS SYSTEM BANDWIDTH IS LOCKED UP IN A MEASLY 32 MB OF RAM , 32 MEGABYTES

lemmings have nothing to say to just about every multiplat being noticeably superior on the PS4 with higher FPS, higher resolution, more effects and more stable framerates, what a shame, Microsoft has been outclassed by Sony in the hardware department, it runs cooler then the Xbox One, is slimmer and smaller, doesnt have a power brick like the Xbox One, these are facts no one can deny

Oh look, another person who completely ignored everything I (and ever interviewed developer) have said, and is running around yelling "The PS4 is more powerful, the PS4 is more powerful".

Cows gonna cow...

#364 Edited by StormyJoe (5475 posts) -

@Krelian-co said:

@StormyJoe said:

@btk2k2 said:

@StormyJoe said:

So now the gap between the PS4 and Xb1 is 1000000000000%?

Typical troll. I could give you irrefutable evidence and you would just ignore it because it ruins your little fantasy world. You said it yourself: you are right and everyone actually "in" the industry is wrong.

Pathetic.

Nice strawman.

He just said that the SDKs will improve but that applies to both consoles and not just the Xbox One.

When I was making my 40-50% (later revised to 35-45% predictions after the GPU clock bump) I was making the assumption that the ESRAM would be used in a way that gave it rough parity with the GDDR5 in the PS4. I did that not because I thought it was a realistic scenario but because I did not want to overstate the PS4 advantage. As it happens it appears my 35-45% estimate has so far proven to be quite a cautious one as the differences are actually greater than that in several instances.

No, that's not what he said, and I did not misinterpret his responses. Tormentos is convinced that there is absolutely nothing that will every create a parity between the PS4 and XB1, regardless of what every industry expert says to the contrary.

another dumb post by stormyclown.

a. if sdks improve xbone, they will improve ps4 too, but i guess as a lem you like to ignore that fact, oh right you are "not" a lem anymore, hahahah.

b. every industry expert says there will be parity between ps4 and xbone? hahaha lol making bs as usual.

a) The XB1 has it's own version of Direct X, not shared with the PS4

b) go read.

Why do you keep posting? Glutton for punishment?

#365 Edited by ronvalencia (15129 posts) -
@btk2k2 said:
@ronvalencia said:

http://www.dualshockers.com/2014/02/06/the-order-1886-will-have-4x-aa-undecided-between-1920x800-and-1920x1080-lack-of-multiplayer-expained/

"To be clear, x800 with 4xMSAA needs more bandwidth than x1080 would, so 1080 no MS would be cheaper."

-------

The choice is 1920x800 with MSAA X4 or 1920x1080p with no MSAA. The limit for 1920x800 is due performance issues.

That seems like you are jumping to a conclusion. No where does it say that 1080p with 4xMSAA is not possible and it also states that 1920x800 4xMSAA is more hardware intensive than 1080p 0xMSAA so if they were at the edge of the performance envelope with 1080x800 4xMSAA then it is easily possible that 1080p 2xMSAA would also work just fine for their frame rate target.

Andrea stated "so 1080 no MS would be cheaper" i.e. no MSAA at 1080p. You are adding 2X MSAA and Andrea's post clearly states no MSAA at 1080p

@tormentos

PRT doesn't use DDR3 to store those textures,Anandtech was very clear,it uses your own video memory as cache i just destroyed your argument period,sony even list PRT in their list of things the PS4 does,you are been a blind MS fanboy period and is time you really drop it,Sony confirm it and Anandtech confirm it to,in an article from 2011 on GCN,hell DX didn't even support PRT back them..lol

The textures are hold on the disc or HDD and streamed as need it,having 2 memory pulls is irrelevant,because PRT is make for GCN and on PC DDR3 has far slower speed and data transfer than GDDR5,once again from all intended purposes the GPU will see 2 memory banks one it can access one it can't,even if system partition is use for those textures it will still load faster than they would in 2 different memory setups.

I already prove what i need it to PRT work on PS4,and MS had to make support for it on xbox one because of ESRAM which no GCN has,i told you that like 20 times,no GCN has ESRAM,and PRT work on a single memory you video memory.

You better read the basics of cache in computing. read http://en.wikipedia.org/wiki/Cache_(computing)

It's clear you don't know the purpose having the cache in the first place.

To minimize latency and applying "cache" concepts for GCN's PRT,

1. "Cache memory" is replaced by GPU's faster memory pool which runs faster than DDR3.

2. CPU is replace by GPU.

http://www.britannica.com/EBchecked/topic/87789/cache-memory

cache memory, also called Cache, a supplementary memory system that temporarily stores frequently used instructions and data for quicker processing by the central processor of a computer. The cache augments, and is an extension of, a computer’s main memory

--------------

AMD PRT makes the faster VRAM into a "cache" against slower memory types e.g. DDR3 for X1's and gaming PC's situation. For PS4, it doesn't have larger/slower memory pool to cache against.

Against your "GPU will see 2 memory banks one it can access one it can't" assertion, read AMD PRT from http://www.tinysg.de/techGuides/tg4_prt.html

The Southern Islands graphic chips of Radeon 7000 series boards and FirePro W-series Workstation cards support a new OpenGL extension called AMD_sparse_texture that allows to download only parts of a texture image (tiles), rather than providing the entire image all at once. The driver will commit memory only for the downloaded tiles, and the application may decide to unload tiles again if they are no longer needed. This way, the graphics memory can serve as a texture tile cache, indexing parts of vast textures in main memory or even on disk.

AMD's GCN driver from Radeon HD 7970 era allows the GPU to access main memory. This vendor specific feature was replicated for PC's DirectX 11.2 and Xbox One's DirectX11.X superset extensions i.e. Xbox One doesn't need Mantle.

You are wrong on GPU's inability to access main memory claim i.e. the driver will provide the access.

The issue on OpenGL vs DIrectX on PRT's support release date is just red-herring and you should know PS4's primary graphics API is not OpenGL.

From http://www.officialplaystationmagazine.co.uk/2013/10/31/ps4-uses-a-5400-rpm-sata-ii-hard-drive-users-can-install-anything-less-than-9-5mm-larger-than-160gb/

PS4's standard disk is a slow 5400 RPM laptop types LOL. It has massive latency if one uses this type of HDD for texture cache storage. LOL. Current PC workstations has faster storage devices than the slow 5400 RPM laptop HDD.

The fundamental lessons for AMD PRT and Tiled resource is to keep GPU's TMUs operating at it's peak efficiency in the constraint of it's fastest memory storage.

I didn't state PS4 doesn't have PRT functionality, but the gain from PRT for PCs and X1s are larger than PS4's version. For X1, PRT is part of the requirement for max efficiency while the PC is an option when most runtime game textures fits within 2 GB VRAM for 1080p scenarios. For PS4, PRT is nearly pointless since it has 5 to 6 GB VRAM which can easily enable it's TMUs running at max efficiency within the constraint of it's fastest memory storage.

----------------------------------

At 1920x1080p, any flagship PC GPU ASICs with flagship PC video memory bandwidth from the year 2012 can apply 4X MSAA. The PC's performance scales with the user's budget.

Note that MSAA is not a silver bullet when comes to non-edge geometry, hence why you have a combo solution.

#366 Edited by tyloss (829 posts) -

@StormyJoe: The Xbox One will never achieve parity with any PS4 multiplat that utilizes both consoles power 100% efficiently.

We've already seen this in numerous multiplats.

#367 Edited by StormyJoe (5475 posts) -

@tyloss said:

@StormyJoe: The Xbox One will never achieve parity with any PS4 multiplat that utilizes both consoles power 100% efficiently.

We've already seen this in numerous multiplats.

Read the developer interviews.

#368 Posted by delta3074 (18406 posts) -

@tormentos said:

@delta3074 said:

At the same time GTA V won on Ps3, Assassins creed black flag,call of duty ghosts, Batman arkham origins and fifa 14 won on Xbox 360, compared to GTA V and Battlefield 4, thats 4 games versus 2 so even at the end of the gen the majority of multiplats perform better on the 360.

The position is obviously reversed this generation but you have to concede that last gen the 360 was the console to own for multiplats and had the most high rated, high quality games.

The Ps4 this gen will be what the xbox 360 was last gen, the best console to own for multiplats.

And Flops are not the be all and end all of determining a consoles performance, The Ps3 may have had more flops but the 360 had more useable RAM and an extra 10MB Edram which had a massive 256GB/sec system bandwitdh., not to mention unified shader architecture which the Cell could not emulate.

Obviously the PS4 is more the more powerful of the consoles this generation but Don't kid yourself, the 360 was an equal match for the Ps3 in both hardware and games.

You can have all the flops in the world but without the RAM it doesn't mean anything.

And is funny because GTA 5 is much bigger scale than any of those games in which the 360 won,and BF4 is also bigger again in scale than any of the shooter your mention as well.

Now compare that to the xbox one where no multiplatform performs better on xbox one,hell only sport and racing performs close and because they are barely demanding.

It is reverse this gen the only difference is that this gen MS will not catch up because the hardware disparity is to big,the PS3 require to much resources to actually put effort into making those version even on par with the 360,let alone superior to the xbox 360,the xbox one has a benefit there is not as hard,but then again is not as powerful to keep up as the 360 was.

The xbox 360 had 18 MB more of usable ram after the OS was reduce to 50BM,the PS3 also benefit from not having to be constantly decompressing data because Blu-ray was big enough to hold uncompressed textures,which tent to give thePS3 the edge when it comes to textures even that it had 18 MB less of ram.

The xbox 360 Edram was basically irrelevant,because where the data traveled at that speed was inside the ESRAM,the line feeding the EDRAM was way slower than 256GB,so a bottle neck was created,also MS calculated wrongly when they implemented Edram,because they believe that 10MB was enough for 720p with 4XAA,which wasn't the case in order to do that developer had to tile which was a problem and require more time and resources,which is the reason why many xbox one games don't even have 2XAA.

Is not about the flops alone the PS3 was just a few flops over the xbox 360,and was do to cell been 200Gflosp,the xbox 360 had unified shaders,and the PS3 had a CPU that could runs GPU task and post process that on xbox 360 were basically handle by its GPU,which the xbox one could not emulate either either,remember how Cell worked like an Ageia physic card.? The xbox one CPU could not do that,so those physic to run well would have to be run on the GPU.

The xbox 360 just had a few more MB of ram than the PS3 nothing big,it clearly wasn't enough to stop the PS3 from matching and surpassing the 360,like games prove already.

i agree to some degree, especially with the physics part, but you have still not factored in USA which allowed shader operations at half the cost and something that could not even be emulated on the Ps3 plus the fact that the Xenos was capable of some DX10 routines, not to mention the Ps3's lack of a tesselator api.and the CPu's complete lack of Branch predictors.

The problem is we could trade all day and we would just end up with a list of things that one console has and the other doesn't, bottom line is that Caramack was right, both consoles are so close in power it's not even worth mentioning .

Also, interesting what you said about the 360 having to uncompress data because the decompression is done by the GPU in realtime by dedicated hardware., decompression speeds also had a lot to do with whether you had a HDD or not.

#369 Posted by ReadingRainbow4 (14396 posts) -

Lol plants vs zombies.

#370 Posted by Dreams-Visions (26569 posts) -

@theshensolidus said:

Wow, tons of wheel spinning in this thread. For a lot of you, this is going to be an endless level of rationalization and debate you'll be having all gen long. We have both games that have been released, that are being released, and that will be releasing both in the short-term and far-term that points to this power difference, yet you still hang onto some hope that it just isn't true or there is some misunderstood logic behind it all, when the reality is the simplest answer is the one that is definitely true.

Firstly - yes, we did get a new XDK (about 2 weeks ago in-fact). Some of you are implying these are new dev kits - wrong; its a software update. We're currently on the March update with the "new & improved" party/friends system. As has been said before, the major difference this XDK has given us is it freed up 8% of the originally 10% Kinect reserved GPU cycles (i.e. 10 processes out of a 100 were taken, it would now only be 2), meaning our drawcalls have theoretically been improved (we're still experimenting with it). What I will say this is helping us with across the board is FPS stability, which has been an issue many of you have noted in games released.

Don't go thinking it's going to make past games run better or future games development on the platform is going to be "omg, all da powah!" in terms of resources or potential fidelity, because at the end of the day our biggest limiting factor is still absolutely hardware, and nothing will EVER change that. Yes, drivers and software will become more streamlined and certain tricks to maximize the hardware will be understood, but even if we changed the development environment to the most streamlined available (something which it certainly IS NOT right now), we are still going to be held by the reigns of physical hardware.

Secondly - I have heard this sentiment from other developers over the last few months, and its something we don't quite understand. What, exactly, do we need to understand about eSRAM? I know many a programmer that have taken it as an insult to see comments saying "Devs don't understand how to utilize it", like it's way over our heads. Newsflash- its a 32mb memory pool. We've been working with memory constraints since the dawn of technology, there is nothing magical about it because it has a different name or because of its bus speed. Yes, it does suck that we have such a limited amount of 'the fast RAM', and yes it is causing us headaches. While people will throw around DirectX 11.2 (ugh) and Tiled Resources as some foresighted use case for eSRAM, the fact is that developers are always going to use those 32mbs of fast memory for our frame buffer, we pretty much have to for a number of reasons.

It's because of having to utilize the 'fast' memory in such limited quantity is why we see things like longer load times, longer install times, fewer effects and worse aliasing on multiplatform titles on the system. If we had 128mbs of eSRAM, many of our problems would be exponentially alleviated in the short-term. However, there would STILL be a power difference between the two platforms, you just would have started seeing the pull away in year 2 or 3 instead of year 1.

And i'll leave off with this - what does it REALLY matter what the system can do? No matter what, It's going to do the one principle thing that you purchased it for, and that is play great games. Okay, fine, multiplats will run better across the board on an opposing machine, but that doesn't mean you can't enjoy the games. PS3 owners went an entire generation getting their teeth kicked in with shitty ports, and its still continuing, but they still had amazing gaming experiences. And that is why you purchased this console; to play great games. Don't take up the banner of wanting to support a console just because you want it to be the strongest to win in debates. No, take up the banner of playing with the gaming console you prefer because you prefer it, and who the hell cares what other people think. You're your own person with your own opinion, never feel you have to justify your decision of purchase or support to random internet debaters. As long as you are getting the experience you want from the console you purchased, who the hell should have anything to say contrary to your experience and opinion.

Good post. And yet it's conclusion is somewhat nearsighted. Part of the experience you want from a console purchased is affected by framerate, resolution and effects. Lower than native resolutions introduce blur, artifacting and problems recognizing distant objects. Lower framerates need no explanation. Lowered effects take away from the ambiance and look that the developers hoped to give their audience. These are objective truths and problems that affect the One more than the PS4...and by no small measure. And it's only year 1.

As one who owns all 3 nextgen systems and a gaming PC, my One is now basically a Wii: I'll buy exclusives on it. Games I can't get anywhere else. Otherwise, everything is going to be bought on the PS4 or PC. Because as you said...it's about the experience...and some parts of the experience are measurable and objective.

#371 Posted by delta3074 (18406 posts) -

@sts106mat said:

@tormentos said:

@acp_45 said:

@tormentos: LOL... "PRT are made to work on a single pool of Ram not 2"

Why do you think they chose ESRAM ?

Ok dude... AMD does have an extension for Tiled Recources...

remember I said "Although there are other things in play on PS4 side, I don’t want to go into it now."

I said that because I knew someone like you would take me on like that.... but it didn’t help because you probably didn’t read that part......did you ?

This Extension Tiled Recources firstly are on the bug list with tons of other API extensions it being the least important.... and an a bug could be fixed in a day up until a month and could even take longer.... Then they finally get to AMD_Sparse_Texture I think it’s called...... But by then you are going to have to wait for at least an extra few months to make it’s way to the low-level shading language...

The only thing this extension has been used for are at demo builds.

AMD have very little resources assigned to opengl development. The bug-tracker and developer forums are half-abandoned.

AND WHEN DID I SAY PS4 WAS NOT STRONGER........... !!!!! ?

The chose ESRAM because they chose DDR3 because they wanted 8GB of memory and by the time the decision was make to have 8GB GDDR5 didn't have the yields and was to expensive,specs are lock out almost 2 years before the console release some times even more,hell Cell developments started on 2001 and was ready by 2005.

The second aspect of PRT is managing the tiles. In essence PRT reduces local video memory to a very large cache, where tiles are mapped/pinned as necessary and then evicted as per the cache rules, and elsewhere the hardware handles page/tile translation should a tile not already be in the cache. Large tomes have been written on caching methods, and this aspect is of particular interest to AMD because what they learn about caching here they can apply to graphical workloads (i.e. professional) and not just gaming.

http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/6

PRT or Tile Resources is a GCN feature,before the xbox one was even born,the 7970 hit on December 2011,and OpenGL supported the feature,when MS didn't even support it,PRT also know ans Mega textures (Rage) or what Jonch Carmack used on Rage also used OpenGL to do it even without hardware support,mega textures worked on both PS3 and xbox 360,even that it was software based.

The feature also work on PS4 because it is GCN,the notion of Tile resources not working on PS4 is as silly as the whole henna 2 scaler of the xbox one,is not better than the PS4 one in fact it is the same one all GCN use,but most of you don't even know that,wait what about Data Move Engines also know as DMA which are also on PS4..

By the way textures alone will not make the xbox one work better,tile resources mean little the xbox one lack power because it has less CU and less stream processors,and half the ROP of the PS4.

@sts106mat said:

@tormentos haha no. when you've played Ryse, then you can talk to me about it. funny how most people i know on here are enjoying it.

Why do you ignore the FACT that Heil68 has killed way more barbarians in Ryse than anyone else on my friends list? funny that...Sony champion Cow extreme, yet he spent literally hours playing ryse....a so called terrible game (he himself described it as fun).

keep the tears coming RYSE >>>>> PS4 graphics (for now).

OH i played a demo don't need more of it,just like Knack was sh** to when i play it at least i have the gust to admit it sucked.

Oh please Ryse on PS4 would run better and look better bag your silly ass argument,The Order just piss over it,and the only thing it has on Killzone is characters models the rest Killzone beat it all the way,from more open spaces,to effects resolution and performance to.

Lol, what store runs a demo with a game featuring dismemberments and blood everywhere? LIAR LIAR PANTS ON FIRE

yeah, i also played Knack at a store, it felt like a PS2 game to be honest.

i have to agree with Mat, no way they would demo a game like ryse in a computer shop, thats just too far fetched considering the level of violence in the game, you would have to go to one seriously uncivilised country for that to be the case,lol

#372 Edited by xboxiphoneps3 (2376 posts) -
@StormyJoe said:

@xboxiphoneps3 said:

@StormyJoe said:

@tormentos said:

@StormyJoe said:

@tormentos: For the cheap seats: No developer agrees with you. Go find some links... otherwise, you are regulated to "dummy" status.

I could give a fu** about what a developer say...

BF4 720p 10 FPS slower on xbox one,900p on PS4 10 FPS faster.

Ghost 720p on xbox one,1080p on PS4.

AC4 900p on xbox one horrible AA problems worse image quality,1080p on PS4.

Tomb Raider up to 30FPS slower on xbox one,lower quality textures,alpha based effects at half the resolution of the PS4 version,lower quality depth of field,reduced levels of anisotropic filtering,900p cut scenes.

This ^^^ by far is the biggest gap yet,and show all the elements i once told you would happen,from frames difference to lower quality effects,and not just some frame 20 FPS slower on average and up to 30 FPS in many parts,did i mention that Tomb Raider is a damn DX game.? That should have been way easy to port to the xbox one than the PS4.?

How much do you think a damn 8% will increase in performance.? Is incredible how you believe that an 8% unlock will get the xbox one to perform to a point were it won't matter,but some how you think that a 40% difference between the PS4 and xbox one is small,you don't know sh** about hardware,you make some sh** ass prediction that until now hasn't held.

Oh did i forgot MG 720p 60 FPS on xbox one,1080p 60 FPS on PS4 and on PS4 has real weather simulation.?

Hold tied to what some polite developers are saying there is a gap in power on the PS4 vs xbox one that is impossible to close no matter how much MS optimize.

@StormyJoe said:

@tormentos: You keep posting the same stupid things. Is that because you can't find a single article where a game developer agrees with you? I can find a whole lot of articles that agree with me.

Owning you is getting so easy.

All the proves i need are on the games,prove me wrong and show me how the xbox one run demanding games even close to the PS4..lol

http://www.anandtech.com/bench/product/778?vs=777

But but magic drivers and 8% GPU will bring parity...lol

Yeah like that ever happen with the 7770 and 7850...lol

HA HA HA HA HA!!! So, what you are saying, is that everyone should listen to you, a half-truth speaking, over the top, super moo-cow-troll instead of people who are in the industry and actually developing game software.

WOW! Thanks for reminding me why I normally ignore you. LOL!!!

wow and i thought lemmings couldnt get any more delusional.... its funny though because Tormentos is completely right, why do just about every multiplat run better on the PS4 with more effects and higher resolution and FPS? GIVE UP LEMMINGS THE XBOX ONE IS WEAKER MORE THEN HALF ITS SYSTEM BANDWIDTH IS LOCKED UP IN A MEASLY 32 MB OF RAM , 32 MEGABYTES

lemmings have nothing to say to just about every multiplat being noticeably superior on the PS4 with higher FPS, higher resolution, more effects and more stable framerates, what a shame, Microsoft has been outclassed by Sony in the hardware department, it runs cooler then the Xbox One, is slimmer and smaller, doesnt have a power brick like the Xbox One, these are facts no one can deny

Oh look, another person who completely ignored everything I (and ever interviewed developer) have said, and is running around yelling "The PS4 is more powerful, the PS4 is more powerful".

Cows gonna cow...

dude you make no sense you said tormentos is trolling and is speaking half truth over the top, and then you say "bu bu the developers are actually developing the games" okay, you are beyond right, these developers ARE making the games for both consoles, and what is the end result? PS4 is running the multiplats better then the Xbox One, the games speak for them self, stop being delusional, PS4 is running a bunch of multiplats better then the Xbox One, higher resolution, higher FPS, some extra effects and more stable FPS on the ps4 over the xbox one, delusional lemmings are gonna be delusional, are people really still claiming the Xbox One has secret sauce and is still more powerful? that is long dead with, the Xbox One is not as strong as the ps4, im waiting for true next gen games to finally come out starting this year

why does the PS4 run multiplats better then the Xbox One? LOL ill sit here all day while you try and answer this and babble and then go off topic

#373 Posted by delta3074 (18406 posts) -
@tormentos said:

@StormyJoe said:

@tormentos: If developers and publishers were so interested in "Not hurting MS's feelings", then COD: Ghosts would have been 720P/60FPS for both the PS4 and XB1.

As for "closing the gap", I suppose I can forgive your ignorance. I used to be a software developer (moved on to DBA/management). I have my MCSD. And, I can tell you first hand how improved SDKs and drivers can greatly imporve performance of a piece of software. You can deny it all you want, but it doesn't change that it's a fact.

I am going to believe my own work experience and what other developers say "on the record" before some forum troll any day.

That is a problem MS can dictate thing on its console not on PS4,they try that last gen with the PS3 forcing developers to have parity on disc and would refuse any game that had extra content on Blu-ray,the same with DL games,this Gen apparently sony probably bite them first,and i am sure that there is a policy in the PS4 contract with developer that games most take advantage of

SDK improve..... 100000000000% right you are,now prove to me how only the xbox one will improve its SDK and sony will not because that is the only way your argument can even begin to make sense,i already told you which you ignore that SDK DOOOOOOOOOO improve they do,sadly for your silly argument PS4 SDK will improve to,in fact the PS4 after 2 or 3 years will start pushing heavy compute which the xbox one is not even modify to do.

Good rebuttal dude, don't think he can come back from that, obviously SONY are going to improve there software for the PS4, even a drop in the OS can improve hardware performance, vintage mate.

I don't think the XB1 will close the gap in power with the Ps4 but at the same time i don't believe the difference is as big as 50% because 50% faster does not equate 50% more powerful in my book, i reckon it's more around the 30-35% mark.

#374 Edited by Exiled_Paladin (52 posts) -

LOOK AT ALL THE PEASANTS ARGUING OVER THEIR INFERIOR HARDWARE.

-SITS BACK AND WATCHES THE SHIT STORM-

JUST MASTER RACE THINGS.

#375 Edited by ronvalencia (15129 posts) -

@tyloss:

@tyloss said:

@StormyJoe: The Xbox One will never achieve parity with any PS4 multiplat that utilizes both consoles power 100% efficiently.

We've already seen this in numerous multiplats.

http://gamingbolt.com/xbox-ones-esram-too-small-to-output-games-at-1080p-but-will-catch-up-to-ps4-rebellion-games#SBX8QyXmrlJEyBW1.99

Bolcato stated that, “It was clearly a bit more complicated to extract the maximum power from the Xbox One when you’re trying to do that. I think eSRAM is easy to use. The only problem is…Part of the problem is that it’s just a little bit too small to output 1080p within that size. It’s such a small size within there that we can’t do everything in 1080p with that little buffer of super-fast RAM.

“It means you have to do it in chunks or using tricks, tiling it and so on...

...

Will the process become easier over time as understanding of the hardware improves? “Definitely, yeah. They are releasing a new SDK that’s much faster and we will be comfortably running at 1080p on Xbox One. We were worried six months ago and we are not anymore, it’s got better and they are quite comparable machines.

“Yeah, I mean that’s probably why, well at least on paper, it’s a bit more powerful. But I think the Xbox One is gonna catch up. But definitely there’s this eSRAM. PS4 has 8GB and it’s almost as fast as eSRAM [bandwidth wise] but at the same time you can go a little bit further with it, because you don’t have this slower memory. That’s also why you don’t have that many games running in 1080p, because you have to make it smaller, for what you can fit into the eSRAM with the Xbox One.”

http://www.developer-tech.com/news/2014/feb/17/xbox-one-1080p-comfortably-sdk-update-says-rebellion/

“They [Microsoft] are releasing a new SDK that’s much faster and we will be comfortably running at 1080p on Xbox One. We were worried six months ago and we are not anymore, it’s got better and they are quite comparable machines.”

The studio is currently in the process of developing a sequel to the much-loved and graphically-intensive -- ‘Sniper Elite’ series. This has pitted the company in the perfect position to comment on the state of both systems’ development in an unbiased manner.

Sniper Elite 3 Aiming For Full 60FPS And 1080p On PS4 And Xbox One.

Sniper Elite series is not a racing game.

------------------

Prototype 7850 with 12 CU+dual rasterizer+153.6 GB/s memory bandwidth shows the limit when the memory bandwidth+dual rastersizer units are the same as the retail 7850 with 16 CUs+dual rastersizer units.

Based from 7950's Vantage pixel fill rate scores (240GB = 32 ROPS), 16 ROPS at 800Mhz can consume about 120 GB/s.

----------------

Vantage's color fill rate test involves "test draws frames by filling the screen multiple times. The color and alpha of each corner of the screen is animated with the interpolated color written directly to the target using alpha blending".

3DMark Vantage Pixel Fille rate (from anandtech and techreport)

Radeon HD 7850 (32 ROPS at 860 Mhz) = 8 Gpixel = 153.6 GB/s

Radeon HD 7870 (32 ROPS at 1Ghz) = 8 Gpixel = 153.6 GB/s (7.9 Gpixel from techreport)

Radeon HD 7950-800 (32 ROPS at 800 Mhz) = 12.1 Gpixel = 240 GB/s

Radeon HD 7970-925 (32 ROPS at 925 Mhz) = 13.2 Gpixel = 260 GB/s (from techreport)

----

The increase between 240 GB/s vs 153 GB/s is 1.56X

The increase between 12.1 Gpixel vs 8 Gpixel is 1.51X

----

The increase between 260 GB/s vs 153.6 Gpixel is 1.69X

The increase between 13.2 Gpixel vs 8 GB/s is 1.65X

----

There's a near linear relationship with increased memory bandwidth and pixel file rate.

Lets assume 7950-800 is the top 32 ROPs capability. For 16 ROPS, if we divide 7950-800's 240 GB/s by 2 it will yield ~120 GB/s write.

Lets assume 7970-925 is the top 32 ROPs capability. For 16 ROPS, if we divide 7970-925's 260 GB/s by 2 it will yield ~130 GB/s write.

Microsoft's 16 ROPs math includes both read and write which will saturate X1's ESRAM bandwidth i.e. "eight bytes write, four bytes read".

#376 Posted by blamix99 (2036 posts) -

@sts106mat said:

@blamix99 said:

@sukraj said:

Ryse takes a huge dump on flop Order 1999

really dude? Ryse is just graphics, even the top 5 game sites gave below 7.

gamespot - 40

ign - 6.8

game informer - 60

eurogamer - 50

gamerevolution - 60

its much better than just graphics. trying to nail the executions to maximise your return is tricky, its surprising how many times i mess them up. the story, characters and voice acting are all also superb. The atmosphere in some of the levels is to die for.

Also, reviewers must have played on recruit level, which makes the game less difficult than a walk in the park. on legendary the game is good challenge.

The game receives a lot of very positive feedback on twitter etc if you look what actual gamers think.

it's a crappy game bro, i saw my brother in law playing it all the time just when the X1 launched.. when he finished the game he said it's a very boring game with beautiful graphics.

#377 Edited by Krelian-co (11155 posts) -

@blamix99 said:

@sts106mat said:

@blamix99 said:

@sukraj said:

Ryse takes a huge dump on flop Order 1999

really dude? Ryse is just graphics, even the top 5 game sites gave below 7.

gamespot - 40

ign - 6.8

game informer - 60

eurogamer - 50

gamerevolution - 60

its much better than just graphics. trying to nail the executions to maximise your return is tricky, its surprising how many times i mess them up. the story, characters and voice acting are all also superb. The atmosphere in some of the levels is to die for.

Also, reviewers must have played on recruit level, which makes the game less difficult than a walk in the park. on legendary the game is good challenge.

The game receives a lot of very positive feedback on twitter etc if you look what actual gamers think.

it's a crappy game bro, i saw my brother in law playing it all the time just when the X1 launched.. when he finished the game he said it's a very boring game with beautiful graphics.

we all know that, "magically" every hardcore lem that is disconnected with reality in these forums is saying it's a great game despite every normal person and reviewer agreeing it's crap.

#378 Posted by tyloss (829 posts) -

@ronvalencia said:

@tyloss:

@tyloss said:

@StormyJoe: The Xbox One will never achieve parity with any PS4 multiplat that utilizes both consoles power 100% efficiently.

We've already seen this in numerous multiplats.

http://gamingbolt.com/xbox-ones-esram-too-small-to-output-games-at-1080p-but-will-catch-up-to-ps4-rebellion-games#SBX8QyXmrlJEyBW1.99

Bolcato stated that, “It was clearly a bit more complicated to extract the maximum power from the Xbox One when you’re trying to do that. I think eSRAM is easy to use. The only problem is…Part of the problem is that it’s just a little bit too small to output 1080p within that size. It’s such a small size within there that we can’t do everything in 1080p with that little buffer of super-fast RAM.

“It means you have to do it in chunks or using tricks, tiling it and so on...

...

Will the process become easier over time as understanding of the hardware improves? “Definitely, yeah. They are releasing a new SDK that’s much faster and we will be comfortably running at 1080p on Xbox One. We were worried six months ago and we are not anymore, it’s got better and they are quite comparable machines.

“Yeah, I mean that’s probably why, well at least on paper, it’s a bit more powerful. But I think the Xbox One is gonna catch up. But definitely there’s this eSRAM. PS4 has 8GB and it’s almost as fast as eSRAM [bandwidth wise] but at the same time you can go a little bit further with it, because you don’t have this slower memory. That’s also why you don’t have that many games running in 1080p, because you have to make it smaller, for what you can fit into the eSRAM with the Xbox One.”

http://www.developer-tech.com/news/2014/feb/17/xbox-one-1080p-comfortably-sdk-update-says-rebellion/

“They [Microsoft] are releasing a new SDK that’s much faster and we will be comfortably running at 1080p on Xbox One. We were worried six months ago and we are not anymore, it’s got better and they are quite comparable machines.”

The studio is currently in the process of developing a sequel to the much-loved and graphically-intensive -- ‘Sniper Elite’ series. This has pitted the company in the perfect position to comment on the state of both systems’ development in an unbiased manner.

Sniper Elite 3 Aiming For Full 60FPS And 1080p On PS4 And Xbox One.

Sniper Elite series is not a racing game.

------------------

Prototype 7850 with 12 CU+dual rasterizer+153.6 GB/s memory bandwidth shows the limit when the memory bandwidth+dual rastersizer units are the same as the retail 7850 with 16 CUs+dual rastersizer units.

Based from 7950's Vantage pixel fill rate scores (240GB = 32 ROPS), 16 ROPS at 800Mhz can consume about 120 GB/s.

----------------

Vantage's color fill rate test involves "test draws frames by filling the screen multiple times. The color and alpha of each corner of the screen is animated with the interpolated color written directly to the target using alpha blending".

3DMark Vantage Pixel Fille rate (from anandtech and techreport)

Radeon HD 7850 (32 ROPS at 860 Mhz) = 8 Gpixel = 153.6 GB/s

Radeon HD 7870 (32 ROPS at 1Ghz) = 8 Gpixel = 153.6 GB/s (7.9 Gpixel from techreport)

Radeon HD 7950-800 (32 ROPS at 800 Mhz) = 12.1 Gpixel = 240 GB/s

Radeon HD 7970-925 (32 ROPS at 925 Mhz) = 13.2 Gpixel = 260 GB/s (from techreport)

----

The increase between 240 GB/s vs 153 GB/s is 1.56X

The increase between 12.1 Gpixel vs 8 Gpixel is 1.51X

----

The increase between 260 GB/s vs 153.6 Gpixel is 1.69X

The increase between 13.2 Gpixel vs 8 GB/s is 1.65X

----

There's a near linear relationship with increased memory bandwidth and pixel file rate.

Lets assume 7950-800 is the top 32 ROPs capability. For 16 ROPS, if we divide 7950-800's 240 GB/s by 2 it will yield ~120 GB/s write.

Lets assume 7970-925 is the top 32 ROPs capability. For 16 ROPS, if we divide 7970-925's 260 GB/s by 2 it will yield ~130 GB/s write.

Microsoft's 16 ROPs math includes both read and write which will saturate X1's ESRAM bandwidth i.e. "eight bytes write, four bytes read".

Nothing was said by the developer both consoles will have complete parity.

Please provide sufficient evidence instead of posting useless walls of text.

#379 Edited by tyloss (829 posts) -

@StormyJoe said:

@tyloss said:

@StormyJoe: The Xbox One will never achieve parity with any PS4 multiplat that utilizes both consoles power 100% efficiently.

We've already seen this in numerous multiplats.

Read the developer interviews.

So you're talking out of your ass? I read them (Ron quoted them above) and nothing explicitly stated the Xbox and PlayStation will output 100% equally down the line.

Actually, some developers have stated quite the opposite.Link

Show me a quote where they say complete parity between the two consoles will happen on multiplats down the road.

#380 Posted by Krelian-co (11155 posts) -

@tyloss said:

@StormyJoe said:

@tyloss said:

@StormyJoe: The Xbox One will never achieve parity with any PS4 multiplat that utilizes both consoles power 100% efficiently.

We've already seen this in numerous multiplats.

Read the developer interviews.

So you're talking out of your ass? I read them (Ron quoted them above) and nothing explicitly stated the Xbox and PlayStation will output 100% equally down the line.

Actually, some developers have stated quite the opposite.Link

Show me a quote where they say complete parity between the two consoles will happen on multiplats down the road.

the funny thing he's arguing EVERYONE is saying that, hahahaha. But hey, he never makes bs!

#381 Edited by tyloss (829 posts) -

@Krelian-co said:

@tyloss said:

@StormyJoe said:

@tyloss said:

@StormyJoe: The Xbox One will never achieve parity with any PS4 multiplat that utilizes both consoles power 100% efficiently.

We've already seen this in numerous multiplats.

Read the developer interviews.

So you're talking out of your ass? I read them (Ron quoted them above) and nothing explicitly stated the Xbox and PlayStation will output 100% equally down the line.

Actually, some developers have stated quite the opposite.Link

Show me a quote where they say complete parity between the two consoles will happen on multiplats down the road.

the funny thing he's arguing EVERYONE is saying that, hahahaha. But hey, he never makes bs!

Haha I'll wait if he can't though that'd mean He's a joke and a shitty debater on top of that.

#382 Posted by the_bi99man (11047 posts) -

@Wasdie said:

@me3x12 said:

Listen this is the same case for The Order 1886 they are clearly making compromises and felt the game will look better at 800p with added effects. This is my exact point for Ryse. So Cows need to decide what argument they want to make they can't have it both ways.

The Order 1886 developers have decided to go with a different aspect ratio. This could either be an artistic choice or because of a technical limitation. I'm not sure. I would lean more towards a technical limitation that they are cleverly masking with an artistic design decision.

Developers always choose what they believe makes the game look better. If running at 1080p would result in a lower framerate then they'll pick a lower resolution or reduce rendering effects to hit a stable framerate. Its up to the developer to choose what will look the best.

This doesn't mean the developer is willingly downgrading the rendering or resolution. They are being forced to do it. All of these games would look better at 1080p and 60fps but none of the platforms are powerful enough to handle them OR the game engines have not yet been tuned properly for the hardware. Could be a number of reasons.

Fact remains that devs are making compromises on all platforms right now. Please don't lump me in with Sony fanboys when it comes to this. I'm not playing ignorant to the Playstation 4's shortcomings.

Wasdie, you're trying way too hard to be patient and reasonable with a blatant fanboy who has absolutely no desire to have a real conversation. He's a troll. Literally every post. It's the sole purpose of his existence. Just ignore him. Or, better yet, use your mod powers and ban him just for being an idiot. I would have banned him months ago, if I were a mod. And everybody would be happier.

#383 Edited by hoosier7 (3843 posts) -

http://gamingbolt.com/xbox-ones-esram-is-cumbersome-bottlenecking-memory-developer

It's like this was leaked just for TC lol

I don’t get this obsession with MS fanboys wanting the more powerful machine. Its not happening. Who cares. There are xbox fans who think esram is this mystical thing when all it is is cumbersome, bottlenecking memory. But its in there so it HAS to be special. When I hear people say devs need to just utilize esram better, its like saying they need to learn how to do the most basic thing again. Esram isn’t rocket science, it isn’t even complex. Its just a pain in the a**. That’s it,

#384 Posted by stuff238 (685 posts) -

Prove it. Oh wait, you can't! In reality, right now in time, the PS4 is winning the graphics console race. We have proof. It is fact.

You have nothing but words. Let me know when an xbone multiplat is better than the PS4 version, because I see nothing yet to prove you right.

#385 Posted by tormentos (18294 posts) -

@StormyJoe said:

So now the gap between the PS4 and Xb1 is 1000000000000%?

Typical troll. I could give you irrefutable evidence and you would just ignore it because it ruins your little fantasy world. You said it yourself: you are right and everyone actually "in" the industry is wrong.

Pathetic.

Hahahahaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa...

Final proof that you are a buffoon,did you read what i say.? Please re read the part where y say 1000000000000.%,read it again slowly..

SDK improve..... 100000000000% right you are,now prove to me how only the xbox one will improve its SDK and sony will not because that is the only way your argument can even begin to make sense

I say that you were 100000000000000% right that SDK do improve...ahahahaaaa

In not place i say the difference was that number.

You have no irrefutable evidence because it doesn't exist,PS4 version of multiplatform games run better than xbox one,there is no evidence to the contrary.

Once again and don't ignore my point like you just did conveniently,i want evidence that PS4 SDK will not improve at all ever,because i already told you that the xbox one SDK will improve but you refuse to admit the PS4 one will,double standard FTW...

#386 Posted by speedfog (3221 posts) -

TC is awesome.

#387 Posted by tormentos (18294 posts) -
@StormyJoe said:

No, that's not what he said, and I did not misinterpret his responses. Tormentos is convinced that there is absolutely nothing that will every create a parity between the PS4 and XB1, regardless of what every industry expert says to the contrary.

Yeah that is what i say you blind lemming i say SDK do improve what the fu** man,stop arguing things i didn't say.

SDK will improve now you on denial and severely anally butthurt lemming if the xbox one SDK improve so will the PS4.

The problem with your god awful theory is that you want to give the impression that the xbox one SDK will improve so greatly that it would make the difference irrelevant,this theory is complete made up and wrong,because SDK for PS4 will improve to just like PS3 ones improved,so let me brake it down for you in the most dummy proof way i can.

The xbox one is a racing car,right now is doing 120 MPH,the PS4 by comparison is doing 150 MPH,but not only that it also corners better than the xbox one car.

So the xbox team does an optimization on and fine tuning on the car,so now it is doing 148 MPH,which is very close to the PS4 150PH,but there is a problem the PS4 team also did customization and fine tuning to the PS4 car,and now is doing 180MPH..

You see what apply to 1 most apply to other,both are GCN with the same hardware basically,the xbox one just has less power,so no matter what is done to it,it will always have less power.

By the way not every industry expert is saying that please prove it,because 2 or 3 quote by some developers who want to be polite doesn't = every industry expert in fact that last one is the only one i have read to actually claim that the xbox one will catch up the PS4 and dismiss the hardware difference as on paper,this isn't last gen when hardware were difference,this gen the xbox one has a 7770 equivalent GPU while having a ESRAM bottle neck,the PS4 is pretty straight forward,and has a GPU like the 7850 OC.

@StormyJoe said:

Oh look, another person who completely ignored everything I (and ever interviewed developer) have said, and is running around yelling "The PS4 is more powerful, the PS4 is more powerful".

Cows gonna cow...

Prove it,prove the PS4 isn't more power i dare you with damn fact not your biased opinion or a quote from a polite developers i want proof because developers can claim what they want,fact is what games are showing now...

You already miss with your predictions,..

But but 4 to 5 frames...lol

@StormyJoe said:

a) The XB1 has it's own version of Direct X, not shared with the PS4

b) go read.

Why do you keep posting? Glutton for punishment?

a) The PS4 also has it own version of tool,trying to use DX as shield is stupid,OpenGL is better and on PS4 sony has tools that are lower level compare to MS DX,oh did i mention that sony will release a to the metal SDK latter one.?

b)You claim it you back it up.

Juts because MS uses DX doesn't mean the PS4 will not improve what kind of lame ass excuse is that,the PS3 didn't use DX either and its SDK improved radically,so hiding on a over head full DX is not helping you case.

@ronvalencia said:

Andrea stated "so 1080 no MS would be cheaper" i.e. no MSAA at 1080p. You are adding 2X MSAA and Andrea's post clearly states no MSAA at 1080p

@tormentos

PRT doesn't use DDR3 to store those textures,Anandtech was very clear,it uses your own video memory as cache i just destroyed your argument period,sony even list PRT in their list of things the PS4 does,you are been a blind MS fanboy period and is time you really drop it,Sony confirm it and Anandtech confirm it to,in an article from 2011 on GCN,hell DX didn't even support PRT back them..lol

The textures are hold on the disc or HDD and streamed as need it,having 2 memory pulls is irrelevant,because PRT is make for GCN and on PC DDR3 has far slower speed and data transfer than GDDR5,once again from all intended purposes the GPU will see 2 memory banks one it can access one it can't,even if system partition is use for those textures it will still load faster than they would in 2 different memory setups.

I already prove what i need it to PRT work on PS4,and MS had to make support for it on xbox one because of ESRAM which no GCN has,i told you that like 20 times,no GCN has ESRAM,and PRT work on a single memory you video memory.

You better read the basics of cache in computing. read http://en.wikipedia.org/wiki/Cache_(computing)

It's clear you don't know the purpose having the cache in the first place.

To minimize latency and applying "cache" concepts for GCN's PRT,

1. "Cache memory" is replaced by GPU's faster memory pool which runs faster than DDR3.

2. CPU is replace by GPU.

http://www.britannica.com/EBchecked/topic/87789/cache-memory

cache memory, also called Cache, a supplementary memory system that temporarily stores frequently used instructions and data for quicker processing by the central processor of a computer. The cache augments, and is an extension of, a computer’s main memory

--------------

AMD PRT makes the faster VRAM into a "cache" against slower memory types e.g. DDR3 for X1's and gaming PC's situation. For PS4, it doesn't have larger/slower memory pool to cache against.

Against your "GPU will see 2 memory banks one it can access one it can't" assertion, read AMD PRT from http://www.tinysg.de/techGuides/tg4_prt.html

The Southern Islands graphic chips of Radeon 7000 series boards and FirePro W-series Workstation cards support a new OpenGL extension called AMD_sparse_texture that allows to download only parts of a texture image (tiles), rather than providing the entire image all at once. The driver will commit memory only for the downloaded tiles, and the application may decide to unload tiles again if they are no longer needed. This way, the graphics memory can serve as a texture tile cache, indexing parts of vast textures in main memory or even on disk.

AMD's GCN driver from Radeon HD 7970 era allows the GPU to access main memory. This vendor specific feature was replicated for PC's DirectX 11.2 and Xbox One's DirectX11.X superset extensions i.e. Xbox One doesn't need Mantle.

You are wrong on GPU's inability to access main memory claim i.e. the driver will provide the access.

The issue on OpenGL vs DIrectX on PRT's support release date is just red-herring and you should know PS4's primary graphics API is not OpenGL.

From http://www.officialplaystationmagazine.co.uk/2013/10/31/ps4-uses-a-5400-rpm-sata-ii-hard-drive-users-can-install-anything-less-than-9-5mm-larger-than-160gb/

PS4's standard disk is a slow 5400 RPM laptop types LOL. It has massive latency if one uses this type of HDD for texture cache storage. LOL. Current PC workstations has faster storage devices than the slow 5400 RPM laptop HDD.

The fundamental lessons for AMD PRT and Tiled resource is to keep GPU's TMUs operating at it's peak efficiency in the constraint of it's fastest memory storage.

I didn't state PS4 doesn't have PRT functionality, but the gain from PRT for PCs and X1s are larger than PS4's version. For X1, PRT is part of the requirement for max efficiency while the PC is an option when most runtime game textures fits within 2 GB VRAM for 1080p scenarios. For PS4, PRT is nearly pointless since it has 5 to 6 GB VRAM which can easily enable it's TMUs running at max efficiency within the constraint of it's fastest memory storage.

----------------------------------

At 1920x1080p, any flagship PC GPU ASICs with flagship PC video memory bandwidth from the year 2012 can apply 4X MSAA. The PC's performance scales with the user's budget.

Note that MSAA is not a silver bullet when comes to non-edge geometry, hence why you have a combo solution.

He didn't say it was impossible he say it would be cheaper 1080p without it,you are reading it out of context.

And all the rest of the crap you just quote there doesn't disprove what i posted.

The driver will commit memory only for the downloaded tiles, and the application may decide to unload tiles again if they are no longer needed. This way, the graphics memory can serve as a texture tile cache, indexing parts of vast textures in main memory or even on disk.

You just backed up my point and my link..lol

PRT work on PS4 period until proven other wise which you haven't done,it doesn't need 2 memory pools and GCN has only 1.

#388 Edited by ronvalencia (15129 posts) -
@tormentos said:
@StormyJoe said:

No, that's not what he said, and I did not misinterpret his responses. Tormentos is convinced that there is absolutely nothing that will every create a parity between the PS4 and XB1, regardless of what every industry expert says to the contrary.

Yeah that is what i say you blind lemming i say SDK do improve what the fu** man,stop arguing things i didn't say.

SDK will improve now you on denial and severely anally butthurt lemming if the xbox one SDK improve so will the PS4.

The problem with your god awful theory is that you want to give the impression that the xbox one SDK will improve so greatly that it would make the difference irrelevant,this theory is complete made up and wrong,because SDK for PS4 will improve to just like PS3 ones improved,so let me brake it down for you in the most dummy proof way i can.

The xbox one is a racing car,right now is doing 120 MPH,the PS4 by comparison is doing 150 MPH,but not only that it also corners better than the xbox one car.

So the xbox team does an optimization on and fine tuning on the car,so now it is doing 148 MPH,which is very close to the PS4 150PH,but there is a problem the PS4 team also did customization and fine tuning to the PS4 car,and now is doing 180MPH..

You see what apply to 1 most apply to other,both are GCN with the same hardware basically,the xbox one just has less power,so no matter what is done to it,it will always have less power.

By the way not every industry expert is saying that please prove it,because 2 or 3 quote by some developers who want to be polite doesn't = every industry expert in fact that last one is the only one i have read to actually claim that the xbox one will catch up the PS4 and dismiss the hardware difference as on paper,this isn't last gen when hardware were difference,this gen the xbox one has a 7770 equivalent GPU while having a ESRAM bottle neck,the PS4 is pretty straight forward,and has a GPU like the 7850 OC.

@StormyJoe said:

Oh look, another person who completely ignored everything I (and ever interviewed developer) have said, and is running around yelling "The PS4 is more powerful, the PS4 is more powerful".

Cows gonna cow...

Prove it,prove the PS4 isn't more power i dare you with damn fact not your biased opinion or a quote from a polite developers i want proof because developers can claim what they want,fact is what games are showing now...

You already miss with your predictions,..

But but 4 to 5 frames...lol

@StormyJoe said:

a) The XB1 has it's own version of Direct X, not shared with the PS4

b) go read.

Why do you keep posting? Glutton for punishment?

a) The PS4 also has it own version of tool,trying to use DX as shield is stupid,OpenGL is better and on PS4 sony has tools that are lower level compare to MS DX,oh did i mention that sony will release a to the metal SDK latter one.?

b)You claim it you back it up.

Juts because MS uses DX doesn't mean the PS4 will not improve what kind of lame ass excuse is that,the PS3 didn't use DX either and its SDK improved radically,so hiding on a over head full DX is not helping you case.

@ronvalencia said:

Andrea stated "so 1080 no MS would be cheaper" i.e. no MSAA at 1080p. You are adding 2X MSAA and Andrea's post clearly states no MSAA at 1080p

@tormentos

PRT doesn't use DDR3 to store those textures,Anandtech was very clear,it uses your own video memory as cache i just destroyed your argument period,sony even list PRT in their list of things the PS4 does,you are been a blind MS fanboy period and is time you really drop it,Sony confirm it and Anandtech confirm it to,in an article from 2011 on GCN,hell DX didn't even support PRT back them..lol

The textures are hold on the disc or HDD and streamed as need it,having 2 memory pulls is irrelevant,because PRT is make for GCN and on PC DDR3 has far slower speed and data transfer than GDDR5,once again from all intended purposes the GPU will see 2 memory banks one it can access one it can't,even if system partition is use for those textures it will still load faster than they would in 2 different memory setups.

I already prove what i need it to PRT work on PS4,and MS had to make support for it on xbox one because of ESRAM which no GCN has,i told you that like 20 times,no GCN has ESRAM,and PRT work on a single memory you video memory.

You better read the basics of cache in computing. read http://en.wikipedia.org/wiki/Cache_(computing)

It's clear you don't know the purpose having the cache in the first place.

To minimize latency and applying "cache" concepts for GCN's PRT,

1. "Cache memory" is replaced by GPU's faster memory pool which runs faster than DDR3.

2. CPU is replace by GPU.

http://www.britannica.com/EBchecked/topic/87789/cache-memory

cache memory, also called Cache, a supplementary memory system that temporarily stores frequently used instructions and data for quicker processing by the central processor of a computer. The cache augments, and is an extension of, a computer’s main memory

--------------

AMD PRT makes the faster VRAM into a "cache" against slower memory types e.g. DDR3 for X1's and gaming PC's situation. For PS4, it doesn't have larger/slower memory pool to cache against.

Against your "GPU will see 2 memory banks one it can access one it can't" assertion, read AMD PRT from http://www.tinysg.de/techGuides/tg4_prt.html

The Southern Islands graphic chips of Radeon 7000 series boards and FirePro W-series Workstation cards support a new OpenGL extension called AMD_sparse_texture that allows to download only parts of a texture image (tiles), rather than providing the entire image all at once. The driver will commit memory only for the downloaded tiles, and the application may decide to unload tiles again if they are no longer needed. This way, the graphics memory can serve as a texture tile cache, indexing parts of vast textures in main memory or even on disk.

AMD's GCN driver from Radeon HD 7970 era allows the GPU to access main memory. This vendor specific feature was replicated for PC's DirectX 11.2 and Xbox One's DirectX11.X superset extensions i.e. Xbox One doesn't need Mantle.

You are wrong on GPU's inability to access main memory claim i.e. the driver will provide the access.

The issue on OpenGL vs DIrectX on PRT's support release date is just red-herring and you should know PS4's primary graphics API is not OpenGL.

From http://www.officialplaystationmagazine.co.uk/2013/10/31/ps4-uses-a-5400-rpm-sata-ii-hard-drive-users-can-install-anything-less-than-9-5mm-larger-than-160gb/

PS4's standard disk is a slow 5400 RPM laptop types LOL. It has massive latency if one uses this type of HDD for texture cache storage. LOL. Current PC workstations has faster storage devices than the slow 5400 RPM laptop HDD.

The fundamental lessons for AMD PRT and Tiled resource is to keep GPU's TMUs operating at it's peak efficiency in the constraint of it's fastest memory storage.

I didn't state PS4 doesn't have PRT functionality, but the gain from PRT for PCs and X1s are larger than PS4's version. For X1, PRT is part of the requirement for max efficiency while the PC is an option when most runtime game textures fits within 2 GB VRAM for 1080p scenarios. For PS4, PRT is nearly pointless since it has 5 to 6 GB VRAM which can easily enable it's TMUs running at max efficiency within the constraint of it's fastest memory storage.

----------------------------------

At 1920x1080p, any flagship PC GPU ASICs with flagship PC video memory bandwidth from the year 2012 can apply 4X MSAA. The PC's performance scales with the user's budget.

Note that MSAA is not a silver bullet when comes to non-edge geometry, hence why you have a combo solution.

He didn't say it was impossible he say it would be cheaper 1080p without it,you are reading it out of context.

And all the rest of the crap you just quote there doesn't disprove what i posted.

The driver will commit memory only for the downloaded tiles, and the application may decide to unload tiles again if they are no longer needed. This way, the graphics memory can serve as a texture tile cache, indexing parts of vast textures in main memory or even on disk.

You just backed up my point and my link..lol

PRT work on PS4 period until proven other wise which you haven't done,it doesn't need 2 memory pools and GCN has only 1.

I didn't say PS4 doesn't have PRT functions LOL.

There's a difference between "the level of gain/benefits" vs zero functions. To keep TMUs busy, AMD PRT/Tiled resource is a requirement for X1 i.e.12 CU equipped prototype 7850 with 153.6 GB/s memory bandwidth can do more than 7770's 72 GB/s memory bandwidth.

It would be interesting to if Rebellion can pull-off 1920x1080p/near 60 fps on both PS4 and X1 i.e. Sniper Elite 3 Aiming For Full 60FPS And 1080p On PS4 And Xbox One

In terms of "the level of gain/benefits", there's a difference between DDR3 memory vs a very slow 5400 RPM laptop hard disk.

Again, your "GPU will see 2 memory banks one it can access one it can't" assertion is wrong i.e.

1. AMD's driver enables AMD PRT features to work with main system memory and that's two memory pools i.e. GDDR5 and DDR3 for gaming PCs. Again, you better review the purpose of having a cache. A cache doesn't exist if there's nothing to cache against a slower memory pool.

Note that HDDs can be use a very slow memory pool i.e. didn't you know virtual memory functions can read/write memory "pages" on the HDDs? My laptops's 512 GB Samsung 840 SSD is faster than early 90s main memory and exceeds the old 32bit PCI standard's 133 MB/s i.e. >200 MB/s

2. Modern GPUs doesn't operate without a driver level software. LOL.

3. DirectX11.X is a superset of PC's DirectX11.2 i.e. it has GCN extensions e.g. AMD GCN on X1 can see both DDR3 and ESRAM memory pools. X1 doesn't need Mantle since it's DirectX has it's own AMD GCN extensions. Again, you claims on GCN's unable to see main memory is wrong. Your stupidity on your part.

4. PCs with AMD GCN has Mantle. Again, you claims on GCN's unable to see main memory is wrong.

5. PCs with DirectX11.2 gains AMD PRT/Tiled resource functions with just-in-time texture streaming+caching from slower memory pools to fast GDDR5 memory pool.

-------

As for 'The Order, Andrea's post clearly states no MSAA at 1080p. The statement is pretty simple and you're adding fan-fiction.

#389 Posted by Krelian-co (11155 posts) -

@tyloss said:

@Krelian-co said:

@tyloss said:

@StormyJoe said:

@tyloss said:

@StormyJoe: The Xbox One will never achieve parity with any PS4 multiplat that utilizes both consoles power 100% efficiently.

We've already seen this in numerous multiplats.

Read the developer interviews.

So you're talking out of your ass? I read them (Ron quoted them above) and nothing explicitly stated the Xbox and PlayStation will output 100% equally down the line.

Actually, some developers have stated quite the opposite.Link

Show me a quote where they say complete parity between the two consoles will happen on multiplats down the road.

the funny thing he's arguing EVERYONE is saying that, hahahaha. But hey, he never makes bs!

Haha I'll wait if he can't though that'd mean He's a joke and a shitty debater on top of that.

as expected, he makes bullsh1t and straight out lies hoping no one will notice, gets disproved he looks like the clown he is and disappears, rinse and repeat, stormyclown style. Lem desperation continuation.

#390 Posted by zeeshanhaider (2595 posts) -

@tormentos said:

@zeeshanhaider said:

Talking out of your ass again....HAHAHAHAHA

Have no data to back it up....HAHAHAHAHA

Ryse > Crapzone: Shadow Fail and in the only has less than 6.7% pixels than the Order on a console 50% more powerful....HAHAHAHAHA

Crytek still shits on all Sony's First Party and creating pains in the asses of Sony drones forever.....HAHAHAHAHAHA

900pStation and 720pBox already getting games where developers had to cut corners...HAHAHAHAHAHA

Not even a single game that could out do what a PC game did 3 years back....HAHAHAHAHAHA

Tablet CPU and GPU barely matching a 570 from 2010 could only do so much....HAHAHAHAHAHA

Peasants are getting ripped off again with $400 and $500 and getting their asses whooped by the PC....HAHAHAHAHA

Oh and Sony is about to dies, Microsoft is going to shutdown the XBox brand, Nintendo will only be producing handhelds in a near future.....HAHAHAHAHA

http://store.steampowered.com/hwsurvey

Hahahahaaa.................

Most use Intel HD 4000 fallow by Intel HD 3000.....

Ryse is a joke and what Killzone does the xbox one would do at 720p and with lower visuals..haha

Oh wait The Order look better has better looking characters,doesn't have PRE-RENDER cut scenes and it higher resolution,with 4XMSAA...lol The order on xbox one would run at sub 720p..lol

Oh wait you already admit that Ryse would run faster at 1080p on PS4,boy you owned your self..

Well is you consider that a PS4 has an early 2012 mid range GPU is not a surprise,once again build me a cheaper stronger PC than the PS4 for $399 hell most people buy walmart PC for $500 and they don't stand close to the PS4..lol

Hahahahaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa............

The PS4 has sold in 3 months more than most top of the line GPU have on PC...

But but sony is dying...ahahahaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa

Can win the argument so you resort to invent sh**...lol


Didn't even expand the list to view more details. Most PC's actually have better spec than 900pStation and sport resolution 1920x1080p and that's only STEAM....HAHAHAHAHAHA

Bringing price? That means already lost the argument.....HAHAHAHAHAHA

As matter of fact you can build a pc for $400 that's better is spec to 900pStation. Serach the Gamespot.....HAHAHAHAHAHA

Again talking out of your ass about the Crapzone: Shadow Fail with nothing from the industry to back it up....HAHAHAHAAHAHA

A 50% more powerful console running a game close to 900p. Crytek again proved Sony first party don't know jack about good coding and how to make engines.....HAHAHAHAHAHA

900pStation has a tablet CPU and a GPU barely matching a midrange GPU 570 from 2010.....HAHAHAHAHAHA

I wouldn't be surprised if 900pStation become 720pStation in and 720pBox becomes Sub-HD box in five years, hell I think we even get Tablets more powerful than the 'out-dated/Next-Gen' consololes.....HAHAHAHAAHA

You don't own a 900pStation, that speaks volumes about that it doesn't worth it. Selfownage right there.....HAHAHAHAAHAHA

PC market continues to grow while consololes parent companies either are dying or thinking about killing the brand.....HAHAHAHAHAHA

900pStation will struggle all his life to reach the level of 2011 PC game....HAHAHAHAHA

Consolites is a species that's on the verge of extinction....HAHAHAHAHAHAHA

#391 Edited by tormentos (18294 posts) -
@ronvalencia said:

@tyloss:

@tyloss said:

@StormyJoe: The Xbox One will never achieve parity with any PS4 multiplat that utilizes both consoles power 100% efficiently.

We've already seen this in numerous multiplats.

http://gamingbolt.com/xbox-ones-esram-too-small-to-output-games-at-1080p-but-will-catch-up-to-ps4-rebellion-games#SBX8QyXmrlJEyBW1.99

Bolcato stated that, “It was clearly a bit more complicated to extract the maximum power from the Xbox One when you’re trying to do that. I think eSRAM is easy to use. The only problem is…Part of the problem is that it’s just a little bit too small to output 1080p within that size. It’s such a small size within there that we can’t do everything in 1080p with that little buffer of super-fast RAM.

“It means you have to do it in chunks or using tricks, tiling it and so on...

...

Will the process become easier over time as understanding of the hardware improves? “Definitely, yeah. They are releasing a new SDK that’s much faster and we will be comfortably running at 1080p on Xbox One. We were worried six months ago and we are not anymore, it’s got better and they are quite comparable machines.

“Yeah, I mean that’s probably why, well at least on paper, it’s a bit more powerful. But I think the Xbox One is gonna catch up. But definitely there’s this eSRAM. PS4 has 8GB and it’s almost as fast as eSRAM [bandwidth wise] but at the same time you can go a little bit further with it, because you don’t have this slower memory. That’s also why you don’t have that many games running in 1080p, because you have to make it smaller, for what you can fit into the eSRAM with the Xbox One.”

http://www.developer-tech.com/news/2014/feb/17/xbox-one-1080p-comfortably-sdk-update-says-rebellion/

“They [Microsoft] are releasing a new SDK that’s much faster and we will be comfortably running at 1080p on Xbox One. We were worried six months ago and we are not anymore, it’s got better and they are quite comparable machines.”

The studio is currently in the process of developing a sequel to the much-loved and graphically-intensive -- ‘Sniper Elite’ series. This has pitted the company in the perfect position to comment on the state of both systems’ development in an unbiased manner.

Sniper Elite 3 Aiming For Full 60FPS And 1080p On PS4 And Xbox One.

Sniper Elite series is not a racing game.

------------------

@ronvalencia said:

You missed my statement on the prototype-7850 with 12 CUs will never exceed the retail 7850 with 16 CUs and PS4 has 18 CUs.

@ronvalencia said:

You missed my statement on the prototype-7850 with 12 CUs will never exceed the retail 7850 with 16 CUs and PS4 has 18 CUs.

@ronvalencia said:

You missed my statement on the prototype-7850 with 12 CUs will never exceed the retail 7850 with 16 CUs and PS4 has 18 CUs.

@ronvalencia said:

You missed my statement on the prototype-7850 with 12 CUs will never exceed the retail 7850 with 16 CUs and PS4 has 18 CUs.

Hahhahaaaaaaaaaaaaaaaaaaaaaaa... That is the excuse you give for the xbox one been behind,and now you side with a developer which say the xbox one will catch up when you know it is impossible for a 12 CU GPU to catch up with an 18CU GPU...You are owning your self hard..lol

#392 Edited by tormentos (18294 posts) -

@ronvalencia said:

I didn't say PS4 doesn't have PRT functions LOL.

There's a difference between "the level of gain/benefits" vs zero functions. To keep TMUs busy, AMD PRT/Tiled resource is a requirement for X1 i.e.12 CU equipped prototype 7850 with 153.6 GB/s memory bandwidth can do more than 7770's 72 GB/s memory bandwidth.

It would be interesting to if Rebellion can pull-off 1920x1080p/near 60 fps on both PS4 and X1 i.e. Sniper Elite 3 Aiming For Full 60FPS And 1080p On PS4 And Xbox One

In terms of "the level of gain/benefits", there's a difference between DDR3 memory vs a very slow 5400 RPM laptop hard disk.

Again, your "GPU will see 2 memory banks one it can access one it can't" assertion is wrong i.e.

1. AMD's driver enables AMD PRT features to work with main system memory and that's two memory pools i.e. GDDR5 and DDR3 for gaming PCs. Again, you better review the purpose of having a cache. A cache doesn't exist if there's nothing to cache against a slower memory pool.

Note that HDDs can be use a very slow memory pool i.e. didn't you know virtual memory functions can read/write memory "pages" on the HDDs? My laptops's 512 GB Samsung 840 SSD is faster than early 90s main memory and exceeds the old 32bit PCI standard's 133 MB/s i.e. >200 MB/s

2. Modern GPUs doesn't operate without a driver level software. LOL.

3. DirectX11.X is a superset of PC's DirectX11.2 i.e. it has GCN extensions e.g. AMD GCN on X1 can see both DDR3 and ESRAM memory pools. X1 doesn't need Mantle since it's DirectX has it's own AMD GCN extensions. Again, you claims on GCN's unable to see main memory is wrong. Your stupidity on your part.

4. PCs with AMD GCN has Mantle. Again, you claims on GCN's unable to see main memory is wrong.

5. PCs with DirectX11.2 gains AMD PRT/Tiled resource functions with just-in-time texture streaming+caching from slower memory pools to fast GDDR5 memory pool.

-------

As for 'The Order, Andrea's post clearly states no MSAA at 1080p. The statement is pretty simple and you're adding fan-fiction.

The xbox one doesn't have a prototype 7850 it has a Bonaire 7790 with 2 CU disable an 16 ROP,this has been confirmed to hell and beyond and you refuse to admit it.

The xbox one doesn't have that bandwidth either,the 7850 has 153GB/s single pool of 1GB,other models have 2GB,and 4GB,the xbox one has 32 MB of ESRAM which can achieve 140Gb/s to 150GB/s in the very best case scenario.

PRT will work on PS4 just like it will on xbox one,and you don't need 2 memory pools it is confirmed period,unless you have data from AMD stating that you need 2 memory pools or ESRAM you loss the argument.

Mantle is a to metal API that work more or less on the same level of the PS4 low level api,DX doesn't have that king of low level design,it is say the is lower level than the PC version of DX,but not like the PS4 in any regard.

No the textures are loaded from the HDD or the disc and partially store in video memory,not a single document from AMD state that PRT are loaded from system ram,in fact Anandtech like i posted say it can be from the HDD..lol

Lets wait 1 year and see what your excuse will be when i quote you,using what that dude say about catching up,i just want you see you backpedal again and claim that a 7850 with 12 CU will no pass a 7850 retail again..lol

#393 Edited by me3x12 (1765 posts) -

http://www.youtube.com/watch?feature=player_embedded&v=kCH9D83vHiA

#394 Edited by tyloss (829 posts) -

@Krelian-co said:

@tyloss said:

@Krelian-co said:

@tyloss said:

@StormyJoe said:

@tyloss said:

@StormyJoe: The Xbox One will never achieve parity with any PS4 multiplat that utilizes both consoles power 100% efficiently.

We've already seen this in numerous multiplats.

Read the developer interviews.

So you're talking out of your ass? I read them (Ron quoted them above) and nothing explicitly stated the Xbox and PlayStation will output 100% equally down the line.

Actually, some developers have stated quite the opposite.Link

Show me a quote where they say complete parity between the two consoles will happen on multiplats down the road.

the funny thing he's arguing EVERYONE is saying that, hahahaha. But hey, he never makes bs!

Haha I'll wait if he can't though that'd mean He's a joke and a shitty debater on top of that.

as expected, he makes bullsh1t and straight out lies hoping no one will notice, gets disproved he looks like the clown he is and disappears, rinse and repeat, stormyclown style. Lem desperation continuation.

StormyJoe If you're reading this (which you probably will) just know you're a fucking clown.

#395 Posted by Caseytappy (2146 posts) -

Poor Lems trying to bend the universe to avoid facing the ugly truth, you can run from reality but it won't go away .

Pixels the size of your fist !

#396 Posted by Phazevariance (10980 posts) -

@Caseytappy said:

Poor Lems trying to bend the universe to avoid facing the ugly truth, you can run from reality but it won't go away .

Pixels the size of your fist !

Sounds a lot like cows last generation desperately claiming to be in second place while they landed in 3rd all generation.

#397 Edited by ronvalencia (15129 posts) -
@tormentos said:

@ronvalencia said:

I didn't say PS4 doesn't have PRT functions LOL.

There's a difference between "the level of gain/benefits" vs zero functions. To keep TMUs busy, AMD PRT/Tiled resource is a requirement for X1 i.e.12 CU equipped prototype 7850 with 153.6 GB/s memory bandwidth can do more than 7770's 72 GB/s memory bandwidth.

It would be interesting to if Rebellion can pull-off 1920x1080p/near 60 fps on both PS4 and X1 i.e. Sniper Elite 3 Aiming For Full 60FPS And 1080p On PS4 And Xbox One

In terms of "the level of gain/benefits", there's a difference between DDR3 memory vs a very slow 5400 RPM laptop hard disk.

Again, your "GPU will see 2 memory banks one it can access one it can't" assertion is wrong i.e.

1. AMD's driver enables AMD PRT features to work with main system memory and that's two memory pools i.e. GDDR5 and DDR3 for gaming PCs. Again, you better review the purpose of having a cache. A cache doesn't exist if there's nothing to cache against a slower memory pool.

Note that HDDs can be use a very slow memory pool i.e. didn't you know virtual memory functions can read/write memory "pages" on the HDDs? My laptops's 512 GB Samsung 840 SSD is faster than early 90s main memory and exceeds the old 32bit PCI standard's 133 MB/s i.e. >200 MB/s

2. Modern GPUs doesn't operate without a driver level software. LOL.

3. DirectX11.X is a superset of PC's DirectX11.2 i.e. it has GCN extensions e.g. AMD GCN on X1 can see both DDR3 and ESRAM memory pools. X1 doesn't need Mantle since it's DirectX has it's own AMD GCN extensions. Again, you claims on GCN's unable to see main memory is wrong. Your stupidity on your part.

4. PCs with AMD GCN has Mantle. Again, you claims on GCN's unable to see main memory is wrong.

5. PCs with DirectX11.2 gains AMD PRT/Tiled resource functions with just-in-time texture streaming+caching from slower memory pools to fast GDDR5 memory pool.

-------

As for 'The Order, Andrea's post clearly states no MSAA at 1080p. The statement is pretty simple and you're adding fan-fiction.

The xbox one doesn't have a prototype 7850 it has a Bonaire 7790 with 2 CU disable an 16 ROP,this has been confirmed to hell and beyond and you refuse to admit it.

The xbox one doesn't have that bandwidth either,the 7850 has 153GB/s single pool of 1GB,other models have 2GB,and 4GB,the xbox one has 32 MB of ESRAM which can achieve 140Gb/s to 150GB/s in the very best case scenario.

PRT will work on PS4 just like it will on xbox one,and you don't need 2 memory pools it is confirmed period,unless you have data from AMD stating that you need 2 memory pools or ESRAM you loss the argument.

Mantle is a to metal API that work more or less on the same level of the PS4 low level api,DX doesn't have that king of low level design,it is say the is lower level than the PC version of DX,but not like the PS4 in any regard.

No the textures are loaded from the HDD or the disc and partially store in video memory,not a single document from AMD state that PRT are loaded from system ram,in fact Anandtech like i posted say it can be from the HDD..lol

Lets wait 1 year and see what your excuse will be when i quote you,using what that dude say about catching up,i just want you see you backpedal again and claim that a 7850 with 12 CU will no pass a 7850 retail again..lol

Your codename comparisons are inconsequential when we discussing about GCN's common functional building blocks.

I have already shown you the prototype 7850's 32 ROPS wouldn't be at full usage at 153.6GB/s memory bandwidth.

Based from Vantage's pixel fill rate benchmark for 7950-800Mhz scores. X1's 16 ROPS at 853Mhz would be consuming around 127 GB/s which is close enough to 153.6 GB/s.

If 7950-800Mhz is the top 32 ROPS sample at 240 GB/s memory bandwidth, then the 16 ROPS version would be consuming around 120 GB/s.

For X1's 853Mhz 16 ROPS, it's (853Mhz/800Mhz) x 120 GB/s = 127 GB/s. Microsoft's ROPS calculation involves memory read and write which saturated X1's ~150GB/s ESRAM bandwidth. 127 GB/s exceeds 7790's 96GB/s memory bandwidth.

16 ROPS is a lesser problem than X1's other GPU issues i.e. 32 ESRAM being too small without tiling tricks.

The prototype 7850 with 12 CU and 153.5 GB/s memory bandwidth shows what 12 CU GCN without being gimped by 68GB/s-to-104GB/s memory bandwidth. This establishes it's place as being lower than retail 7850.

Of course X1 doesn't have 2 GB GDDR5 (i.e. the said prototype 7850 with 12 CUs at 860Mhz has 2 GB GDDR5 at 153 GB/s), but the you didn't factor in Rebellion's specific criteria on X1 being "comparable" to PS4. In another words, don't expect non-tiling game engines for X1 to come close to the prototype 7850's results.

Your discussion around "catching up" or "comparable" are just word plays.

The full quote segment

“Yeah, I mean that’s probably why, well at least on paper, it’s a bit more powerful. But I think the Xbox One is gonna catch up. But definitely there’s this eSRAM. PS4 has 8GB and it’s almost as fast as eSRAM [bandwidth wise] but at the same time you can go a little bit further with it, because you don’t have this slower memory. That’s also why you don’t have that many games running in 1080p, because you have to make it smaller, for what you can fit into the eSRAM with the Xbox One.”

You missed the following

"But definitely there’s this eSRAM. PS4 has 8GB and it’s almost as fast as eSRAM [bandwidth wise] but at the same time you can go a little bit further with it".

PS4 can go further than X1. Your NOT reading the entire statement. LOL.

-------------------

Your AMD PRT stream from HDD is inconsequential for current PC style game engines i.e. 2GB GDDR5 is good enough for R9-270 which beats PS4's multiplatform gaming results.

#398 Edited by ronvalencia (15129 posts) -

@tormentos said:
@ronvalencia said:

@tyloss:

@tyloss said:

@StormyJoe: The Xbox One will never achieve parity with any PS4 multiplat that utilizes both consoles power 100% efficiently.

We've already seen this in numerous multiplats.

http://gamingbolt.com/xbox-ones-esram-too-small-to-output-games-at-1080p-but-will-catch-up-to-ps4-rebellion-games#SBX8QyXmrlJEyBW1.99

Bolcato stated that, “It was clearly a bit more complicated to extract the maximum power from the Xbox One when you’re trying to do that. I think eSRAM is easy to use. The only problem is…Part of the problem is that it’s just a little bit too small to output 1080p within that size. It’s such a small size within there that we can’t do everything in 1080p with that little buffer of super-fast RAM.

“It means you have to do it in chunks or using tricks, tiling it and so on...

...

Will the process become easier over time as understanding of the hardware improves? “Definitely, yeah. They are releasing a new SDK that’s much faster and we will be comfortably running at 1080p on Xbox One. We were worried six months ago and we are not anymore, it’s got better and they are quite comparable machines.

“Yeah, I mean that’s probably why, well at least on paper, it’s a bit more powerful. But I think the Xbox One is gonna catch up. But definitely there’s this eSRAM. PS4 has 8GB and it’s almost as fast as eSRAM [bandwidth wise] but at the same time you can go a little bit further with it, because you don’t have this slower memory. That’s also why you don’t have that many games running in 1080p, because you have to make it smaller, for what you can fit into the eSRAM with the Xbox One.”

http://www.developer-tech.com/news/2014/feb/17/xbox-one-1080p-comfortably-sdk-update-says-rebellion/

“They [Microsoft] are releasing a new SDK that’s much faster and we will be comfortably running at 1080p on Xbox One. We were worried six months ago and we are not anymore, it’s got better and they are quite comparable machines.”

The studio is currently in the process of developing a sequel to the much-loved and graphically-intensive -- ‘Sniper Elite’ series. This has pitted the company in the perfect position to comment on the state of both systems’ development in an unbiased manner.

Sniper Elite 3 Aiming For Full 60FPS And 1080p On PS4 And Xbox One.

Sniper Elite series is not a racing game.

------------------

@ronvalencia said:

You missed my statement on the prototype-7850 with 12 CUs will never exceed the retail 7850 with 16 CUs and PS4 has 18 CUs.

@ronvalencia said:

You missed my statement on the prototype-7850 with 12 CUs will never exceed the retail 7850 with 16 CUs and PS4 has 18 CUs.

@ronvalencia said:

You missed my statement on the prototype-7850 with 12 CUs will never exceed the retail 7850 with 16 CUs and PS4 has 18 CUs.

@ronvalencia said:

You missed my statement on the prototype-7850 with 12 CUs will never exceed the retail 7850 with 16 CUs and PS4 has 18 CUs.

Hahhahaaaaaaaaaaaaaaaaaaaaaaa... That is the excuse you give for the xbox one been behind,and now you side with a developer which say the xbox one will catch up when you know it is impossible for a 12 CU GPU to catch up with an 18CU GPU...You are owning your self hard..lol

You missed

"But definitely there’s this eSRAM. PS4 has 8GB and it’s almost as fast as eSRAM [bandwidth wise] but at the same time YOU CAN GO A LITTLE BIT FURTHER WITH IT.

LOL. Hahahahahahahahahahahahahahahah.

#399 Posted by no-scope-AK47 (2806 posts) -

How can the xflop that has trouble hitting 1080p more than the ps4 be the better 1080p console ??

#400 Posted by iwasgood2u (819 posts) -

How come xboner is far behind in sales? They both launch nearly the same time with ps4. If you were smart ... you would buy a ps4 not some low end expensive piece of junk thats still sitting on store shelves collecting dust. I feel bad for all the hardcore lemmings . They are praying that the new sdk will save them from oblivion. Pretty clear this really is their last hope and if it fails then they will disappear from systemwar or continue telling everyone how good it is to own an xboner even if their father, bill gates, was to say it's weaker than his precious 360.