ESram on the xboxone explained

  • 128 results
  • 1
  • 2
  • 3

This topic is locked from further discussion.

#51 Edited by evildead6789 (6825 posts) -

@killatwill15 said:

@Chutebox said:

Wait, are we back to the X1 having secret sauce again!?

yeah even though they have been refuted by actual developers, and self owned by lemming king phil spencer.

they still press on with this science fiction, sure 1080p is viable on anything even the 3ds,

but the games will suffer for it "draw distance, texture quality, model density, shader quality etc",

I don't know why lemmings don't stop, they are only hurting themselves.

they should have a lemming support group were they apply preparation h to each others anus's,

oh wait they do that every day with these "esram,dx11.2, dx12, cloud, Kinect" threads

poor @kuu2@blackace@FreedomFreeLife@FoxbatAlpha,

even if Microsoft straight up admitted the xbone was weak sauce, they wouldn't believe it.

The x1 is weak sauce, very weak but so is the ps4.

Apart from that . The actual devs didn't have the right tools for esram at the time the xboxone released. At the release of the xboxone 10 percent of the gpu was locked, for the kinect and to switch between skype and movies, which they changed to 2 percent and those are the two main reasons the xboxone couldn't output at 1080p

I don't even have an xboxone, or a ps4. I stil have an x360 and a three year old mid range pc (that smokes the ps4). I would maybe consider buying an xboxone for exclusives or fighting games if they released it without a kinect

but I would never buy a crappy playstation lol. The ps3 was so much stronger too and it showed.... way to late lol

#52 Edited by evildead6789 (6825 posts) -

@scatteh316 said:

Lmao... PS2s EDRAM was not designed to be used as a cache, it was designed to be used as pure VRAM storage.

PC graphics cards don't use ESRAM or EDRAM because it's a stupid fucking idea.

EDRAM/ESRAM on PC = Less die space for ROPS, TMU's and ALU's.....

You can upgrade pc's , you can't upgrade a console.

#53 Posted by navyguy21 (12686 posts) -

@navyguy21:

and of course the ps4 tools will go downhill. Thats how it works.

Umm.....i didnt speak on that because this thread is about ESRAM.....

#54 Posted by bloodlust_101 (2675 posts) -

The x1 is weak sauce, very weak but so is the ps4.

Apart from that . The actual devs didn't have the right tools for esram at the time the xboxone released. At the release of the xboxone 10 percent of the gpu was locked, for the kinect and to switch between skype and movies, which they changed to 2 percent and those are the two main reasons the xboxone couldn't output at 1080p

I don't even have an xboxone, or a ps4. I stil have an x360 and a three year old mid range pc (that smokes the ps4). I would maybe consider buying an xboxone for exclusives or fighting games if they released it without a kinect

but I would never buy a crappy playstation lol. The ps3 was so much stronger too and it showed.... way to late lol

What is this 3 year old mid ranged pc made up of if you don't mind me asking?

And again - 8% more gpu power will not make it output 1080p from 720p. You constantly say the PS4 GPU only has ~5 frames more than the XBone (Which has been continuously proven wrong). The PS4 is 40% stronger (Maximum theoretical), so if 40% is equal to ~5 frames per second, this upgrade will give you a massive 1 frame more on that same game! Huzzah! (Using your pc gpu thread to claim the difference is insignificant)

#55 Posted by evildead6789 (6825 posts) -

This claim of gddr5 being inferior to a gimped ram setup is ridicilous. Forza 5 devs acknowlegded that outputting 1080p@60 took too much of the esram. The frame buffer takes up anywhere from 18 (720p@30) to 32 MB(1080@60).

You would have been right if there was 64 MB of esram, even 42 MB would have done wonders. But as it is now, 32 MB is gimping the system.

Forza had the problem it had to work with a devkit that didn't make use of the esram correctly

and they had to work with 10 percent of the gpu locked for multimedia. This is now changed to 2 percent.

#56 Posted by killatwill15 (845 posts) -

@killatwill15 said:

@Chutebox said:

Wait, are we back to the X1 having secret sauce again!?

yeah even though they have been refuted by actual developers, and self owned by lemming king phil spencer.

they still press on with this science fiction, sure 1080p is viable on anything even the 3ds,

but the games will suffer for it "draw distance, texture quality, model density, shader quality etc",

I don't know why lemmings don't stop, they are only hurting themselves.

they should have a lemming support group were they apply preparation h to each others anus's,

oh wait they do that every day with these "esram,dx11.2, dx12, cloud, Kinect" threads

poor @kuu2@blackace@FreedomFreeLife@FoxbatAlpha,

even if Microsoft straight up admitted the xbone was weak sauce, they wouldn't believe it.

The x1 is weak sauce, very weak but so is the ps4.

Apart from that . The actual devs didn't have the right tools for esram at the time the xboxone released. At the release of the xboxone 10 percent of the gpu was locked, for the kinect and to switch between skype and movies, which they changed to 2 percent and those are the two main reasons the xboxone couldn't output at 1080p

I don't even have an xboxone, or a ps4. I stil have an x360 and a three year old mid range pc (that smokes the ps4). I would maybe consider buying an xboxone for exclusives or fighting games if they released it without a kinect

but I would never buy a crappy playstation lol. The ps3 was so much stronger too and it showed.... way to late lol

like I said, even gamecube is capable of outputting 1080p,

but the game will have to suffer for the resolution boost.

the gpu reserve or esram is no excuse to blame that xb1 couldn't display 1080.

it can, but it would look ugly as fuck,

crytek said so before switching to 900p for ryse,

1080 was possible, but they wanted ryse to look better,

the console isn't capable of both, other ports have proved that,

it is just weak hardware, period.

and if you have a pc more powerful than a ps4,

stick with that,

why waste money on an upscaled wii u(x1).

because the only exclusives that stayed on xbox was halo forza gears,

the rest will be pc ports one day.

#57 Edited by evildead6789 (6825 posts) -

@bloodlust_101 said:

@evildead6789 said:

The x1 is weak sauce, very weak but so is the ps4.

Apart from that . The actual devs didn't have the right tools for esram at the time the xboxone released. At the release of the xboxone 10 percent of the gpu was locked, for the kinect and to switch between skype and movies, which they changed to 2 percent and those are the two main reasons the xboxone couldn't output at 1080p

I don't even have an xboxone, or a ps4. I stil have an x360 and a three year old mid range pc (that smokes the ps4). I would maybe consider buying an xboxone for exclusives or fighting games if they released it without a kinect

but I would never buy a crappy playstation lol. The ps3 was so much stronger too and it showed.... way to late lol

What is this 3 year old mid ranged pc made up of if you don't mind me asking?

And again - 8% more gpu power will not make it output 1080p from 720p. You constantly say the PS4 GPU only has ~5 frames more than the XBone (Which has been continuously proven wrong). The PS4 is 40% stronger (Maximum theoretical), so if 40% is equal to ~5 frames per second, this upgrade will give you a massive 1 frame more on that same game! Huzzah! (Using your pc gpu thread to claim the difference is insignificant)

It's in my sig. i5-2500, hd 7870 xt (the hd 7870 xt isn't three years old, but you could have bought something similar 3 years ago). My i5 is double the horsepower of sony's cpu btw and it costed me 190$... three years ago. The hd 7870 xt costed me 210$. Something similar now costs 180$ and it's way stronger than the ps4's gpu.

Thief had an output of 900p, with still 10 percent of the gpu locked... . People blow the ps4's extra power out of proportion, even I did when I saw the hardware specs. But if the esram is used well, there will be hardly a difference. Not to mention, x1 overclocked the cpu and gpu.

Having said that, I hate both console manufactures at this time, the fact they released such weak consoles are a disgrace to any gaming enthousiast. I hope nintendo release a new console. a strong one (like in the rumours)

#58 Posted by misterpmedia (3362 posts) -

IIRC, the max ESRAM bandwidth numbers which take it well above the GDDR5 one was a theoretical peak. I think the technical fellow from microsoft said they wouldn't be getting at all close to that on real world performance in their tech break down on Eurogamer.

#59 Posted by evildead6789 (6825 posts) -

IIRC, the max ESRAM bandwidth numbers which take it well above the GDDR5 one was a theoretical peak. I think the technical fellow from microsoft said they wouldn't be getting at all close to that on real world performance in their tech break down on Eurogamer.

Doesn't matter , it's still on chip ram, and that's what makes the big difference. The gddr5 in sony ps4 is standalone ram, even worse it's not dedicated but shared.

#60 Edited by evildead6789 (6825 posts) -

@killatwill15 said:

@evildead6789 said:

@killatwill15 said:

@Chutebox said:

Wait, are we back to the X1 having secret sauce again!?

yeah even though they have been refuted by actual developers, and self owned by lemming king phil spencer.

they still press on with this science fiction, sure 1080p is viable on anything even the 3ds,

but the games will suffer for it "draw distance, texture quality, model density, shader quality etc",

I don't know why lemmings don't stop, they are only hurting themselves.

they should have a lemming support group were they apply preparation h to each others anus's,

oh wait they do that every day with these "esram,dx11.2, dx12, cloud, Kinect" threads

poor @kuu2@blackace@FreedomFreeLife@FoxbatAlpha,

even if Microsoft straight up admitted the xbone was weak sauce, they wouldn't believe it.

The x1 is weak sauce, very weak but so is the ps4.

Apart from that . The actual devs didn't have the right tools for esram at the time the xboxone released. At the release of the xboxone 10 percent of the gpu was locked, for the kinect and to switch between skype and movies, which they changed to 2 percent and those are the two main reasons the xboxone couldn't output at 1080p

I don't even have an xboxone, or a ps4. I stil have an x360 and a three year old mid range pc (that smokes the ps4). I would maybe consider buying an xboxone for exclusives or fighting games if they released it without a kinect

but I would never buy a crappy playstation lol. The ps3 was so much stronger too and it showed.... way to late lol

like I said, even gamecube is capable of outputting 1080p,

but the game will have to suffer for the resolution boost.

the gpu reserve or esram is no excuse to blame that xb1 couldn't display 1080.

it can, but it would look ugly as fuck,

crytek said so before switching to 900p for ryse,

1080 was possible, but they wanted ryse to look better,

the console isn't capable of both, other ports have proved that,

it is just weak hardware, period.

and if you have a pc more powerful than a ps4,

stick with that,

why waste money on an upscaled wii u(x1).

because the only exclusives that stayed on xbox was halo forza gears,

the rest will be pc ports one day.

crytek didn't have the right esram tools and they had the gpu lock. The extra power in the ps4 is blown outta proportion. You can see it in the differrence in frame rates in the charts. Even the ps4 struggles at 1080p. The ps4 may have the edge but they're both weak systems and the x1 was released with bad devtools and a gpu lock of 10 percent. Don't forget they overclocked the system too.

The biggest problem x1 has was a bad launch because of management. That and the mandatory kinect.

#61 Posted by arkephonic (6208 posts) -

The funniest part about all this is that the reason the Xbox One is so weak is because of Kinect needing to be in every SKU. They had to make concessions to the hardware specs to keep it affordable and under the poorly received $600 ps3. The problem with this is that Kinect sucks for video games and serves no purpose. You know what does serve a purpose in gaming? Fast ram, powerful video cards and nice processors. In a perfect world, sure, Kinect could be in every SKU so that every xb1 owner had one and devs could feel better about spending resources on Kinect implementation in their games, but that whole idea is far fetched. Due to the fact that people are scared shitless to price a console at $600 after the ps3 reception, there's just no way they could have released a powerful console with Kinect included. Kinect sucks for video games, period. The fact that Microsoft is hell bent on making Kinect an integral part of video games is their biggest downfall. Kinect sucks and won't ever be a good controller. Ever. Microsoft deserves to fail for even dabbling with this Kinect bullshit. I would own an Xbox one if Kinect didn't exist.

#62 Posted by evildead6789 (6825 posts) -

The funniest part about all this is that the reason the Xbox One is so weak is because of Kinect needing to be in every SKU. They had to make concessions to the hardware specs to keep it affordable and under the poorly received $600 ps3. The problem with this is that Kinect sucks for video games and serves no purpose. You know what does serve a purpose in gaming? Fast ram, powerful video cards and nice processors. In a perfect world, sure, Kinect could be in every SKU so that every xb1 owner had one and devs could feel better about spending resources on Kinect implementation in their games, but that whole idea is far fetched. Due to the fact that people are scared shitless to price a console at $600 after the ps3 reception, there's just no way they could have released a powerful console with Kinect included. Kinect sucks for video games, period. The fact that Microsoft is hell bent on making Kinect an integral part of video games is their biggest downfall. Kinect sucks and won't ever be a good controller. Ever. Microsoft deserves to fail for even dabbling with this Kinect bullshit. I would own an Xbox one if Kinect didn't exist.

It will be good for virtual reality though but I think the xboxone is too weak for that or is it not? I mean if they can store 6 gb of textures in the 32 mb ram, they still have 8 gb of ram left then to store more textures. Well some is reserved for other stuff but I'm sure you catch my drift.

As a couch system yeah, remains to be seen if it's good or not, at this time the ps4 is better, but it stays a weak system.

#63 Posted by slimdogmilionar (286 posts) -

I'm no engineer but I do think the xbox is not being used to its full potential, first reason I believe this is because we keep hearing devs say over and over they are not using esram right now. Another reason is because everything that the xbox can do it seems to have dedicated hardware for each task so they can run parallel with each other.

#64 Edited by Oemenia (10216 posts) -

@Martin_G_N said:

@evildead6789:

Yeah right, there is several reasons why Nvidia and AMD hasn't used cache on GPU's. One is being able to use GDDR5 and another is because the EDRAM takes up a lot of space on the board. Which is why the X1's GPU is so small. The only reason they ended up with EDRAM is because they wanted 8 GB of main memory, and at the time all they could use was the slow DDR3. The idea of having fast cache on the GPU is nothing new, the ps2 had that.

Yes and look what the ps2 did with their very weak specs.

I don't know the reason why they didn't use cache, you could be right it uses too much space on the board but that's kinda weird, it isn't too much space for cpu's and they are much smaller.I think it is because pc's uses different cards, different chips. Having something like edram on one chip and not on another could have problems with compatiblity. With a pc you can also buy a better videocard with more ram so this kind of optimization would not only gives the devs more work, but also the people that make drivers.

The X1 doesn't have this problem. It's only one platform.

X1 could have used gddr5 as well but at the time of r&d it was too expensive. Sony took a chance there, guessing the price would drop and they guessed right. Gddr5 may be faster than ddr3 but calling it slow is a bit over the top. DDr3 cards are slower cards but mostly the card will go one tier down. Like here the 7790 with ddr3 has the power of the hd 7770 with gddr5.

I do agree that MS didnt intentionally want too gimp the system with slow RAM, they thought they could only have a lot by going with DDR3 and so had to compensate with the eSRAM.

On the other hand they had a perfectly fine road-map in place that used DDR4 but was scrapped at the last minute. If anything the PS4 is more powerful by chance, we were all assuming they would go for a leaner system after the losses of the PS3.

#65 Posted by killatwill15 (845 posts) -

@killatwill15 said:

@evildead6789 said:

@killatwill15 said:

@Chutebox said:

Wait, are we back to the X1 having secret sauce again!?

yeah even though they have been refuted by actual developers, and self owned by lemming king phil spencer.

they still press on with this science fiction, sure 1080p is viable on anything even the 3ds,

but the games will suffer for it "draw distance, texture quality, model density, shader quality etc",

I don't know why lemmings don't stop, they are only hurting themselves.

they should have a lemming support group were they apply preparation h to each others anus's,

oh wait they do that every day with these "esram,dx11.2, dx12, cloud, Kinect" threads

poor @kuu2@blackace@FreedomFreeLife@FoxbatAlpha,

even if Microsoft straight up admitted the xbone was weak sauce, they wouldn't believe it.

The x1 is weak sauce, very weak but so is the ps4.

Apart from that . The actual devs didn't have the right tools for esram at the time the xboxone released. At the release of the xboxone 10 percent of the gpu was locked, for the kinect and to switch between skype and movies, which they changed to 2 percent and those are the two main reasons the xboxone couldn't output at 1080p

I don't even have an xboxone, or a ps4. I stil have an x360 and a three year old mid range pc (that smokes the ps4). I would maybe consider buying an xboxone for exclusives or fighting games if they released it without a kinect

but I would never buy a crappy playstation lol. The ps3 was so much stronger too and it showed.... way to late lol

like I said, even gamecube is capable of outputting 1080p,

but the game will have to suffer for the resolution boost.

the gpu reserve or esram is no excuse to blame that xb1 couldn't display 1080.

it can, but it would look ugly as fuck,

crytek said so before switching to 900p for ryse,

1080 was possible, but they wanted ryse to look better,

the console isn't capable of both, other ports have proved that,

it is just weak hardware, period.

and if you have a pc more powerful than a ps4,

stick with that,

why waste money on an upscaled wii u(x1).

because the only exclusives that stayed on xbox was halo forza gears,

the rest will be pc ports one day.

crytek didn't have the right esram tools and they had the gpu lock. The extra power in the ps4 is blown outta proportion. You can see it in the differrence in frame rates in the charts. Even the ps4 struggles at 1080p. The ps4 may have the edge but they're both weak systems and the x1 was released with bad devtools and a gpu lock of 10 percent. Don't forget they overclocked the system too.

The biggest problem x1 has was a bad launch because of management. That and the mandatory kinect.

it wasn't locked at 10 percent,

10 percent was locked away,

the ps4 and xbone may be weak by todays comparisons,

but there is no denying that the xbone is the weakest of the 2,

and the discrepancy will only be more noticeable down the road as multiplats get even more ambitious and optimised

#66 Posted by stereointegrity (10675 posts) -

@misterpmedia said:

IIRC, the max ESRAM bandwidth numbers which take it well above the GDDR5 one was a theoretical peak. I think the technical fellow from microsoft said they wouldn't be getting at all close to that on real world performance in their tech break down on Eurogamer.

Doesn't matter , it's still on chip ram, and that's what makes the big difference. The gddr5 in sony ps4 is standalone ram, even worse it's not dedicated but shared.

its more cache then it is ram it has been shown that way...

and yes the ram in ps4 is stand alone but HSA was made for shared memory so both cpu and gpu can share the same info with out having to pass it by eachother

#67 Posted by silversix_ (13546 posts) -

So good that it has to run outdated looking games in 720p. Just let it go already

#68 Posted by Solid_Max13 (3476 posts) -

Didn't @tormentos debunk half of this with legit sources and all? eSRAM is powerful but they used to little and devs prefer GDDR5, eSRAM is a pain to dev for.

#69 Posted by scatteh316 (4732 posts) -

I already posted a comment from a developer showing how piss poor small ESRAM is to be of any real benefit to Xbone.

It needed to have 64Mb of it for it to classed as a good advantage.

PS4 has a memory sub-system that is just as quick and multiple factors bigger

#70 Edited by killatwill15 (845 posts) -

@Solid_Max13 said:

Didn't @tormentos debunk half of this with legit sources and all? eSRAM is powerful but they used to little and devs prefer GDDR5, eSRAM is a pain to dev for.

yeah he did,

but you know lemmings like to bring back stuff from the dead,

explains all the zombie games the xbone has.

or could it be that lemmings are braindead?

#71 Edited by scatteh316 (4732 posts) -

@killatwill15 said:

@evildead6789 said:

@killatwill15 said:

@Chutebox said:

Wait, are we back to the X1 having secret sauce again!?

yeah even though they have been refuted by actual developers, and self owned by lemming king phil spencer.

they still press on with this science fiction, sure 1080p is viable on anything even the 3ds,

but the games will suffer for it "draw distance, texture quality, model density, shader quality etc",

I don't know why lemmings don't stop, they are only hurting themselves.

they should have a lemming support group were they apply preparation h to each others anus's,

oh wait they do that every day with these "esram,dx11.2, dx12, cloud, Kinect" threads

poor @kuu2@blackace@FreedomFreeLife@FoxbatAlpha,

even if Microsoft straight up admitted the xbone was weak sauce, they wouldn't believe it.

The x1 is weak sauce, very weak but so is the ps4.

Apart from that . The actual devs didn't have the right tools for esram at the time the xboxone released. At the release of the xboxone 10 percent of the gpu was locked, for the kinect and to switch between skype and movies, which they changed to 2 percent and those are the two main reasons the xboxone couldn't output at 1080p

I don't even have an xboxone, or a ps4. I stil have an x360 and a three year old mid range pc (that smokes the ps4). I would maybe consider buying an xboxone for exclusives or fighting games if they released it without a kinect

but I would never buy a crappy playstation lol. The ps3 was so much stronger too and it showed.... way to late lol

like I said, even gamecube is capable of outputting 1080p,

but the game will have to suffer for the resolution boost.

the gpu reserve or esram is no excuse to blame that xb1 couldn't display 1080.

it can, but it would look ugly as fuck,

crytek said so before switching to 900p for ryse,

1080 was possible, but they wanted ryse to look better,

the console isn't capable of both, other ports have proved that,

it is just weak hardware, period.

and if you have a pc more powerful than a ps4,

stick with that,

why waste money on an upscaled wii u(x1).

because the only exclusives that stayed on xbox was halo forza gears,

the rest will be pc ports one day.

crytek didn't have the right esram tools and they had the gpu lock. The extra power in the ps4 is blown outta proportion. You can see it in the differrence in frame rates in the charts. Even the ps4 struggles at 1080p. The ps4 may have the edge but they're both weak systems and the x1 was released with bad devtools and a gpu lock of 10 percent. Don't forget they overclocked the system too.

The biggest problem x1 has was a bad launch because of management. That and the mandatory kinect.

PS4 has a 50% texture mapping advantage, a 50% shader performance advantage and TWICE the fill rate....

ESRAM will NOT make up for any of that, the only way to reduce that performance lead is to increase the clock speeds, which Microsoft have already done.

But as Xbones OS consumes more resources anyway it's hardly dented PS4's performance lead.

That fill rate advantage is why PS4 is hitting 1080p on pretty much everything and sftware, API or hardware hacks will not give Xbone double the fill rate all of a sudden.

#72 Edited by ronvalencia (15109 posts) -

@Martin_G_N:

Te

@Martin_G_N said:

The speed difference between the ESRAM and GDDR5 is pretty similar, and there is alot more of the GDDR5 RAM in the PS4 which is better. I think having a fast cache is important, but it would have to be alot faster than the GDDR5 RAM to make a difference, which it isn't.

And it is still funny to hear about how the PS4 can't support some feature because it doesn't have Direct X 11.something or 12. It can utilize the same features as any Direct X version can through Open GL, if it has the power for it of course. Remember Tesselation, which was a Direct X only feature when it arrived. It could also be done on the PS3.

Tesselation can be done on DX9 GPUs (i.e. vertex shader ping pong), but it's not as efficient when compared to a dedicated hardware and expanded register storage.

If you use tesselation on the SPUs, you will reduce the available compute power for other tasks.

Note why Intel Larrabee was dead on arrival i.e. pure stream processor solution is not the answer for pure GPU performance.

ATI learnt their lesson with R600's shader powered MSAA i.e. RV670 includes the full MSAA hardware.

#73 Posted by scatteh316 (4732 posts) -

@Martin_G_N:

Te

@Martin_G_N said:

The speed difference between the ESRAM and GDDR5 is pretty similar, and there is alot more of the GDDR5 RAM in the PS4 which is better. I think having a fast cache is important, but it would have to be alot faster than the GDDR5 RAM to make a difference, which it isn't.

And it is still funny to hear about how the PS4 can't support some feature because it doesn't have Direct X 11.something or 12. It can utilize the same features as any Direct X version can through Open GL, if it has the power for it of course. Remember Tesselation, which was a Direct X only feature when it arrived. It could also be done on the PS3.

ATI learnt their lesson with R600's shader powered MSAA i.e. RV670 includes the full MSAA hardware.

And yet funnily enough over the last couple of years shader based post process AA has become the new trend....

#74 Posted by misterpmedia (3362 posts) -

@misterpmedia said:

IIRC, the max ESRAM bandwidth numbers which take it well above the GDDR5 one was a theoretical peak. I think the technical fellow from microsoft said they wouldn't be getting at all close to that on real world performance in their tech break down on Eurogamer.

Doesn't matter , it's still on chip ram, and that's what makes the big difference. The gddr5 in sony ps4 is standalone ram, even worse it's not dedicated but shared.

Not getting how this is bad lol It's what developers wanted

#75 Edited by delta3074 (17595 posts) -

@tyloss said:

@navyguy21: yeah, no.

YES, hes actually right, PS3 developement really kicked off with EDGE tools, 3rd party titles looked like rubbish on the Ps3 until SONY introduced EDGE tools and lowered the OS overhead on the PS3.

Clearly you have no idea how optimisation works so you should really shut your mouth and keep your asanine comments to yourself

#76 Edited by urbansys (231 posts) -

Clouds... ESram... DirectX 11.2... Overclocks... Stacked GPUs... Tiled Resources... Reduced gpu overheads... DirectX 12... Reduced cpu overheads...

Give up. The desperation is becoming too much - you lemmings are ruining yourselves... The PS4 is stronger.

You clearly did not read the link... Then again the attention spam of a lot of you here is no more then a dog with a squirrel.

#77 Posted by MonsieurX (28649 posts) -

@lglz1337 said:

@MonsieurX: you know the story about half a brain ?i guess you do if not go other thread

So you can't explain it,I'm almost surprised.

#78 Edited by FoxbatAlpha (5884 posts) -

A very interest breakdown of how EsRam is beneficial. Thanks for posting.

I just would like to address the GPU in the Xbox One. Microsoft never stated that it is a 7790. Everyone thinks it is a 7790 because DF matched up specs in this range and the 7790 was close. Not really the case though.

This is from a article claiming about next gen GPU's and DX12 hardware. A 7790 isn't next gen. Whatever is in the Xbox One is next gen.

http://wccftech.com/microsoft-teased-directx-12-features-reserved-generation-gpus-unveil/

DirectX 12 will indeed make lower-level abstraction available (but not mandatory—there will be backward-compatibility with DX11) on existing hardware. However, Tamasi explained that DirectX 12 will introduce a set of new features in addition to the lower-level abstraction, and those features will require new hardware. In his words, Microsoft “only teased” at some of those additions this week, and a “whole bunch more” are coming. via TechReportRead more: http://wccftech.com/microsoft-teased-directx-12-features-reserved-generation-gpus-unveil/#ixzz2wsayh5mo

#79 Edited by ronvalencia (15109 posts) -

@scatteh316 said:

@ronvalencia said:

@Martin_G_N:

Te

@Martin_G_N said:

The speed difference between the ESRAM and GDDR5 is pretty similar, and there is alot more of the GDDR5 RAM in the PS4 which is better. I think having a fast cache is important, but it would have to be alot faster than the GDDR5 RAM to make a difference, which it isn't.

And it is still funny to hear about how the PS4 can't support some feature because it doesn't have Direct X 11.something or 12. It can utilize the same features as any Direct X version can through Open GL, if it has the power for it of course. Remember Tesselation, which was a Direct X only feature when it arrived. It could also be done on the PS3.

ATI learnt their lesson with R600's shader powered MSAA i.e. RV670 includes the full MSAA hardware.

And yet funnily enough over the last couple of years shader based post process AA has become the new trend....

It doesn't help ATI's position when it's benchmark scores are less than the competition. It's clear you do not understand the competitive nature in the PC GPU market place.

Also, MSAA doesn't equal screen space based AA i.e. MLAA, FXAA and 'etc'.

#80 Edited by ronvalencia (15109 posts) -
@FoxbatAlpha said:

A very interest breakdown of how EsRam is beneficial. Thanks for posting.

I just would like to address the GPU in the Xbox One. Microsoft never stated that it is a 7790. Everyone thinks it is a 7790 because DF matched up specs in this range and the 7790 was close. Not really the case though.

This is from a article claiming about next gen GPU's and DX12 hardware. A 7790 isn't next gen. Whatever is in the Xbox One is next gen.

http://wccftech.com/microsoft-teased-directx-12-features-reserved-generation-gpus-unveil/

DirectX 12 will indeed make lower-level abstraction available (but not mandatory—there will be backward-compatibility with DX11) on existing hardware. However, Tamasi explained that DirectX 12 will introduce a set of new features in addition to the lower-level abstraction, and those features will require new hardware. In his words, Microsoft “only teased” at some of those additions this week, and a “whole bunch more” are coming. via TechReportRead more: http://wccftech.com/microsoft-teased-directx-12-features-reserved-generation-gpus-unveil/#ixzz2wsayh5mo

The article was from Techreport. http://techreport.com/news/26210/directx-12-will-also-add-new-features-for-next-gen-gpus

"However, Tamasi explained that DirectX 12 will introduce a set of new features in addition to the lower-level abstraction, and those features will require new hardware."

Mr Tamasi is from NVIDIA and doesn't have the authority for non-NVIDIA hardware.

Techreport's "new blend modes" and something called "conservative rasterization" refers to DirectX12's new rendering modes.

1. "Programmable blend and efficient OIT with pixel ordered UAV".

2. "Better collision and culling with Conservative Rasterization".

If you compare point 1 with http://software.intel.com/en-us/blogs/2013/07/18/order-independent-transparency-approximation-with-pixel-synchronization

The API is based on Intel's Pixel Sync. The main feature with this Intel API is the pixel shader wait function. This avoids the "link list" requirements since the pipelines are handling the pixel shader read/manage/write order.

DirectX11's Link List based OIT from http://www.docstoc.com/docs/106125562/Order-Independent-Transparency-Using-DirectX-11-Linked-Lists

OpenGL 4.0+'s Link List based OIT from http://blog.icare3d.org/2010/07/opengl-40-abuffer-v20-linked-lists-of.html

Again, NVIDIA's Mr Tamasi DOES NOT have any authority for non-NVIDIA hardware. Intel Haswell says Hi and says F**k you on non-Intel GPU competitors.

PS; My Intel Core i7-4770K "Haswell" HD 4600 IGP has the above Intel feature. LOL.

AMD GCN has out-of-order wavefront (i.e. shader instructions bundle) processing features. With-out of-order wavefront processing, if the data doesn't exist in CU's SRAM storage areas, it will cause a stall/wait state for the said wavefront and executes another non-dependant wavefront.

AMD GCN's "Programmable scheduler" feature that could duplicate the shader wait/stall state..

AMD's ACE unit is similar to CELL's SPU and there's a chance for a workaround e.g. an API wrapper that runs ACE unit GpGPU program for "Conservative Rasterization" workload. Unlike the old DirectX 11.2, low latency APIs can enable proper real time GpGPU compute to be returned back to the CPU within the 16 ms frame render.

Read http://www.slideshare.net/zlatan4177/gpgpu-algorithms-in-games (AMD's HSA compute, real time GpGPU result returns in games). AMD talks about improved culling resolution via HSA compute. PC's DirectX 11.2 is too slow for proper real time GPGPU compute and that's why you don't see programmable GPUs patching APIs.

Xbox One has 2 ACE units with each having 8 queues i.e. a total of 16 queues. This is where PS4's 8 ACE units has an advantage i.e. greater at creating API workarounds.

From http://www.amd.com/us/Documents/GCN_Architecture_whitepaper.pdf More information on GCN's shader program kernel scheduling features.

While ACEs ordinarily operate in an independent fashion, they can synchronize and communicate using cache, memory or the 64KB Global Data Share. This means that an ACE can actually form a task graph, where individual tasks have dependencies on one another. So in practice, a task in one ACE could depend on tasks on another ACE or part of the graphics pipeline. The ACEs can switch between tasks queue, by stopping a task and selecting the next task from a different queue. For instance, if the currently running task graph is waiting for input from the graphics pipeline due to a dependency, the ACE could switch to a different task queue that is ready to be scheduled. The ACE will flush any workgroups associated with the old task, and then issue workgroups from the new task to the shader array.

AMD ACE can sync results. PS4 and R9-290X/290 has 64 queues from 8 ACE units.

---------------

The main difference is AMD used the word "FULL" for their compatibility with "DirectX 12".

Read http://www.amd.com/us/press-releases/Pages/amd-demonstrates-2014mar20.aspx

"Full DirectX 12 compatibility promised for the award-winning Graphics Core Next architecture".

I'll bet, AMD HSA compute would be covering PC API politics i.e. CELL style with AMD GCN's stream processor magnitude.

AMD also mentions "software rasterization" on HSA GpGPUs from http://www.slideshare.net/zlatan4177/gpgpu-algorithms-in-games This should sound familiar to PS3's CELL fanboys. This is why AMD can claim "Full DirectX 12 compatibility" i.e. ~5.6 TFLOPS for "software rasterization" on R9-290X i.e. AMD's Larrabee style fullback mode and unlike Larrabee, AMD GCN still has it's heavy Rasterization hardware.

---------------

#81 Edited by tormentos (16408 posts) -

To anyone familiar with software programming, the problem MS had with XB1 was its rushed design and last minute changes. Devs didnt have the drivers to take advantage of the changes, or even a chance to play around with ESRAM to see its capabilites.

Id wager that they arent using ESRAM for reading and writing at the same time, which it is capable of. They probably arent using tiled resources either.

There is no reason a last gen game like Tomb Raider wouldnt run at 1080p other than outdated drivers.

PC gamers know that having outdated drivers for new games could effectively cut your framerate in half.

MS shouldve went for the kill against Sony, but instead the execs in charge didnt understand gaming and tried to cut corners.

Its good that the people in charge now like Phil Harrison really understand gaming and are making a ton of changes to make Xbox better. They have a long way to go since Phil Spencer and Balmer set the place on fire and left.

ESRAM is just fast memory it doesn't have any capabilities other than helping the xbox one with its DDR3 bandwidth problem,the xbox one could have 1,000GB bandwidth is doesn't change the fact that it has a weak GPU,it could have 8 GB of GDDR5 like the PS4 and the PS4 still would out perform it,because it has a stronger GPU from the same line.

And they are using ESRAM for reading and writing dude,already stated by MS it self and getting 140Gb/s to 150GB/s which is quite more than what the full 7790 with 1.79TF has.

Tombd Raider is 1080p on xbox one,it just have to give up frames,and effects quality to do so,because ESRAM is to small,in fact Tomb Raider while been 1080p on xbox one show one of the biggest gaps in performance,up to 100% faster frames on PS4,but higher resolutions effects on PS4 and even better quality textures all while having 1080p cut scenes as well which the xbox one has to downgrade to 900p.

The xbox one can achieve 1080p just have to sacrifice something.

Developers already say what was the problem it wasn't just drivers,ESRAM is to small to fit everything in it,hell killzone was using 50MB of frame buffer on xbox one that is a problem it only has 32mb,a developer in beyond3d also gave an example of how hes game had 50MB for several things,+ 64 mb for shadow maps, and 64 MB for cascade maps,we are talking about 178MB of frame buffer,and he didn't even include MSAA,so in other to fit all that on xbox one they have to cut corners,frames,resolution,texture quality or effects.

MS didn't try to cut corners they knew what they where doing and they had a complete plan in line,Kinect build in TV oriented box,that require internet that was the plant and they didn't need a 7970 to pull that off,so they design the unit around what they need,the backlash MS faced was double once for trying to push the Draconian DRM and 2 because they lie about it and spin it.

@Tighaman said:

No one is going to listen they gonna say its too small they gonna say you got the from misterxmedia or they gonna bring you a quote from some grapevine story that's been stepped on sooooo many times but carried on as fact lol they keep on talking about a 7770 or 7790 with Esram like you can put slap it on the gpu and presto magic

Developers say is to small period you can't fit everything there and is the reason why it has to do some sacrifices,but even with GDDR5 it would still be outperform because a 12 CU GCN with never beat a 18 U GCN period.

@evildead6789:

Yeah right, there is several reasons why Nvidia and AMD hasn't used cache on GPU's. One is being able to use GDDR5 and another is because the EDRAM takes up a lot of space on the board. Which is why the X1's GPU is so small. The only reason they ended up with EDRAM is because they wanted 8 GB of main memory, and at the time all they could use was the slow DDR3. The idea of having fast cache on the GPU is nothing new, the ps2 had that.

Because they don't need crappy ESRAM stealing die space,there are cards with more than 300GB/s bandwidth and they don't use ESRAM.

MS ended with DDR3 because it was cheap,and cost effective and since they already use EDRAM on xbox 360,they went with ESRAM,the could have use 8GB of GDDR5 nothing stopped MS,they had the money to do so,and the xbox one has more or less time in the making than the PS4,so yeah the whole excuse of DDR3 is the only thing they could use is a joke GPU on PC had been using GDDR5 for years now,MS just wanted to keep the cost as low as possible to include Kinect while making a profit which is what they were doing at launch.

I wouldnt say secret sauce.

Its more that devs now have the tools to actually use the hardware design microsoft finally settled on.

Also to be considered if devs are simply porting engines/games from last gen (which happens at launch), that would make this worse since they have to account for yet another barrier that they have to code for. No need for that with PS4

So i would say devs are finally getting a set of instructions to build the playset rather than just trying to figure it out based on the old model.

Hardware in consoles is fixed, so there is never any secret sauce unless the platform holder holds back information, which would anger devs and would leak out anyway.

PS3 didnt have any secret sauce either, devs finally learned to work with it.

Same goes for XB1, the tools will catch up to the architecture

The PS3 had a CPU that could offload task from the CPU,it helped the GPU by taking load of its back,developers didn't know how to use it,and were not using it at all,in fact first PS3 games used only the PPE and not a single SPE,ESRAM can't offload task from the GPU is just memory,the xbox one even partially using ESRAM is giving the GPU more than enough bandwidth GPU like the xbox one on PC don't have 150GB/s bandwidth the 7770 has 72Gb/s and the full 7790 which is stronger than the xbox one has 96GB/s.

#82 Edited by evildead6789 (6825 posts) -

@tormentos said:

ESRAM is just fast memory it doesn't have any capabilities other than helping the xbox one with its DDR3 bandwidth problem,the xbox one could have 1,000GB bandwidth is doesn't change the fact that it has a weak GPU,it could have 8 GB of GDDR5 like the PS4 and the PS4 still would out perform it,because it has a stronger GPU from the same line.

And they are using ESRAM for reading and writing dude,already stated by MS it self and getting 140Gb/s to 150GB/s which is quite more than what the full 7790 with 1.79TF has.

Tombd Raider is 1080p on xbox one,it just have to give up frames,and effects quality to do so,because ESRAM is to small,in fact Tomb Raider while been 1080p on xbox one show one of the biggest gaps in performance,up to 100% faster frames on PS4,but higher resolutions effects on PS4 and even better quality textures all while having 1080p cut scenes as well which the xbox one has to downgrade to 900p.

The xbox one can achieve 1080p just have to sacrifice something.

Developers already say what was the problem it wasn't just drivers,ESRAM is to small to fit everything in it,hell killzone was using 50MB of frame buffer on xbox one that is a problem it only has 32mb,a developer in beyond3d also gave an example of how hes game had 50MB for several things,+ 64 mb for shadow maps, and 64 MB for cascade maps,we are talking about 178MB of frame buffer,and he didn't even include MSAA,so in other to fit all that on xbox one they have to cut corners,frames,resolution,texture quality or effects.

MS didn't try to cut corners they knew what they where doing and they had a complete plan in line,Kinect build in TV oriented box,that require internet that was the plant and they didn't need a 7970 to pull that off,so they design the unit around what they need,the backlash MS faced was double once for trying to push the Draconian DRM and 2 because they lie about it and spin it.

Developers say is to small period you can't fit everything there and is the reason why it has to do some sacrifices,but even with GDDR5 it would still be outperform because a 12 CU GCN with never beat a 18 U GCN period.

Because they don't need crappy ESRAM stealing die space,there are cards with more than 300GB/s bandwidth and they don't use ESRAM.

MS ended with DDR3 because it was cheap,and cost effective and since they already use EDRAM on xbox 360,they went with ESRAM,the could have use 8GB of GDDR5 nothing stopped MS,they had the money to do so,and the xbox one has more or less time in the making than the PS4,so yeah the whole excuse of DDR3 is the only thing they could use is a joke GPU on PC had been using GDDR5 for years now,MS just wanted to keep the cost as low as possible to include Kinect while making a profit which is what they were doing at launch.

The PS3 had a CPU that could offload task from the CPU,it helped the GPU by taking load of its back,developers didn't know how to use it,and were not using it at all,in fact first PS3 games used only the PPE and not a single SPE,ESRAM can't offload task from the GPU is just memory,the xbox one even partially using ESRAM is giving the GPU more than enough bandwidth GPU like the xbox one on PC don't have 150GB/s bandwidth the 7770 has 72Gb/s and the full 7790 which is stronger than the xbox one has 96GB/s.

Esram is more than fast memory, the fact that it is on chip ram makes the communication between gpu and ram a lot faster. It's similar to cache on the cpu's. If I would follow your reasoning xeon and phenom cpu's wouldn't be faster than the cpu counterparts pentium and athlon that have less cache.

Tomb raider has been made, like all games that have been made developped before januari 21, with a gpu lock on 10 percent that was reserved for the kinect and multimedia, this has been reduced to 2 percent since january 2014. Devs didn't have the right dev tools at this time also to work with the esram. Future games will have more power available. How much remains to be seen.

Maybe the xboxone have to sacrifice something on 1080p but this is all with the gpu lock and the wrong esram tools. I'm not saying they will achieve 1080p in the future but the difference with the ps4 will be less.

Not everything has to fit in esram, it's not like they have to chose between esram and ddr3 all the time, again devs didn't have the right tools.

Esram isn't used on gpu chips because it isn't standardized and the technology to use it is very new. If it is successfull it could very well be that all cards start to use it, it all depends on the tools that are made for it. It isn't really necessary at this time because gpu cards can still manage with reasonable amounts of ram. It's still easier for a dev to just load the texture as a whole in the memory especially when the tech is pretty new and the right tools aren't readily available. As for die space, on chip ram doesn't have to be necessarily esram.

Esram is one of the reasons the xboxone has a higher price, that and the kinect. Since it's implemented in directx 11.2, I doubt they did this as a last minute counter on sony's gddr5. They would have been very lucky then they stumbled on tech that can store 6gb of textures on 32 mb of ram. If ms knew the price of the gddr5 would have dropped , they could have use dedicated gddr5 for the gpu or they could have kept the setup they have now. Sony got lucky with the price drop of gddr5 otherwise they would have never been able to sell the ps4 for 400$ and that's the main reason the ps4 sells so much better. If they both had the 500$ price tag, the ps4 wouldn't sell so easily.

The ps4 can show it's power from the get go because it uses standard pc hardware. There won't be much room for optimization in the future, for the xboxone there will be. How much and how fast it will go remains to be seen. But since they push the kinect on the consumer and they already took a patent on vr devices for the xboxone a year before sony did, ms could have more tricks up his sleeve than people think.

As of now the ps4 is by far a better system for current games released but you can be sure stuff will be a lot different in two years time. Having said all that , the systems are very weak compared to current pc tech. The so called strong ps4 is somewhat of joke. The difference between the ps4 and xboxone is wildly exaggerated and the xboxone had simply a bad release, due to sony's luck with gddr5 prices and microsofts bad management. Once the devs will release products that are made with the right esram tools and without the gpu lock of 10 percent, the differences will get very small.

Maybe people will notice then that the ps4 is no more than a 5 year old pc. The game thief is a nice example of this. It's runs at 1080p and 30fps but with less detail than the pc version. The pc version only needs a mid range pc to do this and this gen just begun. If nintendo takes a chance here they could very well give sony and ms the asswhooping they deserve, because releasing weak consoles like this are a insult to any gaming enthousiast.

I could forgive ms (or even sony) if they innovate in other ways, the kinect can play a big part in vr, but sony , they're making a vr device, but they aren't the only ones.

#83 Posted by evildead6789 (6825 posts) -

@evildead6789 said:

@killatwill15 said:

@evildead6789 said:

@killatwill15 said:

@Chutebox said:

Wait, are we back to the X1 having secret sauce again!?

yeah even though they have been refuted by actual developers, and self owned by lemming king phil spencer.

they still press on with this science fiction, sure 1080p is viable on anything even the 3ds,

but the games will suffer for it "draw distance, texture quality, model density, shader quality etc",

I don't know why lemmings don't stop, they are only hurting themselves.

they should have a lemming support group were they apply preparation h to each others anus's,

oh wait they do that every day with these "esram,dx11.2, dx12, cloud, Kinect" threads

poor @kuu2@blackace@FreedomFreeLife@FoxbatAlpha,

even if Microsoft straight up admitted the xbone was weak sauce, they wouldn't believe it.

The x1 is weak sauce, very weak but so is the ps4.

Apart from that . The actual devs didn't have the right tools for esram at the time the xboxone released. At the release of the xboxone 10 percent of the gpu was locked, for the kinect and to switch between skype and movies, which they changed to 2 percent and those are the two main reasons the xboxone couldn't output at 1080p

I don't even have an xboxone, or a ps4. I stil have an x360 and a three year old mid range pc (that smokes the ps4). I would maybe consider buying an xboxone for exclusives or fighting games if they released it without a kinect

but I would never buy a crappy playstation lol. The ps3 was so much stronger too and it showed.... way to late lol

like I said, even gamecube is capable of outputting 1080p,

but the game will have to suffer for the resolution boost.

the gpu reserve or esram is no excuse to blame that xb1 couldn't display 1080.

it can, but it would look ugly as fuck,

crytek said so before switching to 900p for ryse,

1080 was possible, but they wanted ryse to look better,

the console isn't capable of both, other ports have proved that,

it is just weak hardware, period.

and if you have a pc more powerful than a ps4,

stick with that,

why waste money on an upscaled wii u(x1).

because the only exclusives that stayed on xbox was halo forza gears,

the rest will be pc ports one day.

crytek didn't have the right esram tools and they had the gpu lock. The extra power in the ps4 is blown outta proportion. You can see it in the differrence in frame rates in the charts. Even the ps4 struggles at 1080p. The ps4 may have the edge but they're both weak systems and the x1 was released with bad devtools and a gpu lock of 10 percent. Don't forget they overclocked the system too.

The biggest problem x1 has was a bad launch because of management. That and the mandatory kinect.

PS4 has a 50% texture mapping advantage, a 50% shader performance advantage and TWICE the fill rate....

ESRAM will NOT make up for any of that, the only way to reduce that performance lead is to increase the clock speeds, which Microsoft have already done.

But as Xbones OS consumes more resources anyway it's hardly dented PS4's performance lead.

That fill rate advantage is why PS4 is hitting 1080p on pretty much everything and sftware, API or hardware hacks will not give Xbone double the fill rate all of a sudden.

How do you know esram will not make up for that, if you can store 6 gb of textures in 32 mb of embedded ram. Textures take up the most memory by far. You're not an expert on dirextx technology otherwise you wouldn't be making these assumptions.

Esram hasn't been used with this technology yet but it will be. How much and by whom remains to be seen

#84 Posted by Shewgenja (7924 posts) -

How do you know esram will not make up for that, if you can store 6 gb of textures in 32 mb of embedded ram. Textures take up the most memory by far. You're not an expert on dirextx technology otherwise you wouldn't be making these assumptions.

Esram hasn't been used with this technology yet but it will be. How much and by whom remains to be seen

How does ESRam hold 6GBs worth of texture information in a differed rendering engine?

#85 Posted by Solid_Max13 (3476 posts) -

@Shewgenja: because MisterX said so and he's the lemming jester

#86 Posted by lhughey (4221 posts) -

@evildead6789:

Either its all rubbish or MS are idiots for letting it drag on this long. Do you really think that they would let xbone struggle this far behind for this long if the secret sauce was real?

It hasn't been long. Only one generation of games have been released. It has been said time and time again that the Xbone API was late getting into developers hands, so any game that has been released or is close to release probably wont see the methods that allow for easier ESRam usage. In reality, its very similar to the fact that it took "Lazy Devs" a long time to learn teh cell. Going with unorthodox hardware/API is always risky when it requires people to change the way they do things.

And why is it still referred to as "secret sauce"? Lol. Its not a secret as it has been published before the Xbone was even released.

#87 Edited by Shewgenja (7924 posts) -

I just like how this magical 6GBs is going to fold itself like origami into a 32MB cache without having to make calls to any other memory somewhere... Like an object being held in DDR3.. Oh teh magical efficiencies!

In a practical world, you can either sweat bullets trying to figure out how to maximize your combined write between two vastly different speeds of cache or you can simply send your shadow maps to it and call it a day. Either way, you're not getting a system much more powerful than a laptop from a year or two ago. The APU doesn't even utilize any form of hUMA to send along moving job data from the CPU to the GPU and vice versa, so you end up still having to send your read/write to duplicated assets as far as I can tell. Granted, I've only had time to play with that other devkit.

As far as I can tell, the XBone has some critical deficiencies when it comes to certain types of engine technology. Hopefully, rubber will meet the road when it comes to the new DX12 integrated environment. Currently, the XBone is a goddamn mess and the half-truths that people spit out about the ESRam doesn't come without its own drawbacks. Using ignorance and hyuck-hyck comebacks to defend it from the PS4 isn't going to carry weight in the long-term.

#88 Posted by evildead6789 (6825 posts) -

I just like how this magical 6GBs is going to fold itself like origami into a 32MB cache without having to make calls to any other memory somewhere... Like an object being held in DDR3.. Oh teh magical efficiencies!

In a practical world, you can either sweat bullets trying to figure out how to maximize your combined write between two vastly different speeds of cache or you can simply send your shadow maps to it and call it a day. Either way, you're not getting a system much more powerful than a laptop from a year or two ago. The APU doesn't even utilize any form of hUMA to send along moving job data from the CPU to the GPU and vice versa, so you end up still having to send your read/write to duplicated assets as far as I can tell. Granted, I've only had time to play with that other devkit.

As far as I can tell, the XBone has some critical deficiencies when it comes to certain types of engine technology. Hopefully, rubber will meet the road when it comes to the new DX12 integrated environment. Currently, the XBone is a goddamn mess and the half-truths that people spit out about the ESRam doesn't come without its own drawbacks. Using ignorance and hyuck-hyck comebacks to defend it from the PS4 isn't going to carry weight in the long-term.

Your forgetting that the xboxone is more than esram. It still has the normal setup like the ps4 has, only with a 7790 and ddr3 ram.

I don't think you understand how onchip ram works also, If you ever had a xeon cpu , you would know what i mean. It's way way faster. Look at the difference between athlon II and phenom II cpu's, they're the same cpu's only the atlons don't have l3 cache ram. Even a phenom II triple core outclasses a athlon II x4 when they run at the same speed. All the phenom II has is 6mb cache ram. on chip ram has a lot faster communication between the chip and the ram.

These are gpu's but it's the same principle. If they can use that speed to the advantage of rendering data, it will have an advantage. Directx 11.2 shows that they can use it.

@evildead6789 said:

@misterpmedia said:

IIRC, the max ESRAM bandwidth numbers which take it well above the GDDR5 one was a theoretical peak. I think the technical fellow from microsoft said they wouldn't be getting at all close to that on real world performance in their tech break down on Eurogamer.

Doesn't matter , it's still on chip ram, and that's what makes the big difference. The gddr5 in sony ps4 is standalone ram, even worse it's not dedicated but shared.

Not getting how this is bad lol It's what developers wanted

The esram isn't shared. Even if they use the tiling texture process for gddr5 flike esram uses it, it will have the handicap of being shared. It already has the handicap of not being on chip ram, the shared ram makes it even worse. It does have the advantage of being accessible by cpu and gpu at the same time but dedicated would have the advantage when swapping textures on the fly.

Still even if it's was dedictated you would never have the advantages esram has, what gddr5 will be able to do with tiling textures will only be for textures that are very far away, so the player wouldn't see the texture swapping.

#89 Edited by tormentos (16408 posts) -

@evildead6789 said:

Esram is more than fast memory, the fact that it is on chip ram makes the communication between gpu and ram a lot faster. It's similar to cache on the cpu's. If I would follow your reasoning xeon and phenom cpu's wouldn't be faster than the cpu counterparts pentium and athlon that have less cache.

Tomb raider has been made, like all games that have been made developped before januari 21, with a gpu lock on 10 percent that was reserved for the kinect and multimedia, this has been reduced to 2 percent since january 2014. Devs didn't have the right dev tools at this time also to work with the esram. Future games will have more power available. How much remains to be seen.

Maybe the xboxone have to sacrifice something on 1080p but this is all with the gpu lock and the wrong esram tools. I'm not saying they will achieve 1080p in the future but the difference with the ps4 will be less.

Not everything has to fit in esram, it's not like they have to chose between esram and ddr3 all the time, again devs didn't have the right tools.

Esram isn't used on gpu chips because it isn't standardized and the technology to use it is very new. If it is successfull it could very well be that all cards start to use it, it all depends on the tools that are made for it. It isn't really necessary at this time because gpu cards can still manage with reasonable amounts of ram. It's still easier for a dev to just load the texture as a whole in the memory especially when the tech is pretty new and the right tools aren't readily available. As for die space, on chip ram doesn't have to be necessarily esram.

Esram is one of the reasons the xboxone has a higher price, that and the kinect. Since it's implemented in directx 11.2, I doubt they did this as a last minute counter on sony's gddr5. They would have been very lucky then they stumbled on tech that can store 6gb of textures on 32 mb of ram. If ms knew the price of the gddr5 would have dropped , they could have use dedicated gddr5 for the gpu or they could have kept the setup they have now. Sony got lucky with the price drop of gddr5 otherwise they would have never been able to sell the ps4 for 400$ and that's the main reason the ps4 sells so much better. If they both had the 500$ price tag, the ps4 wouldn't sell so easily.

The ps4 can show it's power from the get go because it uses standard pc hardware. There won't be much room for optimization in the future, for the xboxone there will be. How much and how fast it will go remains to be seen. But since they push the kinect on the consumer and they already took a patent on vr devices for the xboxone a year before sony did, ms could have more tricks up his sleeve than people think.

As of now the ps4 is by far a better system for current games released but you can be sure stuff will be a lot different in two years time. Having said all that , the systems are very weak compared to current pc tech. The so called strong ps4 is somewhat of joke. The difference between the ps4 and xboxone is wildly exaggerated and the xboxone had simply a bad release, due to sony's luck with gddr5 prices and microsofts bad management. Once the devs will release products that are made with the right esram tools and without the gpu lock of 10 percent, the differences will get very small.

Maybe people will notice then that the ps4 is no more than a 5 year old pc. The game thief is a nice example of this. It's runs at 1080p and 30fps but with less detail than the pc version. The pc version only needs a mid range pc to do this and this gen just begun. If nintendo takes a chance here they could very well give sony and ms the asswhooping they deserve, because releasing weak consoles like this are a insult to any gaming enthousiast.

I could forgive ms (or even sony) if they innovate in other ways, the kinect can play a big part in vr, but sony , they're making a vr device, but they aren't the only ones.

Yeah the communication between GPU and ESRAM is very fast,because they are on the same die,now the main memory is not on the APU die,that connection is way slower,you can't fit everything you would on 5GB of GDDR5 on 32MB of ESRAM,worse the bandwidth is 140 to 150 GB/s best case scenario,and on top of that it would mean little since the GPU is gimped.

Second the 10% reservation mean little the PS4 has 40% advantage people have been downplaying it,so how come when the difference is 40% it doesn't matter,but when the topic is the miserable 10% xbox one reservation all of the sudden 10% will bring parity or close,is just a lame ass argument and it will always be,8% will yield 3 frames more at best.

Oh they have since the fast pool is just 32MB any framebuffer bigger than that will cause problems,i even posted examples already,it is just to small that is where sacrifices come in.

What the hell embedded ram is not new,man and is not on PC because it not need it,the R9 290 has 320GB/s way way faster than the xbox one,so yeah they don't have to waste space with on die ESRAM when they can just have that bandwidth without it.

DX is totally irrelevant having support for on xbox one means nothing,ESRAM is just memory man is not a GPU or a flop generator is just memory period nothing more nothing less.

Storing 6 GB of textures using PRT means nothing the PS4 can do it as well,sadly for you a game is not compose out of textures and any effect you use ad to ESRAM.

The xbox one use standard PC hardware stop spinning the only thing not PC related is ESRAM because MS chose DDR3 the GPU is from PC so is the CPU and so is DDR3..The whole there is no room for improvement on PS4 is a joke a charade created by fanboys to try to act like the xbox one will catch the PS4,no it will not happen period and the PS4 will improve over time just like any console does,the xbox one games had been under performing bad because well is relative in power to a 7770 and it could have 1GB/s bandwidth and it will change nothing because the GPU is weak.

Yeah the xbox one will catch magically the PS4 in 2 years thanks to the magic ESRAM that never stop giving..hahaha

Bad release my ass Watchdogs comes in may and already is been leak to run at lower resolution on xbox one,MGS5 again 720p on xbox one 1080p on PS4 is by no chance small that is 100% difference in resolution is actually more than the 50% PS4 fans had been claiming even worse it has no dynamic weather simulation like the PS4 either.

#90 Edited by scatteh316 (4732 posts) -

@evildead6789:

Because you fool ESRAM is not a ROP, or an TMU or an ALU....It's a slab of memory.... so it can't make up them you complete idiot... and seen as though you're a tech god and keep banging on about these 6Gb worth of titles textures would you mind telling me where the frame buffers would go while the tiled textures are stored in the ESRAM?

@ronvalencia:

You didn't understand my sentence and I know PERFECTLY how competitive the PC market is.

#91 Posted by evildead6789 (6825 posts) -

@evildead6789:

Because you fool ESRAM is not a ROP, or an TMU or an ALU....It's a slab of memory.... so it can't make up them you complete idiot... and seen as though you're a tech god and keep banging on about these 6Gb worth of titles textures would you mind telling me where the frame buffers would go while the tiled textures are stored in the ESRAM?

@ronvalencia:

You didn't understand my sentence and I know PERFECTLY how competitive the PC market is.

It doesn't matter it isn't a rop, or tmu or alu. On chip ram for a gpu combined with the tech of dx 11.2 is new technology. I think microsoft knows their tech better than you do. If you would have read the link you would know why the 6 gb of textures can fit in 32 mb on chip gpu ram.

And again the esram and ddr3 standalone ram can work together, it's not the one or the other. This is not like a standard gpu that you put in your desktop.

#92 Edited by Shewgenja (7924 posts) -

No offense, dude, but what you're saying sounds more like regurgitation from propaganda than having basis in knowledge. For instance..

176 GB/s is NOT dramatically slower than 102.. LOLz! In order to "tap into" any secret sauce for combined-write to be rasterized by the GPU, you will have to essentially bag the ESRam to work like it was matching pace write for write with the 68 GB/s of the DDR3. That only gives you an effective matched bandwidth of 136 GB/s .. That is the best case scenario provided MSes toolchains even make that an attainable mark effectively.

You guys really need to give this up. The PS4 is like a super-cruise capable jet fighter compared to the F-16 that is XBone. Do both have their place in wartime ops? Yes. Let's not pretend they are in the same class, though. All 8 Gigabytes of PS4s RAM run faster than XBones theoretical peak performance and that is simply a reality of the situation.

If you won't believe me, and rightfully so, since I'm just a random yob on an internet forum then listen to MS themselves.

@Shewgenja said:

I just like how this magical 6GBs is going to fold itself like origami into a 32MB cache without having to make calls to any other memory somewhere... Like an object being held in DDR3.. Oh teh magical efficiencies!

In a practical world, you can either sweat bullets trying to figure out how to maximize your combined write between two vastly different speeds of cache or you can simply send your shadow maps to it and call it a day. Either way, you're not getting a system much more powerful than a laptop from a year or two ago. The APU doesn't even utilize any form of hUMA to send along moving job data from the CPU to the GPU and vice versa, so you end up still having to send your read/write to duplicated assets as far as I can tell. Granted, I've only had time to play with that other devkit.

As far as I can tell, the XBone has some critical deficiencies when it comes to certain types of engine technology. Hopefully, rubber will meet the road when it comes to the new DX12 integrated environment. Currently, the XBone is a goddamn mess and the half-truths that people spit out about the ESRam doesn't come without its own drawbacks. Using ignorance and hyuck-hyck comebacks to defend it from the PS4 isn't going to carry weight in the long-term.

Your forgetting that the xboxone is more than esram. It still has the normal setup like the ps4 has, only with a 7790 and ddr3 ram.

I don't think you understand how onchip ram works also, If you ever had a xeon cpu , you would know what i mean. It's way way faster. Look at the difference between athlon II and phenom II cpu's, they're the same cpu's only the atlons don't have l3 cache ram. Even a phenom II triple core outclasses a athlon II x4 when they run at the same speed. All the phenom II has is 6mb cache ram. on chip ram has a lot faster communication between the chip and the ram.

These are gpu's but it's the same principle. If they can use that speed to the advantage of rendering data, it will have an advantage. Directx 11.2 shows that they can use it.

#93 Posted by scatteh316 (4732 posts) -

@scatteh316 said:

@evildead6789:

Because you fool ESRAM is not a ROP, or an TMU or an ALU....It's a slab of memory.... so it can't make up them you complete idiot... and seen as though you're a tech god and keep banging on about these 6Gb worth of titles textures would you mind telling me where the frame buffers would go while the tiled textures are stored in the ESRAM?

@ronvalencia:

You didn't understand my sentence and I know PERFECTLY how competitive the PC market is.

It doesn't matter it isn't a rop, or tmu or alu. On chip ram for a gpu combined with the tech of dx 11.2 is new technology. I think microsoft knows their tech better than you do. If you would have read the link you would know why the 6 gb of textures can fit in 32 mb on chip gpu ram.

And again the esram and ddr3 standalone ram can work together, it's not the one or the other. This is not like a standard gpu that you put in your desktop.

So you completely dodge my question... awesome.... Microsoft do know tech... you clearly don't...... the ESRAM does not have the ability to generate, shade or apply texture maps.

It can only store the pixels and textures... so it will not add fillrate, ALU or TMU performance.

And you still have not told me where the frame buffer is going to go if you're using the ESRAM for tiled texture resources... Speaking of which where did I say you can't fit 6Gb of textures into ESRAM... That's right... I didn't.....

But answer my question... Where are you going to put the frame buffers if you're using the ESRAM for tiled textures?

#94 Edited by tormentos (16408 posts) -

It doesn't matter it isn't a rop, or tmu or alu. On chip ram for a gpu combined with the tech of dx 11.2 is new technology. I think microsoft knows their tech better than you do. If you would have read the link you would know why the 6 gb of textures can fit in 32 mb on chip gpu ram.

And again the esram and ddr3 standalone ram can work together, it's not the one or the other. This is not like a standard gpu that you put in your desktop.

Le secret sauce par deux...lol

Yeah and sony also knows their tech more than you,and sony modify the PS4 for heavy compute something MS didn't do.

PRT also work on PS4 man stop with the whole 6GB of textures crap,is irrelevant the PS4 also can do it and games aren't compose of just textures.

#95 Posted by highking_kallor (445 posts) -

@lhughey:

If you say so dude. You’re really just setting yourself up for disappointment.

#96 Posted by lhughey (4221 posts) -

@lhughey:

If you say so dude. You’re really just setting yourself up for disappointment.

The proof will be in the pudding. Lets just sit back and see what happens. I won't be that disappointed either way. I really enjoy playing the Bone and will have a PS4 later in the year. And if I cared about the having the best graphics, I'd be playing on my PC.

#97 Edited by evildead6789 (6825 posts) -

@tormentos said:

@evildead6789 said:

Esram is more than fast memory, the fact that it is on chip ram makes the communication between gpu and ram a lot faster. It's similar to cache on the cpu's. If I would follow your reasoning xeon and phenom cpu's wouldn't be faster than the cpu counterparts pentium and athlon that have less cache.

Tomb raider has been made, like all games that have been made developped before januari 21, with a gpu lock on 10 percent that was reserved for the kinect and multimedia, this has been reduced to 2 percent since january 2014. Devs didn't have the right dev tools at this time also to work with the esram. Future games will have more power available. How much remains to be seen.

Maybe the xboxone have to sacrifice something on 1080p but this is all with the gpu lock and the wrong esram tools. I'm not saying they will achieve 1080p in the future but the difference with the ps4 will be less.

Not everything has to fit in esram, it's not like they have to chose between esram and ddr3 all the time, again devs didn't have the right tools.

Esram isn't used on gpu chips because it isn't standardized and the technology to use it is very new. If it is successfull it could very well be that all cards start to use it, it all depends on the tools that are made for it. It isn't really necessary at this time because gpu cards can still manage with reasonable amounts of ram. It's still easier for a dev to just load the texture as a whole in the memory especially when the tech is pretty new and the right tools aren't readily available. As for die space, on chip ram doesn't have to be necessarily esram.

Esram is one of the reasons the xboxone has a higher price, that and the kinect. Since it's implemented in directx 11.2, I doubt they did this as a last minute counter on sony's gddr5. They would have been very lucky then they stumbled on tech that can store 6gb of textures on 32 mb of ram. If ms knew the price of the gddr5 would have dropped , they could have use dedicated gddr5 for the gpu or they could have kept the setup they have now. Sony got lucky with the price drop of gddr5 otherwise they would have never been able to sell the ps4 for 400$ and that's the main reason the ps4 sells so much better. If they both had the 500$ price tag, the ps4 wouldn't sell so easily.

The ps4 can show it's power from the get go because it uses standard pc hardware. There won't be much room for optimization in the future, for the xboxone there will be. How much and how fast it will go remains to be seen. But since they push the kinect on the consumer and they already took a patent on vr devices for the xboxone a year before sony did, ms could have more tricks up his sleeve than people think.

As of now the ps4 is by far a better system for current games released but you can be sure stuff will be a lot different in two years time. Having said all that , the systems are very weak compared to current pc tech. The so called strong ps4 is somewhat of joke. The difference between the ps4 and xboxone is wildly exaggerated and the xboxone had simply a bad release, due to sony's luck with gddr5 prices and microsofts bad management. Once the devs will release products that are made with the right esram tools and without the gpu lock of 10 percent, the differences will get very small.

Maybe people will notice then that the ps4 is no more than a 5 year old pc. The game thief is a nice example of this. It's runs at 1080p and 30fps but with less detail than the pc version. The pc version only needs a mid range pc to do this and this gen just begun. If nintendo takes a chance here they could very well give sony and ms the asswhooping they deserve, because releasing weak consoles like this are a insult to any gaming enthousiast.

I could forgive ms (or even sony) if they innovate in other ways, the kinect can play a big part in vr, but sony , they're making a vr device, but they aren't the only ones.

Yeah the communication between GPU and ESRAM is very fast,because they are on the same die,now the main memory is not on the APU die,that connection is way slower,you can't fit everything you would on 5GB of GDDR5 on 32MB of ESRAM,worse the bandwidth is 140 to 150 GB/s best case scenario,and on top of that it would mean little since the GPU is gimped.

Second the 10% reservation mean little the PS4 has 40% advantage people have been downplaying it,so how come when the difference is 40% it doesn't matter,but when the topic is the miserable 10% xbox one reservation all of the sudden 10% will bring parity or close,is just a lame ass argument and it will always be,8% will yield 3 frames more at best.

Oh they have since the fast pool is just 32MB any framebuffer bigger than that will cause problems,i even posted examples already,it is just to small that is where sacrifices come in.

What the hell embedded ram is not new,man and is not on PC because it not need it,the R9 290 has 320GB/s way way faster than the xbox one,so yeah they don't have to waste space with on die ESRAM when they can just have that bandwidth without it.

DX is totally irrelevant having support for on xbox one means nothing,ESRAM is just memory man is not a GPU or a flop generator is just memory period nothing more nothing less.

Storing 6 GB of textures using PRT means nothing the PS4 can do it as well,sadly for you a game is not compose out of textures and any effect you use ad to ESRAM.

The xbox one use standard PC hardware stop spinning the only thing not PC related is ESRAM because MS chose DDR3 the GPU is from PC so is the CPU and so is DDR3..The whole there is no room for improvement on PS4 is a joke a charade created by fanboys to try to act like the xbox one will catch the PS4,no it will not happen period and the PS4 will improve over time just like any console does,the xbox one games had been under performing bad because well is relative in power to a 7770 and it could have 1GB/s bandwidth and it will change nothing because the GPU is weak.

Yeah the xbox one will catch magically the PS4 in 2 years thanks to the magic ESRAM that never stop giving..hahaha

Bad release my ass Watchdogs comes in may and already is been leak to run at lower resolution on xbox one,MGS5 again 720p on xbox one 1080p on PS4 is by no chance small that is 100% difference in resolution is actually more than the 50% PS4 fans had been claiming even worse it has no dynamic weather simulation like the PS4 either.

Of course the extra 8 percent power won't close the gap between the ps4 and xboxone, but it's one handicap less for the x1. It will improve performance on the x1, anyway you put it . 3 Frames will make a difference. The difference between the 7790 and hd 7850 is 5 frames on average, of course the 7790 in a pc uses gddr5 , but it doesn't have 32 mb of esram and it only has 1 gb of gddr5 also while the xboxone has 8 gb ddr3 in total for the whole system.

How can you say directx is irrevelant, it's the api used to make games, it has always been the leading api when it comes to making games, and it's still today. It's also microsoft technology, The features in dx11.2 are exclusive to the xboxone. The ps4 will have similar features in their own customization of dirextx 11.1 but if it will work as good as the xboxone is not sure, it could even be better , I can't predict the future.

But the fact is that the tiling texture technique doesn't work well within the same ram, because that ram needs to have other functions too aswell as still storing textures as a whole. The esram can work indepent from the ddr3 ram, sony's gddr5 ram can't. Microsoft is also the one that creates directx, sony may use it and modify it to their liking, directx 11.2 is xboxone tech only and I don't think sony has the same expertise when it come making gaming api. Even in networking and hardware sony is not in the same league as ms, history proved it.

Textures make up for the biggest part in vram. Fast system ram can only do so much, the speed of cpu and gpu plays a part here too. The gddr5 system ram in the ps4 doesn't give that many advantage when you look at the performance of the cpu and gpu. Even top end pc's use ddr3 as system ram. The difference is performance between ddr3 running at 1133 vs 2100 mhz is next to nothing in benchmarks.

As for fanboys, I have an x360 and pc, not an xbox1, not a ps4. I had ps3 once for a motion game the fight, which was very good, but it was the only thing i used the ps3 for. I won't buy an xboxone or ps4 at this time, there no reason for it , the systems don't have any games that I like that aren't released on the pc. What I'm saying here are just unbiased plain facts. When you look at the hardware and software the xboxone will come closer to the ps4. Thief already proved it when it ran at 900p.

The only reason why i'm explaining esram is because I want people to realize how weak the ps4 actually is. The xboxone may have a gimped gpu and cpu, but so does the ps4. If you see two ugly people alone in a room, the least ugly one may look pretty all of sudden, but everything changes when a real good looking person comes into the room. And that's what happening here, people only see xbox and playstation but old pc's and medium steam machines kick their asses and they do have the multiplats, they do have exclusives and they do have controller support.

Even better a pc is a multipurpose system. As for watchdogs, it's a ubisoft title and the last time ubisoft impressed me was with far cry and prince of persia , the warrior within. Far cry 2&3 weren't bad and they may be good vr games later when I do my workout with a virtuix omni. Even then, if watchdogs is good, my pc will run it in bettter quality, my 3 year old pc lol.

#98 Posted by magicalclick (22218 posts) -

Thanks for the interesting info. So, basically esRAM is better with DDR3 because it requires much more frequent smaller tiled textures. While DDR3 has lower bandwidth, but, in a streaming tiled situation, DDR3 becomes more suitable. Certainly it is hard to know how it pit against DDR5, but, I should be interesting to see games utilizing this.

#99 Edited by evildead6789 (6825 posts) -

@evildead6789 said:

It doesn't matter it isn't a rop, or tmu or alu. On chip ram for a gpu combined with the tech of dx 11.2 is new technology. I think microsoft knows their tech better than you do. If you would have read the link you would know why the 6 gb of textures can fit in 32 mb on chip gpu ram.

And again the esram and ddr3 standalone ram can work together, it's not the one or the other. This is not like a standard gpu that you put in your desktop.

Le secret sauce par deux...lol

Yeah and sony also knows their tech more than you,and sony modify the PS4 for heavy compute something MS didn't do.

PRT also work on PS4 man stop with the whole 6GB of textures crap,is irrelevant the PS4 also can do it and games aren't compose of just textures.

Like I said in the other post, the xboxone can do the texture tiling in the esram and keep the ddr3 for storing textures as a whole or other vram functions.

Sony modified microsofts dx11.1 to work with prt while microsoft created dx 11.2 exclusively for the xboxone. Sony uses microsoft technology in their system. If sony was so good at making gaming software api, why didn't they make their own api from the ground up?

Because they can't, microsoft has always been superior when it comes to know how. That they released the xboxone with a kinect maybe a bad business decision, but I really like to see what happens when vr hits the market.

Apart from that, both are weak ass systems, my pc, which runs on microsoft windows, still smokes both systems and it's a mid range pc lol

#100 Edited by casharmy (6813 posts) -

The date of this is March 24, 2014

Ubisoft animation director praises Sony's PlayStation 4, saying it allowed the developer to "build a new type of game."

The PlayStation 4 affords developers almost an unlimited memory budget, allowing them to create richly detailed and immersive worlds, according to Watch Dogs animation director Colin Graham.

In an interview on the PlayStation Blog, the Ubisoft developer said it's always a pain point for his animation team to run out of memory, but this issue doesn't come up on PS4, he explained.

"PS4 has really given us the platform to build a new type of game," Graham said. "It gave us a chance to dream. From an animation point of view, we're always running out of memory, especially when you start developing towards the end of a console generation, so PS4 allowed things like reduced animation compression and more variety in civilians."

"From my point of view, it's a bit like working with an unlimited budget because we can't fill the memory budget on PS4," he added. "It's a really nice piece of hardware."

Watch Dogs lead gameplay designer Danny Belanger also offered a round of praise for the technical prowess of the PS4.

"The powerful technology helped us create a better simulation," Belanger said. "The water effects, the amount of people in the city, the quality of the lighting… it all brings us a step forward."

http://www.gamespot.com/articles/watch-dogs-dev-says-ps4-so-powerful-that-it-gave-us-a-chance-to-dream/1100-6418490/

Dat GDDR5 ram advantage, what were people saying about ESram again?