scatteh316's forum posts

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

36

Followers

Reviews: 0

User Lists: 0

#1 scatteh316
Member since 2004 • 10273 Posts

@evildead6789 said:

@scatteh316 said:

@evildead6789:

Because you fool ESRAM is not a ROP, or an TMU or an ALU....It's a slab of memory.... so it can't make up them you complete idiot... and seen as though you're a tech god and keep banging on about these 6Gb worth of titles textures would you mind telling me where the frame buffers would go while the tiled textures are stored in the ESRAM?

@ronvalencia:

You didn't understand my sentence and I know PERFECTLY how competitive the PC market is.

It doesn't matter it isn't a rop, or tmu or alu. On chip ram for a gpu combined with the tech of dx 11.2 is new technology. I think microsoft knows their tech better than you do. If you would have read the link you would know why the 6 gb of textures can fit in 32 mb on chip gpu ram.

And again the esram and ddr3 standalone ram can work together, it's not the one or the other. This is not like a standard gpu that you put in your desktop.

So you completely dodge my question... awesome.... Microsoft do know tech... you clearly don't...... the ESRAM does not have the ability to generate, shade or apply texture maps.

It can only store the pixels and textures... so it will not add fillrate, ALU or TMU performance.

And you still have not told me where the frame buffer is going to go if you're using the ESRAM for tiled texture resources... Speaking of which where did I say you can't fit 6Gb of textures into ESRAM... That's right... I didn't.....

But answer my question... Where are you going to put the frame buffers if you're using the ESRAM for tiled textures?

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

36

Followers

Reviews: 0

User Lists: 0

#2  Edited By scatteh316
Member since 2004 • 10273 Posts

@evildead6789:

Because you fool ESRAM is not a ROP, or an TMU or an ALU....It's a slab of memory.... so it can't make up them you complete idiot... and seen as though you're a tech god and keep banging on about these 6Gb worth of titles textures would you mind telling me where the frame buffers would go while the tiled textures are stored in the ESRAM?

@ronvalencia:

You didn't understand my sentence and I know PERFECTLY how competitive the PC market is.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

36

Followers

Reviews: 0

User Lists: 0

#3 scatteh316
Member since 2004 • 10273 Posts

@ronvalencia said:

@Martin_G_N:

Te

@Martin_G_N said:

The speed difference between the ESRAM and GDDR5 is pretty similar, and there is alot more of the GDDR5 RAM in the PS4 which is better. I think having a fast cache is important, but it would have to be alot faster than the GDDR5 RAM to make a difference, which it isn't.

And it is still funny to hear about how the PS4 can't support some feature because it doesn't have Direct X 11.something or 12. It can utilize the same features as any Direct X version can through Open GL, if it has the power for it of course. Remember Tesselation, which was a Direct X only feature when it arrived. It could also be done on the PS3.

ATI learnt their lesson with R600's shader powered MSAA i.e. RV670 includes the full MSAA hardware.

And yet funnily enough over the last couple of years shader based post process AA has become the new trend....

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

36

Followers

Reviews: 0

User Lists: 0

#4  Edited By scatteh316
Member since 2004 • 10273 Posts

@evildead6789 said:

@killatwill15 said:

@evildead6789 said:

@killatwill15 said:

@Chutebox said:

Wait, are we back to the X1 having secret sauce again!?

yeah even though they have been refuted by actual developers, and self owned by lemming king phil spencer.

they still press on with this science fiction, sure 1080p is viable on anything even the 3ds,

but the games will suffer for it "draw distance, texture quality, model density, shader quality etc",

I don't know why lemmings don't stop, they are only hurting themselves.

they should have a lemming support group were they apply preparation h to each others anus's,

oh wait they do that every day with these "esram,dx11.2, dx12, cloud, Kinect" threads

poor @kuu2@blackace@FreedomFreeLife@FoxbatAlpha,

even if Microsoft straight up admitted the xbone was weak sauce, they wouldn't believe it.

The x1 is weak sauce, very weak but so is the ps4.

Apart from that . The actual devs didn't have the right tools for esram at the time the xboxone released. At the release of the xboxone 10 percent of the gpu was locked, for the kinect and to switch between skype and movies, which they changed to 2 percent and those are the two main reasons the xboxone couldn't output at 1080p

I don't even have an xboxone, or a ps4. I stil have an x360 and a three year old mid range pc (that smokes the ps4). I would maybe consider buying an xboxone for exclusives or fighting games if they released it without a kinect

but I would never buy a crappy playstation lol. The ps3 was so much stronger too and it showed.... way to late lol

like I said, even gamecube is capable of outputting 1080p,

but the game will have to suffer for the resolution boost.

the gpu reserve or esram is no excuse to blame that xb1 couldn't display 1080.

it can, but it would look ugly as ****,

crytek said so before switching to 900p for ryse,

1080 was possible, but they wanted ryse to look better,

the console isn't capable of both, other ports have proved that,

it is just weak hardware, period.

and if you have a pc more powerful than a ps4,

stick with that,

why waste money on an upscaled wii u(x1).

because the only exclusives that stayed on xbox was halo forza gears,

the rest will be pc ports one day.

crytek didn't have the right esram tools and they had the gpu lock. The extra power in the ps4 is blown outta proportion. You can see it in the differrence in frame rates in the charts. Even the ps4 struggles at 1080p. The ps4 may have the edge but they're both weak systems and the x1 was released with bad devtools and a gpu lock of 10 percent. Don't forget they overclocked the system too.

The biggest problem x1 has was a bad launch because of management. That and the mandatory kinect.

PS4 has a 50% texture mapping advantage, a 50% shader performance advantage and TWICE the fill rate....

ESRAM will NOT make up for any of that, the only way to reduce that performance lead is to increase the clock speeds, which Microsoft have already done.

But as Xbones OS consumes more resources anyway it's hardly dented PS4's performance lead.

That fill rate advantage is why PS4 is hitting 1080p on pretty much everything and sftware, API or hardware hacks will not give Xbone double the fill rate all of a sudden.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

36

Followers

Reviews: 0

User Lists: 0

#5 scatteh316
Member since 2004 • 10273 Posts

@ronvalencia said:
@scatteh316 said:
@ronvalencia said:

1. TMUs are memory bound. You gain missed my Tomb Raider 2013 with 69 GB/s and R9-290 example.

2. Radeon HD 7950's 32 ROPS superior result says Hi. Any fill rates are bound by memory bandwidth. Adding another 16 ROPS ay 853Mhz on X1 would be nearly pointless.

If Radeon HD 7950 at 800Mhz is the top 32 ROPS ~= 240 GB/s example, the 16 ROP version would consume 120 GB/s. If normalize 120 GB/s at 800Mhz to 853Mhz it would yield 127.8 GB/s which is far higher than 7770/7790/R7-260X memory bandwidth limits.

Rebellion already explained why most X1's games are less than 1920x1080p i.e. ESRAM is too small for 1920x1080p and it needs "tiling tricks". Rebellion > you.

3. Stop posting crap posts. The prototype 7850 shows what it's 12 CU (768 stream processors and 48 TMUs) capabilities with higher memory bandwidth. Prototype 7850's 32 ROPS will be bound by memory bandwidth.

My POV is backed by Rebellion's POV.

Rebellion > you.

4. Consoles doesn't have PC's DirectX's sysmem1 (CPU) to sysmem2(Direct3D) to VRAM memory copy issues. Secondly, Xbox One's CPU (FCL type block) can directly communicate to the GPU (via host MMU).

5. An example on how R9-290 was reduce to near X1 Tomb Raider 2013 results by gimping memory bandwidth to 69 GB/s. Btw, my Mini-ITX box is limited by 16X PCI-E version 2 (8 GB/s-8 GB/s link) and dual channel DDR3-1333 memory i.e. it's Intel H67 chipset limitation. Your point 4 is a joke since 60 GB/s VRAM (i.e. 8 GB/s removed by 16X PCI-E version 2) didn't significantly change frame rate result.

For near X1 Tomb Raider 2013 result, my R9-290 has160 TMUs (from 40 CU x 4) and it was gimped by 69 GB/s memory bandwidth. Again, TMUs are memory bound. The same problem for my R9-290's 64 ROPS which is also gimped by 69 GB/s memory bandwidth.

6. You can't use Xbox 360's tiling example. Rebellion > you.

7. For Xbox One, it's an additional memory bandwidth for render targets.

8. ID's Megatexture doesn't work in Xbox 360's EDRAM, while AMD PRT/Tiled resource works with ESRAM. ID's Rage uses mega-texturing via GDDR3 and uses EDRAM just rendering targets.

9. Refer to point 8. You don't know the difference between Xbox 360 and Xbox One. Xbox One's GPU can happy do HDR 16 with FP filtering i.e. there's no need for Xbox 360's double frame buffer HDR workaround.

The existence of my 7950's 32 ROPS kills your PS4's 32 ROPS paper spec..

Fact: Rebellion > you.

Just stop... you can't compare you stupid experiments with PC hardware to the way console GPU's get utilized, get a god damn clue......

And PS4 32 ROPS >> HD 7950 ROPS in the real world as PS4 will always have a much much higher utilization.

Bull$hit. 7950 has better performance results than PS4. How's your 1600x900p BattleField 4?

7950 has AMD MANTLE i.e. the consoles doesn't have the API advantages over the PC.

7950+Mantle ~= PS4 scaled to 28 CU GCN with 240GB/s memory bandwidth..

You are an idiot....... 7950 might have better performance for now, we'll see how a 7950 stacks up in 12 months time.

And my Battlefiled 4 is just fine at 2560x1440 all Ultra settings with 8xAA.

Mantle is a good leap forward for PC but it's no where near as low level and optimized as a console API so don't compare the 2.

7950+Mantle only shows a few percent of an improvement on the GPU side of things, most of Mantles performance benefit comes from the CPU, but my 5Ghz 3770k doesn't struggle anyway.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

36

Followers

Reviews: 0

User Lists: 0

#6 scatteh316
Member since 2004 • 10273 Posts

I already posted a comment from a developer showing how piss poor small ESRAM is to be of any real benefit to Xbone.

It needed to have 64Mb of it for it to classed as a good advantage.

PS4 has a memory sub-system that is just as quick and multiple factors bigger

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

36

Followers

Reviews: 0

User Lists: 0

#7 scatteh316
Member since 2004 • 10273 Posts

Lmao... PS2s EDRAM was not designed to be used as a cache, it was designed to be used as pure VRAM storage.

PC graphics cards don't use ESRAM or EDRAM because it's a stupid fucking idea.

EDRAM/ESRAM on PC = Less die space for ROPS, TMU's and ALU's.....

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

36

Followers

Reviews: 0

User Lists: 0

#8 scatteh316
Member since 2004 • 10273 Posts

So if you're sticking 6Gb worth of titles textures in ESRAM where you going to put the frame buffer :|

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

36

Followers

Reviews: 0

User Lists: 0

#9 scatteh316
Member since 2004 • 10273 Posts

What you reckon?

PS4 and Xbone using x86 style CPu's and normal GPU's ( Especially in PS4 ) should mean that porting to PC is soooo much easier this time around.

It will make it easier for developers to target as many people as possible.

This could potential reduce the amount of 'console' exclusives this time around.

Good thing or bad?

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

36

Followers

Reviews: 0

User Lists: 0

#10  Edited By scatteh316
Member since 2004 • 10273 Posts

Waiting for Lemmings to blame the fact the game doesn't use DX12...lmao