PS4 and Xbox One GDC 2014 Presentation : ROPS

  • 83 results
  • 1
  • 2
#1 Edited by scatteh316 (4848 posts) -

Simply put, Xbox One is severely hampered by ROP performance and will struggle to deliver native 1080p games due to fill rate and bandwidth constraints.

PS4 is also some what bandwidth starved but has a big advantage over Xbox One when it comes to ROP performance making 1080p easily achievable on PS4 compared to Xbox.

http://www.bilder-upload.eu/upload/179726-1396172332.jpg

http://www.bilder-upload.eu/upload/a14170-1396172466.jpg

Long story short, Xbox One might improve over the coming months but native 1080p will NOT be common place on Xbox due to bandwidth and ROP limitations.

Link

Edit: Chrome won't let me add the images properly :/

#2 Posted by Salt_The_Fries (8364 posts) -

Apparently Bone has only 3.5 higher fill rate than 360...

#3 Edited by R3FURBISHED (10330 posts) -
#4 Posted by scatteh316 (4848 posts) -

My response hasn't changed since i first saw this

I don't know what that nonsense means or what in the world a ROP is.

Basically ROPs generate the final pixels, the more fill rate you have the more pixels you can render.

720p is less then 1 million pixels where as 1080p is just over 2 million pixels so 1080p would require 2x the fill rate.

#5 Posted by getyeryayasout (7173 posts) -

Da-Doom ROP-ROP-ROP Da-Doom ROP-ROP

#6 Edited by ronvalencia (15109 posts) -
@scatteh316 said:

Simply put, Xbox One is severely hampered by ROP performance and will struggle to deliver native 1080p games due to fill rate and bandwidth constraints.

PS4 is also some what bandwidth starved but has a big advantage over Xbox One when it comes to ROP performance making 1080p easily achievable on PS4 compared to Xbox.

http://www.bilder-upload.eu/upload/179726-1396172332.jpg

http://www.bilder-upload.eu/upload/a14170-1396172466.jpg

Long story short, Xbox One might improve over the coming months but native 1080p will NOT be common place on Xbox due to bandwidth and ROP limitations.

Link

Edit: Chrome won't let me add the images properly :/

Are you claiming real games doesn't have texture and stencil Z-ROPS bandwidth? Pure color-ROPS vs bandwidth is nice for benchmarks and it doesn't reflect real life workloads.

Microsoft calculates their ROPS with read and write, not just pure writes.

From http://www.eurogamer.net/articles/digitalfoundry-microsoft-to-unlock-more-gpu-power-for-xbox-one-developers

"For example, consider a typical game scenario where the render target is 32bpp [bits per pixel] and blending is disabled, and the depth/stencil surface is 32bpp with Z [depth] enabled. That amount to 12 bytes of bandwidth needed per pixel drawn (eight bytes write, four bytes read). At our peak fill-rate of 13.65GPixels/s that adds up to 164GB/s of real bandwidth that is needed which pretty much saturates our ESRAM bandwidth. In this case, even if we had doubled the number of ROPs, the effective fill-rate would not have changed because we would be bottlenecked on bandwidth"

AMD's ROPS contains both stencil/Z-ROPS and color ROPS. There are more stencil/Z-ROPS than color ROPS i.e. AMD designed their GPUs with stencil/Z-ROPS bias.

---------------------

I can gimp my R9-290 with 69 GB/s memory bandwidth and still delivers 1920x1080p e.g. Tomb Raider 2013 at 35.5 fps and Ultimate settings (with TressFX). I can increase my fps with shadow resolution set to normal and texture quality set to high.

-------------

R9-290 with 109 GB/s memory bandwidth. 47.5 fps average result

-------------

Note that ROPS are not the only memory bandwidth consumer e.g. TMUs. The memory bandwidth has to be shared with other bandwidth consumers.

#7 Posted by _Matt_ (8867 posts) -

Da-Doom ROP-ROP-ROP Da-Doom ROP-ROP

I could not have put it any better myself.

#8 Edited by Suppaman100 (3771 posts) -

TLHBO.

"Bu..bu...bu...DirectX12 will surely give Xbone a power boost" LOL

Have fun with your subpar gaming experience lemmings

#9 Posted by Bishop1310 (986 posts) -

lol...system wars

#10 Posted by Solid_Max13 (3527 posts) -

If this is true then MS should apologize to everyone and refund them in someway

#11 Posted by scatteh316 (4848 posts) -

@ronvalencia: No one give a flying cart and about your 290 and your UN controlled rubbish results.

Same shit in a different thread.

If you bothered to read the presentation you'll see these figures are from a well respected developer giving REAL WORLD figures and experiences with BOTH machines.

These are not some made up bullshit numbers from Sony or Microsoft.

#12 Posted by blamix (192 posts) -
#13 Posted by lglz1337 (3132 posts) -

in short XBone is a waste

#14 Posted by NFJSupreme (5150 posts) -

So nothing new was learned. Just re-explaining what we already know.

#15 Posted by SecretPolice (21483 posts) -

The Bore must be even weaker since The One sports RYSE, the current graphics king so with Sony going broke, best unload that thing before it's worth nothin at all :o umm, just sayin. :P

#16 Posted by Solid_Max13 (3527 posts) -

@SecretPolice: was that even a sentence? wtf are you even saying?

#17 Edited by ronvalencia (15109 posts) -
@scatteh316 said:

@ronvalencia: No one give a flying cart and about your 290 and your UN controlled rubbish results.

Same shit in a different thread.

If you bothered to read the presentation you'll see these figures are from a well respected developer giving REAL WORLD figures and experiences with BOTH machines.

These are not some made up bullshit numbers from Sony or Microsoft.

Same $hit in different thread.

I'I have posted similar ROPS numbers (easy to calculate from the hardware specs) in the past and your pure color-ROPS vs memory bandwidth doesn't factor in Z-ROPS bandwidth consumption. If you read the presentation, there's a method to bypass the ROPS i.e. via UAVs (introduced in DX11.0 and expanded on DX11.1 Feature Level 11_1).

If Z-ROPS are nearly useless, then AMD shouldn't have quadrupled Z-ROP count over color-ROPS.

From http://www.techpowerup.com/gpudb/296/radeon-hd-7970.html

7970 has 128 Z-ROPS which is 4X greater than it's 32 color-ROPS. AMD invested more on Z-ROPS than on color-ROPS.

Microsoft's BS numbers actually matches your posted numbers e.g. write 109 GB/s + read 54 GB/s = 163 total.

If reading memory operations are useless, AMD/NVIDIA/Intel should have designed their memory controllers with only the write direction.

-------------

I have already shown you a real game app with 1920x1080p rendering and 69 GB/s memory bandwidth.

-------------

http://en.wikipedia.org/wiki/Avalanche_Studios

Notes: Avalanched Studio is known to be NVIDIA "The Way It's Meant to be Played" developer on the PC.

VS

http://en.wikipedia.org/wiki/Rebellion_Developments#List_of_video_games_developed

Notes: Rebellion is known to be AMD Gaming Evolved with Mantle developer on the PC.

#18 Posted by SecretPolice (21483 posts) -

Too deep for you with shallow minds, move along little doggy. :P

#19 Posted by getyeryayasout (7173 posts) -

Too deep for you with shallow minds, move along little doggy. :P

Did you get sent up river? Where ya been?

#20 Edited by SecretPolice (21483 posts) -

@getyer.. 10th attempt, my posts don't show up, my servers hate the new GS format and so do I. :(

#21 Edited by getyeryayasout (7173 posts) -
#22 Posted by tormentos (17096 posts) -

Simply put, Xbox One is severely hampered by ROP performance and will struggle to deliver native 1080p games due to fill rate and bandwidth constraints.

PS4 is also some what bandwidth starved but has a big advantage over Xbox One when it comes to ROP performance making 1080p easily achievable on PS4 compared to Xbox.

http://www.bilder-upload.eu/upload/179726-1396172332.jpg

http://www.bilder-upload.eu/upload/a14170-1396172466.jpg

Long story short, Xbox One might improve over the coming months but native 1080p will NOT be common place on Xbox due to bandwidth and ROP limitations.

Link

Edit: Chrome won't let me add the images properly :/

hahahahaaaaaaaaaaaaaa............

I know certain person who will put a great fight against this screen..lol

@ronvalencia

Oh oh... ROP Bound...

#23 Edited by misterpmedia (3363 posts) -

So nothing new was learned. Just re-explaining what we already know.

Pretty much this. Microsoft's defense was: 'look at the games'. We have more than enough evidence to know now Xbone struggles with full HD.

#24 Edited by ronvalencia (15109 posts) -
@tormentos said:

@scatteh316 said:

Simply put, Xbox One is severely hampered by ROP performance and will struggle to deliver native 1080p games due to fill rate and bandwidth constraints.

PS4 is also some what bandwidth starved but has a big advantage over Xbox One when it comes to ROP performance making 1080p easily achievable on PS4 compared to Xbox.

http://www.bilder-upload.eu/upload/179726-1396172332.jpg

http://www.bilder-upload.eu/upload/a14170-1396172466.jpg

Long story short, Xbox One might improve over the coming months but native 1080p will NOT be common place on Xbox due to bandwidth and ROP limitations.

Link

Edit: Chrome won't let me add the images properly :/

hahahahaaaaaaaaaaaaaa............

I know certain person who will put a great fight against this screen..lol

@ronvalencia

Oh oh... ROP Bound...

Nothing specific on 1920x1080p. All it states is color-ROPS are less than memory bandwidth. You haven't factored Z-ROPS which is 4X over ROPS count.

Unlike TC's post, Rebellion's statement was specific for 1920x1080p native output.

R9-290's 64 ROPS at 947Mhz would be still "ROPS bound" with RGBA8(integer) against 320 GB/s memory bandwidth e.g. the math: 947Mhz x 64 x 4 byte = 242 GB/s.

The big flaw with their math is the hardware color decompression/compression factor. There's more to it than the simple math i.e. the GPU has to fit in compute, TMU, Z-ROP, color-ROP, 'etc' workloads into the finite memory bandwidth.

JIT decompress and compress hardware will throw-off any simple memory bandwidth vs ROPS math. From http://www.bit-tech.net/hardware/graphics/2007/05/16/r600_ati_radeon_hd_2900_xt/8

If you notice, AMD's ROPS unit has multiple read and write ports. TC's post focused on the Blend section i.e. the color-ROPS. AMD increased the ratio between Z-ROPS(depth/stencil ROP) vs color-ROPS (Blend ROPS) i.e. 2:1 to 4:1.

-------------

Again, I can gimp my R9-290 with 69 GB/s memory bandwidth and still deliver 1920x1080p e.g. Tomb Raider 2013 at 35.5 fps and Ultimate settings (with TressFX). I can increase my fps with shadow resolution set to normal and texture quality set to high.

.

#25 Posted by Scipio8 (615 posts) -
#26 Edited by Shewgenja (8461 posts) -

@getyeryayasout said:

Da-Doom ROP-ROP-ROP Da-Doom ROP-ROP

Mmm-ROP, ba duba ROP

ba du ROP, ba duba ROP

shooby doobuh doom ba, mmm-ROP

#27 Posted by getyeryayasout (7173 posts) -

@getyeryayasout said:

Da-Doom ROP-ROP-ROP Da-Doom ROP-ROP

Mmm-ROP, ba duba ROP

ba du ROP, ba duba ROP

shooby doobuh doom ba, mmm-ROP

MmmROP, ROP-ROP, MmmROP Dubi-dop-do-dop

MmmROP, ROP-ROP, MmmROP Whoooa Yeah Yeah!

#28 Posted by Krelian-co (10368 posts) -

the xbots have been xboned again. Well, enjoy the more cinematic 720p another gen lol.

Also in b4 ron and his bs test and rops, oh wait.

#29 Posted by ronvalencia (15109 posts) -

the xbots have been xboned again. Well, enjoy the more cinematic 720p another gen lol.

Also in b4 ron and his bs test and rops, oh wait.

The BS is your post.

#30 Edited by Solid_Max13 (3527 posts) -

@ronvalencia: you clearly understand tech but the issue is what many devs have stated and how they utilize the tech which I thus case 1080p is not viable right now on the xbone and won't be for some time until they can utilize the hardware properly tomb raider was a last gen game and isn't hard to get to 1080p but the newer games is where you'll see the drop

#31 Posted by Chutebox (36598 posts) -

@SecretPolice: Welcome back! That being said, you're drunk if you think ryse looks better than infamous!

#32 Posted by stationplay_4 (443 posts) -

man if only they would of called the xbox one the xbox 720. it would just be so perfect.

#33 Posted by SecretPolice (21483 posts) -

# Chute... thanks but my posts almost never show up, we'll see with this one. As for me being drunk, eh, many times that's likely a safe bet after work.. tehe j/k a bit but anyhoo, thus far RYSE and Mel Brooks says... it's good to be the king. :P

#34 Posted by iambatman7986 (394 posts) -

But I'm still having more fun on my One than PS4 right now. No amount of Rops is changing that at the moment. Just saying.

#35 Edited by lbjkurono23 (12544 posts) -

Sure looks like the xbox needs a rops injection.

#36 Edited by tormentos (17096 posts) -

@ronvalencia said:

Nothing specific on 1920x1080p. All it states is color-ROPS are less than memory bandwidth. You haven't factored Z-ROPS which is 4X over ROPS count.

Unlike TC's post, Rebellion's statement was specific for 1920x1080p native output.

R9-290's 64 ROPS at 947Mhz would be still "ROPS bound" with RGBA8(integer) against 320 GB/s memory bandwidth e.g. the math: 947Mhz x 64 x 4 byte = 242 GB/s.

The big flaw with their math is the hardware color decompression/compression factor. There's more to it than the simple math i.e. the GPU has to fit in compute, TMU, Z-ROP, color-ROP, 'etc' workloads into the finite memory bandwidth.

JIT decompress and compress hardware will throw-off any simple memory bandwidth vs ROPS math. From http://www.bit-tech.net/hardware/graphics/2007/05/16/r600_ati_radeon_hd_2900_xt/8

If you notice, AMD's ROPS unit has multiple read and write ports. TC's post focused on the Blend section i.e. the color-ROPS. AMD increased the ratio between Z-ROPS(depth/stencil ROP) vs color-ROPS (Blend ROPS) i.e. 2:1 to 4:1.

-------------

Again, I can gimp my R9-290 with 69 GB/s memory bandwidth and still deliver 1920x1080p e.g. Tomb Raider 2013 at 35.5 fps and Ultimate settings (with TressFX). I can increase my fps with shadow resolution set to normal and texture quality set to high.

.

ROP bound period a developers says it move on..lol

I love how you want to give credit only to arguments that serve you best even selectively quoting sources,so the part you like is true like ROP not been a problems like DF claim,but then completely going into denial on things like Bonaire been inside the xbox one which DF also claim.

You can't have it both ways either DF is good for both claims or it isn't for any..

@Solid_Max13 said:

@ronvalencia: you clearly understand tech but the issue is what many devs have stated and how they utilize the tech which I thus case 1080p is not viable right now on the xbone and won't be for some time until they can utilize the hardware properly tomb raider was a last gen game and isn't hard to get to 1080p but the newer games is where you'll see the drop

The developer he quote Rebellion say it ESRAM is to small and they have to do tricks and use tilling,tricks mean changing things lowering assets,turning off effects or doing them at half resolution like Tomb Raider case and or decreasing frames or reducing resolution..

My argument is back up by game performance so far his arguments are back up by nothing,i have been quoting here benchmarks from Anandtech for months,telling Ronvalencia what the difference would be,and he refuse to admit it,he claims the 7770 didn't represent the xbox one cache,ESRAM,bandwidth and bus,so in his eyes all that would change the world,in reality it didn't and the xbox one acted even worse than a 7770 which actually can hit 1080p in quite allot of games under certain quality.

Hell the gap i use to post here from the 7850 to the 7770 was even smaller,that the gap between the xbox one and PS4 games like Tomb Raider running at the same quality on the 7770 don't run as much as 30 FPS slower while having lower resolution effects and 900p cut scenes and having worse textures like the xbox one does vs the PS4,the real life gap is what talks.

#37 Edited by scatteh316 (4848 posts) -

@ronvalencia: You need a clue, these consoles are not using 290's, they're not running PC drivers, they're not using PC operating systems, they not using the higher level code PC is....

So get that into your thick skull before you post anymore.....but....but...but..... I can gimp my 290 in situations that are complete irrelevant to the consoles.

PS4 > Xbone... That is pure FACT....ACCEPT IT..

#38 Posted by Solid_Max13 (3527 posts) -

@tormentos: Would it be wrong to say I admire you! lol

@ronvalencia: You need a clue, these consoles are not using 290's, they're not running PC drivers, they're not using PC operating systems, they not using the higher level code PC is....

So get that into your thick skull before you post anymore.....but....but...but..... I can gimp my 290 in situations that are complete irrelevant to the consoles.

PS4 > Xbone... That is pure FACT....ACCEPT IT..

He won't, he's going to come back Rebellion>You and ROPS and 290 and so on and so forth but tormentos and your post made it completely clear what devs have found for the Xbox and what The X1 is capable of right now.

#39 Posted by FoxbatAlpha (6145 posts) -

Lol, no. From my man MisterC.

1. GPU is still considered the best in Trogput acclerator

this time we dont need ROPS, we dont need FIXED funtion

as those function already handled by OLS DX11.1 GPU

we stripped down all of it and we got pure Beefed UP GCN2.0 CU

no fixed funtion

2. CPU is still the best in Latency optimized accelerator

3. SRAM/ESRAM is much much faster than even edram or GDDR5

4. there is other acclerator too, that bes to put the closest

path to esrm

5. Like i put previously, Stakced is must for

a. Wattage

b. shortest path for data movement

c. gain momentum in 20/22/16nm

#40 Posted by edwardecl (2112 posts) -

Xrops180...

#41 Posted by navyguy21 (12746 posts) -

TC, you really shouldnt post about something that you have on knowledge on.

Nowhere in that document did it say what you said, and the article doesnt even say that the ESRAM is capable of reading and writing at the same time. Not to mention that ESRAM and DDR3 can be used simultaneously. Not sure the document was even speculating on graphics capablities anyway since ROPs dont tell the whole story.

And its funny to see so many run with it as if they know anything.

The only one with any knowledge so far is rovalencia.

#42 Posted by NFJSupreme (5150 posts) -

as much as I love tech talk. As as i like seeing ron talk over you guys heads and as much as I like seeing tormento go to blows with him can we give it all a rest now. At the end of the day the PS4 is still stronger than the Xbone. It could be 50% stronger. It could be only 25% stronger. At the end of the day it's still stronger. Hopefully the xbone will compete better with proper drivers, dev tools, cloud rain, secrete sauce, magic pxie dust, whatever but even then it will still be weaker than the PS4.

#43 Posted by scatteh316 (4848 posts) -

@navyguy21: If you know you're hardware it's quite easy to see the implications.

ESRAM can read and write at the same time but has to miss a cycle in every 8 which is lost performance, the ROPs were built into the daughter die on Xbox 360 making it ideal and perfect for the EDRAM to be used as a frame buffer. It's not the same situation now on Xbox One.

When it comes to running at native 1080p it's clear to everyone that understates the rendering pipe line that Xbox One is always going to struggle.

And rovalencia doesn't know anything, he just pastes the same old crap about his Rebellion developers and his silly 290 experiments.

#44 Edited by Sollet (7342 posts) -

TC, you really shouldnt post about something that you have on knowledge on.

Nowhere in that document did it say what you said, and the article doesnt even say that the ESRAM is capable of reading and writing at the same time. Not to mention that ESRAM and DDR3 can be used simultaneously. Not sure the document was even speculating on graphics capablities anyway since ROPs dont tell the whole story.

And its funny to see so many run with it as if they know anything.

The only one with any knowledge so far is rovalencia.

Which has been proven wrong x amount of times lol.

#45 Posted by I_can_haz (6551 posts) -

XBone 720p is a weak POS confirmed.

"b...bu....bu....but teh dirextX 12 boost!!!1!" LOL TLHBO

#46 Posted by remiks00 (1706 posts) -

I have no idea what the hell any of that means :\. @ronvalencia can you explain what the hell you are saying in layman's terms please. I'm trying to learn.

#47 Edited by ronvalencia (15109 posts) -

@scatteh316 said:

@navyguy21: If you know you're hardware it's quite easy to see the implications.

ESRAM can read and write at the same time but has to miss a cycle in every 8 which is lost performance, the ROPs were built into the daughter die on Xbox 360 making it ideal and perfect for the EDRAM to be used as a frame buffer. It's not the same situation now on Xbox One.

When it comes to running at native 1080p it's clear to everyone that understates the rendering pipe line that Xbox One is always going to struggle.

And rovalencia doesn't know anything, he just pastes the same old crap about his Rebellion developers and his silly 290 experiments.

It's you who doesn't know anything. Unlike your source, Rebellion has specifically addressed 1920x1080p. Rebellion developers > you.

Your not factoring Xbox 360 is limited to 8 color ROPS at 500Mhz (4 Gigapixel/s) and the connection between the GPU and ESRAM is limited to 32 GB/s.

Unlike Xbox 360, Xbox One's ROPS can write to DDR3 and ESRAM, and it has higher 16 ROPS at 853Mhz.

From hotchip.org's diagram, Xbox One's peak BW is 204 GB/s for it's ESRAM.

#48 Edited by btk2k2 (363 posts) -

It's you who doesn't know anything. Unlike your source, Rebellion has specifically addressed 1920x1080p. Rebellion developers > you.

Your not factoring Xbox 360 is limited to 8 color ROPS at 500Mhz (4 Gigapixel/s) and the connection between the GPU and ESRAM is limited to 32 GB/s.

Unlike Xbox 360, Xbox One's ROPS can write to DDR3 and ESRAM, and it has higher 16 ROPS at 853Mhz.

From hotchip.org's diagram, Xbox One's peak BW is 204 GB/s for it's ESRAM.

There are 3 issues with the Xbox One that make it difficult to achieve 1080p and they are all linked together.

1) Low bandwidth main memory pool.

2) Low capacity high bandwidth memory pool

3) 16 ROPS.

In some cases the developer can fit the render target into the ESRAM and make use of its high bandwidth, in that scenario the limiting factor for 1080p is the number of ROPS as you are ROP bound.

In other cases the developer cannot fit the render target into ESRAM so they have to use the DDR3, in that scenario the limiting factor for 1080p is the bandwidth as you become bandwidth bound.

Your benchmark is flawed for all the reasons scatteh316 stated and atleast 2 addtional reasons, first of all you are using a game that is not really pushing the envelope in terms of fill rate requirements, secondly you are using a built in game benchmark and it is widely known that both AMD and Nvidia optimise the hell out of in game benchmarks so that their hardware reviews better.

There will be ways to get 1080p on the Xbox One as is self evident by the fact that some games do work at 1080p already, the problem is that it takes a fair few sacrifices in other aspects of the image quality to achieve that because of the 3 limitations above, if your render target is too big you are bandwidth starved and if it is the right size you are ROP starved, the latter is the least bad but it is not ideal.

If you need to resort to using tiling tricks so that some of the render target is in the ESRAM and some is in the DDR3 trying to manage that in such a way that the low bandwidth RAM will not cause problems is not going to be easy, and unless MS can create tools to automate this I do not see multiplatform developers going through the effort when they can just use 900p or 720p and be done with it.

#49 Posted by SecretPolice (21483 posts) -

Phenom E3 for MS inbound whilst the cows & Sony bloviate about supposed superior specs., yeah, sounds a lot like last gen, da powah of teh PS3 will be unleashed any day now .... since 2006. :P

#50 Edited by -RocBoys9489- (6196 posts) -

wtf is going on; buy a PC ya twats