DirectX 12 Boosts Xbox One CPU Performance by 50%, GPU by 20% -leak

Avatar image for AzatiS
AzatiS

14969

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#101  Edited By AzatiS
Member since 2004 • 14969 Posts

@RR360DD said:

Don't care.

As long as Microsoft Games Studios keep providing the good stuff, I'll be happy.

You are the only lem i really respect .

Why i say that ?

Because you were saying you dont care back in the days when everyone attacking X1 for not being able to do 1080p or lacking in performance in general and now you saying the very same thing that Dx12 supposedly will do wonders and fill the "power" gap X1 was missing.

Youre legit , i like you

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#102 tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

@tormentos:

You have used R7-250X and R7-265 which are PC SKU examples and you didn't limit your generalizations.

If you are going to use PC SKU examples to reflect XBO vs PS4 comparisons,

XBO ~= Pitcairn ES i.e. 12 CU (768 stream processors) at 860 Mhz (1.32 TFLOPS), 153.6 GB/s, dual tessellation units)

PS4 ~= R7-265 i.e. 16 CU at 900 Mhz (1.89 TFLOPS), 179 GB/s, dual tessellation units.

http://www.tomshardware.com/reviews/768-shader-pitcairn-review,3196.html

XBO's split render target (frame buffer write DDR3 68 GB/s + ESRAM 104 GB/s) + 48 TMU workaround for 16 ROPS. + dual tessellation units ~= Pitcairn ES's 153.6 GB/s memory bandwidth. Avalanche Studios (Mad Max) has devised the 16 ROPS workaround with 48 TMUs i.e. memory bandwidth bound.

XBO's with just 68 GB/s DDR3 memory usage ~= 7770/R7-250X or down clocked R7-260 at 853Mhz.

PS4 ~= R7-265. it's simple.

See this is why i consider you a blind hypocrite it all most be do as you think it work,so a R250X doesn't represent the xbox one GPU because it has 2 CU less,how the fu** Pitcairn represent the xbox one when Pitcairn was 32ROP and 153gb/s which the xbox one doesn't have.

I use the R250X because regardless of having 2 less CU has about the same flop count as the xbox one,because it is a refresh 7770.

The xbox one GPU is Bonaire 7790 with 2 CU less and lower clock speed is not Pitcairn in any way,so stop your whole this doesn't represent that argument they are hypocrite when you try to pass the xbox one GPU as Pitcairn,the xbox one GPU is bonaire 7790 14 CU with 2 disable for redundancy,the PS4 is Pitcairn 7870 with 2 CU disable the fact that you still hold to that crap today say how stupid and silly your argument still are.

Prove to me Bonaire has 32 ROP and 153GB/s and you have a point,the xbox one GPU is 14 CU not 12,it has 2 disable your argument are pure bullshit.

Hell you are using the arguments of 2013 when the consoles weren't out and you were claiming that the difference would be quality wise only at the same resolution..hahahahaaha you were wrong...hahaha

No is not a workaround split render target only work when the data that will reside on DDR3 doesn't need fast bandwidth,which is why Forza 5 use it,the sky was place on ESRAM because it didn't change and was dead so it didn't require fast bandwidth,anything that need fast bandwidth can't be done by split render target.

There is no downclock R260 and by the way the R260 is 16 ROP unlike Pitcairn example you pick which was 32 and has 96GB/s not 153GB/s like the pitcairn model.

BY the way stop spinning my point is simple you compare to equal GPU using difference CPU,hell in what an i7 vs a A10 represent the PS4 vs xbox one situation.? Oh yeah in nothing you are just to butthurt and blind to see it..hahahaa

@GrenadeLauncher said:

LOL I wonder how long it would be before this turned up here.

It's fake. The word frequently wouldn't be misspelt in a technical document.

@tymeservesfate owned in his own thread. @blackace is looking pretty stupid too.

It is fake i already posted the original and the link to MS own page which is where this was took from..hahahaaha

Avatar image for misterpmedia
misterpmedia

6209

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#103 misterpmedia
Member since 2013 • 6209 Posts

@blackace said:
@misterpmedia said:

Lems still clinging to the secret sauce still? We're about knocking on the door of 2016 and it's still not revealed? Any way, this thread already delivered with Blacklem going full and all in lem with his bright green leminess.

Cows still hoping, praying, and wishing that DX12/Win10 is imaginary and will never see the light of day.... just like the Cloud Tech (or wait a minute... lol!!). Poor Cows. It's only going to get worst for you trolls.

lmfao, you need to re-read the first comment you made in this thread and realise how stupid this one looks. Self-owning yourself in your pool of irony. Exactly like a blacklem would. You trolls know, no bounds. I'll wait until 2018 when you're still sounding off the same old blacklem troll garble ;)

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#104 tormentos
Member since 2003 • 33784 Posts

@misterpmedia said:
@blackace said:

Cows still hoping, praying, and wishing that DX12/Win10 is imaginary and will never see the light of day.... just like the Cloud Tech (or wait a minute... lol!!). Poor Cows. It's only going to get worst for you trolls.

lmfao, you need to re-read the first comment you made in this thread and realise how stupid this one looks. Self-owning yourself in your pool of irony. Exactly like a blacklem would. You trolls know, no bounds. I'll wait until 2018 when you're still sounding off the same old blacklem troll garble ;)

The poor lemm predicted that all games by 2015 would be 1080p and that 720 and 900p would be over,thee is only 4 months left for the year to end and his dreams are getting crush..hahahahaa

Blackace is without doubt one of the most butthurt lemmings,which make easy to identify his alter egos as they tend to do the same meltdown,same delusional claims and even use the fake manticore 20+ years gaming crap..hahaha

Blackace,B4X,NyaDC...lol

Avatar image for FoxbatAlpha
FoxbatAlpha

10669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#105 FoxbatAlpha
Member since 2009 • 10669 Posts

So while Sony shouts things from the roof tops to get cows and fanboys to cream themselves, Microsoft keeps a lid on things. Phil trolled you all with his comments on DX12 and now BAM!!!!!!!!!!! Shot to the nuts.

Thank you Phil for owning these fools.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#106 tormentos
Member since 2003 • 33784 Posts

@FoxbatAlpha said:

So while Sony shouts things from the roof tops to get cows and fanboys to cream themselves, Microsoft keeps a lid on things. Phil trolled you all with his comments on DX12 and now BAM!!!!!!!!!!! Shot to the nuts.

Thank you Phil for owning these fools.

Is Fake...........................hahahahaaaaaaaaaaaaaa

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#107 StormyJoe
Member since 2011 • 7806 Posts

@ButDuuude said:
@StormyJoe said:
@Cheleman said:

STILL considerably weaker than a ps4...

LOL

The PS4 is only 30% more powerful. Hardly an earth-shattering difference.

lol love it how you guys keep bringing it down :)

"We" are not bringing it down, people like "you" keep trying to make the number higher.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#108 StormyJoe
Member since 2011 • 7806 Posts

@jaz_0 said:

Similar to Nvidia, AMD won’t support some features of DirectX 12 as well. AMD’s Robert Hallock said that there is not a single graphics card with “full support” for DirectX 12 in the market right now.

He wrote:

I think gamers are learning an important lesson: there’s no such thing as “full support” for DX12 on the market today.

There have been many attempts to distract people from this truth through campaigns that deliberately conflate feature levels, individual untiered features and the definition of “support.” This has been confusing, and caused so much unnecessary heartache and rumor-mongering.

He went on to say that every GPU comes with its unique architecture, and none have full support for DirectX 12 right now.

-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

For those ignorant Boners who think DirectX 12 will add new features to the Xbone's current hardware.

Calling someone ignorant while posting that electronic tripe - LOL, the irony!

He did not say what those features were, or even if any developers are asking for them. So, people like you go off saying "DX12 won't add anything to XB1!", which is a conclusion completely unsupported by that quote.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#109 StormyJoe
Member since 2011 • 7806 Posts

@miahz1986 said:

It's funny how Microsoft believe software and cloud computing is the future and makes a huge difference. Heck they must even think elephants can fly.

Moving on... If this was the case then any tom, dick or harry can have a low range card graphic card in a PC and get boost in CPU/GPU LOL

Cloud computing I hardly seen any games using this future but then again doesn't make a huge difference. Imagine the servers go down or anything like that then cloud computing is MUTE. Same goes for Digital games LOL and yes Microsoft think the future is digital (good luck with that).

"Imagine the servers go down or anything like that then cloud computing is MUTE" <-- quite possibly the funniest ignorant fanboy comment I have ever heard.

First off, cloud computing is a version of distributed computing. What's great about cloud computing is that there is no defined server, it's a collection of servers. In MS's case, hundereds of thousands of computers. Saying "the servers go down" is like saying "the Internet went down".

Secondly, it's "moot", not "mute".

Lastly, the issue with cloud computing is bandwidth, not the the theory.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#110 StormyJoe
Member since 2011 • 7806 Posts

@tormentos said:
@ronvalencia said:

Using Tomorrows Children as an example http://fumufumu.q-games.com/archives/TheTechnologyOfTomorrowsChildrenFinal.pdf

They went from 33 ms to 27 ms by using Async and that's about around 20% improvement for this game. This game only targets GCN arch (PS4).

The above example shows already low CPU overhead APIs (PS4) extracting additional effective performance from Async compute usage. The example is also applicable for XBO.

Keep the hopes alive the xbox one api will never be as slim as the PS4 one,and the xbox one will be behind always all gen long..hahahahaa

Gosh, you were soooo close to having a rational point, then you went back to being a fanboy. LOL. Loser.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#112 tormentos
Member since 2003 • 33784 Posts

@StormyJoe said:

"Imagine the servers go down or anything like that then cloud computing is MUTE" <-- quite possibly the funniest ignorant fanboy comment I have ever heard.

First off, cloud computing is a version of distributed computing. What's great about cloud computing is that there is no defined server, it's a collection of servers. In MS's case, hundereds of thousands of computers. Saying "the servers go down" is like saying "the Internet went down".

Secondly, it's "moot", not "mute".

Lastly, the issue with cloud computing is bandwidth, not the the theory.

Total bullshit XBL has been down and that include azure games which use the cloud like Titanfall remember the whole fiasco that plague PSN and LIve last holiday.?

Live is a centralise service which mean more than one feature can be affected by server problems on their end.

@StormyJoe said:

Gosh, you were soooo close to having a rational point, then you went back to being a fanboy. LOL. Loser.

Oh really prove it.? for freaking once prove something you claim.

The PS4 API is done only to work on PS4 no backward compatibility or legacy whatsoever with any other hardware,this mean the API done for the machine take advantage of nothing but the specific PS4 hardware and nothing more,DX12 has legacy attach to it,all DX have,because it is made with more than 1 hardware in mind,in fact DX12 is suppose to help xbox and PC games to be ported more friendly between platforms how in hell you think that happen.?

So while DX12 has to support many hardwares including the xbox one,the PS4 API is slim,streamline and just make to take the best advantage possible of the unit,so true is this that the PS4 beat PC and xbox one to async shaders,hell it beat the xbox one by 2 full years,MS API will never be as clean as sony.

Any way i don't even know why this thread is open TC article is fake and was already proven.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#113  Edited By StormyJoe
Member since 2011 • 7806 Posts

@tormentos said:
@StormyJoe said:

"Imagine the servers go down or anything like that then cloud computing is MUTE" <-- quite possibly the funniest ignorant fanboy comment I have ever heard.

First off, cloud computing is a version of distributed computing. What's great about cloud computing is that there is no defined server, it's a collection of servers. In MS's case, hundereds of thousands of computers. Saying "the servers go down" is like saying "the Internet went down".

Secondly, it's "moot", not "mute".

Lastly, the issue with cloud computing is bandwidth, not the the theory.

Total bullshit XBL has been down and that include azure games which use the cloud like Titanfall remember the whole fiasco that plague PSN and LIve last holiday.?

Live is a centralise service which mean more than one feature can be affected by server problems on their end.

@StormyJoe said:

Gosh, you were soooo close to having a rational point, then you went back to being a fanboy. LOL. Loser.

Oh really prove it.? for freaking once prove something you claim.

The PS4 API is done only to work on PS4 no backward compatibility or legacy whatsoever with any other hardware,this mean the API done for the machine take advantage of nothing but the specific PS4 hardware and nothing more,DX12 has legacy attach to it,all DX have,because it is made with more than 1 hardware in mind,in fact DX12 is suppose to help xbox and PC games to be ported more friendly between platforms how in hell you think that happen.?

So while DX12 has to support many hardwares including the xbox one,the PS4 API is slim,streamline and just make to take the best advantage possible of the unit,so true is this that the PS4 beat PC and xbox one to async shaders,hell it beat the xbox one by 2 full years,MS API will never be as clean as sony.

Any way i don't even know why this thread is open TC article is fake and was already proven.

Tormentos, you don't know shit about cloud computing, so keep your piehole shut. Azure has gone down once - early last year - and that was because of a bad software update that affected 2/3rds of their servers. Azure and XBL are not the same thing.

Backwards compatibility is not indicative of poor performance in an API, stupid. Ever hear of overloaded methods? Of course not - you don't know anything about software development.

Avatar image for Zero_epyon
Zero_epyon

20135

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#114  Edited By Zero_epyon
Member since 2004 • 20135 Posts

lol @blackace falling for fake DX12 news again!

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#115 GrenadeLauncher
Member since 2004 • 6843 Posts

StormyJoke meltdown.

Your thoughts on this being fake @StormyJoe ?

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#117 GrenadeLauncher
Member since 2004 • 6843 Posts

@sts106mat said:

How is he having a meltdown?

proving that people don't know things = a having meltdown now?

His five post rant showed it. Poor StormyJoke, still fighting a lost cause.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#119  Edited By StormyJoe
Member since 2011 • 7806 Posts

@sts106mat said:
@GrenadeLauncher said:

StormyJoke meltdown.

Your thoughts on this being fake @StormyJoe ?

How is he having a meltdown?

proving that people don't know things = a having meltdown now?

Maybe you have to be down to @GrenadeLauncher's level to think so. IDK.

@tormentos owns himself so often, and when he's not he's making fantastical comments that border on paranoid delusion, that I have little patience for him.

GrenadeLauncher is also a worthless fanboy, who just resorts to asinine name calling (at least Tormentos can form complete sentences that are more than 5 words long). I might take offense, or care, if he wasn't such a tool and his name calling wasn't so stupid and juvenile.

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#121 GrenadeLauncher
Member since 2004 • 6843 Posts

Poor StormyJoke. Bitter for two years and counting.

Go play some videogames and calm down. Oh wait, I forgot, you don't have any.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#122 tormentos
Member since 2003 • 33784 Posts

@StormyJoe said:

Tormentos, you don't know shit about cloud computing, so keep your piehole shut. Azure has gone down once - early last year - and that was because of a bad software update that affected 2/3rds of their servers. Azure and XBL are not the same thing.

Backwards compatibility is not indicative of poor performance in an API, stupid. Ever hear of overloaded methods? Of course not - you don't know anything about software development.

Is funny how i never know anything about any topic yet it is you who constantly get owned and wrong.

Live was down it had problems that mean those problems would affect any game that depend from the server,but is not limited to that,problems with connection in your end are also a problem.

I am not talking about backward compatibility in games you fool,like playing 360 games on xbox one,i am talking about backward compatible API for more than 1 console in other words using 1 api to code for PS4 and PS3,on xbox one that is a problem because DX is made for different hardwares not just 1.

This is the reason why DX will never be as slim as sony tools just like the xbox 360 API weren't as slim as the PS3 ones either.

@sts106mat said:

How is he having a meltdown?

proving that people don't know things = a having meltdown now?

The problem is he didn't prove shit,he just had one of its usual meltdowns.

@sts106mat said:

i dont see him having a rant, i just see him telling some clueless fools about facts.

I see him melting down quite badly but you are a lemming so i don't expect you to see just like you didn't see how crap Ryse was..lol

@StormyJoe said:

Maybe you have to be down to @GrenadeLauncher's level to think so. IDK.

@tormentos owns himself so often, and when he's not he's making fantastical comments that border on paranoid delusion, that I have little patience for him.

GrenadeLauncher is also a worthless fanboy, who just resorts to asinine name calling (at least Tormentos can form complete sentences that are more than 5 words long). I might take offense, or care, if he wasn't such a tool and his name calling wasn't so stupid and juvenile.

The only delusional fanboy getting owned here is you,i don't claim parity between 2 GPU which have a considerable gap,and i don't make false claims like by the end of 2014 Q1 2015 all games will be 1080p.

So considering that you are consistently wrong on your cry baby arguments were you fake owning people while you get your ass handed to you,i most say if there is some one who self own its self it is you.

@sts106mat said:

100% agreed, tormentos is funny, i enjoy laughing at his posts, he has been around a long time and always the same way, we wouldn't want to change him.

As for Grenadelauncher, I have read more intelligent scrawls of graffiti on the walls of public toilets than i have seen in any of his posts.

I bring that spark that make forums like this seem alive,imagine how boring this place would be with just lemmings or cows alone..ahahaha..

I am a little embarrass,i didn't see the post about inb4uall death,when i am always here,make me wonder i think i spend to much time fighting in this kind of threads..

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#123 StormyJoe
Member since 2011 • 7806 Posts

@tormentos said:
@StormyJoe said:

Tormentos, you don't know shit about cloud computing, so keep your piehole shut. Azure has gone down once - early last year - and that was because of a bad software update that affected 2/3rds of their servers. Azure and XBL are not the same thing.

Backwards compatibility is not indicative of poor performance in an API, stupid. Ever hear of overloaded methods? Of course not - you don't know anything about software development.

Is funny how i never know anything about any topic yet it is you who constantly get owned and wrong.

Live was down it had problems that mean those problems would affect any game that depend from the server,but is not limited to that,problems with connection in your end are also a problem.

I am not talking about backward compatibility in games you fool,like playing 360 games on xbox one,i am talking about backward compatible API for more than 1 console in other words using 1 api to code for PS4 and PS3,on xbox one that is a problem because DX is made for different hardwares not just 1.

This is the reason why DX will never be as slim as sony tools just like the xbox 360 API weren't as slim as the PS3 ones either.

@sts106mat said:

How is he having a meltdown?

proving that people don't know things = a having meltdown now?

The problem is he didn't prove shit,he just had one of its usual meltdowns.

@sts106mat said:

i dont see him having a rant, i just see him telling some clueless fools about facts.

I see him melting down quite badly but you are a lemming so i don't expect you to see just like you didn't see how crap Ryse was..lol

@StormyJoe said:

Maybe you have to be down to @GrenadeLauncher's level to think so. IDK.

@tormentos owns himself so often, and when he's not he's making fantastical comments that border on paranoid delusion, that I have little patience for him.

GrenadeLauncher is also a worthless fanboy, who just resorts to asinine name calling (at least Tormentos can form complete sentences that are more than 5 words long). I might take offense, or care, if he wasn't such a tool and his name calling wasn't so stupid and juvenile.

The only delusional fanboy getting owned here is you,i don't claim parity between 2 GPU which have a considerable gap,and i don't make false claims like by the end of 2014 Q1 2015 all games will be 1080p.

So considering that you are consistently wrong on your cry baby arguments were you fake owning people while you get your ass handed to you,i most say if there is some one who self own its self it is you.

@sts106mat said:

100% agreed, tormentos is funny, i enjoy laughing at his posts, he has been around a long time and always the same way, we wouldn't want to change him.

As for Grenadelauncher, I have read more intelligent scrawls of graffiti on the walls of public toilets than i have seen in any of his posts.

I bring that spark that make forums like this seem alive,imagine how boring this place would be with just lemmings or cows alone..ahahaha..

I am a little embarrass,i didn't see the post about inb4uall death,when i am always here,make me wonder i think i spend to much time fighting in this kind of threads..

Really? Because I believe you still have to go back two years to my frame rate estimate to find something I was wrong on.

Again, method overloading. Look it up.

There is nothing more to prove - you are the one who is (yet again) wrong. @GrenadeLauncher is your only ally. LOL!!!!

Dismissive posts are hardly enough to get me upset.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#124 StormyJoe
Member since 2011 • 7806 Posts

@GrenadeLauncher said:

Poor StormyJoke. Bitter for two years and counting.

Go play some videogames and calm down. Oh wait, I forgot, you don't have any.

See what I mean, @sts106mat? Not only does he have the mental capacity of a mandrill, but he is also a coward who won't link your name when he talks trash.

Avatar image for Phazevariance
Phazevariance

12356

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#125  Edited By Phazevariance
Member since 2003 • 12356 Posts

Probably should be mentioned that this (if true) would only apply to new games made for DX12. All current games on the console are not DX12 built games so it's not like you will update and games you have will run any better.

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#126 GrenadeLauncher
Member since 2004 • 6843 Posts

@StormyJoe said:

See what I mean, @sts106mat? Not only does he have the mental capacity of a mandrill, but he is also a coward who won't link your name when he talks trash.

You already know I'm talking about you, StormyJoke. You're not a special buttercup.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#127 tormentos
Member since 2003 • 33784 Posts

@StormyJoe said:

Really? Because I believe you still have to go back two years to my frame rate estimate to find something I was wrong on.

Again, method overloading. Look it up.

There is nothing more to prove - you are the one who is (yet again) wrong. @GrenadeLauncher is your only ally. LOL!!!!

Dismissive posts are hardly enough to get me upset.

No i just go back to a few months when you claim that by the end of 2014 Q1 2015 all games would be 1080p...lol

You should stop understand well what people are arguing with you so you don't look so dumb,i am not the one claiming parity you are.lol

You can't prove anything which is totally different and i am sure more than grenadelauncher agree with me,but i don't need backup or pad in my back to know i am right.

@Phazevariance said:

Probably should be mentioned that this (if true) would only apply to new games made for DX12. All current games on the console are not DX12 built games so it's not like you will update and games you have will run any better.

Is fake the article i already posted a link with the original informantion on MS page is not even a leak,is a troll site which just trolled lemmings again...hahahaha

Avatar image for draign
Draign

1824

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#128 Draign
Member since 2013 • 1824 Posts

If it doesnt work then no games will benefit, and itll be business as usual. It seems like yall XB1 haters care about this fiasco more than XB1 owners. Such fear of whats not understood.

Avatar image for foxhound_fox
foxhound_fox

98532

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#129 foxhound_fox
Member since 2005 • 98532 Posts

I would really laugh if all this time it was Microsoft with the "hidden powah!".

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#130  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos:

@tormentos said:
@ronvalencia said:

@tormentos:

You have used R7-250X and R7-265 which are PC SKU examples and you didn't limit your generalizations.

If you are going to use PC SKU examples to reflect XBO vs PS4 comparisons,

XBO ~= Pitcairn ES i.e. 12 CU (768 stream processors) at 860 Mhz (1.32 TFLOPS), 153.6 GB/s, dual tessellation units)

PS4 ~= R7-265 i.e. 16 CU at 900 Mhz (1.89 TFLOPS), 179 GB/s, dual tessellation units.

http://www.tomshardware.com/reviews/768-shader-pitcairn-review,3196.html

XBO's split render target (frame buffer write DDR3 68 GB/s + ESRAM 104 GB/s) + 48 TMU workaround for 16 ROPS. + dual tessellation units ~= Pitcairn ES's 153.6 GB/s memory bandwidth. Avalanche Studios (Mad Max) has devised the 16 ROPS workaround with 48 TMUs i.e. memory bandwidth bound.

XBO's with just 68 GB/s DDR3 memory usage ~= 7770/R7-250X or down clocked R7-260 at 853Mhz.

PS4 ~= R7-265. it's simple.

See this is why i consider you a blind hypocrite it all most be do as you think it work,so a R250X doesn't represent the xbox one GPU because it has 2 CU less,how the fu** Pitcairn represent the xbox one when Pitcairn was 32ROP and 153gb/s which the xbox one doesn't have.

I use the R250X because regardless of having 2 less CU has about the same flop count as the xbox one,because it is a refresh 7770.

The xbox one GPU is Bonaire 7790 with 2 CU less and lower clock speed is not Pitcairn in any way,so stop your whole this doesn't represent that argument they are hypocrite when you try to pass the xbox one GPU as Pitcairn,the xbox one GPU is bonaire 7790 14 CU with 2 disable for redundancy,the PS4 is Pitcairn 7870 with 2 CU disable the fact that you still hold to that crap today say how stupid and silly your argument still are.

Prove to me Bonaire has 32 ROP and 153GB/s and you have a point,the xbox one GPU is 14 CU not 12,it has 2 disable your argument are pure bullshit.

Hell you are using the arguments of 2013 when the consoles weren't out and you were claiming that the difference would be quality wise only at the same resolution..hahahahaaha you were wrong...hahaha

No is not a workaround split render target only work when the data that will reside on DDR3 doesn't need fast bandwidth,which is why Forza 5 use it,the sky was place on ESRAM because it didn't change and was dead so it didn't require fast bandwidth,anything that need fast bandwidth can't be done by split render target.

There is no downclock R260 and by the way the R260 is 16 ROP unlike Pitcairn example you pick which was 32 and has 96GB/s not 153GB/s like the pitcairn model.

BY the way stop spinning my point is simple you compare to equal GPU using difference CPU,hell in what an i7 vs a A10 represent the PS4 vs xbox one situation.? Oh yeah in nothing you are just to butthurt and blind to see it..hahahaa

@GrenadeLauncher said:

LOL I wonder how long it would be before this turned up here.

It's fake. The word frequently wouldn't be misspelt in a technical document.

@tymeservesfate owned in his own thread. @blackace is looking pretty stupid too.

It is fake i already posted the original and the link to MS own page which is where this was took from..hahahaaha

The reasons why I use Pitcairn ES for XBO

1. Duel tessellation units hence it's geometry engine power are identical with XBO. R7-250X is half of XBO's geometry engine power. My reason removes any bottlenecks from this point. Your reason is just bias blind fan boy.

2. The internal bus in XBO reassembles Pitcairn instead Cape Verde or Bonaire i.e. in single cycle, data write operation faces 256 bit interface instead of 128 bit version from Cape Verde or Bonaire. Furthermore, there's are four 256bit I/O interface for ESRAM i.e data writes width per cycle is 1024 bits wide. To write 256 bits of data against 128 bit interface would take two cycles. Your reason is just bias blind fan boy. You failed to notice fine details and you rely on basic marketing code names.

3. With larger data types, 16 ROPS already saturates XBO's memory bandwidth. This was shown with Avalanche Studios GDC 2014 presentation. Your reason is just bias blind fan boy.

4. With lesser data types, Avalanche Studios has shown a re-purposed 48 TMU as thier ROPS write operation hence saturates XBO's memory bandwidth. PS4's 32 ROPS MSAA X4 are memory bound e.g. The Order's 1920x800p was gimped by memory bandwidth which wouldn't be a problem with 7950's 32 ROPS and it's 240 GB/s memory bandwidth. Your reason is just bias blind fan boy. Your scope is limited by XBO vs PS4 when 7950 shows it's 32 ROPS are not limited by PS4's lower memory bandwidth.

5. GDC 2015's split target render presentation which combines DDR3 and ESRAM reassembles Pitcairn in terms of data write width instead of Cape Verde or Bonaire. Your reason is just bias blind fan boy.

6. ALL AMD GCNs are built from the same CU building blocks. Different SKUs changes internal bus and I/O interfaces. Your reason is just bias blind fan boy. You failed to notice fine details and you rely on basic marketing code names.

The evidence for your lack of technical knowledge when you facked-up basic PS4's memory bandwidth math.

Avatar image for Bishop1310
Bishop1310

1274

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#132 Bishop1310
Member since 2007 • 1274 Posts

@tormentos said:
@StormyJoe said:

Really? Because I believe you still have to go back two years to my frame rate estimate to find something I was wrong on.

Again, method overloading. Look it up.

There is nothing more to prove - you are the one who is (yet again) wrong. @GrenadeLauncher is your only ally. LOL!!!!

Dismissive posts are hardly enough to get me upset.

No i just go back to a few months when you claim that by the end of 2014 Q1 2015 all games would be 1080p...lol

You should stop understand well what people are arguing with you so you don't look so dumb,i am not the one claiming parity you are.lol

You can't prove anything which is totally different and i am sure more than grenadelauncher agree with me,but i don't need backup or pad in my back to know i am right.

@Phazevariance said:

Probably should be mentioned that this (if true) would only apply to new games made for DX12. All current games on the console are not DX12 built games so it's not like you will update and games you have will run any better.

Is fake the article i already posted a link with the original informantion on MS page is not even a leak,is a troll site which just trolled lemmings again...hahahaha

Seriously, you're calling someone dumb when you talk like this?.... lol WTF.

Natural selection failed when it let you slip through the cracks torfaggo.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#133  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos:

PC end-users can down clock their R7-260 from the control panel or from MSI Afterburner tool. AMD has not block end user's ability to change GPU clock speeds and this is part of being a gaming PC owner.

I use MSI Afterburner tool to under-clock and overclock my Samsung ATIV Book 8 laptop's 8870M to 850Mhz GPU and memory to 1250Mhz (GDD5-5000Mhz) hence exceeding R9-M370X configuration.

Avatar image for the-a-baum
The-A-Baum

1370

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#134 The-A-Baum
Member since 2015 • 1370 Posts

oh god i will punch a baby in the face if he brings up projet cars again, so help me god!

Avatar image for Zero_epyon
Zero_epyon

20135

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#135 Zero_epyon
Member since 2004 • 20135 Posts

Holy crap! Is the article not fake? If it is then lets call it what it is and move one. Every thread goes the same way. No proof is given, only more speculation. Wait for official benchmarks already...

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#136 tormentos
Member since 2003 • 33784 Posts

@draign said:

If it doesnt work then no games will benefit, and itll be business as usual. It seems like yall XB1 haters care about this fiasco more than XB1 owners. Such fear of whats not understood.

NO it is you people that care and keep making threads about things that will not materialize,for example this one already was prove to be fake.

@ronvalencia said:

@tormentos:

The reasons why I use Pitcairn ES for XBO

1. Duel tessellation units hence it's geometry engine power are identical with XBO. R7-250X is half of XBO's geometry engine power. My reason removes any bottlenecks from this point. Your reason is just bias blind fan boy.

2. The internal bus in XBO reassembles Pitcairn instead Cape Verde or Bonaire i.e. in single cycle, data write operation faces 256 bit interface instead of 128 bit version from Cape Verde or Bonaire. Furthermore, there's are four 256bit I/O interface for ESRAM i.e data writes width per cycle is 1024 bits wide. To write 256 bits of data against 128 bit interface would take two cycles. Your reason is just bias blind fan boy. You failed to notice fine details and you rely on basic marketing code names.

3. With larger data types, 16 ROPS already saturates XBO's memory bandwidth. This was shown with Avalanche Studios GDC 2014 presentation. Your reason is just bias blind fan boy.

4. With lesser data types, Avalanche Studios has shown a re-purposed 48 TMU as thier ROPS write operation hence saturates XBO's memory bandwidth. PS4's 32 ROPS MSAA X4 are memory bound e.g. The Order's 1920x800p was gimped by memory bandwidth which wouldn't be a problem with 7950's 32 ROPS and it's 240 GB/s memory bandwidth. Your reason is just bias blind fan boy. Your scope is limited by XBO vs PS4 when 7950 shows it's 32 ROPS are not limited by PS4's lower memory bandwidth.

5. GDC 2015's split target render presentation which combines DDR3 and ESRAM reassembles Pitcairn in terms of data write width instead of Cape Verde or Bonaire. Your reason is just bias blind fan boy.

6. ALL AMD GCNs are built from the same CU building blocks. Different SKUs changes internal bus and I/O interfaces. Your reason is just bias blind fan boy. You failed to notice fine details and you rely on basic marketing code names.

The evidence for your lack of technical knowledge when you facked-up basic PS4's memory bandwidth math.

1-Yeah i guess that mean i should use Tonga on my next comparison you know,since it has 8 ACES like the PS4.

This ^^ is what you did now,i am just doing what you did,so don't argue against me about it is your argument.

2-Is Bonaire with 256 bit bus and 16 ROP,just like the PS4 is Pitcairn with Tonga Aces count it mean nothing this GPU were slightly modify that doesn't mean the PS4 GPU will have the performance of Tonga in their high end GPU.

Details mean shit actual performance is what matter,and the 7770 is most of the time spot on or over the xbox one in performance regardless of having a bigger bus and more bandwidth the problem is that a GPU with the flop count on PC doesn't have 150GB/s it has 72GB/s which is basically half if AMD uses a 256 bit bus and a 150GB/s the 7770 still would not come close to a 7850 period those 2 features what allow is the GPU to run more freely problem is the xbox one has a wimpy GPU which can't run much.

So it can have 400GB/s the end result would be the same and I ALREADY PROVE IT SINCE THAT ARGUMENT YOU USE IT YEARS AGO AND WAS WRONG,should i quote you using your derange arguments about how the xbox one wasn't a 7770 because it had more bandwidth and bigger bus..? then quote the benchmarks were the 7770 beat the xbox one silly.?

Loading Video...

Exhibit A.

Detail mean shit end result is what speak here a 7770 running Ryse at higher resolution and frames than the xbox one.

And this is a game made first for the xbox one than for PC by a very capable company.

3-I think you are about to get owned again....

PS4 GPU Can Handle 64bit Path, Xbox One Must Render 32bit To Avoid Bandwidth Issues.

On the XB1, if we are rendering to ESRAM, 64bit just about hits the crossover point between ROP and bandwidth-bound.

But even if we render to the relatively slow DDR3 memory, we will still be ROP-bound if the render-target is a typical 32bit texture.”

Read more at http://gamingbolt.com/ps4-gpu-can-handle-64bit-path-xbox-one-must-render-32bit-to-avoid-bandwidth-issues#HLpAbSOigmq4Akv5.99

Avalanche Studios...lol

The xbox one can become ROP bound.

This was proven already the xbox one can become ROP bound.

4-Proven the xbox one can become ROP bound and the same developer state it,funny how you always want to go against the current.

5-That work on some scenarios,opn Forza 5 for example it uses split render targets the sky on Forza in Forza 5 is dead,so they place it on DDR3 because it worked on DDR3 bandwidth which is split,the xbox one GPU doesn't have 68GB/s + 109GB/s it had 38GB/s + 109GB/s The CPU and system of the xbox one use 30GB/s from the main ram pool of DDR3 so the GPU can't use all 68GB/s only a portion so anything there must not require fast bandwidth,split render target will not probably bring Frostbite 3 from 720p because the number of effects that require fast bandwidth is too much.

No they don't resemble pitcairn which has a sole bandwidth of 153GB/s which allow the GPU to use any effect that require high bandwidth and the only problem would be power to run them,on xbox one bandwidth has a limited use because on split render target you can't use both pools for process that require fast ones,which also requires more work by the way.

6-Yes they are but there are specific models already done,the xbox one is Bonaire 7790 which is the only GPU with 14 CU on GCN for that time,the PS4 Pitcairn 7870 with 2 CU disable as well.

So you are arguing something already proven,7790 with 2 CU disable and lower clock speed,it retain the 7790 16 ROP while 256 bit bus was added instead of 128 bit one.

Evidence of my lack of knowledge.?

@ronvalencia said:

XBO's split render target (frame buffer write DDR3 68 GB/s + ESRAM 104 GB/s) + 48 TMU workaround for 16 ROPS. + dual tessellation units ~= Pitcairn ES's 153.6 GB/s memory bandwidth.Avalanche Studios (Mad Max) has devised the 16 ROPS workaround with 48 TMUs i.e. memory bandwidth bound.

Evidence of your lack of knowledge.

The xbox one split render target method can't use 68GB/s + 104GB/s ESRAM,the xbox one CPU and system have reserve 30GB/s from the main DDR3 pool,which mean you can get 30GB/s + 68GB/s for the GPU because the maximum is 68GB/s so if the CPU takes 30GB/s that leave you with 38GB/s ill wait while you claim it was a mistake or ignore it..

Yeah 30GB/s for systems you just claim falsely that is 68GB/s + 104GB/s...lol

Talking about messing up i guess next time you quote me with that shit you are getting quoted with the same memory error as well.

@Bishop1310 said:

Seriously, you're calling someone dumb when you talk like this?.... lol WTF.

Natural selection failed when it let you slip through the cracks torfaggo.

I see someone is anal about the thread been debunked...hahahahaaaa

@ronvalencia said:

@tormentos:

PC end-users can down clock their R7-260 from the control panel or from MSI Afterburner tool. AMD has not block end user's ability to change GPU clock speeds and this is part of being a gaming PC owner.

I use MSI Afterburner tool to under-clock and overclock my Samsung ATIV Book 8 laptop's 8870M to 850Mhz GPU and memory to 1250Mhz (GDD5-5000Mhz) hence exceeding R9-M370X configuration.

I didn't say you can't down clock it,i say it doesn't come downclock which is true..

I have a R270 and afterburner as well dude i know,you are arguing something i didn't imply.

@the-a-baum said:

oh god i will punch a baby in the face if he brings up projet cars again, so help me god!

Na i did better i debunked the thread as fake my dear troll lemming account..hahahaa

@Zero_epyon said:

Holy crap! Is the article not fake? If it is then lets call it what it is and move one. Every thread goes the same way. No proof is given, only more speculation. Wait for official benchmarks already...

It is fake but lemming pretend they don't see it then fools get enrage and claim i am wrong and don't know anything..hahahaha

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#137 waahahah
Member since 2014 • 2462 Posts

@tormentos said:

Oh really prove it.? for freaking once prove something you claim.

The PS4 API is done only to work on PS4 no backward compatibility or legacy whatsoever with any other hardware,this mean the API done for the machine take advantage of nothing but the specific PS4 hardware and nothing more,DX12 has legacy attach to it,all DX have,because it is made with more than 1 hardware in mind,in fact DX12 is suppose to help xbox and PC games to be ported more friendly between platforms how in hell you think that happen.?

So while DX12 has to support many hardwares including the xbox one,the PS4 API is slim,streamline and just make to take the best advantage possible of the unit,so true is this that the PS4 beat PC and xbox one to async shaders,hell it beat the xbox one by 2 full years,MS API will never be as clean as sony.

Any way i don't even know why this thread is open TC article is fake and was already proven.

I know I'll regret this, but tormentos, I don't think you have any idea what an API is for...

xbox's API's have the same interface as desktop API's but also have extensions for xbox because it does have customized hardware dev's need access too. MS doesn't implement the directx drivers, they just make a specification for hardware manufacturer's to do so. With xbox they probably write the implementation in this case. Its an abstraction layer and how slim it it depends on the back end implementation which would likely be highly optimized version of directx for consoles.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#138 tormentos
Member since 2003 • 33784 Posts

@waahahah said:

I know I'll regret this, but tormentos, I don't think you have any idea what an API is for...

xbox's API's have the same interface as desktop API's but also have extensions for xbox because it does have customized hardware dev's need access too. MS doesn't implement the directx drivers, they just make a specification for hardware manufacturer's to do so. With xbox they probably write the implementation in this case. Its an abstraction layer and how slim it it depends on the back end implementation which would likely be highly optimized version of directx for consoles.

Wasn't established already.? API is for learning how to turn petroleum into gasoline i think... That is what i read when i search for it look...

http://www.americanpetroleuminstitute.com/

API stand for American Petroleum Institute.

Now if the xbox one runs on 81 octane and the PS4 uses nothing but 93 octane so no matter where are you driving to the PS4 will always last longer on the same amount of fuel... See...

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#139 waahahah
Member since 2014 • 2462 Posts

@tormentos said:
@waahahah said:

I know I'll regret this, but tormentos, I don't think you have any idea what an API is for...

xbox's API's have the same interface as desktop API's but also have extensions for xbox because it does have customized hardware dev's need access too. MS doesn't implement the directx drivers, they just make a specification for hardware manufacturer's to do so. With xbox they probably write the implementation in this case. Its an abstraction layer and how slim it it depends on the back end implementation which would likely be highly optimized version of directx for consoles.

Wasn't established already.? API is for learning how to turn petroleum into gasoline i think... That is what i read when i search for it look...

http://www.americanpetroleuminstitute.com/

API stand for American Petroleum Institute.

Now if the xbox one runs on 81 octane and the PS4 uses nothing but 93 octane so no matter where are you driving to the PS4 will always last longer on the same amount of fuel... See...

well you confirmed it, an api when talking about software has nothing to do with petroleum, and your argument complete nonsense.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#140 tormentos
Member since 2003 • 33784 Posts

@waahahah said:
@tormentos said:
@waahahah said:

I know I'll regret this, but tormentos, I don't think you have any idea what an API is for...

xbox's API's have the same interface as desktop API's but also have extensions for xbox because it does have customized hardware dev's need access too. MS doesn't implement the directx drivers, they just make a specification for hardware manufacturer's to do so. With xbox they probably write the implementation in this case. Its an abstraction layer and how slim it it depends on the back end implementation which would likely be highly optimized version of directx for consoles.

Wasn't established already.? API is for learning how to turn petroleum into gasoline i think... That is what i read when i search for it look...

http://www.americanpetroleuminstitute.com/

API stand for American Petroleum Institute.

Now if the xbox one runs on 81 octane and the PS4 uses nothing but 93 octane so no matter where are you driving to the PS4 will always last longer on the same amount of fuel... See...

well you confirmed it, an api when talking about software has nothing to do with petroleum, and your argument complete nonsense.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#141 waahahah
Member since 2014 • 2462 Posts

@tormentos said:
@waahahah said:
@tormentos said:
@waahahah said:

I know I'll regret this, but tormentos, I don't think you have any idea what an API is for...

xbox's API's have the same interface as desktop API's but also have extensions for xbox because it does have customized hardware dev's need access too. MS doesn't implement the directx drivers, they just make a specification for hardware manufacturer's to do so. With xbox they probably write the implementation in this case. Its an abstraction layer and how slim it it depends on the back end implementation which would likely be highly optimized version of directx for consoles.

Wasn't established already.? API is for learning how to turn petroleum into gasoline i think... That is what i read when i search for it look...

http://www.americanpetroleuminstitute.com/

API stand for American Petroleum Institute.

Now if the xbox one runs on 81 octane and the PS4 uses nothing but 93 octane so no matter where are you driving to the PS4 will always last longer on the same amount of fuel... See...

well you confirmed it, an api when talking about software has nothing to do with petroleum, and your argument complete nonsense.

I know the point, your completely ignoring my response which is what confirms how stupid you are. If you can't argue with some insane stretch of logic you outright dismiss it.

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#142 GrenadeLauncher
Member since 2004 • 6843 Posts

I love that lemmings fell for this obviously faked shit.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#143  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos:

-Yeah i guess that mean i should use Tonga on my next comparison you know,since it has 8 ACES like the PS4.

PS4 doesn't have

1. 28 CUs, Tonga has 32 CUs but they are reserve for Apple iMacs and high end mobile GPUs.

2. Quad tessellation units, <---- this indicates a newer GCN design. PS4 is limited to dual tessellation units.

3. ROPS delta compression/decompression. R9-285's 3DMarks Vantage Color Fill score beats R9-280X.

This ^^ is what you did now,i am just doing what you did,so don't argue against me about it is your argument.

Bullshit. You haven't matched the active CU count and tessellation units.

2-Is Bonaire with 256 bit bus and 16 ROP,just like the PS4 is Pitcairn with Tonga Aces count it mean nothing this GPU were slightly modify that doesn't mean the PS4 GPU will have the performance of Tonga in their high end GPU.

Flawed argument setup since you haven't factored in

1. Tonga has 28 CUs,

2. Quad tessellation units, <---- this indicates a newer GCN design. PS4 is limited to Bonaire/Pitcairn's dual tessellation units.

3. ROPS delta compression/decompression. R9-285's 3DMarks Vantage Color Fill score beats R9-280X.

The xbox one can become ROP bound.

Both myself and Microsoft has argued XBO can be memory bandwidth bound with higher data types.

Again, http://www.eurogamer.net/articles/digitalfoundry-microsoft-to-unlock-more-gpu-power-for-xbox-one-developers

"For example, consider a typical game scenario where the render target is 32bpp [bits per pixel] and blending is disabled, and the depth/stencil surface is 32bpp with Z [depth] enabled. That amount to 12 bytes of bandwidth needed per pixel drawn (eight bytes write, four bytes read). At our peak fill-rate of 13.65GPixels/s that adds up to 164GB/s of real bandwidth that is needed which pretty much saturates our ESRAM bandwidth. In this case, even if we had doubled the number of ROPs, the effective fill-rate would not have changed because we would be bottlenecked on bandwidth. In other words, we balanced our ROPs to our bandwidth for our target scenarios."

Avalanche Studios' presentation backs Microsoft's argument for higher data types.

For lower data types, Avalanche Studios' presentation has 48 TMU workaround for 16 ROPS (effectively 17 ROPS at 800Mhz).

4-Proven the xbox one can become ROP bound and the same developer state it,funny how you always want to go against the current.

Wrong, Both myself and Microsoft has argued XBO can be memory bandwidth bound with higher data types. I have posted .http://www.eurogamer.net/articles/digitalfoundry-microsoft-to-unlock-more-gpu-power-for-xbox-one-developers link multiple times.

I have stated that arguing for 32 ROPS alone are useless without factoring memory bandwidth. I have always pointed to you the 7950/7970's 32 ROPS as counter argument against PS4's 32 ROPS.

5-That work on some scenarios,opn Forza 5 for example it uses split render targets the sky on Forza in Forza 5 is dead,so they place it on DDR3 because it worked on DDR3 bandwidth which is split,the xbox one GPU doesn't have 68GB/s + 109GB/s it had 38GB/s + 109GB/s The CPU and system of the xbox one use 30GB/s from the main ram pool of DDR3 so the GPU can't use all 68GB/s only a portion so anything there must not require fast bandwidth,split render target will not probably bring Frostbite 3 from 720p because the number of effects that require fast bandwidth is too much.

Don't be a hypocrite i.e. Sony has posted pure GPU memory usage benchmarks. The memory handling from one console would be similar to another console but with different bandwidth number values.

The only time GPU can use the entire memory bandwidth is when CPU communicates with GPU via the Onion links i.e. CPUs doesn't access the DDR3/GDDR5 memory.

For GPU's DDR3, the theoretical peak 68 GB/s with real world result's 54 GB/s i.e. 80 percent. Have you forgotten 54 GB/s against 68 GB/s?

GDC 2015's example is not just rendering the sky segment. GDC 2015 split render targets presentation introduces the new ESRAM APIs that replaced the old "harder to use" ESRAM APIs.

Talking about messing up i guess next time you quote me with that shit you are getting quoted with the same memory error as well.

The only person who messed up is you.

Have you forgotten real world DDR3's 54 GB/s against 68 GB/s?

Have you forgotten there's a link between GPU and CPU?

Details mean shit actual performance is what matter,and the 7770 is most of the time spot on or over the xbox one in performance regardless of having a bigger bus and more bandwidth the problem is that a GPU with the flop count on PC doesn't have 150GB/s it has 72GB/s which is basically half if AMD uses a 256 bit bus and a 150GB/s the 7770 still would not come close to a 7850 period those 2 features what allow is the GPU to run more freely problem is the xbox one has a wimpy GPU which can't run much.

Spot on you say? There's are several multi-platform games that both consoles failed to reach PC's 7770 results.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#144 tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

PS4 doesn't have

1. 28 CUs, Tonga has 32 CUs but they are reserve for Apple iMacs and high end mobile GPUs.

2. Quad tessellation units, <---- this indicates a newer GCN design. PS4 is limited to dual tessellation units.

3. ROPS delta compression/decompression. R9-285's 3DMarks Vantage Color Fill score beats R9-280X.

This ^^ is what you did now,i am just doing what you did,so don't argue against me about it is your argument.

Bullshit. You haven't matched the active CU count and tessellation units.

2-Is Bonaire with 256 bit bus and 16 ROP,just like the PS4 is Pitcairn with Tonga Aces count it mean nothing this GPU were slightly modify that doesn't mean the PS4 GPU will have the performance of Tonga in their high end GPU.

Flawed argument setup since you haven't factored in

1. Tonga has 28 CUs,

2. Quad tessellation units, <---- this indicates a newer GCN design. PS4 is limited to Bonaire/Pitcairn's dual tessellation units.

3. ROPS delta compression/decompression. R9-285's 3DMarks Vantage Color Fill score beats R9-280X.

The xbox one can become ROP bound.

Both myself and Microsoft has argued XBO can be memory bandwidth bound with higher data types.

Again, http://www.eurogamer.net/articles/digitalfoundry-microsoft-to-unlock-more-gpu-power-for-xbox-one-developers

"For example, consider a typical game scenario where the render target is 32bpp [bits per pixel] and blending is disabled, and the depth/stencil surface is 32bpp with Z [depth] enabled. That amount to 12 bytes of bandwidth needed per pixel drawn (eight bytes write, four bytes read). At our peak fill-rate of 13.65GPixels/s that adds up to 164GB/s of real bandwidth that is needed which pretty much saturates our ESRAM bandwidth. In this case, even if we had doubled the number of ROPs, the effective fill-rate would not have changed because we would be bottlenecked on bandwidth. In other words, we balanced our ROPs to our bandwidth for our target scenarios."

Avalanche Studios' presentation backs Microsoft's argument for higher data types.

For lower data types, Avalanche Studios' presentation has 48 TMU workaround for 16 ROPS (effectively 17 ROPS at 800Mhz).

4-Proven the xbox one can become ROP bound and the same developer state it,funny how you always want to go against the current.

Wrong, Both myself and Microsoft has argued XBO can be memory bandwidth bound with higher data types. I have posted .http://www.eurogamer.net/articles/digitalfoundry-microsoft-to-unlock-more-gpu-power-for-xbox-one-developers link multiple times.

I have stated that arguing for 32 ROPS alone are useless without factoring memory bandwidth. I have always pointed to you the 7950/7970's 32 ROPS as counter argument against PS4's 32 ROPS.

5-That work on some scenarios,opn Forza 5 for example it uses split render targets the sky on Forza in Forza 5 is dead,so they place it on DDR3 because it worked on DDR3 bandwidth which is split,the xbox one GPU doesn't have 68GB/s + 109GB/s it had 38GB/s + 109GB/s The CPU and system of the xbox one use 30GB/s from the main ram pool of DDR3 so the GPU can't use all 68GB/s only a portion so anything there must not require fast bandwidth,split render target will not probably bring Frostbite 3 from 720p because the number of effects that require fast bandwidth is too much.

Don't be a hypocrite i.e. Sony has posted pure GPU memory usage benchmarks. The memory handling from one console would be similar to another console but with different bandwidth number values.

The only time GPU can use the entire memory bandwidth is when CPU communicates with GPU via the Onion links i.e. CPUs doesn't access the DDR3/GDDR5 memory.

For GPU's DDR3, the theoretical peak 68 GB/s with real world result's 54 GB/s i.e. 80 percent. Have you forgotten 54 GB/s against 68 GB/s?

GDC 2015's example is not just rendering the sky segment. GDC 2015 split render targets presentation introduces the new ESRAM APIs that replaced the old "harder to use" ESRAM APIs.

Talking about messing up i guess next time you quote me with that shit you are getting quoted with the same memory error as well.

The only person who messed up is you.

Have you forgotten real world DDR3's 54 GB/s against 68 GB/s?

Have you forgotten there's a link between GPU and CPU?

Details mean shit actual performance is what matter,and the 7770 is most of the time spot on or over the xbox one in performance regardless of having a bigger bus and more bandwidth the problem is that a GPU with the flop count on PC doesn't have 150GB/s it has 72GB/s which is basically half if AMD uses a 256 bit bus and a 150GB/s the 7770 still would not come close to a 7850 period those 2 features what allow is the GPU to run more freely problem is the xbox one has a wimpy GPU which can't run much.

Spot on you say? There's are several multi-platform games that both consoles failed to reach PC's 7770 results.

1-Oh really because the xbox one doesn't have The ROP or bandwidth as Pitcairn and you certainly didn't match them either...

So again is your argument owning you.

2-Flawed my ass you walking salami,i used your bullshit ass argument against you,the xbox one doesn't have 153GB'/s bandwidth nor has 32 ROP so is not PITCAIRN PERIOD,just like the PS4 doesn't have 28CU,and 240GB/s either.

3-What MS argue there was damage control on one side they argue that they would be bandwidth bound at 165GB/s but on another claimed 204GB/s bandwidth,there is nothing balance about the xbox one is pure bullshit and damage control fact is at 16 ROP the xbox one becomes even more ROP bound that at 32ROP period more at 1080p.

4-

Avalanche >>>>>>>>>>>>>>>>>>>>>>>>>>>>> You..

ROP bound in 2 out of 3 scenarios and on of of those both memory and ROP bound.

5-Hypocrite.? Yo u are the one inventing shit and adding things where they are not,quote sony saying the PS4 usable bandwidth is 140GB/s or your full of shit,the only thing you have is an old graph from before the console was even release,funny how the xbox one can improve everything but some how sony can't right.?

Prove it because even that chart you use doesn't say maximum is 140GB/s in reality all it does is illustrate how the CPU can consume bandwidth disproportionaly in some ways.

Hahahaahaaaaaaaaaaaaaaaa... Nice way to damage control my argument,you claim 68GB/s + 104GB/s for split rendering fool,that is impossible because the system use 30GB/s from the main pool of ram,in other words is 104GB/s + 38GB/s which would be a total of 142GB/s. is lower than pitcairn by 11GB/s.The argument wasn't about how much you can get from the 68GB/s which you claim is 54GB/s is about you claiming 68GB + 104 GB which is impossible as that would leave both the CPU and system without any bandwidth fool.

The only person who messed up is you.

Have you forgotten real world DDR3's 54 GB/s against 68 GB/s?

Have you forgotten there's a link between GPU and CPU?

No it is you because it was you who claim 68GB/s + 104GB/s not me you walking salami.

@ronvalencia said:

XBO's split render target (frame buffer write DDR3 68 GB/s + ESRAM 104 GB/s) + 48 TMU workaround for 16 ROPS. + dual tessellation units ~= Pitcairn ES's 153.6 GB/s memory bandwidth.Avalanche Studios (Mad Max) has devised the 16 ROPS workaround with 48 TMUs i.e. memory bandwidth bound.

You claimed this ^^^^^ this is False you claimed 68GB/s not 54GB/s it was you,you claimed that false shit,the true is split render target can operate at 142GB/s peak because the cpu uses 30GB/s of the main pool of ram you mess up your math and you are now claiming that 54GB/s is what you can get from 68GB/s the problem is i never argue that it so you are deviating the argument again you downplay the fact that you were wrong,in fact been 54GB/s works even worse against you and your argument about pitcairn.

Problem is Ryse wasn't a multiplatform you fool Ryse was an exclusive xbox one game that was ported to PC..hahahaa

So basically you have no argument,the 7770 beat the xbox one in Ryse period.

@waahahah said:

I know the point, your completely ignoring my response which is what confirms how stupid you are. If you can't argue with some insane stretch of logic you outright dismiss it.

Na i just trolled you dude,i know what an API is what it does and how legacy can ruin its performance we have experience it for years on Consoles vs PC.

Avatar image for joel_c17
joel_c17

3206

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#145 joel_c17
Member since 2005 • 3206 Posts

Is directX the new Cloudz?

Avatar image for nyadc
NyaDC

8006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#146  Edited By NyaDC
Member since 2014 • 8006 Posts

Avatar image for Chutebox
Chutebox

50604

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#147  Edited By Chutebox
Member since 2007 • 50604 Posts

Good thing im not gullible, or I'd Believe this only to be let down

Avatar image for leandrro
leandrro

1644

Forum Posts

0

Wiki Points

0

Followers

Reviews: -2

User Lists: 0

#149 leandrro
Member since 2007 • 1644 Posts

one and a half potato then

Avatar image for deactivated-5a30e101a977c
deactivated-5a30e101a977c

5970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#150 deactivated-5a30e101a977c
Member since 2006 • 5970 Posts

@GrenadeLauncher said:

I love that lemmings fell for this obviously faked shit.

I love that cows fell for this obviously faked shit.