Devs React to DX 12 Doubling Xbox One GPU Speed

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#451 tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

Note that AMD's GL_AMD_sparse_texture(1) vendor specific extension was displaced by GL_ARB_sparse_texture(2).

GL_ARB_sparse_texture is part of OpenGL 4.4 and AMD doesn't support OpenGL 4.4(3) at this time and AMD's GL_AMD_sparse_texture extension is dead.

References

1. https://www.opengl.org/registry/specs/AMD/sparse_texture.txt

2. http://www.opengl.org/registry/specs/ARB/sparse_texture.txt it has NVIDIA's input.

3. http://www.khronos.org/news/press/khronos-releases-opengl-4.4-specification

Without ARB's support, it's a dead end API.

----------------

For textures that fits within the 6 GB memory, AMD PRT has more performance gain on X1 (1) than PS4(2).

1. X1's TMU fetch source starts from 68 GB/s** to 204 GB/s** ESRAM. Hotchip.org stated peak 204 GB/s** BW (Back Write). "Tiling tricks" needs to be applied for textures (via AMD PRT) and render targets.

2. PS4's TMU fetch source remains at 176 GB/s**.

**theoretical peak values.

For greater than 6 GB textures, AMD PRT can be applied on HDD or Blu-ray sources.

In terms of AMD PRT functionality, AMD Kaveri APU's single speed memory design is similar to PS4's single speed memory design and the only large difference is with memory speed e.g. Kaveri's 128bit DDR3-2xx0 Mhz vs PS4's 256bit GDDR5-5500 Mhz.

PS4's PRT function just with HDD/Blu-Ray -> GDDR5.

X1's PRT function is with HDD/Blu-Ray -> DDR3 -> ESRAM.

Kaveri's PRT function just with HDD/SSD -> DDR3.

Gaming PC with dGPU's PRT function is with HDD/SSD -> DDR3 -> GDDR5.

So basically your argument is that OpenGL extension is dead.hahahaaaaaaaaaaaaaaaaa

Industry Support

“AMD has a long tradition of supporting open industry standards, and congratulates the Khronos Group on the announcement of the OpenGL 4.4 specification for state-of-the-art graphics processing,” said Matt Skynner, corporate vice president and general manager, Graphics Business Unit, AMD. “Maintaining and enhancing OpenGL as a strong and viable graphics API is very important to AMD in support of our APUs and GPUs. We’re proud to continue support for the OpenGL development community.”

From you own link selective reader...

1.

The same discussion with ESRAM as well - the 204GB/s number that was presented at Hot Chips is taking known limitations of the logic around the ESRAM into account. You can't sustain writes for absolutely every single cycle. The writes is known to insert a bubble [a dead cycle] occasionally... One out of every eight cycles is a bubble, so that's how you get the combined 204GB/s as the raw peak that we can really achieve over the ESRAM. And then if you say what can you achieve out of an application - we've measured about 140-150GB/s for ESRAM. That's real code running. That's not some diagnostic or some simulation case or something like that.

Digital Foundry: So 140-150GB/s is a realistic target and you can integrate DDR3 bandwidth simultaneously?

Nick Baker: Yes. That's been measured.

http://www.eurogamer.net/articles/digitalfoundry-the-complete-xbox-one-interview

Don't mix rebellions claims with ESRAM speed they never claimed that,ESRAM can't do 204gb/s stated by MS 2 months after the hotships conference,it can do 140 to 150GB/s which is slower than the PS4 176 gb/s bandwidth even if you take away the 20GB for CPU still faster.

2-Which doesn't matter again for all intended purposes,GCN doesn't need ESRAM to use PRT and i already quote that from AMD then self the textures need to be taken for the HDD and partially load into the GPU ram system ram had nothing to do with it,i quote it and posted it again you and you still ignore it because you only read what serve you best and ignore the rest,juts like you quoted activision for COD Ghost been 1080p and you were wrong.

Oh and loading textures directly to the DDR3 will affect performance i don't think DDR3 will be use other than final destination after the data pass for ESRAM,Turn 10 gave an example of this already,the sky which is static and does nothing can be place on the main memory bank but the cars and other stuff that demand speed are placed on the ESRAM.

PRT works on PS4 period and no GCN need ESRAM to make it work,PRT is a way to save memory which for now the xbox one and PS4 both have to waste.

@freddie2222 said:

@tormentos:

updated conversation from Brad Wardell:

Brad Wardell - "One way to look at the XBox One with DirectX 11 is it has 8 cores but only 1 of them does dx work. With dx12, all 8 do."

Some guy - "Xboxone alredy does that. All those dx 12 improvements are about windows not xbox."

Brad Wardell - "I'm sorry but you're totally wrong on that front. If you disagree, better tell MS. I'm just the messenger."

Some guy - "in all materials published they talk about PC. They only mention XBOX as already hawing close to zero owerhead."

Brad Wardell - "XBO is very low on overhead. Overhead is not the problem. It is that the XBO isn't splitting DX tasks across the 8 cores."

Multi-core issue:

Some guy - "I thought the Xbox One had 6 cores available for gaming and 2 cores reserved for the OS, does DX12 change this?"

Brad Wardell - "not that I'm aware of. #cores for gaming is different than cores that interact with the GPU."

Did you watch that video demo they showed where they used 3DMark to demonstrate the workload on the cpu, where one core was main thread and pulled the biggest load then they switched to DX12 and the wordload was splitted evenly across all cores? This might be a pretty big deal, Infamous had CPU bottleneck right? It might be that also the PS4 currently dosent split the load evenly accross the cores, but what do I know. Im not a dev

That is a joke DX doesn't use just 1 core of multicore CPU,in fact the exampled give using screens showed how all cores were working on DX11,in fact what they did was improve timing,which showed pretty easy on the screen MS showed.

Not only that the xbox one doesn't have 8 cores to use,it has 6 so you have already 2 wrong statements there all cores on CPU had been use on consoles for years,and on PC as well,it took more time but all know games demand multicore CPU and DX12 isn't here so yeah those cores are been use.

The load was already split but most of the work was done by the first core,if you see what they do they lower the first core time greatly while increasing the secondary cores ones,in fact in the screen show you can compare how they got to cutting in half the timing,but all cores were already in US that demo was for PC not xbox one,which use only 6 cores for games confirmed already not 8 so he was wrong.

But here is a nice take on this..

Writer’s take: It’s relevant to mention that after the presentation some alleged that this would lead to multiplying the performance of the Xbox One’s GPU or of the Xbox One in general by two. That’s a very premature (and not very likely) theory.

What is seeing a 2x performance boost is the efficiency of the CPU, but it won’t necessarily scale 1:1 with the GPU. First of all, no matter how fast you can feed commands to the GPU, you can’t push it past its hardware limits, secondly, the demo showcased below (the results of which were used to make those allegations) has been created with a graphics benchmark and not with a game, meaning that it doesn’t involve AI, advanced physics and all those computations that are normally loaded on the CPU and that already use the secondary threads.

Thirdly, the demo ran on PC, and part of the advantage of D3D and DirectX 12 is to bring PC coding closer to console-like efficiency, which is already partly present on the Xbox One.

http://www.dualshockers.com/2014/04/07/directx-12-unveiled-by-microsoft-promising-large-boost-impressive-tech-demo-shows-performance-gain/

All you have to do is use some common sense much like Nvidia and AMD tech demos on GPU that never show AI and are done to so call demo what the GPU it is the same here,that demo had no AI advance physics or anything,so under those conditions the demo was make they gain 2X CPU performance,but when a game code is running things will be difference because the CPU is in charge of more things than just simple code.

This is a fact with all Nvidia and AMD demo,you see impressive demos surpass by a truck load the games that actually run on the GPU.

Another point is that this demo was done on PC,and Mantle as well as DX12 are API bring to PC for the sole purpose of emulating console API,so the xbox one already is using part of that optimization that you saw on that PC demo,and the improvement on xbox one will not be the same,many here don't actually understand what they are arguing for example one of the dudes arguing this with me claim that Mantle may come to PS4 but if it doesn't the xbox one will have an advantage because it has DX12,which is totally false because the PS4 doesn't need mantle,Mantle is a PC api done to micmic the PS4 API LIBGNM.

@FastRobby said:

Tormentos keeps thinking he knows more than Brad Wardell, haha, silly boy

Oh i don't know more than him i just know that he is Damage controlling for MS,the xbox one doesn't have 8 cores for games it has 6,and all cores on CPU for consoles have been in use before DX11 even existed...lol

Hell all SPE on Cell were been use before DX11 existed as well..lol

Multicore use on consoles is older than on PC,so claiming that only 1 cores does the jobs is a joke,not even on PC is like that and MS example prove that they compare timing between both codes and all the cores were running and been use under both codes DX11 and DX12.

Is not my fault that you don't know what the fu** your talking about,console API are what Mantle and DX12 are trying to emulate,part of DX12 is already on xbox one working since day 1,so yeah don't expect 2X performance that is a joke,to trick suckers like you into buying the xbox one..lol

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#452 btk2k2
Member since 2003 • 440 Posts

@FastRobby said:

Tormentos keeps thinking he knows more than Brad Wardell, haha, silly boy

Lets be honest here. Brad Wardell is a PC developer who specialises in AI. Now I am 100% sure he knows what he is talking about with regards to the enhancements DX12 will have as Stardock worked on the Mantle demo for Star Swarm but his "Bottom line: DirectX 12 looks to double perf over DirectX 11 on the same hardware. The game changing effect of this cannot be understated." comment is talking about PC DX11 -> PC DX12.

The Xbox One DX11 already has CPU enhancements over the PC version although DX12 will add more to it. Sure it will have an effect on Xbox One but it will not bring it to parity with the PS4, the hardware gap is too large to overcome.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#454 tormentos
Member since 2003 • 33784 Posts

@btk2k2 said:

From what I have read the PS4 API is very close to Mantle. I have also read that DX12 is very close to Mantle so it seems like both the PS4 and Xbox One API are going to be based on the same core implementation. That would further suggest that any advances AMD make with Mantle between now and DX12's release will be incorporated to DX12 and likely to the PS4 API as well.

Mantle is very good at improving minimum frame rates in sections that are CPU bound, this is obvious in larger BF4 maps. Given that we see the PS4 consistently score higher minimums than Xbox One, even in cases where the PS4 is pushing more intense graphical settings (BF4 again) it would suggest that the advances AMD made with Mantle have found their way into the low level PS4 API and are on their way to the Xbox API in the form of DX12.

Given that it would seem logical to conclude that the Xbox One will close the gap somewhat with PS4, I see 900p vs 1080p with all else being equal becoming the norm as that would be a good balance for the GPU performance differences and the amount of ESRAM the Xbox One has. The current status of 900p vs 1080p + more graphics or 720p vs 1080p shows a larger discrepancy than the GPU performance differences would indicate, even when taking into account the ESRAM.

How can i have not see that you bring an incredible good point here like always...

This people are talking about this gains based on CPU bound scenarios,which isn't the problem here how many developers have complain about the xbox one CPU.?

In fact all the contrary the complain were mostly about ESRAM or the 10% reservation.

Activision asked to drop the 10% reservation so they could get higher resolution on Ghost obviously it would not have been 1080p but higher than the 720p it would have.

So the xbox one wasn't been CPU bound on those situations it was GPU bound,which DX12 will help little since all it deals is with CPU over head,so any game that is GPU bound like Ryse or even Titanfall will get nothing from DX12 or minimal at best,and i say Titanfall because it uses AI from the cloud which is already helping the CPU and still failed to be even 900p,when clearly that game could have been 1080p on PS4.

And yes Mantle is an API to emulate LibGNM basically is to bring consoles gains to PC.

'

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#455 tormentos
Member since 2003 • 33784 Posts

@FastRobby said:

Haha, you still believe you know it better. He is a developer, he knows what he is saying, he has way more credit than you, everything you are saying is just lies. I never bought the Xbox One for superiour graphics, I would've bought a PC then. I did it for the entertainment, best exclusives, because in the end gameplay is alwasy the same on both platforms for multiplatform games. So if I really wanted graphics I would've joined PC master race. Keep dreaming PS4 even comes close to a PC, you're a joke.

I doncare believe what you want when the xbox one fail to keep parity with the PS4 it is you who will be made fun off,just like we have until now with all those secret sauce believers that now barely post..lol

Entertainment.? Hahaha

Best exclusives...lol Halo Gears Forza all gene long again.? It already started last year Forza on Launch,this year Halo either 2 or 5 with Forza pin off Horizon,next year Gears with Forza 6..lol

I feel sad for you...lol

Hahahaa

You don't care about graphics yet here you are arguing non stop about the xbox one so call magic sauce for increasing graphics..hahahahaaaaaaaaaaaa

You are such a fake..lol

Avatar image for tdkmillsy
tdkmillsy

5876

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#456 tdkmillsy
Member since 2003 • 5876 Posts

@tormentos said:

@FastRobby said:

Haha, you still believe you know it better. He is a developer, he knows what he is saying, he has way more credit than you, everything you are saying is just lies. I never bought the Xbox One for superiour graphics, I would've bought a PC then. I did it for the entertainment, best exclusives, because in the end gameplay is alwasy the same on both platforms for multiplatform games. So if I really wanted graphics I would've joined PC master race. Keep dreaming PS4 even comes close to a PC, you're a joke.

I doncare believe what you want when the xbox one fail to keep parity with the PS4 it is you who will be made fun off,just like we have until now with all those secret sauce believers that now barely post..lol

Entertainment.? Hahaha

Best exclusives...lol Halo Gears Forza all gene long again.? It already started last year Forza on Launch,this year Halo either 2 or 5 with Forza pin off Horizon,next year Gears with Forza 6..lol

I feel sad for you...lol

Hahahaa

You don't care about graphics yet here you are arguing non stop about the xbox one so call magic sauce for increasing graphics..hahahahaaaaaaaaaaaa

You are such a fake..lol

Cough Infamous, Cough Killzone, Cough Uncharted

All consoles release true and trusted games, your argument is null and void.

Xbox One has plenty of other games in its locker.

Avatar image for rrjim1
rrjim1

1983

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#457  Edited By rrjim1
Member since 2005 • 1983 Posts

@btk2k2:

In order for most programmers to change to Mantle they would need to see a double digit improvement, Mantle can not do this. This is were DX12 really shines, is does improve performance into the double digit.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#458  Edited By tormentos
Member since 2003 • 33784 Posts

@tdkmillsy said:

Cough Infamous, Cough Killzone, Cough Uncharted

All consoles release true and trusted games, your argument is null and void.

Xbox One has plenty of other games in its locker.

You fell a little to short on your cough The Order,Drive Club...lol

Oh please Forza is not the COD of racing games..lol

When did the last Killzone hit.?

Early 2011..

Last Uncharted...

late 2011.

Infamous mid 2011..

lol

So almost 3 years absence for Infamous and Killzone,while Uncharted would be here by the 4th birthday for Uncharted 3..lol

Forza 4 2011.

Forza Horizon 2012

Forza 5 2013

Forza Horizon 2 2014

Forza 6 2015..lol

Hahahaha COD like...

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#459  Edited By StormyJoe
Member since 2011 • 7806 Posts

@tormentos said:
@StormyJoe said:

Moon is made of blue cheese...

Microsoft does not have to give it's API's to Sony, nor does Sony have to give their APIs to Microsoft. You literally have no idea how software works. You are really making yourself look like an idiot for debating things you know nothing about.

And, for the love of God, STOP F**KING SAYING I THINK THE XB1 WILL DOUBLE IN SPEED - I HAVE NEVER SAID THAT. Clear enough? Do you need it in a larger font?

Also, you are again saying things that I have never said, and then trying to counter them. I never said Sony's APIs wouldn't also increase in performance. WTF is the matter with you???!!!!

Finally, being a .Net developer is 100% relevant. You just don't know what you are talking about enough to understand that.

You are a buffoon and you will always be.

MS and sony don't have to give API one to another to be able to use features of one API on another..

PRT wasn't support by DX on 2011,now they are on DX 11.2 you also know them as Tile Resources..

The MS got a hold of OpenGL.? You know sh** of what your talking MS doesn't need to get OpenGL to be able to support something OpenGL support i just proved that very easy just like they don't need to get Sony APi or the other way around,and while DX11.2 is relatively new OpenGL since 2011 has support for PRT.

I have a point with links you don't.

API get new features all the time,mostly like PRT OpenGL get them first,in this case it took MS almost 3 years to bring tile resources after OpenGL did.

Dude WTF i argue with you for pages on end to get you to admit that SONY api would also improve and then after you did,you try to minimize sony's improvements based on the fact that developers haven't complain,that sony hasn't say anything or that sony API was good enough,so it would not improve like DX on xbox one..

A few months ago Tile resources was the secret weapon you lemmings were using,Tile Resources is PRT which is also supported on PS4 i argue this to hell and beyond and you people would not admit working for PS4..lol

The new secret sauce no is DX12..lol

No is not you are not making a game nor you are a damn game developers quick talking sh**.

Are you goddamned brain damaged? You have no idea what you are talking about AT ALL. Do you even know how software is written? What tools get used?

You couldn't be more out of your league here if you tried.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#461 StormyJoe
Member since 2011 • 7806 Posts

@daveg1 said:

what a load of bollox any kid here thinking dx12 is going to make any diffrence is going to be very let down.

since when does more technical graphics use less power ha ha ha the x1 will struggle even more with dx12 belive me it will be very watered down than what youl see on pc.

this is just more damage control from ms cos the ps4 is better..

Spoken with no truer ignorance. Let the people who know about software development post, ok?

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#462 StormyJoe
Member since 2011 • 7806 Posts

@lglz1337 said:

@FastRobby: tormentos eating lemmings alive you mean

Are you out of your mind? The guy doesn't even know the first thing about programming, and he is trying to debate software developers. He is absolutely clueless.

How you can think that is even remotely true just boggles my mind.

Avatar image for lglz1337
lglz1337

4959

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#463 lglz1337
Member since 2013 • 4959 Posts

@StormyJoe: and who is the so called dev inthis place ? lol

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#464 StormyJoe
Member since 2011 • 7806 Posts

@lglz1337 said:

@StormyJoe: and who is the so called dev inthis place ? lol

I have been a developer for quite some time (10+years). I have an MCSD. I think I qualify as a software developer.

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#465  Edited By deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

@lglz1337 said:

@StormyJoe: and who is the so called dev inthis place ? lol

There are Devs however how many Devs in here have experience with both platforms as well as DX12. I think that the only one that is speaking about it is Brad Wardell.

I think you only have a few choices :

* Brad is completely wrong in both CPU and GPU cases.

* Brad is completely correct in both CPU and GPU cases.

* Brad is only correct for CPU bound cases and it will not help GPU.

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#466 btk2k2
Member since 2003 • 440 Posts

@rrjim1 said:

@btk2k2:

In order for most programmers to change to Mantle they would need to see a double digit improvement, Mantle can not do this. This is were DX12 really shines, is does improve performance into the double digit.

Mantle improvements depend on how CPU bound the scenario is. i7-4770k + 290x in BF4 is unlikely to see a huge increase but something like an FX6xxx + 290x will see a larger increase.

@ttboy said:

@lglz1337 said:

@StormyJoe: and who is the so called dev inthis place ? lol

There are Devs however how many Devs in here have experience with both platforms as well as DX12. I think that the only one that is speaking about it is Brad Wardell.

I think you only have a few choices :

* Brad is completely wrong in both CPU and GPU cases.

* Brad is completely correct in both CPU and GPU cases.

* Brad is only correct for CPU bound cases and it will not help GPU.

I cant think of a console project Brad as worked on but you can point one out to me if I am incorrect. Brad is an AI dev, a very very good AI dev but an AI dev none the less.

Mantle aims to help both CPU and GPU performance although they see CPU gains as the low hanging fruit so have targeted them first. DX12 will likely improve both somewhat with CPU being a much larger factor.

All Brad has said about DX12 and Xbox One is that it should make it easier to achieve a higher resolution and you should have more objects on screen at the same time. Nothing specific and nothing ground breaking.

Avatar image for darkangel115
darkangel115

4562

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#468 darkangel115
Member since 2013 • 4562 Posts

@StormyJoe said:

@tormentos said:
@StormyJoe said:

Moon is made of blue cheese...

Microsoft does not have to give it's API's to Sony, nor does Sony have to give their APIs to Microsoft. You literally have no idea how software works. You are really making yourself look like an idiot for debating things you know nothing about.

And, for the love of God, STOP F**KING SAYING I THINK THE XB1 WILL DOUBLE IN SPEED - I HAVE NEVER SAID THAT. Clear enough? Do you need it in a larger font?

Also, you are again saying things that I have never said, and then trying to counter them. I never said Sony's APIs wouldn't also increase in performance. WTF is the matter with you???!!!!

Finally, being a .Net developer is 100% relevant. You just don't know what you are talking about enough to understand that.

You are a buffoon and you will always be.

MS and sony don't have to give API one to another to be able to use features of one API on another..

PRT wasn't support by DX on 2011,now they are on DX 11.2 you also know them as Tile Resources..

The MS got a hold of OpenGL.? You know sh** of what your talking MS doesn't need to get OpenGL to be able to support something OpenGL support i just proved that very easy just like they don't need to get Sony APi or the other way around,and while DX11.2 is relatively new OpenGL since 2011 has support for PRT.

I have a point with links you don't.

API get new features all the time,mostly like PRT OpenGL get them first,in this case it took MS almost 3 years to bring tile resources after OpenGL did.

Dude WTF i argue with you for pages on end to get you to admit that SONY api would also improve and then after you did,you try to minimize sony's improvements based on the fact that developers haven't complain,that sony hasn't say anything or that sony API was good enough,so it would not improve like DX on xbox one..

A few months ago Tile resources was the secret weapon you lemmings were using,Tile Resources is PRT which is also supported on PS4 i argue this to hell and beyond and you people would not admit working for PS4..lol

The new secret sauce no is DX12..lol

No is not you are not making a game nor you are a damn game developers quick talking sh**.

Are you goddamned brain damaged? You have no idea what you are talking about AT ALL. Do you even know how software is written? What tools get used?

You couldn't be more out of your league here if you tried.

I told you dude, don't waste your time. He doesn't know anything. He doesn't even own a PS4 or Xbox one, but he likes to troll everyone and talk up sony by spewing nonsense he doesn't understand and copy/pasting to try and prove a point he isn't smart enough to prove with his own words

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#469 StormyJoe
Member since 2011 • 7806 Posts

@darkangel115 said:

@StormyJoe said:

@tormentos said:
@StormyJoe said:

Moon is made of blue cheese...

Microsoft does not have to give it's API's to Sony, nor does Sony have to give their APIs to Microsoft. You literally have no idea how software works. You are really making yourself look like an idiot for debating things you know nothing about.

And, for the love of God, STOP F**KING SAYING I THINK THE XB1 WILL DOUBLE IN SPEED - I HAVE NEVER SAID THAT. Clear enough? Do you need it in a larger font?

Also, you are again saying things that I have never said, and then trying to counter them. I never said Sony's APIs wouldn't also increase in performance. WTF is the matter with you???!!!!

Finally, being a .Net developer is 100% relevant. You just don't know what you are talking about enough to understand that.

You are a buffoon and you will always be.

MS and sony don't have to give API one to another to be able to use features of one API on another..

PRT wasn't support by DX on 2011,now they are on DX 11.2 you also know them as Tile Resources..

The MS got a hold of OpenGL.? You know sh** of what your talking MS doesn't need to get OpenGL to be able to support something OpenGL support i just proved that very easy just like they don't need to get Sony APi or the other way around,and while DX11.2 is relatively new OpenGL since 2011 has support for PRT.

I have a point with links you don't.

API get new features all the time,mostly like PRT OpenGL get them first,in this case it took MS almost 3 years to bring tile resources after OpenGL did.

Dude WTF i argue with you for pages on end to get you to admit that SONY api would also improve and then after you did,you try to minimize sony's improvements based on the fact that developers haven't complain,that sony hasn't say anything or that sony API was good enough,so it would not improve like DX on xbox one..

A few months ago Tile resources was the secret weapon you lemmings were using,Tile Resources is PRT which is also supported on PS4 i argue this to hell and beyond and you people would not admit working for PS4..lol

The new secret sauce no is DX12..lol

No is not you are not making a game nor you are a damn game developers quick talking sh**.

Are you goddamned brain damaged? You have no idea what you are talking about AT ALL. Do you even know how software is written? What tools get used?

You couldn't be more out of your league here if you tried.

I told you dude, don't waste your time. He doesn't know anything. He doesn't even own a PS4 or Xbox one, but he likes to troll everyone and talk up sony by spewing nonsense he doesn't understand and copy/pasting to try and prove a point he isn't smart enough to prove with his own words

I totally believe you. Someone who knows absolutely nothing about software development is trying to debate me about software development.

Unbelievable....

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#470 deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

@btk2k2 said:

@rrjim1 said:

@btk2k2:

In order for most programmers to change to Mantle they would need to see a double digit improvement, Mantle can not do this. This is were DX12 really shines, is does improve performance into the double digit.

Mantle improvements depend on how CPU bound the scenario is. i7-4770k + 290x in BF4 is unlikely to see a huge increase but something like an FX6xxx + 290x will see a larger increase.

@ttboy said:

@lglz1337 said:

@StormyJoe: and who is the so called dev inthis place ? lol

There are Devs however how many Devs in here have experience with both platforms as well as DX12. I think that the only one that is speaking about it is Brad Wardell.

I think you only have a few choices :

* Brad is completely wrong in both CPU and GPU cases.

* Brad is completely correct in both CPU and GPU cases.

* Brad is only correct for CPU bound cases and it will not help GPU.

I cant think of a console project Brad as worked on but you can point one out to me if I am incorrect. Brad is an AI dev, a very very good AI dev but an AI dev none the less.

Mantle aims to help both CPU and GPU performance although they see CPU gains as the low hanging fruit so have targeted them first. DX12 will likely improve both somewhat with CPU being a much larger factor.

All Brad has said about DX12 and Xbox One is that it should make it easier to achieve a higher resolution and you should have more objects on screen at the same time. Nothing specific and nothing ground breaking.

I would have to research his CV which may take some time. A quick glance shows that he works with guys from Oxide who have a ton of DX experience. One of which "led the technical development of HLSL for D3D10".

Brad has been very specific about the gains that he's seen. He has even goes so far as to stand by his assertion of “it effectively gives every Xbox One owner a new GPU that is twice as fast as the old one,” .

Devs of his caliber are usually very selective in what they say. Hell we have hour long debates on naming Methods at my company. I don't think he would put his rep on the line if he and his colleagues were dead wrong. But who knows maybe I'm wrong. We'll see in a few months!

Avatar image for darkangel115
darkangel115

4562

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#472 darkangel115
Member since 2013 • 4562 Posts

@StormyJoe said:

@darkangel115 said:

@StormyJoe said:

@tormentos said:
@StormyJoe said:

Moon is made of blue cheese...

Microsoft does not have to give it's API's to Sony, nor does Sony have to give their APIs to Microsoft. You literally have no idea how software works. You are really making yourself look like an idiot for debating things you know nothing about.

And, for the love of God, STOP F**KING SAYING I THINK THE XB1 WILL DOUBLE IN SPEED - I HAVE NEVER SAID THAT. Clear enough? Do you need it in a larger font?

Also, you are again saying things that I have never said, and then trying to counter them. I never said Sony's APIs wouldn't also increase in performance. WTF is the matter with you???!!!!

Finally, being a .Net developer is 100% relevant. You just don't know what you are talking about enough to understand that.

You are a buffoon and you will always be.

MS and sony don't have to give API one to another to be able to use features of one API on another..

PRT wasn't support by DX on 2011,now they are on DX 11.2 you also know them as Tile Resources..

The MS got a hold of OpenGL.? You know sh** of what your talking MS doesn't need to get OpenGL to be able to support something OpenGL support i just proved that very easy just like they don't need to get Sony APi or the other way around,and while DX11.2 is relatively new OpenGL since 2011 has support for PRT.

I have a point with links you don't.

API get new features all the time,mostly like PRT OpenGL get them first,in this case it took MS almost 3 years to bring tile resources after OpenGL did.

Dude WTF i argue with you for pages on end to get you to admit that SONY api would also improve and then after you did,you try to minimize sony's improvements based on the fact that developers haven't complain,that sony hasn't say anything or that sony API was good enough,so it would not improve like DX on xbox one..

A few months ago Tile resources was the secret weapon you lemmings were using,Tile Resources is PRT which is also supported on PS4 i argue this to hell and beyond and you people would not admit working for PS4..lol

The new secret sauce no is DX12..lol

No is not you are not making a game nor you are a damn game developers quick talking sh**.

Are you goddamned brain damaged? You have no idea what you are talking about AT ALL. Do you even know how software is written? What tools get used?

You couldn't be more out of your league here if you tried.

I told you dude, don't waste your time. He doesn't know anything. He doesn't even own a PS4 or Xbox one, but he likes to troll everyone and talk up sony by spewing nonsense he doesn't understand and copy/pasting to try and prove a point he isn't smart enough to prove with his own words

I totally believe you. Someone who knows absolutely nothing about software development is trying to debate me about software development.

Unbelievable....

He's not trying to debate you. He literally acts like a bot. Just posting random copy and paste crap from other web sites. All he is trying to do is get a reaction. he must have a sad life. I stopped replying to him ages ago

Avatar image for deactivated-5f19d4c9d7318
deactivated-5f19d4c9d7318

4166

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#473  Edited By deactivated-5f19d4c9d7318
Member since 2008 • 4166 Posts

@rocky_denace said:

It's a TRAVISTY that this thread is still going and not locked yet the amount of fakeboy worthless know nothing blind fanboys is pathetic to see. It's incredible that zit faced basement dwellers waiting for mommy's ham sandwich can still be in this thread proclaiming they know more then developers and MS, AMD, Intel, and Nvidia whom have all said and confirmed that DX12 is a big leap in tech and performance for PC and Xbox One. Yet somehow we are to believe these worthless know nothings in this thread that they know more then these corporate giants and real developers who create this tech? We are better off doing this in this thread see pic below.

I agree about the fanboys but you do realise you're in a thread about the devs reactions?

Which developers has it been talking about DX12? All i've seen is those from MS, AMD or Intel and speculation. Just as with the devs quoted in the op it's good to be speculative as we've not heard anything from anyone who isn't either trying to sell GPUs and CPUs or an OS.

Imo there's no doubt we'll see boosts for the PC due to the work DX12 does with CPU cores just as we're seeing with Mantle. A lot of that work is done on console regardless of DX12 so i'm going to remain skeptical. There'll certainly be some major time savers for devs though.

Look at Planetside 2, multi-core support came because of the work they'd done with the PS4.

Avatar image for lglz1337
lglz1337

4959

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#474 lglz1337
Member since 2013 • 4959 Posts

@StormyJoe: cool i'm obama better believe

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#475 lostrib
Member since 2009 • 49999 Posts

@rocky_denace said:

It's a TRAVISTY that this thread is still going and not locked yet the amount of fakeboy worthless know nothing blind fanboys is pathetic to see. It's incredible that zit faced basement dwellers waiting for mommy's ham sandwich can still be in this thread proclaiming they know more then developers and MS, AMD, Intel, and Nvidia whom have all said and confirmed that DX12 is a big leap in tech and performance for PC and Xbox One. Yet somehow we are to believe these worthless know nothings in this thread that they know more then these corporate giants and real developers who create this tech? We are better off doing this in this thread see pic below.

Stop crying for shit to get locked, just flag it and move on

Avatar image for tdkmillsy
tdkmillsy

5876

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#476 tdkmillsy
Member since 2003 • 5876 Posts

#

@tormentos said:

@tdkmillsy said:

Cough Infamous, Cough Killzone, Cough Uncharted

All consoles release true and trusted games, your argument is null and void.

Xbox One has plenty of other games in its locker.

You fell a little to short on your cough The Order,Drive Club...lol

Oh please Forza is not the COD of racing games..lol

When did the last Killzone hit.?

Early 2011..

Last Uncharted...

late 2011.

Infamous mid 2011..

lol

So almost 3 years absence for Infamous and Killzone,while Uncharted would be here by the 4th birthday for Uncharted 3..lol

Forza 4 2011.

Forza Horizon 2012

Forza 5 2013

Forza Horizon 2 2014

Forza 6 2015..lol

Hahahaha COD like...

Ok seeing as you claim Forza and Forza Horizon are the same game, lets look at Gran Turismo.

Gran Turismo has released a game in some form nearly every year from 1997 to 2013 and most likely every year going forward.

Metal Gear had a good run before going multiplat

Killzone has released 4 games across the series Gears of War released 4

Halo has 8 releases from 2001 but only 6 where true to the series with others being alternate games and re-releases. So that's one every 2-3 years approximately. Hows that different from Killzone in 2011 and Killzone Shadow fall in 2013????

I'll say it slow so you can understand. ALL CONSOLES RELEASE GAMES FROM WELL SELLING SERIES.

Sony do it

Microsoft do it

Nintendo do it

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#477  Edited By ReadingRainbow4
Member since 2012 • 18733 Posts

@rocky_denace said:

It's a TRAVISTY that this thread is still going and not locked yet the amount of fakeboy worthless know nothing blind fanboys is pathetic to see. It's incredible that zit faced basement dwellers waiting for mommy's ham sandwich can still be in this thread proclaiming they know more then developers and MS, AMD, Intel, and Nvidia whom have all said and confirmed that DX12 is a big leap in tech and performance for PC and Xbox One. Yet somehow we are to believe these worthless know nothings in this thread that they know more then these corporate giants and real developers who create this tech? We are better off doing this in this thread see pic below.

You really have to work on your English bro.

Avatar image for misterpmedia
misterpmedia

6209

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#478 misterpmedia
Member since 2013 • 6209 Posts

@rocky_denace said:

It's a TRAVISTY that this thread is still going and not locked yet the amount of fakeboy worthless know nothing blind fanboys is pathetic to see. It's incredible that zit faced basement dwellers waiting for mommy's ham sandwich can still be in this thread proclaiming they know more then developers and MS, AMD, Intel, and Nvidia whom have all said and confirmed that DX12 is a big leap in tech and performance for PC and Xbox One. Yet somehow we are to believe these worthless know nothings in this thread that they know more then these corporate giants and real developers who create this tech? We are better off doing this in this thread see pic below.

Avatar image for GravityX
GravityX

865

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#479 GravityX
Member since 2013 • 865 Posts

Reset:

Ok.

So DirectX12 will improve CPU. Taking away a bottleneck and allow more of the GPU to be utilized.

One dev says its going to 100% GPU increase.

So the increase in the GPU is because of the increase in CPU performance, due to removal of the CPU bottleneck.

However the increase CPU usage will produce much more heat.

Large heat sink and large fan seemed to have been planned for this future increase in heat.

So lets think about GPU pushing 4 shopping carts however the door (CPU) only allows 1 shopping cart at a time.

Think of DirectX12 as giving the (CPU) 4 side by side doors, so in essence improving what the GPU can do.

Conclusion DirectX12 might help after all.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#480  Edited By scatteh316
Member since 2004 • 10273 Posts

But what people in here are too stupid to understand is that Xbone is a CLOSED BOX system.... It's already running very optimized code....

And as for unlocking more GPU performance? Complete and utter rubbish.... If you're GPU bound then your GPU bound and having CPU power will not make a difference.

It's the same on PC, if your graphics card is maxed out changing the CPU to a faster one won't do anything.

DirectX12 will not give this massive boost people are expecting, even other developers are laughing at that statement....

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#481  Edited By btk2k2
Member since 2003 • 440 Posts

@ttboy said:

I would have to research his CV which may take some time. A quick glance shows that he works with guys from Oxide who have a ton of DX experience. One of which "led the technical development of HLSL for D3D10".

Brad has been very specific about the gains that he's seen. He has even goes so far as to stand by his assertion of “it effectively gives every Xbox One owner a new GPU that is twice as fast as the old one,” .

Devs of his caliber are usually very selective in what they say. Hell we have hour long debates on naming Methods at my company. I don't think he would put his rep on the line if he and his colleagues were dead wrong. But who knows maybe I'm wrong. We'll see in a few months!

Brad Wardell owns Stardock. A company who makes most of its money through windows applications such as Start8 and WIndowBlinds. That money allows them to make more niche games such as Gal Civ, Elemental and Fallen Enchantress. They also helped with and published Sins of a Solar Empire. Brad does the AI in these games but as owner probably is aware of other aspects as well. Stardock have teamed up with Oxide to create a new engine for RTS and 4x TBS games that enable much more to be going on screen at the same time, which for space based games just adds to the options a developer has. Now weather Brad himself has done any work in this team up or if he just keeps tabs on it I have no idea.

In RTS and 4x TBS games the CPU is very important because of all the obejects that need to be kept track of, when you play Civ V the game needs to know where everything is, even if the player cannot see it, so that it can do the correct maths in conflict resolution or roll the RNG for what bonus you get from some ruins. Doing that and having good AI is a challenge so the less overhead a game of this kind has the more CPU resources can go to handling object tracking and AI allowing a greater number of objects on screen and also allowing more computationally expensive AI algorithms.

In Gal Civ 2 there is an option to enable the AI to use more CPU resources, this was put in as CPUs became more powerful and it really helps the AI perform at a higher level without resorting to cheats like in other 4x games. The AI in Gal Civ is known as some of, if not the absolute best in the genre. Brad is amazing at AI work, one of the best in the business.

I am not saying he is wrong about DX12 for Xbox One, what I am saying is that if it is true then those same techniques can be used in the PS4 API as the core uarch is the same and DX12 is based on Mantle (according to the guys at DICE), The low level PS4 API is also likely to be based on Mantle as it has been stated both are very, very similar (DICE again) and all 3 are using HLSL. I already think the PS4 API has more Mantle like features than the Xbox DX11 API as it can consistently achieve better minimum frame rates in games where some frame drops are CPU related such as 64 player maps in BF4. Based on that I think DX12 will close the gap a little bit as Xbox One takes advantage of the same features but there will still be a 40-50% gap in performance simply because the PS4 has a better GPU than he Xbox One and no amount of API optimisation is going to close that gap.

900p vs 1080p with all else being equal is what I predict the majority of titles will achieve, this is partly due to the GPU in Xbox One and partly due to the ESRAM.

Avatar image for Guy_Brohski
Guy_Brohski

2221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#482  Edited By Guy_Brohski
Member since 2013 • 2221 Posts

@ni6htmare01:WOW! I was thinking about upgrading my Video Card but I guess with DirectX 12 I no longer need to??? LOL LOL LOl~~~ Yeah right!!

LOL LOL LOl~~~ Yeah, because why would anyone actually want to increase the performance of their overpriced GPU, right?

Avatar image for trasherhead
trasherhead

3058

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#483 trasherhead
Member since 2005 • 3058 Posts

Why is this thread still going? It is a damn API, it allows devs to use hardware features.

Both have same hardware with same feature, meaning that both have the same ability to do the same stuff.

Tiled resources is a hardware feature, it is not DX12 specific and MS has said as much in their presentation of it that openGL also has it and that it is NOT specific to the EDRAM.

Now lock this.

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#484 deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

@btk2k2:

I think his point is that most games are CPU bound not GPU otherwise his comment wouldn't make sense. He obviously feels fairly confident about it.

PS 4 can emulate any feature or use mantle but at what cost. As you use more CPU cores heat rises and then you hAve to be concerned about the thermal envelope....this is where case design etc raises it's head.

Sure the ps4 GPU is stronger on paper but is it efficiently used.. This is microsofts bet.. Efficiency over brute force...we'll see who his the better approach.

Avatar image for Bishop1310
Bishop1310

1274

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#485  Edited By Bishop1310
Member since 2007 • 1274 Posts

@tormentos said:

That is a joke DX doesn't use just 1 core of multicore CPU,in fact the exampled give using screens showed how all cores were working on DX11,in fact what they did was improve timing,which showed pretty easy on the screen MS showed.

Not only that the xbox one doesn't have 8 cores to use,it has 6 so you have already 2 wrong statements there all cores on CPU had been use on consoles for years,and on PC as well,it took more time but all know games demand multicore CPU and DX12 isn't here so yeah those cores are been use.

The load was already split but most of the work was done by the first core,if you see what they do they lower the first core time greatly while increasing the secondary cores ones,in fact in the screen show you can compare how they got to cutting in half the timing,but all cores were already in US that demo was for PC not xbox one,which use only 6 cores for games confirmed already not 8 so he was wrong.

But here is a nice take on this..

This right here proves that everyone in this thread is right and you're pretty lost with this stuff..

It's very well known that one of the HUGE defects in DX up until now is the amount of load and instruction sets that gets placed in the first core of a multi core set up far exceeds instructions passed onto other cores.. The mundane task's typically handled by said cores rarely relieve the first core of handling the initial thread of big instructions.

You've been out classed here big time my friend your butthurt damage control is just funny to watch now.

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#486 deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

More interesting answers from Brad:

  1. cozomel ‏@cozomel74 21 hrs

    @draginol So what do you know about the PS4's API and how it compares to DX12? And is PS4's API also not fully utilizing the cores?


  2. Brad Wardell ‏@draginol 21 hrs

    @cozomel74 I'm afraid not. The PS4 hardware is substantially better overall though.


  3. cozomel ‏@cozomel74 21 hrs

    @draginol Let me pick your brain a little if you dont mind, but whats more important to performance. The driver or the API?


  4. Brad Wardell ‏@draginol 20 hrs

    @cozomel74 Really depends. There's so many factors that affect perf. Depends how bad either is. :) In my experience, though, the driver.


  5. ExploderMouse ‏@ExploderMouse 11 hrs

    @draginol Can you verify that the X1's GPU will double its performance in any way? I'm a firm believer in this, but be nice to put it to bed


  6. Brad Wardell ‏@draginol 11 hrs

    @ExploderMouse there is no way to prove until it comes out and devs use it.

Avatar image for deactivated-5c79c3cfce222
deactivated-5c79c3cfce222

4715

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#487  Edited By deactivated-5c79c3cfce222
Member since 2009 • 4715 Posts

Boost, sure. A 100% overall performance gain would be some sort of miracle.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#488 StormyJoe
Member since 2011 • 7806 Posts

@lglz1337 said:

@StormyJoe: cool i'm obama better believe

What?

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#489 StormyJoe
Member since 2011 • 7806 Posts

@scatteh316 said:

But what people in here are too stupid to understand is that Xbone is a CLOSED BOX system.... It's already running very optimized code....

And as for unlocking more GPU performance? Complete and utter rubbish.... If you're GPU bound then your GPU bound and having CPU power will not make a difference.

It's the same on PC, if your graphics card is maxed out changing the CPU to a faster one won't do anything.

DirectX12 will not give this massive boost people are expecting, even other developers are laughing at that statement....

100% untrue. Seriously... if you don't know anything about the topic at hand, why post?

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#490 btk2k2
Member since 2003 • 440 Posts
@ttboy said:

@btk2k2:

I think his point is that most games are CPU bound not GPU otherwise his comment wouldn't make sense. He obviously feels fairly confident about it.

PS 4 can emulate any feature or use mantle but at what cost. As you use more CPU cores heat rises and then you hAve to be concerned about the thermal envelope....this is where case design etc raises it's head.

Sure the ps4 GPU is stronger on paper but is it efficiently used.. This is microsofts bet.. Efficiency over brute force...we'll see who his the better approach.

RTS and 4x games are CPU bound, why do you think some games have unit limits. Have you tried playing Sup Com with lots of units, it slows to a crawl and it is not because of the GFX load. FPS games tend to be GPU bound but in large multiplayer maps the CPU load goes up considerably. It is a very case by case basis and Brad has said as much, as a dev who works in the 4x and RTS genres he may have been speaking from that context.

None of the parts will exceed the TDP so aslong as the cooler is specified to exhaust that amount of TDP and some overhead that is not a concern at all. Gaming laptops have a higher TDP than the PS4 and their cooling solutions are quite often weaker but they make up for it by using high RPM fans.

They are going to be as efficient as each other with the nod going to the PS4 because devs do not have to deal with ESRAM. The drivers will be the same as they are both GCN based and the API is going to be very similar as that will be Mantle based.

@ttboy said:

More interesting answers from Brad:

  1. cozomel ‏@cozomel74 21 hrs

    @draginol So what do you know about the PS4's API and how it compares to DX12? And is PS4's API also not fully utilizing the cores?

  2. Brad Wardell ‏@draginol 21 hrs

    @cozomel74 I'm afraid not. The PS4 hardware is substantially better overall though.

  3. cozomel ‏@cozomel74 21 hrs

    @draginol Let me pick your brain a little if you dont mind, but whats more important to performance. The driver or the API?

  4. Brad Wardell ‏@draginol 20 hrs

    @cozomel74 Really depends. There's so many factors that affect perf. Depends how bad either is. :) In my experience, though, the driver.

  5. ExploderMouse ‏@ExploderMouse 11 hrs

    @draginol Can you verify that the X1's GPU will double its performance in any way? I'm a firm believer in this, but be nice to put it to bed

  6. Brad Wardell ‏@draginol 11 hrs

    @ExploderMouse there is no way to prove until it comes out and devs use it.

Interesting yes. His answer in bullet point 2 is a bit vague though. Does he mean he does not know about the PS4 API and how it compares to DX12 or is he saying the PS4 does not utilize the cores? He does quite clearly state that the PS4 is substantially better though.

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#491  Edited By deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

Pretty good lol.

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#492 blackace
Member since 2002 • 23576 Posts

If you guys haven't figured it out by now, Tormentos is just a biased trolling fool who thinks he knows everything. Just do what I do and ignore his BS comments. The guy doesn't even own a PS4 or XB1, yet he thinks he knows everything about both. LOL!! What a joke. I'm surprised this thread hasn't been locked yet. It's funny how cows will bellieve thurway and cboat rumors as FACT, but when comments come from developers, AMD, Nvidia & Intel about anything XB1 it's all false. lmao!! Hilarious. Bunch of fools.

Avatar image for misterpmedia
misterpmedia

6209

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#493  Edited By misterpmedia
Member since 2013 • 6209 Posts

@tormentos You're getting popular over at the loony bin ;)

Now wonder this topic has so many views, all the pleebs from the olive garden have been dropping by.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#494 tormentos
Member since 2003 • 33784 Posts

@GravityX said:

Reset:

Ok.

So DirectX12 will improve CPU. Taking away a bottleneck and allow more of the GPU to be utilized.

One dev says its going to 100% GPU increase.

So the increase in the GPU is because of the increase in CPU performance, due to removal of the CPU bottleneck.

However the increase CPU usage will produce much more heat.

Large heat sink and large fan seemed to have been planned for this future increase in heat.

So lets think about GPU pushing 4 shopping carts however the door (CPU) only allows 1 shopping cart at a time.

Think of DirectX12 as giving the (CPU) 4 side by side doors, so in essence improving what the GPU can do.

Conclusion DirectX12 might help after all.

Xbox one games weren't CPU bound that i know,they were GPU bound because the GPU is weak sauce + it had a reservation + ESRAM is to small for 1080p with certain quality,CPU was the least of the concerns,in fact Activision asked MS to drop the 10% reservation they didn't ask for lower CPU over head,because the xbox one already have lower CPU over head than PC this is because consoles API since the firing shot are more streamline than PC ones period.

And is one of the biggest reason why the whole 2X claim was call into question,also getting 2x performance on a code that doesn't have advance physics,AI and anything that a normal CPU handle as well as code isn't the best indication of how things will work.

100% GPU increase is total bullsh** and show even more how much lies are been spread,there is no fu**ing way a GPU will increase performance 100% which is double with just a damn API that work on CPU over head.

The fun thing about this is how you claimed for months that the xbox one CPU was faster than the PS4 one because it had a 100mhz boost in speed,and now all of the sudden you people want to act like the xbox one CPU was bottle neck lol and you want to pretend that the PC version of DX is the exact same DX on xbox,so any gains on PC will = same gains on xbox one..lol

The large Fan and case were planned for that,the xbox one APU has ESRAM inside it which produce more heat than just CPU and GPU alone on PS4,the xbox one now on this moment without DX and since launch produce more heat than the PS4 while gaming even that.

The PS4 has an internal PSU which produces more inside the console than having a power brick.

Has a smaller case that leaves less room for it to breath.

The PS4 while gaming consumes 6 watts more than the xbox one which should produce more heat.

The case is big on the xbox because MS sucks making hardware,the fan is huge on the xbox one because MS suffer RROD last gen do to over heating,and the xbox one APU generating more heat because it has ESRAM inside.

And yes you people still don't know what your talking off and i am bookmarking this great thread for latter references..lol

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#495 scatteh316
Member since 2004 • 10273 Posts

@StormyJoe said:

@scatteh316 said:

But what people in here are too stupid to understand is that Xbone is a CLOSED BOX system.... It's already running very optimized code....

And as for unlocking more GPU performance? Complete and utter rubbish.... If you're GPU bound then your GPU bound and having CPU power will not make a difference.

It's the same on PC, if your graphics card is maxed out changing the CPU to a faster one won't do anything.

DirectX12 will not give this massive boost people are expecting, even other developers are laughing at that statement....

100% untrue. Seriously... if you don't know anything about the topic at hand, why post?

I know more then you boy..

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#496 StormyJoe
Member since 2011 • 7806 Posts

@scatteh316 said:

@StormyJoe said:

@scatteh316 said:

But what people in here are too stupid to understand is that Xbone is a CLOSED BOX system.... It's already running very optimized code....

And as for unlocking more GPU performance? Complete and utter rubbish.... If you're GPU bound then your GPU bound and having CPU power will not make a difference.

It's the same on PC, if your graphics card is maxed out changing the CPU to a faster one won't do anything.

DirectX12 will not give this massive boost people are expecting, even other developers are laughing at that statement....

100% untrue. Seriously... if you don't know anything about the topic at hand, why post?

I know more then you boy..

First off, not a boy. Secondly, I seriously doubt it. I've been a software developer for over 10 years, and have had a current MCSD certification for ever version of .Net.

How about you?

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#497 tormentos
Member since 2003 • 33784 Posts

@StormyJoe said:

@scatteh316 said:

But what people in here are too stupid to understand is that Xbone is a CLOSED BOX system.... It's already running very optimized code....

And as for unlocking more GPU performance? Complete and utter rubbish.... If you're GPU bound then your GPU bound and having CPU power will not make a difference.

It's the same on PC, if your graphics card is maxed out changing the CPU to a faster one won't do anything.

DirectX12 will not give this massive boost people are expecting, even other developers are laughing at that statement....

100% untrue. Seriously... if you don't know anything about the topic at hand, why post?

No is not just because you have a damn i7 doesn't mean a 7770 will all of the sudden be a 7970.

Funny thing is you accuse him of know nothing but you don't give any examples at all to counter his arguments.

If your game is GPU bound you could have the most efficient CPU in the world with the most efficient API in the world it would change nothing,the xbox one has a 7770,any game that becomes GPU will say that way and DX12 will do sh** to change that,mostly all games are GPU bound on xbox one as oppose to CPU bound,if the xbox one had a stronger GPU it would be doing better now regardless of its CPU.

Then there is ESRAM which isn't pretty good for 1080p with certain effects,dude the xbox one GPU is equivalent in power to a 7770 it may have dual render pipes like the 7790 but performance wise it is a 1.28TF GPU,you can't get that far on PC with that,even using Mantle and you can go further with a 7850 or a stronger GPU like the PS4 one which is the case here.

@blackace said:

If you guys haven't figured it out by now, Tormentos is just a biased trolling fool who thinks he knows everything. Just do what I do and ignore his BS comments. The guy doesn't even own a PS4 or XB1, yet he thinks he knows everything about both. LOL!! What a joke. I'm surprised this thread hasn't been locked yet. It's funny how cows will bellieve thurway and cboat rumors as FACT, but when comments come from developers, AMD, Nvidia & Intel about anything XB1 it's all false. lmao!! Hilarious. Bunch of fools.

I don't need to own a PS4 or xbox one to talk about them,you are a joker who claim to own all consoles yet you didn't even know that on PS3 you could share PSN games..lol

We are talking about hardware here and DX12,i don't need to own either to talk about it,or what owning a PS4 or xbox one make earn you a PHD in DX,API and hardware.?

My god what a silly fanboy you really are.

Nvidia,AMD and Intel are MS partners for endless years all 4 have common interest in each other business,and all will lie just to get a sucker like you to buy,DX12 is not windows 7,so upgrade with a new PC or buy your self a new windows 8 software,if you buy a new machine any of those 3 name there can benefit from it.

And even Activision developer doubt this sh**...lol

But then again Activision are developers,intel,amd and nvidia aren't.

Funny thing just like Respawn was gloating about how they went to the xbox one because of the cloud,and now it turns out that sony didn't even wanted the game on PS4 but wanted it on Vita..hahahaaaaaa

See people will say anything when their partners ask them to..lol

@misterpmedia said:

@tormentos You're getting popular over at the loony bin ;)

Now wonder this topic has so many views, all the pleebs from the olive garden have been dropping by.

Hahahahaa holy sh***.......I really hit a nerve on that guy..hahahaha

He want mister mediaC clown to argue with me..lol

My god some people are a joke and will take any sh** a company will tell them,hahaha is epic..

http://www.gamespot.com/profile/blog/the-art-of-trolling-fanboys/26056956/

He should read my new blog that would surely make him mad even more..hahaha

@StormyJoe said:

First off, not a boy. Secondly, I seriously doubt it. I've been a software developer for over 10 years, and have had a current MCSD certification for ever version of .Net.

How about you?

You are a software developer for 10 years and a fanboy probably for 20...

Considering how you want to put holes in one APi vs another.

Avatar image for lbjkurono23
lbjkurono23

12544

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#498 lbjkurono23
Member since 2007 • 12544 Posts

@misterpmedia said:

@tormentos You're getting popular over at the loony bin ;)

Now wonder this topic has so many views, all the pleebs from the olive garden have been dropping by.

is that a cult?

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#499 StormyJoe
Member since 2011 • 7806 Posts

@tormentos said:

@StormyJoe said:

@scatteh316 said:

But what people in here are too stupid to understand is that Xbone is a CLOSED BOX system.... It's already running very optimized code....

And as for unlocking more GPU performance? Complete and utter rubbish.... If you're GPU bound then your GPU bound and having CPU power will not make a difference.

It's the same on PC, if your graphics card is maxed out changing the CPU to a faster one won't do anything.

DirectX12 will not give this massive boost people are expecting, even other developers are laughing at that statement....

100% untrue. Seriously... if you don't know anything about the topic at hand, why post?

No is not just because you have a damn i7 doesn't mean a 7770 will all of the sudden be a 7970.

Funny thing is you accuse him of know nothing but you don't give any examples at all to counter his arguments.

If your game is GPU bound you could have the most efficient CPU in the world with the most efficient API in the world it would change nothing,the xbox one has a 7770,any game that becomes GPU will say that way and DX12 will do sh** to change that,mostly all games are GPU bound on xbox one as oppose to CPU bound,if the xbox one had a stronger GPU it would be doing better now regardless of its CPU.

Then there is ESRAM which isn't pretty good for 1080p with certain effects,dude the xbox one GPU is equivalent in power to a 7770 it may have dual render pipes like the 7790 but performance wise it is a 1.28TF GPU,you can't get that far on PC with that,even using Mantle and you can go further with a 7850 or a stronger GPU like the PS4 one which is the case here.

@blackace said:

If you guys haven't figured it out by now, Tormentos is just a biased trolling fool who thinks he knows everything. Just do what I do and ignore his BS comments. The guy doesn't even own a PS4 or XB1, yet he thinks he knows everything about both. LOL!! What a joke. I'm surprised this thread hasn't been locked yet. It's funny how cows will bellieve thurway and cboat rumors as FACT, but when comments come from developers, AMD, Nvidia & Intel about anything XB1 it's all false. lmao!! Hilarious. Bunch of fools.

I don't need to own a PS4 or xbox one to talk about them,you are a joker who claim to own all consoles yet you didn't even know that on PS3 you could share PSN games..lol

We are talking about hardware here and DX12,i don't need to own either to talk about it,or what owning a PS4 or xbox one make earn you a PHD in DX,API and hardware.?

My god what a silly fanboy you really are.

Nvidia,AMD and Intel are MS partners for endless years all 4 have common interest in each other business,and all will lie just to get a sucker like you to buy,DX12 is not windows 7,so upgrade with a new PC or buy your self a new windows 8 software,if you buy a new machine any of those 3 name there can benefit from it.

And even Activision developer doubt this sh**...lol

But then again Activision are developers,intel,amd and nvidia aren't.

Funny thing just like Respawn was gloating about how they went to the xbox one because of the cloud,and now it turns out that sony didn't even wanted the game on PS4 but wanted it on Vita..hahahaaaaaa

See people will say anything when their partners ask them to..lol

@misterpmedia said:

@tormentos You're getting popular over at the loony bin ;)

Now wonder this topic has so many views, all the pleebs from the olive garden have been dropping by.

Hahahahaa holy sh***.......I really hit a nerve on that guy..hahahaha

He want mister mediaC clown to argue with me..lol

My god some people are a joke and will take any sh** a company will tell them,hahaha is epic..

http://www.gamespot.com/profile/blog/the-art-of-trolling-fanboys/26056956/

He should read my new blog that would surely make him mad even more..hahaha

@StormyJoe said:

First off, not a boy. Secondly, I seriously doubt it. I've been a software developer for over 10 years, and have had a current MCSD certification for ever version of .Net.

How about you?

You are a software developer for 10 years and a fanboy probably for 20...

Considering how you want to put holes in one APi vs another.

@tormentos, seriously... just keep quite about stuff you know absolutely nothing about. Improving API performance improves software performance, It is a basic law of software development. Mathematicians do not have to prove to anyone that 1 + 1 = 2; but that doesn't make it untrue. Saying that improved APIs cannot or do not affect software performance is so patently false is is literally tantamount to saying the moon is made of blue cheese.

100% increase? No - not unless the previous APIs were written like total unoptimized crap. But, you can realistically get a 20-30% increase. I see it all the time.

Avatar image for ActicEdge
ActicEdge

24492

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#500 ActicEdge
Member since 2008 • 24492 Posts

If all it took was software to double your GPU speed, there would be far less hardware growth than their currently is. This is obviously a silly quote. That said, all the people here arguing to know Computer Engineering, Software Engineering, Hardware Coding, shut up. Quoting a bunch of powerpoints you don't understand and a bunch of shit out of context does not make you seem smart or knowledgeable.