Devs React to DX 12 Doubling Xbox One GPU Speed

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#501  Edited By StormyJoe
Member since 2011 • 7806 Posts

@ActicEdge said:

If all it took was software to double your GPU speed, there would be far less hardware growth than their currently is. This is obviously a silly quote. That said, all the people here arguing to know Computer Engineering, Software Engineering, Hardware Coding, shut up. Quoting a bunch of powerpoints you don't understand and a bunch of shit out of context does not make you seem smart or knowledgeable.

Really...

Improving API performance improves software performance, It is a fact, not an opinion.

Avatar image for ActicEdge
ActicEdge

24492

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#502  Edited By ActicEdge
Member since 2008 • 24492 Posts

@StormyJoe said:

@ActicEdge said:

If all it took was software to double your GPU speed, there would be far less hardware growth than their currently is. This is obviously a silly quote. That said, all the people here arguing to know Computer Engineering, Software Engineering, Hardware Coding, shut up. Quoting a bunch of powerpoints you don't understand and a bunch of shit out of context does not make you seem smart or knowledgeable.

Really...

Improving API performance improves software performance, It is a fact, not an opinion.

Uh duh, but scrolling through this thread you're not getting simple statements like that. The thread wouldn't be 500 posts otherwise. Its also obvious that the GPU speed in the X1 will not double because of it. No one believes that. Aside from those few facts, what else is there to be discussed? Development tools improve, efficiency goes up, you get more for less. I think that's about it unless I'm missing something? If I'm missing something its beyond my scope because software/hardware engineering is not my field. Its just funny watching people pulling numbers and logic out of their ass to come to some conclusions founded on nothing.

Avatar image for Phazevariance
Phazevariance

12356

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#503 Phazevariance
Member since 2003 • 12356 Posts

@ActicEdge said:

If all it took was software to double your GPU speed, there would be far less hardware growth than their currently is. This is obviously a silly quote. That said, all the people here arguing to know Computer Engineering, Software Engineering, Hardware Coding, shut up. Quoting a bunch of powerpoints you don't understand and a bunch of shit out of context does not make you seem smart or knowledgeable.

Not entirely true. Windows Vista uses double the ram of Windows 7 on the same hardware because it duplicates images for hte GUI in both System RAM and GPU RAM. That's fixed with a software update called Windows 7. :P GPU speed could increase (maybe not double) if the code on it is crap and then sorted out in an update.

In teh end though that just states that the current code is crap.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#504 tormentos
Member since 2003 • 33784 Posts

@Phazevariance said:

@ActicEdge said:

If all it took was software to double your GPU speed, there would be far less hardware growth than their currently is. This is obviously a silly quote. That said, all the people here arguing to know Computer Engineering, Software Engineering, Hardware Coding, shut up. Quoting a bunch of powerpoints you don't understand and a bunch of shit out of context does not make you seem smart or knowledgeable.

Not entirely true. Windows Vista uses double the ram of Windows 7 on the same hardware because it duplicates images for hte GUI in both System RAM and GPU RAM. That's fixed with a software update called Windows 7. :P GPU speed could increase (maybe not double) if the code on it is crap and then sorted out in an update.

In teh end though that just states that the current code is crap.

But Windows 7 doesn't use double the ram of windows 8 on the same hardware,Vista was the most god awful OS MS ever have done,it was terrible,windows 7 is better and windows 8 doesn't have the same gap windows 7 had from vista.

Not only that the xbox one isn't vista based or even windows 7 based,if anything it will be close to windows 8 which already is very good,to add even more consoles API are more streamline since the opening bell than PC ones,reason why already DX on xbox one already use features that will be found on DX12 which are not yet on PC.

So yeah software improved hardware,and not is not eternal gains,and there is a limit to what you can get by driving that hardware 99% efficient for not saying 100%.

Problem is this also apply to the PS4 which also its Api will improve,for example surface tilling is now from 10 to 100 times faster on CPU than it was for the first couple of months so by the next sdk update PS4 performance should improve.

This happen to all hardware.

Avatar image for Old_Gooseberry
Old_Gooseberry

3958

Forum Posts

0

Wiki Points

0

Followers

Reviews: 76

User Lists: 0

#505  Edited By Old_Gooseberry
Member since 2002 • 3958 Posts

if xbox one does end up being twice as fast as the ps4 then i'd think about getting one someday... but its not likely, if its dx12, its probably just a feature of it like rendering shadows or water effects faster or something. but theres no proof... but a lot of the time if i can pick between dx9 or 11, 11 is much faster (if same effects are used on both), so it would make sense that dx12 would be even faster with new hardware.

does xboxone support dx12 even?anyone know this?

Avatar image for MK-Professor
MK-Professor

4214

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#506  Edited By MK-Professor
Member since 2009 • 4214 Posts

DX12 will only help to eliminate the cpu bottleneck just like Mantle, so no performance improvement on GPU side for PC. Now for the xbox1, DX12 will make not difference.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#507 scatteh316
Member since 2004 • 10273 Posts

@StormyJoe said:

@scatteh316 said:

@StormyJoe said:

@scatteh316 said:

But what people in here are too stupid to understand is that Xbone is a CLOSED BOX system.... It's already running very optimized code....

And as for unlocking more GPU performance? Complete and utter rubbish.... If you're GPU bound then your GPU bound and having CPU power will not make a difference.

It's the same on PC, if your graphics card is maxed out changing the CPU to a faster one won't do anything.

DirectX12 will not give this massive boost people are expecting, even other developers are laughing at that statement....

100% untrue. Seriously... if you don't know anything about the topic at hand, why post?

I know more then you boy..

First off, not a boy. Secondly, I seriously doubt it. I've been a software developer for over 10 years, and have had a current MCSD certification for ever version of .Net.

How about you?

So no experience with the latest DirectX or OpenGL API's then?

Thanks for clearing that up.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#508  Edited By Tighaman
Member since 2006 • 1038 Posts

again you guys kill me this not 1999 CPU now do things that usually done on a GPU and vise versa if it helps with the CPU it helps the GPU Ps4 CPU was the bottleneck in the game infamous even though they offloaded some of the CPU load to the GPU. Just think if they didnt have to do that how would the game look if its already a great looking game.

Avatar image for ActicEdge
ActicEdge

24492

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#509 ActicEdge
Member since 2008 • 24492 Posts

@Phazevariance said:

@ActicEdge said:

If all it took was software to double your GPU speed, there would be far less hardware growth than their currently is. This is obviously a silly quote. That said, all the people here arguing to know Computer Engineering, Software Engineering, Hardware Coding, shut up. Quoting a bunch of powerpoints you don't understand and a bunch of shit out of context does not make you seem smart or knowledgeable.

Not entirely true. Windows Vista uses double the ram of Windows 7 on the same hardware because it duplicates images for hte GUI in both System RAM and GPU RAM. That's fixed with a software update called Windows 7. :P GPU speed could increase (maybe not double) if the code on it is crap and then sorted out in an update.

In teh end though that just states that the current code is crap.

This is fair though I think its a pretty massive jump for us to compare DX11--->DX12 vs MS doing what they label as an upgrade to a new OS :P

Avatar image for DirkXXVI
DirkXXVI

498

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

#512 DirkXXVI
Member since 2008 • 498 Posts

Nintendrones = Skynet

You have been warned. The end is nigh!

Avatar image for MK-Professor
MK-Professor

4214

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#513  Edited By MK-Professor
Member since 2009 • 4214 Posts

@rocky_denace said:
@MK-Professor said:

DX12 will only help to eliminate the cpu bottleneck just like Mantle, so no performance improvement on GPU side for PC. Now for the xbox1, DX12 will make not difference.

First off I can't believe this thread is still going such ass clowns in here with no experience with any of this yet acting like the are rocket scientist.

Listen bra first off the CPU in the X1 is not a bottleneck in fact it's faster then the Cpu in the PS4 it's clocked higher. And yes DX12 will help CPU for sure but what you clearly don't understand is DX12 will make GPU's run much more efficient and allow GPU's to use their CU's more efficiently. This has already been confirmed by MS, AMD, Intel, Nvidia and a respected developer who is currently working with the DX12 API. So please stop with your wishful butthurt thinking because you are flat out wrong and have no experience with this once so ever.

U Mad Bra

First of all you didn't even read my post, I never said it is a bottleneck for xbox1, I was talking about pc. Also the performance advantages the DX12 will bring on the GPU side will be next to nothing. Funny how you're telling me that i have no experience here, When I am probably the only guy here that have knowledge in programing for DX9, DX11 and HLSL.

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#515 lostrib
Member since 2009 • 49999 Posts

@rocky_denace said:

@MK-Professor said:

@rocky_denace said:
@MK-Professor said:

DX12 will only help to eliminate the cpu bottleneck just like Mantle, so no performance improvement on GPU side for PC. Now for the xbox1, DX12 will make not difference.

First off I can't believe this thread is still going such ass clowns in here with no experience with any of this yet acting like the are rocket scientist.

Listen bra first off the CPU in the X1 is not a bottleneck in fact it's faster then the Cpu in the PS4 it's clocked higher. And yes DX12 will help CPU for sure but what you clearly don't understand is DX12 will make GPU's run much more efficient and allow GPU's to use their CU's more efficiently. This has already been confirmed by MS, AMD, Intel, Nvidia and a respected developer who is currently working with the DX12 API. So please stop with your wishful butthurt thinking because you are flat out wrong and have no experience with this once so ever.

U Mad Bra

First of all you didn't even read my post, I never said it is a bottleneck for xbox1, I was talking about pc. Also the performance advantages the DX12 will bring on the GPU side will be next to nothing. Funny how you're telling me that i have no experience here, When I am probably the only guy here that have knowledge in programing for DX9, DX11 and HLSL.

Have you worked with the new DX12 API? No you apparently haven't so you can not speak of anything that you are trying to say especially when a respected developer who is currently working on it has and says other wise. Also AMD has said it's better then Mantel and also Intel is saying it's the biggest leap they have seen in years and Nvidia has praised it also. So don't come back until you have touched DX12 then we can listen to you because until then you don't know squat about DX12 and it don't matter if you have worked with DX9-10-11 because from what a respected developer and corporate graphics giants are saying other wise and are calling it a huge leap over anything we have see before.

THREAD!!!

lol someone's back from suspension, and still acting like a fool

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#518 lostrib
Member since 2009 • 49999 Posts

@rocky_denace said:

@lostrib said:

@rocky_denace said:

@MK-Professor said:

@rocky_denace said:
@MK-Professor said:

DX12 will only help to eliminate the cpu bottleneck just like Mantle, so no performance improvement on GPU side for PC. Now for the xbox1, DX12 will make not difference.

First off I can't believe this thread is still going such ass clowns in here with no experience with any of this yet acting like the are rocket scientist.

Listen bra first off the CPU in the X1 is not a bottleneck in fact it's faster then the Cpu in the PS4 it's clocked higher. And yes DX12 will help CPU for sure but what you clearly don't understand is DX12 will make GPU's run much more efficient and allow GPU's to use their CU's more efficiently. This has already been confirmed by MS, AMD, Intel, Nvidia and a respected developer who is currently working with the DX12 API. So please stop with your wishful butthurt thinking because you are flat out wrong and have no experience with this once so ever.

U Mad Bra

First of all you didn't even read my post, I never said it is a bottleneck for xbox1, I was talking about pc. Also the performance advantages the DX12 will bring on the GPU side will be next to nothing. Funny how you're telling me that i have no experience here, When I am probably the only guy here that have knowledge in programing for DX9, DX11 and HLSL.

Have you worked with the new DX12 API? No you apparently haven't so you can not speak of anything that you are trying to say especially when a respected developer who is currently working on it has and says other wise. Also AMD has said it's better then Mantel and also Intel is saying it's the biggest leap they have seen in years and Nvidia has praised it also. So don't come back until you have touched DX12 then we can listen to you because until then you don't know squat about DX12 and it don't matter if you have worked with DX9-10-11 because from what a respected developer and corporate graphics giants are saying other wise and are calling it a huge leap over anything we have see before.

THREAD!!!

lol someone's back from suspension, and still acting like a fool

I'm the only one in this thread talking any logical sense because apparently everyone else seems to think they know more then the graphics industry leaders and a current respected developer working with it. Just a bunch of buthurt PS fanboys scared and hoping it's not true.

...and that's where I stopped reading

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#519  Edited By ronvalencia
Member since 2008 • 29612 Posts
@ActicEdge said:

If all it took was software to double your GPU speed, there would be far less hardware growth than their currently is. This is obviously a silly quote. That said, all the people here arguing to know Computer Engineering, Software Engineering, Hardware Coding, shut up. Quoting a bunch of powerpoints you don't understand and a bunch of shit out of context does not make you seem smart or knowledgeable.

The context for the 2X is made with Star swarm's requirements and may not be applicable for other titles.

For Xbox One, it's Direct3D/driver stack wasn't multi-threaded friendly for multiple CPU cores. This is similar with AMD's Direct3D 11.0 implementation on the PC. The blame is AMD and they just got their multi-threading render capabilities with Mantle era drivers. It's better late than never.

Direct3D 11.0's multi-threading rendering at the driver level is optional, hence AMD ignored it and it doesn't fall into DirectX's Feature Level. AMD didn't ignore Direct3D's Feature Levels e.g. 11_0, 11_1.

On the PC, using Intel CPU with industry's best IPC(instruction per cycle) designs minimises AMD's MRT driver issue. AMD's GPU marketing department uses high-end Intel CPUs for their GPUs e.g. AMD's latest FirePro W9100 (recycled R9-290X with full featured 64bit FP math enabled) workstation reference uses Intel's latest Xeon CPUs.

I program C++ for a living.

@StormyJoe said:
@ActicEdge said:

If all it took was software to double your GPU speed, there would be far less hardware growth than their currently is. This is obviously a silly quote. That said, all the people here arguing to know Computer Engineering, Software Engineering, Hardware Coding, shut up. Quoting a bunch of powerpoints you don't understand and a bunch of shit out of context does not make you seem smart or knowledgeable.

Really...

Improving API performance improves software performance, It is a fact, not an opinion.

Large improvements for AMD CPU users which is important for IPC weak**/small core CPU designs e.g. AMD Jaguar.

**Relative to Intel Core series CPUs but not against ARM CPUs.

@MK-Professor said:
@rocky_denace said:
@MK-Professor said:

DX12 will only help to eliminate the cpu bottleneck just like Mantle, so no performance improvement on GPU side for PC. Now for the xbox1, DX12 will make not difference.

First off I can't believe this thread is still going such ass clowns in here with no experience with any of this yet acting like the are rocket scientist.

Listen bra first off the CPU in the X1 is not a bottleneck in fact it's faster then the Cpu in the PS4 it's clocked higher. And yes DX12 will help CPU for sure but what you clearly don't understand is DX12 will make GPU's run much more efficient and allow GPU's to use their CU's more efficiently. This has already been confirmed by MS, AMD, Intel, Nvidia and a respected developer who is currently working with the DX12 API. So please stop with your wishful butthurt thinking because you are flat out wrong and have no experience with this once so ever.

U Mad Bra

First of all you didn't even read my post, I never said it is a bottleneck for xbox1, I was talking about pc. Also the performance advantages the DX12 will bring on the GPU side will be next to nothing. Funny how you're telling me that i have no experience here, When I am probably the only guy here that have knowledge in programing for DX9, DX11 and HLSL.


Note that CPU drives the GPU.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#520  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:

@ronvalencia said:

Note that AMD's GL_AMD_sparse_texture(1) vendor specific extension was displaced by GL_ARB_sparse_texture(2).

GL_ARB_sparse_texture is part of OpenGL 4.4 and AMD doesn't support OpenGL 4.4(3) at this time and AMD's GL_AMD_sparse_texture extension is dead.

References

1. https://www.opengl.org/registry/specs/AMD/sparse_texture.txt

2. http://www.opengl.org/registry/specs/ARB/sparse_texture.txt it has NVIDIA's input.

3. http://www.khronos.org/news/press/khronos-releases-opengl-4.4-specification

Without ARB's support, it's a dead end API.

----------------

For textures that fits within the 6 GB memory, AMD PRT has more performance gain on X1 (1) than PS4(2).

1. X1's TMU fetch source starts from 68 GB/s** to 204 GB/s** ESRAM. Hotchip.org stated peak 204 GB/s** BW (Back Write). "Tiling tricks" needs to be applied for textures (via AMD PRT) and render targets.

2. PS4's TMU fetch source remains at 176 GB/s**.

**theoretical peak values.

For greater than 6 GB textures, AMD PRT can be applied on HDD or Blu-ray sources.

In terms of AMD PRT functionality, AMD Kaveri APU's single speed memory design is similar to PS4's single speed memory design and the only large difference is with memory speed e.g. Kaveri's 128bit DDR3-2xx0 Mhz vs PS4's 256bit GDDR5-5500 Mhz.

PS4's PRT function just with HDD/Blu-Ray -> GDDR5.

X1's PRT function is with HDD/Blu-Ray -> DDR3 -> ESRAM.

Kaveri's PRT function just with HDD/SSD -> DDR3.

Gaming PC with dGPU's PRT function is with HDD/SSD -> DDR3 -> GDDR5.

So basically your argument is that OpenGL extension is dead.hahahaaaaaaaaaaaaaaaaa

Industry Support

“AMD has a long tradition of supporting open industry standards, and congratulates the Khronos Group on the announcement of the OpenGL 4.4 specification for state-of-the-art graphics processing,” said Matt Skynner, corporate vice president and general manager, Graphics Business Unit, AMD. “Maintaining and enhancing OpenGL as a strong and viable graphics API is very important to AMD in support of our APUs and GPUs. We’re proud to continue support for the OpenGL development community.”

From you own link selective reader...

1.

The same discussion with ESRAM as well - the 204GB/s number that was presented at Hot Chips is taking known limitations of the logic around the ESRAM into account. You can't sustain writes for absolutely every single cycle. The writes is known to insert a bubble [a dead cycle] occasionally... One out of every eight cycles is a bubble, so that's how you get the combined 204GB/s as the raw peak that we can really achieve over the ESRAM. And then if you say what can you achieve out of an application - we've measured about 140-150GB/s for ESRAM. That's real code running. That's not some diagnostic or some simulation case or something like that.

Digital Foundry: So 140-150GB/s is a realistic target and you can integrate DDR3 bandwidth simultaneously?

Nick Baker: Yes. That's been measured.

http://www.eurogamer.net/articles/digitalfoundry-the-complete-xbox-one-interview

Don't mix rebellions claims with ESRAM speed they never claimed that,ESRAM can't do 204gb/s stated by MS 2 months after the hotships conference,it can do 140 to 150GB/s which is slower than the PS4 176 gb/s bandwidth even if you take away the 20GB for CPU still faster.

2-Which doesn't matter again for all intended purposes,GCN doesn't need ESRAM to use PRT and i already quote that from AMD then self the textures need to be taken for the HDD and partially load into the GPU ram system ram had nothing to do with it,i quote it and posted it again you and you still ignore it because you only read what serve you best and ignore the rest,juts like you quoted activision for COD Ghost been 1080p and you were wrong.

Oh and loading textures directly to the DDR3 will affect performance i don't think DDR3 will be use other than final destination after the data pass for ESRAM,Turn 10 gave an example of this already,the sky which is static and does nothing can be place on the main memory bank but the cars and other stuff that demand speed are placed on the ESRAM.

PRT works on PS4 period and no GCN need ESRAM to make it work,PRT is a way to save memory which for now the xbox one and PS4 both have to waste.

@freddie2222 said:

@tormentos:

updated conversation from Brad Wardell:

Brad Wardell - "One way to look at the XBox One with DirectX 11 is it has 8 cores but only 1 of them does dx work. With dx12, all 8 do."

Some guy - "Xboxone alredy does that. All those dx 12 improvements are about windows not xbox."

Brad Wardell - "I'm sorry but you're totally wrong on that front. If you disagree, better tell MS. I'm just the messenger."

Some guy - "in all materials published they talk about PC. They only mention XBOX as already hawing close to zero owerhead."

Brad Wardell - "XBO is very low on overhead. Overhead is not the problem. It is that the XBO isn't splitting DX tasks across the 8 cores."

Multi-core issue:

Some guy - "I thought the Xbox One had 6 cores available for gaming and 2 cores reserved for the OS, does DX12 change this?"

Brad Wardell - "not that I'm aware of. #cores for gaming is different than cores that interact with the GPU."

Did you watch that video demo they showed where they used 3DMark to demonstrate the workload on the cpu, where one core was main thread and pulled the biggest load then they switched to DX12 and the wordload was splitted evenly across all cores? This might be a pretty big deal, Infamous had CPU bottleneck right? It might be that also the PS4 currently dosent split the load evenly accross the cores, but what do I know. Im not a dev

That is a joke DX doesn't use just 1 core of multicore CPU,in fact the exampled give using screens showed how all cores were working on DX11,in fact what they did was improve timing,which showed pretty easy on the screen MS showed.

Not only that the xbox one doesn't have 8 cores to use,it has 6 so you have already 2 wrong statements there all cores on CPU had been use on consoles for years,and on PC as well,it took more time but all know games demand multicore CPU and DX12 isn't here so yeah those cores are been use.

The load was already split but most of the work was done by the first core,if you see what they do they lower the first core time greatly while increasing the secondary cores ones,in fact in the screen show you can compare how they got to cutting in half the timing,but all cores were already in US that demo was for PC not xbox one,which use only 6 cores for games confirmed already not 8 so he was wrong.

But here is a nice take on this..

Have you calculated the effective data transfer with 8th cycle being nuked for data transfer?

Rebellion's made a statement on memory bandwidth for both PS4 and X1 i.e. PS4's GDDR5 memory is almost as fast as X1's ESRAM.

Your wrong on Direct3D, AMD GPUs and CPU multi-threading.

Avatar image for spitfire-six
Spitfire-Six

1378

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#521 Spitfire-Six
Member since 2014 • 1378 Posts

And just like that everyone got served by someone who knows programing. Side note Im starting in CS Ive been told its better to start in C# instead of VB and c+. Truth ?

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#523 lostrib
Member since 2009 • 49999 Posts

@rocky_denace said:

when can we expect your next meltdown? I'd like to clear my schedule ahead of time

Avatar image for daveg1
daveg1

20405

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

#524 daveg1
Member since 2005 • 20405 Posts

im a dev and have worked on some of this and last gens biggest titles...

and i can tell you stormy joe is talkin with his butt... just like i just did above.

Avatar image for lglz1337
lglz1337

4959

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#525 lglz1337
Member since 2013 • 4959 Posts

@rocky_denace: calm down you take this place way to serious you are like a firework exploding in every thread

Avatar image for HalcyonScarlet
HalcyonScarlet

13664

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#526 HalcyonScarlet
Member since 2011 • 13664 Posts

@daveg1 said:

im a dev and have worked on some of this and last gens biggest titles...

and i can tell you stormy joe is talkin with his butt... just like i just did above.

Always sceptical when someone says this. Why would you be wasting your time in SWs. Aren't there better forums for devs then this cow/lem mud slinging hole?

If it was the older SWs when you actually had decent threads (cos all the shit ones got locked), I could understand. Before the mods essentially said '**** it, do what you want'. You actually have to work hard to make a thread bad enough to get locked now.

Avatar image for MK-Professor
MK-Professor

4214

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#527 MK-Professor
Member since 2009 • 4214 Posts

@rocky_denace said:

@MK-Professor said:

@rocky_denace said:
@MK-Professor said:

DX12 will only help to eliminate the cpu bottleneck just like Mantle, so no performance improvement on GPU side for PC. Now for the xbox1, DX12 will make not difference.

First off I can't believe this thread is still going such ass clowns in here with no experience with any of this yet acting like the are rocket scientist.

Listen bra first off the CPU in the X1 is not a bottleneck in fact it's faster then the Cpu in the PS4 it's clocked higher. And yes DX12 will help CPU for sure but what you clearly don't understand is DX12 will make GPU's run much more efficient and allow GPU's to use their CU's more efficiently. This has already been confirmed by MS, AMD, Intel, Nvidia and a respected developer who is currently working with the DX12 API. So please stop with your wishful butthurt thinking because you are flat out wrong and have no experience with this once so ever.

U Mad Bra

First of all you didn't even read my post, I never said it is a bottleneck for xbox1, I was talking about pc. Also the performance advantages the DX12 will bring on the GPU side will be next to nothing. Funny how you're telling me that i have no experience here, When I am probably the only guy here that have knowledge in programing for DX9, DX11 and HLSL.

Have you worked with the new DX12 API? No you apparently haven't so you can not speak of anything that you are trying to say especially when a respected developer who is currently working on it has and says other wise. Also AMD has said it's better then Mantel and also Intel is saying it's the biggest leap they have seen in years and Nvidia has praised it also. So don't come back until you have touched DX12 then we can listen to you because until then you don't know squat about DX12 and it don't matter if you have worked with DX9-10-11 because from what a respected developer and corporate graphics giants are saying other wise and are calling it a huge leap over anything we have see before.

The only apparent downside is that it apparently pushes GPU's harder then ever before and they will run hot but it's clearly obvious MS was prepared for this with the X1 because it has a huge fan and huge heat sink so they knew what was up and what was coming.

THREAD!!!

companies do lie from time to time you know...

I am not going to discuss any further. DX12 will bring for the PC significant better drawcall performance so less CPU load, and pretty much no performance advantages on the GPU side. On the XBOX1 DX12 will not improve the drawcall performance because it was already very good with the existing API and no real performance advantages on the GPU side.

Now you can say/believe what ever you want but time will tell that i am right.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#529 Tighaman
Member since 2006 • 1038 Posts

man the guy who made dynasty warrior 8 on the ps4 says specifically the 14+4 set up is every easy to use so its 1.3tf vs 1.4tf for graphics but gddr5 wants to go faster than the cpu can deliver, API changes will help the ps4 too so DX12 will be great for the x1

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#530 tormentos
Member since 2003 • 33784 Posts

@rocky_denace said:

Just more FALSE nonsense and again to you just like the others you have never worked with DX12 or it's API dev tools so once again you can go to the barn of not knowing SHIT!!!

And you are also clearly wrong the current and launch dev tools on X1 where not good this has already been confirmed by a respected developer Rebellion working on Sniper Elite III saying that he has worked with the new SDK kits that are coming soon for X1 and he says they are great and a huge improvement over what dev tools currently have and goes on to say that once the new SDK kits arrive the X1 should be hitting 1080p easily.

It's so amazing to me that these fanboys still say false nonsense even in the face of facts and even when respected developers that are currently working with it. Just blind PS fanboys still trying to say no because of butthurt and fear.

Have you ever worked with DX.?

Do you even freaking know the difference between DX11,mantle and DX12.?

Aren't you the same moron who was saying the xbox one had an advantage because Mantle wasn't coming to PS4.? Completely ignoring that Mantle is an API to emulate console APi which DX12 also is.?

Oh please DX on xbox one already has features of DX 12..hahaha

There is no new SDK,the SDK just get update and i am sure it already did get update,in fact last update rolled out,some of DX12 features like bundles are been use since launch and on launch games..lol

You don't get it do you DX12 is MS glorified Mantle..hahahaha

Much like the worthless cloud...

@Tighaman said:

man the guy who made dynasty warrior 8 on the ps4 says specifically the 14+4 set up is every easy to use so its 1.3tf vs 1.4tf for graphics but gddr5 wants to go faster than the cpu can deliver, API changes will help the ps4 too so DX12 will be great for the x1

Link because from what i read the PS4 has 30 CU 12 are disable and will be re enable once compute kick into gear..

The xbox one isn't even 1.3 is 1.28TF has a 2% reservation still,oh and whit what the xbox one will make up for the 400+Gflops for compute..

I have told you like 10 times is 1.84 vs 1.28Tf no matter what those 4 CU will not banish and do nothing dude stop grasping you look desperate.

1.484 + 400 vs 1.128 no way around that...

Avatar image for GravityX
GravityX

865

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#531  Edited By GravityX
Member since 2013 • 865 Posts

Oh oh, Sony thinks this DX12 is on to something.

http://www.dualshockers.com/2014/04/18/sony-working-on-improving-multithreading-and-parallel-processing-for-ps4-and-ps3/

Sony Working on Improving Multithreading and Parallel Processing for PS4 and PS3

Avatar image for ps4hasnogames
PS4hasNOgames

2620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#532  Edited By PS4hasNOgames
Member since 2014 • 2620 Posts

@GravityX said:

Oh oh, Sony thinks this DX12 is on to something.

http://www.dualshockers.com/2014/04/18/sony-working-on-improving-multithreading-and-parallel-processing-for-ps4-and-ps3/

Sony Working on Improving Multithreading and Parallel Processing for PS4 and PS3

Just imagine how good games will look by then...this gen has me excited. But actual gameplay hasn't changed much since ps2 days, am I wrong?

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#533 Tighaman
Member since 2006 • 1038 Posts

@tormentos: giant bomb.com dev from dynasty warrior 8 talking about use of CU and he said if they would have a faster clock and use the whole 18 instead they could have did more and if you look at the hotchips diagram you see that two extra graphic compute and two extra cpu command processors is for direct compute. And they sony is using game sound from gpu too. You have to include all of that things are closer than you all make it out to be. When new engines come out if the x1 still struggling we can come back to this subject but until we stop cross genning game you will never get the full story.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#534 tormentos
Member since 2003 • 33784 Posts

@Tighaman said:

@tormentos: giant bomb.com dev from dynasty warrior 8 talking about use of CU and he said if they would have a faster clock and use the whole 18 instead they could have did more and if you look at the hotchips diagram you see that two extra graphic compute and two extra cpu command processors is for direct compute.

And they sony is using game sound from gpu too. You have to include all of that things are closer than you all make it out to be. When new engines come out if the x1 still struggling we can come back to this subject but until we stop cross genning game you will never get the full story.

LINK....

No sony is not using sound from the GPU dude stop inventing crap ray tracing for audio is not the same as simple audio,and will require resources from outside the sound block which is mainly for Kinect voice recognition in the first place.

On PS4 audio is handle by AMD true audio hardware...lol

@GravityX said:

Oh oh, Sony thinks this DX12 is on to something.

http://www.dualshockers.com/2014/04/18/sony-working-on-improving-multithreading-and-parallel-processing-for-ps4-and-ps3/

Sony Working on Improving Multithreading and Parallel Processing for PS4 and PS3

Sony is always working on improving tools and the article seems a little misleading,since they are not asking specific for what the article claim,the job listing requires those fields but that doesn't mean it is for that purpose,alto is well know that sony will keep improving its sdk.

I have say this many times already anything the xbox one can do by APi the PS4 can do it as well,so can i claim now double performance for the PS4 as well like you have been doing.?

Avatar image for LJS9502_basic
LJS9502_basic

178845

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#535 LJS9502_basic
Member since 2003 • 178845 Posts

You guys just keep eating up the PR......hilarious.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#538 tormentos
Member since 2003 • 33784 Posts

@FastRobby said:

Are you a credible developer? As long as we don't see developers saying that this is huge, and will have a serious impact, you can't say anything. But then again in your world you actually think you know more than the developers...

Are you.? Some developer are even making fun of this..hahaha

And one from Activision actually asked why people go on record to say things like that,you know why he say that right.? Because one you go on record oh you better deliver or is coming back to hunt you,which has been the case for MS since before launch and which has loss them the market.

All the crap about ESRAM,hyping Tile resources as exclusives lol 15 processors,the whole balance crap..

All has byte them back,and they claim that by launch games would speak for the self,and they did either MS was completely unaware of games performance on 3rd parties (which i doubt) or they pretty much lie to people in order for then to byte an buy and xbox one.

@rocky_denace said:

OWNED!!! Go home kids mommy has dinner ready for you hurry up because daddy wants to tap mommy again and make more stupid blind fanboy kids.

http://gamingbolt.com/xbox-ones-esram-too-small-to-output-games-at-1080p-but-will-catch-up-to-ps4-rebellion-games

Oh that has been posted here like 10 times...hahahaha

There are already games that run the same on both hardware NFS,NBA Tomb Raider,all 1080p he wasn't saying anything new he was been polity while at the same time he was telling you that yeah ESRAM is to SMALL so yeah it is to small now it will be to small in 10 years,as games get more demanding it will make even a bigger dent on xbox one.

Oh and most games i am sure will be make with the PS4 in mind since it is the more straight forward hardware then scale down to xbox one,hell even those that get make on PC will also scale better on PS4 without much problems on xbox one ESRAM is a problem for that.

You are not a developer you haven't work with DX either,and you are just butthurt that your console is getting kick,keep the dream of the secret sauce alive..lol

Avatar image for slimdogmilionar
slimdogmilionar

1343

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#539  Edited By slimdogmilionar
Member since 2014 • 1343 Posts

@tormentos said:

Have you ever worked with DX.?

Do you even freaking know the difference between DX11,mantle and DX12.?

Aren't you the same moron who was saying the xbox one had an advantage because Mantle wasn't coming to PS4.? Completely ignoring that Mantle is an API to emulate console APi which DX12 also is.?

Oh please DX on xbox one already has features of DX 12..hahaha

There is no new SDK,the SDK just get update and i am sure it already did get update,in fact last update rolled out,some of DX12 features like bundles are been use since launch and on launch games..lol

You don't get it do you DX12 is MS glorified Mantle..hahahaha

Much like the worthless cloud...

@Tighaman said:

man the guy who made dynasty warrior 8 on the ps4 says specifically the 14+4 set up is every easy to use so its 1.3tf vs 1.4tf for graphics but gddr5 wants to go faster than the cpu can deliver, API changes will help the ps4 too so DX12 will be great for the x1

Link because from what i read the PS4 has 30 CU 12 are disable and will be re enable once compute kick into gear..

The xbox one isn't even 1.3 is 1.28TF has a 2% reservation still,oh and whit what the xbox one will make up for the 400+Gflops for compute..

I have told you like 10 times is 1.84 vs 1.28Tf no matter what those 4 CU will not banish and do nothing dude stop grasping you look desperate.

1.484 + 400 vs 1.128 no way around that...

Don't you mean 30 ROPS and 18 CU's balanced at 14 + 4? I know people will have their own opinion but my question is " Why is the PS4 not crushing xbox in graphics even with devs not being able to fully use ESram"? Most of what I hear is that the PS4 sacrifices gameplay for resolution seeing as how most games on xbox play better according to the people. So what if Amd, Nvidia, and M$ all move towards using these new api's? Where does that leave PS4? It leaves them using outdated developing tech.

M$ cloud is worthless? Really? Sony's cloud is less complicated with only 8,000 servers and they have yet to crank it up. M$ has a complicated cloud infrastructure like Google's(which Sony doesn't understand) who nobody believe's in despite the fact that's it's working now, and not just on Xbox1, businesses are actually using M$ and Google's cloud compute servers. Anything Google does will make money and M$ followed suite and its paying off enough that they now have commercials advertising the cloud for businesses. Yep the gimmick is right in everyone's face again sort of like when M$ got flamed for introducing broadband gaming to the industry. To bad that didn't work out for them like everyone said. Oh wait broadband gaming is the standard for online gaming now just like everyone thought that the Xbox 360's unified architecture would be the standard for gaming consoles because Microsoft did it. Besides AMD has already said DX12 is better and that it will work hand in hand with DX12. Not to mention Xbox is built like a PC with dgpu and Cpu which have seperate ram pools.

Sony would not be hiring someone to do this now if M$ hadn't made DX exclusive or if they new these major companies were going to shift to these new API's.

http://www.dsogaming.com/news/amd-on-why-mantle-dx12-is-the-result-of-common-goals/

"Microsoft has been working with GPU hardware manufacturers to find you ways the GPU can be enabled to do the best rendering techniques with great quality and performance, and DX12 is the fruit of that collaboration. DX12 aims to fully exploit the GPUs supporting it. In addition, DX12 was the result of common goals as everyone participating in this program wanted to resolve performance issues that could not be resolved otherwise."

So DX12 was not done by M$ alone AMD and Nvidia also had a hand in it. Imagine that M$ bringing competing companies together to create something better.

I'm hoping to be able play games at higher FPS on my outdated crossfire setup when DX12 releases.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#541  Edited By scatteh316
Member since 2004 • 10273 Posts

@rocky_denace: You tool, All consoles go through tweaks and optimisations through the life span so this is not own age at all.

Just natural evolution of API's and game engines which always happens.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#543 tormentos
Member since 2003 • 33784 Posts

@slimdogmilionar said:

Don't you mean 30 ROPS and 18 CU's balanced at 14 + 4? I know people will have their own opinion but my question is " Why is the PS4 not crushing xbox in graphics even with devs not being able to fully use ESram"? Most of what I hear is that the PS4 sacrifices gameplay for resolution seeing as how most games on xbox play better according to the people. So what if Amd, Nvidia, and M$ all move towards using these new api's? Where does that leave PS4? It leaves them using outdated developing tech.

M$ cloud is worthless? Really? Sony's cloud is less complicated with only 8,000 servers and they have yet to crank it up.

M$ has a complicated cloud infrastructure like Google's(which Sony doesn't understand) who nobody believe's in despite the fact that's it's working now, and not just on Xbox1, businesses are actually using M$ and Google's cloud compute servers. Anything Google does will make money and M$ followed suite and its paying off enough that they now have commercials advertising the cloud for businesses.

Yep the gimmick is right in everyone's face again sort of like when M$ got flamed for introducing broadband gaming to the industry. To bad that didn't work out for them like everyone said. Oh wait broadband gaming is the standard for online gaming now just like everyone thought that the Xbox 360's unified architecture would be the standard for gaming consoles because Microsoft did it. Besides AMD has already said DX12 is better and that it will work hand in hand with DX12. Not to mention Xbox is built like a PC with dgpu and Cpu which have seperate ram pools.

Sony would not be hiring someone to do this now if M$ hadn't made DX exclusive or if they new these major companies were going to shift to these new API's.

http://www.dsogaming.com/news/amd-on-why-mantle-dx12-is-the-result-of-common-goals/

"Microsoft has been working with GPU hardware manufacturers to find you ways the GPU can be enabled to do the best rendering techniques with great quality and performance, and DX12 is the fruit of that collaboration. DX12 aims to fully exploit the GPUs supporting it. In addition, DX12 was the result of common goals as everyone participating in this program wanted to resolve performance issues that could not be resolved otherwise."

So DX12 was not done by M$ alone AMD and Nvidia also had a hand in it. Imagine that M$ bringing competing companies together to create something better.

I'm hoping to be able play games at higher FPS on my outdated crossfire setup when DX12 releases.

Firs of all the PS4 is crushing the xbox one...

The difference between the PS4 and xbox one should not be more than 20 FPS is the worse case at the same resolution,musty 15 17FPS,the PS4 is kicking the living ass out of the xbox one.

You know the power it takes to double another GPU resolution wise.? Yeah i am sure you have no idea,just like the king of power it requires by one GPU to double another in frames..

Lets see.

Tomb Raider is 1080p on both xbox one and PS4,but there stop the similarity,the PS4 version runs at 60 FPS in many instances the xbox one version at just 30FPS thats 100% difference,the PS4 version is 20 FPS on average faster than the xbox one version.

Look at what GPU represent the xbox one and PS4 closes frames,the difference between the PS4 and xbox one should be like 15 frames which is a little higher than 7770 vs 7850,yet the difference is a big as 7950 vs 7770 why and how is that possible.?

That bold part is completely stupid and has nothing to do with this argument,Ryse is a sh** game and look good ,one isn't tied to the other Infamous look better is more open 1080p and doesn't have drop to 17 frames like Ryse and is a better game to.

The PS4 is not 14+4 is so you are an alter account of the other joke posters here which insist in the 14+4 crap.

The PS4 is 18 CU units the xbox one is 12 there is nothing that can change that,even if it was true using 4CU for compute is something that the xbox one need to compensate with sad part is that if 4 CU of the xbox one are use for compute to level the playing field the xbox one end with 8 CU for rendering,maybe that is the reason why Ghost is 720p on xbox one and 1080p on PS4..lol Same with MGS5 as well 1080p 60 FPS PS4 with extra effects 720p on xbox one,and that came 1 month ago..

Sony's cloud isn't to magically boost performance like MS wanted to make believe,only to own then self when developers claimed it was mostly for dedicated servers for games,and maybe backed lighting or AI which are thing that don't need to be constantly refresh,mind you that the majority on xbox one games don't use the cloud for AI,because that would mean that any one who bough those games most have online.

MS introduce what.? Hahahahaaaaaaaaaaaaaaa Broadband gaming,..... to the console market..?

hahahaaaaaaaaaaaaaaaaaa............

The PS2 did that before the xbox you buffoon..hahaha

Since before launch Sony stated that the PS2 would connect to the internet to allow to play games online by add on,which was 56k and Broadband was well,it was release before xbox live was dude,in fact the first games to run on the broadband network adapter was release before the adapter THPS 3,you know that game came before the xbox was even out right on 2001.?

So yet again another thing that lemming claim MS did that they didn't.

AMD has released two new videos in which the red team reveals the reasons why both Mantle and DirectX 12 are really important to PC gaming. As AMD noted, both Mantle and DirectX 12 will benefit existing customers, and will offer better performance for all games supporting them.

From you own link selective reader...

If you don't get that drift..lol LibGNM is like Mantle..lol

By the way most of those performance issues are not found on PS4 because those issues are created by DX in the first place..hahahaaaaaaaaaaaa

PS4′s library and Mantle are similar

The good news is that the mantle API will be similar to the PS4 library. This was revealed at AMD’s APU13 presentation, and in addition they’ll be pushing Frostbite’s design based on these key pieces of technology. The rep speaking also mentioned that the Mantle API and the PS4′s library are going to be much closer together than Mantle and Microsoft’s DirectX11. DX11 as we know has many issues, and a lot of the problems revolving around PC performance is currently directly linked to DX11. The similarity between Mantle and the PS4 isn’t the only one for AMD, there was news recently that AMD’s TrueAudio technology has more than a little in common with the PS4′s Audio DSP (article).

AMD’s Guennadi Riguer was speaking during the developer summit and said that the current API’s are only capable of getting games developers “so far”.

http://www.redgamingtech.com/amd-mantle-works-on-nvidia-gpus-100000-draw-calls-similar-to-ps4-api-library/

Butbutbut DX12....Sony is copying ...hahahahaaa

The common goal between DX 12 and Mantle is been low level and get rid of PC likes bottle necks created by other DX version in the first place,DX was never an ace vs the PS3 it will not be now that the hardware is friendlier..hahahaha

But like i already told you keep the dream alive..lol

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#545  Edited By lostrib
Member since 2009 • 49999 Posts

I see the meltdown is going strong in this thread

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#546 tormentos
Member since 2003 • 33784 Posts

@rocky_denace said:

LOL love how these cows try to slip things in that aren't true.

First off TR on PS4 is not locked at 60fps in FACT it bounces around all over anywhere from mid 30's to 50's and in FACT some have suggested that it's not a likable experience bouncing FPS like this and say the X1 version that stays closer to 30 can be a better experience. This was also the case with Infamous bouncing around FPS with that weak sauce CPU in the PS4. The Infamous patch is an option

It average 50 FPS just like PC averages which also go under,in many instances sustained is 60 FPS,oh did i mention that the xbox one has lower quality effects as well as 900p

the bottom line is that the differences between 40-50fps on the PS4 are far less of an issue than, say, the 24-30fps drops incurred by the Xbox One.

For the most part the main graphical bells and whistles are lavished equally across both consoles, although intriguingly there are a few areas that do see Xbox One cutbacks. As demonstrated in our head-to-head video below (and in our vast Tomb Raider comparison gallery), alpha-based effects in certain areas give the appearance of rendering at half resolution - though other examples do look much cleaner. We also see a lower-quality depth of field in cut-scenes, and reduced levels of anisotropic filtering on artwork during gameplay. Curiously, there are also a few lower-resolution textures in places on Xbox One

http://www.eurogamer.net/articles/digitalfoundry-2014-tomb-raider-definitive-edition-next-gen-face-off

Not only frame wise the difference is also effects wise..lol

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#547  Edited By ReadingRainbow4
Member since 2012 • 18733 Posts

Lems have been udderly decimated.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#548 Tighaman
Member since 2006 • 1038 Posts

@tormentos: damn torm you sound mad. We got docs that leaked from sony that says 14+4 cu is a balance now we have a real dev from a real game that it is a 14+4 cu setup why you just can't accept that? I ve been saying from the get go old engines dx 9 dx 10 will not fully use a dx 11.2 arch. New engines won't be brute force it will be about putting the right data in the right place data flow,

The ps3 arch was ahead of its time and MS knew it but they have a stronghold on the industry so the devs went with MS ......so Sony thought this gen was gonna stay with brute force but x1 and dx 12 is about data flow supercomputer ps3 arch.

Avatar image for Martin_G_N
Martin_G_N

2124

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#549 Martin_G_N
Member since 2006 • 2124 Posts

@Tighaman:

The X1 is not close to being similar to the ps3's architecture. Both the ps4 and the X1 are the same, but with different memory setup and GPU. The X1 have smaller GPU since it uses ES ram to help the slow memory bandwidth and that takes up space on the board.

The Ps3 had a powerful CPU that could help the GPU. Now and in the future it is the GPU that will do most of the work, even helping the CPU, and this is how the Ps4 and X1 is developed.