Mantle Performance Preview: LOOKS AWESOME!

This topic is locked from further discussion.

Avatar image for ShadowDeathX
ShadowDeathX

11698

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#1  Edited By ShadowDeathX
Member since 2006 • 11698 Posts

Video: http://www.youtube.com/watch?v=QIWyf8Hyjbg

Mantle heavily reduces the CPU’s workload

R9-290X + FX 8350 Base Build:

As the speaker discloses after the demo (at 37:18), their CPU is actually hardly even being used. The developer presented the audience with a before-and-after frame analysis of how Mantle affects the system’s workload in comparison to DirectX. In the frame analysis taken from the game running with DirectX, we see a chunk of the CPU being used by the driver thread. In the second (the one where Mantle was used), that thread is simply gone.

Not only did it free up a lot of CPU time when switched to Mantle, but Mantle itself took only a fraction of CPU time to function as opposed to DirectX’s required driver thread. This means that developers have the option to go one of two ways if they use Mantle: they can utilize that unused CPU time to improve their game, or they can simply drop the CPU requirement. The viability of lower CPU performance requirements was later verified even further (at 39:55) when the speaker disclosed that even when underclocked to 2GHz, the FX-8350 was still waiting for the GPU to finish on any given frame. Typically, we would see things the other way around.

http://wccftech.com/amd-mantle-demo-game-gpubound-cpu-cut-2ghz/#ixzz2o4EAfAgn

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#2 Xtasy26
Member since 2008 • 5582 Posts

Great! Can't wait to see it in BF4.

Avatar image for Gaming-Planet
Gaming-Planet

21064

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#4 Gaming-Planet
Member since 2008 • 21064 Posts

@farrell2k said:

Developers now have a decision to make: Keep writing everything for DirectX, or write mantle for AMD cards, and then DirectX for Nvidia cards. I predict that they will just stick with Directx, rather than spending more time and money on two separate code bases for PC.

Maybe have Nvidia license Mantle? Unless this only works with the GCN architecture.

Avatar image for gerygo
GeryGo

12803

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#6 GeryGo  Moderator
Member since 2006 • 12803 Posts

This means the end of CPU hog games such as Metro etc.?

Avatar image for Elann2008
Elann2008

33028

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#7  Edited By Elann2008
Member since 2007 • 33028 Posts

@farrell2k said:

Developers now have a decision to make: Keep writing everything for DirectX, or write mantle for AMD cards, and then DirectX for Nvidia cards. I predict that they will just stick with Directx, rather than spending more time and money on two separate code bases for PC.

Goodbye DirectX!

Avatar image for Geminon
Geminon

1177

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 Geminon
Member since 2012 • 1177 Posts

all talk until we see actual real world performance.

Avatar image for JangoWuzHere
JangoWuzHere

19032

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#9 JangoWuzHere
Member since 2007 • 19032 Posts

I don't understand why AMD would not keep this exclusive. Nvidia has no interest in making gsync available for AMD cards.

Avatar image for Jebus213
Jebus213

10056

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 Jebus213
Member since 2010 • 10056 Posts
@JangoWuzHere said:

I don't understand why AMD would not keep this exclusive. Nvidia has no interest in making gsync available for AMD cards.

Gsync isn't the same thing.

Avatar image for adamosmaki
adamosmaki

10718

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

#11  Edited By adamosmaki
Member since 2007 • 10718 Posts

So i might keep that phenom II of mine a couple more years

Avatar image for kraken2109
kraken2109

13271

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12 kraken2109
Member since 2009 • 13271 Posts

Interesting

Avatar image for JangoWuzHere
JangoWuzHere

19032

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#13 JangoWuzHere
Member since 2007 • 19032 Posts

@Jebus213 said:
@JangoWuzHere said:

I don't understand why AMD would not keep this exclusive. Nvidia has no interest in making gsync available for AMD cards.

Gsync isn't the same thing.

I know that.

Avatar image for Jebus213
Jebus213

10056

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14  Edited By Jebus213
Member since 2010 • 10056 Posts
@JangoWuzHere said:

@Jebus213 said:
@JangoWuzHere said:

I don't understand why AMD would not keep this exclusive. Nvidia has no interest in making gsync available for AMD cards.

Gsync isn't the same thing.

I know that.

Why not make it open source?

Devs aren't going to alienate their user base like that.

Avatar image for HavocV3
HavocV3

8068

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 HavocV3
Member since 2009 • 8068 Posts

@JangoWuzHere said:

I don't understand why AMD would not keep this exclusive. Nvidia has no interest in making gsync available for AMD cards.

Because AMD isn't interested in holding the industry back with proprietary BS.

That or it's because they know they can't get away with it like Nvidia can.

Either way it's best if it's open. If it's as good as the thread leads us to believe then you'd certainly want it to see more widespread use. That's better than it ending up like some of the gimmicky crap that only appears in a couple of games every year. *cough* TressFx, PhysX *cough*

Avatar image for aihyah
aihyah

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16 aihyah
Member since 2005 • 25 Posts

Coding close to the metal sounds good until you realize that this is nonsense, its backwards. When we first got 3d hardware that's what we did, 3dfx glide? It was a fragmented market and It was an awful mess. Higher level api allowed universal development and pc hardware evolution just picked up the slack, and it still works that way. Coding close to the metal assumes no change, and that's just not how things are on the pc. Will a few titles feature it? Sure, so it happened with glide, in the long run it just won't matter.

Avatar image for adamosmaki
adamosmaki

10718

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

#17 adamosmaki
Member since 2007 • 10718 Posts

@HavocV3 said:

@JangoWuzHere said:

I don't understand why AMD would not keep this exclusive. Nvidia has no interest in making gsync available for AMD cards.

Because AMD isn't interested in holding the industry back with proprietary BS.

That or it's because they know they can't get away with it like Nvidia can.

Either way it's best if it's open. If it's as good as the thread leads us to believe then you'd certainly want it to see more widespread use. That's better than it ending up like some of the gimmicky crap that only appears in a couple of games every year. *cough* TressFx, PhysX *cough*

Let alone with Gpu computing rendering CPU even less important on games any intel advantage on that front will be lost and that is a good think for AMD's cpu department and i believe that is also part of the reason for not keeping this exclusive

Avatar image for Gaming-Planet
Gaming-Planet

21064

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#18 Gaming-Planet
Member since 2008 • 21064 Posts

@farrell2k said:

@Gaming-Planet said:

@farrell2k said:

Developers now have a decision to make: Keep writing everything for DirectX, or write mantle for AMD cards, and then DirectX for Nvidia cards. I predict that they will just stick with Directx, rather than spending more time and money on two separate code bases for PC.

Maybe have Nvidia license Mantle? Unless this only works with the GCN architecture.

It's an API meant to work with AMD hardware. I am sure if it takes off, Nvidia will work on their own.

Apparently it could work on Nvidia cards if Nvidia chooses to adopt it. http://wccftech.com/amd-mantle-api-require-gcn-work-nvidia-graphic-cards/ Not a proprietary API. This is good for all gamers!

Avatar image for blaznwiipspman1
blaznwiipspman1

16539

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 blaznwiipspman1
Member since 2007 • 16539 Posts

indeed, nvidia is the only one jumping on proprietary bs. BTW, mantle doesn't need to be adopted by the pc industry, if it takes off for consoles which all feature AMD gpu's and AMD cpus then it would transition over to the PC side very easily. So there is no separate costs with deving for mantle api and direct x. Smart move by AMD.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#20 ronvalencia
Member since 2008 • 29612 Posts

@farrell2k said:

Developers now have a decision to make: Keep writing everything for DirectX, or write mantle for AMD cards, and then DirectX for Nvidia cards. I predict that they will just stick with Directx, rather than spending more time and money on two separate code bases for PC.

If they have a PS4 build, the dev can get that build and recycle it for Mantle.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#21 ronvalencia
Member since 2008 • 29612 Posts

@Gaming-Planet said:

@farrell2k said:

Developers now have a decision to make: Keep writing everything for DirectX, or write mantle for AMD cards, and then DirectX for Nvidia cards. I predict that they will just stick with Directx, rather than spending more time and money on two separate code bases for PC.

Maybe have Nvidia license Mantle? Unless this only works with the GCN architecture.

As mentioned in the lecture, Mantle was designed to have enough abstraction layer for AMD's future GPU architecture which leads to Mantle could work with NVIDIA's modern GPUs.

Avatar image for Masculus
Masculus

2878

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 Masculus
Member since 2009 • 2878 Posts

My body is ready.

Avatar image for GarGx1
GarGx1

10934

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#23  Edited By GarGx1
Member since 2011 • 10934 Posts

This is one reason I'm not upgrading my GPU until later next year.

I want to see where developers are going to go with Mantle thrown into the mix. At the moment I'm still planning on going Maxwell but that can change if AMD come up with something competitive and Mantle proves to be a winner.

Avatar image for nicecall
nicecall

528

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

#24 nicecall
Member since 2013 • 528 Posts

i don't know much about Mantle... but if this actually works as good as it seems, you'd practically never have to upgrade your cpu again and just do gpu upgrades? AMD is gonna clean house if they keep this only for their video cards... only reasons i've been sticking it with nvidia is better drivers, physx, and gsync (who knows if this will pay off). but i guess if it works on nvidia also its good both ways.

this could even put hurt to intel.. this makes AMD cpus just fine for gaming if the GPU does all the work.

upgrading my video card will wait for sure now...

Avatar image for silversix_
silversix_

26347

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 silversix_
Member since 2010 • 26347 Posts

wow all the years my 2500k@4.2 will last me now. Will this be available to 800 series from nvidia?if not i really dont see the point of getting an nvidia card from now on (if mantle is actually used in more than 1/10 games)

Avatar image for wis3boi
wis3boi

32507

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#26 wis3boi
Member since 2005 • 32507 Posts

@silversix_ said:

wow all the years my 2500k@4.2 will last me now. Will this be available to 800 series from nvidia?if not i really dont see the point of getting an nvidia card from now on (if mantle is actually used in more than 1/10 games)

no nvidia support unless nvidia decides to go ahead and take advantage of AMD's offer to use it (mantle can work on both, nvidia just needs to say yes). So for now its only AMD 7xxxx and 2xx

Avatar image for silversix_
silversix_

26347

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27  Edited By silversix_
Member since 2010 • 26347 Posts

@wis3boi said:

@silversix_ said:

wow all the years my 2500k@4.2 will last me now. Will this be available to 800 series from nvidia?if not i really dont see the point of getting an nvidia card from now on (if mantle is actually used in more than 1/10 games)

no nvidia support unless nvidia decides to go ahead and take advantage of AMD's offer to use it (mantle can work on both, nvidia just needs to say yes). So for now its only AMD 7xxxx and 2xx

stupid ass nvidia... ill miss physx in BL3 but if performance boost is as big with mantle, there's no point of getting the already overpriced nvidia gpu's

Avatar image for MlauTheDaft
MlauTheDaft

5189

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 MlauTheDaft
Member since 2011 • 5189 Posts

I smell a little PR tbh.... These kinds of numbers are usually inflated of course, but I'm more curious about them making it available to Nvidia.

Something tells me, it's somewhat more complicated/less beneficial for Nvidia to implement, than AMD would like us to think.

Besides, does'nt Nvidia already have something similar in NVAPI? Maybe that's why they're not falling over themselves to adopt Mantle.

Avatar image for Gammit10
Gammit10

2397

Forum Posts

0

Wiki Points

0

Followers

Reviews: 119

User Lists: 2

#29 Gammit10
Member since 2004 • 2397 Posts

@MlauTheDaft said:

I smell a little PR tbh.... These kinds of numbers are usually inflated of course, but I'm more curious about them making it available to Nvidia.

Something tells me, it's somewhat more complicated/less beneficial for Nvidia to implement, than AMD would like us to think.

Besides, does'nt Nvidia already have something similar in NVAPI? Maybe that's why they're not falling over themselves to adopt Mantle.

Agreed on all accounts.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#30  Edited By ronvalencia
Member since 2008 • 29612 Posts

@MlauTheDaft said:

I smell a little PR tbh.... These kinds of numbers are usually inflated of course, but I'm more curious about them making it available to Nvidia.

Something tells me, it's somewhat more complicated/less beneficial for Nvidia to implement, than AMD would like us to think.

Besides, does'nt Nvidia already have something similar in NVAPI? Maybe that's why they're not falling over themselves to adopt Mantle.

NVAPI still relies on Direct3D.

Avatar image for Marfoo
Marfoo

6002

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31 Marfoo
Member since 2004 • 6002 Posts

I'm skeptical about Mantle not requiring GCN to run. The benefits from going to a low-level API stem from the fact that commands and functions are hardware-specific and optimized.

Although, the part where it acts as the layer between driver and card might work with Nvidia cards as well. If I understood the presentation correctly, Direct3D does a lot of redundant checks and handles threading poorly whereas Mantle removes those checks and does thread assignment much more intelligently reducing the amount of time it takes to actually get data to the GPU to draw a frame.

It's an interesting new tool and if it takes off I can see Direct3D and OpenGL trying to follow and make similar optimizations available. Either way, it's good for the industry.