What's up with these DirectX12, OpenGL, Mantle APIs?

#1 Edited by koreform (30 posts) -

Okay, so I consider myself a enthusiast, but there is something I've never understood: what is DirectX12, OpenGL and this newfangled Mantle API? From what little I gather, it's kind of like a buffer system in between the game program and the hardware of the computer, and enables the high-level code of the game interact with the low-level language needed for the GPU and CPU for manipulating the hardware.

What I really want to know though, is what's the difference between the three I mentioned. I am also curious if AMD Mantle and DirectX12 will indeed make PC gaming more effective like consoles. Also, I always though the reason consoles were more effective (not better) was because they had simpler, lower-level APIs that got rid of a lot of clutter - but it turns out XBONE uses DIrectX too, so I am confused. What does the PS4 use then?

I get frustrated because I spent $800 on two GTX 680s that supposedly should make laps around the GPUs inside next-gen consoles but they actually barely outperform them in terms of actual gameplay graphics(?). (Edit: I played ACIV: Black Flag on the PS4 and my computer, and I noticed that the PS4 version ran smoother on the PS4 than on my PC. When I said gameplay, I meant that the overall gaming experience, not just the graphics, was better on the console due to the game seemingly running more effectively on it. But English isn't my first language and I will concede if I was utterly wrong to use the word the way I did.) I would never trade my PC for a console (if I had to pick) but I feel there could be some improvement. I mean the entire PS4 console costs $400. Maybe this Mantle and DirectX12 can solve this problem?

Note: I don't know if this question has been already asked or discussed on a different thread, the search function is broken (thank you, GameSpot).

#2 Edited by FelipeInside (26325 posts) -

@koreform said:

I get frustrated because I spent $800 on two GTX 680s that supposedly should make laps around the GPUs inside next-gen consoles but they actually barely outperform them in terms of actual gameplay.

#3 Posted by Horgen (111134 posts) -

@FelipeInside said:

@koreform said:

I get frustrated because I spent $800 on two GTX 680s that supposedly should make laps around the GPUs inside next-gen consoles but they actually barely outperform them in terms of actual gameplay.

TC... Gameplay=/=graphics...

DX12. Mantle, openGL are ways used to talk with the hardware in the GPU I believe. It has next to nothing to do with the gameplay. That's up to the game devs.

#4 Posted by PredatorRules (8854 posts) -

@koreform said:

Okay, so I consider myself a enthusiast, but there is something I've never understood: what is DirectX12, OpenGL and this newfangled Mantle API? From what little I gather, it's kind of like a buffer system in between the game program and the hardware of the computer, and enables the high-level code of the game interact with the low-level language needed for the GPU and CPU for manipulating the hardware.

What I really want to know though, is what's the difference between the three I mentioned. I am also curious if AMD Mantle and DirectX12 will indeed make PC gaming more effective like consoles. Also, I always though the reason consoles were more effective (not better) was because they had simpler, lower-level APIs that got rid of a lot of clutter - but it turns out XBONE uses DIrectX too, so I am confused. What does the PS4 use then?

I get frustrated because I spent $800 on two GTX 680s that supposedly should make laps around the GPUs inside next-gen consoles but they actually barely outperform them in terms of actual gameplay. I would never trade my PC for a console (if I had to pick) but I feel there could be some improvement. I mean the entire PS4 console costs $400. Maybe this Mantle and DirectX12 can solve this problem?

Note: I don't know if this question has been already asked or discussed on a different thread, the search function is broken (thank you, GameSpot).

If you hate that you've spent more than double of the cost of a brand new console for GPUs alone you should talk to game devs to stop making lazy ass ports for the PC and focus their main effort on PC instead of the opposite.

You can't really blame them either because if I had a gaming company I'd go where there's more money flow.

#5 Posted by kraken2109 (13229 posts) -

The most recent AC games have been really badly optimised, try something else.

#6 Posted by wis3boi (32002 posts) -

stop using half assed ports as benchmarks

#7 Posted by Jawad2007 (435 posts) -

bro , wait and see the witcher 3 , it will run like crap in consoles and it will shine in PC , most of developers focus in consoles cause it will sell like 4X more than PC , making a game for every platform need a lot of time .

our world now sucks it want money nothing more nothing less , I work 16 h everyday for like 1000$ / month .... Quantity over Quality :( sad world tbh ....

#8 Posted by koreform (30 posts) -

Alright, AC might not be the best example. Other games I have trouble running at above 30 FPS at all points are Crysis 3 and Witcher 2. And you don't have to tell me that developers care more about console than pc, I know. But that wasn't my point, can somebody actually answer my original question about the APIs? I didn't post this to compare PCs from consoles, I was wondering if anyone had insight to share about DirectX 12, OpenGL and Mantle.

#9 Posted by RaZoR500 (350 posts) -

@koreform: Can't run the witcher 2 with two GTX 680 above 30fps? I can get 60fps most of the time with a single GTX 680. What resolution are you playing at and what are your specs? I think something is wrong with your PC or you are doing something terribly wrong.

#10 Posted by JigglyWiggly_ (23855 posts) -

I get over 100fps in the witcher 2 at 1080p at high/max (no uber sampling) on my gtx 670's in sli.

#11 Edited by FelipeInside (26325 posts) -

@koreform said:

Alright, AC might not be the best example. Other games I have trouble running at above 30 FPS at all points are Crysis 3 and Witcher 2. And you don't have to tell me that developers care more about console than pc, I know. But that wasn't my point, can somebody actually answer my original question about the APIs? I didn't post this to compare PCs from consoles, I was wondering if anyone had insight to share about DirectX 12, OpenGL and Mantle.

Something wrong there dude. Have you tried disabling SLI?

I got much more than 30FPS on Witcher 2 and Crysis 3, and I just run a single 680.

#12 Posted by koreform (30 posts) -

So nothing on the APIs? No big deal, I guess. I run all my games at 1920 x 1080, and they do run over 30 FPS at most points in the game, I specifically said it doesn't hold the FPS at "all points"... So basically I get drops that go below 30 FPS during graphics intense scenes. I do run Witcher 2 with ubersampling, and while I read reviews that my setup should hold the game steadily over 60 FPS I don't get the same results. And @FelipeInside, I would be surprised if a single GTX 680 could really run the game at steady 30 FPS with all DX11 features and MSAA 8X turned on, most benchmarks online seem struggle with keeping 30 FPS. My game tanked to 20 FPS, so that was one of the reasons I got a second card in the first place. And you guys could be right, there could be something I am doing wrong - but as far as I know, my SLI is working and I still get moments when games drop below 30 FPS.

You are probably going to tell me that console games don't run like the way I run them, and that's true. But that wasn't the point of this discussion - I didn't want to start another pointless PC / console comparison, but I wanted to know if newer APIs like DirectX12 and Mantle could make PC games run more efficiently. (The only reason I mentioned AC was because I was trying to defend why I used the word "gameplay" to describe my frustration with the occasionally low FPS.) Yes, there is the problem of poorly ported games, and there probably will be forever, but nevertheless many developers have mentioned that consoles run games better on average because of their low-level APIs.

I was simply trying to start a mature conversation about a interesting topic, but I guess making fun of me is more fun for the crowds. I don't really mind it, but I am bit disappointed at the lack of interest in the technical stuff here at GameSpot. Maybe you guys could point me to a different forum or website that would be more suited for such discussions? (I use HardOCP, but people over there seem to care less about the games than the hardware...)

#13 Edited by MlauTheDaft (4518 posts) -

Something is wrong with your setup. I get better results with a single GTX770 and that's a rebranded 680 with increased memory speed, if I recall correctly.

Edit:

Iøm not even sure what you're actually asking about in regards to the different APIs. Here's a link to a recent ArsTechnica artilcle about OpenGL 4.5 for you.

#14 Posted by Horgen (111134 posts) -

@koreform said:

So nothing on the APIs? No big deal, I guess. I run all my games at 1920 x 1080, and they do run over 30 FPS at most points in the game, I specifically said it doesn't hold the FPS at "all points"... So basically I get drops that go below 30 FPS during graphics intense scenes. I do run Witcher 2 with ubersampling, and while I read reviews that my setup should hold the game steadily over 60 FPS I don't get the same results. And @FelipeInside, I would be surprised if a single GTX 680 could really run the game at steady 30 FPS with all DX11 features and MSAA 8X turned on, most benchmarks online seem struggle with keeping 30 FPS. My game tanked to 20 FPS, so that was one of the reasons I got a second card in the first place. And you guys could be right, there could be something I am doing wrong - but as far as I know, my SLI is working and I still get moments when games drop below 30 FPS.

You are probably going to tell me that console games don't run like the way I run them, and that's true. But that wasn't the point of this discussion - I didn't want to start another pointless PC / console comparison, but I wanted to know if newer APIs like DirectX12 and Mantle could make PC games run more efficiently. (The only reason I mentioned AC was because I was trying to defend why I used the word "gameplay" to describe my frustration with the occasionally low FPS.) Yes, there is the problem of poorly ported games, and there probably will be forever, but nevertheless many developers have mentioned that consoles run games better on average because of their low-level APIs.

I was simply trying to start a mature conversation about a interesting topic, but I guess making fun of me is more fun for the crowds. I don't really mind it, but I am bit disappointed at the lack of interest in the technical stuff here at GameSpot. Maybe you guys could point me to a different forum or website that would be more suited for such discussions? (I use HardOCP, but people over there seem to care less about the games than the hardware...)

Ah.. Now I think I understand a bit more.

DX 12 is supposed to be more efficient by lowering the overhead or something like that for the CPU. Meaning a weaker CPU will perform better. A good CPU won't benefit so much, as it already has enough power for the game. Though a slight increase in minimum FPS might still happen. Mantle is supposed to do that for AMD CPU/GPU combo... But I think it still only benefit with a R9 290X and a weak quad core AMD CPU.

#15 Posted by Kh1ndjal (2554 posts) -

@horgen said:

@koreform said:

So nothing on the APIs? No big deal, I guess. I run all my games at 1920 x 1080, and they do run over 30 FPS at most points in the game, I specifically said it doesn't hold the FPS at "all points"... So basically I get drops that go below 30 FPS during graphics intense scenes. I do run Witcher 2 with ubersampling, and while I read reviews that my setup should hold the game steadily over 60 FPS I don't get the same results. And @FelipeInside, I would be surprised if a single GTX 680 could really run the game at steady 30 FPS with all DX11 features and MSAA 8X turned on, most benchmarks online seem struggle with keeping 30 FPS. My game tanked to 20 FPS, so that was one of the reasons I got a second card in the first place. And you guys could be right, there could be something I am doing wrong - but as far as I know, my SLI is working and I still get moments when games drop below 30 FPS.

You are probably going to tell me that console games don't run like the way I run them, and that's true. But that wasn't the point of this discussion - I didn't want to start another pointless PC / console comparison, but I wanted to know if newer APIs like DirectX12 and Mantle could make PC games run more efficiently. (The only reason I mentioned AC was because I was trying to defend why I used the word "gameplay" to describe my frustration with the occasionally low FPS.) Yes, there is the problem of poorly ported games, and there probably will be forever, but nevertheless many developers have mentioned that consoles run games better on average because of their low-level APIs.

I was simply trying to start a mature conversation about a interesting topic, but I guess making fun of me is more fun for the crowds. I don't really mind it, but I am bit disappointed at the lack of interest in the technical stuff here at GameSpot. Maybe you guys could point me to a different forum or website that would be more suited for such discussions? (I use HardOCP, but people over there seem to care less about the games than the hardware...)

Ah.. Now I think I understand a bit more.

DX 12 is supposed to be more efficient by lowering the overhead or something like that for the CPU. Meaning a weaker CPU will perform better. A good CPU won't benefit so much, as it already has enough power for the game. Though a slight increase in minimum FPS might still happen. Mantle is supposed to do that for AMD CPU/GPU combo... But I think it still only benefit with a R9 290X and a weak quad core AMD CPU.

i like how you've used the word "supposed" because that's exactly the word i would use. until we see some real world results.

#16 Edited by koreform (30 posts) -

For those who think there is something wrong with my system because it doesn't run Crysis 3 maxed out at +60 fps, check this out. You will notice it at MSAA 8X. Also, if you search Witcher 2 ubersampling on Google, you can check many others who tried and failed. The GTX 680 won't come anywhere close to 60 fps. My system is fine, it runs most games +100 fps, the games I mentioned are the few I can't have going +60 fps the whole time.

As for original question, I guess you guys are right. No point in discussing things that are not out for real yet. Just wanted to know if my system would get a boost in the future without having to upgrade the hardware. Thanks for the input.

#17 Edited by FelipeInside (26325 posts) -

@koreform said:

For those who think there is something wrong with my system because it doesn't run Crysis 3 maxed out at +60 fps, check this out. You will notice it at MSAA 8X. Also, if you search Witcher 2 ubersampling on Google, you can check many others who tried and failed. The GTX 680 won't come anywhere close to 60 fps. My system is fine, it runs most games +100 fps, the games I mentioned are the few I can't have going +60 fps the whole time.

As for original question, I guess you guys are right. No point in discussing things that are not out for real yet. Just wanted to know if my system would get a boost in the future without having to upgrade the hardware. Thanks for the input.

Then turn down MSAA and turn off Ubersampling for Christs Sake and play the game at higher framerates. Putting MSAA at max doesn't make the game look any better unless you're playing at ridiculous high resolutions.

You're on PC now, not consoles. It's all about finding that sweet spot between good graphics and good performance. There isn't one PC game where I have started it without going into graphical options and changing a few things first.

No one is making fun of you, unless you start saying things like 2 680s can't perform as well as a console.

As for the initial points. Yes, there are different APIs that supposedly offer more performance (like Mantle), but it's not mainstream yet and I even heard some of them cause more problems. Just enjoy what's available now and don't worry about the future. If Mantle does become mainstream, by then you will probably have upgraded your PC anyway.

#18 Posted by koreform (30 posts) -

I was never complaining, if you understand. I love my setup and I like to push it to see its limits. I was just trying to make a point to people who said there is something wrong with my system.

As for the APIs, if you haven't heard, DirectX 12 is going to be available to GeForce cards starting from Fermi architecture, meaning it will support 400/500/600/700 series that are out now. It will also support current AMD GCN cards. Since Mantle is also promising to deliver support for Nvidia cards, I am hoping they do the same as DX12. That's why I was excited for my current system getting a boost. The only thing I probably will have to upgrade is my Windows.

I don't want this to turn into an argument, so I am going to stop here. Thanks for the input guys.