DirectX 12 Will Allow Multi-GPU Between GeForce And Radeon

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#1 Coseniath
Member since 2004 • 3183 Posts

Now these are news, I never thought that I would read:

From Tom's Hardware:

Exclusive: DirectX 12 Will Allow Multi-GPU Between GeForce And Radeon

We have early information about some of the details regarding DirectX 12, and what follows will surprise you.

A source with knowledge of the matter gave us some early information about an "unspoken API," which we strongly infer is DirectX 12.

We first heard of DirectX 12 in 2013, and DirectX 12 appears to finally be around the corner. It's expected to launch in tandem with the upcoming Windows 10 operating system.

The new API will work much differently from older APIs, and it's common knowledge by now that it will be "closer to the hardware" than older APIs, similar to AMD's Mantle. This will bring massive improvements in framerates and latency, but that's not all that DirectX 12 has up its sleeve.

Explicit Asynchronous Multi-GPU Capabilities

One of the big things that we will be seeing is DirectX 12's Explicit Asynchronous Multi-GPU capabilities. What this means is that the API combines all the different graphics resources in a system and puts them all into one "bucket." It is then left to the game developer to divide the workload up however they see fit, letting different hardware take care of different tasks.

Part of this new feature set that aids multi-GPU configurations is that the frame buffers (GPU memory) won't necessarily need to be mirrored anymore. In older APIs, in order to benefit from multiple GPUs, you'd have the two work together, each one rendering an alternate frame (AFR). This required both to have all of the texture and geometry data in their frame buffers, meaning that despite having two cards with 4 GB of memory, you'd still only have a 4 GB frame buffer.

DirectX 12 will remove the 4 + 4 = 4 idea and will work with a new frame rendering method called SFR, which stands for Split Frame Rendering. Developers will be able to manually, or automatically, divide the texture and geometry data between the GPUs, and all of the GPUs can then work together to work on each frame. Each GPU will then work on a specific portion of the screen, with the number of portions being equivalent to the number of GPUs installed.

Our source suggested that this technology will significantly reduce latency, and the explanation is simple. With AFR, a number of frames need to be in queue in order to deliver a smooth experience, but what this means is that the image on screen will always be about 4-5 frames behind the user's input actions.

This might deliver a very high framerate, but the latency will still make the game feel much less responsive. With SFR, however, the queue depth is always just one, or arguably even less, as each GPU is working on a different part of the screen. As the queue depth goes down, the framerate should also go up due to freed-up resources.

The source said that with binding the multiple GPUs together, DirectX 12 treats the entire graphics subsystem as a single, more powerful graphics card. Thus, users get the robustness of a running a single GPU, but with multiple graphics cards.

It should be noted that although the new Civilization: Beyond Earth title runs on Mantle, it has an SFR option and works in a similar way because AMD's Mantle API supports SFR. Mind you, Split Frame Rendering is not a new trick by any means. Many industrial film, photography, and 3D modelling applications use it, and back in the 90s some game engines also supported it.

Of course, chances are you won't be able to use all of the options described above at the same time. Split frame rendering, for example, will still likely require some of the textures and geometry data to be in multiple frame buffers, and there may be other sacrifices that have to be made.

Build A Multi-GPU System With Both AMD And Nvidia Cards

We were also told that DirectX 12 will support all of this across multiple GPU architectures, simultaneously. What this means is that Nvidia GeForce GPUs will be able to work in tandem with AMD Radeon GPUs to render the same game – the same frame, even.

This is especially interesting as it allows you to leverage the technology benefits of both of these hardware platforms if you wish to do so. If you like Nvidia's GeForce Experience software and 3D Vision, but you want to use AMD's TrueAudio and FreeSync, chances are you'll be able to do that when DirectX 12 comes around. What will likely happen is that one card will operate as the master card, while the other will be used for additional power.

What we're seeing here is that DirectX 12 is capable of aggregating graphics resources, be that compute or memory, in the most efficient way possible. Don't forget, however, that this isn't only beneficial for systems with multiple discrete desktop GPUs. Laptops with dual-graphics solutions, or systems running an APU and a GPU will be able to benefit too. DirectX 12's aggregation will allow GPUs to work together that today would be completely mismatched, possibly making technologies like SLI and CrossFire obsolete in the future.

There is a catch, however. Lots of the optimization work for the spreading of workloads is left to the developers – the game studios. The same went for older APIs, though, and DirectX 12 is intended to be much friendlier. For advanced uses it may be a bit tricky, but according to the source, implementing the SFR should be a relatively simple and painless process for most developers.

Queueing frames has been a difficult point for various studios, such that on some games SLI or CrossFire configurations don't even work. The aggregation together with SFR should solve that issue.

That's as far as we can reach into the cookie jar for now, but we expect to see and learn more at GDC.

=======================

Titan II crossli 390X anyone?

Avatar image for xantufrog
xantufrog

17875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#2 xantufrog  Moderator
Member since 2013 • 17875 Posts

That's... Wild. Something tells me performance will be variable with a cross-brand configuration, but it kind of captures the essence of customizability on PC

Avatar image for BassMan
BassMan

17808

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#3  Edited By BassMan
Member since 2002 • 17808 Posts

It looks promising. Hopefully game engines adopt these techniques quickly.

Avatar image for MonsieurX
MonsieurX

39858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 MonsieurX
Member since 2008 • 39858 Posts

I remember the promising Hydra chip

Avatar image for Mozuckint
Mozuckint

831

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#5 Mozuckint
Member since 2012 • 831 Posts

Not sure about differing GPU's but I know AMD's Robert Hallock tweeted a few weeks ago, that outright claimed Mantle can combine video memory(4+4 equaling 8 and such), and that API's following in those steps(presumably DX12 and Glnext) will carry similar functionality.

Avatar image for KHAndAnime
KHAndAnime

17565

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#6 KHAndAnime
Member since 2009 • 17565 Posts

@MonsieurX said:

I remember the promising Hydra chip

Aw, me too, I was going to bring that up :P

I'll be really surprised if this actually happens though. Theoretical possibilities are necessarily always realized.

Avatar image for byshop
Byshop

20504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#7  Edited By Byshop  Moderator
Member since 2002 • 20504 Posts

The ability to use the VRAM on both cards is worth the price of admission alone.

-Byshop

Avatar image for catalli
Catalli

3453

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#8 Catalli  Moderator
Member since 2014 • 3453 Posts

I don't like that green background in the image. I smell shenanigans...

Avatar image for Ben-Buja
Ben-Buja

2809

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#9 Ben-Buja
Member since 2011 • 2809 Posts

@Byshop said:

The ability to use the VRAM on both cards is worth the price of admission alone.

-Byshop

Yep, it will make my 970s a lot more future proof. DX12 sounds awesome, it will change PC gaming a lot, more than DX10 and DX11 did for sure

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#10  Edited By Coseniath
Member since 2004 • 3183 Posts

Lucid Logic is nowhere near Microsoft levels in any comparison.

I can start point countless facts that seperates the two techs (Hydra Engine and DX12), but I will say the 2 things that matter most.

First Microsoft's influence. If we can say there is a company with the most influence in PC industry then its Microsoft. When Microsoft says to devs and PC Part manufacturers "jump", they say "how high?".

Second and very important too, is that they are going to make things easy for developers. So no more excuses behind lazyness....

Avatar image for deactivated-59d151f079814
deactivated-59d151f079814

47239

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#11  Edited By deactivated-59d151f079814
Member since 2003 • 47239 Posts

@Byshop said:

The ability to use the VRAM on both cards is worth the price of admission alone.

-Byshop

Pretty much this.. It makes SLI /Crossfire a viable option now for mid range systems.. Before this it was so much better just to upgrade with a single card.. Due to better efficiency in power, performance, not to mention no SLI related problems with certain games and older games.. But I will wait til it's actual release.. I remember the amount of hype that was behind DX10 in similiar claims with massive performance increases..

Avatar image for byshop
Byshop

20504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#12 Byshop  Moderator
Member since 2002 • 20504 Posts

@sSubZerOo said:

Pretty much this.. It makes SLI /Crossfire a viable option now for mid range systems.. Before this it was so much better just to upgrade with a single card.. Due to better efficiency in power, performance, not to mention no SLI related problems with certain games and older games.. But I will wait til it's actual release.. I remember the amount of hype that was behind DX10 in similiar claims with massive performance increases..

Now I'm just thinking how awesome it would have been if each time I had gotten a new model dual GPU to replace my old one, I had instead just -added- the new card to my system (i.e. a 6990 + 7990 or 7990 + dual 290xs.

-Byshop

Avatar image for BassMan
BassMan

17808

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#13  Edited By BassMan
Member since 2002 • 17808 Posts

I still don't see how mixing AMD and Nvidia would work. Nvidia has too much proprietary tech that the AMD cards would not be able to use. You would have to disable these settings in games I guess when mixing cards, or have to program the game to target the Nvidia card while unloading basic processing on the AMD. I just wish more developers would make better use of PhysX and all the other cool stuff. I hate it when they develop a game for the lowest common denominator.

Avatar image for horgen
horgen

127503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#14  Edited By horgen  Moderator
Member since 2006 • 127503 Posts

@Byshop said:

@sSubZerOo said:

Pretty much this.. It makes SLI /Crossfire a viable option now for mid range systems.. Before this it was so much better just to upgrade with a single card.. Due to better efficiency in power, performance, not to mention no SLI related problems with certain games and older games.. But I will wait til it's actual release.. I remember the amount of hype that was behind DX10 in similiar claims with massive performance increases..

Now I'm just thinking how awesome it would have been if each time I had gotten a new model dual GPU to replace my old one, I had instead just -added- the new card to my system (i.e. a 6990 + 7990 or 7990 + dual 290xs.

-Byshop

And had a heap of PSU upgrades.. I mean add-ons as well. Suddenly the Corsair AX1500 isn't enough :P

Avatar image for digitm64
digitm64

470

Forum Posts

0

Wiki Points

0

Followers

Reviews: 25

User Lists: 5

#16 digitm64
Member since 2013 • 470 Posts

One reason to look forward to the rumored nVidia 8GB VRAM cards. Will get 2x, and have 16GB VRAM......I will challenge a game to bring that down on it's knees. Bring it on !!!

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#18 Coseniath
Member since 2004 • 3183 Posts
@horgen said:

@Byshop said:

@sSubZerOo said:

Pretty much this.. It makes SLI /Crossfire a viable option now for mid range systems.. Before this it was so much better just to upgrade with a single card.. Due to better efficiency in power, performance, not to mention no SLI related problems with certain games and older games.. But I will wait til it's actual release.. I remember the amount of hype that was behind DX10 in similiar claims with massive performance increases..

Now I'm just thinking how awesome it would have been if each time I had gotten a new model dual GPU to replace my old one, I had instead just -added- the new card to my system (i.e. a 6990 + 7990 or 7990 + dual 290xs.

-Byshop

And had a heap of PSU upgrades.. I mean add-ons as well. Suddenly the Corsair AX1500 isn't enough :P

SuperFlower Leadex Platinum ’8 Pack Edition’ 2000W Review

Super Flower LEADEX 2000 Watt @ 2500 Watt Overload - 200 A on 12V

Problem solved! :D

Avatar image for horgen
horgen

127503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#19 horgen  Moderator
Member since 2006 • 127503 Posts

@Coseniath: who needs a 2kW PSU? Not even 2 of the 295X from AMD pull that much bthey stop at 1300 or so if you play at 4K

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#20 Coseniath
Member since 2004 • 3183 Posts

@horgen: Dunno. Someone with 2 R9 295X and a... fridge? :P

Avatar image for Lach0121
Lach0121

11783

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#22  Edited By Lach0121
Member since 2007 • 11783 Posts

Am I misunderstanding something, or rather could someone clarify for me...

Does your VRAM stack now with DX12, and multi-gpu setups. Like before 2x 1gb cards would only leave you with 1gb in sli/xfire. So would that same configuration now leave you with 2gb of VRAM?

Also the AMD/Nvidia multi gpu stuff is really nice. (but I see it being a minefield at first with drivers)

Avatar image for insane_metalist
insane_metalist

7797

Forum Posts

0

Wiki Points

0

Followers

Reviews: 42

User Lists: 0

#23 insane_metalist
Member since 2006 • 7797 Posts

That's weird... hmm... 390 + X2 290's sounds interesting :p

Avatar image for ShadowDeathX
ShadowDeathX

11698

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#24 ShadowDeathX
Member since 2006 • 11698 Posts

Civilization: Beyond Earth already stacks multiple video card memory, uses Split Frame Rendering, if you are using Mantle.

Anyways, the multiple vendor thing won't happen. Nvidia will just add a block into the drivers, like they usually do.

Avatar image for ShadowDeathX
ShadowDeathX

11698

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#25 ShadowDeathX
Member since 2006 • 11698 Posts

@Lach0121 said:

Am I misunderstanding something, or rather could someone clarify for me...

Does your VRAM stack now with DX12, and multi-gpu setups. Like before 2x 1gb cards would only leave you with 1gb in sli/xfire. So would that same configuration now leave you with 2gb of VRAM?

Also the AMD/Nvidia multi gpu stuff is really nice. (but I see it being a minefield at first with drivers)

The game has to support Split Frame Rendering.

In traditional Crossfire/SLI, one card would be in charge of rendering one frame and the next card would render the next frame.

With Split Frame Rendering, each frame is divided up and each card renders a portion of the frame. For example in 1080p and with a 2 Way GPU setup, one GPU would render 540 lines and the other GPU would render the other 540 lines. Since each GPU is assigned a different workload, they use their own memory to load up the image.

Avatar image for Lach0121
Lach0121

11783

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#26 Lach0121
Member since 2007 • 11783 Posts

@ShadowDeathX:

I see, thanks for explaining it, quite clearly I might add.

Avatar image for Code135
Code135

892

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#27 Code135
Member since 2005 • 892 Posts

Sounds good, but until I see this operational it still won't get me buying Win10 ... It's one thing to see it on paper, the other to see it operational and "fault-proof".

Avatar image for deactivated-57d307c5efcda
deactivated-57d307c5efcda

1302

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#28 deactivated-57d307c5efcda
Member since 2009 • 1302 Posts

This is very promising but I highly doubt mixing Nvidia and AMD cards will be part of it. What is good though is the Vram stacking now or using different cards but in the same family. Windows 10 is free to all 7 and 8 users, so there is no reason to not upgrade. Only people who have to buy it are Vista and "shudder" people who are STILL using XP.

Avatar image for Lach0121
Lach0121

11783

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#29 Lach0121
Member since 2007 • 11783 Posts

@ryangcnx-2: No there will still possibly be reasons not to upgrade.

I don't use my pc just for gaming. I use it for music production too. So I have to worry about hardware/driver compatibility, and software compatibility with a new OS. Although I don't think I will have a problem with most of it, just a couple pieces that I am a little worried might not make it to windows 10, at least not right away.

Although we have two rigs, we will upgrade one, test the hardware/drivers out on it, and if it works, then I will upgrade my OS on my rig as well.

Avatar image for silversix_
silversix_

26347

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#30 silversix_
Member since 2010 • 26347 Posts

impressive stuff

Avatar image for elessarGObonzo
elessarGObonzo

2677

Forum Posts

0

Wiki Points

0

Followers

Reviews: 140

User Lists: 0

#31 elessarGObonzo
Member since 2008 • 2677 Posts

knew there was a reason to keep my 770. PhysX and TressFX?

this will be a revolutionary step, backward? wasn't this possible like 15 years ago?

Nvidia will probably eventually buy AMD and just incorporate into their mobile class division.

Avatar image for BassMan
BassMan

17808

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#32 BassMan
Member since 2002 • 17808 Posts

@Lach0121 said:

@ryangcnx-2: No there will still possibly be reasons not to upgrade.

I don't use my pc just for gaming. I use it for music production too. So I have to worry about hardware/driver compatibility, and software compatibility with a new OS. Although I don't think I will have a problem with most of it, just a couple pieces that I am a little worried might not make it to windows 10, at least not right away.

Although we have two rigs, we will upgrade one, test the hardware/drivers out on it, and if it works, then I will upgrade my OS on my rig as well.

Just use a multi-boot setup. I Have Win XP, Win7, and Win 8.1 all running on the same system.

Avatar image for Lach0121
Lach0121

11783

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#33 Lach0121
Member since 2007 • 11783 Posts

@BassMan: Well that kinda negates the free update doesn't it?

I mean once you update an OS its the new OS. Didn't MS state that once you upgrade to 10 you can't go back to the 8.1 you upgraded from?

Avatar image for BassMan
BassMan

17808

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#34  Edited By BassMan
Member since 2002 • 17808 Posts

@Lach0121:

I guess if you are forced to upgrade online, then it would convert your current OS. However, if you have the old OS on disc, you can still install it after. Also, Microsoft upgrade discs usually have the full OS and they usually just require you to enter the key of your older OS. If you can get one of those for WIn 10, then you can do a fresh install.

Avatar image for Lach0121
Lach0121

11783

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#35 Lach0121
Member since 2007 • 11783 Posts

@BassMan: I do have the disc for 8.1 actually.

I would prefer to do only one OS. However, if testing the new OS on rig #2 yields negative results with my equipment. Then I will have to resort to your recommendation. Its something that will be more illuminated once Windows 10 gets closer to release.

Avatar image for Old_Gooseberry
Old_Gooseberry

3958

Forum Posts

0

Wiki Points

0

Followers

Reviews: 76

User Lists: 0

#36 Old_Gooseberry
Member since 2002 • 3958 Posts

this sounds so crazy it just might work

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#39 Coseniath
Member since 2004 • 3183 Posts
@Chatch09 said:

@Coseniath: Any word on how extra system RAM will come into play with DX12? I have 2x8GB of 2400Mhz now but want another kit for completeness (I know its not necessary, but my OCD wants every slot on my MOBO filled lol!)

Dunno, but with 16GB RAM you currently have, I can safely say that you shouldn't have a problem running games at max for like 3 years or even more....

Games at the moment barely pass 8GB. some might reach 10GB. But 16GB will be enough for a long time. (from 4GB to 8GB, it took like 5 to 7 years...)

Avatar image for horgen
horgen

127503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#40 horgen  Moderator
Member since 2006 • 127503 Posts

@Coseniath said:
@Chatch09 said:

@Coseniath: Any word on how extra system RAM will come into play with DX12? I have 2x8GB of 2400Mhz now but want another kit for completeness (I know its not necessary, but my OCD wants every slot on my MOBO filled lol!)

Dunno, but with 16GB RAM you currently have, I can safely say that you shouldn't have a problem running games at max for like 3 years or even more....

Games at the moment barely pass 8GB. some might reach 10GB. But 16GB will be enough for a long time. (from 4GB to 8GB, it took like 5 to 7 years...)

Going from 32bit to 64bit might have had something to do with that?!?

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#41 Coseniath
Member since 2004 • 3183 Posts
@horgen said:

@Coseniath said:
@Chatch09 said:

@Coseniath: Any word on how extra system RAM will come into play with DX12? I have 2x8GB of 2400Mhz now but want another kit for completeness (I know its not necessary, but my OCD wants every slot on my MOBO filled lol!)

Dunno, but with 16GB RAM you currently have, I can safely say that you shouldn't have a problem running games at max for like 3 years or even more....

Games at the moment barely pass 8GB. some might reach 10GB. But 16GB will be enough for a long time. (from 4GB to 8GB, it took like 5 to 7 years...)

Going from 32bit to 64bit might have had something to do with that?!?

Actually the one helped the other. There was a need for more people to use 64bit OS in order (the devs) to push more than 4GB total RAM and there was a need for more people to use more than 4GB in order to need 64bit :).

Avatar image for gerygo
GeryGo

12803

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#42 GeryGo  Moderator
Member since 2006 • 12803 Posts

The real question: will it allow me to use some mid tier GPU with a combo of lower tier just to give me the slight boost that I need to get into high end tier GPUs?

Avatar image for saintsatan
SaintSatan

1986

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#43 SaintSatan
Member since 2003 • 1986 Posts

I have a boner because I have a Dual GTX 780m 4GB SLI laptop. I want this released now. So end of the year?