Mixed GPU nightmare.

Avatar image for o0squishy0o
#1 Posted by o0squishy0o (2802 posts) -

Hi everyone!

So basically I've recently bought a pair of 2080ti's. I do a lot of GPU rendering however I do play the occasional game.

I've noticed however after keeping one my old 980s in as monitor display for when I render as I am working; when playing games (obviously switching cables over etc) my 2080tis seem to run kinda crappy.

The only time I've got them to run well was when I uninstalled the 980 (drivers only, the card remained in the system) reinstalled the drivers and the cards seem to run really well.

However I've run into the same exact problem again which I can only describe as the cards are throttling under something. Temps are low and so is the "power" usage.

I'm sure I should be able to run games through my 2080ti and it totally ignore the fact I've got a 980 still kicking around in there, but that just doesn't seem the case.

I can't really keep pulling the 980 out and reinstalling the drivers everytime I want to play a game. So I was wondering if there is anything I should be doing?

I've tried disabling the card, I've reinstalled the latest drivers. I've tried telling the awful nvidia experience thing not to use the 980 for things like CUDA usage etc on the app/games in question.

tldr; my old 980 seems to be causing my 2080ti to be awful at running games. Card looks to be not performing at 100% and bounces around 50/80%. Even lower if I put it at 1080p.

Any suggestions etc would be awesome! cheers :)

Avatar image for schu
#2 Posted by schu (9909 posts) -

@o0squishy0o: What kind of power supply can run dual 2080 tis and a 980?

Avatar image for GTR12
#3 Edited by GTR12 (13254 posts) -

@schu said:

@o0squishy0o: What kind of power supply can run dual 2080 tis and a 980?

This ^^

PSU is at its limit.

Avatar image for PredatorRules
#4 Edited by PredatorRules (12281 posts) -
@schu said:

@o0squishy0o: What kind of power supply can run dual 2080 tis and a 980?

A 1500Watt of course :D

As for solution to your problem @o0squishy0o you can disable PCI slots via MOBO BIOS rather than pulling off the 980 everytime you want to game.

Avatar image for o0squishy0o
#5 Posted by o0squishy0o (2802 posts) -

Thanks for the replies so far, however I doubt the PSU is the issue as the cards run fine when rendering which uses BOTH 2080tis while the 980 is being used as a monitor. The issue seems to be purely gaming where the 980 is present in the system. A friend of mine said that since the system is seeing there is 980 in there the computer will act according to work within the limits of that card? Apparently a guy had an issue similar before and had to remove the old card and do a clean install (which I sort of did before and it worked, but now its like the 2080ti isn't trying.)

So yeah I don't think its the PSU since when I use it to render its using both cards at full power and its totally fine and getting 100% performance. Something to do with gaming appears to be the issue.

Avatar image for o0squishy0o
#6 Posted by o0squishy0o (2802 posts) -

@PredatorRules:

Oh so sorry I didn't see all of your post about the PCI disabling. I'll give it a go. It seems more software/driver related since the problem was fixed at one point while I still had the 980 installed.

Avatar image for horgen
#7 Posted by Horgen (118723 posts) -

Does running the monitor from one of the 2080Ti lower it rendering performance that much?

Avatar image for o0squishy0o
#8 Posted by o0squishy0o (2802 posts) -

@horgen: To be honest, I probably could get away with removing the 980. I just liked the fact if I could I use the 980 to be the windows card and then the 2080tis to be rendering away. I rarely work like that and typically only use one of them for testing. Its just a pain that it seems like there is something going on driverwise to be causing this.

Avatar image for horgen
#9 Posted by Horgen (118723 posts) -

@o0squishy0o: I do understand that, but given the amount of power these new cards have, simply running a monitor should be easy for them and not affect the rendering performance by much.

Avatar image for o0squishy0o
#10 Posted by o0squishy0o (2802 posts) -

@horgen: I know I know.... just can't help but want to get the absolute max from them! Thanks for your replies and everyone else who has contributed, I appreciate it!

Avatar image for ronvalencia
#11 Posted by ronvalencia (26498 posts) -

@o0squishy0o said:

Hi everyone!

So basically I've recently bought a pair of 2080ti's. I do a lot of GPU rendering however I do play the occasional game.

I've noticed however after keeping one my old 980s in as monitor display for when I render as I am working; when playing games (obviously switching cables over etc) my 2080tis seem to run kinda crappy.

The only time I've got them to run well was when I uninstalled the 980 (drivers only, the card remained in the system) reinstalled the drivers and the cards seem to run really well.

However I've run into the same exact problem again which I can only describe as the cards are throttling under something. Temps are low and so is the "power" usage.

I'm sure I should be able to run games through my 2080ti and it totally ignore the fact I've got a 980 still kicking around in there, but that just doesn't seem the case.

I can't really keep pulling the 980 out and reinstalling the drivers everytime I want to play a game. So I was wondering if there is anything I should be doing?

I've tried disabling the card, I've reinstalled the latest drivers. I've tried telling the awful nvidia experience thing not to use the 980 for things like CUDA usage etc on the app/games in question.

tldr; my old 980 seems to be causing my 2080ti to be awful at running games. Card looks to be not performing at 100% and bounces around 50/80%. Even lower if I put it at 1080p.

Any suggestions etc would be awesome! cheers :)

Tried Intel IGP + NVIDIA dGPU setup? I'm assuming you have normal desktop Intel Core series instead of SkyLake-X/Xeon type motherboard platforms.

Avatar image for WESTBLADE
#12 Posted by WESTBLADE (329 posts) -

@o0squishy0o said:

@horgen: I know I know.... just can't help but want to get the absolute max from them! Thanks for your replies and everyone else who has contributed, I appreciate it!

You seem to have a lot of money to burn (two 2080 Ti's?😶), but no brain to back it up (you don't even know that you can disable PCI slots on mobo???🙄), so just buy some cheap ass pre-built PC and throw your ''almighty'' 980 in...👍

Avatar image for o0squishy0o
#13 Posted by o0squishy0o (2802 posts) -

@WESTBLADE: Would you like me to buy you a 2080ti and make you feel a little better?

In all seriousness the reason for keeping the 980 was purely so I could have the 2080tis doing the rendering while I still work within windows/etc without suffering viewport performance degradation. Turns out that simply doesn't work for me and to solve the issue I had to totally uninstall everything, and just have the two cards instead.

Oh the "almighty" 980s are being sold. One is gone but if you'd like to write me a nice letter I might consider a discounted rate for the other one for you ;)

Avatar image for WESTBLADE
#14 Posted by WESTBLADE (329 posts) -
@o0squishy0o said:

@WESTBLADE: Would you like me to buy you a 2080ti and make you feel a little better?

In all seriousness the reason for keeping the 980 was purely so I could have the 2080tis doing the rendering while I still work within windows/etc without suffering viewport performance degradation. Turns out that simply doesn't work for me and to solve the issue I had to totally uninstall everything, and just have the two cards instead.

Oh the "almighty" 980s are being sold. One is gone but if you'd like to write me a nice letter I might consider a discounted rate for the other one for you ;)

Yawn, stop the bullshitting.
For someone who does a lot of ''GPU rendering'', but then it's okay just leaving a single punny 980 do the job so he can ''occasionaly'' game... Hmm... CPU must still have alot of time feeding a 3rd GPU, so... Cryptomining? 😆

Avatar image for o0squishy0o
#15 Posted by o0squishy0o (2802 posts) -

@WESTBLADE:

With all due respect I think you might be a little confused as to what I am saying is my situation, be it my fault or not, it really doesn't matter.

Basically, while working within the program (3DS MAX) I can be rendering a scene (using the 2080tis with the render engine of choice) and still maintain good viewport performance within the program itself. Essentially meaning I could work with limited downtime of restarting rendering but also gain increased response time. In the most basic terms I can think of, the 2080tis would be acting like a separate machine to render the image, while the 980 would be like your normal windows/display device.

With regards to the gaming side. Because I am not in a position to yet justify a second dedicated workstation/gaming machine I'd use the 2080tis to play games. However I believe it must be driver related that when the 980 was still in the machine, even though not being used it would give me awful performance in game.

I hope that makes sense? You probably don't care but I still felt like I should try and explain! :)

Avatar image for WESTBLADE
#16 Posted by WESTBLADE (329 posts) -

@o0squishy0o:
Ah, i really apologise then... if you'd only just said '3DS MAX' from the very beginning, i would of have immediately knew where you are going with 'heavy GPU rendering' while not gaming and i wouldn't be so ''toxic''...😅

HonestlyI suspected a bit of a clueless ''miner'' (man i hate that scum, the price for new GPU's is now soridiculous thanks to them) who makes like 3 dollars a day (you know, the electricity bills and shit, it makes no sense) or someone who uses the cards for some lightning fast bootleg 4K Blu-ray encoding, eh...😋

WOW then! If you needed a pair of 2080 Ti's, then you've got to work with a ''few billions'' worth of polies (i remember it being more of a RAM hog back then, didn't touch MAX for years tho), ''whatever realtime shaders'' for materials/lighting you are running in your viewports (i did mainly low-to-midpoly mechanical game models back than and was using only 'Xoliul Shader' to fool around), The speed of GDDR6 must also be great for fast texture restoring and reading, etc... Makes sense now.

TO ANSWER YOUR QUESTION:
Back then when i upgraded from GTX460 to a GTX 570, i felt kinda sorry for my 460 and since i had a 2-Way SLI MB, i wanted to use the 570 as the main card and then use the 460 dedicated to PhysX.
Well, my 570 was surprisingly underperforming, because the NVIDIA drivers/OS/games wer confused as to what a stupid thing i did. You can only USE the exact same model cards in SLI.
I didn't have them ''bridged'' (that 570+460), because that's just silly idea to begin with...:). AND that was back then when NVIDIA actually still fully supported SLI (but not 100% sure though). Now if you have to save yourself a lot of headeache from a gaming perspective - one powerful card is the way to go...