Highest possible temperature for a GPU. Need info.

This topic is locked from further discussion.

#1 Posted by funkyzoom (1426 posts) -

I newly installed a Radeon HD 6950 GPU. I have a feeling its running a bit hot. hence I adjusted the fan speed so that it increases 1% for every 1c increase in temperature. The temperature hovers between 40c and 50c when idle. I just tested one game after installing the GPU, that is Sleeping Dogs with everything maxed out. When I started playing the game, the temperature gradually increased, then became stable around 80c (means it didn't increase anymore after that), and due to my fan settings, the fan runs at 80% speed during this time. So I just wanted to know, are my temperatues normal, or is my GPU overheating? Also, what temperature of the GPU is a cause for alarm uncer full load? I checked out various forums, and people say different things. Some say a GPU is surely safe till it reaches around 90c, whereas others say the GPU gets damaged if it goes anywhere beyond 80c, and 70c is ideal for full load. Please help me. Thank you.

#2 Posted by ferret-gamer (17310 posts) -

Your temps are perfectly fine. GPUs only start getting in the danger zone around 100c or above, and they will throttle themselves before they overheat. Generally you want to keep it under 90c just to be safe.

#3 Posted by 04dcarraher (19263 posts) -
Anything under 85C is fine, anything pass 85C time to look into better cooling and or figure out the reason for the temps. The type of cooler used on the gpu can also determine how well and and how high a graphics card can handle the heat. If you have a heatsink that also contacts and cools the memory and other components. The heat being transferred from the gpu can saturate the heatsink causing the memory and other components on the graphics card to become too hot(insulate) and you can also run into issues. Even though a gpu could be below the 100C threshold they can be unstable.
#4 Posted by PernicioEnigma (5281 posts) -
Try to get it lower. The people above are right in saying 85C is an okay temperature, but getting it lower will help increase the lifespan of your GPU.
#5 Posted by funkyzoom (1426 posts) -

Your temps are perfectly fine. GPUs only start getting in the danger zone around 100c or above, and they will throttle themselves before they overheat. Generally you want to keep it under 90c just to be safe.

ferret-gamer

Thanks for the info. I'm relieved. I just wanted to know, how much does Sleeping Dogs stress the GPU when maxed out? I mean...I only tried this game. So the max temperature I'm seeing (80c) with this game maxed out, with other games will the temperature increase a lot? And with my current temperatures, do you think I will be able to run almost all current games maxed, without the temperature increasing beyond 95c? I'm asking these questions 'cos I'm a bit scared to test it myself, since the GPU might get fried. I'm broke right now and can't afford a new one now.

#6 Posted by funkyzoom (1426 posts) -

Try to get it lower. The people above are right in saying 85C is an okay temperature, but getting it lower will help increase the lifespan of your GPU.PernicioEnigma

How exactly can I get it lower? I'm certainly not in a position to spend even a penny now on upgrades of my cabinet or cooling solution. I can only perform tweaks like altering fan speed.

#7 Posted by V4LENT1NE (12895 posts) -
80 is nothing, cards like the older ATI 4000 series that I used to have idled in my case at around 79 and went up to 100 full load, never really had a problem with them. I heard the real threshold for a GPU is like 120c.
#8 Posted by topgunmv (10184 posts) -

80 is nothing, cards like the older ATI 4000 series that I used to have idled in my case at around 79 and went up to 100 full load, never really had a problem with them. I heard the real threshold for a GPU is like 120c.V4LENT1NE

I think whoever told you that is a crazy person. That sounds too high even for gaming laptops.

#9 Posted by ferret-gamer (17310 posts) -

[QUOTE="ferret-gamer"]

Your temps are perfectly fine. GPUs only start getting in the danger zone around 100c or above, and they will throttle themselves before they overheat. Generally you want to keep it under 90c just to be safe.

funkyzoom

Thanks for the info. I'm relieved. I just wanted to know, how much does Sleeping Dogs stress the GPU when maxed out? I mean...I only tried this game. So the max temperature I'm seeing (80c) with this game maxed out, with other games will the temperature increase a lot? And with my current temperatures, do you think I will be able to run almost all current games maxed, without the temperature increasing beyond 95c? I'm asking these questions 'cos I'm a bit scared to test it myself, since the GPU might get fried. I'm broke right now and can't afford a new one now.

I have a 6950 as well, Sleeping Dogs can get full gpu utilization on the card out easily if you up the anti aliasing. Most 6950s will get in high 70s mid 80s on their stock coolers when running intensive games. As long as the stock coolers aren't full of dust and working they are designed to keep the stock card well within operating temperatures even if fully stressed

My 6950 has a aftermarket cooler on it right now, so it never gets higher than 55c, but when i was running on the stock cooler i was getting pretty much the same temperatures as you were.

If you want to be sure, run a program called Furmark. It is a stress test program that will get your card hotter than pretty much anything else. If your temps are good there, then you you can be pretty sure that games won't even get close to overheating your card.

#10 Posted by V4LENT1NE (12895 posts) -

[QUOTE="V4LENT1NE"]80 is nothing, cards like the older ATI 4000 series that I used to have idled in my case at around 79 and went up to 100 full load, never really had a problem with them. I heard the real threshold for a GPU is like 120c.topgunmv

I think whoever told you that is a crazy person. That sounds too high even for gaming laptops.

GPUs can handle way more heat then a lot of electronics, especially older GPUs where it was the norm to see quite high temps. Like I said I had my old Xfire 4870s and they idled at 79c every day for 2 years without a fault.
#11 Posted by funkyzoom (1426 posts) -

I have a 6950 as well, Sleeping Dogs can get full gpu utilization on the card out easily if you up the anti aliasing. Most 6950s will get in high 70s mid 80s on their stock coolers when running intensive games. As long as the stock coolers aren't full of dust and working they are designed to keep the stock card well within operating temperatures even if fully stressed

My 6950 has a aftermarket cooler on it right now, so it never gets higher than 55c, but when i was running on the stock cooler i was getting pretty much the same temperatures as you were.

If you want to be sure, run a program called Furmark. It is a stress test program that will get your card hotter than pretty much anything else. If your temps are good there, then you you can be pretty sure that games won't even get close to overheating your card.

ferret-gamer

Thanks again. I did run furmark yesterdaay itself. With 8XAA enabled at 1080p, the temperature reached around 90c after running the benchmark for a minutes. Actually the temperature kept on climbing, so I don't know how much it would have increased if I let furmark run for acouple more minutes. Is this fine?

And one more thing I wanted to know. Earlier I used a HD 5670, and I know that the HD 6950 is a significant upgrade. But even then, in sleeping dogs I don't notice any significant increase in graphics quality now, compared to the time when I used to run it with the HD 5670. On that card I had the settings on high, but AA turned off, now I've maxed every available setting. But I do notice a slightincrease in performance now. I was expecting a radical increase in graphics quality with this upgrade when maxed, but nothing is noticible. Is this normal, or do I have to chane some settings in the AMD Catalyst control center? I haven't overlcoked my card. What I'm using is the Sapphire Radeon HD 6950 2GB Dirt 3 edition. Please advice. Thank you.

#12 Posted by 04dcarraher (19263 posts) -
[QUOTE="topgunmv"]

[QUOTE="V4LENT1NE"]80 is nothing, cards like the older ATI 4000 series that I used to have idled in my case at around 79 and went up to 100 full load, never really had a problem with them. I heard the real threshold for a GPU is like 120c.V4LENT1NE

I think whoever told you that is a crazy person. That sounds too high even for gaming laptops.

GPUs can handle way more heat then a lot of electronics, especially older GPUs where it was the norm to see quite high temps. Like I said I had my old Xfire 4870s and they idled at 79c every day for 2 years without a fault.

That does not matter, getting gpu's near or beyond 100C, your playing russian roulette... with shorting the lifespan to causing damage to causing unstable conditions.
#13 Posted by 04dcarraher (19263 posts) -

[QUOTE="ferret-gamer"]

I have a 6950 as well, Sleeping Dogs can get full gpu utilization on the card out easily if you up the anti aliasing. Most 6950s will get in high 70s mid 80s on their stock coolers when running intensive games. As long as the stock coolers aren't full of dust and working they are designed to keep the stock card well within operating temperatures even if fully stressed

My 6950 has a aftermarket cooler on it right now, so it never gets higher than 55c, but when i was running on the stock cooler i was getting pretty much the same temperatures as you were.

If you want to be sure, run a program called Furmark. It is a stress test program that will get your card hotter than pretty much anything else. If your temps are good there, then you you can be pretty sure that games won't even get close to overheating your card.

funkyzoom

Thanks again. I did run furmark yesterdaay itself. With 8XAA enabled at 1080p, the temperature reached around 90c after running the benchmark for a minutes. Actually the temperature kept on climbing, so I don't know how much it would have increased if I let furmark run for acouple more minutes. Is this fine?

And one more thing I wanted to know. Earlier I used a HD 5670, and I know that the HD 6950 is a significant upgrade. But even then, in sleeping dogs I don't notice any significant increase in graphics quality now, compared to the time when I used to run it with the HD 5670. On that card I had the settings on high, but AA turned off, now I've maxed every available setting. But I do notice a slightincrease in performance now. I was expecting a radical increase in graphics quality with this upgrade when maxed, but nothing is noticible. Is this normal, or do I have to chane some settings in the AMD Catalyst control center? I haven't overlcoked my card. What I'm using is the Sapphire Radeon HD 6950 2GB Dirt 3 edition. Please advice. Thank you.

Why would you expect better graphics quality when the game running nearly the same settings as before just with a gpu change? Usually the differences between high and ultra settings are minor. That's why your not seeing any real difference. You need to runs games where that 5670 couldn't run beyond medium settings to see real graphical and performance improvements.

#14 Posted by funkyzoom (1426 posts) -

Why would you expect better graphics quality when the game running nearly the same settings as before just with a gpu change? Usually the differences between high and ultra settings are minor. That's why your not seeing any real difference. You need to runs games where that 5670 couldn't run beyond medium settings to see real graphical and performance improvements.

04dcarraher

Ok, I got it. But with my 5670, I had completely turned off AA, and now I've set it to extreme. Since AA is such a resource heavy feature, shouldn't I also see some improvement in graphics quality?

#15 Posted by ferret-gamer (17310 posts) -

[QUOTE="topgunmv"]

[QUOTE="V4LENT1NE"]80 is nothing, cards like the older ATI 4000 series that I used to have idled in my case at around 79 and went up to 100 full load, never really had a problem with them. I heard the real threshold for a GPU is like 120c.V4LENT1NE

I think whoever told you that is a crazy person. That sounds too high even for gaming laptops.

GPUs can handle way more heat then a lot of electronics, especially older GPUs where it was the norm to see quite high temps. Like I said I had my old Xfire 4870s and they idled at 79c every day for 2 years without a fault.

The ATI 4800 might idle really hot, but idling at 79c is still pretty high. Most cards don't idle anywhere near that high, heck most newer cards only get around those temps when under full load. Radeon 4800 cards might have a higher threshold than most, because those cards run really, really hot somethimes, but most graphics cards will start having problems above 100c, much less 120c

I had a couple cards with temperature issues before. My gtx 280 would artifact at around 103c, and throttle at 105. With my 6950's bad cooler that card could handle even less temperature, it would straight crash at 100c.

#16 Posted by ferret-gamer (17310 posts) -

[QUOTE="04dcarraher"]

Why would you expect better graphics quality when the game running nearly the same settings as before just with a gpu change? Usually the differences between high and ultra settings are minor. That's why your not seeing any real difference. You need to runs games where that 5670 couldn't run beyond medium settings to see real graphical and performance improvements.

funkyzoom

Ok, I got it. But with my 5670, I had completely turned off AA, and now I've set it to extreme. Since AA is such a resource heavy feature, shouldn't I also see some improvement in graphics quality?

Anti aliasing just cleans up jaggies in the image, basically making edges on stuff smoother. It has diminishing returns after a while since you can only smooth an edge so much. The method Sleeping Dogs uses on higher settings is a very good AA method, but also extraordinarily costly, so the performance/quality ration isn't very good. Probably the reason you didn't notice that much of a difference is because you can't turn Anti aliasing off fully in sleeping dogs. Even on the lowest AA setting it will still use FXAA. That alone is usually good enough AA for most people.
#17 Posted by V4LENT1NE (12895 posts) -

[QUOTE="V4LENT1NE"][QUOTE="topgunmv"]

I think whoever told you that is a crazy person. That sounds too high even for gaming laptops.

ferret-gamer

GPUs can handle way more heat then a lot of electronics, especially older GPUs where it was the norm to see quite high temps. Like I said I had my old Xfire 4870s and they idled at 79c every day for 2 years without a fault.

The ATI 4800 might idle really hot, but idling at 79c is still pretty high. Most cards don't idle anywhere near that high, heck most newer cards only get around those temps when under full load. Radeon 4800 cards might have a higher threshold than most, because those cards run really, really hot somethimes, but most graphics cards will start having problems above 100c, much less 120c

I had a couple cards with temperature issues before. My gtx 280 would artifact at around 103c, and throttle at 105. With my 6950's bad cooler that card could handle even less temperature, it would straight crash at 100c.

I know it was high, not denying that. My GTX580 now idles at like 50 and doesnt go over 70 at full load. What I am saying is cards, especially older ones can handle a lot more than people make out.
#18 Posted by V4LENT1NE (12895 posts) -
[QUOTE="V4LENT1NE"][QUOTE="topgunmv"]

I think whoever told you that is a crazy person. That sounds too high even for gaming laptops.

04dcarraher
GPUs can handle way more heat then a lot of electronics, especially older GPUs where it was the norm to see quite high temps. Like I said I had my old Xfire 4870s and they idled at 79c every day for 2 years without a fault.

That does not matter, getting gpu's near or beyond 100C, your playing russian roulette... with shorting the lifespan to causing damage to causing unstable conditions.

Never said it wouldnt damage, just saying mine were fine.
#19 Posted by kraken2109 (13007 posts) -

Most GPUs have a max of about 105, but as long as you're under 90 it should be fine.

#20 Posted by Postmortem123 (7643 posts) -

80C at 80% fan speed seems pretty horrible.

#21 Posted by JohnF111 (14051 posts) -
I have a HD5770 and it can reach over 100 degrees Celsius and runs perfectly fine, but like others say keep it well under that when you can, 80c won't damage most GPU's but it certainly won't help them either. 80c is a good threshold to have in mind but there's no set limit, everyone has different ideas on the upper temp limits and I think that's what's confusing you, some say nvidia's like the GTX460 can run over 100c and be considered fine whereas others say 70c is too hot and should be stopped but a GPU is designed to run hotter than a CPU. All you can do is look at others and try to decide for yourself what good temps are, 80c is a good limits across the board and suits just about every graphics card in general though.
#22 Posted by GamerwillzPS (8530 posts) -

I'd get worried if it goes above 90°C. Below is fine, but consider some better cooling at 85°C.

I know it varies on all cards, but I'd say the maximum acceptable temperature would be 90°C.