I bought a 4K monitor a week ago, i already had a GTX 1080 Ti so i thought hey maybe its finally time to get a 4k display as my card would be able to handle 60fps at 4k in most games but as it turns out i was wrong.
2-3 years old games and its true, i can max out most games without AA and hover around 60-70 range but most newer games can't handle 4k quite well. For example in Fallout 4 i get around 40 fps during heavy fight scenarios sometime which is unacceptable, Geforce experience tells me to turn down TONS of settings like lightning quality to low, godrays off, shadow quality low, virtually 0 draw distance other then npcs etc which i don't think is worth playing at 4k. So basically my current rule is if i can't maintain above 55 fps with no AA and AF then i simply drop down the resolution to 1440p and max out everything and still enjoy buttery smooth 60fps.
Yes i know 4k looks absolutely gorgeous but i don't think its worth dropping so much settings which totally break immersion for me. Sun light without godrays looks awful to me. What do you prefer? Lower settings or lower resolution to improve performance?
Log in to comment