3840X2160 just seems to be a very tough hurdle for GPU designers to get over. Even with next gen consoles i suspect it'll be 4K with a big fat * for many titles since devs will be pushing the other tech and putting more of a demand on the GPU.
even with the mighty 2080TI Nvidia seem to be recommending using a combination of DLSS and a lower res and other temporal reconstruction, especially if you want 4K* and a solid framerate and high end RTX. cheating basically
AMD seem more focused on cheating too using image sharpening and a lower res instead of pushing for "honest" 4K.
So on next gen consoles (especially with early talk of the GPU side possibly being a bit disappointing in terms of the leap from a PS4 pro or X1X) i suspect devs will lie, cheat, steal and pillage every single drop of performance they can for new tech and if that means its 4K** instead of "honest" 4K then so be it. Not necessarily a bad thing in and of itself but some of the cheats used leave me wince (not a fan of temporal reconstruction for example. in any game i have seen that used it i preferred to turn it off and just use a lower res). Sony and MS may even talk 8K but i think that will be more the HDMI port can support 8K and maybe they can do 8K video playback or something.
it just seems to be a very tough wall to push through for gaming.
I only got on the 4K bandwagon this year. My old 1080P TV died so i got a new 4K Monitor. For reference its a 43" 4K monitor and i sit around 5 feet away from it while playing games.
In the few games i have tried:
Shadow of Mordor: i can barely notice any difference between 4K and 1440P. 1080P does look fuzzier but in the middle of playing it wouldn't bother me. I do use the high res texture pack.
Rise of the tomb raider: Initially i didnt notice much difference going from 1080P to 4K but as i moved around i began to notice a lot more small details in the scenery. The only thing is i tried this on my old 4GB RX580 and tried 4K only for a short period since the game was unplayable due to a very low framerate (though higher than i thought: around 20-22 while walking around in the valley hub area). It's not an earth shattering game changer..just a cleaner image with smaller details becoming more apparent.
Shadow of the tomb raider (playing this using a Vega 64 at the highest settings preset): I have only played this at 4K so far and man does it look perty. Lots of small details really stand out. I must play with the resolution a bit more. I do only get around 30FPS so i wonder would switching the res down a notch or 2 make much of a difference to the visuals.
Pillars of eternity 2: this actually surprised me. This was the clearest difference to me when jumping from 1080P to 4K (also can be played just fine at 4K with an RX 580 4GB....just a side note). the backgrounds looked the same but the little details on the character models really became clearer at 4K. on chain mail armour all the little rings are more apparent for example.
Doom 2016: I actually really struggled to notice any difference here. Maybe i was also using some sort of AA that was messing it up but it looked a little sharper....that's about it. I don't think i would be missing out playing it at 1080P on a 4K screen. Still looked great of course...it's doom 2016 after all.
But yeah...is it worth it? should devs prioritise pushing "honest" 4K or instead focus on 4K* and delivering better visuals in other areas (ray tracing, physical simulation etc.) and/or better performance. So far i would say cheat away Devs. Honest 4K is generally not worth the trade off.
But some of those cheats.....they can be a cure that is worse than the disease. Hopefully thats something that improves next gen.
Log in to comment