It just seems all very odd to me. Shouldn't the next gen "upgrades" (PS4 pro and XB1X, that is) worry about improving stuff like texture rendering, Supersampling, dynamic lighting, and environmental particles over how high the resolution will be? Rendering the 3d models themselves at 4k would seem like it should take precedent over rendering the image at 4k.
For example, though it is far more demanding on my system (1060 6GB with a I7-7700HQ), playing Battlefield 4 at 200% Resolution scaling (which is another word for Supersampling, where the game's 3d models are rendered at essentially 4k, giving the environment a much, MUCH more crisper and detailed look) looks FAR better than simply rendering the game's image at 4k.
What's the point of playing at 4K if the graphics, in reality, look exactly the same, but simply clearer? "Wow, we get a clear image of our mediocre graphics". I'm not saying that graphics are not improving, but they have really stagnated over the last 2 years in the console world, and it doesn't seem to be much of an emphasis in their development.
Log in to comment