@Alienware_fan said:
@killatwill15 said:
@the_bi99man said:
Absolutely not. Literally impossible. Why? Because massive resolution is one of the biggest reasons that CGI movies can look as good as they do. That's the difference between pre-rendering (CGI movies) and real time rendering (video games). CGI movies are pre-rendered at massive resolutions. Like, beyond 4K. Beyond 8K. Rendered using farms of computers, each more powerful than any gaming system. And even with those rendering farms, it might take hours to render a 5 minute scene. That's where all the detail comes from in the first place. Then, video that's already been rendered is put on a disc, to be displayed at whatever resolution your player/TV allow. 1080p or 720p, or even less, if you're using a DVD, rather than a Bluray.
Video games will have to be rendered at higher resolutions, as well, in order to achieve that level of detail. And the whole point of real time rendering, which allows interactivity (obviously necessary for a video game), is that rendering and displaying are done at the same time, and the rendering is done by the home device (be it a PC or a console), right there while you're playing. And if your PC or console is powerful enough to do that rendering, at the resolution necessary for such detail, there's no reason to display it at a lower resolution.
So basically, what I'm getting at is that, eventually graphic fidelity in video games will reach the point that CGI movies already have, but increasing resolution is one of the many things that will have to be done to get there. Resolution isn't a separate thing from "graphics", as the statement, "720p is perfectly fine as long as the graphics are good", implies. There are many contributing factors to the picture fidelity that many gamers refer to with the blanket label of "graphics", and resolution is one of the most important ones.
whoa slow it down,
im pretty sure he is just jerking your penis skin bro....
unless he is truly that stupid
okay smart guy tell me why crysis 3 even at 1024x768 looks better than most games even at 1600p?
Because there are other aspects of graphics besides resolution, as well. Things like texture resolution, particle effects, shading and lighting, animations, etc. But that doesn't mean that resolution doesn't matter. They all do. It's a balancing act. I thought I explained it pretty clearly in the above quoted post.
As another example of why higher resolutions are needed, look at texture resolution in modern games. Some aren't too hot, but most high-end-graphic games of recent years have high enough texture resolution (that is, the level of detail in the textures) that they actually lose detail when the final output resolution of the full picture is any less than 1080p.
Even Dark Souls, a game that certainly isn't winning any awards for technically stunning graphics, has high enough texture resolution that you don't see full detail when the output res is 720p. This is confirmed by the existence of the PC version. The PC version got a lot of shit for being a straight port. That is to say, the developer did absolutely nothing to improve the game for the PC version over the console versions. Which means, among other things, that the textures which are present in the PC version are identical to the textures used in the console versions. And yet, when you use the DSfix to increase the resolution to 1080p (from the 720p-ish res that the consoles used, and is the default for the PC version) you not only see less jaggies around edges, but you actually see more detail in the textures, as well. The textures are still exactly the same, but increasing the resolution of the final output image increases the level of detail visible in those very same textures. This means that they actually could have used lower-res textures in the console versions, because the output resolution was bottlenecking the texture detail anyway, and it would have been literally impossible to tell if the textures themselves were slightly lower-res.
And that's not uncommon, at all. Even Path of Exile, another game that isn't making a run for a graphics-king title, has textures that continue to reveal more detail up to 1440/1600p.
Bottom line is, again, there are many factors contributing to the image quality of a video game, including resolution. They can all be improved individually, but there is a limit to how effective improving one aspect can be, without improving the others with it. Generally, as graphics steadily improve, resolution is the factor that gets left behind, in between occasional leaps. This is most likely because increasing resolution comes with the added cost and hassle of needing to upgrade display equipment (monitors and TVs), rather than just rendering hardware and software. So, resolution is always playing leap-frog because nobody wants to have to buy a new monitor every time they buy a new video card. But that doesn't mean resolution can be left behind forever. It must increase periodically, to give the other graphic aspects more headroom to improve. That's why resolution standards have increased as much as they have in the last 20 years, and why they will continue to increase, albeit at a more staggered pace than things like effects and lighting.
Log in to comment