Saren_Dredd's comments

Avatar image for saren_dredd
Saren_Dredd

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

I don't want to come off as pompous or privileged, I'm not. But my passion is gaming, and as such I've always saved up to get the absolute best in hardware. I game on my TV with my PC through Steam, so you know the method. My TV is capable of 240FPS, 3D, ultra HD (it's pretty much 4k, but it's not. When I tested them side by side, the 4k didn't have any noticeable advantage over the one I chose -- and yes, I brought my laptop into the store because this is what I was using it for). My question is twofold:

Since there was no visual or performance difference between my TV (Which is NOT 4k, but did just as well as a brand new Sony 4K set -- my TV is also made by Sony) and the "higher resolution" TV, does that mean my eye just couldn't detect the difference, and thus and response on the "better" TV was just entirely ignored and therefore superfluous?


Also, are there ANY games that take advantage and actually USE all of the hardware that we put all this money into obtaining? Will there ever be? (<<<-- main question)

I've played Crysis 1-3 and use them as my graphics testers, I've maxed out Skyrim and added UltraHD mods and SMIM, etc. Neither of them, I feel, truly take advantage of everything I have and I do NOT have the best GPU. I have an ASUS Republic of Gamers laptop with an ATI 6800HD graphics (I upgraded from the 5870HD) card. The 5870 did the same job as the 6xxx and they are vastly different in terms of rendering. I haven't upgraded since, because no game out even challenges what I have. This is 2011-2012 technology I'm talking about, and much better has come out since. My real question is why upgrade when what I have now does the trick? I don't understand why companies tout specs and performance enhancements like it's so important, but the games don't take advantage of it. I've heard the argument about the eye not detecting over a certain FPS, I used to both build AND sell the TVs and GPUs/CPUs that made these enhancements possible and I heard those things everyday. I know how they work. What I don't know is why they're made when nothing seems to be able to take advantage of it. I'm not talking about "next gen" (which let's face it, is really just a mid-powered PC), I'm talking about everything as a whole. I'm sorry this has run on so long, but this is pretty heady stuff to talk about. I've heard there is no difference between 120FPS-240, and I've reviewed both and there doesn't seem to be a difference at all. Even from 60 to 120, there's is a negligible difference. The problem is NOT the hardware, the problem is the sources using the hardware. Since we don't have the sources compatible with the hardware (movies are shot at 24fps, meaning the 60-240fps rendering is done by the machine and is artificial), what is the point in all the advances? If we don't change our source, how is the machine (source: video recorder, machine: your TV) ever truly tested to its limits and used?