It's easy to market tech products based on a number indicating a simple metric to evaluate performance; just like when 1080 was the big label to put on TV's, 4K is the popular buzz term today. You could draw similar comparisons to how PCs are sold, the GHz is a simple metric to inform consumers of a computers performance, along with core count and memory. Higher is better, though GHz by itself can be misleading, as many consumers are unaware of the IPC factor or other features like L2 and L3 cache.
Although frame rate should also be an easy marketing number, resolution is more closely related to graphical fidelity which is more immediately obvious even in quick observation, so by perception the more desirable trait to advertise than performance. And as developers try to push as much of the graphics as possible on limited hardware in consoles, 60 fps is more often unattainable, thus less reason to market that.
All the other graphical settings like Texture, level detail, Shadows, Lighting, Ambient Occlusion, Anti-Aliasing, Particles, Vegetation, Water Effects, Hair Effects, Screen space reflections.... these use basic descriptive terms of Low, Med, High, Ultra (or sometimes as simple as On or Off) which while providing a scale of quality, is also vague to the uninitiated as to how it it all rates. It's far more generalized and less concise as to how a game looks, and some terms many don't understand their function, such as occlusion and particles. A descriptive scale of Med, High, Ultra doesn't intuitively translate to a numeric scale where there's no definitive criteria to what the differences are between the tiers, which is only compounded that there is no universal measure of these. It's a sliding standard that each game has it's own scale, and shifts by generations.
Basically it's all "loose and fuzzy" as a means to define visual performance, so it always goes back to the easy to gauge numbers that are easy to market.
Log in to comment