So many threads with so many heated arguments because most of you think you're true experts yet you continue to prove how ignorant you are when it comes to graphics hardware.
That's why last gen Sony execs made a claim about PS3 being so powerful that could run games at 120fps and everyone ate it up. That claim made about as much sense as the 4d claim. Yet you guys ate it up. Because you're clueless.
This gen it's the same thing. Now it's 1080p and 60fps. The PS4 and X1 can render every game at 1080p/60. But they don't because developers want the games to actually look good. You got that, devs DON'T DO 1080p/60 IN ORDER TO MAKE THE GAMES LOOK GOOD!
I know some of you are blown away by that and some of you half get it but still think ok that's true but only because next gen hardware is so crappy. Not true.
Even if the PS4/X1 had 3.0Ghz 8 core CPU and a Radeon R9 280X GPU, the games wouldn't be 1080p/60. The standard of graphics would be higher and the devs in order to MAKE THE GAMES LOOK THE BEST POSSIBLE would cut the fps and/or the res.
I repeat THE STANDARD OF GRAPHICS WOULD BE HIGHER but in order to optimize the FIXED hardware the devs would cut the average fps and/or resolution.
Just to exaggerate my point a bit, do you guys realize that devs could make games for PS4/X1 at 4K res and 120 fps IF THEY CUT DOWN THE DETAIL AND EFFECTS DRAMATICALLY? Or they could crank up the details and graphical effects super high but run the game at 480p and 5fps. Both cases would look like garbage.
They optimize the game to look the best for the fixed hardware. They crank up effects the highest they can while maintaining a good res and playable fps. It's a balance they are need to achieve to make the game look the best. And if the PS4/X1 were twice as powerful, the same would be true.
Why are so many guys here confused by the difference with fixed hardware and an upgradeable PC?