When did framerate become the defacto thing to argue about?

  • 90 results
  • 1
  • 2
#51 Posted by clone01 (24930 posts) -

Technology advances. I mean, I remember Final Fantasy VII on PS1 being the ultimate graphics experience for me. Framerates are like that too. Gamers will expect more and more. That being said, whatever the number is...is it fun to play? Does it look good? Well, there you go.

#52 Edited by GrenadeLauncher (5493 posts) -

Because lemmings got this turd at 60FPS and they think it vindicates the Xbone or something.

#53 Edited by magicalclick (23010 posts) -

Mainly because PS3 runs multiplat consistently slower than Xbox360. So, it was started from last Gen.

#54 Posted by Wild_man_22 (650 posts) -

When marketers turned it into a selling point for their consoles. I think game performance is important, but this whole ''1080p 60fps'' has turned into buzz phrase for the most part. Now almost every game has it labeled on their box or in a press release whether it's true or not.

It's great that people are more interested now in polished well-made experiences. But it seems like a lot of people use it more as a selling point or bragging rights.

#55 Posted by Roler42 (782 posts) -

I used to play PC games at the lowest ot low and still getting 15-20 fps, didn't care because I knew I was limited to my hardware, once I upgraded? I can't go back to anything that low again, it's not like people didn't care, even "back in the day" people knew the limitations of their hardware and why it was giving them such low framerates

"Having fun" is the lazy way of thinking and that's precisely why towards the start of this gen we started to get games with 15 fps/30 fov, it's mediocre

#56 Posted by darkangel115 (1842 posts) -

Honestly I think its only big because of cows. Hermits always did (and still do) have bragging rights over console power. But This is the 1st time in 20+ years that Sony has what is considered the more powerful system (for the moment) If you look back, the PS1 was overpowered by the N64, The PS2 was overpowered by the xbox (and this was the largest gap in power we've ever seen), and the PS3 was overpowered by the 360. Now the PS4 is off to an early start as being the "more powerful system" so cows are rejoicing in it because they aren't used to it? Thats the best guess I can think of. and still while being under powered, the PS brand has sold over 250 million consoles despite being weaker then its competition, I think that alone is further proof that power means nothing and its all fanboy BS.

#57 Edited by I_can_haz (6551 posts) -

LOL @ lems suddenly not caring about frame rates and resolution when that's all they talked about last gen.

#58 Posted by blackace (21073 posts) -

@bldgirsh said:

When the baby boomers cows of next-gen came.

I fixed it for you.

#59 Posted by scatteh316 (4979 posts) -

@darkangel115 said:

Honestly I think its only big because of cows. Hermits always did (and still do) have bragging rights over console power. But This is the 1st time in 20+ years that Sony has what is considered the more powerful system (for the moment) If you look back, the PS1 was overpowered by the N64, The PS2 was overpowered by the xbox (and this was the largest gap in power we've ever seen), and the PS3 was overpowered by the 360. Now the PS4 is off to an early start as being the "more powerful system" so cows are rejoicing in it because they aren't used to it? Thats the best guess I can think of. and still while being under powered, the PS brand has sold over 250 million consoles despite being weaker then its competition, I think that alone is further proof that power means nothing and its all fanboy BS.

I would actually say the gap between ps4 & Xbone is quite a bit bigger the the gap between ps2 & xbox 1

#60 Posted by darkangel115 (1842 posts) -

@scatteh316 said:

@darkangel115 said:

Honestly I think its only big because of cows. Hermits always did (and still do) have bragging rights over console power. But This is the 1st time in 20+ years that Sony has what is considered the more powerful system (for the moment) If you look back, the PS1 was overpowered by the N64, The PS2 was overpowered by the xbox (and this was the largest gap in power we've ever seen), and the PS3 was overpowered by the 360. Now the PS4 is off to an early start as being the "more powerful system" so cows are rejoicing in it because they aren't used to it? Thats the best guess I can think of. and still while being under powered, the PS brand has sold over 250 million consoles despite being weaker then its competition, I think that alone is further proof that power means nothing and its all fanboy BS.

I would actually say the gap between ps4 & Xbone is quite a bit bigger the the gap between ps2 & xbox 1

Then I would actually say you are highly inaccurate.

#61 Posted by ActicEdge (24492 posts) -

Framerate has always been a big deal to me. It doesn't have to be 60 FPS though I would personally want that but it can't run like a piece of shit stuttering mess like tons of games on the PS3/360.

#62 Posted by BldgIrsh (2779 posts) -

@ActicEdge: Then that would raise the question upon titling a game 60 FPS if it actually stay locked at 60 FPS without dipping below 55~

#63 Posted by ActicEdge (24492 posts) -

@bldgirsh said:

@ActicEdge: Then that would raise the question upon titling a game 60 FPS if it actually stay locked at 60 FPS without dipping below 55~

Most games that claim to run at 60 FPS don't stay 100% locked at 60. They generally float between 55-60FPS. That's fine imo. The difference between 55-60 is way less noticeable than 25-30.

#64 Posted by Wasdie (50245 posts) -

Since polygon counts, texture resolution, lighting, shading, and other graphical effects have hit a point where the Xbox One/PS4/PC are all relatively comparable and any changes or additions to the fidelity much more subtle.

Resolution and framerate are a good measure of difference between the consoles. It's pretty clear if one console is rendering the scene with the same amount of fidelity as the other console but doing it 30 times more second, that console is the more powerful of the two.

Also resolution is starting to come into play. All of the polygons, high resolution textures, lighting, and shading in the world don't mean squat if you don't have enough pixels there to display the info. On the PC we're starting to see gamer turn to downsampling to increase picture quality. Even if you're downsampling from something like 4k to 1080p, you'll have a far more detailed image than if you just did 1080p due to the original render having more pixels for each effect to apply too.

#65 Posted by GreySeal9 (24859 posts) -

The only reason the TC is making this thread is because people were calling out Borderlands on the Vita for having a poor framerate and he couldn't handle it.

#66 Edited by Grey_Eyed_Elf (3912 posts) -

@Wasdie said:

Since polygon counts, texture resolution, lighting, shading, and other graphical effects have hit a point where the Xbox One/PS4/PC are all relatively comparable and any changes or additions to the fidelity much more subtle.

Resolution and framerate are a good measure of difference between the consoles. It's pretty clear if one console is rendering the scene with the same amount of fidelity as the other console but doing it 30 times more second, that console is the more powerful of the two.

Also resolution is starting to come into play. All of the polygons, high resolution textures, lighting, and shading in the world don't mean squat if you don't have enough pixels there to display the info. On the PC we're starting to see gamer turn to downsampling to increase picture quality. Even if you're downsampling from something like 4k to 1080p, you'll have a far more detailed image than if you just did 1080p due to the original render having more pixels for each effect to apply too.

Downsampling is wonderful.

I found that playing battlefield on high settings at 2560x1080 at 150% resolution with 2x MSAA looks better than just running the game at 2560x1080 on ultra... I don't know why it just looks more crisp.

Same as Day Z... But I would recommend running Day Z with downsampling since it will kill your GPU.

#67 Posted by Wasdie (50245 posts) -

@Grey_Eyed_Elf said:

@Wasdie said:

Since polygon counts, texture resolution, lighting, shading, and other graphical effects have hit a point where the Xbox One/PS4/PC are all relatively comparable and any changes or additions to the fidelity much more subtle.

Resolution and framerate are a good measure of difference between the consoles. It's pretty clear if one console is rendering the scene with the same amount of fidelity as the other console but doing it 30 times more second, that console is the more powerful of the two.

Also resolution is starting to come into play. All of the polygons, high resolution textures, lighting, and shading in the world don't mean squat if you don't have enough pixels there to display the info. On the PC we're starting to see gamer turn to downsampling to increase picture quality. Even if you're downsampling from something like 4k to 1080p, you'll have a far more detailed image than if you just did 1080p due to the original render having more pixels for each effect to apply too.

Downsampling is wonderful.

I found that playing battlefield on high settings at 2560x1080 at 150% resolution with 2x MSAA looks better than just running the game at 2560x1080 on ultra... I don't know why it just looks more crisp.

Same as Day Z... But I would recommend running Day Z with downsampling since it will kill your GPU.

You should run at 200% and 0 MSAA. MSAA is a supersampling method. So at 150% you're already supersampling. Generally when supersampling you don't want to apply any AA.

AA is a temporary thing. As GPUs become more and more powerful and can handle higher and higher resolutions, AA will become a thing of the past. People downsample 4k ,6k, or 8k to 1080p (they do on NeoGAF) and the results are absolutely amazing. It's CGI quality in real time. Granted when rendering at 8k in real time you get on a handful frames per second. But the fact it can be done is the first step.

#68 Edited by Grey_Eyed_Elf (3912 posts) -

@Wasdie said:

@Grey_Eyed_Elf said:

@Wasdie said:

Since polygon counts, texture resolution, lighting, shading, and other graphical effects have hit a point where the Xbox One/PS4/PC are all relatively comparable and any changes or additions to the fidelity much more subtle.

Resolution and framerate are a good measure of difference between the consoles. It's pretty clear if one console is rendering the scene with the same amount of fidelity as the other console but doing it 30 times more second, that console is the more powerful of the two.

Also resolution is starting to come into play. All of the polygons, high resolution textures, lighting, and shading in the world don't mean squat if you don't have enough pixels there to display the info. On the PC we're starting to see gamer turn to downsampling to increase picture quality. Even if you're downsampling from something like 4k to 1080p, you'll have a far more detailed image than if you just did 1080p due to the original render having more pixels for each effect to apply too.

Downsampling is wonderful.

I found that playing battlefield on high settings at 2560x1080 at 150% resolution with 2x MSAA looks better than just running the game at 2560x1080 on ultra... I don't know why it just looks more crisp.

Same as Day Z... But I would recommend running Day Z with downsampling since it will kill your GPU.

You should run at 200% and 0 MSAA. MSAA is a supersampling method. So at 150% you're already supersampling. Generally when supersampling you don't want to apply any AA.

AA is a temporary thing. As GPUs become more and more powerful and can handle higher and higher resolutions, AA will become a thing of the past. People downsample 4k ,6k, or 8k to 1080p (they do on NeoGAF) and the results are absolutely amazing. It's CGI quality in real time. Granted when rendering at 8k in real time you get on a handful frames per second. But the fact it can be done is the first step.

I tried 200% with no AA and I have to drop the settings to medium in order to salvage a playable framerate. Also with the MSAA... I find that it looks better with 150% with 2x MSAA than it does with just 150% resolution on its own. Strange.

I would love to play Borderlands 2 with 200% downsampling. That would looks stunning.

#69 Posted by DeusEx_Squid (15 posts) -

People love to brag about how better games are on their system. Games have made huge graphic leaps since 'the good old days' imagine playing The Witcher 3 or Batman: Arkham Knight with 15-20 fps. Everybody would go completely mental.

#70 Edited by PAL360 (27005 posts) -

I remember how bad Daytona USA looked and played on Saturn compared to the 60fps arcade version! Is framerate the most important aspect for me to enjoy a game? Not at all. Does high framerate make the experience better? Hell yeah!

#71 Edited by delta3074 (18498 posts) -

@bldgirsh said:

@I_can_haz: Morrowind came out in 2002... he was talking about back in the day... where NO ONE cared about FPS.

i care about Framerates and performance more than resolution.

@santoron said:

@kingtito said:

@santoron said:

Agreed, except... this is SW. This is where people go to fight about game systems. When frame rates are diferent, the guy with the higher rate is going to claim a victory.

It's silly, but it's also why people come here instead of General Games Discussion.

What you're not going to cry in this thread?

Oh look. I've got my own Troll.

And I will pet him and I will call him George...

Post of the year man, hands down,pure vintage, i laughed myself stupid.

#72 Edited by ReadingRainbow4 (14677 posts) -

How the hell did you play morrowind with 18 fps.

It would have been so damn choppy, ew. And that's the best elder scrolls :(

#73 Posted by ConanTheStoner (5936 posts) -

Eh.. I always cared about framerate in a way. Not that you'll see me arguing about it on SW though.

Of course back in the day a shoddy frame rate was the norm even in some of the biggest games. Times change though and once you experience something good you don't really want to deal with a shit experience. Once a game becomes sub-30fps I really don't want to play it.

I've never been too picky with graphics, but framerate can really make or break an experience sometimes.

#74 Posted by ReadingRainbow4 (14677 posts) -

Really I only care about frame rate enough to the point where it gives me a smooth experience.

Can't wait to play gta 5 on pc, those 20fps drops were fucking bullshit on the consoles.

#75 Posted by Old_Gooseberry (3795 posts) -

I been aware of my fps in games ever since the late 90s, its not a new thing.. its important to get the most out of your hardware and games. It only really became important when games went full 3d and you needed a good framerate to sync with your monitor and provide smooth gameplay. You had to play morrowind at 20 fps? no wonder your all angry and stuff.

and for 60 fps, its an important number to keep it in sync to your monitor assuming its 60hz. if you drop down a lot you see lag, or screen tearing/bluring. Mostly graphical lag and sluggishness though. I forget when quake 3 came out, but that game was all about the fps, the more the better. I can't remember how much fps i had in it, but it was easily 100+ and probably was only using a crappy voodoo 3 card as well. fps was key even back then, shooters were pure speed back then and you had to have fps.

#76 Posted by PsychoLemons (2123 posts) -

I thought the standard fps is 60 and above.

#77 Edited by Midnightshade29 (5361 posts) -

@Old_Gooseberry said:

I been aware of my fps in games ever since the late 90s, its not a new thing.. its important to get the most out of your hardware and games. It only really became important when games went full 3d and you needed a good framerate to sync with your monitor and provide smooth gameplay. You had to play morrowind at 20 fps? no wonder your all angry and stuff.

and for 60 fps, its an important number to keep it in sync to your monitor assuming its 60hz. if you drop down a lot you see lag, or screen tearing/bluring. Mostly graphical lag and sluggishness though. I forget when quake 3 came out, but that game was all about the fps, the more the better. I can't remember how much fps i had in it, but it was easily 100+ and probably was only using a crappy voodoo 3 card as well. fps was key even back then, shooters were pure speed back then and you had to have fps.

haha... yeah... i had to play morrowind at 20fps because I was still sporting a voodoo 5 5500 at that time.. I then got a geforce 3 and was enthralled by the newness of pixel shaders, with shinny water for the first time.. :) I did get a higher fps but not by much as I pumped everything to max... at 1027 x 768 ,,,that was great back then ..lol.

I do agree that with multiplayer in Quake 3 and ut99 fps was important and is still important in competitive games... but I think when people whine about it dipping 10 fps from 60 in a single player game, that its not the big deal that people make it out to be.

I have been bench-marking for years since 3dmark99 was out.. and I used to love seeing the new effects and how fast my gfx card could run.. but when playing a single player game, i will sacrifice frame rate for visuals as long as the game is still playable and smooth.

#78 Edited by DJ_Headshot (6281 posts) -

I don't doubt you where able to enjoy a game at 18-20fps but the difference between 20fps and 60fps is enormous let alone higher framerates. That what I remember playing half life 2 at using my integrated graphics on my pc at the time I could max it out minus AA at 1280x1024 but would run at constant 20fps bought a new graphics card and psu to upgrade that pc same game graphically just with AA smoothing out all the jaggies but holy shit the massive difference in smoothness was what really stuck out going from 20fps to 75fps was truly mind blowing at the time and greatly improved the experience totally made the upgrade worth while. And I say 75fps since my monitor at the time had a 75Hz refresh rate so it was even smoother then a normal 60hz monitor making the difference even bigger in comparison but even on a 60hz screen the difference is still huge

#79 Posted by CrownKingArthur (5070 posts) -

@DJ_Headshot: yea i've had 75 Hz before. i found it much better than 60 Hz.

and i read your entire story.

#80 Posted by wis3boi (31582 posts) -

There's no room for error at 30. it drops at all, and it looks like ass.

#81 Posted by OhSnapitz (18464 posts) -

Gamers (some apparently, not all) are paying $400+ for machines that are a drastic upgrade from their predecessors. That being the case the 360/PS3 were able to "lock" a game at 30 FPS. It's only a nature reaction when games of this gen are struggling to do so (in some cases).

Things like that should be pointed out.. that's not to saying gamers will simple stop paying on said systems, but there's nothing wrong with challenging a dev to get better performance out of a title.

#82 Edited by Gue1 (10398 posts) -

@I_can_haz said:

LOL @ lems suddenly not caring about frame rates and resolution when that's all they talked about last gen.

lems the flipfloppers.

1 frame and 3 pixel advantage for xbox was seen like something huge. PS4 running games at twice the frame-rate and resolution than the xbone? That doesn't matter. lol

#83 Edited by Old_Gooseberry (3795 posts) -

@GrenadeLauncher said:

Because lemmings got this turd at 60FPS and they think it vindicates the Xbone or something.

what game was that from? its not from this gen is it? I been playing NFS Most wanted 2005 on my PC lately and it looks much better then whatever game that is you linked.

heres screenshots below of NFS Most wanted 2005 i took, the trees look better then that game and its almost 10 years old. And theres actually shadows visible.

#84 Posted by GrenadeLauncher (5493 posts) -

@Old_Gooseberry said:

@GrenadeLauncher said:

Because lemmings got this turd at 60FPS and they think it vindicates the Xbone or something.

what game was that from? its not from this gen is it? I been playing NFS Most wanted 2005 on my PC lately and it looks much better then whatever game that is you linked.

heres screenshots below of NFS Most wanted 2005 i took, the trees look better then that game and its almost 10 years old. And theres actually shadows visible.

That, my friend, is Forza 5, only possible using the power of Xbox One.

#85 Posted by Old_Gooseberry (3795 posts) -

@GrenadeLauncher said:

@Old_Gooseberry said:

@GrenadeLauncher said:

Because lemmings got this turd at 60FPS and they think it vindicates the Xbone or something.

what game was that from? its not from this gen is it? I been playing NFS Most wanted 2005 on my PC lately and it looks much better then whatever game that is you linked.

heres screenshots below of NFS Most wanted 2005 i took, the trees look better then that game and its almost 10 years old. And theres actually shadows visible.

That, my friend, is Forza 5, only possible using the power of Xbox One.

umm ... lol wtf ? Simple trees should at least look as good as a 2005 game. thx for letting me know what game it was.

I looked up the screenshots for Forza 5 just to be sure u werent kidding, and they are very inconsistant. how'd it get a 9 out of 10 i wonder for a next gen game. The cars look excellent but some of the environments look like smudgy/blurry mid to early 2000s graphics.

#86 Posted by harry_james_pot (11024 posts) -

People can argue about if 30fps is playable or not... But 20? That is choppy as hell, how can anyone play like that?

#87 Posted by SecretPolice (22439 posts) -

@Old_Gooseberry:

Ummm, try playing that awesome Driver and you'll see why it's AAAE. Besides, all that matters these dayzzz is it's 1080P at 60 fps... at least that's all I hear here in SW.... amrite?

#88 Posted by parkurtommo (27366 posts) -

This is a relative issue. People who are used to gaming in 60 fps will find 20 fps completely unplayable. I myself was a console gamer til about 2 or 3 years ago, and I have since been getting 60 fps on most games until recently because of the lead up to next gen. So, I'm completely fine with 30 fps in games that don't require significant input, basically cinematic games if you will. In shooters and action adventure games I usually am ok with 30-40 fps which is what I have been playing Watch Dogs on. However in games that require dexterity anything less than 60 fps is unacceptable for me.

But generally speaking I don't give a crap about sub 60 fps, sub 30 fps is noticeable though, and I think this goes for most people.

#89 Posted by GrenadeLauncher (5493 posts) -

@Old_Gooseberry said:

umm ... lol wtf ? Simple trees should at least look as good as a 2005 game. thx for letting me know what game it was.

I looked up the screenshots for Forza 5 just to be sure u werent kidding, and they are very inconsistant. how'd it get a 9 out of 10 i wonder for a next gen game. The cars look excellent but some of the environments look like smudgy/blurry mid to early 2000s graphics.

That's what they had to do to get it 1080p 60fps.

#90 Posted by DJ_Headshot (6281 posts) -

@CrownKingArthur said:

@DJ_Headshot: yea i've had 75 Hz before. i found it much better than 60 Hz.

and i read your entire story.

its a 25% increase over 60hz so defiantly better I overclocked my monitor to 71.4hz just shy of 20% increase in refresh rate over 60hz and can notice the difference in smoothness in games I can only imagine how big of a difference it would be with a 120hz or 144hz monitor!

#91 Posted by ShutUpFanboys (55 posts) -

GrenadeLauncher, why don't you post some screen shots of Forza 5 other than that ONE shot you always use in every thread?

You're a god damn broken record using that same shot endlessly.