When did framerate become the defacto thing to argue about?

  • 90 results
  • 1
  • 2
#1 Posted by Midnightshade29 (5535 posts) -

Seriously. I remember playing games on PC years ago with an outdated card... playing something like Morrowind with like 18-20fps... and guess what I didn't care an ounce. It was completly playable. Did it make the game less enjoyable? Hell no.

I used to have a 3dfx voodoo 3 back in the day and was glad just to have 3d accelleration... DId I expect 60fps out of every game...no. Why is it something that needs to be today?

I usually would pump up the graphics to max and have lowish frame-rate.. and it was fine.

When did this obsession with having locked 30 or 60fps come from? I don't remember it being dicussed years ago. As long as the game is playable it shouldn't matter. There have been a number of games that reviewers bashed for low fps yet, I found them fine.

Shouldn't we be concerned if the game is fun or not, not if it hits 30 or 60 all the time? I could care less as long as its playable. And a few fps dips does not make a game unplayable, of course the internet seems to think so ...

#2 Posted by BldgIrsh (2998 posts) -

When the baby boomers of next-gen came.

#3 Posted by santoron (7988 posts) -

Agreed, except... this is SW. This is where people go to fight about game systems. When frame rates are diferent, the guy with the higher rate is going to claim a victory.

It's silly, but it's also why people come here instead of General Games Discussion.

#4 Edited by I_can_haz (6511 posts) -

@Midnightshade29 said:

Seriously. I remember playing games on PC years ago with an outdated card... playing something like Morrowind with like 18-20fps... and guess what I didn't care an ounce. It was completly playable. Did it make the game less enjoyable? Hell no.

I used to have a 3dfx voodoo 3 back in the day and was glad just to have 3d accelleration... DId I expect 60fps out of every game...no. Why is it something that needs to be today?

I usually would pump up the graphics to max and have lowish frame-rate.. and it was fine.

When did this obsession with having locked 30 or 60fps come from? I don't remember it being dicussed years ago. As long as the game is playable it shouldn't matter. There have been a number of games that reviewers bashed for low fps yet, I found them fine.

Shouldn't we be concerned if the game is fun or not, not if it hits 30 or 60 all the time? I could care less as long as its playable. And a few fps dips does not make a game unplayable, of course the internet seems to think so ...

Not to sound like a douche but games that drop to 18-20fps become unplayable for me unless it's only for an extremely short period of time.

I do agree with you that it shouldn't matter if a game is 30 or 60fps if it's fun though and for me it doesn't. If I can get 60fps I always go for it but if for whatever reason a dev can't reach that standard I don't get mad unless it's a fighting game or driving simulator.

#5 Posted by Pray_to_me (3059 posts) -

On PC people expect 60+ fps on PS360/iPad ports.

#6 Posted by BldgIrsh (2998 posts) -

@I_can_haz: Morrowind came out in 2002... he was talking about back in the day... where NO ONE cared about FPS.

#7 Posted by TheRealBigRich (756 posts) -

It had to be when this gen came because i remember ps3/360 and never even hearing about fps or resolution. I heard that some games played better on 360 and that was my primary gaming machine but it was mainly cause it had games i liked, the controller was more comfortable to me and the online experience was better most of the time (after the psn hack mainly )

#8 Edited by I_can_haz (6511 posts) -

@bldgirsh said:

@I_can_haz: Morrowind came out in 2002... he was talking about back in the day... where NO ONE cared about FPS.

Well that may be his personal view but I've always cared about fps. I remember tweaking settings to make sure I had above 60fps in Quake back in the day. My eyes are extremely sensitive to frame drops and screen tearing.

#9 Posted by CrownKingArthur (5262 posts) -
@bldgirsh said:

When the baby boomers of next-gen came.

bravo sir, that amused me.

op topic, when i was at high school in 'the 3rd form' (so around 1998), there was another turd former who caught the same bus as me - and he was a 120 fps enthusiast.

so, we have had framerate enthusiasts for at least 16 years. that same individual was quite hermit about all aspects of graphics tbh, quite impressive nobody ever beat him up.

#10 Posted by sam890 (1108 posts) -

@Pray_to_me said:

On PC people expect 60+ fps on PS360/iPad ports.

We shouldn't expect it ?

#11 Edited by GunSmith1_basic (9885 posts) -

framerate is important because devs that ignore it are trying to boost how their game looks in trailers and screenshots at the expense of the experience. Framerate is integral to gameplay. It makes the challenge easier to see and respond to.

It is appropriate that you bring up the PC experience of the past. PC gaming is diverse in every way. There were many games that chose to emphasize visuals at the expense of gameplay (the reverse is also true).

#12 Posted by Cloud_imperium (5847 posts) -

Since the reveal of "teh powerful cell processor"

#13 Posted by kingtito (5242 posts) -

@santoron said:

Agreed, except... this is SW. This is where people go to fight about game systems. When frame rates are diferent, the guy with the higher rate is going to claim a victory.

It's silly, but it's also why people come here instead of General Games Discussion.

What you're not going to cry in this thread?

#14 Edited by lamprey263 (25398 posts) -

Fanboyism is why, when one features gives one faction a reason to claim superiority over another, they're going to exploit it; simple as that.

#15 Posted by Gaming-Planet (14360 posts) -

Because companies like Microsoft and Sony keep using it to sell their console.

#16 Edited by f50p90 (3767 posts) -

When multiplats became 90% of the industry and the little nuances became the only thing to argue about. When you were playing morrowind you could simply lol at the consolites who couldn't play it. Now they have your Oblivions and Skyrims so you can only lol at the graphics/performance

#17 Edited by santoron (7988 posts) -

@kingtito said:

@santoron said:

Agreed, except... this is SW. This is where people go to fight about game systems. When frame rates are diferent, the guy with the higher rate is going to claim a victory.

It's silly, but it's also why people come here instead of General Games Discussion.

What you're not going to cry in this thread?

Oh look. I've got my own Troll.

And I will pet him and I will call him George...

#18 Edited by D4RKL1NG (271 posts) -

Resolution/FPS is a fad that will eventually disappear as the gen goes on. The next weapon that would be used in fighting the good fight in SW will be which console has the best female representation in games. Brace yourself,the great shit storm of feminazis are coming.

#19 Edited by vashkey (33768 posts) -

1. Because a handful of games hit 60 fps on PS4 over Xbox One so now it matters

2. With new hardware many people expected 60fps to become a standard. The fact that both platforms seem to not always hit this frustrates some people.

I think if you want games to really push graphics or scope this gen then you're probably going to get 30 fps. If you want 60 then the games are probably not going to look a whole lot bigger or action packed than some of the better games of last gen. Eh, I could be wrong though.

Personally, it's not a huge deal to me. I really think competitive games should strive for 60 fps as should spectacle fighters like Bayonetta. Just about anything else is acceptable at 30 but 60 is preferable.

I do think the sudden unacceptability of 30 fps among the users on this board is ridiculous. Among the console heavy players I bet almost anyone here's favorite console game of the previous generation ran at 30 fps and they didn't even notice or think to complain about it back then. CoD and Platinum games were some of the only games that consistently aimed for it.

#20 Edited by mems_1224 (48412 posts) -

When Sony stopped having the worst version of every multiplat with the PS4.

#21 Edited by scatteh316 (5021 posts) -

30fps and low is very very sluggish to me and my eyes, at minimum I could cope with 50fps but ideally 60fps+

And it's hard to explain what a good frame rate means and how much affects playability and input response to a bunch of people who've never experience high frame rates, i.e console players.

You can't really miss 60fps+ if you've never really had it.

#22 Posted by AK_the_Twilight (285 posts) -

@scatteh316 said:

30fps and low is very very sluggish to me and my eyes, at minimum I could cope with 50fps but ideally 60fps+

And it's hard to explain what a good frame rate means and how much affects playability and input response to a bunch of people who've never experience high frame rates, i.e console players.

You can't really miss 60fps+ if you've never really had it.

This. Unless you've seen 60fps consistently, it doesn't really matter. The frame rate issue isn't too important, but I'd rather have people yearn for a better frame rate than stupid resolution issues.

While frame rate is absolutely noticeable in comparison between 30 and 60, resolution is like this generation's "blast processing": a term used by console zealots and marketing executives to say their console is better without knowing anything about how it works or the difference it makes.

#23 Posted by SolidTy (45767 posts) -

I remember when the original Timesplitters and the original Devil May Cry 2001 had 60 fps and it was touted by PR as such. I remember when Ratchet & Clank games were 60fps...

Resolution became a big deal around 2004-2005 for consoles, but framerate poked it's head around 1999 with consoles. Before that, it was more a PC thing.

#24 Posted by GamersJustGame (323 posts) -

@Midnightshade29 said:

Seriously. I remember playing games on PC years ago with an outdated card... playing something like Morrowind with like 18-20fps... and guess what I didn't care an ounce. It was completly playable. Did it make the game less enjoyable? Hell no.

I used to have a 3dfx voodoo 3 back in the day and was glad just to have 3d accelleration... DId I expect 60fps out of every game...no. Why is it something that needs to be today?

I usually would pump up the graphics to max and have lowish frame-rate.. and it was fine.

When did this obsession with having locked 30 or 60fps come from? I don't remember it being dicussed years ago. As long as the game is playable it shouldn't matter. There have been a number of games that reviewers bashed for low fps yet, I found them fine.

Shouldn't we be concerned if the game is fun or not, not if it hits 30 or 60 all the time? I could care less as long as its playable. And a few fps dips does not make a game unplayable, of course the internet seems to think so ...

Let me tell you the truth.

Gamers don't give a crap about resolution and frame rate. 90% of them cant even tell a difference. If your playing on a PC monitor yeah, you will see a difference. But console gamers on TV cant tell. Its so minimal and none issue its borderline hilarious how everyone throws a fuss about it.

#25 Edited by lglz1337 (4117 posts) -

since uprezzed x360 pc spreadsheet 4k, 60fps v-sync 8xMSAA with jaggies gaming

#26 Edited by Evo_nine (2093 posts) -

Once you go 60fps you cant go back....so smooth and glorious...why settle for something less?

Sucks that ps4 is stuck at 30fps all of this gen.

#27 Posted by bezza2011 (2716 posts) -

Probably because there isn't much more to talk about and fight over lol, strange how quick people are becoming experts in all of this frames per second rubbish, a game is a game, it has been that way for a while now, but for the time being what else are we going to have wars over, no one has any decent exclusives to really brag about because nothing is even nearly here lol wait till 2015 when we actually have some exclusives.

#28 Posted by HalcyonScarlet (5136 posts) -

Standards. Back in the day people were happy to have 4 wheels and an engine. Now car fans have performance standards.

There is nothing wrong with having performance standards, I'll drop graphics settings to get there. I think it's worse these days that graphic is more important than performance. But current gen consoles have to, because they look only marginally better than last gen consoles above 30fps otherwise.

#29 Posted by PS4hasNOgames (2304 posts) -

When hermits stopped getting all the big name games and they had to justify their shitty rigs with shitty games like Crysis and Metro.

#30 Edited by jun_aka_pekto (17329 posts) -

Not sure. Probably somewhere between the transition from CRT to LCD monitors. I know when I was still playing Battlefield 2 with a CRT monitor, most players played fine with sub-60fps whether that's at a non-interlaced scan rate of 85Hz or 120Hz. Then, all the current stuff we hear now started appearing around the 2009-2010 timeframe.

I have a simplified view of gaming. I still think 30fps minimums are good enough although more is always welcome. I regard framerates as nothing more than an indicator of how much reserve GPU resources I have....even that can be subjective. I can have framerates hover in the low 30's all day. But, if a game maintains it even through the action, I'm fine with it.

This is a sample clip from Watch Dogs (watching the clouds). The framerates when recording (@1080p) is the same as when I'm not recording. I think the FX-8350 has a lot to do with it. I know if I tried this with my Phenom II quadcore, my framerates would be single digits. My framerates pretty much stays the same when there's action going on. I chose this spot because it's one of the few spots where my framerates drop to this low.

Even with sub-60fps I don't see lagging or stuttering.

Loading Video...

#31 Posted by TheEroica (14411 posts) -

When two companies make their consoles 99.9% the same and you mix that with a game drought on each console, you gotta argue frame rates... Fanboys gonna always Fanboy no matter what.

#32 Edited by Jebus213 (8920 posts) -

When console gamers discovered 60FPS.

#33 Posted by stizzal13 (609 posts) -

It became a thing to argue about when the calendar said 2014 and devs were still releasing games that were sub-60fps.

#34 Posted by airshocker (30915 posts) -

18-20 FPS isn't playable for me. Perhaps you have low standards. But if I were to play BF4 at 20 FPS I'd get my ass handed to me. 60 is pretty much standard when I game on the PC. But as long as it doesn't dip below 30, I'll be okay.

#35 Posted by blangenakker (2580 posts) -

Well it seemed to be more apparent when the new consoles came out. People were expecting better from them.

#36 Posted by Heirren (18799 posts) -

The framerate issue has ALWAYS been there. ALWAYS.

#37 Posted by FoxbatAlpha (8853 posts) -
Loading Video...

#38 Posted by Kinthalis (5340 posts) -

The standard for PC has bene 1080p/60 FPS for YEARS now.

New consoles hit and they are STILL struggling to hit that on most games. That angers console peasants who want their crappy plastic game boxes to be superior, bu are realizing that in fact, they are not.

#39 Posted by TheShadowLord07 (22187 posts) -

as gamers got older their standards increased. that's all I got really.

#40 Posted by foxhound_fox (90601 posts) -

I played Oblivion at ~15fps when it first came out and loved it.

Then I upgraded and never looked back.

#41 Posted by DEadliNE-Zero0 (4843 posts) -

Because most games are multiplats, so, the only way for consoles to figure out which is better is to argue resolution and frame rates.

#42 Edited by edwardecl (2239 posts) -

I tolerated it back in the day, but with LCD screens vsync become a necessity and 15FPS is not acceptable so 30FPS or 60FPS is required.

#43 Posted by donalbane (16376 posts) -

Whenever two rival platforms have different performance on multiplatform games, people are going to talk about those differences - especially here. I think it's interesting that developers are lowering resolutions instead of effects, which they could be doing instead. I guess people would be able to post screenshot comparisons which would make the differences more salient to the media and forums like this one, so they just dropped resolution instead.

#44 Posted by Heil68 (46825 posts) -

since MS tried to change the industry in 2013 and promptly got knocked the fuck out, causing them to shit can their whole vision of the Xbone and finally having to drop the biggest joke in the industry Kinect(lol)

#45 Edited by Gue1 (11249 posts) -

the moment I played Mega Man X4 on an emulator I understood how important high frame-rate was for games. And for both, 3rd and 1st person shooters high frame-rate is a game changer.

#46 Posted by SecretPolice (23347 posts) -

Since The One was the only this gen console to Launch with a game running 1080P at 60 fps. :P

#47 Posted by musicalmac (23541 posts) -

@Midnightshade29 said:

I used to have a 3dfx voodoo 3 back in the day and was glad just to have 3d accelleration...

The golden age, I still remember how great that day was, the day I installed my (12mb VRAM) Voodoo 2. Great memories...

The framerate thing is an issue brought on in part by the entitlement age. Kids these days and their expectations. It also gives people something to argue about considering how underwhelming both consoles are. It's too bad they weren't more interesting, just more of the same with a few extra triangles.

#48 Posted by WilliamRLBaker (28483 posts) -

When the sheen's finally got it.
Historically the most powerful system rarely wins its the games that wins it for a system this is proven by nearly every generation, But last generation sheen's had consistently worse versions of games unless you count exclusives where Sony often had to use Chinese coding farms to get the base work done then it was all refinement later on by the developer...oh and the fact the best looking games only looked good in their ingame cinematics where every thing processing wise was turned off the actual gameplay was obviously less stellar looking.

Now that sheen's have system power they fall back on their baseline graphics graphics, frames per second its all that matters, and most Lemmings are laughing their asses off at the hypocrisy.
There is a reason Video game system developers basically tricked the majority of gamers with the bit wars they actually convinced loads of people that bits and byte monikers meant something.

#49 Posted by kraken2109 (13211 posts) -

Because it's 2014 and standards are higher.

#50 Edited by 2Chalupas (5283 posts) -

All I know is I don't remember having tons of games with screen tearing/judderiness on the PS2/Gamecube. There were games with "slowdowns" yes, but it seemed to be handled better.

Seems like with the advent of "multiplats" from last gen, all of a sudden it was more acceptable to just try and port PC games - and frequently the 360 or PS3 weren't equipped to handle them even at 30 fps - all kinds of weird juddering, screen tearing, bad AA, etc. Screen tearing was the bane of last gen on 360 games. I think it's talked about more now, because people expect a new gen means not having to deal with those types of technical/performance issues anymore.

A 100% stable 30 fps is most always going to be fine for me though. 60fps is just the gold standard.