Let's face it: framerate less than 60 FPS is a horrible tradeoff for 4k

  • 159 results
  • 1
  • 2
  • 3
  • 4
Avatar image for deactivated-6079d224de716
deactivated-6079d224de716

2567

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 deactivated-6079d224de716
Member since 2009 • 2567 Posts

As long as the resolution is more than 1440p (or at least 1080p), framerate is what matters the most. 30 FPS should be left forgotten. Not only the games look like shit, they also play like shit with all this input lag that low framerate causes. 60 FPS is a must. 144 FPS is better of course but console players can only dream about that.

Avatar image for BassMan
BassMan

17803

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#2  Edited By BassMan  Online
Member since 2002 • 17803 Posts

This one is wise. A promotion is in order.

Avatar image for Pedro
Pedro

69415

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#3 Pedro  Online
Member since 2002 • 69415 Posts

Millions of gamers that make up the majority of gaming don't have a problem gaming at 30fps.

Avatar image for deactivated-6079d224de716
deactivated-6079d224de716

2567

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 deactivated-6079d224de716
Member since 2009 • 2567 Posts

@Pedro said:

Millions of gamers that make up the majority of gaming don't have a problem gaming at 30fps.

They had to adapt. Not like they could choose, lol. MS and Sony could give them 60 fps but they opted to go 4k because 4k is a gimmick that's easier to explain and sell to the casual crowd. In case of Sony, they also did it to sell more 4k TVs.

Avatar image for UssjTrunks
UssjTrunks

11299

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 UssjTrunks
Member since 2005 • 11299 Posts

Once you experience 120+ FPS, even 60 is unplayable.

Avatar image for Pedro
Pedro

69415

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#6 Pedro  Online
Member since 2002 • 69415 Posts

@Orchid87 said:
@Pedro said:

Millions of gamers that make up the majority of gaming don't have a problem gaming at 30fps.

They had to adapt. Not like they could choose, lol. MS and Sony could give them 60 fps but they opted to go 4k because 4k is a gimmick that's easier to explain and sell to the casual crowd. In case of Sony, they also did it to sell more 4k TVs.

No, they opted for 4k because gamers and the media emphasis on resolution. This emphasis on resolution originated from the "hardcore gamers" and not from the casual crowd who still don't give a damn.

Avatar image for ArchoNils2
ArchoNils2

10534

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#7  Edited By ArchoNils2
Member since 2005 • 10534 Posts

Depends on the player, really. If you actually want to play a game, you are right. But most just want to jump from one cutscene to the next in an average at best story.

Avatar image for deactivated-6079d224de716
deactivated-6079d224de716

2567

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8  Edited By deactivated-6079d224de716
Member since 2009 • 2567 Posts

@Pedro: lol, no hardcore gamer would choose 4k over 60 or more fps. Smooth visuals and less input lag really matters.

@ArchoNils2: so true

Avatar image for loodko_koopus
Loodko_Koopus

69

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#9  Edited By Loodko_Koopus
Member since 2018 • 69 Posts

I used to be one of those people who thought 30fps is fine a few years ago, and I must say it has become THE most important thing to me as far as games go. As far as resolution goes, I honestly think 2k or 4k depend on how big your screen is and how far away you're sitting from it. Also, the jump from 1080p to 1440p is really not that big, the difference is much smaller than from non-HD to 1080 . It's better, for sure, but not that much better to justify lower FPS.

Avatar image for npiet1
npiet1

3576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#10 npiet1
Member since 2018 • 3576 Posts

@Orchid87 said:
@Pedro said:

Millions of gamers that make up the majority of gaming don't have a problem gaming at 30fps.

They had to adapt. Not like they could choose, lol. MS and Sony could give them 60 fps but they opted to go 4k because 4k is a gimmick that's easier to explain and sell to the casual crowd. In case of Sony, they also did it to sell more 4k TVs.

Or it could be because unless you have a high end TV there still only 60hz and don't support 60fps

Avatar image for nomadic8280
nomadic8280

476

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#11 nomadic8280
Member since 2017 • 476 Posts

@UssjTrunks said:

Once you experience 120+ FPS, even 60 is unplayable.

Shit, once you experience 2,400+ FPS, even 1,200 is unplayable. Whatever man. If you suck, you suck. You're just watching yourself die at a higher framerate.

Avatar image for nomadic8280
nomadic8280

476

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#12 nomadic8280
Member since 2017 • 476 Posts

I'm going to now play Destiny 2 in 4K @ 30fps, and to spite TC I'm going to have fun. Take that!

Avatar image for deactivated-5fd4737f5f083
deactivated-5fd4737f5f083

937

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#13 deactivated-5fd4737f5f083
Member since 2018 • 937 Posts

It's kinda like "you don't know what you have until it's gone". After gaming almost exclusively at 60+ for a few years and then buying a new gen console it took quite a lot of adjusting to get used to the experience at half the frame rate. For me it doesn't kill the experience and I won't stop playing a game because it sits at 30 instead of 60 or more but if I have the option to play the same game at 30 on a larger tv vs 60 on a smaller monitor, it's a very easy win for the PC. Different strokes for different folks though.

Avatar image for Litchie
Litchie

34596

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#14 Litchie
Member since 2003 • 34596 Posts

Very true. But you're not a casual gamer, as opposed to most console gamers. They can't even see the difference between 30 and 60 fps, let alone understand that resolution isn't the most important thing about graphics.

Avatar image for UssjTrunks
UssjTrunks

11299

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 UssjTrunks
Member since 2005 • 11299 Posts

@nomadic8280 said:
@UssjTrunks said:

Once you experience 120+ FPS, even 60 is unplayable.

Shit, once you experience 2,400+ FPS, even 1,200 is unplayable. Whatever man. If you suck, you suck. You're just watching yourself die at a higher framerate.

There is a reaaon why there is no e-sports scene on consoles.

Avatar image for UssjTrunks
UssjTrunks

11299

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16 UssjTrunks
Member since 2005 • 11299 Posts

@Pedro said:

Millions of gamers that make up the majority of gaming don't have a problem gaming at 30fps.

Most console gamers are casuals who have never been expossd to higher frames. The difference is astounding.

Avatar image for npiet1
npiet1

3576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#17 npiet1
Member since 2018 • 3576 Posts

I would rather 4k @ 30fps than 1080p @ 60fps just because there is a huge quality difference, its a lot more noticeable.

Avatar image for deactivated-6079d224de716
deactivated-6079d224de716

2567

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18 deactivated-6079d224de716
Member since 2009 • 2567 Posts

@npiet1 said:

I would rather 4k @ 30fps than 1080p @ 60fps just because there is a huge quality difference, its a lot more noticeable.

If you play "cinematic experiences" that are 80% cutscenes, then probably yes. For actual games, 60 fps and 30 fps is like night and day.

Avatar image for ShadowDeathX
ShadowDeathX

11698

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#19 ShadowDeathX
Member since 2006 • 11698 Posts

That is a given. Now tell Sony and Microsoft to put better CPUs so developers can achieve higher and more stable frame rates.

Avatar image for npiet1
npiet1

3576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#20 npiet1
Member since 2018 • 3576 Posts

@Orchid87: I just cant notice it with games, I can see it in videos but not with games.

Avatar image for BassMan
BassMan

17803

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#21  Edited By BassMan  Online
Member since 2002 • 17803 Posts

@ShadowDeathX said:

That is a given. Now tell Sony and Microsoft to put better CPUs so developers can achieve higher and more stable frame rates.

I am also OK with them putting in better CPUs and continuing to target 30fps on consoles so the scope of games can evolve and PC can continue to reap the benefits. Got to raise the lowest common denominator. hehe :)

Avatar image for nomadic8280
nomadic8280

476

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#22 nomadic8280
Member since 2017 • 476 Posts

@Litchie said:

Very true. But you're not a casual gamer, as opposed to most console gamers. They can't even see the difference between 30 and 60 fps, let alone understand that resolution isn't the most important thing about graphics.

And, good for them. Think of the money they save by not giving a shit. Perhaps they should be looking down their noses at "serious gamers".

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#23 ronvalencia
Member since 2008 • 29612 Posts

@Orchid87 said:

As long as the resolution is more than 1440p (or at least 1080p), framerate is what matters the most. 30 FPS should be left forgotten. Not only the games look like shit, they also play like shit with all this input lag that low framerate causes. 60 FPS is a must. 144 FPS is better of course but console players can only dream about that.

60 fps target is not a large problem when FreeSync can support V-sync locked between 30 hz to 60 hz e.g. 48 hz is good enough.

Avatar image for BassMan
BassMan

17803

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#24  Edited By BassMan  Online
Member since 2002 • 17803 Posts

@ronvalencia said:
@Orchid87 said:

As long as the resolution is more than 1440p (or at least 1080p), framerate is what matters the most. 30 FPS should be left forgotten. Not only the games look like shit, they also play like shit with all this input lag that low framerate causes. 60 FPS is a must. 144 FPS is better of course but console players can only dream about that.

60 fps target is not a large problem when FreeSync can support V-sync locked between 30 hz to 60 hz e.g. 48 hz is good enough.

60 is the minimum target people should strive for. VRR is nice to compensate for occasional dips, but 48fps does not suffice. Noticeably better than 30fps though.

Avatar image for howmakewood
Howmakewood

7702

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#25 Howmakewood
Member since 2015 • 7702 Posts

God of War base ps4: 1080p 30fps and on the Pro in performance mode 1080p ~45fps, amazing.

Avatar image for deactivated-5c18005f903a1
deactivated-5c18005f903a1

4626

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26 deactivated-5c18005f903a1
Member since 2016 • 4626 Posts

Maybe. But 90% of the people playing games on console don't give a shit about stuff like that.

Avatar image for Litchie
Litchie

34596

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#27 Litchie
Member since 2003 • 34596 Posts
@nomadic8280 said:
@Litchie said:

Very true. But you're not a casual gamer, as opposed to most console gamers. They can't even see the difference between 30 and 60 fps, let alone understand that resolution isn't the most important thing about graphics.

And, good for them. Think of the money they save by not giving a shit. Perhaps they should be looking down their noses at "serious gamers".

Yeah, they save money by not knowing or caring about things. But looking down their noses at people who know more than them, care more and wants more out of the hobby? No.

PC gamers shouldn't look down upon console gamers either, that is those who understand they don't have the option for the same quality as PC. A console gamer that tells people their console is better than PCs? Yeah, I have no problem with PC gamers looking down upon those..

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#28  Edited By ronvalencia
Member since 2008 • 29612 Posts

@ShadowDeathX said:

That is a given. Now tell Sony and Microsoft to put better CPUs so developers can achieve higher and more stable frame rates.

During 28 nm era and based on MS/Sony price guidance

IBM was offering PowerPC A2 to replace PPE. SPE is EOL (End of Life). FPU/SIMD is optional. PowerPC 750 was the alternative. MS/Sony needs to pay for 128bit SIMD upgrade.

AMD was offering Jaguar. 128 bit FPU/SIMD is built-in. AMD could be offering 2-for 1 "Fusion deals with GPU.

ARM was offering Cortex A15. FPU/SIMD is built-in, but it's 64bit hardware.

Intel was offering Atom Baytrail. 128 bit FPU/SIMD is built-in.

AMD, Intel and IBM wasn't willing to offer big chip CPUs at low cost embedded CPU prices. There not much choices in the embedded CPU market before 28 nm era.

After desktop PC's Ryzens was released, AMD offered embedded Ryzen based on 14 nm Raven Ridge mobile parts sometime in 2018 while desktop PC's Ryzen switches to 12nm second generation design.

AMD Ryzen Embedded V1000 SoCs family replaced Jaguar Embedded SoCs family. On absolute performance, ARM is not offering any competition.

IBM... where's PowerPC 970 (G5) embedded???? why the garbage PPC 4xx, 7xx and A2s?

Where's the high performance embedded CPU competition?

During 1990s, you have healthy high performance CPU competition from PowerPC (3DO), DEC Alpha, DEC Strong ARM, Sun Sparcs, HP's PA-RISC (to used in Commodore's Amiga CD64 that supported Windows NT 3.x/OpenGL), MIPS32/MIPS64 (Sony, Nintendo) and Super-H.

X86 CPU vendors run away with wider SIMD wars.

Avatar image for sailor232
sailor232

6880

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#29 sailor232
Member since 2003 • 6880 Posts

I agree, if I can get away with 60fps 4k gaming I will, but I'll drop to 1440p whenever 60 isn't stable. Most people don't notice a difference though, most people don't know anything about tech and just game casually.

Avatar image for deactivated-5fd4737f5f083
deactivated-5fd4737f5f083

937

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#30 deactivated-5fd4737f5f083
Member since 2018 • 937 Posts

I can't buy the "people can't tell the difference" thing.

Avatar image for SOedipus
SOedipus

14801

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31 SOedipus
Member since 2006 • 14801 Posts

Nice to see that some people here have standards.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#32 ronvalencia
Member since 2008 • 29612 Posts

@howmakewood said:

God of War base ps4: 1080p 30fps and on the Pro in performance mode 1080p ~45fps, amazing.

Sony should support FreeSync for 45 hz.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#33  Edited By scatteh316
Member since 2004 • 10273 Posts
@Orchid87 said:

As long as the resolution is more than 1440p (or at least 1080p), framerate is what matters the most. 30 FPS should be left forgotten. Not only the games look like shit, they also play like shit with all this input lag that low framerate causes. 60 FPS is a must. 144 FPS is better of coursebut console players can only dream about that.

So can PC gamers.....

Avatar image for Kusimeka
Kusimeka

419

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#34  Edited By Kusimeka
Member since 2007 • 419 Posts

For me, if I start the game on 30FPS I'm usually ok with it. However, when I'm gaming at 144 FPS and lower it to 60FPS it feels horrendous, I don't even want to know what 30FPS would feel like after gaming on 144 for several hours.

I feel like console games are also designed in a way that gets more out of the FPS they are using, so a 30FPS console game will feel better than a PC game running at 30FPS, but once again, that may just be me and what I'm used to.

Avatar image for calvincfb
Calvincfb

0

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#35 Calvincfb
Member since 2018 • 0 Posts

I don't care about 60fps, neither do the developers

Avatar image for deactivated-5c1d0901c2aec
deactivated-5c1d0901c2aec

6762

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36 deactivated-5c1d0901c2aec
Member since 2016 • 6762 Posts

I am hesitant about upgrading to a 4K monitor because I prefer a high frame-rate when I am playing on PC. That being said, my monitor isn't great and it could do with an upgrade, but I won't be investing in 4K.

I don't have any 4K capable gaming consoles, and I don't intend to buy one so no need for a 4K TV. I'm also more forgiving of frame-rates on a TV. :)

Avatar image for MonsieurX
MonsieurX

39858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#37 MonsieurX
Member since 2008 • 39858 Posts

@calvincfb said:

I don't care about 60fps, neither do the developers

It's nice to see you're a representative of the developers

Avatar image for SecretPolice
SecretPolice

44049

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#38 SecretPolice
Member since 2007 • 44049 Posts

Lets face it, Mighty X1X 4K MonsterBox MasterRace makes you...

Cry like a lil girl. lolol :P

Avatar image for deactivated-6079d224de716
deactivated-6079d224de716

2567

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#39 deactivated-6079d224de716
Member since 2009 • 2567 Posts

@jumpaction said:

I am hesitant about upgrading to a 4K monitor because I prefer a high frame-rate when I am playing on PC. That being said, my monitor isn't great and it could do with an upgrade, but I won't be investing in 4K.

I don't have any 4K capable gaming consoles, and I don't intend to buy one so no need for a 4K TV. I'm also more forgiving of frame-rates on a TV. :)

Better invest in GSync/Freesync monitor.

Avatar image for calvincfb
Calvincfb

0

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#40 Calvincfb
Member since 2018 • 0 Posts

@MonsieurX: if they did care about fps they'd push 60fps, they don't, they'd rather push for graphics, physics and other stuff.

It's a trade-off and it's clear they'd rather use the power avaliable on achieving stable 30fps with better graphics, AI, physics and post processing effects than fps.

It's not that hard to notice, you know?

Avatar image for deactivated-5c1d0901c2aec
deactivated-5c1d0901c2aec

6762

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#41 deactivated-5c1d0901c2aec
Member since 2016 • 6762 Posts

@Orchid87: Anything is an improvement over my 60hz, 1080p monitor, to be honest. :P

But I am shopping around and always keeping an eye out on the right one for me.

Avatar image for raugutcon
raugutcon

5576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#42  Edited By raugutcon
Member since 2014 • 5576 Posts

I’m one of those millions of casual gamers that don’t give a damn about FPS, 30 FPS ain’t like if the game shows like a slideshow.

Avatar image for calvincfb
Calvincfb

0

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#43  Edited By Calvincfb
Member since 2018 • 0 Posts

@raugutcon: brace yourself, PC elitists are going to be butthurt and call you a soccer mom and try to look down on you because of that.

Avatar image for deactivated-5c1d0901c2aec
deactivated-5c1d0901c2aec

6762

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#44  Edited By deactivated-5c1d0901c2aec
Member since 2016 • 6762 Posts

@calvincfb: For me the difference comes from:

1. Expectations

2. Hardware

My TV has a higher refresh rate than the equivalent on my PC monitor so even though the game itself runs at 30fps on my Nintendo Switch, the TV is able to make it look smoother. Some people hate that when applied to movies and say that it makes the film look like a soap-opera, but I quite like it. It does, however result in input lag, but for most of the games I play it's not noticeable enough to be a detriment. The other element to this is that I know the limitations of my Xbox One and Nintendo Switch. I know that they won't be capable of running numerous games at 60fps, so I don't expect that performance from them. I can, at least hope for a steady enough frame-rate that it's never a detriment to gameplay.

Playing on PC is different. My PC is doing all the work in ensuring that the play experience I have is smooth, and I know my PC is capable of running games at 60fps+ at 1080p. That's what I have come to expect from my system, and so when a poor port means that this is not achieved, I am disappointed. Mostly though, I have had smooth experiences. :)

Avatar image for raugutcon
raugutcon

5576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#45 raugutcon
Member since 2014 • 5576 Posts

@calvincfb: like if I care about them.

Avatar image for calvincfb
Calvincfb

0

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#46  Edited By Calvincfb
Member since 2018 • 0 Posts

@jumpaction: any hardware is capable of 60fps, it all depends on where the developers want to put the power of the hardware.

If we were having ps3 style games, we'd have 1080p/60fps on consoles, but that's not what devs want. They want eye candy because that sells more, hence games on consoles being 30fps.

I'm fine with 30fps, always been, always will be.

Avatar image for deactivated-5c1d0901c2aec
deactivated-5c1d0901c2aec

6762

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#47 deactivated-5c1d0901c2aec
Member since 2016 • 6762 Posts

@calvincfb: I know that Far Cry 5 would not run on my Nintendo Switch at 60fps without serious compensations made. After all, it's a systemic game with many different AI actors and physics systems working at once. This is the reason why I expect and am okay with a game like Breath of the Wild running the way it does on Nintendo Switch.

But on PC, I know my system can run Far Cry 5 at or higher than 60fps regardless of the systems in place or the visual fidelity of the game. It's not a standard I personally expect from my Nintendo Switch but one I do expect from my PC.

Avatar image for calvincfb
Calvincfb

0

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#48  Edited By Calvincfb
Member since 2018 • 0 Posts

@jumpaction: the only reason PC can achieve what it can is because the games are made first with console in mind, so everything is made with compromises so they can achieve 1080/30fps, the extra power on PC is used to achieve better graphics and performance on top of what was made first for consoles.

If consoles were out of the picture, the Dev would be free to pull out a Crysis and see the world cry on uproar and despair.

But they don't want that anymore, nobody does.

Avatar image for pelvist
pelvist

9001

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#49  Edited By pelvist
Member since 2010 • 9001 Posts
@netracing said:

I can't buy the "people can't tell the difference" thing.

I read somewhere that the amount of frames a person can perceive depends on what they're used to watching/playing as the brain needs time to get used to processing the information faster and this is why some people cant tell the difference as they haven't experienced higher frame-rates for a prolonged time. If a person only plays games at 30fps then I can believe it when they say they cant tell the difference. You can see a difference when it is demonstrated side by side in a .gif but thats about it, they cant feel the difference while playing. This made sense to me because I find it really hard to go back to playing games on consoles nowadays, yet I played them all the time when I was a lad and never had an issue.

However, certain games that were hugely popular especially among console gamers since last gen have forced 60fps in their multiplayer modes and I find it hard to believe that console gamers haven't had enough time playing at least one of these games to not tell the difference at some point when going back to 30fps, it just seems like an excuse for most of their games running at less than 30fps in 2018.

Iv been spoiled with PC gaming to the point where I find it hard to enjoy gaming on consoles due to how sluggish they feel while playing them. It took a couple of weeks of casual, bite size gaming sessions to get into BOTW, though im glad I did because it was a really, really good game that I would buy again if it was available (legally) in 60+ fps on PC.

Avatar image for ten_pints
Ten_Pints

4072

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#50 Ten_Pints
Member since 2014 • 4072 Posts

There is no trade off for the consoles in most cases, graphics power is not the bottleneck.