Why do people continue to talk about 30 vs 60? I thought next gen will have VRR?

Avatar image for Sushiglutton
Sushiglutton

7744

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#1 Sushiglutton
Member since 2009 • 7744 Posts

Tech question here. As I understand it next gen will support HDMI 2.1 and that in turn means variable refresh rate (VRR). So the binary choice between 30 and 60 will be a thing of the past. The framerate can vary freely within some range without issues like tearing etc.

Or am I wrong as usual lol?

Avatar image for bluestars
Bluestars

1071

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#2 Bluestars
Member since 2019 • 1071 Posts

Your asking in the wrong place,SW doesn’t know it arse from its elbow

HaH

Check out experts for information

Avatar image for Juub1990
Juub1990

9922

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 Juub1990  Online
Member since 2013 • 9922 Posts

Because 60 and above is good. Anything below it even with VRR is not ideal.

Avatar image for deactivated-5efed3ebc2180
deactivated-5efed3ebc2180

923

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#4 deactivated-5efed3ebc2180
Member since 2006 • 923 Posts

@Sushiglutton said:

Tech question here. As I understand it next gen will support HDMI 2.1 and that in turn means variable refresh rate (VRR). So the binary choice between 30 and 60 will be a thing of the past. The framerate can vary freely within some range without issues like tearing etc.

Or am I wrong as usual lol?

Not sure if trolling, or not, but VRR is AMD's equivalent of NVIDIA's G-Sync. Basically "free vertical synchronization". Doesn't introduce latency, stutter when the framerate falls below its target nor is it taxing on hardware, basically like commonly used solution without its shortcommings.

TL;DR: it's just "no screen tear, just better".

Avatar image for deactivated-5efed3ebc2180
deactivated-5efed3ebc2180

923

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#5 deactivated-5efed3ebc2180
Member since 2006 • 923 Posts

@Juub1990: Umm, if you switch to 2160p/30Hz, why not...

Avatar image for phbz
phbz

5859

Forum Posts

0

Wiki Points

0

Followers

Reviews: 54

User Lists: 0

#6 phbz
Member since 2009 • 5859 Posts

VRR is dependent on having a compatible TV. Don't know exactly what is the % of people with HDMI 2.1, I imagine it to be extremely low.

Avatar image for deactivated-5efed3ebc2180
deactivated-5efed3ebc2180

923

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#7  Edited By deactivated-5efed3ebc2180
Member since 2006 • 923 Posts

@phbz said:

VRR is dependent on having a compatible TV. Don't know exactly what is the % of people with HDMI 2.1, I imagine it to be extremely low.

I've got one, kinda useless as of now when i have a NVIDIA GPU... XD

Though might be useful IF PS5 will be worthy...:}

Avatar image for Sushiglutton
Sushiglutton

7744

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#8 Sushiglutton
Member since 2009 • 7744 Posts

@Juub1990 said:

Because 60 and above is good. Anything below it even with VRR is not ideal.

As I understand it (could be wrong) before it was this either or thing. If you couldn't hit a stable 60 then it lead to issues like tearing and frame pacing so you might as well do 30.

But with VRR you could do 55. Is it reasonable to think that 55 is that much worse than 60?

@phbz said:

VRR is dependent on having a compatible TV. Don't know exactly what is the % of people with HDMI 2.1, I imagine it to be extremely low.

I guess that's part of it. But it will become standard soon I suppose. Those who like me will upgrade their TV for next gen will have it.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7224

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 Grey_Eyed_Elf
Member since 2011 • 7224 Posts

VRR?... Just like HDR and Dolby Atmos, its a tech spec that MANY people don't actually have supported by their TV's.

Go ahead and ask people here to post up their systems, majority of console gamer's who aren't PC gamer's also do not have TV's that support HDR let alone one that supports HDMI 2.1!

Avatar image for xantufrog
xantufrog

13100

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#10 xantufrog  Moderator
Member since 2013 • 13100 Posts

VRR is a bandage for unstable framerates, not a replacement for good ones.

Avatar image for Zero_epyon
Zero_epyon

14425

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#11 Zero_epyon
Member since 2004 • 14425 Posts

Though it supports hdmi 2.1 and vrr, the majority of televisions out there don’t support it. So most console owners will be locked into those frame rates.

Avatar image for Sushiglutton
Sushiglutton

7744

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#12  Edited By Sushiglutton
Member since 2009 • 7744 Posts

@xantufrog said:

VRR is a bandage for unstable framerates, not a replacement for good ones.

Isn't though? Let's say one game runs at rock solid 60 and another fluctuates in the 55-60 interval. Would you be able to tell the difference if it was VRR? Would a normal person (like me lol).

@Zero_epyon said:

Though it supports hdmi 2.1 and vrr, the majority of televisions out there don’t support it. So most console owners will be locked into those frame rates.

Ah, so I guess we will continue to hear 30 vs 60 for a while then.

Avatar image for Telekill
Telekill

9258

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#13 Telekill
Member since 2003 • 9258 Posts

The only time I give a crap about fps is with VR. I keep getting motion sick. Hoping the next gen of VR will fix that.

Avatar image for Sushiglutton
Sushiglutton

7744

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#14  Edited By Sushiglutton
Member since 2009 • 7744 Posts

@Telekill said:

The only time I give a crap about fps is with VR. I keep getting motion sick. Hoping the next gen of VR will fix that.

I'm sure they will do that. But resolution (screen door effect) is also an issue. So not sure what will be prioritized.

Avatar image for organic_machine
organic_machine

10056

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#15 organic_machine
Member since 2004 • 10056 Posts

Frame pacing is more of an issue for me. I would rather play a 30-45FPS game with immaculate frame pacing than a game at 60FPS with stutters all over the place.

VRR is amazing, but it doesn't really impact Frame rates. It just reduces screen tearing without the penalty of using V-Sync.

Avatar image for xantufrog
xantufrog

13100

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#16  Edited By xantufrog  Moderator
Member since 2013 • 13100 Posts

@Sushiglutton said:
@xantufrog said:

VRR is a bandage for unstable framerates, not a replacement for good ones.

Isn't though? Let's say one game runs at rock solid 60 and another fluctuates in the 55-60 interval. Would you be able to tell the difference if it was VRR? Would a normal person (like me lol).

@Zero_epyon said:

Though it supports hdmi 2.1 and vrr, the majority of televisions out there don’t support it. So most console owners will be locked into those frame rates.

Ah, so I guess we will continue to hear 30 vs 60 for a while then.

Short answer is: probably no you couldn't perceive the difference. But that doesn't counter my statement... VRR has ZERO purpose, except to convert a 55-60FPS scenario into one that feels like a solid 60

Again, it's a bandage to make something shitty feel less shitty. It certainly wouldn't make a 28-40 FPS game (average/target 30ish) feel like a 60FPS game if that's what people are trying to drive at...

Avatar image for Zero_epyon
Zero_epyon

14425

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#17 Zero_epyon
Member since 2004 • 14425 Posts

@xantufrog said:
@Sushiglutton said:
@xantufrog said:

VRR is a bandage for unstable framerates, not a replacement for good ones.

Isn't though? Let's say one game runs at rock solid 60 and another fluctuates in the 55-60 interval. Would you be able to tell the difference if it was VRR? Would a normal person (like me lol).

@Zero_epyon said:

Though it supports hdmi 2.1 and vrr, the majority of televisions out there don’t support it. So most console owners will be locked into those frame rates.

Ah, so I guess we will continue to hear 30 vs 60 for a while then.

Short answer is: probably no you couldn't perceive the difference. But that doesn't counter my statement... VRR has ZERO purpose, except to convert a 55-60FPS scenario into one that feels like a solid 60

Again, it's a bandage to make something shitty feel less shitty. It certainly wouldn't make a 28-40 FPS game (average/target 30ish) feel like a 60FPS game if that's what people are trying to drive at...

I have a freesync monitor and some titles I can get 4K/45-60 feel like 60. There's some stutter and I know when it's happening. Anything under 45 and it's 30fps. So yeah that's exactly what it's for, to try to make 50-60 feel like 60.