Next-gen (consoles) just became even more obsolete!

This topic is locked from further discussion.

Avatar image for loco145
loco145

12226

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 loco145
Member since 2006 • 12226 Posts

NVIDIA head Jen-Hsun Huang revealed G-Sync this morning, a module for gaming monitors that helps alleviate screen tearing and skipping issues. A variety of display companies are already on board, including ASUS, BenQ, Philips and ViewSonic. Huang said the module kills stutter entirely, pushes down lag and kills tearing. The monitors with G-Sync look the same as a normal display, as the module is built into the rear (as seen above). We're told by NVIDIA's Ujesh Desai that the module won't make new monitors much more expensive, and the module works with GPUs that have Kepler architecture (so the GTX 660 and up). G-Sync monitors will be available starting in Q1 2014.

Source

Consolites has been pwned!

Avatar image for GTSaiyanjin2
GTSaiyanjin2

6018

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#2 GTSaiyanjin2
Member since 2005 • 6018 Posts

sounds great and all, but I've never had a problem with screen tearing, maybe its something I don't notice when playing games. I do have a 120hz monitor, so maybe that helps. But the Sync monitors still sound interesting, if they have any 120-144hz monitors I may be in for one.

Avatar image for loco145
loco145

12226

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4  Edited By loco145
Member since 2006 • 12226 Posts

@Seiki_sands said:

Consoles can be hooked to monitors.

PCs can be hooked to TVs.

Isn't that a point hermits are very fond of making when a console gamer mentions comfort and ease as a plus for consoles?

Consoles can't be G-Synced. Pwnit!

Avatar image for Seiki_sands
Seiki_sands

1973

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#5  Edited By Seiki_sands
Member since 2003 • 1973 Posts

@loco145 said:

@Seiki_sands said:

Consoles can be hooked to monitors.

PCs can be hooked to TVs.

Isn't that a point hermits are very fond of making when a console gamer mentions comfort and ease as a plus for consoles?

Consoles can't be G-Synced. Pwnit!

Yep. Realized that as soon as I posted, tried to delete, but ran into "issues" with the site.

Though that said, much like any other news regarding PC hardware, it is mitigated by the fact that it won't be used by most PC gamers, and is largely unavailable to a sizable segment of PC gamers (those with AMD cards, specifically).

Avatar image for Gue1
Gue1

12171

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#6 Gue1
Member since 2004 • 12171 Posts

if you have screen tear on your PC games then that means that you have a crap rig. This G-sync is just a gimmick to force idiot herms into buying new monitors that support the feature.

#DealWithIt ;)

Avatar image for gameofthering
gameofthering

11286

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#7  Edited By gameofthering
Member since 2004 • 11286 Posts

The only game I really noticed screen tearing in was Portal 2. I had no choice but to put on Vsync :(

Avatar image for slipknot0129
slipknot0129

5832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8  Edited By slipknot0129
Member since 2008 • 5832 Posts

Seems like it would solve all the screen tearing issues and stuff. Make things smoother too. I'll buy one when my monitor goes out.

Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#9  Edited By Kinthalis
Member since 2002 • 5503 Posts

It's more than just no Vsync necessary, this would change the way timings ona monitor work. They would be fully driven by the GPU.

That means smoother motion than even on 120hz monitors. And ofcourse even lower latency.

But yeah, this pretty much renders consoles obselete for me. One of the biggest updates to display technology since HD,a nd consoles are left behind...yet again.

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#10  Edited By ReadingRainbow4
Member since 2012 • 18733 Posts

I watched that demo, and half the time could literally not see the difference unless I really, really tried.

It just came off as a massive PR cock-tease. I hate Vsync as much as the next guy, hence I don't use it in anything that requires precision. If this can change that, then great.

But it still required you to purchase a G-sync Monitor, something I wish wasn't the case.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#11 clyde46
Member since 2005 • 49061 Posts

@Gue1 said:

if you have screen tear on your PC games then that means that you have a crap rig. This G-sync is just a gimmick to force idiot herms into buying new monitors that support the feature.

#DealWithIt ;)

Wrong.

Avatar image for donalbane
donalbane

16383

Forum Posts

0

Wiki Points

0

Followers

Reviews: 19

User Lists: 0

#12  Edited By donalbane
Member since 2003 • 16383 Posts

Maybe PC gamers should try turning on V-sync before buying a new monitor. Just my 2 cents here.

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#13 ReadingRainbow4
Member since 2012 • 18733 Posts

@clyde46 said:

@Gue1 said:

if you have screen tear on your PC games then that means that you have a crap rig. This G-sync is just a gimmick to force idiot herms into buying new monitors that support the feature.

#DealWithIt ;)

Wrong.

Very.

Avatar image for psymon100
psymon100

6835

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14  Edited By psymon100
Member since 2012 • 6835 Posts

Eh.

Great ... but at the same time, I'm not sure I need this. Plus I'm using AMD GPUs all around at the moment.

Avatar image for Peredith
Peredith

2289

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 Peredith
Member since 2011 • 2289 Posts

Hermits in fullscale damage control

Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#16  Edited By Kinthalis
Member since 2002 • 5503 Posts

@Peredith said:

Hermits in fullscale damage control

???

Damage control for what? Consoel gamers are the ones running into rude awakening after rude awakening wiht these "next gen" consoles, barely able to run at 1080p.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 tormentos
Member since 2003 • 33784 Posts

@loco145 said:

NVIDIA head Jen-Hsun Huang revealed G-Sync this morning, a module for gaming monitors that helps alleviate screen tearing and skipping issues. A variety of display companies are already on board, including ASUS, BenQ, Philips and ViewSonic. Huang said the module kills stutter entirely, pushes down lag and kills tearing. The monitors with G-Sync look the same as a normal display, as the module is built into the rear (as seen above). We're told by NVIDIA's Ujesh Desai that the module won't make new monitors much more expensive, and the module works with GPUs that have Kepler architecture (so the GTX 660 and up). G-Sync monitors will be available starting in Q1 2014.

Source

Consolites has been pwned!

So this involves trowing your monitor away and buying a new one.?

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#18  Edited By clyde46
Member since 2005 • 49061 Posts

@donalbane said:

Maybe PC gamers should try turning on V-sync before buying a new monitor. Just my 2 cents here.

Hardware >>>> Software.

Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#19  Edited By Kinthalis
Member since 2002 • 5503 Posts

@donalbane said:

Maybe PC gamers should try turning on V-sync before buying a new monitor. Just my 2 cents here.

Lol. I think you missed the point.

Alright, the issue is that monitors and TV's currently run at a predetermined refresh rate. Dsiplay adapters need to sync their frame rates to this rate or frame tearing occurs.

V-sync solves the frame tearing issue, though you can only run it essentially at that refresh rate or 1/2 that refresh rate. AND it introduces stutter and latency.

Enter G-sync. By doing what shoudl have been done the moment we switched over from CRT technology in the first place, and making the monitor refresh the display according to the display adapter, you don't have to worry about latency, about stuttering, OR about screen tearing.

What's more, because regardless of the frame rate, the screen is only refreshed when the GPU is ready this creates a type of persistance, for lack of a better term, for each frame.

The end result is that motion is incredibly smooth, even smoother in appearance than a 120hz monitor, and smoother even at lower framrates! (that's key), motion blur is also eliminated. Moving objects would appear as they do in the real world. Here's an example:

As you cna see text, inside the moving object apepars crisp and clear vs blurry on the other panel.

This is probably the most amazing leap in dsiplay tech since HD. And the tech is here. So it'll coem to everyone, evantually. That's the really exciting part.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#20 ronvalencia
Member since 2008 • 29612 Posts

@loco145 said:

NVIDIA head Jen-Hsun Huang revealed G-Sync this morning, a module for gaming monitors that helps alleviate screen tearing and skipping issues. A variety of display companies are already on board, including ASUS, BenQ, Philips and ViewSonic. Huang said the module kills stutter entirely, pushes down lag and kills tearing. The monitors with G-Sync look the same as a normal display, as the module is built into the rear (as seen above). We're told by NVIDIA's Ujesh Desai that the module won't make new monitors much more expensive, and the module works with GPUs that have Kepler architecture (so the GTX 660 and up). G-Sync monitors will be available starting in Q1 2014.

Source

Consolites has been pwned!

The problem with PC ecosystem is gap between the hardware and user land software.

Avatar image for Peredith
Peredith

2289

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#21  Edited By Peredith
Member since 2011 • 2289 Posts

@Kinthalis said:

@Peredith said:

Hermits in fullscale damage control

???

Damage control for what? Consoel gamers are the ones running into rude awakening after rude awakening wiht these "next gen" consoles, barely able to run at 1080p.

Hermits get "G Sync", console gamers get

Lawl

Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#22 Kinthalis
Member since 2002 • 5503 Posts

@Peredith said:

@Kinthalis said:

@Peredith said:

Hermits in fullscale damage control

???

Damage control for what? Consoel gamers are the ones running into rude awakening after rude awakening wiht these "next gen" consoles, barely able to run at 1080p.

Hermits get "G Sync", console gamers get

Lawl

Boring game?

Avatar image for Peredith
Peredith

2289

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23  Edited By Peredith
Member since 2011 • 2289 Posts

Nah, CGI quality graphics. Enjoy your G Sync on your indie games though.

Avatar image for HaloinventedFPS
HaloinventedFPS

4738

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24  Edited By HaloinventedFPS
Member since 2010 • 4738 Posts

Ha & Ryse doesn't even have v-sync

I wonder how many next gen games will skip out on v-sync

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#25  Edited By ronvalencia
Member since 2008 • 29612 Posts

@HaloinventedFPS said:

Ha & Ryse doesn't even have v-sync

I wonder how many next gen games will skip out on v-sync

It depends on the final game release.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#26  Edited By ronvalencia
Member since 2008 • 29612 Posts

@loco145 said:

NVIDIA head Jen-Hsun Huang revealed G-Sync this morning, a module for gaming monitors that helps alleviate screen tearing and skipping issues. A variety of display companies are already on board, including ASUS, BenQ, Philips and ViewSonic. Huang said the module kills stutter entirely, pushes down lag and kills tearing. The monitors with G-Sync look the same as a normal display, as the module is built into the rear (as seen above). We're told by NVIDIA's Ujesh Desai that the module won't make new monitors much more expensive, and the module works with GPUs that have Kepler architecture (so the GTX 660 and up). G-Sync monitors will be available starting in Q1 2014.

Source

Consolites has been pwned!

PSR has similar features.

http://www.anandtech.com/show/7208/understanding-panel-self-refresh

http://www.anandtech.com/show/4803/lgs-shuriken-self-refreshing-panel-aims-to-improve-notebook-battery-life-by-up-to-an-hour

http://liliputing.com/2012/04/intel-future-could-use-less-power-panel-self-refresh-tech.html

Avatar image for Sushiglutton
Sushiglutton

9853

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#27 Sushiglutton
Member since 2009 • 9853 Posts

"In traditional setups a display will refresh the screen at a fixed interval, but in a G-Sync enabled setup the display won't refresh the screen until it's given a new frame from the GPU."

It seems obvious that the latter is much better, I wonder what makes it so difficult to achieve?

Avatar image for lowe0
lowe0

13692

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28  Edited By lowe0
Member since 2004 • 13692 Posts

@Sushiglutton said:

"In traditional setups a display will refresh the screen at a fixed interval, but in a G-Sync enabled setup the display won't refresh the screen until it's given a new frame from the GPU."

It seems obvious that the latter is much better, I wonder what makes it so difficult to achieve?

It's a holdover from the CRT days where the refresh rate was tied to the physical process of guiding electrons to the right spot on the screen over and over. Now that it's just an arbitrary timer for legacy purposes, it doesn't have to tick at a fixed rate anymore.