Xbox One already has an answer to Nvidia G-Sync

This topic is locked from further discussion.

Avatar image for Far_RockNYC
Far_RockNYC

1244

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 Far_RockNYC
Member since 2012 • 1244 Posts

http://gearnuke.com/xbox-one-already-has-an-answer-to-nvidia-g-sync/

Spoken of briefly during a recent Digital Foundry interview with the Xbox One architects, the console’s scaler will allow developers to employ a dynamic framebuffer in games without the need for additional software intervention, given that the scaler parameters can be altered on a frame-by-frame basis.

Theoretically, this implies a locked frame rate of either 30 or 60 frames per second without the need for v-sync, and a variable resolution as opposed to the opposite. In turn, this would lead to the elimination of input lag, stutter, and screen tearing, much like Nvidia’s motto with G-Sync. While G-Sync essentially uses a dynamic refresh rate that is in sync with the GPU framebuffer, the Xbox One’s scaler can maintain a constant frame rate by dynamically scaling back on resolution whenever the GPU load exceeds the required frame rate.

Multiple industry sources have indicated that developers are keen on making use of this hardware feature. However, given the unfinished and evolving state of early Xbox One development kits and considering that development on launch titles was already well underway, we won’t be seeing its utilization any time soon.

Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#2  Edited By Kinthalis
Member since 2002 • 5503 Posts

That makes no sense.

How can the GPU tell how long it will take to render a specific frame, without first rendering that frame? That's not possible. So all this is doing is lowering the resolution in RESPONSE to a lower frame rate, which is too late. The update cycle on the monitor has already passed by.

This might make hitting 30 FPS for developers a bit easier, less volatility in frame rate, especially at sub 30 FPS, but you're still need to sync ot the monitor. You're still going to get perceptable animation stutter and input lag, especially at lower frame rates.

Not only that, but on top of it all (all the same issues that have always existed) you end up with even LOWER resolutions! Ha!

So now they cna basically outright lie to you. They cna say: This game runs at 1080p. But that's only 50% of the time, the other 50% of the time the scaler is telling the GPu to render at 720p cause otherwise the frame rate will dorp to below 30.

Avatar image for barrybarryk
barrybarryk

488

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#3 barrybarryk
Member since 2012 • 488 Posts
@Far_RockNYC said:

Theoretically, this implies a locked frame rate of either 30 or 60 frames per second without the need for v-sync, and a variable resolution as opposed to the opposite. In turn, this would lead to the elimination of input lag, stutter, and screen tearing

Yeah that's not what V or G sync do.

It's about making sure the display is ready for the frame before sending it. Not locking it to 30 or 60 fps and just hoping the timings match up

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#4  Edited By Wasdie  Moderator
Member since 2003 • 53622 Posts

I've seen that in practice before and the end result is ugly as hell. During graphics-intense scenes the image quality noticeably drops. Aliasing gets worse, the screen gets muddy, and the crisp details go away. This is not something you want.

Basically any competent dev isn't going to ruin their image quality and litter their game with inconsistent graphics quality. They'll do a better job of utilizing their available resources and keeping the framerate steady throughout even if that means a slight reduction in overall quality. A good art direction can go a long way in these situations.

This would only make porting and crunch time easier. Instead of them having to re-engineer entire parts of their graphics engine or do large scale optimization work on all of their assets, they could just utilize this as a quick fix saving themselves a lot of time at the cost of image quality.

This isn't something to brag about.

Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#5  Edited By Kinthalis
Member since 2002 • 5503 Posts

Lol, So I went ahead and read the actual interview. NO MENTIONS of Gsync. The dynamic scaling is not even discussed within the context of Vsync.

So this is basically just some random console gmaing website that doesn't understand the hardware OR the technology, making shit up based on a tidbit of an interview, they clearly didn't comprehend.

Lol.

TC, please change title.

Avatar image for cfisher2833
cfisher2833

2150

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#6 cfisher2833
Member since 2011 • 2150 Posts

Yeeeaaaahhh....not really the same thing at all....at all. Gsync syncs the monitor's refresh rate to the GPU, whereas this simply scales down the resolution the game is rendering at when there's a particularly intense moment. With Gsync you can run at sub-60fps and still get an incredibly smooth experience, whereas this just downgrades the image quality on the fly. The two technologies aren't even in the same ballpark. Sony and Microsoft don't do shit for computer graphics technology--it's all Nvidia and AMD.

Avatar image for Gue1
Gue1

12171

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#7 Gue1
Member since 2004 • 12171 Posts

Triple Buffering will do fine for consoles. Much better than MS's solution about dropping resolution.

Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#8  Edited By Kinthalis
Member since 2002 • 5503 Posts

@Gue1 said:

Triple Buffering will do fine for consoles. Much better than MS's solution about dropping resolution.

The dynamic scaler has nothign to do wiht Vsync.

Also, trippled buffered Vsync adds complexity to the renderer, has overhead associated with it, takes up more RAM, and increases input latency. Since it allows for greater frame rate variability, you still have the issue of animation stutter too unless the console can consistently hold frame rates above the target one. And if that's true, then double buffered V-sync would be a better solution.