NVIDIA G-SYNC Display Technology Explored

Article Index:   

Back in September at a press event held in Montreal, NVIDIA CEO Jen-Hsun Huang announced what he called “one of the most important works NVIDIA has done for computer graphics.” The technology was called G-SYNC. G-SYNC is an end-to-end graphics and display architecture that starts with a Kepler-based GPU and ends with a G-SYNC module within a monitor. The G-SYNC module is piece of hardware that replaces the scaler inside a display/gaming monitor that essentially does away with the fixed vertical refresh rates of current displays.

To put it simply, what G-SYNC does is keep a display and the output from a Kepler-based GPU in sync, regardless of frame rates or whether or not V-Sync is enabled. Instead of the monitor controlling the timing, and refreshing at say every 60Hz, with G-SYNC the timing control is transferred to the GPU. The GPU scans a frame out and the monitor then draws it. The monitor doesn't update until a frame is done drawing. And as soon as a frame is done drawing, the monitor will update as soon as it can with the next frame, in lockstep with the GPU.


NVIDIA G-SYNC Display Technology Demo

During the original announcement, we recorded some video of G-SYNC in action, as you can see above. We must point out that this live demo of the technology doesn’t really do it justice because the camera can’t pick up all of the issues being displayed on-screen, and the camera's recording frame rate is out of sync with the monitors too. To see it live is to truly appreciate G-SYNC, but the tech is explained in plain English in the video, which some of you may find helpful. If you were there live, what you would have seen was that the animation on the G-SYNC enabled screen was as smooth as silk and devoid of any synchronization-related anomalies or stutters, whereas the traditional screen showed significant stutter and tearing. The difference live was like night and day.


Asus, Ben-Q, Philips and Viewsonic have signed up to offer G-SYNC enabled monitors.

Just recently, we were given the opportunity to experience G-SYNC on our own and experiment with the technology in-house. In this piece, we're going to talk about some of our experiences with G-SYNC, but before we do, we want to dive a little deeper into the technology and explain what really makes G-SYNC tick. To do so, we must first talk about what G-SYNC is designed to fix.

As it stands today, gamers can typically choose to play their games with V-Sync (vertical sync) enabled or disabled. V-Sync is an ancient technology that essentially allowed the output from a video source to synchronize properly with a display at a given frequency--the most common of which is 60Hz. That may sound well and good, but if the graphics output is coming at a rate above or below the vertical refresh rate of the screen, a number of issues are introduced. Disabling V-Sync may seem like the simple answer, but that causes a whole new set of problems.


V-Sync Enabled - Stutters and Lag

The diagram above illustrates what happens between a GPU and a display when V-Sync is enabled. In the image, the panel is refreshing every 16ms (roughly 60Hz), but the GPU is rendering frames at different intervals. Frame 1 renders faster than 16ms, so a bit of lag is introduced before the screen updates. Frame 2 takes longer than 16ms, so that frame is shown on-screen twice, which causes stutter during the animation and input lag. And so on. V-Sync would be an ideal solution if the frames were rendered and output at 60Hz as well, but that's not how today's games work. It's common for today's games to exhibit significant variations in frame time and it's rare that the GPU and display are actually in sync.


V-Sync Disabled - Tearing

Disabling V-Sync does away with the input lag, but introduces tearing on-screen. The diagram above illustrates what happens between the GPU and display when V-Sync is disabled. The GPU essentially pumps out frames as fast as it can, regardless of whether or not the display can keep up. What results is that unfinished parts of adjacent frames are displayed on-screen, and since the positioning of the scene's components are usually in different positions, tear lines are introduced.


On-Screen Tearing with V-Sync Disabled

As you can see, this scene exhibits tears at the very top and just above the gun, because three adjacent frames were output to the display faster than the display could draw them. This is what happens in virtually every game when playing with V-Sync disabled. If frame rates are high enough, the tearing may be tough to see, but rest assured, it's there.


Tomb Raider Frame Rates Over Time: Red Line-V-Sync On, Green Line-V-Sync Off


Crysis 3 Frame Rates Over Time: Red Line-V-Sync On, Green Line-V-Sync Off

The two graphs above show how frame rates are affected when enabling / disabling V-Sync in a couple of games, with a GeForce GTX 760, namely Tomb Raider and Crysis 3. With V-Sync enabled (red line), and the games configured for high image quality settings to target the 40-60 FPS range, it is not uncommon to see frame rates locked at 30 FPS for a time (half the monitor's refresh rate), which means many frames are duplicated, which introduces lag. Or, as is the case with Tomb Raider, you see wild, constant fluctuations between 30 and 60 FPS. With V-Sync disabled (green line), each frame it output at a completely different rate, which is never actually in sync with the display.

Image gallery

Related content

Comments

Comments
basroil one year ago

G-Sync will finally kill HDMI and DVI on desktops, since only DP can transfer using packets rather than timed signals. It might also make GPUs cheaper and faster, since you don't need to worry about GPU side timing as much, and finally have true 24p video without a 1k+ TV. Looking forward to it, and if they can make generic versions for dell monitors it's even better.

Dave_HH one year ago

It might be a while before HDMI and DVI are killed off but it's good to see the advancement and this is pretty significant technology for sure.

semitope one year ago

the technology will work over HDMI. Not sure why it would kill DVI and HDMI. Sounds like most of the processing goes on in the monitor and it simply waits for signals from the GPU. The GPU is only kinda in control in that it is a bottleneck.

Is there actually special hardware in the GPU? It sounds like any GPU, even AMD, could end up working with this. The problem is nvidia and their obsession with proprietary but monitor manufacturers could implement their own technology that would work with any GPU and market it themselves, right? They can finally start labelling their products "Smart" monitors

basroil one year ago

"It might be a while before HDMI and DVI are killed off"

Never said it would be overnight, but DP was designed to replace DVI and still nobody uses it, so a killer application might be the push it needs to get rid of DVI by 2016-18.

"the technology will work over HDMI. Not sure why it would kill DVI and HDMI"

It will NOT work with DVI and HDMI, at least not in a way that would allow them to still use the trademarked names and call it DVI/HDMI. While the cables are perfectly fine, DVI is only timing based with fixed clocks according to specifications, and HDMI is a superset of DVI. DP gets around the timings issue by being programmable packet based communication, so theoretically you can set your link clock to the maximum and just send the right timing data along with the image data. DVI and HDMI could theoretically allow for such use, but it would likely require dropping TDMS which I doubt they would want (since that's a separate technology and they can double charge you)

thomas.gershman 9 months ago

I dont agree that it will kill HDMI/DVI. but it will certainly make an impact in the next few years.

nfs3freak 8 months ago

Agreed. It can't kill off HDMI/DVI. VGA is still around and it will continue to be. DP will increase as more monitors have it and support it.

I just can't wait for ASUS ROG 27inch GSync...I just wish it wouldn't be 800 bucks.

illus1ons 8 months ago

Does this work with any kind of LCD monitor?

RBloch one year ago

Too bad it's an exclusive for Asus till the end of 2014. Big mistake.

Dave_HH one year ago

No, not true. It is only Asus that's offering the upgradable monitor through the end of this year. Not next year.

MikeCammisa one year ago

To bad I'm broke

BigKihd one year ago

I would like to see this thing in action, plus dont know when I would be able to get it or afford one, i just bought a 27in 2560x1440 IPS screen.

thunderdan602 one year ago

Another gadget to send me further into the poor house. Yes!

Post a Comment
or Register to comment