NVIDIA G-Sync Refresher
With adaptive refresh rate technologies like G-SYNC, instead of the monitor controlling the timing and refreshing at say 60Hz, the timing control is transferred to the GPU. The GPU scans out a frame to the monitor, where it's drawn on the screen. And the monitor doesn't update until a frame is done drawing. Then, as soon as a frame is done, the monitor updates again as quickly as possible with the next frame, in lockstep with the output from the GPU.
During NVIDIA's original unveiling of G-SYNC, we recorded some video of the technology in action--we have it posted above. We must point out that this live demo of the technology doesn’t really do it justice because the camera can’t pick up all of the issues being shown on-screen, and the camera's recording frame rate is out of sync with the monitors too.
To see them live is to truly appreciate variable refresh rate technologies like G-SYNC, but the tech is explained in plain English in the video, which some of you may find helpful. If you were there live, what you would have seen was that the animation on the G-SYNC enabled screen was as smooth as silk and devoid of any synchronization-related anomalies or stutters, whereas the traditional screen showed significant stutter and tearing. The visible differences during the live demo were like night and day.
As it stands today, gamers can typically choose to play their games with V-Sync (vertical sync) enabled or disabled. V-Sync is an ancient technology that essentially allowed the output from a video source to synchronize properly with a display at a given frequency--the most common of which is 60Hz. That may sound well and good, but if the graphics output is coming at a rate above or below the vertical refresh rate of the screen, a number of issues are introduced. Disabling V-Sync may seem like the simple answer, but that causes a whole new set of problems.
The diagram above illustrates what happens between a GPU and a display when V-Sync is enabled. In the image, the panel is refreshing every 16ms (or roughly 60Hz), but the GPU is rendering frames at different intervals. Frame 1 renders faster than 16ms, so a bit of lag is introduced before the screen updates. Frame 2 takes longer than 16ms, so that frame is shown on-screen twice, which causes stutter during the animation and input lag. And so on. V-Sync would be an ideal solution if the frames were rendered and output at 60Hz as well, but that's not how today's games and GPUs work. It's common for today's games to exhibit significant variations in frame time and it's rare that the GPU and display are actually in sync for any meaningful length of time.
Disabling V-Sync does away with the input lag, but introduces tearing on-screen. The diagram above illustrates what happens between the GPU and display when V-Sync is disabled. The GPU essentially pumps out frames as fast as it can, regardless of whether or not the display can keep up. What results is that unfinished parts of adjacent frames are displayed on-screen, and since the positioning of the scene's components are usually in different positions, tear lines are introduced.
As you can see, this scene exhibits tears at the very top and just above the gun, because three adjacent frames were output to the display faster than the display could draw them. This is what happens in virtually every game when playing with V-Sync disabled. If frame rates are high enough, the tearing may be tough to see, but rest assured, it's there.
The two graphs above show how frame rates are affected when enabling / disabling V-Sync in a couple of games on a GeForce GTX 760, namely Tomb Raider and Crysis 3. With V-Sync enabled (red line), and the games configured for high image quality settings to target the 40-60 FPS range, it is not uncommon to see frame rates locked at 30 FPS for a time (half the monitor's refresh rate), which means many frames are duplicated, which introduces lag. Or, as is the case with Tomb Raider, you see wild, constant fluctuations between 30 and 60 FPS. With V-Sync disabled (green line), each frame it output at a completely different rate, which is never actually in sync with the display.
Now that we’ve explained what happens when V-Sync is enabled or disabled, understanding what NVIDIA’s G-SYNC technology does should be simple. The image below illustrates how frames are output to a G-SYNC equipped screen like the ASUS ROG SWIFT PG279Q, when G-SYNC is enabled in NVIDIA’s drivers and the DisplayPort input is used.
With G-SYNC, when a frame is complete, it is scanned out to the screen. Frame 1 finishes in X amount of time, and it’s sent to the display. Frame 2 takes a bit longer, but when it is done, it is sent out to the screen, and so on. G-SYNC removes the fixed refresh rate limitation of most of today’s desktop displays and transfers the timing to the GPU. The screen’s refresh rate is dynamically adjusted to stay in-sync with the GPU, regardless of the frame rate, though G-SYNC is limited to refresh rates between 30 and 144Hz with GeForce cards from the GTX 650 Ti on up. With a GeForce GTX 960 or newer, G-SYNC will work all the way up to this panel's 165Hz peak.