How AMD's FreeSync Panel Replay Tech Aims To Boost Gaming Laptop Battery Life
An AMD-modified version of VESA's Panel Self Refresh could make the tech applicable in more situations, saving more battery. Before we tell you about FreeSync Panel Replay, let's talk about where it came from and what it's replacing. FreeSync is of course AMD's variable-refresh-rate technology. It's essentially an implementation of the VESA Adaptive Sync standard, which is extremely common on laptop displays, particularly gaming laptops.
These displays generally also already support an Embedded DisplayPort feature called Panel Self Refresh (PSR). This lets the screen refresh itself without the involvement of the GPU when there's nothing new to show. This saves power, as parts of the GPU—particularly the memory interface—can briefly power down.
Panel Self Refresh is already common on laptops, but it can't be used at the same time as Variable Refresh Rate (VRR) modes like FreeSync. It also has various additional qualities that make it undesirable for use while gaming, like a latency hit when switching from self-refresh to live feed.
Well, AMD's apparently working on a technology that resolves these faults known as FreeSync Panel Replay. It's fundamentally similar to PSR—the panel refreshes itself without the GPU scanning out a new frame—but different in that the GPU and panel remain synchronized, so you avoid the latency hit when switching between self-refresh and live video, and you can also maintain VRR operation during and after self-refresh.
The cover notes for the driver patch.
The news of this feature comes not from an AMD press release or other announcement, but in fact from a patch note for AMD's graphics drivers for Linux. The cover note for the thousand-odd-line patch (reproduced above) says that Replay "can be activated in more UI scenarios" compared to PSR, which makes sense to us.
There's no telling how much power this feature will actually save. We wouldn't expect it to be huge—it's not as if the whole GPU is turning off—but it could be significant on systems with big displays and powerful GPUs. We won't know for sure until we get one in hand to test, of course.