Gaming

FreeSync vs. G-Sync

If you’ve ever experienced screen tearing in a PC game, you know how annoying it can be — an otherwise correctly-rendered frame ruined by gross horizontal lines and stuttering. You can turn on V-Sync, but that can be detrimental to system performance.

Nvidia and AMD have stepped up to solve the issue while preserving frame rates, and both manufacturers have turned to adaptive refresh technology for the solution. That often leads to a very obvious recommendation: If you have a Nvidia GPU, use G-Sync. If you have an AMD GPU, use FreeSync.

But if you have a choice in monitors or graphic cards, you may be wondering exactly what the differences are and which syncing technology is best for your setup. Let’s break it down to reveal which is a better option for you.

Performance

G-Sync and FreeSync are both designed to smooth out gameplay, reduce input lag, and prevent screen tearing. They have different methods for accomplishing these goals, but what sets them apart is that the former keeps its approach close to the vest, while the latter is shared freely. Nvidia’s G-Sync works through a built-in chip in the monitor’s construction. FreeSync uses the video card’s functionality to manage the monitor’s refresh rate using the Adaptive Sync standard built into the DisplayPort standard — the result is a difference in performance.

Users note having FreeSync enabled reduces tearing and stuttering, but some monitors exhibit another problem: Ghosting. As objects move on the screen, they leave shadowy images of their last position. It’s an artifact that some people don’t notice at all, but it annoys others.

Many fingers point at what might cause it, but the physical reason for it is power management. If you don’t apply enough power to the pixels, your image will have gaps in it — too much power, and you’ll see ghosting. Balancing the adaptive refresh technology with proper power distribution is hard.

Acer Predator XB2 review full

Both FreeSync and G-Sync also suffer when the frame rate isn’t consistently syncing within the monitor’s refresh range. G-Sync can show problems with flickering at very low frame rates, and while the technology usually compensates to fix it, there are exceptions. FreeSync, meanwhile, has stuttering problems if the frame rate drops below a monitor’s stated minimum refresh rate. Some FreeSync monitors have an extremely narrow adaptive refresh range, and if your video card can’t deliver frames within that range, problems arise.

Most reviewers who’ve compared the two side-by-side seem to prefer the quality of G-Sync, which does not show stutter issues at low frame rates and is thus smoother in real-world situations. It’s also important to note that upgrades to syncing technology (and GPUs) are slowly improving these problems for both technologies.

Selection

One of the first differences you’ll hear people talk about with adaptive refresh technology, besides the general rivalry between AMD and Nvidia, is the difference between a closed and an open standard. While G-Sync is proprietary Nvidia technology and requires the company’s permission and cooperation to use, FreeSync is free for any developer or manufacturer to use. Thus, there are more monitors available with FreeSync support.

In most cases, you can’t mix and match between the two technologies. While the monitors themselves will work irrespective of the graphics card’s brand and can offer both Freesync and G-Sync support, G-Sync is only available on Nvidia graphics cards. Freesync works on all AMD cards and some Nvidia cards, too. But there’s a catch — it’s only guaranteed to work correctly on FreeSync monitors that are certified Nvidia G-Sync Compatible. The cards have undergone rigorous testing and are approved by Nvidia to ensure that FreeSync runs smoothly across the card range. Here’s a current list of certified monitors.

If you go the Nvidia route, the monitor’s module will handle the heavy lifting involved in adjusting the refresh rate. They tend to be more expensive than Freesync counterparts, although there are now more affordable G-Sync monitors available, like the Acer Predator XB241H.

Most recent-generation Nvidia graphics cards support G-Sync. Blur Busters has a good list of compatible Nvidia GPUs you can consult to see if your current card supports it. Nvidia, meanwhile, has special requirements for G-Sync rated desktops and laptops for a more thorough check of your system.

freesyncdisplay

You won’t end up paying much extra for a monitor with FreeSync. There’s no premium for the manufacturer to include it, unlike G-Sync. FreeSync in the mid-hundreds frequently comes with a 1440p display and a 144Hz refresh rate (where their G-Sync counterparts might not), and monitors without those features can run as low as $ 160.

Premium versions

G-Sync and Freesync aren’t just features; they’re also certifications that monitor manufacturers have to meet. While basic specifications allow for frame syncing, more stringent premium versions of both G-Sync and Freesync exist, too. If monitor manufacturers meet these more demanding standards, then users can feel secure that the monitors are of higher quality, too.

AMD’s premium options include:

  • FreeSync Premium: Premium requires monitors to support a native 120Hz refresh rate for a flawless 1080p resolution experience. It also adds low frame rate compensation (LFC), which copies and extends frames if the frame rate drops to help smooth out more bumpy experiences.
  • FreeSync Premium Pro: Previously known as FreeSeync 2 HDR, this premium version of FreeSync is specifically designed for HDR content, and if monitors support it, then they must guarantee at least 400 nits of brightness for HDR, along with all the benefits found with FreeSync Premium.

Nvidia’s G-Sync options are tiered, with G-Sync compatible at the bottom, offering basic G-Sync functionality in monitors that aren’t designed with G-Sync in mind (some Freesync monitors meet its minimum requirements). G-Sync is the next option up, with the most capable of monitors given G-Sync Ultimate status:

  • G-Sync Ultimate: Ultimate is similar to FreeSync Premium Pro, a more advanced option available on the more powerful GPUs and monitors that are designed for HDR support and low latency. It used to demand a minimum brightness of 1,000 nits, but that was recently reduced to demand just VESA HDR400 compatibility, or around 400 nits.

Conclusion

Both Nvidia’s G-Sync and AMD’s Freesync offer excellent features that can significantly improve your game, but which is better? The feature list for G-Sync monitors, particularly those rated at G-Sync Ultimate level, is greater, but not so much that you should definitively pick a G-Sync monitor over a Freesync one. Indeed, if you already have a decent graphics card, then buying a monitor to go with your GPU makes the most sense.

Discounting the cost for any other components, expect to spend at least a few hundred dollars on a G-Sync monitor (our budget pick can be found for around $ 330). For a compatible G-Sync graphics card, prices can vary considerably due to current shortages. Mid-range options like the RTX 3060 are releasing shortly and will offer fantastic performance for around $ 400, but they will be in short supply. Other new generation cards are undoubtedly hard to find and cost upwards of $ 500 even when available.

FreeSync monitors and FreeSync-supporting GPUs are generally less expensive. For example, the AMD Radeon RX 590 graphics card costs around $ 200. However, the more powerful graphics cards are quite hard to find in the early months of 2021. Waiting a few months to buy a new RX 6000 card at a reasonable price would be better than trying to buy over MSRP right now.

Editors’ Recommendations

Let’s block ads! (Why?)

Gaming | Digital Trends

Leave a Reply

Your email address will not be published. Required fields are marked *