FreeSync Vs G SYNC – What Is It? And Which Is Best For You In 2023?
FreeSync Vs G-SYNC
Contents
If you’re looking to buy a monitor with a high refresh rate beat 144 curves or 240 Hertz. Then you must have noticed how most of these monitors come with one of these two features. FreeSync and G Sync. But what are these technologies all about? How are they different from one another? And most importantly is what better than the other? If these are the types of questions that have been bugging you then you’ve come to the right place. Over this article, we’ll be discussing free sync vs G sync and answering all of these questions and more.
What Is Free Sync Vs G Sync?
Free sync and G-sync are what are known as adaptive sync technologies. They serve as alternatives to the Vsync option that you can find in the graphics menus of many video games. Now if you aren’t sure what Vsync does, here’s a quick rundown.
It all has to do with the monitor. A monitor refresh rate which is measured in hertz determines, how many frames the monitor can display each second.
In theory, this acts as a hard tap on your FPS or frame per second. Even though your GPU may be able to render 80 FPS in a certain game. A 60Hertz monitor can only flip through 60 of those frames every second. This disbalance, however, is usually not as amicable as we’ve just described it.
Short Description Of What Vsync Does?
In this example, the GPU is actively pushing for those 80 frames to be displayed on a monitor. Even though the monitor physically can’t handle that. The result of this dysfunctional relationship is what we call screen tearing. This right there, yeah, nobody wants this and that is where Vsync comes in.
The purpose of Vsync is to actually impose a hard cap on your FPS. So that you don’t have to separate your screen tearing while you game. And this works perfectly well on 60Hertz monitors. If your GPU can manage a stable 60fps then that’s what you have. If not it’ll reduce the cap to 30fps.
Unfortunately, the situation isn’t that simple with higher frame rates and refresh rates. Common issues with Vsync include stuttering and input lag, neither of which is desirable if you are already gone out of your way to buy a 144Hertz or a 240Hertz monitor.
The Key Difference Between Free Sync Vs G Sync Vs Vsync
So Nvidia and AMD both decided to make their own adaptive sync technologies. Nvidia made g-sync and AMD made free sync. Now there are certainly some key differences between these two technologies. And we’ll get to them in a bit. But in essence, they accomplish the same thing. Whereas Vsync throttles your GPU so that it stays in line with the monitor’s refresh rate.
Adaptive sync adapts the refresh rate to the FPS. let’s say for example that your GPU can run a certain game while maintaining a steady70 to 90 FPS. Vsync would cap that to only 60 FPS. But adaptive sync technologies would make sure that the monitor refreshes at 70 to 90 Hertz as the FPS fluctuates.
So you can think of it kind of as a dance with the GPU and monitor as partners. When they’re dancing to Vsync the monitor takes the leap. But when the band starts playing adaptive sync GPU takes the lead which ensures better performance.
What Makes Them Unique? Free Sync Vs G Sync.
Now that we know what these two adaptive sync technologies have in common. Let’s take a look and see what makes each of them unique. For starters both free sync and g sync are limited by the selection of GPUs. As a rule, the free sync and free sync monitors can only work if you’re using an AMD GPU. And G sync is only viable when paired with an NVIDIA GPU.
Certain G-sync compatible free sync monitors have been released recently so there are exceptions to this rule. But they are still very much just that, the few exceptions.
Next up, in order for either of these technologies to work monitors have to use scaler modules. Needless to say, AMD and NVIDIA have different approaches to how these scaler modules are implemented.
OEM Requirement
Nvidia requires OEMs to use their proprietary scalars when making G sync monitors. This is why G sync monitors end up being much more expensive than their free sync counterparts. OEMs have to both license to technology and buy the expensive scalar modules directly from Nvidia.
Alternatively, AMD takes a more open approach. OEMs free to use whichever scalar modules they want and they don’t even have to pay AMD for licenses to implement free sync in their monitors. This is why free sync monitors are so much more affordable. Most notably free sync monitors only support a specific frame rate range. What this means is that some free sync monitors will only work in a frame rate range between say 40 and 75 FPS. While others may work in the 30 to 144 FPS range and so on.
This is a direct result of AMD’s decision not to impose strict quality control as Nvidia does. So when buying a free sync monitor always make sure to check the frame rate range specified by the manufacturer.
On the other hand, because Nvidia forces OEMs to use their proprietary scalar modules g-sync monitors have no frame rate restrictions and what so ever. In fact, this strict quality control ensures other side benefits as well. Such as motion blur reduction, elimination of ghosting, and easier monitor overclocking.
Free Sync Vs G Sync Conclusion
So which of these adaptive sync technologies is better? Well G sync is definitely the superior technology from an objective standpoint. But the objective standpoint is not always the one you’ll be using. It doesn’t mean squat if you’re working on a budget. In which case, free sync is the better choice. So the most important thing to keep in mind is that in either case, you get what you paid for. If you have the money to buy a g-sync monitor you’ll appreciate it for all of its extra benefits. But if not a free sync monitor will still save you a lot of headaches just so long as you get one the caters to your desired frame rate range.