Why your next monitor should have AMD FreeSync
When Nvidia introduced G-Sync, its frame-smoothing monitor technology, 18 months ago we were pretty blown away by how much of a difference it made to the feel of smoothness when gaming. Even if your graphics card could only output 40fps, you still got a smooth, immersive gaming experience.
However, there was, and still is, a problem with G-Sync: it requires a whole new Nvidia controller to be built into monitors, which limits uptake of the technology and makes the monitors expensive, plus of course you have to have an Nvidia graphics card. And that’s where AMD’s FreeSync comes in.
Developed as a direct response to G-Sync, FreeSync is an alternative way of getting to the same result: having a monitor that responds to the framerate of the graphics card, rather than being fixed at, say, 60Hz or 75Hz.
SEE ALSO: The top tech from CES 2015
28” 60Hz 4K display with FreeSync from Samsung
What makes it so attractive is that, unlike G-Sync, FreeSync doesn’t require any extra hardware from the monitor manufacturers. As a result AMD has been able to announce 11 new monitors that use FreeSync technology even though it has only just launched here at CES 2015. Meanwhile there are still relatively few G-Sync monitors even though it has been around a while, as the G-Sync module requires a license fee and is otherwise quite limited in what it can do.
FreeSync in fact uses the standard adaptive-sync command that’s part of the DisplayPort standard (yes, these technologies only work over DisplayPort) and as such it’s theoretically possible for many existing monitors to support it simply by applying a new firmware to the monitor’s controller.
For those unfamiliar, the advantage of these technologies is that they eliminate (within certain boundaries) two of the most distracting graphical problems when gaming on a PC.
The first is tearing, which is where the monitor doesn’t wait for a complete new frame from the graphics card before updating its display so you get multiple images displayed on the same frame, creating a stepped or torn look to the image.
Tearing can be fixed by turning on V-Sync, which forces the monitor to wait for a full new frame before updating its image. The problem is, if it has to wait more than the period of time it takes for the monitor to refresh – i.e. 10ms for a 60Hz monitor – it creates a secondary problem where it shows the same image twice. This results in an effect called stuttering.
G-Sync and FreeSync aim to solve this by tying the refresh of the monitor to the graphics card, so if you can only get around 40fps in a game the monitor will essentially just run at 40Hz.
There are some limitations, though. For a start, the effect only works within certain framerate ranges: too low (below 30fps) and the framerate is just so slow that the experience is still very poor, while at the other end it obviously only works up to the maximum framerate of your monitor so if you feel limited by your 60Hz monitor G-Sync and FreeSync won’t change this. Also, despite the fact that AMD’s solution theoretically requires little in the way of new hardware, it’s unlikely monitor manufacturers will release new firmware for existing monitors so you will have to buy a new one.
27” 144Hz QHD display from BenQ
You’ll also likely have to buy a new graphics card as the outputting of the adaptive-sync signal is only supported on the latest AMD graphics cards or APUs, with Intel and Nvidia yet to get on board.
However, because FreeSync is an open standard that any monitor controller maker can use there’s a good chance it’ll soon become standard, whichever graphics card or monitor you buy.
The full range of FreeSync monitors will be arriving over the next few months and already includes 4k, ultra-wide and budget models from the likes of BenQ, LG and Samsung.