The Nvidia GeForce GTX 1060 is the graphics card that PC builders on a budget have been waiting for. Unlike Nvidia’s top-end GTX 1080 and GTX 1070 cards, however, the GTX 1060 faces competition from AMD in the form of the Radeon RX 480.
With competition comes aggression, and the GTX 1060 is far closer to the RX 480 in terms of price than most were imagining. It’s more expensive and more powerful – but is it worth it? As it turns out, if you have the extra cash then, yes, it is.
Update: Since my original review of the GTX 1060, Nvidia quietly launched a 3GB, slightly less powerful version of the card. The 3GB GTX 1060 still uses the same GP106 GPU as the most expensive version but with slightly slower clock speeds and fewer CUDA cores. I have started to test two models that are physically identical (but with the different number of CUDA cores and memory) and so far the results are largely as expected.
The 3GB model appears to be between 2% and 10% slower than the 6GB version in most games I tested, although there is an outlier in the form of Rise of the Tomb Raider that resulted in a 1440p benchmark that was nearly 20% slower. Trusted won't be releasing full results until I've conducted further analysis, so stay tuned for my full comparison.
My original review continues below.
Video: Nvidia GeForce GTX 1060 review
The GPU is based on a new Nvidia chip, the GP106. The headline-grabbing GTX 1070 and GTX 1080 cards both used the GP104.
So as far as chip design is concerned, instead of taking the form of a “binned” GP104 – a chip that wasn’t good enough to be a 1080 or 1070 – the GTX 1060 is a completely different product.
However, there are things that remain. The GTX 1060 is powered by the same Pascal architecture as its more expensive siblings, and therefore benefits from the 16-nanometre manufacturing process that enables Nvidia to cram more transistors on a given piece of silicon – without increasing power consumption and heat to the same degree.
The GTX 1060 has 1,280 CUDA cores performing the bulk of the graphics legwork, which is more than the 1,024 found on the previous-generation GeForce GTX 960. Intriguingly, the GTX 1060 has a higher boost clock speed than its bigger brother, the GTX 1070, topping out at 1.7GHz.
With a different chip design and fewer CUDA cores, the clock speed won’t translate to better performance, but the GTX 1060 should be able to play games at Full HD and 1440p – and that’s before we get to overclocking.
Away from the graphics chip itself, the GTX 1060 has 6GB of GDDR5 memory. This is 2GB less than the AMD Radeon RX 480, but still more than enough for today’s modern games at Full HD and 1440p resolutions. The memory runs at 8Gbits/sec, which is standard these days for GDDR5 memory.
Related: Best Graphics Cards 2016
Nvidia quotes the card as consuming 120W of power – significantly less than the RX 480’s quoted 150W – so the GTX 1060 should fit nicely into builds where power consumption and heat are limiting factors, such as home-theatre PCs and games console replacements.
The GTX 1060 is Nvidia’s cheapest VR-ready graphics card, which could prove to be a big selling point for those who are considering buying a headset in the next year or so.
I was sent in the Founders Edition of the GTX 1060, which is a more expensive, Nvidia-built version of the card. In all likelihood this won’t be the version you end up buying, since Nvidia’s partners will likely have their own cheaper, custom versions in short order.
Still, it’s a great-looking card with die-cast aluminium highlights and a black plastic shroud. It isn’t the metal-fest of the 1070 or 1080, but it looks exciting nonetheless.
It’s just 9.8 inches in length, with just under a third of the length a result of an extended plastic shroud that contains some of the cooling kit and the fan. Expect third-party manufacturers to produce extremely compact versions of the 1060, with cards suitable for ultra-small desktops and living-room PCs.
You get three DisplayPort connectors, an HDMI 2.0b port and a DVI port here. The DisplayPort 1.4a connectors are ready for HDR gaming, too.