Nvidia: You’ll die less if you play Fortnite and PUBG at 144fps
Graphics card manufacturer Nvidia claims that if you’ve got a top of the line GPU, your kill/death ratio in battle royal titles like Fortnite and PUBG will be much higher – and it’s released some figures to back it up.
While it seems obvious that using a more powerful graphics processor, along with a high refresh rate monitor means that you’d be able to run things a lot more easily – and with regards to Fortnite, we’ve been here before – cynics are gonna cynic and will be quick point out that of course a company that sells graphics cards is going to tell you to buy the best one you can afford.
That said, it’s worth taking a peek at Nvidia’s working, and see how they arrived at their high fps = a top heavy kills-to-death ratio result.
Taking anonymised data from recorded GeForce Experience Highlights sessions, Nvidia created median player profiles based on each generation of its GPUs going all the way back to the GeForce GTX 600 Series cards as a baseline.
Related: Best graphics cards
Nvidia said that players using newer RTX 20 Series graphics cards enjoyed a 53% higher K/D ratio compared to those using GTX 600 cards. It’s worth noting that when it released the new GTX 1660 Ti graphics card, Nvidia revealed that around two thirds of gamers currently use GTX 900 series cards.
Based on the below data, folks with Nvidia GPUs are around 29% better at Fortnite than people using the baseline GPU – but they’re also 24% less good than someone playing on a rig based around a new 20 Series card.
You should note that the entry for the 20 Series doesn’t include the newer, and less-powerful RTX 2060 card, so that 53% jump isn’t entirely representative.
Nvidia also looked at the number of hours players sunk into PUBG and Fortnite to see if grinding accounted for anything.
Interestingly, Big Green’s own data shows that if you’ve got a less powerful card, someone playing for 40 hours a week, should be able to frag someone who only has the time to squeeze in 5 hours a week, even if they’re rocking a more capable GPU.
Gaps in K/D performance appear to widen over time, so if you’re able to put in the hours and invest in a beefier system, you’re golden. Nvidia says this is due to players becoming better attuned to the benefits afforded to you by being able to play at higher frame rates, as well as, well, practicing.
Rounding this out is data which shows that the average gamer playing battle royale games using 10 Series and the higher-end 20 Series cards playing on 60Hz, 144Hz, and 240Hz monitors.
The sample of players Nvidia looked at here were running games at 1080p, which makes sense, as lower resolutions means you can play games at higher frame rates. Dialling down texture settings also helps in this regard, but it’s unclear from Nvidia’s post what other graphics settings were turned on or off here.
There’s no mention of either ray tracing or DLSS – key features of Nvidia’s latest cards – in Nvidia’s post, which, given that you’re going to see frame rate dips if you turn this on without DLSS enabled, and no battle royale type games support RT as of yet, this isn’t a shocker.
Related: Best battle royale games
Nvidia found that – surprise – using a monitor with a higher refresh rate yield tangible in-game benefits. That’s common sense – having a GPU that can push a game beyond 60 frames per second is pointless if the display you’re using can’t refresh quickly enough to show you those additional frames. If anything, it’ll make it worse, as you’ll encounter screen tearing.
What’s interesting is that it looks like 144Hz is the sweet spot for current titles, mainly down to the fact that only the highest end GPUs, can consistently run games at 240fps.
So unless you have the deepest pockets, Nvidia recommends: “if you play Battle Royales and want to perform at your best, you should optimise your system for 144+ FPS and pair it with a 144Hz monitor.”