Technology advances relentlessly, yet substandard games still go on sale.
Let’s face it: we all make compromises, practically every day of our lives. But should we accept compromises in the videogames we play? Especially since we’re now lucky if we don’t have to pay more than £40 for them?
Recently, compromise has been uppermost in the minds of gamers. Gearbox Software CEO Randy Pitchford, in a fit of honesty, contended that “every game ships with compromises”, while the PC version of Batman: Arkham Knight – a game rightly lavished with praise on the consoles – proved so buggy that it was removed from sale. And that came after the £170 Batmobile Edition of the game was pulled before it even went on sale due, developer Rocksteady said, to “Unforeseen circumstances that greatly compromised the quality” of the transformable Batmobile statue.
That word again — and, even for those with the shortest memories, there are plenty of examples of games that were way more compromised than Batman: Arkham Knight. Assassin’s Creed fans are still getting over the buggy state of the most recent iteration, Assassin’s Creed Unity, which suffered from more or less daily patches when it launched. It took Sony’s DriveClub the best part of six months to get to grips with the server issues which rendered it more or less unplayable online – and it’s a predominantly online game. Even the mighty Halo: The Master Chief Collection launched with its online matching broken – and the one thing you expect Microsoft to get right is the server side.
Are games becoming more compromised? If so, why? Is the situation likely to improve, and if it doesn’t, where should we draw the line?
See also: Halo 5 Guardians tips and tricks
Jon Hare, nowadays senior game consultant at Tower Studios but, as one of the founders of Sensible Software of Sensible Soccer and Cannon Fodder fame, a man who has been making videogames since the 1980s, says: “I think games are definitely much more compromised now, and there are quite a few reasons why.”
Bad – but not unexpected – news. However, Hare’s explanations of why that troubled state of affairs has come about are not as straightforward as you might expect.
Take the most obvious one, which any developer or publisher would no doubt wheel out to explain the bugginess of its latest opus: games are so much more complex these days, and more complexity is bound to breed more bugs. Hare disputes that, pointing out that, thanks to the likes of cross-platform games development engine Unity, “We’ve never had so much good middleware, ever.”
See also: Best PS4 Games 2015
Hare does concede that overambition can pose problems for developers:
“As a designer, I can go: ‘I want the game to do blah, blah and blah’, without fully comprehending the amount of technical complexity. You would then need a technical lead to say to you that it can be done, but you’d need four years and 85 really shit-hot programmers. So, a bit of assessment about what is doable is also key. Often, compromises are made because people say yes to things they can’t really do, just to get the gig.”
But we all want the games we play to be ambitious, and surely, if a game attempts something new, we’ll give it a bit of extra leeway? Fallout 3, for example, was far from bug-free when it came out, yet was showered with plaudits. And Pitchford could have had his own game Borderlands 2 in mind when he contended that all games are compromised – again, it suffered from bugs for a while, but still received plenty of critical acclaim.
Sign up for the newsletter
Get news, competitions and special offers direct to your inbox
Modern-day complexity is an excuse for, rather than a cause of, compromise.
See also: Everything we know about Fallout 4 so far
The scourge of patching
Perhaps the most vexing aspect of being a modern gamer – and something that has tangibly worsened since the latest generation of consoles arrived – is the ubiquity of patches.
Day-one patches are the worst of all: there’s no point in queuing up to be the first person to buy a game these days, since almost inevitably, when you get it home, you won’t be able to play it for hours while that dreaded day-one patch downloads.
“When patching arrived in the 90s, it came out of PC games, because they were harder to code. And it also came from American coders: they were the ones who were sloppy and wanted to patch,” explained Hare. “You wouldn’t have seen many patches by Japanese or European coders before then. We just made sure the game was finished before we put it in the box. Then there became a lazy acceptance that you could patch and it was OK.
To a certain extent, that’s pervasive today.”
See also: Best Xbox One Games 2015
Hare isn’t merely flying the flag for jingoistic reasons: he believes the most important factor that breeds compromise in modern games is simple: bad programming.
“I’ve found it very frustrating in the past 15 years, dealing with people who are just aren’t good enough. Assembling a team which is strong in all the talents, for all sizes of studio, is a challenge at the moment.”
And he adds that the free-to-play mentality, in which it’s now seen as fine to put out a half-formed game and effectively finish it off with updates, isn’t helping.
Plus, he adds, free-to-play games are almost inherently compromised: “You’re catering for 97 per cent of customers who don’t pay and 3 per cent who do, and the weird thing is that your game has to be balanced around the 3 per cent who do pay. So 97 per cent of your customers aren’t getting a game which is perfectly tuned for them.”
See also: PS4 vs Xbox One
Rushing breeds compromise
Hare also points the finger at one aspect of the games industry that has changed irrevocably. In the 80s and 90s, developers were paid flat fees by publishers – “At Sensible Software, the deals we got ranged from £5,000 to £1 million” – to make their games. Developers would pay their employees flat fees for each project so, inevitably, developers would wait until their games were properly finished before releasing them, which was usually a while after they were originally touted for release.
But we know live in a world of risk-averse publishers desperate to milk everything they can from franchises that they know have been successful for years, which means cranking out a game each year, if possible. And that rat-race led to one of the highest-profile games to be released in a thoroughly compromised state in recent years. Assassin’s Creed Unity was horrifically buggy when it hit the shops last Christmas, with those who bought it having to endure almost daily patches.
Assassin’s Creed Unity was created in extreme conditions – Ubisoft commissioned a whole raft of developers to make different bits of it, so it’s probably a miracle that it wasn’t even buggier. But Hare acknowledges that, in the 80s and 90s, “the marketing spend wasn’t so great.”
“The other problem for publishers nowadays is that when they start the marketing machine and commit a lot of money to it, if the game, as it turns out, isn’t quite ready, there are too many things hanging on it to say: ‘Wait, we need another two or three months’.”
And that, surely, is the most worrying reason why compromise is becoming too obvious an aspect of modern videogames.
Hare agrees that the balance between marketing departments and those making the game has shifted too far towards the former, and in the 21st century, games developers face way more pressure than ever before. Especially when it comes to franchises.
“What is the percentage of triple-A budgets being spent on new IP as opposed to existing ones? 5 per cent?”
The way in which games publishers’ have retreated into the comfort zone of yearly franchises, it appears, can lead to compromises in games, and all eyes, for example, will be on Assassin’s Creed Syndicate when it arrives – one more buggy mess, surely, could bury Ubisoft’s lucrative franchise.
See also: Best Games 2015
Online gaming and technical woes
Another conspicuous driver of compromise in games is the move towards online gameplay, with the highest-profile example being Sony’s DriveClub. The game is playable (although pretty drab) offline, but is focused on online gameplay. Yet when it launched, its online side simply didn’t work due to a broken server infrastructure. Sony and developer Evolution Studios took months to fix it, leaving many punters very aggrieved.
Halo: The Master Chief Collection also launched with online problems (although 343 Industries sorted them out fairly quickly). But Hare doesn’t believe that the move from standalone games towards ones that have at least an online component is breeding more compromise into the development process.
Instead he prefers to lay the blame for such howlers at something more fundamental:
“We struggle with no international body forcing all the technical companies to conform to standards across the world.”
See also: Xbox One vs Xbox 360
So how can we avoid compromise?
Based on what Hare says about the realities of modern development, plus a healthy dose of common sense, there are various measures we can take to avoid compromised games.
The most obvious one would have to be avoiding the temptation to buy games on the day of release. The sense of one-upmanship you get when you’re the first to play a new game swiftly palls when it doesn’t work properly. And if you’re still wavering, research whether that game is saddled with a day-one patch – and if it is, avoid it like the plague, at least until it has been on sale for a while. That might even save you some money: the longer a game has been out, the less it will cost.
That applies doubly to games which are predominantly online: it can be impossible for reviewers to ascertain whether multiplayer games will continue to work once their servers are properly populated. DriveClub, for example, worked fine online before launch, when just a handful of people were able to play it.
See also: PS4 vs PS3
And be suspicious of yearly franchises – as we have seen, they are a hotbed of compromise. That’s precisely why Activision, for example, now splits the development of Call of Duty games between three, rather than two, developers, so that each gets a sensible three years’ development time. Having an idea about the conditions in which a game was developed provides a great means of avoiding the most egregiously compromised ones.
Approach “ports” of games with trepidation, especially PC ones: if publishers are concentrating on getting games developed for the consoles, PC versions can be an afterthought, farmed out to less competent developers, as was the case with Batman: Arkham Knight.
Always question whether you’re getting value for money: after all, games publishers are constantly dreaming up new ways of parting us from our money above and beyond the basic games, so now, more than ever, they owe it to their consumers to get things right. And if you are afflicted by a substandard game, don’t just whinge about it on Twitter: lodge a complaint with its publisher, as you would with any other consumer product.
In other words, be sceptical rather than slavish. Who knows: if you apply such a policy to life in general, you might find yourself having to compromise less often.