DeepMind is close to mastering StarCraft II
Not content with being the best in the world at the ancient Chinese game of Go, Google’s DeepMind AI has picked up something a little more recent: 2010’s Starcraft 2.
In a new blog post, researchers has given an update on how AlphaStar – the name given to the DeepMind’s Starcraft playing AI – is getting on. The long and short of it is very well: it can play at a Grandmaster level as any of the three classes – Terran, Protoss or Zerg – and is better than 99.8% of human players on Blizzard’s Battle.net.
Related: Best multiplayer games
More importantly, it’s reached this ability despite being hampered by the same limitations that human players face. It has the same camera view, limited knowledge of the map, and its actions per minute are capped to what a human’s reflexes could potentially achieve.
As with Go, this is all entirely self-taught. The idea is that this general-purpose learning could help in other aspects of AI and robotics from self-driving cars to object recognition systems.
“The history of progress in artificial intelligence has been marked by milestone achievements in games,” said DeepMind’s David Silver who worked on AlphaStar. “Ever since computers cracked Go, chess and poker, StarCraft has emerged by consensus as the next grand challenge.
“The game’s complexity is much greater than chess, because players control hundreds of units; more complex than Go, because there are 10^26 possible choices for every move; and players have less information about their opponents than in poker.”
Related: Best strategy games
Unlike with Go, where DeepMind beat the world’s human champions, AlphaStar isn’t quite the best in the world. There’s still that elusive 0.2% of human players that should beat it if matched up, but given its progress, it’s likely only a matter of time before it’s unbeatable by humankind.