FYI.

This story is over 5 years old.

Tech

Having Conquered 'Go,' Google's DeepMind AI Takes on 'StarCraft II'

Google's DeepMind partners with Blizzard to create an AI API that'll be available next year.
Image: DeepMind

When artificial intelligence designers seek to pit their creations against humans in games, they tend to stick to those with reputations as brain teasers spanning millennia. Back in the '90s Garry Kasparov faced off against IBM's Deep Blue in chess (complete with accusations of cheating aimed at the machine), and just a little over half a year ago Google's AlphaGo AI beat Lee Sedol at the ancient Chinese game of Go. Both systems were victorious.

Advertisement

But now DeepMind, the Google subsidiary behind AlphaGo, has its sights on more modern stuff, specifically the real-time strategy video game and esports darling StarCraft II. At Blizzcon in California yesterday, the company announced that it's working with StarCraft maker Blizzard Entertainment to create an API for artificial intelligence that'll be available to researchers and hobbyists alike sometime next year. StarCraft II already has an AI system, of course, but the difference is that it doesn't behave like a human in the way Google's DeepMind is supposed to. It has access to information human players do not and can issue orders to units that would be outside a human's field of view. Even at its most benign, it's technically still cheating.

That's intimidating, sure, but

StarCraft II'

s smashing success proves it's hardly insurmountable. Humans have even been known to beat it when it's augmented with artificial intelligence systems apart from DeepMind's, as when a Russian player going by the name of Djem5 managed to beat three AI bots designed by the University of Alberta

last December

.

DeepMind wants something better—an AI that uses the same information humans do while using the same field of view, but one that does simply does it better than us in the style of AlphaGo or Deep Blue. It's a bigger challenge than it may sound, as the whole game board is always visible to the AI in chess or Go, but in

Advertisement

StarCraft II

it has to work with our limitations.

"Computers are capable of extremely fast control, but that doesn't necessarily demonstrate intelligence, so agents must interact with the game within limits of human dexterity in terms of 'Actions Per Minute,'" said DeepMind researcher Oriol Vinyals in a blog post yesterday.

Vinyals said the firm's particularly interested in StarCraft II for the way its fast-paced juggle of resources and troop movements "provide a useful bridge to the messiness of the real world." In time, given enough data from its own efforts and from other researchers using its API, the lessons learned "could ultimately transfer to real-world tasks."

A Kasparov-style showdown isn't likely anytime soon. Vinyals cautions in the post that "we're still a long way from being able to challenge a professional human player at StarCraft II," but then again, many commentators were sure AlphaGo wasn't quite ready to take on Sedol.