I get very tired of the clickbaity journalism hyping up minor advances in AI, making news stories out of nothing or even the ones for those in the industry. You know the type: “Facebook AI had to be shut down”, “Google creates self learning AI”.
I demystify a lot of these when I’m asked about them – technology should be accessible and understandable and I deplore the tendency of those seeking to get article hits by over-egging with misleading headlines. What amused me over the weekend was that an AI not beating a human was a news story where the AI was “trounced”.
About this time last year, Deepmind announced that they were working with Blizzard entertainment to release Starcraft 2 as an AI research environment. Artificial environments have been used by robotics research to train control functions for a while, so having a much more complex environment for decision making was quite exciting.
Two of my favourite things combined 🙂 DeepMind and Blizzard to release StarCraft II as an #AI research environment https://t.co/yGdeltfog5
— Dr Janet Bastiman (@Yssybyl) November 5, 2016
The original StarCraft has been used for AI bots for several years. A year later from the SC2 announcement and we’re still waiting on good SC2 AI but there have been several tournaments with Starcraft pitching AI against other AI and more recently, AI against top players. Right now, AI players aren’t quite ready to beat the best human players, but this still seems to make the mainstream news. Are we supposed to all breathe a collective sigh of relief that maybe the robots aren’t going to take over just yet?
Playing modern games is actually a difficult task. There’s a reason that research started with the arcade games of the 1980s – simple two dimensional controls in PacMan are orders of magnitude away from complex games like Starcraft. Even with games like Go and Chess where AI has had success over the best humans, you have a known-state problem. You can either see what your opponent is doing as they are doing it (the ghosts in PacMan) or you have a full view of the state of the game and can predict likely next moves (Chess, Go). In comparison, Starcraft is far more like poker – you know your own status and uncover common areas, but don’t know your opponents status until you interact. While AI has cracked texas hold ’em, the dimensionality of games such as Starcraft is immense, requiring movement, selection, prediction, speed and decisive action.
The criticism that Song Byung-gu had of the AI was not that it played predictably, with some of their responses “stunning”, but that they were too cautious. I know from experience that attacking early while your opponent is still gathering resources can determine the outcome. My own style of play is far too cautious1.
The article mentioned that the time of the games was short – “beating one bot in 10 minutes”. The average professional Starcraft game is in the region of 11-13 minutes although this varies by map and skill level of the player. The shortest game I found came in at 1:28. It’s not unsurprising that these AI vs human games were in the region of 4 to 10 minutes.
Overall I think the “Trounced” headline is probably unfair in this situation. The AI played better than most humans and I don’t think it will be too long until we have one who can beat Song Byung-gu. What they did show was the ability to master a very complex game and be responsive against one of the best human players in the world. Give it a few months and I’m sure the tables will turn.
But how long will it be until we have a “gaming” AI who can transfer skills between games?
- For all games – I tend to prefer PvE rather than PvP for this reason ↩