Here’s How Google is Using Video Games to Teach Self-Driving Cars
Much like biological brains, artificially intelligent neural networks need training to
So, what’s an easy, accessible, and intuitive way to teach these machines? Video games, of course! Specifically, Blizzard Entertainment’s StarCraft II — a science-fiction-themed real-time strategy game.
Drive Smarter: The Mazda SUV lineup
Why StarCraft II?
There’s a massive number of games to choose from — including actual driving simulators. What makes StarCraft II special? Well, this game is known for its easy-to-play, difficult-to-master gameplay. It tasks players with managing dozens of individual units, each with unique skills, while also balancing your resources and fending off enemies. While humans have a relatively easy time picking up on StarCraft II’s gameplay, this complicated mix of tasks can pose challenges to AI.
In other words, StarCraft II challenges the AI’s problem spots. Google’s DeepMind system is learning the most efficient way to handle these challenges through an algorithm called population-based training. This algorithm speeds up the learning process by finding the most efficient methods for the task at hand, then continually refining those methods.
DeepMind is also getting smarter through some player-designed technology. Players have developed ways to make the game’s AI smarter and more challenging. These lessons are being carried over to the neural network’s training.
Just keep in mind that it’s all a work in progress. “We need to constantly retrain the net and rewrite our code. And when you retrain, you may need to tweak your parameters.”
More training, you say? Time to fire up StarCraft II!
Train Yourself to Drive Safely: Don’t text and drive