Elon Musk’s OpenAI bots can kick your butt in Dota 2
Come August, a matchup of epic proportions will go down between humans and bots playing the popular team battle game Dota 2. And the machine players, built by OpenAI, the research nonprofit cofounded by Elon Musk, may very well prevail.
In a victory hailed by Microsoft’s Bill Gates, a team of five AI bots beat their human counterparts in a practice tourney Monday. “That’s a big deal, because their victory required teamwork and collaboration – a huge milestone in advancing artificial intelligence,” Gates tweeted.
#AI bots just beat humans at the video game Dota 2. That’s a big deal, because their victory required teamwork and collaboration – a huge milestone in advancing artificial intelligence. https://t.co/UqIUhh9xFc
— Bill Gates (@BillGates) June 26, 2018
The OpenAI Five bots plan to take on professional players in Vancouver come August at The International, a tournament hosted by Valve, the online game’s developer.
Unlike board games such as chess and Go with a relatively simple board in full view of all players, games like Dota 2 capture more of the complexity of the physical world, making them an interesting challenge for AI systems.
The game features teams of five characters, known as heroes, attempting to conquer each other’s starting bases while moving across a complex world of magic, monsters, and weaponry.
“The hope is that systems which solve complex video games will be highly general, with applications outside of games,” they write.
The game features huge volumes of ever-changing data points and challenges. Games lasting just 45 minutes can effectively include tens of thousands of moves for one player, while chess is often limited to around 40 and Go to about 150, according to the researchers.
The bots don’t watch the game pixel-by-pixel but rather get access to the same data a human would absorb visually. The Dota 2 bots learn largely by playing against themselves and can cram 180 years’ of play into a single day. Powering the AI is a system built with 256 GPUs and 128,000 CPU cores.
Rapid Learning
When they first started, the bots randomly roamed the game world, but they quickly learned techniques like patrolling paths between opposing teams’ bases and harvesting resources. Later, they developed more complex strategies, including many familiar to the best of human players.
By the beginning of June, the bots could soundly beat the best human player team at OpenAI, a team of Valve employees, and won two out of three matches against both a highly ranked amateur team and a semiprofessional team.
Before the August tournament, a benchmark competition against a top group of players will be broadcast July 28 on Twitch and before a live audience in San Francisco.
Fast Company , Read Full Story
(5)