NVIDIA reports a story about an AI bot developed by students at MIT and New York University that beat professional gamers during the Genesis 4 Super Smash Bros tournament last month. The bot taught itself how to play in two weeks.
The AI was trained with CUDA, Tesla K20/TITAN X GPUs and the TensorFlow deep learning framework. The creator, Vlad Firoiu used reinforcement learning by having the bot play itself repeatedly, learning which techniques worked the best. Firoiu coauthored the paper with William F. Whitney.
From the story:
The bot almost learns to make its own flow chart. Based on its past playing experiences, it learns that certain combinations of moves are more effective, through thousands of games of trial and error. However, its preferred move combinations are strange, and almost inhuman to pros who watch. Also, the typical human has a response time of about 200 milliseconds, about six times slower than the bot