Learning and Modeling Player Behavior in Games
Speaker: Ben Weber (UC Santa Cruz)
Abstract:
Video games provide an excellent environment for artificial intelligence (AI)
research, because they present many real-world properties. One of the key
challenges is developing AI systems capable of intelligently interacting with
human participants. Building game AI which engages with players invokes the
following questions: what gameplay behavior can be learned from players, and
what player behavior can be predicted based on previous gameplay interactions? I will present two projects which have explored these research questions. The
first project, EISBot, extracts examples from game replays to learn how to
play StarCraft from demonstration. The second project, Madden NFL Mining,
creates models of player retention by building regression models from millions
of players. The outcomes of these projects are techniques for building game
AI which learns from demonstration, and approaches for modeling player behavior
in games. The broader impact of this work is methods for integrating learning
in game AI, and incorporating player feedback in the game design process.
Bio:
Ben Weber is a Ph.D. candidate working with Michael Mateas and Arnav Jhala in the Expressive Intelligence Studio at the University of California, Santa Cruz. His dissertation project, EISBot, incorporates reactive planning, case-based reasoning, and machine learning to play the real-time strategy game StarCraft. To promote research in game AI, Ben organized the first AIIDE StarCraft AI Competition, which attracted participants from all over the world. Ben previously worked at Electronic Arts as a technical analyst on Madden NFL 12.