Automatic Tutorial Generation

We often think of AI as something that plays games (Alpha-Go), creates content (procedural content generation), identifies faces (Facebook photo AI), auto-completes search results (Google), or drives cars (any autonomous car company), among a multitude of other things. AI is all around us, constantly being used to study patterns and then use that knowledge to benefit us in some direct way.

We ask: what is possible for this AI to learn? But rarely do we ask: what can we learn from this AI?

So how could an AI teach? What would it teach? Is it talking to us? Is it showing us things? It would probably have to understand the thing it is teaching, first of all. How would we ensure that it truly understands the subject material? There’s a lot to unpack here. People will undoubtedly have different opinions about how an AI can teach us, just like how people have different opinions about how human teachers should teach in schools.

To avoid becoming overwhelmed with the possibilities of research in this space, we have to frame the problem with some boundaries. The first constraint is to keep the context of “subject material” within the realm of video games. We already have AI that can play games, but we don’t have AI that can necessarily understand how the English language works. So rather than try to solve the English language problem, let’s stick to what we know AI knows. We know that AI can play some games, and the educational part of any video game would undoubtedly be the tutorial.
Several projects from the lab have been worked on within this scope:

Sorry, no content here.