Tyler Streeter
evolving humanoid motor control project banner

evolving humanoid motor control


Fall 2003 - Spring 2004


In short: optimizing neural networks with a genetic algorithm to control physically-simulated humanoid creatures.

This article was the original inspiration. It started as a project for an artificial intelligence class taught by Dr. Dimitris Margaritis and eventually became the motivation for my graduate research. The goal was to create motor control systems for virtual humans using artificial neural networks and genetic algorithms. I wanted to take the current physically simulated "ragdolls" in video games and bring them to life, allowing them to learn motor skills entirely on their own.

Basically, a physically simulated humanoid is controlled by an artificial neural network which senses joint angles and controls muscle forces. A genetic algorithm optimizes the neural network weights to improve performance on a given motor control task (standing, jumping, or walking). The most fascinating thing about this method is that the system can learn an appropriate control algorithm with minimal help from the programmer. The main design decision is the fitness function (e.g., for jumping, individuals that jump higher stay in the population and spread their genes).

I would usually start a simulation in the evening, go to sleep, and check on the progress in the morning. I was often surprised at the types of results I would find. Sometimes a suboptimal solution would dominate the population; sometimes they would exploit some instability in the physics simulation and cheat the system; sometimes they would turn out better than I'd hoped.

During spring 2004 I spent time developing tools (a wxwidgets/SDL setup for evolving behaviors and visualizing the results), experimenting with Ken Stanley's NEAT algorithm (modified for leaky integrator neurons with evolvable time constant parameters), and improving results on the walking task. One of the videos below shows the walking results. The neural networks used for walking have no sensory inputs. They are simple central pattern generators which output oscillatory signals in the absence of any sensory input.




Virtual human learning to stand Virtual biped learning to walk Screenshot of the evolution tool GUI