It's fun to watch them get this level of capture / creation / animation pretty much in real-time on the show floor with a couple of cameras. There's enough processing power in modern CPU/GPU systems to do per-pixel tracking and surface/texture extract in real-time.

It seemed to work best with folks who have less-than-perfect skin textures or when using UV illumination (under which pretty much everyone has imperfect skin texture).

My wife was very upset after seeing the electronic theater things at SIGGRAPH this year. Now she can't pretend that the things she sees in the media are true representations of reality.