Wednesday, May 9, 2007

Artificial Intelligence :: Expressive Emotion

By Alyssa Lees

This thesis explores alternatives and tools for making MOCAP work. In it, she explores expressive movement, deformable rigs, motion data and the ability of the computer to adjust and fill in the blanks of recorded movement. Her introduction not only sets her claim, but also describes the industry and breaks down what to expect chapter by chapter throughout the thesis. A snippet of her abstract is below:

"
The key aspect is the creation of a deformable skeleton representation of the human body using a unique machine learning approach. The deformable skeleton is modeled by replicating the actual movements of the human spine. The second step relies on exploiting the subtle aspects of motion, such as hand movement to create an emotional effect visually. Both of these approaches involve exaggerating the movements in the same vein as traditional 2-D animation technique of 'squash and stretch'. Finally, a novel technique for the application of style on a baseline motion capture sequence is developed. All of these approaches are rooted in machine learning techniques. Linear discriminate analysis was initially applied to a single phrase of motion demonstrating various style characteristics in LABAN notation. A variety of methods including nonlinear PCA, and LLE were used to learn the underlying manifold of spine movements. Nonlinear dynamic models were learned in attempts to describe motion segments versus single phrases. In addition, the dissertation focuses on the variety of obstacles in learning with motion data. This includes the correct parameterization of angles, applying statistical analysis to quaternions, and appropriate distance measures between postures. "

Lees, Alyssa. "Expressive Emotion". New York University, 2006.
http://wwwlib.umi.com/dissertations/fullcit/3234154

No comments: