emBodied Digital Creativity BDC
Supported by the
National Science Foundation

We are building a system that allows for both expression and analysis of body movement and gesture. The system maps user actions with tangible interfaces onto characters in a real-time virtual environment. Through our body-movement system, we will record a matrix of character animations generated by different users. We plan to analyze this matrix in order to find behavioral clusters, and suggest groupings and focal points that can lead to a method of categorization. Using this categorization of traits, we can then complement a particular user's gestures with other participants' strategies for movement. Our expectation is that the user can imagine a wider range of expressive and creative possibilities after seeing other participants' motor biases.

This project's working hypotheses are: 1) A way to expand the solution or expression space and enhance creativity in scientific and artistic domains is to broaden a person's body memories, and 2) Video game characters that encode our body movements through tangible user interfaces can be used to broaden the body memories of the player.


The evaluative protocols are as follows: We will use two domains (artistic, scientific) to test the hypothesis that novel movements in personalized virtual characters would lead to improvements in imagined movements in players.

A central component of movement-based internal models in science and engineering is the mental rotation of objects and other entities, such as atoms, molecules, DNA, proteins, fluids, gases, gears, motors, air currents, planetary systems, etc. We will be evaluating changes in mental rotation ability based on engagement with the system.

Our evaluative experiment, testing mental rotation ability in participants before and after using the system, will be designed by our collaborators at the University of Calgary and run at Georgia Tech in consultation with them. The Calgary group has previous experience running mental rotation experiments and the stimuli and designs developed for these experiments would be adapted to develop our experiment.


Our second experiment, evaluates how novel movements generated by virtual characters contribute to players' artistic creativity. Having access to novel body states should allow players to predict others' internal states better, thus refining their ability to both experience and generate novel artistic expressions.

We are currently testing the hypothesis that character attributions are based on participants simulating both the character traits and the gestural movement that generates a particular drawing. Participants then link these two parameters, namely the sense of movement from imagined character traits, and the sense of movement from the drawings. Insights gained about movement and character trait attribution from this experiment will be expanded to design an experiment that tests how access to novel movements changes character trait attribution. This experiment will also be designed by our collaborators at the University of Calgary, and run at Georgia Tech in consultation with them.

Full NSF Proposal available here.