The Facial Expression project seeks to automatically record and analyze human facial expressions and synthesize corresponding facial animation. Analysis and synthesis of facial expression are central to the goal of responsive and empathetic human-computer interfaces. On the analysis side, the computer can respond and react to subtle sentiments reflected on the user's face. And on the synthesis end, the computer may present a comfortable and familiar visage to the end user. While the computer analysis of speech is the subject of extensive research, nonspeech facial gestures have received less attention. Natural communication in virtual settings will require the development of a computational facility with facial gestures.