The needs for an accurate hand motion tracking have been increasing in HCI and applications requiring an understanding of user motions. Hand gestures can be used as a convenient user interface for various computer applications in which the state and the action of the users are automatically inferred from a set of video cameras. The objective is to extend the current mousekeyboard interaction techniques in order to allow the user to behave naturally in the immersed environment, as the system perceives and responds appropriately to user actions. HCI related applications remains the most frequent however; we have focused our effort on the accurate detection and tracking of un-instrumented hands for assessing user performance in accomplishing a specific task.
Multimodal communications rely on speech, haptics and visual sensing of the user in order to characterize the user's emotional and/or physical state. The proposed research focuses on augmenting the interaction capabilities of a multimodal system by providing an automatic tracking of hands motion and recognizing the corresponding hands gestures. The objective here is to augment or replace the mouse and keyboard paradigm with functionalities relying on natural hands motion for driving a specific application.
Tech Summary
NSF Report (Year 7)
NSF Report (Year 8)
Poster
More Info
Laboratory