The broad goals of this project are to develop new techniques of human interaction with stereo displays, particularly autostereoscopic (AS) displays. Autostereoscopic displays produce a 3D visual sensation to one or more observers without the use of glasses, goggles, helmets, or headtracking. We are developing new techniques for interactive input and manipulation of threedimensional data using a motion tracking system combined with an autostereoscopic display. Users interact with the system by means of video cameras that track a light source or a user's hand motions in space. Our principles are applicable to both single-user and multi-user AS displays ranging in size from laptops to large flat-panel systems.