Research

VIEW-AIMS

VIEW-AIMS

VIEW-AIMS

VIEW-AIMS

Video
View-AIMS Video 1
[quicktime]
[winmedia]

View-AIMS Video 2
[quicktime]
[winmedia]

Immersidata Management

Overview

The main objective of the AIMS project is to address the challenges involved in managing the multidimensional sensor data streams generated within Immersive Environments. We call this data type, immersidata, which is defined as the data acquired from a user's interactions with an immersive environment. Management of immersidata becomes crucial as the number of immersive applications grows and as they become more common. Due to specific characteristics of the immersidata, its management requires database expertise combined with signal processing and continuous math flavors.

NSF Report
Poster


Video Interaction Enhancement (VIEW-AIMS)

The main objective of this project is to address the challenges involved in managing the multidimensional sensor data streams generated within Immersive Environments. We call this data type, immersidata, which is defined as the data acquired from a user's interactions with an immersive environment. As a sub-project of AIMS, VIEW-AIMS emphasizes on challenges of user interactions in immersive environments. In order to facilitate natural interactions, users in immersive environments can perform the command through hand gestures. The main challenge would be to extract in real-time a meaningful atomic motion and then recognize the motion by comparing it with a known library of motions.

With VIEW-AIMS, we demonstrate our pattern-isolation and pattern-recognition techniques. This demonstration provides a prototype video browser, where users can operate the system with hand gestures. VIEW-AIMS gathers continuous immersidata streams from the 2 gloves and 2 trackers. After extracting meaningful command from these data, the application can manipulate the media player based on the recognized command.

Video [quicktime] | [winmedia]

Tech Summary
More Info

Video
View-AIMS Video 1
[quicktime]
[winmedia]

View-AIMS Video 2
[quicktime]
[winmedia]


AIDA (Adaptive Immersive Data Analyzer)

AIDA (Adaptive Immersive Data Analyzer) is an adaptive application for querying and analyzing the data generated by an immersive environment. By focusing on off-line queries over immersive data sets, it supports a wide range of complex queries from knowledge discovery to spatio-temporal queries. AIDA's database schema design and efficient query sets are common for general immersive environments. This demonstration provides a prototype suitable for the design and development of domain-specific query and analysis applications on the users' interactions with the corresponding immersive environment. AIDA has implemented a set of applicationspecific queries for Immersive Classroom application (IC).

Tech Summary


Haptic Data Analysis for American Sign Language Recognition

An interesting extension to immersive environments is the ability to record sessions (persistent storage). Augmenting this recording with enough semantics will make it possible to query elements of the immersion and customize the results toward user preferences. Hence, a user would be able to experience the immersion as if he/she had been present in the physical environment. For the past five years we have been addressing the challenges involved in managing the data generated within immersive environments. An immersive environment is an augmented or virtual reality experience that connects a person with other people, objects, places, and databases. Together, with many other researchers, we have addressed the management of obvious data types such as image, video, audio and text. However, we identified a set of less familiar data types, collectively termed as immersidata that are specific to immersive environments. Immersidata is defined as a representation for the information acquired from user's interactions within immersive environments. Our MIE system supports the acquisition, storage, querying and analysis of immersidata.

IMSC's 2001 NSF Report
Laboratory
More Info