Authors

Presenter(s)

Vamsi Charan Adari

Files

Download

Download Project (254 KB)

Description

The purpose of this project is to address the problem of interaction between the user and eye-wear devices. In particular, our framework recognizes audio instructions, hand gestures, and human gazes and translate them into commands. First, the audio input from the user is recognized and converted to text by using speech to text recognition system. Second, hand gestures are recognized based on the movements of finger tips in multiple frames. Third, the human gaze is computed as the average of the two distances captured from both eyes. To demonstrate these commands, we developed two applications to demonstrate the effectiveness of the new way of interaction. The first application projects a 3D model explorer that can be manipulated through the commands that we programmed to expand, rotate, or reset. The second application projects a solar system that demonstrates the ability for these commands to interact with multiple virtual objects. This advancement in eye-wear device interaction will facilitate the usability of eye-wear devices with virtual objects moving forward.

Publication Date

4-24-2019

Project Designation

Independent Research

Primary Advisor

Van Tam Nguyen

Primary Advisor's Department

Computer Science

Keywords

Stander Symposium project

Human-eyewear device interaction

Share

COinS