Download Full Text (280 KB)
The purpose of this project is to address the problem of interaction between the user and eye-wear devices. In particular, our framework recognizes audio instructions, hand gestures, and human gazes and translate them into commands. First, the audio input from the user is recognized and converted to text by using speech to text recognition system. Second, hand gestures are recognized based on the movements of finger tips in multiple frames. Third, the human gaze is computed as the average of the two distances captured from both eyes. To demonstrate these commands, we developed two applications to demonstrate the effectiveness of the new way of interaction. The first application projects a 3D model explorer that can be manipulated through the commands that we programmed to expand, rotate, or reset. The second application projects a solar system that demonstrates the ability for these commands to interact with multiple virtual objects. This advancement in eye-wear device interaction will facilitate the usability of eye-wear devices with virtual objects moving forward.
Van Tam Nguyen
Primary Advisor's Department
Stander Symposium poster
"Human-eyewear device interaction" (2019). Stander Symposium Posters. 1575.