Authors

Presenter(s)

Bhrugvish Timirbhai Vakil

Files

Download

Download Project (861 KB)

Description

Gestures are very important to convey our self to each other since the dawn of human civilization. Our proposal focuses on detecting human hand gestures so that virtual environment machine can interpret it easy. This will enable us to communicate in virtual reality world using hand gesture. Some other application of our proposal includes body language interpretation, sign language interpretation etc. We address the highly challenging problem of real-time 3D hand tracking based on a monocular RGB-only sequence. Our tracking method combines a convolutional neural network with a kinematic 3D hand model, such that it generalizes well to unseen data, is robust to occlusions and varying camera viewpoints, and leads to anatomically plausible as well as temporally smooth hand motions. This project will help us to interact with virtual world using our hand gestures with the help of predefined rules. This project can be use as method to communicate with virtual lenses and interact with them. It can also we used in virtual field like Education, medical. Imagine if you are doctor and imagine you need to perform intricate surgery you can develop the object of your surgery in a virtual world and interact with it in using your hand gesture.

Publication Date

4-22-2020

Project Designation

Graduate Research

Primary Advisor

Ju Shen

Primary Advisor's Department

Computer Science

Keywords

Stander Symposium project, College of Arts and Sciences

United Nations Sustainable Development Goals

Quality Education; Industry, Innovation, and Infrastructure

Real Time Hand Gesture Recognition for 3D World

Share

COinS