Gesture-Controlled Smartphone


Anton Krivosheyev
Richard Romanowski
Michelle Greenfarb

Professor Rose
Gesture controlled smartphone.png


The progression of technology is fast-moving, and there is no sign of this pace

slowing down in the near future. The cell phone is an innovation that has allowed users to have everything they need at their fingertips. This project provides a demonstration of wireless and contact free manipulation of an Android device displayed on an external, “virtual” screen. With a simple download of an Android application and a wearable physical portion of the project, a user has all the advantages of their cell phone without the need to touch the phone.


With cell phones today, receiving an important email no longer has to wait

until the end of the day, and an update for friends on what you had for breakfast can be instantaneously published online. It’s hard for most people to imagine their lives without their Android phones, but sometimes it is not always convenient to have a tactile interaction with the phone. With this project, a user can take advantage of all of the features of their Android device, without the need to physically hold the phone. In a time where a cellphone becomes obsolete in just a few months, this project provides the user a way to expand the utility of their current Android device with a simple download of an application and a headset compatible with most Android devices.



The design of the system utilizes a popular serial communication protocol between modular devices, as can be seen in the above image. The hardware components are linked together and functionally resonate via the software written exclusively within an Android application. The design of the system can be separated into two sections: communication into the phone and out of the phone. Everything from basic image processing to an intuitive user interface is contained within the application, which is written in Java.


The design process proved to be very important for the success of this project. The entire project was broken up into subprojects; once a subproject could be understood on its own, it could be implemented within the entire system. With this understanding of the system, the current prototype holds many possibilities for expansion. Presently, the system shows proof of the ability to manipulate the phone externally and display information from the phone to a wearable projector. Next, speed and user interface improvements need to be addressed for the system. After the prototype can run more efficiently, then the optical components to make a HUD can be added, and the system can be indefinitely expanded from there.