A conceptual Google Translate feature that introduces an American Sign Language Translator into the app. Using motion tracking technologies, the algorithm will track the hand motions of the subject communicating in ASL and in real-time translate into words below the screen. All you have to do is aim your phone camera and you can instantly translate and communicate without any stress while getting educated about different forms of communication.
With today's day in age, there is no real device or technology out there to help translate or communicate with American Sign Language making it a pain point that I identified. There are (terrible) apps out there that users can download to learn how to communicate in ASL but a lot of the time the images drawings are blurry or because it is a picture it is hard to understand when it is in motion. How can I make this transition for universal communication easier for users and more accessible?
The solution that I came up with is to use motion tracking technologies to help ease the transition and communication barrier between American Sign Language. Using motion tracking technologies, the algorithm will track the hand motions of the subject communicating in American Sign Language and in real-time translate into words below the screen. All you have to do is aim your phone camera and you can instantly translate and communicate without any stress while getting educated about different forms of communication.
Through out my process and discovery work, i went through different research points like creating a competitor analysis where you can view HERE to read my research on other competitor products. I've also conducted Empathy mapping, User persona's, User joineries and User flows in order to move into the design process on Lo - Hi fidelity designs.
This Ballet Rotoscope video i can across really inspired me in moving forward with a project like this, thinking about motion tracking and how we can make accessibility experiance better.
Below, I bought this little pamphlet of a deaf man selling it years ago and that was the spark in my interest into a world I am unfamiliar with and I thought I would incorporate it in my process of inspiration because I would look at it time to time. 👇
This is the start planing stage of the using the MoSCoW rules to determine what must be included in the app as well as what should, could and want in the app to help prioritize our design process.
We then started with some rough sketches of how the interface would look like such as the projects page, calendar page and what the A.I voice command would look like.
Went from mid-fi to hi-fi renders of the different app screens identified in the initial user flow. Approached the direction with the website page as well.