Google Translate
ASL Feature

UX/UI update with a more modern style as well as an ASL translator
feature to incorporate into the google translator app.

Overview


A conceptual Google Translate feature that introduces an American Sign Language Translator into the app. Using motion tracking technologies, the algorithm will track the hand motions of the subject communicating in ASL and in real-time translate into words below the screen. All you have to do is aim your phone camera and you can instantly translate and communicate without any stress while getting educated about different forms of communication.

Tools - Sketch, Figma, Photoshop, Illustrator, After Effects, Premiere Pro

My Role - UX/UI Design, Product Designer, Design thinker

Type - UX/UI Design, Product Designer, Assesibility

Duration - January - April 2020

Understanding the Target Audience

Problem ⛔️

With today's day in age, there is no real device or technology out there to help translate or communicate with American Sign Language making it a pain point that I identified. There are (terrible) apps out there that users can download to learn how to communicate in ASL but a lot of the time the images drawings are blurry or because it is a picture it is hard to understand when it is in motion. How can I make this transition for universal communication easier for users and more accessible?

Solution ✅

The solution that I came up with is to use motion tracking technologies to help ease the transition and communication barrier between American Sign Language. Using motion tracking technologies, the algorithm will track the hand motions of the subject communicating in American Sign Language and in real-time translate into words below the screen. All you have to do is aim your phone camera and you can instantly translate and communicate without any stress while getting educated about different forms of communication.

 Process Work👇

Through out my process and discovery work, i went through different research points like creating a competitor analysis where you can view HERE to read my research on other competitor products. I've also conducted Empathy mapping, User persona's, User joineries and User flows in order to move into the design process on Lo - Hi fidelity designs.

Inspiration


This Ballet Rotoscope video i can across really inspired me in moving forward with a project like this, thinking about motion tracking and how we can make accessibility experiance better.

Below, I bought this little pamphlet of a deaf man selling it years ago and that was the spark in my interest into a world I am unfamiliar with and I thought I would incorporate it in my process of inspiration because I would look at it time to time. 👇

User personas

Interaction Map




Low Fidelity Sketches

This is the start planing stage of the using the MoSCoW rules to determine what must be included in the app as well as what should, could and want in the app to help prioritize our design process. 

We then started with some rough sketches of how the interface would look like such as the projects page, calendar page and what the A.I voice command would look like.

Motion tracking visualization

...




High Fidelity Wireframes

Went from mid-fi to hi-fi renders of the different app screens identified in the initial user flow. Approached the direction with the website page as well.

Prototype Motion Tracking 


After all my initial discovery and early design phase of the process, I started to move into the exploration phase of the motion graphic/tracking aspect of the project, these are my after effects files testing, the motion tracking and syncing the translation text.

Promotional Website Video


This is the prototype video play through of the promotional website showcasing this new feature addition

Final product Demonstration

Design System / UI Kit

I chose the overall colour palette to be very simple, clean, and professional. Being that this is a “potential” university approved application. Also, keeping an organized and clean platform will improve emotions of a busy student. 

Design Rational

For my final project, I knew I wanted to do something along the lines of accessibility and explore that further. One of the areas I was interested in researching was the ASL and how to make the experience better for those who are looking to translate and understand one another to help this barrier of communication. Especially with users who are not familiar with the language. I thought of making this type of project its own but after some research and thinking I then decided to make it a new feature for the google translate app.

Using motion tracking technologies, the algorithm will track the hand motions of the subject communicating in American Sign Language and in real time translate into words below the screen. All you have to do is aim your phone camera and you can instantly translate and communicate without any stress while getting educated about different forms of communication. Since I was recreating and updating the user interface and experience of the app as well, I just gave the overall look a more fresh and modern feel compared to the old UX/UI.

Throughout this process it made me more aware and open to how these different user types think and process information. ASL & Sign Language in general is a whole different form of communication and before diving into this project I had to get familiar with it myself and understand these different experiences and thought processes which was super enlightening and eye opening for me as a designer.