American Sign Language is the most predominant sign language for deaf people in the US. Yet, very few amount of people from the population actually know ASL. This app lets everyone learn ASL in a very easy way and fun environment!
Learn ASL really easily!
A ReactJS app that lets you learn the American Sign Language with the power of Machine Learning.
The frontend of the app is taken care by ReactJS and the backend is taken care by TensorFlow.js This app has a chatbot interface that lets users type, ask questions and play the game. The chatbot gives the user a sign that he has to make and the Machine Learning model will recognize and classify whether the sign made by the user was correct or not. The user has a points score which tells the user the number of correct signs he/she got from the total number of signs they tried.
Currently, the model only recognizes 5 signs, 2 letters namely
Y and 3 words namely
Learning ASL can get hard for a lot of people. The flexibility on its own could be a problem for many. The main purpose of this app is to let users learn ASL with practice and instead of an actual human telling whether the user is right or wrong, processing the information through a bot which is much more efficient and much more easier for the users themselves. The app is portable and hence it can run on all devices since it uses TensorFlow.js
Click here to check out the live version.
Using the app is pretty straightforward, the steps involved are given below: 1. Click the guess button. 2. Once the popup opens, click the Check Answer button and wait. 3. You will now be able to see live video feed, now just make any sign from these 5 mentioned below, i. Peace ii. Okay iii. A iv. Y v. Thumbs Up 4. On the console, you can see whether the sign you made was correct or not.
View the screencast:
The resources used in the app are mentioned below: * IBM Watson * TensorFlow.js * p5.js * create-react-app * React component: Circular progress bar, Sidebar, Popup
Unfortunately, using the app doesn't seem that straightforward as mentioned in the instructions. Could've made use of the space lot better than having to open the console. Promising but confusing to use.
What an ambitious idea! I wasn't able to get the app fully working, I think (I had multiple camera feeds, and no console), but it's a fascinating start.
I really like the concept! I couldn't get it to work well however. It was picking up too much of what was behind me. I may need to only stand in front of solid-colored walls.
I bet this could become a great translation app! Just point at the person speaking asl, get a written translation!
I love the idea but there's no way to adjust the lighting so I could never get the app to work to test it out. The popups don't seem to do anything either.
Pretty amazing. Loved how this was done in such a short span on time, and also pretty useful.
Kudos to the team on building this. The website could use some design, but that is not a negative on the team since the hack does illustrate the utility of the hack.