Software Development
DEI / Guest posts / Hackathons

These college students built an app that can ‘read’ American Sign Language

At Drexel's fledgling hackathon, the winning team made a real-time system that translates hand movements into text messages. A team member explains how.

The winning team behind "Sign Me" at Dragon Hacks. (Courtesy photo)
This is a guest post by Drexel student Nigel Coelho.
I always wondered what it’d be like to have an app that could translate American Sign Language in real time.

Earlier this month, I got the chance to make that idea a reality during Drexel’s Dragon Hacks hackathon. We made a setup that let a camera read your hand movements in American Sign Language, translate those hand movements to English and then text those translated words to your phone in real time. Our team, made up of students from the University of Maryland, University of Pittsburgh, Penn State and Drexel, thought it could be a way for people who didn’t understand ASL to communicate with a person with hearing loss — and the judges agreed that it was a good idea: our hack won first place.
Here’s a look at our process over the weekend.

The Inspiration

Our initial idea was a hardware hack that used eye tracking via computer vision to find out if truck drivers fell asleep at the wheel. With an Arduino microcontroller connected, it would sound a buzzer if the app found the driver’s eyelids closing.
When playing around with the Microsoft Kinect Motion Sensor and seeing its capability, however, we found sign language recognition to be a much better application. Moreover, in terms of getting a comprehensive app done over the weekend, the ASL recognition app seemed more feasible. In addition, one of our team members, the University of Pittsburgh’s Solomon Astley had a friend in high school who was deaf. He learned to sign in order to communicate with his friend and already knew some phrases in ASL coming into the hackathon.
Some of our group members were also inspired by the fact that over 5 percent of the world’s population, or about 360 million people, suffer from debilitating hearing loss. This includes one-third of the population over 65 years of age. Helping the Deaf community was something we could get behind.

The Technology

Our team would eventually use open source American Sign Language recognition software with Twilio’s API to send text messages about what was being signed into the Kinect Motion Sensor Camera. Twilio lets developers send text messages to mobile phones using web apps deployed in the cloud.
Essentially, the Kinect could be mounted on a hat, it would take in what the person opposite to the wearer was signing and feed it into the app that translated it into English. It would then push the text to a Node.js web app deployed on IBM Bluemix that integrated with Twilio’s API. This would result in a text message regarding what was signed being sent to the wearer’s phone.
Towards the end, most of the team found themselves learning several phrases in sign language for the demo.
During the demo, we found a few people who knew ASL that came up and tried signing into the camera. We’d set up the app to take in phrases but not individual letters in ASL. That made us a little nervous when people began trying to spell out words in ASL.

The Win

Initially we had aimed to win the best social impact hack award. That prize ended up going to an app that gamified arthritis prevention. At that point we didn’t think we’d win. When our team was announced as the winning hack, we didn’t believe it at first. Most of our team members were first-time hackers that came into the weekend to learn something but found themselves winning the entire hackathon instead. It was an experience fraught with worry in the last few moments of finishing our app, but the win left us brimming with joy.
The winning team consisted of Solomon Astley and Alexis Aquiatan from the Pitt, Shawn Wali from the University of Maryland, Yahya Saad from Penn State and myself, Nigel Coelho, from Drexel.
Going forward we’re going to try to see if we could integrate with the Myo Armband to have a second input in order to confirm the accuracy of certain signs by a person. It proves to be something that can really help people communicate better.

Before you go...

Please consider supporting Technical.ly to keep our independent journalism strong. Unlike most business-focused media outlets, we don’t have a paywall. Instead, we count on your personal and organizational support.

3 ways to support our work:
  • Contribute to the Journalism Fund. Charitable giving ensures our information remains free and accessible for residents to discover workforce programs and entrepreneurship pathways. This includes philanthropic grants and individual tax-deductible donations from readers like you.
  • Use our Preferred Partners. Our directory of vetted providers offers high-quality recommendations for services our readers need, and each referral supports our journalism.
  • Use our services. If you need entrepreneurs and tech leaders to buy your services, are seeking technologists to hire or want more professionals to know about your ecosystem, Technical.ly has the biggest and most engaged audience in the mid-Atlantic. We help companies tell their stories and answer big questions to meet and serve our community.
The journalism fund Preferred partners Our services
Engagement

Join our growing Slack community

Join 5,000 tech professionals and entrepreneurs in our community Slack today!

Trending

Philly startup Burro aims to revolutionize farming with robots

Ghost Robotics is selling a majority stake for $240M, dodging months of protests over military uses

As a returning citizen, she experienced tech overload. Now she’s fighting to end the digital divide

How to encourage more healthcare entrepreneurship (and why that matters)

Technically Media