Medium

PennApps Spring 2013


in Philadelphia, Pennsylvania

A social tool for learning sign language! By using the new Leap Motion, our team implemented a rudimentary machine learning algorithm to track and identify American Sign Language from a user's hand gestures.

Social Sign visualizes hand gestures and broadcasts them in both a textual and visual form to other signers in the signing room.

In a standard chat room fashion, the interface permits written communication but with the benefit of enhanced learning in mind. It's all about learning a new way to communicate.

SocialSign -- handing communication to you.



0 Favorites

Share



Team



0 Comments

We've joined the Mashery family. Read the announcement.
Feedback