'Signglasses' may Aid in Deaf Literacy; Researchers Find (BYU)
'Signglasses' may Aid in Deaf Literacy; Researchers Find (BYU)Brigham Young University

As an opportunity for the deaf people across the world, a team of Brigham Young University students under Professor Mike Jones have recently come up with the "Signglasses" project that is aimed to develop and improve sign language for narration purpose among the deaf community. The project will utilize different types of glasses that will include Google Glass too.

"My favorite part of the project is conducting experiments with deaf children in the planetarium. They get to try on the glasses and watch a movie with an interpreter on the screen of the glasses. They're always thrilled and intrigued with what they've experienced. It makes me feel like what we are doing is worthwhile." said Tyler Foulger, via a press release. Foulger, is the leader of the project and a born deaf.

Kei Ikeda and David Hampton, two computer science students of Professor Jones have also joined the project just when the project started receiving funding from National Science Foundation and Sorenson Impact Foundation in order to extend the scope of the project.

The university has a group of students who are confident in sign language, which is a huge advantage for the researchers. The connection with the community of students with fluency in sign language has opened doors for the team of students undergoing the project. The chance of watching students of deaf university succeed and doing cool stuff, is really rewarding, explained Jones.

The team tested their system during a field tour visit to the Jean Messieu School that teaches the deaf children. Research from one of the tests uncovered that the signer should be exhibited in the center of the lens, while deaf participants could see straight through the signer as how they viewed on a planetarium show. This was somewhat surprising for researchers as they believed that deaf participants would choose to have a video displayed at the top, like how the Google Glass normally presents itself.

"One idea is when you're reading a book and come across a word that you don't understand, you point at it, push a button to take a picture, some software figures out what word you're pointing at and then sends the word to a dictionary and the dictionary sends a video definition back," said Jones.

The researchers aimed to further the study with researchers at Georgia Tech, to analyze if the sunglasses can be used as literary tool.

The project has helped to utilize ASL knowledge and to communicate with the deaf community, in such a way that was never thought to be actually possible, explained Amber Hatch, who can hear and whose ambition has been motivated by the project and want to pursue a career on psychology to serve the deaf people. The project is extremely amazing and all are excited to see where it will head in the next year, she added.

The full results of the researcher will be published by Jones at Interaction Design and Children in June.