Brigham Young Researchers Develop Google Glass System for Deaf Students

Brigham Young University (BYU) researchers have developed a system to project sign language interpreters onto Google Glass and other similar types of glasses.

The "Signglasses" project was developed to improve the planetarium experience for deaf students. Typically, when deaf students visit the planetarium, they can't see the sign language interpreter and the overhead projections at the same time because the lights have to be on to see the interpreter and off to see the projection. With Signglasses, deaf students can watch the planetarium projection at the same time as they watch the interpreter projected onto their glasses.

The research team has field tested the system with students from Jean Massieu School for the Deaf. The researchers were surprised to discover that students preferred the interpreter to be projected in the center of one lens, so they could look straight through the signer when focusing on the planetarium show. The team had assumed students would prefer to see the projection at the top of the lens, as Google Glass normally does.

The Signglasses project is lead by Michael Jones, assistant professor of computer science at BYU, and several of the student researchers working with him are deaf. "Having a group of students who are fluent in sign language here at the university has been huge," said Jones in a prepared statement. "We got connected into that community of fluent sign language students and that opened a lot of doors for us."

The team is also working with researchers at Georgia Tech to explore the potential of Signglasses as a literacy tool. With the technology, when deaf students encounter new words in books, they could push a button, and a video dictionary would project a definition of the word in sign language.

The full results of the Signglasses research project will be published in June at Interaction Design and Children.

Further information about the project can be viewed in a YouTube video.

About the Author

Leila Meyer is a technology writer based in British Columbia. She can be reached at [email protected].

Featured

  • Man using laptop computer to learn with artificial intelligence tutor

    McGraw Hill Expands Gen AI Tools for Teaching and Learning

    Ed tech provider McGraw Hill has launched Teacher Assistant, a new generative AI-powered tool for lesson planning support, and announced the wider availability of Writing Assistant, a gen AI tool for strengthening students' writing skills.  

  • interconnected gears and cogs

    Integration Brings Anthropic Claude AI Models to Copilot

    Microsoft has integrated Anthropic's Claude artificial intelligence models to its Microsoft 365 Copilot platform, giving enterprise users another option beyond OpenAI's models for powering workplace AI experiences.

  • open laptop with various educational materials like charts, quizzes, and documents emerging from the screen

    Pear Deck Learning Debuts New AI Features

    GoGuardian recently introduced new artificial intelligence features within its Pear Deck Learning curriculum and instruction platform, designed to aid educators throughout their teaching journey — from lesson planning to assessment.

  • silhouetted person holding a phone, facing a glowing digital speech bubble with a robot icon and heart symbol

    OpenAI to Enhance ChatGPT Safety Features

    OpenAI has announced it is strengthening safety measures within ChatGPT to better detect and respond to users experiencing mental health crises.