Current visual assistive technology doesn’t allow for the rapid and seamless transmission of critical data like whether a door is open or a car is traveling in the wrong direction. Tan, a professor in the Department of Mechanical, Aerospace, and Biomedical Engineering and associate head of Integrated Programs and Activities, is working to eliminate those limitations.
“Mobility is about three things: direction, range, and access,” Tan explains. Guide Glass aims to address all three at once. The goal is to improve mobility with one device, without making it overly cumbersome.
Guide Glass combines the direction and range assistance of a guide dog with a walking stick’s ability to warn of obstructions.
The prototype technology resembles sunglasses with a GoPro camera attached to one side. The camera and sensors perform the functions of the eyes and brain, seeing and analyzing surroundings. That information is immediately converted into data by way of sensors that are connected to a microprocessing unit on the device. Proprietary software immediately evaluates the wearer’s surroundings and audibly relays the data, in real time, to an earpiece.
For instance, if you’re crossing a street, Guide Glass doesn’t just tell you when the light changes—it lets you know if a car has run the red light, as well as the location of potholes and other pedestrians in the crosswalk.
GPS improvements eventually will allow further refinements with even more precise calculations and directions.
The potential for Guide Glass goes beyond the visually impaired. Since it doesn’t rely on light to make measurements, the technology works for anyone whose vision might not otherwise be 100 percent, such as firefighters in a smoky building or police officers entering a darkened crime scene.
With this groundbreaking work of Tan and his colleagues, anyone can see that Volunteers make a difference.