Like all new mothers, Kathy Beitz wanted to take a good, long look at her baby after he was born. Normally, that would have been impossible because Beitz is legally blind. But after she donned a pair of vision-enhancing electronic glasses, she was able to see her newborn in detail. In a YouTube video that went viral in January, Beitz described the experience as “overwhelming” — in a positive way. “I got to fall in love with [my baby],” she explained.
Beitz’s story shows how wearable technology is changing the lives of people with disabilities. Wearable assistive technology is not a new idea; people have been wearing hearing aids for decades. But advances in sensors, cameras and algorithms are facilitating more capable and useful wearables. Among the latest inventions: glasses that can identify objects and describe them out loud and clothing that translates spatial data into vibrations.
Though some of these innovations are still in the prototype stage, experts say they show promise. “Wearables have the potential to bring a change in the way disabled people interact with their environment,” says Venkat Rao, creator of the Assistive Technology blog. Dana Marlowe, principal partner of the accessibility consulting firm Accessibility Partners, also thinks wearable technology will benefit people with disabilities, especially those who are “on the go, travel and want instant data.”
Some of the most remarkable developments in wearable assistive devices relate to vision for the blind. eSight, the company that made the glasses Beitz wore to meet her baby, says its technology “trigger[s] an increased reaction” from the cells in users’ eyes. eSight glasses work by capturing video through a high-definition camera, feeding the video to a portable processing unit and displaying the modified video in real-time in front of users’ eyes.
A device called the OrCam helps visually impaired people in a different way — by reading text aloud to them. First, users attach the OrCam’s tiny “smart camera” and earpiece to their eyeglass frames. When users want to read something, such as a newspaper article or a product label, they point their finger at the item. Their OrCam then “speaks” to them via the earpiece, which uses bone conduction to carry sound through the bones of the skull to the inner ear.
Microsoft is also leveraging bone conduction to help visually impaired people “see.” The software company’s “3D Soundscape Technology” pairs a bone-conduction headset with a smartphone and indoor and outdoor wireless beacons. When blind people traverse zones that have been outfitted with Bluetooth and Wi-Fi beacons, they receive audio cues through their headsets. The cues notify users about everything from upcoming obstacles to passing buses and local points of interest.
The visually impaired aren’t the only beneficiaries of wearable assistive technology. Academic researchers have spent more than a decade developing gloves that convert sign language into writing and/or speech to ease communication between deaf and hearing people. Now researchers are also putting sensors into clothing. A Baylor College neuroscientist recently raised more than $47,000 on Kickstarter to create a vest that will communicate sounds to deaf users via vibrations.
The chipmaker Intel is also integrating assistive sensors into clothing. In January, Intel unveiled a camera and sensor system that can be attached to a jacket. This “RealSense” technology detects approaching objects and people and vibrates to convey the information to visually impaired users.
Intel and Microsoft’s wearable assistive devices have yet to be commercialized, but experts are optimistic about the impact big companies will have on the industry. “Intel and Microsoft’s technologies have great potential to help people with visual and/or hearing impairment not just because they are making exciting products, but also because they have funds to keep researching and developing and improving,” says Rao. He is similarly positive about Google’s head-mounted display, Google Glass. Though disabled users have criticized Google Glass for not being hearing aid-compatible and not supporting sign language input, both Rao and Marlowe think the device (which is currently being redesigned) will be able to assist a broad range of disabled people. For example, Google Glass could provide reminders to users with Parkinson’s disease, teach autistic children how to react to social cues and enable people with cerebral palsy to record and send videos, compose messages and browse the Web just by nodding their heads.
Of course, many people who would like to use Google Glass can’t afford the $1,500 gadget — or the $1,000-plus vibrating VEST, $3,500 OrCam or $15,000 eSight. “Cost is a huge challenge [in the adoption of wearable assistive devices] because people with disabilitiesare highly underemployed [and those who are employed] aren’t paid nearly as much as workers without disabilities,” points out Marlowe. Given the expense, some people have turned to crowdfunding to buy wearables. Beitz’s sister, Yvonne Felix, is leading one such campaign at #MakeBlindnessHistory.
Wearable assistive devices must also be easy to use to gain traction. “Someone who is not tech-savvy and who struggles with gadgets in general may be apprehensive about getting into wearables,” says Rao. Marlowe says developers should ask disabled people for input so they can make their wearables as user-friendly as possible. “Accessibility needs to be done up front, with the feedback of [disabled] users…in the developmental phase, and not retrofitted at the end,” she adds.
Eventually, close cooperation between developers and disabled users could lead to devices that “harmonize…users of all abilities,” says Marlowe. For example, rather than create standalone devices for disabled people, Verizon partners Sesame Enable and Visus Technology are developing assistive software that can be loaded onto smartphones. Sesame Enable, a 2014 Verizon Powerful Answers Education Award winner, makes gesture-based software that enables paralyzed people to navigate touchscreen devices by moving their heads. Visus is building a mobile application suite called VelaSensethat will identify text, currency, barcodes and familiar faces to give visually-impaired users real-time information about their surroundings. “More ‘regular’ technology should have accessibility features built-in,” Marlowe says. “I would like to see less assistive technology and more accessible technology.”