When I walked into an Arlington, VA coffee shop to meet Candice Jordan, I felt the usual anxious vigilance I get when I'm looking for a person I've never met. People are usually great at producing obvious body-language signals that indicate they're also waiting for a stranger, but Candice wouldn't be looking for me: She'd be listening. She did display a few great cues, thankfully. A patient, doe-eyed Labrador retriever named Austria rested by her side, and a clutch of electronic gadgets were spread before her on the table.
Candice also had a Google Glass headset perched on the bridge her nose. It was this that I'd really come to talk to her about. After we exchanged pleasantries, she gave me a quick rundown on her recent life—one in which smart assistive technology is playing an increasingly important role. AI-enabled eyesight services, smart hearing aids, and other intuitive, connected technology is changing the game for people with disabilities.

Vision Quest

Candice lost her sight entirely in college in 1998, at the age of 21, waking up blind one morning after months of declining vision because of worsening, inoperable cataracts. She worked with her university to complete her degree in psychology and then obtained a master's degree in rehabilitation counseling; she's been working for the District of Columbia government's Rehabilitation Services Administration since 2007. 
So why Google Glass? Candice uses them with Aira, a new service she subscribes to: It connects her with a human agent who uses video feed from the headset or a phone's camera to describe her environment for her and help navigate her through it. The agent also has access to a dashboard of data about her preferences, multiple maps, and information about her physical location. Aira can tell her as much or as little as she wants to know about her surroundings.
Suman Kanuganti, CEO and founder of Aira, said his concept arose from a time he was on a phone-camera video call with a visually impaired friend. He asked his friend to hold his phone camera up, facing outward from his head, and then proceeded to describe what he saw in the friend's kitchen to him. On subsequent calls, they performed the exercise outdoors using a Google Glass headset Kanuganti had acquired.
"I was walking with him as I sat in San Diego, and I realized, I can pull up maps and other information for him while he's moving," Kanuganti said. "He said, Suman, what we're doing is for fun, but there are millions of blind people for whom a service like this would be life-changing."
Smart Assistive Tech
Candice Jordan navigates around an outdoor mall with the help of Aira (photo: Michelle Z. Donahue)
Candice handed me her Google Glass and phone and told me to have at it. I really wasn't sure where to start, but the agent she connected with that day, Patrick, took the lead.
He described the store, telling me where I could find the ordering counter and a shelf of mugs and providing some details about what was on the walls and who was immediately nearby. We then made our way to the exit (the door swung outward, Patrick noted). Then we were in a bright courtyard ringed by shops, where Patrick told us it was 49 degrees and sunny.
As Austria nosed into the vestibule of a Thai restaurant, Patrick mentioned we could also opt for sushi, grilled chicken, Lebanese, shoes, or discount designer clothes. Candice asked what other stores were around; when he mentioned a housewares outlet, she asked him to direct us there so she could look for a stovetop griddle.
Armed again with her headset and phone, Candice followed the ensuing left-right- straight-ahead directions, avoiding obstacles with Austria's help and alerts from Patrick. She stepped gingerly down into a curb cut when Patrick told her it was there; as we waited at a crosswalk, he had her scan left and right so he could look for oncoming vehicles. All clear.
In the store, she switched to using her phone's camera when the Wi-Fi connection fizzled, causing Patrick's video feed to freeze. The highlight of our day, Candice said, was the moment when a store clerk stopped by and asked if she needed any help.
"I love being able to say, 'No, I've got it, thanks!' when people ask me that now," Candice said. "Before, any time I needed anything in a store, I'd have to find customer service, wait for them to bring someone to help me, then have them go through my list. And because you need help, so often you have to be nice, and kind of market yourself, and educate them. Well, if it's Saturday at 7 a.m. and it's the only time I have to go to the grocery store, who wants to do all that? Now I don't need to."

Tech Enablement

According to a 2010 report from the U.S. Census, more than 56 million people, or nearly 20 percent of the nation's population, are living with a physical or cognitive impairment of some kind. Aira is just one example of an emerging segment of smart technology that's being designed specifically with this population in mind. The ability to connect to myriad streams of data, whether through a new piece of hardware or software and apps for devices, is playing heavily into how these products and applications are being developed, with the goal of helping people lead more independent, inclusive, and fulfilling lives.
The array of available solutions is dizzying. Lechal, which started as a navigation aid for the visually impaired, has developed GPS-connected shoes with haptics feedback: They buzz to help you navigate as you walk. New Jersey-based Oticon makes a set of smart hearing aids that can be programmed to prompt other devices in your home to perform a cascade of tasks according to your proximity or time of day—automatically closing the garage door, locking the house, and turning the thermostat down when you leave for work, for example.
In Europe, SpeechCode created a system to produce highly detailed QR codes that can be included on packaging, signage, or any other printed material. When scanned by the user via an app (which helps locate and center the code), the text from the package or sign encoded in the code is translated to an audio file available in 40 different languages. And Dimple, a programmable stick-on button for Android devices, uses near-field communication (NFC) to launch apps, phone settings, and even control other smart home appliances at a touch.
Myriad other devices exist to help individuals adapt to their particular disability. As illustration, a single adaptive-tech loan program at Easter Seals of Massachusetts' Assistive Technology Regional Center holds 1,200 devices for people to borrow and test out. High-tech options include eyegaze devices (these help you to access a computer or communication aid by controlling a mouse with your eyes), text-to-speech machines, and smartwatch-like wristbands that relay mobile phone messages. The overall concept is to enable people with disabilities to automate aspects of their lives that are otherwise cumbersome, as well as to make information more easily accessible.
Smart Assistive Tech
Easter Seals of Massachussetts' Assistive Technology Regional Center
"This idea of having connected devices in your home, a smart home, really is a boon to people with all kinds of disabilities," said Henry Claypool, executive vice president of the American Association of People with Disabilities, and director of policy for University of California San Francisco's Community Living Policy Center. "Greater independence, a better quality of life, and integration and inclusion—those are hallmarks of the Americans with Disabilities Act. Connected devices have tremendous potential to enable people to live as part of a community, instead of having to move to a more restricted environment where everything is brought to them."
It's a robust topic of academic research and development, as well. At the Rochester Institute of Technology, Professor Matt Huenerfauth is working on developing tools such as an American Sign Language (ASL) trainer using a Microsoft Xbox Kinect camera. The system uses animations of common ASL gestures to "spellcheck" a learner's signs: the user can copy the animation's movements, and because the program can "see" the user's movements via the Kinect camera, the software can flag or help the user correct errors in their signing. Huenerfauth is also investigating how speech-recognition technology could be used to produce captions automatically for one-on-one or small-group meetings between deaf and hearing participants.
And at the Georgia Tech Institute for People and Technology, Executive Director Beth Mynatt recently spoke of research underway that uses sensing of the brain's motor cortex to recognize the formation of individual words and phrases and translate them into machine-generated speech or text. This idea, too, emerged out of work with ASL. While researching how to read and translate brain signals, the team realized that the signal generated by a person physically signing an ASL letter or word was the same as when he or she thought about signing the letter.
But as promising and as useful as recent innovations have been, they need to be more reliable, easier to use, and there for the long haul.
Get Early Access to Top Brand Name Tech up to 50% off
Join the PCMag Tech Deals list delivered straight to your inbox

"It's tough to get individuals with disabilities to be the primary or initial adopters of some of these technologies, because if it fails, there are real consequences," said Eric Oddleifson, assistant vice president of Assistive Technology and Employment Services at Easter Seals of Massachusetts. "Many times, people will opt for something they know will work rather than try something new that may not work in the long term."

Informing Design, Connecting Solutions

One obstacle to adoption is the piecemeal nature of current solutions. There are lots of gadgets and apps out there that can communicate with your phone to relay information, or keep track of personal preferences, or automate your home. So maybe you have haptic shoes, if-this-then-that hearing aids, a smart thermostat, an Amazon Echo, and a dozen Wi-Fi-enabled LED light bulbs. And each one of them has its own app. At what point does managing all these solutions become more of an obstacle than the problem they're meant to solve? Intelligent platforms that can integrate a mix of products and user interfaces into a single, easily accessible ecosystem are still largely lacking.
Scott Moody is CEO of K4Connect, which has developed a smart-device ecosystem platform for people living with disabilities, called the K4Community, which can be used with nearly any connected device on the market across a variety of communications protocols.
"Products are often designed for one demographic—say, millennials—and then an often-feeble attempt is made to adapt them," he told me. "Each device and application is developed to solve a specific issue, but it can get to the point where one needs to have tens of apps or devices just to move around their living room. All these applications and devices need to work together—not just your home-automation products but your health, wellness, content, and communication devices, as well."
To Moody's point about product design, often the needs of the disabled aren't considered up front at all, even if the technology could, at its core, solve a key need.
Smart Assistive Tech
KR Liu (photo: courtesy KR Liu)
"Generally, a couple of big companies have been doing better about being pioneers in trying to make their products more accessible," said KR Liu, the head of sales and marketing strategy for Doppler Labs. "It's just in the last few years that the tech industry has started to think about how it can be more inclusive, not only within companies but in design for consumers."
Lui suffers from severe hearing loss herself; she needs to use high-powered hearing aids instead of her company's sound-enhancing Here One earbuds. These wireless headphones can use GPS and location information to automatically shift volume and filter settings, depending on whether the user is indoors, outdoors, at a concert or in a library. Though it was initially conceived of as a "music curation" tool for users to customize live-music events, Liu's presence on the team from very early on in the design process helped shape the earbuds into a product that could address multiple needs.
"I was involved in helping them navigate what it would take to have our technology reach a consumer like myself," Liu said. When the Here One earbuds were unveiled, the company received thousands of inquiries about whether the product could be used as assistive hearing devices.
"There are consumers who need a little help in loud restaurants or an open office but don't need a $5,000 hearing aid," she said, some of whom may also have been drawn by the idea of having a listening booster without the stigma of a full-blown hearing aid.
Smart Assistive Tech
Hear One earbuds (photo courtesy Doppler Labs)
"You'll get better, focused products by having someone with a disability involved in the design process," Oddleifson said. "We have many clients who are involved with Harvard and MIT in creating new kinds of assistive technology, and their involvement is a crucial step. Wouldn't it be nice if the big companies all had a person on their team, maybe with a disability, who could inform some of those design decisions?"
That idea was a key tenet in the development process of a Braille smartwatch, and eventually, a Braille tablet, by the South Korean company Dot. One of Dot's front-office workers is a blind individual who is active in the local vision-impaired community. He also served as Dot's first line of testing for prototype tweaks, said Alex Lee, a company representative.
"We go straight to him when there's something new," Lee said. "We say, 'What do you think about this function? How would you change this?'" He has also brought friends and others from his mostly online community into Dot's offices for beta testing and periodic chats with the engineers.
Backed by a successful crowdfunding campaign and with the first units shipped in April, the low-profile Dot watch features four Braille characters, which are driven by magnetically controlled pins. Connected via Bluetooth to a user's phone, the watch face can scroll through text messages, e-mails, and other short missives. And, of course, it tells time—but without the need for Siri to shout it out in a quiet room.
Smart Assistive Tech
The Dot watch next to a phone displaying the Dot app.
Its appearance as an accessory was also important, Lee said, because the company felt adaptive devices shouldn't reflexively be clunky. Aira's Kanuganti agreed, saying his company was in the process of developing a headset where style doesn't take a backseat. He noted that the Aira service is agnostic, though, intended to work with any hardware the user chooses to use.
"Think about how much these phone companies worry about the design of their handsets, but how often it's actually just sitting in your pocket," Kanuganti said. "But glasses—they're on your face, and they'd better be cool." Think Tom Ford frames but packing a quality camera, antennas, GPS, and proximity and altimeter sensors.

Taking a Walk

Eventually, Aira may be able to do even more for Candice that won't require her to talk to agents. "Aira" is a portmanteau of the term AI (artificial intelligence) and the name of the ancient Egyptian sun god Ra, and the company eventually intends to use artificial intelligence to generate and communicate information to its subscribers based on their most common habits, routes, and routines. The goal is to draw upon image recognition technologies, information from previous conversations with agents, street and satellite maps, GPS, and physical location to augment and occasionally replace what a human agent relays to the Aira user.
"We're looking to leverage existing systems to do the work," Kanuganti said. "You categorize individual tasks, then push your training system to automate those things."
For now, though, Candice is still chatting with Patrick and the other agents. And beyond what she describes as the "absolute freedom" that Aira has afforded her over the previous six months, she wasn't expecting for it to restore one aspect of making her way through the world: just enjoying a walk, with no particular purpose or destination.
When she met Kanuganti last fall on the National Mall in Washington to try Aira, the agent she connected with guided her from a subway entrance to one of the Smithsonian museums. And instead of just receiving directions, she asked what the agent could see: orange and red leaves falling from the trees. A trash can to her left. A person walking toward her with a stroller and a baby dressed in pink.
"For the very first time, in I don't know how long, I felt like I was talking a walk in the city that I live in and love," Candice said. "It was a walk. Not just the task of getting safely from point A to point B. I wanted to scream with happiness."
Watch: $700 Kuri Home Robot Is at Your Service
Unmute
Loaded: 0%
Progress: 0%
Remaining Time-0:00