Nearly 70 percent of Americans take prescription drugs, according to the Mayo Clinic.
Reading the tiny print on medication bottles—where dose amounts and critical instructions are found—can be tricky. For those with low or no vision, it’s next to impossible.
But a recent collaboration between the AT&T Foundry for Connected Health (located in the Texas Medical Center Innovation Institute) and La Jolla, California-based Aira, which makes smart glasses for people who are blind or have low vision, offers a solution.
Aira’s remote assistive technology connects smart glasses users—dubbed “explorers”—with a network of certified agents via an augmented reality dashboard. The agents serve as visual interpreters, helping users accomplish a wide range of activities, such as walking down the street, navigating an airport or even reading a bedtime story.
Members of the AT&T Foundry for Connected Health team, while developing an artificial intelligence (AI) and machine learning system to read the labels on medication bottles, partnered with Aira about a year ago to provide network connectivity to Aira’s smart glasses.
The result is “Hey Chloe,” a recognition solution with built-in, voice-activated technology that debuted in March 2018. Aira’s new AI platform identifies prescriptions and over-the-counter medications.
Nadia Morris, former director of the AT&T Foundry, explained the process: “First, [the computer] has to determine if it is a medication bottle or not,” she said. “It’s similar to the TxTag, where a photo is taken. Their systems are trained to know what a license plate looks like.”
The TxTag system, which allows drivers to pre-pay tolls, works off an AI system that recognizes license plates. Morris’s team applied the same process to medication bottles; team members brought in their own bottles and trained the computer to read them. The team even set up a secure system for other AT&T colleagues to donate images of their bottles to help train the computer.
“We crowdsourced it,” Morris said. “A lot of employees run the spectrum of age, gender and ethnic background, so it was a good cross section.”
Although major pharmacies have offered “talking” pill bottles for several years—typically, a health professional records instructions on a device that attaches to the bottle—“Hey Chloe” accesses instructions in a different way.
“Hey Chloe” users can activate the AI assistant by asking, “Hey Chloe, what medication is this?” The AI assistant will scan the field around the user and find the bottle of prescription medication. The glasses will read the label and turn that information into an audio file that is read into the user’s ear, Morris said. The system also works for over-the-counter medication.
The AT&T Foundry team learned a few things during this project. One challenge with machine learning is providing a varied data set from which the computer can learn. In addition, because most pill bottles are cylinders, the user often must rotate the bottles for the glasses to read the prescriptions correctly.
Team members also discovered that while it seems like there is a CVS or Walgreens on every corner, a large number of prescriptions come from independent pharmacies, so they had to train the computer to recognize different types of labels, Morris said.
During the project, Morris was often asked why the team didn’t use photos from the Internet, where such images are readily available.
She said it was because those pictures are typically perfect, with the label always facing the right way.
People who are visually impaired might not always pick up the bottle with the label facing them, she said.
Aira’s next-generation wearables, Horizon smart glasses, come with “Hey Chloe” and became available in May 2018. The glasses are already paired with an Aira-dedicated smartphone, powered by AT&T, for those who don’t own a smartphone.
There is a lot of synergy between the work Aira is doing to connect blind users and human agents, and what AT&T is accomplishing to power that connectivity, explained Greg Stilson, director of product management at Aira.
“AT&T has been a huge partner with us,” said Stilson, who is blind. “It stemmed from the need to have a partner who provided data. Imagine a constant video feed with a blind user connected to agents managing all of that data, with the ‘explorer’ using up that data on their own smartphone plan.”
More than 35 percent of the interactions between agents and users involve some level of reading, he added, which is why “Hey Chloe” provides such an advantage.
The artificial intelligence platform also helps users locate pill bottles that have been misplaced. Users scan an area with the glasses and ask the AI agent to locate the bottle of medication. In turn, the glasses recognize the medicine label among other items and direct the user to the bottle.
AT&T is helping Aira add new interactive abilities to “Hey Chloe” so the AI program will be able to recognize other items and even tell the user to move closer to an object if it is blurry, Stilson said.
The overall goal is for people to have freedom and interaction. “It’s like having a sighted person in your pocket,” he said.
And “Hey Chloe” is just the beginning.
“We have this beautiful AI and human interaction,” Stilson said. “The pill bottle is one thing, but we are moving toward being able to read any text out there. Imagine going through an airport, one of the most challenging environments, where you have to go from Point A to Point B, passing restaurants and restrooms. Soon, all you will have to say is ‘Chloe, read this.’ Text reading is opening up the world of print, and we are very excited about it.”
RT @HHSGov: Millions of people in the United States get the common cold every year. Avoid being one of them with these tips: https://t.co/c…
RT @BCMHouston_News: Congratulations to @bcmhouston's Dr. Dick Kuo, who has been named chair of the Department of Emergency Medicine. Dr. K…
RT @CDCFlu: Widespread #flu activity is being reported by 48 states and Puerto Rico. Help stop the spread of flu and take 3 actions: ✔️Ge…
Last night, @ShepherdSchool students performed Brahms “Sextet No. 1 in B-flat Major, Op. 18” at the John F. Kennedy Center in D.C. Congrats to Likai He, Emma Powell, violin; Ankit Anil, Becca Barnett, viola; Jacob MacKay, Minzo Kim, cello on a spectacular performance! https://t.co/zW4BZPql3l
RT @RiceBaseball: ⚾️ 𝐆𝐀𝐌𝐄 𝐃𝐀𝐘 ⚾️ 🆚 » #14 @TexasBaseball 📍 » Reckling Park ⏰ » 7:00PM CT 📱 » https://t.co/Ps5ZFbDHKc 🎟 » https://t.co/8…
Apple Will Partner With Veterans Affairs to Make Medical Information More Accessible to U.S. Veterans https://t.co/QJGpOnv1HI Via@Fortune
@shellylopezgray @UnitedHealthGrp Thanks for the shout-out
“You couldn’t ask for more than just living, but to bring life into this world is just tremendous.” Read survivor @soignier’s story of overcoming a rare #ovariancancer and giving birth to a son: https://t.co/pBJuQSZyJV #gyncsm #endcancer
Lynchburg museum recognizes local African-American Veterans for Black History Month https://t.co/Y5gLwz3SSE via @wsls
Rome, NY: local Veterans Center adds new counseling services https://t.co/1xSjikrNjk via @RomeSentinel
How our Center for Entrepreneurship Advancement helps employees like @drjustinbird turn ideas into realities: https://t.co/tVykkpXalL @christaylor9393 #endcancer
ICYMI: Study finds small cell #lungcancer may respond to combination of immunotherapy and DNA damage repair inhibitors: https://t.co/04inaclXy4 #LCSM #CancerMoonshot #endcancer https://t.co/nspwkPbjoF
TAMU Health Sciences@TAMHSC
Researchers at #TAMU are working on ways to improve patient outcomes after #heartattacks. https://t.co/MAkvEcrxGY #TAMUHealth #HeartMonth https://t.co/sfkxHR4Hge
Kathy knows the importance of not ignoring your symptoms. Her chest pain led to the same heart condition as her brother. Read her story: https://t.co/XnEGzswEeS. #HeartMonth
Owls are making history. Congratulations to @RiceWBB for entering @AP_Top25! Read more: https://t.co/YxHzBGFW3l https://t.co/2DavuLWUKy