Nearly 70 percent of Americans take prescription drugs, according to the Mayo Clinic.
Reading the tiny print on medication bottles—where dose amounts and critical instructions are found—can be tricky. For those with low or no vision, it’s next to impossible.
But a recent collaboration between the AT&T Foundry for Connected Health (located in the Texas Medical Center Innovation Institute) and La Jolla, California-based Aira, which makes smart glasses for people who are blind or have low vision, offers a solution.
Aira’s remote assistive technology connects smart glasses users—dubbed “explorers”—with a network of certified agents via an augmented reality dashboard. The agents serve as visual interpreters, helping users accomplish a wide range of activities, such as walking down the street, navigating an airport or even reading a bedtime story.
Members of the AT&T Foundry for Connected Health team, while developing an artificial intelligence (AI) and machine learning system to read the labels on medication bottles, partnered with Aira about a year ago to provide network connectivity to Aira’s smart glasses.
The result is “Hey Chloe,” a recognition solution with built-in, voice-activated technology that debuted in March 2018. Aira’s new AI platform identifies prescriptions and over-the-counter medications.
Nadia Morris, former director of the AT&T Foundry, explained the process: “First, [the computer] has to determine if it is a medication bottle or not,” she said. “It’s similar to the TxTag, where a photo is taken. Their systems are trained to know what a license plate looks like.”
The TxTag system, which allows drivers to pre-pay tolls, works off an AI system that recognizes license plates. Morris’s team applied the same process to medication bottles; team members brought in their own bottles and trained the computer to read them. The team even set up a secure system for other AT&T colleagues to donate images of their bottles to help train the computer.
“We crowdsourced it,” Morris said. “A lot of employees run the spectrum of age, gender and ethnic background, so it was a good cross section.”
Although major pharmacies have offered “talking” pill bottles for several years—typically, a health professional records instructions on a device that attaches to the bottle—“Hey Chloe” accesses instructions in a different way.
“Hey Chloe” users can activate the AI assistant by asking, “Hey Chloe, what medication is this?” The AI assistant will scan the field around the user and find the bottle of prescription medication. The glasses will read the label and turn that information into an audio file that is read into the user’s ear, Morris said. The system also works for over-the-counter medication.
The AT&T Foundry team learned a few things during this project. One challenge with machine learning is providing a varied data set from which the computer can learn. In addition, because most pill bottles are cylinders, the user often must rotate the bottles for the glasses to read the prescriptions correctly.
Team members also discovered that while it seems like there is a CVS or Walgreens on every corner, a large number of prescriptions come from independent pharmacies, so they had to train the computer to recognize different types of labels, Morris said.
During the project, Morris was often asked why the team didn’t use photos from the Internet, where such images are readily available.
She said it was because those pictures are typically perfect, with the label always facing the right way.
People who are visually impaired might not always pick up the bottle with the label facing them, she said.
Aira’s next-generation wearables, Horizon smart glasses, come with “Hey Chloe” and became available in May 2018. The glasses are already paired with an Aira-dedicated smartphone, powered by AT&T, for those who don’t own a smartphone.
There is a lot of synergy between the work Aira is doing to connect blind users and human agents, and what AT&T is accomplishing to power that connectivity, explained Greg Stilson, director of product management at Aira.
“AT&T has been a huge partner with us,” said Stilson, who is blind. “It stemmed from the need to have a partner who provided data. Imagine a constant video feed with a blind user connected to agents managing all of that data, with the ‘explorer’ using up that data on their own smartphone plan.”
More than 35 percent of the interactions between agents and users involve some level of reading, he added, which is why “Hey Chloe” provides such an advantage.
The artificial intelligence platform also helps users locate pill bottles that have been misplaced. Users scan an area with the glasses and ask the AI agent to locate the bottle of medication. In turn, the glasses recognize the medicine label among other items and direct the user to the bottle.
AT&T is helping Aira add new interactive abilities to “Hey Chloe” so the AI program will be able to recognize other items and even tell the user to move closer to an object if it is blurry, Stilson said.
The overall goal is for people to have freedom and interaction. “It’s like having a sighted person in your pocket,” he said.
And “Hey Chloe” is just the beginning.
“We have this beautiful AI and human interaction,” Stilson said. “The pill bottle is one thing, but we are moving toward being able to read any text out there. Imagine going through an airport, one of the most challenging environments, where you have to go from Point A to Point B, passing restaurants and restrooms. Soon, all you will have to say is ‘Chloe, read this.’ Text reading is opening up the world of print, and we are very excited about it.”
@jnainani6 We appreciate the RT.
@JohnsHopkinsSPH Thanks for sharing the event info.
@ghn_news @HoustonHealth @BigCitiesHealth @JohnsHopkins Thanks for sharing the event info.
TAMU Health Sciences@TAMHSC
Megan Badejo, Medical Student at the @TAMUmedicine, says: It's not about my #race. It's not about my #gender. It's about my #purpose!We celebrate the @AmerMedicalAssn’s Women in #Medicine Month.#WIMMonth #AggieDocs #TAMHSC #AggieHealth #TAMUHealth #medicalstudent #science https://t.co/wb6cyecGgD
RT @UTPhysicians: We’re hiring! Check out the latest job postings for @UTHealth & UT Physicians.https://t.co/DcQ3sflvhv
MD Anderson Cancer Center@MDAndersonNews
Join @ppisters, @AlyssaRieber and other experts and patients as they discuss ways to improve the #cancer patient experience at our #BidenCancerSummit event: https://t.co/JG9jx5PXvw #CancerMoonshot #endcancer https://t.co/293sNemWKz
MD Anderson Cancer Center@MDAndersonNews
Have you registered for the 2018 @BootWalk? Sign up today to help us give #cancer the boot and raise funds for research, care and education: https://t.co/ZrLKF9p0CG #endcancer https://t.co/UcOeLmnmlx
Northern Arizona VA’s new cable channel broadcasts programming just for Veterans https://t.co/YSXIiK4QFr via #VAntagePoint
Osteoarthritis currently affects 28 million people in the US and Dr. Brenden Lee and his lab are looking into gene therapy as a possible treatment option. https://t.co/M6HkUfuJFl #osteoarthritis @BCMFromtheLabs
WE ARE WHAT WE EAT! Your diet has a direct impact on your health. Get your NUTRITION questions answered by the experts UTHealth now! https://t.co/l1R8gNZ01W #UTHealthHouseCalls
RT @abc13houston: WATCH LIVE: Experts taking your questions on nutrition in tonight's 'UTHealth House Calls' https://t.co/pYj5E5pBrN
Veteran Finds Salvation in VA Treatment Program Suicide Prevention Month https://t.co/JlslC3lTzf via #VAntagePoint
University of Houston@UHouston
RT @UHpres: Congratulations President Ira Blake on your investiture today! https://t.co/6oV8NtuUfx
Today’s #VeteranOfTheDay is @USNavy Veteran Dale Doss. https://t.co/3CZWUSefjW
After an almost life-ending accident, Artis began a new chapter of his life with a prosthetic leg.“During this time, all I could think about was all of the things I wanted to do despite my circumstances,” he said.Read his story: https://t.co/BwuAJxRU1s.