Nearly 70 percent of Americans take prescription drugs, according to the Mayo Clinic.
Reading the tiny print on medication bottles—where dose amounts and critical instructions are found—can be tricky. For those with low or no vision, it’s next to impossible.
But a recent collaboration between the AT&T Foundry for Connected Health (located in the Texas Medical Center Innovation Institute) and La Jolla, California-based Aira, which makes smart glasses for people who are blind or have low vision, offers a solution.
Aira’s remote assistive technology connects smart glasses users—dubbed “explorers”—with a network of certified agents via an augmented reality dashboard. The agents serve as visual interpreters, helping users accomplish a wide range of activities, such as walking down the street, navigating an airport or even reading a bedtime story.
Members of the AT&T Foundry for Connected Health team, while developing an artificial intelligence (AI) and machine learning system to read the labels on medication bottles, partnered with Aira about a year ago to provide network connectivity to Aira’s smart glasses.
The result is “Hey Chloe,” a recognition solution with built-in, voice-activated technology that debuted in March 2018. Aira’s new AI platform identifies prescriptions and over-the-counter medications.
Nadia Morris, former director of the AT&T Foundry, explained the process: “First, [the computer] has to determine if it is a medication bottle or not,” she said. “It’s similar to the TxTag, where a photo is taken. Their systems are trained to know what a license plate looks like.”
The TxTag system, which allows drivers to pre-pay tolls, works off an AI system that recognizes license plates. Morris’s team applied the same process to medication bottles; team members brought in their own bottles and trained the computer to read them. The team even set up a secure system for other AT&T colleagues to donate images of their bottles to help train the computer.
“We crowdsourced it,” Morris said. “A lot of employees run the spectrum of age, gender and ethnic background, so it was a good cross section.”
Although major pharmacies have offered “talking” pill bottles for several years—typically, a health professional records instructions on a device that attaches to the bottle—“Hey Chloe” accesses instructions in a different way.
“Hey Chloe” users can activate the AI assistant by asking, “Hey Chloe, what medication is this?” The AI assistant will scan the field around the user and find the bottle of prescription medication. The glasses will read the label and turn that information into an audio file that is read into the user’s ear, Morris said. The system also works for over-the-counter medication.
The AT&T Foundry team learned a few things during this project. One challenge with machine learning is providing a varied data set from which the computer can learn. In addition, because most pill bottles are cylinders, the user often must rotate the bottles for the glasses to read the prescriptions correctly.
Team members also discovered that while it seems like there is a CVS or Walgreens on every corner, a large number of prescriptions come from independent pharmacies, so they had to train the computer to recognize different types of labels, Morris said.
During the project, Morris was often asked why the team didn’t use photos from the Internet, where such images are readily available.
She said it was because those pictures are typically perfect, with the label always facing the right way.
People who are visually impaired might not always pick up the bottle with the label facing them, she said.
Aira’s next-generation wearables, Horizon smart glasses, come with “Hey Chloe” and became available in May 2018. The glasses are already paired with an Aira-dedicated smartphone, powered by AT&T, for those who don’t own a smartphone.
There is a lot of synergy between the work Aira is doing to connect blind users and human agents, and what AT&T is accomplishing to power that connectivity, explained Greg Stilson, director of product management at Aira.
“AT&T has been a huge partner with us,” said Stilson, who is blind. “It stemmed from the need to have a partner who provided data. Imagine a constant video feed with a blind user connected to agents managing all of that data, with the ‘explorer’ using up that data on their own smartphone plan.”
More than 35 percent of the interactions between agents and users involve some level of reading, he added, which is why “Hey Chloe” provides such an advantage.
The artificial intelligence platform also helps users locate pill bottles that have been misplaced. Users scan an area with the glasses and ask the AI agent to locate the bottle of medication. In turn, the glasses recognize the medicine label among other items and direct the user to the bottle.
AT&T is helping Aira add new interactive abilities to “Hey Chloe” so the AI program will be able to recognize other items and even tell the user to move closer to an object if it is blurry, Stilson said.
The overall goal is for people to have freedom and interaction. “It’s like having a sighted person in your pocket,” he said.
And “Hey Chloe” is just the beginning.
“We have this beautiful AI and human interaction,” Stilson said. “The pill bottle is one thing, but we are moving toward being able to read any text out there. Imagine going through an airport, one of the most challenging environments, where you have to go from Point A to Point B, passing restaurants and restrooms. Soon, all you will have to say is ‘Chloe, read this.’ Text reading is opening up the world of print, and we are very excited about it.”
The lengths a cancer cell will go to survive might surprise you! https://t.co/GJphyQgP0d @BCMFromtheLabs
As a VA doctor, Ken knew VA offered top-quality care so after his injury in Iraq he chose VA for care. Now this Purple Heart recipient continues to serve Veterans every day. Watch his story and #ExploreVA: https://t.co/X47xATHFU6
University of Houston@UHouston
We've got it covered! With a recent $3 Million gift, the inaugural class of the UH College of Medicine will have their tuition paid in full. Thank you to all of our supporters for helping us address the need for primary care in our community. #UHHereWeGo https://t.co/ywFieVN2H9
Chef Amanda DeJesus gave us a meal prep demo for #WellnessWednesday last week. Learned loads and smiled a bunch, too! https://t.co/2x6fEwIwy7
#Bioengineering students at #RiceUniversity have designed an augmented reality app to help Parkinson's patients overcome a symptom known as "freezing." Read the full story here: https://t.co/kX8O9WItT8 https://t.co/CIoJXgLsBW
RT @ACCmediacenter: Smoking may increase risk of #AFib, research suggests. @HealthDayEditor https://t.co/bsZ4kX8zSG https://t.co/sdSdEiMGVz
@lopez_indignado Alberto, we are not the same operation and are not able to address your concerns. We suggest you speak to the El Instituto de Cancerologia about your questions regarding your care. Best wishes to you.
.@3Bsgrandma shares how a second opinion revealed a stage IV #breastcancer diagnosis and how a #clinicaltrial offered hope. #bcsm #endcancer https://t.co/utykXMSP2H
@lopez_indignado Alberto, unfortunately we can't provide medical advice via Twitter. However, if you'd like to learn about coming to MD Anderson, you can learn about our International Center here: https://t.co/XNiNJ7jqxM. Best wishes to you.
The @GlasscockSchool's fall course catalog is chock-full of offerings for ever-curious learners, including classes in the humanities and sciences, foreign languages and personal and professional development. https://t.co/A793BrcLfz https://t.co/8a1K8LbHE2
Last Shasta County Navy Veteran who survived Pearl Harbor attack laid to rest https://t.co/lc5ODzbk9l via @JimSchultz_RS
One internal medicine resident looks back at a time where a patient helped him realize the balance of overly identifying with patients and complete detachment. https://t.co/7xsFWEhVc4
Santa Clarita Air Force Veteran Launches Campaign For Web Series To Depict Modern Soldier Life https://t.co/Td0ds3jxue via @KHTSRADIO
U.S. Department of Veterans AffairsVeteransAffairs
Today’s #VeteranOfTheDay is Army Veteran Stanley Nelson. Stanley served from 1949 to 1952.Stanley, from Otwell, Indiana in Pike County, joined the Army in 1949 and completed training at Fort Knox. He was sent to Japan and in 1950 was assigned to the 8th Engineer Combat Battalion, 1st Calvary in Korea during the Korean War. On February 14, 1951, Stanley was defending the flank of advancing soldiers near Chipyong in modern-day South Korea. He was wounded by small arms fire in the right shoulder, right foot, left leg and left foot. Stanley was left incapacitated and was captured by the enemy.Stanley endured torture and difficult conditions while held prisoner and was left to die. However, American forces discovered him and evacuated him for medical treatment. The lower part of Stanley’s leg was amputated the following month and he recovered at Percy James Army Hospital in Battle Creek, Michigan. He was medically retired on January 31, 1952.Thank you for your service, Stanley!
Thank you @MBThewoodlands for supporting student scholarships at @MDA_UTHGrad! It also is hosting our next House Calls web chat on sports medicine this Thursday, July 19, at 6:30 p.m. CST at https://t.co/N5UU1Jx4pq. Submit a question for our experts by using #UTHealthHouseCalls.