Researchers at Dartmouth-Hitchcock and the University of Massachusetts Boston have received a 4-year, $1.2 million grant from the National Institute on Aging (NIA) to study if voice and language patterns captured by voice assistants, like Alexa, can be used to identify people in an early stage of dementia or cognitive impairment. They are experimenting to see if collecting long-term speech patterns of individuals in their homes will enable them to develop new speech-analysis methods for early detection of Alzheimer’s disease.
Past research has shown that speech patterns change with early onset dementia. This research is exploring the theory that changes in the speech patterns of individuals using the voice assistant systems may reveal declines in memory and function over time. The grant’s goal is to develop a potentially low-cost and practical home-based assessment method using voice assistant systems for early detection of cognitive decline. Identifying memory and other cognitive problems early can lead to interventions and support systems to improve patients’ everyday function and quality of life before symptoms become too severe.
This study involves evaluating participants in the laboratory for 18 months and in the home for 28 months to see whether the voice assistant systems can measure and predict an individual’s decline over time. A voice assistant such as Alexa could be trained to notice changes in speech patterns to help caregivers to monitor the cognitive health of people using it.
Some of the challenges in developing this include getting people to feel comfortable using such a system and enabling the voice assistant to make sense of different languages or to understand even when someone doesn’t speak clearly. The voice assistants would not replace clinical evaluations but would serve to help with earlier detection of problems.