UT Southwestern Medical Center researchers found that AI-powered voice analysis can help diagnose Alzheimer’s and cognitive impairment in early stages, potentially providing an efficient screening tool for primary care providers if confirmed by larger studies.O’Donnell Brain Institute researcher says findings may lead to a simple screening test for early detection of cognitive impairment.New technologies that
Share this:

UT Southwestern Medical Center researchers found that AI-powered voice analysis can help diagnose Alzheimer’s and cognitive impairment in early stages, potentially providing an efficient screening tool for primary care providers if confirmed by larger studies.

O’Donnell Brain Institute researcher says findings may lead to a simple screening test for early detection of cognitive impairment.

New technologies that can capture subtle changes in a patient’s voice may help physicians diagnose cognitive impairment and Alzheimer’s disease before symptoms begin to show, according to a UT Southwestern Medical Center researcher who led a study published in the Alzheimer’s Association publication Diagnosis, Assessment & Disease Monitoring.

“Our focus was on identifying subtle language and audio changes that are present in the very early stages of Alzheimer’s disease but not easily recognizable by family members or an individual’s primary care physician,” said Ihab Hajjar, M.D., Professor of Neurology at UT Southwestern’s Peter O’Donnell Jr. Brain Institute.

Researchers used advanced machine learning and natural language processing (NLP) tools to assess speech patterns in 206 people – 114 who met the criteria for mild cognitive decline and 92 who were unimpaired. The team then mapped those findings to commonly used biomarkers to determine their efficacy in measuring impairment.

Study participants, who were enrolled in a research program at Emory University in Atlanta, were given several standard cognitive assessments before being asked to record a spontaneous 1- to 2-minute description of artwork.

“The recorded descriptions of the picture provided us with an approximation of conversational abilities that we could study via artificial intelligence to determine speech motor control, idea density, grammatical complexity, and other speech features,” Dr. Hajjar said.

The research team compared the participants’ speech analytics to their cerebral spinal fluid samples and MRI scans to determine how accurately the digital voice biomarkers detected both mild cognitive impairment and Alzheimer’s disease status and progression.

“Prior to the development of machine learning and NLP, the detailed study of speech patterns in patients was extremely labor intensive and often not successful because the changes in the early stages are frequently undetectable to the human ear,” Dr. Hajjar said.

Read Full Story
SciTechDaily Rating


Discover more from News Facts Network

Subscribe to get the latest posts sent to your email.

0 0 votes
Article Rating
Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x