For most people, an Alzheimer’s diagnosis arrives late — too late for the treatments that exist to do their best work. The symptoms have already been mistaken for normal aging for years. The window for intervention has quietly closed. A new wave of AI research out of Massachusetts suggests that window doesn’t have to close nearly as early as it does.

Researchers at Worcester Polytechnic Institute have developed a machine learning model that analyzes structural changes in standard MRI brain scans and detects Alzheimer’s disease with nearly 93% accuracy — distinguishing healthy brains from those showing early cognitive impairment or Alzheimer’s changes at a precision the human eye simply can’t match. The study, published in March 2026 in the journal Neuroscience, analyzed 815 brain scans across 95 distinct brain regions to identify patterns of volume loss that signal the disease before symptoms become obvious.

The 90% Problem

The urgency behind this research comes down to a single staggering number. An estimated 90% of people in the earliest phase of Alzheimer’s — a stage called mild cognitive impairment — go undiagnosed in the United States, according to multiple studies. That’s not a small gap. That’s the vast majority of people who could benefit from early intervention cycling through the healthcare system without anyone connecting the dots.

The gap matters more now than it ever has, because for the first time, it’s clinically consequential. Two drugs — Leqembi and Kisunla — received FDA approval in 2023 and 2024, and both have been shown to modestly slow Alzheimer’s progression. But they come with a strict condition: they only work at the earliest symptomatic stages of the disease. Start too late, and the drugs can’t do what they’re designed to do. About one-third of people with mild cognitive impairment due to Alzheimer’s will develop dementia within five years, according to the Alzheimer’s Association — which makes early detection not a nice-to-have but the central clinical challenge.

What the AI Actually Sees

Closeup of X-ray photography of human brain

The WPI model works by measuring brain volume across 95 regions in standard MRI scans and identifying subtle patterns of shrinkage that don’t register on a typical clinical read. Machine learning can see differences and changes in the brain that the human eye can’t, said Benjamin Nephew, the assistant research professor who led the study. The regions that emerged as strongest predictors were the hippocampus, the amygdala, and the entorhinal cortex — the same structures that Alzheimer’s tends to attack first, responsible for memory, emotional processing, and navigational perception respectively.

The study also found meaningful sex-related differences in how brain volume loss unfolds, suggesting that the disease’s structural footprint may develop differently depending on biological factors — a finding that could eventually inform more personalized screening approaches.

Several Massachusetts Institutions, One Shared Goal

WPI isn’t alone. Across the state, multiple research teams are attacking the early detection problem from different angles at the same time. At Mass General Brigham, researchers are using AI to mine electronic medical records — scanning patient notes for the kinds of subtle cognitive signals that clinicians often don’t flag in real time. At Boston University, a separate team has built a model that predicts the presence of amyloid beta and tau proteins — the sticky hallmarks of Alzheimer’s — using common, less expensive tests like brain scans and memory assessments, rather than the costly PET imaging typically required. UCLA researchers are developing a parallel tool focused specifically on reducing diagnostic disparities in underrepresented communities, where Alzheimer’s is both more prevalent and more frequently missed.

“The biggest opportunity to improve Alzheimer’s care isn’t in a new drug — it’s in noticing the earliest signs sooner,” said Dr. Lidia Moura, director of population health in the neurology department at Mass General Brigham.

What Earlier Diagnosis Actually Changes

Kevin Terwilliger, a 62-year-old Alzheimer’s advocate who writes about living with the disease at alzblog.net, spent more than five years after a 2018 ministroke trying to get a clinician to take his symptoms seriously enough to administer a neurocognitive test. When one finally did, he couldn’t recall any of five words read to him minutes earlier. He was subsequently diagnosed, began 18 months of Leqembi infusions, and reports feeling sharper than he did before treatment. He’s clear about what earlier detection would have meant: earlier treatment, and more time at his best.

His experience is not unusual. It is, in fact, the norm — which is precisely why the AI work happening across Massachusetts carries weight beyond a journal publication. An estimated 7 million Americans 65 and older are living with Alzheimer’s today. The AI tools being developed right now won’t cure the disease. But catching it in a standard MRI scan — before patients or their doctors know to look — changes what’s possible for every one of those 7 million people who will follow.

Skip to content