Summary: New research shows that deep learning can use EEG signals to distinguish Alzheimer’s disease from frontotemporal dementia with high accuracy. By analyzing both the timing and frequency of brain activity, the model uncovered distinct patterns: broader impairment across multiple regions in Alzheimer’s and more localized frontal and temporal changes in frontotemporal dementia.
The system also estimated disease severity, giving doctors faster information than traditional tools. These findings suggest that affordable EEG technology, combined with advanced AI, can streamline diagnosis and personalize care for people experiencing cognitive decline.
Key facts
EEG Biomarkers: Slow delta waves in the frontal and central areas signaled disease in both conditions. Different patterns: Alzheimer’s showed a generalized alteration, while frontotemporal dementia remained more localized. High accuracy: A two-stage deep learning system achieved 84% accuracy in separating the two disorders.
Source: FAU
Dementia is a group of disorders that gradually impair memory, thinking, and daily functioning. Alzheimer’s disease (AD), the most common form of dementia, will affect about 7.2 million Americans aged 65 and older in 2025.
Frontotemporal dementia (FTD), although rarer, is the second most common cause of early-onset dementia and usually affects people between the ages of 40 and 60.
Although both diseases damage the brain, they do so in different ways. AD primarily affects memory and spatial awareness, while FTD focuses on regions responsible for behavior, personality, and language.
Because their symptoms can overlap, this often leads to misdiagnosis. Distinguishing between them is not only a scientific challenge but a clinical necessity, as an accurate diagnosis can profoundly affect treatment, care, and quality of life.
MRI and PET scans are effective in diagnosing AD, but they are expensive, time-consuming, and require specialized equipment. Electroencephalography (EEG) offers a portable, non-invasive and affordable alternative by measuring brain activity with sensors in various frequency bands.
However, the signals are often noisy and vary between individuals, making analysis difficult. Even with applications of machine learning to EEG data, results are inconsistent and differentiating AD from FTD remains difficult.
To address this problem, researchers at Florida Atlantic University’s College of Engineering and Computer Science have created a deep learning model that detects and evaluates AD and FTD. Increases the accuracy and interpretability of EEG by analyzing brain activity patterns based on frequency and time related to each disease.
The results of the study, published in the journal Biomedical Signal Processing and Control, found that slow delta brain waves were an important biomarker for both AD and FTD, primarily in the frontal and central regions of the brain.
In AD, brain activity was more altered, also affecting other brain regions and frequency bands such as beta, indicating more extensive brain damage. These differences help explain why AD is often easier to detect than FTD.
The model achieved more than 90% accuracy in distinguishing people with dementia (AD or FTD) from cognitively normal participants. It also predicted disease severity with relative errors of less than 35% for AD and 15.5% for FTD.
Because AD and FTD share similar symptoms and brain activity, it was difficult to tell them apart. Using feature selection, the researchers increased the model’s specificity (how well it identified people without the disease) from 26% to 65%.
Its two-stage design (first detecting healthy individuals and then separating AD from FTD) achieved an accuracy of 84%, ranking among the best EEG-based methods so far.
The model combines convolutional neural networks and attention-based LSTM to detect both the type and severity of dementia from EEG data. Grad-CAM shows which brain signals influenced the model, helping doctors understand their decisions.
This approach offers new insight into how brain activity evolves and what regions and frequencies drive diagnosis, something that traditional tools rarely capture.
“What makes our study novel is how we use deep learning to extract both spatial and temporal information from EEG signals,” said Tuan Vo, first author and doctoral student in FAU’s Department of Electrical and Computer Engineering.
“By doing this, we can detect subtle brain wave patterns related to Alzheimer’s and frontotemporal dementia that would otherwise go unnoticed. Our model not only identifies the disease, but also estimates its severity, offering a more complete picture of each patient’s condition.”
The findings also revealed that AD tends to be more severe, affecting a broader range of brain areas and leading to lower cognitive scores, while the effects of FTD are more localized to the frontal and temporal lobes.
These insights align with previous neuroimaging studies, but add new depth by showing how these patterns appear in EEG data, an inexpensive and non-invasive diagnostic tool.
“Our findings show that Alzheimer’s disease alters brain activity more broadly, especially in the frontal, parietal and temporal regions, while frontotemporal dementia mainly affects the frontal and central areas,” said Hanqi Zhuang, Ph.D., co-author, associate dean and professor in the Department of Electrical and Computer Engineering at FAU.
“This difference explains why Alzheimer’s is often easier to detect. However, our work also shows that careful feature selection can significantly improve how well we distinguish FTD from Alzheimer’s.”
Overall, the study shows that deep learning can streamline dementia diagnosis by combining detection and severity assessment into a single system, reducing lengthy evaluations and giving doctors real-time tools to track disease progression.
“This work demonstrates how the combination of engineering, artificial intelligence and neuroscience can transform the way we address major health challenges,” said Stella Batalama, Ph.D., dean of the College of Engineering and Computer Science.
“With millions affected by Alzheimer’s and frontotemporal dementia, advances like this open the door to earlier detection, more personalized care and interventions that can truly improve lives.”
Co-authors of the study are Ali K. Ibrahim, Ph.D., teaching assistant professor; and Chiron Bang, a doctoral student, both in the FAU Department of Electrical and Computer Engineering.
Key questions answered:
A: Their symptoms and EEG signatures often overlap, leading to misdiagnosis without specialized imaging.
A: Analyzes spatial and temporal features simultaneously, revealing subtle differences in brain waves that standard methods miss.
A: Yes, it estimates the severity levels of both conditions, helping doctors track progression more effectively.
Editorial notes:
This article was edited by a Neuroscience News editor. Magazine article reviewed in its entirety. Additional context added by our staff.
About this news about research in AI and neurotechnology
Author: Gisèle Galoustian
Source: FAU
Contact: Gisele Galoustian – FAU
Image: Image is credited to Neuroscience News.
Original research: Open access.
“Extraction and interpretation of EEG features for diagnosis and severity prediction of Alzheimer’s disease and frontotemporal dementia using deep learning” by Tuan Vo et al. Processing and control of biomedical signals
Abstract
Extraction and Interpretation of EEG Features for Diagnosing and Predicting the Severity of Alzheimer’s Disease and Frontotemporal Dementia Using Deep Learning
Alzheimer’s disease (AD) is the most common form of dementia, characterized by progressive cognitive decline and memory loss. Frontotemporal dementia (FTD), the second most common form of dementia, affects the frontal and temporal lobes and causes changes in personality, behavior, and language.
Due to overlapping symptoms, FTD is often misdiagnosed as AD. Although electroencephalography (EEG) is portable, noninvasive, and cost-effective, its diagnostic potential for AD and FTD is limited by the similarities between the two diseases.
To address this, we present an EEG-based feature extraction method to identify and predict the severity of AD and FTD using deep learning. Key findings include increased delta band activities in the frontal and central regions as biomarkers.
By extracting temporal and spectral features from EEG signals, our model combines a convolutional neural network with an attention-based short-term memory (aLSTM) network, achieving over 90% accuracy in distinguishing AD and FTD from cognitively normal (CN) individuals.
It also predicts severity with relative errors of less than 35% for AD and approximately 15.5% for FTD. Differentiating FTD from AD remains challenging due to shared characteristics.
However, the application of a feature selection procedure improves the specificity for separating AD from FTD, increasing it from 26% to 65%. On this basis, we developed a two-stage approach to classify AD, CN, and FTD simultaneously. In this approach, CN is first identified, followed by differentiation of FTD from AD.
This method achieves an overall accuracy of 84% in the classification of AD, CN, and FTD.

























