• About Us
  • Contact Us
  • Disclaimer
  • Privacy Policy
Thursday, November 13, 2025
Spinal Cord Injury
  • Home
  • Spinal Cord Injury
    • Quadriplegia
    • Paraplegia
  • Rehabilitation
    •  Epidural Stimulation
  • Latest SCI News
    Harvesting induced pluripotent stem cells in a disposable 3D printed bioreactor developed by SwRI

    Harvesting induced pluripotent stem cells in a disposable 3D printed bioreactor developed by SwRI

    Pennsylvania man says experimental drug helped him recover from bicycle accident

    Pennsylvania man says experimental drug helped him recover from bicycle accident

    Spinal cord injuries linked to chronic health problems later in life

    Spinal cord injuries linked to chronic health problems later in life

    People with traumatic spinal cord injuries are at higher risk of developing chronic health problems

    People with traumatic spinal cord injuries are at higher risk of developing chronic health problems

    Using bioinformatics to speed discovery of spinal cord injury treatments

    Using bioinformatics to speed discovery of spinal cord injury treatments

    2-Year-Old Boy Defies Odds After Spinal Cord Injury – NBC Chicago

    2-Year-Old Boy Defies Odds After Spinal Cord Injury – NBC Chicago

    Advances in organoids could transform the treatment of spinal cord and peripheral nerve injuries

    Advances in organoids could transform the treatment of spinal cord and peripheral nerve injuries

    Spinal Cord Injury Saskatchewan brings back annual wheelchair race – CTV News

    Spinal Cord Injury Saskatchewan brings back annual wheelchair race – CTV News

    Walking With Anthony Helps Spinal Cord Injury Survivors – CBS News

    Walking With Anthony Helps Spinal Cord Injury Survivors – CBS News

  • Spine
  • Health News
  • Weight Loss
  • More
    • NeuroScience
    • Brain Computer Interface
    • Diet-Nutrition
No Result
View All Result
  • Home
  • Spinal Cord Injury
    • Quadriplegia
    • Paraplegia
  • Rehabilitation
    •  Epidural Stimulation
  • Latest SCI News
    Harvesting induced pluripotent stem cells in a disposable 3D printed bioreactor developed by SwRI

    Harvesting induced pluripotent stem cells in a disposable 3D printed bioreactor developed by SwRI

    Pennsylvania man says experimental drug helped him recover from bicycle accident

    Pennsylvania man says experimental drug helped him recover from bicycle accident

    Spinal cord injuries linked to chronic health problems later in life

    Spinal cord injuries linked to chronic health problems later in life

    People with traumatic spinal cord injuries are at higher risk of developing chronic health problems

    People with traumatic spinal cord injuries are at higher risk of developing chronic health problems

    Using bioinformatics to speed discovery of spinal cord injury treatments

    Using bioinformatics to speed discovery of spinal cord injury treatments

    2-Year-Old Boy Defies Odds After Spinal Cord Injury – NBC Chicago

    2-Year-Old Boy Defies Odds After Spinal Cord Injury – NBC Chicago

    Advances in organoids could transform the treatment of spinal cord and peripheral nerve injuries

    Advances in organoids could transform the treatment of spinal cord and peripheral nerve injuries

    Spinal Cord Injury Saskatchewan brings back annual wheelchair race – CTV News

    Spinal Cord Injury Saskatchewan brings back annual wheelchair race – CTV News

    Walking With Anthony Helps Spinal Cord Injury Survivors – CBS News

    Walking With Anthony Helps Spinal Cord Injury Survivors – CBS News

  • Spine
  • Health News
  • Weight Loss
  • More
    • NeuroScience
    • Brain Computer Interface
    • Diet-Nutrition
No Result
View All Result
Spinal Cord Injury
No Result
View All Result
Home NeuroScience

People Overlook Hidden Racial Biases in AI Emotion Recognition

Editor's by Editor's
October 19, 2025
in NeuroScience
0
0
People Overlook Hidden Racial Biases in AI Emotion Recognition

Summary: A new study reveals that most people do not recognize racial biases built into artificial intelligence systems, even when they are visible in training data. Research shows that artificial intelligence trained on imbalanced data sets (such as happy white faces and sad black faces) learns to associate race with emotion, perpetuating biased performance.

Participants rarely noticed these biases unless they belonged to the negatively portrayed group. The findings highlight the need to improve public awareness, AI literacy, and transparency in how algorithms are trained and evaluated.

Key facts:

Hidden bias: AI trained on racially imbalanced data misclassified emotions, often depicting white faces as happier than black ones. Human blindness: Most users did not notice the bias in the AI ​​data sets, trusting that the AI ​​was neutral even when it was not. Group sensitivity: Only participants from negatively portrayed racial groups were more likely to detect bias.

Source: State of Pennsylvania

When recognizing faces and emotions, artificial intelligence (AI) can be biased, such as classifying white people as happier than people of other racial backgrounds.

This happens because the data used to train the AI ​​contained a disproportionate number of happy white faces, leading it to correlate race with emotional expression.

In a recent study, published in Media Psychology, researchers asked users to evaluate such biased training data, but most users did not notice the bias unless they were in the negatively portrayed group.

The study was designed to examine whether laypeople understand that non-representative data used to train AI systems can lead to biased performance.

The academics, who have been studying this topic for five years, said AI systems need to be trained to “work for everyone” and produce results that are diverse and representative of all groups, not just one majority group.

According to the researchers, that includes understanding what the AI ​​is learning from unforeseen correlations in the training data, or from data sets fed into the system to teach it how it is expected to work in the future.

“In the case of this study, the AI ​​appears to have learned that race is an important criterion for determining whether a face is happy or sad,” said lead author S. Shyam Sundar, an Evan Pugh University professor and director of the Center for Socially Responsible Artificial Intelligence at Penn State. “Though we don’t intend for him to learn that.”

The question is whether humans can recognize this bias in training data. According to the researchers, most participants in their experiments only began to notice biases when the AI ​​showed biased performance, such as misclassifying the emotions of black individuals but doing a good job of classifying emotions expressed by white individuals.

Black participants were more likely to suspect there was a problem, especially when the training data overrepresented their own group by representing negative emotions (sadness).

“In one of the experiment scenarios, which featured racially biased AI performance, the system failed to accurately classify facial expression from images of minority groups,” said lead author Cheng “Chris” Chen, assistant professor of technology and emerging media at Oregon State University, who earned her doctorate in mass communications at Penn’s Donald P. Bellisario College of Communications. State.

“That’s what we mean by biased performance in an AI system where the system favors the dominant group in its ranking.”

Chen, Sundar and co-author Eunchae Jang, a doctoral student in mass communications at Bellisario College, created 12 versions of a prototype artificial intelligence system designed to detect users’ facial expressions.

Using 769 participants in three experiments, the researchers tested how users could detect bias in different scenarios. The first two experiments included participants of diverse racial backgrounds, with white participants making up the majority of the sample. In the third experiment, the researchers intentionally recruited an equal number of white and black participants.

The images used in the studies were of black and white people. The first experiment showed participants a biased representation of race in certain rating categories, such as happy or sad images that were unequally distributed across racial groups. The happy faces were mostly white. The sad faces were mostly black.

The second showed bias related to the lack of adequate representation of certain racial groups in the training data. For example, participants would only see images of white subjects in the happy and sad categories.

In the third experiment, the researchers presented the stimuli from the first two experiments along with their counterexamples, resulting in five conditions: happy black/sad white; happy white/sad black; all white; all black; and no racial confusion, meaning there was no possible mix of emotion and race.

For each experiment, the researchers asked participants whether they perceived the AI ​​system to treat all racial groups equally. The researchers found that in all three scenarios, most participants indicated that they did not notice any bias. In the final experiment, black participants were more likely to identify racial prejudice compared to their white counterparts, and often only when it involved unhappy images of black people.

“We were surprised that people didn’t recognize that race and emotion were confounded, that one race was more likely than others to represent a given emotion in the training data, even when put face to face,” Sundar said. “To me, that’s the most important finding of the study.”

Sundar added that the research was more about human psychology than technology. He said people often “trust AI to be neutral, even when it’s not.”

Chen said that people’s inability to detect racial confusion in training data leads to reliance on AI performance for evaluation.

“Performance bias is very, very persuasive,” Chen said. “When people see racially biased performance in an AI system, they ignore the characteristics of the training data and form their perceptions based on the biased outcome.”

Plans for future research include developing and testing better ways to communicate AI’s inherent biases to users, developers, and policymakers. The researchers said they hope to continue studying how people perceive and understand algorithmic bias by focusing on improving media and artificial intelligence literacy.

Key questions answered:

Q: What was the main finding of the study on AI and racial bias?

A: Most people were unable to detect racial bias in AI systems trained on biased data, highlighting how subtle and easily overlooked algorithmic bias can be.

Q: Why does this bias occur in AI emotion recognition?

A: AI models often learn unintended correlations from training data; for example, they associate race with emotion because of imbalanced examples, such as happier white faces and sad black faces.

Q: Why is AI’s racial emotional recognition bias important to society?

A: It shows that people trust AI too easily and often overlook racial bias unless it directly affects them, underscoring the need for public education and better detection of bias in AI systems.

About this AI research news

Author: Francisco Tutella
Source: State of Pennsylvania
Contact: Francisco Tutella – Penn State
Image: Image is credited to Neuroscience News.

Original Research: Closed access.
“Racial bias in AI training data: Do laymen notice it?” by S. Shyam Sundar et al. Media psychology

Abstract

Racial bias in AI training data: do laymen notice it?

Given that the nature of training data is the primary cause of algorithmic bias, do laypeople realize that systematic misrepresentation and underrepresentation of certain races in training data can affect AI performance in a way that privileges some races over others?

To answer this question, we conducted three online between-subject experiments (N = 769 in total) with a prototype of an artificial intelligence system that recognizes facial expressions based on emotions.

Our results show that, in general, the representativeness of training data is not an effective signal for communicating algorithmic bias. Instead, users rely on AI performance bias to perceive racial bias in AI algorithms. Additionally, the race of the users matters.

Black participants perceive the system to be more biased when all facial images used to represent unhappy emotions in the training data are those of Black individuals.

This finding highlights an important human cognitive limitation that must be taken into account when communicating algorithmic bias arising from biases in training data.

ShareTweetSendShare
Editor's

Editor's

Related Posts

Low income, vision loss and isolation drive dementia risk
NeuroScience

Low income, vision loss and isolation drive dementia risk

November 13, 2025
0
Low choline levels could be a hidden anxiety factor
NeuroScience

Low choline levels could be a hidden anxiety factor

November 12, 2025
0
Everyday speech may reveal early cognitive decline
NeuroScience

Everyday speech may reveal early cognitive decline

November 12, 2025
0
Brain Patterns of Autism and ADHD Reveal Shared Biological Roots
NeuroScience

Brain Patterns of Autism and ADHD Reveal Shared Biological Roots

November 11, 2025
0
New study uncovers structural key to how cells deliver cargo
NeuroScience

New study uncovers structural key to how cells deliver cargo

November 11, 2025
0
Shyness can originate in the cerebellum
NeuroScience

Shyness can originate in the cerebellum

November 10, 2025
0
Load More
No Result
View All Result

Rajesh Logo14 White

Rajeshspinalinjury.com is the ‘Spinal Cord Injury the latest NEWS’ website. We’ll provide you with merely interesting content.

Categories

  • Brain Computer Interface
  • Diet & Nutrition
  • Epidural Stimulation
  • Latest SCI News
  • NeuroScience
  • SCI Research
  • Spinal Cord Injury
  • Spine
  • Weight Loss
No Result
View All Result

Recent News

Low income, vision loss and isolation drive dementia risk

Low income, vision loss and isolation drive dementia risk

November 13, 2025
Laminectomy and laminotomy – Ortho Spine News

Laminectomy and laminotomy – Ortho Spine News

November 13, 2025
  • About Us
  • Contact Us
  • Disclaimer
  • Privacy Policy
  • Terms of Condition

Rajeshspinalinjury ©2025 || All Right Reserved. Design & Development by Rajesh

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

No Result
View All Result
  • Spinal Cord Injury
  • Paraplegia
  • Quadriplegia
  •  Epidural Stimulation
  • Rehabilitation
  • Latest SCI News
  • Spine
  • NeuroScience
  • Brain Computer Interface
  • Health News
  • Diet & Nutrition
  • Healthy Recipes
  • Weight Loss
  • About Us
  • Contact Us

Rajeshspinalinjury ©2025 || All Right Reserved. Design & Development by Rajesh