Assessment of human emotional reactions to visual stimuli “deep-dreamed” by artificial neural networks
2024

How Artificial Neural Networks Affect Human Emotions

Sample size: 150 publication Evidence: moderate

Author Information

Author(s): Agnieszka Marczak-Czajka, Timothy Redgrave, Mahsa Mitcheff, Michael Villano, Adam Czajka

Primary Institution: University of Notre Dame

Hypothesis

Can visual stimuli synthesized by artificial neural networks evoke specific emotional reactions in humans?

Conclusion

The study shows that synthesizing images using specific methods can enhance or reduce emotional reactions in viewers.

Supporting Evidence

  • Synthesized images that maximized activations of some CNN layers led to significantly higher or lower arousal and valence levels compared to average subject reactions.
  • Multiple linear regression analysis found that hue, feature congestion, and sub-band entropy were significant predictors of arousal.
  • No statistically significant dependencies were found between image global visual features and the measured valence.

Takeaway

This study found that pictures made by computers can make people feel different emotions, like happy or sad, depending on how the pictures are made.

Methodology

Participants rated their emotional responses to images generated by a convolutional neural network based on self-assessment manikin figures.

Potential Biases

Self-reported data may introduce biases in emotional responses.

Limitations

The study relied on self-reported data and had a limited sample size, which may affect the generalizability of the findings.

Participant Demographics

150 participants (98 female) with a mean age of 26.16 years.

Statistical Information

P-Value

p<0.001

Statistical Significance

p<0.001

Digital Object Identifier (DOI)

10.3389/fpsyg.2024.1509392

Want to read the original?

Access the complete publication on the publisher's website

View Original Publication