Causal Inference in Multisensory Perception
2007

Causal Inference in Multisensory Perception

Sample size: 19 publication Evidence: high

Author Information

Author(s): Körding Konrad P., Beierholm Ulrik, Ma Wei Ji, Quartz Steven, Tenenbaum Joshua B., Shams Ladan

Primary Institution: California Institute of Technology

Hypothesis

Can the brain efficiently infer the causes underlying sensory events through multisensory cue combination?

Conclusion

Humans can efficiently infer the causal structure and location of sensory stimuli using multisensory integration.

Supporting Evidence

  • The model accurately predicts how humans combine auditory and visual cues.
  • Subjects' estimates of auditory positions differ from their estimates of visual positions.
  • The causal inference model explains the observed patterns of partial combination in multisensory perception.

Takeaway

When we see and hear something at the same time, our brain tries to figure out if they come from the same source or different ones, which helps us understand what's happening around us.

Methodology

The study used a dual-report paradigm where participants reported the perceived locations of visual and auditory stimuli presented simultaneously with varying spatial disparities.

Potential Biases

Potential biases in participant responses due to the experimental setup and the nature of sensory stimuli.

Limitations

The study's findings may not generalize to all types of multisensory integration scenarios.

Participant Demographics

Nineteen undergraduate students, ten male, from the California Institute of Technology.

Statistical Information

P-Value

p<0.0001

Statistical Significance

p<0.0001

Digital Object Identifier (DOI)

10.1371/journal.pone.0000943

Want to read the original?

Access the complete publication on the publisher's website

View Original Publication