Searching across hundreds of databases

Our searching services are busy right now. Your search will reload in five seconds.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

Integration of audiovisual spatial signals is not consistent with maximum likelihood estimation.

Cortex; a journal devoted to the study of the nervous system and behavior | 2019

Multisensory perception is regarded as one of the most prominent examples where human behaviour conforms to the computational principles of maximum likelihood estimation (MLE). In particular, observers are thought to integrate auditory and visual spatial cues weighted in proportion to their relative sensory reliabilities into the most reliable and unbiased percept consistent with MLE. Yet, evidence to date has been inconsistent. The current pre-registered, large-scale (N = 36) replication study investigated the extent to which human behaviour for audiovisual localization is in line with maximum likelihood estimation. The acquired psychophysics data show that while observers were able to reduce their multisensory variance relative to the unisensory variances in accordance with MLE, they weighed the visual signals significantly stronger than predicted by MLE. Simulations show that this dissociation can be explained by a greater sensitivity of standard estimation procedures to detect deviations from MLE predictions for sensory weights than for audiovisual variances. Our results therefore suggest that observers did not integrate audiovisual spatial signals weighted exactly in proportion to their relative reliabilities for localization. These small deviations from the predictions of maximum likelihood estimation may be explained by observers' uncertainty about the world's causal structure as accounted for by Bayesian causal inference.

Pubmed ID: 31082680 RIS Download

Research resources used in this publication

None found

Antibodies used in this publication

None found

Associated grants

None

Publication data is provided by the National Library of Medicine ® and PubMed ®. Data is retrieved from PubMed ® on a weekly schedule. For terms and conditions see the National Library of Medicine Terms and Conditions.

This is a list of tools and resources that we have found mentioned in this publication.


MATLAB (tool)

RRID:SCR_001622

Multi paradigm numerical computing environment and fourth generation programming language developed by MathWorks. Allows matrix manipulations, plotting of functions and data, implementation of algorithms, creation of user interfaces, and interfacing with programs written in other languages, including C, C++, Java, Fortran and Python. Used to explore and visualize ideas and collaborate across disciplines including signal and image processing, communications, control systems, and computational finance.

View all literature mentions

G*Power (tool)

RRID:SCR_013726

Data analytics software to compute statistical power analyses for many commonly used statistical tests in social and behavioral research. It can also be used to compute effect sizes and to graphically display the results of power analyses.

View all literature mentions

Psychophysics Toolbox (tool)

RRID:SCR_002881

A free set of Matlab and GNU/Octave functions for vision research. It makes it easy to synthesize and show accurately controlled visual and auditory stimuli and interact with the observer.

View all literature mentions

Palamedes Toolbox (tool)

RRID:SCR_006521

Matlab routines for analyzing psychophysical data * sychometric function fitting * Multi-condition model fitting * Adaptive procedures * Signal detection measures * Maximum likelihood difference scaling * Model comparisons

View all literature mentions

SR Research EyeLink Eye Trackers (tool)

RRID:SCR_009602

THIS RESOURCE IS NO LONGER AVAILABLE,documented on February 1st, 2022. Instrument supplier providing eye tracking capabilities for behavioral labs as well as for MRI, MEG, and EEG research environments.

View all literature mentions