Searching across hundreds of databases

Our searching services are busy right now. Your search will reload in five seconds.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

This service exclusively searches for literature that cites resources. Please be aware that the total number of searchable documents is limited to those containing RRIDs and does not include all open-access literature.

Search

Type in a keyword to search

On page 1 showing 1 ~ 9 papers out of 9 papers

Eye Movement-Related Confounds in Neural Decoding of Visual Working Memory Representations.

  • Pim Mostert‎ et al.
  • eNeuro‎
  • 2018‎

A relatively new analysis technique, known as neural decoding or multivariate pattern analysis (MVPA), has become increasingly popular for cognitive neuroimaging studies over recent years. These techniques promise to uncover the representational contents of neural signals, as well as the underlying code and the dynamic profile thereof. A field in which these techniques have led to novel insights in particular is that of visual working memory (VWM). In the present study, we subjected human volunteers to a combined VWM/imagery task while recording their neural signals using magnetoencephalography (MEG). We applied multivariate decoding analyses to uncover the temporal profile underlying the neural representations of the memorized item. Analysis of gaze position however revealed that our results were contaminated by systematic eye movements, suggesting that the MEG decoding results from our originally planned analyses were confounded. In addition to the eye movement analyses, we also present the original analyses to highlight how these might have readily led to invalid conclusions. Finally, we demonstrate a potential remedy, whereby we train the decoders on a functional localizer that was specifically designed to target bottom-up sensory signals and as such avoids eye movements. We conclude by arguing for more awareness of the potentially pervasive and ubiquitous effects of eye movement-related confounds.


Cortical activity during naturalistic music listening reflects short-range predictions based on long-term experience.

  • Pius Kern‎ et al.
  • eLife‎
  • 2022‎

Expectations shape our experience of music. However, the internal model upon which listeners form melodic expectations is still debated. Do expectations stem from Gestalt-like principles or statistical learning? If the latter, does long-term experience play an important role, or are short-term regularities sufficient? And finally, what length of context informs contextual expectations? To answer these questions, we presented human listeners with diverse naturalistic compositions from Western classical music, while recording neural activity using MEG. We quantified note-level melodic surprise and uncertainty using various computational models of music, including a state-of-the-art transformer neural network. A time-resolved regression analysis revealed that neural activity over fronto-temporal sensors tracked melodic surprise particularly around 200ms and 300-500ms after note onset. This neural surprise response was dissociated from sensory-acoustic and adaptation effects. Neural surprise was best predicted by computational models that incorporated long-term statistical learning-rather than by simple, Gestalt-like principles. Yet, intriguingly, the surprise reflected primarily short-range musical contexts of less than ten notes. We present a full replication of our novel MEG results in an openly available EEG dataset. Together, these results elucidate the internal model that shapes melodic predictions during naturalistic music listening.


Neural correlates of observing joint actions with shared intentions.

  • Terry Eskenazi‎ et al.
  • Cortex; a journal devoted to the study of the nervous system and behavior‎
  • 2015‎

Studies on the neural bases of action perception have largely focused on the perception of individual actions. Little is known about perception of joint actions where two or more individuals coordinate their actions based on a shared intention. In this fMRI study we asked whether observing situations where two individuals act on a shared intention elicits a different neural response than observing situations where individuals act on their independent parallel intentions. We compared the neural response to perceptually identical yet intentionally ambiguous actions observed in varying contexts. A dialog between two individuals conveyed either a shared intention or two independent parallel intentions. The dialogs were followed by an identical video clip where the two individuals performed certain actions. In one task condition participants tracked the intentions of the actors, in the other, they monitored moving colored dots placed on the same videos. We found that in the intention task versus the color task, observing joint actions based on shared intentions activated the temporal poles, precuneus, and the ventral striatum compared to observing interactions based on parallel intentions. Precuneus and the temporal poles are thought to support mental state reasoning, the latter with a more specific role in retrieving memories associated with social scripts. Activation in the ventral striatum, an area involved in reward processing, likely indicates a hedonistic response to observed shared intentional relations similarly to those experienced when personally sharing mental states with others.


McGurk illusion recalibrates subsequent auditory perception.

  • Claudia S Lüttke‎ et al.
  • Scientific reports‎
  • 2016‎

Visual information can alter auditory perception. This is clearly illustrated by the well-known McGurk illusion, where an auditory/aba/ and a visual /aga/ are merged to the percept of 'ada'. It is less clear however whether such a change in perception may recalibrate subsequent perception. Here we asked whether the altered auditory perception due to the McGurk illusion affects subsequent auditory perception, i.e. whether this process of fusion may cause a recalibration of the auditory boundaries between phonemes. Participants categorized auditory and audiovisual speech stimuli as /aba/, /ada/ or /aga/ while activity patterns in their auditory cortices were recorded using fMRI. Interestingly, following a McGurk illusion, an auditory /aba/ was more often misperceived as 'ada'. Furthermore, we observed a neural counterpart of this recalibration in the early auditory cortex. When the auditory input /aba/ was perceived as 'ada', activity patterns bore stronger resemblance to activity patterns elicited by /ada/ sounds than when they were correctly perceived as /aba/. Our results suggest that upon experiencing the McGurk illusion, the brain shifts the neural representation of an /aba/ sound towards /ada/, culminating in a recalibration in perception of subsequent auditory input.


Rapid recalibration of speech perception after experiencing the McGurk illusion.

  • Claudia S Lüttke‎ et al.
  • Royal Society open science‎
  • 2018‎

The human brain can quickly adapt to changes in the environment. One example is phonetic recalibration: a speech sound is interpreted differently depending on the visual speech and this interpretation persists in the absence of visual information. Here, we examined the mechanisms of phonetic recalibration. Participants categorized the auditory syllables /aba/ and /ada/, which were sometimes preceded by the so-called McGurk stimuli (in which an /aba/ sound, due to visual /aga/ input, is often perceived as 'ada'). We found that only one trial of exposure to the McGurk illusion was sufficient to induce a recalibration effect, i.e. an auditory /aba/ stimulus was subsequently more often perceived as 'ada'. Furthermore, phonetic recalibration took place only when auditory and visual inputs were integrated to 'ada' (McGurk illusion). Moreover, this recalibration depended on the sensory similarity between the preceding and current auditory stimulus. Finally, signal detection theoretical analysis showed that McGurk-induced phonetic recalibration resulted in both a criterion shift towards /ada/ and a reduced sensitivity to distinguish between /aba/ and /ada/ sounds. The current study shows that phonetic recalibration is dependent on the perceptual integration of audiovisual information and leads to a perceptual shift in phoneme categorization.


Magnetoencephalography recordings reveal the neural mechanisms of auditory contributions to improved visual detection.

  • Alexis Pérez-Bellido‎ et al.
  • Communications biology‎
  • 2023‎

Sounds enhance the detection of visual stimuli while concurrently biasing an observer's decisions. To investigate the neural mechanisms that underlie such multisensory interactions, we decoded time-resolved Signal Detection Theory sensitivity and criterion parameters from magneto-encephalographic recordings of participants that performed a visual detection task. We found that sounds improved visual detection sensitivity by enhancing the accumulation and maintenance of perceptual evidence over time. Meanwhile, criterion decoding analyses revealed that sounds induced brain activity patterns that resembled the patterns evoked by an actual visual stimulus. These two complementary mechanisms of audiovisual interplay differed in terms of their automaticity: Whereas the sound-induced enhancement in visual sensitivity depended on participants being actively engaged in a detection task, we found that sounds activated the visual cortex irrespective of task demands, potentially inducing visual illusory percepts. These results challenge the classical assumption that sound-induced increases in false alarms exclusively correspond to decision-level biases.


Updating Contextual Sensory Expectations for Adaptive Behavior.

  • Ambra Ferrari‎ et al.
  • The Journal of neuroscience : the official journal of the Society for Neuroscience‎
  • 2022‎

The brain has the extraordinary capacity to construct predictive models of the environment by internalizing statistical regularities in the sensory inputs. The resulting sensory expectations shape how we perceive and react to the world; at the neural level, this relates to decreased neural responses to expected than unexpected stimuli ("expectation suppression"). Crucially, expectations may need revision as context changes. However, existing research has often neglected this issue. Further, it is unclear whether contextual revisions apply selectively to expectations relevant to the task at hand, hence serving adaptive behavior. The present fMRI study examined how contextual visual expectations spread throughout the cortical hierarchy as we update our beliefs. We created a volatile environment: two alternating contexts contained different sequences of object images, thereby producing context-dependent expectations that needed revision when the context changed. Human participants of both sexes attended a training session before scanning to learn the contextual sequences. The fMRI experiment then tested for the emergence of contextual expectation suppression in two separate tasks, respectively, with task-relevant and task-irrelevant expectations. Effects of contextual expectation emerged progressively across the cortical hierarchy as participants attuned themselves to the context: expectation suppression appeared first in the insula, inferior frontal gyrus, and posterior parietal cortex, followed by the ventral visual stream, up to early visual cortex. This applied selectively to task-relevant expectations. Together, the present results suggest that an insular and frontoparietal executive control network may guide the flexible deployment of contextual sensory expectations for adaptive behavior in our complex and dynamic world.SIGNIFICANCE STATEMENT The world is structured by statistical regularities, which we use to predict the future. This is often accompanied by suppressed neural responses to expected compared with unexpected events ("expectation suppression"). Crucially, the world is also highly volatile and context-dependent: expected events may become unexpected when the context changes, thus raising the crucial need for belief updating. However, this issue has generally been neglected. By setting up a volatile environment, we show that expectation suppression emerges first in executive control regions, followed by relevant sensory areas, only when observers use their expectations to optimize behavior. This provides surprising yet clear evidence on how the brain controls the updating of sensory expectations for adaptive behavior in our ever-changing world.


Statistical learning attenuates visual activity only for attended stimuli.

  • David Richter‎ et al.
  • eLife‎
  • 2019‎

Perception and behavior can be guided by predictions, which are often based on learned statistical regularities. Neural responses to expected stimuli are frequently found to be attenuated after statistical learning. However, whether this sensory attenuation following statistical learning occurs automatically or depends on attention remains unknown. In the present fMRI study, we exposed human volunteers to sequentially presented object stimuli, in which the first object predicted the identity of the second object. We observed a reliable attenuation of neural activity for expected compared to unexpected stimuli in the ventral visual stream. Crucially, this sensory attenuation was only apparent when stimuli were attended, and vanished when attention was directed away from the predictable objects. These results put important constraints on neurocomputational theories that cast perception as a process of probabilistic integration of prior knowledge and sensory information.


Suppressed Sensory Response to Predictable Object Stimuli throughout the Ventral Visual Stream.

  • David Richter‎ et al.
  • The Journal of neuroscience : the official journal of the Society for Neuroscience‎
  • 2018‎

Prediction plays a crucial role in perception, as prominently suggested by predictive coding theories. However, the exact form and mechanism of predictive modulations of sensory processing remain unclear, with some studies reporting a downregulation of the sensory response for predictable input whereas others observed an enhanced response. In a similar vein, downregulation of the sensory response for predictable input has been linked to either sharpening or dampening of the sensory representation, which are opposite in nature. In the present study, we set out to investigate the neural consequences of perceptual expectation of object stimuli throughout the visual hierarchy, using fMRI in human volunteers. Participants of both sexes were exposed to pairs of sequentially presented object images in a statistical learning paradigm, in which the first object predicted the identity of the second object. Image transitions were not task relevant; thus, all learning of statistical regularities was incidental. We found strong suppression of neural responses to expected compared with unexpected stimuli throughout the ventral visual stream, including primary visual cortex, lateral occipital complex, and anterior ventral visual areas. Expectation suppression in lateral occipital complex scaled positively with image preference and voxel selectivity, lending support to the dampening account of expectation suppression in object perception.SIGNIFICANCE STATEMENT It has been suggested that the brain fundamentally relies on predictions and constructs models of the world to make sense of sensory information. Previous research on the neural basis of prediction has documented suppressed neural responses to expected compared with unexpected stimuli. In the present study, we demonstrate robust expectation suppression throughout the entire ventral visual stream, and underlying this suppression a dampening of the sensory representation in object-selective visual cortex, but not in primary visual cortex. Together, our results provide novel evidence in support of theories conceptualizing perception as an active inference process, which selectively dampens cortical representations of predictable objects. This dampening may support our ability to automatically filter out irrelevant, predictable objects.


  1. SciCrunch.org Resources

    Welcome to the FDI Lab - SciCrunch.org Resources search. From here you can search through a compilation of resources used by FDI Lab - SciCrunch.org and see how data is organized within our community.

  2. Navigation

    You are currently on the Community Resources tab looking through categories and sources that FDI Lab - SciCrunch.org has compiled. You can navigate through those categories from here or change to a different tab to execute your search through. Each tab gives a different perspective on data.

  3. Logging in and Registering

    If you have an account on FDI Lab - SciCrunch.org then you can log in from here to get additional features in FDI Lab - SciCrunch.org such as Collections, Saved Searches, and managing Resources.

  4. Searching

    Here is the search term that is being executed, you can type in anything you want to search for. Some tips to help searching:

    1. Use quotes around phrases you want to match exactly
    2. You can manually AND and OR terms to change how we search between words
    3. You can add "-" to terms to make sure no results return with that term in them (ex. Cerebellum -CA1)
    4. You can add "+" to terms to require they be in the data
    5. Using autocomplete specifies which branch of our semantics you with to search and can help refine your search
  5. Save Your Search

    You can save any searches you perform for quick access to later from here.

  6. Query Expansion

    We recognized your search term and included synonyms and inferred terms along side your term to help get the data you are looking for.

  7. Collections

    If you are logged into FDI Lab - SciCrunch.org you can add data records to your collections to create custom spreadsheets across multiple sources of data.

  8. Facets

    Here are the facets that you can filter your papers by.

  9. Options

    From here we'll present any options for the literature, such as exporting your current results.

  10. Further Questions

    If you have any further questions please check out our FAQs Page to ask questions and see our tutorials. Click this button to view this tutorial again.

Publications Per Year

X

Year:

Count: