Searching across hundreds of databases

Our searching services are busy right now. Your search will reload in five seconds.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

This service exclusively searches for literature that cites resources. Please be aware that the total number of searchable documents is limited to those containing RRIDs and does not include all open-access literature.

Search

Type in a keyword to search

On page 1 showing 1 ~ 4 papers out of 4 papers

Precise discrimination of object position in the human pulvinar.

  • Jason Fischer‎ et al.
  • Human brain mapping‎
  • 2009‎

Very little is known about the human pulvinar; suggestions for its function include relaying input from cortical areas, allocating visual attention, supporting feature binding, and other integrative processes. The diversity of hypotheses about pulvinar function highlights our lack of understanding of its basic role. A conspicuously missing piece of information is whether the human pulvinar encodes visual information topographically. The answer to this question is crucial, as it dramatically constrains the sorts of computational and cognitive processes that the pulvinar might carry out. Here we used fMRI to test for position-sensitive encoding in the human pulvinar. Subjects passively viewed flickering Gabor stimuli, and as the spatial separation between Gabors increased, the correlation between patterns of activity across voxels within the right pulvinar decreased significantly. The results demonstrate the existence of precise topographic coding in the human pulvinar lateralized to the right hemisphere, and provide a means of functionally localizing this topographic region.


Serial dependence revealed in history-dependent perceptual templates.

  • Yuki Murai‎ et al.
  • Current biology : CB‎
  • 2021‎

In any given perceptual task, the visual system selectively weighs or filters incoming information. The particular set of weights or filters form a kind of template, which reveals the regions or types of information that are particularly useful for a given perceptual decision.1,2 Unfortunately, sensory input is noisy and ever changing. To compensate for these fluctuations, the visual system could adopt a strategy of biasing the templates such that they reflect a temporal smoothing of input, which would be a form of serial dependence.3-5 Here, we demonstrate that perceptual templates are, in fact, altered by serial dependence. Using a simple orientation detection task and classification-image technique, we found that perceptual templates are systematically biased toward previously seen, task-irrelevant orientations. The results of an orientation discrimination task suggest that this shift in perceptual template derives from a change in the perceptual appearance of orientation. Our study reveals how serial dependence biases internal templates of orientation and suggests that the sensitivity of classification-image techniques in general could be improved by taking into account history-dependent fluctuations in templates.


Stimulus-Specific Individual Differences in Holistic Perception of Mooney Faces.

  • Teresa Canas-Bajo‎ et al.
  • Frontiers in psychology‎
  • 2020‎

Humans perceive faces holistically rather than as a set of separate features. Previous work demonstrates that some individuals are better at this holistic type of processing than others. Here, we show that there are unique individual differences in holistic processing of specific Mooney faces. We operationalized the increased difficulty of recognizing a face when inverted compared to upright as a measure of the degree to which individual Mooney faces were processed holistically by individual observers. Our results show that Mooney faces vary considerably in the extent to which they tap into holistic processing; some Mooney faces require holistic processing more than others. Importantly, there is little between-subject agreement about which faces are processed holistically; specific faces that are processed holistically by one observer are not by other observers. Essentially, what counts as holistic for one person is unique to that particular observer. Interestingly, we found that the per-face, per-observer differences in face discrimination only occurred for harder Mooney faces that required relatively more holistic processing. These findings suggest that holistic processing of hard Mooney faces depends on a particular observer's experience whereas processing of easier, cartoon-like Mooney faces can proceed universally for everyone. Future work using Mooney faces in perception research should take these stimulus-specific individual differences into account to best isolate holistic processing.


Serial dependence in visual perception.

  • Jason Fischer‎ et al.
  • Nature neuroscience‎
  • 2014‎

Visual input often arrives in a noisy and discontinuous stream, owing to head and eye movements, occlusion, lighting changes, and many other factors. Yet the physical world is generally stable; objects and physical characteristics rarely change spontaneously. How then does the human visual system capitalize on continuity in the physical environment over time? We found that visual perception in humans is serially dependent, using both prior and present input to inform perception at the present moment. Using an orientation judgment task, we found that, even when visual input changed randomly over time, perceived orientation was strongly and systematically biased toward recently seen stimuli. Furthermore, the strength of this bias was modulated by attention and tuned to the spatial and temporal proximity of successive stimuli. These results reveal a serial dependence in perception characterized by a spatiotemporally tuned, orientation-selective operator-which we call a continuity field-that may promote visual stability over time.


  1. SciCrunch.org Resources

    Welcome to the FDI Lab - SciCrunch.org Resources search. From here you can search through a compilation of resources used by FDI Lab - SciCrunch.org and see how data is organized within our community.

  2. Navigation

    You are currently on the Community Resources tab looking through categories and sources that FDI Lab - SciCrunch.org has compiled. You can navigate through those categories from here or change to a different tab to execute your search through. Each tab gives a different perspective on data.

  3. Logging in and Registering

    If you have an account on FDI Lab - SciCrunch.org then you can log in from here to get additional features in FDI Lab - SciCrunch.org such as Collections, Saved Searches, and managing Resources.

  4. Searching

    Here is the search term that is being executed, you can type in anything you want to search for. Some tips to help searching:

    1. Use quotes around phrases you want to match exactly
    2. You can manually AND and OR terms to change how we search between words
    3. You can add "-" to terms to make sure no results return with that term in them (ex. Cerebellum -CA1)
    4. You can add "+" to terms to require they be in the data
    5. Using autocomplete specifies which branch of our semantics you with to search and can help refine your search
  5. Save Your Search

    You can save any searches you perform for quick access to later from here.

  6. Query Expansion

    We recognized your search term and included synonyms and inferred terms along side your term to help get the data you are looking for.

  7. Collections

    If you are logged into FDI Lab - SciCrunch.org you can add data records to your collections to create custom spreadsheets across multiple sources of data.

  8. Facets

    Here are the facets that you can filter your papers by.

  9. Options

    From here we'll present any options for the literature, such as exporting your current results.

  10. Further Questions

    If you have any further questions please check out our FAQs Page to ask questions and see our tutorials. Click this button to view this tutorial again.

Publications Per Year

X

Year:

Count: