2024MAY03: Our hosting provider has resolved some DB connectivity issues. We may experience some more outages as the issue is resolved. We apologize for the inconvenience. Dismiss and don't show again

Searching across hundreds of databases

Our searching services are busy right now. Your search will reload in five seconds.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

This service exclusively searches for literature that cites resources. Please be aware that the total number of searchable documents is limited to those containing RRIDs and does not include all open-access literature.

Search

Type in a keyword to search

On page 1 showing 1 ~ 4 papers out of 4 papers

Enhanced odor discrimination and impaired olfactory memory by spatially controlled switch of AMPA receptors.

  • Derya R Shimshek‎ et al.
  • PLoS biology‎
  • 2005‎

Genetic perturbations of alpha-amino-3-hydroxy-5-methyl-4-isoxazolepropionate receptors (AMPARs) are widely used to dissect molecular mechanisms of sensory coding, learning, and memory. In this study, we investigated the role of Ca2+-permeable AMPARs in olfactory behavior. AMPAR modification was obtained by depletion of the GluR-B subunit or expression of unedited GluR-B(Q), both leading to increased Ca2+ permeability of AMPARs. Mice with this functional AMPAR switch, specifically in forebrain, showed enhanced olfactory discrimination and more rapid learning in a go/no-go operant conditioning task. Olfactory memory, however, was dramatically impaired. GluR-B depletion in forebrain was ectopically variable ("mosaic") among individuals and strongly correlated with decreased olfactory memory in hippocampus and cortex. Accordingly, memory was rescued by transgenic GluR-B expression restricted to piriform cortex and hippocampus, while enhanced odor discrimination was independent of both GluR-B variability and transgenic GluR-B expression. Thus, correlated differences in behavior and levels of GluR-B expression allowed a mechanistic and spatial dissection of olfactory learning, discrimination, and memory capabilities.


AutonoMouse: High throughput operant conditioning reveals progressive impairment with graded olfactory bulb lesions.

  • Andrew Erskine‎ et al.
  • PloS one‎
  • 2019‎

Operant conditioning is a crucial tool in neuroscience research for probing brain function. While molecular, anatomical and even physiological techniques have seen radical increases in throughput, efficiency, and reproducibility in recent years, behavioural tools have somewhat lagged behind. Here we present a fully automated, high-throughput system for self-initiated conditioning of up to 25 group-housed, radio-frequency identification (RFID) tagged mice over periods of several months and >106 trials. We validate this "AutonoMouse" system in a series of olfactory behavioural tasks and show that acquired data is comparable to previous semi-manual approaches. Furthermore, we use AutonoMouse to systematically probe the impact of graded olfactory bulb lesions on olfactory behaviour, demonstrating that while odour discrimination in general is robust to even most extensive disruptions, small olfactory bulb lesions already impair odour detection. Discrimination learning of similar mixtures as well as learning speed are in turn reliably impacted by medium lesion sizes. The modular nature and open-source design of AutonoMouse should allow for similar robust and systematic assessments across neuroscience research areas.


High-Throughput Automated Olfactory Phenotyping of Group-Housed Mice.

  • Janine K Reinert‎ et al.
  • Frontiers in behavioral neuroscience‎
  • 2019‎

Behavioral phenotyping of mice is often compromised by manual interventions of the experimenter and limited throughput. Here, we describe a fully automated behavior setup that allows for quantitative analysis of mouse olfaction with minimized experimenter involvement. Mice are group-housed and tagged with unique RFID chips. They can freely initiate trials and are automatically trained on a go/no-go task, learning to distinguish a rewarded from an unrewarded odor. Further, odor discrimination tasks and detailed training aspects can be set for each animal individually for automated execution without direct experimenter intervention. The procedure described here, from initial RFID implantation to discrimination of complex odor mixtures at high accuracy, can be completed within <2 months with cohorts of up to 25 male mice. Apart from the presentation of monomolecular odors, the setup can generate arbitrary mixtures and dilutions from any set of odors to create complex stimuli, enabling demanding behavioral analyses at high-throughput.


Perceptual judgements and chronic imaging of altered odour maps indicate comprehensive stimulus template matching in olfaction.

  • Edward F Bracey‎ et al.
  • Nature communications‎
  • 2013‎

Lesion experiments suggest that odour input to the olfactory bulb contains significant redundant signal such that rodents can discern odours using minimal stimulus-related information. Here we investigate the dependence of odour-quality perception on the integrity of glomerular activity by comparing odour-evoked activity maps before and after epithelial lesions. Lesions prevent mice from recognizing previously experienced odours and differentially delay discrimination learning of unrecognized and novel odour pairs. Poor recognition results not from mice experiencing an altered concentration of an odour but from perception of apparent novel qualities. Consistent with this, relative intensity of glomerular activity following lesions is altered compared with maps recorded in shams and by varying odour concentration. Together, these data show that odour recognition relies on comprehensively matching input patterns to a previously generated stimulus template. When encountering novel odours, access to all glomerular activity ensures rapid generation of new templates to perform accurate perceptual judgements.


  1. SciCrunch.org Resources

    Welcome to the FDI Lab - SciCrunch.org Resources search. From here you can search through a compilation of resources used by FDI Lab - SciCrunch.org and see how data is organized within our community.

  2. Navigation

    You are currently on the Community Resources tab looking through categories and sources that FDI Lab - SciCrunch.org has compiled. You can navigate through those categories from here or change to a different tab to execute your search through. Each tab gives a different perspective on data.

  3. Logging in and Registering

    If you have an account on FDI Lab - SciCrunch.org then you can log in from here to get additional features in FDI Lab - SciCrunch.org such as Collections, Saved Searches, and managing Resources.

  4. Searching

    Here is the search term that is being executed, you can type in anything you want to search for. Some tips to help searching:

    1. Use quotes around phrases you want to match exactly
    2. You can manually AND and OR terms to change how we search between words
    3. You can add "-" to terms to make sure no results return with that term in them (ex. Cerebellum -CA1)
    4. You can add "+" to terms to require they be in the data
    5. Using autocomplete specifies which branch of our semantics you with to search and can help refine your search
  5. Save Your Search

    You can save any searches you perform for quick access to later from here.

  6. Query Expansion

    We recognized your search term and included synonyms and inferred terms along side your term to help get the data you are looking for.

  7. Collections

    If you are logged into FDI Lab - SciCrunch.org you can add data records to your collections to create custom spreadsheets across multiple sources of data.

  8. Facets

    Here are the facets that you can filter your papers by.

  9. Options

    From here we'll present any options for the literature, such as exporting your current results.

  10. Further Questions

    If you have any further questions please check out our FAQs Page to ask questions and see our tutorials. Click this button to view this tutorial again.

Publications Per Year

X

Year:

Count: