Searching across hundreds of databases

Our searching services are busy right now. Your search will reload in five seconds.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

This service exclusively searches for literature that cites resources. Please be aware that the total number of searchable documents is limited to those containing RRIDs and does not include all open-access literature.

Search

Type in a keyword to search

On page 1 showing 1 ~ 9 papers out of 9 papers

A Circuit for Integration of Head- and Visual-Motion Signals in Layer 6 of Mouse Primary Visual Cortex.

  • Mateo Vélez-Fort‎ et al.
  • Neuron‎
  • 2018‎

To interpret visual-motion events, the underlying computation must involve internal reference to the motion status of the observer's head. We show here that layer 6 (L6) principal neurons in mouse primary visual cortex (V1) receive a diffuse, vestibular-mediated synaptic input that signals the angular velocity of horizontal rotation. Behavioral and theoretical experiments indicate that these inputs, distributed over a network of 100 L6 neurons, provide both a reliable estimate and, therefore, physiological separation of head-velocity signals. During head rotation in the presence of visual stimuli, L6 neurons exhibit postsynaptic responses that approximate the arithmetic sum of the vestibular and visual-motion response. Functional input mapping reveals that these internal motion signals arrive into L6 via a direct projection from the retrosplenial cortex. We therefore propose that visual-motion processing in V1 L6 is multisensory and contextually dependent on the motion status of the animal's head.


Innate heuristics and fast learning support escape route selection in mice.

  • Federico Claudi‎ et al.
  • Current biology : CB‎
  • 2022‎

When faced with imminent danger, animals must rapidly take defensive actions to reach safety. Mice can react to threatening stimuli in ∼250 milliseconds1 and, in simple environments, use spatial memory to quickly escape to shelter.2,3 Natural habitats, however, often offer multiple routes to safety that animals must identify and choose from.4 This is challenging because although rodents can learn to navigate complex mazes,5,6 learning the value of different routes through trial and error during escape could be deadly. Here, we investigated how mice learn to choose between different escape routes. Using environments with paths to shelter of varying length and geometry, we find that mice prefer options that minimize path distance and angle relative to the shelter. This strategy is already present during the first threat encounter and after only ∼10 minutes of exploration in a novel environment, indicating that route selection does not require experience of escaping. Instead, an innate heuristic assigns survival value to each path after rapidly learning the spatial environment. This route selection process is flexible and allows quick adaptation to arenas with dynamic geometries. Computational modeling shows that model-based reinforcement learning agents replicate the observed behavior in environments where the shelter location is rewarding during exploration. These results show that mice combine fast spatial learning with innate heuristics to choose escape routes with the highest survival value. The results further suggest that integrating prior knowledge acquired through evolution with knowledge learned from experience supports adaptation to changing environments and minimizes the need for trial and error when the errors are costly.


A system for tracking whisker kinematics and whisker shape in three dimensions.

  • Rasmus S Petersen‎ et al.
  • PLoS computational biology‎
  • 2020‎

Quantification of behaviour is essential for biology. Since the whisker system is a popular model, it is important to have methods for measuring whisker movements from behaving animals. Here, we developed a high-speed imaging system that measures whisker movements simultaneously from two vantage points. We developed a whisker tracker algorithm that automatically reconstructs 3D whisker information directly from the 'stereo' video data. The tracker is controlled via a Graphical User Interface that also allows user-friendly curation. The algorithm tracks whiskers, by fitting a 3D Bezier curve to the basal section of each target whisker. By using prior knowledge of natural whisker motion and natural whisker shape to constrain the fits and by minimising the number of fitted parameters, the algorithm is able to track multiple whiskers in parallel with low error rate. We used the output of the tracker to produce a 3D description of each tracked whisker, including its 3D orientation and 3D shape, as well as bending-related mechanical force. In conclusion, we present a non-invasive, automatic system to track whiskers in 3D from high-speed video, creating the opportunity for comprehensive 3D analysis of sensorimotor behaviour and its neural basis.


Prediction of Choice from Competing Mechanosensory and Choice-Memory Cues during Active Tactile Decision Making.

  • Dario Campagner‎ et al.
  • The Journal of neuroscience : the official journal of the Society for Neuroscience‎
  • 2019‎

Perceptual decision making is an active process where animals move their sense organs to extract task-relevant information. To investigate how the brain translates sensory input into decisions during active sensation, we developed a mouse active touch task where the mechanosensory input can be precisely measured and that challenges animals to use multiple mechanosensory cues. Male mice were trained to localize a pole using a single whisker and to report their decision by selecting one of three choices. Using high-speed imaging and machine vision, we estimated whisker-object mechanical forces at millisecond resolution. Mice solved the task by a sensory-motor strategy where both the strength and direction of whisker bending were informative cues to pole location. We found competing influences of immediate sensory input and choice memory on mouse choice. On correct trials, choice could be predicted from the direction and strength of whisker bending, but not from previous choice. In contrast, on error trials, choice could be predicted from previous choice but not from whisker bending. This study shows that animal choices during active tactile decision making can be predicted from mechanosensory and choice-memory signals, and provides a new task well suited for the future study of the neural basis of active perceptual decisions.SIGNIFICANCE STATEMENT Due to the difficulty of measuring the sensory input to moving sense organs, active perceptual decision making remains poorly understood. The whisker system provides a way forward since it is now possible to measure the mechanical forces due to whisker-object contact during behavior. Here we train mice in a novel behavioral task that challenges them to use rich mechanosensory cues but can be performed using one whisker and enables task-relevant mechanical forces to be precisely estimated. This approach enables rigorous study of how sensory cues translate into action during active, perceptual decision making. Our findings provide new insight into active touch and how sensory/internal signals interact to determine behavioral choices.


Control of fear extinction by hypothalamic melanin-concentrating hormone-expressing neurons.

  • Cristina Concetti‎ et al.
  • Proceedings of the National Academy of Sciences of the United States of America‎
  • 2020‎

Learning to fear danger is essential for survival. However, overactive, relapsing fear behavior in the absence of danger is a hallmark of disabling anxiety disorders that affect millions of people. Its suppression is thus of great interest, but the necessary brain components remain incompletely identified. We studied fear suppression through a procedure in which, after acquiring fear of aversive events (fear learning), subjects were exposed to fear-eliciting cues without aversive events (safety learning), leading to suppression of fear behavior (fear extinction). Here we show that inappropriate, learning-resistant fear behavior results from disruption of brain components not previously implicated in this disorder: hypothalamic melanin-concentrating hormone-expressing neurons (MNs). Using real-time recordings of MNs across fear learning and extinction, we provide evidence that fear-inducing aversive events elevate MN activity. We find that optogenetic disruption of this MN activity profoundly impairs safety learning, abnormally slowing down fear extinction and exacerbating fear relapse. Importantly, we demonstrate that the MN disruption impairs neither fear learning nor related sensory responses, indicating that MNs differentially control safety and fear learning. Thus, we identify a neural substrate for inhibition of excessive fear behavior.


The peripheral olfactory code in Drosophila larvae contains temporal information and is robust over multiple timescales.

  • Micheline Grillet‎ et al.
  • Proceedings. Biological sciences‎
  • 2016‎

We studied the electrophysiological activity of two classes of Drosophila melanogaster larval olfactory sensory neurons (OSNs), Or24a and Or74a, in response to 1 s stimulation with butanol, octanol, 2-heptanone, and propyl acetate. Each odour/OSN combination produced unique responses in terms of spike count and temporal profile. We used a classifier algorithm to explore the information content of OSN activity, and showed that as well as spike count, the activity of these OSNs included temporal information that enabled the classifier to accurately identify odours. The responses of OSNs during continuous odour exposure (5 and 20 min) showed that both types of neuron continued to respond, with no complete adaptation, and with no change to their ability to encode temporal information. Finally, we exposed larvae to octanol for 3 days and found only minor quantitative changes in OSN response to odours, indicating that the larval peripheral code is robust when faced with long-term exposure to odours, such as would be found in a natural context.


Perceptual judgements and chronic imaging of altered odour maps indicate comprehensive stimulus template matching in olfaction.

  • Edward F Bracey‎ et al.
  • Nature communications‎
  • 2013‎

Lesion experiments suggest that odour input to the olfactory bulb contains significant redundant signal such that rodents can discern odours using minimal stimulus-related information. Here we investigate the dependence of odour-quality perception on the integrity of glomerular activity by comparing odour-evoked activity maps before and after epithelial lesions. Lesions prevent mice from recognizing previously experienced odours and differentially delay discrimination learning of unrecognized and novel odour pairs. Poor recognition results not from mice experiencing an altered concentration of an odour but from perception of apparent novel qualities. Consistent with this, relative intensity of glomerular activity following lesions is altered compared with maps recorded in shams and by varying odour concentration. Together, these data show that odour recognition relies on comprehensively matching input patterns to a previously generated stimulus template. When encountering novel odours, access to all glomerular activity ensures rapid generation of new templates to perform accurate perceptual judgements.


Role of spontaneous and sensory orexin network dynamics in rapid locomotion initiation.

  • Mahesh M Karnani‎ et al.
  • Progress in neurobiology‎
  • 2020‎

Appropriate motor control is critical for normal life, and requires hypothalamic hypocretin/orexin neurons (HONs). HONs are slowly regulated by nutrients, but also display rapid (subsecond) activity fluctuations in vivo. The necessity of these activity bursts for sensorimotor control and their roles in specific phases of movement are unknown. Here we show that temporally-restricted optosilencing of spontaneous or sensory-evoked HON bursts disrupts locomotion initiation, but does not affect ongoing locomotion. Conversely, HON optostimulation initiates locomotion with subsecond delays in a frequency-dependent manner. Using 2-photon volumetric imaging of activity of >300 HONs during sensory stimulation and self-initiated locomotion, we identify several locomotion-related HON subtypes, which distinctly predict the probability of imminent locomotion initiation, display distinct sensory responses, and are differentially modulated by food deprivation. By causally linking HON bursts to locomotion initiation, these findings reveal the sensorimotor importance of rapid spontaneous and evoked fluctuations in HON ensemble activity.


Multisensory coding of angular head velocity in the retrosplenial cortex.

  • Sepiedeh Keshavarzi‎ et al.
  • Neuron‎
  • 2022‎

To successfully navigate the environment, animals depend on their ability to continuously track their heading direction and speed. Neurons that encode angular head velocity (AHV) are fundamental to this process, yet the contribution of various motion signals to AHV coding in the cortex remains elusive. By performing chronic single-unit recordings in the retrosplenial cortex (RSP) of the mouse and tracking the activity of individual AHV cells between freely moving and head-restrained conditions, we find that vestibular inputs dominate AHV signaling. Moreover, the addition of visual inputs onto these neurons increases the gain and signal-to-noise ratio of their tuning during active exploration. Psychophysical experiments and neural decoding further reveal that vestibular-visual integration increases the perceptual accuracy of angular self-motion and the fidelity of its representation by RSP ensembles. We conclude that while cortical AHV coding requires vestibular input, where possible, it also uses vision to optimize heading estimation during navigation.


  1. SciCrunch.org Resources

    Welcome to the FDI Lab - SciCrunch.org Resources search. From here you can search through a compilation of resources used by FDI Lab - SciCrunch.org and see how data is organized within our community.

  2. Navigation

    You are currently on the Community Resources tab looking through categories and sources that FDI Lab - SciCrunch.org has compiled. You can navigate through those categories from here or change to a different tab to execute your search through. Each tab gives a different perspective on data.

  3. Logging in and Registering

    If you have an account on FDI Lab - SciCrunch.org then you can log in from here to get additional features in FDI Lab - SciCrunch.org such as Collections, Saved Searches, and managing Resources.

  4. Searching

    Here is the search term that is being executed, you can type in anything you want to search for. Some tips to help searching:

    1. Use quotes around phrases you want to match exactly
    2. You can manually AND and OR terms to change how we search between words
    3. You can add "-" to terms to make sure no results return with that term in them (ex. Cerebellum -CA1)
    4. You can add "+" to terms to require they be in the data
    5. Using autocomplete specifies which branch of our semantics you with to search and can help refine your search
  5. Save Your Search

    You can save any searches you perform for quick access to later from here.

  6. Query Expansion

    We recognized your search term and included synonyms and inferred terms along side your term to help get the data you are looking for.

  7. Collections

    If you are logged into FDI Lab - SciCrunch.org you can add data records to your collections to create custom spreadsheets across multiple sources of data.

  8. Facets

    Here are the facets that you can filter your papers by.

  9. Options

    From here we'll present any options for the literature, such as exporting your current results.

  10. Further Questions

    If you have any further questions please check out our FAQs Page to ask questions and see our tutorials. Click this button to view this tutorial again.

Publications Per Year

X

Year:

Count: