Searching across hundreds of databases

Our searching services are busy right now. Your search will reload in five seconds.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

This service exclusively searches for literature that cites resources. Please be aware that the total number of searchable documents is limited to those containing RRIDs and does not include all open-access literature.

Search

Type in a keyword to search

On page 1 showing 1 ~ 3 papers out of 3 papers

Agent-based representations of objects and actions in the monkey pre-supplementary motor area.

  • Alessandro Livi‎ et al.
  • Proceedings of the National Academy of Sciences of the United States of America‎
  • 2019‎

Information about objects around us is essential for planning actions and for predicting those of others. Here, we studied pre-supplementary motor area F6 neurons with a task in which monkeys viewed and grasped (or refrained from grasping) objects, and then observed a human doing the same task. We found "action-related neurons" encoding selectively monkey's own action [self-type (ST)], another agent's action [other-type (OT)], or both [self- and other-type (SOT)]. Interestingly, we found "object-related neurons" exhibiting the same type of selectivity before action onset: Indeed, distinct sets of neurons discharged when visually presented objects were targeted by the monkey's own action (ST), another agent's action (OT), or both (SOT). Notably, object-related neurons appear to signal self and other's intention to grasp and the most likely grip type that will be performed, whereas action-related neurons encode a general goal attainment signal devoid of any specificity for the observed grip type. Time-resolved cross-modal population decoding revealed that F6 neurons first integrate information about object and context to generate an agent-shared signal specifying whether and how the object will be grasped, which progressively turns into a broader agent-based goal attainment signal during action unfolding. Importantly, shared representation of objects critically depends upon their location in the observer's peripersonal space, suggesting an "object-mirroring" mechanism through which observers could accurately predict others' impending action by recruiting the same motor representation they would activate if they were to act upon the same object in the same context.


Stable readout of observed actions from format-dependent activity of monkey's anterior intraparietal neurons.

  • Marco Lanzilotto‎ et al.
  • Proceedings of the National Academy of Sciences of the United States of America‎
  • 2020‎

Humans accurately identify observed actions despite large dynamic changes in their retinal images and a variety of visual presentation formats. A large network of brain regions in primates participates in the processing of others' actions, with the anterior intraparietal area (AIP) playing a major role in routing information about observed manipulative actions (OMAs) to the other nodes of the network. This study investigated whether the AIP also contributes to invariant coding of OMAs across different visual formats. We recorded AIP neuronal activity from two macaques while they observed videos portraying seven manipulative actions (drag, drop, grasp, push, roll, rotate, squeeze) in four visual formats. Each format resulted from the combination of two actor's body postures (standing, sitting) and two viewpoints (lateral, frontal). Out of 297 recorded units, 38% were OMA-selective in at least one format. Robust population code for viewpoint and actor's body posture emerged shortly after stimulus presentation, followed by OMA selectivity. Although we found no fully invariant OMA-selective neuron, we discovered a population code that allowed us to classify action exemplars irrespective of the visual format. This code depends on a multiplicative mixing of signals about OMA identity and visual format, particularly evidenced by a set of units maintaining a relatively stable OMA selectivity across formats despite considerable rescaling of their firing rate depending on the visual specificities of each format. These findings suggest that the AIP integrates format-dependent information and the visual features of others' actions, leading to a stable readout of observed manipulative action identity.


Local and system mechanisms for action execution and observation in parietal and premotor cortices.

  • Carolina G Ferroni‎ et al.
  • Current biology : CB‎
  • 2021‎

The action observation network (AON) includes a system of brain areas largely shared with action execution in both human and nonhuman primates. Yet temporal and tuning specificities of distinct areas and of physiologically identified neuronal classes in the encoding of self and others' action remain unknown. We recorded the activity of 355 single units from three crucial nodes of the AON, the anterior intraparietal area (AIP), and premotor areas F5 and F6, while monkeys performed a Go/No-Go grasping task and observed an experimenter performing it. At the system level, during task execution, F6 displays a prevalence of suppressed neurons and signals whether an action has to be performed, whereas AIP and F5 share a prevalence of facilitated neurons and remarkable target selectivity; during task observation, F5 stands out for its unique prevalence of facilitated neurons and its stronger and earlier modulation than AIP and F6. By applying unsupervised clustering of spike waveforms, we found distinct cell classes unevenly distributed across areas, with different firing properties and carrying specific visuomotor signals. Broadly spiking neurons exhibited a balanced amount of facilitated and suppressed activity during action execution and observation, whereas narrower spiking neurons showed more mutually facilitated responses during the execution of one's own and others' action, particularly in areas AIP and F5. Our findings elucidate the time course of activity and firing properties of neurons in the AON during one's own and others' action, from the system level of anatomically distinct areas to the local level of physiologically distinct cell classes.


  1. SciCrunch.org Resources

    Welcome to the FDI Lab - SciCrunch.org Resources search. From here you can search through a compilation of resources used by FDI Lab - SciCrunch.org and see how data is organized within our community.

  2. Navigation

    You are currently on the Community Resources tab looking through categories and sources that FDI Lab - SciCrunch.org has compiled. You can navigate through those categories from here or change to a different tab to execute your search through. Each tab gives a different perspective on data.

  3. Logging in and Registering

    If you have an account on FDI Lab - SciCrunch.org then you can log in from here to get additional features in FDI Lab - SciCrunch.org such as Collections, Saved Searches, and managing Resources.

  4. Searching

    Here is the search term that is being executed, you can type in anything you want to search for. Some tips to help searching:

    1. Use quotes around phrases you want to match exactly
    2. You can manually AND and OR terms to change how we search between words
    3. You can add "-" to terms to make sure no results return with that term in them (ex. Cerebellum -CA1)
    4. You can add "+" to terms to require they be in the data
    5. Using autocomplete specifies which branch of our semantics you with to search and can help refine your search
  5. Save Your Search

    You can save any searches you perform for quick access to later from here.

  6. Query Expansion

    We recognized your search term and included synonyms and inferred terms along side your term to help get the data you are looking for.

  7. Collections

    If you are logged into FDI Lab - SciCrunch.org you can add data records to your collections to create custom spreadsheets across multiple sources of data.

  8. Facets

    Here are the facets that you can filter your papers by.

  9. Options

    From here we'll present any options for the literature, such as exporting your current results.

  10. Further Questions

    If you have any further questions please check out our FAQs Page to ask questions and see our tutorials. Click this button to view this tutorial again.

Publications Per Year

X

Year:

Count: