Searching across hundreds of databases

Our searching services are busy right now. Your search will reload in five seconds.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

This service exclusively searches for literature that cites resources. Please be aware that the total number of searchable documents is limited to those containing RRIDs and does not include all open-access literature.

Search

Type in a keyword to search

On page 1 showing 1 ~ 3 papers out of 3 papers

Observation of others' actions during limb immobilization prevents the subsequent decay of motor performance.

  • Doriana De Marco‎ et al.
  • Proceedings of the National Academy of Sciences of the United States of America‎
  • 2021‎

There is rich clinical evidence that observing normally executed actions promotes the recovery of the corresponding action execution in patients with motor deficits. In this study, we assessed the ability of action observation to prevent the decay of healthy individuals' motor abilities following upper-limb immobilization. To this end, upper-limb kinematics was recorded in healthy participants while they performed three reach-to-grasp movements before immobilization and the same movements after 16 h of immobilization. The participants were subdivided into two groups; the experimental group observed, during the immobilization, the same reach-to-grasp movements they had performed before immobilization, whereas the control group observed natural scenarios. After bandage removal, motor impairment in performing reach-to-grasp movements was milder in the experimental group. These findings support the hypothesis that action observation, via the mirror mechanism, plays a protective role against the decline of motor performance induced by limb nonuse. From this perspective, action observation therapy is a promising tool for anticipating rehabilitation onset in clinical conditions involving limb nonuse, thus reducing the burden of further rehabilitation.


Agent-based representations of objects and actions in the monkey pre-supplementary motor area.

  • Alessandro Livi‎ et al.
  • Proceedings of the National Academy of Sciences of the United States of America‎
  • 2019‎

Information about objects around us is essential for planning actions and for predicting those of others. Here, we studied pre-supplementary motor area F6 neurons with a task in which monkeys viewed and grasped (or refrained from grasping) objects, and then observed a human doing the same task. We found "action-related neurons" encoding selectively monkey's own action [self-type (ST)], another agent's action [other-type (OT)], or both [self- and other-type (SOT)]. Interestingly, we found "object-related neurons" exhibiting the same type of selectivity before action onset: Indeed, distinct sets of neurons discharged when visually presented objects were targeted by the monkey's own action (ST), another agent's action (OT), or both (SOT). Notably, object-related neurons appear to signal self and other's intention to grasp and the most likely grip type that will be performed, whereas action-related neurons encode a general goal attainment signal devoid of any specificity for the observed grip type. Time-resolved cross-modal population decoding revealed that F6 neurons first integrate information about object and context to generate an agent-shared signal specifying whether and how the object will be grasped, which progressively turns into a broader agent-based goal attainment signal during action unfolding. Importantly, shared representation of objects critically depends upon their location in the observer's peripersonal space, suggesting an "object-mirroring" mechanism through which observers could accurately predict others' impending action by recruiting the same motor representation they would activate if they were to act upon the same object in the same context.


Architectural experience influences the processing of others' body expressions.

  • Paolo Presti‎ et al.
  • Proceedings of the National Academy of Sciences of the United States of America‎
  • 2023‎

The interplay between space and cognition is a crucial issue in Neuroscience leading to the development of multiple research fields. However, the relationship between architectural space and the movement of the inhabitants and their interactions has been too often neglected, failing to provide a unifying view of architecture's capacity to modulate social cognition broadly. We bridge this gap by requesting participants to judge avatars' emotional expression (high vs. low arousal) at the end of their promenade inside high- or low-arousing architectures. Stimuli were presented in virtual reality to ensure a dynamic, naturalistic experience. High-density electroencephalography (EEG) was recorded to assess the neural responses to the avatar's presentation. Observing highly aroused avatars increased Late Positive Potentials (LPP), in line with previous evidence. Strikingly, 250 ms before the occurrence of the LPP, P200 amplitude increased due to the experience of low-arousing architectures, reflecting an early greater attention during the processing of body expressions. In addition, participants stared longer at the avatar's head and judged the observed posture as more arousing. Source localization highlighted a contribution of the dorsal premotor cortex to both P200 and LPP. In conclusion, the immersive and dynamic architectural experience modulates human social cognition. In addition, the motor system plays a role in processing architecture and body expressions suggesting that the space and social cognition interplay is rooted in overlapping neural substrates. This study demonstrates that the manipulation of mere architectural space is sufficient to influence human social cognition.


  1. SciCrunch.org Resources

    Welcome to the FDI Lab - SciCrunch.org Resources search. From here you can search through a compilation of resources used by FDI Lab - SciCrunch.org and see how data is organized within our community.

  2. Navigation

    You are currently on the Community Resources tab looking through categories and sources that FDI Lab - SciCrunch.org has compiled. You can navigate through those categories from here or change to a different tab to execute your search through. Each tab gives a different perspective on data.

  3. Logging in and Registering

    If you have an account on FDI Lab - SciCrunch.org then you can log in from here to get additional features in FDI Lab - SciCrunch.org such as Collections, Saved Searches, and managing Resources.

  4. Searching

    Here is the search term that is being executed, you can type in anything you want to search for. Some tips to help searching:

    1. Use quotes around phrases you want to match exactly
    2. You can manually AND and OR terms to change how we search between words
    3. You can add "-" to terms to make sure no results return with that term in them (ex. Cerebellum -CA1)
    4. You can add "+" to terms to require they be in the data
    5. Using autocomplete specifies which branch of our semantics you with to search and can help refine your search
  5. Save Your Search

    You can save any searches you perform for quick access to later from here.

  6. Query Expansion

    We recognized your search term and included synonyms and inferred terms along side your term to help get the data you are looking for.

  7. Collections

    If you are logged into FDI Lab - SciCrunch.org you can add data records to your collections to create custom spreadsheets across multiple sources of data.

  8. Facets

    Here are the facets that you can filter your papers by.

  9. Options

    From here we'll present any options for the literature, such as exporting your current results.

  10. Further Questions

    If you have any further questions please check out our FAQs Page to ask questions and see our tutorials. Click this button to view this tutorial again.

Publications Per Year

X

Year:

Count: