Searching across hundreds of databases

Our searching services are busy right now. Your search will reload in five seconds.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

This service exclusively searches for literature that cites resources. Please be aware that the total number of searchable documents is limited to those containing RRIDs and does not include all open-access literature.

Search

Type in a keyword to search

On page 1 showing 1 ~ 20 papers out of 29 papers

Embodying the camera: An EEG study on the effect of camera movements on film spectators´ sensorimotor cortex activation.

  • Katrin Heimann‎ et al.
  • PloS one‎
  • 2019‎

One key feature of film consists in its power to bodily engage the viewer. Previous research has suggested lens and camera movements to be among the most effective stylistic devices involved in such engagement. In an EEG experiment we assessed the role of such movements in modulating specific spectators´ neural and experiential responses, likely reflecting such engagement. We produced short video clips of an empty room with a still, a zooming and a moving camera (steadicam) that might simulate the movement of an observer in different ways. We found an event related desynchronization of the beta components of the rolandic mu rhythm that was stronger for the clips produced with steadicam than for those produced with a still or zooming camera. No equivalent modulation in the attention related occipital areas was found, thus confirming the sensorimotor nature of spectators´ neural responses to the film clips. The present study provides the first empirical evidence that filmic means such as camera movements alone can modulate spectators' bodily engagement with film.


Pain Mirrors: Neural Correlates of Observing Self or Others' Facial Expressions of Pain.

  • Francesca Benuzzi‎ et al.
  • Frontiers in psychology‎
  • 2018‎

Facial expressions of pain are able to elicit empathy and adaptive behavioral responses in the observer. An influential theory posits that empathy relies on an affective mirror mechanism, according to which emotion recognition relies upon the internal simulation of motor and interoceptive states triggered by emotional stimuli. We tested this hypothesis comparing representations of self or others' expressions of pain in nineteen young healthy female volunteers by means of functional magnetic resonance imaging (fMRI). We hypothesized that one's own facial expressions are more likely to elicit the internal simulation of emotions, being more strictly related to self. Video-clips of the facial expressions of each volunteer receiving either painful or non-painful mechanical stimulations to their right hand dorsum were recorded and used as stimuli in a 2 × 2 (Self/Other; Pain/No-Pain) within-subject design. During each trial, a 2 s video clip was presented, displaying either the subject's own neutral or painful facial expressions (Self No-Pain, SNP; Self Pain, SP), or the expressions of other unfamiliar volunteers (Others' No-Pain, ONP; Others' Pain, OP), displaying a comparable emotional intensity. Participants were asked to indicate whether each video displayed a pain expression. fMRI signals were higher while viewing Pain than No-Pain stimuli in a large bilateral array of cortical areas including middle and superior temporal, supramarginal, superior mesial and inferior frontal (IFG) gyri, anterior insula (AI), anterior cingulate (ACC), and anterior mid-cingulate (aMCC) cortex, as well as right fusiform gyrus. Bilateral activations were also detected in thalamus and basal ganglia. The Self vs. Other contrast showed signal changes in ACC and aMCC, IFG, AI, and parietal cortex. A significant interaction between Self and Pain [(SP vs. SNP) >(OP vs. ONP)] was found in a pre-defined region of aMCC known to be also active during noxious stimulation. These findings demonstrate that the observation of one's own and others' facial expressions share a largely common neural network, but self-related stimuli induce generally higher activations. In line with our hypothesis, selectively greater activity for self pain-related stimuli was found in aMCC, a medial-wall region critical for pain perception and recognition.


Audio-visuomotor processing in the musician's brain: an ERP study on professional violinists and clarinetists.

  • Alice Mado Proverbio‎ et al.
  • Scientific reports‎
  • 2014‎

The temporal dynamics of brain activation during visual and auditory perception of congruent vs. incongruent musical video clips was investigated in 12 musicians from the Milan Conservatory of music and 12 controls. 368 videos of a clarinetist and a violinist playing the same score with their instruments were presented. The sounds were similar in pitch, intensity, rhythm and duration. To produce an audiovisual discrepancy, in half of the trials, the visual information was incongruent with the soundtrack in pitch. ERPs were recorded from 128 sites. Only in musicians for their own instruments was a N400-like negative deflection elicited due to the incongruent audiovisual information. SwLORETA applied to the N400 response identified the areas mediating multimodal motor processing: the prefrontal cortex, the right superior and middle temporal gyrus, the premotor cortex, the inferior frontal and inferior parietal areas, the EBA, somatosensory cortex, cerebellum and SMA. The data indicate the existence of audiomotor mirror neurons responding to incongruent visual and auditory information, thus suggesting that they may encode multimodal representations of musical gestures and sounds. These systems may underlie the ability to learn how to play a musical instrument.


Effects of emotional contexts on cerebello-thalamo-cortical activity during action observation.

  • Viridiana Mazzola‎ et al.
  • PloS one‎
  • 2013‎

Several studies investigated the neural and functional mechanisms underlying action observation in contexts with objects. However, actions seen in everyday life are often embedded in emotional contexts. The neural systems integrating emotion cues in action observation are still poorly understood. Previous findings suggest that the processing of both action and emotion information recruits motor control areas within the cerebello-thalamo-cortical pathways. It is therefore hard to determine whether social emotional contexts influence action processing via a direct modulation of motor representations coding for the observed action or via the affective state and implicit motor preparedness elicited in observers in response to emotional contexts. Here we designed a novel fMRI task to identify neural networks engaged by the affective appraisal of a grasping action seen in two different emotional contexts, while keeping the action kinematics constant. Results confirmed that observing the same acts of grasping but in different emotional contexts modulated activity in supplementary motor area, ventrolateral thalamus, anterior cerebellum. Moreover, changes in functional connectivity between left supplementary motor area and parahippocampus in different emotional contexts suggested a direct neural pathway through which emotional contexts may drive the neural motor system. Taken together, these findings shed new light on the malleability of motor system as a function of emotional contexts.


Comprehending body language and mimics: an ERP and neuroimaging study on Italian actors and viewers.

  • Alice Mado Proverbio‎ et al.
  • PloS one‎
  • 2014‎

In this study, the neural mechanism subserving the ability to understand people's emotional and mental states by observing their body language (facial expression, body posture and mimics) was investigated in healthy volunteers. ERPs were recorded in 30 Italian University students while they evaluated 280 pictures of highly ecological displays of emotional body language that were acted out by 8 male and female Italian actors. Pictures were briefly flashed and preceded by short verbal descriptions (e.g., "What a bore!") that were incongruent half of the time (e.g., a picture of a very attentive and concentrated person shown after the previous example verbal description). ERP data and source reconstruction indicated that the first recognition of incongruent body language occurred 300 ms post-stimulus. swLORETA performed on the N400 identified the strongest generators of this effect in the right rectal gyrus (BA11) of the ventromedial orbitofrontal cortex, the bilateral uncus (limbic system) and the cingulate cortex, the cortical areas devoted to face and body processing (STS, FFA EBA) and the premotor cortex (BA6), which is involved in action understanding. These results indicate that face and body mimics undergo a prioritized processing that is mostly represented in the affective brain and is rapidly compared with verbal information. This process is likely able to regulate social interactions by providing on-line information about the sincerity and trustfulness of others.


Binding action and emotion in social understanding.

  • Francesca Ferri‎ et al.
  • PloS one‎
  • 2013‎

In social life actions are tightly linked with emotions. The integration of affective- and action-related information has to be considered as a fundamental component of appropriate social understanding. The present functional magnetic resonance imaging study aimed at investigating whether an emotion (Happiness, Anger or Neutral) dynamically expressed by an observed agent modulates brain activity underlying the perception of his grasping action. As control stimuli, participants observed the same agent either only expressing an emotion or only performing a grasping action. Our results showed that the observation of an action embedded in an emotional context (agent's facial expression), compared with the observation of the same action embedded in a neutral context, elicits higher neural response at the level of motor frontal cortices, temporal and occipital cortices, bilaterally. Particularly, the dynamic facial expression of anger modulates the re-enactment of a motor representation of the observed action. This is supported by the evidence that observing actions embedded in the context of anger, but not happiness, compared with a neutral context, elicits stronger activity in the bilateral pre-central gyrus and inferior frontal gyrus, besides the pre-supplementary motor area, a region playing a central role in motor control. Angry faces not only seem to modulate the simulation of actions, but may also trigger motor reaction. These findings suggest that emotions exert a modulatory role on action observation in different cortical areas involved in action processing.


Processing of hand-related verbs specifically affects the planning and execution of arm reaching movements.

  • Giovanni Mirabella‎ et al.
  • PloS one‎
  • 2012‎

Even though a growing body of research has shown that the processing of action language affects the planning and execution of motor acts, several aspects of this interaction are still hotly debated. The directionality (i.e. does understanding action-related language induce a facilitation or an interference with the corresponding action?), the time course, and the nature of the interaction (i.e. under what conditions does the phenomenon occur?) are largely unclear. To further explore this topic we exploited a go/no-go paradigm in which healthy participants were required to perform arm reaching movements toward a target when verbs expressing either hand or foot actions were shown, and to refrain from moving when abstract verbs were presented. We found that reaction times (RT) and percentages of errors increased when the verb involved the same effector used to give the response. This interference occurred very early, when the interval between verb presentation and the delivery of the go signal was 50 ms, and could be elicited until this delay was about 600 ms. In addition, RTs were faster when subjects used the right arm than when they used the left arm, suggesting that action-verb understanding is left-lateralized. Furthermore, when the color of the printed verb and not its meaning was the cue for movement execution the differences between RTs and error percentages between verb categories disappeared, unequivocally indicating that the phenomenon occurs only when the semantic content of a verb has to be retrieved. These results are compatible with the theory of embodied language, which hypothesizes that comprehending verbal descriptions of actions relies on an internal simulation of the sensory-motor experience of the action, and provide a new and detailed view of the interplay between action language and motor acts.


Brain response to a humanoid robot in areas implicated in the perception of human emotional gestures.

  • Thierry Chaminade‎ et al.
  • PloS one‎
  • 2010‎

The humanoid robot WE4-RII was designed to express human emotions in order to improve human-robot interaction. We can read the emotions depicted in its gestures, yet might utilize different neural processes than those used for reading the emotions in human agents.


Audience spontaneous entrainment during the collective enjoyment of live performances: physiological and behavioral measurements.

  • Martina Ardizzi‎ et al.
  • Scientific reports‎
  • 2020‎

Cardiac synchrony is a crucial component of shared experiences, considered as an objective measure of emotional processes accompanying empathic interactions. No study has investigated whether cardiac synchrony among people engaged in collective situations links to the individual emotional evaluation of the shared experience. We investigated theatrical live performances as collective experiences evoking strong emotional engagement in the audience. Cross Recurrence Quantification Analysis was applied to obtain the cardiac synchrony of twelve spectators' quartets attending to two live acting performances. This physiological measure was then correlated with spectators' emotional intensity ratings. Results showed an expected increment in synchrony among people belonging to the same quartet during both performances attendance and rest periods. Furthermore, participants' cardiac synchrony was found to be correlated with audience's convergence in the explicit emotional evaluation of the performances they attended to. These findings demonstrate that the mere co-presence of other people sharing a common experience is enough for cardiac synchrony to occur spontaneously and that it increases in function of a shared and coherent explicit emotional experience.


ERP modulation during observation of abstract paintings by Franz Kline.

  • Beatrice Sbriscia-Fioretti‎ et al.
  • PloS one‎
  • 2013‎

The aim of this study was to test the involvement of sensorimotor cortical circuits during the beholding of the static consequences of hand gestures devoid of any meaning.In order to verify this hypothesis we performed an EEG experiment presenting to participants images of abstract works of art with marked traces of brushstrokes. The EEG data were analyzed by using Event Related Potentials (ERPs). We aimed to demonstrate a direct involvement of sensorimotor cortical circuits during the beholding of these selected works of abstract art. The stimuli consisted of three different abstract black and white paintings by Franz Kline. Results verified our experimental hypothesis showing the activation of premotor and motor cortical areas during stimuli observation. In addition, abstract works of art observation elicited the activation of reward-related orbitofrontal areas, and cognitive categorization-related prefrontal areas. The cortical sensorimotor activation is a fundamental neurophysiological demonstration of the direct involvement of the cortical motor system in perception of static meaningless images belonging to abstract art. These results support the role of embodied simulation of artist's gestures in the perception of works of art.


Modulation of arm reaching movements during processing of arm/hand-related action verbs with and without emotional connotation.

  • Silvia Spadacenta‎ et al.
  • PloS one‎
  • 2014‎

The theory of embodied language states that language comprehension relies on an internal reenactment of the sensorimotor experience associated with the processed word or sentence. Most evidence in support of this hypothesis had been collected using linguistic material without any emotional connotation. For instance, it had been shown that processing of arm-related verbs, but not of those leg-related verbs, affects the planning and execution of reaching movements; however, at present it is unknown whether this effect is further modulated by verbs evoking an emotional experience. Showing such a modulation might shed light on a very debated issue, i.e. the way in which the emotional meaning of a word is processed. To this end, we assessed whether processing arm/hand-related verbs describing actions with negative connotations (e.g. to stab) affects reaching movements differently from arm/hand-related verbs describing actions with neutral connotation (e.g. to comb). We exploited a go/no-go paradigm in which healthy participants were required to perform arm-reaching movements toward a target when verbs expressing emotional hand actions, neutral hand actions or foot actions were shown, and to refrain from moving when no-effector-related verbs were presented. Reaction times and percentages of errors increased when the verb involved the same effector as used to give the response. However, we also found that the size of this interference decreased when the arm/hand-related verbs had a negative emotional connotation. Crucially, we show that such modulation only occurred when the verb semantics had to be retrieved. These results suggest that the comprehension of negatively valenced verbs might require the simultaneous reenactment of the neural circuitry associated with the processing of the emotion evoked by their meaning and of the neural circuitry associated with their motor features.


When early experiences build a wall to others' emotions: an electrophysiological and autonomic study.

  • Martina Ardizzi‎ et al.
  • PloS one‎
  • 2013‎

Facial expression of emotions is a powerful vehicle for communicating information about others' emotional states and it normally induces facial mimicry in the observers. The aim of this study was to investigate if early aversive experiences could interfere with emotion recognition, facial mimicry, and with the autonomic regulation of social behaviors. We conducted a facial emotion recognition task in a group of "street-boys" and in an age-matched control group. We recorded facial electromyography (EMG), a marker of facial mimicry, and respiratory sinus arrhythmia (RSA), an index of the recruitment of autonomic system promoting social behaviors and predisposition, in response to the observation of facial expressions of emotions. Results showed an over-attribution of anger, and reduced EMG responses during the observation of both positive and negative expressions only among street-boys. Street-boys also showed lower RSA after observation of facial expressions and ineffective RSA suppression during presentation of non-threatening expressions. Our findings suggest that early aversive experiences alter not only emotion recognition but also facial mimicry of emotions. These deficits affect the autonomic regulation of social behaviors inducing lower social predisposition after the visualization of facial expressions and an ineffective recruitment of defensive behavior in response to non-threatening expressions.


Both of us disgusted in My insula: the common neural basis of seeing and feeling disgust.

  • Bruno Wicker‎ et al.
  • Neuron‎
  • 2003‎

What neural mechanism underlies the capacity to understand the emotions of others? Does this mechanism involve brain areas normally involved in experiencing the same emotion? We performed an fMRI study in which participants inhaled odorants producing a strong feeling of disgust. The same participants observed video clips showing the emotional facial expression of disgust. Observing such faces and feeling disgust activated the same sites in the anterior insula and to a lesser extent in the anterior cingulate cortex. Thus, as observing hand actions activates the observer's motor representation of that action, observing an emotion activates the neural representation of that emotion. This finding provides a unifying mechanism for understanding the behaviors of others.


Zoomed out: digital media use and depersonalization experiences during the COVID-19 lockdown.

  • Anna Ciaunica‎ et al.
  • Scientific reports‎
  • 2022‎

Depersonalisation is a common dissociative experience characterised by distressing feelings of being detached or 'estranged' from one's self and body and/or the world. The COVID-19 pandemic forcing millions of people to socially distance themselves from others and to change their lifestyle habits. We have conducted an online study of 622 participants worldwide to investigate the relationship between digital media-based activities, distal social interactions and peoples' sense of self during the lockdown as contrasted with before the pandemic. We found that increased use of digital media-based activities and online social e-meetings correlated with higher feelings of depersonalisation. We also found that the participants reporting higher experiences of depersonalisation, also reported enhanced vividness of negative emotions (as opposed to positive emotions). Finally, participants who reported that lockdown influenced their life to a greater extent had higher occurrences of depersonalisation experiences. Our findings may help to address key questions regarding well-being during a lockdown, in the general population. Our study points to potential risks related to overly sedentary, and hyper-digitalised lifestyle habits that may induce feelings of living in one's 'head' (mind), disconnected from one's body, self and the world.


Autonomic vulnerability to biased perception of social inclusion in borderline personality disorder.

  • Maria Lidia Gerra‎ et al.
  • Borderline personality disorder and emotion dysregulation‎
  • 2021‎

Individuals with Borderline Personality Disorder (BPD) feel rejected even when socially included. The pathophysiological mechanisms of this rejection bias are still unknown. Using the Cyberball paradigm, we investigated whether patients with BPD, display altered physiological responses to social inclusion and ostracism, as assessed by changes in Respiratory Sinus Arrhythmia (RSA).


Less Empathic and More Reactive: The Different Impact of Childhood Maltreatment on Facial Mimicry and Vagal Regulation.

  • Martina Ardizzi‎ et al.
  • PloS one‎
  • 2016‎

Facial mimicry and vagal regulation represent two crucial physiological responses to others' facial expressions of emotions. Facial mimicry, defined as the automatic, rapid and congruent electromyographic activation to others' facial expressions, is implicated in empathy, emotional reciprocity and emotions recognition. Vagal regulation, quantified by the computation of Respiratory Sinus Arrhythmia (RSA), exemplifies the autonomic adaptation to contingent social cues. Although it has been demonstrated that childhood maltreatment induces alterations in the processing of the facial expression of emotions, both at an explicit and implicit level, the effects of maltreatment on children's facial mimicry and vagal regulation in response to facial expressions of emotions remain unknown. The purpose of the present study was to fill this gap, involving 24 street-children (maltreated group) and 20 age-matched controls (control group). We recorded their spontaneous facial electromyographic activations of corrugator and zygomaticus muscles and RSA responses during the visualization of the facial expressions of anger, fear, joy and sadness. Results demonstrated a different impact of childhood maltreatment on facial mimicry and vagal regulation. Maltreated children did not show the typical positive-negative modulation of corrugator mimicry. Furthermore, when only negative facial expressions were considered, maltreated children demonstrated lower corrugator mimicry than controls. With respect to vagal regulation, whereas maltreated children manifested the expected and functional inverse correlation between RSA value at rest and RSA response to angry facial expressions, controls did not. These results describe an early and divergent functional adaptation to hostile environment of the two investigated physiological mechanisms. On the one side, maltreatment leads to the suppression of the spontaneous facial mimicry normally concurring to empathic understanding of others' emotions. On the other side, maltreatment forces the precocious development of the functional synchronization between vagal regulation and threatening social cues facilitating the recruitment of fight-or-flight defensive behavioral strategies.


Frontal Functional Connectivity of Electrocorticographic Delta and Theta Rhythms during Action Execution Versus Action Observation in Humans.

  • Claudio Babiloni‎ et al.
  • Frontiers in behavioral neuroscience‎
  • 2017‎

We have previously shown that in seven drug-resistant epilepsy patients, both reaching-grasping of objects and the mere observation of those actions did desynchronize subdural electrocorticographic (ECoG) alpha (8-13 Hz) and beta (14-30) rhythms as a sign of cortical activation in primary somatosensory-motor, lateral premotor and ventral prefrontal areas (Babiloni et al., 2016a). Furthermore, that desynchronization was greater during action execution than during its observation. In the present exploratory study, we reanalyzed those ECoG data to evaluate the proof-of-concept that lagged linear connectivity (LLC) between primary somatosensory-motor, lateral premotor and ventral prefrontal areas would be enhanced during the action execution compared to the mere observation due to a greater flow of visual and somatomotor information. Results showed that the delta-theta (<8 Hz) LLC between lateral premotor and ventral prefrontal areas was higher during action execution than during action observation. Furthermore, the phase of these delta-theta rhythms entrained the local event-related connectivity of alpha and beta rhythms. It was speculated the existence of a multi-oscillatory functional network between high-order frontal motor areas which should be more involved during the actual reaching-grasping of objects compared to its mere observation. Future studies in a larger population should cross-validate these preliminary results.


Sharing the filmic experience - The physiology of socio-emotional processes in the cinema.

  • Laura Kaltwasser‎ et al.
  • PloS one‎
  • 2019‎

As we identify with characters on screen, we simulate their emotions and thoughts. This is accompanied by physiological changes such as galvanic skin response (GSR), an indicator for emotional arousal, and respiratory sinus arrhythmia (RSA), referring to vagal activity. We investigated whether the presence of a cinema audience affects these psychophysiological processes. The study was conducted in a real cinema in Berlin. Participants came twice to watch previously rated emotional film scenes eliciting amusement, anger, tenderness or fear. Once they watched the scenes alone, once in a group. We tested whether the vagal modulation in response to the mere presence of others influences explicit (reported) and implicit markers (RSA, heart rate (HR) and GSR) of emotional processes in function of solitary or collective enjoyment of movie scenes. On the physiological level, we found a mediating effect of vagal flexibility to the mere presence of others. Individuals showing a high baseline difference (alone vs. social) prior to the presentation of film, maintained higher RSA in the alone compared to the social condition. The opposite pattern emerged for low baseline difference individuals. Emotional arousal as reflected in GSR was significantly more pronounced during scenes eliciting anger independent of the social condition. On the behavioural level, we found evidence for emotion-specific effects on reported empathy, emotional intensity and Theory of Mind. Furthermore, people who decrease their RSA in response to others' company are those who felt themselves more empathically engaged with the characters. Our data speaks in favour of a specific role of vagal regulation in response to the mere presence of others in terms of explicit empathic engagement with characters during shared filmic experience.


The consequences of COVID-19 on social interactions: an online study on face covering.

  • Marta Calbi‎ et al.
  • Scientific reports‎
  • 2021‎

The COVID-19 pandemic has dramatically changed the nature of our social interactions. In order to understand how protective equipment and distancing measures influence the ability to comprehend others' emotions and, thus, to effectively interact with others, we carried out an online study across the Italian population during the first pandemic peak. Participants were shown static facial expressions (Angry, Happy and Neutral) covered by a sanitary mask or by a scarf. They were asked to evaluate the expressed emotions as well as to assess the degree to which one would adopt physical and social distancing measures for each stimulus. Results demonstrate that, despite the covering of the lower-face, participants correctly recognized the facial expressions of emotions with a polarizing effect on emotional valence ratings found in females. Noticeably, while females' ratings for physical and social distancing were driven by the emotional content of the stimuli, males were influenced by the "covered" condition. The results also show the impact of the pandemic on anxiety and fear experienced by participants. Taken together, our results offer novel insights on the impact of the COVID-19 pandemic on social interactions, providing a deeper understanding of the way people react to different kinds of protective face covering.


How context influences the interpretation of facial expressions: a source localization high-density EEG study on the "Kuleshov effect".

  • Marta Calbi‎ et al.
  • Scientific reports‎
  • 2019‎

Few studies have explored the specificities of contextual modulations of the processing of facial expressions at a neuronal level. This study fills this gap by employing an original paradigm, based on a version of the filmic "Kuleshov effect". High-density EEG was recorded while participants watched film sequences consisting of three shots: the close-up of a target person's neutral face (Face_1), the scene that the target person was looking at (happy, fearful, or neutral), and another close-up of the same target person's neutral face (Face_2). The participants' task was to rate both valence and arousal, and subsequently to categorize the target person's emotional state. The results indicate that despite a significant behavioural 'context' effect, the electrophysiological indexes still indicate that the face is evaluated as neutral. Specifically, Face_2 elicited a high amplitude N170 when preceded by neutral contexts, and a high amplitude Late Positive Potential (LPP) when preceded by emotional contexts, thus showing sensitivity to the evaluative congruence (N170) and incongruence (LPP) between context and Face_2. The LPP activity was mainly underpinned by brain regions involved in facial expressions and emotion recognition processing. Our results shed new light on temporal and neural correlates of context-sensitivity in the interpretation of facial expressions.


  1. SciCrunch.org Resources

    Welcome to the FDI Lab - SciCrunch.org Resources search. From here you can search through a compilation of resources used by FDI Lab - SciCrunch.org and see how data is organized within our community.

  2. Navigation

    You are currently on the Community Resources tab looking through categories and sources that FDI Lab - SciCrunch.org has compiled. You can navigate through those categories from here or change to a different tab to execute your search through. Each tab gives a different perspective on data.

  3. Logging in and Registering

    If you have an account on FDI Lab - SciCrunch.org then you can log in from here to get additional features in FDI Lab - SciCrunch.org such as Collections, Saved Searches, and managing Resources.

  4. Searching

    Here is the search term that is being executed, you can type in anything you want to search for. Some tips to help searching:

    1. Use quotes around phrases you want to match exactly
    2. You can manually AND and OR terms to change how we search between words
    3. You can add "-" to terms to make sure no results return with that term in them (ex. Cerebellum -CA1)
    4. You can add "+" to terms to require they be in the data
    5. Using autocomplete specifies which branch of our semantics you with to search and can help refine your search
  5. Save Your Search

    You can save any searches you perform for quick access to later from here.

  6. Query Expansion

    We recognized your search term and included synonyms and inferred terms along side your term to help get the data you are looking for.

  7. Collections

    If you are logged into FDI Lab - SciCrunch.org you can add data records to your collections to create custom spreadsheets across multiple sources of data.

  8. Facets

    Here are the facets that you can filter your papers by.

  9. Options

    From here we'll present any options for the literature, such as exporting your current results.

  10. Further Questions

    If you have any further questions please check out our FAQs Page to ask questions and see our tutorials. Click this button to view this tutorial again.

Publications Per Year

X

Year:

Count: