This service exclusively searches for literature that cites resources. Please be aware that the total number of searchable documents is limited to those containing RRIDs and does not include all open-access literature.
Emotions and temperature are closely related through embodied processes, and people seem to associate temperature concepts with emotions. While this relationship is often evidenced by everyday language (e.g., cold and warm feelings), what remains missing to date is a systematic study that holistically analyzes how and why people associate specific temperatures with emotions. The present research aimed to investigate the associations between temperature concepts and emotion adjectives on both explicit and implicit levels. In Experiment 1, we evaluated explicit associations between twelve pairs of emotion adjectives derived from the circumplex model of affect, and five different temperature concepts ranging from 0°C to 40°C, based on responses from 403 native speakers of four different languages (English, Spanish, Japanese, Chinese). The results of Experiment 1 revealed that, across languages, the temperatures were associated with different regions of the circumplex model. The 0°C and 10°C were associated with negative-valanced, low-arousal emotions, while 20°C was associated with positive-valanced, low-to-medium-arousal emotions. Moreover, 30°C was associated with positive-valanced, high-arousal emotions; and 40°C was associated with high-arousal and either positive- or negative-valanced emotions. In Experiment 2 (N = 102), we explored whether these temperature-emotion associations were also present at the implicit level, by conducting Implicit Association Tests (IATs) with temperature words (cold and hot) and opposing pairs of emotional adjectives for each dimension of valence (Unhappy/Dissatisfied vs. Happy/Satisfied) and arousal (Passive/Quiet vs. Active/Alert) on native English speakers. The results of Experiment 2 revealed that participants held implicit associations between the word hot and positive-valanced and high-arousal emotions. Additionally, the word cold was associated with negative-valanced and low-arousal emotions. These findings provide evidence for the existence of temperature-emotion associations at both explicit and implicit levels across languages.
Despite decades of research establishing the causes and consequences of emotions in the laboratory, we know surprisingly little about emotions in everyday life. We developed a smartphone application that monitored real-time emotions of an exceptionally large (N = 11,000+) and heterogeneous participants sample. People's everyday life seems profoundly emotional: participants experienced at least one emotion 90% of the time. The most frequent emotion was joy, followed by love and anxiety. People experienced positive emotions 2.5 times more often than negative emotions, but also experienced positive and negative emotions simultaneously relatively frequently. We also characterized the interconnections between people's emotions using network analysis. This novel approach to emotion research suggests that specific emotions can fall into the following categories 1) connector emotions (e.g., joy), which stimulate same valence emotions while inhibiting opposite valence emotions, 2) provincial emotions (e.g., gratitude), which stimulate same valence emotions only, or 3) distal emotions (e.g., embarrassment), which have little interaction with other emotions and are typically experienced in isolation. Providing both basic foundations and novel tools to the study of emotions in everyday life, these findings demonstrate that emotions are ubiquitous to life and can exist together and distinctly, which has important implications for both emotional interventions and theory.
Emotions are states of vigilant readiness that guide human and animal behaviour during survival-salient situations. Categorical models of emotions posit neurally and physiologically distinct basic human emotions (anger, fear, disgust, happiness, sadness and surprise) that govern different survival functions. Opioid receptors are expressed abundantly in the mammalian emotion circuit, and the opioid system modulates a variety of functions related to arousal and motivation. Yet, its specific contribution to different basic emotions has remained poorly understood. Here, we review how the endogenous opioid system and particularly the μ receptor contribute to emotional processing in humans. Activation of the endogenous opioid system is consistently associated with both pleasant and unpleasant emotions. In general, exogenous opioid agonists facilitate approach-oriented emotions (anger, pleasure) and inhibit avoidance-oriented emotions (fear, sadness). Opioids also modulate social bonding and affiliative behaviour, and prolonged opioid abuse may render both social bonding and emotion recognition circuits dysfunctional. However, there is no clear evidence that the opioid system is able to affect the emotions associated with surprise and disgust. Taken together, the opioid systems contribute to a wide array of positive and negative emotions through their general ability to modulate the approach versus avoidance motivation associated with specific emotions. Because of the protective effects of opioid system-mediated prosociality and positive mood, the opioid system may constitute an important factor contributing to psychological and psychosomatic resilience.
The ability to experience others' emotional states is a key component in social interactions. Uniquely among sensorimotor regions, the somatosensory cortex (SCx) plays an especially important role in human emotion understanding. While distinct emotions are experienced in specific parts of the body, it remains unknown whether the SCx exhibits somatotopic activations to different emotional expressions. In the current study, we investigated if the affective response triggered by observing others' emotional face expressions leads to differential activations in SCx. Participants performed a visual facial emotion discrimination task while we measured changes in SCx topographic EEG activity by tactually stimulating two body-parts representative of the upper and lower limbs, the finger and the toe respectively. The results of the study showed an emotion specific response in the finger SCx when observing angry as opposed to sad emotional expressions, after controlling for carry-over effects of visual evoked activity. This dissociation to observed emotions was not present in toe somatosensory responses. Our results suggest that somatotopic activations of the SCx to discrete emotions might play a crucial role in understanding others' emotions.
In the current study we show that non-verbal food-evoked emotion scores significantly improve food choice prediction over merely liking scores. Previous research has shown that liking measures correlate with choice. However, liking is no strong predictor for food choice in real life environments. Therefore, the focus within recent studies shifted towards using emotion-profiling methods that successfully can discriminate between products that are equally liked. However, it is unclear how well scores from emotion-profiling methods predict actual food choice and/or consumption. To test this, we proposed to decompose emotion scores into valence and arousal scores using Principal Component Analysis (PCA) and apply Multinomial Logit Models (MLM) to estimate food choice using liking, valence, and arousal as possible predictors. For this analysis, we used an existing data set comprised of liking and food-evoked emotions scores from 123 participants, who rated 7 unlabeled breakfast drinks. Liking scores were measured using a 100-mm visual analogue scale, while food-evoked emotions were measured using 2 existing emotion-profiling methods: a verbal and a non-verbal method (EsSense Profile and PrEmo, respectively). After 7 days, participants were asked to choose 1 breakfast drink from the experiment to consume during breakfast in a simulated restaurant environment. Cross validation showed that we were able to correctly predict individualized food choice (1 out of 7 products) for over 50% of the participants. This number increased to nearly 80% when looking at the top 2 candidates. Model comparisons showed that evoked emotions better predict food choice than perceived liking alone. However, the strongest predictive strength was achieved by the combination of evoked emotions and liking. Furthermore we showed that non-verbal food-evoked emotion scores more accurately predict food choice than verbal food-evoked emotions scores.
Despite the increasing interest in sleep and dream-related processes of emotion regulation, their reflection into wake and dream emotional experience remains unclear. Here, we aimed to assess dream emotions and their relationships with wake emotions through the modified Differential Emotions Scale (Fredrickson, 2003), which includes a broad array of both positive and negative emotions. The scale has been first validated on 212 healthy Italian participants, in two versions: a WAKE-2wks form, assessing the frequency of 22 emotions over the past 2 weeks, and a WAKE-24hr form, assessing their intensity over the past 24 h. Fifty volunteers from the wider sample completed the WAKE-24hr mDES for several days until a dream was recalled, and dream emotions were self-reported using the same scale. A bifactorial structure was confirmed for both mDES forms, which also showed good validity and reliability. Though Positive and Negative Affect (average intensity of positive and negative items, PA, and NA, respectively) were balanced in dreams, specific negative emotions prevailed; rmANOVA showed a different pattern (prevalence of PA and positive emotions) in wake (both WAKE-2wks and WAKE-24hr), with a decrease of PA and an increase of NA in the dream compared to previous wake. No significant regression model emerged between waking and dream affect, and exploratory analyses revealed a stable proportion of PA and NA (with prevailing PA) over the 3 days preceding the dream. Our findings highlight a discontinuity between wake and dream affect and suggest that positive and negative emotions experienced during wake may undertake distinct sleep-related regulation pathways.
Kinesthesia, the perception of our own body movements, relies on the integration of proprioceptive information arising mostly from muscle spindles, which are sensory receptors in skeletal muscles. We recently showed that emotions alter the proprioceptive messages from such muscle afferents, making them more sensitive to muscle lengthening when participants were listening sad music. Presently, we investigated whether these changes in proprioceptive feedback relating to emotional state may affect the perception of limb movements. Kinesthetic acuity was tested in 20 healthy, young adults by imposing ramp-and-hold movements that consisted of either plantar flexion or dorsiflexion movements of the ankle at 0.04°/s, or no movement. These were imposed during four emotional conditions (listening to neutral, sad, or happy music, or no music). The participants were asked to relax and focus on music (or nothing), and then they shifted their focus to the direction of an incoming movement. Once this had finished, they were asked its direction. Muscle activity, heart rate, and electrodermal activity were recorded during each trial, and after each music condition the participants rated the emotion felt on a visual analog scale. The rating of the emotional content of the music corroborated with changes in physiological measures. Kinesthetic acuity was also affected by the emotional state and found to be larger during the sad condition, as compared to the no music or neutral conditions. We conclude that emotion can shape our perception of movements, which we show here where feeling sadness significantly increase our kinesthetic acuity, this may be functionally relevant for the preparation of appropriate behavioral responses.
The sense of olfaction has been considered of minor importance in human communication. In recent years, evidence has emerged that humans might be influenced by unconscious messages sent through chemosignals in body odors. Data concerning the ability of humans to recognize fear, maybe related to the evolutionary role of these emotions in the fight-or-flight reactions, are well known.
Compared with the well documented neurophysiological findings on negative emotions, much less is known about positive emotions. In the present study, we explored the EEG correlates of ten different positive emotions (joy, gratitude, serenity, interest, hope, pride, amusement, inspiration, awe, and love). A group of 20 participants were invited to watch 30 short film clips with their EEGs simultaneously recorded. Distinct topographical patterns for different positive emotions were found for the correlation coefficients between the subjective ratings on the ten positive emotions per film clip and the corresponding EEG spectral powers in different frequency bands. Based on the similarities of the participants' ratings on the ten positive emotions, these emotions were further clustered into three representative clusters, as 'encouragement' for awe, gratitude, hope, inspiration, pride, 'playfulness' for amusement, joy, interest, and 'harmony' for love, serenity. Using the EEG spectral powers as features, both the binary classification on the higher and lower ratings on these positive emotions and the binary classification between the three positive emotion clusters, achieved accuracies of approximately 80% and above. To our knowledge, our study provides the first piece of evidence on the EEG correlates of different positive emotions.
Human body movements can convey a variety of emotions and even create advantages in some special life situations. However, how emotion is encoded in body movements has remained unclear. One reason is that there is a lack of public human body kinematic dataset regarding the expressing of various emotions. Therefore, we aimed to produce a comprehensive dataset to assist in recognizing cues from all parts of the body that indicate six basic emotions (happiness, sadness, anger, fear, disgust, surprise) and neutral expression. The present dataset was created using a portable wireless motion capture system. Twenty-two semi-professional actors (half male) completed performances according to the standardized guidance and preferred daily events. A total of 1402 recordings at 125 Hz were collected, consisting of the position and rotation data of 72 anatomical nodes. To our knowledge, this is now the largest emotional kinematic dataset of the human body. We hope this dataset will contribute to multiple fields of research and practice, including social neuroscience, psychiatry, computer vision, and biometric and information forensics.
Studies presenting memory-facilitating effect of emotions typically focused on affective dimensions of arousal and valence. Little is known, however, about the extent to which stimulus-driven basic emotions could have distinct effects on memory. In the present paper we sought to examine the modulatory effect of disgust, fear, and sadness on intentional remembering and forgetting using widely used item-method directed forgetting (DF) paradigm. Eighteen women underwent fMRI scanning during encoding phase in which they were asked either to remember (R) or to forget (F) pictures. In the test phase all previously used stimuli were re-presented together with the same number of new pictures and participants had to categorize them as old or new, irrespective of the F/R instruction. On the behavioral level we found a typical DF effect, i.e., higher recognition rates for to-be-remembered (TBR) items than to-be-forgotten (TBF) ones for both neutral and emotional categories. Emotional stimuli had higher recognition rate than neutral ones, while among emotional those eliciting disgust produced highest recognition, but at the same time induced more false alarms. Therefore, when false alarm corrected recognition was examined the DF effect was equally strong irrespective of emotion. Additionally, even though subjects rated disgusting pictures as more arousing and negative than other picture categories, logistic regression on the item level showed that the effect of disgust on recognition memory was stronger than the effect of arousal or valence. On the neural level, ROI analyses (with valence and arousal covariates) revealed that correctly recognized disgusting stimuli evoked the highest activity in the left amygdala compared to all other categories. This structure was also more activated for remembered vs. forgotten stimuli, but only in case of disgust or fear eliciting pictures. Our findings, despite several limitations, suggest that disgust have a special salience in memory relative to other negative emotions, which cannot be put down to differences in arousal or valence. The current results thereby support the suggestion that a purely dimensional model of emotional influences on cognition might not be adequate to account for observed effects.
The behavioral differentiation of positive emotions has recently been studied in terms of their discrete adaptive functions or appraising profiles. Some preliminary neurophysiological evidences have been found with electroencephalography or autonomic nervous system measurements such as heart rate, skin conductance, etc. However, the brain's hemodynamic responses to different positive emotions remain largely unknown. In the present study, the functional near-infrared spectroscopy (fNIRS) technique was employed. With this tool, we for the first time reported recognizable discrete positive emotions using fNIRS signals. Thirteen participants watched 30 emotional video clips to elicit 10 typical kinds of positive emotions (joy, gratitude, serenity, interest, hope, pride, amusement, inspiration, awe, and love), and their frontal neural activities were simultaneously recorded with a 24-channel fNIRS system. The multidimensional scaling analysis of participants' subjective ratings on these 10 positive emotions revealed three distinct clusters, which could be interpreted as "playfulness" for amusement, joy, interest, "encouragement" for awe, gratitude, hope, inspiration, pride, and "harmony" for love, serenity. Hemodynamic responses to these three positive emotion clusters showed distinct patterns, and HbO-based individual-level binary classifications between them achieved an averaged accuracy of 73.79 ± 11.49% (77.56 ± 7.39% for encouragement vs. harmony, 73.29 ± 11.87% for playfulness vs. harmony, 70.51 ± 13.96% for encouragement vs. harmony). Benefited from fNIRS's high portability, low running cost and the relative robustness against motion and electrical artifacts, our findings provided support for implementing a more fine-grained emotion recognition system with subdivided positive emotion categories.
Here we present a dataset with a description of portrayed emotions in the movie "Forrest Gump". A total of 12 observers independently annotated emotional episodes regarding their temporal location and duration. The nature of an emotion was characterized with basic attributes, such as arousal and valence, as well as explicit emotion category labels. In addition, annotations include a record of the perceptual evidence for the presence of an emotion. Two variants of the movie were annotated separately: 1) an audio-movie version of Forrest Gump that has been used as a stimulus for the acquisition of a large public functional brain imaging dataset, and 2) the original audio-visual movie. We present reliability and consistency estimates that suggest that both stimuli can be used to study visual and auditory emotion cue processing in real-life like situations. Raw annotations from all observers are publicly released in full in order to maximize their utility for a wide range of applications and possible future extensions. In addition, aggregate time series of inter-observer agreement with respect to particular attributes of portrayed emotions are provided to facilitate adoption of these data.
The body is closely tied to the processing of social and emotional information. The purpose of this study was to determine whether a relationship between emotions and social attitudes conveyed through gestures exists. Thus, we tested the effect of pro-social (i.e., happy face) and anti-social (i.e., angry face) emotional primes on the ability to detect socially relevant hand postures (i.e., pictures depicting an open/closed hand). In particular, participants were required to establish, as quickly as possible, if the test stimulus (i.e., a hand posture) was the same or different, compared to the reference stimulus (i.e., a hand posture) previously displayed in the computer screen. Results show that facial primes, displayed between the reference and the test stimuli, influence the recognition of hand postures, according to the social attitude implicitly related to the stimulus. We found that perception of pro-social (i.e., happy face) primes resulted in slower RTs in detecting the open hand posture as compared to the closed hand posture. Vice-versa, perception of the anti-social (i.e., angry face) prime resulted in slower RTs in detecting the closed hand posture compared to the open hand posture. These results suggest that the social attitude implicitly conveyed by the displayed stimuli might represent the conceptual link between emotions and gestures.
We investigated how technologically mediating two different components of emotion-communicative expression and physiological state-to group members affects physiological linkage and self-reported feelings in a small group during video viewing. In different conditions the availability of second screen text chat (communicative expression) and visualization of group level physiological heart rates and their dyadic linkage (physiology) was varied. Within this four person group two participants formed a physically co-located dyad and the other two were individually situated in two separate rooms. We found that text chat always increased heart rate synchrony but HR visualization only with non-co-located dyads. We also found that physiological linkage was strongly connected to self-reported social presence. The results encourage further exploration of the possibilities of sharing group member's physiological components of emotion by technological means to enhance mediated communication and strengthen social presence.
People can speak, and this provides opportunities to analyze human emotions using perceived experiences communicated via language, as well as through measurement and imaging techniques that are also applicable to other higher animal species. Here I compare four qualitative methodological approaches to test if, and how, thrill depends on fear. I use eight high-risk, high-skill, real-life outdoor adventure recreation activities to provide the test circumstances. I present data from: >4000 person-days of participant observation; interviews with 40 expert practitioners; retrospective autoethnography of 50 critical incidents over 4 decades; and experimental autoethnography of 60 events. Results from different methods are congruent, but different approaches yield different insights. The principal findings are as follows. Individuals differ in their fear and thrill responses. The same individual may have different responses on different occasions. Fear boosts performance, but panic causes paralysis. Anxiety or apprehension prior to a risky action or event differs from fear experienced during the event itself. The intensity of pre-event fear generally increases with the immediacy of risk to life, and time to contemplate that risk. Fear must be faced, assessed and overcome in order to act. Thrill can occur either during or after a high-risk event. Thrill can occur without fear, and fear without thrill. Below a lower threshold of perceived risk, thrill can occur without fear. Between a lower and upper threshold, thrill increases with fear. Beyond the upper threshold, thrill vanishes but fear remains. This there is a sawtooth relation between fear and thrill. Perceived danger generates intense focus and awareness. Fear and other emotions can disappear during intense concentration and focus. Under high risk, the usual emotional sequence is fear before the action or event, then focus during the action or event, then thrill, relief, or triumph afterward. The emotionless state persists only during the most intense concentration. For events long enough to differentiate time within the events, fear and thrill can arise and fade in different fine-scale sequences.
Affectively salient stimuli are capable of capturing attentional resources which allow the brain to change the current course of action in order to respond to potentially advantageous or threatening stimuli. Here, we investigated the behavioral and cerebral impact of peripherally presented affective stimuli on the subsequent processing of foveal information. To this end, we recorded whole-head magnetoencephalograms from 12 participants while they made speeded responses to the direction of left- or right-oriented arrows that were presented foveally at fixation. Each arrow was preceded by a peripherally presented pair of pictures, one emotional (unpleasant or pleasant), and one neutral. Paired pictures were presented at 12° of eccentricity to the left and right of a central fixation cross. We observed that the participants responded more quickly when the orientation of the arrow was congruent with the location of the previously presented emotional scene. Results show that non-predictive emotional information in peripheral vision interferes with subsequent responses to foveally presented targets. Importantly, this behavioral effect was correlated with an early (∼135 msec) increase of left fronto-central activity for the emotionally congruent combination, whose cerebral sources were notably located in the left orbitofrontal cortex. We therefore suggest that the prior spatial distribution of emotional salience, like physical salience, grabs attentional resources and modifies the performance in the center of the visual field. Thus, these data shed light on the neurobehavioral correlates of the emotional coding of visual space.
Until recently, research in animal welfare science has mainly focused on negative experiences like pain and suffering, often neglecting the importance of assessing and promoting positive experiences. In rodents, specific facial expressions have been found to occur in situations thought to induce negatively valenced emotional states (e.g., pain, aggression and fear), but none have yet been identified for positive states. Thus, this study aimed to investigate if facial expressions indicative of positive emotional state are exhibited in rats. Adolescent male Lister Hooded rats (Rattus norvegicus, N = 15) were individually subjected to a Positive and a mildly aversive Contrast Treatment over two consecutive days in order to induce contrasting emotional states and to detect differences in facial expression. The Positive Treatment consisted of playful manual tickling administered by the experimenter, while the Contrast Treatment consisted of exposure to a novel test room with intermittent bursts of white noise. The number of positive ultrasonic vocalisations was greater in the Positive Treatment compared to the Contrast Treatment, indicating the experience of differentially valenced states in the two treatments. The main findings were that Ear Colour became significantly pinker and Ear Angle was wider (ears more relaxed) in the Positive Treatment compared to the Contrast Treatment. All other quantitative and qualitative measures of facial expression, which included Eyeball height to width Ratio, Eyebrow height to width Ratio, Eyebrow Angle, visibility of the Nictitating Membrane, and the established Rat Grimace Scale, did not show differences between treatments. This study contributes to the exploration of positive emotional states, and thus good welfare, in rats as it identified the first facial indicators of positive emotions following a positive heterospecific play treatment. Furthermore, it provides improvements to the photography technique and image analysis for the detection of fine differences in facial expression, and also adds to the refinement of the tickling procedure.
We investigated the cortical representation of emotional prosody in normal-hearing listeners using functional near-infrared spectroscopy (fNIRS) and behavioural assessments. Consistent with previous reports, listeners relied most heavily on F0 cues when recognizing emotion cues; performance was relatively poor-and highly variable between listeners-when only intensity and speech-rate cues were available. Using fNIRS to image cortical activity to speech utterances containing natural and reduced prosodic cues, we found right superior temporal gyrus (STG) to be most sensitive to emotional prosody, but no emotion-specific cortical activations, suggesting that while fNIRS might be suited to investigating cortical mechanisms supporting speech processing it is less suited to investigating cortical haemodynamic responses to individual vocal emotions. Manipulating emotional speech to render F0 cues less informative, we found the amplitude of the haemodynamic response in right STG to be significantly correlated with listeners' abilities to recognise vocal emotions with uninformative F0 cues. Specifically, listeners more able to assign emotions to speech with degraded F0 cues showed lower haemodynamic responses to these degraded signals. This suggests a potential objective measure of behavioural sensitivity to vocal emotions that might benefit neurodiverse populations less sensitive to emotional prosody or hearing-impaired listeners, many of whom rely on listening technologies such as hearing aids and cochlear implants-neither of which restore, and often further degrade, the F0 cues essential to parsing emotional prosody conveyed in speech.
Healthy cities continuously attempt to improve residents' health. Health is affected by psychological factors, such as happiness and emotions. Therefore, this study investigates the effects of healthy city program performance on individuals' emotions, as well as the correlation between healthy city program performance and emotions using personal happiness index as a parameter. We conducted a questionnaire survey of residents in areas implementing healthy city projects. A total of 596 responses were obtained. We used structural equations to analyze the relationship of structural influences. Results showed that healthy city program performance had significant static effects on emotion. This observation shows that healthy city programs decrease local residents' negative emotions, such as stress and depression. Therefore, healthy city programs stabilize residents' emotions by increasing health friendliness. To improve the performance of healthy city programs, it is necessary to mitigate health risk factors and positively affect individuals' emotions.
Welcome to the FDI Lab - SciCrunch.org Resources search. From here you can search through a compilation of resources used by FDI Lab - SciCrunch.org and see how data is organized within our community.
You are currently on the Community Resources tab looking through categories and sources that FDI Lab - SciCrunch.org has compiled. You can navigate through those categories from here or change to a different tab to execute your search through. Each tab gives a different perspective on data.
If you have an account on FDI Lab - SciCrunch.org then you can log in from here to get additional features in FDI Lab - SciCrunch.org such as Collections, Saved Searches, and managing Resources.
Here is the search term that is being executed, you can type in anything you want to search for. Some tips to help searching:
You can save any searches you perform for quick access to later from here.
We recognized your search term and included synonyms and inferred terms along side your term to help get the data you are looking for.
If you are logged into FDI Lab - SciCrunch.org you can add data records to your collections to create custom spreadsheets across multiple sources of data.
Here are the facets that you can filter your papers by.
From here we'll present any options for the literature, such as exporting your current results.
If you have any further questions please check out our FAQs Page to ask questions and see our tutorials. Click this button to view this tutorial again.
Year:
Count: