Searching across hundreds of databases

Our searching services are busy right now. Your search will reload in five seconds.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

This service exclusively searches for literature that cites resources. Please be aware that the total number of searchable documents is limited to those containing RRIDs and does not include all open-access literature.

Search

Type in a keyword to search

On page 1 showing 1 ~ 15 papers out of 15 papers

What makes music memorable? Relationships between acoustic musical features and music-evoked emotions and memories in older adults.

  • Ilja Salakka‎ et al.
  • PloS one‎
  • 2021‎

Music has a unique capacity to evoke both strong emotions and vivid autobiographical memories. Previous music information retrieval (MIR) studies have shown that the emotional experience of music is influenced by a combination of musical features, including tonal, rhythmic, and loudness features. Here, our aim was to explore the relationship between music-evoked emotions and music-evoked memories and how musical features (derived with MIR) can predict them both.


Decoding Individual differences and musical preference via music-induced movement.

  • Yudhik Agrawal‎ et al.
  • Scientific reports‎
  • 2022‎

Movement is a universal response to music, with dance often taking place in social settings. Although previous work has suggested that socially relevant information, such as personality and gender, are encoded in dance movement, the generalizability of previous work is limited. The current study aims to decode dancers' gender, personality traits, and music preference from music-induced movements. We propose a method that predicts such individual difference from free dance movements, and demonstrate the robustness of the proposed method by using two data sets collected using different musical stimuli. In addition, we introduce a novel measure to explore the relative importance of different joints in predicting individual differences. Results demonstrated near perfect classification of gender, and notably high prediction of personality and music preferences. Furthermore, learned models demonstrated generalizability across datasets highlighting the importance of certain joints in intrinsic movement patterns specific to individual differences. Results further support theories of embodied music cognition and the role of bodily movement in musical experiences by demonstrating the influence of gender, personality, and music preferences on embodied responses to heard music.


Naturalistic music and dance: Cortical phase synchrony in musicians and dancers.

  • Hanna Poikonen‎ et al.
  • PloS one‎
  • 2018‎

Expertise in music has been investigated for decades and the results have been applied not only in composition, performance and music education, but also in understanding brain plasticity in a larger context. Several studies have revealed a strong connection between auditory and motor processes and listening to and performing music, and music imagination. Recently, as a logical next step in music and movement, the cognitive and affective neurosciences have been directed towards expertise in dance. To understand the versatile and overlapping processes during artistic stimuli, such as music and dance, it is necessary to study them with continuous naturalistic stimuli. Thus, we used long excerpts from the contemporary dance piece Carmen presented with and without music to professional dancers, musicians, and laymen in an EEG laboratory. We were interested in the cortical phase synchrony within each participant group over several frequency bands during uni- and multimodal processing. Dancers had strengthened theta and gamma synchrony during music relative to silence and silent dance, whereas the presence of music decreased systematically the alpha and beta synchrony in musicians. Laymen were the only group of participants with significant results related to dance. Future studies are required to understand whether these results are related to some other factor (such as familiarity to the stimuli), or if our results reveal a new point of view to dance observation and expertise.


From Vivaldi to Beatles and back: predicting lateralized brain responses to music.

  • Vinoo Alluri‎ et al.
  • NeuroImage‎
  • 2013‎

We aimed at predicting the temporal evolution of brain activity in naturalistic music listening conditions using a combination of neuroimaging and acoustic feature extraction. Participants were scanned using functional Magnetic Resonance Imaging (fMRI) while listening to two musical medleys, including pieces from various genres with and without lyrics. Regression models were built to predict voxel-wise brain activations which were then tested in a cross-validation setting in order to evaluate the robustness of the hence created models across stimuli. To further assess the generalizability of the models we extended the cross-validation procedure by including another dataset, which comprised continuous fMRI responses of musically trained participants to an Argentinean tango. Individual models for the two musical medleys revealed that activations in several areas in the brain belonging to the auditory, limbic, and motor regions could be predicted. Notably, activations in the medial orbitofrontal region and the anterior cingulate cortex, relevant for self-referential appraisal and aesthetic judgments, could be predicted successfully. Cross-validation across musical stimuli and participant pools helped identify a region of the right superior temporal gyrus, encompassing the planum polare and the Heschl's gyrus, as the core structure that processed complex acoustic features of musical pieces from various genres, with or without lyrics. Models based on purely instrumental music were able to predict activation in the bilateral auditory cortices, parietal, somatosensory, and left hemispheric primary and supplementary motor areas. The presence of lyrics on the other hand weakened the prediction of activations in the left superior temporal gyrus. Our results suggest spontaneous emotion-related processing during naturalistic listening to music and provide supportive evidence for the hemispheric specialization for categorical sounds with realistic stimuli. We herewith introduce a powerful means to predict brain responses to music, speech, or soundscapes across a large variety of contexts.


Action in Perception: Prominent Visuo-Motor Functional Symmetry in Musicians during Music Listening.

  • Iballa Burunat‎ et al.
  • PloS one‎
  • 2015‎

Musical training leads to sensory and motor neuroplastic changes in the human brain. Motivated by findings on enlarged corpus callosum in musicians and asymmetric somatomotor representation in string players, we investigated the relationship between musical training, callosal anatomy, and interhemispheric functional symmetry during music listening. Functional symmetry was increased in musicians compared to nonmusicians, and in keyboardists compared to string players. This increased functional symmetry was prominent in visual and motor brain networks. Callosal size did not significantly differ between groups except for the posterior callosum in musicians compared to nonmusicians. We conclude that the distinctive postural and kinematic symmetry in instrument playing cross-modally shapes information processing in sensory-motor cortical areas during music listening. This cross-modal plasticity suggests that motor training affects music perception.


Dynamics of brain activity underlying working memory for music in a naturalistic condition.

  • Iballa Burunat‎ et al.
  • Cortex; a journal devoted to the study of the nervous system and behavior‎
  • 2014‎

We aimed at determining the functional neuroanatomy of working memory (WM) recognition of musical motifs that occurs while listening to music by adopting a non-standard procedure. Western tonal music provides naturally occurring repetition and variation of motifs. These serve as WM triggers, thus allowing us to study the phenomenon of motif tracking within real music. Adopting a modern tango as stimulus, a behavioural test helped to identify the stimulus motifs and build a time-course regressor of WM neural responses. This regressor was then correlated with the participants' (musicians') functional magnetic resonance imaging (fMRI) signal obtained during a continuous listening condition. In order to fine-tune the identification of WM processes in the brain, the variance accounted for by the sensory processing of a set of the stimulus' acoustic features was pruned from participants' neurovascular responses to music. Motivic repetitions activated prefrontal and motor cortical areas, basal ganglia, medial temporal lobe (MTL) structures, and cerebellum. The findings suggest that WM processing of motifs while listening to music emerges from the integration of neural activity distributed over cognitive, motor and limbic subsystems. The recruitment of the hippocampus stands as a novel finding in auditory WM. Effective connectivity and agglomerative hierarchical clustering analyses indicate that the hippocampal connectivity is modulated by motif repetitions, showing strong connections with WM-relevant areas (dorsolateral prefrontal cortex - dlPFC, supplementary motor area - SMA, and cerebellum), which supports the role of the hippocampus in the encoding of the musical motifs in WM, and may evidence long-term memory (LTM) formation, enabled by the use of a realistic listening condition.


Influences of rhythm- and timbre-related musical features on characteristics of music-induced movement.

  • Birgitta Burger‎ et al.
  • Frontiers in psychology‎
  • 2013‎

Music makes us move. Several factors can affect the characteristics of such movements, including individual factors or musical features. For this study, we investigated the effect of rhythm- and timbre-related musical features as well as tempo on movement characteristics. Sixty participants were presented with 30 musical stimuli representing different styles of popular music, and instructed to move along with the music. Optical motion capture was used to record participants' movements. Subsequently, eight movement features and four rhythm- and timbre-related musical features were computationally extracted from the data, while the tempo was assessed in a perceptual experiment. A subsequent correlational analysis revealed that, for instance, clear pulses seemed to be embodied with the whole body, i.e., by using various movement types of different body parts, whereas spectral flux and percussiveness were found to be more distinctly related to certain body parts, such as head and hand movement. A series of ANOVAs with the stimuli being divided into three groups of five stimuli each based on the tempo revealed no significant differences between the groups, suggesting that the tempo of our stimuli set failed to have an effect on the movement features. In general, the results can be linked to the framework of embodied music cognition, as they show that body movements are used to reflect, imitate, and predict musical characteristics.


Exploring Frequency-Dependent Brain Networks from Ongoing EEG Using Spatial ICA During Music Listening.

  • Yongjie Zhu‎ et al.
  • Brain topography‎
  • 2020‎

Recently, exploring brain activity based on functional networks during naturalistic stimuli especially music and video represents an attractive challenge because of the low signal-to-noise ratio in collected brain data. Although most efforts focusing on exploring the listening brain have been made through functional magnetic resonance imaging (fMRI), sensor-level electro- or magnetoencephalography (EEG/MEG) technique, little is known about how neural rhythms are involved in the brain network activity under naturalistic stimuli. This study exploited cortical oscillations through analysis of ongoing EEG and musical feature during freely listening to music. We used a data-driven method that combined music information retrieval with spatial Fourier Independent Components Analysis (spatial Fourier-ICA) to probe the interplay between the spatial profiles and the spectral patterns of the brain network emerging from music listening. Correlation analysis was performed between time courses of brain networks extracted from EEG data and musical feature time series extracted from music stimuli to derive the musical feature related oscillatory patterns in the listening brain. We found brain networks of musical feature processing were frequency-dependent. Musical feature time series, especially fluctuation centroid and key feature, were associated with an increased beta activation in the bilateral superior temporal gyrus. An increased alpha oscillation in the bilateral occipital cortex emerged during music listening, which was consistent with alpha functional suppression hypothesis in task-irrelevant regions. We also observed an increased delta-beta oscillatory activity in the prefrontal cortex associated with musical feature processing. In addition to these findings, the proposed method seems valuable for characterizing the large-scale frequency-dependent brain activity engaged in musical feature processing.


Hunting for the beat in the body: on period and phase locking in music-induced movement.

  • Birgitta Burger‎ et al.
  • Frontiers in human neuroscience‎
  • 2014‎

Music has the capacity to induce movement in humans. Such responses during music listening are usually spontaneous and range from tapping to full-body dancing. However, it is still unclear how humans embody musical structures to facilitate entrainment. This paper describes two experiments, one dealing with period locking to different metrical levels in full-body movement and its relationships to beat- and rhythm-related musical characteristics, and the other dealing with phase locking in the more constrained condition of sideways swaying motions. Expected in Experiment 1 was that music with clear and strong beat structures would facilitate more period-locked movement. Experiment 2 was assumed to yield a common phase relationship between participants' swaying movements and the musical beat. In both experiments optical motion capture was used to record participants' movements. In Experiment 1 a window-based period-locking probability index related to four metrical levels was established, based on acceleration data in three dimensions. Subsequent correlations between this index and musical characteristics of the stimuli revealed pulse clarity to be related to periodic movement at the tactus level, and low frequency flux to mediolateral and anteroposterior movement at both tactus and bar levels. At faster tempi higher metrical levels became more apparent in participants' movement. Experiment 2 showed that about half of the participants showed a stable phase relationship between movement and beat, with superior-inferior movement most often being synchronized to the tactus level, whereas mediolateral movement was rather synchronized to the bar level. However, the relationship between movement phase and beat locations was not consistent between participants, as the beat locations occurred at different phase angles of their movements. The results imply that entrainment to music is a complex phenomenon, involving the whole body and occurring at different metrical levels.


Capturing the musical brain with Lasso: Dynamic decoding of musical features from fMRI data.

  • Petri Toiviainen‎ et al.
  • NeuroImage‎
  • 2014‎

We investigated neural correlates of musical feature processing with a decoding approach. To this end, we used a method that combines computational extraction of musical features with regularized multiple regression (LASSO). Optimal model parameters were determined by maximizing the decoding accuracy using a leave-one-out cross-validation scheme. The method was applied to functional magnetic resonance imaging (fMRI) data that were collected using a naturalistic paradigm, in which participants' brain responses were recorded while they were continuously listening to pieces of real music. The dependent variables comprised musical feature time series that were computationally extracted from the stimulus. We expected timbral features to obtain a higher prediction accuracy than rhythmic and tonal ones. Moreover, we expected the areas significantly contributing to the decoding models to be consistent with areas of significant activation observed in previous research using a naturalistic paradigm with fMRI. Of the six musical features considered, five could be significantly predicted for the majority of participants. The areas significantly contributing to the optimal decoding models agreed to a great extent with results obtained in previous studies. In particular, areas in the superior temporal gyrus, Heschl's gyrus, Rolandic operculum, and cerebellum contributed to the decoding of timbral features. For the decoding of the rhythmic feature, we found the bilateral superior temporal gyrus, right Heschl's gyrus, and hippocampus to contribute most. The tonal feature, however, could not be significantly predicted, suggesting a higher inter-participant variability in its neural processing. A subsequent classification experiment revealed that segments of the stimulus could be classified from the fMRI data with significant accuracy. The present findings provide compelling evidence for the involvement of the auditory cortex, the cerebellum and the hippocampus in the processing of musical features during continuous listening to music.


Fractionating auditory priors: A neural dissociation between active and passive experience of musical sounds.

  • Marina Kliuchko‎ et al.
  • PloS one‎
  • 2019‎

Learning, attention and action play a crucial role in determining how stimulus predictions are formed, stored, and updated. Years-long experience with the specific repertoires of sounds of one or more musical styles is what characterizes professional musicians. Here we contrasted active experience with sounds, namely long-lasting motor practice, theoretical study and engaged listening to the acoustic features characterizing a musical style of choice in professional musicians with mainly passive experience of sounds in laypersons. We hypothesized that long-term active experience of sounds would influence the neural predictions of the stylistic features in professional musicians in a distinct way from the mainly passive experience of sounds in laypersons. Participants with different musical backgrounds were recruited: professional jazz and classical musicians, amateur musicians and non-musicians. They were presented with a musical multi-feature paradigm eliciting mismatch negativity (MMN), a prediction error signal to changes in six sound features for only 12 minutes of electroencephalography (EEG) and magnetoencephalography (MEG) recordings. We observed a generally larger MMN amplitudes-indicative of stronger automatic neural signals to violated priors-in jazz musicians (but not in classical musicians) as compared to non-musicians and amateurs. The specific MMN enhancements were found for spectral features (timbre, pitch, slide) and sound intensity. In participants who were not musicians, the higher preference for jazz music was associated with reduced MMN to pitch slide (a feature common in jazz music style). Our results suggest that long-lasting, active experience of a musical style is associated with accurate neural priors for the sound features of the preferred style, in contrast to passive listening.


Identifying musical pieces from fMRI data using encoding and decoding models.

  • Sebastian Hoefle‎ et al.
  • Scientific reports‎
  • 2018‎

Encoding models can reveal and decode neural representations in the visual and semantic domains. However, a thorough understanding of how distributed information in auditory cortices and temporal evolution of music contribute to model performance is still lacking in the musical domain. We measured fMRI responses during naturalistic music listening and constructed a two-stage approach that first mapped musical features in auditory cortices and then decoded novel musical pieces. We then probed the influence of stimuli duration (number of time points) and spatial extent (number of voxels) on decoding accuracy. Our approach revealed a linear increase in accuracy with duration and a point of optimal model performance for the spatial extent. We further showed that Shannon entropy is a driving factor, boosting accuracy up to 95% for music with highest information content. These findings provide key insights for future decoding and reconstruction algorithms and open new venues for possible clinical applications.


Large-scale brain networks emerge from dynamic processing of musical timbre, key and rhythm.

  • Vinoo Alluri‎ et al.
  • NeuroImage‎
  • 2012‎

We investigated the neural underpinnings of timbral, tonal, and rhythmic features of a naturalistic musical stimulus. Participants were scanned with functional Magnetic Resonance Imaging (fMRI) while listening to a stimulus with a rich musical structure, a modern tango. We correlated temporal evolutions of timbral, tonal, and rhythmic features of the stimulus, extracted using acoustic feature extraction procedures, with the fMRI time series. Results corroborate those obtained with controlled stimuli in previous studies and highlight additional areas recruited during musical feature processing. While timbral feature processing was associated with activations in cognitive areas of the cerebellum, and sensory and default mode network cerebrocortical areas, musical pulse and tonality processing recruited cortical and subcortical cognitive, motor and emotion-related circuits. In sum, by combining neuroimaging, acoustic feature extraction and behavioral methods, we revealed the large-scale cognitive, motor and limbic brain circuitry dedicated to acoustic feature processing during listening to a naturalistic stimulus. In addition to these novel findings, our study has practical relevance as it provides a powerful means to localize neural processing of individual acoustical features, be it those of music, speech, or soundscapes, in ecological settings.


Early auditory processing in musicians and dancers during a contemporary dance piece.

  • Hanna Poikonen‎ et al.
  • Scientific reports‎
  • 2016‎

The neural responses to simple tones and short sound sequences have been studied extensively. However, in reality the sounds surrounding us are spectrally and temporally complex, dynamic and overlapping. Thus, research using natural sounds is crucial in understanding the operation of the brain in its natural environment. Music is an excellent example of natural stimulation which, in addition to sensory responses, elicits vast cognitive and emotional processes in the brain. Here we show that the preattentive P50 response evoked by rapid increases in timbral brightness during continuous music is enhanced in dancers when compared to musicians and laymen. In dance, fast changes in brightness are often emphasized with a significant change in movement. In addition, the auditory N100 and P200 responses are suppressed and sped up in dancers, musicians and laymen when music is accompanied with a dance choreography. These results were obtained with a novel event-related potential (ERP) method for natural music. They suggest that we can begin studying the brain with long pieces of natural music using the ERP method of electroencephalography (EEG) as has already been done with functional magnetic resonance (fMRI), these two brain imaging methods complementing each other.


Group analysis of ongoing EEG data based on fast double-coupled nonnegative tensor decomposition.

  • Xiulin Wang‎ et al.
  • Journal of neuroscience methods‎
  • 2020‎

Ongoing EEG data are recorded as mixtures of stimulus-elicited EEG, spontaneous EEG and noises, which require advanced signal processing techniques for separation and analysis. Existing methods cannot simultaneously consider common and individual characteristics among/within subjects when extracting stimulus-elicited brain activities from ongoing EEG elicited by 512-s long modern tango music.


  1. SciCrunch.org Resources

    Welcome to the FDI Lab - SciCrunch.org Resources search. From here you can search through a compilation of resources used by FDI Lab - SciCrunch.org and see how data is organized within our community.

  2. Navigation

    You are currently on the Community Resources tab looking through categories and sources that FDI Lab - SciCrunch.org has compiled. You can navigate through those categories from here or change to a different tab to execute your search through. Each tab gives a different perspective on data.

  3. Logging in and Registering

    If you have an account on FDI Lab - SciCrunch.org then you can log in from here to get additional features in FDI Lab - SciCrunch.org such as Collections, Saved Searches, and managing Resources.

  4. Searching

    Here is the search term that is being executed, you can type in anything you want to search for. Some tips to help searching:

    1. Use quotes around phrases you want to match exactly
    2. You can manually AND and OR terms to change how we search between words
    3. You can add "-" to terms to make sure no results return with that term in them (ex. Cerebellum -CA1)
    4. You can add "+" to terms to require they be in the data
    5. Using autocomplete specifies which branch of our semantics you with to search and can help refine your search
  5. Save Your Search

    You can save any searches you perform for quick access to later from here.

  6. Query Expansion

    We recognized your search term and included synonyms and inferred terms along side your term to help get the data you are looking for.

  7. Collections

    If you are logged into FDI Lab - SciCrunch.org you can add data records to your collections to create custom spreadsheets across multiple sources of data.

  8. Facets

    Here are the facets that you can filter your papers by.

  9. Options

    From here we'll present any options for the literature, such as exporting your current results.

  10. Further Questions

    If you have any further questions please check out our FAQs Page to ask questions and see our tutorials. Click this button to view this tutorial again.

Publications Per Year

X

Year:

Count: