Searching across hundreds of databases

Our searching services are busy right now. Your search will reload in five seconds.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

This service exclusively searches for literature that cites resources. Please be aware that the total number of searchable documents is limited to those containing RRIDs and does not include all open-access literature.

Search

Type in a keyword to search

On page 1 showing 1 ~ 20 papers out of 205 papers

Practical management of heterogeneous neuroimaging metadata by global neuroimaging data repositories.

  • Scott C Neu‎ et al.
  • Frontiers in neuroinformatics‎
  • 2012‎

Rapidly evolving neuroimaging techniques are producing unprecedented quantities of digital data at the same time that many research studies are evolving into global, multi-disciplinary collaborations between geographically distributed scientists. While networked computers have made it almost trivial to transmit data across long distances, collecting and analyzing this data requires extensive metadata if the data is to be maximally shared. Though it is typically straightforward to encode text and numerical values into files and send content between different locations, it is often difficult to attach context and implicit assumptions to the content. As the number of and geographic separation between data contributors grows to national and global scales, the heterogeneity of the collected metadata increases and conformance to a single standardization becomes implausible. Neuroimaging data repositories must then not only accumulate data but must also consolidate disparate metadata into an integrated view. In this article, using specific examples from our experiences, we demonstrate how standardization alone cannot achieve full integration of neuroimaging data from multiple heterogeneous sources and why a fundamental change in the architecture of neuroimaging data repositories is needed instead.


Neuroimaging, nutrition, and iron-related genes.

  • Neda Jahanshad‎ et al.
  • Cellular and molecular life sciences : CMLS‎
  • 2013‎

Several dietary factors and their genetic modifiers play a role in neurological disease and affect the human brain. The structural and functional integrity of the living brain can be assessed using neuroimaging, enabling large-scale epidemiological studies to identify factors that help or harm the brain. Iron is one nutritional factor that comes entirely from our diet, and its storage and transport in the body are under strong genetic control. In this review, we discuss how neuroimaging can help to identify associations between brain integrity, genetic variations, and dietary factors such as iron. We also review iron's essential role in cognition, and we note some challenges and confounds involved in interpreting links between diet and brain health. Finally, we outline some recent discoveries regarding the genetics of iron and its effects on the brain, suggesting the promise of neuroimaging in revealing how dietary factors affect the brain.


The LONI Debabeler: a mediator for neuroimaging software.

  • Scott C Neu‎ et al.
  • NeuroImage‎
  • 2005‎

Brain image analysis often involves processing neuroimaging data with different software packages. Using different software packages together requires exchanging files between them; the output files of one package are used as input files to the next package in the processing sequence. File exchanges become problematic when different packages use different file formats or different conventions within the same file format. Although comprehensive medical image file formats have been developed, no one format exists that satisfies the needs of analyses that involve multiple processing algorithms. The LONI Debabeler acts as a mediator between neuroimaging software packages by automatically using an appropriate file translation to convert files between each pair of linked packages. These translations are built and edited using the Debabeler graphical interface and compensate for package-dependent variations that result in intrapackage incompatibilities. The Debabeler gives neuroimaging processing environments a configurable automaton for file translation and provides users a flexible application for developing robust solutions to translation problems.


Is it time to re-prioritize neuroimaging databases and digital repositories?

  • John Darrell Van Horn‎ et al.
  • NeuroImage‎
  • 2009‎

The development of in vivo brain imaging has lead to the collection of large quantities of digital information. In any individual research article, several tens of gigabytes-worth of data may be represented-collected across normal and patient samples. With the ease of collecting such data, there is increased desire for brain imaging datasets to be openly shared through sophisticated databases. However, very often the raw and pre-processed versions of these data are not available to researchers outside of the team that collected them. A range of neuroimaging databasing approaches has streamlined the transmission, storage, and dissemination of data from such brain imaging studies. Though early sociological and technical concerns have been addressed, they have not been ameliorated altogether for many in the field. In this article, we review the progress made in neuroimaging databases, their role in data sharing, data management, potential for the construction of brain atlases, recording data provenance, and value for re-analysis, new publication, and training. We feature the LONI IDA as an example of an archive being used as a source for brain atlas workflow construction, list several instances of other successful uses of image databases, and comment on archive sustainability. Finally, we suggest that, given these developments, now is the time for the neuroimaging community to re-prioritize large-scale databases as a valuable component of brain imaging science.


Fast and accurate modelling of longitudinal and repeated measures neuroimaging data.

  • Bryan Guillaume‎ et al.
  • NeuroImage‎
  • 2014‎

Despite the growing importance of longitudinal data in neuroimaging, the standard analysis methods make restrictive or unrealistic assumptions (e.g., assumption of Compound Symmetry--the state of all equal variances and equal correlations--or spatially homogeneous longitudinal correlations). While some new methods have been proposed to more accurately account for such data, these methods are based on iterative algorithms that are slow and failure-prone. In this article, we propose the use of the Sandwich Estimator method which first estimates the parameters of interest with a simple Ordinary Least Square model and second estimates variances/covariances with the "so-called" Sandwich Estimator (SwE) which accounts for the within-subject correlation existing in longitudinal data. Here, we introduce the SwE method in its classic form, and we review and propose several adjustments to improve its behaviour, specifically in small samples. We use intensive Monte Carlo simulations to compare all considered adjustments and isolate the best combination for neuroimaging data. We also compare the SwE method to other popular methods and demonstrate its strengths and weaknesses. Finally, we analyse a highly unbalanced longitudinal dataset from the Alzheimer's Disease Neuroimaging Initiative and demonstrate the flexibility of the SwE method to fit within- and between-subject effects in a single model. Software implementing this SwE method has been made freely available at http://warwick.ac.uk/tenichols/SwE.


Multi-source feature learning for joint analysis of incomplete multiple heterogeneous neuroimaging data.

  • Lei Yuan‎ et al.
  • NeuroImage‎
  • 2012‎

Analysis of incomplete data is a big challenge when integrating large-scale brain imaging datasets from different imaging modalities. In the Alzheimer's Disease Neuroimaging Initiative (ADNI), for example, over half of the subjects lack cerebrospinal fluid (CSF) measurements; an independent half of the subjects do not have fluorodeoxyglucose positron emission tomography (FDG-PET) scans; many lack proteomics measurements. Traditionally, subjects with missing measures are discarded, resulting in a severe loss of available information. In this paper, we address this problem by proposing an incomplete Multi-Source Feature (iMSF) learning method where all the samples (with at least one available data source) can be used. To illustrate the proposed approach, we classify patients from the ADNI study into groups with Alzheimer's disease (AD), mild cognitive impairment (MCI) and normal controls, based on the multi-modality data. At baseline, ADNI's 780 participants (172AD, 397 MCI, 211 NC), have at least one of four data types: magnetic resonance imaging (MRI), FDG-PET, CSF and proteomics. These data are used to test our algorithm. Depending on the problem being solved, we divide our samples according to the availability of data sources, and we learn shared sets of features with state-of-the-art sparse learning methods. To build a practical and robust system, we construct a classifier ensemble by combining our method with four other methods for missing value estimation. Comprehensive experiments with various parameters show that our proposed iMSF method and the ensemble model yield stable and promising results.


Structural neuroimaging biomarkers for obsessive-compulsive disorder in the ENIGMA-OCD consortium: medication matters.

  • Willem B Bruin‎ et al.
  • Translational psychiatry‎
  • 2020‎

No diagnostic biomarkers are available for obsessive-compulsive disorder (OCD). Here, we aimed to identify magnetic resonance imaging (MRI) biomarkers for OCD, using 46 data sets with 2304 OCD patients and 2068 healthy controls from the ENIGMA consortium. We performed machine learning analysis of regional measures of cortical thickness, surface area and subcortical volume and tested classification performance using cross-validation. Classification performance for OCD vs. controls using the complete sample with different classifiers and cross-validation strategies was poor. When models were validated on data from other sites, model performance did not exceed chance-level. In contrast, fair classification performance was achieved when patients were grouped according to their medication status. These results indicate that medication use is associated with substantial differences in brain anatomy that are widely distributed, and indicate that clinical heterogeneity contributes to the poor performance of structural MRI as a disease marker.


Support vector machine classification of major depressive disorder using diffusion-weighted neuroimaging and graph theory.

  • Matthew D Sacchet‎ et al.
  • Frontiers in psychiatry‎
  • 2015‎

Recently, there has been considerable interest in understanding brain networks in major depressive disorder (MDD). Neural pathways can be tracked in the living brain using diffusion-weighted imaging (DWI); graph theory can then be used to study properties of the resulting fiber networks. To date, global abnormalities have not been reported in tractography-based graph metrics in MDD, so we used a machine learning approach based on "support vector machines" to differentiate depressed from healthy individuals based on multiple brain network properties. We also assessed how important specific graph metrics were for this differentiation. Finally, we conducted a local graph analysis to identify abnormal connectivity at specific nodes of the network. We were able to classify depression using whole-brain graph metrics. Small-worldness was the most useful graph metric for classification. The right pars orbitalis, right inferior parietal cortex, and left rostral anterior cingulate all showed abnormal network connectivity in MDD. This is the first use of structural global graph metrics to classify depressed individuals. These findings highlight the importance of future research to understand network properties in depression across imaging modalities, improve classification results, and relate network alterations to psychiatric symptoms, medication, and comorbidities.


ENIGMA-anxiety working group: Rationale for and organization of large-scale neuroimaging studies of anxiety disorders.

  • Janna Marie Bas-Hoogendam‎ et al.
  • Human brain mapping‎
  • 2022‎

Anxiety disorders are highly prevalent and disabling but seem particularly tractable to investigation with translational neuroscience methodologies. Neuroimaging has informed our understanding of the neurobiology of anxiety disorders, but research has been limited by small sample sizes and low statistical power, as well as heterogenous imaging methodology. The ENIGMA-Anxiety Working Group has brought together researchers from around the world, in a harmonized and coordinated effort to address these challenges and generate more robust and reproducible findings. This paper elaborates on the concepts and methods informing the work of the working group to date, and describes the initial approach of the four subgroups studying generalized anxiety disorder, panic disorder, social anxiety disorder, and specific phobia. At present, the ENIGMA-Anxiety database contains information about more than 100 unique samples, from 16 countries and 59 institutes. Future directions include examining additional imaging modalities, integrating imaging and genetic data, and collaborating with other ENIGMA working groups. The ENIGMA consortium creates synergy at the intersection of global mental health and clinical neuroscience, and the ENIGMA-Anxiety Working Group extends the promise of this approach to neuroimaging research on anxiety disorders.


Neuroimaging-AI Endophenotypes of Brain Diseases in the General Population: Towards a Dimensional System of Vulnerability.

  • Junhao Wen‎ et al.
  • medRxiv : the preprint server for health sciences‎
  • 2023‎

Disease heterogeneity poses a significant challenge for precision diagnostics in both clinical and sub-clinical stages. Recent work leveraging artificial intelligence (AI) has offered promise to dissect this heterogeneity by identifying complex intermediate phenotypes - herein called dimensional neuroimaging endophenotypes (DNEs) - which subtype various neurologic and neuropsychiatric diseases. We investigate the presence of nine such DNEs derived from independent yet harmonized studies on Alzheimer's disease (AD1-2)1, autism spectrum disorder (ASD1-3)2, late-life depression (LLD1-2)3, and schizophrenia (SCZ1-2)4, in the general population of 39,178 participants in the UK Biobank study. Phenome-wide associations revealed prominent associations between the nine DNEs and phenotypes related to the brain and other human organ systems. This phenotypic landscape aligns with the SNP-phenotype genome-wide associations, revealing 31 genomic loci associated with the nine DNEs (Bonferroni corrected P-value < 5×10-8/9). The DNEs exhibited significant genetic correlations, colocalization, and causal relationships with multiple human organ systems and chronic diseases. A causal effect (odds ratio=1.25 [1.11, 1.40], P-value=8.72×1-4) was established from AD2, characterized by focal medial temporal lobe atrophy, to AD. The nine DNEs and their polygenic risk scores significantly improved the prediction accuracy for 14 systemic disease categories and mortality. These findings underscore the potential of the nine DNEs to identify individuals at a high risk of developing the four brain diseases during preclinical stages for precision diagnostics. All results are publicly available at: http://labs.loni.usc.edu/medicine/.


The ENIGMA Stroke Recovery Working Group: Big data neuroimaging to study brain-behavior relationships after stroke.

  • Sook-Lei Liew‎ et al.
  • Human brain mapping‎
  • 2022‎

The goal of the Enhancing Neuroimaging Genetics through Meta-Analysis (ENIGMA) Stroke Recovery working group is to understand brain and behavior relationships using well-powered meta- and mega-analytic approaches. ENIGMA Stroke Recovery has data from over 2,100 stroke patients collected across 39 research studies and 10 countries around the world, comprising the largest multisite retrospective stroke data collaboration to date. This article outlines the efforts taken by the ENIGMA Stroke Recovery working group to develop neuroinformatics protocols and methods to manage multisite stroke brain magnetic resonance imaging, behavioral and demographics data. Specifically, the processes for scalable data intake and preprocessing, multisite data harmonization, and large-scale stroke lesion analysis are described, and challenges unique to this type of big data collaboration in stroke research are discussed. Finally, future directions and limitations, as well as recommendations for improved data harmonization through prospective data collection and data management, are provided.


The Enhancing NeuroImaging Genetics through Meta-Analysis Consortium: 10 Years of Global Collaborations in Human Brain Mapping.

  • Paul M Thompson‎ et al.
  • Human brain mapping‎
  • 2022‎

This Special Issue of Human Brain Mapping is dedicated to a 10-year anniversary of the Enhancing NeuroImaging Genetics through Meta-Analysis (ENIGMA) Consortium. It reports updates from a broad range of international neuroimaging projects that pool data from around the world to answer fundamental questions in neuroscience. Since ENIGMA was formed in December 2009, the initiative grew into a worldwide effort with over 2,000 participating scientists from 45 countries, and over 50 working groups leading large-scale studies of human brain disorders. Over the last decade, many lessons were learned on how best to pool brain data from diverse sources. Working groups were created to develop methods to analyze worldwide data from anatomical and diffusion magnetic resonance imaging (MRI), resting state and task-based functional MRI, electroencephalography (EEG), magnetoencephalography (MEG), and magnetic resonance spectroscopy (MRS). The quest to understand genetic effects on human brain development and disease also led to analyses of brain scans on an unprecedented scale. Genetic roadmaps of the human cortex were created by researchers worldwide who collaborated to perform statistically well-powered analyses of common and rare genetic variants on brain measures and rates of brain development and aging. Here, we summarize the 31 papers in this Special Issue, covering: (a) technical approaches to harmonize analysis of different types of brain imaging data, (b) reviews of the last decade of work by several of ENIGMA's clinical and technical working groups, and (c) new empirical papers reporting large-scale international brain mapping analyses in patients with substance use disorders, schizophrenia, bipolar disorders, major depression, posttraumatic stress disorder, obsessive compulsive disorder, epilepsy, and stroke.


Association analysis of rare variants near the APOE region with CSF and neuroimaging biomarkers of Alzheimer's disease.

  • Kwangsik Nho‎ et al.
  • BMC medical genomics‎
  • 2017‎

The APOE ε4 allele is the most significant common genetic risk factor for late-onset Alzheimer's disease (LOAD). The region surrounding APOE on chromosome 19 has also shown consistent association with LOAD. However, no common variants in the region remain significant after adjusting for APOE genotype. We report a rare variant association analysis of genes in the vicinity of APOE with cerebrospinal fluid (CSF) and neuroimaging biomarkers of LOAD.


Higher homocysteine associated with thinner cortical gray matter in 803 participants from the Alzheimer's Disease Neuroimaging Initiative.

  • Sarah K Madsen‎ et al.
  • Neurobiology of aging‎
  • 2015‎

A significant portion of our risk for dementia in old age is associated with lifestyle factors (diet, exercise, and cardiovascular health) that are modifiable, at least in principle. One such risk factor, high-homocysteine levels in the blood, is known to increase risk for Alzheimer's disease and vascular disorders. Here, we set out to understand how homocysteine levels relate to 3D surface-based maps of cortical gray matter distribution (thickness, volume, and surface area) computed from brain magnetic resonance imaging in 803 elderly subjects from the Alzheimer's Disease Neuroimaging Initiative data set. Individuals with higher plasma levels of homocysteine had lower gray matter thickness in bilateral frontal, parietal, occipital, and right temporal regions and lower gray matter volumes in left frontal, parietal, temporal, and occipital regions, after controlling for diagnosis, age, and sex and after correcting for multiple comparisons. No significant within-group associations were found in cognitively healthy people, patients with mild cognitive impairment, or patients with Alzheimer's disease. These regional differences in gray matter structure may be useful biomarkers to assess the effectiveness of interventions, such as vitamin B supplements, that aim to prevent homocysteine-related brain atrophy by normalizing homocysteine levels.


Spatial patterns of neuroimaging biomarker change in individuals from families with autosomal dominant Alzheimer's disease: a longitudinal study.

  • Brian A Gordon‎ et al.
  • The Lancet. Neurology‎
  • 2018‎

Models of Alzheimer's disease propose a sequence of amyloid β (Aβ) accumulation, hypometabolism, and structural decline that precedes the onset of clinical dementia. These pathological features evolve both temporally and spatially in the brain. In this study, we aimed to characterise where in the brain and when in the course of the disease neuroimaging biomarkers become abnormal.


Neuroimaging-based classification of PTSD using data-driven computational approaches: A multisite big data study from the ENIGMA-PGC PTSD consortium.

  • Xi Zhu‎ et al.
  • NeuroImage‎
  • 2023‎

Recent advances in data-driven computational approaches have been helpful in devising tools to objectively diagnose psychiatric disorders. However, current machine learning studies limited to small homogeneous samples, different methodologies, and different imaging collection protocols, limit the ability to directly compare and generalize their results. Here we aimed to classify individuals with PTSD versus controls and assess the generalizability using a large heterogeneous brain datasets from the ENIGMA-PGC PTSD Working group.


The LONI QC System: A Semi-Automated, Web-Based and Freely-Available Environment for the Comprehensive Quality Control of Neuroimaging Data.

  • Hosung Kim‎ et al.
  • Frontiers in neuroinformatics‎
  • 2019‎

Quantifying, controlling, and monitoring image quality is an essential prerequisite for ensuring the validity and reproducibility of many types of neuroimaging data analyses. Implementation of quality control (QC) procedures is the key to ensuring that neuroimaging data are of high-quality and their validity in the subsequent analyses. We introduce the QC system of the Laboratory of Neuro Imaging (LONI): a web-based system featuring a workflow for the assessment of various modality and contrast brain imaging data. The design allows users to anonymously upload imaging data to the LONI-QC system. It then computes an exhaustive set of QC metrics which aids users to perform a standardized QC by generating a range of scalar and vector statistics. These procedures are performed in parallel using a large compute cluster. Finally, the system offers an automated QC procedure for structural MRI, which can flag each QC metric as being 'good' or 'bad.' Validation using various sets of data acquired from a single scanner and from multiple sites demonstrated the reproducibility of our QC metrics, and the sensitivity and specificity of the proposed Auto QC to 'bad' quality images in comparison to visual inspection. To the best of our knowledge, LONI-QC is the first online QC system that uniquely supports the variety of functionality where we compute numerous QC metrics and perform visual/automated image QC of multi-contrast and multi-modal brain imaging data. The LONI-QC system has been used to assess the quality of large neuroimaging datasets acquired as part of various multi-site studies such as the Transforming Research and Clinical Knowledge in Traumatic Brain Injury (TRACK-TBI) Study and the Alzheimer's Disease Neuroimaging Initiative (ADNI). LONI-QC's functionality is freely available to users worldwide and its adoption by imaging researchers is likely to contribute substantially to upholding high standards of brain image data quality and to implementing these standards across the neuroimaging community.


Effects of traumatic brain injury and posttraumatic stress disorder on Alzheimer's disease in veterans, using the Alzheimer's Disease Neuroimaging Initiative.

  • Michael W Weiner‎ et al.
  • Alzheimer's & dementia : the journal of the Alzheimer's Association‎
  • 2014‎

Both traumatic brain injury (TBI) and posttraumatic stress disorder (PTSD) are common problems resulting from military service, and both have been associated with increased risk of cognitive decline and dementia resulting from Alzheimer's disease (AD) or other causes. This study aims to use imaging techniques and biomarker analysis to determine whether traumatic brain injury (TBI) and/or PTSD resulting from combat or other traumas increase the risk for AD and decrease cognitive reserve in Veteran subjects, after accounting for age. Using military and Department of Veterans Affairs records, 65 Vietnam War veterans with a history of moderate or severe TBI with or without PTSD, 65 with ongoing PTSD without TBI, and 65 control subjects are being enrolled in this study at 19 sites. The study aims to select subject groups that are comparable in age, gender, ethnicity, and education. Subjects with mild cognitive impairment (MCI) or dementia are being excluded. However, a new study just beginning, and similar in size, will study subjects with TBI, subjects with PTSD, and control subjects with MCI. Baseline measurements of cognition, function, blood, and cerebrospinal fluid biomarkers; magnetic resonance images (structural, diffusion tensor, and resting state blood-level oxygen dependent (BOLD) functional magnetic resonance imaging); and amyloid positron emission tomographic (PET) images with florbetapir are being obtained. One-year follow-up measurements will be collected for most of the baseline procedures, with the exception of the lumbar puncture, the PET imaging, and apolipoprotein E genotyping. To date, 19 subjects with TBI only, 46 with PTSD only, and 15 with TBI and PTSD have been recruited and referred to 13 clinics to undergo the study protocol. It is expected that cohorts will be fully recruited by October 2014. This study is a first step toward the design and statistical powering of an AD prevention trial using at-risk veterans as subjects, and provides the basis for a larger, more comprehensive study of dementia risk factors in veterans.


Increasing participant diversity in AD research: Plans for digital screening, blood testing, and a community-engaged approach in the Alzheimer's Disease Neuroimaging Initiative 4.

  • Michael W Weiner‎ et al.
  • Alzheimer's & dementia : the journal of the Alzheimer's Association‎
  • 2023‎

The Alzheimer's Disease Neuroimaging Initiative (ADNI) aims to validate biomarkers for Alzheimer's disease (AD) clinical trials. To improve generalizability, ADNI4 aims to enroll 50-60% of its new participants from underrepresented populations (URPs) using new biofluid and digital technologies. ADNI4 has received funding from the National Institute on Aging beginning September 2022.


Heritability of the network architecture of intrinsic brain functional connectivity.

  • Benjamin Sinclair‎ et al.
  • NeuroImage‎
  • 2015‎

The brain's functional network exhibits many features facilitating functional specialization, integration, and robustness to attack. Using graph theory to characterize brain networks, studies demonstrate their small-world, modular, and "rich-club" properties, with deviations reported in many common neuropathological conditions. Here we estimate the heritability of five widely used graph theoretical metrics (mean clustering coefficient (γ), modularity (Q), rich-club coefficient (ϕnorm), global efficiency (λ), small-worldness (σ)) over a range of connection densities (k=5-25%) in a large cohort of twins (N=592, 84 MZ and 89 DZ twin pairs, 246 single twins, age 23 ± 2.5). We also considered the effects of global signal regression (GSR). We found that the graph metrics were moderately influenced by genetic factors h(2) (γ=47-59%, Q=38-59%, ϕnorm=0-29%, λ=52-64%, σ=51-59%) at lower connection densities (≤ 15%), and when global signal regression was implemented, heritability estimates decreased substantially h(2) (γ=0-26%, Q=0-28%, ϕnorm=0%, λ=23-30%, σ=0-27%). Distinct network features were phenotypically correlated (|r|=0.15-0.81), and γ, Q, and λ were found to be influenced by overlapping genetic factors. Our findings suggest that these metrics may be potential endophenotypes for psychiatric disease and suitable for genetic association studies, but that genetic effects must be interpreted with respect to methodological choices.


  1. SciCrunch.org Resources

    Welcome to the FDI Lab - SciCrunch.org Resources search. From here you can search through a compilation of resources used by FDI Lab - SciCrunch.org and see how data is organized within our community.

  2. Navigation

    You are currently on the Community Resources tab looking through categories and sources that FDI Lab - SciCrunch.org has compiled. You can navigate through those categories from here or change to a different tab to execute your search through. Each tab gives a different perspective on data.

  3. Logging in and Registering

    If you have an account on FDI Lab - SciCrunch.org then you can log in from here to get additional features in FDI Lab - SciCrunch.org such as Collections, Saved Searches, and managing Resources.

  4. Searching

    Here is the search term that is being executed, you can type in anything you want to search for. Some tips to help searching:

    1. Use quotes around phrases you want to match exactly
    2. You can manually AND and OR terms to change how we search between words
    3. You can add "-" to terms to make sure no results return with that term in them (ex. Cerebellum -CA1)
    4. You can add "+" to terms to require they be in the data
    5. Using autocomplete specifies which branch of our semantics you with to search and can help refine your search
  5. Save Your Search

    You can save any searches you perform for quick access to later from here.

  6. Query Expansion

    We recognized your search term and included synonyms and inferred terms along side your term to help get the data you are looking for.

  7. Collections

    If you are logged into FDI Lab - SciCrunch.org you can add data records to your collections to create custom spreadsheets across multiple sources of data.

  8. Facets

    Here are the facets that you can filter your papers by.

  9. Options

    From here we'll present any options for the literature, such as exporting your current results.

  10. Further Questions

    If you have any further questions please check out our FAQs Page to ask questions and see our tutorials. Click this button to view this tutorial again.

Publications Per Year

X

Year:

Count: