Searching across hundreds of databases

Our searching services are busy right now. Your search will reload in five seconds.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

This service exclusively searches for literature that cites resources. Please be aware that the total number of searchable documents is limited to those containing RRIDs and does not include all open-access literature.

Search

Type in a keyword to search

On page 1 showing 1 ~ 20 papers out of 4,212 papers

Entropy from state probabilities: hydration entropy of cations.

  • Roland G Huber‎ et al.
  • The journal of physical chemistry. B‎
  • 2013‎

Entropy is an important energetic quantity determining the progression of chemical processes. We propose a new approach to obtain hydration entropy directly from probability density functions in state space. We demonstrate the validity of our approach for a series of cations in aqueous solution. Extensive validation of simulation results was performed. Our approach does not make prior assumptions about the shape of the potential energy landscape and is capable of calculating accurate hydration entropy values. Sampling times in the low nanosecond range are sufficient for the investigated ionic systems. Although the presented strategy is at the moment limited to systems for which a scalar order parameter can be derived, this is not a principal limitation of the method. The strategy presented is applicable to any chemical system where sufficient sampling of conformational space is accessible, for example, by computer simulations.


Disordered enthalpy-entropy descriptor for high-entropy ceramics discovery.

  • Simon Divilov‎ et al.
  • Nature‎
  • 2024‎

The need for improved functionalities in extreme environments is fuelling interest in high-entropy ceramics1-3. Except for the computational discovery of high-entropy carbides, performed with the entropy-forming-ability descriptor4, most innovation has been slowly driven by experimental means1-3. Hence, advancement in the field needs more theoretical contributions. Here we introduce disordered enthalpy-entropy descriptor (DEED), a descriptor that captures the balance between entropy gains and enthalpy costs, allowing the correct classification of functional synthesizability of multicomponent ceramics, regardless of chemistry and structure. To make our calculations possible, we have developed a convolutional algorithm that drastically reduces computational resources. Moreover, DEED guides the experimental discovery of new single-phase high-entropy carbonitrides and borides. This work, integrated into the AFLOW computational ecosystem, provides an array of potential new candidates, ripe for experimental discoveries.


Entropy-stabilized single-atom Pd catalysts via high-entropy fluorite oxide supports.

  • Haidi Xu‎ et al.
  • Nature communications‎
  • 2020‎

Single-atom catalysts (SACs) have attracted considerable attention in the catalysis community. However, fabricating intrinsically stable SACs on traditional supports (N-doped carbon, metal oxides, etc.) remains a formidable challenge, especially under high-temperature conditions. Here, we report a novel entropy-driven strategy to stabilize Pd single-atom on the high-entropy fluorite oxides (CeZrHfTiLa)Ox (HEFO) as the support by a combination of mechanical milling with calcination at 900 °C. Characterization results reveal that single Pd atoms are incorporated into HEFO (Pd1@HEFO) sublattice by forming stable Pd-O-M bonds (M = Ce/Zr/La). Compared to the traditional support stabilized catalysts such as Pd@CeO2, Pd1@HEFO affords the improved reducibility of lattice oxygen and the existence of stable Pd-O-M species, thus exhibiting not only higher low-temperature CO oxidation activity but also outstanding resistance to thermal and hydrothermal degradation. This work therefore exemplifies the superiority of high-entropy materials for the preparation of SACs.


RNA Thermodynamic Structural Entropy.

  • Juan Antonio Garcia-Martin‎ et al.
  • PloS one‎
  • 2015‎

Conformational entropy for atomic-level, three dimensional biomolecules is known experimentally to play an important role in protein-ligand discrimination, yet reliable computation of entropy remains a difficult problem. Here we describe the first two accurate and efficient algorithms to compute the conformational entropy for RNA secondary structures, with respect to the Turner energy model, where free energy parameters are determined from UV absorption experiments. An algorithm to compute the derivational entropy for RNA secondary structures had previously been introduced, using stochastic context free grammars (SCFGs). However, the numerical value of derivational entropy depends heavily on the chosen context free grammar and on the training set used to estimate rule probabilities. Using data from the Rfam database, we determine that both of our thermodynamic methods, which agree in numerical value, are substantially faster than the SCFG method. Thermodynamic structural entropy is much smaller than derivational entropy, and the correlation between length-normalized thermodynamic entropy and derivational entropy is moderately weak to poor. In applications, we plot the structural entropy as a function of temperature for known thermoswitches, such as the repression of heat shock gene expression (ROSE) element, we determine that the correlation between hammerhead ribozyme cleavage activity and total free energy is improved by including an additional free energy term arising from conformational entropy, and we plot the structural entropy of windows of the HIV-1 genome. Our software RNAentropy can compute structural entropy for any user-specified temperature, and supports both the Turner'99 and Turner'04 energy parameters. It follows that RNAentropy is state-of-the-art software to compute RNA secondary structure conformational entropy. Source code is available at https://github.com/clotelab/RNAentropy/; a full web server is available at http://bioinformatics.bc.edu/clotelab/RNAentropy, including source code and ancillary programs.


Brain entropy mapping using fMRI.

  • Ze Wang‎ et al.
  • PloS one‎
  • 2014‎

Entropy is an important trait for life as well as the human brain. Characterizing brain entropy (BEN) may provide an informative tool to assess brain states and brain functions. Yet little is known about the distribution and regional organization of BEN in normal brain. The purpose of this study was to examine the whole brain entropy patterns using a large cohort of normal subjects. A series of experiments were first performed to validate an approximate entropy measure regarding its sensitivity, specificity, and reliability using synthetic data and fMRI data. Resting state fMRI data from a large cohort of normal subjects (n = 1049) from multi-sites were then used to derive a 3-dimensional BEN map, showing a sharp low-high entropy contrast between the neocortex and the rest of brain. The spatial heterogeneity of resting BEN was further studied using a data-driven clustering method, and the entire brain was found to be organized into 7 hierarchical regional BEN networks that are consistent with known structural and functional brain parcellations. These findings suggest BEN mapping as a physiologically and functionally meaningful measure for studying brain functions.


Entropy, Fluctuations, and Disordered Proteins.

  • Eshel Faraggi‎ et al.
  • Entropy (Basel, Switzerland)‎
  • 2019‎

Entropy should directly reflect the extent of disorder in proteins. By clustering structurally related proteins and studying the multiple-sequence-alignment of the sequences of these clusters, we were able to link between sequence, structure, and disorder information. We introduced several parameters as measures of fluctuations at a given MSA site and used these as representative of the sequence and structure entropy at that site. In general, we found a tendency for negative correlations between disorder and structure, and significant positive correlations between disorder and the fluctuations in the system. We also found evidence for residue-type conservation for those residues proximate to potentially disordered sites. Mutation at the disorder site itself appear to be allowed. In addition, we found positive correlation for disorder and accessible surface area, validating that disordered residues occur in exposed regions of proteins. Finally, we also found that fluctuations in the dihedral angles at the original mutated residue and disorder are positively correlated while dihedral angle fluctuations in spatially proximal residues are negatively correlated with disorder. Our results seem to indicate permissible variability in the disordered site, but greater rigidity in the parts of the protein with which the disordered site interacts. This is another indication that disordered residues are involved in protein function.


Cardiorespiratory Coupling Analysis Based on Entropy and Cross-Entropy in Distinguishing Different Depression Stages.

  • Lulu Zhao‎ et al.
  • Frontiers in physiology‎
  • 2019‎

This study used entropy- and cross entropy-based methods to explore the cardiorespiratory coupling of depressive patients, and thus to assess the values of those entropy methods for identifying depression patients with different disease severities.


Entropy removal of medical diagnostics.

  • Shuhan He‎ et al.
  • Scientific reports‎
  • 2024‎

Shannon entropy is a core concept in machine learning and information theory, particularly in decision tree modeling. To date, no studies have extensively and quantitatively applied Shannon entropy in a systematic way to quantify the entropy of clinical situations using diagnostic variables (true and false positives and negatives, respectively). Decision tree representations of medical decision-making tools can be generated using diagnostic variables found in literature and entropy removal can be calculated for these tools. This concept of clinical entropy removal has significant potential for further use to bring forth healthcare innovation, such as quantifying the impact of clinical guidelines and value of care and applications to Emergency Medicine scenarios where diagnostic accuracy in a limited time window is paramount. This analysis was done for 623 diagnostic tools and provided unique insights into their utility. For studies that provided detailed data on medical decision-making algorithms, bootstrapped datasets were generated from source data to perform comprehensive machine learning analysis on these algorithms and their constituent steps, which revealed a novel and thorough evaluation of medical diagnostic algorithms.


On the use of approximate entropy and sample entropy with centre of pressure time-series.

  • Luis Montesinos‎ et al.
  • Journal of neuroengineering and rehabilitation‎
  • 2018‎

Approximate entropy (ApEn) and sample entropy (SampEn) have been previously used to quantify the regularity in centre of pressure (COP) time-series in different experimental groups and/or conditions. ApEn and SampEn are very sensitive to their input parameters: m (subseries length), r (tolerance) and N (data length). Yet, the effects of changing those parameters have been scarcely investigated in the analysis of COP time-series. This study aimed to investigate the effects of changing parameters m, r and N on ApEn and SampEn values in COP time-series, as well as the ability of these entropy measures to discriminate between groups.


Low Computational Cost for Sample Entropy.

  • George Manis‎ et al.
  • Entropy (Basel, Switzerland)‎
  • 2018‎

Sample Entropy is the most popular definition of entropy and is widely used as a measure of the regularity/complexity of a time series. On the other hand, it is a computationally expensive method which may require a large amount of time when used in long series or with a large number of signals. The computationally intensive part is the similarity check between points in m dimensional space. In this paper, we propose new algorithms or extend already proposed ones, aiming to compute Sample Entropy quickly. All algorithms return exactly the same value for Sample Entropy, and no approximation techniques are used. We compare and evaluate them using cardiac inter-beat (RR) time series. We investigate three algorithms. The first one is an extension of the k d -trees algorithm, customized for Sample Entropy. The second one is an extension of an algorithm initially proposed for Approximate Entropy, again customized for Sample Entropy, but also improved to present even faster results. The last one is a completely new algorithm, presenting the fastest execution times for specific values of m, r, time series length, and signal characteristics. These algorithms are compared with the straightforward implementation, directly resulting from the definition of Sample Entropy, in order to give a clear image of the speedups achieved. All algorithms assume the classical approach to the metric, in which the maximum norm is used. The key idea of the two last suggested algorithms is to avoid unnecessary comparisons by detecting them early. We use the term unnecessary to refer to those comparisons for which we know a priori that they will fail at the similarity check. The number of avoided comparisons is proved to be very large, resulting in an analogous large reduction of execution time, making them the fastest algorithms available today for the computation of Sample Entropy.


Entropy analysis of human death uncertainty.

  • J A Tenreiro Machado‎ et al.
  • Nonlinear dynamics‎
  • 2021‎

Uncertainty about the time of death is part of one's life, and plays an important role in demographic and actuarial sciences. Entropy is a measure useful for characterizing complex systems. This paper analyses death uncertainty through the concept of entropy. For that purpose, the Shannon and the cumulative residual entropies are adopted. The first may be interpreted as an average information. The second was proposed more recently and is related to reliability measures such as the mean residual lifetime. Data collected from the Human Mortality Database and describing the evolution of 40 countries during several decades are studied using entropy measures. The emerging country and inter-country entropy patterns are used to characterize the dynamics of mortality. The locus of the two entropies gives a deeper insight into the dynamical evolution of the human mortality data series.


Entropy predicts sensitivity of pseudorandom seeds.

  • Benjamin Dominik Maier‎ et al.
  • Genome research‎
  • 2023‎

Seed design is important for sequence similarity search applications such as read mapping and average nucleotide identity (ANI) estimation. Although k-mers and spaced k-mers are likely the most well-known and used seeds, sensitivity suffers at high error rates, particularly when indels are present. Recently, we developed a pseudorandom seeding construct, strobemers, which was empirically shown to have high sensitivity also at high indel rates. However, the study lacked a deeper understanding of why. In this study, we propose a model to estimate the entropy of a seed and find that seeds with high entropy, according to our model, in most cases have high match sensitivity. Our discovered seed randomness-sensitivity relationship explains why some seeds perform better than others, and the relationship provides a framework for designing even more sensitive seeds. We also present three new strobemer seed constructs: mixedstrobes, altstrobes, and multistrobes. We use both simulated and biological data to show that our new seed constructs improve sequence-matching sensitivity to other strobemers. We show that the three new seed constructs are useful for read mapping and ANI estimation. For read mapping, we implement strobemers into minimap2 and observe 30% faster alignment time and 0.2% higher accuracy than using k-mers when mapping reads at high error rates. As for ANI estimation, we find that higher entropy seeds have a higher rank correlation between estimated and true ANI.


Entropy of balance--some recent results.

  • Frank G Borg‎ et al.
  • Journal of neuroengineering and rehabilitation‎
  • 2010‎

Entropy when applied to biological signals is expected to reflect the state of the biological system. However the physiological interpretation of the entropy is not always straightforward. When should high entropy be interpreted as a healthy sign, and when as marker of deteriorating health? We address this question for the particular case of human standing balance and the Center of Pressure data.


Task-induced changes in brain entropy.

  • Aldo Camargo‎ et al.
  • medRxiv : the preprint server for health sciences‎
  • 2023‎

Entropy indicates irregularity of a dynamic system with higher entropy indicating higher irregularity and more transit states. In the human brain, regional entropy has been increasingly assessed using resting state fMRI. Response of regional entropy to task has been scarcely studied. The purpose of this study is to characterize task-induced regional brain entropy (BEN) alterations using the large Human Connectome Project (HCP) data. To control the potential modulation by the block-design, BEN of task-fMRI was calculated from the fMRI images acquired during the task conditions only and then compared to BEN of rsfMRI. Compared to resting state, task-performance unanimously induced BEN reduction in the peripheral cortical area including both the task activated regions and task non-specific regions such as the task negative area and BEN increase in the centric part of the sensorimotor and perception networks. Task control condition showed large residual task effects. After controlling the task non-specific effects using the control BEN vs task BEN comparison, regional BEN showed task specific effects in target regions.


Amplitude- and Fluctuation-Based Dispersion Entropy.

  • Hamed Azami‎ et al.
  • Entropy (Basel, Switzerland)‎
  • 2018‎

Dispersion entropy (DispEn) is a recently introduced entropy metric to quantify the uncertainty of time series. It is fast and, so far, it has demonstrated very good performance in the characterisation of time series. It includes a mapping step, but the effect of different mappings has not been studied yet. Here, we investigate the effect of linear and nonlinear mapping approaches in DispEn. We also inspect the sensitivity of different parameters of DispEn to noise. Moreover, we develop fluctuation-based DispEn (FDispEn) as a measure to deal with only the fluctuations of time series. Furthermore, the original and fluctuation-based forbidden dispersion patterns are introduced to discriminate deterministic from stochastic time series. Finally, we compare the performance of DispEn, FDispEn, permutation entropy, sample entropy, and Lempel-Ziv complexity on two physiological datasets. The results show that DispEn is the most consistent technique to distinguish various dynamics of the biomedical signals. Due to their advantages over existing entropy methods, DispEn and FDispEn are expected to be broadly used for the characterization of a wide variety of real-world time series. The MATLAB codes used in this paper are freely available at http://dx.doi.org/10.7488/ds/2326.


Lithium containing layered high entropy oxide structures.

  • Junbo Wang‎ et al.
  • Scientific reports‎
  • 2020‎

Layered Delafossite-type Lix(M1M2M3M4M5…Mn)O2 materials, a new class of high-entropy oxides, were synthesized by nebulized spray pyrolysis and subsequent high-temperature annealing. Various metal species (M = Ni, Co, Mn, Al, Fe, Zn, Cr, Ti, Zr, Cu) could be incorporated into this structure type, and in most cases, single-phase oxides were obtained. Delafossite structures are well known and the related materials are used in different fields of application, especially in electrochemical energy storage (e.g., LiNixCoyMnzO2 [NCM]). The transfer of the high-entropy concept to this type of materials and the successful structural replication enabled the preparation of novel compounds with unprecedented properties. Here, we report on the characterization of a series of Delafossite-type high-entropy oxides by means of TEM, SEM, XPS, ICP-OES, Mössbauer spectroscopy, XRD including Rietveld refinement analysis, SAED and STEM mapping and discuss about the role of entropy stabilization. Our experimental data indicate the formation of uniform solid-solution structures with some Li/M mixing.


Entropy and order in urban street networks.

  • Agust Gudmundsson‎ et al.
  • Scientific reports‎
  • 2013‎

Many complex networks erase parts of their geometry as they develop, so that their evolution is difficult to quantify and trace. Here we introduce entropy measures for quantifying the complexity of street orientations and length variations within planar networks and apply them to the street networks of 41 British cities, whose geometric evolution over centuries can be explored. The results show that the street networks of the old central parts of the cities have lower orientation/length entropies - the streets are more tightly ordered and form denser networks - than the outer and more recent parts. Entropy and street length increase, because of spreading, with distance from the network centre. Tracing the 400-year evolution of one network indicates growth through densification (streets are added within the existing network) and expansion (streets are added at the margin of the network) and a gradual increase in entropy over time.


High entropy liquid electrolytes for lithium batteries.

  • Qidi Wang‎ et al.
  • Nature communications‎
  • 2023‎

High-entropy alloys/compounds have large configurational entropy by introducing multiple components, showing improved functional properties that exceed those of conventional materials. However, how increasing entropy impacts the thermodynamic/kinetic properties in liquids that are ambiguous. Here we show this strategy in liquid electrolytes for rechargeable lithium batteries, demonstrating the substantial impact of raising the entropy of electrolytes by introducing multiple salts. Unlike all liquid electrolytes so far reported, the participation of several anionic groups in this electrolyte induces a larger diversity in solvation structures, unexpectedly decreasing solvation strengths between lithium ions and solvents/anions, facilitating lithium-ion diffusivity and the formation of stable interphase passivation layers. In comparison to the single-salt electrolytes, a low-concentration dimethyl ether electrolyte with four salts shows an enhanced cycling stability and rate capability. These findings, rationalized by the fundamental relationship between entropy-dominated solvation structures and ion transport, bring forward high-entropy electrolytes as a composition-rich and unexplored space for lithium batteries and beyond.


Entropy-based model for miRNA isoform analysis.

  • Shengqin Wang‎ et al.
  • PloS one‎
  • 2015‎

MiRNAs have been widely studied due to their important post-transcriptional regulatory roles in gene expression. Many reports have demonstrated the evidence of miRNA isoform products (isomiRs) in high-throughput small RNA sequencing data. However, the biological function involved in these molecules is still not well investigated. Here, we developed a Shannon entropy-based model to estimate isomiR expression profiles of high-throughput small RNA sequencing data extracted from miRBase webserver. By using the Kolmogorov-Smirnov statistical test (KS test), we demonstrated that the 5p and 3p miRNAs present more variants than the single arm miRNAs. We also found that the isomiR variant, except the 3' isomiR variant, is strongly correlated with Minimum Free Energy (MFE) of pre-miRNA, suggesting the intrinsic feature of pre-miRNA should be one of the important factors for the miRNA regulation. The functional enrichment analysis showed that the miRNAs with high variation, particularly the 5' end variation, are enriched in a set of critical functions, supporting these molecules should not be randomly produced. Our results provide a probabilistic framework for miRNA isoforms analysis, and give functional insights into pre-miRNA processing.


Baroreflex Coupling Assessed by Cross-Compression Entropy.

  • Andy Schumann‎ et al.
  • Frontiers in physiology‎
  • 2017‎

Estimating interactions between physiological systems is an important challenge in modern biomedical research. Here, we explore a new concept for quantifying information common in two time series by cross-compressibility. Cross-compression entropy (CCE) exploits the ZIP data compression algorithm extended to bivariate data analysis. First, time series are transformed into symbol vectors. Symbols of the target time series are coded by the symbols of the source series. Uncoupled and linearly coupled surrogates were derived from cardiovascular recordings of 36 healthy controls obtained during rest to demonstrate suitability of this method for assessing physiological coupling. CCE at rest was compared to that of isometric handgrip exercise. Finally, spontaneous baroreflex interaction assessed by CCEBRS was compared between 21 patients suffering from acute schizophrenia and 21 matched controls. The CCEBRS of original time series was significantly higher than in uncoupled surrogates in 89% of the subjects and higher than in linearly coupled surrogates in 47% of the subjects. Handgrip exercise led to sympathetic activation and vagal inhibition accompanied by reduced baroreflex sensitivity. CCEBRS decreased from 0.553 ± 0.030 at rest to 0.514 ± 0.035 during exercise (p < 0.001). In acute schizophrenia, heart rate, and blood pressure were elevated. Heart rate variability indicated a change of sympathovagal balance. The CCEBRS of patients with schizophrenia was reduced compared to healthy controls (0.546 ± 0.042 vs. 0.507 ± 0.046, p < 0.01) and revealed a decrease of blood pressure influence on heart rate in patients with schizophrenia. Our results indicate that CCE is suitable for the investigation of linear and non-linear coupling in cardiovascular time series. CCE can quantify causal interactions in short, noisy and non-stationary physiological time series.


  1. SciCrunch.org Resources

    Welcome to the FDI Lab - SciCrunch.org Resources search. From here you can search through a compilation of resources used by FDI Lab - SciCrunch.org and see how data is organized within our community.

  2. Navigation

    You are currently on the Community Resources tab looking through categories and sources that FDI Lab - SciCrunch.org has compiled. You can navigate through those categories from here or change to a different tab to execute your search through. Each tab gives a different perspective on data.

  3. Logging in and Registering

    If you have an account on FDI Lab - SciCrunch.org then you can log in from here to get additional features in FDI Lab - SciCrunch.org such as Collections, Saved Searches, and managing Resources.

  4. Searching

    Here is the search term that is being executed, you can type in anything you want to search for. Some tips to help searching:

    1. Use quotes around phrases you want to match exactly
    2. You can manually AND and OR terms to change how we search between words
    3. You can add "-" to terms to make sure no results return with that term in them (ex. Cerebellum -CA1)
    4. You can add "+" to terms to require they be in the data
    5. Using autocomplete specifies which branch of our semantics you with to search and can help refine your search
  5. Save Your Search

    You can save any searches you perform for quick access to later from here.

  6. Query Expansion

    We recognized your search term and included synonyms and inferred terms along side your term to help get the data you are looking for.

  7. Collections

    If you are logged into FDI Lab - SciCrunch.org you can add data records to your collections to create custom spreadsheets across multiple sources of data.

  8. Facets

    Here are the facets that you can filter your papers by.

  9. Options

    From here we'll present any options for the literature, such as exporting your current results.

  10. Further Questions

    If you have any further questions please check out our FAQs Page to ask questions and see our tutorials. Click this button to view this tutorial again.

Publications Per Year

X

Year:

Count: