The theory of Shannon entropy was applied to the Choi-Williams time-frequency distribution (CWD) of time series in order to extract entropy information in both time and frequency domains. In this way, four novel indexes were defined: (1) partial instantaneous entropy, calculated as the entropy of the CWD with respect to time by using the probability mass function at each time instant taken independently; (2) partial spectral information entropy, calculated as the entropy of the CWD with respect to frequency by using the probability mass function of each frequency value taken independently; (3) complete instantaneous entropy, calculated as the entropy of the CWD with respect to time by using the probability mass function of the entire CWD; (4) complete spectral information entropy, calculated as the entropy of the CWD with respect to frequency by using the probability mass function of the entire CWD. These indexes were tested on synthetic time series with different behavior (periodic, chaotic and random)
We review the relative entropy method in the context of hyperbolic and diffusive relaxation limits of entropy solutions for various hyperbolic models. The main example consists of the convergence from multidimensional compressible Euler equations with friction to the porous medium equation \cite{LT12}. With small modifications, the arguments used in that case can be adapted to the study of the diffusive limit from the Euler-Poisson system with friction to the Keller-Segel system \cite{LT13}. In addition, the $p$--system with friction and the system of viscoelasticity with memory are then reviewed, again in the case of diffusive limits \cite{LT12}. Finally, the method of relative entropy is described for the multidimensional stress relaxation model converging to elastodynamics \cite[Section 3.2]{LT06}, one of the first examples of application of the method to hyperbolic relaxation limits.. ...
Alexis Vasseur speaking at BIRS workshop, Model reduction in continuum thermodynamics: Modeling, analysis and computation, on Wednesday, September 19, 2012 on the topic: Relative entropy method applied to model reduction in fluid mechanics.
ARTICLE SYNOPSIS...Cycle Analysis: A comparison of the Fourier and Maximum Entropy methods by John F. Ehler The motivation for reducing price history to a mathematical expression is clear. If we can describe the prices mathematically, we have the means to extend the equat. ...
The problem of analyzing data from a multisphere neutron spectrometer to infer the energy spectrum of the incident neutrons is discussed. The main features of the code MAXED, a computer program developed to apply the maximum entropy principle to the deconvolution (unfolding) of multisphere neutron spectrometer data, are described, and the use of the code is illustrated with an example. A user`s guide for the code MAXED is included in an appendix. The code is available from the authors upon request.
The aim of this book is to provide an overview of current works addressing the topics of research that explore the geometric structures of information and entropy. The papers in this book includes the extended versions of a selection of the paper published in Proceedings of the 34th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering (MaxEnt 2014), Amboise, France, 21-26 September 2014. Chapter 1 of the book is a historical review of the origins of thermodynamics and information theory. Chapter 2 discusses the mathematical and physical foundations of geometric structures related to information and entropy. Lastly, Chapter 3 is dedicated to applications with numerical schemes for geometric structures of information and entropy. ...
Depending on the nature of the specimen to be reconstructed, the commonly used reconstruction techniques can create a problem, since in order to obtain creditable reconstructions, filters, which are obtained in an ad-hoc manner, have to be applied. These filters which effectively exclude high resolution data (e.g. noise), also remove any non-noise data thereby leading to artifacts in the reconstruction. The generation of these artifacts however can be overcome if the object lends itself to some sort of symmetry or if many copies of the object being reconstructed have the same structure. However, chromatin fibres do not lend itself towards this type of analysis and hence in order to get reconstructions which are devoid of artifacts, all the information present in the sample has to be preserved. Thus an alternate method namely the maximum entropy method, which does not involve the use of filters, was developed.. The maximum entropy method is used extensively in astronomy and medical tomography and ...
Employs a multi-stage algorithm that makes use of spatial contextual information in a hierarchical clustering procedure for unsupervised image segmentation
Electric power is a basic industry in the national economy in China. It influences the economy growth in China seriously owing to the surplus and shortage
In the continuation of [6], we study reversible reaction-diffusion systems via entropy methods (based on the free energy functional). We show for a particular model problem with two reacting species in which one of the diffusion constants vanishes that the solutions decay exponentially with explicit rate and constant towards the unique constant equilibrium state.
Heres a paper for the proceedings of a workshop on Information and Entropy in Biological System this spring: • John Baez and Blake Pollard, Relative entropy in biological systems, with Blake S. Pollard, Entropy 18 (2016), 46. Wed love any comments or questions you might have. Im not happy with the title. In the paper…
The study compares permutation-based and coarse-grained entropy approaches for the assessment of complexity of short heart period (HP) variability recordings. Shannon permutation entropy (SPE) and conditional permutation entropy (CPE) are computed as examples of permutation-based entropies, while the k-nearest neighbor conditional entropy (KNNCE) is calculated as an example of coarse-grained conditional entropy. SPE, CPE and KNNCE were applied to ad-hoc simulated autoregressive processes corrupted by increasing amounts of broad band noise and to real HP variability series recorded after complete vagal blockade obtained via administration of a high dose of atropine (AT) in nine healthy volunteers and during orthostatic challenge induced by 90° head-up tilt (T90) in 15 healthy individuals. Over the simulated series the performances of SPE and CPE degraded more rapidly with the amplitude of the superimposed broad band noise than those of KNNCE. Over real data KNNCE identified the expected decrease ...
The support vector machine is used as a data mining technique to extract informative hydrologic data on the basis of a strong relationship between error tolerance and the number of support vectors. Hydrologic data of flash flood events in the Lan-Yang River basin in Taiwan were used for the case study. Various percentages (from 50% to 10%) of hydrologic data, including those for flood stage and rainfall data, were mined and used as informative data to characterize a flood hydrograph. Information on these mined hydrologic data sets was quantified using entropy indices, namely marginal entropy, joint entropy, transinformation, and conditional entropy. Analytical results obtained using the entropy indices proved that the mined informative data could be hydrologically interpreted and have a meaningful explanation based on information entropy. Estimates of marginal and joint entropies showed that, in view of flood forecasting, the flood stage was a more informative variable than rainfall. In addition,
Conformational entropy is a potentially important thermodynamic parameter contributing to protein function. Quantitative measures of conformational entropy are necessary for an understanding of its role but have been difficult to obtain experimentally. We have recently introduced an empirical calibration method that utilizes the changes in conformational dynamics as a proxy for changes in conformational entropy. This approach raises several questions with regards to the microscopic origins of the measured conformational entropy as well as its general applicability. One of the goals in this work was to probe the microscopic origins of the link between conformational dynamics and conformational entropy. Using MD simulations, we find that the motions of methyl-bearing side chains are sufficiently coupled to that of other side chains and serve as excellently reporters of the overall side chain conformational entropy. We also propose a modified weighting scheme to project the change in NMR-measured methyl
Extracted from text ... Southern African Journal of Anaesthesia & Analgesia ? March 2006 21 Spectral entropy and haemodynamic response to surgery during sevoflurane anaesthesia Introduction Apart from somatic responses, surgery also evokes autonomic responses, including haemodynamic responses. Spectral entropy has been validated as a means to monitor the hypnotic state during sevoflurane anaesthesia. Aim To investigate the relationship between spectral entropy, heart rate, and blood pressure during sevoflurane anaesthesia. FJ Smith, E Dannhauser Patients and methods The sample consisted of 43 patients scheduled for elective abdominal surgery. Patients were premedicated with oral midazolam. Induction of anaesthesia was achieved with alfentanil 15 mg/kg, ..
The two entropy expressions, log B and −B logB (where B is the local brightness of the object or its spatial spectral power) used in maximum entropy (ME) image restoration, are derived as limiting cases of a general entropy formula. The brightness B is represented by the n photons emitted from a small unit area of the object and imaged in the receiver. These n photons can be distributed over z degrees of freedom in q(n,z) different ways calculated by the Bose-Einstein statistics. The entropy to be maximized is interpreted, as in the original definition of entropy by Boltzmann and Planck, as logq(n,z). This entropy expression reduces to log B and −B logB in the limits of ...
Measuring complexity in terms of the predictability of time series is a major area of research in science and engineering, and its applications are spreading throughout many scientific disciplines, where the analysis of physiological signals is perhaps the most widely reported in literature. Sample entropy is a popular measure for quantifying signal irregularity. However, the sample entropy does not take sequential information, which is inherently useful, into its calculation of sample similarity. Here, we develop a method that is based on the mathematical principle of the sample entropy and enables the capture of sequential information of a time series in the context of spatial dependence provided by the binary-level co-occurrence matrix of a recurrence plot. Experimental results on time-series data of the Lorenz system, physiological signals of gait maturation in healthy children, and gait dynamics in Huntingtons disease show the potential of the proposed method.. ...
So the reason John is surprised to see his cards while Dave is not surprised is that Johns hand is in a very rare macrostate, while Daves hand is in a very common macrostate. We expect to see high entropy hands like Daves junk (with many microstates) and not low entropy hands like Johns straight flush. High entropy hands are more common than low entropy hands, simply because there are more of them. If we buy a new deck of cards, we can immediately deal two straight flushes right off the top of the deck. (The cards in a new deck are all in order.) The entropy of the cards is low. When we shuffle the deck, the entropy increases. We start dealing lousy, high entropy, hands. As you learn more about entropy, youll see that increasing entropy is the way of the world. In the meantime, next time you play poker, take the time to savor those lousy hands. Each one is just as rare as a great one ...
Abstract: We consider words as a network of interacting letters, and approximate the probability distribution of states taken on by this network. Despite the intuition that the rules of English spelling are highly combinatorial (and arbitrary), we find that maximum entropy models consistent with pairwise correlations among letters provide a surprisingly good approximation to the full statistics of four letter words, capturing ~92% of the multi-information among letters and even "discovering" real words that were not represented in the data from which the pairwise correlations were estimated. The maximum entropy model defines an energy landscape on the space of possible words, and local minima in this landscape account for nearly two-thirds of words used in written English ...
We study the problem of "privacy amplification": key agreement between two parties who both know a weak secret w, such as a password. (Such a setting is ubiquitous on the internet, where passwords are the most commonly used security device.) We assume that the key agreement protocol is taking place in the presence of an active computationally unbounded adversary Eve. The adversary may have partial knowledge about w, so we assume only that w has some entropy from Eves point of view. Thus, the goal of the protocol is to convert this non-uniform secret w into a uniformly distributed string R R that is fully secret from Eve. R may then be used as a key for running symmetric cryptographic protocols (such as encryption, authentication, etc.).. Because we make no computational assumptions, the entropy in R can come only from w. Thus such a protocol must minimize the entropy loss during its execution, so that R is as long as possible. The best previous results have entropy loss of Θ(κ 2 ) Θ(κ2) , ...
print(Information Gain: %.3f bits % gain). 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29. 30. 31. # calculate the information gain. from math import log2. # calculate the entropy for the split in the dataset. def entropy(class0, class1):. return -(class0 * log2(class0) + class1 * log2(class1)). # split of the main dataset. class0 = 13 / 20. class1 = 7 / 20. # calculate entropy before the change. s_entropy = entropy(class0, class1). print(Dataset Entropy: %.3f bits % s_entropy). # split 1 (split via value1). s1_class0 = 7 / 8. s1_class1 = 1 / 8. # calculate the entropy of the first group. s1_entropy = entropy(s1_class0, s1_class1). print(Group1 Entropy: %.3f bits % s1_entropy). # split 2 (split via value2). s2_class0 = 6 / 12. s2_class1 = 6 / 12. # calculate the entropy of the second group. s2_entropy = entropy(s2_class0, s2_class1). print(Group2 Entropy: %.3f bits % s2_entropy). # calculate the information gain. gain = s_entropy ...
Jain, R., Radhakrishnan, J., Sen, P. (2009). A property of quantum relative entropy with an application to privacy in quantum communication. Journal of the ACM 56 (6). [email protected] Repository. https://doi.org/10.1145/ ...
Shop at Noble Knight Games for Retro Phaze by Relative Entropy Games - part of our Full Inventory collection. New, used, and Out-of-Print.
These processes are accompanied by increase of randomness and hence increase of entropy i.e. fro these processes, entropy change is positive.. Consider the following processes:. 1) Cooling down of a cup of tea. 2)Reaction taking place between a piece of marble or sodium hydroxide and Hydrochloric acid in an open vessel.. These processes involve exchange of matter and energy with the surrounding.Hence they are not isolated systems.. For these processes, we have to consider the total entropy change of the system and the surrounding.. ΔStotal = ΔSsystem + ΔSsurrounding. For the process to be spontaneous, ΔStotal must be positive.. For all spontaneous processes, the total entropy change ( ΔStotal ) must be positive.. ΔStotal = ΔSsystem + ΔSsurrounding . ,0. The randomness and hence the entropy keeps on increasing till ultimately an equilibrium is reached.. The entropy of the system at equilibrium is maximum and there is no further change in entropy i.e. ΔS =0.. If Stotal is negative, the ...
This issue starts with a basic introduction to the far from equilibrium thermodynamics of the Earth system by Kleidon (2010). This paper reviews the basics of thermodynamics to demonstrate that thermodynamics provides the means of describing practically all Earth system processes in purely thermodynamic terms. Entropy production is not just defined for heat fluxes and temperature gradients, but rather for a very broad range of conjugated variables, demonstrating that MEP has potentially wide-ranging applications within Earth and environmental systems.. This introduction is followed by a critical outside view by Volk & Pauluis (2010) on the contributions of this special issue. Rather than discussing every contribution of this issue in detail, they focus on a few key questions that summarize the challenges of applying MEP and that should guide future developments. Using the example of dry versus moist convection, they show that systems can produce the same amount of entropy but by very different ...
Entropy rate estimates how much information is present in the action potential signal, without considering how much of the action potential timing represents encoded mechanical signal. In contrast, maximum entropy rate is the amount of information that could be encoded in an action potential train at a given firing rate (Rieke et al., 1997). A perfectly regular action potential train would have a very low entropy rate, regardless of firing rate or maximum entropy rate.. The data compression method of estimating entropy has the advantage of being independent of any assumptions about the mechanism of information coding. However, it often gives higher values than the maximum entropy rate. Possible reasons for this include a failure to achieve maximal possible compression and the use of Stirlings approximation in the derivation of maximum entropy (French et al., 2003).. Entropy rate and maximum entropy rate had similar absolute values to the coherence-based information capacity, suggesting that ...
Parent Directory - entropy_decoder_model_kernel_abstract.h.html 2017-12-19 19:15 6.3K entropy_decoder_model_kernel_6.h.html 2017-12-19 19:15 9.2K entropy_decoder_model_kernel_5.h.html 2017-12-19 19:15 60K entropy_decoder_model_kernel_4.h.html 2017-12-19 19:15 45K entropy_decoder_model_kernel_3.h.html 2017-12-19 19:15 26K entropy_decoder_model_kernel_2.h.html 2017-12-19 19:15 18K entropy_decoder_model_kernel_1.h.html 2017-12-19 19:15 12K ...
Entropy is that thermodynamic quantity that is associated with representing non-availability of thermal energy in a system for conversion into mechanical energy that is termed as a disorder in the system. The entropy changes that are there in a specific system is specifically driven by heat flow as well as signs are determined by that.. When exothermic reactions are taken into consideration, there is a constant temperature within the system that would cause heat to flow into the surrounding area and thereby ensure that the surrounding is rendered positive.. When any such system and its surroundings come across each other in irreversible process, there is an increase in entropy. When entropy of one increases, entropy of another decreases. So, it is taken that this change in entropy in an irreversible process is greater than zero, while in reversible process it is less than zero.. ...
In the maximum entropy method, among all the solutions compatible with the underdetermined equation system, the solution with the smallest information content, or the maximum entropy, is selected. s results seem to indicate, the maximum entropy reconstruction is distinguished favorably from the ART Three-Dimensional Reconstruction of Nonperiodic Macromolecular Assemblies 25 reconstruction by a smaller amount of artifacts due to angular limitation. However, the great computational expense has thus far prevented a widespread use of this method. 1974; FRANK et ai. , 1986 b); the lateral views of the 40 S eukaryotic ribosomal subunit (BOUBLIK and HELLMAN 1978; FRANK et ai. 1981 b, 1982); and the top view of glutamine synthetase (VALENTINE et ai. 1968; FRANK et ai. 1978; see Figs. 4 and 5). , 1986). The azimuthal angles of the particles can be determined from an additional micrograph of the same specimen area at 0°, by employing rotational correlation techniques originally developed for the ...
The role of side-chain entropy (SCE) in protein folding has long been speculated about but is still not fully understood. Utilizing a newly developed Monte Carlo method, we conducted a systematic investigation of how the SCE relates to the size of the protein and how it differs among a proteins X-ray, NMR, and decoy structures. We estimated the SCE for a set of 675 nonhomologous proteins, and observed that there is a significant SCE for both exposed and buried residues for all these proteins-the contribution of buried residues approaches ∼40% of the overall SCE. Furthermore, the SCE can be quite different for structures with similar compactness or even similar conformations. As a striking example, we found that proteins X-ray structures appear to pack more
1] J.-P. Eckmann, S. Oliffson Kamphorst & D. Ruelle, Recurrence Plots of Dynamical Systems, Europhysics Letters, 4, 973-977, 1987.. [2] L. L. Trulla, A. Giuliani, J. P. Zbilut & C. L. Webber Jr., Recurrence quantification analysis of the logistic equation with transients, Physics Letters A, 223 (4), 255-260, 1996.. [3] This latter detail was missing in the definition provided in C. Letellier, Estimating the Shannon entropy : recurrence plots versus symbolic dynamics, Physical Review Letters, 96, 254102, 2006. ...
2] M. M. Kaur, P. Nath: On some characterizations of the Shannon entropy using extreme symmetry and block symmetry. Inform, and Control 53 (1982), 9-20. MR 0715518 , Zbl 0511.94012 ...
Entropy is a measurement of uncertainty. It is a way to assign a score of uncertainty to a stochastic variable. Entropy is a key component of information theory, a branch of mathematics designed to quantify information. It first came of age in 1948 with the publication of Claude Shannons paper "A Mathematical Theory of Communication.". Before we can adequately define entropy, we need to first define information. Information, for our purposes, is what you get when the uncertainty about something is diminished or erased. Take a fair coin. Before we flip the coin, we are uncertain about what will happen when we flip it. After we flip the coin, our uncertainty about that coin flip is effectively 0 since we know the outcome (heads or tails). Weve gained information. But how do we measure the uncertainty about the coin flip before it happens? How do we quantify that uncertainty? The answer is entropy.. Consider an information source X that can produce, with equal probability, 6 symbols (also known ...
Many research problems are extremely complex, making interdisciplinary knowledge a necessity; consequently cooperative work in mixed teams is a common and increasing research procedure. In this paper, we evaluated information-theoretic network measures on publication networks. For the experiments described in this paper we used the network of excellence from the RWTH Aachen University, described in [1]. Those measures can be understood as graph complexity measures, which evaluate the structural complexity based on the corresponding concept. We see that it is challenging to generalize such results towards different measures as every measure captures structural information differently and, hence, leads to a different entropy value. This calls for exploring the structural interpretation of a graph measure [2] which has been a challenging problem.
in British Journal of Anaesthesia (2006), 97(6), 842-847. Background. The spectral entropy of the electroencephalogram has been proposed to monitor the depth of anaesthesia. State Entropy (SE) reflects the level of hypnosis. Response Entropy (RE), computed from ... [more ▼]. Background. The spectral entropy of the electroencephalogram has been proposed to monitor the depth of anaesthesia. State Entropy (SE) reflects the level of hypnosis. Response Entropy (RE), computed from electroencephalogram and facial muscle activity, reflects the response to nociceptive stimulation. We evaluated the effect of rocuronium on Bispectral Index (TM) (BIS) and entropy responses to laryngoscopy. Methods. A total of 25 patients were anaesthetized with propofol using a target-controlled infusion. At steady state, they randomly received 0.6 mg kg(-1) rocuronium (R) or saline (S). After 3 min, a 20 s laryngoscopy was applied. BIS, RE and SE were recorded continuously and averaged over 1 min during baseline, at ...
Abstract: Using recent measurements of the supermassive black hole (SMBH) mass function, we find that SMBHs are the largest contributor to the entropy of the observable universe, contributing at least an order of magnitude more entropy than previously estimated. The total entropy of the observable universe is correspondingly higher, and is S_obs = 3.1+3.0-1.7x10^104 k. We calculate the entropy of the current cosmic event horizon to be S_CEH = 2.6+-0.3x10^122 k, dwarfing the entropy of its interior, S_CEHint = 1.2+1.1-0.7x10^103 k. We make the first tentative estimate of the entropy of weakly interacting massive particle dark matter within the observable universe, S_dm = 10^87-10^89 k. We highlight several caveats pertaining to these estimates and make recommendations for future work ...
Project overview. IS2 is a project of the research unit Rhône-Alpes of INRIA (INRIA: National Institute of Research in Computer Science and Robotic). The project is concerned with statistical modelling. Emphasis is placed on incomplete data models. Its main areas of applications are biomedical statistic and failure time models. In 1996, IS2 developed activities statistical modelling through a maximum entropy principle, generalized linear models with random effects, stochastic algorithms, Bayesian statistical analysis of industrial failure times, competing risk models, tails of distributions, and medical applications (diagnosis of haemorragic infarct, models for thromboses increase, and analysis of hospital length of stays via a mixture of exponential distributions). Keywords: Statistical modeling, incomplete data, heteroscedasticity, generalized linear models, maximum entropy method, stochastics algorithms, diagnosis aid, survival analysis, rare events.. ...
Article Effect of a discrete heat source location on entropy generation in mixed convective cooling of a nanofluid inside the ventilated cavity. In this paper, the effect of localised heat sources on entropy generation owing to mixed convection flow ...
which in the perfect disorder limit (all Pn = 1/W) leads to Boltzmanns formula, while in the opposite limit (one configuration with probability 1), the entropy vanishes. This formulation is called the Gibbs entropy formula and is analogous to that of Shannons information entropy. The mathematical field of combinatorics, and in particular the mathematics of combinations and permutations is highly important in the calculation of configurational entropy. In particular, this field of mathematics offers formalized approaches for calculating the number of ways of choosing or arranging discrete objects; in this case, atoms or molecules. However, it is important to note that the positions of molecules are not strictly speaking discrete above the quantum level. Thus a variety of approximations may be used in discretizing a system to allow for a purely combinatorial approach. Alternatively, integral methods may be used in some cases to work directly with continuous position functions. A second approach ...
Download The OpenNLP Maximum Entropy Package for free. Maximum entropy is a powerful method for constructing statistical models of classification tasks, such as part of speech tagging in Natural Language Processing. Several example applications using maxent can be found in the OpenNLP Tools Library.
A precise, quantitative definition of entropy was proposed by the Austrian physicist Ludwig Boltzmann in the late 19th century. According to this definition entropy is related to probability: If a system has several states available to it, the one that can be achieved in the greatest number of ways (has the largest number of microstates) is the one most likely to occur. The state with the greatest probability has the highest entropy. S = kB . lnΩ Where, kB is Boltzmanns constant (R/NA) Ω is the number of microstates corresponding to a given state (including both position and energy) Note: The above definition of entropy is not useful in a practical sense for the typical types of samples used by chemists because those samples contain so many components (for example 1 mole of gas contains 6.022 x 1023 individual particles). How entropy is associated with chemical processes? ΔS = qrev / T = ΔΗ / Τ (at constant temperature T and pressure P)
Entropy is of no greater importance than in the second law of thermodynamics, according to which the total entropy of any system cannot decrease other than increasing the entropy of some other system. Hence, in a system isolated from its surroundings, the entropy cannot decrease. It follows that heat cannot flow from a colder to a hotter body without the application of work to the system. Secondly, it is impossible for any device operating on a cycle to produce net work from a single temperature reservoir; the production of net work requires flow of heat from a hotter to a cooler reservoir. As a result, there is no possibility of a perpetual motion system. Finally, it follows that a reduction in the increase of entropy in a specified process, such as a chemical reaction, means that it is energetically more efficient. I am a college sophomore with a dual major in Physics and Mathematics @ University of California, Santa Barbara. By the way, i came across these excellent physics flash cards. Its ...
Furthermore, entropy is not directly related to the capacity to do useful work. A system with high entropy can easily have more capacity to do useful work than a system with much lower entropy. For example a system consisting of a tank of steam and a block of ice has much more entropy than a system consisting of just a block of ice. But the former can do a lot more work than the latter. What is material is the change in entropy, and that depends on how the energy in the system is distributed and in what form it is in ...
Renyi entropy together with the other associates with the a person parameter entropy models class is usually in turn seen being a case of the Sharma-Mittal entropy, that is a bi-parametric generalized entropy design. This NISTIR focuses on using Renyi and Tsallis entropy and divergence products to investigate similarities and discrepancies amongst probability distributions of desire. The report introduces extensions for the normal uniformity identification and measurement tactics that were proposed in the NIST SP 800-22 and SP 800-ninety (A, B, & C ...
Disclosed is a system and a method for computer-supported analysis of arrhythmic potentials in ECG signals, particularly those of late potentials. Interference discrimination advantages of frequency domain analysis are combined with temporal localization advantages of time domain analysis to determine the accurate location of arrhythmic potentials. Several small signal segments are selected in an ECG waveform. A determination is made of parameters corresponding to extended signals which closely match fluctuations of each respective small signal, allowing more information is discerned about the small signals than is possible with more conventional techniques. A comparison is made with respect to extended signals rather than small signals. Two autoregressive models are used, the maximum entropy method and adaptive filter determination. Area integrals of the frequency characteristics of small signal segments are recorded successively with respect to the frequency range of the arrhythmic potentials analyzed
In recognition that intraurban exposure gradients may be as large as between-city variations, recent air pollution epidemiologic studies have become increasingly interested in capturing within-city exposure gradients. In addition, because of the rapidly accumulating health data, recent studies also need to handle large study populations distributed over large geographic domains. Even though several modeling approaches have been introduced, a consistent modeling framework capturing within-city exposure variability and applicable to large geographic domains is still missing. To address these needs, we proposed a modeling framework based on the Bayesian Maximum Entropy method that integrates monitoring data and outputs from existing air quality models based on Land Use Regression (LUR) and Chemical Transport Models (CTM). The framework was applied to estimate the yearly average NO2 concentrations over the region of Catalunya in Spain. By jointly accounting for the global scale variability in the ...
The increasing number of metagenomic and genomic sequences has dramatically improved our understanding of microbial diversity, yet our ability to infer metabolic capabilities in such datasets remains challenging. |br| We describe the Multigenomic Entropy Based Score pipeline (MEBS), a software platform designed to evaluate, compare and infer complex metabolic pathways in large omic datasets, including entire biogeochemical cycles. MEBS is open source and available through https://github.com/eead-csic-compbio/metagenome_Pfam_score. To demonstrate its use we modeled the sulfur cycle by exhaustively curating the molecular and ecological elements involved (compounds, genes, metabolic pathways and microbial taxa). This information was reduced to a collection of 112 characteristic Pfam protein domains and a list of complete-sequenced sulfur genomes. Using the mathematical framework of relative entropy (H), we quantitatively measured the enrichment of these domains among sulfur genomes. The entropy of each
The increasing number of metagenomic and genomic sequences has dramatically improved our understanding of microbial diversity, yet our ability to infer metabolic capabilities in such datasets remains challenging. |br| We describe the Multigenomic Entropy Based Score pipeline (MEBS), a software platform designed to evaluate, compare and infer complex metabolic pathways in large omic datasets, including entire biogeochemical cycles. MEBS is open source and available through https://github.com/eead-csic-compbio/metagenome_Pfam_score. To demonstrate its use we modeled the sulfur cycle by exhaustively curating the molecular and ecological elements involved (compounds, genes, metabolic pathways and microbial taxa). This information was reduced to a collection of 112 characteristic Pfam protein domains and a list of complete-sequenced sulfur genomes. Using the mathematical framework of relative entropy (H), we quantitatively measured the enrichment of these domains among sulfur genomes. The entropy of each
This paper presents a real-time method for Spoken Language Identification based on the entropy of the posterior probabilities of language specific phoneme recognisers. Entropy based discriminant functions computed on short speech segments are used to compare the model fit to a specific set of observations and language identification is performed as a model selection task. The experiments, performed on a closed set of four Germanic languages on the SpeechDat telephone speech recordings, give 95% accuracy of the method for 10 seconds long speech utterances and 99% accuracy for 20 seconds long utterances.. ...
The coil to globule transition is a fundamental phenomenon in the physics of macro-molecules by reason of the multiplicity of arrangements of their conformation. Such conformational freedom is the main source of entropy in the molecule and is the main opponent to the transition towards the compact state, since a system tends to the state of maximum entropy. This phenomenon is captured by very simple models, such as the ensemble of Interacting Self avoiding Walks on the lattice. This model shows that the coil to globule transition belongs to the universality class of continuous transition called Θ point. Starting from a critical inspection of the definition of the interacting walks model, we introduce a refinement aiming to represent more precisely the entropy sourced from the local fluctuations of the molecule around its equilibrium conformations; this contribution is absent in the standard model which includes only the entropy generated by the multiplicity of the global conformations. Through ...