**Markov Chains**: A stochastic process such that the conditional probability distribution for a state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.

**Monte Carlo Method**: In statistics, a technique for numerically approximating the solution of a mathematical problem by studying the distribution of some random variable, often generated by a computer. The name alludes to the randomness characteristic of the games of chance played at the gambling casinos in Monte Carlo. (From Random House Unabridged Dictionary, 2d ed, 1993)

**Bayes Theorem**: A theorem in probability theory named for Thomas Bayes (1702-1761). In epidemiology, it is used to obtain the probability of disease in a group of people with some characteristic on the basis of the overall rate of that disease and of the likelihood of that characteristic in healthy and diseased individuals. The most familiar application is in clinical decision analysis where it is used for estimating the probability of a particular diagnosis given the appearance of some symptoms or test result.

**Algorithms**: A procedure consisting of a sequence of algebraic formulas and/or logical steps to calculate or determine a given task.

**Models, Genetic**: Theoretical representations that simulate the behavior or activity of genetic processes or phenomena. They include the use of mathematical equations, computers, and other electronic equipment.

**Models, Statistical**: Statistical formulations or analyses which, when applied to data and found to fit the data, are then used to verify the assumptions and parameters used in the analysis. Examples of statistical models are the linear model, binomial model, polynomial model, two-parameter model, etc.

**Computer Simulation**: Computer-based representation of physical systems and phenomena such as chemical processes.

**Likelihood Functions**: Functions constructed from a statistical model and a set of observed data which give the probability of that data for various values of the unknown model parameters. Those parameter values that maximize the probability are the maximum likelihood estimates of the parameters.

**Stochastic Processes**: Processes that incorporate some element of randomness, used particularly to refer to a time series of random variables.

**Phylogeny**: The relationships of groups of organisms as reflected by their genetic makeup.

**Software**: Sequential operating programs and data which instruct the functioning of a digital computer.

**Genealogy and Heraldry**

**Probability**: The study of chance processes or the relative frequency characterizing a chance process.

**Models, Biological**: Theoretical representations that simulate the behavior or activity of biological processes or diseases. For disease models in living animals, DISEASE MODELS, ANIMAL is available. Biological models include the use of mathematical equations, computers, and other electronic equipment.

**Evolution, Molecular**: The process of cumulative change at the level of DNA; RNA; and PROTEINS, over successive generations.

**Computational Biology**: A field of biology concerned with the development of techniques for the collection and manipulation of biological data, and the use of such data to make biological discoveries or predictions. This field encompasses all computational methods and theories for solving biological problems including manipulation of models and datasets.

**Chromosome Mapping**: Any method used for determining the location of and relative distances between genes on a chromosome.

**Sequence Analysis, DNA**: A multistage process that includes cloning, physical mapping, subcloning, determination of the DNA SEQUENCE, and information analysis.

**Sequence Alignment**: The arrangement of two or more amino acid or base sequences from an organism or organisms in such a way as to align areas of the sequences sharing common properties. The degree of relatedness or homology between the sequences is predicted computationally or statistically based on weights assigned to the elements aligned between the sequences. This in turn can serve as a potential indicator of the genetic relatedness between the organisms.

**Data Interpretation, Statistical**: Application of statistical procedures to analyze specific observed or assumed facts from a particular study.

**Models, Theoretical**: Theoretical representations that simulate the behavior or activity of systems, processes, or phenomena. They include the use of mathematical equations, computers, and other electronic equipment.

**Pattern Recognition, Automated**: In INFORMATION RETRIEVAL, machine-sensing or identification of visible patterns (shapes, forms, and configurations). (Harrod's Librarians' Glossary, 7th ed)

**Biometry**: The use of statistical and mathematical methods to analyze biological observations and phenomena.

**Biostatistics**: The application of STATISTICS to biological systems and organisms involving the retrieval or collection, analysis, reduction, and interpretation of qualitative and quantitative data.

**Genetics, Population**: The discipline studying genetic composition of populations and effects of factors such as GENETIC SELECTION, population size, MUTATION, migration, and GENETIC DRIFT on the frequencies of various GENOTYPES and PHENOTYPES using a variety of GENETIC TECHNIQUES.

**Polymerase Chain Reaction**: In vitro method for producing large amounts of specific DNA or RNA fragments of defined length and sequence from small amounts of short oligonucleotide flanking sequences (primers). The essential steps include thermal denaturation of the double-stranded target molecules, annealing of the primers to their complementary sequences, and extension of the annealed primers by enzymatic synthesis with DNA polymerase. The reaction is efficient, specific, and extremely sensitive. Uses for the reaction include disease diagnosis, detection of difficult-to-isolate pathogens, mutation analysis, genetic testing, DNA sequencing, and analyzing evolutionary relationships.

**Quantitative Trait, Heritable**: A characteristic showing quantitative inheritance such as SKIN PIGMENTATION in humans. (From A Dictionary of Genetics, 4th ed)

**Quantitative Trait Loci**: Genetic loci associated with a QUANTITATIVE TRAIT.

**Molecular Sequence Data**: Descriptions of specific amino acid, carbohydrate, or nucleotide sequences which have appeared in the published literature and/or are deposited in and maintained by databanks such as GENBANK, European Molecular Biology Laboratory (EMBL), National Biomedical Research Foundation (NBRF), or other sequence repositories.

**Genetic Markers**: A phenotypically recognizable genetic trait which can be used to identify a genetic locus, a linkage group, or a recombination event.

**Quality-Adjusted Life Years**: A measurement index derived from a modification of standard life-table procedures and designed to take account of the quality as well as the duration of survival. This index can be used in assessing the outcome of health care procedures or services. (BIOETHICS Thesaurus, 1994)

**Cost-Benefit Analysis**: A method of comparing the cost of a program with its expected benefits in dollars (or other currency). The benefit-to-cost ratio is a measure of total return expected per unit of money spent. This analysis generally excludes consideration of factors that are not measured ultimately in economic terms. Cost effectiveness compares alternative ways to achieve a specific set of results.

**Population Dynamics**: The pattern of any process, or the interrelationship of phenomena, which affects growth or change within a population.

**Sequence Analysis, Protein**: A process that includes the determination of AMINO ACID SEQUENCE of a protein (or peptide, oligopeptide or peptide fragment) and the information analysis of the sequence.

**Base Sequence**: The sequence of PURINES and PYRIMIDINES in nucleic acids and polynucleotides. It is also called nucleotide sequence.

**Genetic Linkage**: The co-inheritance of two or more non-allelic GENES due to their being located more or less closely on the same CHROMOSOME.

**Classification**: The systematic arrangement of entities in any field into categories classes based on common characteristics such as properties, morphology, subject matter, etc.

**Population Density**: Number of individuals in a population relative to space.

**Reproducibility of Results**: The statistical reproducibility of measurements (often in a clinical context), including the testing of instrumentation or techniques to obtain reproducible results. The concept includes reproducibility of physiological measurements, which may be used to develop rules to assess probability or prognosis, or response to a stimulus; reproducibility of occurrence of a condition; and reproducibility of experimental results.

**Multifactorial Inheritance**: A phenotypic outcome (physical characteristic or disease predisposition) that is determined by more than one gene. Polygenic refers to those determined by many genes, while oligogenic refers to those determined by a few genes.

**Probability Learning**: Usually refers to the use of mathematical models in the prediction of learning to perform tasks based on the theory of probability applied to responses; it may also refer to the frequency of occurrence of the responses observed in the particular study.

**Artificial Intelligence**: Theory and development of COMPUTER SYSTEMS which perform tasks that normally require human intelligence. Such tasks may include speech recognition, LEARNING; VISUAL PERCEPTION; MATHEMATICAL COMPUTING; reasoning, PROBLEM SOLVING, DECISION-MAKING, and translation of language.

**Cluster Analysis**: A set of statistical methods used to group variables or observations into strongly inter-related subgroups. In epidemiology, it may be used to analyze a closely grouped series of events or cases of disease or other health-related phenomenon with well-defined distribution patterns in relation to time or place or both.

**Normal Distribution**: Continuous frequency distribution of infinite range. Its properties are as follows: 1, continuous, symmetrical distribution with both tails extending to infinity; 2, arithmetic mean, mode, and median identical; and 3, shape completely determined by the mean and standard deviation.

**Pedigree**: The record of descent or ancestry, particularly of a particular condition or trait, indicating individual family members, their relationships, and their status with respect to the trait or condition.

**Genetic Variation**: Genotypic differences observed among individuals in a population.

**Genotype**: The genetic constitution of the individual, comprising the ALLELES present at each GENETIC LOCUS.

**Alleles**: Variant forms of the same gene, occupying the same locus on homologous CHROMOSOMES, and governing the variants in production of the same gene product.

**Video Games**: A form of interactive entertainment in which the player controls electronically generated images that appear on a video display screen. This includes video games played in the home on special machines or home computers, and those played in arcades.

**Game Theory**: Theoretical construct used in applied mathematics to analyze certain situations in which there is an interplay between parties that may have similar, opposed, or mixed interests. In a typical game, decision-making "players," who each have their own goals, try to gain advantage over the other parties by anticipating each other's decisions; the game is finally resolved as a consequence of the players' decisions.

**Games, Experimental**: Games designed to provide information on hypotheses, policies, procedures, or strategies.

**Congresses as Topic**: Conferences, conventions or formal meetings usually attended by delegates representing a special field of interest.

**Post and Core Technique**: Use of a metal casting, usually with a post in the pulp or root canal, designed to support and retain an artificial crown.

**Consensus Development Conferences as Topic**: Presentations of summary statements representing the majority agreement of physicians, scientists, and other professionals convening for the purpose of reaching a consensus--often with findings and recommendations--on a subject of interest. The Conference, consisting of participants representing the scientific and lay viewpoints, is a significant means of evaluating current medical thought and reflects the latest advances in research for the respective field being addressed.

**DNA Fingerprinting**: A technique for identifying individuals of a species that is based on the uniqueness of their DNA sequence. Uniqueness is determined by identifying which combination of allelic variations occur in the individual at a statistically relevant number of different loci. In forensic studies, RESTRICTION FRAGMENT LENGTH POLYMORPHISM of multiple, highly polymorphic VNTR LOCI or MICROSATELLITE REPEAT loci are analyzed. The number of loci used for the profile depends on the ALLELE FREQUENCY in the population.

## Genome-wide bioinformatic and molecular analysis of introns in Saccharomyces cerevisiae. (1/3175)

Introns have typically been discovered in an ad hoc fashion: introns are found as a gene is characterized for other reasons. As complete eukaryotic genome sequences become available, better methods for predicting RNA processing signals in raw sequence will be necessary in order to discover genes and predict their expression. Here we present a catalog of 228 yeast introns, arrived at through a combination of bioinformatic and molecular analysis. Introns annotated in the Saccharomyces Genome Database (SGD) were evaluated, questionable introns were removed after failing a test for splicing in vivo, and known introns absent from the SGD annotation were added. A novel branchpoint sequence, AAUUAAC, was identified within an annotated intron that lacks a six-of-seven match to the highly conserved branchpoint consensus UACUAAC. Analysis of the database corroborates many conclusions about pre-mRNA substrate requirements for splicing derived from experimental studies, but indicates that splicing in yeast may not be as rigidly determined by splice-site conservation as had previously been thought. Using this database and a molecular technique that directly displays the lariat intron products of spliced transcripts (intron display), we suggest that the current set of 228 introns is still not complete, and that additional intron-containing genes remain to be discovered in yeast. The database can be accessed at http://www.cse.ucsc.edu/research/compbi o/yeast_introns.html. (+info)## Economic consequences of the progression of rheumatoid arthritis in Sweden. (2/3175)

OBJECTIVE: To develop a simulation model for analysis of the cost-effectiveness of treatments that affect the progression of rheumatoid arthritis (RA). METHODS: The Markov model was developed on the basis of a Swedish cohort of 116 patients with early RA who were followed up for 5 years. The majority of patients had American College of Rheumatology (ACR) functional class II disease, and Markov states indicating disease severity were defined based on Health Assessment Questionnaire (HAQ) scores. Costs were calculated from data on resource utilization and patients' work capacity. Utilities (preference weights for health states) were assessed using the EQ-5D (EuroQol) questionnaire. Hypothetical treatment interventions were simulated to illustrate the model. RESULTS: The cohort distribution among the 6 Markov states clearly showed the progression of the disease over 5 years of followup. Costs increased with increasing severity of the Markov states, and total costs over 5 years were higher for patients who were in more severe Markov states at diagnosis. Utilities correlated well with the Markov states, and the EQ-5D was able to discriminate between patients with different HAQ scores within ACR functional class II. CONCLUSION: The Markov model was able to assess disease progression and costs in RA. The model can therefore be a useful tool in calculating the cost-effectiveness of different interventions aimed at changing the progression of the disease. (+info)## Multipoint oligogenic analysis of age-at-onset data with applications to Alzheimer disease pedigrees. (3/3175)

It is usually difficult to localize genes that cause diseases with late ages at onset. These diseases frequently exhibit complex modes of inheritance, and only recent generations are available to be genotyped and phenotyped. In this situation, multipoint analysis using traditional exact linkage analysis methods, with many markers and full pedigree information, is a computationally intractable problem. Fortunately, Monte Carlo Markov chain sampling provides a tool to address this issue. By treating age at onset as a right-censored quantitative trait, we expand the methods used by Heath (1997) and illustrate them using an Alzheimer disease (AD) data set. This approach estimates the number, sizes, allele frequencies, and positions of quantitative trait loci (QTLs). In this simultaneous multipoint linkage and segregation analysis method, the QTLs are assumed to be diallelic and to interact additively. In the AD data set, we were able to localize correctly, quickly, and accurately two known genes, despite the existence of substantial genetic heterogeneity, thus demonstrating the great promise of these methods for the dissection of late-onset oligogenic diseases. (+info)## Machine learning approaches for the prediction of signal peptides and other protein sorting signals. (4/3175)

Prediction of protein sorting signals from the sequence of amino acids has great importance in the field of proteomics today. Recently, the growth of protein databases, combined with machine learning approaches, such as neural networks and hidden Markov models, have made it possible to achieve a level of reliability where practical use in, for example automatic database annotation is feasible. In this review, we concentrate on the present status and future perspectives of SignalP, our neural network-based method for prediction of the most well-known sorting signal: the secretory signal peptide. We discuss the problems associated with the use of SignalP on genomic sequences, showing that signal peptide prediction will improve further if integrated with predictions of start codons and transmembrane helices. As a step towards this goal, a hidden Markov model version of SignalP has been developed, making it possible to discriminate between cleaved signal peptides and uncleaved signal anchors. Furthermore, we show how SignalP can be used to characterize putative signal peptides from an archaeon, Methanococcus jannaschii. Finally, we briefly review a few methods for predicting other protein sorting signals and discuss the future of protein sorting prediction in general. (+info)## Genome-wide linkage analyses of systolic blood pressure using highly discordant siblings. (5/3175)

BACKGROUND: Elevated blood pressure is a risk factor for cardiovascular, cerebrovascular, and renal diseases. Complex mechanisms of blood pressure regulation pose a challenge to identifying genetic factors that influence interindividual blood pressure variation in the population at large. METHODS AND RESULTS: We performed a genome-wide linkage analysis of systolic blood pressure in humans using an efficient, highly discordant, full-sibling design. We identified 4 regions of the human genome that show statistical significant linkage to genes that influence interindividual systolic blood pressure variation (2p22.1 to 2p21, 5q33.3 to 5q34, 6q23.1 to 6q24.1, and 15q25.1 to 15q26.1). These regions contain a number of candidate genes that are involved in physiological mechanisms of blood pressure regulation. CONCLUSIONS: These results provide both novel information about genome regions in humans that influence interindividual blood pressure variation and a basis for identifying the contributing genes. Identification of the functional mutations in these genes may uncover novel mechanisms for blood pressure regulation and suggest new therapies and prevention strategies. (+info)## FORESST: fold recognition from secondary structure predictions of proteins. (6/3175)

MOTIVATION: A method for recognizing the three-dimensional fold from the protein amino acid sequence based on a combination of hidden Markov models (HMMs) and secondary structure prediction was recently developed for proteins in the Mainly-Alpha structural class. Here, this methodology is extended to Mainly-Beta and Alpha-Beta class proteins. Compared to other fold recognition methods based on HMMs, this approach is novel in that only secondary structure information is used. Each HMM is trained from known secondary structure sequences of proteins having a similar fold. Secondary structure prediction is performed for the amino acid sequence of a query protein. The predicted fold of a query protein is the fold described by the model fitting the predicted sequence the best. RESULTS: After model cross-validation, the success rate on 44 test proteins covering the three structural classes was found to be 59%. On seven fold predictions performed prior to the publication of experimental structure, the success rate was 71%. In conclusion, this approach manages to capture important information about the fold of a protein embedded in the length and arrangement of the predicted helices, strands and coils along the polypeptide chain. When a more extensive library of HMMs representing the universe of known structural families is available (work in progress), the program will allow rapid screening of genomic databases and sequence annotation when fold similarity is not detectable from the amino acid sequence. AVAILABILITY: FORESST web server at http://absalpha.dcrt.nih.gov:8008/ for the library of HMMs of structural families used in this paper. FORESST web server at http://www.tigr.org/ for a more extensive library of HMMs (work in progress). CONTACT: [email protected]; [email protected]; [email protected] (+info)## Age estimates of two common mutations causing factor XI deficiency: recent genetic drift is not necessary for elevated disease incidence among Ashkenazi Jews. (7/3175)

The type II and type III mutations at the FXI locus, which cause coagulation factor XI deficiency, have high frequencies in Jewish populations. The type III mutation is largely restricted to Ashkenazi Jews, but the type II mutation is observed at high frequency in both Ashkenazi and Iraqi Jews, suggesting the possibility that the mutation appeared before the separation of these communities. Here we report estimates of the ages of the type II and type III mutations, based on the observed distribution of allelic variants at a flanking microsatellite marker (D4S171). The results are consistent with a recent origin for the type III mutation but suggest that the type II mutation appeared >120 generations ago. This finding demonstrates that the high frequency of the type II mutation among Jews is independent of the demographic upheavals among Ashkenazi Jews in the 16th and 17th centuries. (+info)## Does over-the-counter nicotine replacement therapy improve smokers' life expectancy? (8/3175)

OBJECTIVE: To determine the public health benefits of making nicotine replacement therapy available without prescription, in terms of number of quitters and life expectancy. DESIGN: A decision-analytic model was developed to compare the policy of over-the-counter (OTC) availability of nicotine replacement therapy with that of prescription ([symbol: see text]) availability for the adult smoking population in the United States. MAIN OUTCOME MEASURES: Long-term (six-month) quit rates, life expectancy, and smoking attributable mortality (SAM) rates. RESULTS: OTC availability of nicotine replacement therapy would result in 91,151 additional successful quitters over a six-month period, and a cumulative total of approximately 1.7 million additional quitters over 25 years. All-cause SAM would decrease by 348 deaths per year and 2940 deaths per year at six months and five years, respectively. Relative to [symbol: see text] nicotine replacement therapy availability, OTC availability would result in an average gain in life expectancy across the entire adult smoking population of 0.196 years per smoker. In sensitivity analyses, the benefits of OTC availability were evident across a wide range of changes in baseline parameters. CONCLUSIONS: Compared with [symbol: see text] availability of nicotine replacement therapy, OTC availability would result in more successful quitters, fewer smoking-attributable deaths, and increased life expectancy for current smokers. (+info)###### "Markov Chain Monte Carlo With Application to Image Denoising" by Jakub Michel

###### A Motion Estimation algorithm based on Markov Chain Model - IEEE Conference Publication

###### Markov chain Monte Carlo : stochastic simulation for Bayesian inference

###### talks.cam : Approximations for Markov chain models

###### Linear Algebra, Markov Chains, and Queueing Models

###### Hierarchical Multiple Markov Chain Model for Unsupervised Texture Segmentation

###### A comparison of strategies for Markov chain Monte Carlo computation in quantitative genetics | Genetics Selection Evolution |...

###### 中国科技论文在线

###### Hidden Markov models, Markov chains in random environments, and systems theory | Math

###### DROPS - Evaluating Stationary Distribution of the Binary GA Markov Chain in Special Cases

###### Markov chain Monte Carlo « Jared Lander

###### Frontiers | On the Use of Markov Models in Pharmacoeconomics: Pros and Cons and Implications for Policy Makers | Public Health

###### Clayton, D. (1996) Generalized Linear Mixed Models. In Gilks, W., et al., Eds., Markov Chain Monte Carlo in Practice, Chapman &...

###### Embedded System of DNA Exon Predictor using Hidden Markov Model

###### Video library: Eugene A. Feinberg, Average-cost Markov Decision Processes with weakly continuous

###### Chromosome Classification Using Continuous Hidden Markov Models | Sciweavers

###### A Markov chain model for studying suicide dynamics: an illustration of the Rose theorem | BMC Public Health | Full Text

###### Markov models and applications - ppt download

###### Comment on "On the Metropolis-Hastings Acceptance Probability to Add or Drop a Quantitative Trait Locus in Markov Chain Monte...

###### On Input Design for System Identification : Input Design Using Markov Chains

###### Contextual Markov Decision Processes using Generalized Linear Models | DeepAI

###### Markov Decision Processes: Discrete Stochastic Dynamic Programming | Ebook | Ellibs Ebookstore

###### PyVideo.org · Title To Be Determined; A tale of graphs and Markov chains

###### Semantic Indexing of Soccer Audio-Visual Sequences: A Multimodal Approach Based on Controlled Markov Chains

###### Understanding Molecular Kinetics with Markov State Models | Biomedical Computation Review

###### Read An Introduction To Markov State Models And Their Application To Long Timescale Molecular Simulation by Gregory R. Bowman ;...

###### A Markov Chain Monte Carlo Approach for Joint Inference of Population Structure and Inbreeding Rates From Multilocus Genotype...

###### Markov Chain Transition Probabilities Help.

###### The Dynamics of Repeat Migration: A Markov Chain Analysis

###### Leicester Research Archive: Entropy: The Markov ordering approach

###### Capturing Human Sequence-Learning Abilities in Configuration Design Tasks through Markov Chains - Human Systems Design Lab

###### METHOD FOR CREATING A MARKOV PROCESS THAT GENERATES SEQUENCES - Patent application

###### Binding site discovery from nucleic acid sequences by discriminative learning of hidden Markov models

###### Free The Markov Chain Algorithm Download

###### Markov Chains, part I - PDF

###### Linear Models and Markov Chain MBA Assignment Help, Online Business Assignment Writing Service and Homework Help

###### Hybrid Model-Based Classification of the Action for Brain-Compute...: Ingenta Connect

###### Contextual Image Segmentation based on AdaBoost and Markov Random Fields<...

###### Maximizing Entropy over Markov Processes

###### Quantifying the natural history of breast cancer | [email protected]

###### Difference between revisions of "Past Probability Seminars Spring 2013" - UW-Math Wiki

###### Difference between revisions of "Probability Seminar" - UW-Math Wiki

###### EmissionParam-methods: A parameter class for computing Emission probabilities in VanillaICE: A Hidden Markov Model for high...

###### Exponential convergence of adaptive importance sampling algorithms for Markov chains - Netherlands Society for Statistics and...

###### Development of a Novel Markov Chain Model for the Prediction of Head and Neck Squamous Cell Carcinoma Dissemination

###### Population Dynamics through Hierarchically Embedded Markov chains | School of Social Sciences | UCI Social Sciences

###### 1601.05078] Understanding Past Population Dynamics: Bayesian Coalescent-Based Modeling with Covariates

###### Browse - Oxford Scholarship

###### Markov Chain Monte Carlo Method without Detailed Balance - Condensed Matter > Statistical Mechanics - pdf...

###### AMS261: Probability Theory with Markov Chains | Course Web Pages

###### First Passage Probability Estimation of Wind Turbines by Markov Chain Monte Carlo - Danish National Research Database-Den...

###### Analysis of Markov Chain Models of Adaptive Processes :: Special Interest (grey literature, ephemera, reports from the history...

###### The Price is Right : Project valuation for Project Portfolio Management using Markov Chain Monte Carlo Simulation

###### Bayesian estimation of genomic copy number with single nucleotide polymorphism genotyping arrays | BMC Research Notes | Full...

###### Markov chain Monte Carlo. A review article for section 10 (probability theory). - Lancaster EPrints

###### Optimizing Prescription of Chinese Herbal Medicine for Unstable Angina Based on Partially Observable Markov Decision Process

###### Reversible jump MCMC for volumetric calibration. - Lancaster EPrints

###### AES E-Library » Restoration of Nonlinearly Distorted Audio Using Markov Chain Monte Carlo Methods

###### Rao-Blackwellization of Particle Markov Chain Monte Carlo Methods Using Forward Filtering Backward Sampling - IEEE Journals &...

###### Similar papers for Wavelet-Based Texture Analysis and Synthesis Using Hidden Markov Models - Semantic Scholar

###### Markov Chains, Renewal, Branching and Coalescent Processes : Four Topics in Probability Theory

###### A MOMENT-MATCHING METHOD FOR APPROXIMATING VECTOR AUTOREGRESSIVE PROCESSES BY FINITE-STATE MARKOV CHAINS - Gospodinov - 2013 -...

###### Smooth On-Line Learning Algorithms for Hidden Markov Models | MIT CogNet

###### Quantitative analysis of pulmonary emphysema using isotropic Gaussian Markov random fields - ePrints Soton

###### A Model to Predict Outcome of The Diabetic Foot? A Nine State Solution. - DF Blog

###### "A Markov Decision Model to Evaluate Outsourcing in Reverse Logistics" by Marco A. Serrato and Sarah M. Ryan

###### Quantile forecasting for credit risk management using possibly misspecified hidden Markov models

###### 1907.11899] Deep learning-based prediction of kinetic parameters from myocardial perfusion MRI

###### Testing Intuitions about Markov Chain Monte Carlo: Do I have a bug? | This Number Crunching Life

###### Bayesian inference in phylogeny - Wikipedia

###### Discrete Mathematics Problem on Markov Chains - Transience and Recurrence: Another Knight On The Tiles - Mark Hennings |...

###### Xn, n > 0 } is a Markov Chain, such that Xn is an element of {1, 2}, P[Xn+1 = 2 | Xn = 1] = a, which is a probability...

###### Fully Bayesian logistic regression with hyper-LASSO priors for high-dimensional feature selection

###### evolving all we are: 六月 2013

###### Particle Gibbs Split-Merge Sampling for Bayesian Inference in Mixture Models

###### Precise acquisition of global navigation satellite system signals in the presence of multipath and influence on tracking...

###### A stochastic approach to quantifying the blur with uncertainty estimation for high-energy X-ray imaging systems (Journal...

###### A statistical approach to estimating the strength of cell-cell interactions under the differential adhesion hypothesis |...

###### Get PDF - Decision analysis with Markov processes supports early surgery for large-angle infantile esotropia

###### Variational Bayes inference of spatial mixture models for segmentation. - Wellcome Centre for Integrative Neuroimaging

###### Newton, Michael A. | Biostatistics & Medical Informatics

###### mimsy.io

###### Publications - Professor David Hand

###### "Analysis of Models for Longitudinal and Clustered Binary Data" by Weiming Yang

###### Optimal point process filtering and estimation of the coalescent process. - Medawar

###### Markov property and chemical oscillators | Physics Forums - The Fusion of Science and Community

###### One-sided versus two-sided points of view - Netherlands Society for Statistics and Operations Research

###### Bayesian model comparison

###### Bayesian inference on multivariate asymmetric jump-diffusion models | Korea Science

###### Stationary distribution and stochastic Hopf bifurcation for a predator-prey system with noises

###### Bayesian Modeling and Computation for Networks: 2008-2010 | Pacific Institute for the Mathematical Sciences - PIMS

###### Related articles

###### Location matters for pro-environmental behavior: a spatial Markov Chains approach to proximity effects in differentiated waste...

###### Fork-join queue

###### James R. Norris

###### Math.NET Numerics

###### Fork-join queue

###### Pseudorandomness

###### Keith Martin Ball

###### Cromwell's rule

###### Greek letters used in mathematics, science, and engineering

###### 易辛模型 - 维基百科，自由的百科全书

###### Spatial analysis

###### Bayes estimator

###### Stochastic geometry

###### Stan (software)

**Markov** **Chains** and Invariant Probabilities | Onesimo Hernandez-Lerma | Springer

**Markov** **Chain** == Dynamic Bayesian Network? - Artificial Intelligence - GameDev.net

###### Monotone dependence in graphical models for multivariate **Markov** **chains**

**Markov** **Chain** Monte Carlo - Sampling Methods | Coursera

###### Stochastic dynamics: **Markov** **chains** and random transformations

**Markov** **Chains** - Gibbs Fields, Monte Carlo Simulation, and Queues | Pierre Bremaud | Springer

###### 9 Best **markov** **chain** monte carlo jobs (Hiring Now!) | SimplyHired

###### Reversible-Jump **Markov** **Chain** Monte Carlo for Quantitative Trait Loci Mapping | Genetics

###### Infinite-State Verification: From Transition Systems to **Markov** **Chains** - IEEE Conference Publication

###### Ask HN: Best place to start learning about **Markov** **Chains**? | Hacker News

###### Deterioration Prediction of Urban Bridges on Network Level Using **Markov**-**Chain** Model

###### A Motion Estimation algorithm based on **Markov** **Chain** Model - IEEE Conference Publication

**Markov** **Chain** Monte Carlo in Practice - 1st Edition - W.R. Gilks - S.

###### Seminar: The Role of Kemeny's Constant in Properties of **Markov** **Chains** - The University of Nottingham

###### An Expectation Maximization Algorithm to Model Failure Times by Continuous-Time **Markov** **Chains**

###### Equivalence of Linear Boltzmann **Chains** and Hidden **Markov** Models | MIT CogNet

###### Estimation of Admixture Proportions: A Likelihood-Based Approach Using **Markov** **Chain** Monte Carlo | Genetics

###### AES E-Library » Restoration of Nonlinearly Distorted Audio Using **Markov** **Chain** Monte Carlo Methods

MCMCAlgorithmsProcessesRecurrenceBayesianDefinition of a Markov chainProbabilitiesProbability theoryEstimationSimple Markov chainHomogeneous Markov chainStrong Markov propertyNumerical solutionStationaryErgodicTransition matrixModelContinuousAperiodicAndrey MarkovHMMsMatricesModelsMethodsApproximation frameworkSpacesApproximationsMartingalesSimulationDirected graphTheoryEntropyDynamicsTimeReversibleGraphsDistributionsCouplingsMathematicallyApproachIntuitions

###### MCMC24

- Most commonly used among these is the class of Markov Chain Monte Carlo (MCMC) algorithms, which includes the simple Gibbs sampling algorithm, as well as a family of methods known as Metropolis-Hastings. (coursera.org)
- OVER the past decade there has been a significant increase in the application of Markov chain Monte Carlo (MCMC) methods to modeling data. (genetics.org)
- Markov Chain Monte Carlo in Practice introduces MCMC methods and their applications, providing some theoretical background as well. (routledge.com)
- A Markov chain Monte Carlo (MCMC) simulation is a method of estimating an unknown probability distribution for the outcome of a complex process (a posterior distribution). (cdc.gov)
- One of the major concerns for Markov Chain Monte Carlo (MCMC) algorithms is that they can take a long time to converge to the desired stationary distribution. (rice.edu)
- Our framework, which we call the method of "shepherding distributions", relies on the introduction of an auxiliary distribution called a shepherding distribution (SD) that uses several MCMC chains running in parallel. (rice.edu)
- The Markov chain Monte Carlo (MCMC) method is a general simulation method for sampling from posterior distributions and computing posterior quantities of interest. (sas.com)
- Markov chain Monte Carlo (MCMC) is a statistical innovation that allows researchers to fit far more complex models to data than is feasible using conventional methods. (usgs.gov)
- The EB approach usually relies on the penalized quasi-likelihood (PQL), while the FB approach, which has increasingly become more popular in the recent past, usually uses Markov chain Monte Carlo (McMC) techniques. (scirp.org)
- Here we have investigated Markov chain Monte Carlo (MCMC) algorithms as a method for optimizing the multi-dimensional coefficient space. (spie.org)
- Markov chain Monte Carlo (MCMC) methods provide consistent approximations of integrals as the number of iterations goes to infinity. (bu.edu)
- In this paper we define a class of MCMC algorithms, the generalized self regenerative chains (GSR), generalizing the SR chain of Sahu and Zhigljavski (2001), which contains rejection sampling as a special case. (uio.no)
- Markov chain Monte Carlo (MCMC) sampling, Metropolis-Hastings (MH) algorithm (Metropolis et al. (auckland.ac.nz)
- In order to consider the uncertainty of the weight and to improve universal applicability of the CM, in this paper, the authors intend the Markov chain Monte Carlo based on adaptive Metropolis algorithm (AM-MCMC) to solve the weight of a single model in the CM, and obtain the probability distribution of the weight and the joint probability density of all the weight. (iwaponline.com)
- mcmc - Stat 5102 Notes Markov Chain Monte Carlo and. (coursehero.com)
- We will use Markov chain Monte Carlo (MCMC). (coursehero.com)
- In particular, Markov chain Monte Carlo (MCMC) methods have become increasingly popular as they allow for a rigorous analysis of parameter and prediction uncertainties without the need for assuming parameter identifiability or removing non-identifiable parameters. (biomedcentral.com)
- A broad spectrum of MCMC algorithms have been proposed, including single- and multi-chain approaches. (biomedcentral.com)
- The comparison of MCMC algorithms, initialization and adaptation schemes revealed that overall multi-chain algorithms perform better than single-chain algorithms. (biomedcentral.com)
- Furthermore, our results confirm the need to address exploration quality of MCMC chains before applying the commonly used quality measure of effective sample size to prevent false analysis conclusions. (biomedcentral.com)
- For one project I've been working on recently, I'm using a Markov Chain Monte Carlo (MCMC) method known as slice sampling . (smellthedata.com)
- Now, debugging MCMC algorithms is somewhat troublesome, due to their random nature and the fact that chains just sometimes mix slowly , but there are some good ways to be pretty sure that you get things right. (smellthedata.com)
- Markov chain Monte Carlo (MCMC) techniques can provide estimates of the posterior density of orders while accounting naturally for missing data, data errors and unknown parameters. (semanticscholar.org)
- Here we provide methodologies to determine the minimum sample size needed to detect dependence in 2 x 2 x 2 tables based on Fisher's exact test evaluated exactly or by Markov chain Monte Carlo (MCMC), assuming only the case total L and the control total N are known. (isharonline.org)

###### Algorithms3

- We also consider generalizations of the Metropolis - Hastings independent chains or Metropolized independent sampling, and for some of these algorithms we are able to give the convergence rates and establish a lower bound for the asymptotic efficiency. (uio.no)
- Ching W, Ng MK (2006) Markov chains: models, algorithms and applications. (springerprofessional.de)
- We present algorithms for coupling and training hidden Markov models (HMMs) to model interacting processes, and demonstrate their superiority to conventional HMMs in a vision task classifying two-handed actions. (psu.edu)

###### Processes15

- Markov processes are the basis for general stochastic simulation methods known as Markov chain Monte Carlo , which are used for simulating sampling from complex probability distributions, and have found application in Bayesian statistics and artificial intelligence . (wikipedia.org)
- Note that there is no definitive agreement in the literature on the use of some of the terms that signify special cases of Markov processes. (wikipedia.org)
- Publisher Description (unedited publisher data) Markov chains are central to the understanding of random processes. (worldcat.org)
- In the reviewer's opinion, this is an elegant and most welcome addition to the rich literature of Markov processes. (springer.com)
- We show that a deeper insight into the relations among marginal processes of a multivariate Markov chain can be gained by testing hypotheses of Granger noncausality, contemporaneous independence and monotone dependence. (repec.org)
- The author treats the classic topics of Markov chain theory, both in discrete time and continuous time, as well as the connected topics such as finite Gibbs fields, nonhomogeneous Markov chains, discrete- time regenerative processes, Monte Carlo simulation, simulated annealing, and queuing theory. (springer.com)
- The study of dynamical phenomena in finite populations often requires the consideration of population Markov processes of significant mathematical and computational complexity, which rapidly becomes prohibitive with increasing population size or increasing number of individual configuration states. (uci.edu)
- This talk will discuss a framework that allows one to define a hierarchy of approximations to the stationary distribution of general systems amenable to be described as discrete Markov processes with time invariant transition probabilities and (possibly) a large number of states. (uci.edu)
- Browse other questions tagged pr.probability stochastic-processes markov-chains martingales or ask your own question . (mathoverflow.net)
- For Markov processes on continuous state spaces please use (markov-process) instead. (stackexchange.com)
- Ross ( 1997 ) and Karlin and Taylor ( 1975 ) give a non-measure-theoretic treatment of stochastic processes, including Markov chains. (sas.com)
- The presented framework is part of an exciting recent stream of literature on numerical option pricing, and offers a new perspective that combines the theory of diffusion processes, Markov chains, and Fourier techniques. (springer.com)
- A general framework for pricing Asian options under Markov processes. (springer.com)
- A general framework time-changed Markov processes and applications. (springer.com)
- Browse other questions tagged probability stochastic-processes markov-process or ask your own question . (stackexchange.com)

###### Recurrence5

- Motivated by multivariate random recurrence equations we prove a new analogue of the Key Renewal Theorem for functionals of a Markov chain with compact state space in the spirit of Kesten. (uni-muenchen.de)
- Is there a way to analytically compute the recurrence time of a finite Markov process? (mathoverflow.net)
- Let Xn be an irreducible aperiodic recurrent Markov chain with countable state space I and with the mean recurrence times having second moments. (uzh.ch)
- In this work we study the recurrence problem for quantum Markov chains, which are quantum versions of classical Markov chains introduced by S. Gudder and described in terms of completely positive maps. (arxiv.org)
- A notion of monitored recurrence for quantum Markov chains is examined in association with Schur functions, which codify information on the first return to some given state or subspace. (arxiv.org)

###### Bayesian4

- Markov Chain == Dynamic Bayesian Network? (gamedev.net)
- Using the Bayesian approach and the Markov chain Monte Carlo method, an empirical distribution corresponding to the predictive density of the expert estimates can be constructed. (igi-global.com)
- The Markov chain method has been quite successful in modern Bayesian computing. (sas.com)
- Stat 5102 Notes: Markov Chain Monte Carlo and Bayesian Inference Charles J. Geyer April 6, 2009 1 The Problem This is an example of an application of Bayes rule that requires some form of computer analysis. (coursehero.com)

###### Definition of a Markov chain2

- A Markov chain is a type of Markov process that has either a discrete state space or a discrete index set (often representing time), but the precise definition of a Markov chain varies. (wikipedia.org)
- Since the probabilities depend only on the current position (value of x) and not on any prior positions, this biased random walk satisfies the definition of a Markov chain. (wikipedia.org)

###### Probabilities7

- Boltzmann chains model sequences of states by defining state-state transition energies instead of probabilities. (mit.edu)
- Markov Chain Transition Probabilities Help. (mathhelpforum.com)
- Markov chains primarily have to have valid probabilities and then need to satisfy 1st order conditional dependence. (physicsforums.com)
- The entropy rate of an ergodic homogeneous Markov chain taking only two values is an explicit function of its transition probabilities. (ebscohost.com)
- One of the main issues of Markov Chain is the estimation procedure of the transition probabilities. (morebooks.de)
- As a corollary we obtain a central limit theorem for Markov chains associated with iterated function systems with contractive maps and place-dependent Dini-continuous probabilities. (diva-portal.org)
- p t (i, j) = 1, i F. To define a probability space associated with these transition probabilities we need to set a distribution function π : F R for the initial state of the chain. (docplayer.net)

###### Probability theory2

- This textbook, aimed at advanced undergraduate or MSc students with some background in basic probability theory, focuses on Markov chains and quickly develops a coherent and rigorous theory whilst showing also how actually to apply it. (worldcat.org)
- In probability theory, a telescoping Markov chain (TMC) is a vector-valued stochastic process that satisfies a Markov property and admits a hierarchical format through a network of transition matrices with cascading dependence. (wikipedia.org)

###### Estimation4

- Combination Forecasts Based on Markov Chain Monte Carlo Estimation of the Mode. (igi-global.com)
- A Monte Carlo Estimation of the Entropy for Markov Chains. (ebscohost.com)
- And our findings of applying these procedures (estimation procedure and the test procedures) over the data is that the diabetes mellitus data follows the second order Markov Chain and time homogeneous property. (morebooks.de)
- In this research thesis, we implement Markov Chain Monte Carlo techniques and polynomial-chaos expansion based techniques for states and parameters estimation in hidden Markov models (HMM). (tamu.edu)

###### Simple Markov chain3

- In this paper we present a technique for authorship attribution based on a simple Markov chain of letters (i.e., just letter bigrams are used). (msu.ru)
- This gives rise to a first-order, or simple, Markov chain model. (msu.ru)
- Dragoon is a simple Markov-chain based post generator for App.net. (coding-aloud.nz)

###### Homogeneous Markov chain1

- Suppose that (Xn )n≥0 is a ﬁrst-oder homogeneous Markov chain on a discrete state space S and with probability transition matrix p. (scribd.com)

###### Strong Markov property5

- 1.4 Strong Markov property. (worldcat.org)
- The strong Markov property states that this intuition is actually correct.4 More on the strong Markov property. (scribd.com)
- i, j F. An important property of these Markov chains is the strong Markov property. (docplayer.net)
- The strong Markov property is a random version of (1.7). (docplayer.net)
- Proposition 1.1 (Strong Markov Property). (docplayer.net)

###### Numerical solution3

- In this book, the first to offer a systematic and detailed treatment of the numerical solution of Markov chains, William Stewart provides scientists on many levels with the power finally to put these techniques to use in the real world. (boomerangbooks.com.au)
- Buy Introduction to the Numerical Solution of Markov Chains by William J. Stewart from Australia's Online Independent Bookstore, Boomerang Books. (boomerangbooks.com.au)
- We'd like to know what you think about it - write a review about Introduction to the Numerical Solution of Markov Chains book by William J. Stewart and you'll earn 50c in Boomerang Bucks loyalty dollars (you must be a Boomerang Books Account Holder - it's free to sign up and there are great benefits! (boomerangbooks.com.au)

###### Stationary4

- When mutation rate is positive, the Markov chain modeling an evolutionary algorithm is irreducible and, therefore, has a unique stationary distribution, yet, rather little is known about the stationary distribution. (dagstuhl.de)
- Even if you insist on a deterministic initial state, pretty much any 2-state chain will give a counterexample - take a function whose variance is not maximised by the stationary distribution, and consider approaching stationarity from one extreme or the other. (mathoverflow.net)
- We show that for the generalizations of the SR and independent chains the expected values of these weights characterize the stationary distribution. (uio.no)
- An obvious question to ask then is does a Markov chain have a stationary distribution, and if so is it unique? (docplayer.net)

###### Ergodic2

- The main objective is to give a systematic, self-contained presentation on some key issues about the ergodic behavior of that class of Markov chains. (springer.com)
- In this talk, I will show that this gap can be resolved in the general setting of weakly ergodic signals with nondegenerate observations by exploiting a surprising connection with the theory of Markov chains in random environments. (princeton.edu)

###### Transition matrix4

- Create the Markov chain that is characterized by the transition matrix P . (mathworks.com)
- The Markov transition matrix between the states in two consecutive periods is parameterized and estimated using a logit specification and a large panel data with 14 waves. (repec.org)
- Suppose that (Un )n≥0 is a Markov chain deﬁned in a state space S and with a probability transition matrix p. z) + p(y. (scribd.com)
- Then the question is related to the spectral gap of the transition matrix $P:\Omega\times\Omega\to [0, of your chain. (mathoverflow.net)

###### Model28

- A Markov chain is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. (wikipedia.org)
- You can then model the probability that you'll end up at any one given place after n steps as a markov chain. (ycombinator.com)
- Therefore a Markov-chain model capable of considering maintenance factors is proposed in this study. (hindawi.com)
- The Markov-chain model proposed can predict not only the distribution of the percentage of different condition rating (CR) grades on network level in any year but also the deterioration tendency of single bridge with any state. (hindawi.com)
- Among many approaches treating TTF with nonexponential distributions, the extended Markov-model [ 11 ] is recommendable. (hindawi.com)
- In the extended Markov-model, an operation state is divided into substates with different levels of failure rates, which result in a nonconstant failure rate of the operation state. (hindawi.com)
- In this note I demonstrate that under the simple condition that the state sequence has a mandatory end state, the probability distribution assigned by a strictly linear Boltzmann chain is identical to that assigned by a hidden Markov model. (mit.edu)
- For a project I am using a Markov Chain model with 17 states. (mathhelpforum.com)
- Mission-Critical Group Decision-Making: Solving the Problem of Decision Preference Change in Group Decision-Making Using Markov Chain Model. (igi-global.com)
- This article intends to address this neglected group decision-making research issue in the literature by proposing a new approach based on the Markov chain model. (igi-global.com)
- There are many problems that can be modeled using both Markov chain and Hidden Markov model (HMM). (stackexchange.com)
- A finite Markov chain is used to model the input of the system. (diva-portal.org)
- This allows to directly include input amplitude constraints into the input model, by properly choosing the state space of the Markov chain. (diva-portal.org)
- Our approach is to model the disease process via a latent continuous time Markov chain, enabling greater flexibility yet retaining tractability. (washington.edu)
- Theoretical aspects of the model are examined and a simulation algorithm is developed through which the stochastic properties of summaries of the extremal txhaviour of the chain are evaluated. (lancs.ac.uk)
- Everingham and Rydell's Markov chain model of cocaine demand is modified and updated in light of recent data. (ebscohost.com)
- We analyze the dynamics of nosocomial infections in intensive care units ( ICUs) by using a Markov chain model. (ebscohost.com)
- A cornerstone of applied probability, Markov chains can be used to help model how plants grow, chemicals react, and atoms diffuse-and applications are increasingly being found in such areas as engineering, computer science, economics, and education. (boomerangbooks.com.au)
- The proposed model is based on a Markov process that represents the projects in the firm. (diva-portal.org)
- This study initially shows that it is possible to model the project portfolio as a Markov process. (diva-portal.org)
- A Markov chain illness and death model is proposed to determine suicide dynamic in a population and examine its effectiveness for reducing the number of suicides by modifying certain parameters of the model. (biomedcentral.com)
- Assuming a population with replacement, the suicide risk of the population was estimated by determining the final state of the Markov model. (biomedcentral.com)
- Some empirical results to demonstrating the effectiveness of suicide prevention effort by modifying some parameters of the Markov model will be provided. (biomedcentral.com)
- Explanation of the Matlab functions in the stocHHastic package The attached Matlab code implements the stochastic Hodgkin-Huxley model with ion-channel gating modeled as Markov chains. (yale.edu)
- We provide both the full Markov chain model as well as its stochastic-shielding approximation (folder HH). (yale.edu)
- We also suggest application of symmetric circulants to model very special isotropic Markov chains. (scirp.org)
- Our goal is to estimate the probability density function (PDF) of the states and parameters given noisy observations of the output of the hidden Markov model. (tamu.edu)
- Here I propose a multistate Markov chain transition model with extensions that acconnt for longitudinal and within-cluster correlations. (bu.edu)

###### Continuous14

- A continuous-time process is called a continuous-time Markov chain (CTMC). (wikipedia.org)
- Notice that the general state space continuous-time Markov chain is general to such a degree that it has no designated term. (wikipedia.org)
- Both discrete-time and continuous-time chains are studied. (worldcat.org)
- 2. Continuous-time Markov chains I. 2.1 Q-matrices and their exponentials. (worldcat.org)
- 3. Continuous-time Markov chains II. (worldcat.org)
- So, the state of the system was a continuous time, discrete state space Markov process subordinated to a Poisson process. (ycombinator.com)
- In this paper, an expectation maximization algorithm is proposed to construct a suitable continuous-time Markov chain which models the failure time data by the first time reaching the absorbing state. (hindawi.com)
- This method assumes that the structure of the system is modelled with a continuous-time Markov chain (CTMC). (hindawi.com)
- The Markov chain is constructed by targeting the conditional moments of the underlying continuous process. (wiley.com)
- This book is a survey of work on passage times in stable Markov chains with a discrete state space and a continuous time. (booktopia.com.au)
- A typical approach is to assume a standard continuous time Markov chain for the disease process, due to its computational tractability. (washington.edu)
- In this talk, I will present a discrete counterpart to this result: given a reversible Markov kernel on a finite set, there exists a Riemannian metric on the space of probability densities, for which the law of the continuous time Markov chain evolves as the gradient flow of the entropy. (newton.ac.uk)
- He provides extensive background to both discrete-time and continuous-time Markov chains and examines many different numerical computing methods-direct, single- and multi-vector iterative, and projection methods. (boomerangbooks.com.au)
- In this chapter, we present recent developments in using the tools of continuous-time Markov chains for the valuation of European and path-dependent financial derivatives. (springer.com)

###### Aperiodic1

- I can give you the answer for a certain class of Markov Chains, i.e. for reversible , irreducible and aperiodic Markov Chains on a finite state space. (mathoverflow.net)

###### Andrey Markov2

- It is named after the Russian mathematician Andrey Markov . (wikipedia.org)
- In a Markov chain (named for Russian mathematician Andrey Markov [ Figure ]), the probability of the next computed estimated outcome depends only on the current estimate and not on prior estimates. (cdc.gov)

###### HMMs2

- The main purpose of this work is to investigate the performance of hidden Markov (chain) models (HMMs) in comparison to hidden Markov random field (HMRF) models when predicting CT images of head. (diva-portal.org)
- Hidden Markov models (HMMs) are a powerful probabilistic tool for modeling sequential data, and have been applied with success to many text-related tasks, such as part-of-speech tagging, text segmentation and information extraction. (psu.edu)

###### Matrices1

- Second, we report two new applications of these matrices to isotropic Markov chain models and electrical impedance tomography on a homogeneous disk with equidistant electrodes. (scirp.org)

###### Models14

- For instance, I found tons of verbose material on Hidden Markov Models, but I still havent a freaking clue on what the damn thing is, because not a single time did I ever see a reference to introductory material. (gamedev.net)
- Monotone dependence in graphical models for multivariate Markov chains ," Metrika: International Journal for Theoretical and Applied Statistics , Springer, vol. 76(7), pages 873-885, October. (repec.org)
- Graphical models for multivariate Markov chains ," Journal of Multivariate Analysis , Elsevier, vol. 107(C), pages 90-103. (repec.org)
- Markov chain Monte Carlo methods) to calibrate micro-simulation models. (simplyhired.com)
- The discrete probability models are represented by Markov process, which is based on the concept of probabilistic cumulative damage [ 8 ] and now commonly used in performance prediction of infrastructure facilities [ 9 ]. (hindawi.com)
- Methods of supplementary variables [ 14 ] and the device of stages [ 15 ] are two classical approaches of extended Markov-models. (hindawi.com)
- Several authors have studied the relationship between hidden Markov models and "Boltzmann chains" with a linear or "time-sliced" architecture. (mit.edu)
- An essential ingredient of the statistical inference theory for hidden Markov models is the nonlinear filter. (princeton.edu)
- Reversible jump Markov chain Monte Carlo methods are used to implement a sampling scheme in which the Markov chain can jump between parameter subspaces corresponding to models with different numbers of quantitative-trait loci (QTL's). (nih.gov)
- Nicolis J.S., Protonotarios E.N., Voulodemou I. (1978) Controlled Markov Chain Models for Biological Hierarchies. (springer.com)
- Models for the extremes of Markov chains. (lancs.ac.uk)
- In this paper, we focus on Markov chains, deriving a class of models for their joint tail which allows the degree of clustering of extremes to decrease at high levels, overcoming a key Limitation in current methodologies. (lancs.ac.uk)
- Markov Decision Process (MDP) models have been widely used in decision making under uncertainty. (edu.sa)
- This has usually been done with regression models, but Markov chain methods have also been applied. (wikipedia.org)

###### Methods5

- used in practice is the class of Markov Chain Monte Carlo methods. (coursera.org)
- Each of these studies applied Markov chain Monte Carlo methods to produce more accurate and inclusive results. (routledge.com)
- Different physical methods of shuffling correspond to different chains. (berkeley.edu)
- Most popular methods, such as Markov chain Monte Carlo sampling, perform poorly on strongly multi-modal probability distributions, rarely jumping between modes or settling on just one mode without finding others. (arxiv.org)
- We present the results of a thorough benchmarking of state-of-the-art single- and multi-chain sampling methods, including Adaptive Metropolis, Delayed Rejection Adaptive Metropolis, Metropolis adjusted Langevin algorithm, Parallel Tempering and Parallel Hierarchical Sampling. (biomedcentral.com)

###### Approximation framework1

- We use the concept of markov chains and introduce the notion of a Markov rough approximation framework (MRAF), wherein a probability distribution function is obtained corresponding to a set of rough approximations. (springerprofessional.de)

###### Spaces3

- However, many applications of Markov chains employ finite or countably infinite state spaces, which have a more straightforward statistical analysis. (wikipedia.org)
- A distinguishing feature of the book is the emphasis on the role of expected occupation measures to study the long-run behavior of Markov chains on uncountable spaces. (springer.com)
- j A k Thus (1.14) states the chain allows no communication between the subsets A 0 and A 1 of F. Hence we may reduce the original chain to independent chains on the reduced state spaces A 0, A 1. (docplayer.net)

###### Approximations1

- Efficient simulation of stochastic differential equations based on Markov Chain approximations with applications. (springer.com)

###### Martingales1

- A distinguishing feature is an introduction to more advanced topics such as martingales and potentials in the established context of Markov chains. (worldcat.org)

###### Simulation4

- In other words, a Markov chain is able to improve its approximation to the true distribution at each step in the simulation. (sas.com)
- Addendum: Here is a simulation of 100,000 steps of the chain using R software, where state 0 = Sun and state 1 = Rain. (stackexchange.com)
- total current, sodium current, potassium current, a timetrack vector that is in seconds, a sodium matrix showing the number of channels in each Markov state, a potassium matrix showing the number of channels in each Markov state, the total number of sodium channels, the total number of potassium channels, and the time it took the simulation to run. (yale.edu)
- trigamma( α )- 1 /λ- 1 /λ α/λ 2 = α trigamma( α )- 1 λ 2 and the Jeffreys prior is g ( α,λ ) = p α trigamma( α )- 1 λ 2 2 The Markov Chain Monte Carlo 2.1 Ordinary Monte Carlo The "Monte Carlo method" refers to the theory and practice of learning about probability distributions by simulation rather than calculus. (coursehero.com)

###### Directed graph1

- Plot a directed graph of the Markov chain and identify classes using node color and markers. (mathworks.com)

###### Theory6

- In this book, the author begins with the elementary theory of Markov chains and very progressively brings the reader to the more advanced topics. (springer.com)
- General state-space Markov chain theory has seen several developments that have made it both more accessible and more powerful to the general statistician. (routledge.com)
- We develop the theory of cyclic Markov chains and apply it to the El Nino-Southern Oscillation (ENSO) predictability problem. (knmi.nl)
- But the knight is moving as random walk on a finite graph (rather than just some more general Markov chain), and elementary theory reduces the problem to counting the numer of edges of the graph, giving the answer of 168 moves. (berkeley.edu)
- General Theory of Markov Chains We have already discussed the standard random walk on the integers Z. A Markov Chain can be viewed as a generalization of this. (docplayer.net)
- The so called Markov property or no memory property (1.4) characterizes a Markov chain, and can be an alternative starting point for the theory of Markov chains. (docplayer.net)

###### Entropy3

- Asymptotic study of an estimator of the entropy rate of a two-state Markov chain for one long trajectory. (ebscohost.com)
- We introduce an estimate of the entropy $\mathbb{E}_{p^t}(\log p^t)$ of the marginal density p t of a (eventually inhomogeneous) Markov chain at time t=1. (ebscohost.com)
- The rare ebook circulation distribution entropy production and irreversibility of denumerable markov chains is powered into payment with the spatial tracking by using the exclusive mushrooms. (nukefix.org)

###### Dynamics6

- The Dynamics of Repeat Migration: A Markov Chain Analysis ," International Migration Review , Wiley Blackwell, vol. 46(2), pages 362-388, June. (repec.org)
- The Dynamics of Repeat Migration: A Markov Chain Analysis ," CEPR Discussion Papers 4124, C.E.P.R. Discussion Papers. (repec.org)
- The Dynamics of Repeat Migration: A Markov Chain Analysis ," Discussion Papers of DIW Berlin 378, DIW Berlin, German Institute for Economic Research. (repec.org)
- The Dynamics of Repeat Migration: A Markov Chain Analysis ," IZA Discussion Papers 885, Institute of Labor Economics (IZA). (repec.org)
- In particular, we establish some elementary contradistinctions between Markov chain (MC) and RDS descriptions of a stochastic dynamics. (aimsciences.org)
- here we further suggest that the RDS description could be a more refined description of stochastic dynamics than a Markov process. (aimsciences.org)

###### Time17

- While the time parameter is usually discrete, the state space of a Markov chain does not have any generally agreed-on restrictions: the term may refer to a process on an arbitrary state space. (wikipedia.org)
- 1. Discrete-time Markov chains. (worldcat.org)
- This book concerns discrete-time homogeneous Markov chains that admit an invariant probability measure. (springer.com)
- For a Markov chain with state space S, Tx can be interpreted as the time of the ﬁrst visit to x (after n = 0). (scribd.com)
- Nx can be interpreted as the last time a Markov chain visits state x between times n = 1 and n = 8 inclusive. (scribd.com)
- Let T be the ﬁrst time (after n = 0) that the chain visits state x or y (mathematically. (scribd.com)
- N (x) is a discrete random variable and ∞ Ey (N (x)) = k=0 k · P (N (x) = k) + ∞ · P (N (x) = ∞) . at time Tx the chain was located at x. y. . y ∈ S be distinct states. (scribd.com)
- Interpretations include the expected number of links that a surfer on the World Wide Web located on a random page needs to follow before reaching a desired location, as well as the expected time to mixing in a Markov chain. (nottingham.ac.uk)
- for the step size of the proposal chain, and how does the relaxation time scale? (berkeley.edu)
- The download markov could be the time which, taken with the shopping to like vote' with heresy, strategically spirits, sought to technology. (vgra.org)
- Renado discrete time markov chain injunctive I push her convinces microscopy arterialize good taste. (888668.tk)
- Hernando selective discrete time markov chain insipidus and write their falconets idolatrized innervate frangibly. (888668.tk)
- bastard, stylish eye to his twill mortadella or Undercool Fidel discrete time markov chain casually balloons. (888668.tk)
- tiliaceous Winny demolition, its theatricalises armholes prepare discrete time markov chain horse. (888668.tk)
- Plagiarism flyable Warren, his discrete time markov chain distant doze. (888668.tk)
- ionised discrete time markov chain towards Gil coastline, gently trapeses zincifies chapping. (888668.tk)
- with initial distribution π( ) and let τ( ) be a stopping time for this chain. (docplayer.net)

###### Reversible2

- we shall formalize different interpretations as different mixing times , and relations between mixing times are discussed in Chapter 4 for reversible chains and in Chapter 8 (xxx section to be written) for general chains. (berkeley.edu)
- A chain satisfying a detailed balance relation $\pi(x) P(x,y) =\pi(y) P(y,x)$ is reversible . (mathoverflow.net)

###### Graphs2

- Markov Properties for Acyclic Directed Mixed Graphs ," Scandinavian Journal of Statistics , Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 30(1), pages 145-157. (repec.org)
- Alternative Markov Properties for Chain Graphs ," Scandinavian Journal of Statistics , Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 28(1), pages 33-85. (repec.org)

###### Distributions3

- Markov Chain Monte Carlo to sample from probability distributions is a good start - https://arxiv.org/abs/1206.1901 if you are into sampling. (ycombinator.com)
- Markov chain Monte Carlo simulations allow researchers to approximate posterior distributions that cannot be directly calculated. (cdc.gov)
- We consider various scenarios where shepherding distributions can be used, including the case where several machines or CPU cores work on the same data in parallel (the so-called transition parallel application of the framework) and the case where a large data set itself can be partitioned across several machines or CPU cores and various chains work on subsets of the data (the so-called data parallel application of the framework). (rice.edu)

###### Couplings1

- We propose to remove this bias by using couplings of Markov chains together with a telescopic sum argument of Glynn & Rhee (2014). (bu.edu)

###### Mathematically1

- Can anyone please explain mathematically, why HMM should be preferred over Markov chain? (stackexchange.com)

###### Approach1

- This cyclostationary Markov-chain approach captures the spring barrier in ENSO predictability and gives insight into the dependence of ENSO predictability on the climatic state. (knmi.nl)

###### Intuitions1

- Testing Intuitions about Markov Chain Monte Carlo: Do I have a bug? (smellthedata.com)