• regression
  • We've done some work on the logistic regression ("3-parameter Rasch") model , and it might be helpful to see some references to other approaches. (andrewgelman.com)
  • Data analysis typically relies on a standard set of statistical models, especially Generalized Linear Models (GLMs) that form the foundations of regression and the analysis of variance. (springer.com)
  • Our method involves multiply imputing the missing items and questions by adding to existing methods of imputation designed for single surveys a hierarchical regression model that allows covariates at the individual and survey levels. (harvard.edu)
  • This model, which provides a tool for comparative politics research analagous to that which regression analysis provides in the American two-party context, can be used to explain or predict how geographic distributions of electoral results depend upon economic conditions, neighborhood ethnic compositions, campaign spending, and other features of the election campaign or aggregate areas. (harvard.edu)
  • These techniques include, among others: non-parametric regression, which refers to modeling where the structure of the relationship between variables is treated non-parametrically, but where nevertheless there may be parametric assumptions about the distribution of model residuals. (wikipedia.org)
  • They are different from statistical models (for example linear regression) whose aim is to empirically estimate the relationships between variables. (wikipedia.org)
  • theoretical
  • My experience is that anything too data-driven in this field tends to run into trouble within political science because it while it is one thing to toss more elaborate statistical setups at the roll call data, they tend to lack the clear theoretical underpinnings of the Euclidean spatial voting model. (andrewgelman.com)
  • They provide a way of formalizing available information and making theoretical assumptions, enabling the evaluation of the assumptions by empirical evidence, and applying what is learned to make more complete model-based inferences and predictions. (springer.com)
  • The directional and proximity models offer dramatically different theories for how voters make decisions and fundamentally divergent views of the supposed microfoundations on which vast bodies of literature in theoretical rational choice and empirical political behavior have been built. (harvard.edu)
  • We demonstrate here that the empirical tests in the large and growing body of literature on this subject amount to theoretical debates about which statistical assumption is right. (harvard.edu)
  • Sornette's group has contributed significantly to the theoretical development and study of the properties of the now standard Epidemic Type Aftershock Sequence (ETAS) model. (wikipedia.org)
  • ANOVA
  • Analysis of variance (ANOVA): A mathematical process for separating the variability of a group of observations into assignable causes and setting up various significance tests. (wikipedia.org)
  • Bayes
  • Bridge sampling is an impoverished method that only gives Bayes factors for overlapping models. (ed.ac.uk)
  • In the model, this circuit computes probabilities that considered alternatives are correct, according to Bayes' theorem. (ed.ac.uk)
  • methods
  • As non-parametric methods make fewer assumptions, their applicability is much wider than the corresponding parametric methods. (wikipedia.org)
  • Under Race to the Top and other programs advocating for better methods of evaluating teacher performance, districts have looked to value-added modeling as a supplement to observing teachers in classrooms. (wikipedia.org)
  • Louisiana legislator Frank A. Hoffmann introduced a bill to authorize the use of value-added modeling techniques in the state's public schools as a means to reward strong teachers and to identify successful pedagogical methods, as well as providing a means to provide additional professional development for those teachers identified as weaker than others. (wikipedia.org)
  • Here it is convenient to follow the terminology used by the Cochrane Collaboration, and use "meta-analysis" to refer to statistical methods of combining evidence, leaving other aspects of 'research synthesis' or 'evidence synthesis', such as combining information from qualitative studies, for the more general context of systematic reviews. (wikipedia.org)
  • Structured sparsity regularization is a class of methods, and an area of research in statistical learning theory, that extend and generalize sparsity regularization learning methods. (wikipedia.org)
  • Both sparsity and structured sparsity regularization methods seek to exploit the assumption that the output variable Y {\displaystyle Y} (i.e., response, or dependent variable) to be learned can be described by a reduced number of variables in the input space X {\displaystyle X} (i.e., the domain, space of features or explanatory variables). (wikipedia.org)
  • Structured sparsity regularization methods generalize and extend sparsity regularization methods, by allowing for optimal selection over structures like groups or networks of input variables in X {\displaystyle X} . Common motivation for the use of structured sparsity methods are model interpretability, high-dimensional learning (where dimensionality of X {\displaystyle X} may be higher than the number of observations n {\displaystyle n} ), and reduction of computational complexity. (wikipedia.org)
  • Moreover, structured sparsity methods allow to incorporate prior assumptions on the structure of the input variables, such as overlapping groups, non-overlapping groups, and acyclic graphs. (wikipedia.org)
  • Examples of uses of structured sparsity methods include face recognition, magnetic resonance image (MRI) processing, socio-linguistic analysis in natural language processing, and analysis of genetic expression in breast cancer. (wikipedia.org)
  • This can lead to a stack of substates, where traditional problem methods, such as planning or hierarchical task decomposition, naturally arise. (wikipedia.org)
  • These models use the genetic information already obtained through methods such as phylogenetics to determine the route that evolution has taken and when evolutionary events occurred. (wikipedia.org)
  • estimates
  • Conceptually, a meta-analysis uses a statistical approach to combine the results from multiple studies in an effort to increase power (over individual studies), improve estimates of the size of the effect and/or to resolve uncertainty when reports disagree. (wikipedia.org)
  • dependence
  • Mutual information is a measure of the inherent dependence expressed in the joint distribution of X and Y relative to the joint distribution of X and Y under the assumption of independence. (wikipedia.org)
  • computational
  • During this talk I will present computational models describing decision making process in the cortico-basal ganglia circuit. (ed.ac.uk)
  • The hierarchical form of analysis and organization helps in the understanding of multiparameter problems and also plays an important role in developing computational strategies. (wikipedia.org)
  • observation
  • The development of cognitive models involves the creative scientific formalization of assumptions, based on theory, observation, and other relevant information. (springer.com)
  • In a hierarchical model, observations are grouped into clusters, and the distribution of an observation is determined not only by common structure among all clusters but also by the specific structure of the cluster where this observation belongs. (wikipedia.org)
  • analysis
  • if you don't know about it, Doug Rivers has a very nice paper on identification for multidimensional item-response models (with roll call analysis as a special case). (andrewgelman.com)
  • The AHM also differs from the RSM with respect to the identification of the cognitive attributes and the logic underlying the diagnostic inferences made from the statistical analysis. (wikipedia.org)
  • Principled test design encompasses 3 broad stages: cognitive model development test development psychometric analysis. (wikipedia.org)
  • Psychometric analysis comprises the third stage in the test design process. (wikipedia.org)
  • The approach utilizes an analysis of variance model to achieve normalization and estimate differential expression of genes across multiple conditions. (pnas.org)
  • Value-added modeling (also known as value-added analysis and value-added assessment) is a method of teacher evaluation that measures the teacher's contribution in a given year by comparing the current test scores of their students to the scores of those same students in previous school years, as well as to the scores of other students in the same grade. (wikipedia.org)
  • A meta-analysis is a statistical analysis that combines the results of multiple scientific studies. (wikipedia.org)
  • The statistical theory surrounding meta-analysis was greatly advanced by the work of Nambury S. Raju, Larry V. Hedges, Harris Cooper, Ingram Olkin, John E. Hunter, Jacob Cohen, Thomas C. Chalmers, Robert Rosenthal, Frank L. Schmidt, and Douglas G. Bonett. (wikipedia.org)
  • A meta-analysis is a statistical overview of the results from one or more systematic reviews. (wikipedia.org)
  • explicitly
  • I will explain why many existing machine learning tasks are really causal reasoning in disguise, why an increasing number of machine learning tasks will explicitly require causal models, and why researchers and practitioners who understand causal reasoning will succeed where others fail. (microsoft.com)
  • approach
  • In contrast, the AHM uses an a priori approach to identifying the attributes and specifying their interrelationships in a cognitive model. (wikipedia.org)
  • In this article, we offer an approach, built on the technique of statistical simulation, to extract the currently overlooked information from any statistical method and to interpret and present it in a reader-friendly manner. (harvard.edu)
  • This approach captures the modeling situation where variables can be selected as long as they belong at least to one group with positive coefficients. (wikipedia.org)
  • classification
  • The RSM using statistical pattern classification where examinees' observed response patterns are matched to pre-determined response patterns that each correspond to a particular cognitive or knowledge state. (wikipedia.org)
  • Notwithstanding these distinctions, the statistical literature now commonly applies the label "non-parametric" to test procedures that we have just termed "distribution-free", thereby losing a useful classification. (wikipedia.org)
  • mathematical
  • The architecture system design of the present invention allows for information gathering independent of the mathematical models used and takes into account security settings in the network hosts. (google.com)
  • In mathematical modeling, deterministic simulations contain no random variables and no degree of randomness, and consist mostly of equations, for example difference equations. (wikipedia.org)
  • different
  • Unfortunately, these assumptions are also crucial since changing them leads to different conclusions about voter processes. (harvard.edu)
  • Hierarchical modeling is used when information is available on several different levels of observational units. (wikipedia.org)
  • So a random effect component, different for different clusters, is introduced into the model. (wikipedia.org)
  • There are different techniques to fit a hierarchical generalized linear model. (wikipedia.org)
  • Since its beginnings in 1983 as John Laird's thesis, it has been widely used by AI researchers to create intelligent agents and cognitive models of different aspects of human behavior. (wikipedia.org)
  • As a process, bootstrapping can be divided into different domains, according to whether it involves semantic bootstrapping, syntactic bootstrapping, prosodic bootstrapping, or pragmatic bootstrapping. (wikipedia.org)
  • usually
  • Researchers use statistical processes on a student's past test scores to predict the student's future test scores, on the assumption that students usually score approximately as well each year as they have in past years. (wikipedia.org)
  • Deterministic simulation models are usually designed to capture some underlying mechanism or natural process. (wikipedia.org)
  • Crucially, the precision (inverse variance) of high order derivatives fall to zero fairly quickly, which means it is only necessary to model relatively low order generalized motion (usually between two and eight) for any given or parameterized autocorrelation function. (wikipedia.org)
  • make
  • Hierarchical feedback control policies, on the other hand, offer the promise of being able to handle realistically complex manufacturing systems in a tractable fashion to make their management more efficient. (wikipedia.org)
  • results
  • We show state-of-the-art results on language modelling and text compression. (ed.ac.uk)
  • In the first part I will review models describing how speed and accuracy of decisions is controlled in the cortico-basal-ganglia circuit, and present results of a recent experiment attempting to distinguish between these models. (ed.ac.uk)
  • Social Scientists rarely take full advantage of the information available in their statistical results. (harvard.edu)
  • Deployment : Predictive model deployment provides the option to deploy the analytical results into everyday decision making process to get results, reports and output by automating the decisions based on the modelling. (wikipedia.org)
  • Model Monitoring : Models are managed and monitored to review the model performance to ensure that it is providing the results expected. (wikipedia.org)
  • estimate
  • Design: A set of experimental runs which allows you to fit a particular model and estimate your desired effects. (wikipedia.org)
  • standard
  • First, accepting their entire statistical model, and correcting only an algebraic error (a mistake in how they computed their standard errors), we showed that their hypothesized relationship holds up in fewer than half the tests they reported. (harvard.edu)
  • With Taylor ED's open architecture, software users can access standard libraries of atoms to build models. (wikipedia.org)
  • statistics
  • Simulation of the system model yields the timing and resource usage statistics needed for performance evaluation, without the necessity of emulating the system. (wikipedia.org)
  • In statistics, hierarchical generalized linear models (HGLM) extend generalized linear models by relaxing the assumption that error components are independent. (wikipedia.org)
  • The group is active in the modelling of earthquakes, landslides, and other natural hazards, combining concepts and tools from statistical physics, statistics, tectonics, seismology and more. (wikipedia.org)
  • This over-simplified assumption has recently relaxed by coupling the statistics of ETAS to genuine mechanical information. (wikipedia.org)
  • test
  • In the second part of the talk, I will present a model assuming that the cortico-basal-ganglia circuit performs statistically optimal test that maximizes speed of decisions for any required accuracy. (ed.ac.uk)
  • Naively, these associations can be found by use of a simple statistical test. (microsoft.com)
  • In this way, value-added modeling attempts to isolate the teacher's contributions from factors outside the teacher's control that are known to strongly affect student test performance, including the student's general intelligence, poverty, and parental involvement. (wikipedia.org)