Algorithms: A procedure consisting of a sequence of algebraic formulas and/or logical steps to calculate or determine a given task.Models, Statistical: Statistical formulations or analyses which, when applied to data and found to fit the data, are then used to verify the assumptions and parameters used in the analysis. Examples of statistical models are the linear model, binomial model, polynomial model, two-parameter model, etc.Computer Simulation: Computer-based representation of physical systems and phenomena such as chemical processes.Data Interpretation, Statistical: Application of statistical procedures to analyze specific observed or assumed facts from a particular study.Likelihood Functions: Functions constructed from a statistical model and a set of observed data which give the probability of that data for various values of the unknown model parameters. Those parameter values that maximize the probability are the maximum likelihood estimates of the parameters.Monte Carlo Method: In statistics, a technique for numerically approximating the solution of a mathematical problem by studying the distribution of some random variable, often generated by a computer. The name alludes to the randomness characteristic of the games of chance played at the gambling casinos in Monte Carlo. (From Random House Unabridged Dictionary, 2d ed, 1993)Reproducibility of Results: The statistical reproducibility of measurements (often in a clinical context), including the testing of instrumentation or techniques to obtain reproducible results. The concept includes reproducibility of physiological measurements, which may be used to develop rules to assess probability or prognosis, or response to a stimulus; reproducibility of occurrence of a condition; and reproducibility of experimental results.Bayes Theorem: A theorem in probability theory named for Thomas Bayes (1702-1761). In epidemiology, it is used to obtain the probability of disease in a group of people with some characteristic on the basis of the overall rate of that disease and of the likelihood of that characteristic in healthy and diseased individuals. The most familiar application is in clinical decision analysis where it is used for estimating the probability of a particular diagnosis given the appearance of some symptoms or test result.Least-Squares Analysis: A principle of estimation in which the estimates of a set of parameters in a statistical model are those quantities minimizing the sum of squared differences between the observed values of a dependent variable and the values predicted by the model.Image Interpretation, Computer-Assisted: Methods developed to aid in the interpretation of ultrasound, radiographic images, etc., for diagnosis of disease.Phantoms, Imaging: Devices or objects in various imaging techniques used to visualize or enhance visualization by simulating conditions encountered in the procedure. Phantoms are used very often in procedures employing or measuring x-irradiation or radioactive material to evaluate performance. Phantoms often have properties similar to human tissue. Water demonstrates absorbing properties similar to normal tissue, hence water-filled phantoms are used to map radiation levels. Phantoms are used also as teaching aids to simulate real conditions with x-ray or ultrasonic machines. (From Iturralde, Dictionary and Handbook of Nuclear Medicine and Clinical Imaging, 1990)Nonlinear Dynamics: The study of systems which respond disproportionately (nonlinearly) to initial conditions or perturbing stimuli. Nonlinear systems may exhibit "chaos" which is classically characterized as sensitive dependence on initial conditions. Chaotic systems, while distinguished from more ordered periodic systems, are not random. When their behavior over time is appropriately displayed (in "phase space"), constraints are evident which are described by "strange attractors". Phase space representations of chaotic systems, or strange attractors, usually reveal fractal (FRACTALS) self-similarity across time scales. Natural, including biological, systems often display nonlinear dynamics and chaos.Models, Genetic: Theoretical representations that simulate the behavior or activity of genetic processes or phenomena. They include the use of mathematical equations, computers, and other electronic equipment.Models, Biological: Theoretical representations that simulate the behavior or activity of biological processes or diseases. For disease models in living animals, DISEASE MODELS, ANIMAL is available. Biological models include the use of mathematical equations, computers, and other electronic equipment.Sample Size: The number of units (persons, animals, patients, specified circumstances, etc.) in a population to be studied. The sample size should be big enough to have a high likelihood of detecting a true difference between two groups. (From Wassertheil-Smoller, Biostatistics and Epidemiology, 1990, p95)Signal Processing, Computer-Assisted: Computer-assisted processing of electric, ultrasonic, or electronic signals to interpret function and activity.Probability: The study of chance processes or the relative frequency characterizing a chance process.Models, Theoretical: Theoretical representations that simulate the behavior or activity of systems, processes, or phenomena. They include the use of mathematical equations, computers, and other electronic equipment.Stochastic Processes: Processes that incorporate some element of randomness, used particularly to refer to a time series of random variables.Fourier Analysis: Analysis based on the mathematical function first formulated by Jean-Baptiste-Joseph Fourier in 1807. The function, known as the Fourier transform, describes the sinusoidal pattern of any fluctuating pattern in the physical world in terms of its amplitude and its phase. It has broad applications in biomedicine, e.g., analysis of the x-ray crystallography data pivotal in identifying the double helical nature of DNA and in analysis of other molecules, including viruses, and the modified back-projection algorithm universally used in computerized tomography imaging, etc. (From Segen, The Dictionary of Modern Medicine, 1992)Bias (Epidemiology): Any deviation of results or inferences from the truth, or processes leading to such deviation. Bias can result from several sources: one-sided or systematic variations in measurement from the true value (systematic error); flaws in study design; deviation of inferences, interpretations, or analyses based on flawed data or data collection; etc. There is no sense of prejudice or subjectivity implied in the assessment of bias under these conditions.Image Processing, Computer-Assisted: A technique of inputting two-dimensional images into a computer and then enhancing or analyzing the imagery into a form that is more useful to the human observer.Pattern Recognition, Automated: In INFORMATION RETRIEVAL, machine-sensing or identification of visible patterns (shapes, forms, and configurations). (Harrod's Librarians' Glossary, 7th ed)Sensitivity and Specificity: Binary classification measures to assess test results. Sensitivity or recall rate is the proportion of true positives. Specificity is the probability of correctly determining the absence of a condition. (From Last, Dictionary of Epidemiology, 2d ed)Software: Sequential operating programs and data which instruct the functioning of a digital computer.Regression Analysis: Procedures for finding the mathematical function which best describes the relationship between a dependent variable and one or more independent variables. In linear regression (see LINEAR MODELS) the relationship is constrained to be a straight line and LEAST-SQUARES ANALYSIS is used to determine the best fit. In logistic regression (see LOGISTIC MODELS) the dependent variable is qualitative rather than continuously variable and LIKELIHOOD FUNCTIONS are used to find the best relationship. In multiple regression, the dependent variable is considered to depend on more than a single independent variable.Linear Models: Statistical models in which the value of a parameter for a given value of a factor is assumed to be equal to a + bx, where a and b are constants. The models predict a linear regression.Computational Biology: A field of biology concerned with the development of techniques for the collection and manipulation of biological data, and the use of such data to make biological discoveries or predictions. This field encompasses all computational methods and theories for solving biological problems including manipulation of models and datasets.Time Factors: Elements of limited time intervals, contributing to particular results or situations.Magnetic Resonance Imaging: Non-invasive method of demonstrating internal anatomy based on the principle that atomic nuclei in a strong magnetic field absorb pulses of radiofrequency energy and emit them as radiowaves which can be reconstructed into computerized images. The concept includes proton spin tomographic techniques.Gene Expression Profiling: The determination of the pattern of genes expressed at the level of GENETIC TRANSCRIPTION, under specific circumstances or in a specific cell.Brain: The part of CENTRAL NERVOUS SYSTEM that is contained within the skull (CRANIUM). Arising from the NEURAL TUBE, the embryonic brain is comprised of three major parts including PROSENCEPHALON (the forebrain); MESENCEPHALON (the midbrain); and RHOMBENCEPHALON (the hindbrain). The developed brain consists of CEREBRUM; CEREBELLUM; and other structures in the BRAIN STEM.United StatesBiometry: The use of statistical and mathematical methods to analyze biological observations and phenomena.Image Enhancement: Improvement of the quality of a picture by various techniques, including computer processing, digital filtering, echocardiographic techniques, light and ultrastructural MICROSCOPY, fluorescence spectrometry and microscopy, scintigraphy, and in vitro image processing at the molecular level.Age Determination by Teeth: A means of identifying the age of an animal or human through tooth examination.Biostatistics: The application of STATISTICS to biological systems and organisms involving the retrieval or collection, analysis, reduction, and interpretation of qualitative and quantitative data.Normal Distribution: Continuous frequency distribution of infinite range. Its properties are as follows: 1, continuous, symmetrical distribution with both tails extending to infinity; 2, arithmetic mean, mode, and median identical; and 3, shape completely determined by the mean and standard deviation.Markov Chains: A stochastic process such that the conditional probability distribution for a state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.Statistics as Topic: The science and art of collecting, summarizing, and analyzing data that are subject to random variation. The term is also applied to the data themselves and to the summarization of the data.Statistical Distributions: The complete summaries of the frequencies of the values or categories of a measurement made on a group of items, a population, or other collection of data. The distribution tells either how many or what proportion of the group was found to have each value (or each range of values) out of all the possible values that the quantitative measure can have.Mathematics: The deductive study of shape, quantity, and dependence. (From McGraw-Hill Dictionary of Scientific and Technical Terms, 6th ed)Fetal Weight: The weight of the FETUS in utero. It is usually estimated by various formulas based on measurements made during PRENATAL ULTRASONOGRAPHY.Methods: A series of steps taken in order to conduct research.Uncertainty: The condition in which reasonable knowledge regarding risks, benefits, or the future is not available.Breeding: The production of offspring by selective mating or HYBRIDIZATION, GENETIC in animals or plants.Imaging, Three-Dimensional: The process of generating three-dimensional images by electronic, photographic, or other methods. For example, three-dimensional images can be generated by assembling multiple tomographic images with the aid of a computer, while photographic 3-D images (HOLOGRAPHY) can be made by exposing film to the interference pattern created when two laser light sources shine on an object.Calibration: Determination, by measurement or comparison with a standard, of the correct value of each scale reading on a meter or other measuring instrument; or determination of the settings of a control device that correspond to particular values of voltage, current, frequency or other output.Artifacts: Any visible result of a procedure which is caused by the procedure itself and not by the entity being analyzed. Common examples include histological structures introduced by tissue processing, radiographic images of structures that are not naturally present in living tissue, and products of chemical reactions that occur during analysis.Signal-To-Noise Ratio: The comparison of the quantity of meaningful data to the irrelevant or incorrect data.Genetics, Population: The discipline studying genetic composition of populations and effects of factors such as GENETIC SELECTION, population size, MUTATION, migration, and GENETIC DRIFT on the frequencies of various GENOTYPES and PHENOTYPES using a variety of GENETIC TECHNIQUES.Genetic Variation: Genotypic differences observed among individuals in a population.Predictive Value of Tests: In screening and diagnostic tests, the probability that a person with a positive test is a true positive (i.e., has the disease), is referred to as the predictive value of a positive test; whereas, the predictive value of a negative test is the probability that the person with a negative test does not have the disease. Predictive value is related to the sensitivity and specificity of the test.Phylogeny: The relationships of groups of organisms as reflected by their genetic makeup.Binomial Distribution: The probability distribution associated with two mutually exclusive outcomes; used to model cumulative incidence rates and prevalence rates. The Bernoulli distribution is a special case of binomial distribution.Numerical Analysis, Computer-Assisted: Computer-assisted study of methods for obtaining useful quantitative solutions to problems that have been expressed mathematically.Mathematical Computing: Computer-assisted interpretation and analysis of various mathematical functions related to a particular problem.Research Design: A plan for collecting and utilizing data so that desired information can be obtained with sufficient precision or so that an hypothesis can be tested properly.Risk Assessment: The qualitative or quantitative estimation of the likelihood of adverse effects that may result from exposure to specified health hazards or from the absence of beneficial influences. (Last, Dictionary of Epidemiology, 1988)Gene Frequency: The proportion of one particular in the total of all ALLELES for one genetic locus in a breeding POPULATION.Models, Neurological: Theoretical representations that simulate the behavior or activity of the neurological system, processes or phenomena; includes the use of mathematical equations, computers, and other electronic equipment.Reference Values: The range or frequency distribution of a measurement in a population (of organisms, organs or things) that has not been selected for the presence of disease or abnormality.Evolution, Molecular: The process of cumulative change at the level of DNA; RNA; and PROTEINS, over successive generations.Artificial Intelligence: Theory and development of COMPUTER SYSTEMS which perform tasks that normally require human intelligence. Such tasks may include speech recognition, LEARNING; VISUAL PERCEPTION; MATHEMATICAL COMPUTING; reasoning, PROBLEM SOLVING, DECISION-MAKING, and translation of language.Observer Variation: The failure by the observer to measure or identify a phenomenon accurately, which results in an error. Sources for this may be due to the observer's missing an abnormality, or to faulty technique resulting in incorrect test measurement, or to misinterpretation of the data. Two varieties are inter-observer variation (the amount observers vary from one another when reporting on the same material) and intra-observer variation (the amount one observer varies between observations when reporting more than once on the same material).Elasticity Imaging Techniques: Non-invasive imaging methods based on the mechanical response of an object to a vibrational or impulsive force. It is used for determining the viscoelastic properties of tissue, and thereby differentiating soft from hard inclusions in tissue such as microcalcifications, and some cancer lesions. Most techniques use ultrasound to create the images - eliciting the response with an ultrasonic radiation force and/or recording displacements of the tissue by Doppler ultrasonography.Models, Cardiovascular: Theoretical representations that simulate the behavior or activity of the cardiovascular system, processes, or phenomena; includes the use of mathematical equations, computers and other electronic equipment.ROC Curve: A graphic means for assessing the ability of a screening test to discriminate between healthy and diseased persons; may also be used in other studies, e.g., distinguishing stimuli responses as to a faint stimuli or nonstimuli.Analysis of Variance: A statistical technique that isolates and assesses the contributions of categorical independent variables to variation in the mean of a continuous dependent variable.Radiometry: The measurement of radiation by photography, as in x-ray film and film badge, by Geiger-Mueller tube, and by SCINTILLATION COUNTING.Population Density: Number of individuals in a population relative to space.Kinetics: The rate dynamics in chemical or physical systems.Sequence Analysis, DNA: A multistage process that includes cloning, physical mapping, subcloning, determination of the DNA SEQUENCE, and information analysis.Forensic Dentistry: The application of dental knowledge to questions of law.Blood Volume Determination: Method for determining the circulating blood volume by introducing a known quantity of foreign substance into the blood and determining its concentration some minutes later when thorough mixing has occurred. From these two values the blood volume can be calculated by dividing the quantity of injected material by its concentration in the blood at the time of uniform mixing. Generally expressed as cubic centimeters or liters per kilogram of body weight.Body Water: Fluids composed mainly of water found within the body.Environmental Monitoring: The monitoring of the level of toxins, chemical pollutants, microbial contaminants, or other harmful substances in the environment (soil, air, and water), workplace, or in the bodies of people and animals present in that environment.Radiation Dosage: The amount of radiation energy that is deposited in a unit mass of material, such as tissues of plants or animal. In RADIOTHERAPY, radiation dosage is expressed in gray units (Gy). In RADIOLOGIC HEALTH, the dosage is expressed by the product of absorbed dose (Gy) and quality factor (a function of linear energy transfer), and is called radiation dose equivalent in sievert units (Sv).Mathematical Concepts: Numeric or quantitative entities, descriptions, properties, relationships, operations, and events.Selection Bias: The introduction of error due to systematic differences in the characteristics between those selected and those not selected for a given study. In sampling bias, error is the result of failure to ensure that all members of the reference population have a known chance of selection in the sample.Pregnancy: The status during which female mammals carry their developing young (EMBRYOS or FETUSES) in utero before birth, beginning from FERTILIZATION to BIRTH.Pharmacokinetics: Dynamic and kinetic mechanisms of exogenous chemical and DRUG LIBERATION; ABSORPTION; BIOLOGICAL TRANSPORT; TISSUE DISTRIBUTION; BIOTRANSFORMATION; elimination; and DRUG TOXICITY as a function of dosage, and rate of METABOLISM. LADMER, ADME and ADMET are abbreviations for liberation, absorption, distribution, metabolism, elimination, and toxicology.Genetic Markers: A phenotypically recognizable genetic trait which can be used to identify a genetic locus, a linkage group, or a recombination event.Ultrasonography: The visualization of deep structures of the body by recording the reflections or echoes of ultrasonic pulses directed into the tissues. Use of ultrasound for imaging or diagnostic purposes employs frequencies ranging from 1.6 to 10 megahertz.Genotype: The genetic constitution of the individual, comprising the ALLELES present at each GENETIC LOCUS.Quantitative Trait, Heritable: A characteristic showing quantitative inheritance such as SKIN PIGMENTATION in humans. (From A Dictionary of Genetics, 4th ed)False Positive Reactions: Positive test results in subjects who do not possess the attribute for which the test is conducted. The labeling of healthy persons as diseased when screening in the detection of disease. (Last, A Dictionary of Epidemiology, 2d ed)Transducers: Any device or element which converts an input signal into an output signal of a different form. Examples include the microphone, phonographic pickup, loudspeaker, barometer, photoelectric cell, automobile horn, doorbell, and underwater sound transducer. (McGraw Hill Dictionary of Scientific and Technical Terms, 4th ed)Epidemiologic Methods: Research techniques that focus on study designs and data gathering methods in human and animal populations.Risk Factors: An aspect of personal behavior or lifestyle, environmental exposure, or inborn or inherited characteristic, which, on the basis of epidemiologic evidence, is known to be associated with a health-related condition considered important to prevent.Confidence Intervals: A range of values for a variable of interest, e.g., a rate, constructed so that this range has a specified probability of including the true value of the variable.Motion: Physical motion, i.e., a change in position of a body or subject as a result of an external force. It is distinguished from MOVEMENT, a process resulting from biological activity.Indicator Dilution Techniques: Methods for assessing flow through a system by injection of a known quantity of an indicator, such as a dye, radionuclide, or chilled liquid, into the system and monitoring its concentration over time at a specific point in the system. (From Dorland, 28th ed)Statistics, Nonparametric: A class of statistical methods applicable to a large set of probability distributions used to test for correlation, location, independence, etc. In most nonparametric statistical tests, the original scores or observations are replaced by another variable containing less information. An important class of nonparametric tests employs the ordinal properties of the data. Another class of tests uses information about whether an observation is above or below some fixed value such as the median, and a third class is based on the frequency of the occurrence of runs in the data. (From McGraw-Hill Dictionary of Scientific and Technical Terms, 4th ed, p1284; Corsini, Concise Encyclopedia of Psychology, 1987, p764-5)Oligonucleotide Array Sequence Analysis: Hybridization of a nucleic acid sample to a very large set of OLIGONUCLEOTIDE PROBES, which have been attached individually in columns and rows to a solid support, to determine a BASE SEQUENCE, or to detect variations in a gene sequence, GENE EXPRESSION, or for GENE MAPPING.Infant, Newborn: An infant during the first month after birth.Diagnosis, Computer-Assisted: Application of computer programs designed to assist the physician in solving a diagnostic problem.Subtraction Technique: Combination or superimposition of two images for demonstrating differences between them (e.g., radiograph with contrast vs. one without, radionuclide images using different radionuclides, radiograph vs. radionuclide image) and in the preparation of audiovisual materials (e.g., offsetting identical images, coloring of vessels in angiograms).Age Factors: Age as a constituent element or influence contributing to the production of a result. It may be applicable to the cause or the effect of a circumstance. It is used with human or animal concepts but should be differentiated from AGING, a physiological process, and TIME FACTORS which refers only to the passage of time.Cattle: Domesticated bovine animals of the genus Bos, usually kept on a farm or ranch and used for the production of meat or dairy products or for heavy labor.Neural Networks (Computer): A computer architecture, implementable in either hardware or software, modeled after biological neural networks. Like the biological system in which the processing capability is a result of the interconnection strengths between arrays of nonlinear processing nodes, computerized neural networks, often called perceptrons or multilayer connectionist models, consist of neuron-like units. A homogeneous group of units makes up a layer. These networks are good at pattern recognition. They are adaptive, performing tasks by example, and thus are better for decision-making than are linear learning machines or cluster analysis. They do not require explicit programming.Polymorphism, Single Nucleotide: A single nucleotide variation in a genetic sequence that occurs at appreciable frequency in the population.Weights and Measures: Measuring and weighing systems and processes.Anthropometry: The technique that deals with the measurement of the size, weight, and proportions of the human or other primate body.Age Determination by Skeleton: Establishment of the age of an individual by examination of their skeletal structure.Prospective Studies: Observation of a population for a sufficient number of persons over a sufficient number of years to generate incidence or mortality rates subsequent to the selection of the study group.Haplotypes: The genetic constitution of individuals with respect to one member of a pair of allelic genes, or sets of genes that are closely linked and tend to be inherited together such as those of the MAJOR HISTOCOMPATIBILITY COMPLEX.Epidemiologic Research Design: The form and structure of analytic studies in epidemiologic and clinical research.Ultrasonics: A subfield of acoustics dealing in the radio frequency range higher than acoustic SOUND waves (approximately above 20 kilohertz). Ultrasonic radiation is used therapeutically (DIATHERMY and ULTRASONIC THERAPY) to generate HEAT and to selectively destroy tissues. It is also used in diagnostics, for example, ULTRASONOGRAPHY; ECHOENCEPHALOGRAPHY; and ECHOCARDIOGRAPHY, to visually display echoes received from irradiated tissues.Chromosome Mapping: Any method used for determining the location of and relative distances between genes on a chromosome.Body Weight: The mass or quantity of heaviness of an individual. It is expressed by units of pounds or kilograms.Time Perception: The ability to estimate periods of time lapsed or duration of time.Alleles: Variant forms of the same gene, occupying the same locus on homologous CHROMOSOMES, and governing the variants in production of the same gene product.Databases, Factual: Extensive collections, reputedly complete, of facts and data garnered from material of a specialized subject area and made available for analysis and application. The collection can be automated by various contemporary methods for retrieval. The concept should be differentiated from DATABASES, BIBLIOGRAPHIC which is restricted to collections of bibliographic references.Food Analysis: Measurement and evaluation of the components of substances to be taken as FOOD.Reference Standards: A basis of value established for the measure of quantity, weight, extent or quality, e.g. weight standards, standard solutions, methods, techniques, and procedures used in diagnosis and therapy.Equipment Design: Methods of creating machines and devices.Case-Control Studies: Studies which start with the identification of persons with a disease of interest and a control (comparison, referent) group without the disease. The relationship of an attribute to the disease is examined by comparing diseased and non-diseased persons with regard to the frequency or levels of the attribute in each group.Incidence: The number of new cases of a given disease during a given period in a specified population. It also is used for the rate at which new events occur in a defined population. It is differentiated from PREVALENCE, which refers to all cases, new or old, in the population at a given time.Ultrasonography, Prenatal: The visualization of tissues during pregnancy through recording of the echoes of ultrasonic waves directed into the body. The procedure may be applied with reference to the mother or the fetus and with reference to organs or the detection of maternal or fetal disease.Evaluation Studies as Topic: Studies determining the effectiveness or value of processes, personnel, and equipment, or the material on conducting such studies. For drugs and devices, CLINICAL TRIALS AS TOPIC; DRUG EVALUATION; and DRUG EVALUATION, PRECLINICAL are available.Inbreeding: The mating of plants or non-human animals which are closely related genetically.Prevalence: The total number of cases of a given disease in a specified population at a designated time. It is differentiated from INCIDENCE, which refers to the number of new cases in the population at a given time.Selection, Genetic: Differential and non-random reproduction of different genotypes, operating to alter the gene frequencies within a population.Pedigree: The record of descent or ancestry, particularly of a particular condition or trait, indicating individual family members, their relationships, and their status with respect to the trait or condition.Body Burden: The total amount of a chemical, metal or radioactive substance present at any time after absorption in the body of man or animal.ComputersSize Perception: The sensory interpretation of the dimensions of objects.Swine: Any of various animals that constitute the family Suidae and comprise stout-bodied, short-legged omnivorous mammals with thick skin, usually covered with coarse bristles, a rather long mobile snout, and small tail. Included are the genera Babyrousa, Phacochoerus (wart hogs), and Sus, the latter containing the domestic pig (see SUS SCROFA).Electric Impedance: The resistance to the flow of either alternating or direct electrical current.Radiographic Image Enhancement: Improvement in the quality of an x-ray image by use of an intensifying screen, tube, or filter and by optimum exposure techniques. Digital processing methods are often employed.Glomerular Filtration Rate: The volume of water filtered out of plasma through glomerular capillary walls into Bowman's capsules per unit of time. It is considered to be equivalent to INULIN clearance.Population: The total number of individuals inhabiting a particular region or area.Wavelet Analysis: Signal and data processing method that uses decomposition of wavelets to approximate, estimate, or compress signals with finite time and frequency domains. It represents a signal or data in terms of a fast decaying wavelet series from the original prototype wavelet, called the mother wavelet. This mathematical algorithm has been adopted widely in biomedical disciplines for data and signal processing in noise removal and audio/image compression (e.g., EEG and MRI).Gestational Age: The age of the conceptus, beginning from the time of FERTILIZATION. In clinical obstetrics, the gestational age is often estimated as the time from the last day of the last MENSTRUATION which is about 2 weeks before OVULATION and fertilization.Biological Markers: Measurable and quantifiable biological parameters (e.g., specific enzyme concentration, specific hormone concentration, specific gene phenotype distribution in a population, presence of biological substances) which serve as indices for health- and physiology-related assessments, such as disease risk, psychiatric disorders, environmental exposure and its effects, disease diagnosis, metabolic processes, substance abuse, pregnancy, cell line development, epidemiologic studies, etc.Acceleration: An increase in the rate of speed.Tomography, X-Ray Computed: Tomography using x-ray transmission and a computer algorithm to reconstruct the image.Cluster Analysis: A set of statistical methods used to group variables or observations into strongly inter-related subgroups. In epidemiology, it may be used to analyze a closely grouped series of events or cases of disease or other health-related phenomenon with well-defined distribution patterns in relation to time or place or both.CreatinineColorimetry: Any technique by which an unknown color is evaluated in terms of standard colors. The technique may be visual, photoelectric, or indirect by means of spectrophotometry. It is used in chemistry and physics. (McGraw-Hill Dictionary of Scientific and Technical Terms, 4th ed)Body Composition: The relative amounts of various components in the body, such as percentage of body fat.Linkage Disequilibrium: Nonrandom association of linked genes. This is the tendency of the alleles of two separate but already linked loci to be found together more frequently than would be expected by chance alone.Causality: The relating of causes to the effects they produce. Causes are termed necessary when they must always precede an effect and sufficient when they initiate or produce an effect. Any of several factors may be associated with the potential disease causation or outcome, including predisposing factors, enabling factors, precipitating factors, reinforcing factors, and risk factors.Biomechanical Phenomena: The properties, processes, and behavior of biological systems under the action of mechanical forces.Movement: The act, process, or result of passing from one place or position to another. It differs from LOCOMOTION in that locomotion is restricted to the passing of the whole body from one place to another, while movement encompasses both locomotion but also a change of the position of the whole body or any of its parts. Movement may be used with reference to humans, vertebrate and invertebrate animals, and microorganisms. Differentiate also from MOTOR ACTIVITY, movement associated with behavior.Cohort Studies: Studies in which subsets of a defined population are identified. These groups may or may not be exposed to factors hypothesized to influence the probability of the occurrence of a particular disease or other outcome. Cohorts are defined populations which, as a whole, are followed in an attempt to determine distinguishing subgroup characteristics.Systems Biology: Comprehensive, methodical analysis of complex biological systems by monitoring responses to perturbations of biological processes. Large scale, computerized collection and analysis of the data are used to develop and test models of biological systems.Principal Component Analysis: Mathematical procedure that transforms a number of possibly correlated variables into a smaller number of uncorrelated variables called principal components.Occupational Exposure: The exposure to potentially harmful chemical, physical, or biological agents that occurs as a result of one's occupation.Radiographic Image Interpretation, Computer-Assisted: Computer systems or networks designed to provide radiographic interpretive information.Technetium Tc 99m Pentetate: A technetium imaging agent used in renal scintigraphy, computed tomography, lung ventilation imaging, gastrointestinal scintigraphy, and many other procedures which employ radionuclide imaging agents.Retrospective Studies: Studies used to test etiologic hypotheses in which inferences about an exposure to putative causal factors are derived from data relating to characteristics of persons under study or to events or experiences in their past. The essential feature is that some of the persons under study have the disease or outcome of interest and their characteristics are compared with those of unaffected persons.Thermoluminescent Dosimetry: The use of a device composed of thermoluminescent material for measuring exposure to IONIZING RADIATION. The thermoluminescent material emits light when heated. The amount of light emitted is proportional to the amount of ionizing radiation to which the material has been exposed.Blood Flow Velocity: A value equal to the total volume flow divided by the cross-sectional area of the vascular bed.Area Under Curve: A statistical means of summarizing information from a series of measurements on one individual. It is frequently used in clinical pharmacology where the AUC from serum levels can be interpreted as the total uptake of whatever has been administered. As a plot of the concentration of a drug against time, after a single dose of medicine, producing a standard shape curve, it is a means of comparing the bioavailability of the same drug made by different companies. (From Winslade, Dictionary of Clinical Research, 1992)Classification: The systematic arrangement of entities in any field into categories classes based on common characteristics such as properties, morphology, subject matter, etc.Deuterium Oxide: The isotopic compound of hydrogen of mass 2 (deuterium) with oxygen. (From Grant & Hackh's Chemical Dictionary, 5th ed) It is used to study mechanisms and rates of chemical or nuclear reactions, as well as biological processes.Water: A clear, odorless, tasteless liquid that is essential for most animal and plant life and is an excellent solvent for many substances. The chemical formula is hydrogen oxide (H2O). (McGraw-Hill Dictionary of Scientific and Technical Terms, 4th ed)Quantitative Trait Loci: Genetic loci associated with a QUANTITATIVE TRAIT.Risk: The probability that an event will occur. It encompasses a variety of measures of the probability of a generally unfavorable outcome.Cystatin C: An extracellular cystatin subtype that is abundantly expressed in bodily fluids. It may play a role in the inhibition of interstitial CYSTEINE PROTEASES.Quality Control: A system for verifying and maintaining a desired level of quality in a product or process by careful planning, use of proper equipment, continued inspection, and corrective action as required. (Random House Unabridged Dictionary, 2d ed)Forecasting: The prediction or projection of the nature of future problems or existing conditions based upon the extrapolation or interpretation of existing scientific data or by the application of scientific methodology.Sampling Studies: Studies in which a number of subjects are selected from all subjects in a defined population. Conclusions based on sample results may be attributed only to the population sampled.Basic Reproduction Number: The expected number of new cases of an infection caused by an infected individual, in a population consisting of susceptible contacts only.Poisson Distribution: A distribution function used to describe the occurrence of rare events or to describe the sampling distribution of isolated counts in a continuum of time or space.Radioisotope Dilution Technique: Method for assessing flow through a system by injection of a known quantity of radionuclide into the system and monitoring its concentration over time at a specific point in the system. (From Dorland, 28th ed)Positron-Emission Tomography: An imaging technique using compounds labelled with short-lived positron-emitting radionuclides (such as carbon-11, nitrogen-13, oxygen-15 and fluorine-18) to measure cell metabolism. It has been useful in study of soft tissues such as CANCER; CARDIOVASCULAR SYSTEM; and brain. SINGLE-PHOTON EMISSION-COMPUTED TOMOGRAPHY is closely related to positron emission tomography, but uses isotopes with longer half-lives and resolution is lower.Radiation Monitoring: The observation, either continuously or at intervals, of the levels of radiation in a given area, generally for the purpose of assuring that they have not exceeded prescribed amounts or, in case of radiation already present in the area, assuring that the levels have returned to those meeting acceptable safety standards.Proteins: Linear POLYPEPTIDES that are synthesized on RIBOSOMES and may be further modified, crosslinked, cleaved, or assembled into complex proteins with several subunits. The specific sequence of AMINO ACIDS determines the shape the polypeptide will take, during PROTEIN FOLDING, and the function of the protein.Remote Sensing Technology: Observation and acquisition of physical data from a distance by viewing and making measurements from a distance or receiving transmitted data from observations made at distant location.Hemoglobinometry: Measurement of hemoglobin concentration in blood.Models, Chemical: Theoretical representations that simulate the behavior or activity of chemical processes or phenomena; includes the use of mathematical equations, computers, and other electronic equipment.Genealogy and HeraldryForensic Anthropology: Scientific study of human skeletal remains with the express purpose of identification. This includes establishing individual identity, trauma analysis, facial reconstruction, photographic superimposition, determination of time interval since death, and crime-scene recovery. Forensic anthropologists do not certify cause of death but provide data to assist in determination of probable cause. This is a branch of the field of physical anthropology and qualified individuals are certified by the American Board of Forensic Anthropology. (From Am J Forensic Med Pathol 1992 Jun;13(2):146)Neoplasms: New abnormal growth of tissue. Malignant neoplasms show a greater degree of anaplasia and have the properties of invasion and metastasis, compared to benign neoplasms.Radiopharmaceuticals: Compounds that are used in medicine as sources of radiation for radiotherapy and for diagnostic purposes. They have numerous uses in research and industry. (Martindale, The Extra Pharmacopoeia, 30th ed, p1161)Sequence Alignment: The arrangement of two or more amino acid or base sequences from an organism or organisms in such a way as to align areas of the sequences sharing common properties. The degree of relatedness or homology between the sequences is predicted computationally or statistically based on weights assigned to the elements aligned between the sequences. This in turn can serve as a potential indicator of the genetic relatedness between the organisms.JapanMicrosatellite Repeats: A variety of simple repeat sequences that are distributed throughout the GENOME. They are characterized by a short repeat unit of 2-8 basepairs that is repeated up to 100 times. They are also known as short tandem repeats (STRs).Longitudinal Studies: Studies in which variables relating to an individual or group of individuals are assessed over a period of time.Birth Weight: The mass or quantity of heaviness of an individual at BIRTH. It is expressed by units of pounds or kilograms.Feasibility Studies: Studies to determine the advantages or disadvantages, practicability, or capability of accomplishing a projected plan, study, or project.Radar: A system using beamed and reflected radio signals to and from an object in such a way that range, bearing, and other characteristics of the object may be determined.Brain Mapping: Imaging techniques used to colocalize sites of brain functions or physiological activity with brain structures.Genome: The genetic complement of an organism, including all of its GENES, as represented in its DNA, or in some cases, its RNA.Metabolic Clearance Rate: Volume of biological fluid completely cleared of drug metabolites as measured in unit time. Elimination occurs as a result of metabolic processes in the kidney, liver, saliva, sweat, intestine, heart, brain, or other site.Models, Anatomic: Three-dimensional representation to show anatomic structures. Models may be used in place of intact animals or organisms for teaching, practice, and study.Automatic Data Processing: Data processing largely performed by automatic means.Environmental Exposure: The exposure to potentially harmful chemical, physical, or biological agents in the environment or to environmental factors that may include ionizing radiation, pathogenic organisms, or toxic chemicals.Wireless Technology: Techniques using energy such as radio frequency, infrared light, laser light, visible light, or acoustic energy to transfer information without the use of wires, over both short and long distances.Elastic Modulus: Numerical expression indicating the measure of stiffness in a material. It is defined by the ratio of stress in a unit area of substance to the resulting deformation (strain). This allows the behavior of a material under load (such as bone) to be calculated.Distance Perception: The act of knowing or the recognition of a distance by recollective thought, or by means of a sensory process which is under the influence of set and of prior experience.Genetic Linkage: The co-inheritance of two or more non-allelic GENES due to their being located more or less closely on the same CHROMOSOME.Tomography, Emission-Computed: Tomography using radioactive emissions from injected RADIONUCLIDES and computer ALGORITHMS to reconstruct an image.Biochemical Phenomena: The chemical processes, enzymatic activities, and pathways of living things and related temporal, dimensional, qualitative, and quantitative concepts.Sex Factors: Maleness or femaleness as a constituent element or influence contributing to the production of a result. It may be applicable to the cause or effect of a circumstance. It is used with human or animal concepts but should be differentiated from SEX CHARACTERISTICS, anatomical or physiological manifestations of sex, and from SEX DISTRIBUTION, the number of males and females in given circumstances.Geography: The science dealing with the earth and its life, especially the description of land, sea, and air and the distribution of plant and animal life, including humanity and human industries with reference to the mutual relations of these elements. (From Webster, 3d ed)Bankruptcy: The state of legal insolvency with assets taken over by judicial process so that they may be distributed among creditors.Kazakhstan
... the least-squares method may be used to fit a generalized linear model. The least-squares method is usually credited to Carl ... and define a method of estimation that minimizes the error of estimation. Gauss showed that arithmetic mean is indeed the best ... In regression analysis the researcher specifies an empirical model. For example, a very common model is the straight line model ... He then turned the problem around by asking what form the density should have and what method of estimation should be used to ...
Generalized Method of Moments. Maximum likelihood estimation. Simultaneous equation systems, large econometric models. ARIMA ( ... SAS has routines for automated State Space estimation. RATS can be programmed to estimate State Space models, or regression ... Similarly, SAS has an entire routine for estimating and forecasting with Unobserved Components Models. In RATS, estimation of ... All these methods can be used in order to forecast, as well as to conduct data analysis. In addition, RATS can handle cross- ...
... models and estimation procedures. American Journal of Human Genetics 19:233-257. Edwards, A. W. F. 1969. Statistical methods in ... He is best known for his pioneering work, with L.L. Cavalli-Sforza, on quantitative methods of phylogenetic analysis, and for ... The origin and early development of the method of minimum evolution for the reconstruction of phylogenetic trees. Systematic ...
Cross-validation is a statistical method for validating a predictive model. Subsets of the data are held out for use as ... Good, P. (2006) Resampling Methods. 3rd Ed. Birkhauser. Wolter, K.M. (2007). Introduction to Variance Estimation. 2nd Edition. ... Quenouille invented this method with the intention of reducing the bias of the sample estimate. Tukey extended this method by ... Verbyla, D.; Litvaitis, J. (1989). "Resampling methods for evaluating classification accuracy of wildlife habitat models". ...
Dargahi-Noubary, G. R. (1989). "On tail estimation: An improved method". Mathematical Geology. 21 (8): 829-842. doi:10.1007/ ... Modeling Distributions and Lorenz Curves. New York: Springer. ISBN 9780387727967. Arnold, B. C.; Laguna, L. (1977). On ... It is often used to model the tails of another distribution. It is specified by three parameters: location μ {\displaystyle \mu ... Davison, A. C. (1984-09-30). "Modelling Excesses over High Thresholds, with an Application". In de Oliveira, J. Tiago. ...
Dubnov S, Assayag G, Lartillot O, Bejenaro G (2003). "Using machine-learning methods for musical style modeling". Computer. 36 ... Galves A, Garivier A, Gassiat E (2012). "Joint estimation of intersecting context tree models". Scandinavian Journal of ... "variable order Markov models" (VOM), "probabilistic suffix trees" and "context tree models". The name "stochastic chains with ... These models were introduced in the information theory literature by Jorma Rissanen in 1983, as a universal tool to data ...
... estimation methods and scale effects.". In Jakeman, A.J.; Beck, M.B.; McAleer, M. Modelling Change in Environmental Systems. ... Methods of computing this index differ primarily in the way the upslope contributing area is calculated. The topographic ... The topographic wetness index (TWI) was developed by Beven and Kirkby within the runoff model TOPMODEL. The topographic wetness ... Beven, K.J.; Kirkby, M. J. (1979). "A physically based, variable contributing area model of basin hydrology". Hydrolological ...
ISBN 978-1-60086-722-4. Puneet, Singla; Junkins, John L. (2009). Multi-resolution Methods for Modeling and Control of Dynamical ... ISBN 978-1-58488-769-0. Crassidis, John L.; Junkins, John L. (2011). Optimal Estimation of Dynamic Systems,Second Edition. New ... An Introduction to Optimal Estimation of Dynamical Systems. Leyden, The Netherlands: Sijthoff-Noordhoff. ISBN 90-286-0067-1. ...
Deterministic methods in stochastic optimal control (1st ed.). Davis, Mark (1993). Markov models and optimization (1st ed.). ... Davis, Mark (1977). Linear Estimation and Stochastic Control (1st ed.). Davis, Mark; Vintner, Richard B (1985). Stochastic ... in particular in credit risk models, pricing in incomplete markets and stochastic volatility. Davis obtained his PhD in 1971 at ... leading a front-office group providing pricing models and risk analysis for fixed income, equity and credit-related products. ...
"Least Absolute Deviation Estimation in Structural Equation Modeling". Sociological Methods & Research. 36 (2): 227-265. doi: ... The algorithms for IRLS, Wesolowsky's Method, and Li's Method can be found in Appendix A of among other methods. Checking all ... A Simplex method is a method for solving a problem in linear programming. The most popular algorithm is the Barrodale-Roberts ... The following is an enumeration of some least absolute deviations solving methods. Simplex-based methods (such as the Barrodale ...
... the estimation methods for model (2) cannot be used in this case and alternative estimation methods for model (3) are available ... Various estimation methods can be applied to model (6). Adding multiple functional covariates, model (6) can also be extended ... Various estimation methods for model (4) are available. When X {\displaystyle X} and Y {\displaystyle Y} are concurrently ... Functional linear models (FLMs) are an extension of linear models (LMs). A linear model with scalar response Y ∈ R {\ ...
Ecological Modelling, 38: 277-298. Worton BJ. (1989) Kernel methods for estimating the utilization distribution in home-range ... 1986) Density estimation for statistics and data analysis. London: Chapman and Hall. 176 p. Worton BJ. (1987). A review of ... The LoCoH method has a number of advantages over parametric kernel methods. In particular: As more data are added, the ... In this sense, LoCoH methods are a generalization of the home range estimator method based on constructing the minimum convex ...
"ARCH MODELS: PROPERTIES, ESTIMATION AND TESTING". Journal of Economic Surveys. 7: 305-366. doi:10.1111/j.1467-6419.1993.tb00170 ... Theory and Method. 19: 3619-3644. Bera, Anil K.; Byron, R.P. (1990). "Linearised Estimation of Nonlinear Simultaneous Equation ... Bera, Anil K. and Higgins, M.L. (1993). 'ARCH Models: Properties, Estimation and Testing'. Journal of Economic Surveys, 7, pp. ... Bera, Anil K. and Higgins, M.L. (1994). 'ARCH Models: Properties Estimation and Testing'. Survey in Econometrics, pp. 215-272. ...
Population Estimation Methods (Editor and contributor), Statistics Canada, Ottawa. Coale J. A. and P. Demeny. 1966. Regional ... A Three Parameter Model for Birth Projection, Population Studies, vol. XXVII (3) 1973. Romaniuk, A. 1968. Infertility in ... During his tenure at the Canada Federal Bureau of Statistics, he endeavoured to develop methods for population estimation. His ... Population Estimation Methods (Editor and contributor), Statistics Canada, Ottawa. Romaniuk A. 1975. (Jointly with Gnanasekaran ...
Tarantola, Albert (2005). Inverse Problem Theory and Methods for Model Parameter Estimation. SIAM. ISBN 978-0-89871-572-9. ... that during the years 1985-2000 developed modern methods for the interpretation of waveform data. Apart from doing scientific ...
ISBN 0-89871-572-5. Tarantola, Albert (1987). Inverse Problem Theory and Methods for Model Parameter Estimation. Springer- ... Data assimilation methods allow the models to incorporate later observations to improve the initial conditions. Data ... Fluid dynamic models are governed by a set of partial differential equations. For these equations to make good predictions, ... Geophysical inverse theory is concerned with analyzing geophysical data to get model parameters. It is concerned with the ...
Norman, K. L. (1973). A method of maximum likelihood estimation for information integration models. (CHIP No. 35). La Jolla, ... Adding models R = reaction/overt behavior F/G = contributing factors R1 = F1 + G1 (Condition 1) R2 = F2 + G2 (Condition 2) ... Example: If F1 is negative, then G1 must be greater than G2 Two advantages of adding models; (1) Participants are not required ... "Cognitive algebra" refers to the class of functions that are used to model the integration process. They may be adding, ...
... Theory and Methods for Model Parameter Estimation (PDF). SIAM. doi:10.1137/1.9780898717921.fm - via epubs.siam. ... The new method learns from its own previously generated realizations of the shale and produces models that match the existing ... Model parameters → Data The transformation from data to model parameters (or vice versa) is a result of the interaction of a ... of data to the model parameters. The objective of an inverse problem is to find the best model parameters m {\displaystyle m} ...
The AF method was correct 45.91% of the time with a tendency toward under-estimation. Both the OC and AF methods, generated ... Choose the best model from a series of models that differ in complexity. Researchers use goodness-of-fit measures to fit models ... Both of these methods have out-performed the K1 method in simulation. In the Ruscio and Roche study (2012), the OC method was ... Used to test the null hypothesis that a model has perfect model fit. It should be applied to models with an increasing number ...
Estimation of the model yields results that can be used to predict this employment probability for each individual. In the ... He suggests a two-stage estimation method to correct the bias. The correction uses a control function idea and is easy to ... ISBN 0-12-398750-4. Newey, Whitney; Powell, J.; Walker, James R. (1990). "Semiparametric Estimation of Selection Models: Some ... ISBN 3-11-014936-2. Cameron, A. Colin; Trivedi, Pravin K. (2005). "Sample Selection Models". Microeconometrics: Methods and ...
Mein, R. G., E. M. Laurenson and T. A. McMahon (1974). Simple nonlinear model for flood estimation. Journal of the Hydraulics ... Cunge J. A (1969). On the subject of a flood propagation computational method (Muskingum method). Journal of Hydraulic Research ... The hydraulic models (e.g. dynamic and diffusion wave models) require the gathering of a lot of data related to river geometry ... Barati R (2011). Parameter estimation of nonlinear Muskingum models using Nelder-Mead Simplex algorithm. Journal of Hydrologic ...
... sample decomposition approach to multivariate estimation methods." Applied Stochastic Models and Data Analysis 2.3: 99-119. ... "OLS estimation in a model where a microvariable is explained by aggregates and contemporaneous disturbances are equicorrelated ... His research interests centered on econometric methods and their applications, especially nonparametric and robust methods in ... Heij, C., De Boer, P., Franses, P. H., Kloek, T., & Van Dijk, H. K.. Econometric methods with applications in business and ...
VanderWeele, T.J. (2009). "Marginal structural models for the estimation of direct and indirect effects". Epidemiology. 20 (1 ... Psychol Methods. 7 (1): 83-104. doi:10.1037/1082-989x.7.1.83. PMID 11928892. "Testing of Mediation Models in SPSS and SAS". ... 1) Confounding: Another model that is often tested is one in which competing variables in the model are alternative potential ... "SPSS and SAS procedures for estimating indirect effects in simple mediation models". Behavior Research Methods, Instruments, ...
The estimation method includes both calibration and objective video quality estimations. There are four major areas where this ... The model predicts the video quality as it is perceived by subjects (viewers). The prediction model uses psycho-visual and ... VQEG then proposed the VQuad-HD model to ITU-T to form a video quality model standard - known as ITU-T J.341 - which was ... measurement method can be used. The full reference measurement method can be used when the unimpaired reference video signal is ...
This can be done with simultaneous-equation methods of estimation in econometrics. Such methods allow solving for the model- ... are needed to perform such an estimation. An alternative to "structural estimation" is reduced-form estimation, which regresses ... The model of supply and demand also applies to various specialty markets. The model is commonly applied to wages, in the market ... The supply-and-demand model is a partial equilibrium model of economic equilibrium, where the clearance on the market of some ...
... of the scanner Physics can be incorporated into the likelihood model than those used by analytical reconstruction methods, ... May 2008). "Estimation of the radiation dose from CT in cystic fibrosis". Chest. 133 (5): 1289-91, author6 reply 1290-1. doi: ... or via Bayes penalty methods[43][44] or via I.J. Good's roughness method[45][46] may yield superior performance to expectation- ... Alternative methods of scanning include x-ray computed tomography (CT), magnetic resonance imaging (MRI) and functional ...
The simulation estimation methods we discuss here make it possible to estimate LDV models that are computationally intractable ... Finally, estimation methods that rely on simulation are described. We review three general approaches that combine estimation ... using classical estimation methods. We first review the ways in which LDV models arise, describing the differences and ... Following the LDV models, we described variables simulation methods for evaluating such integrals. Naturally, censoring and ...
Statistical Model Estimation examines the most important and popular methods used to estimate parameters for statistical models ... and provide informative model summary statistics. Designed for R users, the book is also ideal for anyone wanting… ... Methods of Statistical Model Estimation examines the most important and popular methods used to estimate parameters for ... Generalized Linear Models. Maximum Likelihood Estimation. Panel Data. Model Estimation Using Simulation. Bibliography. Index. ...
We also develop a consistent information criterion for the determination of the number of factors to be included in the model. ... In this paper we propose a new parametric methodology for estimating factors from large datasets based on state space models ... The estimation of dynamic factor models for large sets of variables has attracted considerable attention recently, due to the ... C51 - Mathematical and Quantitative Methods - - Econometric Modeling - - - Model Construction and Estimation. *E52 - ...
... calibration methods [Risto Lehtonen, University of Helsinki; Ari Veijanen, Statistics Finland] ... Estimation of poverty rate for small areas by model calibration and hybrid ... Estimation of poverty rate for small areas by model calibration and hybrid calibration methods [Risto Lehtonen, University of ... Estimation of poverty rate for small areas by model calibration and "hybrid" calibration methods [Risto Lehtonen, University of ...
G-estimation of accelerated failure-time models was developed to hand ... Most methods proposed to address this bias do not consider that it may be caused by time-varying confounders affected by prior ... METHODS: We compare results from Cox models and g-estimation in 38,747 autoworkers exposed to straight metalworking fluids. ... A comparison of standard methods with g-estimation of accelerated failure-time models to address the healthy-worker survivor ...
... optimization method for the problem of parameter estimation in thermodynamic models. The method is based on an adaptation of ... PARAMETER ESTIMATION PROBLEM. The method of least squares determines the model parameters by solution of an optimization ... The model is represented by the function ycalc = ycalc (q, x). The method of least squares presents the following structure ( ... utilization of the SA method can give better results when compared to Powell s method. Each run using Powell s method achieved ...
Citation Information: Statistics & Decisions International mathematical journal for stochastic methods and models, Volume 28, ... Parameter estimation by contrast minimization for noisy observations of a diffusion process ... Mean and covariance matrix adaptive estimation for a weakly stationary process. Application in stochastic optimization by ...
... to reduce their magnitude and their contribution to the rating estimation as successive factors are added. When an appropriate ... Methods and apparatus for modeling relationships at multiple scales in ratings estimation ... Methods and apparatus for modeling relationships at multiple scales in ratings estimation ... 6. The method of claim 1, wherein iteratively pre-computing the matrix P and the matrix Q is performed off-line.. 7. The method ...
Non-homogeneous regression models are widely used to statistically post-process numerical ensemble weather prediction ... Estimation methods for non-homogeneous regression models: Minimum continuous ranked probability score vs. maximum likelihood. ... Non-homogeneous regression models are widely used to statistically post-process numerical ensemble weather prediction models. ... Such regression models are capable of forecasting full probability distributions and correct for ensemble errors in the mean ...
Citation: Barley, M. H. and McFiggans, G.: The critical assessment of vapour pressure estimation methods for use in modelling ... The critical assessment of vapour pressure estimation methods for use in modelling the formation of atmospheric organic aerosol ... was found to be very sensitive to the variation in vapour pressure values typically present when comparing estimation methods. ... Abstract. A selection of models for estimating vapour pressures have been tested against experimental data for a set of ...
... a comparison of direct and indirect model-based methods. Linear mixed models underpin many small area estimation (SAE) methods ... Linear mixed models underpin many small area estimation (SAE) methods. In this paper we investigate SAE based on linear models ... Small area estimation for spatially correlated populations - a comparison of direct and indirect model-based methods ... Small area estimation for spatially correlated populations - a comparison of direct and indirect model-based methods ...
A Model-Based Ionic Conductance Estimation Method for Retinal Neurons You will receive an email whenever this article is ... N.L. Kamiji, A. Ishihara, S. Usui; A Model-Based Ionic Conductance Estimation Method for Retinal Neurons . Invest. Ophthalmol. ... Additionally, this method would be useful to analyze detailed neural mechanisms using the tuned model for closed-loop ... Furthermore, since the time for estimation is relatively short (, 1 s), this method can be accomplished on-line and in real ...
Structural correlation method for model reduction and practical estimation of patient specific parameters illustrated on heart ... The use of these methods make the model suitable for estimation of parameters from individuals, allowing it to be adopted for ... The use of these methods make the model suitable for estimation of parameters from individuals, allowing it to be adopted for ... Structural correlation method for model reduction and practical estimation of patient specific parameters illustrated on heart ...
... approach within the MHD estimation to extend the robust estimation method for a finite mixture of Poisson mixed-effect models. ... Xiang, L., Yau, K. K.,& Lee, A. H. (2012). The robust estimation method for a finite mixture of Poisson mixed-effect models. ... The robust estimation method for a finite mixture of Poisson mixed-effect models. en. ... parameters in such a model are often estimated through the residual maximum likelihood estimation approach. However, the method ...
... is a famous and widely used sensitivity and uncertainty analysis method. It provides a new ... Variance-based sensitivity analysis of a wind risk model-model behavior and lessons for forest modelling. Environ Model Softw ... Zhao R (1983) Watershed hydrological model-Xinanjiang model and Northern Shaanxi model. Water Resources and Electric Power ... Computer aided numerical methods for hydrological model calibration: an overview and recent development. Arch Comput Methods ...
2002) Epistasis: What it means, what it doesnt mean, and statistical methods to detect it in humans. Hum Mol Genet 11(20):2463 ... We note that the standard model given by Eq. 3 is nested in the model given by Eq. 5, which in turn is nested in the model ... In these cases, the standard model, which is nested in our more general model, provided an adequate model for heritability. ... 3]Model 3 is known as a linear mixed model with a random effect having the covariance matrix K. c. a. u. s. a. l. =. 1. M. X. X ...
This article describes the adaptation of a model estimating the burden of postmenopausal osteoporosis (PMO) to the UK. The ... Methods. For each year of the study, the incident cohort (women experiencing a first osteoporotic fracture) was identified ... Epidemiological burden of postmenopausal osteoporosis in the UK from 2010 to 2021: estimations from a disease model. ... The PMO disease model, first developed for Sweden, was adapted to the UK. Due to demographic changes, the burden of ...
... results from measurements and modeling in India for estimation of the global burden of disease.. Balakrishnan K1, Ghosh S, ... METHODS: We monitored 24-hr household concentrations of PM2.5, in 617 rural households from 4 states in India on a cross- ... Such models can also serve to inform public health intervention efforts. Thus, we developed a model to estimate national ... Results from validation studies: Scatter plot of modeled vs. measured kitchen area PM 2.5 (top) concentrations obtained from ...
Standard factor models imply a linear relationship between expected returns on assets and their factor exposures. We provide ... Political Methods: Quantitative Methods eJournal. Subscribe to this fee journal for more curated articles on this topic ... Linear Factor Models and the Estimation of Expected Returns. Netspar Discussion Paper No. 03/2016-020 ... Sarisoy, Cisil and de Goeij, Peter and Werker, Bas J.M., Linear Factor Models and the Estimation of Expected Returns (March 3, ...
We test the model with a collection of unaccompanied tonal melodies to evaluate it as a feature extractor for chord estimation ... We test the model with a collection of unaccompanied tonal melodies to evaluate it as a feature extractor for chord estimation ... We also find that, like other existing features for chord estimation, the performance of the model can be improved by using ... We also find that, like other existing features for chord estimation, the performance of the model can be improved by using ...
Details of their estimation are presented in Methods and SI Appendix, section 3. The regional dataset EPICA Dome C (EDC) Ice ... Given a models finite resolution, accurately modeling local and regional variability is more difficult than modeling global ... Methods. Spectral Estimation.. Because paleoclimate data are often unevenly sampled in the time domain, a common strategy for ... 15 and 18⇓⇓-21 and Table 1) as well as improved spectral methods (ref. 26 and Methods). Notably, the latest Past Global Changes ...
... comparing estimation errors of the plurality of motion estimation models with one another, and selecting one motion estimation ... and performing sub-pixel motion estimation using the selected motion estimation model. ... model which has a smaller estimation error according to the comparing of the estimation errors, ... generating a plurality of motion estimation models using a value of the motion vector, ...
The goal of this paper is to compare several widely used Bayesian model selection methods in practical model selection problems ... 2.4 Reference model approach. Section 2.2 reviewed methods for utility estimation that are based on sample reuse without any ... model. Assuming the true data generating model belongs to the set of the candidate models, MAP model can be shown to be the ... The right plot shows a biased utility estimation method that either under or overestimates the ability of most of the models. ...
Read chapter 11 Risk Assessment Models and Methods: This book is the seventh in a series of titles from the National Research ... Estimation via Mathematical Models for Risk. Model-based estimation provides a feasible alternative to direct estimation. Model ... The most common method of fitting risk model data (i.e., estimating the unknown parameters in the model) is the method of ... Model Parameter Estimation. Models describe the mathematical form of a risk function, but the parameters in the model must be ...
... and peak model estimation (PME).We manually generated Metabolites 2013, 3 278 a gold standard with the aid of a domain expert ( ... Our main finding is that the power of the classification accuracy is almost equally good for all methods, the manually created ... In summary, we conclude that all methods, though small differences exist, are largely reliable and enable a wide spectrum of ... Although sophisticated machine learning methods exist, an inevitable preprocessing step is reliable and robust peak detection ...
  • The linear mixed model (LMM) is now routinely used to estimate heritability. (pnas.org)
  • Recent progress in GBD methodologies that use integrated-exposure-response (IER) curves for combustion particles required the development of models to quantitatively estimate average HAP levels experienced by large populations. (nih.gov)
  • Thus, we developed a model to estimate national household concentrations of PM2.5 from solid cookfuel use in India, together with estimates for 29 states. (nih.gov)
  • Extrapolation of the household results by state to all solid cookfuel-using households in India, covered by NFHS 2005, resulted in a modeled estimate of 450 μg/m3 (95% CI: 318,640) and 113 μg/m3 (95% CI: 102,127) , for national average 24-hr PM2.5 concentrations in the kitchen and living areas respectively. (nih.gov)
  • The results show that the optimization of a utility estimate such as the cross-validation (CV) score is liable to finding overfitted models due to relatively high variance in the utility estimates when the data is scarce. (springer.com)
  • This paper attempts to test the effectiveness of log-linear models (LLM) to estimate the suspended (S-LMM), dissolved (D-LLM), and total suspended (T-LLM) load into a Mediterranean semiarid karst stream (the Argos River basin, in southeast Spain). (iwaponline.com)
  • This means that even for a very large data col-lection, the maximum likelihood estimation method does not allow Turing's estimate PT for a probability of a word (m-gram) which occurred in the sample r times is r* PT = where r We call a procedure of replacing a count r with a modified count r ' "discounting " and a ratio rt/r a discount coefficient dr. (psu.edu)
  • To this end, we have recently developed a novel technique to continuously (i.e., automatically) estimate EF by model-based analysis of an aortic pressure waveform. (biomedsearch.com)
  • We estimate the model and use it to asses the relative contribution of the different factors for overall wage inequality. (nber.org)
  • A classification modeling is used effectively to estimate the probabilities of attributes in predicting the yield, which in turn influences biofuel production and its entire supply chain. (alliedacademies.org)
  • Thresholding methods to estimate copula density. (degruyter.com)
  • Critical needs to estimate spread rates include the availability of surveys to characterize the spatial distribution of an invading species and the application of analytical methods to. (usda.gov)
  • The Generalized Dynamic Factor Model: One-Sided Estimation and Forecasting ," Computing in Economics and Finance 2003 143, Society for Computational Economics. (repec.org)
  • Simulated annealing (SA) is a global stochastic optimization method that originated in the computational reproduction of the thermal process of annealing, where a material is heated and cooled slowly in order to reach a minimum energy state. (scielo.br)
  • Thus this novel technique, which utilizes a simple experimental protocol in conjunction with computational models, could give additional insights into the mechanism and role of conductance changes induced by neuromodulators and other mechanisms. (arvojournals.org)
  • In this research, we focused on the computational efficiency issue of the GLUE method. (springerprofessional.de)
  • Application of the parallel GLUE for the Xinanjiang hydrological model parameter sensitivity analysis proved its much better computational efficiency than the traditional serial computing technology, and the correctness was also verified. (springerprofessional.de)
  • As a first step, virtually all modeling procedures in use in computational mechanics involve a process of discretization in which continuum models of nature are transformed into discrete or digital forms manageable by digital computers. (nap.edu)
  • It is clear that in this necessary first step of the computational process, the discretization, an error is always made, because a discrete model cannot possibly capture all of the information embodied in continuum models of gaseous, fluid, or solid materials. (nap.edu)
  • We first review the ways in which LDV models arise, describing the differences and similarities in censored and truncated data generating processes. (repec.org)
  • 4. The method of claim further comprising determining an optimal number of factors f so as to capture the most prominent features of the ratings data while leaving out relatively insignificant features. (patentgenius.com)
  • 7. The method of claim 6, wherein iteratively pre-computing the matrix P and the matrix Q is performed upon receipt of new rating data. (patentgenius.com)
  • A selection of models for estimating vapour pressures have been tested against experimental data for a set of compounds selected for their particular relevance to the formation of atmospheric aerosol by gas-liquid partitioning. (atmos-chem-phys.net)
  • This method is illustrated in detail on a model predicting baroreflex regulation of heart rate and applied to analysis of data from a rat and healthy humans. (ruc.dk)
  • Across the simulated and Ugandan data, narrow-sense heritability estimates were lower using the more general model. (pnas.org)
  • When analyzing clustered count data derived from several latent subpopulations, the finite mixture of the Poisson mixed-effect model is an immediate strategy to model the underlying heterogeneity. (ntu.edu.sg)
  • We used the model-based strategy to identify genes suited to normalize quantitative RT-PCR data from colon cancer and bladder cancer. (aacrjournals.org)
  • The increase in the number of large data sets and the complexity of current probabilistic sequence evolution models necessitates fast and reliable phylogeny reconstruction methods. (psu.edu)
  • Using the latest simulations and data syntheses, we find agreement for spectra derived from observations and models on timescales ranging from interannual to multimillennial. (pnas.org)
  • To demonstrate the effectiveness of the proposed method, we analyzed trajectory data of worms, fruit flies, rats, and bats in the laboratories, and penguins and flying seabirds in the wild, which were recorded with various methods and span a wide range of spatiotemporal scales-from mm to 1,000 km in space and from sub-seconds to days in time. (frontiersin.org)
  • Thus, our method provides a versatile and unbiased way to extract behavioral features from simple trajectory data to understand brain function. (frontiersin.org)
  • Evaluation of the association between exposure and disease occurrence is aided by the use of statistical models, and the types of models commonly used in radiation epidemiology are described below, as are the methods for fitting the models to data. (nap.edu)
  • The method, called the COPY method due to the nature in which the data set is manipulated, simply approximates the maximum likelihood estimation by making the estimated probabilities slightly less than one at the offending points. (cdc.gov)
  • 2003), the data were only manipulated when the log binomial model failed to converge. (cdc.gov)
  • If Blizzard and Hosmer can get the COPY method to converge in Stata, their diagnostics can be used for maximum likelihood based estimation on any data set without having to switch from Stata to SAS. (cdc.gov)
  • Receive email alerts on new books, offers and news in Data Driven Modelling of Complex Systems. (cambridge.org)
  • For models built from high-dimensional data, e.g. arising from microarray technology, often survival time is the response of interest. (biomedcentral.com)
  • In our framework, the original nonnegative multiplicative update equations of NMF appear as an expectation-maximisation (EM) algorithm for maximum likelihood estimation of a conditionally Poisson model via data augmentation . (hindawi.com)
  • In this case study we demonstrate that if mass transfer is eliminated, a pseudo zero-order model can be fitted to the experimental data with a high degree of correlation. (intechopen.com)
  • Salivary excretion factors (SEFs) measured by scintigraphy and QoL data from self-reported questionnaires were used for NTCP modeling to describe the incidence of grade 3+ xerostomia. (biomedsearch.com)
  • CONCLUSIONS: Our study shows the agreement between the NTCP parameter modeling based on SEF and QoL data, which gave a NPV of 100% with each dataset, and the QUANTEC guidelines, thus validating the cut-off values of 20 and 25 Gy. (biomedsearch.com)
  • While the method has been developed for and suc-cessfully implemented in the IBM Real Time Speech Recognizers, its generality makes it applicable in other areas where the problem of es-timating probabilities from sparse data arises. (psu.edu)
  • Uncertainty estimates and sensitivity analysis are used to investigate how the frequency at which data is sampled affects the estimation process and how the accuracy and uncertainty of estimates improves as data is collected over the course of an outbreak. (aimsciences.org)
  • We assess the informativeness of individual data points in a given time series to determine when more frequent sampling (if possible) would prove to be most beneficial to the estimation process. (aimsciences.org)
  • We present DESeq2, a method for differential analysis of count data, using shrinkage estimation for dispersions and fold changes to improve stability and interpretability of estimates. (nih.gov)
  • When time series of mortality data are available, the model can be used to describe and analyse levels and dynamics of morbidity. (nih.gov)
  • Technique is illustrated using the stochastic growth of basic model, considering quarterly data on the Venezuelan economy between the first quarter of (1984) through the third quarter of (2004). (csuca.org)
  • This volume presents an eclectic mix of original research articles in areas covering the analysis of ordered data, stochastic modeling and biostatistics. (springer.com)
  • The volume is intended for all researchers with an interest in order statistics, distribution theory, analysis of censored data, stochastic modeling, time series analysis, and statistical methods for the health sciences, including statistical genetics. (springer.com)
  • A method and device in a communication system including a receiver having a plurality of receiving antennas for receiving a plurality of information bursts transmitted by at least one transmitting user device where the information bursts contain a number of data symbols and a pilot symbol sequence of. (google.com)
  • Its application to model based on Bangladesh data is discussed. (umn.edu)
  • Spatial statistics is a rapidly developing field involving the quantitative analysis of such spatial data and spatio-temporal data, and the statistical modelling of related variability and uncertainty. (elsevier.com)
  • This module is concerned with building and applying statistical models for data. (ncl.ac.uk)
  • The module provides a comprehensive introduction to the issues involved in using statistics to model real data, and to draw relevant conclusions. (ncl.ac.uk)
  • The proposed apparatus, systems and methods use one OFDM preamble, thereby avoiding the need to buffer the whole data packet before data demodulation and enabling online and/or real-time operation. (freepatentsonline.com)
  • We illustrate the advantages of our method through several well-designed real data-based analytical experiments. (biomedcentral.com)
  • A comprehensive source on mixed data analysis, Analysis of Mixed Data: Methods & Applications summarizes the fundamental developments in the field. (routledge.com)
  • Analysis of Mixed Data: Methods & Applications traces important developments, collates basic results, presents terminology and methodologies, and gives an overview of statistical research applications. (routledge.com)
  • This book has a wide coverage on recent methodological developments for mixed data analysis involving GLMM, copula models, Bayesian methods, and latent variable modeling as well as on the applications of mixed data analysis in various areas, including biology, epidemiology, econometrics, health policy, and social science. (routledge.com)
  • It gives an excellent overview of mixed data analysis both in terms of methods and applications. (routledge.com)
  • The estimation of spatial autocorrelation in spatially- and temporally-referenced data is fundamental to understanding an organism's population biology. (usda.gov)
  • I used four sets of census field data, and developed an idealized space-time dynamic system, to study the behavior of spatial autocorrelation estimates when a practical method of sampling is employed. (usda.gov)
  • Skew-Gaussian mixture model and Gaussian mixture model were proposed to investigate CT image estimation from MR images by partitioning the data into two major tissue types. (diva-portal.org)
  • The performance of the proposed models was evaluated using the leaveone-out cross-validation method on real data. (diva-portal.org)
  • In this work, a constrained stochastic space search (CSSS) method for parameter estimation is proposed and used to optimize the goal function for the difference between measured and estimated gene expression time series data. (wur.nl)
  • Supported by a large number of examples, Linear Model Methodology provides a strong foundation in the theory of linear models and explores the latest developments in data analysis. (crcpress.com)
  • These studies stem from the fact that sampling the high dimensional model space is highly nontrivial, and because it is well known that both the Bayes factors and the marginal probabilities are sensitive to the prior choices (e.g. (springer.com)
  • This section ends with a description of the use of fitted models for estimating probabilities of causation and certain measures of lifetime detriment associated with exposure to ionizing radiation. (nap.edu)
  • Blizzard and Hosmer did not use the method because they still had convergence problems in Stata even though no estimated probabilities were exactly one. (cdc.gov)
  • moreover, classifying observations based on their predicted probabilities is a type of binary classification model. (wikipedia.org)
  • For each year of the study, the 'incident cohort' (women experiencing a first osteoporotic fracture) was identified and run through a Markov model using 1-year cycles until 2020. (springer.com)
  • Inference in the resulting models can be carried out easily using variational (structured mean field) or Markov Chain Monte Carlo (Gibbs sampler). (hindawi.com)
  • There has been considerable and still growing interest in prior models and, in particular, in discrete Markov random field methods. (barnesandnoble.com)
  • RESEARCH DESIGN AND METHODS We used Markov modeling framework to generate yearly forecasts of the number of individuals in each of three states (diabetes, no diabetes, and death). (diabetesjournals.org)
  • The proposed parameter tuning method provided a consistent numerical measure of the response to the drug applied. (arvojournals.org)
  • The NPML estimation not only avoids the problem of numerical integration in deriving the MHD estimating equations, but also enhances the robustness characteristic because of its resistance to possible misspecification of the parametric distribution for the random effects. (ntu.edu.sg)
  • In this paper methods from spectral analysis are used to evaluate numerical accuracy formally and construct diagnostics for convergence. (psu.edu)
  • S. Franz and H-G. Roos, The capriciousness of numerical methods for singular perturbations , SIAM Rev. , 53 (2011), 157. (aimsciences.org)
  • Michael Wulkow founded the company Computing in Technology (CiT) in 1992 and since then has been involved in projects, research and numerous publications on the modeling and numerical treatment of processes in technical chemistry, biology, particle technology and pharmacokinetics. (wiley-vch.de)
  • Large Sample Properties of Generalized Method of Moments Estimators ," Econometrica , Econometric Society, vol. 50(4), pages 1029-1054, July. (repec.org)
  • Minimum Hellinger distance estimation for k-component Poisson mixture with random effects. (ntu.edu.sg)
  • Barros and Hirakata (2003) first proposed the use of the Poisson model with a robust variance estimator, and they also had convergence problems with the COPY method in Stata. (cdc.gov)
  • It has been shown that this model is robust to complex demographic scenarios for neutral genetic differentiation. (genetics.org)
  • The aim of this chapter is to present proposed kinetic and density functional theory (DFT) models for the selective oxidation of glycerol to various hydroxy-acids over an acidified Au/γ-Al 2 O 3 catalyst. (intechopen.com)
  • The first chapter also introduces some important notions: serial dependence, stationarity, testing for whiteness of spectrum, parametric and non-parametric modeling, and forecasting. (maa.org)
  • Similarly, observability and controllability (important concepts from control theory) get two pages in the chapter on state space models. (maa.org)
  • This is a well-written book with broad coverage on various topics … Each chapter provides carefully selected examples and cases to show the application of presented methods and pointed out possible future research directions. (routledge.com)
  • The final chapter introduces generalized linear models, which represent an extension of classical linear models. (crcpress.com)
  • Multiperiod Probit Models and Orthogonality Condition Estimation ," International Economic Review , Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 24(1), pages 21-35, February. (repec.org)
  • In both senses, this book is written for people who wish to fit statistical models and understand them. (routledge.com)
  • In this course we consider the issues involved when we wish to construct realistic and useful statistical models for problems which can arise in a range of fields: medicine, finance, social research and environmental issues being some of the main areas. (ncl.ac.uk)
  • As far as I know, this is the first work in this area, which provides an excellent overview about statistical models, estimation, and applications. (routledge.com)
  • CONCLUSIONS: G-estimation may provide a better control for the healthy-worker survivor effect than standard methods. (cdc.gov)
  • A comparison of standard methods with g-estimation of accelerated failure-time models to address the healthy-worker survivor effect: application in a cohort of autoworkers exposed to metalworking fluids. (cdc.gov)
  • G-estimation of accelerated failure-time models was developed to handle this issue but has never been applied to account for the healthy-worker survivor effect. (cdc.gov)
  • Generalized Linear Models. (routledge.com)
  • In this paper we investigate SAE based on linear models with spatially correlated small area effects where the neighbourhood structure is described by a contiguity matrix. (soton.ac.uk)
  • Standard factor models imply a linear relationship between expected returns on assets and their factor exposures. (ssrn.com)
  • LLM are more flexible and interpretable with respect to the estimation of sediment load, and they have two advantages over linear models: (1) predictions are always positive and (2) the residuals are often more homoscedastic. (iwaponline.com)
  • We propose a novel spatial functional linear model (SFLM), that incorporates a spatial autoregressive parameter and a spatial weight matrix into FLM to accommodate spatial dependencies among individuals. (archives-ouvertes.fr)
  • The subject of a posteriori error estimation is in the early stages of development, although significant progress has been made in recent years for linear elliptic problems. (nap.edu)
  • The course provides a detailed treatment of linear and non-linear characterization and modelling of amplifiers/transmitters from device to system level perspective. (ucalgary.ca)
  • Given the importance of linear models in statistical theory and experimental research, a good understanding of their fundamental principles and theory is essential. (crcpress.com)
  • After presenting the historical evolution of certain methods and techniques used in linear models, the book reviews vector spaces and linear transformations and discusses the basic concepts and results of matrix algebra that are relevant to the study of linear models. (crcpress.com)
  • Although mainly focused on classical linear models, the next several chapters also explore recent techniques for solving well-known problems that pertain to the distribution and independence of quadratic forms, the analysis of estimable linear functions and contrasts, and the general treatment of balanced random and mixed-effects models. (crcpress.com)
  • The author then covers more contemporary topics in linear models, including the adequacy of Satterthwaite's approximation, unbalanced fixed- and mixed-effects models, heteroscedastic linear models, response surface models with random effects, and linear multiresponse models. (crcpress.com)
  • Reflecting advances made in the last thirty years, this book offers a rigorous development of the theory underlying linear models. (crcpress.com)
  • In supercritical fluid extraction processes, the thermodynamic model is a fundamental aspect. (scielo.br)
  • Green Biomass Pretreatment for Biofuels Production reviews a range of pretreatment methods such as ammonium fiber explosion, steam explosion, dilute acid hydrolysis, alkali hydrolysis, and supercritical carbon dioxide explosion focusing on their final sugar yields from hemicellulose, glucose yields from cellulose, as well as on their feasibilities in bioenergy production processes at various scales. (google.com)
  • This paper reviews suitable methods for durability testing as well as basic modeling approaches which allow for the transfer of laboratory results to the longtime behavior of interface materials during a vehicle's lifetime. (mdpi.com)
  • The review of Vehtari and Ojanen ( 2012 ) being qualitative, our contribution is to compare many of the different methods quantitatively in practical model selection problems, discuss the differences, and give recommendations about the preferred approaches. (springer.com)
  • In this work we evaluate four state-of-the-art approaches for automated IMS-based peak detection: local maxima search, watershed transformation with IPHEx, region-merging with VisualNow, and peak model estimation (PME). (mdpi.com)
  • Although the introduction suggests that "some basic knowledge of calculus" is required for understanding most methods described here, that substantially understates the necessary prerequisites. (maa.org)
  • Factors affecting reservoir firm yield, as determined by application of the Massachusetts Department of Environmental Protection's Firm Yield Estimator (FYE) model, were evaluated, modified, and tested on 46 streamflow-dominated reservoirs representing 15 Massachusetts drinking-water supplies. (usgs.gov)
  • The firm yield of a system is sensitive to how the water is transferred from one reservoir to another, the capacity of the connection between the reservoirs, and how seasonal variations in demand are represented in the FYE model. (usgs.gov)
  • This research aims at the successful convention of the tree based classifier CART in order to predict the yield estimation of Jatropha seed by applying it on the agricultural dataset. (alliedacademies.org)
  • If the mathematical models of mechanics were perfect representations of mechanical events (which, of course, they are not), their utility in simulating events would be solely dependent on the discretization process used in computations and the errors it produces. (nap.edu)