Calibration: Determination, by measurement or comparison with a standard, of the correct value of each scale reading on a meter or other measuring instrument; or determination of the settings of a control device that correspond to particular values of voltage, current, frequency or other output.Reproducibility of Results: The statistical reproducibility of measurements (often in a clinical context), including the testing of instrumentation or techniques to obtain reproducible results. The concept includes reproducibility of physiological measurements, which may be used to develop rules to assess probability or prognosis, or response to a stimulus; reproducibility of occurrence of a condition; and reproducibility of experimental results.Sensitivity and Specificity: Binary classification measures to assess test results. Sensitivity or recall rate is the proportion of true positives. Specificity is the probability of correctly determining the absence of a condition. (From Last, Dictionary of Epidemiology, 2d ed)Reference Standards: A basis of value established for the measure of quantity, weight, extent or quality, e.g. weight standards, standard solutions, methods, techniques, and procedures used in diagnosis and therapy.Algorithms: A procedure consisting of a sequence of algebraic formulas and/or logical steps to calculate or determine a given task.Quality Control: A system for verifying and maintaining a desired level of quality in a product or process by careful planning, use of proper equipment, continued inspection, and corrective action as required. (Random House Unabridged Dictionary, 2d ed)Equipment Design: Methods of creating machines and devices.Phantoms, Imaging: Devices or objects in various imaging techniques used to visualize or enhance visualization by simulating conditions encountered in the procedure. Phantoms are used very often in procedures employing or measuring x-irradiation or radioactive material to evaluate performance. Phantoms often have properties similar to human tissue. Water demonstrates absorbing properties similar to normal tissue, hence water-filled phantoms are used to map radiation levels. Phantoms are used also as teaching aids to simulate real conditions with x-ray or ultrasonic machines. (From Iturralde, Dictionary and Handbook of Nuclear Medicine and Clinical Imaging, 1990)ROC Curve: A graphic means for assessing the ability of a screening test to discriminate between healthy and diseased persons; may also be used in other studies, e.g., distinguishing stimuli responses as to a faint stimuli or nonstimuli.Models, Statistical: Statistical formulations or analyses which, when applied to data and found to fit the data, are then used to verify the assumptions and parameters used in the analysis. Examples of statistical models are the linear model, binomial model, polynomial model, two-parameter model, etc.Equipment Failure Analysis: The evaluation of incidents involving the loss of function of a device. These evaluations are used for a variety of purposes such as to determine the failure rates, the causes of failures, costs of failures, and the reliability and maintainability of devices.Predictive Value of Tests: In screening and diagnostic tests, the probability that a person with a positive test is a true positive (i.e., has the disease), is referred to as the predictive value of a positive test; whereas, the predictive value of a negative test is the probability that the person with a negative test does not have the disease. Predictive value is related to the sensitivity and specificity of the test.Limit of Detection: Concentration or quantity that is derived from the smallest measure that can be detected with reasonable certainty for a given analytical procedure.Chromatography, High Pressure Liquid: Liquid chromatographic techniques which feature high inlet pressures, high sensitivity, and high speed.Computer Simulation: Computer-based representation of physical systems and phenomena such as chemical processes.Fossils: Remains, impressions, or traces of animals or plants of past geological times which have been preserved in the earth's crust.Autoanalysis: Method of analyzing chemicals using automation.Transducers: Any device or element which converts an input signal into an output signal of a different form. Examples include the microphone, phonographic pickup, loudspeaker, barometer, photoelectric cell, automobile horn, doorbell, and underwater sound transducer. (McGraw Hill Dictionary of Scientific and Technical Terms, 4th ed)Models, Theoretical: Theoretical representations that simulate the behavior or activity of systems, processes, or phenomena. They include the use of mathematical equations, computers, and other electronic equipment.Image Processing, Computer-Assisted: A technique of inputting two-dimensional images into a computer and then enhancing or analyzing the imagery into a form that is more useful to the human observer.Flow Injection Analysis: The analysis of a chemical substance by inserting a sample into a carrier stream of reagent using a sample injection valve that propels the sample downstream where mixing occurs in a coiled tube, then passes into a flow-through detector and a recorder or other data handling device.Software: Sequential operating programs and data which instruct the functioning of a digital computer.Chemistry Techniques, Analytical: Methodologies used for the isolation, identification, detection, and quantitation of chemical substances.Indicators and Reagents: Substances used for the detection, identification, analysis, etc. of chemical, biological, or pathologic processes or conditions. Indicators are substances that change in physical appearance, e.g., color, at or approaching the endpoint of a chemical titration, e.g., on the passage between acidity and alkalinity. Reagents are substances used for the detection or determination of another substance by chemical or microscopical means, especially analysis. Types of reagents are precipitants, solvents, oxidizers, reducers, fluxes, and colorimetric reagents. (From Grant & Hackh's Chemical Dictionary, 5th ed, p301, p499)Least-Squares Analysis: A principle of estimation in which the estimates of a set of parameters in a statistical model are those quantities minimizing the sum of squared differences between the observed values of a dependent variable and the values predicted by the model.Spectroscopy, Near-Infrared: A noninvasive technique that uses the differential absorption properties of hemoglobin and myoglobin to evaluate tissue oxygenation and indirectly can measure regional hemodynamics and blood flow. Near-infrared light (NIR) can propagate through tissues and at particular wavelengths is differentially absorbed by oxygenated vs. deoxygenated forms of hemoglobin and myoglobin. Illumination of intact tissue with NIR allows qualitative assessment of changes in the tissue concentration of these molecules. The analysis is also used to determine body composition.Weights and Measures: Measuring and weighing systems and processes.Radiometry: The measurement of radiation by photography, as in x-ray film and film badge, by Geiger-Mueller tube, and by SCINTILLATION COUNTING.Artifacts: Any visible result of a procedure which is caused by the procedure itself and not by the entity being analyzed. Common examples include histological structures introduced by tissue processing, radiographic images of structures that are not naturally present in living tissue, and products of chemical reactions that occur during analysis.Bayes Theorem: A theorem in probability theory named for Thomas Bayes (1702-1761). In epidemiology, it is used to obtain the probability of disease in a group of people with some characteristic on the basis of the overall rate of that disease and of the likelihood of that characteristic in healthy and diseased individuals. The most familiar application is in clinical decision analysis where it is used for estimating the probability of a particular diagnosis given the appearance of some symptoms or test result.Time Factors: Elements of limited time intervals, contributing to particular results or situations.Imaging, Three-Dimensional: The process of generating three-dimensional images by electronic, photographic, or other methods. For example, three-dimensional images can be generated by assembling multiple tomographic images with the aid of a computer, while photographic 3-D images (HOLOGRAPHY) can be made by exposing film to the interference pattern created when two laser light sources shine on an object.Observer Variation: The failure by the observer to measure or identify a phenomenon accurately, which results in an error. Sources for this may be due to the observer's missing an abnormality, or to faulty technique resulting in incorrect test measurement, or to misinterpretation of the data. Two varieties are inter-observer variation (the amount observers vary from one another when reporting on the same material) and intra-observer variation (the amount one observer varies between observations when reporting more than once on the same material).Radiographic Image Enhancement: Improvement in the quality of an x-ray image by use of an intensifying screen, tube, or filter and by optimum exposure techniques. Digital processing methods are often employed.Nomograms: Graphical representation of a statistical model containing scales for calculating the prognostic weight of a value for each individual variable. Nomograms are instruments that can be used to predict outcomes using specific clinical parameters. They use ALGORITHMS that incorporate several variables to calculate the predicted probability that a patient will achieve a particular clinical endpoint.Chromatography, Liquid: Chromatographic techniques in which the mobile phase is a liquid.Image Interpretation, Computer-Assisted: Methods developed to aid in the interpretation of ultrasound, radiographic images, etc., for diagnosis of disease.Monitoring, Ambulatory: The use of electronic equipment to observe or record physiologic processes while the patient undergoes normal daily activities.Tandem Mass Spectrometry: A mass spectrometry technique using two (MS/MS) or more mass analyzers. With two in tandem, the precursor ions are mass-selected by a first mass analyzer, and focused into a collision region where they are then fragmented into product ions which are then characterized by a second mass analyzer. A variety of techniques are used to separate the compounds, ionize them, and introduce them to the first mass analyzer. For example, for in GC-MS/MS, GAS CHROMATOGRAPHY-MASS SPECTROMETRY is involved in separating relatively small compounds by GAS CHROMATOGRAPHY prior to injecting them into an ionization chamber for the mass selection.Biosensing Techniques: Any of a variety of procedures which use biomolecular probes to measure the presence or concentration of biological molecules, biological structures, microorganisms, etc., by translating a biochemical interaction at the probe surface into a quantifiable physical signal.Gas Chromatography-Mass Spectrometry: A microanalytical technique combining mass spectrometry and gas chromatography for the qualitative as well as quantitative determinations of compounds.Linear Models: Statistical models in which the value of a parameter for a given value of a factor is assumed to be equal to a + bx, where a and b are constants. The models predict a linear regression.Perceptual Distortion: Lack of correspondence between the way a stimulus is commonly perceived and the way an individual perceives it under given conditions.Data Interpretation, Statistical: Application of statistical procedures to analyze specific observed or assumed facts from a particular study.Immunoassay: A technique using antibodies for identifying or quantifying a substance. Usually the substance being studied serves as antigen both in antibody production and in measurement of antibody by the test substance.Electrodes: Electric conductors through which electric currents enter or leave a medium, whether it be an electrolytic solution, solid, molten mass, gas, or vacuum.Image Enhancement: Improvement of the quality of a picture by various techniques, including computer processing, digital filtering, echocardiographic techniques, light and ultrastructural MICROSCOPY, fluorescence spectrometry and microscopy, scintigraphy, and in vitro image processing at the molecular level.Evaluation Studies as Topic: Studies determining the effectiveness or value of processes, personnel, and equipment, or the material on conducting such studies. For drugs and devices, CLINICAL TRIALS AS TOPIC; DRUG EVALUATION; and DRUG EVALUATION, PRECLINICAL are available.Models, Biological: Theoretical representations that simulate the behavior or activity of biological processes or diseases. For disease models in living animals, DISEASE MODELS, ANIMAL is available. Biological models include the use of mathematical equations, computers, and other electronic equipment.Spectrometry, Mass, Electrospray Ionization: A mass spectrometry technique used for analysis of nonvolatile compounds such as proteins and macromolecules. The technique involves preparing electrically charged droplets from analyte molecules dissolved in solvent. The electrically charged droplets enter a vacuum chamber where the solvent is evaporated. Evaporation of solvent reduces the droplet size, thereby increasing the coulombic repulsion within the droplet. As the charged droplets get smaller, the excess charge within them causes them to disintegrate and release analyte molecules. The volatilized analyte molecules are then analyzed by mass spectrometry.Regression Analysis: Procedures for finding the mathematical function which best describes the relationship between a dependent variable and one or more independent variables. In linear regression (see LINEAR MODELS) the relationship is constrained to be a straight line and LEAST-SQUARES ANALYSIS is used to determine the best fit. In logistic regression (see LOGISTIC MODELS) the dependent variable is qualitative rather than continuously variable and LIKELIHOOD FUNCTIONS are used to find the best relationship. In multiple regression, the dependent variable is considered to depend on more than a single independent variable.Automation: Controlled operation of an apparatus, process, or system by mechanical or electronic devices that take the place of human organs of observation, effort, and decision. (From Webster's Collegiate Dictionary, 1993)Discriminant Analysis: A statistical analytic technique used with discrete dependent variables, concerned with separating sets of observed values and allocating new values. It is sometimes used instead of regression analysis.Optical Processes: Behavior of LIGHT and its interactions with itself and materials.Mass Spectrometry: An analytical method used in determining the identity of a chemical based on its mass using mass analyzers/mass spectrometers.Tomography, X-Ray Computed: Tomography using x-ray transmission and a computer algorithm to reconstruct the image.Magnetic Resonance Imaging: Non-invasive method of demonstrating internal anatomy based on the principle that atomic nuclei in a strong magnetic field absorb pulses of radiofrequency energy and emit them as radiowaves which can be reconstructed into computerized images. The concept includes proton spin tomographic techniques.Reference Values: The range or frequency distribution of a measurement in a population (of organisms, organs or things) that has not been selected for the presence of disease or abnormality.Point-of-Care Systems: Laboratory and other services provided to patients at the bedside. These include diagnostic and laboratory testing using automated information entry.Pattern Recognition, Automated: In INFORMATION RETRIEVAL, machine-sensing or identification of visible patterns (shapes, forms, and configurations). (Harrod's Librarians' Glossary, 7th ed)Prospective Studies: Observation of a population for a sufficient number of persons over a sufficient number of years to generate incidence or mortality rates subsequent to the selection of the study group.Radiographic Image Interpretation, Computer-Assisted: Computer systems or networks designed to provide radiographic interpretive information.Solid Phase Extraction: An extraction method that separates analytes using a solid phase and a liquid phase. It is used for preparative sample cleanup before analysis by CHROMATOGRAPHY and other analytical methods.Fiducial Markers: Materials used as reference points for imaging studies.Signal Processing, Computer-Assisted: Computer-assisted processing of electric, ultrasonic, or electronic signals to interpret function and activity.Monte Carlo Method: In statistics, a technique for numerically approximating the solution of a mathematical problem by studying the distribution of some random variable, often generated by a computer. The name alludes to the randomness characteristic of the games of chance played at the gambling casinos in Monte Carlo. (From Random House Unabridged Dictionary, 2d ed, 1993)Drug Stability: The chemical and physical integrity of a pharmaceutical product.Spectrophotometry, Ultraviolet: Determination of the spectra of ultraviolet absorption by specific molecules in gases or liquids, for example Cl2, SO2, NO2, CS2, ozone, mercury vapor, and various unsaturated compounds. (McGraw-Hill Dictionary of Scientific and Technical Terms, 4th ed)Diagnosis, Computer-Assisted: Application of computer programs designed to assist the physician in solving a diagnostic problem.Pharmaceutical Preparations: Drugs intended for human or veterinary use, presented in their finished dosage form. Included here are materials used in the preparation and/or formulation of the finished dosage form.Reagent Kits, Diagnostic: Commercially prepared reagent sets, with accessory devices, containing all of the major components and literature necessary to perform one or more designated diagnostic tests or procedures. They may be for laboratory or personal use.Bias (Epidemiology): Any deviation of results or inferences from the truth, or processes leading to such deviation. Bias can result from several sources: one-sided or systematic variations in measurement from the true value (systematic error); flaws in study design; deviation of inferences, interpretations, or analyses based on flawed data or data collection; etc. There is no sense of prejudice or subjectivity implied in the assessment of bias under these conditions.Thermometers: Measuring instruments for determining the temperature of matter. Most thermometers used in the field of medicine are designed for measuring body temperature or for use in the clinical laboratory. (From UMDNS, 1999)Acceleration: An increase in the rate of speed.Freeze Drying: Method of tissue preparation in which the tissue specimen is frozen and then dehydrated at low temperature in a high vacuum. This method is also used for dehydrating pharmaceutical and food products.Support Vector Machines: Learning algorithms which are a set of related supervised computer learning methods that analyze data and recognize patterns, and used for classification and regression analysis.Blood Glucose Self-Monitoring: Self evaluation of whole blood glucose levels outside the clinical laboratory. A digital or battery-operated reflectance meter may be used. It has wide application in controlling unstable insulin-dependent diabetes.Photogrammetry: Making measurements by the use of stereoscopic photographs.Neural Networks (Computer): A computer architecture, implementable in either hardware or software, modeled after biological neural networks. Like the biological system in which the processing capability is a result of the interconnection strengths between arrays of nonlinear processing nodes, computerized neural networks, often called perceptrons or multilayer connectionist models, consist of neuron-like units. A homogeneous group of units makes up a layer. These networks are good at pattern recognition. They are adaptive, performing tasks by example, and thus are better for decision-making than are linear learning machines or cluster analysis. They do not require explicit programming.Likelihood Functions: Functions constructed from a statistical model and a set of observed data which give the probability of that data for various values of the unknown model parameters. Those parameter values that maximize the probability are the maximum likelihood estimates of the parameters.Uncertainty: The condition in which reasonable knowledge regarding risks, benefits, or the future is not available.Chemistry, Clinical: The specialty of ANALYTIC CHEMISTRY applied to assays of physiologically important substances found in blood, urine, tissues, and other biological fluids for the purpose of aiding the physician in making a diagnosis or following therapy.Monitoring, Physiologic: The continuous measurement of physiological processes, blood pressure, heart rate, renal output, reflexes, respiration, etc., in a patient or experimental animal; includes pharmacologic monitoring, the measurement of administered drugs or their metabolites in the blood, tissues, or urine.Tomography Scanners, X-Ray Computed: X-ray image-detecting devices that make a focused image of body structures lying in a predetermined plane from which more complex images are computed.Brain-Computer Interfaces: Instrumentation consisting of hardware and software that communicates with the BRAIN. The hardware component of the interface records brain signals, while the software component analyzes the signals and converts them into a command that controls a device or sends a feedback signal to the brain.Optical Fibers: Thin strands of transparent material, usually glass, that are used for transmitting light waves over long distances.Risk Assessment: The qualitative or quantitative estimation of the likelihood of adverse effects that may result from exposure to specified health hazards or from the absence of beneficial influences. (Last, Dictionary of Epidemiology, 1988)Nephelometry and Turbidimetry: Chemical analysis based on the phenomenon whereby light, passing through a medium with dispersed particles of a different refractive index from that of the medium, is attenuated in intensity by scattering. In turbidimetry, the intensity of light transmitted through the medium, the unscattered light, is measured. In nephelometry, the intensity of the scattered light is measured, usually, but not necessarily, at right angles to the incident light beam.Computational Biology: A field of biology concerned with the development of techniques for the collection and manipulation of biological data, and the use of such data to make biological discoveries or predictions. This field encompasses all computational methods and theories for solving biological problems including manipulation of models and datasets.Blood Chemical Analysis: An examination of chemicals in the blood.Prothrombin Time: Clotting time of PLASMA recalcified in the presence of excess TISSUE THROMBOPLASTIN. Factors measured are FIBRINOGEN; PROTHROMBIN; FACTOR V; FACTOR VII; and FACTOR X. It is used for monitoring anticoagulant therapy with COUMARINS.Data Display: The visual display of data in a man-machine system. An example is when data is called from the computer and transmitted to a CATHODE RAY TUBE DISPLAY or LIQUID CRYSTAL display.Cone-Beam Computed Tomography: Computed tomography modalities which use a cone or pyramid-shaped beam of radiation.Microchemistry: The development and use of techniques and equipment to study or perform chemical reactions, with small quantities of materials, frequently less than a milligram or a milliliter.Photography: Method of making images on a sensitized surface by exposure to light or other radiant energy.Indicator Dilution Techniques: Methods for assessing flow through a system by injection of a known quantity of an indicator, such as a dye, radionuclide, or chilled liquid, into the system and monitoring its concentration over time at a specific point in the system. (From Dorland, 28th ed)Clinical Chemistry Tests: Laboratory tests demonstrating the presence of physiologically significant substances in the blood, urine, tissue, and body fluids with application to the diagnosis or therapy of disease.Film Dosimetry: Use of a device (film badge) for measuring exposure of individuals to radiation. It is usually made of metal, plastic, or paper and loaded with one or more pieces of x-ray film.Phylogeny: The relationships of groups of organisms as reflected by their genetic makeup.Area Under Curve: A statistical means of summarizing information from a series of measurements on one individual. It is frequently used in clinical pharmacology where the AUC from serum levels can be interpreted as the total uptake of whatever has been administered. As a plot of the concentration of a drug against time, after a single dose of medicine, producing a standard shape curve, it is a means of comparing the bioavailability of the same drug made by different companies. (From Winslade, Dictionary of Clinical Research, 1992)Secobarbital: A barbiturate that is used as a sedative. Secobarbital is reported to have no anti-anxiety activity.Diagnostic Errors: Incorrect diagnoses after clinical examination or technical diagnostic procedures.False Positive Reactions: Positive test results in subjects who do not possess the attribute for which the test is conducted. The labeling of healthy persons as diseased when screening in the detection of disease. (Last, A Dictionary of Epidemiology, 2d ed)Ear Canal: The narrow passage way that conducts the sound collected by the EAR AURICLE to the TYMPANIC MEMBRANE.Models, Genetic: Theoretical representations that simulate the behavior or activity of genetic processes or phenomena. They include the use of mathematical equations, computers, and other electronic equipment.Optics and Photonics: A specialized field of physics and engineering involved in studying the behavior and properties of light and the technology of analyzing, generating, transmitting, and manipulating ELECTROMAGNETIC RADIATION in the visible, infrared, and ultraviolet range.Fluorescent Dyes: Agents that emit light after excitation by light. The wave length of the emitted light is usually longer than that of the incident light. Fluorochromes are substances that cause fluorescence in other substances, i.e., dyes used to mark or label other compounds with fluorescent tags.Acoustics: The branch of physics that deals with sound and sound waves. In medicine it is often applied in procedures in speech and hearing studies. With regard to the environment, it refers to the characteristics of a room, auditorium, theatre, building, etc. that determines the audibility or fidelity of sounds in it. (From Random House Unabridged Dictionary, 2d ed)Germanium: A rare metal element with a blue-gray appearance and atomic symbol Ge, atomic number 32, and atomic weight 72.63.Surgery, Computer-Assisted: Surgical procedures conducted with the aid of computers. This is most frequently used in orthopedic and laparoscopic surgery for implant placement and instrument guidance. Image-guided surgery interactively combines prior CT scans or MRI images with real-time video.Reagent Strips: Narrow pieces of material impregnated or covered with a substance used to produce a chemical reaction. The strips are used in detecting, measuring, producing, etc., other substances. (From Dorland, 28th ed)Optical Devices: Products or parts of products used to detect, manipulate, or analyze light, such as LENSES, refractors, mirrors, filters, prisms, and OPTICAL FIBERS.Laboratories: Facilities equipped to carry out investigative procedures.Fiber Optic Technology: The technology of transmitting light over long distances through strands of glass or other transparent material.Radioactivity: The spontaneous transformation of a nuclide into one or more different nuclides, accompanied by either the emission of particles from the nucleus, nuclear capture or ejection of orbital electrons, or fission. (McGraw-Hill Dictionary of Scientific and Technical Terms, 6th ed)Infrared Rays: That portion of the electromagnetic spectrum usually sensed as heat. Infrared wavelengths are longer than those of visible light, extending into the microwave frequencies. They are used therapeutically as heat, and also to warm food in restaurants.Tablets: Solid dosage forms, of varying weight, size, and shape, which may be molded or compressed, and which contain a medicinal substance in pure or diluted form. (Dorland, 28th ed)Electrical Equipment and Supplies: Apparatus and instruments that generate and operate with ELECTRICITY, and their electrical components.Retrospective Studies: Studies used to test etiologic hypotheses in which inferences about an exposure to putative causal factors are derived from data relating to characteristics of persons under study or to events or experiences in their past. The essential feature is that some of the persons under study have the disease or outcome of interest and their characteristics are compared with those of unaffected persons.Radiation Dosage: The amount of radiation energy that is deposited in a unit mass of material, such as tissues of plants or animal. In RADIOTHERAPY, radiation dosage is expressed in gray units (Gy). In RADIOLOGIC HEALTH, the dosage is expressed by the product of absorbed dose (Gy) and quality factor (a function of linear energy transfer), and is called radiation dose equivalent in sievert units (Sv).Photometry: Measurement of the various properties of light.Computer-Aided Design: The use of computers for designing and/or manufacturing of anything, including drugs, surgical procedures, orthotics, and prosthetics.Chromatography, Reverse-Phase: A chromatography technique in which the stationary phase is composed of a non-polar substance with a polar mobile phase, in contrast to normal-phase chromatography in which the stationary phase is a polar substance with a non-polar mobile phase.Spectrophotometry, Atomic: Spectrophotometric techniques by which the absorption or emmision spectra of radiation from atoms are produced and analyzed.Magnetics: The study of MAGNETIC PHENOMENA.Evolution, Molecular: The process of cumulative change at the level of DNA; RNA; and PROTEINS, over successive generations.Electrochemistry: The study of chemical changes resulting from electrical action and electrical activity resulting from chemical changes.Radiotherapy, Computer-Assisted: Computer systems or programs used in accurate computations for providing radiation dosage treatment to patients.Normal Distribution: Continuous frequency distribution of infinite range. Its properties are as follows: 1, continuous, symmetrical distribution with both tails extending to infinity; 2, arithmetic mean, mode, and median identical; and 3, shape completely determined by the mean and standard deviation.Amplifiers, Electronic: Electronic devices that increase the magnitude of a signal's power level or current.False Negative Reactions: Negative test results in subjects who possess the attribute for which the test is conducted. The labeling of diseased persons as healthy when screening in the detection of disease. (Last, A Dictionary of Epidemiology, 2d ed)Water: A clear, odorless, tasteless liquid that is essential for most animal and plant life and is an excellent solvent for many substances. The chemical formula is hydrogen oxide (H2O). (McGraw-Hill Dictionary of Scientific and Technical Terms, 4th ed)Ultrasonography: The visualization of deep structures of the body by recording the reflections or echoes of ultrasonic pulses directed into the tissues. Use of ultrasound for imaging or diagnostic purposes employs frequencies ranging from 1.6 to 10 megahertz.Feasibility Studies: Studies to determine the advantages or disadvantages, practicability, or capability of accomplishing a projected plan, study, or project.Chromatography, Gas: Fractionation of a vaporized sample as a consequence of partition between a mobile gaseous phase and a stationary phase held in a column. Two types are gas-solid chromatography, where the fixed phase is a solid, and gas-liquid, in which the stationary phase is a nonvolatile liquid supported on an inert solid matrix.Fourier Analysis: Analysis based on the mathematical function first formulated by Jean-Baptiste-Joseph Fourier in 1807. The function, known as the Fourier transform, describes the sinusoidal pattern of any fluctuating pattern in the physical world in terms of its amplitude and its phase. It has broad applications in biomedicine, e.g., analysis of the x-ray crystallography data pivotal in identifying the double helical nature of DNA and in analysis of other molecules, including viruses, and the modified back-projection algorithm universally used in computerized tomography imaging, etc. (From Segen, The Dictionary of Modern Medicine, 1992)Environmental Monitoring: The monitoring of the level of toxins, chemical pollutants, microbial contaminants, or other harmful substances in the environment (soil, air, and water), workplace, or in the bodies of people and animals present in that environment.Hydrogen-Ion Concentration: The normality of a solution with respect to HYDROGEN ions; H+. It is related to acidity measurements in most cases by pH = log 1/2[1/(H+)], where (H+) is the hydrogen ion concentration in gram equivalents per liter of solution. (McGraw-Hill Dictionary of Scientific and Technical Terms, 6th ed)Fluorescence: The property of emitting radiation while being irradiated. The radiation emitted is usually of longer wavelength than that incident or absorbed, e.g., a substance can be irradiated with invisible radiation and emit visible light. X-ray fluorescence is used in diagnosis.Prognosis: A prediction of the probable outcome of a disease based on a individual's condition and the usual course of the disease as seen in similar situations.Scintillation Counting: Detection and counting of scintillations produced in a fluorescent material by ionizing radiation.Computer Terminals: Input/output devices designed to receive data in an environment associated with the job to be performed, and capable of transmitting entries to, and obtaining output from, the system of which it is a part. (Computer Dictionary, 4th ed.)Proteins: Linear POLYPEPTIDES that are synthesized on RIBOSOMES and may be further modified, crosslinked, cleaved, or assembled into complex proteins with several subunits. The specific sequence of AMINO ACIDS determines the shape the polypeptide will take, during PROTEIN FOLDING, and the function of the protein.Spectrum Analysis: The measurement of the amplitude of the components of a complex waveform throughout the frequency range of the waveform. (McGraw-Hill Dictionary of Scientific and Technical Terms, 6th ed)Scattering, Radiation: The diversion of RADIATION (thermal, electromagnetic, or nuclear) from its original path as a result of interactions or collisions with atoms, molecules, or larger particles in the atmosphere or other media. (McGraw-Hill Dictionary of Scientific and Technical Terms, 6th ed)Sequence Analysis, Protein: A process that includes the determination of AMINO ACID SEQUENCE of a protein (or peptide, oligopeptide or peptide fragment) and the information analysis of the sequence.
These instruments were complementary and were used in academic and analytical settings through the 1950s, 1960s, and 1970s. ... Clarke, F. J. J. (June 5, 1972). "High Accuracy Spectrophotometry at the National Physical Laboratory (Teddington, Middlesex, ... wavelength calibration reproducibility and wide range of scan speeds of its prism-grating double-monochromator. Cary, Henry H ... Sommer, L. (1989). Analytical absorption spectrophotometry in the visible and ultraviolet : the principles. Amsterdam: Elsevier ...
This approach was first put forward in the 1970s and developed in 2002. Many analysts do not employ analytical equations for ... When calibration plots are markedly nonlinear, one can bypass the empirical polynomial fitting and employ the ratio of two ... M.J.T. Milton; J.A. Wang (2002). "High Accuracy Method for Isotope Dilution Mass Spectrometry with Application to the ... Analytical application of the radiotracer method is forerunner of isotope dilution. This method was developed in the early 20th ...
Baird, D. (2002). "Analytical chemistry and the 'big' scientific instrumentation revolution". In Morris, Peter J. T. From ... They may be responsible for calibration, testing and maintenance of the system. In a research environment it is common for ... Common concerns of both are the selection of appropriate sensors based on size, weight, cost, reliability, accuracy, longevity ... in the 1970s. The transformation of instrumentation from mechanical pneumatic transmitters, controllers, and valves to ...
The 1970s brought many new aspects to the CFS duties with the addition of a forensic engineer and certain types of electronic ... This unit is part of a through system to ensure the accuracy and relevance of the evidence it produces. Communications between ... The new standard for CFS is the ISO 17025 standard for testing and calibration laboratories (along with relevant supplemental ... GOALS Newsletter (Government of Ontario Analytical Laboratories System), December 1992, Issue #9. http://www.csfs.ca/csfs_page. ...
Standard addition can be applied to most analytical techniques and is used instead of a calibration curve to solve the matrix ... Starting in approximately the 1970s into the present day analytical chemistry has progressively become more inclusive of ... Error of a measurement is an inverse measure of accurate measurement i.e. smaller the error greater the accuracy of the ... Modern analytical chemistry is dominated by instrumental analysis. Many analytical chemists focus on a single type of ...
He let us think that we had some of the best ideas, but on reflection we knew where they came from." During the 1970s Gutowsky ... "Pittcon '92". Analytical Chemistry. 64 (3): 133A-137A. 31 May 2012. doi:10.1021/ac00027a716. Retrieved 15 June 2017. "2016 ... Comparing results from a variety of samples, Gutowsky and his group improved the accuracy of their instrument through careful ... Through rigorous calculation, convergence, calibration, experimental characterization, and correlation to chemical concepts, he ...
In the late 1960s and 1970s unmeasured variables were taken into account in the data reconciliation process., DVR also became ... Measurement errors can be categorized into two basic types: random errors due to intrinsic sensor accuracy and systematic ... ", "analytical redundancy", or "topological redundancy". Redundancy can be due to sensor redundancy, where sensors are ... errors (or gross errors) due to sensor calibration or faulty data transmission. Random errors means that the measurement y {\ ...
Calibration at low concentrations is often done with automated analyzers to save time and to eliminate variables of manual ... The "ultrapure water" term became more popular in the later 1970s and early 1980s as a way of describing the particular quality ... It is the standard location for the majority of analytical tests. The point of connection (POC) is another commonly used point ... They require no maintenance except for periodic verification of measurement accuracy, typically annually. Sodium is usually the ...
This effect is accounted for during calibration by using a different marine calibration curve; without this curve, modern ... C atoms that are detected.[62] In the late 1970s an alternative approach became available: directly counting the number of 14. ... To verify the accuracy of the method, several artefacts that were datable by other techniques were tested; the results of the ... the success of radiocarbon dating stimulated interest in analytical and statistical approaches to archaeological data.[102] ...
These short term fluctuations in the calibration curve are now known as de Vries effects, after Hessel de Vries. A calibration ... In the late 1970s an alternative approach became available: directly counting the number of 14 C and 12 C atoms in a given ... To verify the accuracy of the method, several artefacts that were datable by other techniques were tested; the results of the ... the success of radiocarbon dating stimulated interest in analytical and statistical approaches to archaeological data. Taylor ...
... accuracy, sensitivity) and for methods of calibration (Wadsö and Goldberg 2001). Modern IMC instruments are actually semi- ... Many modern IMC instrument designs stem from work done in Sweden in the late 1960s and early 1970s (Wadsö 1968, Suurkuusk & ... IMC is a powerful and versatile analytical tool for four closely related reasons: All chemical and physical processes are ... For operation in either hc or pc mode, routine calibration in commercial instruments is usually accomplished with built-in ...
In the 1970s, there was a claim on synthesis of a mercury(III) compound, but it is now thought to be false. Organic mercury ... which are traditionally used in optical spectroscopy for calibration of spectral position. Commercial calibration lamps are ... Mercury thermometers are still widely used for certain scientific applications because of their greater accuracy and working ... Analytical and Bioanalytical Chemistry. 383 (6): 1009-13. doi:10.1007/s00216-005-0069-7. PMID 16228199. "Merck's Manual 1899" ( ...
This model is able to capture in analytical form, a range of shapes of hysteretic cycles which match the behaviour of a wide ... A more formal mathematical theory of systems with hysteresis was developed in the 1970s by a group of Russian mathematicians ... As of 2002[update], only desorption curves are usually measured during calibration of soil moisture sensors. Despite the fact ... Parkes, Martin (8 April 1999). "Subject: Accuracy of capacitance soil moisture .." SOWACS (Mailing list). Archived from the ...
"Central Analytical Laboratory. Archived from the original on 2010-06-15. Retrieved 2009-01-02.. ... Radar-derived rainfall estimates complement surface station data which can be used for calibration. To produce radar ... but its accuracy will depend on what ruler is used to measure the rain with. Any of the above rain gauges can be made at home, ... as well as an increase since the 1970s in the prevalence of droughts-especially in the tropics and subtropics. Changes in ...
Factors affecting accuracy of various meters include calibration of meter, ambient temperature, pressure use to wipe off strip ... "Analytical Performance Requirements for Systems for Self-Monitoring of Blood Glucose With Focus on System Accuracy: Relevant ... It was used in American hospitals in the 1970s. A moving needle indicated the blood glucose after about a minute. Home glucose ... Accuracy of glucose meters is a common topic of clinical concern. Blood glucose meters must meet accuracy standards set by the ...
Analytical Transmission Scanning Electron Microscopy This project develops STEM-in-SEM methods or low-energy transmission ... Argon mini-arc meets its match: Use of a laser-driven plasma source in ultraviolet-detector calibrations The National Institute ... In the early 1970s, NIST researchers pioneered the near-field scanning technique, now the standard method for testing high- ... The accuracy to which these measurements can be accomplished requires minimization of the ...
Air-kerma Calibrations in Radiation Protection Level Cs-137 and Co-60 Gamma Ray Beams The Dosimetry Group maintains and ... Analytical Transmission Scanning Electron Microscopy This project develops STEM-in-SEM methods or low-energy transmission ... Alpha-gamma Counting for High Accuracy Fluence Measurement The Alpha-Gamma device is a totally-absorbing neutron detector that ... In the early 1970s, NIST researchers pioneered the near-field scanning technique, now the standard method for testing high- ...
... uses a software-based nomogram to estimate stroke volume without the need for lithium calibration. Accuracy of the latter has ... Pressure Recording Analytical Method (PRAM) (MostCare device, Vytech, Padu, Italy) is another device that estimates CO from the ... From its initial use in outpatient settings in 1970s, echocardiography has found its place in all inpatient settings, ranging ... The latter system requires occasional set-up time and calibration. Data from either of these catheters can be used to derive ...
These instruments were complementary and were used in academic and analytical settings through the 1950s, 1960s, and 1970s. ... Clarke, F. J. J. (June 5, 1972). "High Accuracy Spectrophotometry at the National Physical Laboratory (Teddington, Middlesex, ... wavelength calibration reproducibility and wide range of scan speeds of its prism-grating double-monochromator. Cary, Henry H ... Sommer, L. (1989). Analytical absorption spectrophotometry in the visible and ultraviolet : the principles. Amsterdam: Elsevier ...
... namely wavelength calibration and absorbance/transmittance accuracy.. Instrument design. The current range of NIR spectrometers ... aBurgess Analytical Consultancy Limited, Rose Rae, The Lendings, Startforth, Barnard Castle, Co. Durham DL12 9AB, UK. bStarna ... since Technicon developed the original Infralyser range in the mid 1970s for compositional analysis of agricultural commodities ... the use of a solid artefact or standard cuvette for wavelength calibration for either transmittance or reflectance calibration ...
Sonic nozzles for gas calibrators, when simplicity provides accuracy! Gas analysers calibration is a task required in many ... In the 1970s, the development of ambulatory infusion technology enabled the pumps to be used in research; ... ... Measurements in traces range are performed and analytical devices specifications need to be validated and ... ...
Analytical Laboratories. NHANES uses several methods to monitor the quality of the analyses performed by the contract ... Production of PCBs peaked in the early 1970s and was banned in the United States after 1979. Together with the PCDDs and PCDFs ... instrument calibration, reagents, and any special considerations are submitted to NCHS quarterly. The reports are reviewed for ... quality control and quality assurance performance criteria for accuracy and precision, similar to the Westgard rules (Caudill, ...
Analytical Laboratories NHANES uses several methods to monitor the quality of the analyses performed by the contract ... Production of PCBs peaked in the early 1970s and was banned in the United States after 1979. Together with the PCDDs and PCDFs ... instrument calibration, reagents, and any special considerations are submitted to NCHS quarterly. The reports are reviewed for ... quality control and quality assurance performance criteria for accuracy and precision, similar to the Westgard rules (Caudill, ...
The pH sensor works in the range 6-8 pH units and it is characterised by an accuracy of 0.07 pH units and a response time of ... A calibration curve was determined, resulting in an easy-handling immobilization method with a cheap stable material. This ... Flow cytometry has been instrumental in rapid analysis of single cells since the 1970s. One of the common approaches is the ... The relative merits and pitfalls of each analytical method are discussed. Fluorescence intermittency and spectral shifts of ...
The technique of meter calibration is meter specific; some devices have automatic calibration, whereas others use lot-specific ... Analytical goals for meters and for patient-performed testing have been recently reported (53,54,55,56). If SMBG is prescribed ... By the mid-1970s it became clear that HbA1c resulted from a posttranslational modification of HbA by glucose and that there was ... Chan JC, Wong R, Cheung CK, Lam P, Chow CC, Yeung VT, Kan EC, Loo KM, Mong MY, Cockram CS: Accuracy, precision, and user- ...
The first major revolution occurred in the 1970s with the development of the MIRD models and calculational techniques. These ... One involves the limitations on spatial resolution and accuracy of activity quantification with nuclear medicine cameras. ... Small thermoluminescent dosimeters (TLDs) have been very useful in anthropomorphic phantoms in calibration of diagnostic and ... The use of point kernels and analytical methods are not as common currently as characterization of dose distributions using ...
This approach was first put forward in the 1970s and developed in 2002. Many analysts do not employ analytical equations for ... When calibration plots are markedly nonlinear, one can bypass the empirical polynomial fitting and employ the ratio of two ... M.J.T. Milton; J.A. Wang (2002). "High Accuracy Method for Isotope Dilution Mass Spectrometry with Application to the ... Analytical application of the radiotracer method is forerunner of isotope dilution. This method was developed in the early 20th ...
Journal of Analytical Methods in Chemistry is a peer-reviewed, Open Access journal that publishes original research articles as ... These samples were oven-dried at 105°C, homogenized, and passed through the extraction and analytical steps [29]. The accuracy ... Air pollution was not perceived as a major problem in most countries until late 1960s and early 1970s; it was the global ... and calibrated using mixed calibration standard solutions prepared as mandated [14, 28]. Blank solutions were prepared without ...
Lack of analytical accuracy not only impacts current clinical care but also is holding back a better understanding of emerging ... "As is true for all methods, measurements based on mass spectrometry depend upon the accuracy of calibration, freedom from ... "Back in the 1970s, we used estradiol testing primarily in younger women for fertility management where measuring relatively ... it is using to assign target values to single donor serum materials that labs can use to assess the calibration and accuracy of ...
This instalment provides an overview for analytical-scale HPLC pumps, including their requirements, modern designs, operating ... The compositional accuracy of the pump can be verified by running step gradient profiles (for example, in 10% composition steps ... Syringe Pumps: Many early HPLC pumps in the 1960s and 1970s were syringe pumps (such as the Varian 8500 Dual Syringe Pump ... He holds a PhD in analytical chemistry from City University of New York. He has more than 100 publications and a bestâ selling ...
The accuracy of glucometers may be considered as technical and clinical. Technical accuracy refers to the analytical result ... Calibration errors are also common for those meters which require calibration.. History & Evolution of Continuous Glucose ... Evolution of CGM can be traced back to the mid-1970s followed by the development of sensor technology and implantable glucose ... Accuracy of CGM Sensors. Though sensor accuracy have improved over the years, the accuracy of sensors available for use in ...
... as well as to determine analytical accuracy and precision. Using these congeners and their respective ratios allows accurate ... 2008) were similar to those reported in the late 1970s (Gardner et al. 1978). 2011 K.F. Gaines, J.W. Summers, J.C. Cumbee, Jr ... Samples were quantified using a six-point calibration curve derived from dilutions of certified congener (Ultra Scientific) and ...
The calibration standard curve was prepared with seven points in rat plasma with drug concentrations ranging between 0.0097 and ... The precision and accuracy were less than 15% for the between-day variabilities, which were characterized at the 3 ... Analytical assays. (i) Colistin analysis in plasma.Determination of colistin concentrations in plasma were performed by a ... It is an old antibiotic that was abandoned in the 1970s due to its adverse effects (7). During the last 15 years, it has ...
... used to qualify absorbance accuracy can be used to qualify linearity provided they are compatible with the analytical ... Until the 1970s most laboratories used in-house prepared solutions or proprietary test materials to check the performance of ... The stability of the reference material is also very important, and the validity of the calibration should be stated on the CRM ... Control of wavelength accuracy. The user is required to:. "Control the wavelength accuracy of an appropriate number of bands in ...
The accuracy of the calibration curve was also checked with 5 mg F? L? 1 standard for the higher range calibration curve, and ... Analytical model of fluoride distribution in groundwater. Solute transport in a three-dimensional homogeneous aquifer was ... The first boreholes were installed in the 1970s but the year is not certain. Samples were collected for wet, dry, and shoulder ... A semi-analytical three-dimensional model (Domenico, 1987) of fluoride distribution in groundwater in Namoo was created using ...
Temperature-measuring instruments require calibration to confirm the accuracy of temperature measurements. Calibration of ... In the 1970s, the Wyoming Geological Survey (WGS) began a project to inventory all the thermal springs in the State of Wyoming ... inclusion of the uncertainty associated with the analytical technique clarifies the confidence of the interpretation. Analytic ... Calibration of temperature data loggers is an important component of the equipments use. An easy calibration check is the ice- ...
In the late 1970s LC-based assays emerged that used refined extraction and chromatographic steps with some form of fixed- or ... accuracy, and variability) regularly over the past decade (Carter et al., 2004; Carter, 2009; Jones et al., 2009). These ... In the case of the Canadian survey of 25OHD levels in the Canadian population, the analytical methods used to determine 25OHD ... A standard reference material (SRM 972, Vitamin D in Human Serum) and calibration solution (SRM 2972, 25-Hydroxyvitamin D2 and ...
The researchers described the new calibration method and results in the July 15 issue of the journal Analytical Chemistry. In ... One of the most popular candidates for a unified theory is string theory, first developed in the late 1960s and early 1970s.. ... the researchers used DCC-calibrated Raman spectroscopy to significantly boost the accuracy of blood glucose measurements - an ... However, this calibration becomes more difficult immediately after the patient eats or drinks something sugary, because blood ...
  • This instalment provides an overview for analytical-scale HPLC pumps, including their requirements, modern designs, operating principles, trends, and best practices for trouble-free operation. (chromatographyonline.com)
  • Mixer volumes range from a few μL to 2 mL in analytical HPLC systems. (chromatographyonline.com)
  • A major change is that the scope is now extended to include high-performance liquid chromatography (HPLC) detectors and process analytical technology (PAT) as applications of ultraviolet/visible (UV/vis) spectrophotometry. (spectroscopyeurope.com)
  • Proper analytical screening including assay determination via High Performance Liquid Chromatography (HPLC), redox potential, pH measurement and general raw material compatibility testing helps in the development of an effective microbial protection program to inhibit or control the growth of microorganisms that cause biodeterioration. (coatingsworld.com)
  • 1966: HPLC was first named by Horvath at Yale University but HPLC didnt catch on until the 1970s 1978: W.C. Stills introduced flash chromatography, where solvent is forced through a packed column with positive pressure. (scribd.com)
  • High Accuracy Spectrophotometry at the National Physical Laboratory (Teddington, Middlesex, UK. (wikipedia.org)
  • Last, having solved the above problems, the range of application areas now opening up to the technique also required the design of instruments of a more physically robust nature than usually found in an analytical laboratory, and by definition, portable in nature. (spectroscopyeurope.com)
  • 1994. Reported results met the Division of Laboratory Sciences' quality control and quality assurance performance criteria for accuracy and precision (similar to specifications outlined by Westgard, 1981). (cdc.gov)
  • This proposal seeks funds to build a tool for data collection for social scientists that would bring a new standard of accuracy to the data, broader accessibility for social scientists at reduced cost, blending the analytic power afforded by a state-of-the art computer network laboratory with the generalization power afforded by a representative sample survey of American households. (stanford.edu)
  • Published GLP regulations and guidelines have a significant impact on the daily operation of an analytical laboratory. (labcompliance.com)
  • The WHO criteria for AMI then established (and re-established since the 1970s) the laboratory requirements for CK-MB and detectable levels of troponin to correlate with clinical findings. (labmedicineblog.com)
  • Several non-invasive optical metrological, imaging and analytical techniques were applied to silver denarii of Diva Faustina from two different editions. (cosch.info)
  • The resulting data, in the form of a calibration curve, is now used to convert a given measurement of radiocarbon in a sample into an estimate of the sample's calendar age. (wikipedia.org)
  • In industrial applications of polymer in water-based systems (e.g., coatings, adhesives, and sealants), analytical testing of the system enables a formulator to maximize biocide efficacy, thus protecting product integrity. (coatingsworld.com)
  • PCBs were widely used as dielectric fluids, plasticisers and adhesives from 1930s to 1970s (Backe et al. (springer.com)
  • In addition we can say more about how much of the observed tissue modulated spectra must be associated with a "modulation defect" resulting from various errors affecting the accuracy and precision in subtraction of static tissue contributions. (spie.org)
  • Nonetheless, precisions as high as 3 parts in 10^{13} have been achieved, well beyond the precision of any available calibration, and higher precision is expected in comparisons of antihydrogen and hydrogen. (sgms.ch)
  • Progress reports containing any problems encountered during shipping or receipt of specimens, summary statistics for each control pool, QC graphs, instrument calibration, reagents, and any special considerations are submitted to NCHS quarterly. (cdc.gov)
  • There is more detail on this topic as well as on a variety of other areas such as gas detection requirements and training, sensor technologies, and instrument calibration and maintenance. (labmate-online.com)
  • 9 Calibration For the moisture content to be shown as a percentage, the instrument first needs to be calibrated. (docplayer.net)
  • The current range of NIR spectrometers are radically different from "conventional" filter-based spectrometers, since Technicon developed the original Infralyser range in the mid 1970s for compositional analysis of agricultural commodities. (spectroscopyeurope.com)
  • The method is based on NIR spectroscopy and multivariate data analysis, while for calibration and validation a reference collection of 199 historical canvas samples was used. (shroudstory.com)
  • The as-synthesized QDs were characterized by various analytical tools such as ultraviolet-visible (UV-vis) absorption, photoluminescence (PL) spectroscopy, X-ray diffractometry (XRD) and transmission electron microscopy (TEM). (intechopen.com)
  • The analytical method for PCDDs/PCDFs/cPCB is described in Patterson et al. (cdc.gov)
  • The user is recommended to: "Define the measuring conditions to obtain a satisfactory signal-to-noise ratio and to select the scan range, scan rate and slit-width that provide the necessary optical resolution for the intended application without losing the required signal-to-noise ratio or the linearity of the analytical method. (spectroscopyeurope.com)
  • A new analytical method was developed to non-destructively determine pH and degree of polymerisation (DP) of cellulose in fibres in 19th-20th century painting canvases, and to identify the fibre type: cotton, linen, hemp, ramie or jute. (shroudstory.com)
  • The ID/LC-MS/MS method has improved accuracy compared with immunoassay. (aaccjnls.org)
  • Another possibility involves the calibration of the passive sampler before its use, but this approach is time consuming and requires the use of an additional independent sampling method (for instance, an active air sampler). (rsc.org)
  • The diagnostic accuracy of total hormone measurements would equal that of free hormone if all patients had identical levels of binding proteins with similar affinities for the thyroid hormones. (nacb.org)
  • Only Cr (77%), Y (82%), and Sb (80%) fell outside the acceptable limits of accuracy (% recovery between 85 and 115%) because of likely residence in mineral phases resistant to the acid digestion.A separate sample of 0-5-cm material was collected at each site for determination of organic compounds. (chemweb.com)
  • Gas analyser's calibration is a task required in many applications according to either legislation or quality systems management. (environmental-expert.com)
  • With an emphasis on recent developments in remote sensing, the symposium program will focus on applications of satellite and other Earth observations to monitor, assess, and perform projections of future land and water resources, as well as big data and other analytical technologies to improve decision making using satellite data. (usgs.gov)
  • The symposium will offer a program focused on applications of satellite and other Earth observations to monitor, assess, and perform projections of future land and water resources, as well as big data and other analytical technologies to improve decision making using satellite data. (usgs.gov)
  • Back in the 1970s, we used estradiol testing primarily in younger women for fertility management where measuring relatively high levels was the target," said Sluss. (aacc.org)
  • However, back in the 1970s evidence started accumulating on their environmental persistency and on their toxicity as human carcinogens. (rsc.org)
  • Data from Khirbat en-Nahas, and the nearby site of Rujm Hamra Ifdan, demonstrate the centrality of industrial-scale metal production during those centuries traditionally linked closely to political events in Edom's 10th century BCE neighbor ancient Israel. (pnas.org)
  • Specifically, the proposal seeks funds for three categories of expenses: (1) purchase of computer hardware, (2) installation of the computer hardware in a representative sample of 1,000 households across the country, and (3) calibration of a national network of these computers via the Internet, testing to assure that it works properly to permit social science data collection. (stanford.edu)
  • Then, once a month, respondents would provide a new round of data by accessing a secure webpage, and calibration of the computer network would be conducted. (stanford.edu)
  • "A systematic error in mass flow calorimetry demonstrated" (2002) , in which he reanalyzes the calorimetry results described this paper by Edmund Storms and suggests that a non-nuclear explanation he calls "Calibration Constant Shift" (CCS, Section 3 below) is compatible with the data. (coldfusionblog.net)
  • In the 1970s many of these techniques began to be used together as hybrid techniques to achieve a complete characterization of samples. (wikipedia.org)
  • However, the ABRWH and its contractor, SC&A, caution the reader that at the time of its release, this report is pre-decisional and has not been reviewed by the Board for factual accuracy or applicability within the requirements of 42 CFR 82. (cdc.gov)
  • Health for factual accuracy or applicability within the requirements of 42 CFR 82. (cdc.gov)
  • This paper addresses some key areas where analytical testing can provide critical information to ensure the optimal protection of polymer dispersions. (coatingsworld.com)
  • In the 1970s, Burkitt proposed the hypothesis that dietary fibre reduces the risk of colorectal cancer, based on the observation of low rates of such cancer among rural Africans who ate a diet with a high fibre content. (bmj.com)
  • We validated the chronology by comparing it to independent high-accuracy, absolutely dated chronologies. (clim-past.net)
  • The discovery of a chemical present in blood that increases the risk of cancer would be a discovery that an analytical chemist might be involved in. (wikipedia.org)
  • The cup of purposes known from skilled mobile others, showing the mainstream intelligence of the computer, nutrient band after the War( and the African ' substrate Science '), greater pumice marriage in sample and thrombus, the many manifestation of the ancestry, and a many Calibration in presentations. (tackletime.net)
  • Regulatory interest has lead to the promulgation of monographs for the specification, calibration and control of NIR spectrometers in both the United States5 and European Pharmacopoeias.6 Unfortunately these monographs have not been harmonised and there are significant differences. (spectroscopyeurope.com)
  • Due to the toxic effects of organochlorines in aquatic organisms, the use and/or sale of most organochlorine pesticides has been banned or restricted in many developed countries such as United States of America and Sweden since the mid 1970s (Tanabe et al. (scielo.org.za)
  • With the discovery of DBPs in the mid-1970s, caused by incomplete oxidation by halogenated chemicals - mainly chlorine-based - with certain organics, their removal became a priority for regulators and utilities. (wateronline.com)