Determination, by measurement or comparison with a standard, of the correct value of each scale reading on a meter or other measuring instrument; or determination of the settings of a control device that correspond to particular values of voltage, current, frequency or other output.
The statistical reproducibility of measurements (often in a clinical context), including the testing of instrumentation or techniques to obtain reproducible results. The concept includes reproducibility of physiological measurements, which may be used to develop rules to assess probability or prognosis, or response to a stimulus; reproducibility of occurrence of a condition; and reproducibility of experimental results.
Binary classification measures to assess test results. Sensitivity or recall rate is the proportion of true positives. Specificity is the probability of correctly determining the absence of a condition. (From Last, Dictionary of Epidemiology, 2d ed)
A basis of value established for the measure of quantity, weight, extent or quality, e.g. weight standards, standard solutions, methods, techniques, and procedures used in diagnosis and therapy.
A procedure consisting of a sequence of algebraic formulas and/or logical steps to calculate or determine a given task.
A system for verifying and maintaining a desired level of quality in a product or process by careful planning, use of proper equipment, continued inspection, and corrective action as required. (Random House Unabridged Dictionary, 2d ed)
Methods of creating machines and devices.
Devices or objects in various imaging techniques used to visualize or enhance visualization by simulating conditions encountered in the procedure. Phantoms are used very often in procedures employing or measuring x-irradiation or radioactive material to evaluate performance. Phantoms often have properties similar to human tissue. Water demonstrates absorbing properties similar to normal tissue, hence water-filled phantoms are used to map radiation levels. Phantoms are used also as teaching aids to simulate real conditions with x-ray or ultrasonic machines. (From Iturralde, Dictionary and Handbook of Nuclear Medicine and Clinical Imaging, 1990)
A graphic means for assessing the ability of a screening test to discriminate between healthy and diseased persons; may also be used in other studies, e.g., distinguishing stimuli responses as to a faint stimuli or nonstimuli.
Statistical formulations or analyses which, when applied to data and found to fit the data, are then used to verify the assumptions and parameters used in the analysis. Examples of statistical models are the linear model, binomial model, polynomial model, two-parameter model, etc.
The evaluation of incidents involving the loss of function of a device. These evaluations are used for a variety of purposes such as to determine the failure rates, the causes of failures, costs of failures, and the reliability and maintainability of devices.
In screening and diagnostic tests, the probability that a person with a positive test is a true positive (i.e., has the disease), is referred to as the predictive value of a positive test; whereas, the predictive value of a negative test is the probability that the person with a negative test does not have the disease. Predictive value is related to the sensitivity and specificity of the test.
Concentration or quantity that is derived from the smallest measure that can be detected with reasonable certainty for a given analytical procedure.
Liquid chromatographic techniques which feature high inlet pressures, high sensitivity, and high speed.
Computer-based representation of physical systems and phenomena such as chemical processes.
Remains, impressions, or traces of animals or plants of past geological times which have been preserved in the earth's crust.
Method of analyzing chemicals using automation.
Any device or element which converts an input signal into an output signal of a different form. Examples include the microphone, phonographic pickup, loudspeaker, barometer, photoelectric cell, automobile horn, doorbell, and underwater sound transducer. (McGraw Hill Dictionary of Scientific and Technical Terms, 4th ed)
Theoretical representations that simulate the behavior or activity of systems, processes, or phenomena. They include the use of mathematical equations, computers, and other electronic equipment.
A technique of inputting two-dimensional images into a computer and then enhancing or analyzing the imagery into a form that is more useful to the human observer.
The analysis of a chemical substance by inserting a sample into a carrier stream of reagent using a sample injection valve that propels the sample downstream where mixing occurs in a coiled tube, then passes into a flow-through detector and a recorder or other data handling device.
Sequential operating programs and data which instruct the functioning of a digital computer.
Methodologies used for the isolation, identification, detection, and quantitation of chemical substances.
Substances used for the detection, identification, analysis, etc. of chemical, biological, or pathologic processes or conditions. Indicators are substances that change in physical appearance, e.g., color, at or approaching the endpoint of a chemical titration, e.g., on the passage between acidity and alkalinity. Reagents are substances used for the detection or determination of another substance by chemical or microscopical means, especially analysis. Types of reagents are precipitants, solvents, oxidizers, reducers, fluxes, and colorimetric reagents. (From Grant & Hackh's Chemical Dictionary, 5th ed, p301, p499)
A principle of estimation in which the estimates of a set of parameters in a statistical model are those quantities minimizing the sum of squared differences between the observed values of a dependent variable and the values predicted by the model.
A noninvasive technique that uses the differential absorption properties of hemoglobin and myoglobin to evaluate tissue oxygenation and indirectly can measure regional hemodynamics and blood flow. Near-infrared light (NIR) can propagate through tissues and at particular wavelengths is differentially absorbed by oxygenated vs. deoxygenated forms of hemoglobin and myoglobin. Illumination of intact tissue with NIR allows qualitative assessment of changes in the tissue concentration of these molecules. The analysis is also used to determine body composition.
Measuring and weighing systems and processes.
The measurement of radiation by photography, as in x-ray film and film badge, by Geiger-Mueller tube, and by SCINTILLATION COUNTING.
Any visible result of a procedure which is caused by the procedure itself and not by the entity being analyzed. Common examples include histological structures introduced by tissue processing, radiographic images of structures that are not naturally present in living tissue, and products of chemical reactions that occur during analysis.
A theorem in probability theory named for Thomas Bayes (1702-1761). In epidemiology, it is used to obtain the probability of disease in a group of people with some characteristic on the basis of the overall rate of that disease and of the likelihood of that characteristic in healthy and diseased individuals. The most familiar application is in clinical decision analysis where it is used for estimating the probability of a particular diagnosis given the appearance of some symptoms or test result.
Elements of limited time intervals, contributing to particular results or situations.
The process of generating three-dimensional images by electronic, photographic, or other methods. For example, three-dimensional images can be generated by assembling multiple tomographic images with the aid of a computer, while photographic 3-D images (HOLOGRAPHY) can be made by exposing film to the interference pattern created when two laser light sources shine on an object.
The failure by the observer to measure or identify a phenomenon accurately, which results in an error. Sources for this may be due to the observer's missing an abnormality, or to faulty technique resulting in incorrect test measurement, or to misinterpretation of the data. Two varieties are inter-observer variation (the amount observers vary from one another when reporting on the same material) and intra-observer variation (the amount one observer varies between observations when reporting more than once on the same material).
Improvement in the quality of an x-ray image by use of an intensifying screen, tube, or filter and by optimum exposure techniques. Digital processing methods are often employed.
Graphical representation of a statistical model containing scales for calculating the prognostic weight of a value for each individual variable. Nomograms are instruments that can be used to predict outcomes using specific clinical parameters. They use ALGORITHMS that incorporate several variables to calculate the predicted probability that a patient will achieve a particular clinical endpoint.
Chromatographic techniques in which the mobile phase is a liquid.
Methods developed to aid in the interpretation of ultrasound, radiographic images, etc., for diagnosis of disease.
The use of electronic equipment to observe or record physiologic processes while the patient undergoes normal daily activities.
A mass spectrometry technique using two (MS/MS) or more mass analyzers. With two in tandem, the precursor ions are mass-selected by a first mass analyzer, and focused into a collision region where they are then fragmented into product ions which are then characterized by a second mass analyzer. A variety of techniques are used to separate the compounds, ionize them, and introduce them to the first mass analyzer. For example, for in GC-MS/MS, GAS CHROMATOGRAPHY-MASS SPECTROMETRY is involved in separating relatively small compounds by GAS CHROMATOGRAPHY prior to injecting them into an ionization chamber for the mass selection.
Any of a variety of procedures which use biomolecular probes to measure the presence or concentration of biological molecules, biological structures, microorganisms, etc., by translating a biochemical interaction at the probe surface into a quantifiable physical signal.
A microanalytical technique combining mass spectrometry and gas chromatography for the qualitative as well as quantitative determinations of compounds.
Statistical models in which the value of a parameter for a given value of a factor is assumed to be equal to a + bx, where a and b are constants. The models predict a linear regression.
Lack of correspondence between the way a stimulus is commonly perceived and the way an individual perceives it under given conditions.
Application of statistical procedures to analyze specific observed or assumed facts from a particular study.
A technique using antibodies for identifying or quantifying a substance. Usually the substance being studied serves as antigen both in antibody production and in measurement of antibody by the test substance.
Electric conductors through which electric currents enter or leave a medium, whether it be an electrolytic solution, solid, molten mass, gas, or vacuum.
Improvement of the quality of a picture by various techniques, including computer processing, digital filtering, echocardiographic techniques, light and ultrastructural MICROSCOPY, fluorescence spectrometry and microscopy, scintigraphy, and in vitro image processing at the molecular level.
Studies determining the effectiveness or value of processes, personnel, and equipment, or the material on conducting such studies. For drugs and devices, CLINICAL TRIALS AS TOPIC; DRUG EVALUATION; and DRUG EVALUATION, PRECLINICAL are available.
Theoretical representations that simulate the behavior or activity of biological processes or diseases. For disease models in living animals, DISEASE MODELS, ANIMAL is available. Biological models include the use of mathematical equations, computers, and other electronic equipment.
A mass spectrometry technique used for analysis of nonvolatile compounds such as proteins and macromolecules. The technique involves preparing electrically charged droplets from analyte molecules dissolved in solvent. The electrically charged droplets enter a vacuum chamber where the solvent is evaporated. Evaporation of solvent reduces the droplet size, thereby increasing the coulombic repulsion within the droplet. As the charged droplets get smaller, the excess charge within them causes them to disintegrate and release analyte molecules. The volatilized analyte molecules are then analyzed by mass spectrometry.
Procedures for finding the mathematical function which best describes the relationship between a dependent variable and one or more independent variables. In linear regression (see LINEAR MODELS) the relationship is constrained to be a straight line and LEAST-SQUARES ANALYSIS is used to determine the best fit. In logistic regression (see LOGISTIC MODELS) the dependent variable is qualitative rather than continuously variable and LIKELIHOOD FUNCTIONS are used to find the best relationship. In multiple regression, the dependent variable is considered to depend on more than a single independent variable.
Controlled operation of an apparatus, process, or system by mechanical or electronic devices that take the place of human organs of observation, effort, and decision. (From Webster's Collegiate Dictionary, 1993)
A statistical analytic technique used with discrete dependent variables, concerned with separating sets of observed values and allocating new values. It is sometimes used instead of regression analysis.
Behavior of LIGHT and its interactions with itself and materials.
An analytical method used in determining the identity of a chemical based on its mass using mass analyzers/mass spectrometers.
Tomography using x-ray transmission and a computer algorithm to reconstruct the image.
Non-invasive method of demonstrating internal anatomy based on the principle that atomic nuclei in a strong magnetic field absorb pulses of radiofrequency energy and emit them as radiowaves which can be reconstructed into computerized images. The concept includes proton spin tomographic techniques.
The range or frequency distribution of a measurement in a population (of organisms, organs or things) that has not been selected for the presence of disease or abnormality.
Laboratory and other services provided to patients at the bedside. These include diagnostic and laboratory testing using automated information entry.
In INFORMATION RETRIEVAL, machine-sensing or identification of visible patterns (shapes, forms, and configurations). (Harrod's Librarians' Glossary, 7th ed)
Observation of a population for a sufficient number of persons over a sufficient number of years to generate incidence or mortality rates subsequent to the selection of the study group.
Computer systems or networks designed to provide radiographic interpretive information.
An extraction method that separates analytes using a solid phase and a liquid phase. It is used for preparative sample cleanup before analysis by CHROMATOGRAPHY and other analytical methods.
Materials used as reference points for imaging studies.
Computer-assisted processing of electric, ultrasonic, or electronic signals to interpret function and activity.
In statistics, a technique for numerically approximating the solution of a mathematical problem by studying the distribution of some random variable, often generated by a computer. The name alludes to the randomness characteristic of the games of chance played at the gambling casinos in Monte Carlo. (From Random House Unabridged Dictionary, 2d ed, 1993)
The chemical and physical integrity of a pharmaceutical product.
Determination of the spectra of ultraviolet absorption by specific molecules in gases or liquids, for example Cl2, SO2, NO2, CS2, ozone, mercury vapor, and various unsaturated compounds. (McGraw-Hill Dictionary of Scientific and Technical Terms, 4th ed)
Application of computer programs designed to assist the physician in solving a diagnostic problem.
Drugs intended for human or veterinary use, presented in their finished dosage form. Included here are materials used in the preparation and/or formulation of the finished dosage form.
Commercially prepared reagent sets, with accessory devices, containing all of the major components and literature necessary to perform one or more designated diagnostic tests or procedures. They may be for laboratory or personal use.
Any deviation of results or inferences from the truth, or processes leading to such deviation. Bias can result from several sources: one-sided or systematic variations in measurement from the true value (systematic error); flaws in study design; deviation of inferences, interpretations, or analyses based on flawed data or data collection; etc. There is no sense of prejudice or subjectivity implied in the assessment of bias under these conditions.
Measuring instruments for determining the temperature of matter. Most thermometers used in the field of medicine are designed for measuring body temperature or for use in the clinical laboratory. (From UMDNS, 1999)
An increase in the rate of speed.
Method of tissue preparation in which the tissue specimen is frozen and then dehydrated at low temperature in a high vacuum. This method is also used for dehydrating pharmaceutical and food products.
Learning algorithms which are a set of related supervised computer learning methods that analyze data and recognize patterns, and used for classification and regression analysis.
Self evaluation of whole blood glucose levels outside the clinical laboratory. A digital or battery-operated reflectance meter may be used. It has wide application in controlling unstable insulin-dependent diabetes.
Making measurements by the use of stereoscopic photographs.
A computer architecture, implementable in either hardware or software, modeled after biological neural networks. Like the biological system in which the processing capability is a result of the interconnection strengths between arrays of nonlinear processing nodes, computerized neural networks, often called perceptrons or multilayer connectionist models, consist of neuron-like units. A homogeneous group of units makes up a layer. These networks are good at pattern recognition. They are adaptive, performing tasks by example, and thus are better for decision-making than are linear learning machines or cluster analysis. They do not require explicit programming.
Functions constructed from a statistical model and a set of observed data which give the probability of that data for various values of the unknown model parameters. Those parameter values that maximize the probability are the maximum likelihood estimates of the parameters.
The condition in which reasonable knowledge regarding risks, benefits, or the future is not available.
The specialty of ANALYTIC CHEMISTRY applied to assays of physiologically important substances found in blood, urine, tissues, and other biological fluids for the purpose of aiding the physician in making a diagnosis or following therapy.
The continuous measurement of physiological processes, blood pressure, heart rate, renal output, reflexes, respiration, etc., in a patient or experimental animal; includes pharmacologic monitoring, the measurement of administered drugs or their metabolites in the blood, tissues, or urine.
X-ray image-detecting devices that make a focused image of body structures lying in a predetermined plane from which more complex images are computed.
Instrumentation consisting of hardware and software that communicates with the BRAIN. The hardware component of the interface records brain signals, while the software component analyzes the signals and converts them into a command that controls a device or sends a feedback signal to the brain.
Thin strands of transparent material, usually glass, that are used for transmitting light waves over long distances.
The qualitative or quantitative estimation of the likelihood of adverse effects that may result from exposure to specified health hazards or from the absence of beneficial influences. (Last, Dictionary of Epidemiology, 1988)
Chemical analysis based on the phenomenon whereby light, passing through a medium with dispersed particles of a different refractive index from that of the medium, is attenuated in intensity by scattering. In turbidimetry, the intensity of light transmitted through the medium, the unscattered light, is measured. In nephelometry, the intensity of the scattered light is measured, usually, but not necessarily, at right angles to the incident light beam.
A field of biology concerned with the development of techniques for the collection and manipulation of biological data, and the use of such data to make biological discoveries or predictions. This field encompasses all computational methods and theories for solving biological problems including manipulation of models and datasets.
An examination of chemicals in the blood.
Clotting time of PLASMA recalcified in the presence of excess TISSUE THROMBOPLASTIN. Factors measured are FIBRINOGEN; PROTHROMBIN; FACTOR V; FACTOR VII; and FACTOR X. It is used for monitoring anticoagulant therapy with COUMARINS.
The visual display of data in a man-machine system. An example is when data is called from the computer and transmitted to a CATHODE RAY TUBE DISPLAY or LIQUID CRYSTAL display.
Computed tomography modalities which use a cone or pyramid-shaped beam of radiation.
The development and use of techniques and equipment to study or perform chemical reactions, with small quantities of materials, frequently less than a milligram or a milliliter.
Method of making images on a sensitized surface by exposure to light or other radiant energy.
Methods for assessing flow through a system by injection of a known quantity of an indicator, such as a dye, radionuclide, or chilled liquid, into the system and monitoring its concentration over time at a specific point in the system. (From Dorland, 28th ed)
Laboratory tests demonstrating the presence of physiologically significant substances in the blood, urine, tissue, and body fluids with application to the diagnosis or therapy of disease.
Use of a device (film badge) for measuring exposure of individuals to radiation. It is usually made of metal, plastic, or paper and loaded with one or more pieces of x-ray film.
The relationships of groups of organisms as reflected by their genetic makeup.
A statistical means of summarizing information from a series of measurements on one individual. It is frequently used in clinical pharmacology where the AUC from serum levels can be interpreted as the total uptake of whatever has been administered. As a plot of the concentration of a drug against time, after a single dose of medicine, producing a standard shape curve, it is a means of comparing the bioavailability of the same drug made by different companies. (From Winslade, Dictionary of Clinical Research, 1992)
A barbiturate that is used as a sedative. Secobarbital is reported to have no anti-anxiety activity.
Incorrect diagnoses after clinical examination or technical diagnostic procedures.
Positive test results in subjects who do not possess the attribute for which the test is conducted. The labeling of healthy persons as diseased when screening in the detection of disease. (Last, A Dictionary of Epidemiology, 2d ed)
The narrow passage way that conducts the sound collected by the EAR AURICLE to the TYMPANIC MEMBRANE.
Theoretical representations that simulate the behavior or activity of genetic processes or phenomena. They include the use of mathematical equations, computers, and other electronic equipment.
A specialized field of physics and engineering involved in studying the behavior and properties of light and the technology of analyzing, generating, transmitting, and manipulating ELECTROMAGNETIC RADIATION in the visible, infrared, and ultraviolet range.
Agents that emit light after excitation by light. The wave length of the emitted light is usually longer than that of the incident light. Fluorochromes are substances that cause fluorescence in other substances, i.e., dyes used to mark or label other compounds with fluorescent tags.
The branch of physics that deals with sound and sound waves. In medicine it is often applied in procedures in speech and hearing studies. With regard to the environment, it refers to the characteristics of a room, auditorium, theatre, building, etc. that determines the audibility or fidelity of sounds in it. (From Random House Unabridged Dictionary, 2d ed)
A rare metal element with a blue-gray appearance and atomic symbol Ge, atomic number 32, and atomic weight 72.63.
Surgical procedures conducted with the aid of computers. This is most frequently used in orthopedic and laparoscopic surgery for implant placement and instrument guidance. Image-guided surgery interactively combines prior CT scans or MRI images with real-time video.
Narrow pieces of material impregnated or covered with a substance used to produce a chemical reaction. The strips are used in detecting, measuring, producing, etc., other substances. (From Dorland, 28th ed)
Products or parts of products used to detect, manipulate, or analyze light, such as LENSES, refractors, mirrors, filters, prisms, and OPTICAL FIBERS.
Facilities equipped to carry out investigative procedures.
The technology of transmitting light over long distances through strands of glass or other transparent material.
The spontaneous transformation of a nuclide into one or more different nuclides, accompanied by either the emission of particles from the nucleus, nuclear capture or ejection of orbital electrons, or fission. (McGraw-Hill Dictionary of Scientific and Technical Terms, 6th ed)
That portion of the electromagnetic spectrum usually sensed as heat. Infrared wavelengths are longer than those of visible light, extending into the microwave frequencies. They are used therapeutically as heat, and also to warm food in restaurants.
Solid dosage forms, of varying weight, size, and shape, which may be molded or compressed, and which contain a medicinal substance in pure or diluted form. (Dorland, 28th ed)
Apparatus and instruments that generate and operate with ELECTRICITY, and their electrical components.
Studies used to test etiologic hypotheses in which inferences about an exposure to putative causal factors are derived from data relating to characteristics of persons under study or to events or experiences in their past. The essential feature is that some of the persons under study have the disease or outcome of interest and their characteristics are compared with those of unaffected persons.
The amount of radiation energy that is deposited in a unit mass of material, such as tissues of plants or animal. In RADIOTHERAPY, radiation dosage is expressed in gray units (Gy). In RADIOLOGIC HEALTH, the dosage is expressed by the product of absorbed dose (Gy) and quality factor (a function of linear energy transfer), and is called radiation dose equivalent in sievert units (Sv).
Measurement of the various properties of light.
The use of computers for designing and/or manufacturing of anything, including drugs, surgical procedures, orthotics, and prosthetics.
A chromatography technique in which the stationary phase is composed of a non-polar substance with a polar mobile phase, in contrast to normal-phase chromatography in which the stationary phase is a polar substance with a non-polar mobile phase.
Spectrophotometric techniques by which the absorption or emmision spectra of radiation from atoms are produced and analyzed.
The study of MAGNETIC PHENOMENA.
The process of cumulative change at the level of DNA; RNA; and PROTEINS, over successive generations.
The study of chemical changes resulting from electrical action and electrical activity resulting from chemical changes.
Computer systems or programs used in accurate computations for providing radiation dosage treatment to patients.
Continuous frequency distribution of infinite range. Its properties are as follows: 1, continuous, symmetrical distribution with both tails extending to infinity; 2, arithmetic mean, mode, and median identical; and 3, shape completely determined by the mean and standard deviation.
Electronic devices that increase the magnitude of a signal's power level or current.
Negative test results in subjects who possess the attribute for which the test is conducted. The labeling of diseased persons as healthy when screening in the detection of disease. (Last, A Dictionary of Epidemiology, 2d ed)
A clear, odorless, tasteless liquid that is essential for most animal and plant life and is an excellent solvent for many substances. The chemical formula is hydrogen oxide (H2O). (McGraw-Hill Dictionary of Scientific and Technical Terms, 4th ed)
The visualization of deep structures of the body by recording the reflections or echoes of ultrasonic pulses directed into the tissues. Use of ultrasound for imaging or diagnostic purposes employs frequencies ranging from 1.6 to 10 megahertz.
Studies to determine the advantages or disadvantages, practicability, or capability of accomplishing a projected plan, study, or project.
Fractionation of a vaporized sample as a consequence of partition between a mobile gaseous phase and a stationary phase held in a column. Two types are gas-solid chromatography, where the fixed phase is a solid, and gas-liquid, in which the stationary phase is a nonvolatile liquid supported on an inert solid matrix.
Analysis based on the mathematical function first formulated by Jean-Baptiste-Joseph Fourier in 1807. The function, known as the Fourier transform, describes the sinusoidal pattern of any fluctuating pattern in the physical world in terms of its amplitude and its phase. It has broad applications in biomedicine, e.g., analysis of the x-ray crystallography data pivotal in identifying the double helical nature of DNA and in analysis of other molecules, including viruses, and the modified back-projection algorithm universally used in computerized tomography imaging, etc. (From Segen, The Dictionary of Modern Medicine, 1992)
The monitoring of the level of toxins, chemical pollutants, microbial contaminants, or other harmful substances in the environment (soil, air, and water), workplace, or in the bodies of people and animals present in that environment.
The normality of a solution with respect to HYDROGEN ions; H+. It is related to acidity measurements in most cases by pH = log 1/2[1/(H+)], where (H+) is the hydrogen ion concentration in gram equivalents per liter of solution. (McGraw-Hill Dictionary of Scientific and Technical Terms, 6th ed)
The property of emitting radiation while being irradiated. The radiation emitted is usually of longer wavelength than that incident or absorbed, e.g., a substance can be irradiated with invisible radiation and emit visible light. X-ray fluorescence is used in diagnosis.
A prediction of the probable outcome of a disease based on a individual's condition and the usual course of the disease as seen in similar situations.
Detection and counting of scintillations produced in a fluorescent material by ionizing radiation.
Input/output devices designed to receive data in an environment associated with the job to be performed, and capable of transmitting entries to, and obtaining output from, the system of which it is a part. (Computer Dictionary, 4th ed.)
Linear POLYPEPTIDES that are synthesized on RIBOSOMES and may be further modified, crosslinked, cleaved, or assembled into complex proteins with several subunits. The specific sequence of AMINO ACIDS determines the shape the polypeptide will take, during PROTEIN FOLDING, and the function of the protein.
The measurement of the amplitude of the components of a complex waveform throughout the frequency range of the waveform. (McGraw-Hill Dictionary of Scientific and Technical Terms, 6th ed)
The diversion of RADIATION (thermal, electromagnetic, or nuclear) from its original path as a result of interactions or collisions with atoms, molecules, or larger particles in the atmosphere or other media. (McGraw-Hill Dictionary of Scientific and Technical Terms, 6th ed)
A process that includes the determination of AMINO ACID SEQUENCE of a protein (or peptide, oligopeptide or peptide fragment) and the information analysis of the sequence.
These instruments were complementary and were used in academic and analytical settings through the 1950s, 1960s, and 1970s. ... CS1 maint: discouraged parameter (link) Clarke, F. J. J. (June 5, 1972). "High Accuracy Spectrophotometry at the National ... wavelength calibration reproducibility and wide range of scan speeds of its prism-grating double-monochromator. Cary, Henry H ... Sommer, L. (1989). Analytical absorption spectrophotometry in the visible and ultraviolet : the principles. Amsterdam: Elsevier ...
Calibration refers to the process of "tuning" or adjusting assumed simulation model inputs to match observed data from the ... Given the complexity of building energy and mass flows, it is generally not possible to find an analytical solution, so the ... This effort resulted in more powerful simulation engines released in the early 1970s, among those were BLAST, DOE-2, ESP-r, ... transparency and accuracy. Since some of these engines have been developed for more than 20 years (e.g. IDA ICE) and due to the ...
Baird, D. (2002). "Analytical chemistry and the 'big' scientific instrumentation revolution". In Morris, Peter J. T. (ed.). ... They may be responsible for calibration, testing and maintenance of the system. In a research environment it is common for ... Common concerns of both are the selection of appropriate sensors based on size, weight, cost, reliability, accuracy, longevity ... in the 1970s. The transformation of instrumentation from mechanical pneumatic transmitters, controllers, and valves to ...
Standard addition can be applied to most analytical techniques and is used instead of a calibration curve to solve the matrix ... Starting in approximately the 1970s into the present day analytical chemistry has progressively become more inclusive of ... Error of a measurement is an inverse measure of accurate measurement i.e. smaller the error greater the accuracy of the ... Modern analytical chemistry is dominated by instrumental analysis. Many analytical chemists focus on a single type of ...
This approach was first put forward in the 1970s and developed in 2002. Many analysts do not employ analytical equations for ... When calibration plots are markedly nonlinear, one can bypass the empirical polynomial fitting and employ the ratio of two ... M.J.T. Milton; J.A. Wang (2002). "High Accuracy Method for Isotope Dilution Mass Spectrometry with Application to the ... Analytical application of the radiotracer method is a forerunner of isotope dilution. This method was developed in the early ...
This reduces the calibration abilities of the process, i.e. it reduces the possibility of forming parts with small concave ... Designing the process has in the past been a challenging task, since initial analytical modeling is possible only for limited ... but it was industrially spread in the 1970s for the production of large T-shaped joints for the oil and gas industry. Today it ... for an increased work hardening of sheet material by distinctive stretching operations and provides better shape accuracy for ...
... the analytical method of general perturbations could no longer be applied to a high enough accuracy to adequately reproduce the ... In the 1970s and early 1980s, much work was done in the astronomical community to update the astronomical almanacs from the ... data calibrations, and dynamical model improvements, most significantly involving Jupiter, Saturn, Pluto, and the Kuiper Belt. ... Lunar Laser Ranging accuracy was improved, giving better positions of the Moon. DE403 covered the time span early 1599 to mid ...
He let us think that we had some of the best ideas, but on reflection we knew where they came from." During the 1970s Gutowsky ... "Pittcon '92". Analytical Chemistry. 64 (3): 133A-137A. 31 May 2012. doi:10.1021/ac00027a716. "2016 Awardees". American Chemical ... Comparing results from a variety of samples, Gutowsky and his group improved the accuracy of their instrument through careful ... Through rigorous calculation, convergence, calibration, experimental characterization, and correlation to chemical concepts, he ...
In the late 1960s and 1970s unmeasured variables were taken into account in the data reconciliation process., PDR also became ... The standard deviation is related to the accuracy of the measurement. For example, at a 95% confidence level, the standard ... due to sensor calibration or faulty data transmission. Random errors means that the measurement y {\displaystyle y\,\!} is a ... ", "analytical redundancy", or "topological redundancy". Redundancy can be due to sensor redundancy, where sensors are ...
Calibration at low concentrations is often done with automated analyzers to save time and to eliminate variables of manual ... The "ultrapure water" term became more popular in the later 1970s and early 1980s as a way of describing the particular quality ... It is the standard location for the majority of analytical tests. The point of connection (POC) is another commonly used point ... They require no maintenance except for periodic verification of measurement accuracy, typically annually. Sodium is usually the ...
This effect is accounted for during calibration by using a different marine calibration curve; without this curve, modern ... C atoms that are detected.[62] In the late 1970s an alternative approach became available: directly counting the number of 14. ... To verify the accuracy of the method, several artefacts that were datable by other techniques were tested; the results of the ... the success of radiocarbon dating stimulated interest in analytical and statistical approaches to archaeological data.[102] ...
... accuracy, sensitivity) and for methods of calibration (Wadsö and Goldberg 2001). Modern IMC instruments are actually semi- ... IMC is a powerful and versatile analytical tool for four closely related reasons: All chemical and physical processes are ... IMC was used to perform a large number of pioneering studies of cultured mammalian cell metabolismics in the 1970s and 1980s in ... 10X smaller dynamic range compared to hc mode.[citation needed] For operation in either hc or pc mode, routine calibration in ...
The analytical uncertainty is typically 1~2‰ in δD. Thus it is still being used today when highest levels of precision are ... GC-IRMS was first introduced by Matthews and Hayes in the late 1970s, and was later used for δ13C, δ15N, δ18O and δ34S. Helium ... The amount effect is also expected for hydrogen isotopes, but there are not as many calibration studies. Across southeast Asia ... The offline combustion/reduction has the highest accuracy and precision for hydrogen isotope measurement without limits for ...
These short term fluctuations in the calibration curve are now known as de Vries effects, after Hessel de Vries. A calibration ... In the late 1970s an alternative approach became available: directly counting the number of 14 C and 12 C atoms in a given ... To verify the accuracy of the method, several artefacts that were datable by other techniques were tested; the results of the ... the success of radiocarbon dating stimulated interest in analytical and statistical approaches to archaeological data. Taylor ...
This methodology was proposed in the 1970s by Walter Hoppe. Under purely absorption contrast conditions, this set of images can ... 1979). Introduction to Analytical Electron Microscopy , SpringerLink (PDF). doi:10.1007/978-1-4757-5581-7. ISBN 978-1-4757-5583 ... Aperture assemblies are often equipped with micrometers to move the aperture, required during optical calibration. Imaging ... with repositioning accuracy on the order of nanometers. Earlier designs of TEM accomplished this with a complex set of ...
In the 1970s, more sophisticated analytical techniques were introduced in educational research, starting with the work of Gene ... For test accuracy and prediction, particularly when there are multivariate effects, other approaches which seek to estimate the ... GIM can be viewed as a model calibration method for integrating information with more flexibility. The meta-analysis estimate ... Willis BH, Hyde CJ (2015). "What is the test's accuracy in my practice population? Tailored meta-analysis provides a plausible ...
This model is able to capture in analytical form, a range of shapes of hysteretic cycles which match the behaviour of a wide ... A more formal mathematical theory of systems with hysteresis was developed in the 1970s by a group of Russian mathematicians ... As of 2002[update], only desorption curves are usually measured during calibration of soil moisture sensors. Despite the fact ... Parkes, Martin (8 April 1999). "Subject: Accuracy of capacitance soil moisture ..." SOWACS (Mailing list). Archived from the ...
Radar-derived rainfall estimates complement surface station data which can be used for calibration. To produce radar ... Chris Lehmann (2009). "10/00". Central Analytical Laboratory. Archived from the original on 2010-06-15. Retrieved 2009-01-02. ... but its accuracy will depend on what ruler is used to measure the rain with. Any of the above rain gauges can be made at home, ... as well as an increase since the 1970s in the prevalence of droughts-especially in the tropics and subtropics. Changes in ...
Factors affecting accuracy of various meters include calibration of meter, ambient temperature, pressure use to wipe off strip ... "Analytical Performance Requirements for Systems for Self-Monitoring of Blood Glucose With Focus on System Accuracy: Relevant ... It was used in American hospitals in the 1970s. A moving needle indicated the blood glucose after about a minute. Home glucose ... Accuracy of glucose meters is a common topic of clinical concern. Blood glucose meters must meet accuracy standards set by the ...
In the early 1970s, psychologists Amos Tversky and Daniel Kahneman took a different approach, linking heuristics to cognitive ... In a full tree, in contrast, order does not matter for the accuracy of the classifications. Fast-and-frugal trees are ... On the other hand, systematic processing involves more analytical and inquisitive cognitive thinking. Individuals looks further ... "Calibration of probabilities: The state of the art to 1980", in Kahneman, Daniel; Slovic, Paul; Tversky, Amos (eds.), Judgment ...
For transformation of the (x, y) information into scattering angle ϕ a removable calibration mask in front of the entrance ... It is defined as the ability of an analytical technique to detect a variation in atomic distribution as a function of depth. ... This article provides information about ERDA that has been around for a long time, since the mid-1970s. The article provides ... Depth resolution less than 1 nm can be obtained with good quantitative accuracy thus giving these techniques significant ...
In the 1970s, compounds of rare earths and iron were discovered with superior magnetomechanic properties, namely the Terfenol-D ... In the very broadest usage, this term can encompass virtually any analytical technique involving remotely generated sound, ... it is important to question their accuracy because of the assumptions inherent in making such estimations from sonar data. ... the latter are used in underwater sound calibration, due to their very low resonance frequencies and flat broadband ...
"Central Analytical Laboratory. Archived from the original on 2010-06-15. Retrieved 2009-01-02.. ... Radar-derived rainfall estimates complement surface station data which can be used for calibration. To produce radar ... but its accuracy will depend on what ruler is used to measure the rain with. Any of the above rain gauges can be made at home, ... as well as an increase since the 1970s in the prevalence of droughts-especially in the tropics and subtropics. Changes in ...
... make full use of the best workspace to reduce the random uncertainty of the volumetric error and improve the machining accuracy ... How to improve the accuracy of CNC machine tools has been gotten great attention [1]. To enhance the machining accuracy of CNC ... W. K. Abdul, Calibration of 5-Axis Machine Tools, Beihang University, Beijing, China, 2010. *A. H. Slocum, Precision Machine ... P. M. Ferreira and C. R. Liu, "An analytical quadratic model for the geometric error of a machine tool," Journal of ...
Analytical Transmission Scanning Electron Microscopy This project develops STEM-in-SEM methods or low-energy transmission ... Argon mini-arc meets its match: Use of a laser-driven plasma source in ultraviolet-detector calibrations The National Institute ... In the early 1970s, NIST researchers pioneered the near-field scanning technique, now the standard method for testing high- ... The accuracy to which these measurements can be accomplished requires minimization of the ...
Air-kerma Calibrations in Radiation Protection Level Cs-137 and Co-60 Gamma Ray Beams The Dosimetry Group maintains and ... Analytical Transmission Scanning Electron Microscopy This project develops STEM-in-SEM methods or low-energy transmission ... Alpha-gamma Counting for High Accuracy Fluence Measurement The Alpha-Gamma device is a totally-absorbing neutron detector that ... In the early 1970s, NIST researchers pioneered the near-field scanning technique, now the standard method for testing high- ...
... uses a software-based nomogram to estimate stroke volume without the need for lithium calibration. Accuracy of the latter has ... Pressure Recording Analytical Method (PRAM) (MostCare device, Vytech, Padu, Italy) is another device that estimates CO from the ... From its initial use in outpatient settings in 1970s, echocardiography has found its place in all inpatient settings, ranging ... The latter system requires occasional set-up time and calibration. Data from either of these catheters can be used to derive ...
These instruments were complementary and were used in academic and analytical settings through the 1950s, 1960s, and 1970s. ... CS1 maint: discouraged parameter (link) Clarke, F. J. J. (June 5, 1972). "High Accuracy Spectrophotometry at the National ... wavelength calibration reproducibility and wide range of scan speeds of its prism-grating double-monochromator. Cary, Henry H ... Sommer, L. (1989). Analytical absorption spectrophotometry in the visible and ultraviolet : the principles. Amsterdam: Elsevier ...
... namely wavelength calibration and absorbance/transmittance accuracy.. Instrument design. The current range of NIR spectrometers ... aBurgess Analytical Consultancy Limited, Rose Rae, The Lendings, Startforth, Barnard Castle, Co. Durham DL12 9AB, UK. bStarna ... since Technicon developed the original Infralyser range in the mid 1970s for compositional analysis of agricultural commodities ... the use of a solid artefact or standard cuvette for wavelength calibration for either transmittance or reflectance calibration ...
Sonic nozzles for gas calibrators, when simplicity provides accuracy! Gas analysers calibration is a task required in many ... In the 1970s, the development of ambulatory infusion technology enabled the pumps to be used in research; ... ... Measurements in traces range are performed and analytical devices specifications need to be validated and ... ...
Power meters, thermometers, and temperature controllers require periodic calibration to maintain their accuracy. As it can be ... Since the 1970s, Yokogawa has been a leading provider of measuring instruments with the high level of precision required for ... as well as analytical instrument converters. The key features of this product are as follows:. 1. Versatile. The 2553A can ... The 2553A outputs voltage and current with high accuracy and stability:. Accuracy: ±75 ppm* for voltage (one year). ±120 ppm ...
Analytical Laboratories. NHANES uses several methods to monitor the quality of the analyses performed by the contract ... Production of PCBs peaked in the early 1970s and was banned in the United States after 1979. Together with the PCDDs and PCDFs ... instrument calibration, reagents, and any special considerations are submitted to NCHS quarterly. The reports are reviewed for ... quality control and quality assurance performance criteria for accuracy and precision, similar to the Westgard rules (Caudill, ...
Calibration refers to the process of "tuning" or adjusting assumed simulation model inputs to match observed data from the ... Given the complexity of building energy and mass flows, it is generally not possible to find an analytical solution, so the ... This effort resulted in more powerful simulation engines released in the early 1970s, among those were BLAST, DOE-2, ESP-r, ... transparency and accuracy. Since some of these engines have been developed for more than 20 years (e.g. IDA ICE) and due to the ...
The pH sensor works in the range 6-8 pH units and it is characterised by an accuracy of 0.07 pH units and a response time of ... A calibration curve was determined, resulting in an easy-handling immobilization method with a cheap stable material. This ... Flow cytometry has been instrumental in rapid analysis of single cells since the 1970s. One of the common approaches is the ... The relative merits and pitfalls of each analytical method are discussed. Fluorescence intermittency and spectral shifts of ...
The technique of meter calibration is meter specific; some devices have automatic calibration, whereas others use lot-specific ... Analytical goals for meters and for patient-performed testing have been recently reported (53,54,55,56). If SMBG is prescribed ... By the mid-1970s it became clear that HbA1c resulted from a posttranslational modification of HbA by glucose and that there was ... Chan JC, Wong R, Cheung CK, Lam P, Chow CC, Yeung VT, Kan EC, Loo KM, Mong MY, Cockram CS: Accuracy, precision, and user- ...
The first major revolution occurred in the 1970s with the development of the MIRD models and calculational techniques. These ... One involves the limitations on spatial resolution and accuracy of activity quantification with nuclear medicine cameras. ... Small thermoluminescent dosimeters (TLDs) have been very useful in anthropomorphic phantoms in calibration of diagnostic and ... The use of point kernels and analytical methods are not as common currently as characterization of dose distributions using ...
The accuracy of glucometers may be considered as technical and clinical. Technical accuracy refers to the analytical result ... Calibration errors are also common for those meters which require calibration.. History & Evolution of Continuous Glucose ... Evolution of CGM can be traced back to the mid-1970s followed by the development of sensor technology and implantable glucose ... Accuracy of CGM Sensors. Though sensor accuracy have improved over the years, the accuracy of sensors available for use in ...
Lack of analytical accuracy not only impacts current clinical care but also is holding back a better understanding of emerging ... "As is true for all methods, measurements based on mass spectrometry depend upon the accuracy of calibration, freedom from ... "Back in the 1970s, we used estradiol testing primarily in younger women for fertility management where measuring relatively ... it is using to assign target values to single donor serum materials that labs can use to assess the calibration and accuracy of ...
... as well as to determine analytical accuracy and precision. Using these congeners and their respective ratios allows accurate ... 2008) were similar to those reported in the late 1970s (Gardner et al. 1978). 2011 K.F. Gaines, J.W. Summers, J.C. Cumbee, Jr ... Samples were quantified using a six-point calibration curve derived from dilutions of certified congener (Ultra Scientific) and ...
This instalment provides an overview for analytical-scale HPLC pumps, including their requirements, modern designs, operating ... The compositional accuracy of the pump can be verified by running step gradient profiles (for example, in 10% composition steps ... Syringe Pumps: Many early HPLC pumps in the 1960s and 1970s were syringe pumps (such as the Varian 8500 Dual Syringe Pump ... He holds a PhD in analytical chemistry from City University of New York. He has more than 100 publications and a bestâ selling ...
The calibration standard curve was prepared with seven points in rat plasma with drug concentrations ranging between 0.0097 and ... The precision and accuracy were less than 15% for the between-day variabilities, which were characterized at the 3 ... Analytical assays. (i) Colistin analysis in plasma.Determination of colistin concentrations in plasma were performed by a ... It is an old antibiotic that was abandoned in the 1970s due to its adverse effects (7). During the last 15 years, it has ...
... used to qualify absorbance accuracy can be used to qualify linearity provided they are compatible with the analytical ... Until the 1970s most laboratories used in-house prepared solutions or proprietary test materials to check the performance of ... The stability of the reference material is also very important, and the validity of the calibration should be stated on the CRM ... Control of wavelength accuracy. The user is required to:. "Control the wavelength accuracy of an appropriate number of bands in ...
The accuracy of the calibration curve was also checked with 5 mg F? L? 1 standard for the higher range calibration curve, and ... Analytical model of fluoride distribution in groundwater. Solute transport in a three-dimensional homogeneous aquifer was ... The first boreholes were installed in the 1970s but the year is not certain. Samples were collected for wet, dry, and shoulder ... A semi-analytical three-dimensional model (Domenico, 1987) of fluoride distribution in groundwater in Namoo was created using ...
Temperature-measuring instruments require calibration to confirm the accuracy of temperature measurements. Calibration of ... In the 1970s, the Wyoming Geological Survey (WGS) began a project to inventory all the thermal springs in the State of Wyoming ... inclusion of the uncertainty associated with the analytical technique clarifies the confidence of the interpretation. Analytic ... Calibration of temperature data loggers is an important component of the equipments use. An easy calibration check is the ice- ...
methyl mercury analysis system methyl mercury distillation system zero air generator mercury vapor calibration industrial ... Tekran is known worldwide for the accuracy and dependability of our equipment and will continue to exceed customer expectations ... Tekran was founded in Toronto, Canada in 1989 to develop custom analytical instrumentation for environmental analysis. Our ... Japan in the 1950s and Iraq in the 1970s. Recent low-level exposure studies have been used to set government guidelines for ...
In the late 1970s LC-based assays emerged that used refined extraction and chromatographic steps with some form of fixed- or ... accuracy, and variability) regularly over the past decade (Carter et al., 2004; Carter, 2009; Jones et al., 2009). These ... In the case of the Canadian survey of 25OHD levels in the Canadian population, the analytical methods used to determine 25OHD ... A standard reference material (SRM 972, Vitamin D in Human Serum) and calibration solution (SRM 2972, 25-Hydroxyvitamin D2 and ...
The researchers described the new calibration method and results in the July 15 issue of the journal Analytical Chemistry. In ... One of the most popular candidates for a unified theory is string theory, first developed in the late 1960s and early 1970s.. ... the researchers used DCC-calibrated Raman spectroscopy to significantly boost the accuracy of blood glucose measurements - an ... However, this calibration becomes more difficult immediately after the patient eats or drinks something sugary, because blood ...
To compare its accuracy to other traditional detection methods, 27 swine stool samples from south of China were investigated ... All chemicals used were of analytical grade or higher.. Apparatus. Absorbance of the HRP-based ELISA was measured using a ... and calibration line (C-line) using an automatic dispenser at a volume of 1 μL/cm. The NC membrane was dried at 37 °C and saved ... Since the first appearance in England in the early 1970s, outbreaks of PED have been reported in several European and Asian ...
Chapter 7 Calibration. 7.1 Introduction. 7.2 Calibration Concepts. 7.3 Calibration Methods. 7.4 Factors Affecting Calibration. ... Crowcon had been a founder member and strong supporter of CoGDEM since it was formed in the 1970s. Leighs involvement with ... 8.5 Accuracy and Stability. 8.6 Electrical/Electronic Tests. 8.7 Software Evaluation. Acknowledgements ... Improve ICPMS Sensitivity by Reducing the Analytical... Oct 01 2019 Read 358 ...
The analytical approach advocated here argues for an historical biblical archaeology rooted in the application of science-based ... To achieve subcentury dating accuracy for ancient historical archaeology, as is the case for Levantine historical biblical ... 2004) INTCAL04 terrestrial radiocarbon age calibration, 0-26 kyr BP. Radiocarbon 46:1029-1058. ... in the 1970s and 1980s (7), using relative ceramic dating methods, they assumed that the Iron Age (IA) in Edom did not start ...
The CCO-PLS method provided a new approach for multi-band selection to achieve high analytical accuracy for molecular ... Piero: I recall writing a multivariate regression programme way back in the mid-1970s on one of the first HP Programmable ... multivariate data analysis, while for calibration and. validation a reference collection of 199 historical canvas samples was ... A new analytical method was developed to non-destructively. determine pH and degree of polymerisation (DP) of cellulose. in ...
  • This effort resulted in more powerful simulation engines released in the early 1970s, among those were BLAST, DOE-2, ESP-r, HVACSIM+ and TRNSYS. (wikipedia.org)
  • These instruments were complementary and were used in academic and analytical settings through the 1950s, 1960s, and 1970s. (wikipedia.org)
  • In the late 1960s and 1970s, RIA methods became available for this purpose. (aaccjnls.org)
  • Propanidid [(4-diethylcarbamoylmethoxy-3-methoxy-phenyl)-acetic acid propyl ester] is a short-acting sedative-hypnotic agent containing an ester moiety that was available in some countries in the 1960s and 1970s. (asahq.org)
  • This instalment provides an overview for analytical-scale HPLC pumps, including their requirements, modern designs, operating principles, trends, and best practices for trouble-free operation. (chromatographyonline.com)
  • Mixer volumes range from a few μL to 2 mL in analytical HPLC systems. (chromatographyonline.com)
  • A major change is that the scope is now extended to include high-performance liquid chromatography (HPLC) detectors and process analytical technology (PAT) as applications of ultraviolet/visible (UV/vis) spectrophotometry. (spectroscopyeurope.com)
  • Proper analytical screening including assay determination via High Performance Liquid Chromatography (HPLC), redox potential, pH measurement and general raw material compatibility testing helps in the development of an effective microbial protection program to inhibit or control the growth of microorganisms that cause biodeterioration. (coatingsworld.com)
  • 1966: HPLC was first named by Horvath at Yale University but HPLC didnt catch on until the 1970s 1978: W.C. Stills introduced flash chromatography, where solvent is forced through a packed column with positive pressure. (scribd.com)
  • To compare its accuracy to other traditional detection methods, 27 swine stool samples from south of China were investigated with the new developed ICA, commercial strip and RT-PCR. (springer.com)
  • When British archaeologists carried out the first controlled excavations in the highlands of Edom (southern Jordan) in the 1970s and 1980s ( 7 ), using relative ceramic dating methods, they assumed that the Iron Age (IA) in Edom did not start before the 7th c. (pnas.org)
  • Ira Rabin and Oliver Hahn from the BAM Federal Institute for Materials Research and Testing have studied scroll fragments which were discovered at four sites up to 50 km from the Qumran Cave, where more than 90% of the known fragments were found, as they discussed in Analytical Methods. (shroudstory.com)
  • The reference collection was analysed destructively using microscopy and chemical analytical methods. (shroudstory.com)
  • Analytical chemistry studies and uses instruments and methods used to separate , identify, and quantify matter. (wikipedia.org)
  • Analytical chemistry consists of classical, wet chemical methods and modern, instrumental methods . (wikipedia.org)
  • Analytical chemistry has been important since the early days of chemistry, providing methods for determining which elements and chemicals are present in the object in question. (wikipedia.org)
  • The PBI tests of the 1950s that estimated the TT4 concentration were replaced first by competitive protein binding methods in the 1960s, which were later superseded by RIA methods in the 1970s. (nacb.org)
  • The report is intended to be used by U.S. Department of Energy (DOE) laboratory managers and technicians as a guide for identifying those analytical methods that are acceptable to the EPA for radionuclide analysis. (docplayer.net)
  • EPA published a list of analytical methods for radionuclides in 40 CFR (a) that were approved for determining compliance with the maximum contaminant levels. (docplayer.net)
  • Until early 1997, Part (a) contained only a few approved methods, most of which were approved for use in the 1970s. (docplayer.net)
  • We demonstrated the accuracy of the method by comparing it with 2 ID/GC-MS methods and also assessed the imprecision, linearity, recovery, sensitivity, and specificity. (aaccjnls.org)
  • The use of standardized methods of analysis in analytical chemistry is one of the most traditional ways of achieving comparability of measurement results. (iupac.org)
  • Especially in food analysis, agrochemicals, organic analysis, and other analytical areas where unstable samples and/or measurands are analyzed, the use of standardized methods is often prescribed by legislation. (iupac.org)
  • Two IUPAC internationally harmonized protocols have for many years served as a basis for validation and adoption of standardized analytical methods (procedures). (iupac.org)
  • The first is the IUPAC "Protocol for the Design, Conduct, and Interpretation of Collaborative Studies," 1 and the second the "Harmonized Protocols for the Adoption of Standardized Analytical Methods and for the Presentation of their Performance Characteristics. (iupac.org)
  • However, the world is changing rapidly and with the fast development of analytical instrumentation and the availability of new analytical techniques and procedures the prescription of methods to be used is sometimes a limiting factor. (iupac.org)
  • This effort was a pilot study to test and refine sampling protocols, analytical methods, quality control protocols, and field logistics for the continental survey. (chemweb.com)
  • From the technology standpoint, lower detection limits have been made possible by improvement of the detection capabilities of the analytical methods and instruments. (foodsafetytech.com)
  • These methods now can provide fast turn-around time and better accuracy in comparison to microbiological methods. (foodsafetytech.com)
  • High Accuracy Spectrophotometry at the National Physical Laboratory (Teddington, Middlesex, UK. (wikipedia.org)
  • Last, having solved the above problems, the range of application areas now opening up to the technique also required the design of instruments of a more physically robust nature than usually found in an analytical laboratory, and by definition, portable in nature. (spectroscopyeurope.com)
  • As it can be quite time-consuming and costly to send an instrument back to the manufacturer or to a laboratory for calibration, many users elect to do this in-house, using a high-precision measuring instrument that they have purchased for use as a standard. (yokogawa.com)
  • This proposal seeks funds to build a tool for data collection for social scientists that would bring a new standard of accuracy to the data, broader accessibility for social scientists at reduced cost, blending the analytic power afforded by a state-of-the art computer network laboratory with the generalization power afforded by a representative sample survey of American households. (stanford.edu)
  • Published GLP regulations and guidelines have a significant impact on the daily operation of an analytical laboratory. (labcompliance.com)
  • Established internal quality-control practices and regular laboratory participation in proficiency testing constitute another very important pillar of quality assurance in analytical chemistry. (iupac.org)
  • The WHO criteria for AMI then established (and re-established since the 1970s) the laboratory requirements for CK-MB and detectable levels of troponin to correlate with clinical findings. (labmedicineblog.com)
  • Testing and calibration laboratories gain a great deal from a technically sound assessment and accreditation by an internationally recognized accreditation body. (labmanager.com)
  • For calibration laboratories, accreditation by an internationally recognized accreditation body validates their pivotal place in the unbroken chain of traceability to national and international measurement standards. (labmanager.com)
  • Other manufacturers that have in-house testing or calibration facilities can reduce or eliminate these overhead costs and subcontract with confidence to outside accredited laboratories. (labmanager.com)
  • Two IUPAC internationally harmonized documents, namely the "International Harmonized Protocol for the Proficiency Testing of (Chemical) Analytical Laboratories" 5 and the "Harmonized Guidelines for Internal Quality Control in Analytical Chemistry Laboratories" 6 still provide the basic rules, which have received wide international acceptance and utilization. (iupac.org)
  • Considering the experience gained over 13 years, the protocol has been updated and a revised version titled "The International Harmonized Protocol for the Proficiency Testing (PT) of Analytical Chemistry Laboratories" was published in 2006. (iupac.org)
  • 7 To supplement this so-called classical PT approach, the WPHQA recently initiated a separate project on the Selection and Use of Proficiency Testing Schemes for Limited Number of Participants (Chemical Analytical Laboratories). (iupac.org)
  • The implementation of standardization is constantly ongoing to reduce analytical discrepancies between laboratories. (dovepress.com)
  • It is particularly note-worthy among commercial instrumentation for the size, freedom from stray light, wavelength calibration reproducibility and wide range of scan speeds of its prism-grating double-monochromator. (wikipedia.org)
  • Tekran was founded in Toronto, Canada in 1989 to develop custom analytical instrumentation for environmental analysis. (environmental-expert.com)
  • Although modern analytical chemistry is dominated by sophisticated instrumentation, the roots of analytical chemistry and some of the principles used in modern instruments are from traditional techniques, many of which are still used today. (wikipedia.org)
  • This renewed interest, particularly in the pharmaceutical industry, has led to the need for traceable standards for the calibration and qualification of the wavelength scale of NIR spectrometers in the regulated environment. (spectroscopyeurope.com)
  • Because of the versatility in the sample presentation modes in the NIR, there is a need to ensure wavelength accuracy in transmittance, reflectance and transflectance. (spectroscopyeurope.com)
  • However, for modern FT instruments the precision is so good that it is only the accuracy of the wavelength scale is important. (spectroscopyeurope.com)
  • The wavelength scale accuracy is important because of the excellent signal to noise ratio of the NIR especially in the region above 1200 nm. (spectroscopyeurope.com)
  • In the area of instrument qualification, we will look briefly at two aspects which continue to cause problems, namely wavelength calibration and absorbance/transmittance accuracy. (spectroscopyeurope.com)
  • High-performance (HP) LC-MS/MS potentially has adequate sensitivity and specificity for low estradiol concentrations, but still is not without potential analytical pitfalls. (aacc.org)
  • Although RT-PCR serves as a good standard due to its high sensitivity and accuracy, this method relies heavily on sophisticated equipment and expensive apparatus. (springer.com)
  • Improve ICPMS Sensitivity by Reducing the Analytical. (labmate-online.com)
  • In mass spectrometry, the performance of the MS system in terms of mass resolution, mass accuracy and principally sensitivity, is highly dependent on the method of ion generation. (bruker.com)
  • The sensitivity of the system is extremely important in most MS analytical tasks. (bruker.com)
  • For the journal, see Analytical Chemistry (journal) . (wikipedia.org)
  • Analytical chemistry is also focused on improvements in experimental design , chemometrics , and the creation of new measurement tools. (wikipedia.org)
  • Analytical chemistry has broad applications to forensics, medicine, science and engineering. (wikipedia.org)
  • During this period significant contributions to analytical chemistry include the development of systematic elemental analysis by Justus von Liebig and systematized organic analysis based on the specific reactions of functional groups. (wikipedia.org)
  • Most of the major developments in analytical chemistry take place after 1900. (wikipedia.org)
  • Starting in approximately the 1970s into the present day analytical chemistry has progressively become more inclusive of biological questions (bioanalytical chemistry), whereas it had previously been largely focused on inorganic or small organic molecules . (wikipedia.org)
  • The late 20th century also saw an expansion of the application of analytical chemistry from somewhat academic chemical questions to forensic , environmental , industrial and medical questions, such as in histology . (wikipedia.org)
  • Modern analytical chemistry is dominated by instrumental analysis. (wikipedia.org)
  • Analytical chemistry plays an increasingly important role in the pharmaceutical industry where, aside from QA, it is used in discovery of new drug candidates and in clinical applications where understanding the interactions between the drug and the patient are critical. (wikipedia.org)
  • These techniques also tend to form the backbone of most undergraduate analytical chemistry educational labs. (wikipedia.org)
  • Today, after almost 30 years, that working party is the IUPAC Interdivisional Working Party for Harmonization of Quality Assurance (WPHQA), which is part of the Analytical Chemistry Division (ACD). (iupac.org)
  • Progress reports containing any problems encountered during shipping or receipt of specimens, summary statistics for each control pool, QC graphs, instrument calibration, reagents, and any special considerations are submitted to NCHS quarterly. (cdc.gov)
  • There is more detail on this topic as well as on a variety of other areas such as gas detection requirements and training, sensor technologies, and instrument calibration and maintenance. (labmate-online.com)
  • However, there are few calibrators on the market that are intended specifically for the measurement of direct current, so many users are forced to use expensive multifunctional calibrators or voltage/current sources that are not as accurate or stable as a dedicated calibration instrument. (yokogawa.com)
  • The resulting data, in the form of a calibration curve, is now used to convert a given measurement of radiocarbon in a sample into an estimate of the sample's calendar age. (wikipedia.org)
  • To gain acceptance in the trading process, the quality of analytical measurement results needs to be assured and demonstrated. (iupac.org)
  • IUPAC has a long tradition of activities related to quality assurance of analytical measurement results. (iupac.org)
  • The short description of activities that follows and the documents cited here are aimed at highlighting the important role that IUPAC, and specifically the WPHQA, plays in ensuring the quality of analytical measurement results. (iupac.org)
  • State of the art advanced technology for reliable and safe measurement of SF 6 -gas quality, includes a gas recovery system, includes built in test for instant calibration of the system accuracy. (e-magic.be)
  • Production of PCBs peaked in the early 1970s and was banned in the United States after 1979. (cdc.gov)
  • Since the early 1970s, NSAIDs have been routinely used in veterinary medical practice. (scirp.org)
  • Starting in the early 1970s, AAR gained new popularity when studies (e.g. (ukdiss.com)
  • In the United States, trace analysis of contaminants in food products began in the early 1970s following amendments to the Federal Food, Drug, and Cosmetic Act (FFDCA) in 1968. (foodsafetytech.com)
  • Several non-invasive optical metrological, imaging and analytical techniques were applied to silver denarii of Diva Faustina from two different editions. (cosch.info)
  • The calibrated accuracy for methane, ethane, and propane is within 3‰ of the values determined using isotope ratio mass spectrometry (IRMS), which is the current method of choice for compound-specific isotope analysis. (pnas.org)
  • The step-by-step protocols for calibration and using the three modules are presented. (jove.com)
  • When determining pH, some important factors must be considered regarding the selection of proper analytical equipment and test protocols. (coatingsworld.com)
  • With the discovery of DBPs in the mid-1970s, caused by incomplete oxidation by halogenated chemicals - mainly chlorine-based - with certain organics, their removal became a priority for regulators and utilities. (wateronline.com)
  • In addition we can say more about how much of the observed tissue modulated spectra must be associated with a "modulation defect" resulting from various errors affecting the accuracy and precision in subtraction of static tissue contributions. (spie.org)
  • The analytical method for PCDDs/PCDFs/cPCB is described in Patterson et al. (cdc.gov)
  • The user is recommended to: "Define the measuring conditions to obtain a satisfactory signal-to-noise ratio and to select the scan range, scan rate and slit-width that provide the necessary optical resolution for the intended application without losing the required signal-to-noise ratio or the linearity of the analytical method. (spectroscopyeurope.com)
  • A new analytical method was developed to non-destructively determine pH and degree of polymerisation (DP) of cellulose in fibres in 19th-20th century painting canvases, and to identify the fibre type: cotton, linen, hemp, ramie or jute. (shroudstory.com)
  • The method is based on NIR spectroscopy and multivariate data analysis, while for calibration and validation a reference collection of 199 historical canvas samples was used. (shroudstory.com)
  • Hence, this study provides the extensive explanation of the XIMIS method and a comprehensive calibration of XIMIS against EVA and HPM using data from 496 roof taps to quantify the value-dependence and separation/attachment zonal dependence. (frontiersin.org)
  • The ID/LC-MS/MS method has improved accuracy compared with immunoassay. (aaccjnls.org)
  • In precision machining, geometric errors of machine tools have considerable effect on geometrical and dimensional accuracies of machined features [ 7 ] and make up the major part of the inaccuracy of a machine tool [ 5 , 6 , 8 ]. (hindawi.com)
  • Since the 1970s, Yokogawa has been a leading provider of measuring instruments with the high level of precision required for use as standards in the calibration of voltage, current, resistance, pressure, and temperature instruments. (yokogawa.com)
  • The discovery of a chemical present in blood that increases the risk of cancer would be a discovery that an analytical chemist might be involved in. (wikipedia.org)
  • Adiabatic Calorimeters In the late 1970s, the American Institute of Chemical Engineers (AIChE) initiated a research project investigating various aspects of reactor vent sizing. (ubriaco-magst.fun)
  • This study aimed to identify the chemical constituents of photoresist (PR) products and their by-products and to compare these constituents with material safety data sheets (MSDSs) and analytical results. (bvsalud.org)
  • The current range of NIR spectrometers are radically different from "conventional" filter-based spectrometers, since Technicon developed the original Infralyser range in the mid 1970s for compositional analysis of agricultural commodities. (spectroscopyeurope.com)
  • The analysis results can reveal the stochastic characteristic of volumetric error and are also helpful to make full use of the best workspace to reduce the random uncertainty of the volumetric error and improve the machining accuracy. (hindawi.com)
  • A volumetric error model, which is the relative error between the cutting tool and the work piece, is a system analysis implement, used when accuracy is an important measure of performance to predict and control the total error of a system or to achieve compensation. (hindawi.com)
  • AAR dating has been suggested as a cost-effective and rapid preliminary dating technique to identify qualitative relative age information in the analysis of a large number of samples, with the possibility of independent calibration by a separate geochronological technique. (ukdiss.com)
  • In recent years, there has been a rise in demand for DC voltage and current calibration instruments as the result of increased use of direct current in data centers to avoid the power losses caused by AC/DC conversion and rising sales of thermometers, analyzers, and DC servo motors. (yokogawa.com)
  • Data from Khirbat en-Nahas, and the nearby site of Rujm Hamra Ifdan, demonstrate the centrality of industrial-scale metal production during those centuries traditionally linked closely to political events in Edom's 10th century BCE neighbor ancient Israel. (pnas.org)
  • Specifically, the proposal seeks funds for three categories of expenses: (1) purchase of computer hardware, (2) installation of the computer hardware in a representative sample of 1,000 households across the country, and (3) calibration of a national network of these computers via the Internet, testing to assure that it works properly to permit social science data collection. (stanford.edu)
  • Then, once a month, respondents would provide a new round of data by accessing a secure webpage, and calibration of the computer network would be conducted. (stanford.edu)
  • The "industry standard" EVA, comprising sixteen 10-min epochs, gives the best accuracy, but is inefficient in its use of data. (frontiersin.org)
  • With an emphasis on recent developments in remote sensing, the symposium program will focus on applications of satellite and other Earth observations to monitor, assess, and perform projections of future land and water resources, as well as big data and other analytical technologies to improve decision making using satellite data. (usgs.gov)
  • The symposium will offer a program focused on applications of satellite and other Earth observations to monitor, assess, and perform projections of future land and water resources, as well as big data and other analytical technologies to improve decision making using satellite data. (usgs.gov)
  • The diagnostic accuracy of total hormone measurements would equal that of free hormone if all patients had identical levels of binding proteins with similar affinities for the thyroid hormones. (nacb.org)
  • Along with rapid progress and development of science, technology and social economy, the machining accuracy of CNC machine tools is increasingly demanding. (hindawi.com)
  • In the development of a biocide program for a polymer dispersion system, analytical testing can provide valuable information. (coatingsworld.com)
  • This is partly explained by the fact that shape of particles was not spherical, although calibration of sampling instruments was performed using spherical particles and the concentration was very high at the UNP workplaces to allow them to aggregate more easily. (bvsalud.org)
  • Users have greater confidence in the accuracy of the test or calibration report they are purchasing because it has been generated by a competent facility. (labmanager.com)
  • Field investigations assessing mercury concentrations in marine biota have been conducted in all oceans and seas 8 , including the Mediterranean since the 1970s. (nature.com)
  • Power meters, thermometers, and temperature controllers require periodic calibration to maintain their accuracy. (yokogawa.com)
  • The 2553A has the optimum output range and functions required for the calibration of thermometers, temperature controllers, and analog power meters. (yokogawa.com)
  • In addition to being able to calibrate analog meters, the 2553A can calibrate thermometers and temperature controllers that utilize a thermocouple or RTD, as well as analytical instrument converters. (yokogawa.com)
  • The 2553A is easy to operate, having a dedicated output setting dial for each digit in the numeric display, a sensor selection dial, and a dial for selecting voltage output, current output, temperature calibration, and other functions. (yokogawa.com)
  • Only Cr (77%), Y (82%), and Sb (80%) fell outside the acceptable limits of accuracy (% recovery between 85 and 115%) because of likely residence in mineral phases resistant to the acid digestion.A separate sample of 0-5-cm material was collected at each site for determination of organic compounds. (chemweb.com)
  • Over the past 40 years, this scientific investment has contributed to new analytical and modelling techniques for quantifying spatial and temporal patterns of mercury in different biotic and abiotic matrices. (nature.com)
  • [5] In the 1970s many of these techniques began to be used together as hybrid techniques to achieve a complete characterization of samples. (wikipedia.org)
  • As mentioned in the conclusions of Cook (2016b) , a calibration using only two taps is not sufficient to present the accuracy and benefit of XIMIS. (frontiersin.org)
  • This paper addresses some key areas where analytical testing can provide critical information to ensure the optimal protection of polymer dispersions. (coatingsworld.com)
  • The presented automatic video tracking system accomplishes this by using a mirror system and a calibration procedure that corrects for the considerable error introduced by the transition of light from water to air. (jove.com)
  • Gas analyser's calibration is a task required in many applications according to either legislation or quality systems management. (environmental-expert.com)
  • In industrial applications of polymer in water-based systems (e.g., coatings, adhesives, and sealants), analytical testing of the system enables a formulator to maximize biocide efficacy, thus protecting product integrity. (coatingsworld.com)
  • For example, the Tekran 3310 Elemental Mercury Calibration Source was chosen by the US National Institute of Standard and Technology (NIST) as the "NIST Prime" which is used to certify all other Hg-CEM calibration sources. (environmental-expert.com)
  • I started the company with a handful of employees at the end of the 1970s to manufacture computing technology. (docplayer.net)
  • However, machining accuracy of the multiaxis synchronized machine is mainly affected by the geometric errors of the guide system, structure stiffness, thermal behavior and the dynamic response, and so forth. (hindawi.com)
  • Proactive analytical screening for pH and redox potential can be done very quickly and will aid in the selection of a proper biocide for a given system. (coatingsworld.com)
  • Back in the 1970s, we used estradiol testing primarily in younger women for fertility management where measuring relatively high levels was the target," said Sluss. (aacc.org)
  • high-voltage switchgears transformers Calibration wika. (e-magic.be)
  • Despite these testing guidelines, a number of variables can affect the accuracy of VWF testing in general and this in turn may impact result interpretation and VWD diagnosis. (diapharma.com)
  • Regulatory interest has lead to the promulgation of monographs for the specification, calibration and control of NIR spectrometers in both the United States5 and European Pharmacopoeias.6 Unfortunately these monographs have not been harmonised and there are significant differences. (spectroscopyeurope.com)
  • Tekran is known worldwide for the accuracy and dependability of our equipment and will continue to exceed customer expectations. (environmental-expert.com)