Quality review procedures necessary for rodent pathology databases and toxicogenomic studies: the National Toxicology Program experience. (49/791)

Accuracy of the pathology data is crucial since rodent studies often provide critical data used for setting human chemical exposure standards. Diagnoses represent a judgment on the expected biological behavior of a lesion and peer review can improve diagnostic accuracy and consistency. With the conduct of 500 2-year rodent studies, the National Toxicology Program (NTP) has refined its process for comprehensive review of the pathology data and diagnoses. We have found that careful judgment can improve and simplify the review, whereas simply applying a set review procedure may not assure study quality. The use of reviewing pathologists and pathology peer review groups is a very effective procedure to increase study quality with minimal time and cost. New genomic technology to assess differential gene expression is being used to predict morphological phenotypes such as necrosis, hyperplasia, and neoplasia. The challenge for pathologists is to provide uniform pathology phenotypes that can be correlated with the gene expression changes. The lessons learned in assuring data quality in standard rodent studies also applies to the emerging field of toxicogenomics.  (+info)

Qualitative and quantitative analysis of nonneoplastic lesions in toxicology studies. (50/791)

A pathology report is written to convey information concerning the pathologic findings in a study. This type of report must be complete, accurate and communicate the relative importance of various findings in a study. The overall quality of the report is determined by three Quality Indicators: thoroughness, accuracy, and consistency. Thoroughness is the identification of every lesion present in a particular organ or tissue, including spontaneous background lesions. Experienced pathologists familiar with background lesions may disregard certain types of lesions or establish a threshold or a severity above which background lesions are diagnosed. Accuracy is the ability to make, and precisely communicate, correct diagnoses. Nomenclature of lesions is a matter of definition and experienced pathologists generally agree as to what terms are to be used. Consistency is the uniform use of a specific term to record a defined lesion and implies that the same diagnostic criteria are being followed for each type of diagnosis. The relative severity of nonneoplastic lesions can be recorded either semiquantitatively or quantitatively. Semiquantitative analysis involves the application of defined severity grades or ranges for specific lesions. Quantitative analysis (counts and measurements) can be performed manually or electronically, utilizing image analysis and stereological techniques to provide numerical values. When both qualitative and quantitative parameters are applied in preparation of a pathology report, the recorded pathology findings can be interpreted and put into perspective. The use of this approach assures a reader that the pathology report meets the highest standards.  (+info)

Dose-response modeling of continuous endpoints. (51/791)

A family of (nested) dose-response models is introduced herein that can be used for describing the change in any continuous endpoint as a function of dose. A member from this family of models may be selected using the likelihood ratio test as a criterion, to prevent overparameterization. The proposed methodology provides for a formal approach of model selection, and a transparent way of assessing the benchmark dose. Apart from a number of natural constraints, the model expressions follow from an obvious way of quantifying differences in sensitivity between populations. As a consequence, dose-response data that relate to both sexes can be efficiently analyzed by incorporating the data from both sexes in the same analysis, even if the sexes are not equally sensitive to the compound studied. The idea of differences in sensitivity is closely related to the assessment factors used in risk assessment. Thus, the models are directly applicable to estimating such factors, if data concerning populations to be compared are available. Such information is valuable for further validation or adjustment of default assessment factors, as well as for informing distributional assessment factors in a probabilistic risk assessment. The various applications of the proposed methodology are illustrated by real data sets.  (+info)

Analysis of rodent growth data in toxicology studies. (52/791)

To evaluate compound-related effects on the growth of rodents, body weight and food consumption data are commonly collected either weekly or biweekly in toxicology studies. Body weight gain, food consumption relative to body weight, and efficiency of food utilization can be derived from body weight and food consumption for each animal in an attempt to better understand the compound-related effects. These five parameters are commonly analyzed in toxicology studies for each sex using a one-factor analysis of variance (ANOVA) at each collection point. The objective of this manuscript is to present an alternative approach to the evaluation of compound-related effects on body weight and food consumption data from both subchronic and chronic rodent toxicology studies. This approach is to perform a repeated-measures ANOVA on a selected set of parameters and analysis intervals. Compared with a standard one-factor ANOVA, this approach uses a statistical analysis method that has greater power and reduces the number of false-positive claims, and consequently provides a succinct yet comprehensive summary of the compound-related effects. Data from a mouse carcinogenicity study are included to illustrate this repeated-measures ANOVA approach to analyzing growth data in contrast with the one-factor ANOVA approach.  (+info)

An overview of toxicogenomics. (53/791)

Toxicogenomics is a rapidly developing discipline that promises to aid scientists in understanding the molecular and cellular effects of chemicals in biological systems. This field encompasses global assessment of biological effects using technologies such as DNA microarrays or high throughput NMR and protein expression analysis. This review provides an overview of advancing multiple approaches (genomic, proteomic, metabonomic) that may extend our understanding of toxicology and highlights the importance of coupling such approaches with classical toxicity studies.  (+info)

Advances in the use of mass spectral libraries for forensic toxicology. (54/791)

Gas chromatography in combination with mass spectrometry (GC-MS) plays an important role in the field of analytical toxicology. The identification of unknown compounds is very frequently undertaken with GC-MS and utilizing mass spectral libraries. Currently available libraries for analytical toxicology were compared for overlapping and uniqueness of their entries. Furthermore, the widely known Pfleger-Maurer-Weber-Drugs-and-Pesticides-Library for toxicology (PMW_tox2) was used to compare the search algorithms PBM (Probability Based Matching, Agilent Technologies), INCOS (Finnigan/Thermoquest), and MassLib (Max Planck Institute). To our knowledge, direct comparisons of mass spectral libraries and search programs for analytical toxicology have not been published previously. The capabilities and necessities of modern MS technology in the field of general unknown analysis are revealed, and some of the potential pitfalls are described.  (+info)

Relevance of animal experiments to humans. (55/791)

The best evidence of an adverse human health effect is a properly conducted epidemiological study. But human beings should not be the sole test animal. Properly conducted animal studies have been shown to be preductive for carcinogenicity and toxicologic responses in human populations. We need to develop more efficient predictive animal tests for all the common serious toxic effects caused by chemicals. One particularly important use of epidemiological studies is to validate (or invalidate) the laboratory animal experiments. There is no more powerful tool than the combination of well conducted animal experiments and well conducted epidemiological experiments.  (+info)

Role of scientific data in health decisions. (56/791)

The distinction between reality and models or methodological assumptions is necessary for an understanding of the use of data--economic, technical or biological--in decision-making. The traditional modes of analysis used in decisions are discussed historically and analytically. Utilitarian-based concepts such as cost-benefit analysis and cannibalistic concepts such as "acceptable risk" are rejected on logical and moral grounds. Historical reality suggests the concept of socially necessary risk determined through the dialectic process in democracy.  (+info)