Vermiculite, respiratory disease, and asbestos exposure in Libby, Montana: update of a cohort mortality study. (57/185)

BACKGROUND: Vermiculite from the mine near Libby, Montana, is contaminated with tremolite asbestos and other amphibole fibers (winchite and richterite). Asbestos-contaminated Libby vermiculite was used in loose-fill attic insulation that remains in millions of homes in the United States, Canada, and other countries. OBJECTIVE: This report describes asbestos-related occupational respiratory disease mortality among workers who mined, milled, and processed the Libby vermiculite. METHODS: This historical cohort mortality study uses life table analysis methods to compare the age-adjusted mortality experience through 2001 of 1,672 Libby workers to that of white men in the U.S. population. RESULTS: Libby workers were significantly more likely to die from asbestosis [standardized mortality ratio (SMR) = 165.8; 95% confidence interval (CI), 103.9-251.1], lung cancer (SMR = 1.7; 95% CI, 1.4-2.1), cancer of the pleura (SMR = 23.3; 95% CI, 6.3-59.5), and mesothelioma. Mortality from asbestosis and lung cancer increased with increasing duration and cumulative exposure to airborne tremolite asbestos and other amphibole fibers. CONCLUSIONS: The observed dose-related increases in asbestosis and lung cancer mortality highlight the need for better understanding and control of exposures that may occur when homeowners or construction workers (including plumbers, cable installers, electricians, telephone repair personnel, and insulators) disturb loose-fill attic insulation made with asbestos-contaminated vermiculite from Libby, Montana.  (+info)

Evaluation of calving seasons and marketing strategies in Northern Great Plains beef enterprises. II. Retained ownership systems. (58/185)

Two bioeconomic computer models were used to evaluate calving seasons in combination with calf marketing strategies for a range-based cow-calf enterprise in the Northern Great Plains. Calving seasons studied were spring (SP, calving beginning March 15 and weaning October 31), spring with calf mortality increased by 5% (SP-IM), summer (SU, calving beginning May 15 and weaning December 31), summer with early weaning (SU-EW, calving beginning May 15 and weaning October 31), and fall (FA, calving beginning August 15 and weaning February 1). Marketing scenarios for steer calves and nonreplacement heifer calves were as follows: sold after weaning (WS), backgrounded in Montana and sold as feeder cattle (WBS), backgrounded in Montana and then fed to slaughter BW in Nebraska (WBFS), and shipped to Nebraska at weaning and fed to slaughter BW (WFS). Quarterly inflation-adjusted cattle and feedstuff prices were representative of the 1990s cattle cycle. Cumulative gross margin (CGM), the sum of ranch gross margin and net return from retained ownership was used to compare systems. At the peak of the cattle cycle, all forms of retained ownership (WBS, WBFS, WFS) were profitable for all calving seasons, but during the descending phase, only WBS increased CGM markedly over WS for SU-EW. At the cycle valley, retained ownership was not profitable for SP and SP-IM, whereas WBFS and WFS were profitable for SU and SU-EW, and all forms of retained ownership were profitable for FA. During the ascending phase, retained ownership was profitable for all calving season-marketing combinations. Systems with the greatest CGM at each phase of the cattle cycle were FA-WFS, SP-WBS, FA-WFS, and FA-WFS at the peak, descending, valley, and ascending phases, respectively. In beef enterprises representative of the Northern Great Plains, with a restricted grazing season and limited access to low-cost, good-quality grazeable forage, no single calving season and no single combination of calving season and calf marketing is expected to be superior throughout the cattle cycle. Fall calving systems most often benefit from retained ownership through slaughter.  (+info)

Evaluation of calving seasons and marketing strategies in Northern Great Plains beef enterprises: I. Cow-calf systems. (59/185)

A bioeconomic computer model was used to evaluate alternate calving seasons in a cow-calf enterprise under range conditions representative of the Northern Great Plains. The simulated ranch utilized a rotational breeding system based on Hereford and Angus and had a fixed forage base (4,500 animal unit months of native range, 520 t of grass hay, and 183 t of alfalfa hay). Calving seasons studied were spring (SP, beginning March 15), summer (SU, beginning May 15), and fall (FA, beginning August 15). Weaning dates were October 31, December 15, and February 1, for SP, SU, and FA. The SP system was also simulated with a 5% increase in calf mortality (SP-IM), and SU with early weaning on October 31 (SU-EW). Herd size for the fixed resource was 509, 523, 519, 560, and 609 cows exposed per year for SP, SP-IM, SU, SU-EW, and FA, respectively. Corresponding values for weight weaned per cow exposed were 206, 186, 193, 153, and 145 kg. Steer calves, nonreplacement heifer calves, and cull cows were sold at the time of weaning. Quarterly cattle and feed prices used were representative of the peak, descending, valley, and ascending phases of the 1990s cattle cycle adjusted for inflation. Estimates of ranch gross margin (gross returns minus variable costs) were greatest for SP, followed by SP-IM, SU, SU-EW, and FA, and the ranks were consistent across phases of the cattle cycle. Differences between ranch gross margin for SP-IM and SU were small. In beef enterprises representative of the Northern Great Plains, with a restricted grazing season, limited access to low-cost, high-quality grazeable forage, and with calves sold at weaning, switching from early spring to a summer or fall calving date is not expected to improve profitability. If delaying calving improves calf survival, then calving in early summer may be a competitive choice.  (+info)

Levels of abnormal prion protein in deer and elk with chronic wasting disease. (60/185)

Chronic wasting disease (CWD) of deer and elk is a widespread health concern because its potential for crossspecies transmission is undetermined. CWD prevalence in wild elk is much lower than its prevalence in wild deer, and whether CWD-infected deer and elk differ in ability to infect other species is unknown. Because lymphoid tissues are important in the pathogenesis of some transmissible spongiform encephalopathies such as sheep scrapie, we investigated whether CWD-affected elk and deer differ in distribution or quantity of disease-associated prion protein (PrPres) in lymphoid tissues. Immunoblot quantification of PrPres from tonsil and retropharyngeal lymph nodes showed much higher levels of PrPres in deer than in elk. This difference correlated with the natural prevalence of CWD in these species and suggested that CWD-infected deer may be more likely than elk to transmit the disease to other cervids and have a greater potential to transmit CWD to noncervids.  (+info)

Internalization of Libby amphibole asbestos and induction of oxidative stress in murine macrophages. (61/185)

The community members of Libby, MT, have experienced significant asbestos exposure and developed numerous asbestos-related diseases including fibrosis and lung cancer due to an asbestos-contaminated vermiculite mine near the community. The form of asbestos in the contaminated vermiculite has been characterized in the amphibole family of fibers. However, the pathogenic effects of these fibers have not been previously characterized. The purpose of this study is to determine the cellular consequences of Libby amphibole exposure in macrophages compared to another well-characterized amphibole fiber; crocidolite asbestos. Our results indicate that Libby asbestos fibers are internalized by macrophages and localize to the cytoplasm and cytoplasmic vacuoles similar to crocidolite fibers. Libby asbestos fiber internalization generates a significant increase in intracellular reactive oxygen species (ROS) as determined by dichlorofluorescein diacetate and dihydroethidine fluorescence indicating that the superoxide anion is the major contributing ROS generated by Libby asbestos. Elevated superoxide levels in macrophages exposed to Libby asbestos coincide with a significant suppression of total superoxide dismutase activity. Both Libby and crocidolite asbestos generate oxidative stress in exposed macrophages by decreasing intracellular glutathione levels. Interestingly crocidolite asbestos, but not Libby asbestos, induces significant DNA damage in macrophages. This study provides evidence that the difference in the level of DNA damage observed between Libby and crocidolite asbestos may be a combined consequence of the distinct chemical compositions of each fiber as well as the activation of separate cellular pathways during asbestos exposure.  (+info)

Temperature effect on tert-butyl alcohol (TBA) biodegradation kinetics in hyporheic zone soils. (62/185)

BACKGROUND: Remediation of tert-butyl alcohol (TBA) in subsurface waters should be taken into consideration at reformulated gasoline contaminated sites since it is a biodegradation intermediate of methyl tert-butyl ether (MTBE), ethyl tert-butyl ether (ETBE), and tert-butyl formate (TBF). The effect of temperature on TBA biodegradation has not been not been published in the literature. METHODS: Biodegradation of [U 14C] TBA was determined using hyporheic zone soil microcosms. RESULTS: First order mineralization rate constants of TBA at 5 degrees C, 15 degrees C and 25 degrees C were 7.84 +/- 0.14 x 10-3, 9.07 +/- 0.09 x 10-3, and 15.3 +/- 0.3 x 10-3 days-1, respectively (or 2.86 +/- 0.05, 3.31 +/- 0.03, 5.60 +/- 0.14 years-1, respectively). Temperature had a statistically significant effect on the mineralization rates and was modelled using the Arrhenius equation with frequency factor (A) and activation energy (Ea) of 154 day-1 and 23,006 mol/J, respectively. CONCLUSION: Results of this study are the first to determine mineralization rates of TBA for different temperatures. The kinetic rates determined in this study can be used in groundwater fate and transport modelling of TBA at the Ronan, MT site and provide an estimate for TBA removal at other similar shallow aquifer sites and hyporheic zones as a function of seasonal change in temperature.  (+info)

Low-level fiber-induced radiographic changes caused by Libby vermiculite: a 25-year follow-up study. (63/185)

RATIONALE: From 1921 to 1990, vermiculite ore from Libby, Montana, was shipped worldwide for commercial and residential use. A 1980 study of a manufacturing facility using Libby vermiculite was the first to demonstrate a small but significant prevalence of pleural chest radiographic changes associated with amphibole fibers contained in the ore. OBJECTIVES: This follow-up study of the original cohort evaluated the extent of radiographic changes and cumulative fiber exposure (CFE) 25 years after cessation of exposure. METHODS: From the original cohort of 513 workers, 431 (84%) were living and available for participation and exposure reconstruction. Of these, 280 (65%) completed both chest radiographs and interviews. Primary outcomes were pleural and/or interstitial changes. MEASUREMENTS AND MAIN RESULTS: Pleural and interstitial changes were demonstrated in 80 (28.7%) and 8 (2.9%) participants, respectively. Of those participants with low lifetime CFE of less than 2.21 fiber/cc-years, 42 (20%) had pleural changes. A significant (P < 0.001) exposure-response relationship of pleural changes with CFE was demonstrated, ranging from 7.1 to 54.3% from the lowest to highest exposure quartile. Removal of individuals with commercial asbestos exposure did not alter this trend. CONCLUSIONS: This study indicates that exposure within an industrial process to Libby vermiculite ore is associated with pleural thickening at low lifetime CFE levels. The propensity of the Libby amphibole fibers to dramatically increase the prevalence of pleural changes 25 years after cessation of exposure at low CFE levels is a concern in view of the wide national distribution of this ore for commercial and residential use.  (+info)

Ethnic differences in BMI, weight concerns, and eating behaviors: comparison of Native American, White, and Hispanic adolescents. (64/185)

Evidence suggests that substantial proportions of adolescents, regardless of ethnicity or gender, are engaged in excessive weight control behaviors. Crago and Shisslak (2003), however, have noted that small samples and poorly validated instruments have limited the value of previous ethnic difference studies. Using the McKnight Risk Factor Survey, we compared Native American, White, and Hispanic adolescents. Native students were divided into groups with one (NA-mixed) or two (NA) Native American biological parents. Surveys were completed by 5th through 10th grade students. BMI z-scores were significantly higher for boys and girls in the NA group, and boys in this group were significantly more engaged in weight control behaviors, including purging. A higher percentage of Native and Hispanic girls preferred a larger body size. BMI was positively correlated with weight and shape concerns and with weight control behaviors, regardless of ethnicity or gender. Overweight among Native adolescents may put them at greater risk for eating problems than their White peers.  (+info)