Measurement variability associated with KXRF bone lead measurement in young adults. (41/2413)

In vivo bone lead measurement using K X-ray fluorescence (KXRF) has been used to estimate long-term lead exposure, especially in adults. Relatively few studies have been conducted on young subjects with this technique. To explore the measurement variability of KXRF bone lead measurements in young subjects, the tibiae of two male cadavers from Boston, Massachusetts, 17 and 20 years of age, were obtained for repeated bone lead measurements. Bone lead concentrations were measured using a grid of nine locations, 1 cm apart, centered at the midpoint of the tibia. Each location was sampled using five 60-min measurements. Measured concentrations ranged from < 0 to 11.8 microg Pb/g bone mineral across a tibia with mean concentrations for the midpoint locations of 0.8 microg Pb/g bone mineral SD = 2.5 and 2.0 microg Pb/g bone mineral (SD = 1.9) for the left and right legs of the younger subject and 3.6 microg Pb/g bone mineral (SD = 2.6) and 6.0 microg Pb/g bone mineral (SD = 3.3) for the left and right legs of the older subject. Although bone lead concentrations did not vary significantly by measurement location in an individual leg, reported measurement uncertainty increased significantly at locations that were 1 cm from the center of the tibia horizontally (p < 0.0001). Symmetry in bone lead concentration between legs was observed for the 17-year-old subject. Potential asymmetry between the left and right legs was suggested for the 20-year-old subject (p = 0.06). These data describe the degree of variability that may be associated with bone lead measurements of young subjects with low bone lead concentrations using a standard spot-source KXRF instrument. Because of the importance of conducting additional research on adolescent lead toxicity, further improvements to the precision of KXRF measurement are needed.  (+info)

Long-term results of arterial allograft below-knee bypass grafts for limb salvage: a retrospective multicenter study. (42/2413)

PURPOSE: Arterial allografts (AAs) have been recently reconsidered in the treatment of critical limb ischemia when vein material is absent, because of the disappointing results with artificial grafts. The aim of this study was to report the results observed in three centers where AAs were used for infrainguinal reconstruction in limb-threatening ischemia. METHODS: Between 1991 and 1997, 165 AA bypass procedures were performed in 148 patients (male, 90) with a mean age of 70 years (range, 20-93 years). Indications for operation were rest pain in 54 cases and tissue loss in 111 cases. Mean resting ankle pressure was 53 mm Hg in 96 patients who did not have diabetes and mean transcutaneous pressure of oxygen was 10 mm Hg in 52 patients who did have diabetes. In 123 cases (75%), there was at least one previous revascularization on the same limb. AAs were obtained from cadaveric donors. The distal anastomosis was to the below-knee popliteal artery in 34 cases, to a tibial artery in 114 cases, and to a pedal artery in 17 cases. RESULTS: At 30 days, the mortality rate was 3.4%; the primary patency rate was 83.3%; the secondary patency rate was 90%; and the limb salvage rate was 98%. During follow-up (mean, 31 months), 65 grafts failed primarily. Causes of primary failure were thought to be progression of the distal disease in 15 cases, myointimal hyperplasia in 16 cases, graft degradation in 10 cases (four dilations, three stenoses, two ruptures, and one dissection), miscellaneous in eight cases, and not known in 16 cases. Primary patency rates at 1, 3, and 5 years were, respectively, 48.7% +/- 4%, 34.9% +/- 6%, and 16.1% +/- 7%. Secondary patency rates at 1, 3, and 5 years were, respectively, 59. 8% +/- 4%, 42.1% +/- 5%, and 25.9% +/- 8%. Limb salvage rates at 1, 3, and 5 years were, respectively, 83.8% +/- 3%, 76.4% +/- 5%, and 74.2 % +/- 8%. CONCLUSION: AA leads to an acceptable limb salvage rate but poor patency rates. A randomized trial that will compare AAs and polytetrafluoroethylene should be undertaken.  (+info)

Superficial femoral popliteal vein: An anatomic study. (43/2413)

OBJECTIVE: The superficial femoral popliteal vein (SFPV) has been used as an alternative conduit for both arterial and venous reconstructive surgery. Its popularity continues to grow, despite concern about the potential for venous morbidity after harvest. The purpose of this study was to determine an anatomic "safe" length of SFPV for harvest, assuming that the preservation of at least one valve and one significant collateral vein in the remaining popliteal vein (PV) segment can minimize venous morbidity. METHODS: Forty-four SFPVs were harvested from 39 cadaveric specimens. The length of both the superficial femoral vein (SFV) and PV was measured, and the number and location of valves and significant side branches (more than 2 mm in diameter) of the PV were measured. The Student two-tailed t test was used as a means of comparing vein lengths between the sexes. Correlation coefficients were determined for the effect of patient height on vein length, stratified by means of sex. RESULTS: Vein length (SFV mean, 24.4 +/- 4 cm; PV mean, 18.8 +/- 4 cm) varied with sex (male SFV mean, 28.1 +/- 5 cm; male PV mean, 21. 5 +/- 3 cm; female SFV mean, 22.6 +/- 4 cm; female PV mean, 18.4 +/- 3 cm; P =.01). Valve number (mean, 1.8 +/- 0.5) and location and collateral vein number (mean, 5 +/- 1.8) and location were variable and independent of height or sex. CONCLUSION: An anatomic "safe" length of SFPV for harvest to minimize venous morbidity would include all the SFV and 12 cm of PV in 95% of women and 15 cm of PV in 95% of men. We found that the male sex was a significant determinant for a longer safe length of vein that can be harvested.  (+info)

Simultaneous pancreas-kidney transplantation and living related donor renal transplantation in patients with diabetes: is there a difference in survival? (44/2413)

OBJECTIVE: To compare the outcome of simultaneous pancreas-kidney transplantation (SPK) and living related donor renal transplantation (LRD) in patients with diabetes. SUMMARY BACKGROUND DATA: It remains unanswered whether diabetic patients with end-stage renal failure are better served by LRD or SPK. METHODS: Using a longitudinal database, data from all diabetic patients receiving LRD or cadaveric renal transplants or SPKs from January 1986 through January 1996 were analyzed. Patient and graft survival, early graft function, and the cause of patient and graft loss were compared for 43 HLA-identical LRDs, 87 haplotype-identical LRDs, 379 SPKs, and 296 cadaveric renal transplants. RESULTS: The demographic composition of the SPK and LRD groups were similar, but because of less strict selection criteria in the cadaveric transplant group, patients were 10 years older, more patients received dialysis, and patients had been receiving dialysis longer before transplantation. Patient survival was similar for the SPK and LRD groups but was significantly lower for the cadaveric renal transplant group. Similarly, there was no difference in graft survival between SPK and LRD recipients, but it was significantly lower for recipients in the cadaveric renal transplant group. Delayed graft function was significantly more common in the cadaveric renal transplant group. Discharge creatinine, the strongest predictor of patient and graft survival, was highest in the SPK group and lowest in the HLA-identical LRD group. The rate of rejection within the first year was greatest in SPK patients (77%), intermediate in the haplotype-identical LRD and cadaveric transplant groups (57% and 48%, respectively), and lowest (16%) in the HLA-identical LRD group. Cardiovascular disease was the primary cause of death for all groups. Acute rejection, chronic rejection, and death with a functioning graft were the predominant causes of graft loss. CONCLUSIONS: This study demonstrates that there was no difference in patient or graft survival in diabetic patients receiving LRD or SPK transplants. However, graft and patient survival rates in diabetic recipients of cadaveric renal transplants were significantly lower than in the other groups.  (+info)

Influence of evaporation and solvent mixtures on the absorption of toluene and n-butanol in human skin in vitro. (45/2413)

The influence of forced ventilation on the percutaneous absorption of butanol and toluene was studied in vitro. Human skin was exposed to the neat solvents and the solvents in binary mixtures with each other and in ternary mixtures with chloroform:methanol. The exposure was either unventilated or ventilated with various flow rates. At the ventilated exposure the skin absorption of all solvents and solvent mixtures was markedly reduced compared to unventilated exposure. Exposure with solvent mixtures increased the amounts of solvent absorbed as well as absorption rates. The absorption of the butanol component was most influenced. Increase in absorption was 11 to 9 times depending on whether toluene or chloroform/methanol was cosolvent. There was also an interindividual variation of absorption rate, varying with a factor of 3.5 for toluene and 4.3 for n-butanol within the 3 skin donors used. Skin absorption of volatile organic solvents at continuous ventilated conditions is related to their volatility and to the ventilation rate.A sufficient workplace ventilation is an important occupational hygienic measure not only to reduce exposure via respiration but to reduce absorption via the skin of volatile compounds as well.  (+info)

Transplantation of three adult patients with one cadaveric graft: wait or innovate. (46/2413)

Graft shortage continues to prolong waiting times for adults requiring liver transplantation. Living related donor transplantation is possible for only a small minority of adults. The techniques for in situ splitting of the liver used for right and left hepatectomies in living donors were adapted to a combined split-liver-domino procedure to obtain right and left hemiliver grafts from a patient undergoing total hepatectomy with liver transplantation for a metabolic disorder. The two grafts were adequate in size and function for transplantation to two adults with low priority for regular cadaver grafts. More frequent use of split-liver techniques in cadaver donors could considerably reduce the graft shortage and waiting time for adult liver recipients.  (+info)

Comparison of conventional surgical versus Seldinger technique emergency cricothyrotomy performed by inexperienced clinicians. (47/2413)

BACKGROUND: Cricothyrotomy is the ultimate option for a patient with a life-threatening airway problem. METHODS: The authors compared the first-time performance of surgical (group 1) versus Seldinger technique (group 2) cricothyrotomy in cadavers. Intensive care unit physicians (n = 20) performed each procedure on two adult human cadavers. Methods were compared with regard to ease of use and anatomy of the neck of the cadaver. Times to location of the cricothyroid membrane, to tracheal puncture, and to the first ventilation were recorded. Each participant was allowed only one attempt per procedure. A pathologist dissected the neck of each patient and assessed correctness of position of the tube and any injury inflicted. Subjective assessment of technique and cadaver on a visual analog scale from 1 (easiest) to 5 (worst) was conducted by the performer. RESULTS: Age, height, and weight of the cadavers were not different. Subjective assessment of both methods (2.2 in group 1 vs. 2.4 in group 2) and anatomy of the cadavers (2.2 in group 1 vs. 2.4 in group 2) showed no statistically significant difference between both groups. Tracheal placement of the tube was achieved in 70% (n = 14) in group 1 versus 60% (n = 12) in group 2 (P value not significant). Five attempts in group 2 had to be aborted because of kinking of the guide wire. Time intervals (mean +/- SD) were from start to location of the cricothyroid membrane 7 +/- 9 s (group 1) versus 8 +/- 7s (group 2), to tracheal puncture 46 +/- 37s (group 1) versus 30 +/- 28s (group 2), and to first ventilation 102 +/- 42s (group 1) versus 100 +/- 46s (group 2) (P value not significant). CONCLUSIONS: The two methods showed equally poor performance.  (+info)

Comparison of laryngeal mask and intubating laryngeal mask insertion by the naive intubator. (48/2413)

Seventy-five inexperienced participants were timed inserting the laryngeal mask airway (LMA) and the intubating laryngeal mask (ILM) in one of five cadavers. Adequacy of ventilation was assessed on a three-point scale depending on chest expansion and air leak. Participants were also asked to intubate the trachea via the ILM. The ILM was inserted faster than the LMA (P < 0.05) with a greater proportion achieving adequate ventilation after their first attempt (P < 0.05). Tracheal intubation via the ILM was completed successfully by 67% (52 of 75) of participants. In a questionnaire, participants stated that the ILM was easier to use and the preferred device in an emergency. The results suggest that inexperienced practitioners should use the ILM rather than the LMA for emergency ventilation.  (+info)