How many beds should a hospital department serve?
Departmental cost functions are constructed for selected hospital departments, using total number of beds in the hospital served as a proxy output measure. Calculation of maxima or minima for the resulting cost functions reveals that, on average, different departments have extremes in their cost functions of different levels of output. A relative cost index is constructed, using parameters of the departmental cost functions, and departmental costs are compared across regions. The significance of departmental differences in optimum output is discussed with regard to sharing of services and modified system design. (+info)
Process reengineering: the role of a planning methodology and picture archiving and communications system team building.
The acquisition of a picture archiving and communications system (PACS) is an opportunity to reengineer business practices and should optimally consider the entire process from image acquisition to communication of results. The purpose of this presentation is to describe the PACS planning methodology used by the Department of Defense (DOD) Joint Imaging Technology Project Office (JITPO), outline the critical procedures for each phase, and review the military experience using this model. The methodology is segmented into four phases: strategic planning, clinical scenario planning, installation planning, and implementation planning. Each is further subdivided based on the specific tasks that need to be accomplished within that phase. By using this method, an institution will have clearly defined program goals, objectives, and PACS requirements before vendors are contacted. The development of an institution-specific PACS requirement should direct the process of proposal comparisons to be based on functionality and exclude unnecessary equipment. This PACS planning methodology is being used at more than eight DOD medical treatment facilities. When properly executed, this methodology facilitates a seamless transition to the electronic environment and contributes to the successful integration of the healthcare enterprise. A crucial component of this methodology is the development of a local PACS planning team to manage all aspects of the process. A plan formulated by the local team is based on input from each department that will be integrating with the PACS. Involving all users in the planning process is paramount for successful implementation. (+info)
Estimating prevalence of alcohol abuse and dependence in one general hospital: an approach to reduce sample selection bias.
Prevalence estimates of alcohol abuse or dependence in general hospitals are often limited to single wards, small data collecting periods or insufficient diagnostic procedures. Therefore, the present study aimed to ascertain alcohol abuse or dependence in one general hospital, to compare prevalence data for all the 11 wards and 6 intake months, to establish if screening is sufficient or if a two-step diagnostic procedure is needed, and to determine whether information for an alcohol diagnosis on suspicion is available. A sample of 1309 medical or surgical in-patients were screened by questionnaires or medication for withdrawal, and, if screening-positive, were interviewed with the alcohol section of a standardized psychiatric interview. In screening-negative patients, a diagnosis on suspicion was given if medication to treat withdrawal had been used, or if there was evidence of single criteria of alcohol dependence, somatic disorders from alcohol drinking, raised laboratory parameters on grounds of alcohol drinking or of self-reported high alcohol consumption. Of the medical and surgical in-patients, 20.7 and 16.0% respectively were alcohol abusers or dependents, with a range of prevalence rates of alcohol abuse or dependence among wards of 11.1-32.9% and among intake months between 11.3 and 28.7%. Of the medical department in-patients, 1.9%, and of the surgical in-patients, 2.1%, were screened as false-positive cases. In addition, 5.5% of the medical and 12.0% of the surgical patients were given a diagnosis on suspicion. It is concluded that all general wards and different intake months should be taken into account when estimating prevalence of alcohol abuse or dependence in a general hospital. (+info)
Difference between observed and predicted length of stay as an indicator of inpatient care inefficiency.
OBJECTIVES: To evaluate the performance of the difference between observed and predicted length of stay (OLOS-PLOS) as an inefficiency of care indicator for inpatients. SETTING: The Internal Medicine and the General Surgery departments of Hermanos Ameijeiras Hospital in Havana. DESIGN AND STUDY PARTICIPANTS: Two sets of clinical histories were needed for each department: one for deriving the predictive equation and another to validate it. The equation was a linear multiple regression model which included variables recognized as affecting length of stay. The validation group of histories was thoroughly examined and separated into two groups: (i) adequate efficiency or mild problems and (ii) inefficiencies considered to be moderate or severe. This classification was the gold standard to obtain a receiver operating characteristic (ROC) curve for the indicator. RESULTS: The function explained 41% of the total variation for Internal Medicine and 70% for General Surgery. The indicator's mean difference between the two validation groups of histories was around 10 days for both departments. The areas under the ROC curve were 0.80 for Internal Medicine and 0.88 for General Surgery. Sensitivity and specificity > 0.7 for detecting inefficiencies of care are achieved with a cut off point of 2 days for Internal Medicine and 1 day for General Surgery. CONCLUSIONS: The use of predictive equations might be quite useful for detecting efficiency problems in inpatient health care. (+info)
Reducing errors made by emergency physicians in interpreting radiographs: longitudinal study.
OBJECTIVES: To reduce errors made in the interpretation of radiographs in an emergency department. DESIGN: Longitudinal study. SETTING: Hospital emergency department. INTERVENTIONS: All staff reviewed all clinically significant discrepancies at monthly meetings. A file of clinically significant errors was created; the file was used for teaching. Later a team redesigned the process. A system was developed for interpreting radiographs that would be followed regardless of the day of the week or time of day. All standard radiographs were brought directly to the emergency physician for immediate interpretation. Radiologists reviewed the films within 12 hours as a quality control measure, and if a significant misinterpretation was found patients were asked to return. MAIN OUTCOME MEASURES: Reduction in number of clinically significant errors (such as missed fractures or foreign bodies) on radiographs read in the emergency department. Data on the error rate for radiologists and the effect of the recall procedure were not available so reliability modelling was used to assess the effect of these on overall safety. RESULTS: After the initial improvements the rate of false negative errors fell from 3% (95% confidence interval 2.8% to 3.2%) to 1.2% (1.03% to 1.37%). After the processes were redesigned it fell further to 0.3% (0.26% to 0.34%). Reliability modelling showed that the number of potential adverse effects per 1000 cases fell from 19 before the improvements to 3 afterwards and unmitigated adverse effects fell from 2.2/1000 before to 0.16/1000 afterwards, assuming 95% success in calling patients back. CONCLUSION: Systems of radiograph interpretation that optimise the skills of all clinicians involved and contain reliable processes for mitigating errors can reduce error rates substantially. (+info)
Molecular epidemiology of an outbreak due to IRT-2 beta-lactamase-producing strains of Klebsiella pneumoniae in a geriatric department.
In February 1998, 195 patients in the geriatric department of a French hospital were screened for the presence of co-amoxiclav-resistant Klebsiella pneumoniae. Eleven co-amoxiclav-resistant isolates obtained all produced an identical IRT-2 beta-lactamase. These K. pneumoniae isolates were clonally related and harboured a c. 55 kb non-conjugative plasmid encoding a non-class-1 integron-located blaIRT-2 gene. This study underlines that geriatric departments may be a reservoir for antibiotic-resistant strains and that IRT beta-lactamase-producing strains may be nosocomial pathogens. (+info)
Feasibility of hospital-based blood banking: a Tanzanian case study.
The demand for blood transfusion is high in sub-Saharan Africa because of the high prevalence of anaemia and pregnancy related complications, but the practice is estimated to account for 10% of HIV infections in some regions. The main response to this problem by the international donor community is to establish vertically implemented blood transfusion services producing suitable (safe) blood at a cost of US$25-40 per unit. However, the economic sustainability of such interventions is questionable and it is argued here that hospital-based blood transfusion services operating at a basic adequate level are sufficient for low-income African countries. The results of a project aimed at improving such services in Tanzania are presented. The main findings are: (1) the cost per suitable blood unit produced was US$12.4; (2) at an HIV test sensitivity of 93.5% during the study period, discounted financial benefits of the interventions exceeded costs by a factor of between 17.2 and 37.1; (3) the cost per undiscounted year of life saved by use of these interventions was US$2.7-2.8; and (4) safe blood transfusion practices can be assured at an annual cost of US$0.07 per capita. Recommendations are made to ensure safe blood transfusion practices at hospital-based blood banks in Tanzania. (+info)
Warfarin for stroke prevention still underused in atrial fibrillation: patterns of omission.
BACKGROUND AND PURPOSE: The value of warfarin in preventing stroke in patients with chronic atrial fibrillation is well established. However, the prevalence of such treatment generally lags behind actual requirements. The aim of this study was to evaluate doctor- and/or patient-related demographic, clinical, and echocardiographic factors that influence decision for warfarin treatment. METHODS: Between 1990 and 1998, 1027 patients were discharged with chronic or persistent atrial fibrillation. This population was composed of (1) patients with cardiac prosthetic valves (n=48), (2) those with increased bleeding risks (n=152), (3) physically or mentally handicapped patients (n=317), and (4) the remaining 510 patients, the main study group who were subjected to thorough statistical analysis for determining factors influencing warfarin use. RESULTS: The respective rates of warfarin use on discharge in the 4 groups were 93.7%, 30.9%, 17.03%, and 59.4% (P=0.001); of the latter, an additional 28.7% were discharged on aspirin. In the main study group, warfarin treatment rates increased with each consecutive triennial period (29.7%, 53.6%, and 77.1%, respectively; P=0.001). Age >80 years, poor command of Hebrew, and being hospitalized in a given medical department emerged as independent variables negatively influencing warfarin use: P=0.0001, OR 0.30 (95% CI 0.17 to 0.55); P=0.02, OR 0.59 (95% CI 0.36 to 0.94); and P=0.0002, OR 0.26 (95% CI 0.12 to 0.52), respectively. In contrast, past history of stroke and availability of echocardiographic information, regardless of the findings, each increased warfarin use (P=0.03, OR 1.95 [95% CI 1.04 to 3.68], and P=0.0001, OR 3.52 [95% CI 2.16 to 5.72], respectively). CONCLUSIONS: Old age, language difficulties, insufficient doctor alertness to warfarin benefit, and patient disability produced reluctance to treat. Warfarin use still lags behind requirements. (+info)