Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 26
Filter
3.
Am Heart J ; 185: 101-109, 2017 Mar.
Article in English | MEDLINE | ID: mdl-28267463

ABSTRACT

Improving 30-day readmission continues to be problematic for most hospitals. This study reports the creation and validation of sex-specific inpatient (i) heart failure (HF) risk scores using electronic data from the beginning of inpatient care for effective and efficient prediction of 30-day readmission risk. METHODS: HF patients hospitalized at Intermountain Healthcare from 2005 to 2012 (derivation: n=6079; validation: n=2663) and Baylor Scott & White Health (North Region) from 2005 to 2013 (validation: n=5162) were studied. Sex-specific iHF scores were derived to predict post-hospitalization 30-day readmission using common HF laboratory measures and age. Risk scores adding social, morbidity, and treatment factors were also evaluated. RESULTS: The iHF model for females utilized potassium, bicarbonate, blood urea nitrogen, red blood cell count, white blood cell count, and mean corpuscular hemoglobin concentration; for males, components were B-type natriuretic peptide, sodium, creatinine, hematocrit, red cell distribution width, and mean platelet volume. Among females, odds ratios (OR) were OR=1.99 for iHF tertile 3 vs. 1 (95% confidence interval [CI]=1.28, 3.08) for Intermountain validation (P-trend across tertiles=0.002) and OR=1.29 (CI=1.01, 1.66) for Baylor patients (P-trend=0.049). Among males, iHF had OR=1.95 (CI=1.33, 2.85) for tertile 3 vs. 1 in Intermountain (P-trend <0.001) and OR=2.03 (CI=1.52, 2.71) in Baylor (P-trend < 0.001). Expanded models using 182-183 variables had predictive abilities similar to iHF. CONCLUSIONS: Sex-specific laboratory-based electronic health record-delivered iHF risk scores effectively predicted 30-day readmission among HF patients. Efficient to calculate and deliver to clinicians, recent clinical implementation of iHF scores suggest they are useful and useable for more precise clinical HF treatment.


Subject(s)
Heart Failure/blood , Patient Readmission/statistics & numerical data , Risk Assessment/methods , Adolescent , Adrenergic beta-Antagonists/therapeutic use , Adult , Aged , Aged, 80 and over , Angiotensin Receptor Antagonists/therapeutic use , Angiotensin-Converting Enzyme Inhibitors/therapeutic use , Anticoagulants/therapeutic use , Bicarbonates/blood , Blood Urea Nitrogen , Calcium Channel Blockers/therapeutic use , Cardiotonic Agents/therapeutic use , Creatinine/blood , Diuretics/therapeutic use , Erythrocyte Count , Erythrocyte Indices , Heart Failure/drug therapy , Hematocrit , Hospitalization , Humans , Hydroxymethylglutaryl-CoA Reductase Inhibitors/therapeutic use , Hypoglycemic Agents/therapeutic use , Leukocyte Count , Logistic Models , Middle Aged , Multivariate Analysis , Natriuretic Peptide, Brain/blood , Odds Ratio , Platelet Aggregation Inhibitors/therapeutic use , Potassium/blood , Proportional Hazards Models , Reproducibility of Results , Sex Factors , Sodium/blood , Vasoconstrictor Agents/therapeutic use , Young Adult
4.
EGEMS (Wash DC) ; 5(3): 8, 2017 Dec 15.
Article in English | MEDLINE | ID: mdl-29881757

ABSTRACT

Current commercially-available electronic medical record systems produce mainly text-based information focused on financial and regulatory performance. We combined an existing method for organizing complex computer systems-which we label activity-based design-with a proven approach for integrating clinical decision support into front-line care delivery-Care Process Models. The clinical decision support approach increased the structure of textual clinical documentation, to the point where established methods for converting text into computable data (natural language processing) worked efficiently. In a simple trial involving radiology reports for examinations performed to rule out pneumonia, more than 98 percent of all documentation generated was captured as computable data. Use cases across a broad range of other physician, nursing, and physical therapy clinical applications subjectively show similar effects. The resulting system is clinically natural, puts clinicians in direct, rapid control of clinical content without information technology intermediaries, and can generate complete clinical documentation. It supports embedded secondary functions such as the generation of granular activity-based costing data, and embedded generation of clinical coding (e.g., CPT, ICD-10 or SNOMED). Most important, widely-available computable data has the potential to greatly improve care delivery management and outcomes.

5.
Harv Bus Rev ; 94(7-8): 102-11, 134, 2016.
Article in English | MEDLINE | ID: mdl-27526566

ABSTRACT

Recent studies suggest that at least 35%--and maybe over 5o%--of all health care spending in the U.S. is wasted on inadequate, unnecessary, and inefficient care and suboptimal business processes. But efforts to get rid of that waste face a huge challenge: Under current payment methods, the providers who develop more-cost-effective approaches don't receive any of the savings. Instead, the money goes mainly to insurers. The providers, who are paid for the volume of services delivered, end up actually losing money, which undermines their finances and their ability to invest in more cost-saving innovations. To address this quandary, say two top execs from the nonprofit Intermountain Healthcare system, we need a different way to pay for health care: population-based payment. PBP gives care delivery groups a fixed per-person payment that covers all of an individual's health care services in a given year. Under it, providers benefit from the savings of all efforts to attack waste, encouraging them to do it more. And though PBP may sound similar to the HMOs of the 1990s, there are significant twists: Payments go directly to care delivery groups, and patients' physicians--not insurance companies--assume responsibility for overseeing and managing the cost of treatment. Provider groups are also required to meet quality standards that further protect patients. By applying PBP in just part of its system, Intermountain, which serves 2 million people, has been able to chop $688 million in annual waste and bring total costs down 13%.


Subject(s)
Capitation Fee , Delivery of Health Care/economics , Efficiency, Organizational/economics , Quality Improvement , Cost Control , Delivery of Health Care/organization & administration , Delivery of Health Care/standards , Models, Organizational , United States
7.
Health Aff (Millwood) ; 32(2): 321-7, 2013 Feb.
Article in English | MEDLINE | ID: mdl-23381525

ABSTRACT

Patient-centeredness--the idea that care should be designed around patients' needs, preferences, circumstances, and well-being--is a central tenet of health care delivery. For CEOs of health care organizations, patient-centered care is also quickly becoming a business imperative, with payments tied to performance on measures of patient satisfaction and engagement. In A CEO Checklist for High-Value Health Care, we, as executives of eleven leading health care delivery institutions, outlined ten key strategies for reducing costs and waste while improving outcomes. In this article we describe how implementation of these strategies benefits both health care organizations and patients. For example, Kaiser Permanente's Healthy Bones Program resulted in a 30 percent reduction in hip fracture rates for at-risk patients. And at Virginia Mason Health System in Seattle, nurses reorganized care patterns and increased the time they spent on direct patient care to 90 percent. Our experiences show that patient-engaged care can be delivered in ways that simultaneously improve quality and reduce costs.


Subject(s)
Cost Control/methods , Delivery of Health Care/organization & administration , Patient Participation/methods , Quality Improvement/organization & administration , Checklist , Decision Making , Delivery of Health Care/economics , Delivery of Health Care/methods , Delivery of Health Care/standards , Efficiency, Organizational , Evidence-Based Medicine/methods , Health Services Needs and Demand , Humans , Quality of Health Care/standards
8.
Jt Comm J Qual Patient Saf ; 38(9): 395-402, 2012 Sep.
Article in English | MEDLINE | ID: mdl-23002491

ABSTRACT

BACKGROUND: Emergency departments (EDs) are an important source of care for a large segment of the population of the United States. In 2009 there were more than 136 million visits to the ED each year, and more than half of hospital admissions begin in the ED. Measurement and monitoring of emergency department performance has been prompted by The Joint Commission's patient flow standards. A study was conducted to attempt to correlate ED volume and other operating characteristics with performance on metrics. METHODS: A retrospective analysis of the Emergency Department Benchmarking Alliance annual ED survey data for the most recent year for which data were available (2009) was performed to explore observed patterns in ED performance relative to size and operating characteristics. The survey was based on 14.6 million ED visits in 358 hospitals across the United States, with an ED size representation (sampling) approximating that of the Emergency Medicine Network (EM Net). RESULTS: Larger EDs (with higher annual volumes) had longer lengths of stay (p < .0001), higher left without being seen rates (p < .0001), and longer door-to-physician times (p < .0001), all suggesting poorer operational performance. Operating characteristics indicative of higher acuity were associated with worsened performance on metrics and lower acuity characteristics with improved performance. CONCLUSION: ED volume, which also correlates with many operating characteristics, is the strongest predictor of operational performance on metrics and can be used to categorize EDs for comparative analysis. Operating characteristics indicative of acuity also influence performance. The findings suggest that ED performance measures should take ED volume, acuity, and other characteristics into account and that these features have important implications for ED design, operations, and policy decisions.


Subject(s)
Efficiency, Organizational , Emergency Service, Hospital/statistics & numerical data , Workload/statistics & numerical data , Analysis of Variance , Benchmarking , Humans , Length of Stay/statistics & numerical data , Retrospective Studies , United States , Waiting Lists
9.
Health Aff (Millwood) ; 30(6): 1185-91, 2011 Jun.
Article in English | MEDLINE | ID: mdl-21596758

ABSTRACT

It has been estimated that full implementation of the Affordable Care Act will extend coverage to thirty-two million previously uninsured Americans. However, rapidly rising health care costs could thwart that effort. Since 1988 Intermountain Healthcare has applied to health care delivery the insights of W. Edwards Deming's process management theory, which says that the best way to reduce costs is to improve quality. Intermountain achieved such quality-based savings through measuring, understanding, and managing variation among clinicians in providing care. Intermountain created data systems and management structures that increased accountability, drove improvement, and produced savings. For example, a new delivery protocol helped reduce rates of elective induced labor, unplanned cesarean sections, and admissions to newborn intensive care units. That one protocol saves an estimated $50 million in Utah each year. If applied nationally, it would save about $3.5 billion. "Organized care" along these lines may be central to the long-term success of health reform.


Subject(s)
Efficiency, Organizational/economics , Multi-Institutional Systems/economics , Quality Assurance, Health Care/methods , Total Quality Management/methods , Cesarean Section/statistics & numerical data , Cost Control/methods , Female , Humans , Infant , Intensive Care Units, Neonatal/statistics & numerical data , Labor, Induced/statistics & numerical data , Organizational Case Studies , Pregnancy , Utah
10.
Health Aff (Millwood) ; 30(4): 581-9, 2011 Apr.
Article in English | MEDLINE | ID: mdl-21471476

ABSTRACT

Identification and measurement of adverse medical events is central to patient safety, forming a foundation for accountability, prioritizing problems to work on, generating ideas for safer care, and testing which interventions work. We compared three methods to detect adverse events in hospitalized patients, using the same patient sample set from three leading hospitals. We found that the adverse event detection methods commonly used to track patient safety in the United States today-voluntary reporting and the Agency for Healthcare Research and Quality's Patient Safety Indicators-fared very poorly compared to other methods and missed 90 percent of the adverse events. The Institute for Healthcare Improvement's Global Trigger Tool found at least ten times more confirmed, serious events than these other methods. Overall, adverse events occurred in one-third of hospital admissions. Reliance on voluntary reporting and the Patient Safety Indicators could produce misleading conclusions about the current safety of care in the US health care system and misdirect efforts to improve patient safety.


Subject(s)
Hospitals , Medical Errors/statistics & numerical data , Female , Hospital Mortality , Humans , Male , Medical Audit , Middle Aged , Quality Indicators, Health Care/statistics & numerical data , Retrospective Studies , United States/epidemiology
14.
J Hosp Med ; 4(8): 481-5, 2009 Oct.
Article in English | MEDLINE | ID: mdl-19824097

ABSTRACT

BACKGROUND: Delays in discharges affect both efficiency and timeliness of care; 2 measures of quality of inpatient care. OBJECTIVE: Describe number, length, and type of delays in hospital discharges. Characterize impact of delays on overall length of stay (LOS) and costs. DESIGN: Prospective observational cohort study. SETTING: Tertiary-care children's hospital. PATIENTS: All children on 2 medical teams during August 2004. INTERVENTION: Two research assistants presented detailed data of patient care (from daily rounds) to 2 physicians who identified delays and classified the delay type. Discharge was identified as delayed if there was no medical reason for the patient to be in the hospital on a given day. MEASUREMENTS: Delays were classified using a validated and reliable instrument, the Delay Tool. LOS and costs were extracted from an administrative database. RESULTS: Two teams cared for 171 patients. Mean LOS and costs were 7.3 days (standard deviation [SD] 14.3) and $15,197 (SD 38,395), respectively: 22.8% of patients experienced at least 1 delay, accounting for 82 delay-related hospital days (9% of total hospital days) and $170,000 in costs (8.9% of hospital costs); 42.3% of the delays resulted from physician behavior, 21.8% were related to discharge planning, 14.1% were related to consultation, and 12.8% were related to test scheduling. CONCLUSIONS: Almost one-fourth of patients in this 1-month period could have been discharged sooner than they were. Impact of delays on LOS and costs are substantial. Interventions will need to address variations in physician criteria for discharge, more efficient discharge planning, and timely scheduling of consultation and diagnostic testing.


Subject(s)
Hospitals, Pediatric/economics , Patient Discharge/economics , Tertiary Prevention/economics , Child, Preschool , Cohort Studies , Hospitals, Pediatric/trends , Humans , Infant , Length of Stay/economics , Length of Stay/trends , Patient Discharge/trends , Prospective Studies , Tertiary Prevention/trends , Time Factors
15.
J Healthc Qual ; 31(4): 43-52; quiz 52-3, 2009.
Article in English | MEDLINE | ID: mdl-19753808

ABSTRACT

This quality improvement project was designed to improve rates of referral for colonoscopy screening in the Utah Health Research Network, University of Utah Community Clinics. This study was conducted between October 2004 and June 2007 with the main intervention being a clinic workflow modification using computerized screening reminders embedded in the electronic medical record (EMR). The intervention led to sustained improvement, largely driven by the performance of two network clinics. This study demonstrates that a robust EMR, with decision prompts, accompanied by clinic workflow changes and feedback to providers, can lead to sustained change in the rates of colonoscopy referral.


Subject(s)
Colonoscopy , Electronic Health Records , Primary Health Care/methods , Referral and Consultation/statistics & numerical data , Reminder Systems , Humans , Quality Assurance, Health Care , Referral and Consultation/trends
16.
Pediatrics ; 123(1): 338-45, 2009 Jan.
Article in English | MEDLINE | ID: mdl-19117901

ABSTRACT

OBJECTIVE: Aspiration pneumonia is the most common cause of death in children with neurologic impairment who have gastroesophageal reflux disease. Fundoplications and gastrojejunal feeding tubes are frequently employed to prevent aspiration pneumonia in this population. Which of these approaches is more effective in preventing aspiration pneumonia and/or improving survival is unknown. The objective of this study was to compare outcomes for children with neurologic impairment and gastroesophageal reflux disease after either a first fundoplication or a first gastrojejunal feeding tube. PATIENTS AND METHODS: This was a retrospective, observational cohort study of children with neurologic impairment who had either a fundoplication or gastrojejunal feeding tube between January 1997 and December 2005 at a tertiary care children's hospital. Main outcome measures were postprocedure aspiration pneumonia-free survival and mortality. Propensity analyses were used to control for bias in treatment assignment and prognostic imbalances. RESULTS: Of the 366 children with neurologic impairment and gastroesophageal reflux disease, 43 had a first gastrojejunal feeding tube and 323 underwent a first fundoplication. Median length of follow-up was 3.4 years. Children who received a first fundoplication had similar rates of aspiration pneumonia and mortality after the procedure compared with those who had a first gastrojejunal feeding tube, when adjusting for the treatment assignment using propensity scores. CONCLUSIONS: Aspiration pneumonia and mortality are not uncommon events after either a first fundoplication or a first gastrojejunal feeding tube for the management of gastroesophageal reflux disease in children with neurologic impairment. Neither treatment option is clearly superior in preventing the subsequent aspiration pneumonia or improving overall survival for these children. This complex clinical scenario needs to be studied in a prospective, multicenter, randomized control trial to evaluate definitively whether 1 of these 2 management options is more beneficial.


Subject(s)
Enteral Nutrition/mortality , Fundoplication/mortality , Gastroesophageal Reflux/mortality , Nervous System Diseases/mortality , Pneumonia, Aspiration/mortality , Pneumonia, Aspiration/prevention & control , Child, Preschool , Cohort Studies , Enteral Nutrition/methods , Female , Follow-Up Studies , Fundoplication/methods , Gastroesophageal Reflux/complications , Gastroesophageal Reflux/surgery , Humans , Infant , Male , Nervous System Diseases/complications , Nervous System Diseases/surgery , Pneumonia, Aspiration/surgery , Retrospective Studies , Survival Rate/trends
18.
J Hosp Med ; 2(3): 165-73, 2007 May.
Article in English | MEDLINE | ID: mdl-17549766

ABSTRACT

BACKGROUND: Children with neurological impairment (NI) commonly have gastroesophageal reflux disease (GERD) treated with a fundoplication. The impact of this procedure on quality of life is poorly understood. OBJECTIVES: To examine the quality of life of children with NI who have received a fundoplication for GERD and of their caregivers. METHODS: The study was a prospective cohort study of children with NI and GERD who underwent a fundoplication at a children's hospital between January 1, 2005, and July 7, 2006. Quality of life of the children was assessed with the Child Health Questionnaire (CHQ) and of the caregivers with the Short-Form Health Survey Status (SF-36) and Parenting Stress Index (PSI), both at baseline and 1 month after fundoplication. Functional status was assessed using the WeeFIM. Repeated-measures analyses were performed. RESULTS: Forty-four of the 63 parents (70%) were enrolled. The median WeeFIM score was 31.2 versus the age-normal score of 83 (P = .001). Compared with the baseline scores, mean CHQ scores improved over 1 month in the domains of bodily pain (32.8 vs. 47.5, P = .01), role limitations-physical (30.6 vs. 56.6, P = .01), mental health (62.7 vs. 70.6, P = .01), family limitation of activities (43.3 vs. 55.1, P = .03), and parental time (43.0 vs. 55.3, P = .03). The parental SF-36 domain of vitality improved from baseline over 1 month (41.3 vs. 48.2, P = .001), but there were no changes from baseline in Parenting Stress scores. CONCLUSIONS: Parents reported that the quality of life of children with NI who receive a fundoplication for GERD was improved from baseline in several domains 1 month after surgery. The quality of life and stress of caregivers did not improve in nearly all domains, at least in the short term.


Subject(s)
Caregivers , Fundoplication , Gastroesophageal Reflux/surgery , Nervous System Diseases/complications , Quality of Life , Case-Control Studies , Child, Preschool , Fundoplication/adverse effects , Gastroesophageal Reflux/complications , Humans , Infant , Prospective Studies , Stress, Psychological , Utah
19.
AMIA Annu Symp Proc ; : 274-8, 2007 Oct 11.
Article in English | MEDLINE | ID: mdl-18693841

ABSTRACT

The nature of clinical medicine is to focus on individuals rather than the populations from which they originate. This orientation can be problematic in the context of acute healthcare delivery during routine winter outbreaks of viral respiratory disease where an individuals likelihood of viral infection depends on knowledge of local disease incidence. The level of interest in and perceived utility of community and regional infection data for front line clinicians providing acute care is unclear. Based on input from clinicians, we developed an automated analysis and reporting system that delivers pathogen-specific epidemic curves derived from a viral panel that tests for influenza, RSV, adenovirus, parainfluenza and human metapneumovirus. Surveillance summaries were actively e-mailed to clinicians practicing in emergency, urgent and primary care settings and posted on a web site for passive consumption. We demonstrated the feasibility and sustainability of a system that provides both timely and clinically useful surveillance information.


Subject(s)
Disease Outbreaks , Internet , Population Surveillance/methods , Respiratory Tract Infections/epidemiology , Virus Diseases/epidemiology , Adenovirus Infections, Human/epidemiology , Adult , Child , Clinical Laboratory Information Systems , Focus Groups , Humans , Influenza, Human/epidemiology , Metapneumovirus , Paramyxoviridae Infections/epidemiology , Respiratory Syncytial Virus Infections/epidemiology , Respiratory Tract Infections/diagnosis , United States , Virus Diseases/diagnosis
20.
J Thromb Thrombolysis ; 22(3): 191-7, 2006 Dec.
Article in English | MEDLINE | ID: mdl-17111199

ABSTRACT

BACKGROUND: Warfarin has a narrow therapeutic range and wide inter-individual dosing requirements that may be related to functional variants of genes affecting warfarin metabolism (i.e., CYP2C9) and activity (i.e., vitamin K epoxide reductase complex subunit 1-VKORC1). We hypothesized that variants in these two genes explain a substantial proportion of variability in stable warfarin dose and could be used as a basis for improved dosing algorithms. METHODS: Consecutive consenting outpatients (n = 213) with stable INR (2-3) for >1 month were enrolled. Buccal DNA was extracted using a Qiagen mini-column and CYP2C9*2 and VKORC1 genotyping performed by the Taqman 3' nuclease assay. Sequencing for CYP2C9*3, genotyping was done using Big Dye v3.1 terminator chemistry Dose by genotype was assessed by linear regression. RESULTS: Weekly warfarin dose averaged 30.8 +/- 13.9 mg/week; average INR was 2.42 +/- 0.72. CYP2C9*2/*3 genotype distribution was: CC/AA (wild-type [WT]) = 71.4%, CT/AA = 18.3%, CC/AC = 9.4%, and CT/AC = 1%; VKORC1 genotypes were CC (WT) = 36.6%, CT = 50.7%, and TT = 12.7%. Warfarin doses (mg/week) varied by genotype: for CYP2C9, 33.3 mg/week for WT (CC/AA), 27.2 mg/week for CT/AA (P = 0.04 vs. WT), 23.0 mg/week for CC/AC (P = 0.003), and 6.0 mg/week for CT/AC (P < 0.001), representing dose reductions of 18-31% for single and 82% for double variant carriers; for VKORC1: 38.4 mg/week for WT (CC), 28.6 mg/week for CT (P < 0.001 vs. WT), 20.95 mg/week for TT (P < 0.001). In multiple linear regression, genotype was the dominant predictor of warfarin dose (P = 2.4 x 10(-15)); weak predictors were age, weight, and sex. Genotype-based modeling explained 33% of dose-variance, compared with 12% for clinical variables alone. CONCLUSION: In this large prospective study of warfarin genetic dose-determinants, carriage of a single or double CYP2C9 variant, reduced warfarin dose 18-72%, and of a VKORC1 variant by 65%. Genotype-based modeling explained almost one-half of dose-variance. A quantitative dosing algorithm incorporating genotypes for 2C9 and VKORC1 could substantially improve initial warfarin dose-selection and reduce related complications.


Subject(s)
Anticoagulants/pharmacokinetics , Aryl Hydrocarbon Hydroxylases/genetics , Cytochrome P-450 Enzyme System/genetics , Mixed Function Oxygenases/genetics , Pharmacogenetics , Warfarin/pharmacokinetics , Adult , Aged , Aged, 80 and over , Anticoagulants/administration & dosage , Cytochrome P-450 CYP2C9 , Dose-Response Relationship, Drug , Female , Humans , International Normalized Ratio , Male , Middle Aged , Prospective Studies , Vitamin K Epoxide Reductases , Warfarin/administration & dosage
SELECTION OF CITATIONS
SEARCH DETAIL