Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 80
Filter
1.
J Am Med Inform Assoc ; 30(1): 178-194, 2022 12 13.
Article in English | MEDLINE | ID: mdl-36125018

ABSTRACT

How to deliver best care in various clinical settings remains a vexing problem. All pertinent healthcare-related questions have not, cannot, and will not be addressable with costly time- and resource-consuming controlled clinical trials. At present, evidence-based guidelines can address only a small fraction of the types of care that clinicians deliver. Furthermore, underserved areas rarely can access state-of-the-art evidence-based guidelines in real-time, and often lack the wherewithal to implement advanced guidelines. Care providers in such settings frequently do not have sufficient training to undertake advanced guideline implementation. Nevertheless, in advanced modern healthcare delivery environments, use of eActions (validated clinical decision support systems) could help overcome the cognitive limitations of overburdened clinicians. Widespread use of eActions will require surmounting current healthcare technical and cultural barriers and installing clinical evidence/data curation systems. The authors expect that increased numbers of evidence-based guidelines will result from future comparative effectiveness clinical research carried out during routine healthcare delivery within learning healthcare systems.


Subject(s)
Decision Support Systems, Clinical , Delivery of Health Care , Computers
2.
Appl Clin Inform ; 12(1): 179-181, 2021 01.
Article in English | MEDLINE | ID: mdl-33638138
3.
Am Heart J ; 219: 78-88, 2020 01.
Article in English | MEDLINE | ID: mdl-31739181

ABSTRACT

OBJECTIVE: Using augmented intelligence clinical decision tools and a risk score-guided multidisciplinary team-based care process (MTCP), this study evaluated the MTCP for heart failure (HF) patients' 30-day readmission and 30-day mortality across 20 Intermountain Healthcare hospitals. BACKGROUND: HF inpatient care and 30-day post-discharge management require quality improvement to impact patient health, optimize utilization, and avoid readmissions. METHODS: HF inpatients (N = 6182) were studied from January 2013 to November 2016. In February 2014, patients began receiving care via the MTCP based on a phased implementation in which the 8 largest Intermountain hospitals (accounting for 89.8% of HF inpatients) were crossed over sequentially in a stepped manner from control to MTCP over 2.5 years. After implementation, patient risk scores were calculated within 24 hours of admission and delivered electronically to clinicians. High-risk patients received MTCP care (n = 1221), while lower-risk patients received standard HF care (n = 1220). Controls had their readmission and mortality scores calculated retrospectively (high risk: n = 1791; lower risk: n = 1950). RESULTS: High-risk MTCP recipients had 21% lower 30-day readmission compared to high-risk controls (adjusted P = .013, HR = 0.79, CI = 0.66, 0.95) and 52% lower 30-day mortality (adjusted P < .001, HR = 0.48, CI = 0.33, 0.69). Lower-risk patients did not experience increased readmission (adjusted HR = 0.88, P = .19) or mortality (adjusted HR = 0.88, P = .61). Some utilization was higher, such as prescription of home health, for MTCP recipients, with no changes in length of stay or overall costs. CONCLUSIONS: A risk score-guided MTCP was associated with lower 30-day readmission and 30-day mortality in high-risk HF inpatients. Further evaluation of this clinical management approach is required.


Subject(s)
Heart Failure/mortality , Heart Failure/therapy , Patient Care Team , Patient Readmission/statistics & numerical data , Aged , Cause of Death , Cross-Over Studies , Decision Support Techniques , Female , Humans , Inpatients , Male , Patient Readmission/economics , Precision Medicine , Quality Improvement , Risk Assessment , Time Factors
4.
Res Pract Thromb Haemost ; 3(3): 340-348, 2019 Jul.
Article in English | MEDLINE | ID: mdl-31294320

ABSTRACT

BACKGROUND: Upper extremity deep vein thrombosis (UEDVT) constitutes approximately 10% of all deep vein thromboses (DVTs). The incidence of UEDVT is increasing in association with use of peripherally inserted central venous catheters. Treatment for UEDVT is derived largely from evidence for treatment of lower extremity DVT. Limited evidence exists for the use of a direct oral anticoagulant for the treatment of UEDVT. POPULATION: Sequential patients identified within the Intermountain Healthcare System and University of Utah Healthcare system with symptomatic UEDVT defined as the formation of thrombus within the internal jugular, subclavian, axillary, brachial, ulnar, or radial veins of the arm. INTERVENTION: Apixaban 10 mg PO twice daily for 7 days followed by apixaban 5 mg twice daily for 11 weeks. COMPARISON: The historical literature review rate of venous thrombosis reported for recurrent clinically overt objective venous thromboembolism (VTE) and VTE-related death. If the confidence interval for the observed rate excludes the threshold event rate of 4%, we will conclude that treatment with apixaban is noninferior and therefore a clinically valid approach to treat UEDVT. SAMPLE SIZE: We elected a sample size of 375 patients so that an exact 95% confidence interval would exclude an event rate of VTE in the observation cohort of 4%. OUTCOME: Ninety-day rate of new or recurrent objectively confirmed symptomatic venous thrombosis and VTE-related death. The primary safety outcome is the composite of major and clinically relevant nonmajor bleeding.

5.
Res Pract Thromb Haemost ; 2(3): 481-489, 2018 Jul.
Article in English | MEDLINE | ID: mdl-30046752

ABSTRACT

BACKGROUND: Venous thromboembolism prophylaxis remains underutilized in hospitalized medical patients at high risk for venous thromboembolism. We previously reported that a multifaceted intervention was associated with a sustained increase in appropriate thromboprophylaxis and reduced symptomatic venous thromboembolism among medical patients hospitalized in two urban teaching hospitals. The effectiveness of this intervention in community hospitals is unknown. METHODS: We performed a prospective multicenter cohort study in three community hospitals. All medical patients admitted from February 1, 2011 to January 31, 2014 were eligible. Consecutive eligible patients were enrolled into the 12-month "control," 12-month "intervention," or 12-month "maintenance" group. We provided electronic alerts, physician performance feedback, and targeted medical education for the intervention group. Only the alert component of the intervention continued in the maintenance group. The primary outcome was the rate of appropriate thromboprophylaxis among patients at high risk for venous thromboembolism defined as the prescription of guideline recommended chemoprophylaxis, or identification of a chemoprophylaxis contraindication. Secondary outcomes included rates of symptomatic venous thromboembolism, major bleeding, all-cause mortality, heparin-induced thrombocytopenia, physician satisfaction, and alert fatigue. RESULTS: Appropriate thromboprophylaxis when compared to the control group rate of 67% was higher for the intervention group (85%) and for the maintenance group (77%; P < .001 for each comparison). A reduction of 90-day symptomatic venous thromboembolism accompanied the intervention (control 4.5%, intervention 3.4%, maintenance 3.0%, P = .04). CONCLUSIONS: This multifaceted intervention was associated with an overall increase in appropriate thromboprophylaxis of medical patients compared with the control period. Hospital-associated venous thrombosis rates decreased.

6.
Clin Infect Dis ; 67(4): 525-532, 2018 08 01.
Article in English | MEDLINE | ID: mdl-29790913

ABSTRACT

Background: Studies on the implementation of antibiotic stewardship programs (ASPs) in small hospitals are limited. Accreditation organizations now require all hospitals to have ASPs. Methods: The objective of this cluster-randomized intervention was to assess the effectiveness of implementing ASPs in Intermountain Healthcare's 15 small hospitals. Each hospital was randomized to 1 of 3 ASPs of escalating intensity. Program 1 hospitals were provided basic antibiotic stewardship education and tools, access to an infectious disease hotline, and antibiotic utilization data. Program 2 hospitals received those interventions plus advanced education, audit and feedback for select antibiotics, and locally controlled antibiotic restrictions. Program 3 hospitals received program 2 interventions plus audit and feedback on the majority of antibiotics, and an infectious diseases-trained clinician approved restricted antibiotics and reviewed microbiology results. Changes in total and broad-spectrum antibiotic use within programs (intervention versus baseline) and the difference between programs in the magnitude of change in antibiotic use (eg, program 3 vs 1) were evaluated with mixed models. Results: Program 3 hospitals showed reductions in total (rate ratio, 0.89; confidence interval, .80-.99) and broad-spectrum (0.76; .63-.91) antibiotic use when the intervention period was compared with the baseline period. Program 1 and 2 hospitals did not experience a reduction in antibiotic use. Comparison of the magnitude of effects between programs showed a similar trend favoring program 3, but this was not statistically significant. Conclusions: Only the most intensive ASP intervention was associated with reduction in total and broad-spectrum antibiotic use when compared with baseline. Clinical Trials Registration: NCT03245879.


Subject(s)
Anti-Bacterial Agents/therapeutic use , Antimicrobial Stewardship/organization & administration , Health Plan Implementation , Hospitals, Community , Ambulatory Care Facilities , Cluster Analysis , Idaho , Utah
7.
Am J Infect Control ; 46(10): 1084-1091, 2018 10.
Article in English | MEDLINE | ID: mdl-29778437

ABSTRACT

BACKGROUND: Clinical decision support (CDS) systems can help investigators use best practices when responding to outbreaks, but variation in guidelines between jurisdictions can make such systems hard to develop and implement. This study aimed to identify (1) the extent to which state-level guidelines adhere to national recommendations for norovirus outbreak response in health care settings and (2) the impact of variation between states on outbreak outcomes. METHODS: State guidelines were obtained from Internet searches and direct contact with state public health officials in early 2016. Outcomes from norovirus outbreaks that occurred in 2015 were compared using data from the National Outbreak Reporting System. RESULTS: Guidelines were obtained from 41 of 45 (91%) state health departments that responded to queries or had guidelines available on their Web sites. Most state guidelines addressed each of the national recommendations, but specific guidance varied considerably. For example, among 36 states with guidance on numbers of stool specimens to collect, there were 21 different recommendations. Furthermore, having guidelines consistent with national recommendations was associated with fewer outbreaks reported and more outbreaks with confirmed etiology. CONCLUSIONS: This study identified substantial variation in state health care-associated norovirus outbreak response guidelines, which must be considered when developing related CDS systems. More research is needed to understand why this variation exists, how it impacts outbreak outcomes, and where improvements in evidence-based recommendations and communication of national guidance are needed.


Subject(s)
Caliciviridae Infections/epidemiology , Caliciviridae Infections/virology , Disease Outbreaks , Guidelines as Topic , Population Surveillance , Humans , Norovirus , United States/epidemiology
8.
J Card Surg ; 33(4): 163-170, 2018 Apr.
Article in English | MEDLINE | ID: mdl-29569750

ABSTRACT

BACKGROUND: Reducing preventable hospital readmissions after coronary artery bypass graft (CABG) surgery has become a national priority. Predictive models can be used to identify patients at high risk for readmission. However, the majority of the existing models are based on data available at discharge. We sought to develop a model to predict hospital readmission using data available soon after admission for isolated CABG surgery. METHODS: Fifty risk factors were included in a bivariate analysis, 16 of which were significantly associated (P < 0.05) with readmissions and were entered into a multivariate logistic regression and removed stepwise, using backward elimination procedures. The derived model was then validated on 896 prospective isolated CABG cases. RESULTS: Of 2589 isolated CABG patients identified between December 1, 2010, and June 30, 2014, 237(9.15%) were readmitted within 30 days. Five risk factors were predictive of 30-day all-cause readmission: age (odds ratio [OR] = 1.03; 95% confidence interval [CI]: 1.01-1.05; P = 0.004), prior heart failure (OR = 1.55; 95%CI: 1.07-2.24; P = 0.020), total albumin prior to surgery (OR = 0.68; 95%CI: 0.05-0.94; P = 0.021), previous myocardial infarction (OR = 1.44; 95%CI: 1.00-2.08; P = 0.50), and history of diabetes (OR = 1.54; 95%CI: 1.09-2.19; P = 0.015). The area under the curve c-statistic was 0.63 in the derivation sample and 0.65 in the validation sample showing good discrimination. CONCLUSIONS: A 30-day all-cause readmission among isolated CABG patients can be predicted soon after admission with a small number of risk factors.


Subject(s)
Coronary Artery Bypass , Patient Admission , Patient Readmission/statistics & numerical data , Risk Factors , Aged , Albumins , Confidence Intervals , Diabetes Mellitus , Female , Forecasting , Heart Failure , Humans , Logistic Models , Male , Middle Aged , Models, Statistical , Multivariate Analysis , Myocardial Infarction , Risk , Time Factors
9.
Chest ; 153(5): 1153-1159, 2018 05.
Article in English | MEDLINE | ID: mdl-29154971

ABSTRACT

BACKGROUND: Guidelines suggest anticoagulation of patients with high pretest probability of pulmonary embolism (PE) while awaiting diagnostic test results (preemptive anticoagulation). Data relevant to the practice of preemptive anticoagulation are not available. METHODS: We reviewed 3,500 consecutive patients who underwent CT pulmonary angiography (CTPA) at two EDs. We classified the pretest probability for PE using the revised Geneva Score (RGS) as low (RGS 0-3), intermediate (RGS 4-10), or high (RGS 11-18). We classified patients with a high pretest probability of PE as receiving preemptive anticoagulation if therapeutic anticoagulation was given before CTPA completion. Patients with a high bleeding risk and those receiving treatment for DVT before CTPA were excluded from the preemptive anticoagulation analysis. We compared the time elapsed between ED registration and CTPA completion for patients with a low, intermediate, and high pretest probability for PE. RESULTS: We excluded three of 3,500 patients because CTPA preceded ED registration. Of the remaining 3,497 patients, 167 (4.8%) had a high pretest probability for PE. After excluding 29 patients for high bleeding risk and 21 patients who were treated for DVT prior to CTPA, only two of 117 patients (1.7%) with a high pretest probability for PE received preemptive anticoagulation. Furthermore, 37 of the remaining 115 patients (32%) with a high pretest probability for PE had a preexisting indication for anticoagulation but did not receive preemptive anticoagulation. The time from ED registration to CTPA completion did not differ based on the pretest probability of PE. CONCLUSIONS: Physicians rarely use preemptive anticoagulation in patients with a high pretest probability for PE. Clinicians do not expedite CTPA examinations for patients with a high pretest probability for PE.


Subject(s)
Anticoagulants/therapeutic use , Guideline Adherence , Pulmonary Embolism/etiology , Pulmonary Embolism/prevention & control , Adult , Aged , Computed Tomography Angiography , Female , Humans , Male , Middle Aged , Patient Selection , Practice Guidelines as Topic , Practice Patterns, Physicians' , Probability , Pulmonary Embolism/diagnostic imaging , Retrospective Studies
10.
J Card Fail ; 23(10): 719-726, 2017 Oct.
Article in English | MEDLINE | ID: mdl-28821391

ABSTRACT

BACKGROUND: Patients who need and receive timely advanced heart failure (HF) therapies have better long-term survival. However, many of these patients are not identified and referred as soon as they should be. METHODS: A clinical decision support (CDS) application sent secure email notifications to HF patients' providers when they transitioned to advanced disease. Patients identified with CDS in 2015 were compared with control patients from 2013 to 2014. Kaplan-Meier methods and Cox regression were used in this intention-to-treat analysis to compare differences between visits to specialized and survival. RESULTS: Intervention patients were referred to specialized heart facilities significantly more often within 30 days (57% vs 34%; P < .001), 60 days (69% vs 44%; P < .0001), 90 days (73% vs 49%; P < .0001), and 180 days (79% vs 58%; P < .0001). Age and sex did not predict heart facility visits, but renal disease did and patients of nonwhite race were less likely to visit specialized heart facilities. Significantly more intervention patients were found to be alive at 30 (95% vs 92%; P = .036), 60 (95% vs 90%; P = .0013), 90 (94% vs 87%; P = .0002), and 180 days (92% vs 84%; P = .0001). Age, sex, and some comorbid diseases were also predictors of mortality, but race was not. CONCLUSIONS: We found that CDS can facilitate the early identification of patients needing advanced HF therapy and that its use was associated with significantly more patients visiting specialized heart facilities and longer survival.


Subject(s)
Decision Support Systems, Clinical/standards , Heart Failure/diagnostic imaging , Heart Failure/therapy , Patient Selection , Referral and Consultation/standards , Aged , Decision Support Systems, Clinical/trends , Female , Humans , Male , Middle Aged , Referral and Consultation/trends , Retrospective Studies
12.
Appl Clin Inform ; 8(2): 651-659, 2017 06 20.
Article in English | MEDLINE | ID: mdl-28636063

ABSTRACT

BACKGROUND: In the summer of 2016 an international group of biomedical and health informatics faculty and graduate students gathered for the 16th meeting of the International Partnership in Health Informatics Education (IPHIE) masterclass at the University of Utah campus in Salt Lake City, Utah. This international biomedical and health informatics workshop was created to share knowledge and explore issues in biomedical health informatics (BHI). OBJECTIVE: The goal of this paper is to summarize the discussions of biomedical and health informatics graduate students who were asked to define interoperability, and make critical observations to gather insight on how to improve biomedical education. METHODS: Students were assigned to one of four groups and asked to define interoperability and explore potential solutions to current problems of interoperability in health care. RESULTS: We summarize here the student reports on the importance and possible solutions to the "interoperability problem" in biomedical informatics. Reports are provided from each of the four groups of highly qualified graduate students from leading BHI programs in the US, Europe and Asia. CONCLUSION: International workshops such as IPHIE provide a unique opportunity for graduate student learning and knowledge sharing. BHI faculty are encouraged to incorporate into their curriculum opportunities to exercise and strengthen student critical thinking to prepare our students for solving health informatics problems in the future.


Subject(s)
Internationality , Medical Informatics/education , Students, Medical/psychology , Humans
13.
Clin Infect Dis ; 63(10): 1273-1280, 2016 Nov 15.
Article in English | MEDLINE | ID: mdl-27694483

ABSTRACT

BACKGROUND: Antibiotic use and misuse is driving drug resistance. Much of US healthcare takes place in small community hospitals (SCHs); 70% of all US hospitals have <200 beds. Antibiotic use in SCHs is poorly described. We evaluated antibiotic use using data from the National Healthcare and Safety Network antimicrobial use option from the Centers for Disease Control and Prevention. METHODS: We used Intermountain Healthcare's monthly antibiotic use reports for 19 hospitals from 2011 to 2013. Hospital care units were categorized as intensive care, medical/surgical, pediatric, or miscellaneous. Antibiotics were categorized based on spectrum of coverage. Antibiotic use rates, expressed as days of therapy per 1000 patient-days (DOT/1000PD), were calculated for each SCH and compared with rates in large community hospitals (LCHs). Negative-binomial regression was used to relate antibiotic use to predictor variables. RESULTS: Total antibiotic use rates varied widely across the 15 SCHs (median, 436 DOT/1000PD; range, 134-671 DOT/1000PD) and were similar to rates in 4 LCHs (509 DOT/1000PD; 406-597 DOT/1000PD). The proportion of patient-days spent in the respective unit types varied substantially within SCHs and had a large impact on facility-level rates. Broad-spectrum antibiotics accounted for 26% of use in SCHs (range, 8%-36%), similar to the proportion in LCHs (32%; range, 26%-37%). Case mix index, proportion of patient-days in specific unit types, and season were significant predictors of antibiotic use. CONCLUSIONS: There is substantial variation in patterns of antibiotic use among SCHs. Overall usage in SCHs is similar to usage in LCHs. Small hospitals need to become a focus of stewardship efforts.


Subject(s)
Anti-Bacterial Agents/therapeutic use , Drug Utilization/statistics & numerical data , Hospitals, Community/statistics & numerical data , Humans , Idaho/epidemiology , Utah/epidemiology
14.
Am J Med ; 129(10): 1124.e17-26, 2016 Oct.
Article in English | MEDLINE | ID: mdl-27288858

ABSTRACT

BACKGROUND: Venous thromboembolism chemoprophylaxis remains underutilized in hospitalized medical patients at high risk for venous thromboembolism. We assessed the effect of a health care quality-improvement initiative comprised of a targeted electronic alert, comparative practitioner metrics, and practitioner-specific continuing medical education on the rate of appropriate venous thromboembolism chemoprophylaxis provided to medical inpatients at high risk for venous thromboembolism. METHODS: We performed a multicenter prospective observational cohort study in an urban Utah hospital system. All medical patients admitted to 1 of 2 participating hospitals from April 1, 2010 to December 31, 2012 were eligible. Patients were members of the "control" (April 1, 2010 to December 31, 2010), "intervention" (January 1, 2011 to December 31, 2011), or "subsequent year" (January 1, 2012 to December 31, 2012) group. The primary outcome was the rate of appropriate chemoprophylaxis among patients at high risk for venous thromboembolism. Secondary outcomes included rates of symptomatic venous thromboembolism, major bleeding, all-cause mortality, heparin-induced thrombocytopenia, physician satisfaction, and alert fatigue. RESULTS: The rate of appropriate chemoprophylaxis among patients at high risk for venous thromboembolism increased (66.1% control period vs 81.0% intervention period vs 88.1% subsequent year; P <.001 for each comparison). A significant reduction of 90-day symptomatic venous thromboembolism accompanied the quality initiative (9.3% control period, 9.7% intervention period, 6.7% subsequent year; P = .009); 30-day venous thromboembolism rates also significantly decreased. CONCLUSIONS: A multifaceted intervention was associated with increased appropriate venous thromboembolism chemoprophylaxis among medical inpatients at high risk for venous thromboembolism and reduced symptomatic venous thromboembolism. The effect of the intervention was sustained.


Subject(s)
Anticoagulants/therapeutic use , Chemoprevention/statistics & numerical data , Education, Medical, Continuing/methods , Heparin/therapeutic use , Medical Order Entry Systems , Quality Improvement , Venous Thromboembolism/prevention & control , Adult , Aged , Aged, 80 and over , Female , Hemorrhage/chemically induced , Hospitalization , Humans , Male , Middle Aged , Prospective Studies , Thrombocytopenia/chemically induced
15.
J Am Med Inform Assoc ; 23(5): 872-8, 2016 09.
Article in English | MEDLINE | ID: mdl-26911827

ABSTRACT

OBJECTIVE: Develop and evaluate an automated identification and predictive risk report for hospitalized heart failure (HF) patients. METHODS: Dictated free-text reports from the previous 24 h were analyzed each day with natural language processing (NLP), to help improve the early identification of hospitalized patients with HF. A second application that uses an Intermountain Healthcare-developed predictive score to determine each HF patient's risk for 30-day hospital readmission and 30-day mortality was also developed. That information was included in an identification and predictive risk report, which was evaluated at a 354-bed hospital that treats high-risk HF patients. RESULTS: The addition of NLP-identified HF patients increased the identification score's sensitivity from 82.6% to 95.3% and its specificity from 82.7% to 97.5%, and the model's positive predictive value is 97.45%. Daily multidisciplinary discharge planning meetings are now based on the information provided by the HF identification and predictive report, and clinician's review of potential HF admissions takes less time compared to the previously used manual methodology (10 vs 40 min). An evaluation of the use of the HF predictive report identified a significant reduction in 30-day mortality and a significant increase in patient discharges to home care instead of to a specialized nursing facility. CONCLUSIONS: Using clinical decision support to help identify HF patients and automatically calculating their 30-day all-cause readmission and 30-day mortality risks, coupled with a multidisciplinary care process pathway, was found to be an effective process to improve HF patient identification, significantly reduce 30-day mortality, and significantly increase patient discharges to home care.


Subject(s)
Decision Making, Computer-Assisted , Electronic Health Records , Heart Failure/diagnosis , Natural Language Processing , Risk Assessment , Analysis of Variance , Female , Heart Failure/mortality , Heart Failure/therapy , Hospital Information Systems , Hospitalization , Humans , Male , Patient Readmission , Pilot Projects , Sensitivity and Specificity , Severity of Illness Index
16.
J Am Coll Radiol ; 13(2 Suppl): R18-24, 2016 Feb.
Article in English | MEDLINE | ID: mdl-26846530

ABSTRACT

PURPOSE: Incidental pulmonary nodules that require follow-up are often noted on chest CT. Evidence-based guidelines regarding appropriate follow-up have been published, but the rate of adherence to guideline recommendations is unknown. Furthermore, it is unknown whether the radiology report affects the nodule follow-up rate. METHODS: A review of 1,000 CT pulmonary angiographic studies ordered in the emergency department was performed to determine the presence of an incidental pulmonary nodule. Fleischner Society guidelines were applied to ascertain if follow-up was recommended. Radiology reports were classified on the basis of whether nodules were listed in the findings section only, were noted in the impression section, or had explicit recommendations for follow-up. Whether the rate of nodule follow-up was affected by the radiology report was determined according to these 3 groups. RESULTS: Incidental pulmonary nodules that required follow-up were noted on 9.9% (95% confidence interval, 8%-12%) of CT pulmonary angiographic studies. Follow-up for nodules was poor overall (29% [28 of 96]; 95% confidence interval, 20%-38%) and decreased significantly when the nodules were mentioned in the findings section only (0% [0 of 12]). Specific instructions to follow up nodules in radiology reports still resulted in a low follow-up rate of 29% (19 of 65; 95% confidence interval, 18%-40%). CONCLUSIONS: Incidental pulmonary nodules detected on CT pulmonary angiography are common and are frequently not followed up appropriately. Although the inclusion of a pulmonary nodule in the impression section of a radiology report is helpful, it does not ensure follow-up. Better systems for appropriate identification and follow-up of incidental findings are needed.

17.
Clin Appl Thromb Hemost ; 22(3): 265-73, 2016 Apr.
Article in English | MEDLINE | ID: mdl-26346440

ABSTRACT

PURPOSE: To compare the incidence of 90-day venous thromboembolism (VTE) in obese critically ill medical patients receiving VTE chemoprophylaxis with nonobese patients of similar illness severity. We also identified other VTE risk factors. METHODS: Eligible patients spent ≥24 hours in an intensive care unit between November 2007 and November 2013 and received VTE chemoprophylaxis within 48 hours of admission. The primary outcome was 90-day VTE. RESULTS: A total of 11 111 patients were evaluated, of which 1732 obese and 1831 nonobese patients were enrolled with mean BMIs of 38.9 ± 9.2 kg/m(2) and 24.5 ± 3.1 kg/m(2) and mean Acute Physiology and Chronic Health Evaluation II scores of 28.4 ± 11.8 and 26.6 ± 11.7, respectively. The rate of 90-day VTE for the total cohort, obese, and nonobese patients was 6.5%, 7.5%, and 5.5%, respectively. Obese patients were more likely to develop VTE compared with nonobese patients (odds ratio [OR]: 1.41; 95% confidence interval [CI]: 1.03 -1.93). Other risk factors significantly associated with 90-day VTE included prior VTE (OR: 3.93; 95% CI: 1.83-8.48), trauma with surgery in the previous 30 days (OR: 3.70; 95% CI: 1.39-9.86), central venous catheters (OR: 2.64; 95% CI: 1.87-3.72), surgery within 90 days (OR: 2.40; 95% CI: 1.61-3.58), mechanical ventilation (OR: 1.94; 95% CI: 1.39-2.71), male sex (OR: 1.55; 95% CI: 1.13-2.14), and increasing age using 1-year increments (OR: 1.02; 95% CI: 1.01 -1.03). CONCLUSIONS: The rate of VTE in critically ill medical patients remains high despite standard chemoprophylaxis. Obesity is among 8 risk factors independently associated with 90-day VTE.


Subject(s)
Critical Care , Obesity/epidemiology , Venous Thromboembolism/epidemiology , Venous Thromboembolism/prevention & control , Adult , Aged , Catheterization, Central Venous/adverse effects , Critical Illness , Female , Humans , Incidence , Male , Middle Aged , Obesity/complications , Postoperative Complications/epidemiology , Postoperative Complications/prevention & control , Retrospective Studies , Risk Factors , Venous Thromboembolism/etiology , Wounds and Injuries/epidemiology , Wounds and Injuries/surgery
18.
Clin Appl Thromb Hemost ; 22(3): 239-47, 2016 Apr.
Article in English | MEDLINE | ID: mdl-26566669

ABSTRACT

BACKGROUND: Antiphospholipid syndrome (APS) is an acquired thrombophilia characterized by thrombosis, pregnancy morbidity, and the presence of characteristic antibodies. Current therapy for patients having APS with a history of thrombosis necessitates anticoagulation with the vitamin K antagonist warfarin, a challenging drug to manage. Apixaban, approved for the treatment and prevention of venous thrombosis with a low rate of bleeding observed, has never been studied among patients with APS. AIMS AND METHODS: We report study rationale and design of Apixaban for the Secondary Prevention of Thrombosis Among Patients With Antiphospholipid Syndrome (ASTRO-APS), a prospective randomized open-label blinded event pilot study that will randomize patients with a clinical diagnosis of APS receiving therapeutic anticoagulation to either adjusted-dose warfarin or apixaban 2.5 mg twice a day. We aim to report our ability to identify, recruit, randomize, and retain patients with APS randomized to apixaban compared with warfarin. We will report clinically important outcomes of thrombosis and bleeding. All clinical outcomes will be adjudicated by a panel blinded to the treatment arm. A unique aspect of this study is the enrollment of patients with an established clinical diagnosis of APS. Also unique is our use of electronic medical record interrogation techniques to identify patients who would likely meet our inclusion criteria and use of an electronic portal for follow-up visit data capture. CONCLUSION: ASTRO-APS will be the largest prospective study to date comparing a direct oral anticoagulant with warfarin among patients with APS for the secondary prevention of thrombosis. Our inclusion criteria assure that outcomes obtained will be clinically applicable to the routine management of patients with APS receiving indefinite anticoagulation.


Subject(s)
Antiphospholipid Syndrome/drug therapy , Pyrazoles/administration & dosage , Pyridones/administration & dosage , Thrombosis/drug therapy , Warfarin/administration & dosage , Administration, Oral , Adult , Antiphospholipid Syndrome/blood , Antiphospholipid Syndrome/complications , Female , Humans , Male , Middle Aged , Pilot Projects , Pregnancy , Pregnancy Complications, Hematologic/blood , Pregnancy Complications, Hematologic/drug therapy , Pyrazoles/adverse effects , Pyridones/adverse effects , Thrombosis/blood , Thrombosis/etiology , Vitamin K/antagonists & inhibitors , Vitamin K/blood , Warfarin/adverse effects
19.
J Hosp Med ; 11(4): 306-10, 2016 Apr.
Article in English | MEDLINE | ID: mdl-26662622

ABSTRACT

Peripherally inserted central catheters (PICCs) are being selected for venous access more frequently today than ever before. Often the choice of a PICC, when compared with other vascular access devices (VADs), is attractive because of perceived safety, availability, and ease of insertion. However, complications associated with PICCs exist, and there is a paucity of evidence to guide clinician choice for PICC selection and valid use. An international panel with expertise in the arena of venous access and populations associated with these devices was convened to clarify approaches for the optimal use of PICCs and VADs. Here we present for the busy hospital-based practitioner the methodology, key outcomes, and recommendations of the Michigan Appropriateness Guide for Intravenous Catheters (MAGIC) panelists for the appropriate use of VADs.


Subject(s)
Catheterization, Central Venous/standards , Catheterization, Peripheral/statistics & numerical data , Practice Guidelines as Topic/standards , Catheterization, Central Venous/statistics & numerical data , Catheterization, Peripheral/standards , Humans , Michigan
20.
Stud Health Technol Inform ; 216: 270-4, 2015.
Article in English | MEDLINE | ID: mdl-26262053

ABSTRACT

Hospitalized patients in the U.S. do not always receive optimal care. In light of this, Computerized Decision Support (CDS) has been recommended to for the improvement of patient care. A number of methodologies, standards, and frameworks have been developed to facilitate the development and interoperability of computerized clinical guidelines and CDS logic. In addition, Health Information Exchange using Service-Oriented Architecture holds some promise to help realize that goal. We have used a framework at Intermountain Healthcare that employs familiar programming languages and technology to develop over 40 CDS applications during the past 13 years, which clinicians are dependent on each day. This paper describes the framework, technology, and CDS application development methods, while providing three distinct examples of applications that illustrate the need and use of the framework for patient care improvement. The main limitation of this framework is its dependence on point-to-point interfaces to access patient data. We look forward to the use of validated and accessible Service-Oriented Architecture to facilitate patient data access across diverse databases.


Subject(s)
Community Networks/organization & administration , Decision Support Systems, Clinical/standards , Electronic Health Records/standards , Medical Record Linkage/standards , Practice Guidelines as Topic , Software/standards , Idaho , Utah , Utilization Review
SELECTION OF CITATIONS
SEARCH DETAIL
...