Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 8 de 8
Filter
1.
Drug Saf ; 47(6): 557-569, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38478349

ABSTRACT

INTRODUCTION: Drug-drug interactions (DDIs) have potential to cause patient harm, including lowering therapeutic efficacy. This study aimed to (i) determine the prevalence of potential DDIs (pDDIs); clinically relevant DDIs (cDDIs), that is, DDIs that could lead to patient harm, taking into account a patient's individual clinical profile, drug effects and severity of potential harmful outcome; and subsequent actual harm among hospitalized patients and (ii) examine the impact of transitioning from paper-based medication charts to electronic medication management (eMM) on DDIs and patient harms. METHODS: This was a secondary analysis of the control arm of a controlled pre-post study. Patients were randomly selected from three Australian hospitals. Retrospective chart review was conducted before and after the implementation of an eMM system, without accompanying clinical decision support alerts for DDIs. Harm was assessed by an expert panel. RESULTS: Of 1186 patient admissions, 70.1% (n = 831) experienced a pDDI, 42.6% (n = 505) a cDDI and 0.9% (n = 11) an actual harm in hospital. Of 15,860 pDDIs identified, 27.0% (n = 4285) were classified as cDDIs. The median number of pDDIs and cDDIs per 10 drugs were 6 [interquartile range (IQR) 2-13] and 0 (IQR 0-2), respectively. In cases where a cDDI was identified, both drugs were 44% less likely to be co-administered following eMM (adjusted odds ratio 0.56, 95% confidence interval 0.46-0.73). CONCLUSION: Although most patients experienced a pDDI during their hospital stay, less than one-third of pDDIs were clinically relevant. The low prevalence of harm identified raises questions about the value of incorporating DDI decision support into systems given the potential negative impacts of DDI alerts.


Subject(s)
Drug Interactions , Hospitalization , Humans , Male , Female , Middle Aged , Retrospective Studies , Aged , Hospitalization/statistics & numerical data , Australia , Prevalence , Drug-Related Side Effects and Adverse Reactions/epidemiology , Adult , Patient Harm , Aged, 80 and over , Decision Support Systems, Clinical , Medication Errors/statistics & numerical data
2.
BMJ Open ; 14(3): e080610, 2024 Mar 12.
Article in English | MEDLINE | ID: mdl-38479736

ABSTRACT

OBJECTIVE: To identify barriers to hospital participation in controlled cluster trials of clinical decision support (CDS) and potential strategies for addressing barriers. DESIGN: Qualitative descriptive design comprising semistructured interviews. SETTING: Five hospitals in New South Wales and one hospital in Queensland, Australia. PARTICIPANTS: Senior hospital staff, including department directors, chief information officers and those working in health informatics teams. RESULTS: 20 senior hospital staff took part. Barriers to hospital-level recruitment primarily related to perceptions of risk associated with not implementing CDS as a control site. Perceived risks included reductions in patient safety, reputational risk and increased likelihood that benefits would not be achieved following electronic medical record (EMR) implementation without CDS alerts in place. Senior staff recommended clear communication of trial information to all relevant stakeholders as a key strategy for boosting hospital-level participation in trials. CONCLUSION: Hospital participation in controlled cluster trials of CDS is hindered by perceptions that adopting an EMR without CDS is risky for both patients and organisations. The improvements in safety expected to follow CDS implementation makes it challenging and counterintuitive for hospitals to implement EMR without incorporating CDS alerts for the purposes of a research trial. To counteract these barriers, clear communication regarding the evidence base and rationale for a controlled trial is needed.


Subject(s)
Decision Support Systems, Clinical , Humans , Australia , Hospitals , Qualitative Research , Queensland
3.
Int J Qual Health Care ; 35(1)2023 Feb 20.
Article in English | MEDLINE | ID: mdl-36715081

ABSTRACT

Limited research has focused on understanding if and how evidence of health information technology (HIT) effectiveness drives the selection and implementation of technologies in practice. This study aimed to explore the views of senior hospital staff on the role evidence plays in the selection and implementation of HIT, with a particular focus on clinical decision support (CDS) alerts in electronic medication management systems. A qualitative descriptive design was used. Twenty senior hospital staff from six Australian hospitals in New South Wales and Queensland took part in a semistructured interview. Interviews were audio-recorded and transcribed, and a general inductive content analysis approach was used to identify themes. Participants acknowledged the importance of an evidence base, but reported that selection of CDS alerts, and HIT more broadly, was rarely underpinned by evidence that technologies improve patient care. Instead, investments in technologies were guided by the expectation that benefits will be achieved, bolstered by vendor assurances, and a perception that implementation of HIT is unavoidable. Postponing implementation of a technology until an evidence base is available was not always feasible. Although some technologies were seen as not requiring an evidence base, stakeholders viewed evidence as extremely valuable for informing decisions about selection of CDS alerts. In the absence of evidence, evaluation or monitoring of technologies postimplementation is critical, particularly to identify new errors or risks associated with HIT implementation and use. Increased transparency from vendors, with technology evaluation outcomes made directly available to healthcare organizations, may result in less reliance on logic, intuition, and vendor assertions and more evidence-based selection of HIT.


Subject(s)
Decision Support Systems, Clinical , Humans , Australia , Qualitative Research , Personnel, Hospital , Hospitals
4.
BMJ Open ; 9(8): e026034, 2019 08 18.
Article in English | MEDLINE | ID: mdl-31427312

ABSTRACT

INTRODUCTION: Drug-drug interaction (DDI) alerts in hospital electronic medication management (EMM) systems are generated at the point of prescribing to warn doctors about potential interactions in their patients' medication orders. This project aims to determine the impact of DDI alerts on DDI rates and on patient harm in the inpatient setting. It also aims to identify barriers and facilitators to optimal use of alerts, quantify the alert burden posed to prescribers with implementation of DDI alerts and to develop algorithms to improve the specificity of DDI alerting systems. METHODS AND ANALYSIS: A controlled pre-post design will be used. Study sites include six major referral hospitals in two Australian states, New South Wales and Queensland. Three hospitals will act as control sites and will implement an EMM system without DDI alerts, and three as intervention sites with DDI alerts. The medical records of 280 patients admitted in the 6 months prior to and 6 months following implementation of the EMM system at each site (total 3360 patients) will be retrospectively reviewed by study pharmacists to identify potential DDIs, clinically relevant DDIs and associated patient harm. To identify barriers and facilitators to optimal use of alerts, 10-15 doctors working at each intervention hospital will take part in observations and interviews. Non-identifiable DDI alert data will be extracted from EMM systems 6-12 months after system implementation in order to quantify alert burden on prescribers. Finally, data collected from chart review and EMM systems will be linked with clinically relevant DDIs to inform the development of algorithms to trigger only clinically relevant DDI alerts in EMM systems. ETHICS AND DISSEMINATION: This research was approved by the Hunter New England Human Research Ethics Committee (18/02/21/4.07). Study results will be published in peer-reviewed journals and presented at local and international conferences and workshops.


Subject(s)
Decision Support Systems, Clinical , Electronic Health Records/statistics & numerical data , Medical Order Entry Systems/statistics & numerical data , Medication Systems, Hospital/standards , Reminder Systems/supply & distribution , Data Collection , Drug Interactions , Follow-Up Studies , Humans , New South Wales , Queensland , Retrospective Studies
5.
Water Res ; 128: 120-128, 2018 01 01.
Article in English | MEDLINE | ID: mdl-29091803

ABSTRACT

This study investigated the impact of coagulation-flocculation treatment on metal form and bioavailability in municipal wastewater. Real humus effluent samples were separated into particulate, colloidal and truly dissolved fractions before and after treatment with either ferric chloride (FeCl3) or the biopolymer Floculan. Results revealed that both reagents effectively (≥48%) eliminated Cu, Pb and Zn from the particulate fraction and removed Cu and Zn from the colloidal fraction in conjunction with colloidal organic carbon (COC). Although organics in the truly dissolved fraction were resistant to removal, Floculan reduced Cu in this fraction by 72% owing to the complexation of free Cu ions to phenol and amino groups along the polymeric chains, revealing an additional removal pathway. In fact, COC removed in the CF process by Floculan was replaced with truly dissolved compounds, input as a result of this reagents organic composition. Floculan, therefore, reduced the soluble concentration of Cu and Zn without changing the DOC concentration, thus reducing the bioavailability of these metals in treated effluent. FeCl3 did not reduce the bioavailability of target metals, thus did not deliver any environmental benefit. This work provides important information for the selection and development of high performance coagulants to improve metal removal.


Subject(s)
Metals, Heavy/chemistry , Metals, Heavy/pharmacokinetics , Waste Disposal, Fluid/methods , Water Pollutants, Chemical/chemistry , Water Pollutants, Chemical/pharmacokinetics , Biological Availability , Chlorides/chemistry , Ferric Compounds/chemistry , Flocculation , Lead/chemistry , Lead/pharmacokinetics , Microscopy, Electron, Scanning , Spectroscopy, Fourier Transform Infrared , Wastewater/chemistry , Zinc/chemistry , Zinc/pharmacokinetics
6.
Chemosphere ; 175: 239-246, 2017 May.
Article in English | MEDLINE | ID: mdl-28226277

ABSTRACT

The distribution of Cu, Pb, Ni and Zn between particulate, colloidal and truly dissolved size fractions in wastewater from a trickling filter treatment plant was investigated. Samples of influent, primary effluent, humus effluent, final effluent and sludge holding tank returns were collected and separated into particulate (i.e. > 0.45 µm), colloidal (i.e. 1 kDa to 0.45 µm), and truly dissolved (i.e. < 1 kDa) fractions using membrane filters. In the influent, substantial proportions of Cu (60%), Pb (67%), and Zn (32%) were present in the particulate fraction which was removed in conjunction with suspended particles at the works in subsequent treatment stages. In final effluent, sizeable proportions of Cu (52%), Pb (32%), Ni (44%) and Zn (68%) were found within the colloidal size fraction. Calculated ratios of soluble metal to organic carbon suggest the metal to be adsorbed to or complexed with non-humic macromolecules typically found within the colloidal size range. These findings suggest that technologies capable of removing particles within the colloidal fraction have good potential to enhance metals removal from wastewater.


Subject(s)
Colloids/analysis , Metals, Heavy/analysis , Particulate Matter/analysis , Water Pollutants, Chemical/analysis , Adsorption , Colloids/chemistry , Environmental Monitoring , Filtration , Metals, Heavy/chemistry , Particulate Matter/chemistry , Solubility , Waste Disposal, Fluid , Wastewater , Water Pollutants, Chemical/chemistry
7.
Int J Med Inform ; 92: 15-34, 2016 Aug.
Article in English | MEDLINE | ID: mdl-27318068

ABSTRACT

OBJECTIVE: To review evidence of the effectiveness of information technology (IT) interventions to improve antimicrobial prescribing in hospitals. METHOD: MEDLINE (1950-March 2015), EMBASE (1947-March 2015) and PubMED (1966-March 2015) were searched for studies where an IT intervention involving any device (e.g. computer, mobile phone) was evaluated in practice. All papers were assessed for quality using a 10-point rating scale. RESULTS: We identified 45 articles that evaluated an IT intervention to improve antimicrobial prescribing in hospitals. IT interventions took four main forms: (1) stand-alone computerized decision support systems (CDSSs), (2) decision support embedded within a hospital's electronic medical record (EMR) or computerized provider order entry (CPOE) system, (3) computerized antimicrobial approval systems (cAAS), and (4) surveillance systems (SSs). Results reported allowed us to perform meta-analyses for three outcome measures: appropriate use of antimicrobials, patient mortality and hospital length of stay (LOS). IT interventions increased appropriate use of antimicrobials (pooled RR: 1.49, 95%CI: 1.07-2.08); however no evidence of an effect was found when analysis included only studies with a quality score of five or above on the 10-point quality scale (pooled RR: 1.53, 95%CI: 0.96-2.44). There was little evidence of an effect of IT interventions on patient mortality or LOS. The range of study designs and outcome measures prevented meaningful comparisons between different IT intervention types to be made. CONCLUSION: IT interventions can improve the appropriateness of antimicrobial prescribing. However, high quality, systematic multi-site comparative studies are critically needed to assist organizations in making informed decisions about the most effective IT interventions.


Subject(s)
Anti-Infective Agents/administration & dosage , Drug Prescriptions , Electronic Prescribing , Medical Order Entry Systems , Decision Support Systems, Clinical , Humans
8.
Water Air Soil Pollut ; 227: 89, 2016.
Article in English | MEDLINE | ID: mdl-26949273

ABSTRACT

It is important to understand the fate of Hg and Sb within the wastewater treatment process so as to examine potential treatment options and to ensure compliance with regulatory standards. The fate of Hg and Sb was investigated for an activated sludge process treatment works in the UK. Relatively high crude values (Hg 0.092 µg/L, Sb 1.73 µg/L) were observed at the works, whilst low removal rates within the primary (Hg 52.2 %, Sb 16.3 %) and secondary treatment stages (Hg 29.5 %, Sb -28.9 %) resulted in final effluent concentrations of 0.031 µg/L for Hg and 2.04 µg/L for Sb. Removal of Hg was positively correlated with suspended solids (SS) and chemical oxygen demand (COD) removal, whilst Sb was negatively correlated. Elevated final effluent Sb concentrations compared with crude values were postulated and were suggested to result from Sb present in returned sludge liquors. Kepner Tregoe (KT) analysis was applied to identify suitable treatment technologies. For Hg, chemical techniques (specifically precipitation) were found to be the most suitable whilst for Sb, adsorption (using granulated ferric hydroxide) was deemed most appropriate. Operational solutions, such as lengthening hydraulic retention time, and treatment technologies deployed on sludge liquors were also reviewed but were not feasible for implementation at the works.

SELECTION OF CITATIONS
SEARCH DETAIL