Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 210
Filter
1.
BMJ Open ; 14(6): e083635, 2024 Jul 01.
Article in English | MEDLINE | ID: mdl-38951004

ABSTRACT

INTRODUCTION: Critically ill patients are at risk of suboptimal beta-lactam antibiotic (beta-lactam) exposure due to the impact of altered physiology on pharmacokinetics. Suboptimal concentrations can lead to treatment failure or toxicity. Therapeutic drug monitoring (TDM) involves adjusting doses based on measured plasma concentrations and individualising dosing to improve the likelihood of improving exposure. Despite its potential benefits, its adoption has been slow, and data on implementation, dose adaptation and safety are sparse. The aim of this trial is to assess the feasibility and fidelity of implementing beta-lactam TDM-guided dosing in the intensive care unit setting. METHODS AND ANALYSIS: A beta-lactam antibiotic Dose AdaPtation feasibility randomised controlled Trial using Therapeutic Drug Monitoring (ADAPT-TDM) is a single-centre, unblinded, feasibility randomised controlled trial aiming to enroll up to 60 critically ill adult participants (≥18 years). TDM and dose adjustment will be performed daily in the intervention group; the standard of care group will undergo plasma sampling, but no dose adjustment. The main outcomes include: (1) feasibility of recruitment, defined as the number of participants who are recruited from a pool of eligible participants, and (2) fidelity of TDM, defined as the degree to which TDM as a test is delivered as intended, from accurate sample collection, sample processing to result availability. Secondary outcomes include target attainment, uptake of TDM-guided dosing and incidence of neurotoxicity, hepatotoxicity and nephrotoxicity. ETHICS AND DISSEMINATION: This study has been approved by the Alfred Hospital human research ethics committee, Office of Ethics and Research Governance (reference: Project No. 565/22; date of approval: 22/11/2022). Prospective consent will be obtained and the study will be conducted in accordance with the Declaration of Helsinki. The finalised manuscript, including aggregate data, will be submitted for publication in a peer reviewed journal. ADAPT-TDM will determine whether beta-lactam TDM-guided dose adaptation is reproducible and feasible and provide important information required to implement this intervention in a phase III trial. TRIAL REGISTRATION NUMBER: Australian New Zealand Clinical Trials Registry, ACTRN12623000032651.


Subject(s)
Anti-Bacterial Agents , Critical Illness , Drug Monitoring , Feasibility Studies , beta-Lactams , Humans , Drug Monitoring/methods , Anti-Bacterial Agents/administration & dosage , Anti-Bacterial Agents/pharmacokinetics , Critical Illness/therapy , beta-Lactams/administration & dosage , beta-Lactams/pharmacokinetics , Randomized Controlled Trials as Topic , Intensive Care Units
2.
Perfusion ; : 2676591241262261, 2024 Jun 16.
Article in English | MEDLINE | ID: mdl-38881099

ABSTRACT

INTRODUCTION: Venovenous extracorporeal membrane oxygenation (VV ECMO) is used for refractory hypoxemia, although despite this, in high cardiac output states, hypoxaemia may persist. The administration of beta-blockers has been suggested as an approach in this scenario, however the physiological consequences of this intervention are not clear. METHODS: We performed an in-silico study using a previously described mathematical model to evaluate the effect of beta-blockade on mixed venous and arterial saturations (Sv¯O2, SaO2), in three different clinical scenarios and considered the potential effects of beta-blockers on, cardiac output, oxygen consumption and recirculation. Additionally we assessed the interaction of beta-blockade with haemoglobin concentration. RESULTS: In scenario 1: simulating a patient with high cardiac output and partial lung shunt Sv¯O2 decreased from increased 53.5% to 44.7% despite SaO2 rising from 74.2% to 79.2%. In scenario 2 simulating a patient with high cardiac output and complete lung shunt Sv¯O2 remained unchanged at 52.2% and SaO2 rose from 71.9% to 85%. In scenario 3 a patient with normal cardiac output and high recirculation Sv¯O2 fell from 50.8% to 25.5% and also fell from 82.4% to to 78.3%. Across the remaining modelling examples the effect on Sv¯O2 varied but oxygen delivery was consistently reduced across all scenarios. CONCLUSION: The administration of beta-blockers for refractory hypoxemia during VV ECMO are unpredictable and may reduce oxygen delivery, although this will vary with patient and circuit features. This study does not support the use of beta-blockers for this indication.

3.
ASAIO J ; 2024 May 07.
Article in English | MEDLINE | ID: mdl-38713630

ABSTRACT

Veno-arterial extracorporeal membrane oxygenation (VA ECMO) fundamentally alters patient physiology and blood flow relevant to contrast delivery for computed tomography (CT) imaging. Here, we present a comprehensive guide to contrast-enhanced CT scanning in adult ECMO patients, addressing common questions related to contrast delivery via the ECMO circuit, and modifications to ECMO settings and scanning techniques, to avoid non-diagnostic CT scans. The approach is described in detail for patients supported on VA ECMO, with the return cannula sited in the femoral artery. Lesser modifications required for veno-venous ECMO (VV ECMO) are included in the supplemental material. Establishing a common understanding between the intensive care clinician, the CT radiographer, and the radiologist, concerning the patient's blood-flow-physiology, is the overarching goal. Our stepwise approach facilitates clear communication around modifications to the ECMO pump settings, contrast route and rate, as well as the scanning technique, for each individual scenario.

5.
Crit Care ; 28(1): 184, 2024 05 28.
Article in English | MEDLINE | ID: mdl-38807143

ABSTRACT

BACKGROUND: The use of composite outcome measures (COM) in clinical trials is increasing. Whilst their use is associated with benefits, several limitations have been highlighted and there is limited literature exploring their use within critical care. The primary aim of this study was to evaluate the use of COM in high-impact critical care trials, and compare study parameters (including sample size, statistical significance, and consistency of effect estimates) in trials using composite versus non-composite outcomes. METHODS: A systematic review of 16 high-impact journals was conducted. Randomised controlled trials published between 2012 and 2022 reporting a patient important outcome and involving critical care patients, were included. RESULTS: 8271 trials were screened, and 194 included. 39.1% of all trials used a COM and this increased over time. Of those using a COM, only 52.6% explicitly described the outcome as composite. The median number of components was 2 (IQR 2-3). Trials using a COM recruited fewer participants (409 (198.8-851.5) vs 584 (300-1566, p = 0.004), and their use was not associated with increased rates of statistical significance (19.7% vs 17.8%, p = 0.380). Predicted effect sizes were overestimated in all but 6 trials. For studies using a COM the effect estimates were consistent across all components in 43.4% of trials. 93% of COM included components that were not patient important. CONCLUSIONS: COM are increasingly used in critical care trials; however effect estimates are frequently inconsistent across COM components confounding outcome interpretations. The use of COM was associated with smaller sample sizes, and no increased likelihood of statistically significant results. Many of the limitations inherent to the use of COM are relevant to critical care research.


Subject(s)
Critical Care , Outcome Assessment, Health Care , Randomized Controlled Trials as Topic , Humans , Randomized Controlled Trials as Topic/methods , Randomized Controlled Trials as Topic/statistics & numerical data , Critical Care/methods , Critical Care/statistics & numerical data , Critical Care/standards , Outcome Assessment, Health Care/statistics & numerical data , Outcome Assessment, Health Care/methods , Outcome Assessment, Health Care/standards , Journal Impact Factor
6.
BMJ Open ; 14(4): e078435, 2024 Apr 28.
Article in English | MEDLINE | ID: mdl-38684259

ABSTRACT

OBJECTIVES: We aimed to assess the healthcare costs and impact on the economy at large arising from emergency medical services (EMS) treated non-traumatic shock. DESIGN: We conducted a population-based cohort study, where EMS-treated patients were individually linked to hospital-wide and state-wide administrative datasets. Direct healthcare costs (Australian dollars, AUD) were estimated for each element of care using a casemix funding method. The impact on productivity was assessed using a Markov state-transition model with a 3-year horizon. SETTING: Patients older than 18 years of age with shock not related to trauma who received care by EMS (1 January 2015-30 June 2019) in Victoria, Australia were included in the analysis. PRIMARY AND SECONDARY OUTCOME MEASURES: The primary outcome assessed was the total healthcare expenditure. Secondary outcomes included healthcare expenditure stratified by shock aetiology, years of life lived (YLL), productivity-adjusted life-years (PALYs) and productivity losses. RESULTS: A total of 21 334 patients (mean age 65.9 (±19.1) years, and 9641 (45.2%) females were treated by EMS with non-traumatic shock with an average healthcare-related cost of $A11 031 per episode of care and total cost of $A280 million. Annual costs remained stable throughout the study period, but average costs per episode of care increased (Ptrend=0.05). Among patients who survived to hospital, the average cost per episode of care was stratified by aetiology with cardiogenic shock costing $A24 382, $A21 254 for septic shock, $A19 915 for hypovolaemic shock and $A28 057 for obstructive shock. Modelling demonstrated that over a 3-year horizon the cohort lost 24 355 YLLs and 5059 PALYs. Lost human capital due to premature mortality led to productivity-related losses of $A374 million. When extrapolated to the entire Australian population, productivity losses approached $A1.5 billion ($A326 million annually). CONCLUSION: The direct healthcare costs and indirect loss of productivity among patients with non-traumatic shock are high. Targeted public health measures that seek to reduce the incidence of shock and improve systems of care are needed to reduce the financial burden of this syndrome.


Subject(s)
Emergency Medical Services , Health Care Costs , Humans , Female , Male , Victoria , Aged , Health Care Costs/statistics & numerical data , Middle Aged , Emergency Medical Services/economics , Cost of Illness , Aged, 80 and over , Shock/economics , Shock/therapy , Cohort Studies , Adult , Quality-Adjusted Life Years , Health Expenditures/statistics & numerical data
7.
Crit Care Explor ; 6(4): e1069, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38545606

ABSTRACT

OBJECTIVES: To evaluate the current management of new-onset atrial fibrillation and compare differences in practice regionally. DESIGN: Cross-sectional survey. SETTING: United States, Canada, United Kingdom, Europe, Australia, and New Zealand. SUBJECTS: Critical care attending physicians/consultants and fellows. INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: A total of 386 surveys were included in our analysis. Rate control was the preferred treatment approach for hemodynamically stable patients (69.1%), and amiodarone was the most used antiarrhythmic medication (70.9%). For hemodynamically unstable patients, a strategy of electrolyte supplementation and antiarrhythmic therapy was most common (54.7%). Physicians responding to the survey distributed by the Society of Critical Care Medicine were more likely to prescribe beta-blockers as a first-line antiarrhythmic medication (38.4%), use more transthoracic echocardiography than respondents from other regions (82.4%), and more likely to refer patients who survive their ICU stay for cardiology follow-up if they had new-onset atrial fibrillation (57.2%). The majority of survey respondents (83.0%) were interested in participating in future studies of atrial fibrillation in critically ill patients. CONCLUSIONS: Significant variation exists in the management of new-onset atrial fibrillation in critically ill patients, as well as geographic variation. Further research is necessary to inform guidelines in this population and establish if differences in practice impact long-term outcomes.

8.
Aust Crit Care ; 37(4): 585-591, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38355389

ABSTRACT

BACKGROUND: There is increasing use of extracorporeal membrane oxygenation (ECMO) in intensive care, where nurses provide the majority of the required ongoing care of cannulas, circuit, and console. Limited evidence currently exists that details nursing perspectives, experiences, and challenges with workload in the provision of ECMO care. OBJECTIVE: The objective of this study was to investigate intensive care nurses' perceptions of workload in providing specialist ECMO therapy and care in a high-volume ECMO centre. METHODS: The study used a qualitative descriptive methodology through semistructured interviews. Data were analysed using an inductive thematic analysis approach following Braun and Clarke's iterative process. This study was conducted in an intensive care unit within an Australian public, quaternary, university-affiliated hospital, which provides specialist state-wide service for ECMO. FINDINGS: Thirty ECMO-specialist trained intensive care nurses were interviewed. This study identified three key themes: (i) opportunity; (ii) knowledge and responsibilities; and (iii) systems and structures impacting on intensive care nurses' workload in providing ECMO supportive therapy. CONCLUSIONS: Intensive care nurses require advanced clinical and critical thinking skills. Intensive care nurses are motivated and engaged to learn and acquire ECMO skills and competency as part of their ongoing professional development. Providing bedside ECMO management requires constant monitoring and surveillance from nurses to care for the one of the most critically unwell patient populations in the intensive care unit setting. As such, ECMO nursing services require a suitably trained and educated workforce of intensive care trained nurses. ECMO services provide clinical development opportunities for nurses, increase their scope of practice, and create advanced practice-specialist roles.


Subject(s)
Critical Care Nursing , Extracorporeal Membrane Oxygenation , Interviews as Topic , Qualitative Research , Workload , Humans , Female , Male , Adult , Middle Aged , Intensive Care Units , Australia , Attitude of Health Personnel
9.
BMJ Open ; 14(2): e080614, 2024 Feb 21.
Article in English | MEDLINE | ID: mdl-38387978

ABSTRACT

INTRODUCTION: Traumatic brain injury (TBI) is a heterogeneous condition in terms of pathophysiology and clinical course. Outcomes from moderate to severe TBI (msTBI) remain poor despite concerted research efforts. The heterogeneity of clinical management represents a barrier to progress in this area. PRECISION-TBI is a prospective, observational, cohort study that will establish a clinical research network across major neurotrauma centres in Australia. This network will enable the ongoing collection of injury and clinical management data from patients with msTBI, to quantify variations in processes of care between sites. It will also pilot high-frequency data collection and analysis techniques, novel clinical interventions, and comparative effectiveness methodology. METHODS AND ANALYSIS: PRECISION-TBI will initially enrol 300 patients with msTBI with Glasgow Coma Scale (GCS) <13 requiring intensive care unit (ICU) admission for invasive neuromonitoring from 10 Australian neurotrauma centres. Demographic data and process of care data (eg, prehospital, emergency and surgical intervention variables) will be collected. Clinical data will include prehospital and emergency department vital signs, and ICU physiological variables in the form of high frequency neuromonitoring data. ICU treatment data will also be collected for specific aspects of msTBI care. Six-month extended Glasgow Outcome Scores (GOSE) will be collected as the key outcome. Statistical analysis will focus on measures of between and within-site variation. Reports documenting performance on selected key quality indicators will be provided to participating sites. ETHICS AND DISSEMINATION: Ethics approval has been obtained from The Alfred Human Research Ethics Committee (Alfred Health, Melbourne, Australia). All eligible participants will be included in the study under a waiver of consent (hospital data collection) and opt-out (6 months follow-up). Brochures explaining the rationale of the study will be provided to all participants and/or an appropriate medical treatment decision-maker, who can act on the patient's behalf if they lack capacity. Study findings will be disseminated by peer-review publications. TRIAL REGISTRATION NUMBER: NCT05855252.


Subject(s)
Brain Injuries, Traumatic , Brain Injuries , Humans , Australia , Brain Injuries, Traumatic/therapy , Cohort Studies , Glasgow Coma Scale , Prospective Studies , Observational Studies as Topic
10.
J Neurotrauma ; 41(11-12): 1364-1374, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38279804

ABSTRACT

Traumatic brain injury (TBI) is a leading global cause of morbidity and mortality. Intracranial hypertension following moderate-to-severe TBI (m-sTBI) is a potentially modifiable secondary cerebral insult and one of the central therapeutic targets of contemporary neurocritical care. External ventricular drain (EVD) insertion is a common therapeutic intervention used to control intracranial hypertension and attenuate secondary brain injury. However, the optimal timing of EVD insertion in the setting of m-sTBI is uncertain and practice variation is widespread. Therefore, we aimed to assess if there is an association between timing of EVD placement and functional neurological outcome at 6 months post m-sTBI. We pooled individual patient data for all relevant harmonizable variables from the Erythropoietin in Traumatic Brain Injury (EPO-TBI) and Prophylactic Hypothermia Trial to Lessen Traumatic Brain Injury (POLAR) randomized control trials, and the Collaborative European NeuroTrauma Effectiveness Research in TBI (CENTER-TBI) Core Study version 3.0 and Australia-Europe NeuroTrauma Effectiveness Research in TBI (Oz-ENTER) prospective observational studies to create a combined dataset. The Glasgow Coma Scale (GCS) score was used to define TBI severity and we included all patients admitted to an intensive care unit with a GCS ≤12, who were 15 years or older and underwent EVD placement within 7 days of injury. We used hierarchical multi-variable logistic regression models to study the association between EVD insertion within 24 h of injury (early) compared with EVD insertion more than 24 h after injury (late) and 6-month functional neurological outcome measured using the Glasgow Outcome Score Extended (GOSE). In total, 2536 patients were assessed. Of these, 502 (20%) underwent early EVD insertion and 145 (6%) underwent late EVD insertion. Following adjustment for the IMPACT (International Mission for Prognosis and Analysis of Clinical Trials in TBI) score extended (Core + CT), sex, injury severity score, study and treatment site, patients receiving a late EVD had higher odds of death or severe disability (GOSE 1-4) at 6 months follow-up than those receiving an early EVD adjusted odds ratio; 95% confidence interval, 2.14; 1.22-3.76; p = 0.008. Our study suggests that in patients with m-sTBI where an EVD is needed, early (≤ 24 h post-injury) insertion may result in better long-term functional outcomes. This finding supports future prospective investigation in this area.


Subject(s)
Brain Injuries, Traumatic , Drainage , Humans , Brain Injuries, Traumatic/surgery , Male , Female , Adult , Middle Aged , Drainage/methods , Treatment Outcome , Recovery of Function/physiology , Young Adult , Prospective Studies , Ventriculostomy/methods , Glasgow Coma Scale , Intracranial Hypertension/etiology , Time Factors
11.
J Crit Care ; 80: 154430, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38245376

ABSTRACT

BACKGROUND: Noradrenaline and metaraminol are commonly used vasopressors in critically ill patients. However, little is known of their dose equivalence. METHODS: We conducted a single centre retrospective cohort study of all ICU patients who transitioned from metaraminol to noradrenaline infusions between August 26, 2016 and December 31, 2020. Patients receiving additional vasoactive drug infusion were excluded. Dose equivalence was calculated based on the last hour metaraminol dose (in µg/min) and the first hour noradrenaline dose (in µg/min) with the closest matched mean arterial pressure (MAP). Sensitivity analyses were performed on patients with acute kidney injury (AKI), sepsis and mechanical ventilation. RESULTS: We studied 195 patients. The median conversion ratio of metaraminol to noradrenaline was 12.5:1 (IQR 7.5-20.0) for the overall cohort. However, the coefficient of variation was 77% and standard deviation was 11.8. Conversion ratios were unaffected by sepsis or mechanical ventilation but increased (14:1) with AKI. One in five patients had a MAP decrease of >10 mmHg during the transition period from metaraminol to noradrenaline. Post-transition noradrenaline dose (p < 0.001) and AKI (p = 0.045) were independently associated with metaraminol dose. The proportion of variation in noradrenaline dose predicted from metaraminol dose was low (R2 = 0.545). CONCLUSIONS: The median dose equivalence for metaraminol and noradrenaline in this study was 12.5:1. However, there was significant variance in dose equivalence, only half the proportion of variation in noradrenaline infusion dose was predicted by metaraminol dose, and conversion-associated hypotension was common.


Subject(s)
Acute Kidney Injury , Sepsis , Humans , Metaraminol , Norepinephrine , Retrospective Studies , Sepsis/complications , Acute Kidney Injury/complications
12.
Aust Crit Care ; 37(3): 422-428, 2024 May.
Article in English | MEDLINE | ID: mdl-37316370

ABSTRACT

BACKGROUND: Data on nutrition delivery over the whole hospital admission in critically ill patients with COVID-19 are scarce, particularly in the Australian setting. OBJECTIVES: The objective of this study was to describe nutrition delivery in critically ill patients admitted to Australian intensive care units (ICUs) with coronavirus disease 2019 (COVID-19), with a focus on post-ICU nutrition practices. METHODS: A multicentre observational study conducted at nine sites included adult patients with a positive COVID-19 diagnosis admitted to the ICU for >24 h and discharged to an acute ward over a 12-month recruitment period from 1 March 2020. Data were extracted on baseline characteristics and clinical outcomes. Nutrition practice data from the ICU and weekly in the post-ICU ward (up to week four) included route of feeding, presence of nutrition-impacting symptoms, and nutrition support received. RESULTS: A total of 103 patients were included (71% male, age: 58 ± 14 years, body mass index: 30±7 kg/m2), of whom 41.7% (n = 43) received mechanical ventilation within 14 days of ICU admission. While oral nutrition was received by more patients at any time point in the ICU (n = 93, 91.2% of patients) than enteral nutrition (EN) (n = 43, 42.2%) or parenteral nutrition (PN) (n = 2, 2.0%), EN was delivered for a greater duration of time (69.6% feeding days) than oral and PN (29.7% and 0.7%, respectively). More patients received oral intake than the other modes in the post-ICU ward (n = 95, 95.0%), and 40.0% (n = 38/95) of patients were receiving oral nutrition supplements. In the week after ICU discharge, 51.0% of patients (n = 51) had at least one nutrition-impacting symptom, most commonly a reduced appetite (n = 25; 24.5%) or dysphagia (n = 16; 15.7%). CONCLUSION: Critically ill patients during the COVID-19 pandemic in Australia were more likely to receive oral nutrition than artificial nutrition support at any time point both in the ICU and in the post-ICU ward, whereas EN was provided for a greater duration when it was prescribed. Nutrition-impacting symptoms were common.


Subject(s)
COVID-19 , Critical Illness , Adult , Humans , Male , Middle Aged , Aged , Female , COVID-19 Testing , Pandemics , Energy Intake , Length of Stay , Australia , Hospitalization , Intensive Care Units
13.
Acta Anaesthesiol Scand ; 68(3): 361-371, 2024 Mar.
Article in English | MEDLINE | ID: mdl-37944557

ABSTRACT

BACKGROUND: Prone positioning may improve oxygenation in acute hypoxemic respiratory failure and was widely adopted in COVID-19 patients. However, the magnitude and timing of its peak oxygenation effect remain uncertain with the optimum dosage unknown. Therefore, we aimed to investigate the magnitude of the peak effect of prone positioning on the PaO2 :FiO2 ratio during prone and secondly, the time to peak oxygenation. METHODS: Multi-centre, observational study of invasively ventilated adults with acute hypoxemic respiratory failure secondary to COVID-19 treated with prone positioning. Baseline characteristics, prone positioning and patient outcome data were collected. All arterial blood gas (ABG) data during supine, prone and after return to supine position were analysed. The magnitude of peak PaO2 :FiO2 ratio effect and time to peak PaO2 :FIO2 ratio effect was measured. RESULTS: We studied 220 patients (mean age 54 years) and 548 prone episodes. Prone positioning was applied for a mean (±SD) 3 (±2) times and 16 (±3) hours per episode. Pre-proning PaO2 :FIO2 ratio was 137 (±49) for all prone episodes. During the first episode. the mean PaO2 :FIO2 ratio increased from 125 to a peak of 196 (p < .001). Peak effect was achieved during the first episode, after 9 (±5) hours in prone position and maintained until return to supine position. CONCLUSIONS: In ventilated adults with COVID-19 acute hypoxemic respiratory failure, peak PaO2 :FIO2 ratio effect occurred during the first prone positioning episode and after 9 h. Subsequent episodes also improved oxygenation but with diminished effect on PaO2 :FIO2 ratio. This information can help guide the number and duration of prone positioning episodes.


Subject(s)
COVID-19 , Respiratory Distress Syndrome , Respiratory Insufficiency , Adult , Humans , Middle Aged , COVID-19/complications , COVID-19/therapy , Prone Position , Respiration, Artificial , Respiratory Distress Syndrome/therapy , Respiratory Insufficiency/therapy
14.
Blood Purif ; 53(3): 170-180, 2024.
Article in English | MEDLINE | ID: mdl-37992695

ABSTRACT

INTRODUCTION: Continuous renal replacement therapy (CRRT) is common in the intensive care unit (ICU) but a high net ultrafiltration rate (UFNET) calculated with daily data may increase mortality. We aimed to study early UFNET practice using minute-by-minute CRRT machine recordings and to assess its association with admission diagnosis and mortality. METHODS: We studied CRRT treatments in three adult ICUs over 7 years. We calculated early UFNET rates minute-by-minute and categorized UFNET into tertiles of mean UFNET in the first 72 h and admission diagnosis. We applied Cox-proportional hazards modelling with censoring of patients who died within 72 h. RESULTS: We studied 1,218 patients, 154,712 h, and 9,282,729 min of CRRT (5,702 circuits). Mean early UFNET was 1.52 (1.46-1.57) mL/kg/h. Early UFNET tertiles were similar to, but somewhat higher than, previously reported values at 0.00-1.20 mL/kg/h, 1.21-1.93 mL/kg/h, and >1.93 mL/kg/h. UFNET values were similar whether evaluated at 24 or 72 h or for the entire duration of CRRT. There was, however, significant variation in UFNET practice by admission diagnosis: higher in respiratory diseases (pneumonia p = 0.01, other p < 0.0001) and cardiovascular disease (p = 0.005) but lower in cardiothoracic surgery (p = 0.04), renal (p = 0.0003) and toxicology-associated diagnoses (p = 0.01). Higher UFNET was associated with an increased hazard of death, HR 1.24 (1.13-1.37), independent of admission diagnosis, weight, age, sex, presence of end-stage kidney disease, and severity of illness. CONCLUSION: Early UFNET practice varies significantly by admission diagnosis. Higher early UFNET is independently associated with mortality. Impacts of UFNET on mortality may vary by admission diagnosis. Further work is required to elucidate the nature and mechanisms responsible for this association.


Subject(s)
Acute Kidney Injury , Continuous Renal Replacement Therapy , Kidney Failure, Chronic , Adult , Humans , Renal Replacement Therapy , Ultrafiltration , Intensive Care Units , Retrospective Studies , Acute Kidney Injury/therapy , Critical Illness
15.
Nutrition ; 118: 112261, 2024 Feb.
Article in English | MEDLINE | ID: mdl-37984244

ABSTRACT

OBJECTIVES: The main aim of this study was to describe nutrition provision in Australian and New Zealand (ANZ) pediatric intensive care units (PICUs), including mode of nutrition and adequacy of enteral nutrition (EN) to PICU day 28. Secondary aims were to determine the proportion of children undergoing dietetics assessment, the average time to this intervention, and the methods for estimation of energy and protein requirements. METHODS: This observational study was conducted in all ANZ tertiary-affiliated specialist PICUs. All children ≤18 y of age admitted to the PICU over a 2-wk period and remaining for ≥48 h were included. Data were collected on days 1 to 7, 14, 21, and 28 (unless discharged prior). Data points included oral intake, EN and parenteral nutrition support, estimated energy and protein adequacy, and dietetics assessment details. RESULTS: We enrolled 141 children, of which 79 were boys (56%) and 84 were <2 y of age (60%). Thirty children (73%) received solely EN on day 7 with documented energy and protein targets for 22 (73%). Of these children, 14 (64%) received <75% of their estimated requirements. A dietetics assessment was provided to 80 children (57%), and was significantly higher in those remaining in the PICU beyond the median length of stay (41% in patients staying ≤4.6 d versus 72% in those staying >4.6 d; P < 0.001). CONCLUSIONS: This prospective study of nutrition provision across ANZ PICUs identified important areas for improvement, particularly in EN adequacy and nutrition assessment. Further research to optimize nutrition provision in this setting is urgently needed.


Subject(s)
Energy Intake , Intensive Care Units, Pediatric , Child , Male , Humans , Female , Prospective Studies , New Zealand , Australia , Critical Illness
16.
Aust Crit Care ; 37(3): 490-494, 2024 May.
Article in English | MEDLINE | ID: mdl-37169654

ABSTRACT

BACKGROUND: Recommendations to facilitate evidence-based nutrition provision for critically ill children exist and indicate the importance of nutrition in this population. Despite these recommendations, it is currently unknown how well Australian and New Zealand (ANZ) paediatric intensive care units (PICUs) are equipped to provide nutrition care. OBJECTIVES: The objectives of this project were to describe the dietitian and nutrition-related practices and resources in ANZ PICUs. METHODS: A clinician survey was completed as a component of an observational study across nine ANZ PICUs in June 2021. The online survey comprised 31 questions. Data points included reporting on dietetics resourcing, local feeding-related guidelines and algorithms, nutrition screening and assessment practices, anthropometry practices, and indirect calorimetry (IC) device availability and local technical expertise. Data are presented as frequency (%), mean (standard deviation), or median (interquartile range). RESULTS: Survey responses were received from all nine participating sites. Dietetics staffing per available PICU bed ranged from 0.01 to 0.07 full-time equivalent (median: 0.03 [interquartile range: 0.02-0.04]). Nutrition screening was established in three (33%) units, all of which used the Paediatric Nutrition Screening Tool. Dietitians consulted all appropriate patients (or where capacity allowed) in six (66%) units and on a request or referral basis only in three (33%) units. All units possessed a local feeding guideline or algorithm. An IC device was available in two (22%) PICUs and was used in one of these units. CONCLUSIONS: This is the first study to describe the dietitian and nutrition-related practices and resources of ANZ PICUs. Areas for potential improvement include dietetics full-time equivalent, routine nutrition assessment, and access to IC.


Subject(s)
Nutritionists , Child , Humans , New Zealand , Australia , Nutritional Status , Intensive Care Units, Pediatric
17.
J Crit Care ; 79: 154469, 2024 02.
Article in English | MEDLINE | ID: mdl-37992464

ABSTRACT

PURPOSE: Neuromuscular blockers (NMBs) are often used during prone positioning to facilitate mechanical ventilation in COVID-19 related ARDS. However, their impact on oxygenation is uncertain. METHODS: Multi-centre observational study of invasively ventilated COVID-19 ARDS adults treated with prone positioning. We collected data on baseline characteristics, prone positioning, NMB use and patient outcome. We assessed arterial blood gas data during supine and prone positioning and after return to the supine position. RESULTS: We studied 548 prone episodes in 220 patients (mean age 54 years, 61% male) of whom 164 (75%) received NMBs. Mean PaO2:FiO2 (P/F ratio) during the first prone episode with NMBs reached 208 ± 63 mmHg compared with 161 ± 66 mmHg without NMBs (Δmean = 47 ± 5 mmHg) for an absolute increase from baseline of 76 ± 56 mmHg versus 55 ± 56 mmHg (padj < 0.001). The mean P/F ratio on return to the supine position was 190 ± 63 mmHg in the NMB group versus 141 ± 64 mmHg in the non-NMB group for an absolute increase from baseline of 59 ± 58 mmHg versus 34 ± 56 mmHg (padj < 0.001). CONCLUSION: During prone positioning, NMB is associated with increased oxygenation compared to non-NMB therapy, with a sustained effect on return to the supine position. These findings may help guide the use of NMB during prone positioning in COVID-19 ARDS.


Subject(s)
COVID-19 , Neuromuscular Blockade , Neuromuscular Diseases , Respiratory Distress Syndrome , Adult , Female , Humans , Male , Middle Aged , COVID-19/therapy , Prone Position , Pulmonary Gas Exchange , Respiration, Artificial , Respiratory Distress Syndrome/therapy
18.
Emerg Med Australas ; 2023 Dec 11.
Article in English | MEDLINE | ID: mdl-38081764

ABSTRACT

OBJECTIVE: Despite high in-hospital mortality, the epidemiology of prehospital suspected sepsis presentations is not well described. This retrospective cohort study aimed to quantify the burden of such presentations, and to determine whether such a diagnosis was independently associated with longer-term mortality. METHODS: Retrospective, observational population-based cohort study examining all adult prehospital presentations in Victoria, between January 2015 and June 2019, who required subsequent in-hospital assessment. Linked data were extracted from clinical and administrative datasets. Demographics, illness severity, prehospital treatment and mortality were compared between prehospital suspected sepsis and non-sepsis patients. Multivariable logistic regression was used to determine the adjusted association between prehospital assessment (suspected sepsis vs non-sepsis) and 6-month mortality. RESULTS: A total of 1 218 047 patients were included. The age-adjusted incidence rate of prehospital suspected sepsis was 65 cases per 100 000 person-years. Those with prehospital suspected sepsis were older (74 vs 62 years), more frequently male (55% vs 47%), with greater physiological derangement. Intravenous cannulas were more often inserted prehospital (60% vs 29%). Crude in-hospital mortality was 6.5-fold higher in the prehospital suspected sepsis group (11.8% vs 1.8%), and by 6 months, 22.6% had died. After adjustment for demographics, illness severity, comorbidity, treatment and hospital location, a diagnosis of prehospital suspected sepsis was associated with a 35% higher likelihood of 6-month mortality (OR 1.35, 95% CI 1.29-1.41). CONCLUSIONS: The burden of prehospital suspected sepsis in the Australian setting is significant, with paramedics identifying patients at high-risk of poor longer-term outcomes. This implies the need to consider improved care pathways for this highly vulnerable group.

19.
Aust Crit Care ; 2023 Dec 07.
Article in English | MEDLINE | ID: mdl-38065796

ABSTRACT

BACKGROUND: A mobile chest X-ray is traditionally performed to confirm the position of an internal jugular central venous catheter (CVC) after placement in the intensive care unit (ICU). Using chest radiography to confirm CVC position often results in delays in authorising the use of the CVC, requires the deployment of additional human resources, and is costly. OBJECTIVE: This study aimed to determine the feasibility and accuracy of using the central venous pressure (CVP) waveform to confirm the placement of internal jugular CVCs. METHODS: This retrospective study was conducted in a single quaternary ICU over a 6-month period. We included adult patients who had internal jugular CVC inserted and CVP transduced as part of their routine care in the ICU. The sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of CVP waveform analysis in confirming the position of internal jugular CVC relative to chest radiography were calculated. RESULTS: A total of 241 internal jugular CVCs were inserted (in 219 patients, 35.6% female), and the CVP waveform was assessed. In 231 cases, this suggested adequate placement in a central vein, which corresponded with a correct position on subsequent chest X-ray. On six occasions, the CVP waveforms were interpreted as suboptimal; however, on chest X-rays, the CVCs were noted to be in a suitable position (sensitivity: 97.5%). Four suboptimal CVP waveforms were obtained, and they correctly identified CVC malposition on subsequent chest X-ray (specificity: 100%). The average time from CVC insertion to radiological completion was 118 minutes. CONCLUSION: CVP waveform analysis provides a feasible and reliable method for confirming adequate internal jugular CVC position. The use of chest radiography can be limited to cases where suboptimal CVP waveforms are obtained.

20.
Ther Drug Monit ; 2023 Nov 28.
Article in English | MEDLINE | ID: mdl-38018820

ABSTRACT

BACKGROUND: Therapeutic drug monitoring (TDM) of beta-lactam antibiotics (beta-lactams) is increasingly recommended for optimizing antibiotic exposure in intensive care patients with sepsis. However, limited data are available on the implementation of beta-lactam TDM in complex health care settings. Theory-based approaches were used to systematically explore barriers and enablers perceived by key stakeholders in the implementation of beta-lactam TDM in the intensive care unit. METHODS: In this qualitative descriptive study, the authors interviewed key stakeholders (n = 40): infectious disease physicians, intensive care unit physicians, pharmacists, clinical leaders, scientists, and nurses. The data were thematically analyzed and coded using the theoretical domains framework, and the codes and themes were mapped to the relevant domains of the capability, opportunity, and motivation behavior-change wheel model. RESULTS: Barriers included a lack of knowledge, experience, evidence, and confidence, which led to concerns about capability, lack of resources, and harm in straying from standard practice. Access to education and guidelines, on-site assays with short turnaround times, communication among teams, and workflow integration were identified as enablers. A focus on patient care, trust in colleagues, and endorsement by hospital leaders were strong motivators. Pharmacist and nursing stakeholder groups emerged as key targets in the implementation of strategies. CONCLUSIONS: Using theory-based approaches, the authors identified the key barriers and enablers to establishing beta-lactam TDM. These data were used to identify strategies, policies, and key target groups for the implementation of interventions.

SELECTION OF CITATIONS
SEARCH DETAIL
...