Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 256
Filter
Add more filters

Publication year range
1.
BMC Infect Dis ; 24(1): 38, 2024 Jan 02.
Article in English | MEDLINE | ID: mdl-38166699

ABSTRACT

BACKGROUND: Respiratory tract infections (RTIs) are a major global health burden due to their high morbidity and mortality. This retrospective study described the epidemiology of respiratory pathogens in adults over a 5-year period at an Australian tertiary healthcare network. METHODS: All multiplex reverse transcription polymerase chain reaction respiratory samples taken between the 1st of November 2014 and the 31st of October 2019 were included in this study. Overall prevalence and variations according to seasons, age groups and sex were analysed, as well as factors associated with prolonged hospital and intensive care length of stay. RESULTS: There were 12,453 pathogens detected amongst the 12,185 positive samples, with coinfection rates of 3.7%. Picornavirus (Rhinovirus), Influenza A and respiratory syncytial virus were the most commonly detected pathogens. Mycoplasma pneumoniae was the most commonly detected atypical bacteria. Significant differences in the prevalence of Chlamydia pneumoniae and Human metapneumovirus infections were found between sexes. Longest median length of intensive care and hospital stay was for Legionella species. Seasonal variations were evident for certain pathogens. CONCLUSIONS: The high rates of pathogen detection and hospitalisation in this real-world study highlights the significant burden of RTIs, and the urgent need for an improved understanding of the pathogenicity as well as preventative and treatment options of RTIs.


Subject(s)
COVID-19 , Respiratory Syncytial Virus, Human , Respiratory Tract Infections , Adult , Humans , Australia/epidemiology , COVID-19/epidemiology , Multiplex Polymerase Chain Reaction , Respiratory System , Retrospective Studies , Seasons , Male , Female
2.
Aust Crit Care ; 2024 Apr 17.
Article in English | MEDLINE | ID: mdl-38637220

ABSTRACT

BACKGROUND: Adverse changes in muscle health (size and quality) are common in patients receiving extracorporeal membrane oxygenation (ECMO). Nutrition delivery may attenuate such changes, yet the relationship with muscle health remains poorly understood. This study explored the association between energy and protein delivery and changes in muscle health measured using ultrasound from baseline to day 10 and 20 in patients receiving ECMO. METHODS: A secondary analysis of data from a prospective study quantifying changes in muscle health using ultrasound in adults receiving ECMO was completed. Patients were eligible for inclusion if they were prescribed artificial nutrition within 3 days of enrolment and had >1 ultrasound measurement. The primary outcome was the association between protein delivery (grams delivered and percentage of targets received) and change in rectus femoris cross-sectional area (RF-CSA) till day 20. Secondary outcomes were the association between energy and protein delivery and change in RF-CSA till day 10, RF-echogenicity, and quadriceps muscle layer thickness to day 10 and 20. Associations were assessed using Spearman's rank correlation. RESULTS: Twenty-three patients (age: 48 [standard deviation {SD}: 14], 44% male) were included. Mean energy and protein delivery were 1633 kcal (SD: 374 kcal) and 70 g (SD: 17 g) equating to 79% (SD: 19%) of energy and 73% (SD: 17%) of protein targets. No association was observed between protein delivery (r = 0.167; p = 0.495) or the percentage of targets received (r = 0.096; p = 0.694) and change in RF-CSA till day 20. No other significant associations were found between energy or protein delivery and change in RF-CSA, echogenicity, or quadriceps muscle layer thickness at any time point. CONCLUSIONS: This exploratory study observed no association between nutrition delivery and changes in muscle health measured using ultrasound in patients receiving ECMO. Larger prospective studies are required to investigate the association between nutrition delivery and changes in muscle health in patients receiving ECMO.

3.
Aust Crit Care ; 37(3): 422-428, 2024 May.
Article in English | MEDLINE | ID: mdl-37316370

ABSTRACT

BACKGROUND: Data on nutrition delivery over the whole hospital admission in critically ill patients with COVID-19 are scarce, particularly in the Australian setting. OBJECTIVES: The objective of this study was to describe nutrition delivery in critically ill patients admitted to Australian intensive care units (ICUs) with coronavirus disease 2019 (COVID-19), with a focus on post-ICU nutrition practices. METHODS: A multicentre observational study conducted at nine sites included adult patients with a positive COVID-19 diagnosis admitted to the ICU for >24 h and discharged to an acute ward over a 12-month recruitment period from 1 March 2020. Data were extracted on baseline characteristics and clinical outcomes. Nutrition practice data from the ICU and weekly in the post-ICU ward (up to week four) included route of feeding, presence of nutrition-impacting symptoms, and nutrition support received. RESULTS: A total of 103 patients were included (71% male, age: 58 ± 14 years, body mass index: 30±7 kg/m2), of whom 41.7% (n = 43) received mechanical ventilation within 14 days of ICU admission. While oral nutrition was received by more patients at any time point in the ICU (n = 93, 91.2% of patients) than enteral nutrition (EN) (n = 43, 42.2%) or parenteral nutrition (PN) (n = 2, 2.0%), EN was delivered for a greater duration of time (69.6% feeding days) than oral and PN (29.7% and 0.7%, respectively). More patients received oral intake than the other modes in the post-ICU ward (n = 95, 95.0%), and 40.0% (n = 38/95) of patients were receiving oral nutrition supplements. In the week after ICU discharge, 51.0% of patients (n = 51) had at least one nutrition-impacting symptom, most commonly a reduced appetite (n = 25; 24.5%) or dysphagia (n = 16; 15.7%). CONCLUSION: Critically ill patients during the COVID-19 pandemic in Australia were more likely to receive oral nutrition than artificial nutrition support at any time point both in the ICU and in the post-ICU ward, whereas EN was provided for a greater duration when it was prescribed. Nutrition-impacting symptoms were common.


Subject(s)
COVID-19 , Critical Illness , Adult , Humans , Male , Middle Aged , Aged , Female , COVID-19 Testing , Pandemics , Energy Intake , Length of Stay , Australia , Hospitalization , Intensive Care Units
4.
Eur J Haematol ; 110(5): 540-547, 2023 May.
Article in English | MEDLINE | ID: mdl-36656100

ABSTRACT

OBJECTIVE: The infection risk in patients receiving ibrutinib, idelalisib or venetoclax for chronic lymphocytic leukaemia (CLL) or B-cell lymphoma treated outside of clinical trials is incompletely defined. We sought to identify the severe infection rate and associated risk factors in a 'real-world' cohort. METHODS: We conducted a retrospective cohort study of adult patients with CLL or lymphoma treated with ibrutinib, idelalisib or venetoclax. RESULTS: Of 67 patients identified (ibrutinib n = 53, idelalisib n = 8 and venetoclax n = 6), 32 (48%) experienced severe infection. Severe infection occurred at a rate of 65 infections per 100 person-years, with a median of 17.8 months of therapy. Median time to first infection (IQR) was 5.4 months (1.4-15.9). Poor baseline Eastern Cooperative Oncology Group (ECOG) performance status and high Charlson Comorbidity Index (CCI) score associated with increased risk of severe infection [hazard ratios (95% CI) 1.57 (1.07-2.31, p = .018) and 1.3 (1.05-1.62, p = .016) respectively]. CONCLUSION: The severe infection rate for patients receiving ibrutinib, idelalisib or venetoclax for lymphoma and CLL exceeded those reported in clinical trials. Patients with poor ECOG or high CCI should be closely monitored for early signs of infection and prevention strategies actively pursued. Further prospective research is required to define optimal antimicrobial prophylaxis recommendations.


Subject(s)
Leukemia, Lymphocytic, Chronic, B-Cell , Lymphoma, B-Cell , Adult , Humans , Leukemia, Lymphocytic, Chronic, B-Cell/complications , Leukemia, Lymphocytic, Chronic, B-Cell/diagnosis , Leukemia, Lymphocytic, Chronic, B-Cell/drug therapy , Retrospective Studies , Lymphoma, B-Cell/drug therapy , Antineoplastic Combined Chemotherapy Protocols/adverse effects
5.
Pulm Pharmacol Ther ; 82: 102247, 2023 10.
Article in English | MEDLINE | ID: mdl-37574040

ABSTRACT

BACKGROUND: Recent advances in CFTR modulator therapy have the potential to change the face of cystic fibrosis (CF). This retrospective observational study describes real world experience of the four available CFTR modulators in adults and children with CF in a single centre in Melbourne, Australia. METHOD: Data were collected for all patients treated with CFTR modulators at MonashCF between May 2012 and September 2020. Primary outcomes included lung function, admission days and BMI/BMI centile over time. Adverse events and reasons for changing or ceasing medications were also analysed. RESULTS: 55% (74/133) adult and 46% (55/119) paediatric patients were treated with CFTR modulators. FEV1 increased in adults treated with ivacaftor (IVA) and elexacaftor/tezacaftor/ivacaftor (ELX/TEZ/IVA) by 4.73% and 10.07% respectively, and BMI also improved in these groups. Nutrition improved in adults and children treated with lumacaftor/ivacaftor (LUM/IVA). There was no significant improvement in FEV1 or admission days with LUM/IVA or tezacaftor/ivacaftor (TEZ/IVA). 36% (31/85) ceased LUM/IVA, due to adverse effects in 81% (25/31). Of these, 92% (23/25) changed to TEZ/IVA, 78% (18/23) without significant adverse effects. CONCLUSIONS: Our findings for LUM/IVA and TEZ/IVA are less encouraging than those seen in clinical trials, with no significant improvement in lung function or admission days and a higher rate of adverse effects with LUM/IVA compared with phase 3 clinical trials. TEZ/IVA was generally well tolerated by those who experienced side effects with LUM/IVA. The small number of patients treated with ELX/TEZ/IVA had improvements in all parameters. These findings support ongoing use of IVA for individuals with gating mutations, and transition to ELX/TEZ/IVA once available for patients with at least one Phe508del mutation.


Subject(s)
Cystic Fibrosis Transmembrane Conductance Regulator , Cystic Fibrosis , Humans , Adult , Child , Cystic Fibrosis Transmembrane Conductance Regulator/genetics , Cystic Fibrosis Transmembrane Conductance Regulator/therapeutic use , Australia , Aminophenols/adverse effects , Cystic Fibrosis/drug therapy , Mutation
6.
Transpl Int ; 36: 10704, 2023.
Article in English | MEDLINE | ID: mdl-36744051

ABSTRACT

Everolimus (EVE) provides an alternative to maintenance immunosuppression when conventional immunosuppression cannot be tolerated. EVE can be utilized with a calcineurin inhibitor (CNI) minimization or elimination strategy. To date, clinical studies investigating EVE after lung transplant (LTx) have primarily focused on the minimization strategy to preserve renal function. The primary aim was to determine the preferred method of EVE utilization for lung transplant recipients (LTR). To undertake this aim, we compared the safety and efficacy outcomes of EVE as part of minimization and elimination immunosuppressant regimens. Single center retrospective study of 217 LTR initiated on EVE (120 CNI minimization and 97 CNI elimination). Survival outcomes were calculated from the date of EVE commencement. On multivariate analysis, LTR who received EVE as part of the CNI elimination strategy had poorer survival outcomes compared to the CNI minimization strategy [HR 1.61, 95% CI: 1.11-2.32, p=0.010]. Utilization of EVE for renal preservation was associated with improved survival compared to other indications [HR 0.64, 95% CI: 0.42-0.97, p=0.032]. EVE can be successfully utilized for maintenance immunosuppression post LTx, particularly for renal preservation. However, immunosuppressive regimens containing low dose CNI had superior survival outcomes, highlighting the importance of retaining a CNI wherever possible.


Subject(s)
Calcineurin Inhibitors , Everolimus , Adult , Humans , Calcineurin Inhibitors/therapeutic use , Everolimus/therapeutic use , Retrospective Studies , Transplant Recipients , Graft Rejection/prevention & control , Immunosuppressive Agents/therapeutic use , Immunosuppressive Agents/pharmacology , Immunosuppression Therapy/methods , Lung
7.
Transpl Int ; 36: 10581, 2023.
Article in English | MEDLINE | ID: mdl-36824294

ABSTRACT

Everolimus (EVE) has been used as a calcineurin inhibitor (CNI) minimization/ elimination agent or to augment immunosuppression in lung transplant recipients (LTR) with CNI-induced nephrotoxicity or neurotoxicity. The long-term evidence for survival and progression to chronic lung allograft dysfunction (CLAD) is lacking. The primary aim was to compare survival outcomes of LTR starting EVE-based immunosuppression with those remaining on CNI-based regimens. The secondary outcomes being time to CLAD, incidence of CLAD and the emergence of obstructive (BOS) or restrictive (RAS) phenotypes. Single center retrospective study of 91 LTR starting EVE-based immunosuppression matched 1:1 with LTR remaining on CNI-based immunosuppression. On multivariate analysis, compared to those remaining on CNI-based immunosuppression, starting EVE was not associated with poorer survival [HR 1.04, 95% CI: 0.67-1.61, p = 0.853], or a statistically significant faster time to CLAD [HR 1.34, 95% CI: 0.87-2.04, p = 0.182]. There was no difference in the emergence of CLAD (EVE, [n = 57, 62.6%] vs. CNI-based [n = 52, 57.1%], p = 0.41), or the incidence of BOS (p = 0.60) or RAS (p = 0.16) between the two groups. Introduction of EVE-based immunosuppression does not increase the risk of death or accelerate the progression to CLAD compared to CNI-based immunosuppression.


Subject(s)
Bronchiolitis Obliterans , Lung Transplantation , Humans , Everolimus/therapeutic use , Retrospective Studies , Incidence , Lung , Lung Transplantation/adverse effects , Calcineurin Inhibitors/adverse effects , Bronchiolitis Obliterans/etiology
8.
J Ultrasound Med ; 42(5): 1081-1091, 2023 May.
Article in English | MEDLINE | ID: mdl-36321412

ABSTRACT

OBJECTIVES: Cranial ultrasound (cUS) screening is recommended for preterm neonates born before 32 weeks' gestational age (GA). The primary aim of this study was to determine if both a day 3 and day 8 cUS screening examination is necessary for all neonates. METHODS: A retrospective observational study was performed at a tertiary-level Australian hospital. Frequencies of cranial ultrasound abnormality (CUA) were compared between routine screening performed at postnatal days 3, 8, and 42. Univariate and multivariate analyses of risk factors for intraventricular hemorrhage (IVH) was performed using logistic regression. RESULTS: cUS examinations on 712 neonates born before 32 weeks' GA were included. Neonates were divided into 2 groups: 99 neonates in the 23-25 weeks 6 days GA (group A) and 613 neonates in the 26-31 weeks 6 days GA (group B). All CUA occurred more frequently in group A neonates and in the subset of group B neonates who had defined risk factors. Low-risk group B neonates had lower incidence of CUAs demonstrated on day 8 cUS than high-risk group B neonates, with no significant differences between day 3 and day 8. Logistic regression analysis identified a number of risk factors (vaginal delivery, small for GA, Apgar score <7 at 5 minutes, intubation, patent ductus arteriosus and infection) that were associated with increased frequency of IVH on day 8. In neonates born between 30 and 31 weeks 6 days GA, 35% had a CUA identified. CONCLUSIONS: Low-risk preterm neonates born between 26 and 31 weeks 6 days GA, without complications, could be screened with a single early cUS examination around day 8 without missing substantial abnormality.


Subject(s)
Infant, Premature, Diseases , Infant, Premature , Female , Infant, Newborn , Humans , Australia , Gestational Age , Infant, Premature, Diseases/diagnostic imaging , Cerebral Hemorrhage/diagnostic imaging , Retrospective Studies , Observational Studies as Topic
9.
Clin Exp Ophthalmol ; 51(8): 781-789, 2023 11.
Article in English | MEDLINE | ID: mdl-37700734

ABSTRACT

BACKGROUND: Posner Schlossman syndrome is a well-defined uveitis entity that is characterised by relapsing remitting unilateral anterior uveitis with markedly raised intraocular pressure. The aim of this study was to determine the risk factors for progression in patients with Posner Schlossman syndrome. METHODS: Ninety-eight patients were enrolled in a retrospective case series. Progression was defined as a composite endpoint of any of development of permanent glaucoma (in patients with no evidence of glaucomatous loss on presentation), corneal failure, or chronic inflammation. Relapse was defined as a resolving episode of inflammation not meeting the criteria for progression. RESULTS: Seventy seven percent of patients relapsed on average each 2.2 years. Forty percent of patients progressed. On univariate analysis, increased age at enrolment, immunocompromise at enrolment, the presence of glaucomatous optic neuropathy at enrolment, the performance of an anterior chamber tap and a positive anterior chamber tap were all associated with increased risk of progression. On multivariate analysis, age at enrolment, immunocompromise at enrolment, the performance of an anterior chamber tap, and the presence of glaucomatous optic neuropathy at enrolment were independently associated with increased risk of disease progression. CONCLUSIONS: Posner Schlossman syndrome is not a benign uveitis entity and risk of both relapse and progression are high. Older patients, immunocompromised patients, patients with glaucomatous optic neuropathy at enrolment and those with a positive anterior chamber tap are all at increased risk of progression.


Subject(s)
Glaucoma, Open-Angle , Glaucoma , Iridocyclitis , Optic Nerve Diseases , Uveitis, Anterior , Uveitis , Humans , Prognosis , Retrospective Studies , Glaucoma, Open-Angle/complications , Glaucoma/diagnosis , Glaucoma/complications , Uveitis/diagnosis , Uveitis/complications , Uveitis, Anterior/complications , Optic Nerve Diseases/complications , Inflammation , Recurrence , Intraocular Pressure
10.
Australas J Dermatol ; 64(1): 67-79, 2023 Feb.
Article in English | MEDLINE | ID: mdl-36652275

ABSTRACT

BACKGROUND/OBJECTIVES: Sequential digital dermoscopic imaging (SDDI) and total body photography (TBP) are recommended as a two-step surveillance method for individuals at high-risk of developing cutaneous melanoma. Dermoscopic features specific to melanoma have been well described, however, dynamic changes on serial imaging are less understood. This study aims to identify and compare dermoscopic features in developing melanomas and benign naevi that underwent SDDI and TBP to understand which dermoscopic features may be associated with a malignant change. METHOD: Histopathology reports from a private specialist dermatology clinic from January 2007 to December 2019 were reviewed. Histopathologically confirmed melanoma and benign naevi that underwent SDDI and TBP with a minimum follow-up interval of 3 months were included. RESULTS: Eighty-nine melanomas (38.2% invasive, median Breslow thickness 0.35 mm, range: 0.2-1.45 mm) and 48 benign naevi were evaluated by three experienced dermatologists for dermoscopic changes. Features most strongly associated with melanoma included the development of neovascularisation, asymmetry and growth in pigment network, additional colours, shiny white structures, regression, structureless areas and change to a multi-component pattern. The presence of atypical vessels (p = 0.02) and shiny white structures (p = 0.02) were significantly associated with invasive melanoma. CONCLUSION: Evaluation for certain evolving dermoscopic features in melanocytic lesions monitored by SDDI and TBP is efficient in assisting clinical decision making. SDDI with TBP is an effective tool for early detection of melanoma.


Subject(s)
Melanoma , Nevus, Pigmented , Skin Neoplasms , Humans , Melanoma/pathology , Skin Neoplasms/pathology , Dermoscopy/methods , Australia , Nevus, Pigmented/diagnostic imaging , Nevus, Pigmented/pathology , Photography , Melanoma, Cutaneous Malignant
11.
Emerg Radiol ; 30(4): 425-433, 2023 Aug.
Article in English | MEDLINE | ID: mdl-37289287

ABSTRACT

INTRODUCTION: Computed tomography pulmonary angiography (CTPA) is the gold standard test to investigate pulmonary embolism (PE). This technique carries significant radiation risk in young females because of radiosensitive breast and thyroid tissues. A high-pitched CT technique offers significant radiation dose reduction (RDR) and minimises breathing artefact. The addition of CT tube tin-filtration may offer further RDR. The aim of this retrospective study was to assess RDR and image quality (IQ) of high-pitch tin-filtered (HPTF)-CTPA against conventional-CTPA. METHODS: Retrospective review of consecutive adult females age < 50 years undergoing high pitch tin filtration (HPTF) and standard pitch no tin filtration (SPNF) during a 3-year period beginning in November 2017. CTs in both groups were compared for radiation dose, pulmonary arteries contrast density (Hounsfield units (HU)) and movement artefact. Findings of both groups were compared with the Student's T-test and Mann-Whitney U test, where p < 0.05 being considered significant. Diagnostic quality was also recorded. RESULTS: Ten female patients (mean age 33, 6/10 pregnant) in HPTF group and 10 female patients (mean age 36, 1/10 pregnant) in SPNF group were included. The HPTF group achieved 93% RDR (dose length product: 25.15 mGy.cm vs 337.10 mGy.cm, p < 0.01). There was significant contrast density difference between the two groups in the main, left or right pulmonary arteries (322.72 HU, 311.85 HU and 319.41 HU in HPTF group vs 418.60 HU, 405.10 HU and 415.96 HU in SPNF group respectively, p = 0.03, p = 0.03 and p = 0.04). 8/10 HPTF group and 10/10 in the control group were > 250 HU in all three vessels; the remaining 2 HPTF CTPA were > 210HU. All CT scans in both groups were of diagnostic quality and none exhibited movement artefact. CONCLUSION: This study was the first to demonstrate significant RDR with the HPTF technique whilst maintaining IQ in patients undergoing chest CTPA. This technique is particularly beneficial in young females and pregnant females with suspected PE.


Subject(s)
Pulmonary Embolism , Tin , Adult , Humans , Female , Middle Aged , Retrospective Studies , Drug Tapering , Radiation Dosage , Pulmonary Embolism/diagnostic imaging , Pulmonary Artery/diagnostic imaging , Tomography, X-Ray Computed/methods , Angiography/methods , Computed Tomography Angiography/methods
12.
Health Promot J Austr ; 34(2): 420-428, 2023 Apr.
Article in English | MEDLINE | ID: mdl-36065155

ABSTRACT

BACKGROUND: Understanding smoking behaviors in hospital patients who smoke may improve inpatient cessation treatments. This study aimed to describe smoking-related behaviors, past-quit attempts, and self-reported difficulties experienced in quitting among those who enrolled in a smoking cessation trial of varenicline. METHODS: Baseline data were obtained from adult hospitalized smokers (average ≥ 10 cigarettes/day in 4-weeks prior to hospitalization) who enrolled in a randomized, placebo-controlled trial of varenicline ± nicotine lozenges at five Australian public hospitals. A logistic regression model tested the association between participant characteristics and quitting in the previous 12 months. RESULTS: Participants' (n = 320; 57% male, 52.5 ± 12.1 years old) motivation and confidence in quitting were high. A total of 120 participants (37.5%) had attempted quitting in the previous 12-months. Prior hospitalization (P = .008) and employment status (P = .015) were significantly associated with past quit attempts. No statistically significant differences were noted in the reason for hospitalization or the level of nicotine dependence between participants who attempted quitting in the previous 12 months and their counterparts. Smoking cessation pharmacotherapy was used by 55% of those attempting to quit; nicotine replacement therapy (65.2%) and varenicline (16.7%) most common. Stress or anxiety, urges to smoke and a lack of motivation were the difficulties experienced in past quit attempts. CONCLUSIONS: Those who had a prior hospitalization and were unemployed had significantly greater odds of reporting past quit attempts. Further research is needed to investigate the degree of adherence among inpatient smokers with the smoke-free hospital policies and the frequency of NRT provision and uptake on admission.


Subject(s)
Smoking Cessation , Adult , Humans , Male , Middle Aged , Female , Varenicline/therapeutic use , Smokers , Motivation , Tobacco Use Cessation Devices , Australia/epidemiology , Smoking/epidemiology , Hospitals
13.
Aust Crit Care ; 36(4): 485-491, 2023 Jul.
Article in English | MEDLINE | ID: mdl-35810078

ABSTRACT

BACKGROUND: Establishing sequela following critical illness is a public health priority; however, recruitment and retention of this cohort make assessing functional outcomes difficult. Completing patient-reported outcome measures (PROMs) via telephone may improve participant and researcher involvement; however, there is little evidence regarding the correlation of PROMs to performance-based outcome measures in critical care survivors. OBJECTIVES: The objective of this study was to assess the relationship between self-reported and performance-based measures of function in survivors of critical illness. METHODS: This was a nested cohort study of patients enrolled within a previously published study determining predictors of disability-free survival. Spearman's correlation (rs) was calculated between four performance-based outcomes (the Functional Independence Measure [FIM], 6-min walk distance [6MWD], Functional Reach Test [FRT], and grip strength) that were collected during a home visit 6 months following their intensive care unit admission, with two commonly used PROMs (World Health Organization Disability Assessment Scale 2.0 12 Level [WHODAS 2.0] and EuroQol-5 Dimension-5 Level [EQ-5D-5L]) obtained via phone interview (via the PREDICT study) at the same time point. RESULTS: There were 38 PROMs obtained from 40 recruited patients (mean age = 59.8 ± 16 yrs, M:F = 24:16). All 40 completed the FIM and grip strength, 37 the 6MWD, and 39 the FRT. A strong correlation was found between the primary outcome of the WHODAS 2.0 with all performance-based outcomes apart from grip strength where a moderate correlation was identified. Although strong correlations were also established between the EQ-5D-5L utility score and the FIM, 6MWD, and FRT, it only correlated weakly with grip strength. The EQ-5D overall global health rating only had very weak to moderate correlations with the performance-based outcomes. CONCLUSION: The WHODAS 2.0 correlated stronger across multiple performance-based outcome measures of functional recovery and is recommended for use in survivors of critical illness.


Subject(s)
Critical Illness , Quality of Life , Humans , Adult , Middle Aged , Aged , Cohort Studies , Survivors , Patient Reported Outcome Measures , Critical Care , Surveys and Questionnaires
14.
Aust Crit Care ; 36(6): 955-960, 2023 11.
Article in English | MEDLINE | ID: mdl-36806392

ABSTRACT

BACKGROUND: The COVID-19 pandemic highlighted major challenges with usual nutrition care processes, leading to reports of malnutrition and nutrition-related issues in these patients. OBJECTIVE: The objective of this study was to describe nutrition-related service delivery practices across hospitalisation in critically ill patients with COVID-19 admitted to Australian intensive care units (ICUs) in the initial pandemic phase. METHODS: This was a multicentre (nine site) observational study in Australia, linked with a national registry of critically ill patients with COVID-19. Adult patients with COVID-19 who were discharged to an acute ward following ICU admission were included over a 12-month period. Data are presented as n (%), median (interquartile range [IQR]), and odds ratio (OR [95% confidence interval {CI}]). RESULTS: A total of 103 patients were included. Oral nutrition was the most common mode of nutrition (93 [93%]). In the ICU, there were 53 (52%) patients seen by a dietitian (median 4 [2-8] occasions) and malnutrition screening occurred in 51 (50%) patients most commonly with the malnutrition screening tool (50 [98%]). The odds of receiving a higher malnutrition screening tool score increased by 36% for every screening in the ICU (1st to 4th, OR: 1.39 [95% CI: 1.05-1.77] p = 0.018) (indicating increasing risk of malnutrition). On the ward, 51 (50.5%) patients were seen by a dietitian (median time to consult: 44 [22.5-75] hours post ICU discharge). The odds of dietetic consult increased by 39% every week while on the ward (OR: 1.39 [1.03-1.89], p = 0.034). Patients who received mechanical ventilation (MV) were more likely to receive dietetic input than those who never received MV. CONCLUSIONS: During the initial phases of the COVID-19 pandemic in Australia, approximately half of the patients included were seen by a dietitian. An increased number of malnutrition screens were associated with a higher risk score in the ICU and likelihood of dietetic consult increased if patients received MV and as length of ward stay increased.


Subject(s)
COVID-19 , Malnutrition , Adult , Humans , Critical Illness , Pandemics , Australia/epidemiology , Hospitalization , Malnutrition/epidemiology , Malnutrition/diagnosis , Intensive Care Units
15.
Ann Surg ; 275(4): 654-662, 2022 04 01.
Article in English | MEDLINE | ID: mdl-35261389

ABSTRACT

OBJECTIVE: The aim of this study was to evaluate the diagnostic performance of all biomarkers studied to date for the early diagnosis of sepsis in hospitalized patients with burns. BACKGROUND: Early clinical diagnosis of sepsis in burns patients is notoriously difficult due to the hypermetabolic nature of thermal injury. A considerable variety of biomarkers have been proposed as potentially useful adjuncts to assist with making a timely and accurate diagnosis. METHODS: We searched Medline, Embase, Cochrane CENTRAL, Biosis Previews, Web of Science, and Medline In-Process to February 2020. We included diagnostic studies involving burns patients that assessed biomarkers against a reference sepsis definition of positive blood cultures or a combination of microbiologically proven infection with systemic inflammation and/or organ dysfunction. Pooled measures of diagnostic accuracy were derived for each biomarker using bivariate random-effects meta-analysis. RESULTS: We included 28 studies evaluating 57 different biomarkers and incorporating 1517 participants. Procalcitonin was moderately sensitive (73%) and specific (75%) for sepsis in patients with burns. C-reactive protein was highly sensitive (86%) but poorly specific (54%). White blood cell count had poor sensitivity (47%) and moderate specificity (65%). All other biomarkers had insufficient studies to include in a meta-analysis, however brain natriuretic peptide, stroke volume index, tumor necrosis factor (TNF)-alpha, and cell-free DNA (on day 14 post-injury) showed the most promise in single studies. There was moderate to significant heterogeneity reflecting different study populations, sepsis definitions and test thresholds. CONCLUSIONS: The most widely studied biomarkers are poorly predictive for sepsis in burns patients. Brain natriuretic peptide, stroke volume index, TNF-alpha, and cell-free DNA showed promise in single studies and should be further evaluated. A standardized approach to the evaluation of diagnostic markers (including time of sampling, cut-offs, and outcomes) would be useful.


Subject(s)
Burns , Cell-Free Nucleic Acids , Sepsis , Biomarkers , Burns/complications , Burns/diagnosis , Early Diagnosis , Humans , Natriuretic Peptide, Brain , Sensitivity and Specificity , Sepsis/diagnosis
16.
Ann Surg ; 275(2): e401-e409, 2022 02 01.
Article in English | MEDLINE | ID: mdl-33470630

ABSTRACT

OBJECTIVES: To develop and validate a classification of sleeve gastrectomy leaks able to reliably predict outcomes, from protocolized computed tomography (CT) findings and readily available variables. SUMMARY OF BACKGROUND DATA: Leaks post sleeve gastrectomy remain morbid and resource-consuming. Incidence, treatments, and outcomes are variable, representing heterogeneity of the problem. A predictive tool available at presentation would aid management and predict outcomes. METHODS: From a prospective database (2009-2018) we reviewed patients with staple line leaks. A Delphi process was undertaken on candidate variables (80-20). Correlations were performed to stratify 4 groupings based on outcomes (salvage resection, length of stay, and complications) and predictor variables. Training and validation cohorts were established by block randomization. RESULTS: A 4-tiered classification was developed based on CT appearance and duration postsurgery. Interobserver agreement was high (κ = 0.85, P < 0.001). There were 59 patients, (training: 30, validation: 29). Age 42.5 ± 10.8 versus 38.9 ± 10.0 years (P = 0.187); female 65.5% versus 80.0% (P = 0.211), weight 127.4 ± 31.3 versus 141.0 ± 47.9 kg, (P = 0.203). In the training group, there was a trend toward longer hospital stays as grading increased (I = 10.5 d; II = 24 d; III = 66.5 d; IV = 72 d; P = 0.005). Risk of salvage resection increased (risk ratio grade 4 = 9; P = 0.043) as did complication severity (P = 0.027).Findings were reproduced in the validation group: risk of salvage resection (P = 0.007), hospital stay (P = 0.001), complications (P = 0.016). CONCLUSION: We have developed and validated a classification system, based on protocolized CT imaging that predicts a step-wise increased risk of salvage resection, complication severity, and increased hospital stay. The system should aid patient management and facilitate comparisons of outcomes and efficacy of interventions.


Subject(s)
Anastomotic Leak/classification , Anastomotic Leak/diagnostic imaging , Clinical Protocols , Gastrectomy/methods , Tomography, X-Ray Computed , Adult , Female , Humans , Male , Middle Aged , Prospective Studies , Random Allocation
17.
Crit Care Med ; 50(1): 61-71, 2022 01 01.
Article in English | MEDLINE | ID: mdl-34166283

ABSTRACT

OBJECTIVES: To evaluate the functional outcome and health-related quality of life of in-hospital cardiac arrest survivors at 6 and 12 months. DESIGN: A longitudinal cohort study. SETTING: Seven metropolitan hospitals in Australia. PATIENTS: Data were collected for hospitalized adults (≥ 18 yr) who experienced in-hospital cardiac arrest, defined as "a period of unresponsiveness, with no observed respiratory effort and the commencement of external cardiac compressions." INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: Prior to hospital discharge, patients were approached for consent to participate in 6-month and 12-month telephone interviews. Outcomes included the modified Rankin Scale, Barthel Index, Euro-Quality of Life 5 Dimension 5 Level, return to work and hospital readmissions. Forty-eight patients (80%) consented to follow-up interviews. The mean age of participants was 67.2 (± 15.3) years, and 33 of 48 (68.8%) were male. Good functional outcome (modified Rankin Scale score ≤ 3) was reported by 31 of 37 participants (83.8%) at 6 months and 30 of 33 (90.9%) at 12 months. The median Euro-Quality of Life-5D index value was 0.73 (0.33-0.84) at 6 months and 0.76 (0.47-0.88) at 12 months. The median Euro-Quality of Life-Visual Analogue Scale score at 6 months was 70 (55-80) and 75 (50-87.5) at 12 months. Problems in all Euro-Quality of Life-5D-5 L dimension were reported frequently at both time points. Hospital readmission was reported by 23 of 37 patients (62.2%) at 6 months and 16 of 33 (48.5%) at 12 months. Less than half of previously working participants had returned to work by 12 months. CONCLUSIONS: The majority of in-hospital cardiac arrest survivors had a good functional outcome and health-related quality of life at 6 months, and this was largely unchanged at 12 months. Despite this, many reported problems with mobility, self-care, usual activities, pain, and anxiety/depression. Return to work rates was low, and hospital readmissions were common.


Subject(s)
Functional Status , Heart Arrest/epidemiology , Quality of Life , Survivors/statistics & numerical data , Activities of Daily Living , Aged , Aged, 80 and over , Female , Humans , Male , Middle Aged , Patient Readmission/statistics & numerical data , Return to Work/statistics & numerical data
18.
Clin Transplant ; 36(8): e14763, 2022 08.
Article in English | MEDLINE | ID: mdl-35761751

ABSTRACT

INTRODUCTION: Unintentional weight gain, overweight and obesity following solid organ transplantation (SOT) are well-established and linked to morbidity and mortality risk factors. No interventional studies aimed at prevention have been undertaken among lung transplant (LTx) recipients. The combination of group education and telephone coaching is effective in the general population but is untested among SOT cohorts. METHODS: A non-randomized, interventional pilot study was conducted among new LTx recipients. The control group received standard care. In addition to standard care, the intervention involved four group education and four individual, telephone coaching sessions over 12-months. Data collection occurred at 2 weeks, 3- and 12 months post-LTx. Measurements included weight, BMI, fat mass (FM), fat mass index (FMI), fat-free mass (FFM), fat-free mass index (FFMI), waist circumference (WC), visceral adipose tissue (VAT), nutrition knowledge, diet, physical activity, lipid profile, HbA1C , FEV1 , six-minute walk distance and patient satisfaction. RESULTS: Fifteen LTx recipients were recruited into each group. One control participant died 120 days post-LTx, unrelated to the study. There were trends towards lower increases in weight (6.7±7.2 kg vs. 9.8±11.3 kg), BMI (9.6% of baseline vs. 13%), FM (19.7% vs. 40%), FMI, VAT (7.1% vs. 30.8%) and WC (5.5% vs. 9.5%), and greater increases in FFM and FFMI (all P > .05), among the intervention group by 12 months. The intervention was well-accepted by participants. CONCLUSION: This feasible intervention demonstrated non-significant, but clinically meaningful, favorable weight and body composition trends among LTx recipients over 12 months compared to standard care.


Subject(s)
Lung Transplantation , Nutritionists , Body Composition , Body Mass Index , Humans , Lung Transplantation/adverse effects , Obesity/epidemiology , Obesity/surgery , Physical Therapy Modalities , Pilot Projects
19.
Aust Crit Care ; 35(4): 355-361, 2022 07.
Article in English | MEDLINE | ID: mdl-34321180

ABSTRACT

BACKGROUND: Nonurban residential living is associated with adverse outcomes for a number of chronic health conditions. However, it is unclear what effect it has amongst survivors of critical illness. OBJECTIVES: The purpose of this study is to determine whether patients living greater than 50 km from the treating intensive care unit (ICU) have disability outcomes at 6 months that differ from people living within 50 km. METHODS: This was a multicentre, prospective cohort study conducted in five metropolitan ICUs. Participants were adults admitted to the ICU, who received >24 h of mechanical ventilation and survived to hospital discharge. In a secondary analysis of these data, the cohort was dichotomised based on residential distance from the treating ICU: <50 km and ≥50 km. The primary outcome was patient-reported disability using the 12-item World Health Organization's Disability Assessment Schedule (WHODAS 2.0). This was recorded at 6 months after ICU admission by telephone interview. Secondary outcomes included health status as measured by EQ-5D-5L return to work and psychological function as measured by the Hospital Anxiety and Depression Scale (HADS). Multivariable logistic regression was used to assess the association between distance from the ICU and moderate to severe disability, adjusted for potential confounders. Variables included in the multivariable model were deemed to be clinically relevant and had baseline imbalance between groups (p < 0.10). These included marital status and hours of mechanical ventilation. Sensitivity analysis was also conducted using distance in kilometres as a continuous variable. RESULTS: A total of 262 patients were enrolled, and 169 (65%) lived within 50 km of the treating ICU and 93 (35%) lived ≥50 km from the treating ICU (interquartile range [IQR] 10-664 km). There was no difference in patient-reported disability at 6 months between patients living <50 km and those living ≥50 km (WHODAS total disability % [IQR] 10.4 [2.08-25] v 14.6 [2.08-20.8], P = 0.74). There was also no difference between groups for the six major life domains of the WHODAS. There was no difference in rates of anxiety or depression as measured by HADS score (HADS anxiety median [IQR] 4 [1-7] v 3 [1-7], P = 0.60) (HADS depression median [IQR] 3 [1-6] v 3 [1-6], P = 0.62); health status as measured by EQ-5D (mean [SD] 66.7 [20] v 69.8 [22.2], P = 0.24); or health-related unemployment (% (N) 39 [26] v 25 [29.1], P = 0.61). After adjusting for confounders, living ≥50 km from the treating ICU was not associated with increased disability (odds ratio 0.61, 95% confidence interval: 0.33-1.16; P = 0.13) CONCLUSIONS: Survivors of intensive care in Victoria, Australia, who live at least 50 km from the treating ICU did not have greater disability than people living less than 50 km at 6 months after discharge. Living 50 km or more from the treating ICU was not associated with disability, nor was it associated with anxiety or depression, health status, or unemployment due to health.


Subject(s)
Intensive Care Units , Quality of Life , Adult , Critical Illness/psychology , Humans , Prospective Studies , Victoria
20.
Crit Care Med ; 49(9): e860-e869, 2021 09 01.
Article in English | MEDLINE | ID: mdl-33967203

ABSTRACT

OBJECTIVES: To determine the influence of active mobilization during critical illness on health status in survivors 6 months post ICU admission. DESIGN: Post hoc secondary analysis of a prospective cohort study conducted between November 2013 and March 2015. SETTING: Two tertiary hospital ICU's in Victoria, Australia. PATIENTS: Of 194 eligible patients admitted, mobility data for 186 patients were obtained. Inclusion and exclusion criteria were as per the original trial. INTERVENTIONS: The dosage of mobilization in ICU was measured by: 1) the Intensive Care Mobility Scale where a higher Intensive Care Mobility Scale level was considered a higher intensity of mobilization or 2) the number of active mobilization sessions performed during the ICU stay. The data were extracted from medical records and analyzed against Euro-quality of life-5D-5 Level version answers obtained from phone interviews with survivors 6 months following ICU admission. The primary outcome was change in health status measured by the Euro-quality of life-5D-5 Level utility score, with change in Euro-quality of life-5D-5 Level mobility domain a secondary outcome. MEASUREMENTS AND MAIN RESULTS: Achieving higher levels of mobilization (as per the Intensive Care Mobility Scale) was independently associated with improved outcomes at 6 months (Euro-quality of life-5D-5 Level utility score unstandardized regression coefficient [ß] 0.022 [95% CI, 0.002-0.042]; p = 0.033; Euro-quality of life-5D-5 Level mobility domain ß = 0.127 [CI, 0.049-0.205]; p = 0.001). Increasing the number of active mobilization sessions was not found to independently influence health status. Illness severity, total comorbidities, and admission diagnosis also independently influenced health status. CONCLUSIONS: In critically ill survivors, achieving higher levels of mobilization, but not increasing the number of active mobilization sessions, improved health status 6 months after ICU admission.


Subject(s)
Early Ambulation/standards , Health Status , Survivors/statistics & numerical data , Adult , Aged , Cohort Studies , Critical Illness/nursing , Early Ambulation/statistics & numerical data , Female , Humans , Intensive Care Units/organization & administration , Intensive Care Units/statistics & numerical data , Length of Stay/statistics & numerical data , Male , Middle Aged , Prospective Studies , Victoria
SELECTION OF CITATIONS
SEARCH DETAIL