RESUMEN
BACKGROUND: Frailty is prevalent in lung transplant (LTx) candidates, but the impact and subsequent frailty trajectory is unclear. This study aimed to investigate frailty over the first year after LTx. METHOD: Post-LTx recipients completed a thrice weekly 12-week directly supervised exercise rehabilitation program. Edmonton Frail Scale (EFS) was used to assess frailty. Primary outcome was 6-Minute Walk Distance (6MWD) measured at pre-LTx, prerehabilitation, postrehabilitation, and 1 year post-LTx. RESULTS: 106 of 139 recruited participants underwent LTx: mean age 58 years, 48% male, 52% with chronic obstructive pulmonary disease. Mean (± SD) frailty scores pre-LTx and 1 year post-LTx were 5.54 ± 2.4 and 3.28 ±1.5. Mean 6MWD improved significantly for all: prerehabilitation 326 m (SD 116), versus postrehabilitation 523 m (SD 101) (p < 0.001) versus 1 year 512 m (SD 120) (p < 0.001). There were significant differences between an EFS > 7 (frail) and EFS ≤ 7 (not frail) for 6MWD, grip strength (GS), anxiety, and depression. Postrehabilitation, there were no significant differences in 6MWD, GS, anxiety, or depression while comparing EFS > 7 versus ≤ 7. At 1 year, there was a significant difference in depression but not 6MWD, GS, or anxiety between those EFS ≤ 7 and > 7 (p = 0.017). CONCLUSION: Participants in a structured post-LTx rehabilitation program improved in functional exercise capacity (6MWD), GS, depression, and anxiety. For frail participants exercise capacity, depression, anxiety, and GS were well managed in rehabilitation with no significant differences between those who were not frail. Pre-LTx frailty may be reversible post-LTx and should not be an absolute contraindication to LTx.
Asunto(s)
Fragilidad , Trasplante de Pulmón , Humanos , Masculino , Femenino , Persona de Mediana Edad , Estudios de Seguimiento , Pronóstico , Terapia por Ejercicio/métodos , Anciano , Factores de Riesgo , Calidad de Vida , Enfermedad Pulmonar Obstructiva Crónica/rehabilitación , Enfermedad Pulmonar Obstructiva Crónica/cirugía , Complicaciones PosoperatoriasRESUMEN
BACKGROUND: Respiratory tract infections (RTIs) are a major global health burden due to their high morbidity and mortality. This retrospective study described the epidemiology of respiratory pathogens in adults over a 5-year period at an Australian tertiary healthcare network. METHODS: All multiplex reverse transcription polymerase chain reaction respiratory samples taken between the 1st of November 2014 and the 31st of October 2019 were included in this study. Overall prevalence and variations according to seasons, age groups and sex were analysed, as well as factors associated with prolonged hospital and intensive care length of stay. RESULTS: There were 12,453 pathogens detected amongst the 12,185 positive samples, with coinfection rates of 3.7%. Picornavirus (Rhinovirus), Influenza A and respiratory syncytial virus were the most commonly detected pathogens. Mycoplasma pneumoniae was the most commonly detected atypical bacteria. Significant differences in the prevalence of Chlamydia pneumoniae and Human metapneumovirus infections were found between sexes. Longest median length of intensive care and hospital stay was for Legionella species. Seasonal variations were evident for certain pathogens. CONCLUSIONS: The high rates of pathogen detection and hospitalisation in this real-world study highlights the significant burden of RTIs, and the urgent need for an improved understanding of the pathogenicity as well as preventative and treatment options of RTIs.
Asunto(s)
COVID-19 , Virus Sincitial Respiratorio Humano , Infecciones del Sistema Respiratorio , Adulto , Humanos , Australia/epidemiología , COVID-19/epidemiología , Reacción en Cadena de la Polimerasa Multiplex , Sistema Respiratorio , Estudios Retrospectivos , Estaciones del Año , Masculino , FemeninoRESUMEN
OBJECTIVES: The role of nutrition in the recovery of critically ill children has not been investigated and current nutrition provision in the post-pediatric intensive care unit (PICU) period is unknown. The primary objective of this study was to describe ward nutrition support in children following PICU discharge. METHODS: Children up to 18 years admitted to one of nine PICUs over a 2-week period with a length of stay >48 h were enrolled. Data were collected on the first full ward day following PICU discharge and on Days 7, 14, 21, and 28 following PICU admission. Data points included oral intake, enteral (EN) and parenteral nutrition (PN) support, and oral and EN energy and protein provision. RESULTS: Among the 108 children, on the first full ward day 75/108 (69%) children received EN, 54/108 (50%) oral intake, and 8/108 (7%) PN. Of those receiving oral nutrition only on the first full ward day (25/108; 23%), 9/25 (36%) received <50% of their estimated energy and protein requirements. Of those provided EN only, and where nutrition targets were known, on the first full ward day 8/46 (17%) and 7/46 (15%) met <75% of their estimated energy and protein requirements, respectively. On Day 28, this increased to 4/12 (33%) and 5/12 (42%). CONCLUSIONS: In this study of ward-based nutrition support, key findings included consistent use of EN and PN up to at least 28 days following PICU admission, and a high proportion of children receiving EN or oral intake only not meeting their estimated energy and protein requirements.
RESUMEN
BACKGROUND: Adverse changes in muscle health (size and quality) are common in patients receiving extracorporeal membrane oxygenation (ECMO). Nutrition delivery may attenuate such changes, yet the relationship with muscle health remains poorly understood. This study explored the association between energy and protein delivery and changes in muscle health measured using ultrasound from baseline to day 10 and 20 in patients receiving ECMO. METHODS: A secondary analysis of data from a prospective study quantifying changes in muscle health using ultrasound in adults receiving ECMO was completed. Patients were eligible for inclusion if they were prescribed artificial nutrition within 3 days of enrolment and had >1 ultrasound measurement. The primary outcome was the association between protein delivery (grams delivered and percentage of targets received) and change in rectus femoris cross-sectional area (RF-CSA) till day 20. Secondary outcomes were the association between energy and protein delivery and change in RF-CSA till day 10, RF-echogenicity, and quadriceps muscle layer thickness to day 10 and 20. Associations were assessed using Spearman's rank correlation. RESULTS: Twenty-three patients (age: 48 [standard deviation {SD}: 14], 44% male) were included. Mean energy and protein delivery were 1633 kcal (SD: 374 kcal) and 70 g (SD: 17 g) equating to 79% (SD: 19%) of energy and 73% (SD: 17%) of protein targets. No association was observed between protein delivery (r = 0.167; p = 0.495) or the percentage of targets received (r = 0.096; p = 0.694) and change in RF-CSA till day 20. No other significant associations were found between energy or protein delivery and change in RF-CSA, echogenicity, or quadriceps muscle layer thickness at any time point. CONCLUSIONS: This exploratory study observed no association between nutrition delivery and changes in muscle health measured using ultrasound in patients receiving ECMO. Larger prospective studies are required to investigate the association between nutrition delivery and changes in muscle health in patients receiving ECMO.
Asunto(s)
Oxigenación por Membrana Extracorpórea , Humanos , Masculino , Femenino , Persona de Mediana Edad , Estudios Retrospectivos , Ultrasonografía , Adulto , Músculo Esquelético/diagnóstico por imagen , Músculo Cuádriceps/diagnóstico por imagenRESUMEN
BACKGROUND: Data on nutrition delivery over the whole hospital admission in critically ill patients with COVID-19 are scarce, particularly in the Australian setting. OBJECTIVES: The objective of this study was to describe nutrition delivery in critically ill patients admitted to Australian intensive care units (ICUs) with coronavirus disease 2019 (COVID-19), with a focus on post-ICU nutrition practices. METHODS: A multicentre observational study conducted at nine sites included adult patients with a positive COVID-19 diagnosis admitted to the ICU for >24 h and discharged to an acute ward over a 12-month recruitment period from 1 March 2020. Data were extracted on baseline characteristics and clinical outcomes. Nutrition practice data from the ICU and weekly in the post-ICU ward (up to week four) included route of feeding, presence of nutrition-impacting symptoms, and nutrition support received. RESULTS: A total of 103 patients were included (71% male, age: 58 ± 14 years, body mass index: 30±7 kg/m2), of whom 41.7% (n = 43) received mechanical ventilation within 14 days of ICU admission. While oral nutrition was received by more patients at any time point in the ICU (n = 93, 91.2% of patients) than enteral nutrition (EN) (n = 43, 42.2%) or parenteral nutrition (PN) (n = 2, 2.0%), EN was delivered for a greater duration of time (69.6% feeding days) than oral and PN (29.7% and 0.7%, respectively). More patients received oral intake than the other modes in the post-ICU ward (n = 95, 95.0%), and 40.0% (n = 38/95) of patients were receiving oral nutrition supplements. In the week after ICU discharge, 51.0% of patients (n = 51) had at least one nutrition-impacting symptom, most commonly a reduced appetite (n = 25; 24.5%) or dysphagia (n = 16; 15.7%). CONCLUSION: Critically ill patients during the COVID-19 pandemic in Australia were more likely to receive oral nutrition than artificial nutrition support at any time point both in the ICU and in the post-ICU ward, whereas EN was provided for a greater duration when it was prescribed. Nutrition-impacting symptoms were common.
Asunto(s)
COVID-19 , Enfermedad Crítica , Adulto , Humanos , Masculino , Persona de Mediana Edad , Anciano , Femenino , Prueba de COVID-19 , Pandemias , Ingestión de Energía , Tiempo de Internación , Australia , Hospitalización , Unidades de Cuidados IntensivosRESUMEN
OBJECTIVE: The infection risk in patients receiving ibrutinib, idelalisib or venetoclax for chronic lymphocytic leukaemia (CLL) or B-cell lymphoma treated outside of clinical trials is incompletely defined. We sought to identify the severe infection rate and associated risk factors in a 'real-world' cohort. METHODS: We conducted a retrospective cohort study of adult patients with CLL or lymphoma treated with ibrutinib, idelalisib or venetoclax. RESULTS: Of 67 patients identified (ibrutinib n = 53, idelalisib n = 8 and venetoclax n = 6), 32 (48%) experienced severe infection. Severe infection occurred at a rate of 65 infections per 100 person-years, with a median of 17.8 months of therapy. Median time to first infection (IQR) was 5.4 months (1.4-15.9). Poor baseline Eastern Cooperative Oncology Group (ECOG) performance status and high Charlson Comorbidity Index (CCI) score associated with increased risk of severe infection [hazard ratios (95% CI) 1.57 (1.07-2.31, p = .018) and 1.3 (1.05-1.62, p = .016) respectively]. CONCLUSION: The severe infection rate for patients receiving ibrutinib, idelalisib or venetoclax for lymphoma and CLL exceeded those reported in clinical trials. Patients with poor ECOG or high CCI should be closely monitored for early signs of infection and prevention strategies actively pursued. Further prospective research is required to define optimal antimicrobial prophylaxis recommendations.
Asunto(s)
Leucemia Linfocítica Crónica de Células B , Linfoma de Células B , Adulto , Humanos , Leucemia Linfocítica Crónica de Células B/complicaciones , Leucemia Linfocítica Crónica de Células B/diagnóstico , Leucemia Linfocítica Crónica de Células B/tratamiento farmacológico , Estudios Retrospectivos , Linfoma de Células B/tratamiento farmacológico , Protocolos de Quimioterapia Combinada Antineoplásica/efectos adversosRESUMEN
BACKGROUND: Recent advances in CFTR modulator therapy have the potential to change the face of cystic fibrosis (CF). This retrospective observational study describes real world experience of the four available CFTR modulators in adults and children with CF in a single centre in Melbourne, Australia. METHOD: Data were collected for all patients treated with CFTR modulators at MonashCF between May 2012 and September 2020. Primary outcomes included lung function, admission days and BMI/BMI centile over time. Adverse events and reasons for changing or ceasing medications were also analysed. RESULTS: 55% (74/133) adult and 46% (55/119) paediatric patients were treated with CFTR modulators. FEV1 increased in adults treated with ivacaftor (IVA) and elexacaftor/tezacaftor/ivacaftor (ELX/TEZ/IVA) by 4.73% and 10.07% respectively, and BMI also improved in these groups. Nutrition improved in adults and children treated with lumacaftor/ivacaftor (LUM/IVA). There was no significant improvement in FEV1 or admission days with LUM/IVA or tezacaftor/ivacaftor (TEZ/IVA). 36% (31/85) ceased LUM/IVA, due to adverse effects in 81% (25/31). Of these, 92% (23/25) changed to TEZ/IVA, 78% (18/23) without significant adverse effects. CONCLUSIONS: Our findings for LUM/IVA and TEZ/IVA are less encouraging than those seen in clinical trials, with no significant improvement in lung function or admission days and a higher rate of adverse effects with LUM/IVA compared with phase 3 clinical trials. TEZ/IVA was generally well tolerated by those who experienced side effects with LUM/IVA. The small number of patients treated with ELX/TEZ/IVA had improvements in all parameters. These findings support ongoing use of IVA for individuals with gating mutations, and transition to ELX/TEZ/IVA once available for patients with at least one Phe508del mutation.
Asunto(s)
Regulador de Conductancia de Transmembrana de Fibrosis Quística , Fibrosis Quística , Humanos , Adulto , Niño , Regulador de Conductancia de Transmembrana de Fibrosis Quística/genética , Regulador de Conductancia de Transmembrana de Fibrosis Quística/uso terapéutico , Australia , Aminofenoles/efectos adversos , Fibrosis Quística/tratamiento farmacológico , MutaciónRESUMEN
Everolimus (EVE) provides an alternative to maintenance immunosuppression when conventional immunosuppression cannot be tolerated. EVE can be utilized with a calcineurin inhibitor (CNI) minimization or elimination strategy. To date, clinical studies investigating EVE after lung transplant (LTx) have primarily focused on the minimization strategy to preserve renal function. The primary aim was to determine the preferred method of EVE utilization for lung transplant recipients (LTR). To undertake this aim, we compared the safety and efficacy outcomes of EVE as part of minimization and elimination immunosuppressant regimens. Single center retrospective study of 217 LTR initiated on EVE (120 CNI minimization and 97 CNI elimination). Survival outcomes were calculated from the date of EVE commencement. On multivariate analysis, LTR who received EVE as part of the CNI elimination strategy had poorer survival outcomes compared to the CNI minimization strategy [HR 1.61, 95% CI: 1.11-2.32, p=0.010]. Utilization of EVE for renal preservation was associated with improved survival compared to other indications [HR 0.64, 95% CI: 0.42-0.97, p=0.032]. EVE can be successfully utilized for maintenance immunosuppression post LTx, particularly for renal preservation. However, immunosuppressive regimens containing low dose CNI had superior survival outcomes, highlighting the importance of retaining a CNI wherever possible.
Asunto(s)
Inhibidores de la Calcineurina , Everolimus , Adulto , Humanos , Inhibidores de la Calcineurina/uso terapéutico , Everolimus/uso terapéutico , Estudios Retrospectivos , Receptores de Trasplantes , Rechazo de Injerto/prevención & control , Inmunosupresores/uso terapéutico , Inmunosupresores/farmacología , Terapia de Inmunosupresión/métodos , PulmónRESUMEN
Everolimus (EVE) has been used as a calcineurin inhibitor (CNI) minimization/ elimination agent or to augment immunosuppression in lung transplant recipients (LTR) with CNI-induced nephrotoxicity or neurotoxicity. The long-term evidence for survival and progression to chronic lung allograft dysfunction (CLAD) is lacking. The primary aim was to compare survival outcomes of LTR starting EVE-based immunosuppression with those remaining on CNI-based regimens. The secondary outcomes being time to CLAD, incidence of CLAD and the emergence of obstructive (BOS) or restrictive (RAS) phenotypes. Single center retrospective study of 91 LTR starting EVE-based immunosuppression matched 1:1 with LTR remaining on CNI-based immunosuppression. On multivariate analysis, compared to those remaining on CNI-based immunosuppression, starting EVE was not associated with poorer survival [HR 1.04, 95% CI: 0.67-1.61, p = 0.853], or a statistically significant faster time to CLAD [HR 1.34, 95% CI: 0.87-2.04, p = 0.182]. There was no difference in the emergence of CLAD (EVE, [n = 57, 62.6%] vs. CNI-based [n = 52, 57.1%], p = 0.41), or the incidence of BOS (p = 0.60) or RAS (p = 0.16) between the two groups. Introduction of EVE-based immunosuppression does not increase the risk of death or accelerate the progression to CLAD compared to CNI-based immunosuppression.
Asunto(s)
Bronquiolitis Obliterante , Trasplante de Pulmón , Humanos , Everolimus/uso terapéutico , Estudios Retrospectivos , Incidencia , Pulmón , Trasplante de Pulmón/efectos adversos , Inhibidores de la Calcineurina/efectos adversos , Bronquiolitis Obliterante/etiologíaRESUMEN
OBJECTIVES: Cranial ultrasound (cUS) screening is recommended for preterm neonates born before 32 weeks' gestational age (GA). The primary aim of this study was to determine if both a day 3 and day 8 cUS screening examination is necessary for all neonates. METHODS: A retrospective observational study was performed at a tertiary-level Australian hospital. Frequencies of cranial ultrasound abnormality (CUA) were compared between routine screening performed at postnatal days 3, 8, and 42. Univariate and multivariate analyses of risk factors for intraventricular hemorrhage (IVH) was performed using logistic regression. RESULTS: cUS examinations on 712 neonates born before 32 weeks' GA were included. Neonates were divided into 2 groups: 99 neonates in the 23-25 weeks 6 days GA (group A) and 613 neonates in the 26-31 weeks 6 days GA (group B). All CUA occurred more frequently in group A neonates and in the subset of group B neonates who had defined risk factors. Low-risk group B neonates had lower incidence of CUAs demonstrated on day 8 cUS than high-risk group B neonates, with no significant differences between day 3 and day 8. Logistic regression analysis identified a number of risk factors (vaginal delivery, small for GA, Apgar score <7 at 5 minutes, intubation, patent ductus arteriosus and infection) that were associated with increased frequency of IVH on day 8. In neonates born between 30 and 31 weeks 6 days GA, 35% had a CUA identified. CONCLUSIONS: Low-risk preterm neonates born between 26 and 31 weeks 6 days GA, without complications, could be screened with a single early cUS examination around day 8 without missing substantial abnormality.
Asunto(s)
Enfermedades del Prematuro , Recien Nacido Prematuro , Femenino , Recién Nacido , Humanos , Australia , Edad Gestacional , Enfermedades del Prematuro/diagnóstico por imagen , Hemorragia Cerebral/diagnóstico por imagen , Estudios Retrospectivos , Estudios Observacionales como AsuntoRESUMEN
BACKGROUND: Posner Schlossman syndrome is a well-defined uveitis entity that is characterised by relapsing remitting unilateral anterior uveitis with markedly raised intraocular pressure. The aim of this study was to determine the risk factors for progression in patients with Posner Schlossman syndrome. METHODS: Ninety-eight patients were enrolled in a retrospective case series. Progression was defined as a composite endpoint of any of development of permanent glaucoma (in patients with no evidence of glaucomatous loss on presentation), corneal failure, or chronic inflammation. Relapse was defined as a resolving episode of inflammation not meeting the criteria for progression. RESULTS: Seventy seven percent of patients relapsed on average each 2.2 years. Forty percent of patients progressed. On univariate analysis, increased age at enrolment, immunocompromise at enrolment, the presence of glaucomatous optic neuropathy at enrolment, the performance of an anterior chamber tap and a positive anterior chamber tap were all associated with increased risk of progression. On multivariate analysis, age at enrolment, immunocompromise at enrolment, the performance of an anterior chamber tap, and the presence of glaucomatous optic neuropathy at enrolment were independently associated with increased risk of disease progression. CONCLUSIONS: Posner Schlossman syndrome is not a benign uveitis entity and risk of both relapse and progression are high. Older patients, immunocompromised patients, patients with glaucomatous optic neuropathy at enrolment and those with a positive anterior chamber tap are all at increased risk of progression.
Asunto(s)
Glaucoma de Ángulo Abierto , Glaucoma , Iridociclitis , Enfermedades del Nervio Óptico , Uveítis Anterior , Uveítis , Humanos , Pronóstico , Estudios Retrospectivos , Glaucoma de Ángulo Abierto/complicaciones , Glaucoma/diagnóstico , Glaucoma/complicaciones , Uveítis/diagnóstico , Uveítis/complicaciones , Uveítis Anterior/complicaciones , Enfermedades del Nervio Óptico/complicaciones , Inflamación , Recurrencia , Presión IntraocularRESUMEN
BACKGROUND/OBJECTIVES: Sequential digital dermoscopic imaging (SDDI) and total body photography (TBP) are recommended as a two-step surveillance method for individuals at high-risk of developing cutaneous melanoma. Dermoscopic features specific to melanoma have been well described, however, dynamic changes on serial imaging are less understood. This study aims to identify and compare dermoscopic features in developing melanomas and benign naevi that underwent SDDI and TBP to understand which dermoscopic features may be associated with a malignant change. METHOD: Histopathology reports from a private specialist dermatology clinic from January 2007 to December 2019 were reviewed. Histopathologically confirmed melanoma and benign naevi that underwent SDDI and TBP with a minimum follow-up interval of 3 months were included. RESULTS: Eighty-nine melanomas (38.2% invasive, median Breslow thickness 0.35 mm, range: 0.2-1.45 mm) and 48 benign naevi were evaluated by three experienced dermatologists for dermoscopic changes. Features most strongly associated with melanoma included the development of neovascularisation, asymmetry and growth in pigment network, additional colours, shiny white structures, regression, structureless areas and change to a multi-component pattern. The presence of atypical vessels (p = 0.02) and shiny white structures (p = 0.02) were significantly associated with invasive melanoma. CONCLUSION: Evaluation for certain evolving dermoscopic features in melanocytic lesions monitored by SDDI and TBP is efficient in assisting clinical decision making. SDDI with TBP is an effective tool for early detection of melanoma.
Asunto(s)
Melanoma , Nevo Pigmentado , Neoplasias Cutáneas , Humanos , Melanoma/patología , Neoplasias Cutáneas/patología , Dermoscopía/métodos , Australia , Nevo Pigmentado/diagnóstico por imagen , Nevo Pigmentado/patología , Fotograbar , Melanoma Cutáneo MalignoRESUMEN
INTRODUCTION: Computed tomography pulmonary angiography (CTPA) is the gold standard test to investigate pulmonary embolism (PE). This technique carries significant radiation risk in young females because of radiosensitive breast and thyroid tissues. A high-pitched CT technique offers significant radiation dose reduction (RDR) and minimises breathing artefact. The addition of CT tube tin-filtration may offer further RDR. The aim of this retrospective study was to assess RDR and image quality (IQ) of high-pitch tin-filtered (HPTF)-CTPA against conventional-CTPA. METHODS: Retrospective review of consecutive adult females age < 50 years undergoing high pitch tin filtration (HPTF) and standard pitch no tin filtration (SPNF) during a 3-year period beginning in November 2017. CTs in both groups were compared for radiation dose, pulmonary arteries contrast density (Hounsfield units (HU)) and movement artefact. Findings of both groups were compared with the Student's T-test and Mann-Whitney U test, where p < 0.05 being considered significant. Diagnostic quality was also recorded. RESULTS: Ten female patients (mean age 33, 6/10 pregnant) in HPTF group and 10 female patients (mean age 36, 1/10 pregnant) in SPNF group were included. The HPTF group achieved 93% RDR (dose length product: 25.15 mGy.cm vs 337.10 mGy.cm, p < 0.01). There was significant contrast density difference between the two groups in the main, left or right pulmonary arteries (322.72 HU, 311.85 HU and 319.41 HU in HPTF group vs 418.60 HU, 405.10 HU and 415.96 HU in SPNF group respectively, p = 0.03, p = 0.03 and p = 0.04). 8/10 HPTF group and 10/10 in the control group were > 250 HU in all three vessels; the remaining 2 HPTF CTPA were > 210HU. All CT scans in both groups were of diagnostic quality and none exhibited movement artefact. CONCLUSION: This study was the first to demonstrate significant RDR with the HPTF technique whilst maintaining IQ in patients undergoing chest CTPA. This technique is particularly beneficial in young females and pregnant females with suspected PE.
Asunto(s)
Embolia Pulmonar , Estaño , Adulto , Humanos , Femenino , Persona de Mediana Edad , Estudios Retrospectivos , Reducción Gradual de Medicamentos , Dosis de Radiación , Embolia Pulmonar/diagnóstico por imagen , Arteria Pulmonar/diagnóstico por imagen , Tomografía Computarizada por Rayos X/métodos , Angiografía/métodos , Angiografía por Tomografía Computarizada/métodosRESUMEN
BACKGROUND: Understanding smoking behaviors in hospital patients who smoke may improve inpatient cessation treatments. This study aimed to describe smoking-related behaviors, past-quit attempts, and self-reported difficulties experienced in quitting among those who enrolled in a smoking cessation trial of varenicline. METHODS: Baseline data were obtained from adult hospitalized smokers (average ≥ 10 cigarettes/day in 4-weeks prior to hospitalization) who enrolled in a randomized, placebo-controlled trial of varenicline ± nicotine lozenges at five Australian public hospitals. A logistic regression model tested the association between participant characteristics and quitting in the previous 12 months. RESULTS: Participants' (n = 320; 57% male, 52.5 ± 12.1 years old) motivation and confidence in quitting were high. A total of 120 participants (37.5%) had attempted quitting in the previous 12-months. Prior hospitalization (P = .008) and employment status (P = .015) were significantly associated with past quit attempts. No statistically significant differences were noted in the reason for hospitalization or the level of nicotine dependence between participants who attempted quitting in the previous 12 months and their counterparts. Smoking cessation pharmacotherapy was used by 55% of those attempting to quit; nicotine replacement therapy (65.2%) and varenicline (16.7%) most common. Stress or anxiety, urges to smoke and a lack of motivation were the difficulties experienced in past quit attempts. CONCLUSIONS: Those who had a prior hospitalization and were unemployed had significantly greater odds of reporting past quit attempts. Further research is needed to investigate the degree of adherence among inpatient smokers with the smoke-free hospital policies and the frequency of NRT provision and uptake on admission.
Asunto(s)
Cese del Hábito de Fumar , Adulto , Humanos , Masculino , Persona de Mediana Edad , Femenino , Vareniclina/uso terapéutico , Fumadores , Motivación , Dispositivos para Dejar de Fumar Tabaco , Australia/epidemiología , Fumar/epidemiología , HospitalesRESUMEN
BACKGROUND: Establishing sequela following critical illness is a public health priority; however, recruitment and retention of this cohort make assessing functional outcomes difficult. Completing patient-reported outcome measures (PROMs) via telephone may improve participant and researcher involvement; however, there is little evidence regarding the correlation of PROMs to performance-based outcome measures in critical care survivors. OBJECTIVES: The objective of this study was to assess the relationship between self-reported and performance-based measures of function in survivors of critical illness. METHODS: This was a nested cohort study of patients enrolled within a previously published study determining predictors of disability-free survival. Spearman's correlation (rs) was calculated between four performance-based outcomes (the Functional Independence Measure [FIM], 6-min walk distance [6MWD], Functional Reach Test [FRT], and grip strength) that were collected during a home visit 6 months following their intensive care unit admission, with two commonly used PROMs (World Health Organization Disability Assessment Scale 2.0 12 Level [WHODAS 2.0] and EuroQol-5 Dimension-5 Level [EQ-5D-5L]) obtained via phone interview (via the PREDICT study) at the same time point. RESULTS: There were 38 PROMs obtained from 40 recruited patients (mean age = 59.8 ± 16 yrs, M:F = 24:16). All 40 completed the FIM and grip strength, 37 the 6MWD, and 39 the FRT. A strong correlation was found between the primary outcome of the WHODAS 2.0 with all performance-based outcomes apart from grip strength where a moderate correlation was identified. Although strong correlations were also established between the EQ-5D-5L utility score and the FIM, 6MWD, and FRT, it only correlated weakly with grip strength. The EQ-5D overall global health rating only had very weak to moderate correlations with the performance-based outcomes. CONCLUSION: The WHODAS 2.0 correlated stronger across multiple performance-based outcome measures of functional recovery and is recommended for use in survivors of critical illness.
Asunto(s)
Enfermedad Crítica , Calidad de Vida , Humanos , Adulto , Persona de Mediana Edad , Anciano , Estudios de Cohortes , Sobrevivientes , Medición de Resultados Informados por el Paciente , Cuidados Críticos , Encuestas y CuestionariosRESUMEN
BACKGROUND: The COVID-19 pandemic highlighted major challenges with usual nutrition care processes, leading to reports of malnutrition and nutrition-related issues in these patients. OBJECTIVE: The objective of this study was to describe nutrition-related service delivery practices across hospitalisation in critically ill patients with COVID-19 admitted to Australian intensive care units (ICUs) in the initial pandemic phase. METHODS: This was a multicentre (nine site) observational study in Australia, linked with a national registry of critically ill patients with COVID-19. Adult patients with COVID-19 who were discharged to an acute ward following ICU admission were included over a 12-month period. Data are presented as n (%), median (interquartile range [IQR]), and odds ratio (OR [95% confidence interval {CI}]). RESULTS: A total of 103 patients were included. Oral nutrition was the most common mode of nutrition (93 [93%]). In the ICU, there were 53 (52%) patients seen by a dietitian (median 4 [2-8] occasions) and malnutrition screening occurred in 51 (50%) patients most commonly with the malnutrition screening tool (50 [98%]). The odds of receiving a higher malnutrition screening tool score increased by 36% for every screening in the ICU (1st to 4th, OR: 1.39 [95% CI: 1.05-1.77] p = 0.018) (indicating increasing risk of malnutrition). On the ward, 51 (50.5%) patients were seen by a dietitian (median time to consult: 44 [22.5-75] hours post ICU discharge). The odds of dietetic consult increased by 39% every week while on the ward (OR: 1.39 [1.03-1.89], p = 0.034). Patients who received mechanical ventilation (MV) were more likely to receive dietetic input than those who never received MV. CONCLUSIONS: During the initial phases of the COVID-19 pandemic in Australia, approximately half of the patients included were seen by a dietitian. An increased number of malnutrition screens were associated with a higher risk score in the ICU and likelihood of dietetic consult increased if patients received MV and as length of ward stay increased.
Asunto(s)
COVID-19 , Desnutrición , Adulto , Humanos , Enfermedad Crítica , Pandemias , Australia/epidemiología , Hospitalización , Desnutrición/epidemiología , Desnutrición/diagnóstico , Unidades de Cuidados IntensivosRESUMEN
OBJECTIVES: To develop and validate a classification of sleeve gastrectomy leaks able to reliably predict outcomes, from protocolized computed tomography (CT) findings and readily available variables. SUMMARY OF BACKGROUND DATA: Leaks post sleeve gastrectomy remain morbid and resource-consuming. Incidence, treatments, and outcomes are variable, representing heterogeneity of the problem. A predictive tool available at presentation would aid management and predict outcomes. METHODS: From a prospective database (2009-2018) we reviewed patients with staple line leaks. A Delphi process was undertaken on candidate variables (80-20). Correlations were performed to stratify 4 groupings based on outcomes (salvage resection, length of stay, and complications) and predictor variables. Training and validation cohorts were established by block randomization. RESULTS: A 4-tiered classification was developed based on CT appearance and duration postsurgery. Interobserver agreement was high (κ = 0.85, P < 0.001). There were 59 patients, (training: 30, validation: 29). Age 42.5 ± 10.8 versus 38.9 ± 10.0âyears (P = 0.187); female 65.5% versus 80.0% (P = 0.211), weight 127.4 ± 31.3 versus 141.0 ± 47.9âkg, (P = 0.203). In the training group, there was a trend toward longer hospital stays as grading increased (I = 10.5 d; II = 24 d; III = 66.5 d; IV = 72 d; P = 0.005). Risk of salvage resection increased (risk ratio grade 4 = 9; P = 0.043) as did complication severity (P = 0.027).Findings were reproduced in the validation group: risk of salvage resection (P = 0.007), hospital stay (P = 0.001), complications (P = 0.016). CONCLUSION: We have developed and validated a classification system, based on protocolized CT imaging that predicts a step-wise increased risk of salvage resection, complication severity, and increased hospital stay. The system should aid patient management and facilitate comparisons of outcomes and efficacy of interventions.
Asunto(s)
Fuga Anastomótica/clasificación , Fuga Anastomótica/diagnóstico por imagen , Protocolos Clínicos , Gastrectomía/métodos , Tomografía Computarizada por Rayos X , Adulto , Femenino , Humanos , Masculino , Persona de Mediana Edad , Estudios Prospectivos , Distribución AleatoriaRESUMEN
OBJECTIVE: The aim of this study was to evaluate the diagnostic performance of all biomarkers studied to date for the early diagnosis of sepsis in hospitalized patients with burns. BACKGROUND: Early clinical diagnosis of sepsis in burns patients is notoriously difficult due to the hypermetabolic nature of thermal injury. A considerable variety of biomarkers have been proposed as potentially useful adjuncts to assist with making a timely and accurate diagnosis. METHODS: We searched Medline, Embase, Cochrane CENTRAL, Biosis Previews, Web of Science, and Medline In-Process to February 2020. We included diagnostic studies involving burns patients that assessed biomarkers against a reference sepsis definition of positive blood cultures or a combination of microbiologically proven infection with systemic inflammation and/or organ dysfunction. Pooled measures of diagnostic accuracy were derived for each biomarker using bivariate random-effects meta-analysis. RESULTS: We included 28 studies evaluating 57 different biomarkers and incorporating 1517 participants. Procalcitonin was moderately sensitive (73%) and specific (75%) for sepsis in patients with burns. C-reactive protein was highly sensitive (86%) but poorly specific (54%). White blood cell count had poor sensitivity (47%) and moderate specificity (65%). All other biomarkers had insufficient studies to include in a meta-analysis, however brain natriuretic peptide, stroke volume index, tumor necrosis factor (TNF)-alpha, and cell-free DNA (on day 14 post-injury) showed the most promise in single studies. There was moderate to significant heterogeneity reflecting different study populations, sepsis definitions and test thresholds. CONCLUSIONS: The most widely studied biomarkers are poorly predictive for sepsis in burns patients. Brain natriuretic peptide, stroke volume index, TNF-alpha, and cell-free DNA showed promise in single studies and should be further evaluated. A standardized approach to the evaluation of diagnostic markers (including time of sampling, cut-offs, and outcomes) would be useful.
Asunto(s)
Quemaduras , Ácidos Nucleicos Libres de Células , Sepsis , Biomarcadores , Quemaduras/complicaciones , Quemaduras/diagnóstico , Diagnóstico Precoz , Humanos , Péptido Natriurético Encefálico , Sensibilidad y Especificidad , Sepsis/diagnósticoRESUMEN
OBJECTIVES: To evaluate the functional outcome and health-related quality of life of in-hospital cardiac arrest survivors at 6 and 12 months. DESIGN: A longitudinal cohort study. SETTING: Seven metropolitan hospitals in Australia. PATIENTS: Data were collected for hospitalized adults (≥ 18 yr) who experienced in-hospital cardiac arrest, defined as "a period of unresponsiveness, with no observed respiratory effort and the commencement of external cardiac compressions." INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: Prior to hospital discharge, patients were approached for consent to participate in 6-month and 12-month telephone interviews. Outcomes included the modified Rankin Scale, Barthel Index, Euro-Quality of Life 5 Dimension 5 Level, return to work and hospital readmissions. Forty-eight patients (80%) consented to follow-up interviews. The mean age of participants was 67.2 (± 15.3) years, and 33 of 48 (68.8%) were male. Good functional outcome (modified Rankin Scale score ≤ 3) was reported by 31 of 37 participants (83.8%) at 6 months and 30 of 33 (90.9%) at 12 months. The median Euro-Quality of Life-5D index value was 0.73 (0.33-0.84) at 6 months and 0.76 (0.47-0.88) at 12 months. The median Euro-Quality of Life-Visual Analogue Scale score at 6 months was 70 (55-80) and 75 (50-87.5) at 12 months. Problems in all Euro-Quality of Life-5D-5 L dimension were reported frequently at both time points. Hospital readmission was reported by 23 of 37 patients (62.2%) at 6 months and 16 of 33 (48.5%) at 12 months. Less than half of previously working participants had returned to work by 12 months. CONCLUSIONS: The majority of in-hospital cardiac arrest survivors had a good functional outcome and health-related quality of life at 6 months, and this was largely unchanged at 12 months. Despite this, many reported problems with mobility, self-care, usual activities, pain, and anxiety/depression. Return to work rates was low, and hospital readmissions were common.
Asunto(s)
Estado Funcional , Paro Cardíaco/epidemiología , Calidad de Vida , Sobrevivientes/estadística & datos numéricos , Actividades Cotidianas , Anciano , Anciano de 80 o más Años , Femenino , Humanos , Masculino , Persona de Mediana Edad , Readmisión del Paciente/estadística & datos numéricos , Reinserción al Trabajo/estadística & datos numéricosRESUMEN
INTRODUCTION: Unintentional weight gain, overweight and obesity following solid organ transplantation (SOT) are well-established and linked to morbidity and mortality risk factors. No interventional studies aimed at prevention have been undertaken among lung transplant (LTx) recipients. The combination of group education and telephone coaching is effective in the general population but is untested among SOT cohorts. METHODS: A non-randomized, interventional pilot study was conducted among new LTx recipients. The control group received standard care. In addition to standard care, the intervention involved four group education and four individual, telephone coaching sessions over 12-months. Data collection occurred at 2 weeks, 3- and 12 months post-LTx. Measurements included weight, BMI, fat mass (FM), fat mass index (FMI), fat-free mass (FFM), fat-free mass index (FFMI), waist circumference (WC), visceral adipose tissue (VAT), nutrition knowledge, diet, physical activity, lipid profile, HbA1C , FEV1 , six-minute walk distance and patient satisfaction. RESULTS: Fifteen LTx recipients were recruited into each group. One control participant died 120 days post-LTx, unrelated to the study. There were trends towards lower increases in weight (6.7±7.2 kg vs. 9.8±11.3 kg), BMI (9.6% of baseline vs. 13%), FM (19.7% vs. 40%), FMI, VAT (7.1% vs. 30.8%) and WC (5.5% vs. 9.5%), and greater increases in FFM and FFMI (all P > .05), among the intervention group by 12 months. The intervention was well-accepted by participants. CONCLUSION: This feasible intervention demonstrated non-significant, but clinically meaningful, favorable weight and body composition trends among LTx recipients over 12 months compared to standard care.