Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 9 de 9
Filter
1.
Hepatology ; 2024 Mar 27.
Article in English | MEDLINE | ID: mdl-38825975

ABSTRACT

BACKGROUND AND AIMS: Improving the care of decompensated cirrhosis is a significant clinical challenge. The primary aim of this trial was to assess the efficacy of a chronic disease management (CDM) model to reduce liver-related emergency admissions (LREA). The secondary aims were to assess model effects on quality-of-care and patient-reported outcomes. APPROACH AND RESULTS: The study design was a 2-year, multicenter, randomized controlled study with 1:1 allocation of a CDM model versus usual care. The study setting involved both tertiary and community care. Participants were randomly allocated following a decompensated cirrhosis admission. The intervention was a multifaceted CDM model coordinated by a liver nurse. A total of 147 participants (intervention=75, control=71) were recruited with a median Model for End-Stage Liver Disease score of 19. For the primary outcome, there was no difference in the overall LREA rate for the intervention group versus the control group (incident rate ratio 0.89; 95% CI: 0.53-1.50, p=0.666) or in actuarial survival (HR=1.14; 95% CI: 0.66-1.96, p=0.646). However, there was a reduced risk of LREA due to encephalopathy in the intervention versus control group (HR=1.87; 95% CI: 1.18-2.96, p=0.007). Significant improvement in quality-of-care measures was seen for the performance of bone density (p<0.001), vitamin D testing (p<0.001), and HCC surveillance adherence (p=0.050). For assessable participants (44/74 intervention, 32/71 controls) significant improvements in patient-reported outcomes at 3 months were seen in self-management ability and quality of life as assessed by visual analog scale (p=0.044). CONCLUSIONS: This CDM intervention did not reduce overall LREA events and may not be effective in decompensated cirrhosis for this end point.

2.
World J Hepatol ; 14(8): 1576-1583, 2022 Aug 27.
Article in English | MEDLINE | ID: mdl-36157868

ABSTRACT

BACKGROUND: Hepatitis C is a global epidemic and an estimated 230 000 Australians were living with chronic hepatitis C in 2016. Through effective public health policy and state commitment, Australia has utilised the advent of direct acting antiviral (DAA) therapy to transform the therapeutic landscape for hepatitis C virus (HCV). However, treatment rates are falling and novel public health approaches are required to maintain momentum for HCV elimination. Contemporary discourse in cascades of care have focused on expanding testing capabilities but less attention has been given to linking previously diagnosed patients back to care. Our simple and focused study rests on the premise that hospital admissions are an excellent opportunity to identify and refer previously diagnosed patients for HCV treatment. AIM: To assess whether inpatients with HCV are appropriately referred on for treatment. METHODS: We conducted a retrospective single centre cohort study that examined all patients with HCV presenting to The Queen Elizabeth Hospital (QEH) inpatient service between January 1 and December 31, 2017. QEH is a tertiary care hospital in South Australia. The main inclusion criteria were patients with active HCV infection who were eligible for DAA therapy. Our study cohort was identified using a comprehensive list of diagnosis based on international classification of diseases-10 AM codes for chronic viral hepatitis. Patients were excluded from the analysis if they had previously received DAA therapy or spontaneously cleared HCV. Patients presenting with decompensated liver cirrhosis or other systemic medical conditions conferring poor short-term prognosis were also excluded from the analysis. The primary outcome of our study was referral of patients for HCV treatment. Secondary outcomes included assessment of factors predicting treatment referral. RESULTS: There were 309 inpatients identified with hepatitis C as a principal or additional diagnosis between January 1 and December 31, 2017. Of these patients, 148 had active HCV infection without prior treatment or spontaneous clearance. Overall, 131 patients were deemed eligible for DAA treatment and included in the main analysis. Mean patient age was 47.75 ± 1.08 years, and 69% of the cohort were male and 13% identified as Aboriginal or Torres Strait Islander. Liver cirrhosis was a complication of hepatitis C in 7% of the study cohort. Only 10 patients were newly diagnosed with HCV infection during the study period with the remainder having been diagnosed prior to the study. CONCLUSION: Under 25% of hepatitis C patients presenting to an Australian tertiary hospital were appropriately referred for treatment. Advanced age, cirrhosis and admission under medical specialties were predictors of treatment referral.

3.
Am J Gastroenterol ; 116(11): 2235-2240, 2021 11 01.
Article in English | MEDLINE | ID: mdl-34543257

ABSTRACT

INTRODUCTION: "Push" or "pull" techniques with the use of snares, forceps, baskets, and grasping devices are conventionally used to manage esophageal food bolus impaction (FBI). A novel cap-assisted technique has recently been advocated to reduce time taken for food bolus (FB) removal. This study aimed to compare the effectiveness of the cap-assisted technique against conventional methods of esophageal FB removal in a randomized controlled trial. METHODS: Consecutive patients with esophageal FBI requiring endoscopic removal, from 3 Australian tertiary hospitals between 2017 and 2019, were randomized to either the cap-assisted technique or the conventional technique. Primary outcomes were technical success and FB retrieval time. Secondary outcomes were technical success rate, en bloc removal rate, procedure-related complication, length of hospital stay, and cost of consumables. RESULTS: Over 24 months, 342 patients with esophageal FBI were randomized to a cap-assisted (n = 171) or conventional (n = 171) technique. Compared with the conventional approach, the cap-assisted technique was associated with (i) shorter FB retrieval time (4.5 ± 0.5 minutes vs 21.7 ± 0.9 minutes, P < 0.001), (ii) shorter total procedure time (23.0 ± 0.6 minutes vs 47.0 ± 1.3 minutes, P < 0.0001), (iii) higher technical success rate (170/171 vs 160/171, P < 0.001), (iv) higher rate of en bloc removal (159/171 vs 48/171, P < 0.001), and (v) lower rate of procedure-related mucosal tear and bleeding (0/171 vs 13/171, P < 0.001). There were no major adverse events or deaths within 30 days in either group. The total cost of consumables was higher in the conventional group (A$19,644.90 vs A$6,239.90). DISCUSSION: This multicenter randomized controlled trial confirmed that the cap-assisted technique is more effective and less costly than the conventional approach and should be first-line treatment for esophageal FBI.


Subject(s)
Esophagoscopy/methods , Esophagus/surgery , Food/adverse effects , Foreign Bodies/surgery , Postoperative Complications/epidemiology , Adult , Aged , Cost-Benefit Analysis/statistics & numerical data , Esophagoscopy/adverse effects , Esophagoscopy/economics , Esophagoscopy/instrumentation , Esophagus/diagnostic imaging , Esophagus/pathology , Female , Foreign Bodies/diagnosis , Foreign Bodies/etiology , Foreign Bodies/pathology , Hospitals, High-Volume/statistics & numerical data , Humans , Length of Stay/statistics & numerical data , Male , Middle Aged , Postoperative Complications/etiology , Tertiary Care Centers/statistics & numerical data , Treatment Outcome
4.
Eur J Gastroenterol Hepatol ; 32(10): 1381-1389, 2020 10.
Article in English | MEDLINE | ID: mdl-31895911

ABSTRACT

AIM: The objective was to study the long-term (lifetime) cost effectiveness of four different hepatitis C virus (HCV) treatment models of care (MOC) with directly acting antiviral drugs. METHODS: A cohort Markov model-based probabilistic cost-effectiveness analysis (CEA) was undertaken extrapolating to up to 30 years from cost and outcome data collected from a primary study involving a real-life Australian cohort. In this study, noncirrhotic patients treated for HCV from 1 March 2016 to 28 February 2017 at four major public hospitals and liaising sites in South Australia were studied retrospectively. The MOC were classified depending on the person providing patient workup, treatment and monitoring into MOC1 (specialist), MOC2 (mixed specialist and hepatitis nurse), MOC3 (hepatitis nurse) and MOC4 (general practitioner, GP). Incremental costs were estimated from the Medicare perspective. Incremental outcomes were estimated based on the quality-adjusted life years (QALY) gained by achieving a sustained virological response. A cost-effectiveness threshold of Australian dollar 50 000 per QALY gained, the implicit criterion used for assessing the cost-effectiveness of new pharmaceuticals and medical services in Australia was assumed. Net monetary benefit (NMB) estimates based on this threshold were calculated. RESULTS: A total of 1373 patients, 64% males, mean age 50 (SD ±11) years, were studied. In the CEA, MOC4 and MOC2 clearly dominated MOC1 over 30 years with lower costs and higher QALYs. Similarly, NMB was the highest in MOC4, followed by MOC2. CONCLUSION: Decentralized care using GP and mixed consultant nurse models were cost-effective ways of promoting HCV treatment uptake in the setting of unrestricted access to new antivirals.


Subject(s)
Hepatitis C, Chronic , Hepatitis C , Aged , Antiviral Agents/therapeutic use , Australia/epidemiology , Cost-Benefit Analysis , Female , Hepacivirus , Hepatitis C/drug therapy , Hepatitis C, Chronic/diagnosis , Hepatitis C, Chronic/drug therapy , Hepatitis C, Chronic/epidemiology , Humans , Male , Markov Chains , Middle Aged , National Health Programs , Quality-Adjusted Life Years , Retrospective Studies , South Australia/epidemiology
5.
World J Hepatol ; 9(17): 791-796, 2017 Jun 18.
Article in English | MEDLINE | ID: mdl-28660013

ABSTRACT

AIM: To evaluate the performance of aspartate aminotransferase to platelet ratio (APRI) score against FibroScan in predicting the presence of fibrosis. METHODS: Data of patients who concurrently had APRI score, FibroScan and liver biopsy to assess their hepatitis C virus (HCV) and hepatitis B virus (HBV) over 6 years were retrospectively reviewed and details of their disease characteristics and demographics were recorded. Advanced fibrosis was defined as ≥ F3. RESULTS: Of the 3619 patients (47.5 ± 11.3 years, 97M:36F) who had FibroScans and APRI for HCV and HBV, 133 had concurrent liver biopsy. Advanced liver fibrosis was found in 27/133 (20%, F3 = 21 and F4 = 6) patients. Although APRI score (P < 0.001, AUC = 0.83) and FibroScan (P < 0.001, AUC = 0.84) predicted the presence of advanced fibrosis, the sensitivities and specificities were only modest (APRI score: 51.9% sensitivity, 84.9% specificity; FibroScan: 63% sensitivity, 84% specificity). Whilst 13/27 (48%) patients with advanced fibrosis had APRI ≤ 1.0, no patients with APRI ≤ 0.5 had advanced fibrosis, with 100% sensitivity. The use of APRI ≤ 0.5 would avoid the need for FibroScan in 43% of patients. CONCLUSION: APRI score and FibroScan performed equally well in predicting advanced fibrosis. A proposed APRI cut-off score of 0.5 could be used as a screening tool for FibroScan, as cut-off score of 1.0 will miss up to 48% of patients with advanced fibrosis. Further prospective validation studies are required to confirm this finding.

6.
Gastroenterology ; 152(8): 1975-1984.e8, 2017 06.
Article in English | MEDLINE | ID: mdl-28274849

ABSTRACT

BACKGROUND & AIMS: Primary sclerosing cholangitis (PSC) is an orphan hepatobiliary disorder associated with inflammatory bowel disease (IBD). We aimed to estimate the risk of disease progression based on distinct clinical phenotypes in a large international cohort of patients with PSC. METHODS: We performed a retrospective outcome analysis of patients diagnosed with PSC from 1980 through 2010 at 37 centers in Europe, North America, and Australia. For each patient, we collected data on sex, clinician-reported age at and date of PSC and IBD diagnoses, phenotypes of IBD and PSC, and date and indication of IBD-related surgeries. The primary and secondary endpoints were liver transplantation or death (LTD) and hepatopancreatobiliary malignancy, respectively. Cox proportional hazards models were applied to determine the effects of individual covariates on rates of clinical events, with time-to-event analysis ascertained through Kaplan-Meier estimates. RESULTS: Of the 7121 patients in the cohort, 2616 met the primary endpoint (median time to event of 14.5 years) and 721 developed hepatopancreatobiliary malignancy. The most common malignancy was cholangiocarcinoma (n = 594); patients of advanced age at diagnosis had an increased incidence compared with younger patients (incidence rate: 1.2 per 100 patient-years for patients younger than 20 years old, 6.0 per 100 patient-years for patients 21-30 years old, 9.0 per 100 patient-years for patients 31-40 years old, 14.0 per 100 patient-years for patients 41-50 years old, 15.2 per 100 patient-years for patients 51-60 years old, and 21.0 per 100 patient-years for patients older than 60 years). Of all patients with PSC studied, 65.5% were men, 89.8% had classical or large-duct disease, and 70.0% developed IBD at some point. Assessing the development of IBD as a time-dependent covariate, Crohn's disease and no IBD (both vs ulcerative colitis) were associated with a lower risk of LTD (unadjusted hazard ratio [HR], 0.62; P < .001 and HR, 0.90; P = .03, respectively) and malignancy (HR, 0.68; P = .008 and HR, 0.77; P = .004, respectively). Small-duct PSC was associated with a lower risk of LTD or malignancy compared with classic PSC (HR, 0.30 and HR, 0.15, respectively; both P < .001). Female sex was also associated with a lower risk of LTD or malignancy (HR, 0.88; P = .002 and HR, 0.68; P < .001, respectively). In multivariable analyses assessing the primary endpoint, small-duct PSC characterized a low-risk phenotype in both sexes (adjusted HR for men, 0.23; P < .001 and adjusted HR for women, 0.48; P = .003). Conversely, patients with ulcerative colitis had an increased risk of liver disease progression compared with patients with Crohn's disease (HR, 1.56; P < .001) or no IBD (HR, 1.15; P = .002). CONCLUSIONS: In an analysis of data from individual patients with PSC worldwide, we found significant variation in clinical course associated with age at diagnosis, sex, and ductal and IBD subtypes. The survival estimates provided might be used to estimate risk levels for patients with PSC and select patients for clinical trials.


Subject(s)
Cholangitis, Sclerosing/epidemiology , Colitis, Ulcerative/epidemiology , Crohn Disease/epidemiology , Adult , Age Distribution , Australia/epidemiology , Chi-Square Distribution , Cholangitis, Sclerosing/diagnosis , Cholangitis, Sclerosing/mortality , Cholangitis, Sclerosing/surgery , Colitis, Ulcerative/diagnosis , Colitis, Ulcerative/mortality , Colitis, Ulcerative/surgery , Crohn Disease/diagnosis , Crohn Disease/mortality , Crohn Disease/surgery , Disease Progression , Europe/epidemiology , Female , Humans , Incidence , Kaplan-Meier Estimate , Liver Transplantation , Male , Middle Aged , Multivariate Analysis , North America/epidemiology , Phenotype , Prognosis , Proportional Hazards Models , Retrospective Studies , Risk Assessment , Risk Factors , Sex Distribution , Time Factors , Young Adult
7.
Gastrointest Endosc ; 85(6): 1212-1217, 2017 Jun.
Article in English | MEDLINE | ID: mdl-27894929

ABSTRACT

BACKGROUND AND AIMS: This study aims to evaluate the role of unsedated, ultrathin disposable gastroscopy (TDG) against conventional gastroscopy (CG) in the screening and surveillance of gastroesophageal varices (GEVs) in patients with liver cirrhosis. METHOD: Forty-eight patients (56.4 ± 1.3 years; 38 male, 10 female) with liver cirrhosis referred for screening (n = 12) or surveillance (n = 36) of GEVs were prospectively enrolled. Unsedated gastroscopy was initially performed with TDG, followed by CG with conscious sedation. The 2 gastroscopies were performed by different endoscopists blinded to the results of the previous examination. Video recordings of both gastroscopies were validated by an independent investigator in a random, blinded fashion. Endpoints were accuracy and interobserver agreement of detecting GEVs, safety, and potential cost saving. RESULTS: CG identified GEVs in 26 (54%) patients, 10 of whom (21%) had high-risk esophageal varices (HREV). Compared with CG, TDG had an accuracy of 92% for the detection of all GEVs, which increased to 100% for high-risk GEVs. The interobserver agreement for detecting all GEVs on TDG was 88% (κ = 0.74). This increased to 94% (κ = 0.82) for high-risk GEVs. There were no serious adverse events. CONCLUSIONS: Unsedated TDG is safe and has high diagnostic accuracy and interobserver reliability for the detection of GEVs. The use of clinic-based TDG would allow immediate determination of a follow-up plan, making it attractive for variceal screening and surveillance programs. (Clinical trial (ANZCTR) registration number: ACTRN12616001103459.).


Subject(s)
Disposable Equipment , Equipment Design , Esophageal and Gastric Varices/diagnosis , Gastroscopes , Conscious Sedation , Equipment Reuse , Esophageal and Gastric Varices/etiology , Female , Gastroscopy/instrumentation , Humans , Liver Cirrhosis/complications , Male , Mass Screening , Middle Aged , Prospective Studies , Reproducibility of Results
8.
World J Gastroenterol ; 21(45): 12835-42, 2015 Dec 07.
Article in English | MEDLINE | ID: mdl-26668507

ABSTRACT

AIM: To evaluate the practice of nutritional assessment and management of hospitalised patients with cirrhosis and the impact of malnutrition on their clinical outcome. METHODS: This was a retrospective cohort study on patients with liver cirrhosis consecutively admitted to the Department of Gastroenterology and Hepatology at the Royal Adelaide Hospital over 24 mo. Details were gathered related to the patients' demographics, disease severity, nutritional status and assessment, biochemistry and clinical outcomes. Nutritional status was assessed by a dietician and determined by subjective global assessment. Estimated energy and protein requirements were calculated by Simple Ratio Method. Intake was estimated from dietary history and/or food charts, and represented as a percentage of estimated daily requirements. Median duration of follow up was 14.9 (0-41.4) mo. RESULTS: Of the 231 cirrhotic patients (167 male, age: 56.3 ± 0.9 years, 9% Child-Pugh A, 42% Child-Pugh B and 49% Child-Pugh C), 131 (57%) had formal nutritional assessment during their admission and 74 (56%) were judged to have malnutrition. In-hospital caloric (15.6 ± 1.2 kcal/kg vs 23.7 ± 2.3 kcal/kg, P = 0.0003) and protein intake (0.65 ± 0.06 g/kg vs 1.01 ± 0.07 g/kg, P = 0.0003) was significantly reduced in patients with malnutrition. Of the malnourished cohort, 12 (16%) received enteral nutrition during hospitalisation and only 6 (8%) received ongoing dietetic review and assessment following discharge from hospital. The overall mortality was 51%, and was higher in patients with malnutrition compared to those without (HR = 5.29, 95%CI: 2.31-12.1; P < 0.001). CONCLUSION: Malnutrition is common in hospitalised patients with cirrhosis and is associated with higher mortality. Formal nutritional assessment, however, is inadequate. This highlights the need for meticulous nutritional evaluation and management in these patients.


Subject(s)
Hospitalization , Inpatients , Liver Cirrhosis/therapy , Malnutrition/therapy , Nutritional Support/methods , Chronic Disease , Energy Intake , Energy Metabolism , Female , Hospital Mortality , Humans , Liver Cirrhosis/diagnosis , Liver Cirrhosis/mortality , Liver Cirrhosis/physiopathology , Male , Malnutrition/diagnosis , Malnutrition/mortality , Malnutrition/physiopathology , Middle Aged , Nutrition Assessment , Nutritional Status , Retrospective Studies , Risk Factors , South Australia , Time Factors , Treatment Outcome
9.
Curr Opin Gastroenterol ; 29(2): 208-15, 2013 Mar.
Article in English | MEDLINE | ID: mdl-23314811

ABSTRACT

PURPOSE OF REVIEW: To highlight the recent developments in nutritional support for critically ill patients. RECENT FINDINGS: Increasing data support the benefits of early initiation of enteral nutrition, with improvements in small intestinal absorption and clinical outcomes. In contrast to the previous belief, recent data suggest caloric administration of greater than 65-70% of daily requirement is associated with poorer clinical outcomes, especially when supplemental parenteral nutrition is used to increase the amount of caloric delivery. The role of supplementary micronutrients and anti-inflammatory lipids has been further evaluated but remains inconclusive, and is not currently recommended. SUMMARY: Together, current findings indicate that intragastric enteral nutrition should be initiated within 24 h of admission to ICU and supplementary parenteral nutrition should be avoided. Future research should aim to clarify the optimal energy delivery for best clinical outcomes, and the role of small intestinal function and its flora in nutritional care and clinical outcomes.


Subject(s)
Critical Illness/therapy , Nutritional Support/methods , Critical Care/methods , Early Medical Intervention/methods , Enteral Nutrition/methods , Humans , Intensive Care Units , Parenteral Nutrition/methods , Time Factors
SELECTION OF CITATIONS
SEARCH DETAIL