Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 21
Filter
1.
Eur J Immunol ; 54(8): e2350678, 2024 Aug.
Article in English | MEDLINE | ID: mdl-38700055

ABSTRACT

BACKGROUND AND AIMS: Second-generation direct-acting antivirals (2G DAA) to cure HCV have led to dramatic clinical improvements. HCV-associated hepatocellular carcinoma (HCC), however, remains common. Impaired immune tumor surveillance may play a role in HCC development. Our cohort evaluated the effects of innate immune types and clinical variables on outcomes including HCC. METHODS: Participants underwent full HLA class I/KIR typing and long-term HCV follow-up. RESULTS: A total of 353 HCV+ participants were followed for a mean of 7 years. Cirrhosis: 25% at baseline, developed in 12% during follow-up. 158 participants received 2G DAA therapy. HCC developed without HCV therapy in 20 subjects, 24 HCC after HCV therapy, and 10 of these after 2G DAA. Two predictors of HCC among 2G DAA-treated patients: cirrhosis (OR, 10.0, p = 0.002) and HLA/KIR profiles predicting weak natural killer (NK) cell-mediated immunity (NK cell complementation groups 6, 9, 11, 12, OR of 5.1, p = 0.02). Without 2G DAA therapy: cirrhosis was the main clinical predictor of HCC (OR, 30.8, p < 0.0001), and weak NK-cell-mediated immunity did not predict HCC. CONCLUSION: Cirrhosis is the main risk state predisposing to HCC, but weak NK-cell-mediated immunity may predispose to post-2G DAA HCC more than intermediate or strong NK-cell-mediated immunity.


Subject(s)
Antiviral Agents , Carcinoma, Hepatocellular , Hepacivirus , Killer Cells, Natural , Liver Neoplasms , Receptors, KIR , Humans , Carcinoma, Hepatocellular/immunology , Liver Neoplasms/immunology , Killer Cells, Natural/immunology , Male , Antiviral Agents/therapeutic use , Female , Middle Aged , Receptors, KIR/immunology , Aged , Hepacivirus/immunology , Hepatitis C/immunology , Hepatitis C/drug therapy , Hepatitis C/complications , HLA Antigens/immunology , Adult , Immunity, Cellular , Follow-Up Studies , Hepatitis C, Chronic/drug therapy , Hepatitis C, Chronic/immunology , Hepatitis C, Chronic/complications
2.
Liver Transpl ; 28(7): 1144-1157, 2022 07.
Article in English | MEDLINE | ID: mdl-35226793

ABSTRACT

Living donor liver transplantation (LDLT) is an attractive option to decrease waitlist dropout, particularly for patients with hepatocellular carcinoma (HCC) who face lengthening waiting times. Using the United Network for Organ Sharing (UNOS) national database, trends in LDLT utilization for patients with HCC were evaluated, and post-LT outcomes for LDLT versus deceased donor liver transplantation (DDLT) were compared. From 1998 to 2018, LT was performed in 20,161 patients with HCC including 726 (3.6%) who received LDLT. The highest LDLT utilization was prior to the 2002 HCC Model for End-Stage Liver Disease (MELD) exception policy (17.5%) and dropped thereafter (3.1%) with a slight increase following the 6-month wait policy in 2015 (3.8%). LDLT was more common in patients from long-wait UNOS regions with blood type O, in those with larger total tumor diameter (2.3 vs. 2.1 cm, p = 0.02), and higher alpha-fetoprotein at LT (11.5 vs. 9.0 ng/ml, p = 0.04). The 5-year post-LT survival (LDLT 77% vs. DDLT 75%), graft survival (72% vs. 72%), and HCC recurrence (11% vs. 13%) were similar between groups (all p > 0.20). In conclusion, LDLT utilization for HCC has remained low since 2002 with only a slight increase after the 6-month wait policy introduction in 2015. Given the excellent post-LT survival, LDLT appears to be an underutilized but valuable option for patients with HCC, especially those at high risk for waitlist dropout.


Subject(s)
Carcinoma, Hepatocellular , End Stage Liver Disease , Liver Neoplasms , Liver Transplantation , End Stage Liver Disease/etiology , End Stage Liver Disease/surgery , Humans , Liver Transplantation/adverse effects , Living Donors , Neoplasm Recurrence, Local , Retrospective Studies , Severity of Illness Index , Treatment Outcome
3.
Hepatology ; 71(3): 943-954, 2020 03.
Article in English | MEDLINE | ID: mdl-31344273

ABSTRACT

BACKGROUND AND AIMS: United Network for Organ Sharing (UNOS) recently implemented a national policy granting priority listing for liver transplantation (LT) in patients who achieved down-staging of hepatocellular carcinoma (HCC) to Milan criteria. We aimed to evaluate the national experience on down-staging by comparing two down-staging groups with (1) tumor burden meeting UNOS down-staging (UNOS-DS) inclusion criteria and (2) "all-comers" (AC-DS) with initial tumor burden beyond UNOS-DS criteria versus patients always within Milan. APPROACH AND RESULTS: This is a retrospective analysis of the UNOS database of 3,819 patients who underwent LT from 2012 to 2015, classified as always within Milan (n = 3,276), UNOS-DS (n = 422), and AC-DS (n = 121). Median time to LT was 12.8 months in long wait regions, 6.5 months in mid wait regions (MWR), and 2.6 months in short wait regions (SWR). On explant, vascular invasion was found in 23.7% of AC-DS versus 16.9% of UNOS-DS and 14.4% of Milan (P = 0.002). Kaplan-Meier 3-year post-LT survival was 83.2% for Milan, 79.1% for UNOS-DS (P = 0.17 vs. Milan), and 71.4% for AC-DS (P = 0.04 vs. Milan). Within down-staging groups, risk of post-LT death in multivariable analysis was increased in SWR or MWR (hazard ratio [HR], 3.1; P = 0.005) and with alpha-fetoprotein (AFP) ≥ 100 ng/mL at LT (HR, 2.4; P = 0.009). The 3-year HCC recurrence probability was 6.9% for Milan, 12.8% for UNOS-DS, and 16.7% for AC-DS (P < 0.001). In down-staging groups, AFP ≥ 100 (HR, 2.6; P = 0.02) was the only independent predictor of HCC recurrence. CONCLUSIONS: Our results validated UNOS-DS criteria based on comparable 3-year survival between UNOS-DS and Milan groups. Additional refinements based on AFP and wait time may further improve post-LT outcomes in down-staging groups, especially given that reported 3-year recurrence was higher than in those always within Milan criteria.


Subject(s)
Carcinoma, Hepatocellular/surgery , Liver Neoplasms/surgery , Liver Transplantation , Tumor Burden , Waiting Lists , alpha-Fetoproteins/analysis , Aged , Carcinoma, Hepatocellular/blood , Carcinoma, Hepatocellular/mortality , Carcinoma, Hepatocellular/pathology , Female , Humans , Liver Neoplasms/blood , Liver Neoplasms/mortality , Liver Neoplasms/pathology , Male , Middle Aged , Neoplasm Recurrence, Local/epidemiology , Neoplasm Staging , Retrospective Studies
4.
Liver Transpl ; 26(9): 1100-1111, 2020 09.
Article in English | MEDLINE | ID: mdl-32531867

ABSTRACT

Liver transplantation (LT) recipients with hepatocellular carcinoma (HCC) receive a higher proportion of livers from donation after circulatory death (DCD) donors compared with non-HCC etiologies. Nevertheless, data on outcomes in patients with HCC receiving DCD grafts are limited. We evaluated the influence of DCD livers on post-LT outcome among HCC patients. We identified 7563 patients in the United Network for Organ Sharing (UNOS) database who underwent LT with Model for End-Stage Liver Disease score exceptions from 2012 to 2016, including 567 (7.5%) who received a DCD donor organ and 6996 (92.5%) who received a donation after brain death (DBD) donor organ. Kaplan-Meier probabilities of post-LT HCC recurrence at 3 years were 7.6% for DCD and 6.4% for DBD recipients (P = 0.67) and post-LT survival at 3 years was 81.1% versus 85.5%, respectively (P = 0.008). On multivariate analysis, DCD donor (hazard ratio, 1.38; P = 0.005) was an independent predictor of post-LT mortality. However, a survival difference after LT was only observed in subgroups at higher risk for HCC recurrence including Risk Estimation of Tumor Recurrence After Transplant (RETREAT) score ≥4 (DCD 57.0% versus DBD 72.6%; P = 0.02), alpha-fetoprotein (AFP) ≥100 (60.1% versus 76.9%; P = 0.049), and multiple viable tumors on last imaging before LT (69.9% versus 83.1%; P = 0.002). In this analysis of HCC patients receiving DCD versus DBD livers in the UNOS database, we found that patients with a low-to-moderate risk of HCC recurrence (80%-90% of the DCD cohort) had equivalent survival regardless of donor type. It appears that DCD donation can best be used to increase the donor pool for HCC patients with decompensated cirrhosis or partial response/stable disease after locoregional therapy with AFP at LT <100 ng/mL.


Subject(s)
Carcinoma, Hepatocellular , End Stage Liver Disease , Liver Neoplasms , Liver Transplantation , Tissue and Organ Procurement , Carcinoma, Hepatocellular/surgery , Death , Graft Survival , Humans , Liver Neoplasms/surgery , Liver Transplantation/adverse effects , Neoplasm Recurrence, Local/epidemiology , Retrospective Studies , Severity of Illness Index , Tissue Donors
5.
Can Liver J ; 6(3): 358-362, 2023 Oct.
Article in English | MEDLINE | ID: mdl-38020188

ABSTRACT

Background: Frailty is a clinical state of increased vulnerability and is common in patients with cirrhosis. The liver frailty index (LFI) is a validated tool to evaluate frailty in cirrhosis, comprising of grip strength, chair stands, and balance tests. The chair-stand test is an easy to conduct frailty subcomponent that does not require specialized equipment and may be valuable to predict adverse clinical outcomes in cirrhosis. The objective of this study was to determine if the chair-stand test is an independent predictor of mortality and hospitalization in cirrhosis. Methods: A retrospective review of 787 patients with cirrhosis was conducted. Chair-stand times were collected at baseline in person and divided into three groups: <10 seconds (n = 276), 10-15 seconds (n = 290), and >15 seconds (n = 221). Fine-Gray proportional hazards regression models were used to evaluate the association between chair-stand times and the outcomes of mortality and non-elective hospitalization. Results: The hazard of mortality (HR 3.21, 95% CI 2.16%-4.78%, p <0.001) and non-elective hospitalization (HR 2.24, 95% CI 1.73%-2.91%, p <0.001) was increased in group 3 in comparison to group 1. A chair-stand test time >15 seconds had increased all-cause mortality (HR 2.78, 95% CI 2.01%-3.83%, p <0.001) and non-elective hospitalizations (HR 1.84, 95% CI 1.48%-2.29%, p <0.001) compared to <15 seconds. Conclusions: A chair-stand test time of >15 seconds is independently associated with mortality and non-elective hospitalizations. This test holds promise as a rapid prognostication tool in cirrhosis. Future work will include external validation and virtual assessment in this population.

6.
Alzheimers Dement (Amst) ; 14(1): e12321, 2022.
Article in English | MEDLINE | ID: mdl-35845260

ABSTRACT

Introduction: The impact of hepatorenal function on plasma biomarkers of neuropathology is unknown. Herein, we measured several plasma biomarkers in patients with cirrhosis. Methods: Plasma phosphorylated tau (p-tau181), neurofilament light chain (NfL), glial fibrillary acidic protein (GFAP), total tau (t-tau), and ubiquitin carboxyl-terminal hydrolase L1 (UCHL1) were measured in 135 adults with cirrhosis and 22 healthy controls using Simoa. Within cirrhosis, associations between biomarkers and hepatorenal function were explored using linear regression. Results: p-tau181, NfL, t-tau, and UCHL1 were increased 2- to 4-fold in cirrhosis, whereas GFAP was not increased. Within cirrhosis, creatinine moderately correlated with p-tau181 (ß = 0.75, P < .01), NfL (ß = 0.32, P < .01), and t-tau (ß = 0.31, P < .01), but not GFAP (ß = -0.01, P = .88) or UCHL1 (ß = -0.05, P = .60), whereas albumin showed weak, inverse correlations: p-tau181 (ß = -0.18, P < .01), NfL (ß = -0.22, P < .01), GFAP (ß = -0.17, P < .05), t-tau (ß = -0.20, P = .02), and UCHL1 (ß = -0.15, P = .09). Conclusions: Elevated p-tau181, NfL, and t-tau in cirrhosis were associated with renal impairment and hypoalbuminemia, suggesting that hepatorenal function may be important when interpreting plasma biomarkers of neuropathology.

7.
Hepatol Commun ; 6(1): 237-246, 2022 01.
Article in English | MEDLINE | ID: mdl-34558844

ABSTRACT

Physical frailty and impaired cognition are common in patients with cirrhosis. Physical frailty can be assessed using performance-based tests, but the extent to which impaired cognition may impact performance is not well characterized. We assessed the relationship between impaired cognition and physical frailty in patients with cirrhosis. We enrolled 1,623 ambulatory adult patients with cirrhosis waiting for liver transplantation at 10 sites. Frailty was assessed with the liver frailty index (LFI; "frail," LFI ≥ 4.4). Cognition was assessed at the same visit with the number connection test (NCT); continuous "impaired cognition" was examined in primary analysis, with longer NCT (more seconds) indicating worse impaired cognition. For descriptive statistics, "impaired cognition" was NCT ≥ 45 seconds. Linear regression associated frailty and impaired cognition; competing risk regression estimated subhazard ratios (sHRs) of wait-list mortality (i.e., death/delisting for sickness). Median NCT was 41 seconds, and 42% had impaired cognition. Median LFI (4.2 vs. 3.8) and rates of frailty (38% vs. 20%) differed between those with and without impaired cognition. In adjusted analysis, every 10-second NCT increase associated with a 0.08-LFI increase (95% confidence interval [CI], 0.07-0.10). In univariable analysis, both frailty (sHR, 1.63; 95% CI, 1.43-1.87) and impaired cognition (sHR, 1.07; 95% CI, 1.04-1.10) associated with wait-list mortality. After adjustment, frailty but not impaired cognition remained significantly associated with wait-list mortality (sHR, 1.55; 95% CI, 1.33-1.79). Impaired cognition mediated 7.4% (95% CI, 2.0%-16.4%) of the total effect of frailty on 1-year wait-list mortality. Conclusion: Patients with cirrhosis with higher impaired cognition displayed higher rates of physical frailty, yet frailty independently associated with wait-list mortality while impaired cognition did not. Our data provide evidence for using the LFI to understand mortality risk in patients with cirrhosis, even when concurrent impaired cognition varies.


Subject(s)
Cognitive Dysfunction/etiology , Frailty/etiology , Liver Cirrhosis/complications , Aged , Female , Humans , Liver Cirrhosis/psychology , Liver Transplantation , Male , Middle Aged , Prospective Studies , Waiting Lists/mortality
8.
Transplantation ; 105(6): 1297-1302, 2021 06 01.
Article in English | MEDLINE | ID: mdl-33347261

ABSTRACT

BACKGROUND: The use of living donor liver transplantation (LDLT) for primary liver transplantation (LT) may quell concerns about allocating deceased donor organs if the need for retransplantation (re-LT) arises because the primary LT did not draw from the limited organ pool. However, outcomes of re-LT after LDLT are poorly studied. The purpose of this study was to analyze the Adult to Adult Living Donor Liver Transplantation Study (A2ALL) data to report outcomes of re-LT after LDLT, with a focus on long-term survival after re-LT. METHODS: A retrospective review of A2ALL data collected between 1998 and 2014 was performed. Patients were excluded if they received a deceased donor LT. Demographic data, postoperative outcomes and complications, graft and patient survival, and predictors of re-LT and patient survival were assessed. RESULTS: Of the 1065 patients who underwent LDLT during the study time period, 110 recipients (10.3%) required re-LT. In multivariable analyses, hepatitis C virus, longer length of stay at LDLT, hepatic artery thrombosis, biliary stricture, infection, and disease recurrence were associated with an increased risk of re-LT. Patient survival among re-LT patients was significantly inferior to those who underwent primary transplant only at 1 (86% versus 92%), 5 (64% versus 82%), and 10 years (44% versus 68%). CONCLUSIONS: Approximately 10% of A2ALL patients who underwent primary LDLT required re-LT. Compared with patients who underwent primary LT, survival among re-LT recipients was worse at 1, 5, and 10 years after LT, and re-LT was associated with a significantly increased risk of death in multivariable modeling (hazard ratios, 2.29; P < 0.001).


Subject(s)
Liver Transplantation , Living Donors , Reoperation , Adult , Age Factors , Female , Humans , Liver Transplantation/adverse effects , Liver Transplantation/mortality , Male , Middle Aged , North America , Reoperation/adverse effects , Reoperation/mortality , Retrospective Studies , Risk Assessment , Risk Factors , Time Factors , Treatment Outcome
9.
Transplantation ; 105(4): 824-831, 2021 04 01.
Article in English | MEDLINE | ID: mdl-32433235

ABSTRACT

BACKGROUND: Share 35 was a policy implemented in 2013 to increase regional sharing of deceased donor livers to patients with model for end-stage liver disease ≥ 35 to decrease waitlist mortality for the sickest patients awaiting liver transplantation (LT). The purpose of this study was to determine whether live donor liver transplantation (LDLT) volume was impacted by the shift in allocation of deceased donor livers to patients with higher model for end-stage liver disease scores. METHODS: Using Network for Organ Sharing/Organ Procurement and Transplantation Network Standard Transplant Analysis and Research files, we identified all adults who received a primary LT between October 1, 2008, and March 31, 2018. LT from October 1, 2008, through June 30, 2013, was designated as the pre-Share 35 era and July 1, 2013, through March 31, 2018, as the post-Share 35 era. Primary outcomes included transplant volumes, graft survival, and patient survival in both eras. RESULTS: In total, 48 779 primary adult single-organ LT occurred during the study period (22 255 pre-Share 35, 26 524 post). LDLT increased significantly (6.8% post versus 5.7% pre, P < 0.001). LDLT volume varied significantly by region (P < 0.001) with regions 2, 4, 5, and 8 demonstrating significant increases in LDLT volume post-Share 35. The number of centers performing LDLT increased only in regions 4, 6, and 11. Throughout the 2 eras, there was no difference in graft or patient survival for LDLT recipients. CONCLUSIONS: Overall, LDLT volume increased following the implementation of Share 35, which was largely due to increased LDLT volume at centers with experience in LDLT, and corresponded to significant geographic variation in LDLT utilization.


Subject(s)
Decision Support Techniques , Donor Selection , End Stage Liver Disease/surgery , Liver Transplantation , Living Donors/supply & distribution , End Stage Liver Disease/diagnosis , Female , Humans , Liver Transplantation/adverse effects , Male , Middle Aged , Predictive Value of Tests , Risk Assessment , Risk Factors , Severity of Illness Index , Time Factors , Treatment Outcome , United States
10.
Circulation ; 120(11): 935-40, 2009 Sep 15.
Article in English | MEDLINE | ID: mdl-19720938

ABSTRACT

BACKGROUND: Use of an internal mammary artery (IMA) is a well-recognized, nationally endorsed quality indicator for evaluating the process of operative care for coronary artery bypass graft surgery. An objective assessment of the current status of IMA use has not been systematically performed. METHODS AND RESULTS: This cross-sectional observational study analyzed data on 541 368 coronary artery bypass graft surgery procedures reported by 745 hospitals in the Society of Thoracic Surgeons National Cardiac Database from 2002 through 2005. We assessed the current status of IMA use, the association of hospital volume and IMA use, and disparities in IMA use by patient gender and race and by region of hospital location. Rates of using at least 1 IMA and bilateral IMA were 92.4% and 4.0%, with increasing trends over the years. Hospital volume was not significantly associated with IMA use. IMAs were used less frequently in women than men (for at least 1 IMA: odds ratio, 0.62; 95% confidence interval, 0.61 to 0.63; for bilateral IMA: odds ratio, 0.65; 95% confidence interval, 0.63 to 0.68) and less frequently in nonwhite patients than white patients (for at least 1 IMA: odds ratio, 0.84; 95% confidence interval, 0.81 to 0.87; for bilateral IMA: odds ratio, 0.79; 95% confidence interval, 0.75 to 0.83). There were significant differences in frequency of IMA use by hospital region. CONCLUSIONS: Frequency of IMA use in coronary artery bypass graft surgery is increasing; however, many patients still do not receive the benefits of IMA grafts, and some hospitals have a very low IMA use rate. Hospital volume is not associated with IMA use in coronary artery bypass graft surgery. Analysis of this critical performance measure reveals significant gender and race disparities.


Subject(s)
Coronary Artery Disease/epidemiology , Coronary Artery Disease/surgery , Internal Mammary-Coronary Artery Anastomosis/statistics & numerical data , Internal Mammary-Coronary Artery Anastomosis/standards , Quality of Health Care , Age Distribution , Aged , Cross-Sectional Studies , Databases, Factual , Female , Hospital Bed Capacity/statistics & numerical data , Humans , Internal Mammary-Coronary Artery Anastomosis/methods , Male , Middle Aged , Multivariate Analysis , Prevalence , Racial Groups/statistics & numerical data , Regression Analysis , Sex Distribution
11.
Transplantation ; 104(2): 285-292, 2020 02.
Article in English | MEDLINE | ID: mdl-31107823

ABSTRACT

BACKGROUND: Alcoholic liver disease (ALD) accounts for 15%-30% of transplants performed in the United States and Europe; however, the data on living donor liver transplantation (LDLT) for ALD remain sparse. The purpose of this study was to examine the outcomes following LDLT for ALD using data from the adult-to-adult living donor liver transplantation (A2ALL) study, which represents the largest Western experience with adult-to-adult LDLT. METHODS: A retrospective review of A2ALL data collected between 1998 and 2014 was performed. Patients were excluded if they received a deceased donor liver transplant. Demographic data, postoperative outcomes and complications, graft and patient survival, and predictors of graft and patient survival were assessed. RESULTS: Of the 1065 patients who underwent LDLT during the study time period, 168 (15.8%) were transplanted for a diagnosis of ALD. Comparing patients who underwent transplant for ALD with those who were transplanted for other etiologies of liver disease, there was no significant difference in graft survival at 1 (88% versus 84%), 5 (76% versus 74%), or 10 years following transplant (55% versus 61%, P = 0.29). Similarly, there was no difference in patient survival at 1 (94% versus 91%), 5 (83% versus 79%), or 10 years following transplant (61% versus 66%, P = 0.32). CONCLUSIONS: LDLT for ALD results in excellent 1-, 5-, and 10-year graft and patient survival. Patients with ALD and impaired renal function have a higher risk of graft loss and death. These findings support the notion that early LDLT for patients with ALD may help optimize outcomes.


Subject(s)
Liver Diseases, Alcoholic/complications , Liver Failure/surgery , Liver Transplantation/methods , Living Donors/statistics & numerical data , Risk Assessment/methods , Adult , Follow-Up Studies , Graft Survival , Humans , Incidence , Liver Diseases, Alcoholic/surgery , Liver Failure/epidemiology , Liver Failure/etiology , Middle Aged , Retrospective Studies , Survival Rate/trends , Time Factors , United States/epidemiology
12.
Circulation ; 117(7): 876-85, 2008 Feb 19.
Article in English | MEDLINE | ID: mdl-18250266

ABSTRACT

BACKGROUND: There exist few studies that characterize contemporary clinical features and outcomes or risk factors for operative mortality in cardiogenic shock (CS) patients undergoing coronary artery bypass grafting (CABG). METHODS AND RESULTS: We evaluated data of 708,593 patients with and without CS undergoing CABG enrolled in the Society of Thoracic Surgeons National Cardiac Database (2002-2005). Clinical, angiographic, and operative features and in-hospital outcomes were evaluated in patients with and without CS. Logistic regression was used to identify predictors of operative mortality and to estimate weights for an additive risk score. Patients with preoperative CS constituted 14,956 (2.1%) of patients undergoing CABG yet accounted for 14% of all CABG deaths. Operative mortality in CS patients was high and surgery specific, rising from 20% for isolated CABG to 33% for CABG plus valve surgery and 58% for CABG plus ventricular septal repair. Although mortality for CABG surgery overall declined significantly over time (P for trend <0.0001), mortality for CS patients undergoing CABG did not change significantly during the 4-year study period (P=0.07). Factors associated with higher death risk for CS patients undergoing CABG were identified by multivariable analysis and summarized into a simple bedside risk score (c statistic=0.74) that accurately stratified those with low (<10%) to very high (>60%) mortality risk. CONCLUSIONS: Patients with CS represent a minority of those undergoing CABG yet have persistently high operative risks, accounting for 14% of deaths in CABG patients. Estimation of patient-specific risk of mortality is feasible with the simplified additive risk tool developed in our study with the use of routinely available preprocedural data.


Subject(s)
Coronary Artery Bypass/statistics & numerical data , Severity of Illness Index , Shock, Cardiogenic/surgery , Aged , Canada/epidemiology , Coronary Artery Bypass/mortality , Databases, Factual , Female , Heart Rupture, Post-Infarction/epidemiology , Heart Rupture, Post-Infarction/surgery , Heart Valve Prosthesis Implantation/mortality , Heart Valve Prosthesis Implantation/statistics & numerical data , Hospital Mortality , Humans , Male , Middle Aged , Point-of-Care Systems , Postoperative Complications/mortality , Retrospective Studies , Risk , Risk Factors , Shock, Cardiogenic/etiology , Shock, Cardiogenic/mortality , Treatment Outcome , United States/epidemiology
13.
Circulation ; 116(6): 606-12, 2007 Aug 07.
Article in English | MEDLINE | ID: mdl-17646586

ABSTRACT

BACKGROUND: Previous studies showed 75% mortality before hospital discharge in patients with a ventricular assist device (VAD) placed for post-cardiac surgery shock. We examined a large national clinical database to assess trends in the incidence of post-cardiac surgery shock requiring VAD implantation, survival rates, and risk factors for mortality. METHODS AND RESULTS: We identified patients undergoing a VAD procedure after cardiac surgery at US hospitals participating in the Society of Thoracic Surgeons' National Cardiac Database during the years 1995 to 2004. Baseline characteristics and operative outcomes were analyzed in 2.5-year increments. Logistic regression modeling was performed to provide risk-adjusted operative mortality and morbidity odds ratios. A total of 5735 patients had a VAD placed during the 10-year period (0.3% cardiac surgeries). Overall survival rate to discharge after VAD placement was 54.1%. With the earliest period (January 1995 through June 1997) used as reference, the mortality odds ratio declined to 0.72 (July 1997 through December 1999) and eventually to 0.41 (July 2002 through December 2004; P<0.0001). The combined mortality/morbidity odds ratio also declined, to 0.84 and 0.48 over identical periods (P<0.0001). Preoperative characteristics associated with increased mortality were urgency of procedure, reoperation, renal failure, myocardial infarction, aortic stenosis, female sex, race, peripheral vascular disease, New York Heart Association class IV, cardiogenic shock, left main coronary stenosis, and valve procedure (c index=0.755). CONCLUSIONS: After adjustment for clinical characteristics of patients requiring mechanical circulatory support, rates of survival to hospital discharge have improved dramatically. Insertion of a VAD for post-cardiac surgery shock is an important therapeutic intervention that can salvage most of these patients.


Subject(s)
Cardiac Surgical Procedures/trends , Databases, Factual/trends , Heart-Assist Devices/trends , Postoperative Care/trends , Societies, Medical/trends , Thoracic Surgery/trends , Aged , Cardiac Surgical Procedures/instrumentation , Female , Humans , Male , Middle Aged , Postoperative Care/instrumentation , Postoperative Complications/mortality , Postoperative Complications/prevention & control , Thoracic Surgery/instrumentation , Treatment Outcome , United States
14.
Circulation ; 114(21): 2208-16; quiz 2208, 2006 Nov 21.
Article in English | MEDLINE | ID: mdl-17088458

ABSTRACT

BACKGROUND: Estimation of an individual patient's risk for postoperative dialysis can support informed clinical decision making and patient counseling. METHODS AND RESULTS: To develop a simple bedside risk algorithm for estimating patients' probability for dialysis after cardiac surgery, we evaluated data of 449,524 patients undergoing coronary artery bypass grafting (CABG) and/or valve surgery and enrolled in >600 hospitals participating in the Society of Thoracic Surgeons National Database (2002-2004). Logistic regression was used to identify major predictors of postoperative dialysis. Model coefficients were then converted into an additive risk score and internally validated. The model also was validated in a second sample of 86,009 patients undergoing cardiac surgery from January to June 2005. Postoperative dialysis was needed in 6451 patients after cardiac surgery (1.4%), ranging from 1.1% for isolated CABG procedures to 5.1% for CABG plus mitral valve surgery. Multivariable analysis identified preoperative serum creatinine, age, race, type of surgery (CABG plus valve or valve only versus CABG only), diabetes, shock, New York Heart Association class, lung disease, recent myocardial infarction, and prior cardiovascular surgery to be associated with need for postoperative dialysis (c statistic=0.83). The risk score accurately differentiated patients' need for postoperative dialysis across a broad risk spectrum and performed well in patients undergoing isolated CABG, off-pump CABG, isolated aortic valve surgery, aortic valve surgery plus CABG, isolated mitral valve surgery, and mitral valve surgery plus CABG (c statistic=0.83, 0.85, 0.81, 0.75, 0.80, and 0.75, respectively). CONCLUSIONS: Our study identifies the major patient risk factors for postoperative dialysis after cardiac surgery. These risk factors have been converted into a simple, accurate bedside risk tool. This tool should facilitate improved clinician-patient discussions about risks of postoperative dialysis.


Subject(s)
Algorithms , Cardiac Surgical Procedures/adverse effects , Point-of-Care Systems , Renal Dialysis , Renal Insufficiency/etiology , Renal Insufficiency/therapy , Aged , Coronary Artery Bypass/adverse effects , Female , Heart Valve Diseases/surgery , Humans , Logistic Models , Male , Middle Aged , Multivariate Analysis , Probability , Risk Assessment/methods , Risk Factors , Treatment Outcome
15.
J Thorac Cardiovasc Surg ; 138(6): 1297-302, 2009 Dec.
Article in English | MEDLINE | ID: mdl-19783010

ABSTRACT

OBJECTIVE: Diffusing capacity is not routinely used in assessing risk of lung resection, perhaps owing to uncertainty as to whether patients with normal spirometric results require additional evaluation. We determined whether diffusing capacity is predictive of pulmonary complications after lung resection in patients with normal spirometric results. METHODS: We reviewed outcomes of major lung resection in The Society of Thoracic Surgeons General Thoracic Surgery Database from 2002 to 2008 to determine the relationship of diffusing capacity (expressed as percent of predicted) to postoperative pulmonary complications stratified by chronic obstructive pulmonary disease status. RESULTS: Percent of predicted diffusing capacity was measured in 7891 (57%) patients. There were 3905 women and 3986 men with a mean age of 66.3 +/- 10.6 years who underwent lobectomy (6904; 87.5%), bilobectomy (463; 5.9%), and pneumonectomy (524; 6.6%). Chronic obstructive pulmonary disease was identified in 2711 (34.4%) patients. Pulmonary complications occurred in 13%, and the operative mortality was 1.9%. Percent of predicted diffusing capacity was strongly associated with the development of pulmonary complications (odds ratio, 1.12 per 10-point decrease; P < .0001). Decreasing percent of predicted diffusing capacity was incrementally related to an increased incidence of pulmonary complications regardless of chronic obstructive pulmonary disease status. There was no apparent interaction between percent of predicted diffusing capacity and chronic obstructive pulmonary disease status in the predictive model. CONCLUSIONS: Percent of predicted diffusing capacity predicts pulmonary complications after lung resection in patients without chronic obstructive pulmonary disease. We recommend measurement of diffusing capacity in lung resection candidates, regardless of chronic obstructive pulmonary disease, as an important element in the accurate assessment of operative risk.


Subject(s)
Lung Diseases/etiology , Pneumonectomy , Pulmonary Diffusing Capacity , Aged , Female , Humans , Male , Postoperative Complications , Predictive Value of Tests , Pulmonary Disease, Chronic Obstructive/physiopathology , Regression Analysis , Spirometry
16.
J Thorac Cardiovasc Surg ; 137(3): 587-95; discussion 596, 2009 Mar.
Article in English | MEDLINE | ID: mdl-19258071

ABSTRACT

OBJECTIVE: To create a model for perioperative risk of esophagectomy for cancer using the Society of Thoracic Surgeons General Thoracic Database. METHODS: The Society of Thoracic Surgeons General Thoracic Database was queried for all patients treated with esophagectomy for esophageal cancer between January 2002 and December 2007. A multivariable risk model for mortality and major morbidity was constructed. RESULTS: There were 2315 esophagectomies performed by 73 participating centers. Hospital mortality was 63/2315 (2.7%). Major morbidity (defined as reoperation for bleeding [n = 12], anastomotic leak [n = 261], pneumonia [n = 188], reintubation [n = 227], ventilation beyond 48 hours [n = 71], or death [n = 63]) occurred in 553 patients (24%). Preoperative spirometry was obtained in 923/2315 (40%) of patients. A forced expiratory volume in 1 second < 60% of predicted was associated with major morbidity (P = .0044). Important predictors of major morbidity are: age 75 versus 55 (P = .005), black race (P = .08), congestive heart failure (P = .015), coronary artery disease (P = .017), peripheral vascular disease (P = .009), hypertension (P = .029), insulin-dependent diabetes (P = .009), American Society of Anesthesiology rating (P = .001), smoking status (P = .022), and steroid use (P = .026). A strong volume performance relationship was not observed for the composite measure of morbidity and mortality in this patient cohort. CONCLUSIONS: Thoracic surgeons participating in the Society of Thoracic Surgeons General Thoracic Database perform esophagectomy with a low mortality. We identified important predictors of major morbidity and mortality after esophagectomy for esophageal cancer. Volume alone is an inadequate proxy for quality assessment after esophagectomy.


Subject(s)
Esophageal Neoplasms/surgery , Esophagectomy/adverse effects , Aged , Aged, 80 and over , Databases, Factual , Female , Humans , Male , Middle Aged , Models, Statistical , Postoperative Complications/epidemiology , Prognosis , Risk Factors
17.
Ann Thorac Surg ; 88(2): 362-70; discussion 370-1, 2009 Aug.
Article in English | MEDLINE | ID: mdl-19632374

ABSTRACT

BACKGROUND: Smoking cessation is presumed to be beneficial before resection of lung cancer. The effect of smoking cessation on outcome was investigated. METHODS: From January 1999 to July 2007, in-hospital outcomes for 7990 primary resections for lung cancer in adults were reported to the Society of Thoracic Surgeons General Thoracic Surgery Database. Risk of hospital death and respiratory complications was assessed according to timing of smoking cessation, adjusted for clinical confounders. RESULTS: Hospital mortality was 1.4% (n = 109), but 1.5% in patients who had smoked (105 of 6965) vs 0.39% in those who had not (4 of 1025). Compared with the latter, risk-adjusted odds ratios were 3.5 (p = 0.03), 4.6 (p = 0.03), 2.6 (p = 0.7), and 2.5 (p = 0.11) for those whose timing of smoking cessation was categorized as current smoker, quit from 14 days to 1 month, 1 to 12 months, or more than 12 months preoperatively, respectively. Prevalence of major pulmonary complications was 5.7% (456 of 7965) overall, but 6.2% in patients who had smoked (429 of 6941) vs 2.5%% in those who had not (27 of 1024). Compared with the latter, risk-adjusted odds ratios were 1.80 (p = 0.03), 1.62 (p = 0.14), 1.51 (p = 0.20), and 1.29 (p = 0.3) for those whose timing of smoking cessation was categorized as above. CONCLUSIONS: Risks of hospital death and pulmonary complications after lung cancer resection were increased by smoking and mitigated slowly by preoperative cessation. No optimal interval of smoking cessation was identifiable. Patients should be counseled to stop smoking irrespective of surgical timing.


Subject(s)
Lung Neoplasms/surgery , Smoking Cessation , Aged , Cause of Death , Comorbidity , Female , Hospital Mortality , Humans , Logistic Models , Lung Neoplasms/epidemiology , Lung Neoplasms/mortality , Male , Middle Aged , Pneumonectomy/mortality , Postoperative Complications/epidemiology , Preoperative Care , Registries , Risk Factors , Smoking/epidemiology , Time Factors , United States/epidemiology
18.
Ann Thorac Surg ; 85(6): 2040-4; discussion 2045, 2008 Jun.
Article in English | MEDLINE | ID: mdl-18498816

ABSTRACT

BACKGROUND: Hormonal status is a potentially important cause for gender differences in outcomes after cardiovascular operations. Estrogen withdrawal states may potentiate ischemia-reperfusion injury by impairing endothelial cell function and increasing inflammatory cytokine levels. We hypothesized that gender influences mortality after mitral valve operations and that it varies with age, especially during periods of declining ovarian function. METHODS: We studied 24,977 patients (49% women) in The Society of Thoracic Surgeons National Database who underwent isolated mitral valve repair or replacement from 2002 to 2005. Age-related gender differences in mortality after mitral valve operation were compared by risk-adjusted analysis. RESULTS: Gender and age had a pronounced impact on hospital mortality. Women aged 40 to 49 and 50 to 59 had significantly greater hospital mortality than risk-matched men. The adjusted female/male odds ratio for hospital mortality in the group aged 40 to 49 was 2.56 (95% confidence interval, 1.31 to 5.01) but progressively decreased in the four subsequent age groups. This pattern was statistically significant (p = 0.028 and p = 0.018 for 40 to 49 vs 70 to 79 and 80 to 89, respectively) and represents a declining relative mortality risk for women of advanced age. CONCLUSIONS: In patients aged 40 to 59 years, the mortality of mitral valve operation is approximately 2.5 times higher in women compared with men with similar risk factors. This survival disadvantage diminishes with further aging. Changes in ovarian function may be an important cause for this gender-age interaction and are a potential target for novel hormone-based therapies.


Subject(s)
Heart Valve Diseases/surgery , Mitral Valve/surgery , Perimenopause , Postoperative Complications/mortality , Sex Ratio , Adult , Age Factors , Aged , Aged, 80 and over , Cause of Death , Comorbidity , Female , Follow-Up Studies , Heart Valve Diseases/mortality , Hospital Mortality , Humans , Male , Middle Aged , Risk Adjustment
19.
J Thorac Cardiovasc Surg ; 135(2): 247-54, 2008 Feb.
Article in English | MEDLINE | ID: mdl-18242243

ABSTRACT

OBJECTIVE: Our objective was to investigate the surgical management of primary lung cancer by board-certified thoracic surgeons participating in the general thoracic surgery portion of The Society of Thoracic Surgeons database. METHODS: We identified all pulmonary resections recorded in the general thoracic surgery prospective database from 1999 to 2006. Among the 49,029 recorded operations, 9033 pulmonary resections for primary lung cancer were analyzed. RESULTS: There were 4539 men and 4494 women with a median age of 67 years (range 20-94 years). Comorbidity affected 79% of patients and included hypertension in 66%, coronary artery disease in 26%, body mass index of 30 kg/m2 or more in 25.7%, and diabetes mellitus in 13%. The type of resection was a wedge resection in 1649 (18.1%), segmentectomy in 394 (4.4%), lobectomy in 6042 (67%), bilobectomy in 357 (4.0%), and pneumonectomy in 591 (6.5%). Mediastinal lymph nodes were evaluated in 5879 (65%) patients; via mediastinoscopy in 1928 (21%), nodal dissection 3722 (41%), nodal sampling in 1124 (12.4%), and nodal biopsy in 729 (8%). Median length of stay was 5 days (range 0-277 days). Operative mortality was 2.5% (179 patients). One or more postoperative events occurred in 2911 (32%) patients. CONCLUSION: The patients in the general thoracic surgery database are elderly, gender balanced, and afflicted by multiple comorbid conditions. Mediastinal lymph node evaluation is common and the pneumonectomy rate is low. The length of stay is short and operative mortality is low, despite frequent postoperative events.


Subject(s)
Databases, Factual , Hospital Mortality , Lung Neoplasms/mortality , Lung Neoplasms/surgery , Pneumonectomy/methods , Thoracic Surgery/statistics & numerical data , Adult , Age Factors , Aged , Aged, 80 and over , Female , Humans , Lung Neoplasms/pathology , Male , Middle Aged , Neoplasm Staging , Pneumonectomy/adverse effects , Postoperative Complications/mortality , Registries , Risk Assessment , Sex Factors , Societies, Medical/statistics & numerical data , Survival Analysis , Thoracic Surgery, Video-Assisted/adverse effects , Thoracic Surgery, Video-Assisted/methods , Thoracotomy/adverse effects , Thoracotomy/methods , Treatment Outcome , United States
20.
Ann Thorac Surg ; 85(6): 1857-65; discussion 1865, 2008 Jun.
Article in English | MEDLINE | ID: mdl-18498784

ABSTRACT

BACKGROUND: Few reliable estimations of operative risk exist for lung cancer patients undergoing lobectomy. This study identified risk factors associated with prolonged length of hospital stay (PLOS) after lobectomy for lung cancer as a surrogate for perioperative morbid events. METHODS: The Society of Thoracic Surgeons (STS) General Thoracic Surgery Database was queried for patients with lobectomy for lung cancer. A model of preoperative risk factors was developed by multivariate stepwise logistic regression setting the threshold for PLOS at 14 days. Morbidity was measured as postoperative events as defined in the STS database. Risk-adjusted results were reported to participating sites. RESULTS: From January 2002 to June 2006, 4979 lobectomies were performed for lung cancer at 56 STS sites, and 351 (7%) had a PLOS. They had more postoperative events than patients without PLOS (3.4 vs 1.2; p < 0.0001). Patients with PLOS also had higher mortality than those with normal LOS, at 10.8% (38 of 351) vs 0.7% (33 of 4628; p < 0.0001). Significant predictors of PLOS included age per 10 years (odds ratio [OR], 1.30, p < 0.001), Zubrod score (OR, 1.51; p < 0.001), male sex (OR, 1.45; p = 0.002), American Society of Anesthesiology score (OR, 1.54; p < 0.001), insulin-dependent diabetes (OR. 1.71; p = 0.037), renal dysfunction (OR, 1.79; p = 0.004), induction therapy (OR, 1.65; p = 0.001), percentage predicted forced expiratory volume in 1 second in 10% increments (OR, 0.88; p < 0.001), and smoking (OR, 1.33; p = 0.095). After risk adjustment, twofold interhospital variability existed in PLOS among STS sites CONCLUSIONS: We identified significant predictors of PLOS, a surrogate morbidity marker after lobectomy for lung cancer. This model may be used to provide meaningful risk-adjusted outcome comparisons to STS sites for quality improvement purposes.


Subject(s)
Databases, Factual/statistics & numerical data , Length of Stay/statistics & numerical data , Lung Neoplasms/surgery , Pneumonectomy/statistics & numerical data , Postoperative Complications/epidemiology , Risk Adjustment , Age Factors , Aged , Biomarkers , Cohort Studies , Comorbidity , Diabetes Mellitus, Type 1/epidemiology , Diabetes Mellitus, Type 1/mortality , Female , Forced Expiratory Volume/physiology , Humans , Lung Neoplasms/epidemiology , Lung Neoplasms/mortality , Male , Middle Aged , Models, Statistical , Odds Ratio , Postoperative Complications/mortality , Renal Insufficiency/epidemiology , Renal Insufficiency/mortality , Sex Factors , Smoking/adverse effects , Smoking/epidemiology , Survival Analysis
SELECTION OF CITATIONS
SEARCH DETAIL