ABSTRACT
BACKGROUND: Cardiac allograft vasculopathy (CAV) is the primary cause of late mortality after heart transplantation. We look to provide a comprehensive review of contemporary revascularization strategies in CAV. METHODS: PubMed and Web of Science were systematically searched by 3 authors. 1,870 articles were initially screened and 24 were included in this review. RESULTS: PCI is the main revascularization technique utilized in CAV. The pooled estimates for restenosis significantly favored DES over BMS (OR 4.26; 95% CI: 2.54-7.13; p < 0.00001; I2 = 4%). There were insufficient data to quantitatively compare mortality following DES versus BMS. There was no difference in short-term mortality between CABG and PCI. In-hospital mortality was 0.0% for CABG and ranged from 0.0 to 8.34% for PCI. One-year mortality was 8.0% for CABG and 5.0-25.0% for PCI. CABG had a potential advantage at 5 years. Five-year mortality was 17.0% for CABG and ranged from 14 to 40.4% following PCI. Select measures of postoperative morbidity trended toward superior outcomes for CABG. CONCLUSION: In CAV, PCI is the primary revascularization strategy utilized, with DES exhibiting superiority to BMS regarding postoperative morbidity. Further investigation into outcomes following CABG in CAV is required to conclusively elucidate the superior management strategy in CAV.
Subject(s)
Coronary Artery Disease , Drug-Eluting Stents , Heart Diseases , Heart Transplantation , Percutaneous Coronary Intervention , Coronary Artery Disease/surgery , Coronary Vessels , Heart Transplantation/adverse effects , Humans , Percutaneous Coronary Intervention/methods , Treatment OutcomeABSTRACT
OBJECTIVES: We aimed to evaluate the association between levosimendan treatment and acute kidney injury (AKI) as well as assess the clinical sequelae of AKI in cardiac surgery patients with depressed left ventricular function (ejection fraction <35%). METHODS: Patients in the LEVO-CTS trial undergoing on-pump coronary artery bypass grafting (CABG), valve, or CABG/valve surgery were stratified by occurrence and severity of postoperative AKI using the AKIN classification. The association between levosimendan infusion and AKI was modeled using multivariable regression. RESULTS: Among 854 LEVO-CTS patients, 231 (27.0%) experienced postoperative AKI, including 182 (21.3%) with stage 1, 35 (4.1%) with stage 2, and 14 (1.6%) with stage 3 AKI. The rate of AKI was similar between patients receiving levosimendan or placebo. The odds of 30-day mortality significantly increased by AKI stage compared to those without AKI (stage 1: adjusted odds ratio [aOR] 2.0, 95% confidence interval [CI] 0.8-4.9; stage 2: aOR 9.1, 95% CI 3.2-25.7; stage 3: aOR 12.4, 95% CI 3.0-50.4). No association was observed between levosimendan, AKI stage, and odds of 30-day mortality (interaction P = .69). Factors independently associated with AKI included increasing age, body mass index, diabetes, and increasing baseline systolic blood pressure. Increasing baseline eGFR and aldosterone antagonist use were associated with a lower risk of AKI. CONCLUSIONS: Postoperative AKI is common among high-risk patients undergoing cardiac surgery and associated with significantly increased risk of 30-day death or dialysis. Levosimendan was not associated with the risk of AKI.
Subject(s)
Acute Kidney Injury/etiology , Cardiac Surgical Procedures/adverse effects , Cardiotonic Agents/adverse effects , Postoperative Complications/etiology , Simendan/adverse effects , Acute Kidney Injury/mortality , Aged , Cardiotonic Agents/therapeutic use , Coronary Artery Bypass , Female , Humans , Male , Middle Aged , Odds Ratio , Placebos/therapeutic use , Postoperative Complications/mortality , Regression Analysis , Risk Factors , Simendan/therapeutic use , Stroke Volume , Ventricular Dysfunction, Left/physiopathologyABSTRACT
Legionella community-acquired pneumonia necessitating veno-venous extracorporeal membrane oxygenation for severe acute respiratory distress syndrome has been reported in adults. However, few options remain in cases of refractory hypoxemia on veno-venous extracorporeal membrane oxygenation. Herein, we describe adjunctive extended therapeutic hypothermia for refractory hypoxemia despite veno-venous extracorporeal membrane oxygenation for successful management of severe acute respiratory distress syndrome secondary to Legionella.
Subject(s)
Extracorporeal Membrane Oxygenation/methods , Hypothermia, Induced/methods , Hypoxia/therapy , Respiratory Distress Syndrome/therapy , Humans , Male , Middle Aged , Treatment OutcomeABSTRACT
BACKGROUND: Available cardiac surgery risk scores have not been validated in octogenarians. Our objective was to compare the predictive ability of the Society of Thoracic Surgeons (STS) score, EuroSCORE I, and EuroSCORE II in elderly patients undergoing isolated coronary artery bypass grafting surgery (CABG). METHODS: All patients who underwent isolated CABG (2002 - 2008) were identified from the Alberta Provincial Project for Outcomes Assessment in Coronary Heart Disease (APPROACH) registry. All patients aged 80 and older (n = 304) were then matched 1:2 with a randomly selected control group of patients under age 80 (n = 608 of 4732). Risk scores were calculated. Discriminatory accuracy of the risk models was assessed by plotting the areas under the receiver operator characteristic (AUC) and comparing the observed to predicted operative mortality. RESULTS: Octogenarians had a significantly higher predicted mortality by STS Score (3 ± 2% vs. 1 ± 1%; p < 0.001), additive EuroSCORE (8 ± 3% vs. 4 ± 3%; p < 0.001), logistic EuroSCORE (15 ± 14% vs. 5 ± 6%; p < 0.001), and EuroSCORE II (4 ± 3% vs. 2 ± 2%; p < 0.001) compared to patients under age 80 years. Observed mortality was 2% and 1% for patients age 80 and older and under age 80, respectively (p = 0.323). AUC revealed areas for STS, additive and logistic EuroSCORE I and EuroSCORE II, respectively, for patients age 80 and older (0.671, 0.709, 0.694, 0.794) and under age 80 (0.829, 0.750, 0.785, 0.845). CONCLUSION: All risk prediction models assessed overestimated surgical risk, particularly in octogenarians. EuroSCORE II demonstrated better discriminatory accuracy in this population. Inclusion of new variables into these risk models, such as frailty, may allow for more accurate prediction of true operative risk.
Subject(s)
Coronary Artery Bypass/mortality , Coronary Disease/surgery , Risk Assessment/methods , Aged , Aged, 80 and over , Comorbidity , Coronary Artery Bypass/adverse effects , Coronary Disease/complications , Female , Humans , Male , ROC Curve , Retrospective StudiesABSTRACT
BACKGROUND: Acute kidney injury (AKI) is a serious complication following lung transplantation (LTx). We aimed to describe the incidence and outcomes associated with AKI following LTx. METHODS: A retrospective population-based cohort study of all adult recipients of LTx at the University of Alberta between 1990 and 2011. The primary outcome was AKI, defined and classified according to the Kidney Disease: Improving Global Outcomes (KDIGO) criteria, in the first 7 post-operative days. Secondary outcomes included risk factors, utilization of renal replacement therapy (RRT), occurrence of post-operative complications, mortality and kidney recovery. RESULTS: Of 445 LTx recipients included, AKI occurred in 306 (68.8%), with severity classified as Stage I in 38.9% (n = 173), Stage II in 17.5% (n = 78) and Stage III in 12.4% (n = 55). RRT was received by 36 (8.1%). Factors associated with AKI included longer duration of cardiopulmonary bypass [per minute, odds ratio (OR) 1.003; 95% confidence interval (CI), 1.001-1.006; P = 0.02], and mechanical ventilation [per hour (log-transformed), OR 5.30; 95% CI, 3.04-9.24; P < 0.001], and use of cyclosporine (OR 2.03; 95% CI, 1.13-3.64; P = 0.02). In-hospital and 1-year mortality were significantly higher in those with AKI compared with no AKI (7.2 versus 0%; adjusted P = 0.001; 14.4 versus 5.0%; adjusted P = 0.02, respectively). At 3 months, those with AKI had greater sustained loss of kidney function compared with no AKI [estimated glomerular filtration rate, mean (SD): 68.9 (25.7) versus 75.3 (22.1) mL/min/1.73 m(2), P = 0.01]. CONCLUSIONS: By the KDIGO definition, AKI occurred in two-thirds of patients following LTx. AKI portended greater risk of death and loss of kidney function.
Subject(s)
Acute Kidney Injury/epidemiology , Lung Transplantation/adverse effects , Acute Kidney Injury/mortality , Adult , Cardiopulmonary Bypass , Female , Humans , Incidence , Kaplan-Meier Estimate , Male , Middle Aged , Odds Ratio , Postoperative Complications/epidemiology , Renal Replacement Therapy , Retrospective Studies , Risk Factors , Time FactorsABSTRACT
INTRODUCTION: There has been concern regarding the safety of cardiac surgical intervention during off-hours. Sleep deprivation, resource limitations, and an increased case urgency have been postulated to increase off-hours surgical risk, although outcomes are inconsistent in the existing literature. In this systematic review and meta-analysis, we review the literature comparing patients undergoing cardiac surgery during on and off-hours. EVIDENCE ACQUISITION: PubMed and Embase were systematically searched for literature published from January 2000-September 2023, comparing outcomes of patients undergoing cardiac surgery during on and off-hours. Overall, 3540 manuscript titles and abstracts were screened and 11 articles were included. EVIDENCE SYNTHESIS: Overall aggregate analysis indicated no significant differences in rates of in-hospital mortality(OR 1.04; 95% CI, 0.41-2.63; P=0.93) and perioperative morbidity, including stroke (P=0.52), reoperation (P=0.92), major bleeding (P=0.10), and renal complications (P=0.55). Composite rates of sternal wound infection favored on-hours surgery (P=0.01). CONCLUSIONS: Although inferior outcomes in patients undergoing cardiac surgery during off-hours have been noted, aggregate analysis largely revealed equivalent perioperative morbidity and mortality during on and off-hours surgery, although with the exclusion of one outlier study in-hospital mortality and reoperation favored on-hours surgery. Heterogeneity in outcomes is likely multifactorial, with surgical staff fatigue, patient preoperative risk, clinical setting, and resource limitations all contributing. Further investigation is required directly comparing emergent cardiac surgical intervention during on-hours and off-hours controlling for baseline surgical risk to elucidate the true impact of timing of surgery on postoperative outcomes.
ABSTRACT
Aim: Structural valvular deterioration of xenogenic heart valve replacements is thought to be due to a chronic immune response. We sought to engineer porcine extracellular matrix that elicits minimal inflammatory immune response. Materials & methods: Whole blood, bone marrow and pericardium were collected from patients undergoing elective cardiac surgery. Porcine extracellular matrix was decellularized, reseeded with homologous mesenchymal stem cells and exposed to whole blood. Results: DAPI stain confirmed the absence of cells after decellularization, and presence of mesenchymal stem cells after recellularization. There was a significant reduction in IL-1ß and TNF-α production in the recellularized matrix. Conclusion: Recellularization of porcine matrix is successful at attenuating the xenogenic immune response and may provide a suitable scaffold to address the current limitations of prosthetic heart valve replacements.
Deterioration of tissue heart valve replacements is thought to be due to a chronic immune response. We sought to remove cells from a pig derived tissue and replace those cells with human stem cells to create a scaffold that results in a reduced immune response. Whole blood, bone marrow and pericardium were collected from patients undergoing elective cardiac surgery. The pig derived tissue had the cells removed, were replaced with human stem cells and exposed to whole blood. Tissue stain confirmed the absence of cells after removal, and presence of stem cells after replacement of cells. There was a significant reduction in markers of immune response in the recellularized tissue. Removal of cells from pig derived tissue and replacement with human stem cells is successful at reducing the immune response to animal tissue and may provide a suitable scaffold to address the current limitations of heart valve replacement options.
Subject(s)
Heart Valve Prosthesis , Tissue Engineering , Animals , Swine , Extracellular Matrix , Cells, CulturedABSTRACT
BACKGROUND: Evidence suggests that metabolic syndrome (MbS) is associated with early senescence of bioprosthetic aortic valve prostheses. The purpose of this study was to determine whether MbS is also associated with accelerated failure of bioprosthetic valves prostheses in the mitral position. METHODS: Records of all patients undergoing bioprosthetic mitral valve replacement (MVR) from 1993 to 2000 were reviewed. RESULTS: Of 114 patients undergoing bioprosthetic MVR, 48 (42%) had MbS. Mean age was 73 years (vs. 74 years for no MbS). Patients underwent MVR for regurgitation (n = 97; 85%), stenosis (n = 12; 11%), or mixed lesions (n = 4; 4%). Etiology was degenerative (n = 35; 32%), rheumatic (n = 26; 24%), ischemic (n = 30; 28%), calcific (n = 9; 8%), and endocarditis (n = 8; 8%). Mean follow-up was 4.5 years. Overall survival at 5 and 10 years was 56% and 26%, respectively. Survival was similar between groups (p = 0.15). Five patients (2 MbS; 4% vs. 3 no MbS; 5%) required mitral reoperation at a mean of 3.8 years after initial MVR. The risk of prosthetic valve failure was not different between groups (p = 0.66). Despite no initial difference in transmitral gradients, gradients beyond five-year follow-up were greater for those with MbS (6.8 mmHg MbS vs. 4.7 mmHg no MbS, p = 0.007). Independent predictors of gradient progression beyond two years were MbS (p = 0.027) and female gender (p = 0.012). There were no significant differences in valve area, regurgitation, or ejection fraction. CONCLUSIONS: Although overall survival following bioprosthetic MVR is challenging, MbS did not predict diminished survival or excess reoperative risk compared to non-MbS patients. The trend toward more rapid progression of transprosthetic gradients in MbS patients warrants further investigation.
Subject(s)
Bioprosthesis , Heart Valve Diseases/surgery , Heart Valve Prosthesis Implantation/instrumentation , Heart Valve Prosthesis , Metabolic Syndrome/complications , Mitral Valve , Prosthesis Failure/etiology , Aged , Aged, 80 and over , Female , Follow-Up Studies , Heart Valve Diseases/complications , Humans , Male , Middle Aged , Reoperation/statistics & numerical data , Retrospective Studies , Treatment OutcomeABSTRACT
Coronary artery disease (CAD) is common in candidates for lung transplantation (LTx) and has historically been considered a relative contraindication to transplantation. We look to review the outcomes of LTx in patients with CAD and determine the optimum revascularization strategy in LTx candidates. PubMed, Medline and Web of Science were systematically searched by three authors for articles comparing the outcomes of LTx in patients with CAD and receiving coronary revascularization. In total 1668 articles were screened and 12 were included in this review.Preexisting CAD in LTx recipients was not associated with significantly increased postoperative morbidity or mortality. The pooled estimates of mortality rate at 1, 3 and 5 years indicated significantly inferior survival in LTx recipients with a prior history of coronary artery bypass grafting (CABG) [odds ratio (OR), 1.84; 95% confidence interval (CI), 1.53-2.22; P < 0.00001; I2 = 0%; OR, 1.52; 95% CI, 1.21-1.91; P = 0.0003; I2 = 0%; OR, 1.62; 95% CI, 1.13-2.33; P = 0.008; I2 = 71%, respectively). However, contemporary literature suggests that survival rates in LTx recipients with CAD that received revascularization either by percutaneous coronary intervention (PCI), previous or concomitant CABG, are similar to patients who did not receive revascularization. Trends in postoperative morbidity favored CABG in the rates of myocardial infarction and repeat revascularization, whereas rates of stroke favored PCI. The composite results of this study support the consideration of patients with CAD or previous coronary revascularization for LTx. Prospective, randomized controlled trials with consistent patient populations and outcomes reporting are required to fully elucidate the optimum revascularization strategy in LTx candidates.
Subject(s)
Coronary Artery Disease , Lung Transplantation , Percutaneous Coronary Intervention , Humans , Coronary Artery Disease/diagnostic imaging , Coronary Artery Disease/surgery , Prospective Studies , Coronary Artery Bypass , Lung Transplantation/adverse effects , Treatment OutcomeABSTRACT
OBJECTIVES: Our aim was to address the role of autologous mesenchymal stem cell recellularization of xenogenic valves on the activation of the xenoreactive immune response in an in vivo rat model. METHODS: Explanted aortic valve constructs from female Hartley guinea pigs were procured and decellularized, followed by recellularization with autologous Sprague-Dawley rat mesenchymal stem cells. Aortic valve xenografts were then implanted into the infrarenal aorta of recipient rats. Grafts were implanted as either autologous grafts, non-decellularized (NGP), decellularized and recellularized xenografts (RGP). Rats were euthanized after 7 and 21 days and exsanguinated and the grafts were explanted. RESULTS: The NGP grafts demonstrated significant burden of granulocytes (14.3 cells/HPF) and CD3+ T cells (3.9 cells/HPF) compared to the autologous grafts (2.1 granulocytes/HPF and 0.72 CD3+ T cells/HPF) after 7 days. A lower absolute number of infiltrating granulocytes (NGP vs autologous, 6.4 vs 2.4 cells/HPF) and CD3+ T cells (NGP vs autologous, 2.8 vs 0.8 cells/HPF) was seen after 21 days. Equivalent granulocyte cell infiltration in the RGP grafts (2.4 cells/HPF) compared to the autologous grafts (2.1 cells/HPF) after 7 and 21 days (2.8 vs 2.4 cells/HPF) was observed. Equivalent CD3+ T-cell infiltration in the RGP grafts (0.63 cells/HPF) compared to the autologous grafts (0.72 cells/HPF) after 7 and 21 days (0.7 vs 0.8 cells/HPF) was observed. Immunoglobulin production was significantly greater in the NGP grafts compared to the autologous grafts at 7 (123.3 vs 52.7 mg/mL) and 21 days (93.3 vs 71.6 mg/mL), with a similar decreasing trend in absolute production. Equivalent immunoglobulin production was observed in the RGP grafts compared to the autologous grafts at 7 (40.8 vs 52.7 mg/mL) and 21 days (29.5 vs 71.6 mg/mL). CONCLUSIONS: Autologous mesenchymal stem cell recellularization of xenogenic valves reduces the xenoreactive immune response in an in vivo rat model and may be an effective approach to decrease the progression of xenograft valve dysfunction.
Subject(s)
Bioprosthesis , Animals , Aortic Valve , Female , Heterografts , Humans , Immunity , Rats , Rats, Sprague-Dawley , Tissue EngineeringABSTRACT
BACKGROUND: Preoperative anemia is a common comorbidity that often necessitates allogeneic blood transfusion (ABT). As there is a risk associated with blood transfusions, preoperative intravenous iron (IV) has been proposed to increase the hemoglobin to reduce perioperative transfusion; however, randomized controlled trials (RCT) investigating this efficacy for IV iron are small, limited, and inconclusive. Consequently, a meta-analysis that pools these studies may provide new and clinically useful information. METHODS/DESIGN: Databases of MEDLINE, EMBASE, EBM Reviews; Cochrane-controlled trial registry; Scopus; registries of health technology assessment and clinical trials; Web of Science; ProQuest Dissertations and Theses; Clinicaltrials.gov; and Conference Proceedings Citation Index-Science (CPCI-S) were searched. Also, we screened all the retrieved reference lists. SELECTION CRITERIA: Titles and abstracts were screened for relevance (i.e., relevant, irrelevant, or potentially relevant). Then, we screened full texts of those citations identified as potentially applicable. RESULTS: Our search found 3195 citations and ten RCTs (1039 participants) that met our inclusion criteria. Preoperative IV iron supplementation significantly decreases ABT by 16% (risk ratio (RR): 0.84, 95% confidence interval [CI]: 0.71, 0.99, p = 0.04). In addition, preoperatively, hemoglobin levels increased after receiving IV iron (mean difference [MD] between the study groups: 7.15 g/L, 95% CI: 2.26, 12.04 g/L, p = 0.004) and at follow-up > 4 weeks postoperatively (MD: 6.46 g/L, 95% CI: 3.10, 9.81, p = 0.0002). Iron injection was not associated with increased incidence of non-serious or serious adverse effects across groups (RR: 1.13, 95% CI: 0.78, 1.65, p = 0.52) and (RR: 0.96, 95% CI: 0.44, 2.10, p = 0.92) respectively. CONCLUSIONS: With moderate certainty, due to the high risk of bias in some studies in one or two domains, we found intravenous iron supplementation is associated with a significant decrease in the blood transfusions rate, and modest hemoglobin concentrations rise when injected pre-surgery compared with placebo or oral iron supplementation. However, further full-scale randomized controlled trials with robust methodology are required. In particular, the safety, quality of life, and cost-effectiveness of different intravenous iron preparations require further evaluation.
Subject(s)
Anemia , Administration, Intravenous , Anemia/drug therapy , Blood Transfusion , Hemoglobins , Humans , Iron/therapeutic useABSTRACT
CASE PRESENTATION: A 34-year-old previously healthy man of Korean descent (height, 174 cm; weight, 47.4 kg) demonstrated dyspnea with cough and chest tightness. The patient had no relevant occupational exposures and no history of illicit drug or tobacco use. His medical history was notable for chronic sinus tachycardia of undetermined cause, hypertension, gout, glaucoma of the right eye, and a remote history of an intracranial malignancy 24 years prior treated with unspecified chemotherapy, craniotomy, and ventriculoperitoneal shunt placement. His active medications included diltiazem, candesartan, and colchicine as needed.
Subject(s)
Idiopathic Pulmonary Fibrosis/diagnosis , Pleural Diseases/diagnosis , Adult , Diagnosis, Differential , Diagnostic Imaging , Dyspnea , Humans , Male , PneumothoraxABSTRACT
OBJECTIVES: Acute kidney injury (AKI) is common after cardiac surgery. We quantified the mortality and costs of varying degrees of AKI using a population-based cohort in Alberta, Canada. METHODS: A cohort of patients undergoing cardiac surgery from 2004 to 2009 was assembled from linked Alberta administrative databases. AKI was classified by Kidney Disease Improving Global Outcomes stages of severity. Our outcomes were in-hospital mortality, length of stay, and costs; among survivors, we also examined mortality and costs at 365 days. Estimates were adjusted for demographic characteristics, comorbidities, and other covariates. RESULTS: Ten thousand one hundred seventy participants were included, of whom 9771 patients were discharged to community. Overall in-hospital mortality, costs, and length of stay were 4%, 7 days, and Can $34,000, respectively. Postcardiac surgery, AKI occurred in 25%. Compared with those without AKI, AKI was independently associated with increased in-hospital mortality across severity categories, with the highest risk (adjusted odds ratio, 37.1; 95% confidence interval, 26.3-52.1; P < .001) in patients who required acute dialysis. AKI severity was associated with increased hospital days and costs, with costs ranging from 1.21 for stage 1 AKI (95% confidence interval, 1.17-1.23) to 2.74 for acute dialysis (95% confidence interval, 2.49-3.00) (P < .001) times higher than in patients without AKI, after covariate adjustment. Postdischarge to 365 days, patients with AKI continued to experience increased costs up to 1.35-fold, and patients who required dialysis acutely continued to experience a 2.86-fold increased mortality. CONCLUSIONS: AKI remains an important indicator of mortality and health care costs postcardiac surgery.
Subject(s)
Acute Kidney Injury/economics , Acute Kidney Injury/etiology , Cardiac Surgical Procedures/adverse effects , Cardiac Surgical Procedures/economics , Hospital Costs , Acute Kidney Injury/mortality , Acute Kidney Injury/therapy , Aged , Aged, 80 and over , Alberta , Cardiac Surgical Procedures/mortality , Databases, Factual , Female , Hospital Mortality , Humans , Length of Stay/economics , Male , Middle Aged , Risk Assessment , Risk Factors , Time Factors , Treatment OutcomeABSTRACT
BACKGROUND: There is limited information about the impact of frailty on public payer costs in cardiac surgery. This study aimed to determine quality-adjusted life-years (QALYs) and costs associated with preoperative frailty in patients referred for cardiac surgery. METHODS: We retrospectively compared costs of frailty in a cohort of 529 patients aged ≥ 50 years who were referred for nonemergent cardiac surgery in Alberta. Patients were screened preoperatively for frailty, defined as a score of 5 or greater on the Clinical Frailty Scale. The primary outcome measure was public payer costs attributable to frailty, calculated in a difference-in-difference (DID) model. RESULTS: The prevalence of frailty was 10% (n = 51; 95% confidence interval [CI], 7%-12%). Median (interquartile range) costs for frail patients were higher in the first year postsurgery ($200,709 [$146,177-$486,852] vs $147,730 [$100,674-$177,025]; P < 0.001) compared to nonfrail; the difference-in-difference attributable cost of frailty was $57,836 (95% CI, $-28,608-$144,280). At 1 year, frail patients had fewer QALYs realized compared to nonfrail patients (0.71 [0.57-0.77] vs 0.82 [0.75-0.86], P < 0.001), whereas QALYs gained were similar (0.02 [-0.02-0.05] vs 0.02 [0.00-0.04], P = 0.58, median difference 0.003 [95% CI, -0.01-0.02]) in frail and nonfrail patients. CONCLUSIONS: Frailty screening identified a population with greater impairment in quality-of-life and greater healthcare costs. Costs attributable to frailty represent opportunity costs that should be considered in future cardiac surgical services planning in the context of our aging population and the growing prevalence of frailty.
CONTEXTE: Il existe peu de renseignements concernant les répercussions de la fragilité sur les coûts pour les payeurs publics en chirurgie cardiaque. Cette étude visait à déterminer les années de vie pondérées par la qualité (QALY, pour Quality-Adjusted Life-Years) et les coûts associés à la fragilité préopératoire chez les patients dirigés vers un service de chirurgie cardiaque. MÉTHODOLOGIE: Nous avons comparé de façon rétrospective les coûts de la fragilité dans une cohorte de 529 patients âgés de 50 ans ou plus qui ont été dirigés vers un service de chirurgie cardiaque pour une intervention non urgente en Alberta. Un dépistage de la fragilité, définie comme un score de 5 ou plus à l'échelle CFS (Clinical Frailty Scale), a été effectué avant l'intervention. Le principal critère d'évaluation était le coût attribuable à la fragilité pour les payeurs publics, calculé selon un modèle d'écart des différences. RÉSULTATS: La prévalence de la fragilité a été de 10 % (n = 51; intervalle de confiance [IC] à 95 % : 7 à 12 %). Les coûts médians (écart interquartile) dans la première année suivant l'intervention chirurgicale ont été plus élevés chez les patients fragiles que chez les patients non fragiles (200 709 $ [146 177 $ à 486 852 $] contre 147 730 $ [100 674 $ à 177 025 $]; p < 0,001); le coût attribuable de la fragilité selon le modèle d'écart des différences a été de 57 836 $ (IC à 95 % : −28 608 $ à 144 280 $). À 1 an, les patients fragiles avaient moins de QALY réalisées que les patients non fragiles (0,71 [0,57 à 0,77] contre 0,82 [0,75 à 0,86]; p < 0,001), alors que le nombre de QALY gagnées était similaire (0,02 [−0,02 à 0,05] contre 0,02 [0,00 à 0,04]; p = 0,58; différence médiane : 0,003 [IC à 95 % : −0,01 à 0,02]) chez les patients fragiles et non fragiles. CONCLUSIONS: Le dépistage de la fragilité a permis de repérer une population associée à une perte plus importante de qualité de vie et à des coûts plus élevés en soins de santé. Les coûts attribuables à la fragilité représentent des coûts de renonciation qui doivent être considérés dans la planification future des services de chirurgie cardiaque, dans le contexte du vieillissement de notre population et de la prévalence croissante de fragilité.
ABSTRACT
Despite many studies documenting the prevalence of various co-occurring psychiatric symptoms in children and adults with ASD, less is known about how these symptoms relate to subtypes defined by particular phenotypic features within the ASD population. We examined the severity and prevalence of comorbid symptoms of psychopathology, emotion dysregulation, and maladaptive behaviors, as well as adaptive functioning, in a group of 65 minimally verbal children (n = 33) and adolescents (n = 32) with ASD. On the Child and Adolescent Symptom Inventory (CASI-5), for all the symptom classifications except oppositional defiant disorder and conduct disorder, more participants in our sample showed elevated or clinically concerning severity scores relative to the general population. On the Emotion Dysregulation Inventory (EDI), the mean scores for Reactivity and Dysphoria factors in our sample were lower than in the autism calibration sample, which included a large number of inpatient youth with ASD. Overall, few differences were found between the children and adolescents within this severely impaired group of ASD individuals based on clinical cutoff scores on the CASI-5 and EDI factor scores. Psychiatric comorbidities and emotion dysregulation measures were not correlated with autism symptom severity or with measures of adaptive functioning, and were largely unrelated to IQ in our sample. The number of clinically significant psychiatric symptoms on the CASI-5 emerged as the main predictor of maladaptive behaviors. Findings suggest a wide range of co-occurring psychopathology and high degree of maladaptive behavior among minimally verbal children and adolescents with ASD, which are not directly attributable to autism symptom severity, intellectual disability or limitations in adaptive functioning.
ABSTRACT
BACKGROUND AND AIMS: When children hear a novel word, they tend to associate it with a novel rather than a familiar object. The ability to map a novel word to its corresponding referent is thought to depend, at least in part, on language-learning strategies, such as mutual exclusivity and lexical contrast. Although the importance of word learning strategies has been broadly investigated in typically developing children as well as younger children with autism spectrum disorder, who are usually language delayed, there is a paucity of research on such strategies and their role in language learning in school-age children and adolescents with autism spectrum disorder who have failed to develop fluent speech. In this study, we examined the ability of minimally verbal children and adolescents with autism spectrum disorder to learn and retain novel words in an experimental task, as well as the cognitive, language, and social correlates of these abilities. We were primarily interested in the characteristics that differentiated between three subgroups of participants: those unable to use word learning strategies, particularly mutual exclusivity, to learn novel words; those able to learn novel words over several exposure trials but not able retain them; and those able to retain the words they learned. METHODS: Participants were 29 minimally verbal individuals with autism spectrum disorder from 5 to 17 years of age. Participants completed a computerized touchscreen novel-word-learning procedure followed by assessments of immediate retention and of delayed retention, two hours later. Participants were grouped according to whether they passed/failed at least 7 of 8 (binomial p<.035) novel word learning trials and 7 of 8 immediate or delayed retention trials, and were compared on measures of nonverbal IQ, receptive and expressive vocabulary, phonological processing, joint attention and symptom severity. RESULTS: Of 29 participants, 14 failed both learning and immediate retention, 8 passed learning but failed immediate retention, and 7 passed both learning and immediate retention. Group performance was highly similar for delayed retention. Language level, particularly expressive vocabulary, differentiated between participants who did and did not succeed in retention, even while controlling for differences in nonverbal IQ. CONCLUSIONS: The ability of minimally verbal school-age children and adolescents with autism spectrum disorder to identify the referents of novel words was associated with nonverbal cognitive abilities. Retention of words was associated with concurrent expressive language abilities. IMPLICATIONS: Our findings of associations between the retention of novel words acquired in a lab-based experimental task and concurrent language ability warrants further investigation with larger samples and longitudinal research designs, which may support the incorporation of contrastive word learning strategies into language learning interventions for severely language-impaired individuals with autism spectrum disorder.
ABSTRACT
BACKGROUND: Cardiac surgery waitlist recommendations, which were developed based on expert opinion, poorly predict preoperative mortality. Studies reporting risk factors for waitlist mortality have not evaluated the risks including nonadherence to waitlist benchmarks. METHODS: In patients who underwent cardiac surgery or died on the waitlist between 2005 and 2015, we used a Fine and Gray competing risk model to identify independent predictors of waitlist mortality in 12,106 patients scheduled for urgent, semiurgent, or nonurgent surgery. The predictive variables were compared with Canadian Cardiovascular Society (CCS) waitlist recommendations using the Akaike information criterion. RESULTS: A total of 101 (0.8%) patients died awaiting surgery. The median wait times and frequency waitlist deaths among emergent, urgent, semi-urgent, and nonurgent surgery were 0.6, 7.4, 69.0, 55.5 days (P < 0.001) and 6.3%, 0.8%, 0.3%, 0.6% (P < 0.001), respectively. Adherence to CCS waitlist recommendations was higher in patients who died on the waitlist (51.6% vs 70.8%, P = 0.001) and was not predictive of waitlist mortality (hazard ratio 1.48, 95% confidence interval 0.62-0.56). Independent predictors of waitlist mortality were age, aortic surgery, ejection fraction < 35%, urgent surgery, prior myocardial infarction, haemodynamic instability during cardiac catheterization, hypertension, and dyslipidemia. These variables were superior to current CCS guidelines (Akaike information criterion 1251 vs 1317, likelihood ratio test P < 0.001). CONCLUSIONS: CCS waitlist recommendations were poorly predictive of waitlist mortality and the majority of waitlist deaths occur within recommended benchmarks. We identified variables associated with waitlist mortality with improved clinical performance. Our findings suggest a need to re-evaluate cardiac surgical triage criteria using evidence-based data.
Subject(s)
Cardiac Surgical Procedures , Coronary Disease/surgery , Guideline Adherence , Population Surveillance , Risk Assessment/methods , Triage/methods , Waiting Lists/mortality , Aged , Alberta/epidemiology , Coronary Disease/mortality , Databases, Factual , Female , Follow-Up Studies , Humans , Male , Middle Aged , Retrospective Studies , Risk Factors , Survival Rate/trends , Time FactorsABSTRACT
BACKGROUND: Single centre studies support No Touch (NT) saphenous vein graft (SVG) harvesting technique. The primary objective of the SUPERIOR SVG study was to determine whether NT versus conventional (CON) SVG harvesting was associated with improved SVG patency 1 year after coronary artery bypass grafting surgery (CABG). METHODS: Adults undergoing isolated CABG with at least 1 SVG were eligible. CT angiography was performed 1-year post CABG. Leg adverse events were assessed with a questionnaire. A systematic review was performed for published NT graft patency studies and results aggregated including the SUPERIOR study results. RESULTS: Two hundred and-fifty patients were randomized across 12-centres (NT 127 versus CON 123 patients). The primary outcome (study SVG occlusion or cardiovascular (CV) death) was not significantly different in NT versus CON (NT: 7/127 (5.5%), CON 13/123 (10.6%), p = 0.15). Similarly, the proportion of study SVGs with significant stenosis or total occlusion was not significantly different between groups (NT: 8/102 (7.8%), CON: 16/107 (15.0%), p = 0.11). Vein harvest site infection was more common in the NT patients 1 month postoperatively (23.3% vs 9.5%, p < 0.01). Including this study's results, in a meta-analysis, NT was associated with a significant reduction in SVG occlusion, Odds Ratio 0.49, 95% Confidence Interval 0.29-0.82, p = 0.007 in 3 randomized and 1 observational study at 1 year postoperatively. CONCLUSIONS: The NT technique was not associated with improved patency of SVGs at 1-year following CABG while early vein harvest infection was increased. The aggregated data is supportive of an important reduction of SVG occlusion at 1 year with NT harvesting. TRIAL REGISTRATION: NCT01047449 .
Subject(s)
Coronary Artery Bypass/methods , Saphenous Vein/transplantation , Tissue and Organ Harvesting/methods , Adult , Female , Humans , Male , Vascular PatencyABSTRACT
Chronic mitral regurgitation (MR) remains a common cardiovascular condition resulting in significant morbidity and mortality. With an aging population, increasing trends for both primary (degenerative) and secondary (functional) MR have become apparent. Although the gold standard remains surgical intervention with mitral valve repair/replacement, comorbid conditions have steered the development of less invasive technologies to mitigate perioperative surgical risk. Transcatheter mitral valve repair using a percutaneous edge-to-edge technique is the most widely available choice at present. However, other transcatheter mitral valve repair techniques such as annuloplasty and chordal implantation are notable alternatives. Moreover, emerging technologies in transcatheter mitral valve replacement are rapidly establishing their roles in the field of chronic severe MR therapy. Hence, it is imperative to understand the indications and limitations of these various transcatheter mitral valve interventions to provide the best and most up-to-date clinical care for patients. This review will outline current evidence and patient selection criteria for such device-based therapies.
Subject(s)
Biomedical Technology/methods , Mitral Valve Annuloplasty , Mitral Valve Insufficiency/surgery , Aged , Humans , Mitral Valve Annuloplasty/methods , Mitral Valve Annuloplasty/trends , Patient Selection , Risk AdjustmentABSTRACT
Rodent models have been essential to understanding the immune-mediated failure of aortic valve allografts (AVAs). Decellularization has been proposed to reduce the immunogenicity of AVAs. The objective of this study was to determine the most effective method to decellularize AVAs for use in a rat model. Three different decellularization techniques were compared in Lewis aortic valves. Detergent decellularization involved a series of hypotonic and hypertonic Tris buffers at 4 degrees C for 48 h/buffer containing 0.5% Triton X-100 followed by a 72 h washout in phosphate-buffered saline. Osmotic decellularization was performed in similar manner to the detergent-based technique except without the addition of Triton X-100. Enzymatic decellularization consisted of trypsin/EDTA at 37 degrees C for 48 h. Assessment was performed with light microscopy (H&E, Movat's pentachrome), immunohistochemistry for residual cellular elements, and hydroxyproline assays. Detergent-based methodology effected near-complete decellularization of both the leaflets and aortic wall in addition to preservation of the extracellular matrix (ECM). Osmotic lysis was associated with preservation of ECM and moderate decellularization. Enzymatic decellularization resulted in complete decellularization but extensive degeneration and fragmentation of the ECM. When implanted into the infrarenal aorta of allogeneic rats for 1 week, valves decellularized with detergent-based and osmotic methodology failed to stimulate an allogeneic immune response as evidenced by an absence of T cell infiltrates. Osmotic lysis protocols with low dose detergent appear to be most effective at both removing antigenic cellular elements and preserving ECM.