Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 94
Filter
1.
Am J Transplant ; 2024 Jul 16.
Article in English | MEDLINE | ID: mdl-39019437

ABSTRACT

Organ procurement organizations (OPOs) face increasing regulatory scrutiny, and the performance of predictive models used to assess OPO performance is critical. We sought to determine whether adding deceased donor physiological and critical care data to the existing Scientific Registry of Transplant Recipients (SRTR) heart yield model would improve the model's performance. Donor data and heart transplanted (yes/no), the outcome of interest, were obtained from the United Network for Organ Sharing Donor Management Goal (DMG) Registry for 19 141 donors after brain death, from 25 OPOs. The data were split into training and testing portions. Multivariable LASSO regression was used to develop a statistical model incorporating DMG data elements with the existing components of the SRTR model. The DMG + SRTR and SRTR models were applied to the test data to compare the predictive performance of the models. The sensitivity (84%-86%) and specificity (84%-86%) were higher for the DMG + SRTR model compared to the SRTR model (71%-75% and 76%-77%, respectively). For the DMG + SRTR model, the C-statistic was 0.92 to 0.93 compared to 0.80 to 0.81 for the SRTR model. DMG data elements improve the predictive performance of the heart yield model. The addition of DMG data elements to the Organ Procurement and Transplantation Network data collection requirements should be considered.

2.
Harm Reduct J ; 21(1): 125, 2024 Jun 27.
Article in English | MEDLINE | ID: mdl-38937779

ABSTRACT

BACKGROUND: Patients with opioid use disorder (OUD) experience various forms of stigma at the individual, public, and structural levels that can affect how they access and engage with healthcare, particularly with medications for OUD treatment. Telehealth is a relatively new form of care delivery for OUD treatment. As reducing stigma surrounding OUD treatment is critical to address ongoing gaps in care, the aim of this study was to explore how telehealth impacts patient experiences of stigma. METHODS: In this qualitative study, we interviewed patients with OUD at a single urban academic medical center consisting of multiple primary care and addiction clinics in Oregon, USA. Participants were eligible if they had (1) at least one virtual visit for OUD between March 2020 and December 2021, and (2) a prescription for buprenorphine not exclusively used for chronic pain. We conducted phone interviews between October and December 2022, then recorded, transcribed, dual-coded, and analyzed using reflexive thematic analysis. RESULTS: The mean age of participants (n = 30) was 40.5 years (range 20-63); 14 were women, 15 were men, and two were transgender, non-binary, or gender-diverse. Participants were 77% white, and 33% had experienced homelessness in the prior six months. We identified four themes regarding how telehealth for OUD treatment shaped patient perceptions of and experiences with stigma at the individual (1), public (2-3), and structural levels (4): (1) Telehealth offers wanted space and improved control over treatment setting; (2) Public stigma and privacy concerns can impact both telehealth and in-person encounters, depending on clinical and personal circumstances; (3) The social distance of telehealth could mitigate or exacerbate perceptions of clinician stigma, depending on both patient and clinician expectations; (4) The increased flexibility of telehealth translated to perceptions of increased clinician trust and respect. CONCLUSIONS: The forms of stigma experienced by individuals with OUD are complex and multifaceted, as are the ways in which those experiences interact with telehealth-based care. The mixed results of this study support policies allowing for a more individualized, patient-centered approach to care delivery that allows patients a choice over how they receive OUD treatment services.


Subject(s)
Opioid-Related Disorders , Qualitative Research , Social Stigma , Telemedicine , Humans , Female , Male , Adult , Middle Aged , Opioid-Related Disorders/psychology , Young Adult , Oregon , Buprenorphine/therapeutic use , Opiate Substitution Treatment/psychology , Opiate Substitution Treatment/methods
3.
JAMA Netw Open ; 7(2): e2353785, 2024 Feb 05.
Article in English | MEDLINE | ID: mdl-38416500

ABSTRACT

Importance: Delayed graft function in kidney-transplant recipients is associated with increased financial cost and patient burden. In donors with high Kidney Donor Profile Index whose kidneys are not pumped, therapeutic hypothermia has been shown to confer a protective benefit against delayed graft function. Objective: To determine whether hypothermia is superior to normothermia in preventing delayed graft function in low-risk nonpumped kidney donors after brain death. Design, Setting, and Participants: In a multicenter randomized clinical trial, brain-dead kidney donors deemed to be low risk and not requiring machine perfusion per Organ Procurement Organization protocol were prospectively randomized to hypothermia (34.0-35 °C) or normothermia (36.5-37.5 °C) between August 10, 2017, and May 21, 2020, across 4 Organ Procurement Organizations in the US (Arizona, Upper Midwest, Pacific Northwest, and Texas). The final analysis report is dated June 15, 2022, based on the data set received from the United Network for Organ Sharing on June 2, 2021. A total of 509 donors (normothermia: n = 245 and hypothermia: n = 236; 1017 kidneys) met inclusion criteria over the study period. Intervention: Donor hypothermia (34.0-35.0 °C) or normothermia (36.5-37.5 °C). Main Outcomes and Measures: The primary outcome was delayed graft function in the kidney recipients, defined as the need for dialysis within the first week following kidney transplant. The primary analysis follows the intent-to-treat principle. Results: A total of 934 kidneys were transplanted from 481 donors, of which 474 were randomized to the normothermia group and 460 to the hypothermia group. Donor characteristics were similar between the groups, with overall mean (SD) donor age 34.2 (11.1) years, and the mean donor creatinine level at enrollment of 1.03 (0.53) mg/dL. There was a predominance of Standard Criteria Donors (98% in each treatment arm) with similar low mean (SD) Kidney Donor Profile Index (normothermia: 28.99 [20.46] vs hypothermia: 28.32 [21.9]). Cold ischemia time was similar in the normothermia and hypothermia groups (15.99 [7.9] vs 15.45 [7.63] hours). Delayed graft function developed in 87 of the recipients (18%) in the normothermia group vs 79 (17%) in the hypothermia group (adjusted odds ratio, 0.92; 95% CI, 0.64-1.33; P = .66). Conclusions and Relevance: The findings of this study suggest that, in low-risk non-pumped kidneys from brain-dead kidney donors, therapeutic hypothermia compared with normothermia does not appear to prevent delayed graft function in kidney transplant recipients. Trial Registration: ClinicalTrials.gov Identifier: NCT02525510.


Subject(s)
Hypothermia, Induced , Hypothermia , Kidney Transplantation , Adult , Humans , Brain , Brain Death , Delayed Graft Function , Renal Dialysis , Young Adult
4.
JACC Heart Fail ; 12(4): 722-736, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38244008

ABSTRACT

BACKGROUND: Potential organ donors often exhibit abnormalities on electrocardiograms (ECGs) after brain death, but the physiological and prognostic significance of such abnormalities is unknown. OBJECTIVES: This study sought to characterize the prevalence of ECG abnormalities in a nationwide cohort of potential cardiac donors and their associations with cardiac dysfunction, use for heart transplantation (HT), and recipient outcomes. METHODS: The Donor Heart Study enrolled 4,333 potential cardiac organ donors at 8 organ procurement organizations across the United States from 2015 to 2020. A blinded expert reviewer interpreted all ECGs, which were obtained once hemodynamic stability was achieved after brain death and were repeated 24 ± 6 hours later. ECG findings were summarized, and their associations with other cardiac diagnostic findings, use for HT, and graft survival were assessed using univariable and multivariable regression. RESULTS: Initial ECGs were interpretable for 4,136 potential donors. Overall, 64% of ECGs were deemed clinically abnormal, most commonly as a result of a nonspecific St-T-wave abnormality (39%), T-wave inversion (19%), and/or QTc interval >500 ms (17%). Conduction abnormalities, ectopy, pathologic Q waves, and ST-segment elevations were less common (each present in ≤5% of donors) and resolved on repeat ECGs in most cases. Only pathological Q waves were significant predictors of donor heart nonuse (adjusted OR: 0.39; 95% CI: 0.29-0.53), and none were associated with graft survival at 1 year post-HT. CONCLUSIONS: ECG abnormalities are common in potential heart donors but often resolve on serial testing. Pathologic Q waves are associated with a lower likelihood of use for HT, but they do not portend worse graft survival.


Subject(s)
Heart Diseases , Heart Failure , Heart Transplantation , Tissue and Organ Procurement , Humans , Tissue Donors , Brain Death , Electrocardiography , Arrhythmias, Cardiac
5.
J Heart Lung Transplant ; 43(3): 387-393, 2024 Mar.
Article in English | MEDLINE | ID: mdl-37802261

ABSTRACT

Primary graft dysfunction (PGD) is a leading cause of early morbidity and mortality following heart transplantation (HT). We sought to determine the association between pretransplant human leukocyte antigen (HLA) sensitization, as measured using the calculated panel reactive antibody (cPRA) value, and the risk of PGD. METHODS: Consecutive adult HT recipients (n = 596) from 1/2015 to 12/2019 at 2 US centers were included. Severity of PGD was based on the 2014 International Society for Heart and Lung Transplantation consensus statement. For each recipient, unacceptable HLA antigens were obtained and locus-specific cPRA (cPRA-LS) and pre-HT donor-specific antibodies (DSA) were assessed. RESULTS: Univariable logistic modeling showed that peak cPRA-LS for all loci and HLA-A was associated with increased severity of PGD as an ordinal variable (all loci: OR 1.78, 95% CI: 1.01-1.14, p = 0.025, HLA-A: OR 1.14, 95% CI: 1.03-1.26, p = 0.011). Multivariable analysis showed peak cPRA-LS for HLA-A, recipient beta-blocker use, total ischemic time, donor age, prior cardiac surgery, and United Network for Organ Sharing status 1 or 2 were associated with increased severity of PGD. The presence of DSA to HLA-B was associated with trend toward increased risk of mild-to-moderate PGD (OR 2.56, 95% CI: 0.99-6.63, p = 0.053), but DSA to other HLA loci was not associated with PGD. CONCLUSIONS: Sensitization for all HLA loci, and specifically HLA-A, is associated with an increased severity of PGD. These factors should be included in pre-HT risk stratification to minimize the risk of PGD.


Subject(s)
Heart Transplantation , Primary Graft Dysfunction , Adult , Humans , Primary Graft Dysfunction/epidemiology , Primary Graft Dysfunction/etiology , Heart Transplantation/adverse effects , HLA Antigens , Tissue Donors , Antibodies , HLA-A Antigens , Retrospective Studies
6.
Circulation ; 148(10): 822-833, 2023 09 05.
Article in English | MEDLINE | ID: mdl-37465972

ABSTRACT

BACKGROUND: Left ventricular dysfunction in potential donors meeting brain death criteria often results in nonuse of donor hearts for transplantation, yet little is known about its incidence or pathophysiology. Resolving these unknowns was a primary aim of the DHS (Donor Heart Study), a multisite prospective cohort study. METHODS: The DHS enrolled potential donors by neurologic determination of death (n=4333) at 8 organ procurement organizations across the United States between February 2015 and May 2020. Data included medications administered, serial diagnostic tests, and transthoracic echocardiograms (TTEs) performed: (1) within 48 hours after brain death was formally diagnosed; and (2) 24±6 hours later if left ventricular (LV) dysfunction was initially present. LV dysfunction was defined as an LV ejection fraction <50% and was considered reversible if LV ejection fraction was >50% on the second TTE. TTEs were also examined for presence of LV regional wall motion abnormalities and their reversibility. We assessed associations between LV dysfunction, donor heart acceptance for transplantation, and recipient 1-year survival. RESULTS: An initial TTE was interpreted for 3794 of the 4333 potential donors by neurologic determination of death. A total of 493 (13%) of these TTEs showed LV dysfunction. Among those donors with an initial TTE, LV dysfunction was associated with younger age, underweight, and higher NT-proBNP (N-terminal pro-B-type natriuretic peptide) and troponin levels. A second TTE was performed within 24±6 hours for a subset of donors (n=224) with initial LV dysfunction; within this subset, 130 (58%) demonstrated reversibility. Sixty percent of donor hearts with normal LV function were accepted for transplant compared with 56% of hearts with reversible LV dysfunction and 24% of hearts with nonreversible LV dysfunction. Donor LV dysfunction, whether reversible or not, was not associated with recipient 1-year survival. CONCLUSIONS: LV dysfunction associated with brain death occurs in many potential heart donors and is sometimes reversible. These findings can inform decisions made during donor evaluation and help guide donor heart acceptance for transplantation.


Subject(s)
Heart Transplantation , Ventricular Dysfunction, Left , Humans , Tissue Donors , Heart Transplantation/methods , Prospective Studies , Brain Death , Ventricular Function, Left
7.
Clin Transplant ; 37(6): e14978, 2023 06.
Article in English | MEDLINE | ID: mdl-36964943

ABSTRACT

Heart and lung transplant recipients require care provided by clinicians from multiple different specialties, each contributing unique expertise and perspective. The period the patient spends in the intensive care unit is one of the most critical times in the perioperative trajectory. Various organizational models of intensive care exist, including those led by intensivists, surgeons, transplant cardiologists, and pulmonologists. Coordinating timely efficient intensive care is an essential and logistically difficult goal. The present work product of the American Society of Transplantation's Thoracic and Critical Care Community of Practice, Critical Care Task Force outlines operational guidelines and principles that may be applied in different organizational models to optimize the delivery of intensive care for the cardiothoracic organ recipient.


Subject(s)
Intensive Care Units , Surgeons , Humans , Critical Care , Perioperative Care
8.
N Engl J Med ; 388(5): 418-426, 2023 02 02.
Article in English | MEDLINE | ID: mdl-36724328

ABSTRACT

BACKGROUND: Therapeutic hypothermia in brain-dead organ donors has been shown to reduce delayed graft function in kidney recipients after transplantation. Data are needed on the effect of hypothermia as compared with machine perfusion on outcomes after kidney transplantation. METHODS: At six organ-procurement facilities in the United States, we randomly assigned brain-dead kidney donors to undergo therapeutic hypothermia (hypothermia group), ex situ kidney hypothermic machine perfusion (machine-perfusion group), or both (combination-therapy group). The primary outcome was delayed graft function in the kidney transplant recipients (defined as the initiation of dialysis during the first 7 days after transplantation). We also evaluated whether hypothermia alone was noninferior to machine perfusion alone and whether the combination of both methods was superior to each of the individual therapies. Secondary outcomes included graft survival at 1 year after transplantation. RESULTS: From 725 enrolled donors, 1349 kidneys were transplanted: 359 kidneys in the hypothermia group, 511 in the machine-perfusion group, and 479 in the combined-therapy group. Delayed graft function occurred in 109 patients (30%) in the hypothermia group, in 99 patients (19%) in the machine-perfusion group, and in 103 patients (22%) in the combination-therapy group. Adjusted risk ratios for delayed graft function were 1.72 (95% confidence interval [CI], 1.35 to 2.17) for hypothermia as compared with machine perfusion, 1.57 (95% CI, 1.26 to 1.96) for hypothermia as compared with combination therapy, and 1.09 (95% CI, 0.85 to 1.40) for combination therapy as compared with machine perfusion. At 1 year, the frequency of graft survival was similar in the three groups. A total of 10 adverse events were reported, including cardiovascular instability in 9 donors and organ loss in 1 donor owing to perfusion malfunction. CONCLUSIONS: Among brain-dead organ donors, therapeutic hypothermia was inferior to machine perfusion of the kidney in reducing delayed graft function after transplantation. The combination of hypothermia and machine perfusion did not provide additional protection. (Funded by Arnold Ventures; ClinicalTrials.gov number, NCT02525510.).


Subject(s)
Hypothermia, Induced , Hypothermia , Kidney Transplantation , Kidney , Organ Preservation , Perfusion , Humans , Brain Death , Delayed Graft Function/etiology , Delayed Graft Function/prevention & control , Graft Survival , Hypothermia, Induced/adverse effects , Hypothermia, Induced/methods , Kidney/surgery , Kidney Transplantation/adverse effects , Kidney Transplantation/methods , Organ Preservation/adverse effects , Organ Preservation/methods , Perfusion/adverse effects , Perfusion/methods , Tissue Donors
9.
Kidney Int Rep ; 8(1): 17-29, 2023 Jan.
Article in English | MEDLINE | ID: mdl-36644345

ABSTRACT

Rhabdomyolysis-induced acute kidney injury (RIAKI) occurs following damage to the muscular sarcolemma sheath, resulting in the leakage of myoglobin and other metabolites that cause kidney damage. Currently, the sole recommended clinical treatment for RIAKI is aggressive fluid resuscitation, but other potential therapies, including pretreatments for those at risk for developing RIAKI, are under investigation. This review outlines the mechanisms and clinical significance of RIAKI, investigational treatments and their specific targets, and the status of ongoing research trials.

10.
J Heart Lung Transplant ; 42(5): 617-626, 2023 05.
Article in English | MEDLINE | ID: mdl-36682894

ABSTRACT

BACKGROUND: Primary graft dysfunction (PGD) is a major cause of early mortality following heart transplant (HT). Donor risk factors for the development of PGD are incompletely characterized. Donor management goals (DMG) are predefined critical care endpoints used to optimize donors. We evaluated the relationship between DMGs as well as non-DMG parameters, and the development of PGD after HT. METHODS: A cohort of HT recipients from 2 transplant centers between 1/1/12 and 12/31/19 was linked to their respective donors in the United Network for Organ Sharing (UNOS) DMG Registry (n = 1,079). PGD was defined according to modified ISHLT criteria. Variables were subject to univariate and multivariable multinomial modeling with development of mild/moderate or severe PGD as the outcome variable. A second multicenter cohort of 4,010 donors from the DMG Registry was used for validation. RESULTS: Mild/moderate and severe PGD occurred in 15% and 6% of the cohort. Multivariable modeling revealed 6 variables independently associated with mild/moderate and 6 associated with severe PGD, respectively. Recipient use of amiodarone plus beta-blocker, recipient mechanical circulatory support, donor age, donor fraction of inspired oxygen (FiO2), and donor creatinine increased risk whereas predicted heart mass ratio decreased risk of severe PGD. We found that donor age and FiO2 ≥ 40% were associated with an increased risk of death within 90 days post-transplant in a multicenter cohort. CONCLUSIONS: Donor hyperoxia at heart recovery is a novel risk factor for severe primary graft dysfunction and early recipient death. These results suggest that excessive oxygen supplementation should be minimized during donor management.


Subject(s)
Heart Transplantation , Hyperoxia , Primary Graft Dysfunction , Humans , Primary Graft Dysfunction/epidemiology , Primary Graft Dysfunction/etiology , Hyperoxia/complications , Risk Factors , Heart Transplantation/adverse effects , Tissue Donors , Oxygen , Retrospective Studies
12.
Am J Transplant ; 22(7): 1760-1765, 2022 07.
Article in English | MEDLINE | ID: mdl-35373509

ABSTRACT

Solid organ transplantation continues to be constrained by a lack of suitable donor organs. Advances in donor management and evaluation are needed to address this shortage, but the performance of research studies in deceased donors is fraught with challenges. Here we discuss several of the major obstacles we faced in the conduct of the Donor Heart Study-a prospective, multi-site, observational study of donor management, evaluation, and acceptance for heart transplantation. These included recruitment and engagement of participating organ procurement organizations, ambiguities related to study oversight, obtaining authorization for donor research, logistical challenges encountered during donor management, sustaining study momentum, and challenges related to study data management. By highlighting these obstacles encountered, as well as the solutions implemented, we hope to stimulate further discussion and actions that will facilitate the design and execution of future donor research studies.


Subject(s)
Heart Transplantation , Organ Transplantation , Tissue and Organ Procurement , Humans , Prospective Studies , Tissue Donors
13.
Clin Transplant ; 36(2): e14528, 2022 02.
Article in English | MEDLINE | ID: mdl-34739731

ABSTRACT

BACKGROUND: Delayed graft function (DGF) after kidney transplantation is a common occurrence and correlates with poor graft and patient outcomes. Donor characteristics and care are known to impact DGF. We attempted to show the relationship between achievement of specific donor management goals (DMG) and DGF. METHODS: This is a retrospective case-control study using data from 14 046 adult kidney donations after brain death from hospitals in 18 organ procurement organizations (OPOs) which were transplanted to adult recipients between 2012 and 2018. Data on DMG compliance and donor, recipient, and ischemia-related factors were used to create multivariable logistic regression models. RESULTS: The overall rate of DGF was 29.4%. Meeting DMGs for urine output and vasopressor use were associated with decreased risk of DGF. Sensitivity analyses performed with different imputation methods, omitting recipient factors, and analyzing multiple time points yielded largely consistent results. CONCLUSIONS: The development of DMGs continues to show promise in improving outcomes in the kidney transplant recipient population. Studies have already shown increased kidney utilization in smaller cohorts, as well as other organs, and shown decreased rates of DGF. Additional research and analysis are required to assess interactions between meeting DMGs and correlation versus causality in DMGs and DGF.


Subject(s)
Delayed Graft Function , Kidney Transplantation , Adult , Case-Control Studies , Delayed Graft Function/epidemiology , Delayed Graft Function/etiology , Goals , Graft Survival , Humans , Kidney Transplantation/adverse effects , Retrospective Studies , Risk Factors , Tissue Donors
14.
Transplantation ; 106(6): 1227-1232, 2022 06 01.
Article in English | MEDLINE | ID: mdl-34310099

ABSTRACT

BACKGROUND: The development of cytomegalovirus (CMV) infection after kidney transplant remains a significant cause of posttransplant morbidity, graft loss, and mortality. Despite appropriate antiviral therapy, recipients without previous CMV exposure can currently be allocated a kidney from a donor with previous CMV infection (D+R-) that carries the greatest risk of posttransplant CMV infection and associated complications. Preferential placement of CMV D- organs in negative recipients (R-) has been shown to reduce the risk of viral infection and associated complications. METHODS: To assess the long-term survival and economic benefits of allocation policy reforms, a decision-analytic model was constructed to compare receipt of CMV D- with CMV D+ organ in CMV R- recipients using data from transplant registry, Medicare claims, and pharmaceutical costs. RESULTS: For CMV R- patients, receipt of a CMV D- organ was associated with greater average survival (14.3 versus 12.6 y), superior quality-adjusted life years (12.6 versus 9.8), and lower costs ($529 512 versus $542 963). One-way sensitivity analysis demonstrated a survival advantage for patients waiting as long as 30 mo for a CMV D- kidney. CONCLUSIONS: Altering national allocation policy to preferentially offer CMV D- organs to CMV R- recipients could improve survival and lower costs after transplant if appropriately implemented.


Subject(s)
Cytomegalovirus Infections , Kidney Transplantation , Aged , Antiviral Agents/therapeutic use , Cytomegalovirus , Cytomegalovirus Infections/diagnosis , Cytomegalovirus Infections/epidemiology , Cytomegalovirus Infections/prevention & control , Decision Support Techniques , Humans , Kidney Transplantation/adverse effects , Medicare , Retrospective Studies , Transplant Recipients , United States/epidemiology
15.
Transplant Direct ; 7(10): e771, 2021 Oct.
Article in English | MEDLINE | ID: mdl-34604507

ABSTRACT

Early prediction of whether a liver allograft will be utilized for transplantation may allow better resource deployment during donor management and improve organ allocation. The national donor management goals (DMG) registry contains critical care data collected during donor management. We developed a machine learning model to predict transplantation of a liver graft based on data from the DMG registry. METHODS: Several machine learning classifiers were trained to predict transplantation of a liver graft. We utilized 127 variables available in the DMG dataset. We included data from potential deceased organ donors between April 2012 and January 2019. The outcome was defined as liver recovery for transplantation in the operating room. The prediction was made based on data available 12-18 h after the time of authorization for transplantation. The data were randomly separated into training (60%), validation (20%), and test sets (20%). We compared the performance of our models to the Liver Discard Risk Index. RESULTS: Of 13 629 donors in the dataset, 9255 (68%) livers were recovered and transplanted, 1519 recovered but used for research or discarded, 2855 were not recovered. The optimized gradient boosting machine classifier achieved an area under the curve of the receiver operator characteristic of 0.84 on the test set, outperforming all other classifiers. CONCLUSIONS: This model predicts successful liver recovery for transplantation in the operating room, using data available early during donor management. It performs favorably when compared to existing models. It may provide real-time decision support during organ donor management and transplant logistics.

16.
Am J Transplant ; 21(12): 4003-4011, 2021 12.
Article in English | MEDLINE | ID: mdl-34129720

ABSTRACT

Current risk-adjusted models for donor lung use and lung graft survival do not include donor critical care data. We sought to identify modifiable donor physiologic and mechanical ventilation parameters that predict donor lung use and lung graft survival. This is a prospective observational study of donors after brain death (DBDs) managed by 19 Organ Procurement Organizations from 2016 to 2019. Demographics, mechanical ventilation parameters, and critical care data were recorded at standardized time points during donor management. The lungs were transplanted from 1811 (30%) of 6052 DBDs. Achieving ≥7 critical care endpoints was a positive predictor of donor lung use. After controlling for recipient factors, donor blood pH positively predicted lung graft survival (OR 1.48 per 0.1 unit increase in pH) and the administration of dopamine during donor management negatively predicted lung graft survival (OR 0.19). Tidal volumes ≤8 ml/kg predicted body weight (OR 0.65), and higher positive end-expiratory pressures (OR 0.91 per cm H2 O) predicted decreased donor lung use without affecting lung graft survival. A randomized clinical trial is needed to inform optimal ventilator management strategies in DBDs.


Subject(s)
Graft Survival , Tissue and Organ Procurement , Brain Death , Critical Care , Humans , Lung , Tissue Donors
17.
J Am Coll Surg ; 231(3): 351-360.e5, 2020 09.
Article in English | MEDLINE | ID: mdl-32562768

ABSTRACT

BACKGROUND: Current risk-adjusted models used to predict donor heart use and cardiac graft survival from organ donors after brain death (DBDs) do not include bedside critical care data. We sought to identify novel independent predictors of heart use and graft survival to better understand the relationship between donor management and transplantation outcomes. STUDY DESIGN: We conducted a prospective observational study of DBDs managed from 2008 to 2013 by 10 organ procurement organizations. Demographic data, critical care parameters, and treatments were recorded at 3 standardized time points during donor management. The primary outcomes measures were donor heart use and cardiac graft survival. RESULTS: From 3,433 DBDs, 1,134 hearts (33%) were transplanted and 969 cardiac grafts (85%) survived after 684 ± 392 days of follow-up. After multivariable analysis, independent positive predictors of heart use included standard criteria donor status (odds ratio [OR] 3.93), male sex (OR 1.68), ejection fraction > 50% (OR 1.64), and partial pressure of oxygen to fraction of inspired oxygen ratio > 300 (OR 1.31). Independent negative predictors of heart use included donor age (OR 0.94), BMI > 30 kg/m2 (OR 0.78), serum creatinine (OR 0.83), and use of thyroid hormone (OR 0.78). As for graft survival, after controlling for known recipient risk factors, thyroid hormone dose was the only independent predictor (OR 1.04 per µg/h). CONCLUSIONS: Modifiable critical care parameters and treatments predict donor heart use and cardiac graft survival. The discordant relationship between thyroid hormone and donor heart use (negative predictor) vs cardiac graft survival (positive predictor) warrants additional investigation.


Subject(s)
Graft Survival , Heart Transplantation , Tissue and Organ Procurement/methods , Tissue and Organ Procurement/statistics & numerical data , Adult , Aged , Brain Death , Female , Humans , Male , Middle Aged , Prospective Studies
18.
J Trauma Acute Care Surg ; 88(6): 783-788, 2020 06.
Article in English | MEDLINE | ID: mdl-32459446

ABSTRACT

BACKGROUND: Delayed graft function (DGF), the need for dialysis in the first week following kidney transplant, affects approximately one quarter of deceased-donor kidney transplant recipients. Donor demographics, donor serum creatinine, and graft cold ischemia time are associated with DGF. However, there is no consensus on the optimal management of hemodynamic instability in organ donors after brain death (DBDs). Our objective was to determine the relationship between vasopressor selection during donor management and the development of DGF. METHODS: Prospective observational data, including demographic and critical care parameters, were collected for all DBDs managed by 17 organ procurement organizations from nine Organ Procurement and Transplantation Network Regions between 2012 and 2018. Recipient outcome data were linked with donor data through donor identification numbers. Donor critical care parameters, including type of vasopressor and doses, were recorded at three standardized time points during donor management. The analysis included only donors who received at least one vasopressor at all three time points. Vasopressor doses were converted to norepinephrine equivalent doses and analyzed as continuous variables. Univariate analyses were conducted to determine the association between donor variables and DGF. Results were adjusted for known predictors of DGF using binary logistic regression. RESULTS: Complete data were available for 5,554 kidney transplant recipients and 2,985 DBDs. On univariate analysis, donor serum creatinine, donor age, donor subtype, kidney donor profile index, graft cold ischemia time, phenylephrine dose, and dopamine dose were associated with DGF. After multivariable analysis, increased donor serum creatinine, donor age, kidney donor profile index, graft cold ischemia time, and phenylephrine dose remained independent predictors of DGF. CONCLUSION: Higher doses of phenylephrine were an independent predictor of DGF. With the exception of phenylephrine, the selection and dose of vasopressor during donor management did not predict the development of DGF. LEVEL OF EVIDENCE: Prognostic study, Level III.


Subject(s)
Brain Death/physiopathology , Critical Care/statistics & numerical data , Delayed Graft Function/epidemiology , Kidney Transplantation/adverse effects , Kidney/drug effects , Vasoconstrictor Agents/adverse effects , Adult , Age Factors , Cold Ischemia/adverse effects , Critical Care/methods , Delayed Graft Function/etiology , Delayed Graft Function/prevention & control , Dose-Response Relationship, Drug , Female , Humans , Kidney/blood supply , Kidney/physiopathology , Kidney Transplantation/methods , Kidney Transplantation/statistics & numerical data , Male , Middle Aged , Phenylephrine/administration & dosage , Phenylephrine/adverse effects , Prospective Studies , Risk Assessment , Tissue and Organ Procurement/methods , Tissue and Organ Procurement/statistics & numerical data , Vasoconstrictor Agents/administration & dosage , Young Adult
19.
Clin Transplant ; 34(5): e13835, 2020 05.
Article in English | MEDLINE | ID: mdl-32068301

ABSTRACT

BACKGROUND: No standard exists for the use of deceased donor liver biopsy during procurement. We sought to evaluate liver biopsy and the impact of findings on outcomes and graft utilization. METHODS: A prospective observational study of donors after neurologic determination of death was conducted from 02/2012-08/2017 (16 OPOs). Donor data were collected through the UNOS Donor Management Goals Registry Web Portal and linked to the Scientific Registry of Transplant Recipients (SRTR) for recipient outcomes. Recipients of biopsied donor livers (BxDL) were studied and a Cox proportional hazard analysis was used to identify independent predictors of 1-year graft survival. RESULTS: Data from 5449 liver transplant recipients were analyzed, of which 1791(33%) received a BxDL. There was no difference in graft or patient survival between the non-BxDL and BxDL recipient groups. On adjusted analysis of BxDL recipients, macrosteatosis (21%-30%[n = 148] and >30%[n = 92]) was not found to predict 1-year graft survival, whereas increasing donor age (HR1.02), donor Hispanic ethnicity (HR1.62), donor INR (HR1.18), and recipient life support (HR2.29) were. CONCLUSIONS: Excellent graft and patient survival can be achieved in recipients of BxDL grafts. Notably, as demonstrated by the lack of effect of macrosteatosis on survival, donor to recipient matching may contribute to these outcomes.


Subject(s)
Liver Transplantation , Tissue and Organ Procurement , Biopsy , Graft Survival , Humans , Liver , Living Donors , Tissue Donors , Transplant Recipients
20.
Transplantation ; 103(11): e365-e368, 2019 11.
Article in English | MEDLINE | ID: mdl-31356580

ABSTRACT

BACKGROUND: In a recent trial, targeted mild hypothermia in brain-dead organ donors significantly reduced the incidence of delayed graft function after kidney transplantation. This trial was stopped early for efficacy. Here, we report long-term graft survival for all organs along with donor critical care end points. METHODS: We assessed graft survival through 1 year of all solid organs transplanted from 370 donors who had been randomly assigned to hypothermia (34-35°C) or normothermia (36.5-37.5°C) before donation. Additionally, changes in standardized critical care end points were compared between donors in each group. RESULTS: Mild hypothermia was associated with a nonsignificant improvement in 1-year kidney transplant survival (95% versus 92%; hazard ratio, 0.61 [0.31-1.20]; P = 0.15). Mild hypothermia was associated with higher 1-year graft survival in the subgroup of standard criteria donors (97% versus 93%; hazard ratio, 0.39 [0.15 to -1.00]; P = 0.05). There were no significant differences in graft survival of extrarenal organs. There were no differences in critical care end points between groups. CONCLUSIONS: Mild hypothermia in the donor safely reduced the rate of delayed graft function in kidney transplant recipients without adversely affecting donor physiology or extrarenal graft survival. Kidneys from standard criteria donors who received targeted mild hypothermia had improved 1-year graft survival.


Subject(s)
Graft Survival , Hypothermia, Induced , Kidney Transplantation/methods , Tissue Donors , Adult , Aged , Body Temperature , Brain Death , Delayed Graft Function , Follow-Up Studies , Humans , Kidney/pathology , Kidney/surgery , Middle Aged , Patient Safety , Perfusion , Proportional Hazards Models , Tissue and Organ Procurement , Treatment Outcome
SELECTION OF CITATIONS
SEARCH DETAIL