Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 21
Filter
1.
Transplant Proc ; 55(10): 2333-2344, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37925233

ABSTRACT

A more granular donor kidney grading scale, the kidney donor profile index (KDPI), has recently emerged in contradistinction to the standard criteria donor/expanded criteria donor framework. In this paper, we built a Markov decision process model to evaluate the survival, quality-adjusted life years (QALY), and cost advantages of using high-KDPI kidneys based on multiple KDPI strata over a 60-month time horizon as opposed to remaining on the waiting list waiting for a lower-KDPI kidney. Data for the model were gathered from the Scientific Registry of Transplant Recipients and the United States Renal Data System Medicare parts A, B, and D databases. Of the 129,024 phenotypes delineated in this model, 65% of them would experience a survival benefit, 81% would experience an increase in QALYs, 87% would see cost-savings, and 76% would experience cost-savings per QALY from accepting a high-KDPI kidney rather than remaining on the waiting list waiting for a kidney of lower-KDPI. Classification and regression tree analysis (CART) revealed the main drivers of increased survival in accepting high-KDPI kidneys were wait time ≥30 months, panel reactive antibody (PRA) <90, age ≥45 to 65, diagnosis leading to renal failure, and prior transplantation. The CART analysis showed the main drivers of increased QALYs in accepting high-kidneys were wait time ≥30 months, PRA <90, and age ≥55 to 65.


Subject(s)
Kidney Transplantation , Aged , Humans , United States , Kidney Transplantation/adverse effects , Cost-Benefit Analysis , Graft Survival , Medicare , Kidney , Tissue Donors , Retrospective Studies
2.
J Neurosurg Case Lessons ; 4(10)2022 Sep 05.
Article in English | MEDLINE | ID: mdl-36083772

ABSTRACT

BACKGROUND: Conditions that can mimic posterior fossa tumors are rare. Their identification is crucial to avoid unnecessary surgical intervention, especially when prompt initiation of medical therapy is critical. OBSERVATIONS: The authors presented a case of pseudotumoral hemorrhagic cerebellitis in a 3-year-old boy who presented initially with headache, persistent vomiting, and decreased level of consciousness 9 weeks after severe acute respiratory syndrome coronavirus 2 infection. Magnetic resonance imaging showed a left cerebellar hemorrhagic mass-like lesion with edema and mild hydrocephalus. The patient responded to high-dose steroids and was discharged 2 weeks later with complete recovery. LESSONS: When evaluating patients with possible tumor syndromes, it is important to also consider rarer inflammatory syndromes that can masquerade as neoplasms. Postinfectious hemorrhagic cerebellitis is one such syndrome.

3.
Int J Cardiol ; 321: 61-68, 2020 Dec 15.
Article in English | MEDLINE | ID: mdl-32800909

ABSTRACT

BACKGROUND: Depression is a significant concern after cardiac surgery and has not been studied in patients undergoing transcatheter aortic valve replacement (TAVR). We sought to examine the prevalence of pre-procedure depression and anxiety symptoms and explore whether brief bedside cognitive behavioral therapy (CBT) could prevent post-TAVR psychological distress. METHODS: We prospectively recruited consecutive TAVR patients and randomized them to receive brief CBT or treatment as usual (TAU) during their hospitalization. Multi-level regression techniques were used to evaluate changes by treatment arm in depression, anxiety, and quality of life from baseline to 1 month post-TAVR adjusted for sex, race, DM, CHF, MMSE, and STS score. RESULTS: One hundred and forty six participants were randomized. The mean age was 82 years, and 43% were female. Self-reported depression and anxiety scores meeting cutoffs for clinical level distress were 24.6% and 23.2% respectively. Both TAU and CBT groups had comparable improvements in depressive symptoms at 1-month (31% reduction for TAU and 35% reduction for CBT, p = .83). Similarly, both TAU and CBT groups had comparable improvements in anxiety symptoms at 1-month (8% reduction for TAU and 11% reduction for CBT, p = .1). Quality of life scores also improved and were not significantly different between the two groups. CONCLUSIONS: Pre-procedure depression and anxiety may be common among patients undergoing TAVR. However, TAVR patients show spontaneous improvement in depression and anxiety scores at 1-month follow up, regardless of brief CBT. Further research is needed to determine whether more tailored CBT interventions may improve psychological and medical outcomes.


Subject(s)
Cognitive Behavioral Therapy , Transcatheter Aortic Valve Replacement , Aged, 80 and over , Depression/diagnosis , Depression/epidemiology , Depression/etiology , Female , Humans , Male , Quality of Life , Transcatheter Aortic Valve Replacement/adverse effects , Treatment Outcome
4.
Am J Transplant ; 20(11): 2997-3007, 2020 11.
Article in English | MEDLINE | ID: mdl-32515544

ABSTRACT

Clinical decision-making in kidney transplant (KT) during the coronavirus disease 2019 (COVID-19) pandemic is understandably a conundrum: both candidates and recipients may face increased acquisition risks and case fatality rates (CFRs). Given our poor understanding of these risks, many centers have paused or reduced KT activity, yet data to inform such decisions are lacking. To quantify the benefit/harm of KT in this context, we conducted a simulation study of immediate-KT vs delay-until-after-pandemic for different patient phenotypes under a variety of potential COVID-19 scenarios. A calculator was implemented (http://www.transplantmodels.com/covid_sim), and machine learning approaches were used to evaluate the important aspects of our modeling. Characteristics of the pandemic (acquisition risk, CFR) and length of delay (length of pandemic, waitlist priority when modeling deceased donor KT) had greatest influence on benefit/harm. In most scenarios of COVID-19 dynamics and patient characteristics, immediate KT provided survival benefit; KT only began showing evidence of harm in scenarios where CFRs were substantially higher for KT recipients (eg, ≥50% fatality) than for waitlist registrants. Our simulations suggest that KT could be beneficial in many centers if local resources allow, and our calculator can help identify patients who would benefit most. Furthermore, as the pandemic evolves, our calculator can update these predictions.


Subject(s)
COVID-19/epidemiology , Kidney Failure, Chronic/epidemiology , Kidney Transplantation , Machine Learning , Pandemics , SARS-CoV-2 , Tissue Donors/supply & distribution , Adolescent , Adult , Aged , Child , Child, Preschool , Female , Humans , Infant , Infant, Newborn , Kidney Failure, Chronic/surgery , Male , Middle Aged , United States/epidemiology , Waiting Lists/mortality , Young Adult
5.
Transplantation ; 104(6): 1294-1303, 2020 06.
Article in English | MEDLINE | ID: mdl-32433232

ABSTRACT

BACKGROUND: Hepatitis C virus-positive (HCV+) kidney transplant (KT) recipients are at increased risks of rejection and graft failure. The optimal induction agent for this population remains controversial, particularly regarding concerns that antithymocyte globulin (ATG) might increase HCV-related complications. METHODS: Using Scientific Registry of Transplant Recipients and Medicare claims data, we studied 6780 HCV+ and 139 681 HCV- KT recipients in 1999-2016 who received ATG or interleukin-2 receptor antagonist (IL2RA) for induction. We first examined the association of recipient HCV status with receiving ATG (versus IL2RA) using multilevel logistic regression. Then, we studied the association of ATG (versus IL2RA) with KT outcomes (rejection, graft failure, and death) and hepatic complications (liver transplant registration and cirrhosis) among HCV+ recipients using logistic and Cox regression. RESULTS: HCV+ recipients were less likely to receive ATG than HCV- recipients (living donor, adjusted odds ratio [aOR] = 0.640.770.91; deceased donor, aOR = 0.710.810.92). In contrast, HCV+ recipients who received ATG were at lower risk of acute rejection compared to those who received IL2RA (1-y crude incidence = 11.6% versus 12.6%; aOR = 0.680.820.99). There was no significant difference in the risks of graft failure (adjusted hazard ratio [aHR] = 0.861.001.17), death (aHR = 0.850.951.07), liver transplant registration (aHR = 0.580.971.61), and cirrhosis (aHR = 0.730.921.16). CONCLUSIONS: Our findings suggest that ATG, as compared to IL2RA, may lower the risk of acute rejection without increasing hepatic complications in HCV+ KT recipients. Given the higher rates of acute rejection in this population, ATG appears to be safe and reasonable for HCV+ recipients.


Subject(s)
Antilymphocyte Serum/administration & dosage , Graft Rejection/epidemiology , Hepatitis C/drug therapy , Kidney Failure, Chronic/surgery , Kidney Transplantation/adverse effects , Transplantation Conditioning/methods , Adult , Female , Graft Rejection/immunology , Graft Rejection/prevention & control , Graft Survival/drug effects , Graft Survival/immunology , Hepacivirus/drug effects , Hepacivirus/immunology , Hepacivirus/isolation & purification , Hepatitis C/complications , Hepatitis C/diagnosis , Hepatitis C/immunology , Humans , Kidney Failure, Chronic/complications , Kidney Failure, Chronic/mortality , Male , Middle Aged , Receptors, Interleukin-2/antagonists & inhibitors , Receptors, Interleukin-2/immunology , Registries/statistics & numerical data , Survival Analysis , Transplant Recipients/statistics & numerical data , Treatment Outcome , United States/epidemiology
6.
Circ Cardiovasc Interv ; 13(4): e008587, 2020 04.
Article in English | MEDLINE | ID: mdl-32279562

ABSTRACT

BACKGROUND: Intracoronary acetylcholine (Ach) provocation testing is the gold standard for assessing coronary endothelial function. However, dosing regimens of Ach are quite varied in the literature, and there are limited data evaluating the optimal dose. We evaluated the dose-response relationship between Ach and minimal lumen diameter (MLD) by sex and studied whether incremental intracoronary Ach doses given during endothelial function testing improve its diagnostic utility. METHODS: We evaluated 65 men and 212 women with angina and no obstructive coronary artery disease who underwent endothelial function testing using the highest tolerable dose of intracoronary Ach, up to 200 µg. Epicardial endothelial dysfunction was defined as a decrease in MLD >20% after intracoronary Ach by quantitative coronary angiography. We used a linear mixed effects model to evaluate the dose-response relationship. Deming regression analysis was done to compare the %MLD constriction after incremental doses of intracoronary Ach. RESULTS: The mean age was 53.5 years. Endothelial dysfunction was present in 186 (68.1%). Among men with endothelial dysfunction, there was a significant decrease in MLD/10 µg of Ach at doses above 50 µg and 100 µg, while this decrease in MLD was not observed in women (P<0.001). The %MLD constriction at 20 µg versus 50 µg and 50 µg versus 100 µg were not equivalent while the %MLD constriction at 100 µg versus 200 µg were equivalent. CONCLUSIONS: Women and men appear to have different responses to Ach during endothelial function testing. In addition to having a greater response to intracoronary Ach at all doses, men also demonstrate an Ach-MLD dose-response relationship with doses up to 200 µg, while women have minimal change in MLD with doses above 50 µg. An incremental dosing regimen during endothelial function testing appears to improve the diagnostic utility of the test and should be adjusted based on the sex of the patient.


Subject(s)
Acetylcholine/administration & dosage , Angina Pectoris/diagnostic imaging , Coronary Angiography , Coronary Artery Disease/diagnostic imaging , Coronary Vasospasm/diagnostic imaging , Coronary Vessels/diagnostic imaging , Endothelium, Vascular/physiopathology , Vasoconstriction , Vasoconstrictor Agents/administration & dosage , Adult , Aged , Angina Pectoris/physiopathology , Coronary Artery Disease/physiopathology , Coronary Vasospasm/chemically induced , Coronary Vasospasm/physiopathology , Coronary Vessels/physiopathology , Dose-Response Relationship, Drug , Female , Humans , Male , Middle Aged , Predictive Value of Tests , Prospective Studies , Sex Factors
7.
Transplantation ; 103(10): 2113-2120, 2019 10.
Article in English | MEDLINE | ID: mdl-30801545

ABSTRACT

BACKGROUND: The Organ Procurement and Transplantation Network implemented Share 35 on June 18, 2013, to broaden deceased donor liver sharing within regional boundaries. We investigated whether increased sharing under Share 35 impacted geographic disparity in deceased donor liver transplantation (DDLT) across donation service areas (DSAs). METHODS: Using Scientific Registry of Transplant Recipients June 2009 to June 2017, we identified 86 083 adult liver transplant candidates and retrospectively estimated Model for End-Stage Liver Disease (MELD)-adjusted DDLT rates using nested multilevel Poisson regression with random intercepts for DSA and transplant program. From the variance in DDLT rates across 49 DSAs and 102 programs, we derived the DSA-level median incidence rate ratio (MIRR) of DDLT rates. MIRR is a robust metric of heterogeneity across each hierarchical level; larger MIRR indicates greater disparity. RESULTS: MIRR was 2.18 pre-Share 35 and 2.16 post-Share 35. Thus, 2 candidates with the same MELD in 2 different DSAs were expected to have a 2.2-fold difference in DDLT rate driven by geography alone. After accounting for program-level heterogeneity, MIRR was attenuated to 2.10 pre-Share 35 and 1.96 post-Share 35. For candidates with MELD 15-34, MIRR decreased from 2.51 pre- to 2.27 post-Share 35, and for candidates with MELD 35-40, MIRR increased from 1.46 pre- to 1.51 post-Share 35, independent of program-level heterogeneity in DDLT. DSA-level heterogeneity in DDLT rates was greater than program-level heterogeneity pre- and post-Share 35. CONCLUSIONS: Geographic disparity substantially impacted DDLT rates before and after Share 35, independent of program-level heterogeneity and particularly for candidates with MELD 35-40. Despite broader sharing, geography remains a major determinant of access to DDLT.


Subject(s)
End Stage Liver Disease/surgery , Healthcare Disparities/statistics & numerical data , Liver Transplantation/statistics & numerical data , Tissue and Organ Procurement/statistics & numerical data , Allografts/supply & distribution , End Stage Liver Disease/diagnosis , Female , Geography , Humans , Male , Middle Aged , Registries/statistics & numerical data , Retrospective Studies , Severity of Illness Index , Waiting Lists
8.
Int J Cardiol ; 282: 7-15, 2019 05 01.
Article in English | MEDLINE | ID: mdl-30527992

ABSTRACT

OBJECTIVE: While >20% of patients presenting to the cardiac catheterization laboratory with angina have no obstructive coronary artery disease (CAD), a majority (77%) have an occult coronary abnormality (endothelial dysfunction, microvascular dysfunction (MVD), and/or a myocardial bridge (MB)). There are little data regarding the ability of noninvasive stress testing to identify these occult abnormalities in patients with angina in the absence of obstructive CAD. METHODS: We retrospectively evaluated 155 patients (76.7% women) with angina and no obstructive CAD who underwent stress echocardiography and/or electrocardiography before angiography. We evaluated Duke treadmill score, heart rate recovery (HRR), metabolic equivalents, and blood pressure response. During angiography, patients underwent invasive testing for endothelial dysfunction (decrease in epicardial coronary artery diameter >20% after intracoronary acetylcholine), MVD (index of microcirculatory resistance ≥25), and intravascular ultrasound for the presence of an MB. RESULTS: Stress echocardiography and electrocardiography were positive in 58 (43.6%) and 57 (36.7%) patients, respectively. Endothelial dysfunction was present in 96 (64%), MVD in 32 (20.6%), and an MB in 83 (53.9%). On multivariable logistic regression, stress echo was not associated with any abnormality, while stress ECG was associated with endothelial dysfunction. An abnormal HRR was associated with endothelial dysfunction and MVD, but not an MB. CONCLUSION: Conventional stress testing is insufficient for identifying occult coronary abnormalities that are frequently present in patients with angina in the absence of obstructive CAD. A normal stress test does not rule out a non-obstructive coronary etiology of angina, nor does it negate the need for comprehensive invasive testing.


Subject(s)
Angina Pectoris/diagnostic imaging , Coronary Artery Disease/diagnostic imaging , Echocardiography, Stress/standards , Exercise Test/standards , Adult , Aged , Angina Pectoris/physiopathology , Coronary Artery Disease/physiopathology , Electrocardiography/standards , Female , Humans , Male , Microcirculation/physiology , Middle Aged , Retrospective Studies
9.
Clin Transplant ; 32(7): e13291, 2018 07.
Article in English | MEDLINE | ID: mdl-29791039

ABSTRACT

Racial disparities in living donor kidney transplantation (LDKT) persist but the most effective target to eliminate these disparities remains unknown. One potential target could be delays during completion of the live donor evaluation process. We studied racial differences in progression through the evaluation process for 247 African American (AA) and 664 non-AA living donor candidates at our center between January 2011 and March 2015. AA candidates were more likely to be obese (38% vs 22%: P < .001), biologically related (66% vs 44%: P < .001), and live ≤50 miles from the center (64% vs 37%: P < .001) than non-AAs. Even after adjusting for these differences, AAs were less likely to progress from referral to donation (aHR for AA vs non-AA: 0.26 0.47 0.83; P = .01). We then assessed racial differences in completion of each step of the evaluation process and found disparities in progression from medical screening to in-person evaluation (aHR: 0.41 0.620.94; P = .02) and from clearance to donation (aHR: 0.28 0.510.91; P = .02), compared with from referral to medical screening (aHR: 0.78 1.021.33; P = .95) and from in-person evaluation to clearance (aHR: 0.59 0.931.44; P = .54). Delays may be a manifestation of the transplant candidate's social network, thus, targeted efforts to optimize networks for identification of donor candidates may help address LDKT disparities.


Subject(s)
Black or African American/statistics & numerical data , Healthcare Disparities/trends , Kidney Failure, Chronic/ethnology , Kidney Transplantation/statistics & numerical data , Living Donors/statistics & numerical data , White People/statistics & numerical data , Adult , Donor Selection , Female , Follow-Up Studies , Humans , Kidney Failure, Chronic/surgery , Male , Middle Aged , Needs Assessment , Treatment Outcome , United States
10.
Am J Transplant ; 18(6): 1510-1517, 2018 06.
Article in English | MEDLINE | ID: mdl-29437286

ABSTRACT

Kidney paired donation (KPD) can facilitate living donor transplantation for candidates with an incompatible donor, but requires waiting for a match while experiencing the morbidity of dialysis. The balance between waiting for KPD vs desensitization or deceased donor transplantation relies on the ability to estimate KPD wait times. We studied donor/candidate pairs in the National Kidney Registry (NKR), a large multicenter KPD clearinghouse, between October 2011 and September 2015 using a competing-risk framework. Among 1894 candidates, 52% were male, median age was 50 years, 66% were white, 59% had blood type O, 42% had panel reactive antibody (PRA)>80, and 50% obtained KPD through NKR. Median times to KPD ranged from 2 months for candidates with ABO-A and PRA 0, to over a year for candidates with ABO-O or PRA 98+. Candidates with PRA 80-97 and 98+ were 23% (95% confidence interval , 6%-37%) and 83% (78%-87%) less likely to be matched than PRA 0 candidates. ABO-O candidates were 67% (61%-73%) less likely to be matched than ABO-A candidates. Candidates with ABO-B or ABO-O donors were 31% (10%-56%) and 118% (82%-162%) more likely to match than those with ABO-A donors. Providers should counsel candidates about realistic, individualized expectations for KPD, especially in the context of their alternative treatment options.


Subject(s)
Kidney Transplantation , Living Donors , Adult , Female , Histocompatibility Testing , Humans , Male , Middle Aged , Registries , Tissue and Organ Procurement
11.
Am J Transplant ; 18(3): 632-641, 2018 03.
Article in English | MEDLINE | ID: mdl-29165871

ABSTRACT

Kidney paired donation (KPD) is an important tool to facilitate living donor kidney transplantation (LDKT). Concerns remain over prolonged cold ischemia times (CIT) associated with shipping kidneys long distances through KPD. We examined the association between CIT and delayed graft function (DGF), allograft survival, and patient survival for 1267 shipped and 205 nonshipped/internal KPD LDKTs facilitated by the National Kidney Registry in the United States from 2008 to 2015, compared to 4800 unrelated, nonshipped, non-KPD LDKTs. Shipped KPD recipients had a median CIT of 9.3 hours (range = 0.25-23.9 hours), compared to 1.0 hour for internal KPD transplants and 0.93 hours for non-KPD LDKTs. Each hour of CIT was associated with a 5% increased odds of DGF (adjusted odds ratio: 1.05, 95% confidence interval [CI], 1.02-1.09, P < .01). However, there was not a significant association between CIT and all-cause graft failure (adjusted hazard ratio [aHR]: 1.01, 95% CI: 0.98-1.04, P = .4), death-censored graft failure ( [aHR]: 1.02, 95% CI, 0.98-1.06, P = .4), or mortality (aHR 1.00, 95% CI, 0.96-1.04, P > .9). This study of KPD-facilitated LDKTs found no evidence that long CIT is a concern for reduced graft or patient survival. Studies with longer follow-up are needed to refine our understanding of the safety of shipping donor kidneys through KPD.


Subject(s)
Cold Ischemia/adverse effects , Delayed Graft Function/etiology , Graft Rejection/etiology , Kidney Failure, Chronic/surgery , Kidney Transplantation/mortality , Living Donors , Tissue and Organ Harvesting/adverse effects , Travel/statistics & numerical data , Adult , Female , Follow-Up Studies , Glomerular Filtration Rate , Graft Rejection/mortality , Graft Survival , Humans , Kidney Function Tests , Male , Middle Aged , Organ Preservation , Prognosis , Risk Factors , Survival Rate , Time Factors , Tissue and Organ Procurement/methods , Transplant Recipients
12.
Am J Transplant ; 18(6): 1415-1423, 2018 06.
Article in English | MEDLINE | ID: mdl-29232040

ABSTRACT

The Kidney Allocation System fundamentally altered kidney allocation, causing a substantial increase in regional and national sharing that we hypothesized might impact geographic disparities. We measured geographic disparity in deceased donor kidney transplant (DDKT) rate under KAS (6/1/2015-12/1/2016), and compared that with pre-KAS (6/1/2013-12/3/2014). We modeled DSA-level DDKT rates with multilevel Poisson regression, adjusting for allocation factors under KAS. Using the model we calculated a novel, improved metric of geographic disparity: the median incidence rate ratio (MIRR) of transplant rate, a measure of DSA-level variation that accounts for patient casemix and is robust to outlier values. Under KAS, MIRR was 1.75 1.811.86 for adults, meaning that similar candidates across different DSAs have a median 1.81-fold difference in DDKT rate. The impact of geography was greater than the impact of factors emphasized by KAS: having an EPTS score ≤20% was associated with a 1.40-fold increase (IRR = 1.35 1.401.45 , P < .01) and a three-year dialysis vintage was associated with a 1.57-fold increase (IRR = 1.56 1.571.59 , P < .001) in transplant rate. For pediatric candidates, MIRR was even more pronounced, at 1.66 1.922.27 . There was no change in geographic disparities with KAS (P = .3). Despite extensive changes to kidney allocation under KAS, geography remains a primary determinant of access to DDKT.


Subject(s)
Geography , Health Care Rationing , Kidney Transplantation , Tissue and Organ Procurement , Adult , Female , Humans , Male , Middle Aged , Poisson Distribution , Renal Dialysis
13.
Clin Transplant ; 32(2)2018 02.
Article in English | MEDLINE | ID: mdl-29222929

ABSTRACT

BACKGROUND: HIV-infected (HIV+) donor organs can be transplanted into HIV+ recipients under the HIV Organ Policy Equity (HOPE) Act. Quantifying HIV+ donor referrals received by organ procurement organizations (OPOs) is critical for HOPE Act implementation. METHODS: We surveyed the 58 USA OPOs regarding HIV+ referral records and newly discovered HIV+ donors. Using data from OPOs that provided exact records and CDC HIV prevalence data, we projected a national estimate of HIV+ referrals. RESULTS: Fifty-five (95%) OPOs reported HIV+ referrals ranging from 0 to 276 and newly discovered HIV+ cases ranging from 0 to 10 annually. Six OPOs in areas of high HIV prevalence reported more than 100 HIV+ donor referrals. Twenty-seven (47%) OPOs provided exact HIV+ referral records and 28 (51%) OPOs provided exact records of discovered HIV+ cases, totaling 1450 HIV+ referrals and 39 discovered HIV+ donors in the prior year. These OPOs represented 67% and 59% of prevalent HIV cases in the USA; thus, we estimated 2164 HIV+ referrals and 66 discovered HIV+ cases nationally per year. CONCLUSIONS: OPOs reported a high volume of HIV+ referrals annually, of which a subset will be medically eligible for donation. Particularly in areas of high HIV prevalence, OPOs require ongoing support to implement the HOPE Act.


Subject(s)
Donor Selection , HIV Infections/virology , Organ Transplantation/standards , Referral and Consultation , Tissue Donors/statistics & numerical data , Tissue and Organ Procurement/organization & administration , Follow-Up Studies , HIV/isolation & purification , Humans , Prognosis , Tissue Donors/legislation & jurisprudence , Tissue and Organ Procurement/classification , Tissue and Organ Procurement/legislation & jurisprudence
14.
J Am Soc Nephrol ; 28(9): 2749-2755, 2017 Sep.
Article in English | MEDLINE | ID: mdl-28450534

ABSTRACT

Studies have estimated the average risk of postdonation ESRD for living kidney donors in the United States, but personalized estimation on the basis of donor characteristics remains unavailable. We studied 133,824 living kidney donors from 1987 to 2015, as reported to the Organ Procurement and Transplantation Network, with ESRD ascertainment via Centers for Medicare and Medicaid Services linkage, using Cox regression with late entries. Black race (hazard ratio [HR], 2.96; 95% confidence interval [95% CI], 2.25 to 3.89; P<0.001) and male sex (HR, 1.88; 95% CI, 1.50 to 2.35; P<0.001) was associated with higher risk of ESRD in donors. Among nonblack donors, older age was associated with greater risk (HR per 10 years, 1.40; 95% CI, 1.23 to 1.59; P<0.001). Among black donors, older age was not significantly associated with risk (HR, 0.88; 95% CI, 0.72 to 1.09; P=0.3). Greater body mass index was associated with higher risk (HR per 5 kg/m2, 1.61; 95% CI, 1.29 to 2.00; P<0.001). Donors who had a first-degree biological relationship to the recipient had increased risk (HR, 1.70; 95% CI, 1.24 to 2.34; P<0.01). C-statistic of the model was 0.71. Predicted 20-year risk of ESRD for the median donor was only 34 cases per 10,000 donors, but 1% of donors had predicted risk exceeding 256 cases per 10,000 donors. Risk estimation is critical for appropriate informed consent and varies substantially across living kidney donors. Greater permissiveness may be warranted in older black candidate donors; young black candidates should be evaluated carefully.


Subject(s)
Kidney Failure, Chronic/epidemiology , Kidney Transplantation , Living Donors/statistics & numerical data , Adult , Black or African American/statistics & numerical data , Age Factors , Body Mass Index , Female , Humans , Incidence , Kidney Failure, Chronic/ethnology , Male , Middle Aged , Risk Assessment , Risk Factors , Sex Factors , United States/epidemiology
15.
N Engl J Med ; 374(5): 411-21, 2016 Feb 04.
Article in English | MEDLINE | ID: mdl-26544982

ABSTRACT

BACKGROUND: Evaluation of candidates to serve as living kidney donors relies on screening for individual risk factors for end-stage renal disease (ESRD). To support an empirical approach to donor selection, we developed a tool that simultaneously incorporates multiple health characteristics to estimate a person's probable long-term risk of ESRD if that person does not donate a kidney. METHODS: We used risk associations from a meta-analysis of seven general population cohorts, calibrated to the population-level incidence of ESRD and mortality in the United States, to project the estimated long-term incidence of ESRD among persons who do not donate a kidney, according to 10 demographic and health characteristics. We then compared 15-year projections with the observed risk among 52,998 living kidney donors in the United States. RESULTS: A total of 4,933,314 participants from seven cohorts were followed for a median of 4 to 16 years. For a 40-year-old person with health characteristics that were similar to those of age-matched kidney donors, the 15-year projections of the risk of ESRD in the absence of donation varied according to race and sex; the risk was 0.24% among black men, 0.15% among black women, 0.06% among white men, and 0.04% among white women. Risk projections were higher in the presence of a lower estimated glomerular filtration rate, higher albuminuria, hypertension, current or former smoking, diabetes, and obesity. In the model-based lifetime projections, the risk of ESRD was highest among persons in the youngest age group, particularly among young blacks. The 15-year observed risks after donation among kidney donors in the United States were 3.5 to 5.3 times as high as the projected risks in the absence of donation. CONCLUSIONS: Multiple demographic and health characteristics may be used together to estimate the projected long-term risk of ESRD among living kidney-donor candidates and to inform acceptance criteria for kidney donors. (Funded by the National Institute of Diabetes and Digestive and Kidney Diseases and others.).


Subject(s)
Kidney Failure, Chronic/epidemiology , Kidney Transplantation , Living Donors , Risk Assessment , Adult , Aged , Female , Glomerular Filtration Rate , Humans , Hypertension , Incidence , Kidney Failure, Chronic/ethnology , Kidney Failure, Chronic/surgery , Male , Middle Aged , Models, Statistical , Risk Assessment/methods , Risk Factors , Sex Factors , United States/epidemiology
16.
Liver Transpl ; 21(8): 1031-9, 2015 Aug.
Article in English | MEDLINE | ID: mdl-25990089

ABSTRACT

Concerns have been raised that optimized redistricting of liver allocation areas might have the unintended result of shifting livers from better-performing to poorer-performing organ procurement organizations (OPOs). We used liver simulated allocation modeling to simulate a 5-year period of liver sharing within either 4 or 8 optimized districts. We investigated whether each OPO's net liver import under redistricting would be correlated with 2 OPO performance metrics (observed to expected liver yield and liver donor conversion ratio), along with 2 other potential correlates (eligible deaths and incident listings above a Model for End-Stage Liver Disease score of 15). We found no evidence that livers would flow from better-performing OPOs to poorer-performing OPOs in either redistricting scenario. Instead, under these optimized redistricting plans, our simulations suggest that livers would flow from OPOs with more-than-expected eligible deaths toward those with fewer-than-expected eligible deaths and that livers would flow from OPOs with fewer-than-expected incident listings to those with more-than-expected incident listings; the latter is a pattern that is already established in the current allocation system. Redistricting liver distribution to reduce geographic inequity is expected to align liver allocation across the country with the distribution of supply and demand rather than transferring livers from better-performing OPOs to poorer-performing OPOs.


Subject(s)
Catchment Area, Health , Health Care Rationing , Health Services Needs and Demand , Healthcare Disparities , Liver Transplantation/methods , Process Assessment, Health Care , Tissue Donors/supply & distribution , Tissue and Organ Procurement/methods , Computer Simulation , Delivery of Health Care , Humans , Liver Transplantation/adverse effects , Liver Transplantation/mortality , Models, Theoretical , Needs Assessment , Time Factors , Treatment Outcome , Waiting Lists
18.
Transplantation ; 99(2): 360-6, 2015 Feb.
Article in English | MEDLINE | ID: mdl-25594552

ABSTRACT

BACKGROUND: Most pediatric kidney transplant recipients eventually require retransplantation, and the most advantageous timing strategy regarding deceased and living donor transplantation in candidates with only 1 living donor remains unclear. METHODS: A patient-oriented Markov decision process model was designed to compare, for a given patient with 1 living donor, living-donor-first followed if necessary by deceased donor retransplantation versus deceased-donor-first followed if necessary by living donor (if still able to donate) or deceased donor (if not) retransplantation. Based on Scientific Registry of Transplant Recipients data, the model was designed to account for waitlist, graft, and patient survival, sensitization, increased risk of graft failure seen during late adolescence, and differential deceased donor waiting times based on pediatric priority allocation policies. Based on national cohort data, the model was also designed to account for aging or disease development, leading to ineligibility of the living donor over time. RESULTS: Given a set of candidate and living donor characteristics, the Markov model provides the expected patient survival over a time horizon of 20 years. For the most highly sensitized patients (panel reactive antibody > 80%), a deceased-donor-first strategy was advantageous, but for all other patients (panel reactive antibody < 80%), a living-donor-first strategy was recommended. CONCLUSIONS: This Markov model illustrates how patients, families, and providers can be provided information and predictions regarding the most advantageous use of deceased donor versus living donor transplantation for pediatric recipients.


Subject(s)
Decision Support Techniques , Donor Selection , Kidney Transplantation/methods , Living Donors/supply & distribution , Adolescent , Adult , Age Factors , Child , Computer Simulation , Eligibility Determination , Female , Graft Survival , HLA Antigens/immunology , Histocompatibility , Humans , Isoantibodies/blood , Kidney Transplantation/adverse effects , Kidney Transplantation/mortality , Male , Markov Chains , Middle Aged , Multivariate Analysis , Proportional Hazards Models , Registries , Reoperation , Risk Factors , Stochastic Processes , Time Factors , Treatment Outcome , United States , Waiting Lists , Young Adult
19.
Liver Transpl ; 21(3): 293-9, 2015 Mar.
Article in English | MEDLINE | ID: mdl-25556648

ABSTRACT

Whether the liver allocation system shifts organs from better performing organ procurement organizations (OPOs) to poorer performing OPOs has been debated for many years. Models of OPO performance from the Scientific Registry of Transplant Recipients make it possible to study this question in a data-driven manner. We investigated whether each OPO's net liver import was correlated with 2 performance metrics [observed to expected (O:E) liver yield and liver donor conversion ratio] as well as 2 alternative explanations [eligible deaths and incident listings above a Model for End-Stage Liver Disease (MELD) score of 15]. We found no evidence to support the hypothesis that the allocation system transfers livers from better performing OPOs to centers with poorer performing OPOs. Also, having fewer eligible deaths was not associated with a net import. However, having more incident listings was strongly correlated with the net import, both before and after Share 35. Most importantly, the magnitude of the variation in OPO performance was much lower than the variation in demand: although the poorest performing OPOs differed from the best ones by less than 2-fold in the O:E liver yield, incident listings above a MELD score of 15 varied nearly 14-fold. Although it is imperative that all OPOs achieve the best possible results, the flow of livers is not explained by OPO performance metrics, and instead, it appears to be strongly related to differences in demand.


Subject(s)
Catchment Area, Health , End Stage Liver Disease/surgery , Health Services Accessibility/organization & administration , Liver Transplantation/methods , Process Assessment, Health Care/organization & administration , Tissue Donors/supply & distribution , Tissue and Organ Procurement/organization & administration , Decision Support Techniques , End Stage Liver Disease/diagnosis , End Stage Liver Disease/mortality , Health Services Accessibility/standards , Health Services Needs and Demand/organization & administration , Healthcare Disparities , Humans , Liver Transplantation/adverse effects , Liver Transplantation/mortality , Liver Transplantation/standards , Models, Organizational , Needs Assessment/organization & administration , Process Assessment, Health Care/standards , Quality Indicators, Health Care , Residence Characteristics , Severity of Illness Index , Tissue and Organ Procurement/standards , Treatment Outcome , United States , Waiting Lists
20.
Liver Transpl ; 20(10): 1237-43, 2014 Oct.
Article in English | MEDLINE | ID: mdl-24975028

ABSTRACT

Recent allocation policy changes have increased the sharing of deceased donor livers across local boundaries, and sharing even broader than this has been proposed as a remedy for persistent geographic disparities in liver transplantation. It is possible that broader sharing may increase cold ischemia times (CITs) and thus harm recipients. We constructed a detailed model of transport modes (car, helicopter, and fixed-wing aircraft) and transport times between all hospitals, and we investigated the relationship between the transport time and the CIT for deceased donor liver transplants. The median estimated transport time was 2.0 hours for regionally shared livers and 1.0 hour for locally allocated livers. The model-predicted transport mode was flying for 90% of regionally shared livers but for only 22% of locally allocated livers. The median CIT was 7.0 hours for regionally shared livers and 6.0 hours for locally allocated livers. Variation in the transport time accounted for only 14.7% of the variation in the CIT, and the transport time on average composed only 21% of the CIT. In conclusion, nontransport factors play a substantially larger role in the CIT than the transport time. Broader sharing will have only a marginal impact on the CIT but will significantly increase the fraction of transplants that are transported by flying rather than driving.


Subject(s)
Cold Ischemia/statistics & numerical data , Graft Survival , Kidney Transplantation , Liver Transplantation , Tissue Donors , Tissue and Organ Procurement/methods , Adult , Aged , Cadaver , Follow-Up Studies , Humans , Middle Aged , Retrospective Studies , Time Factors
SELECTION OF CITATIONS
SEARCH DETAIL
...