Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 30
Filter
1.
PLoS Comput Biol ; 15(4): e1006952, 2019 04.
Article in English | MEDLINE | ID: mdl-30933973

ABSTRACT

The broadly neutralizing antibody (bnAb) VRC01 is being evaluated for its efficacy to prevent HIV-1 infection in the Antibody Mediated Prevention (AMP) trials. A secondary objective of AMP utilizes sieve analysis to investigate how VRC01 prevention efficacy (PE) varies with HIV-1 envelope (Env) amino acid (AA) sequence features. An exhaustive analysis that tests how PE depends on every AA feature with sufficient variation would have low statistical power. To design an adequately powered primary sieve analysis for AMP, we modeled VRC01 neutralization as a function of Env AA sequence features of 611 HIV-1 gp160 pseudoviruses from the CATNAP database, with objectives: (1) to develop models that best predict the neutralization readouts; and (2) to rank AA features by their predictive importance with classification and regression methods. The dataset was split in half, and machine learning algorithms were applied to each half, each analyzed separately using cross-validation and hold-out validation. We selected Super Learner, a nonparametric ensemble-based cross-validated learning method, for advancement to the primary sieve analysis. This method predicted the dichotomous resistance outcome of whether the IC50 neutralization titer of VRC01 for a given Env pseudovirus is right-censored (indicating resistance) with an average validated AUC of 0.868 across the two hold-out datasets. Quantitative log IC50 was predicted with an average validated R2 of 0.355. Features predicting neutralization sensitivity or resistance included 26 surface-accessible residues in the VRC01 and CD4 binding footprints, the length of gp120, the length of Env, the number of cysteines in gp120, the number of cysteines in Env, and 4 potential N-linked glycosylation sites; the top features will be advanced to the primary sieve analysis. This modeling framework may also inform the study of VRC01 in the treatment of HIV-infected persons.


Subject(s)
Antibodies, Monoclonal/pharmacology , HIV Envelope Protein gp160/genetics , HIV Envelope Protein gp160/immunology , Amino Acid Sequence , Antibodies, Monoclonal/genetics , Antibodies, Monoclonal/immunology , Antibodies, Neutralizing/immunology , Binding Sites , Broadly Neutralizing Antibodies , CD4 Antigens , Computer Simulation , Forecasting/methods , Glycosylation , HIV Antibodies/immunology , HIV Infections/virology , HIV-1/immunology , Humans , Protein Binding
2.
N Engl J Med ; 365(4): 318-26, 2011 Jul 28.
Article in English | MEDLINE | ID: mdl-21793744

ABSTRACT

BACKGROUND: More than 20,000 candidates for kidney transplantation in the United States are sensitized to HLA and may have a prolonged wait for a transplant, with a reduced transplantation rate and an increased rate of death. One solution is to perform live-donor renal transplantation after the depletion of donor-specific anti-HLA antibodies. Whether such antibody depletion results in a survival benefit as compared with waiting for an HLA-compatible kidney is unknown. METHODS: We used a protocol that included plasmapheresis and the administration of low-dose intravenous immune globulin to desensitize 211 HLA-sensitized patients who subsequently underwent renal transplantation (treatment group). We compared rates of death between the group undergoing desensitization treatment and two carefully matched control groups of patients on a waiting list for kidney transplantation who continued to undergo dialysis (dialysis-only group) or who underwent either dialysis or HLA-compatible transplantation (dialysis-or-transplantation group). RESULTS: In the treatment group, Kaplan-Meier estimates of patient survival were 90.6% at 1 year, 85.7% at 3 years, 80.6% at 5 years, and 80.6% at 8 years, as compared with rates of 91.1%, 67.2%, 51.5%, and 30.5%, respectively, for patients in the dialysis-only group and rates of 93.1%, 77.0%, 65.6%, and 49.1%, respectively, for patients in the dialysis-or-transplantation group (P<0.001 for both comparisons). CONCLUSIONS: Live-donor transplantation after desensitization provided a significant survival benefit for patients with HLA sensitization, as compared with waiting for a compatible organ. By 8 years, this survival advantage more than doubled. These data provide evidence that desensitization protocols may help overcome incompatibility barriers in live-donor renal transplantation. (Funded by the National Institute of Diabetes and Digestive and Kidney Diseases and the Charles T. Bauer Foundation.).


Subject(s)
Desensitization, Immunologic/methods , Immunoglobulins, Intravenous/therapeutic use , Kidney Failure, Chronic/surgery , Kidney Transplantation/immunology , Plasmapheresis , Adult , Case-Control Studies , Female , Histocompatibility Testing , Humans , Immunosuppressive Agents/therapeutic use , Kaplan-Meier Estimate , Kidney Failure, Chronic/therapy , Kidney Transplantation/mortality , Living Donors , Male , Middle Aged , Mycophenolic Acid/analogs & derivatives , Mycophenolic Acid/therapeutic use , Plasmapheresis/adverse effects , Renal Dialysis , Tacrolimus/therapeutic use , Transplantation Conditioning/methods , Transplantation Immunology
3.
Cell Rep ; 43(5): 114174, 2024 May 28.
Article in English | MEDLINE | ID: mdl-38700982

ABSTRACT

Activating mutations in PIK3CA are frequently found in estrogen-receptor-positive (ER+) breast cancer, and the combination of the phosphatidylinositol 3-kinase (PI3K) inhibitor alpelisib with anti-ER inhibitors is approved for therapy. We have previously demonstrated that the PI3K pathway regulates ER activity through phosphorylation of the chromatin modifier KMT2D. Here, we discovered a methylation site on KMT2D, at K1330 directly adjacent to S1331, catalyzed by the lysine methyltransferase SMYD2. SMYD2 loss attenuates alpelisib-induced KMT2D chromatin binding and alpelisib-mediated changes in gene expression, including ER-dependent transcription. Knockdown or pharmacological inhibition of SMYD2 sensitizes breast cancer cells, patient-derived organoids, and tumors to PI3K/AKT inhibition and endocrine therapy in part through KMT2D K1330 methylation. Together, our findings uncover a regulatory crosstalk between post-translational modifications that fine-tunes KMT2D function at the chromatin. This provides a rationale for the use of SMYD2 inhibitors in combination with PI3Kα/AKT inhibitors in the treatment of ER+/PIK3CA mutant breast cancer.


Subject(s)
Breast Neoplasms , Chromatin , Histone-Lysine N-Methyltransferase , Humans , Histone-Lysine N-Methyltransferase/metabolism , Histone-Lysine N-Methyltransferase/genetics , Breast Neoplasms/genetics , Breast Neoplasms/drug therapy , Breast Neoplasms/metabolism , Breast Neoplasms/pathology , Female , Chromatin/metabolism , DNA-Binding Proteins/metabolism , DNA-Binding Proteins/genetics , Methylation/drug effects , Cell Line, Tumor , Animals , Mice , Proto-Oncogene Proteins c-akt/metabolism , Neoplasm Proteins/metabolism , Neoplasm Proteins/genetics , Receptors, Estrogen/metabolism , Gene Expression Regulation, Neoplastic/drug effects
4.
Elife ; 122023 07 25.
Article in English | MEDLINE | ID: mdl-37489578

ABSTRACT

Integrin-mediated cell attachment rapidly induces tyrosine kinase signaling. Despite years of research, the role of this signaling in integrin activation and focal adhesion assembly is unclear. We provide evidence that the Src-family kinase (SFK) substrate Cas (Crk-associated substrate, p130Cas, BCAR1) is phosphorylated and associated with its Crk/CrkL effectors in clusters that are precursors of focal adhesions. The initial phospho-Cas clusters contain integrin ß1 in its inactive, bent closed, conformation. Later, phospho-Cas and total Cas levels decrease as integrin ß1 is activated and core focal adhesion proteins including vinculin, talin, kindlin, and paxillin are recruited. Cas is required for cell spreading and focal adhesion assembly in epithelial and fibroblast cells on collagen and fibronectin. Cas cluster formation requires Cas, Crk/CrkL, SFKs, and Rac1 but not vinculin. Rac1 provides positive feedback onto Cas through reactive oxygen, opposed by negative feedback from the ubiquitin proteasome system. The results suggest a two-step model for focal adhesion assembly in which clusters of phospho-Cas, effectors and inactive integrin ß1 grow through positive feedback prior to integrin activation and recruitment of core focal adhesion proteins.


Subject(s)
Focal Adhesions , Phosphoproteins , Phosphorylation , Focal Adhesions/metabolism , Phosphoproteins/metabolism , Integrin beta1/metabolism , Crk-Associated Substrate Protein/metabolism , Protein-Tyrosine Kinases/metabolism , Integrins/metabolism , Focal Adhesion Protein-Tyrosine Kinases/metabolism , Focal Adhesion Kinase 1/metabolism
5.
Elife ; 102021 06 25.
Article in English | MEDLINE | ID: mdl-34169835

ABSTRACT

Integrin adhesion complexes regulate cytoskeletal dynamics during cell migration. Adhesion activates phosphorylation of integrin-associated signaling proteins, including Cas (p130Cas, BCAR1), by Src-family kinases. Cas regulates leading-edge protrusion and migration in cooperation with its binding partner, BCAR3. However, it has been unclear how Cas and BCAR3 cooperate. Here, using normal epithelial cells, we find that BCAR3 localization to integrin adhesions requires Cas. In return, Cas phosphorylation, as well as lamellipodia dynamics and cell migration, requires BCAR3. These functions require the BCAR3 SH2 domain and a specific phosphorylation site, Tyr 117, that is also required for BCAR3 downregulation by the ubiquitin-proteasome system. These findings place BCAR3 in a co-regulatory positive-feedback circuit with Cas, with BCAR3 requiring Cas for localization and Cas requiring BCAR3 for activation and downstream signaling. The use of a single phosphorylation site in BCAR3 for activation and degradation ensures reliable negative feedback by the ubiquitin-proteasome system.


Subject(s)
Adaptor Proteins, Signal Transducing/genetics , Crk-Associated Substrate Protein/genetics , Guanine Nucleotide Exchange Factors/genetics , Pseudopodia/metabolism , Signal Transduction , Adaptor Proteins, Signal Transducing/metabolism , Cell Adhesion , Cell Line , Crk-Associated Substrate Protein/metabolism , Epithelial Cells , Guanine Nucleotide Exchange Factors/metabolism , Humans , Integrins/metabolism , Phosphorylation , src Homology Domains
6.
J Am Soc Nephrol ; 20(1): 197-204, 2009 Jan.
Article in English | MEDLINE | ID: mdl-18776120

ABSTRACT

C4d deposition in peritubular capillaries is a specific marker for the presence of antidonor antibodies in renal transplant recipients and is usually associated with antibody-mediated rejection (AMR) in conventional allografts. In ABO-incompatible grafts, however, peritubular capillary C4d is often present on protocol biopsies lacking histologic features of AMR; the significance of C4d in this setting remains unclear. For addressing this, data from 33 patients who received ABO-incompatible renal allografts (after desensitization) were retrospectively reviewed. Protocol biopsies were performed at 1 and/or 3 and 6 mo after transplantation in each recipient and at 12 mo in 28 recipients. Twenty-one patients (group A) had strong, diffuse peritubular capillary C4d staining without histologic evidence of AMR or cellular rejection on their initial protocol biopsies. The remaining 12 patients (group B) had negative or weak, focal peritubular capillary C4d staining. Three grafts (two in group B) were lost but not as a result of AMR. Excluding these three patients, serum creatinine levels were similar in the two groups at 6 and 12 mo after transplantation and at last follow-up; however, recipients in group A developed significantly fewer overall chronic changes, as scored by the sum of Banff chronic indices, than group B during the first year after transplantation. These results suggest that diffuse peritubular capillary C4d deposition without rejection is associated with a lower risk for scarring in ABO-incompatible renal allografts; the generalizability of these results to conventional allografts remains unknown.


Subject(s)
ABO Blood-Group System/immunology , Blood Group Incompatibility/immunology , Cicatrix/prevention & control , Complement C4b/metabolism , Graft Rejection , Kidney Transplantation/immunology , Peptide Fragments/metabolism , Adult , Aged , Biopsy , Female , Humans , Kidney/pathology , Male , Middle Aged , Retrospective Studies , Transplantation, Homologous
7.
J Am Soc Nephrol ; 19(2): 349-55, 2008 Feb.
Article in English | MEDLINE | ID: mdl-18094366

ABSTRACT

Current billing practices and mandates to report surgical outcomes are disincentives to surgical treatment of obese patients, who are at increased risk for longer hospital stays and higher complication rates. The objective of this study was to quantify the independent association between body mass index (BMI) and waiting time for kidney transplantation to identify potential provider bias against surgical treatment of the obese. A secondary data analysis was performed of a prospective cohort of 132,353 patients who were registered for kidney transplantation in the United States between 1995 and 2006. Among all patients awaiting kidney transplantation, the likelihood of receiving a transplant decreased with increasing degree of obesity, categorized by ranges of BMI (adjusted hazard ratios 0.96 for overweight, 0.93 for obese, 0.72 for severely obese, and 0.56 for morbidly obese, compared with a reference group of patients with normal BMI). Similarly, the likelihood of being bypassed when an organ became available increased in a graded manner with category of obesity (adjusted incidence rate ratio 1.02 for overweight, 1.05 for obese, 1.11 for severely obese, and 1.22 for morbidly obese). Although matching an available organ with an appropriate recipient requires clinical judgment, which could not be fully captured in this study, the observed differences are dramatic and warrant further studies to understand this effect better and to design a system that is less susceptible to unintended bias.


Subject(s)
Health Services Accessibility/statistics & numerical data , Kidney Failure, Chronic/epidemiology , Kidney Failure, Chronic/surgery , Kidney Transplantation/statistics & numerical data , Obesity, Morbid/epidemiology , Adolescent , Adult , Body Mass Index , Cohort Studies , Female , Humans , Male , Middle Aged , Patient Selection , Registries , Regression Analysis , Time Factors , Waiting Lists
8.
J Am Soc Nephrol ; 19(10): 2011-9, 2008 Oct.
Article in English | MEDLINE | ID: mdl-18650478

ABSTRACT

Although the majority of deceased-donor kidneys are donated after brain death, increased recovery of kidneys donated after cardiac death could reduce the organ shortage and is now a national priority. Racial disparities in donations after brain death have been well described for renal transplantation, but it is unknown whether similar disparities occur in donations after cardiac death. In this study, outcomes of adult deceased-donor renal transplant recipients included in the United Network for Organ Sharing database (1993 through 2006) were analyzed. Among black recipients of kidneys obtained after cardiac death, those who received kidneys from black donors had better long-term graft and patient survival than those who received kidneys from white donors. In addition, compared with standard-criteria kidneys from white donors after brain death, kidneys from black donors after cardiac death conferred a 70% reduction in the risk for graft loss (adjusted hazard ratio 0.30; 95% confidence interval 0.14 to 0.65; P = 0.002) and a 59% reduction in risk for death (adjusted hazard ratio 0.41; 95% confidence interval 0.2 to 0.87; P = 0.02) among black recipients. These findings suggest that kidneys obtained from black donors after cardiac death may afford the best long-term survival for black recipients.


Subject(s)
Black People/statistics & numerical data , Kidney Diseases/surgery , Kidney Transplantation/ethnology , Tissue Donors , White People/statistics & numerical data , Adult , Cadaver , Cohort Studies , Female , Graft Survival , Humans , Kidney Diseases/ethnology , Kidney Diseases/mortality , Male , Middle Aged , Retrospective Studies , Risk Factors , Survival Rate , Treatment Outcome
9.
Ann Surg ; 248(5): 863-70, 2008 Nov.
Article in English | MEDLINE | ID: mdl-18948816

ABSTRACT

OBJECTIVE: To quantify the independent association between obesity and access to liver transplantation. BACKGROUND: Obesity is associated with higher complication rates, longer hospitalization, and worse survival after liver transplantation. Nevertheless, transplantation provides survival benefit to patients with end-stage liver disease, regardless of body mass index (BMI). We hypothesized that, despite survival benefit, providers were reluctant to transplant obese patients because of the inherent difficulty of these cases and their inferior outcomes. Our goal was to quantify the independent association between BMI and waiting time for orthotopic liver transplantation as a surrogate marker for this reluctance. METHODS: We studied 29,136 wait-list candidates in the model for end-stage liver disease (MELD) era, categorized as severely obese (BMI 35-40), morbidly obese (BMI 40-60), and reference (BMI 18.5-35). All models were adjusted for factors relevant to the allocation system, factors possibly influencing access to healthcare, and factors biologically related to disease progression and outcomes. RESULTS: The odds of receiving a MELD exception were 30% lower in severely obese and 38% lower in morbidly obese patients. Similarly, the likelihoods of being turned down for an organ were 10% and 16% higher, and the rates of being transplanted were 11% and 29% lower in severely obese and morbidly obese patients, respectively. CONCLUSIONS: Current practice seems to indicate a reluctance to transplant obese patients. If indeed as a community we feel that liver allografts should not be distributed to patients with excessive postoperative risk, we should consider expressing this as a formal change to our allocation policy rather than through informal practice patterns.


Subject(s)
Liver Failure/epidemiology , Liver Transplantation/statistics & numerical data , Obesity/epidemiology , Tissue and Organ Procurement/organization & administration , Waiting Lists , Body Mass Index , Comorbidity , Diabetes Mellitus/epidemiology , Fatty Liver/epidemiology , Fatty Liver/surgery , Female , Health Services Accessibility , Hepatitis C/epidemiology , Hepatitis C/surgery , Humans , Liver Failure/surgery , Male , Middle Aged , Obesity, Morbid/epidemiology , Patient Selection , Regression Analysis , Resource Allocation/organization & administration , Tissue and Organ Procurement/statistics & numerical data
10.
Transplantation ; 85(7): 935-42, 2008 Apr 15.
Article in English | MEDLINE | ID: mdl-18408571

ABSTRACT

BACKGROUND: When the United Network for Organ Sharing changed its algorithm for liver allocation to the model for end-stage liver disease (MELD) system in 2002, highest priority shifted to patients with renal insufficiency as a major component of their end-stage liver disease. An unintended consequence of the new system was a rapid increase in the number of simultaneous liver-kidney transplants (SLK) being performed yearly. METHODS: Adult recipients of deceased donor liver transplants (LT, n=19,137), kidney transplants (n=33,712), and SLK transplants (n=1,032) between 1987 and 2006 were evaluated based on United Network for Organ Sharing data. Recipients were stratified by donor subgroup, MELD score, pre- versus post-MELD era, and length of time on dialysis. Matched-control analyses were performed, and graft and patient survival were analyzed by Kaplan-Meier and Cox proportional hazards analyses. RESULTS: MELD era outcomes demonstrate a decline in patient survival after SLK. Using matched-control analysis, we are unable to demonstrate a benefit in the SLK cohort compared with LT, despite the fact that higher quality allografts are being used for SLK. Subgroup analysis of the SLK cohort did demonstrate an increase in overall 1-year patient and liver graft survival only in those patients on long-term dialysis (> or =3 months) compared with LT (84.5% vs. 70.8%, P=0.008; hazards ratio 0.57 [95% CI 0.34, 0.95], P=0.03). CONCLUSION: These findings suggest that SLK may be overused in the MELD era and that current prioritization of kidney grafts to those liver failure patients results in wasting of limited resources.


Subject(s)
Kidney Transplantation , Liver Transplantation , Resource Allocation , Adolescent , Adult , Aged , Cohort Studies , Follow-Up Studies , Humans , Kidney Transplantation/mortality , Kidney Transplantation/trends , Liver Transplantation/mortality , Liver Transplantation/trends , Middle Aged , Patient Selection , Proportional Hazards Models , Renal Dialysis , Retrospective Studies , Survival Analysis , Survivors , Transplantation, Homologous , Treatment Outcome
11.
Hepatology ; 46(6): 1907-18, 2007 Dec.
Article in English | MEDLINE | ID: mdl-17918247

ABSTRACT

UNLABELLED: Elderly liver donors (ELDs) represent a possible expansion of the donor pool, although there is great reluctance to use ELDs because of reports that increasing donor age predicts graft loss and patient death. The goal of this study was to identify a subgroup of recipients who would be least affected by increased donor age and thus best suited to receive grafts from ELDs. A national registry of deceased donor liver transplants from 2002-2005 was analyzed. ELDs aged 70-92 (n = 1043) were compared with average liver donors (ALDs) aged 18-69 (n = 15,878) and ideal liver donors (ILDs) aged 18-39 (n = 6842). Recipient factors that modified the effect of donor age on outcomes were identified via interaction term analysis. Outcomes in recipient subgroups were compared using Kaplan-Meier survival analysis. Recipients preferred for ELD transplants were determined to be first-time recipients over the age of 45 with body mass index <35, non-status 1 registration, cold ischemic time <8 hours, and either hepatocellular carcinoma or an indication for transplantation other than hepatitis C. In preferred recipients, there were no differences in outcomes when ELD livers were used (3-year graft survival: ELD 75%, ALD 75%, ILD 77%, P > 0.1; 3-year patient survival: ELD 81%, ALD 80%, ILD 81%, P > 0.1). In contrast, there were significantly worse outcomes when ELD livers were used in nonpreferred recipients (3-year graft survival: ELD 50%, ALD 71%, ILD 75%, P < 0.001; 3-year patient survival: ELD 64%, ALD 77%, ILD 80%, P < 0.001). CONCLUSION: The risks of ELDs can be substantially minimized by appropriate recipient selection.


Subject(s)
Liver Transplantation/statistics & numerical data , Patient Selection , Tissue Donors/statistics & numerical data , Adult , Age Factors , Aged , Aged, 80 and over , Cohort Studies , Female , Humans , Male , Middle Aged , Proportional Hazards Models , Prospective Studies , Risk Assessment/statistics & numerical data , Risk Reduction Behavior , United States/epidemiology
12.
Transplantation ; 82(12): 1683-8, 2006 Dec 27.
Article in English | MEDLINE | ID: mdl-17198260

ABSTRACT

BACKGROUND: Liver transplantation from donation after cardiac death (DCD) donors is an increasingly common approach for expansion of the donor organ supply. However, transplantation with DCD livers results in inferior graft survival. In this study, we examined donor and recipient characteristics that are associated with poor allograft outcomes and present a set of criteria that permit allograft survival that is comparable to that of donation after brain death (DBD) grafts in both low- and high-risk recipients. METHODS: The United Network for Organ Sharing/Organ Procurement and Transplantation Network Liver Transplantation Registry between January 1996 and March 2006 was investigated. Adult DCD liver transplants (n = 874) were included. RESULTS: A DCD risk index was developed using the statistically significant factors from a multivariate Cox model: history of previous transplantation, life support status at transplantation, donor age, donor warm ischemia time (DWIT), and cold ischemia time (CIT). Favorable DCD donor criteria were donor age < or =45 years, DWIT < or =15 min, and CIT < or =10 hr. Four risk groups were developed based upon index scores that showed different graft survival. Graft survival of the favorable DCD group (84.9% at 1 year, 75.2% at 3 years, and 69.4% at 5 years) was comparable to that for DBD liver transplantation irrespective of recipient condition. Increasing donor age was more highly predictive of poor outcomes in DCD compared to DBD, especially in recipients in poor preoperative condition. CONCLUSIONS: DCD livers from young donors with short DWIT and CIT should be given greater consideration in order to expand the number of available donor organs.


Subject(s)
Death, Sudden, Cardiac , Graft Survival , Liver Transplantation , Tissue Donors , Tissue and Organ Procurement/methods , Adult , Aged , Cadaver , Female , Humans , Male , Middle Aged , Risk Factors
13.
J Am Coll Surg ; 203(6): 827-30, 2006 Dec.
Article in English | MEDLINE | ID: mdl-17116550

ABSTRACT

BACKGROUND: A low prevalence of high-level clinical studies in the surgical literature has been reported previously. We reviewed a recent sample of surgical publications to assess the current status of clinical research. STUDY DESIGN: A 3-month sample of journal articles in Archives of Surgery, Surgery, and Annals of Surgery in 2005 was evaluated by two independent reviewers to determine the distribution of articles in established evidence classes. RESULTS: A total of 133 publications were identified in the three journals during the time periods reviewed, including 101 clinical articles and 30 basic science articles. Among the clinical papers, there were 8 class I studies (7.9%), 34 class II studies (33.7%), and more than half were class III studies (59 of 101, or 58.4%). CONCLUSIONS: The low prevalence of high-level evidence to guide surgical management of patients persists in major general surgery journals. We believe that education about proper research methodology is not only important for researchers, but is also important for practicing surgeons, and can have important health policy implications as well.


Subject(s)
Bibliometrics , Biomedical Research/standards , General Surgery , Publishing/statistics & numerical data , Biomedical Research/statistics & numerical data , Humans , Periodicals as Topic , Publishing/classification , Publishing/standards
14.
Article in English | MEDLINE | ID: mdl-15975035

ABSTRACT

Antibody-mediated barriers to renal transplantation, including donor specific anti-HLA and anti-blood group antibodies, have become an increasingly important issue over the last forty years as the organ shortage has continued to expand. The inevitable result of the unmet demand for compatible organs has been a continuous increase in recipient waiting times. Over the last decade, two treatment strategies have been developed to address this problem. These regimens rely on the immunomodulatory properties of intravenous immunoglobulin (IVIG) administered alone at relatively high doses, or at lower doses in combination with the non-selective depletion of antibodies from plasmapheresis. Both protocols have been successfully used for desensitization of patients with donor-specific anti-HLA antibody and have allowed for renal transplantation with excellent outcomes. The combined strategy of plasmapheresis/IVIG has also been successfully employed for renal transplantation in recipients of ABO blood group incompatible kidneys. This review will provide an overview of these therapies and their application to incompatible renal transplantation.


Subject(s)
ABO Blood-Group System/immunology , Blood Group Incompatibility/therapy , Histocompatibility Testing , Immunoglobulins, Intravenous/therapeutic use , Kidney Transplantation , Plasmapheresis , Humans
15.
JAMA ; 294(13): 1655-63, 2005 Oct 05.
Article in English | MEDLINE | ID: mdl-16204665

ABSTRACT

CONTEXT: First proposed 2 decades ago, live kidney paired donation (KPD) was considered a promising new approach to addressing the shortage of organs for transplantation. Ethical, administrative, and logistical barriers initially proved formidable and prevented the implementation of KPD programs in the United States. OBJECTIVE: To determine the feasibility and effectiveness of KPD for the management of patients with incompatible donors. DESIGN, SETTING, AND PATIENTS: Prospective series of paired donations matched and transplanted from a pool of blood type or crossmatch incompatible donors and recipients with end-stage renal disease (6 conventional and 4 unconventional KPD transplants) at a US tertiary referral center (between June 2001 and November 2004) with expertise in performing transplants in patients with high immunologic risk. INTERVENTION: Kidney paired donation and live donor renal transplantation. MAIN OUTCOME MEASURES: Patient survival, graft survival, serum creatinine levels, rejection episodes. RESULTS: A total of 22 patients received transplants through 10 paired donations including 2 triple exchanges at Johns Hopkins Hospital. At a median follow-up of 13 months (range, 1-42 months), the patient survival rate was 100% and the graft survival rate was 95.5%. Twenty-one of the 22 patients have functioning grafts with a median 6-month serum creatinine level of 1.2 mg/dL (range, 0.8-1.8 mg/dL) (106.1 micromol/L [range, 70.7-159.1 micromol/L]). There were no instances of antibody-mediated rejection despite the inclusion of 5 patients who were highly sensitized to HLA antigens due to previous exposure to foreign tissue. Four patients developed acute cellular rejection (18%). CONCLUSIONS: This series of patients who received transplants from a single-center KPD pool provides evidence that recipients with incompatible live donors, even those with rare blood type combinations or high degrees of HLA antigen sensitization, can receive transplants through KPD with graft survival rates that appear to be equivalent to directed, compatible live donor transplants. If these results can be generalized, broader availability of KPD to the estimated 6000 patients with incompatible donors could result in a large expansion of the donor pool.


Subject(s)
Kidney Transplantation , Living Donors , Tissue and Organ Procurement , Transplantation Immunology , Adolescent , Adult , Aged , Female , Graft Survival , Humans , Male , Middle Aged , Survival Analysis
16.
Transplantation ; 98(1): 54-65, 2014 Jul 15.
Article in English | MEDLINE | ID: mdl-24978035

ABSTRACT

BACKGROUND: Descriptions of the sequelae of ABO-incompatible (ABOi) kidney transplantation are limited to single-center reports, which may lack power to detect important effects. METHODS: We examined U.S. Renal Data System registry data to study associations of ABOi live-donor kidney transplantation with clinical complications in a national cohort. Among 14,041 Medicare-insured transplants in 2000 to 2007, 119 non-donor-A2 ABOi transplants were identified. A2-incompatible (n=35) transplants were categorized separately. Infection and hemorrhage events were identified by diagnosis codes on billing claims. Associations of ABO incompatibility with complications were assessed by multivariate Cox regression. RESULTS: Recipients of ABOi transplants experienced significantly (P<0.05) higher incidence of wound infections (12.7% vs. 7.3%), pneumonia (7.6% vs. 3.8%), and urinary tract infections (UTIs) or pyelonephritis (24.5% vs. 15.3%) in the first 90 days compared with ABO-compatible recipients. In adjusted models, ABO incompatibility was associated with twice the risk of pneumonia (adjusted hazard ratio [aHR], 2.22; 95% confidence interval [CI], 1.14-4.33) and 56% higher risk of UTIs or pyelonephritis (aHR, 1.56; 95% CI, 1.05-2.30) in the first 90 posttransplantation days, and 3.5 times the relative risk of wound infections in days 91 to 365 (aHR, 3.55; 95% CI, 1.92-6.57). ABOi recipients, 19% of whom underwent pre- or peritransplant splenectomy, experienced twice the adjusted risk of early hemorrhage (aHR, 1.96; 95% CI, 1.19-3.24). A2-incompatible transplantation was associated only with early risk of UTIs or pyelonephritis. CONCLUSION: ABOi transplantation offers patients with potential live donors an additional transplant option but with higher risks of infectious and hemorrhagic complications. Awareness of these complications may help improve protocols for the management of ABOi transplantation.


Subject(s)
ABO Blood-Group System/immunology , Blood Group Incompatibility/immunology , Histocompatibility , Kidney Transplantation/adverse effects , Living Donors , Medicare , Postoperative Hemorrhage/etiology , Pyelonephritis/etiology , Urinary Tract Infections/etiology , Adult , Female , Humans , Kaplan-Meier Estimate , Male , Middle Aged , Multivariate Analysis , Proportional Hazards Models , Registries , Risk Factors , Time Factors , Treatment Outcome , United States
19.
Transplantation ; 91(7): 765-71, 2011 Apr 15.
Article in English | MEDLINE | ID: mdl-21285917

ABSTRACT

BACKGROUND: Kidney transplants from pediatric donors after cardiac death (PDCD) have quadrupled in the past 9 years, but little data exist on outcomes using these donors. We hypothesized that pediatric organs might be more sensitive to the pathophysiology of cardiac death. METHODS: We evaluated outcomes and rates of discard of more than 12,000 pediatric kidneys recovered between 2000 and 2009. We compared short- and long-term graft function among adult and pediatric recipients of PDCD kidneys compared with recipients of pediatric kidneys from donors after brain death (PDBD). RESULTS: Overall, 6.3% of pediatric kidneys recovered were PDCD and 93.7% were PDBD. Discard rates were higher for PDCD kidneys (adjusted odds ratio=1.69, 95% confidence interval [CI]=1.31-2.18, P<0.001). Delayed graft function (DGF) was twice as common in recipients of PDCD grafts compared with PDBD (26.2% vs. 13.0%, P<0.001); however, among pediatric recipients, DGF rates were half of those observed in adults, and a statistically significant difference in DGF could not be detected between PDBD and PDCD grafts (6.9% vs. 4.9%, P=0.6). Among all recipients, PDCD kidneys had a greater risk of graft loss compared with PDBD kidneys (adjusted hazard ratio=1.32, 95% CI=1.06-1.65, P=0.01), although among pediatric recipients this increased risk was not statistically significant (adjusted hazard ratio=2.01, 95% CI=0.89-4.54, P=0.1). CONCLUSIONS: The differences in outcomes between adult recipients of PDCD and PDBD kidneys, and the attenuation of these differences among pediatric recipients, should be weighed against risks of prolonged waitlist time in recipients being considered for these grafts.


Subject(s)
Death , Kidney Transplantation , Tissue Donors , Adolescent , Adult , Child , Female , Graft Survival , Humans , Infant , Kidney Transplantation/mortality , Male , Treatment Outcome
20.
Arch Surg ; 146(11): 1261-6, 2011 Nov.
Article in English | MEDLINE | ID: mdl-22106317

ABSTRACT

HYPOTHESIS: The use of kidneys from deceased donors considered at increased infectious risk represents a strategy to increase the donor pool. DESIGN: Single-institution longitudinal observational study. SETTING: Tertiary care center. PATIENTS: Fifty patients who gave special informed consent to receive Centers for Disease Control and Prevention high-risk (CDCHR) donor kidneys were followed up by serial testing for viral transmission after transplantation. Nucleic acid testing for human immunodeficiency virus, hepatitis B virus, and hepatitis C virus was performed on all high-risk donors before transplantation. Outcomes of CDCHR kidney recipients were compared with outcomes of non-high-risk (non-HR) kidney recipients. MAIN OUTCOME MEASURES: New viral transmission, graft function, and waiting list time. RESULTS: No recipient seroconversion was detected during a median follow-up period of 11.3 months. Compared with non-HR donors, CDCHR donors were younger (mean [SD] age, 35 [11] vs 43 [18] years, P = .01), fewer were expanded criteria donors (2.0% vs 24.8%, P < .001), and fewer had a terminal creatinine level exceeding 2.5 mg/dL (4.0% vs 8.8%, P = .002). The median creatinine levels at 1 year after transplantation were 1.4 (interquartile range, 1.2-1.7) mg/dL for CDCHR recipients and 1.4 (interquartile range, 1.1-1.9) mg/dL for non-HR recipients (P = .4). Willingness to accept a CDCHR kidney significantly shortened the median waiting list time (274 vs 736 days, P < .001). CONCLUSIONS: We show safe use of CDCHR donor kidneys and good 1-year graft function. With continued use of these organs and careful follow-up care, we will be better able to gauge donor risk and match it to recipient need to expand the donor pool and optimize patient benefit.


Subject(s)
Centers for Disease Control and Prevention, U.S. , DNA, Viral/analysis , Kidney Failure, Chronic/surgery , Kidney Transplantation , Tissue Donors , Virus Diseases/diagnosis , Adult , Baltimore/epidemiology , Female , Follow-Up Studies , Humans , Incidence , Male , Middle Aged , Prospective Studies , Risk Factors , United States , Virus Diseases/epidemiology , Virus Diseases/transmission , Waiting Lists
SELECTION OF CITATIONS
SEARCH DETAIL