ABSTRACT
Comprehensive genomic sequencing is becoming a critical component in the assessment of hematologic malignancies, with broad implications for patients' management. In this context, unequivocally discriminating somatic from germline events is challenging but greatly facilitated by matched analysis of tumor:normal pairs of samples. In contrast to solid tumors, in hematologic malignancies conventional sources of normal control material (peripheral blood, buccal swabs, saliva) could be highly involved by the neoplastic process, rendering them unsuitable. In this work we describe our real-world experience using cell-free DNA (cfDNA) isolated from nail clippings as an alternate source of normal control material, through the dedicated review of 2,610 tumor:nail pairs comprehensively sequenced by MSK-IMPACT-heme. Overall, we found that nail cfDNA is a robust germline control for paired genomic studies. In a subset of patients, nail DNA may be contaminated by tumor DNA, reflecting unique attributes of the hematologic disease and transplant history. Contamination is generally low level, but significantly more common among patients with myeloid neoplasms (20.5%; 304/1,482) than among those with lymphoid diseases (5.4%; 61/1,128) and particularly enriched in myeloproliferative neoplasms with marked myelofibrosis. When identified in patients with lymphoid and plasma-cell neoplasms, mutations commonly reflected a myeloid profile and correlated with a concurrent/evolving clonal myeloid neoplasm. Donor DNA was identified in 22% (11/50) of nails collected after allogeneic stem-cell transplantation. In this cohort, an association with a recent history of graft-versus-host disease was identified. These findings should be considered as a potential limitation to the use of nails as a source of normal control DNA but could also provide important diagnostic information regarding the disease process.
Subject(s)
Cell-Free Nucleic Acids , Hematologic Neoplasms , Nails , Humans , Hematologic Neoplasms/genetics , Hematologic Neoplasms/diagnosis , Nails/metabolism , Nails/pathology , Nails/chemistry , Male , Female , Cell-Free Nucleic Acids/genetics , Middle Aged , Adult , Aged , Genomics/methods , High-Throughput Nucleotide Sequencing , Mutation , Young Adult , Aged, 80 and over , AdolescentABSTRACT
Membranous nephropathy (MN) is one of the most common de novo glomerular diseases developing in patients after allogeneic hematopoietic stem cell transplantation (HSCT). Most authors have used immunosuppression for its treatment to target the underlying immune-mediated processes, akin to graft-versus-host disease, but the optimal management is currently unclear. Limited reports in the literature described the use of a conservative approach with success, particularly in cases with lower risks of progression, such as non-nephrotic-range proteinuria or early reduction of proteinuria by 6 months. We report two cases of post-HSCT MN with moderate risk features, namely prolonged durations of nephrotic-range proteinuria, that spontaneously resolved with conservative treatment. Patient 1 was of advanced age and in an immunocompromised state, while patient 2 was in need of a greater graft-versus-disease effect from the donor's immune system, which necessitated a balance between the risk of immunosuppression and the risk of progressive kidney function loss. These cases demonstrated that conservative treatment can be a reasonable approach in selected patients with post-HSCT MN, including those with moderate risk.
ABSTRACT
Comorbidity assessment before allogeneic haematopoietic cell transplantation (allo-HCT) is essential for estimating non-relapse mortality (NRM) risk. We previously developed the Simplified Comorbidity Index (SCI), which captures a small number of 'high-yield' comorbidities and older age. The SCI was predictive of NRM in myeloablative CD34-selected allo-HCT. Here, we evaluated the SCI in a single-centre cohort of 327 patients receiving reduced-intensity conditioning followed by unmanipulated allografts from HLA-matched donors. Among the SCI factors, age above 60, mild renal impairment, moderate pulmonary disease and cardiac disease were most frequent. SCI scores ranged from 0 to 8, with 39%, 20%, 20% and 21% having scores of 0-1, 2, 3 and ≥4 respectively. Corresponding cumulative incidences of 3-year NRM were 11%, 16%, 22% and 27%; p = 0.03. In multivariable models, higher SCI scores were associated with incremental risks of all-cause mortality and NRM. The SCI had an area under the receiver operating characteristic curve of 65.9%, 64.1% and 62.9% for predicting 1-, 2- and 3-year NRM versus 58.4%, 60.4% and 59.3% with the haematopoietic cell transplantation comorbidity index. These results demonstrate for the first time that the SCI is predictive of NRM in patients receiving allo-HCT from HLA-matched donors after reduced-intensity conditioning.
Subject(s)
Hematopoietic Stem Cell Transplantation , Tissue Donors , Humans , Comorbidity , Hematopoietic Stem Cell Transplantation/methods , Recurrence , Retrospective Studies , Transplantation Conditioning/methods , Transplantation, Homologous/methods , MortalityABSTRACT
Allogeneic hematopoietic cell transplantation (allo-HCT) is a potentially curative treatment for patients with acute leukemia. Despite this, studies have shown that only a minority of patients ultimately proceed to allo-HCT. The primary objective of this prospective, observational study was to identify the rate of allo-HCT in patients for whom it was recommended, and reasons why patients deemed appropriate and eligible for HCT did not subsequently undergo transplant. Between April 2016 and April 2021, adult patients with newly diagnosed or relapsed/refractory acute leukemia were enrolled at the time of induction/reinduction therapy. Initial transplantation workup and allo-HCT recommendations were made during the early phase of induction/reinduction. Of the 307 enrolled patients, allo-HCT was recommended to 85% (n = 259), of whom 66% (n = 170) underwent transplant. Donor sources comprised 54% human leukocyte antigen (HLA)-matched unrelated donors, 20% HLA-matched sibling donors and HLA-mismatched graft sources with 15% umbilical cord blood units, 8% HLA-mismatched unrelated donors, and 4% HLA-haploidentical donors. The most common reason for transplant disqualification in the 89 patients in whom it was initially recommended was persistent/relapsed disease (70%), followed by early patient death (10%). In this prospective study, we report a high allo-HCT rate, which may be due to early transplant referral and workup. The main allo-HCT barrier was disease control, followed by early patient death. With the increasing availability of HLA-mismatched graft sources, the lack of donor availability was not a transplant barrier. Further development of novel transplant strategies for patients not achieving remission and improvements in induction regimens could result in increased allo-HCT utilization.
Subject(s)
Graft vs Host Disease , Hematopoietic Stem Cell Transplantation , Leukemia, Myeloid, Acute , Adult , Humans , Prospective Studies , Hematopoietic Stem Cell Transplantation/adverse effects , Unrelated Donors , Transplantation, Homologous , Leukemia, Myeloid, Acute/therapy , Leukemia, Myeloid, Acute/etiology , Acute Disease , HLA Antigens , Graft vs Host Disease/etiology , Retrospective StudiesABSTRACT
BACKGROUND: We examined the correlation between persistent human herpesvirus 6 (HHV-6) DNAemia (p-HHV-6) and absolute lymphocyte count (ALC), platelet count (PLT), and all-cause mortality by 1 year after ex vivo T-cell-depleted (TCD) hematopoietic cell transplant (HCT). METHODS: We analyzed a cohort of adult TCD HCT recipients during 2012-2016 prospectively monitored for plasma HHV-6 by quantitative polymerase chain reaction from day +14 post-HCT through day +100 (D+100). p-HHV-6 was defined as ≥2 consecutive values of ≥500 copies/mL by D+100. PLT and ALC were compared between patients with and without p-HHV-6 using generalized estimating equations (GEE). Multivariable Cox proportional hazard models (PH) were used to identify the impact of p-HHV-6 on 1 year mortality. RESULTS: Of 312 patients, 83 (27%) had p-HHV-6 by D+100. p-HHV-6 was associated with lower ALC and PLT in the first year post-HCT. In multivariable models, p-HHV-6 was associated with higher mortality by 1 year post-HCT (adjusted hazard ratio, 2.97 [95% confidence interval, 1.62-5.47]; P = .0005), after adjusting for age, antiviral treatment, and ALC at D+100. CONCLUSIONS: p-HHV-6 was associated with lower ALC and PLT in the first year post-HCT. p-HHV-6 was an independent predictor of mortality in the first year after TCD HCT.
Subject(s)
Hematopoietic Stem Cell Transplantation , Herpesvirus 6, Human , Roseolovirus Infections , Adult , DNA, Viral , Hematopoietic Stem Cell Transplantation/adverse effects , Herpesvirus 6, Human/genetics , Humans , Proportional Hazards Models , T-Lymphocytes , Transplantation, Homologous/adverse effectsABSTRACT
BACKGROUND: Cytomegalovirus (CMV)-seropositive (R+) hematopoietic cell transplant (HCT) recipients have a survival disparity compared with CMV-seronegative recipient/donor (R-D-) pairs. We hypothesized that primary letermovir prophylaxis (LET) may abrogate this disparity. We investigated the relationship between LET and mortality at 1 year post-HCT. METHODS: In this retrospective cohort study, we included adult R-D- or R+ patients who received HCT pre-LET (between 1 January 2013 through 15 December 2017) and post-LET (between 16 December 2017 through December 2019). R+ were categorized by LET receipt as R+/LET or R+/no-LET. Cox proportional hazard models were used to estimate the association of LET with all-cause mortality at 1 year after transplantation. RESULTS: Of 848 patients analyzed, 305 were R-D-, 364 R+/no-LET, and 160 R+/LET. Because of similar mortality (adjusted hazard ratio [aHR], 1.29 [95% confidence interval {CI}, .76-2.18]; Pâ =â .353]) between pre-LET/R-D- and post-LET/R-D-, R-D- were combined into 1 group. Compared with R-D-, the aHR for mortality was 1.40 (95% CI, 1.01-1.93) for R+/no-LET and 0.89 (95% CI, .57-1.41) for R+/LET. Among R+, LET was associated with decreased risk of death (aHR, 0.62 [95% CI, .40-.98]); when conventional HCT and T-cell depleted HCT were analyzed separately, the aHR was 0.86 (95% CI, .51-1.43) and 0.21 (95% CI, .07-.65), respectively. CONCLUSIONS: At 1 year post-HCT, LET was associated with closing the mortality disparity between R-D- and R+. Among all R+, LET was associated with decreased mortality, driven by 79% reduced incidence of death in T-cell depleted HCT.
Subject(s)
Cytomegalovirus Infections , Hematopoietic Stem Cell Transplantation , Acetates , Adult , Antiviral Agents/therapeutic use , Cytomegalovirus , Cytomegalovirus Infections/epidemiology , Hematopoietic Stem Cell Transplantation/adverse effects , Humans , Quinazolines , Retrospective StudiesABSTRACT
BACKGROUND: We investigatedthe association between time-averaged area under the curve (AAUC) of cytomegalovirus (CMV) viral load (VL) by day 100 and overall survival (OS) at 1-year after hematopoietic cell transplantation (HCT). METHODS: In a retrospective cohort study, including patients receiving HCT between June 2010 and December 2017 from Memorial Sloan Kettering Cancer Center, AAUC was calculated for patients with detected VL. Patients were categorized into non-controllers (Q4) and controllers (Q1-Q3) using the highest AAUC quartile as cutoff. Cox models were used to estimate the association between AAUC and OS. Patients with non-detected CMV VL were categorized into elite-controllers (recipient+ [R+] or R-/donor+ [D+]) and R-/D-. RESULTS: The study (N = 952) included 282 controllers, 93 non-controllers, 275 elite-controllers, and 302 R-/D-. OS was 80.1% and 58.1% for controllers and non-controllers, respectively. In multivariable models, non-controllers had worse OS versus controllers (adjusted hazard ratio [HR] = 2.65; 95% confidence interval [CI], 1.71-4.12). In landmark analyses, controllers had similar OS as elite-controllers (HR = 1.26; 95% CI, .83-1.91) or R-/D- (HR = 0.98; 95% CI, .64-1.5). CONCLUSIONS: Non-controllers had worse OS 1-year post-HCT. Controllers had similar OS as elite-controllers or R-/D-. Future studies are needed to validate our AAUC cutoff across different cohorts and CMV management strategies.
Subject(s)
Cytomegalovirus Infections , Hematopoietic Stem Cell Transplantation , Multiple Organ Failure/mortality , Viral Load , Cytomegalovirus , Cytomegalovirus Infections/mortality , Humans , Kinetics , Multiple Organ Failure/virology , Retrospective Studies , Survival RateABSTRACT
BACKGROUND: We report on predictors of adenovirus (ADV) viremia and correlation of ADV viral kinetics with mortality in ex vivo T-cell depleted (TCD) hematopoietic cell transplant (HCT). METHODS: T cell-depleted HCT recipients from January 1, 2012 through September 30, 2018 were prospectively monitored for ADV in the plasma through Day (D)â +100 posttransplant or for 16 weeks after the onset of ADV viremia. Adenovirus viremia was defined asâ ≥2 consecutive viral loads (VLs)â ≥1000 copies/mL through Dâ +100. Time-averaged area under the curve (AAUC) or peak ADV VL through 16 weeks after onset of ADV viremia were explored as predictors of mortality in Cox models. RESULTS: Of 586 patients (adult 81.7%), 51 (8.7%) developed ADV viremia by Dâ +100. Ageâ <18 years, recipient cytomegalovirus seropositivity, absolute lymphocyte countâ <300 cells/µL at Dâ +30, and acute graft-versus-host disease were predictors of ADV viremia in multivariate models. Fifteen (29%) patients with ADV viremia died by Dâ +180; 8 of 15 (53%) died from ADV. Peak ADV VL (hazard ratio [HR], 2.25; 95% confidence interval [CI], 1.52-3.33) and increasing AAUC (HR, 2.95; 95% CI, 1.83-4.75) correlated with mortality at Dâ +180. CONCLUSIONS: In TCD HCT, peak ADV VL and ADV AAUC correlated with mortality at Dâ +180. Our data support the potential utility of ADV viral kinetics as endpoints in clinical trials of ADV therapies.
Subject(s)
Adenoviridae Infections/mortality , Hematopoietic Stem Cell Transplantation , Lymphocyte Depletion , T-Lymphocytes/immunology , Transplantation Conditioning , Viremia/mortality , Adenoviridae/growth & development , Adenoviridae Infections/immunology , Adolescent , Adult , Child , Child, Preschool , Female , Graft vs Host Disease/immunology , Graft vs Host Disease/mortality , Graft vs Host Disease/virology , Hematologic Diseases/immunology , Hematologic Diseases/mortality , Hematologic Diseases/virology , Humans , Kinetics , Male , Middle Aged , Myeloablative Agonists/therapeutic use , Survival Analysis , T-Lymphocytes/transplantation , Transplantation, Homologous , Viral Load , Viremia/immunologyABSTRACT
Patients presenting for treatment of hematologic cancers may be at increased risk for cognitive dysfunction before allogeneic hematopoietic stem cell transplantation (HSCT) due to advanced age, previous chemotherapy treatment, deconditioning, and fatigue. Cognitive dysfunction may affect treatment decision making, ability to recall or follow post-HSCT treatment recommendations and overall survival (OS). A total of 448 patients admitted for HSCT between 2011 and 2014 were administered the Montreal Cognitive Assessment (MoCA) by occupational therapists during admission before transplantation, and 260 were reassessed following transplantation and before discharge. We examined select predictor variables, including age, Karnofsky Performance Status, sex, disease type, psychotropic medications, and select outcome variables, including OS, and nonrelapse mortality (NRM). Before transplantation, 36.4% of patients met criteria for cognitive dysfunction. Age was found to be a significant predictor, along with disease type (myelodysplastic syndrome [MDS], myeloproliferative disorder [MPD]). No significant association was found between cognitive dysfunction and OS or NRM. Longitudinal analysis from pretransplantation to post-transplantation indicated significant decline following HSCT. Notably, one-third of the study cohort showed cognitive dysfunction at hospital discharge. A significant proportion of HSCT candidates present with cognitive dysfunction, with older patients and those diagnosed with MDS and MPD at greatest risk in this cohort. Attention to cognitive dysfunction before transplantation may alert the treatment team to high-risk cases that require increased oversight, inclusion by caregivers, and referral to occupational therapy at discharge. Longitudinal follow-up studies are needed to clarify the specific effect of HSCT on cognitive dysfunction and the impact of cognitive dysfunction on transplantation outcomes.
Subject(s)
Cognitive Dysfunction , Hematologic Neoplasms , Hematopoietic Stem Cell Transplantation , Cognitive Dysfunction/etiology , Hematologic Neoplasms/therapy , Hematopoietic Stem Cell Transplantation/adverse effects , Humans , Retrospective Studies , Transplantation Conditioning/adverse effects , Transplantation, HomologousABSTRACT
(Val)ganciclovir (vGCV) or foscarnet (FCN) as preemptive therapy (PET) for cytomegalovirus (CMV) after allogeneic hematopoietic cell transplantation (HCT) is associated with myelosuppression and nephrotoxicity, respectively. We analyzed a cohort of CMV-seropositive (R+) HCT recipients managed preemptively at a single center. The objectives of our study were to (1) quantify the frequencies of neutropenia and acute kidney injury (AKI) through day +100 (D100) post-HCT and at PET discontinuation and (2) assess the impact of PET on neutropenia and AKI in multivariate models. This was a retrospective cohort study of adult CMV R+ recipients who underwent allo-HCT at Memorial Sloan Kettering Cancer Center from March 18, 2013, through December 31, 2017, and were managed with PET. Patients were grouped by receipt of PET (PET and no PET). Neutropenia and AKI were defined by Common Terminology Criteria for Adverse Events version 4. Frequencies of toxicities by D100 were compared between relevant groups. The impact of PET on toxicities was examined in univariate and multivariate Poisson/negative binomial regression models. Of 368 CMV R+ HCT recipients, 208 (56.5%) received PET. Neutropenia by D100 occurred in 41.8% and 28.6% patients in PET and no PET, respectively (P = .0009). PET increased the risk of neutropenia (adjusted relative risk = 1.81; 95% confidence interval [CI], 1.48 to 2.21; P < .0001) in multivariate analyses. AKI by D100 occurred in 12.0% and 7.8% patients in PET and no PET, respectively (P = .19). PET increased the risk of AKI by 2.75-fold (95% CI, 1.71 to 4.42; P < .0001). When PET recipients were grouped by first antiviral, neutropenia by D100 occurred in 34.8% and 48.9% of vGCV and FCN recipients, respectively, (P = .08), and AKI occurred in 13.0% and 34.0% of vGCV and FCN recipients, respectively (P = .001). At discontinuation of vGCV or FCN, neutropenia was present in 11.2% versus 2.1% patients, respectively (P = .08), and AKI was present in 1.9% of versus 12.8% patients respectively (P = .005). Preemptive therapy for CMV increased the risk of neutropenia and AKI in the first 100 days post-HCT by 1.8-fold and 2.8-fold, respectively. Our results underscore the need for safer antivirals for CMV management in HCT recipients.
Subject(s)
Cytomegalovirus Infections , Hematopoietic Stem Cell Transplantation , Adult , Antiviral Agents/adverse effects , Cohort Studies , Cytomegalovirus , Cytomegalovirus Infections/drug therapy , Ganciclovir/therapeutic use , Hematopoietic Stem Cell Transplantation/adverse effects , Humans , Retrospective StudiesABSTRACT
Large series of patients with acute myelogenous leukemia (AML) after ex vivo T cell-depleted (TCD) allogeneic hematopoietic stem cell transplantation (allo-HSCT) have not been reported previously. We retrospectively analyzed the outcomes of 266 patients (median age, 54 years) with AML who received CD34-selected TCD allo-HSCTs while in first (75%) or second (25%) complete remission (CR1/CR2) at a single institution. The conditioning regimens were all myeloablative, and no additional graft-versus-host disease (GVHD) prophylaxis was given. The cumulative incidences of grade II-IV and grade III-IV acute GVHD at 180 days were 14% (95% confidence interval [CI], 10% to 18%) and 3% (95% CI, 1% to 5%), respectively. The cumulative incidence of chronic GVHD at 3 years was 3% (95% CI, 1% to 6%). The 3-year cumulative incidence of nonrelapse mortality was 21% (95% CI, 16% to 26%) and that of relapse was 21% (95% CI, 17% to 27%). Overall survival (OS) and disease-free survival (DFS) at 1, 3, and 5 years were 75%, 61%, and 56% and 68%, 57%, and 53%, respectively. There were no significant differences in OS, DFS, and relapse rates for patients who underwent transplantation in CR1 and those who did so in CR2. However, patients with high-risk cytogenetics at diagnosis had significantly poorer outcomes. The OS and DFS rates compare favorably with those for unmodified allo-HSCT, but with considerably lower rates of GVHD.
Subject(s)
Graft vs Host Disease , Hematopoietic Stem Cell Transplantation , Leukemia, Myeloid, Acute , Adult , Disease-Free Survival , Graft vs Host Disease/etiology , Graft vs Host Disease/prevention & control , Humans , Leukemia, Myeloid, Acute/therapy , Middle Aged , Retrospective Studies , T-Lymphocytes , Transplantation Conditioning , Transplantation, HomologousABSTRACT
Although histopathological differences have been reported between acute graft-versus-host disease (aGVHD) rash and non-aGVHD rash in CD34+-selected peripheral blood stem cell transplantation (PBSCT) recipients, skin biopsy alone is usually insufficient to determine rash etiology. As such, distinguishing inflammatory non-aGVHD rashes, such as drug eruptions, from cutaneous aGVHD after CD34+-selected PBSCT remains challenging and relies on clinical presentation. This study aimed to identify etiologies of skin rash in the first year after CD34+-selected PBSCT and to assess whether laboratory serologic markers, transplant characteristics, and rash morphology and symptomatology aid in differentiation of cutaneous aGVHD rash versus non-aGVHD rash. We conducted a retrospective study of 243 adult patients who underwent CD34+-selected PBSCT at Memorial Sloan Kettering Cancer Center between 2008 and 2011. Among this cohort of transplant recipients, only 43 patients (17.7%) developed cutaneous aGVHD. A total of 152 patients (63%) were identified with rash within 1 year after PBSCT. The proportion of patients who experienced peripheral eosinophilia was not different between those with an aGVHD versus non-aGVHD rash (P ≥ .90), nor when stratified by CD34+ selection method (Isolex, Pâ¯=â¯.70; CliniMACS, P≥ .90). The proportion of patients with pruritus was also not different between those with an aGVHD rash versus non-aGVHD rash (P= .20), or when stratified by CD34+ selection modality (Isolex, Pâ¯=â¯.20; CliniMACS, Pâ¯=â¯.50). The most common cause of non-aGVHD rash among those with a clear etiology was drug (39% of Isolex; 26% of CliniMACS). Single drug culprits were identified in 51% of drug rashes. The most commonly reported offending agents included antibiotics, keratinocyte growth factor, chemotherapy, and recombinant glycosylated human IL-7.
Subject(s)
Exanthema , Peripheral Blood Stem Cell Transplantation , Pruritus , Acute Disease , Allografts , Antigens, CD34 , Exanthema/chemically induced , Exanthema/pathology , Female , Humans , Male , Middle Aged , Pruritus/chemically induced , Pruritus/epidemiology , Pruritus/pathology , Retrospective StudiesABSTRACT
In recent years, vancomycin-resistant Enterococcus (VRE) colonization is being increasingly encountered in transplant recipients, and VRE has become one of the leading causes of bacteremia early after allogeneic hematopoietic stem cell transplantation (allo-HSCT). Data are sparse on the effect of empiric VRE therapy for febrile, neutropenic allo-HSCT recipients colonized with VRE. All allo-HSCT recipients aged ≥18years who developed VRE bacteremia (VREB) between 2005 and 2014 were identified and categorized as to whether they received empiric or directed VRE therapy. There were 434 (33%) VRE-colonized and 872 (67%) non-VRE-colonized patients during the study period, and 172 of the 434 (40%) VRE-colonized patients received empiric therapy. There was no significant difference in incidence of VREB among colonized patients who did or did not receive empiric therapy (28 of 172 [16%] vs 55 of 262 [21%]; Pâ¯=â¯.22). There were 95 patients with VREB, of which the majority (83 of 95; 87%) was known to be VRE-colonized. Of the 95 VREB episodes, 29 (31%) were treated with empiric VRE therapy, whereas 66 (69%) were treated with directed therapy. No significant differences in clinical outcomes, including median duration of bacteremia (2 days vs 2 days; Pâ¯=â¯.39), recurrent VREB (3 of 29 [10%] vs 5 of 66 [8%]; Pâ¯=â¯.65), 30-day all-cause mortality (1 of 29 [3%] vs 4 of 66 [6%]; Pâ¯=â¯.62), or VRE-attributable mortality (1 of 29 [3%] vs 1 of 66 [2%]; Pâ¯=â¯.55), were observed between the empiric therapy and directed therapy groups. Kaplan-Meier curve analysis showed no significant difference in survival at 30days in allo-HSCT recipients with VREB who received empiric therapy and those who received directed therapy (97% vs 94%; Pâ¯=â¯.62). Based on our data, we recommend against empiric use of VRE-active agents for fever and neutropenia in VRE-colonized patients undergoing allo-HSCT.
Subject(s)
Bacteremia/drug therapy , Hematopoietic Stem Cell Transplantation/adverse effects , Vancomycin-Resistant Enterococci/drug effects , Anti-Bacterial Agents/therapeutic use , Bacteremia/mortality , Fever/drug therapy , Fever/etiology , Humans , Neutropenia/drug therapy , Neutropenia/etiology , Retrospective Studies , Survival Analysis , Transplantation, Homologous/adverse effects , Treatment Outcome , Vancomycin ResistanceABSTRACT
We quantified cytomegalovirus (CMV) antiviral use and hospital length of stay (LOS) associated with CMV infection in a contemporary cohort of conventional (CONV) and CD34-selected (T cell-depleted) hematopoietic cell transplantation (HCT) recipients managed by preemptive therapy (PET) in a single US center. Adults who received first allogeneic HCT at Memorial Sloan Kettering Cancer Center from June 2010 through December 2014 were analyzed. Days on PET, number of readmissions, and readmission LOS by day 180 post-HCT were summarized. Estimated unit value (EUV) was defined as the expected number of PET days for a cohort of 100 HCT with characteristics as the analyzed cohort. Standardized incidence ratio was calculated as the ratio of observed outcomes of patients with CMV viremia over the outcomes of patients without CMV viremia. Of 318 patients, 88 received CONV and 230 CD34-selected HCT. Rates of CMV viremia were 26.3% for CONV and 41.9% for CD34-selected (Pâ¯=â¯.003). Among patients with viremia 68.2% CONV and 97.9% CD34-selected received PET. EUV for PET was 852 days and 2821 days for CONV and CD34-selected, respectively. The standardized incidence ratios for number of readmission and readmission LOS were 1.7 (95% confidence interval [CI], 1.4 to 2.1) and 1.2 (95% CI, 1.1 to 1.3), respectively, for CONV HCT and 1.7 (95% CI, 1.3 to 2.1) and 1.6 (95% CI, 1.5 to 1.7), respectively, for CD34-selected HCT. Overall survival was similar between patients with and without CMV viremia by HCT type. CMV end-organ disease was associated with lower overall survival only in CD34-selected HCT (Pâ¯=â¯.0007). CMV infection managed by PET requires substantial antiviral use and is associated with longer readmission LOS more, particularly among CD34-selected HCT.
Subject(s)
Cytomegalovirus Infections/etiology , Hematopoietic Stem Cell Transplantation/adverse effects , Transplantation Conditioning/adverse effects , Transplantation, Homologous/adverse effects , Female , Humans , Male , Middle Aged , Treatment OutcomeABSTRACT
Antithymocyte globulin (ATG) use mitigates the risk of graft rejection and graft-versus-host disease (GVHD) after allogeneic hematopoietic cell transplantation (allo-HCT), but ATG overexposure in the setting of lymphopenia negatively affects immune recovery. We hypothesized that standard empiric weight-based dosing of ATG, used to prevent graft rejection in ex vivo CD34-selected allo-HCT, may lead to serious adverse consequences on outcomes in certain patients. We evaluated 304 patients undergoing myeloablative-conditioned ex vivo CD34-selected allo-HCT with HLA-matched donors for the treatment of hematologic malignancies. Patients received rabbit ATG at a dose of 2.5 mg/kg/day i.v. on days -3 and/or -2. An ATG dosing cutoff of 450 mg was used for statistical analyses to assess the relationship between ATG and overall survival (OS). Among all patients, median total ATG dose was 360 mg (range, 130 to 510 mg); 279 (92%) received a total dose of ATG ≤450 mg, and 25 (8%) received a total dose >450 mg. On the first day of ATG administration (day -3), the median absolute lymphocyte count was .0 K/µL. For patients who received a total dose of ATG >450 mg or ≤450 mg, the incidences of acute and late-acute GVHD grade II-IV were statistically similar. At 3 years post-HCT, for patients who received a total dose of ATG >450 mg or ≤450 mg, nonrelapse mortality (NRM) rates were 35% and 18%, respectively (P = .029), disease-free survival (DFS) rates were 37% and 61%, respectively (P = .003), and OS rates were 40% and 67%, respectively (Pâ¯=â¯.001). Among all patient and HCT characteristics in multivariable analyses, receipt of a total dose of ATG >450 mg was associated with an increased risk of NRM (hazard ratio [HR], 2.9; P = .01), shorter DFS (HR, 2.0; P = .03), and inferior OS (HR, 2.1; P = .01). In summary, the use of weight-based ATG at a time of relative lymphopenia before ex vivo CD34-selected allo-HCT results in overdosing in heavier patients, leading to higher NRM and lower DFS and OS. Further pharmacokinetic investigation in this setting is critical to determining the optimal dosing strategy for ATG.
Subject(s)
Antilymphocyte Serum/adverse effects , Hematologic Neoplasms , Hematopoietic Stem Cell Transplantation , Lymphopenia , Transplantation Conditioning/adverse effects , Adult , Aged , Allografts , Antigens, CD34 , Antilymphocyte Serum/administration & dosage , Disease-Free Survival , Female , Follow-Up Studies , Hematologic Neoplasms/blood , Hematologic Neoplasms/mortality , Hematologic Neoplasms/therapy , Humans , Lymphopenia/blood , Lymphopenia/chemically induced , Lymphopenia/mortality , Male , Middle Aged , Retrospective Studies , Survival RateABSTRACT
Immune-mediated cytopenias (ICs), such as immune thrombocytopenia and immune hemolytic anemia, are among the adverse events after allogeneic hematopoietic cell transplantation (allo-HCT). Previous reports suggest that in vivo T cell depletion may increase the incidence of IC after allo-HCT. We evaluated whether a strategy that reduces functional donor T cells via ex vivo CD34+-selection associates with the development of IC in a cohort of 408 patients who underwent allo-HCT for hematologic malignancy. The cumulative incidence of IC at 6, 12, and 36 months after the 30-day landmark post-HCT was 3.4%, 4.9%, and 5.8%, respectively. Among 23 patients who developed IC, 7 died of relapse-related mortality and 4 of nonrelapse mortality. A median 2 types of treatment (range, 1 to 5) was required to resolve IC, and there was considerable heterogeneity in the therapies used. In univariable analyses, a hematologic malignancy Disease Risk Index (DRI) score of 3 was significantly associated with an increased risk of IC compared with a DRI of 1 or 2 (hazard ratio [HR], 4.12; Pâ¯=â¯.003), and IC (HR, 2.4; Pâ¯=â¯.03) was associated with increased risk of relapse. In a multivariable analysis that included DRI, IC remained significantly associated with increased risk of relapse (HR, 2.4; Pâ¯=â¯.03). Our findings show that IC events occur with relatively similar frequency in patients after ex vivo CD34+-selected allo-HCT compared with unmodified allo-HCT, suggesting that reduced donor T cell immunity is not causative of IC. Moreover, we noted a possible link between its development and/or treatment and increased risk of relapse.
Subject(s)
Blood Cell Count/methods , Hematopoietic Stem Cell Transplantation/adverse effects , Transplantation Conditioning/adverse effects , Transplantation, Homologous/adverse effects , Adult , Aged , Female , Hematopoietic Stem Cell Transplantation/methods , Humans , Male , Middle Aged , Retrospective Studies , Transplantation Conditioning/methods , Transplantation, Homologous/methods , Young AdultABSTRACT
Non-graft-versus-host disease (GVHD) ocular complications are generally uncommon after hematopoietic cell transplantation (HCT) but can cause prolonged morbidity affecting activities of daily living and quality of life. Here we provide an expert review of non-GVHD ocular complications in a collaboration between transplantation physicians and ophthalmologists through the Late Effects and Quality of Life Working Committee of the Center for International Blood and Marrow Transplant Research and the Transplant Complications Working Party of the European Society of Blood and Marrow Transplantation. Complications discussed in this review include cataracts, glaucoma, ocular infections, ocular involvement with malignancy, ischemic microvascular retinopathy, central retinal vein occlusion, retinal hemorrhage, retinal detachment and ocular toxicities associated with medications. We summarize the incidence, risk factors, screening, prevention, and treatment of individual complications and generate evidence-based recommendations. Baseline ocular evaluation before HCT should be considered in all patients who undergo HCT. Follow-up evaluations should be considered according to clinical signs and symptoms and risk factors. Better preventive strategies and treatments remain to be investigated for individual ocular complications after HCT. Both transplantation physicians and ophthalmologists should be knowledgeable about non-GVHD ocular complications and provide comprehensive collaborative team care.
Subject(s)
Eye Diseases/etiology , Hematopoietic Stem Cell Transplantation/adverse effects , Eye Diseases/diagnosis , Eye Diseases/prevention & control , Eye Diseases/therapy , Humans , Incidence , Mass Screening , Patient Care Team , Risk FactorsABSTRACT
The development of reduced-intensity approaches for allogeneic hematopoietic cell transplantation has resulted in growing numbers of older related donors (RDs) of peripheral blood stem cells (PBSCs). The effects of age on donation efficacy, toxicity, and long-term recovery in RDs are poorly understood. To address this we analyzed hematologic variables, pain, donation-related symptoms, and recovery in 1211 PBSC RDs aged 18 to 79 enrolled in the Related Donor Safety Study. RDs aged > 60 had a lower median CD34+ level before apheresis compared with younger RDs (age > 60, 59â¯×â¯106/L; age 41 to 60, 81â¯×â¯106/L; age 18 to 40, 121â¯×â¯106/L; P < .001). This resulted in older donors undergoing more apheresis procedures (49% versus 30% ≥ 2 collections, P < .001) and higher collection volumes (52% versus 32% > 24 L, P < .001), leading to high percentages of donors aged > 60 with postcollection thrombocytopenia <50â¯×â¯109/L (26% and 57% after 2 and 3days of collection, respectively). RDs aged 18 to 40 had a higher risk of grades 2 to 4 pain and symptoms pericollection, but donors over age 40 had more persistent pain at 1, 6, and 12 months (odds ratio [OR], 1.7; P = 0.02) and a higher rate of nonrecovery to predonation levels (OR, 1.7; P = .01). Donors reporting comorbidities increased significantly with age, and those with comorbidities that would have led to deferral by National Marrow Donor Program unrelated donor standards had an increased risk for persistent grades 2 to 4 pain (OR, 2.41; P < .001) and failure to recover to predonation baseline for other symptoms (OR, 2.34; P = .004). This information should be used in counseling RDs regarding risk and can assist in developing practice approaches aimed at improving the RD experience for high-risk individuals.
Subject(s)
Peripheral Blood Stem Cell Transplantation/methods , Peripheral Blood Stem Cells/metabolism , Adolescent , Adult , Aged , Blood Donors , Comorbidity , Female , Humans , Male , Middle Aged , Young AdultABSTRACT
TP53 alterations portend extremely poor prognosis in patients with mantle cell lymphoma treated with standard treatment modalities. We reviewed outcomes of 42 patients with available TP53 status who had received a reduced-intensity or non-myeloablative allogeneic haematopoietic cell transplant at our institution. We demonstrated a 2-year overall survival and progression-free survival of 78% [95% confidence interval (CI) 60-88] and 61% (95% CI 43-75), respectively. The 2-year cumulative incidences of relapse and non-relapse mortality were 19% and 20%, respectively. Importantly, there is no significant difference among patients with and without TP53 alterations, suggesting for the first time a beneficial treatment modality for these high-risk patients.
Subject(s)
Hematopoietic Stem Cell Transplantation/methods , Lymphoma, Mantle-Cell/therapy , Transplantation Conditioning/methods , Transplantation, Homologous/methods , Adult , Aged , Female , Humans , Lymphoma, Mantle-Cell/pathology , Male , Middle Aged , Prognosis , Tumor Suppressor Protein p53ABSTRACT
In this study, we evaluated trends and outcomes of allogeneic hematopoietic cell transplantation (HCT) in adults ≥70 years with hematologic malignancies across the United States. Adults ≥70 years with a hematologic malignancy undergoing first allogeneic HCT in the United States between 2000 and 2013 and reported to the Center for International Blood and Marrow Transplant Research were eligible. Transplant utilization and transplant outcomes, including overall survival (OS), progression-free survival (PFS), and transplant-related mortality (TRM) were studied. One thousand one hundred and six patients ≥70 years underwent HCT across 103 transplant centers. The number and proportion of allografts performed in this population rose markedly over the past decade, accounting for 0.1% of transplants in 2000 to 3.85% (N = 298) in 2013. Acute myeloid leukemia and myelodysplastic syndromes represented the most common disease indications. Two-year OS and PFS significantly improved over time (OS: 26% [95% confidence interval (CI), 21% to 33%] in 2000-2007 to 39% [95% CI, 35% to 42%] in 2008-2013, P < .001; PFS: 22% [16% to 28%] in 2000-2007 to 32% [95% CI, 29% to 36%] in 2008-2013, P = .003). Two-year TRM ranged from 33% to 35% and was unchanged over time (P = .54). Multivariable analysis of OS in the modern era of 2008-2013 revealed higher comorbidity by HCT comorbidity index ≥3 (hazard ratio [HR], 1.27; P = .006), umbilical cord blood graft (HR, 1.97; P = .0002), and myeloablative conditioning (HR, 1.61; P = .0002) as adverse factors. Over the past decade, utilization and survival after allogeneic transplant have increased in patients ≥70 years. Select adults ≥70 years with hematologic malignancies should be considered for transplant.