ABSTRACT
BACKGROUND: The current standard of care of screening and referring patients for treatment for symptoms, such as depression, pain, and fatigue, is not effective. This trial aimed to test the efficacy of an integrated screening and novel stepped collaborative care intervention versus standard of care for patients with cancer and at least one of the following symptoms: depression, pain, or fatigue. METHODS: This randomised, parallel, phase 3 trial was conducted in 29 oncology outpatient clinics associated with the UPMC Hillman Cancer Center in the USA. Patients (aged ≥21 years) with any cancer type and clinical levels of depression, pain, or fatigue (or all of these) were eligible. Eligible family caregivers were aged 21 years or older and providing care to a patient diagnosed with cancer who consented for this study. Patients were randomly assigned (1:1) to stepped collaborative care or standard of care using a central, permuted block design (sizes of 2, 4, and 6) stratified by sex and prognostic status. The biostatistician, oncologists, and outcome assessors were masked to treatment assignment. Stepped collaborative care was once-weekly cognitive behavioural therapy for 50-60 min from a care coordinator via telemedicine (eg, telephone or videoconferencing). Pharmacotherapy for symptoms might be initiated or changed if recommended by the treatment team or preferred by the patient. Standard of care was screening and referral to a health-care provider for treatment of symptoms. The primary outcome was health-related quality of life in patients at 6 months. Maintenance of the treatment benefits was assessed at 12 months. Participants included in the primary analysis were per intention to treat, which included patients missing one or both follow-up assessments. This trial was registered with ClinicalTrials.gov (NCT02939755). FINDINGS: Between Dec 5, 2016, and April 8, 2021, 459 patients and 190 family caregivers were enrolled. 222 patients were assigned to standard of care and 237 to stepped collaborative care. Of 459 patients, 201 (44%) were male and 258 (56%) were female. Patients in the stepped collaborative care group had a greater 0-6-month improvement in health-related quality of life than patients in the standard-of-care group (p=0·013, effect size 0·09). Health-related quality of life was maintained for the stepped collaborative care group (p=0·74, effect size 0·01). Patients in the stepped collaborative care group had greater 0-6-month improvements than the standard-of-care group in emotional (p=0·012), functional (p=0·042), and physical (p=0·033) wellbeing. No adverse events were reported by patients in either group and deaths were considered unrelated to the study. INTERPRETATION: An integrated screening and novel stepped collaborative care intervention, compared with the current standard of care, is recommended to improve health-related quality of life. The findings of this study will advance the implementation of guideline concordant care (screening and treatment) and has the potential to shift the practice of screening and treatment paradigm nationwide, improving outcomes for patients diagnosed with cancer. FUNDING: US National Cancer Institute.
Subject(s)
Caregivers , Neoplasms , Female , Humans , Male , Fatigue , Neoplasms/diagnosis , Neoplasms/therapy , Pain , Quality of Life , Treatment Outcome , Young Adult , AdultABSTRACT
Older compatible living donor kidney transplant (CLDKT) recipients have higher mortality and death-censored graft failure (DCGF) compared to younger recipients. These risks may be amplified in older incompatible living donor kidney transplant (ILDKT) recipients who undergo desensitization and intense immunosuppression. In a 25-center cohort of ILDKT recipients transplanted between September 24, 1997, and December 15, 2016, we compared mortality, DCGF, delayed graft function (DGF), acute rejection (AR), and length of stay (LOS) between 234 older (age ≥60 years) and 1172 younger (age 18-59 years) recipients. To investigate whether the impact of age was different for ILDKT recipients compared to 17 542 CLDKT recipients, we used an interaction term to determine whether the relationship between posttransplant outcomes and transplant type (ILDKT vs CLDKT) was modified by age. Overall, older recipients had higher mortality (hazard ratio: 1.632.072.65, P < .001), lower DCGF (hazard ratio: 0.360.530.77, P = .001), and AR (odds ratio: 0.390.540.74, P < .001), and similar DGF (odds ratio: 0.461.032.33, P = .9) and LOS (incidence rate ratio: 0.880.981.10, P = 0.8) compared to younger recipients. The impact of age on mortality (interaction P = .052), DCGF (interaction P = .7), AR interaction P = .2), DGF (interaction P = .9), and LOS (interaction P = .5) were similar in ILDKT and CLDKT recipients. Age alone should not preclude eligibility for ILDKT.
Subject(s)
Kidney Transplantation , Humans , Aged , Middle Aged , Adolescent , Young Adult , Adult , Kidney Transplantation/adverse effects , Living Donors , Graft Survival , Graft Rejection/etiology , HLA Antigens , Risk FactorsABSTRACT
Steatotic livers represent a potentially underutilized resource to increase the donor graft pool; however, 1 barrier to the increased utilization of such grafts is the heterogeneity in the definition and the measurement of macrovesicular steatosis (MaS). Digital imaging software (DIS) may better standardize definitions to study posttransplant outcomes. Using HALO, a DIS, we analyzed 63 liver biopsies, from 3 transplant centers, transplanted between 2016 and 2018, and compared macrovesicular steatosis percentage (%MaS) as estimated by transplant center, donor hospital, and DIS. We also quantified the relationship between DIS characteristics and posttransplant outcomes using log-linear regression for peak aspartate aminotransferase, peak alanine aminotransferase, and total bilirubin on postoperative day 7, as well as logistic regression for early allograft dysfunction. Transplant centers and donor hospitals overestimated %MaS compared with DIS, with better agreement at lower %MaS and less agreement for higher %MaS. No DIS analyzed liver biopsies were calculated to be >20% %MaS; however, 40% of liver biopsies read by transplant center pathologists were read to be >30%. Percent MaS read by HALO was positively associated with peak aspartate aminotransferase (regression coefficient= 1.04 1.08 1.12 , p <0.001), peak alanine aminotransferase (regression coefficient = 1.04 1.08 1.12 , p <0.001), and early allograft dysfunction (OR= 1.10 1.40 1.78 , p =0.006). There was no association between HALO %MaS and total bilirubin on postoperative day 7 (regression coefficient = 0.99 1.01 1.04 , p =0.3). DIS provides reproducible quantification of steatosis that could standardize MaS definitions and identify phenotypes associated with good clinical outcomes to increase the utilization of steatite livers.
Subject(s)
Fatty Liver , Image Processing, Computer-Assisted , Liver Transplantation , Humans , Alanine Transaminase , Aspartate Aminotransferases , Bilirubin , Biopsy , Fatty Liver/diagnostic imaging , Fatty Liver/pathology , Liver/diagnostic imaging , Liver/pathology , Liver Transplantation/methods , Software , Image Processing, Computer-Assisted/methodsABSTRACT
The development of materials showing rapid proton conduction with a low activation energy and stable performance over a wide temperature range is an important and challenging line of research. Here, we report confinement of sulfuric acid within porous MFM-300(Cr) to give MFM-300(Cr)·SO4(H3O)2, which exhibits a record-low activation energy of 0.04 eV, resulting in stable proton conductivity between 25 and 80 °C of >10-2 S cm-1. In situ synchrotron X-ray powder diffraction (SXPD), neutron powder diffraction (NPD), quasielastic neutron scattering (QENS), and molecular dynamics (MD) simulation reveal the pathways of proton transport and the molecular mechanism of proton diffusion within the pores. Confined sulfuric acid species together with adsorbed water molecules play a critical role in promoting the proton transfer through this robust network to afford a material in which proton conductivity is almost temperature-independent.
ABSTRACT
Climate change is predicted to cause widespread disruptions to global biodiversity. Most climate models are at the macroscale, operating at a ~ 1 km resolution and predicting future temperatures at 1.5-2 m above ground level, making them unable to predict microclimates at the scale that many organisms experience temperature. We studied the effects of forest structure and vertical position on microclimatic air temperature within forest canopy in a historically degraded tropical forest in Sikundur, Northern Sumatra, Indonesia. We collected temperature measurements in fifteen plots over 20 months, alongside vegetation structure data from the same fifteen 25 × 25 m plots. We also performed airborne surveys using an unmanned aerial vehicle (UAV) to record canopy structure remotely, both over the plot locations and a wider area. We hypothesised that old-growth forest structure would moderate microclimatic air temperature. Our data showed that Sikundur is a thermally dynamic environment, with simultaneously recorded temperatures at different locations within the canopy varying by up to ~ 15 °C. Our models (R2 = 0.90 to 0.95) showed that temperature differences between data loggers at different sites were largely determined by variation in recording height and the amount of solar radiation reaching the topmost part of the canopy, although strong interactions between these abiotic factors and canopy structure shaped microclimate air temperature variation. The impacts of forest degradation have smaller relative influence on models of microclimatic air temperature than abiotic factors, but the loss of canopy density increases temperature. This may render areas of degraded tropical forests unsuitable for some forest-dwelling species with the advent of future climate change.
Subject(s)
Forests , Microclimate , Biodiversity , Climate Change , Temperature , Trees , Tropical ClimateABSTRACT
We report the reversible adsorption of ammonia (NH3) up to 9.9 mmol g-1 in a robust Al-based metal-organic framework, MFM-303(Al), which is functionalized with free carboxylic acid and hydroxyl groups. The unique pore environment decorated with these acidic sites results in an exceptional packing density of NH3 at 293 K (0.801 g cm-3) comparable to that of solid NH3 at 193 K (0.817 g cm-3). In situ synchrotron X-ray diffraction and inelastic neutron scattering reveal the critical role of free -COOH and -OH groups in immobilizing NH3 molecules. Breakthrough experiments confirm the excellent performance of MFM-303(Al) for the capture of NH3 at low concentrations under both dry and wet conditions.
ABSTRACT
Incompatible living donor kidney transplant recipients (ILDKTr) have pre-existing donor-specific antibody (DSA) that, despite desensitization, may persist or reappear with resulting consequences, including delayed graft function (DGF) and acute rejection (AR). To quantify the risk of DGF and AR in ILDKT and downstream effects, we compared 1406 ILDKTr to 17 542 compatible LDKT recipients (CLDKTr) using a 25-center cohort with novel SRTR linkage. We characterized DSA strength as positive Luminex, negative flow crossmatch (PLNF); positive flow, negative cytotoxic crossmatch (PFNC); or positive cytotoxic crossmatch (PCC). DGF occurred in 3.1% of CLDKT, 3.5% of PLNF, 5.7% of PFNC, and 7.6% of PCC recipients, which translated to higher DGF for PCC recipients (aOR = 1.03 1.682.72 ). However, the impact of DGF on mortality and DCGF risk was no higher for ILDKT than CLDKT (p interaction > .1). AR developed in 8.4% of CLDKT, 18.2% of PLNF, 21.3% of PFNC, and 21.7% of PCC recipients, which translated to higher AR (aOR PLNF = 1.45 2.093.02 ; PFNC = 1.67 2.403.46 ; PCC = 1.48 2.243.37 ). Although the impact of AR on mortality was no higher for ILDKT than CLDKT (p interaction = .1), its impact on DCGF risk was less consequential for ILDKT (aHR = 1.34 1.621.95 ) than CLDKT (aHR = 1.96 2.292.67 ) (p interaction = .004). Providers should consider these risks during preoperative counseling, and strategies to mitigate them should be considered.
Subject(s)
Kidney Transplantation , Delayed Graft Function/etiology , Graft Rejection/etiology , Graft Survival , Humans , Kidney Transplantation/adverse effects , Living Donors , Retrospective Studies , Risk FactorsABSTRACT
Yttrium-90 (Y-90) radioembolization for the treatment of hepatocellular carcinoma can present safety challenges when transplanting recently treated Y-90 patients. To reduce surgeons' contact with radioactive tissue and remain within occupational dose limits, current guidelines recommend delaying transplants at least 14 days, if possible. We wanted to determine the level of radiation exposure to the transplant surgeon when explanting an irradiated liver before the recommended decay period. Anex-vivoradiation exposure analysis was conducted on the explanted liver of a patient who received Y-90 therapy 46 h prior to orthotopic liver transplant. To estimate exposure to the surgeon's hands, radiation dosimeter rings were placed inside three different surgical glove configurations and exposed to the explanted liver. Estimated radiation doses corrected for Y-90 decay were calculated. Radiation safety gloves performed best, with an average radiation exposure rate of 5.36 mSV h-1in the static hand position, an 83% reduction in exposure over controls with no glove (31.31 mSv h-1). Interestingly, non-radiation safety gloves also demonstrated reduced exposure rates, well below occupational regulation limits. Handling of Y-90 radiated organs within the immediate post-treatment period can be done safely and does not exceed federal occupational dose limits if appropriate gloves and necessary precautions are exercised.
Subject(s)
Occupational Exposure , Radiation Exposure , Hepatectomy , Humans , Occupational Exposure/analysis , Radiation Dosage , Yttrium Radioisotopes/therapeutic useABSTRACT
PURPOSE OF REVIEW: Organ transplantation research has led to the discovery of several interesting individual mechanistic pathways, molecules and potential drug targets but there are still no comprehensive studies that have addressed how these varied mechanisms work in unison to regulate the posttransplant immune response that drives kidney rejection and dysfunction. RECENT FINDINGS: Systems biology is a rapidly expanding field that aims to integrate existing knowledge of molecular concepts and large-scale genomic and clinical datasets into networks that can be used in cutting edge computational models to define disease mechanisms in a holistic manner. Systems biology approaches have brought a paradigm shift from a reductionist view of biology to a wider agnostic assessment of disease from several lines of evidence. Although the complex nature of the posttransplant immune response makes it difficult to pinpoint mechanisms, systems biology is enabling discovery of unknown biological interactions using the cumulative power of genomic data sets, clinical data and endpoints, and improved computational methods for the systematic deconvolution of this response. SUMMARY: An integrative systems biology approach that leverages genomic data from varied technologies, such as DNA sequencing, copy number variation, RNA sequencing, and methylation profiles along with long-term clinical follow-up data has the potential to define a framework that can be mined to provide novel insights for developing therapeutic interventions in organ transplantation.
Subject(s)
DNA Copy Number Variations/immunology , Graft Survival/physiology , Immunity, Humoral/immunology , Kidney Transplantation/methods , Humans , Systems Biology/methods , Transplantation, HomologousABSTRACT
Noninvasive biomarkers are needed to monitor stable patients after kidney transplant (KT), because subclinical acute rejection (subAR), currently detectable only with surveillance biopsies, can lead to chronic rejection and graft loss. We conducted a multicenter study to develop a blood-based molecular biomarker for subAR using peripheral blood paired with surveillance biopsies and strict clinical phenotyping algorithms for discovery and validation. At a predefined threshold, 72% to 75% of KT recipients achieved a negative biomarker test correlating with the absence of subAR (negative predictive value: 78%-88%), while a positive test was obtained in 25% to 28% correlating with the presence of subAR (positive predictive value: 47%-61%). The clinical phenotype and biomarker independently and statistically correlated with a composite clinical endpoint (renal function, biopsy-proved acute rejection, ≥grade 2 interstitial fibrosis, and tubular atrophy), as well as with de novo donor-specific antibodies. We also found that <50% showed histologic improvement of subAR on follow-up biopsies despite treatment and that the biomarker could predict this outcome. Our data suggest that a blood-based biomarker that reduces the need for the indiscriminate use of invasive surveillance biopsies and that correlates with transplant outcomes could be used to monitor KT recipients with stable renal function, including after treatment for subAR, potentially improving KT outcomes.
Subject(s)
Biomarkers/blood , Graft Rejection/diagnosis , Kidney Transplantation , Adult , Aged , Algorithms , Biopsy , Female , Fibrosis/diagnosis , Glomerular Filtration Rate , Graft Rejection/blood , Graft Survival , Humans , Male , Middle Aged , Phenotype , Predictive Value of Tests , Treatment Outcome , Young AdultABSTRACT
BACKGROUND: A report from a high-volume single center indicated a survival benefit of receiving a kidney transplant from an HLA-incompatible live donor as compared with remaining on the waiting list, whether or not a kidney from a deceased donor was received. The generalizability of that finding is unclear. METHODS: In a 22-center study, we estimated the survival benefit for 1025 recipients of kidney transplants from HLA-incompatible live donors who were matched with controls who remained on the waiting list or received a transplant from a deceased donor (waiting-list-or-transplant control group) and controls who remained on the waiting list but did not receive a transplant (waiting-list-only control group). We analyzed the data with and without patients from the highest-volume center in the study. RESULTS: Recipients of kidney transplants from incompatible live donors had a higher survival rate than either control group at 1 year (95.0%, vs. 94.0% for the waiting-list-or-transplant control group and 89.6% for the waiting-list-only control group), 3 years (91.7% vs. 83.6% and 72.7%, respectively), 5 years (86.0% vs. 74.4% and 59.2%), and 8 years (76.5% vs. 62.9% and 43.9%) (P<0.001 for all comparisons with the two control groups). The survival benefit was significant at 8 years across all levels of donor-specific antibody: 89.2% for recipients of kidney transplants from incompatible live donors who had a positive Luminex assay for anti-HLA antibody but a negative flow-cytometric cross-match versus 65.0% for the waiting-list-or-transplant control group and 47.1% for the waiting-list-only control group; 76.3% for recipients with a positive flow-cytometric cross-match but a negative cytotoxic cross-match versus 63.3% and 43.0% in the two control groups, respectively; and 71.0% for recipients with a positive cytotoxic cross-match versus 61.5% and 43.7%, respectively. The findings did not change when patients from the highest-volume center were excluded. CONCLUSIONS: This multicenter study validated single-center evidence that patients who received kidney transplants from HLA-incompatible live donors had a substantial survival benefit as compared with patients who did not undergo transplantation and those who waited for transplants from deceased donors. (Funded by the National Institute of Diabetes and Digestive and Kidney Diseases.).
Subject(s)
Histocompatibility , Kidney Transplantation , Living Donors , Graft Survival , HLA Antigens , Histocompatibility Testing , Humans , Kidney Transplantation/mortality , Survival Analysis , Tissue and Organ Procurement , Waiting ListsABSTRACT
Thirty percent of kidney transplant recipients are readmitted in the first month posttransplantation. Those with donor-specific antibody requiring desensitization and incompatible live donor kidney transplantation (ILDKT) constitute a unique subpopulation that might be at higher readmission risk. Drawing on a 22-center cohort, 379 ILDKTs with Medicare primary insurance were matched to compatible transplant-matched controls and to waitlist-only matched controls on panel reactive antibody, age, blood group, renal replacement time, prior kidney transplantation, race, gender, diabetes, and transplant date/waitlisting date. Readmission risk was determined using multilevel, mixed-effects Poisson regression. In the first month, ILDKTs had a 1.28-fold higher readmission risk than compatible controls (95% confidence interval [CI] 1.13-1.46; P < .001). Risk peaked at 6-12 months (relative risk [RR] 1.67, 95% CI 1.49-1.87; P < .001), attenuating by 24-36 months (RR 1.24, 95% CI 1.10-1.40; P < .001). ILDKTs had a 5.86-fold higher readmission risk (95% CI 4.96-6.92; P < .001) in the first month compared to waitlist-only controls. At 12-24 (RR 0.85, 95% CI 0.77-0.95; P = .002) and 24-36 months (RR 0.74, 95% CI 0.66-0.84; P < .001), ILDKTs had a lower risk than waitlist-only controls. These findings of ILDKTs having a higher readmission risk than compatible controls, but a lower readmission risk after the first year than waitlist-only controls should be considered in regulatory/payment schemas and planning clinical care.
Subject(s)
Blood Group Incompatibility/immunology , HLA Antigens/immunology , Kidney Failure, Chronic/surgery , Kidney Transplantation/methods , Living Donors/supply & distribution , Patient Readmission/statistics & numerical data , Postoperative Complications , Adult , Case-Control Studies , Female , Follow-Up Studies , Glomerular Filtration Rate , Graft Survival , Hospitalization/statistics & numerical data , Humans , Isoantibodies/blood , Isoantibodies/immunology , Kidney Function Tests , Male , Middle Aged , Prognosis , Risk FactorsABSTRACT
Transplantation of liver grafts from donation after cardiac death (DCD) is limited. To identify barriers of DCD liver utilization, all active US liver transplant centers (n = 138) were surveyed, and the responses were compared with the United Network for Organ Sharing (UNOS) data. In total, 74 (54%) centers responded, and diversity in attitudes was observed, with many not using organ and/or recipient prognostic variables defined in prior studies and UNOS data analysis. Most centers (74%) believed lack of a system allowing a timely retransplant is a barrier to utilization. UNOS data demonstrated worse 1- and 5-year patient survival (PS) and graft survival (GS) in DCD (PS, 86% and 64%; GS, 82% and 59%, respectively) versus donation after brain death (DBD) recipients (PS, 90% and 71%; GS, 88% and 69%, respectively). Donor alanine aminotransferase (ALT), recipient Model for End-Stage Liver Disease (MELD), and cold ischemia time (CIT) significantly impacted DCD outcomes to a greater extent than DBD outcomes. At 3 years, relisting and retransplant rates were 7.9% and 4.6% higher in DCD recipients. To optimize outcome, our data support the use of DCD liver grafts with CIT <6-8 hours in patients with MELD ≤ 20. In conclusion, standardization of donor and recipient criteria, defining the impact of ischemic cholangiopathy, addressing donor hospital policies, and developing a strategy for timely retransplant may help to expand the use of these organs. Liver Transplantation 23 1372-1383 2017 AASLD.
Subject(s)
End Stage Liver Disease/surgery , Graft Survival , Liver Transplantation/methods , Practice Patterns, Physicians'/standards , Tissue and Organ Procurement/standards , Adult , Allografts/pathology , Allografts/transplantation , Attitude , Cold Ischemia/adverse effects , End Stage Liver Disease/mortality , Graft Rejection/epidemiology , Humans , Liver/pathology , Liver Transplantation/adverse effects , Liver Transplantation/psychology , Liver Transplantation/statistics & numerical data , Middle Aged , Practice Guidelines as Topic , Practice Patterns, Physicians'/organization & administration , Practice Patterns, Physicians'/statistics & numerical data , Retrospective Studies , Severity of Illness Index , Surveys and Questionnaires , Tissue and Organ Procurement/methods , Tissue and Organ Procurement/organization & administration , Transplants , Treatment Outcome , United StatesABSTRACT
Many primate species currently subsist in fragmented and anthropogenically disturbed habitats. Different threats arise depending on the species' life history strategy, dietary requirements and habitat preference. Additionally, anthropogenic disturbance is far from uniform and may affect individual forest fragments in a single landscape in differing ways. We studied the effects of fragmentation on three species of diurnal primate, Cebus albifrons, Alouatta seniculus and Ateles hybridus, in Magdalena Valley, Colombia. We tested the assumption that generalist species are more resilient than specialist species to habitat degradation by examining the fragments' vegetation and spatial structure and how these affected primate presence and abundance patterns. We found C. albifrons, a generalist, to be the most abundant species in 9 of 10 forest fragments, regardless of the level of habitat disturbance. A. hybridus, a large-bodied primate with a specialist diet, was either absent or low in abundance in fragments that had experienced recent disturbances and was found only in higher-quality fragments, regardless of the fragment size. A. seniculus, a species considered to have a highly flexible diet and the ability to survive in degraded habitat, was found in intermediate abundances between those of Cebus spp. and Ateles spp., and was more frequently found in high-quality fragments.
Subject(s)
Alouatta/physiology , Atelinae/physiology , Cebus/physiology , Forests , Plant Dispersal/physiology , Adaptation, Biological , Animals , Colombia , Confounding Factors, Epidemiologic , Conservation of Natural Resources , Ecosystem , Geographic Information Systems , Population Density , Satellite ImageryABSTRACT
BACKGROUND: While surgical resection of pancreatic adenocarcinoma provides the only chance of cure, long-term survival remains poor. Immunotherapy may improve outcomes, especially as adjuvant to local therapies. Gene-mediated cytotoxic immunotherapy (GMCI) generates a systemic anti-tumor response through local delivery of an adenoviral vector expressing the HSV-tk gene (aglatimagene besadenovec, AdV-tk) followed by anti-herpetic prodrug. GMCI has demonstrated synergy with standard of care (SOC) in other tumor types. This is the first application in pancreatic cancer. METHODS: Four dose levels (3 × 10(10) to 1 × 10(12) vector particles) were evaluated as adjuvant to surgery for resectable disease (Arm A) or to 5-FU chemoradiation for locally advanced disease (Arm B). Each patient received two cycles of AdV-tk + prodrug. RESULTS: Twenty-four patients completed therapy, 12 per arm, with no dose-limiting toxicities. All Arm A patients were explored, eight were resected, one was locally advanced and three had distant metastases. CD8(+) T cell infiltration increased an average of 22-fold (range sixfold to 75-fold) compared with baseline (p = 0.0021). PD-L1 expression increased in 5/7 samples analyzed. One node-positive resected patient is alive >66 months without recurrence. Arm B RECIST response rate was 25 % with a median OS of 12 months and 1-year survival of 50 %. Patient-reported quality of life showed no evidence of deterioration. CONCLUSIONS: AdV-tk can be safely combined with pancreatic cancer SOC without added toxicity. Response and survival compare favorably to expected outcomes and immune activity increased. These results support further evaluation of GMCI with more modern chemoradiation and surgery as well as PD-1/PD-L1 inhibitors in pancreatic cancer.
Subject(s)
Acyclovir/analogs & derivatives , Adenocarcinoma/therapy , Genetic Therapy/methods , Immunotherapy/methods , Pancreatic Neoplasms/therapy , Valine/analogs & derivatives , Acyclovir/administration & dosage , Adenocarcinoma/genetics , Adenocarcinoma/immunology , Adenocarcinoma/pathology , Adenocarcinoma/surgery , Adenoviridae/genetics , Adenoviridae/immunology , Adult , Aged , Chemoradiotherapy , Combined Modality Therapy , Dose-Response Relationship, Drug , Female , Genetic Vectors/genetics , Genetic Vectors/immunology , Humans , Immunohistochemistry , Male , Middle Aged , Neoplasm Recurrence, Local , Pancreatic Neoplasms/genetics , Pancreatic Neoplasms/immunology , Pancreatic Neoplasms/surgery , Thymidine Kinase/genetics , Valacyclovir , Valine/administration & dosage , Pancreatic NeoplasmsABSTRACT
The functionalisation of organic linkers in metal-organic frameworks (MOFs) to improve gas uptake is well-documented. Although the positive role of free carboxylic acid sites in MOFs for binding gas molecules has been proposed in computational studies, relatively little experimental evidence has been reported in support of this. Primarily this is because of the inherent synthetic difficulty to prepare MOF materials bearing free, accessible -COOH moieties which would normally bind to metal ions within the framework structure. Here, we describe the direct binding of CO2 and C2H2 molecules to the free -COOH sites within the pores of MFM-303(Al). MFM-303(Al) exhibits highly selective adsorption of CO2 and C2H2 with a high selectivity for C2H2 over C2H4. In situ synchrotron X-ray diffraction and inelastic neutron scattering, coupled with modelling, highlight the cooperative interactions of adsorbed CO2 and C2H2 molecules with free -COOH and -OH sites within MFM-303(Al), thus rationalising the observed high selectivity for gas separation.
ABSTRACT
BACKGROUND: Despite immunization, patients on antineoplastic and immunomodulating agents have a heightened risk of COVID-19 infection. However, accurately attributing this risk to specific medications remains challenging. METHODS: An observational cohort study from December 11, 2020 to September 22, 2022, within a large healthcare system in San Diego, California, USA was designed to identify medications associated with greatest risk of postimmunization SARS-CoV-2 infection. Adults prescribed WHO Anatomical Therapeutic Chemical (ATC) classified antineoplastic and immunomodulating medications were matched (by age, sex, race, and number of immunizations) with control patients not prescribed these medications yielding a population of 26 724 patients for analysis. From this population, 218 blood samples were collected from an enrolled subset to assess serological response and cytokine profile in relation to immunization. RESULTS: Prescription of WHO ATC classified antineoplastic and immunomodulatory agents was associated with elevated postimmunization SARS-CoV-2 infection risk (HR 1.50, 95% CI 1.38 to 1.63). While multiple immunization doses demonstrated a decreased association with postimmunization SARS-CoV-2 infection risk, antineoplastic and immunomodulatory treated patients with four doses remained at heightened risk (HR 1.23, 95% CI 1.06 to 1.43). Risk variation was identified among medication subclasses, with PD-1/PD-L1 inhibiting monoclonal antibodies, calcineurin inhibitors, and CD20 monoclonal antibody inhibitors identified to associate with increased risk of postimmunization SARS-CoV-2 infection. Antineoplastic and immunomodulatory treated patients also displayed a reduced IgG antibody response to SARS-CoV-2 epitopes alongside a unique serum cytokine profile. CONCLUSIONS: Antineoplastic and immunomodulating medications associate with an elevated risk of postimmunization SARS-CoV-2 infection in a drug-specific manner. This comprehensive, unbiased analysis of all WHO ATC classified antineoplastic and immunomodulating medications identifies medications associated with greatest risk. These findings are crucial in guiding and refining vaccination strategies for patients prescribed these treatments, ensuring optimized protection for this susceptible population in future COVID-19 variant surges and potentially for other RNA immunization targets.
Subject(s)
Antineoplastic Agents , COVID-19 , Adult , Humans , SARS-CoV-2 , Immunomodulating Agents , Antibody Formation , Breakthrough Infections , CytokinesABSTRACT
Hydrologic reconstructions from North America are largely unknown for the Middle Miocene. Examination of fungal palynomorph assemblages coupled with traditional plant-based palynology permits delineation of local, as opposed to regional, climate signals and provides a baseline for study of ancient fungas. Here, the Fungi in a Warmer World project presents paleoecology and paleoclimatology of 351 fungal morphotypes from 3 sites in the United States: the Clarkia Konservat-Lagerstätte site (Idaho), the Alum Bluff site (Florida), and the Bouie River site (Mississippi). Of these, 83 fungi are identified as extant taxa and 41 are newly reported from the Miocene. Combining new plant-based paleoclimatic reconstructions with funga-based paleoclimate reconstructions, we demonstrate cooling and hydrologic changes from the Miocene climate optimum to the Serravallian. In the southeastern United States, this is comparable to that reconstructed with pollen and paleobotany alone. In the northwestern United States, cooling is greater than indicated by other reconstructions and hydrology shifts seasonally, from no dry season to a dry summer season. Our results demonstrate the utility of fossil fungi as paleoecologic and paleoclimatic proxies and that warmer than modern geological time intervals do not match the "wet gets wetter, dry gets drier" paradigm. Instead, both plants and fungi show an invigorated hydrological cycle across mid-latitude North America.
ABSTRACT
Mycophenolate mofetil (MMF) and sirolimus (SRL) have been used for calcineurin inhibitor (CNI) minimization to reduce nephrotoxicity following liver transplantation. In this prospective, open-label, multicenter study, patients undergoing transplantation from July 2005 to June 2007 who were maintained on MMF/CNI were randomized 4 to 12 weeks after transplantation to receive MMF/SRL (n = 148) or continue MMF/CNI (n = 145) and included in the intent-to-treat population. The primary efficacy endpoints were the mean percentage change in the calculated glomerular filtration rate (GFR) and a composite of biopsy-proven acute rejection (BPAR), graft lost, death, and lost to follow-up 12 months after transplantation. Patients were followed for a median of 519 days after randomization. MMF/SRL was associated with a significantly greater renal function improvement from baseline with a mean percentage change in GFR of 19.7 ± 40.6 (versus 1.2 ± 39.9 for MMF/CNI, P = 0.0012). The composite endpoint demonstrated the noninferiority of MMF/SRL versus MMF/CNI (16.4% versus 15.4%, 90% confidence interval = -7.1% to 9.0%). The incidence of BPAR was significantly greater with MMF/SRL (12.2%) versus MMF/CNI (4.1%, P = 0.02). Graft loss (including death) occurred in 3.4% of the MMF/SRL-treated patients and in 8.3% of the MMF/CNI-treated patients (P = 0.04). Malignancy-related deaths were less frequent with MMF/SRL. Adverse events caused withdrawal for 34.2% of the MMF/SRL-treated patients and for 24.1% of the MMF/CNI-treated patients (P = 0.06). The use of MMF/SRL is an option for liver transplant recipients who can benefit from improved renal function but is associated with an increased risk of rejection (but not graft loss).