ABSTRACT
River deltas rank among the most economically and ecologically valuable environments on Earth. Even in the absence of sea-level rise, deltas are increasingly vulnerable to coastal hazards as declining sediment supply and climate change alter their sediment budget, affecting delta morphology and possibly leading to erosion1-3. However, the relationship between deltaic sediment budgets, oceanographic forces of waves and tides, and delta morphology has remained poorly quantified. Here we show how the morphology of about 11,000 coastal deltas worldwide, ranging from small bayhead deltas to mega-deltas, has been affected by river damming and deforestation. We introduce a model that shows that present-day delta morphology varies across a continuum between wave (about 80 per cent), tide (around 10 per cent) and river (about 10 per cent) dominance, but that most large deltas are tide- and river-dominated. Over the past 30 years, despite sea-level rise, deltas globally have experienced a net land gain of 54 ± 12 square kilometres per year (2 standard deviations), with the largest 1 per cent of deltas being responsible for 30 per cent of all net land area gains. Humans are a considerable driver of these net land gains-25 per cent of delta growth can be attributed to deforestation-induced increases in fluvial sediment supply. Yet for nearly 1,000 deltas, river damming4 has resulted in a severe (more than 50 per cent) reduction in anthropogenic sediment flux, forcing a collective loss of 12 ± 3.5 square kilometres per year (2 standard deviations) of deltaic land. Not all deltas lose land in response to river damming: deltas transitioning towards tide dominance are currently gaining land, probably through channel infilling. With expected accelerated sea-level rise5, however, recent land gains are unlikely to be sustained throughout the twenty-first century. Understanding the redistribution of sediments by waves and tides will be critical for successfully predicting human-driven change to deltas, both locally and globally.
Subject(s)
Conservation of Natural Resources/statistics & numerical data , Geologic Sediments/analysis , Power Plants/supply & distribution , Rivers , Water Movements , Climate Change/statistics & numerical data , Geographic Mapping , Human Activities/statistics & numerical data , Humans , Internationality , Models, Theoretical , Sea Level Rise/statistics & numerical dataABSTRACT
Alcohol-associated liver disease (ALD) is a major cause of morbidity and mortality worldwide, and a leading indication for liver transplantation (LT) in many countries, including the United States. However, LT for ALD is a complex and evolving field with ethical, social, and medical challenges. Thus, it requires a multidisciplinary approach and individualized decision-making. Short-term and long-term patient and graft survival of patients undergoing LT for ALD are comparable to other indications, but there is a continued need to develop better tools to identify patients who may benefit from LT, improve the pretransplant and posttransplant management of ALD, and evaluate the impact of LT for ALD on the organ donation and transplantation systems. In this review, we summarize the current evidence on LT for ALD, from alcohol-associated hepatitis to decompensated alcohol-associated cirrhosis. We discuss the indications, criteria, outcomes, and controversies of LT for these conditions and highlight the knowledge gaps and research priorities in this field.
ABSTRACT
Robotic surgery is an emerging minimally invasive option for living donor hepatectomy. Currently, there are no studies on the learning curve of robotic donor hepatectomy. Thus, we evaluated the learning curve for robotic donor right hepatectomy (RH). We retrospectively reviewed prospectively collected data from consecutive living donors who underwent robotic hepatectomy at two specialized centers between 2016 and 2022. We estimated the number of cases required to achieve stable operating times for robotic donor RH using cumulative sum (CUSUM) analysis. The complication rates were similar between the two centers (22.8% vs. 26.7%; p=0.74). Most complications were graded as minor (70.4%). Analysis of the total operative time demonstrated that the learning curves reached a peak at the 17th case in Center 1 and the 9th case in Center 2. The average operation times for cases 1-17 versus 18-99 in Center 1 were 603 versus 438 minutes (p<0.001), and cases 1-9 versus 10-15 in Center 2 were 532 versus 418 minutes (p=0.002). Complication rates were lower after the learning curves were achieved, although this did not reach statistical significance. A comparison of outcomes between centers suggests that a standardized approach to this complex operation can be successfully transferred.
ABSTRACT
Intrahepatic cholangiocarcinoma (iCCA) is a rare biliary tract cancer with high mortality rate. Complete resection of the iCCA lesion is the first choice of treatment, with good prognosis after margin-negative resection. Unfortunately, only 12%-40% of patients are eligible for resection at presentation due to cirrhosis, portal hypertension, or large tumor size. Liver transplantation (LT) offers margin-negative iCCA extirpation for patients with unresectable tumors. Initially, iCCA was a contraindication for LT until size-based selection criteria were introduced to identify patients with satisfied post-LT outcomes. Recent studies have shown that tumor biology-based selection can yield high post-LT survival in patients with locally advanced iCCA. Another selection criterion is the tumor response to neoadjuvant therapy. Patients with response to neoadjuvant therapy have better outcomes after LT compared with those without tumor response to neoadjuvant therapy. Another index that helps predict the treatment outcome is the biomarker. Improved survival outcomes have also opened the door for living donor LT for iCCA. Patients undergoing LT for iCCA now have statistically similar survival rates as patients undergoing resection. The combination of surgery and locoregional and systemic therapies improves the prognosis of iCCA patients.
Subject(s)
Bile Duct Neoplasms , Cholangiocarcinoma , Liver Transplantation , Humans , Liver Transplantation/adverse effects , Cholangiocarcinoma/pathology , Treatment Outcome , Bile Duct Neoplasms/pathology , Bile Ducts, Intrahepatic/surgeryABSTRACT
PURPOSE OF REVIEW: The purpose of this review is to both summarize the current knowledge of hepatocellular carcinoma molecular biology and to suggest a framework in which to prospectively translate this knowledge into patient care. This is timely as recent guidelines recommend increased use of these technologies to advance personalized liver cancer care. RECENT FINDINGS: The main themes covered here address germline and somatic genetic alterations recently discovered in hepatocellular carcinoma, largely owing to next generation sequencing technologies, and nascent efforts to translate these into contemporary practice. SUMMARY: Early efforts of translating molecular profiling to hepatocellular carcinoma care demonstrate a growing number of potentially actionable alterations. Still lacking are a consensus on what biomarkers and technologies to adopt, at what scale and cost, and how to integrate them most effectively into care.
Subject(s)
Carcinoma, Hepatocellular , Liver Neoplasms , Humans , Carcinoma, Hepatocellular/genetics , Carcinoma, Hepatocellular/therapy , Carcinoma, Hepatocellular/pathology , Liver Neoplasms/genetics , Liver Neoplasms/therapy , Liver Neoplasms/pathology , High-Throughput Nucleotide SequencingABSTRACT
OBJECTIVE: To design and establish a prospective biospecimen repository that integrates multi-omics assays with clinical data to study mechanisms of controlled injury and healing. BACKGROUND: Elective surgery is an opportunity to understand both the systemic and focal responses accompanying controlled and well-characterized injury to the human body. The overarching goal of this ongoing project is to define stereotypical responses to surgical injury, with the translational purpose of identifying targetable pathways involved in healing and resilience, and variations indicative of aberrant peri-operative outcomes. METHODS: Clinical data from the electronic medical record combined with large-scale biological data sets derived from blood, urine, fecal matter, and tissue samples are collected prospectively through the peri-operative period on patients undergoing 14 surgeries chosen to represent a range of injury locations and intensities. Specimens are subjected to genomic, transcriptomic, proteomic, and metabolomic assays to describe their genetic, metabolic, immunologic, and microbiome profiles, providing a multidimensional landscape of the human response to injury. RESULTS: The highly multiplexed data generated includes changes in over 28,000 mRNA transcripts, 100 plasma metabolites, 200 urine metabolites, and 400 proteins over the longitudinal course of surgery and recovery. In our initial pilot dataset, we demonstrate the feasibility of collecting high quality multi-omic data at pre- and postoperative time points and are already seeing evidence of physiologic perturbation between timepoints. CONCLUSIONS: This repository allows for longitudinal, state-of-the-art geno-mic, transcriptomic, proteomic, metabolomic, immunologic, and clinical data collection and provides a rich and stable infrastructure on which to fuel further biomedical discovery.
Subject(s)
Computational Biology , Proteomics , Genomics , Humans , Metabolomics , Prospective Studies , Proteomics/methodsABSTRACT
Pancreatic cancer, a highly aggressive tumour type with uniformly poor prognosis, exemplifies the classically held view of stepwise cancer development. The current model of tumorigenesis, based on analyses of precursor lesions, termed pancreatic intraepithelial neoplasm (PanINs) lesions, makes two predictions: first, that pancreatic cancer develops through a particular sequence of genetic alterations (KRAS, followed by CDKN2A, then TP53 and SMAD4); and second, that the evolutionary trajectory of pancreatic cancer progression is gradual because each alteration is acquired independently. A shortcoming of this model is that clonally expanded precursor lesions do not always belong to the tumour lineage, indicating that the evolutionary trajectory of the tumour lineage and precursor lesions can be divergent. This prevailing model of tumorigenesis has contributed to the clinical notion that pancreatic cancer evolves slowly and presents at a late stage. However, the propensity for this disease to rapidly metastasize and the inability to improve patient outcomes, despite efforts aimed at early detection, suggest that pancreatic cancer progression is not gradual. Here, using newly developed informatics tools, we tracked changes in DNA copy number and their associated rearrangements in tumour-enriched genomes and found that pancreatic cancer tumorigenesis is neither gradual nor follows the accepted mutation order. Two-thirds of tumours harbour complex rearrangement patterns associated with mitotic errors, consistent with punctuated equilibrium as the principal evolutionary trajectory. In a subset of cases, the consequence of such errors is the simultaneous, rather than sequential, knockout of canonical preneoplastic genetic drivers that are likely to set-off invasive cancer growth. These findings challenge the current progression model of pancreatic cancer and provide insights into the mutational processes that give rise to these aggressive tumours.
Subject(s)
Carcinogenesis/genetics , Carcinogenesis/pathology , Gene Rearrangement/genetics , Genome, Human/genetics , Models, Biological , Mutagenesis/genetics , Pancreatic Neoplasms/genetics , Pancreatic Neoplasms/pathology , Carcinoma in Situ/genetics , Chromothripsis , DNA Copy Number Variations/genetics , Disease Progression , Evolution, Molecular , Female , Genes, Neoplasm/genetics , Humans , Male , Mitosis/genetics , Mutation/genetics , Neoplasm Invasiveness/genetics , Neoplasm Invasiveness/pathology , Neoplasm Metastasis/genetics , Neoplasm Metastasis/pathology , Polyploidy , Precancerous Conditions/geneticsABSTRACT
Induction therapy with rabbit anti-thymocyte globulin (rATG) in low-risk kidney transplant recipients (KTR) remains controversial, given the associated increased risk of cytomegalovirus (CMV) infection. This natural experiment compared 12-month clinical outcomes in low-risk KTR without CMV prophylaxis (January/3/13-September/16/15) receiving no induction or a single 3 mg/kg dose of rATG. We used logistic regression to characterize delayed graft function (DGF), negative binomial to characterize length of hospital stay (LOS), and Cox regression to characterize acute rejection (AR), CMV infection, graft loss, death, and hospital readmissions. Recipients receiving 3 mg/kg rATG had an 81% lower risk of AR (aHR 0.14 0.190.25 , P < 0.001) but no increased rate of hospital readmissions because of infections (0.68 0.911.21 , P = 0.5). There was no association between 3 mg/kg rATG and CMV infection/disease (aHR 0.86 1.101.40 , P = 0.5), even when the analysis was stratified according to recipient CMV serostatus positive (aHR 0.94 1.251.65 , P = 0.1) and negative (aHR 0.28 0.571.16 , P = 0.1). There was no association between 3 mg/kg rATG and mortality (aHR 0.51 1.253.08 , P = 0.6), and graft loss (aHR 0.34 0.731.55 , P = 0.4). Among low-risk KTR receiving no CMV pharmacological prophylaxis, 3 mg/kg rATG induction was associated with a significant reduction in the incidence of AR without an increased risk of CMV infection, regardless of recipient pretransplant CMV serostatus.
Subject(s)
Cytomegalovirus Infections , Kidney Transplantation , Antilymphocyte Serum , Cytomegalovirus , Cytomegalovirus Infections/epidemiology , Graft Rejection/prevention & control , Humans , Immunosuppressive Agents , Incidence , Kidney Transplantation/adverse effects , Retrospective Studies , Transplant RecipientsABSTRACT
Frailty, a measure of physiologic reserve, is associated with poor outcomes and mortality among kidney transplant (KT) candidates and recipients. There are no national estimates of frailty in this population, which may help patient counseling and resource allocation at transplant centers. We studied 4616 KT candidates and 1763 recipients in our multicenter prospective cohort of frailty from 2008-2018 with Fried frailty measurements. Using Scientific Registry of Transplant Recipients (SRTR) data (KT candidates = 560 143 and recipients = 243 508), we projected the national prevalence of frailty (for KT candidates and recipients separately) using standardization through inverse probability weighting, accounting for candidate/recipient, donor, and transplant factors. In our multicenter cohort, 13.3% of KT candidates were frail at evaluation; 8.2% of LDKT recipients and 17.8% of DDKT recipients were frail at transplantation. Projected nationally, our modeling strategy estimated 91 738 KT candidates or 16.4% (95% confidence interval [CI] 14.4%-18.4%) of all KT candidates during the study period were frail, and that 34 822 KT recipients or 14.3% (95% CI 12.3%-16.3%) of all KT recipients were frail (LDKT = 8.2%; DDKT = 17.8%). Given the estimated national prevalence of frailty, transplant programs should consider assessing the condition during KT evaluation to improve patient counseling and resource allocation along with identification of recipients at risk for poor outcomes.
Subject(s)
Frailty , Kidney Transplantation , Frailty/epidemiology , Humans , Prevalence , Prospective Studies , Registries , Risk Factors , Transplant Recipients , United States/epidemiologyABSTRACT
OBJECTIVE: We sought to compare kidney transplantation outcomes between Veterans Affairs (VA) and non-VA transplant centers. SUMMARY BACKGROUND DATA: Transplant care at the VA has previously been scrutinized due to geographic and systematic barriers. The recently instituted MISSION Act entered effect June 6th, 2019, which enables veteran access to surgical care at civilian hospitals if certain eligibility criteria are met. METHODS: We evaluated observed-to-expected outcome ratios (O:E) for graft loss and mortality using the Scientific Registry of Transplant Recipients database for all kidney transplants during a 15-year period (July 1, 2001-June 30, 2016). Of 229,188 kidney transplants performed during the study period, 1508 were performed at VA centers (N = 7), 7750 at the respective academic institutions affiliated with these VA centers, and 227,680 at non-VA centers nationwide (N = 286). RESULTS: Aggregate O:E ratios for mortality were lower in VA centers compared with non-VA centers at 1 month and 1 year (O:E = 0.27 vs 1.00, P = 0.03 and O:E = 0.62 vs 1.00, P = 0.03, respectively). Graft loss at 1 month and 1 year was similar between groups (O:E = 0.65 vs 1.00, P = 0.11 and O:E = 0.79 vs 1.00, P = 0.15, respectively). Ratios for mortality and graft loss were similar between VA centers and their respective academic affiliates. Additionally, a subgroup analysis for graft loss and mortality at 3 years (study period January 1, 2009-December 31, 2013) demonstrated no significant differences between VA centers, VA-affiliates, and all non-VA centers. CONCLUSIONS: Despite low clinical volume, VA centers offer excellent outcomes in kidney transplantation. Veteran referral to civilian hospitals should weigh the benefit of geographic convenience and patient preference with center outcomes.
Subject(s)
Forecasting , Hospitals, Veterans/statistics & numerical data , Kidney Transplantation/statistics & numerical data , Postoperative Complications/epidemiology , Registries , Transplant Recipients/statistics & numerical data , Follow-Up Studies , Hospitals/statistics & numerical data , Humans , Incidence , Retrospective Studies , Survival Rate/trends , United States/epidemiology , United States Department of Veterans Affairs/statistics & numerical dataABSTRACT
BACKGROUND: Disability in general has been associated with poor outcomes in kidney transplant (KT) recipients. However, disability can be derived from various components, specifically visual, hearing, physical and walking impairments. Different impairments may compromise the patient through different mechanisms and might impact different aspects of KT outcomes. METHODS: In our prospective cohort study (June 2013-June 2017), 465 recipients reported hearing, visual, physical and walking impairments before KT. We used hybrid registry-augmented Cox regression, adjusting for confounders using the US KT population (Scientific Registry of Transplant Recipients, N = 66 891), to assess the independent association between impairments and post-KT outcomes [death-censored graft failure (DCGF) and mortality]. RESULTS: In our cohort of 465 recipients, 31.6% reported one or more impairments (hearing 9.3%, visual 16.6%, physical 9.1%, walking 12.1%). Visual impairment was associated with a 3.36-fold [95% confidence interval (CI) 1.17-9.65] higher DCGF risk, however, hearing [2.77 (95% CI 0.78-9.82)], physical [0.67 (95% CI 0.08-3.35)] and walking [0.50 (95% CI 0.06-3.89)] impairments were not. Walking impairment was associated with a 3.13-fold (95% CI 1.32-7.48) higher mortality risk, however, visual [1.20 (95% CI 0.48-2.98)], hearing [1.01 (95% CI 0.29-3.47)] and physical [1.16 (95% CI 0.34-3.94)] impairments were not. CONCLUSIONS: Impairments are common among KT recipients, yet only visual impairment and walking impairment are associated with adverse post-KT outcomes. Referring nephrologists and KT centers should identify recipients with visual and walking impairments who might benefit from targeted interventions pre-KT, additional supportive care and close post-KT monitoring.
Subject(s)
Disabled Persons/statistics & numerical data , Graft Rejection/mortality , Hearing Loss/physiopathology , Kidney Failure, Chronic/mortality , Kidney Transplantation/mortality , Mobility Limitation , Vision Disorders/physiopathology , Adult , Female , Graft Rejection/etiology , Graft Rejection/pathology , Graft Survival , Humans , Kidney Failure, Chronic/surgery , Kidney Transplantation/adverse effects , Male , Middle Aged , Motor Activity , Prognosis , Prospective Studies , Registries , Risk Factors , Survival Rate , Transplant Recipients , WalkingABSTRACT
BACKGROUND: Restoration of kidney function after kidney transplant generally improves cognitive function. It is unclear whether frail recipients, with higher susceptibility to surgical stressors, achieve such post-transplant cognitive improvements or whether they experience subsequent cognitive decline as they age with a functioning graft. METHODS: In this two-center cohort study, we assessed pretransplant frailty (Fried physical frailty phenotype) and cognitive function (Modified Mini-Mental State Examination) in adult kidney transplant recipients. To investigate potential short- and medium-term effects of frailty on post-transplant cognitive trajectories, we measured cognitive function up to 4 years post-transplant. Using an adjusted mixed effects model with a random slope (time) and intercept (person), we characterized post-transplant cognitive trajectories by pretransplant frailty, accounting for nonlinear trajectories. RESULTS: Of 665 recipients (mean age 52.0 years) followed for a median of 1.5 years, 15.0% were frail. After adjustment, pretransplant cognitive scores were significantly lower among frail patients compared with nonfrail patients (89.0 versus 90.8 points). By 3 months post-transplant, cognitive performance improved for both frail (slope =0.22 points per week) and nonfrail (slope =0.14 points per week) recipients. Between 1 and 4 years post-transplant, improvements plateaued among nonfrail recipients (slope =0.005 points per week), whereas cognitive function declined among frail recipients (slope =-0.04 points per week). At 4 years post-transplant, cognitive scores were 5.8 points lower for frail recipients compared with nonfrail recipients. CONCLUSIONS: On average, both frail and nonfrail recipients experience short-term cognitive improvement post-transplant. However, frailty is associated with medium-term cognitive decline post-transplant. Interventions to prevent cognitive decline among frail recipients should be identified.
Subject(s)
Cognitive Dysfunction/etiology , Frailty/complications , Kidney Transplantation , Postoperative Complications/etiology , Cognition , Cohort Studies , Female , Follow-Up Studies , Humans , Male , Middle Aged , Time FactorsABSTRACT
Several single-center reports of using HCV-viremic organs for HCV-uninfected (HCV-) recipients were recently published. We sought to characterize national utilization of HCV-exposed donors for HCV- recipients (HCV D+/R-) in kidney transplantation (KT) and liver transplantation (LT). Using SRTR data (April 1, 2015-December 2, 2018) and Gini coefficients, we studied center-level clustering of 1193 HCV D+/R- KTs and LTs. HCV-viremic (NAT+) D+/R- KTs increased from 1/month in 2015 to 22/month in 2018 (LTs: 0/month to 12/month). HCV-aviremic (Ab+/NAT-) D+/R- KTs increased from < 1/month in 2015 to 26/month in 2018 (LTs: <1/month to 8/month). HCV- recipients of viremic and aviremic kidneys spent a median (interquartile range [IQR]) of 0.7 (0.2-1.6) and 1.6 (0.4-3.5) years on the waitlist versus 1.8 (0.5-4.0) among HCV D-/R-. HCV- recipients of viremic and aviremic livers had median (IQR) MELD scores of 24 (21-30) and 25 (21-32) at transplantation versus 29 (23-36) among HCV D-/R-. 12 KT and 14 LT centers performed 81% and 76% of all viremic HCV D+/R- transplants; 11 KT and 13 LT centers performed 76% and 69% of all aviremic HCV D+/R- transplants. There have been marked increases in HCV D+/R- transplantation, although few centers are driving this practice; centers should continue to weigh the risks and benefits of HCV D+/R- transplantation.
Subject(s)
Hepatitis C/virology , Kidney Transplantation/methods , Liver Transplantation/methods , Practice Patterns, Physicians'/statistics & numerical data , Practice Patterns, Physicians'/trends , Tissue Donors/supply & distribution , Tissue and Organ Procurement/statistics & numerical data , Adult , Antiviral Agents/therapeutic use , Female , Follow-Up Studies , Hepacivirus/isolation & purification , Hepatitis C/drug therapy , Humans , Male , Middle Aged , Practice Patterns, Physicians'/standards , Prognosis , Transplant Recipients , Waiting ListsABSTRACT
In the United States, kidney donation from international (noncitizen/nonresident) living kidney donors (LKDs) is permitted; however, given the heterogeneity of healthcare systems, concerns remain regarding the international LKD practice and recipient outcomes. We studied a US cohort of 102 315 LKD transplants from 2000-2016, including 2088 international LKDs, as reported to the Organ Procurement and Transplantation Network. International LKDs were more tightly clustered among a small number of centers than domestic LKDs (Gini coefficient 0.76 vs 0.58, P < .001). Compared with domestic LKDs, international LKDs were more often young, male, Hispanic or Asian, and biologically related to their recipient (P < .001). Policy-compliant donor follow-up was substantially lower for international LKDs at 6, 12, and 24 months postnephrectomy (2015 cohort: 45%, 33%, 36% vs 76%, 71%, 70% for domestic LKDs, P < .001). Among international LKDs, Hispanic (aOR = 0.23 0.360.56 , P < .001) and biologically related (aOR = 0.39 0.590.89 , P < .01) donors were more compliant in donor follow-up than white and unrelated donors. Recipients of international living donor kidney transplant (LDKT) had similar graft failure (aHR = 0.78 0.891.02 , P = .1) but lower mortality (aHR = 0.53 0.620.72 , P < .001) compared with the recipients of domestic LDKT after adjusting for recipient, transplant, and donor factors. International LKDs may provide an alternative opportunity for living donation. However, efforts to improve international LKD follow-up and engagement are warranted.
Subject(s)
Graft Survival , Kidney Failure, Chronic/mortality , Kidney Transplantation/mortality , Living Donors/supply & distribution , Tissue and Organ Harvesting/mortality , Tissue and Organ Procurement/organization & administration , Adult , Cohort Studies , Female , Follow-Up Studies , Glomerular Filtration Rate , Humans , International Agencies , Kidney Failure, Chronic/surgery , Kidney Function Tests , Male , Middle Aged , Nephrectomy/methods , Prognosis , Registries/statistics & numerical data , Risk Factors , Survival Rate , United StatesABSTRACT
Somatic mutations have been found in the mitochondria in different types of cancer cells, but it is not clear whether these affect tumorigenesis or tumor progression. We analyzed mitochondrial genomes of 268 early-stage, resected pancreatic ductal adenocarcinoma tissues and paired non-tumor tissues. We defined a mitochondrial somatic mutation (mtSNV) as a position where the difference in heteroplasmy fraction between tumor and normal sample was ≥0.2. Our analysis identified 304 mtSNVs, with at least 1 mtSNV in 61% (164 of 268) of tumor samples. The noncoding control region had the greatest proportion of mtSNVs (60 of 304 mutations); this region contains sites that regulate mitochondrial DNA transcription and replication. Frequently mutated genes included ND5, RNR2, and CO1, plus 29 mutations in transfer RNA genes. mtSNVs in 2 separate mitochondrial genes (ND4 and ND6) were associated with shorter overall survival time. This association appeared to depend on the level of mtSNV heteroplasmy. Non-random co-occurrence between mtSNVs and mutations in nuclear genes indicates interactions between nuclear and mitochondrial DNA. In an analysis of primary tumors and metastases from 6 patients, we found tumors to accumulate mitochondrial mutational mutations as they progress.
Subject(s)
Adenocarcinoma/genetics , Carcinoma, Pancreatic Ductal/genetics , DNA, Mitochondrial/genetics , Mutation , Pancreatic Neoplasms/genetics , Adenocarcinoma/mortality , Adenocarcinoma/pathology , Adult , Carcinoma, Pancreatic Ductal/mortality , Carcinoma, Pancreatic Ductal/pathology , Disease Progression , Female , Humans , Kaplan-Meier Estimate , Male , Middle Aged , Pancreatic Neoplasms/mortality , Pancreatic Neoplasms/pathology , Time FactorsABSTRACT
We conducted a case-control exome-wide association study to discover germline variants in coding regions that affect risk for pancreatic cancer, combining data from 5 studies. We analyzed exome and genome sequencing data from 437 patients with pancreatic cancer (cases) and 1922 individuals not known to have cancer (controls). In the primary analysis, BRCA2 had the strongest enrichment for rare inactivating variants (17/437 cases vs 3/1922 controls) (P = 3.27x10-6; exome-wide statistical significance threshold P < 2.5x10-6). Cases had more rare inactivating variants in DNA repair genes than controls, even after excluding 13 genes known to predispose to pancreatic cancer (adjusted odds ratio, 1.35; P = .045). At the suggestive threshold (P < .001), 6 genes were enriched for rare damaging variants (UHMK1, AP1G2, DNTA, CHST6, FGFR3, and EPHA1) and 7 genes had associations with pancreatic cancer risk, based on the sequence-kernel association test. We confirmed variants in BRCA2 as the most common high-penetrant genetic factor associated with pancreatic cancer and we also identified candidate pancreatic cancer genes. Large collaborations and novel approaches are needed to overcome the genetic heterogeneity of pancreatic cancer predisposition.
Subject(s)
Biomarkers, Tumor/genetics , Exome Sequencing , Exome , Genetic Variation , Pancreatic Neoplasms/genetics , BRCA2 Protein/genetics , Case-Control Studies , Genetic Heterogeneity , Genetic Predisposition to Disease , Genome-Wide Association Study , Humans , Odds Ratio , Pancreatic Neoplasms/diagnosis , Phenotype , Risk Assessment , Risk FactorsABSTRACT
PURPOSE OF REVIEW: We report the current state of HIV+ to HIV+ kidney transplantation in the United States and remaining challenges in implementing this practice nationally. RECENT FINDINGS: The HIV Organ Policy Equity (HOPE) Act, which was the first step in unlocking the potential of HIV+ organ donors, mandates clinical research on HIV+ to HIV+ transplantation. As of March 2019, there have been 57 HOPE donors, including both true and false positive HOPE donors resulting in more than 120 transplants. SUMMARY: The HOPE Act, signed in 2013, reversed the federal ban on the transplantation of organs from HIV+ donors into HIV+ recipients. Ongoing national studies are exploring the safety, feasibility, and efficacy of both kidney and liver transplantation in this population. If successfully and fully implemented, HIV+ to HIV+ transplantation could attenuate the organ shortage for everyone waiting, resulting in a far-reaching public health impact.
Subject(s)
Kidney Transplantation/adverse effects , Kidney Transplantation/legislation & jurisprudence , Tissue and Organ Procurement/legislation & jurisprudence , Humans , United StatesABSTRACT
Three dimensional electron microscopy is becoming a very data-intensive field in which vast amounts of experimental images are acquired at high speed. To manage such large-scale projects, we had previously developed a modular workflow system called Scipion (de la Rosa-Trevín et al., 2016). We present here a major extension of Scipion that allows processing of EM images while the data is being acquired. This approach helps to detect problems at early stages, saves computing time and provides users with a detailed evaluation of the data quality before the acquisition is finished. At present, Scipion has been deployed and is in production mode in seven Cryo-EM facilities throughout the world.