Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 50
Filter
1.
J Comput Graph Stat ; 32(3): 938-949, 2023.
Article in English | MEDLINE | ID: mdl-37822489

ABSTRACT

Proximal Markov Chain Monte Carlo is a novel construct that lies at the intersection of Bayesian computation and convex optimization, which helped popularize the use of nondifferentiable priors in Bayesian statistics. Existing formulations of proximal MCMC, however, require hyperparameters and regularization parameters to be prespecified. In this work, we extend the paradigm of proximal MCMC through introducing a novel new class of nondifferentiable priors called epigraph priors. As a proof of concept, we place trend filtering, which was originally a nonparametric regression problem, in a parametric setting to provide a posterior median fit along with credible intervals as measures of uncertainty. The key idea is to replace the nonsmooth term in the posterior density with its Moreau-Yosida envelope, which enables the application of the gradient-based MCMC sampler Hamiltonian Monte Carlo. The proposed method identifies the appropriate amount of smoothing in a data-driven way, thereby automating regularization parameter selection. Compared with conventional proximal MCMC methods, our method is mostly tuning free, achieving simultaneous calibration of the mean, scale and regularization parameters in a fully Bayesian framework.

2.
Technometrics ; 65(1): 117-126, 2023.
Article in English | MEDLINE | ID: mdl-37448596

ABSTRACT

Building on previous research of Chi and Chi (2022), the current paper revisits estimation in robust structured regression under the L2E criterion. We adopt the majorization-minimization (MM) principle to design a new algorithm for updating the vector of regression coefficients. Our sharp majorization achieves faster convergence than the previous alternating proximal gradient descent algorithm (Chi and Chi, 2022). In addition, we reparameterize the model by substituting precision for scale and estimate precision via a modified Newton's method. This simplifies and accelerates overall estimation. We also introduce distance-to-set penalties to enable constrained estimation under nonconvex constraint sets. This tactic also improves performance in coefficient estimation and structure recovery. Finally, we demonstrate the merits of our improved tactics through a rich set of simulation examples and a real data application.

3.
Technometrics ; 65(4): 537-552, 2023.
Article in English | MEDLINE | ID: mdl-38213317

ABSTRACT

The growing prevalence of tensor data, or multiway arrays, in science and engineering applications motivates the need for tensor decompositions that are robust against outliers. In this paper, we present a robust Tucker decomposition estimator based on the L2 criterion, called the Tucker-L2E. Our numerical experiments demonstrate that Tucker-L2E has empirically stronger recovery performance in more challenging high-rank scenarios compared with existing alternatives. The appropriate Tucker-rank can be selected in a data-driven manner with cross-validation or hold-out validation. The practical effectiveness of Tucker-L2E is validated on real data applications in fMRI tensor denoising, PARAFAC analysis of fluorescence data, and feature extraction for classification of corrupted images.

4.
Stat Anal Data Min ; 15(3): 303-313, 2022 Jun.
Article in English | MEDLINE | ID: mdl-35756358

ABSTRACT

Many machine learning algorithms depend on weights that quantify row and column similarities of a data matrix. The choice of weights can dramatically impact the effectiveness of the algorithm. Nonetheless, the problem of choosing weights has arguably not been given enough study. When a data matrix is completely observed, Gaussian kernel affinities can be used to quantify the local similarity between pairs of rows and pairs of columns. Computing weights in the presence of missing data, however, becomes challenging. In this paper, we propose a new method to construct row and column affinities even when data are missing by building off a co-clustering technique. This method takes advantage of solving the optimization problem for multiple pairs of cost parameters and filling in the missing values with increasingly smooth estimates. It exploits the coupled similarity structure among both the rows and columns of a data matrix. We show these affinities can be used to perform tasks such as data imputation, clustering, and matrix completion on graphs.

5.
J Comput Graph Stat ; 31(4): 1051-1062, 2022.
Article in English | MEDLINE | ID: mdl-36721836

ABSTRACT

We introduce a user-friendly computational framework for implementing robust versions of a wide variety of structured regression methods with the L2 criterion. In addition to introducing an algorithm for performing L2E regression, our framework enables robust regression with the L2 criterion for additional structural constraints, works without requiring complex tuning procedures on the precision parameter, can be used to identify heterogeneous subpopulations, and can incorporate readily available non-robust structured regression solvers. We provide convergence guarantees for the framework and demonstrate its flexibility with some examples. Supplementary materials for this article are available online.

6.
Annu Int Conf IEEE Eng Med Biol Soc ; 2021: 4432-4435, 2021 11.
Article in English | MEDLINE | ID: mdl-34892203

ABSTRACT

Coronary bifurcation lesions are a leading cause of Coronary Artery Disease (CAD). Despite its prevalence, coronary bifurcation lesions remain difficult to treat due to our incomplete understanding of how various features of lesion anatomy synergistically disrupt normal hemodynamic flow. In this work, we employ an interpretable machine learning algorithm, the Classification and Regression Tree (CART), to model the impact of these geometric features on local hemodynamic quantities. We generate a synthetic arterial database via computational fluid dynamic simulations and apply the CART approach to predict the time averaged wall shear stress (TAWSS) at two different locations within the cardiac vasculature. Our experimental results show that CART can estimate a simple, interpretable, yet accurately predictive nonlinear model of TAWSS as a function of such features.Clinical relevance- The fitted tree models have the potential to refine predictions of disturbed hemodynamic flow based on an individual's cardiac and lesion anatomy and consequently makes progress towards personalized treatment planning for CAD patients.


Subject(s)
Coronary Artery Disease , Hemodynamics , Heart , Humans , Machine Learning , Stress, Mechanical
7.
J Comput Graph Stat ; 30(1): 115-124, 2021.
Article in English | MEDLINE | ID: mdl-34025100

ABSTRACT

Joint models are popular for analyzing data with multivariate responses. We propose a sparse multivariate single index model, where responses and predictors are linked by unspecified smooth functions and multiple matrix level penalties are employed to select predictors and induce low-rank structures across responses. An alternating direction method of multipliers (ADMM) based algorithm is proposed for model estimation. We demonstrate the effectiveness of proposed model in simulation studies and an application to a genetic association study.

8.
Bioinformatics ; 37(20): 3667-3669, 2021 Oct 25.
Article in English | MEDLINE | ID: mdl-33904580

ABSTRACT

SUMMARY: Biclustering is a generalization of clustering used to identify simultaneous grouping patterns in observations (rows) and features (columns) of a data matrix. Recently, the biclustering task has been formulated as a convex optimization problem. While this convex recasting of the problem has attractive properties, existing algorithms do not scale well. To address this problem and make convex biclustering a practical tool for analyzing larger data, we propose an implementation of fast convex biclustering called COBRAC to reduce the computing time by iteratively compressing problem size along with the solution path. We apply COBRAC to several gene expression datasets to demonstrate its effectiveness and efficiency. Besides the standalone version for COBRAC, we also developed a related online web server for online calculation and visualization of the downloadable interactive results. AVAILABILITY AND IMPLEMENTATION: The source code and test data are available at https://github.com/haidyi/cvxbiclustr or https://zenodo.org/record/4620218. The web server is available at https://cvxbiclustr.ericchi.com. SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.

9.
Sci Rep ; 11(1): 8145, 2021 04 14.
Article in English | MEDLINE | ID: mdl-33854076

ABSTRACT

Conventional invasive diagnostic imaging techniques do not adequately resolve complex Type B and C coronary lesions, which present unique challenges, require personalized treatment and result in worsened patient outcomes. These lesions are often excluded from large-scale non-invasive clinical trials and there does not exist a validated approach to characterize hemodynamic quantities and guide percutaneous intervention for such lesions. This work identifies key biomarkers that differentiate complex Type B and C lesions from simple Type A lesions by introducing and validating a coronary angiography-based computational fluid dynamic (CFD-CA) framework for intracoronary assessment in complex lesions at ultrahigh resolution. Among 14 patients selected in this study, 7 patients with Type B and C lesions were included in the complex lesion group including ostial, bifurcation, serial lesions and lesion where flow was supplied by collateral bed. Simple lesion group included 7 patients with lesions that were discrete, [Formula: see text] long and readily accessible. Intracoronary assessment was performed using CFD-CA framework and validated by comparing to clinically measured pressure-based index, such as FFR. Local pressure, endothelial shear stress (ESS) and velocity profiles were derived for all patients. We validates the accuracy of our CFD-CA framework and report excellent agreement with invasive measurements ([Formula: see text]). Ultra-high resolution achieved by the model enable physiological assessment in complex lesions and quantify hemodynamic metrics in all vessels up to 1mm in diameter. Importantly, we demonstrate that in contrast to traditional pressure-based metrics, there is a significant difference in the intracoronary hemodynamic forces, such as ESS, in complex lesions compared to simple lesions at both resting and hyperemic physiological states [n = 14, [Formula: see text]]. Higher ESS was observed in the complex lesion group ([Formula: see text] Pa) than in simple lesion group ([Formula: see text] Pa). Complex coronary lesions have higher ESS compared to simple lesions, such differential hemodynamic evaluation can provide much the needed insight into the increase in adverse outcomes for such patients and has incremental prognostic value over traditional pressure-based indices, such as FFR.


Subject(s)
Coronary Angiography/methods , Coronary Disease/diagnostic imaging , Radiographic Image Interpretation, Computer-Assisted/methods , Computer Simulation , Coronary Disease/classification , Diagnosis, Differential , Hemodynamics , Humans , Shear Strength
10.
Comput Sci Eng ; 23(6): 42-51, 2021.
Article in English | MEDLINE | ID: mdl-35784398

ABSTRACT

Modern technologies produce a deluge of complicated data. In neuroscience, for example, minimally invasive experimental methods can take recordings of large populations of neurons at high resolution under a multitude of conditions. Such data arrays possess non-trivial interdependencies along each of their axes. Insights into these data arrays may lay the foundations of advanced treatments for nervous system disorders. The potential impacts of such data, however, will not be fully realized unless the techniques for analyzing them keep pace. Specifically, there is an urgent, growing need for methods for estimating the low-dimensional structure and geometry in big and noisy data arrays. This article reviews a framework for identifying complicated underlying patterns in such data and also recounts the key role that the Department of Energy Computational Sciences Graduate Fellowship played in setting the stage for this work to be done by the author.

11.
Article in English | MEDLINE | ID: mdl-33312074

ABSTRACT

Cluster analysis is a fundamental tool for pattern discovery of complex heterogeneous data. Prevalent clustering methods mainly focus on vector or matrix-variate data and are not applicable to general-order tensors, which arise frequently in modern scientific and business applications. Moreover, there is a gap between statistical guarantees and computational efficiency for existing tensor clustering solutions due to the nature of their non-convex formulations. In this work, we bridge this gap by developing a provable convex formulation of tensor co-clustering. Our convex co-clustering (CoCo) estimator enjoys stability guarantees and its computational and storage costs are polynomial in the size of the data. We further establish a non-asymptotic error bound for the CoCo estimator, which reveals a surprising "blessing of dimensionality" phenomenon that does not exist in vector or matrix-variate cluster analysis. Our theoretical findings are supported by extensive simulated studies. Finally, we apply the CoCo estimator to the cluster analysis of advertisement click tensor data from a major online company. Our clustering results provide meaningful business insights to improve advertising effectiveness.

12.
Stat ; 8(1)2020.
Article in English | MEDLINE | ID: mdl-32655193

ABSTRACT

Canonical correlation analysis (CCA) is a multivariate analysis technique for estimating a linear relationship between two sets of measurements. Modern acquisition technologies, for example, those arising in neuroimaging and remote sensing, produce data in the form of multidimensional arrays or tensors. Classic CCA is not appropriate for dealing with tensor data due to the multidimensional structure and ultrahigh dimensionality of such modern data. In this paper, we present tensor CCA (TCCA) to discover relationships between two tensors while simultaneously preserving multidimensional structure of the tensors and utilizing substantially fewer parameters. Furthermore, we show how to employ a parsimonious covariance structure to gain additional stability and efficiency. We delineate population and sample problems for each model and propose efficient estimation algorithms with global convergence guarantees. Also we describe a probabilistic model for TCCA that enables the generation of synthetic data with desired canonical variates and correlations. Simulation studies illustrate the performance of our methods.

13.
IEEE Signal Process Mag ; 37(6): 160-173, 2020 Nov.
Article in English | MEDLINE | ID: mdl-33473243

ABSTRACT

Graph signal processing (GSP) is an important methodology for studying data residing on irregular structures. As acquired data is increasingly taking the form of multi-way tensors, new signal processing tools are needed to maximally utilize the multi-way structure within the data. In this paper, we review modern signal processing frameworks generalizing GSP to multi-way data, starting from graph signals coupled to familiar regular axes such as time in sensor networks, and then extending to general graphs across all tensor modes. This widely applicable paradigm motivates reformulating and improving upon classical problems and approaches to creatively address the challenges in tensor-based data. We synthesize common themes arising from current efforts to combine GSP with tensor analysis and highlight future directions in extending GSP to the multi-way paradigm.

14.
Psychiatry Clin Neurosci ; 74(3): 183-190, 2020 Mar.
Article in English | MEDLINE | ID: mdl-31747095

ABSTRACT

AIM: Acupuncture has benefits in the rehabilitation of neuropsychiatric sequelae of stroke. This study was aimed to evaluate the effectiveness of dense cranial electroacupuncture stimulation plus body acupuncture (DCEAS+BA) in treating poststroke depression (PSD), functional disability, and cognitive deterioration. METHODS: In this assessor- and participant-blinded, randomized controlled trial, 91 stroke patients who initially had PSD were randomly assigned to either DCEAS+BA (n = 45) or minimum acupuncture stimulation as controls (n = 46) for three sessions per week over 8 consecutive weeks. The primary outcome was baseline-to-end-point change in score of the 17-item Hamilton Depression Rating Scale. Secondary outcomes included the Montgomery-Åsberg Depression Rating Scale for depressive symptoms, the Barthel Index for functional disability, and the Montreal Cognitive Assessment for cognitive function. RESULTS: DCEAS+BA-treated patients showed strikingly greater end-point reduction than MAS-treated patients in scores of the three symptom domains. The clinical response rate, defined as an at least 50% baseline-to-end-point reduction in 17-item Hamilton Depression Rating Scale score, was markedly higher in the DCEAS+BA-treated group than that of controls (40.0% vs 17.4%, P = 0.031). Incidence of adverse events was not different in the two groups. Subgroup analysis revealed that DCEAS+BA with electrical stimulation on forehead acupoints was more apparent in reducing Barthel-Index-measured disability than that without electrical stimulation. CONCLUSION: DCEAS+BA, particularly with electrical stimulation on forehead acupoints, reduces PSD, functional disability, and cognitive deterioration of stroke patients. It can serve as an effective rehabilitation therapy for neuropsychiatric sequelae of stroke.


Subject(s)
Acupuncture Points , Acupuncture Therapy/methods , Cognitive Dysfunction/rehabilitation , Depression/rehabilitation , Outcome and Process Assessment, Health Care , Stroke Rehabilitation/methods , Stroke/therapy , Aged , Cognitive Dysfunction/etiology , Depression/etiology , Double-Blind Method , Electroacupuncture/methods , Extremities , Female , Forehead , Humans , Male , Middle Aged , Severity of Illness Index , Skull , Stroke/complications
15.
Stat Med ; 37(20): 2938-2953, 2018 09 10.
Article in English | MEDLINE | ID: mdl-29797335

ABSTRACT

A biologic is a product made from living organisms. A biosimilar is a new version of an already approved branded biologic. Regulatory guidelines recommend a totality-of-the-evidence approach with stepwise development for a new biosimilar. Initial steps for biosimilar development are (a) analytical comparisons to establish similarity in structure and function followed by (b) potential animal studies and a human pharmacokinetics/pharmacodynamics equivalence study. The last step is a phase III clinical trial to confirm similar efficacy, safety, and immunogenicity between the biosimilar and the biologic. A high degree of analytical and pharmacokinetics/pharmacodynamics similarity could provide justification for an eased statistical threshold in the phase III trial, which could then further facilitate an overall abbreviated approval process for biosimilars. Bayesian methods can help in the analysis of clinical trials, by adding proper prior information into the analysis, thereby potentially decreasing required sample size. We develop proper prior information for the analysis of a phase III trial for showing that a proposed biosimilar is similar to a reference biologic. For the reference product, we use a meta-analysis of published results to set a prior for the probability of efficacy, and we propose priors for the proposed biosimilar informed by the strength of the evidence generated in the earlier steps of the approval process. A simulation study shows that with few exceptions, the Bayesian relative risk analysis provides greater power, shorter 90% credible intervals with more than 90% frequentist coverage, and better root mean squared error.


Subject(s)
Bayes Theorem , Biosimilar Pharmaceuticals , Clinical Trials, Phase III as Topic , Humans , Probability , Research Design
16.
J Food Drug Anal ; 26(1): 318-329, 2018 01.
Article in English | MEDLINE | ID: mdl-29389570

ABSTRACT

Tocopherols and tocotrienols, collectively known as vitamin E, have received a great deal of attention because of their interesting biological activities. In the present study, we reexamined and improved previous methods of sample preparation and the conditions of high-performance liquid chromatography for more accurate quantification of tocopherols, tocotrienols and their major chain-degradation metabolites. For the analysis of serum tocopherols/tocotrienols, we reconfirmed our method of mixing serum with ethanol followed by hexane extraction. For the analysis of tissue samples, we improved our methods by extracting tocopherols/tocotrienols directly from tissue homogenate with hexane. For the analysis of total amounts (conjugated and unconjugated forms) of side-chain degradation metabolites, the samples need to be deconjugated by incubating with ß-glucuronidase and sulfatase; serum samples can be directly used for the incubation, whereas for tissue homogenates a pre-deproteination step is needed. The present methods are sensitive, convenient and are suitable for the determination of different forms of vitamin E and their metabolites in animal and human studies. Results from the analysis of serum, liver, kidney, lung and urine samples from mice that had been treated with mixtures of tocotrienols and tocopherols are presented as examples.


Subject(s)
Chromatography, High Pressure Liquid , Metabolomics , Tocopherols/analysis , Tocotrienols/analysis , Animals , Biomarkers , Humans , Mass Spectrometry , Metabolomics/methods , Mice , Molecular Structure , Tocopherols/blood , Tocopherols/chemistry , Tocotrienols/blood , Tocotrienols/chemistry
17.
J Biopharm Stat ; 28(2): 320-332, 2018.
Article in English | MEDLINE | ID: mdl-29173074

ABSTRACT

To improve patients' access to safe and effective biological medicines, abbreviated licensure pathways for biosimilar and interchangeable biological products have been established in the US, Europe, and other countries around the world. The US Food and Drug Administration and European Medicines Agency have published various guidance documents on the development and approval of biosimilars, which recommend a "totality-of-the-evidence" approach with a stepwise process to demonstrate biosimilarity. The approach relies on comprehensive comparability studies ranging from analytical and nonclinical studies to clinical pharmacokinetic/pharmacodynamic (PK/PD) and efficacy studies. A clinical efficacy study may be necessary to address residual uncertainty about the biosimilarity of the proposed product to the reference product and support a demonstration that there are no clinically meaningful differences. In this article, we propose a statistical strategy that takes into account the similarity evidence from analytical assessments and PK studies in the design and analysis of the clinical efficacy study in order to address residual uncertainty and enhance statistical power and precision. We assume that if the proposed biosimilar product and the reference product are shown to be highly similar with respect to the analytical and PK parameters, then they should also be similar with respect to the efficacy parameters. We show that the proposed methods provide correct control of the type I error and improve the power and precision of the efficacy study upon the standard analysis that disregards the prior evidence. We confirm and illustrate the theoretical results through simulation studies based on the biosimilars development experience of many different products.


Subject(s)
Biosimilar Pharmaceuticals/pharmacology , Biosimilar Pharmaceuticals/pharmacokinetics , Clinical Trials, Phase III as Topic/statistics & numerical data , Computer Simulation/statistics & numerical data , Drug Approval/methods , Research Design/statistics & numerical data , Clinical Trials, Phase III as Topic/methods , Europe , Humans , Therapeutic Equivalency , Treatment Outcome , United States , United States Food and Drug Administration
18.
Biometrics ; 73(1): 10-19, 2017 03.
Article in English | MEDLINE | ID: mdl-27163413

ABSTRACT

In the biclustering problem, we seek to simultaneously group observations and features. While biclustering has applications in a wide array of domains, ranging from text mining to collaborative filtering, the problem of identifying structure in high-dimensional genomic data motivates this work. In this context, biclustering enables us to identify subsets of genes that are co-expressed only within a subset of experimental conditions. We present a convex formulation of the biclustering problem that possesses a unique global minimizer and an iterative algorithm, COBRA, that is guaranteed to identify it. Our approach generates an entire solution path of possible biclusters as a single tuning parameter is varied. We also show how to reduce the problem of selecting this tuning parameter to solving a trivial modification of the convex biclustering problem. The key contributions of our work are its simplicity, interpretability, and algorithmic guarantees-features that arguably are lacking in the current alternative algorithms. We demonstrate the advantages of our approach, which includes stably and reproducibly identifying biclusterings, on simulated and real microarray data.


Subject(s)
Cluster Analysis , Data Interpretation, Statistical , Gene Regulatory Networks , Algorithms , Computational Biology/methods , Databases, Genetic , Gene Expression Profiling/methods , Oligonucleotide Array Sequence Analysis
19.
Mol Carcinog ; 56(1): 172-183, 2017 01.
Article in English | MEDLINE | ID: mdl-27175800

ABSTRACT

Tocopherols, the major forms of vitamin E, are a family of fat-soluble compounds that exist in alpha (α-T), beta (ß-T), gamma (γ-T), and delta (δ-T) variants. A cancer preventive effect of vitamin E is suggested by epidemiological studies. However, past animal studies and human intervention trials with α-T, the most active vitamin E form, have yielded disappointing results. A possible explanation is that the cancer preventive activity of α-T is weak compared to other tocopherol forms. In the present study, we investigated the effects of δ-T, γ-T, and α-T (0.2% in diet) in a novel colon cancer model induced by the meat-derived dietary carcinogen, 2-amino-1-methyl-6-phenylimidazo[4,5-b]pyridine (PhIP) and promoted by dextran sodium sulfate (DSS)-induced colitis in CYP1A-humanized (hCYP1A) mice. PhIP/DSS treatments induced multiple polypoid tumors, mainly tubular adenocarcinomas, in the middle to distal colon of the hCYP1A mice after 10 wk. Dietary supplementation with δ-T and γ-T significantly reduced colon tumor formation and suppressed markers of oxidative and nitrosative stress (i.e., 8-oxo-dG and nitrotyrosine) as well as pro-inflammatory mediators (i.e., NF-κB p65 and p-STAT3) in tumors and adjacent tissues. By administering δ-T at different time periods, we obtained results suggesting that the inhibitory effect of δ-T against colon carcinogenesis is mainly due to protection against early cellular and DNA damages caused by PhIP. α-T was found to be ineffective in inhibiting colon tumors and less effective in attenuating the molecular changes. Altogether, we demonstrated strong cancer preventive effects of δ-T and γ-T in a physiologically relevant model of human colon cancer. © 2016 Wiley Periodicals, Inc.


Subject(s)
Anticarcinogenic Agents/therapeutic use , Carcinogenesis/drug effects , Colon/drug effects , Colonic Neoplasms/prevention & control , Tocopherols/therapeutic use , Vitamins/therapeutic use , gamma-Tocopherol/therapeutic use , Animals , Carcinogenesis/chemically induced , Carcinogenesis/genetics , Carcinogenesis/metabolism , Colon/metabolism , Colonic Neoplasms/chemically induced , Colonic Neoplasms/genetics , Colonic Neoplasms/metabolism , Cytochrome P-450 CYP1A1/metabolism , DNA Damage/drug effects , Dextran Sulfate , Humans , Imidazoles , Male , Mice , Oxidative Stress/drug effects
20.
Carcinogenesis ; 37(7): 723-730, 2016 07.
Article in English | MEDLINE | ID: mdl-27207656

ABSTRACT

Obesity is associated with an increased risk of cancer. To study the promotion of dietary carcinogen-induced gastrointestinal cancer by obesity, we employed 2-amino-1-methyl-6-phenylimidazo[4,5-b]pyridine (PhIP) to induce intestinal tumorigenesis in CYP1A-humanized (hCYP1A) mice, in which mouse Cyp1a1/1a2 was replaced with human CYP1A1/1A2 Obesity was introduced in hCYP1A mice by breeding with Lepr(db/+) mice to establish the genetically induced obese hCYP1A-Lepr(db/db) mice or by feeding hCYP1A mice a high-fat diet. PhIP induced the formation of small intestinal tumors at the ages of weeks 28-40 in obese hCYP1A mice, but not in lean hCYP1A mice. No tumors were found in colon and other gastrointestinal organs in the lean or obese mice. Using immunohistochemistry (IHC), we found strong positive staining of NF-κB p65, pSTAT3 and COX2 as well as elevated levels of nuclear ß-catenin (Ctnnb1) in small intestinal tumors, but not in normal tissues. By sequencing Apc and Ctnnb1 genes, we found that most PhIP-induced small intestinal tumors in obese mice carried only a single heterozygous mutation in Apc By bisulfite-sequencing of CpG islands of Apc, we found DNA hypermethylation in a CpG cluster located in its transcription initiation site, which most likely caused the inactivation of the wild-type Apc allele. Our findings demonstrate that PhIP-induced small intestinal carcinogenesis in hCYP1A-db/db mice is promoted by obesity and involves Apc mutation and inactivation by DNA hypermethylation. This experimental result is consistent with the association of obesity and the increased incidence of small intestinal cancer in humans in recent decades.


Subject(s)
Adenomatous Polyposis Coli Protein/genetics , Carcinogenesis/genetics , Cytochrome P-450 CYP1A1/genetics , Intestinal Neoplasms/genetics , Obesity/genetics , beta Catenin/genetics , Animals , Carcinogenesis/drug effects , Cytochrome P-450 CYP1A1/metabolism , DNA Methylation/drug effects , DNA Methylation/genetics , Humans , Imidazoles/toxicity , Intestinal Neoplasms/chemically induced , Intestinal Neoplasms/pathology , Intestine, Small/drug effects , Intestine, Small/pathology , Male , Mice , Mice, Transgenic , Mutation , Neoplasm Proteins/biosynthesis , Obesity/complications , Obesity/pathology , Receptors, Leptin/genetics , beta Catenin/biosynthesis
SELECTION OF CITATIONS
SEARCH DETAIL
...