Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Int J Mol Sci ; 23(17)2022 Aug 31.
Artículo en Inglés | MEDLINE | ID: mdl-36077295

RESUMEN

This study concerns the analysis of the modulation of Chronic Myeloid Leukemia (CML) cell model K562 transcriptome following transfection with the tumor suppressor gene encoding for Protein Tyrosine Phosphatase Receptor Type G (PTPRG) and treatment with the tyrosine kinase inhibitor (TKI) Imatinib. Specifically, we aimed at identifying genes whose level of expression is altered by PTPRG modulation and Imatinib concentration. Statistical tests as differential expression analysis (DEA) supported by gene set enrichment analysis (GSEA) and modern methods of ontological term analysis are presented along with some results of current interest for forthcoming experimental research in the field of the transcriptomic landscape of CML. In particular, we present two methods that differ in the order of the analysis steps. After a gene selection based on fold-change value thresholding, we applied statistical tests to select differentially expressed genes. Therefore, we applied two different methods on the set of differentially expressed genes. With the first method (Method 1), we implemented GSEA, followed by the identification of transcription factors. With the second method (Method 2), we first selected the transcription factors from the set of differentially expressed genes and implemented GSEA on this set. Method 1 is a standard method commonly used in this type of analysis, while Method 2 is unconventional and is motivated by the intention to identify transcription factors more specifically involved in biological processes relevant to the CML condition. Both methods have been equipped in ontological knowledge mining and word cloud analysis, as elements of novelty in our analytical procedure. Data analysis identified RARG and CD36 as a potential PTPRG up-regulated genes, suggesting a possible induction of cell differentiation toward an erithromyeloid phenotype. The prediction was confirmed at the mRNA and protein level, further validating the approach and identifying a new molecular mechanism of tumor suppression governed by PTPRG in a CML context.


Asunto(s)
Leucemia Mielógena Crónica BCR-ABL Positiva , Proteínas Tirosina Fosfatasas Clase 5 Similares a Receptores/genética , Resistencia a Antineoplásicos , Expresión Génica , Genes Supresores de Tumor , Humanos , Mesilato de Imatinib/uso terapéutico , Células K562 , Leucemia Mielógena Crónica BCR-ABL Positiva/patología , Monoéster Fosfórico Hidrolasas/genética , Inhibidores de Proteínas Quinasas/uso terapéutico , Factores de Transcripción/genética
2.
Comput Biol Med ; 148: 105699, 2022 09.
Artículo en Inglés | MEDLINE | ID: mdl-35715259

RESUMEN

Biomechanical simulation enables medical researchers to study complex mechano-biological conditions, although for soft tissue modeling, it may apply highly nonlinear multi-physics theories commonly implemented by expensive finite element (FE) solvers. This is a significantly time-consuming process on a regular computer and completely inefficient in urgent situations. One remedy is to first generate a dataset of the possible inputs and outputs of the solver in order to then train an efficient machine learning (ML) model, i.e., the supervised ML-based surrogate, replacing the expensive solver to speed up the simulation. But it still requires a large number of expensive numerical samples. In this regard, we propose a hybrid ML (HML) method that uses a reduced-order model defined by the simplification of the complex multi-physics equations to produce a dataset of the low-fidelity (LF) results. The surrogate then has this efficient numerical model and an ML model that should increase the fidelity of its outputs to the level of high-fidelity (HF) results. Based on our empirical tests via a group of diverse training and numerical modeling conditions, the proposed method can improve training convergence for very limited training samples. In particular, while considerable time gains comparing to the HF numerical models are observed, training of the HML models is also significantly more efficient than the purely ML-based surrogates. From this, we conclude that this non-destructive HML implementation may increase the accuracy and efficiency of surrogate modeling of soft tissues with complex multi-physics properties in small data regimes.


Asunto(s)
Aprendizaje Automático , Simulación por Computador , Análisis de Elementos Finitos
3.
J Mech Behav Biomed Mater ; 114: 104203, 2021 02.
Artículo en Inglés | MEDLINE | ID: mdl-33234496

RESUMEN

Classical continuum mechanics has been widely used for implementation of the material models of articular cartilage (AC) mainly with the aid of the finite element (FE) method, which, in many cases, considers the stress-free configuration as the initial configuration. On the contrary, the AC experimental tests typically begin with the pre-stressed state of both material and geometrical properties. Indeed, imposing the initial pre-stress onto AC models with the in vivo values as the initial state would result in nonphysiologically expansion of the FE mesh due to the soft nature of AC. This change in the model configuration can also affect the material behavior kinematically in the mixture models of cartilage due to the intrinsic compressibility of the tissue. Although several different fixed-point backward algorithms, as the most straightforward pre-stressing methods, have already been developed to incorporate these initial conditions into FE models iteratively, such methods focused merely on the geometrical parameters, and they omitted the material variations of the anisotropic mixture models of AC. To address this issue, we propose an efficient algorithm generalizing the backward schemes to restore stress-free conditions by optimizing both the involving variables, and we hypothesize that it can affect the results considerably. To this end, a comparative simulation was implemented on an advanced and validated multiphasic model by the new and conventional algorithms. The results are in support of the hypothesis, as in our illustrative general AC model, the material parameters experienced a maximum error of 16% comparing to the initial in vivo data when the older algorithm was employed, and it led to a maximum variation of 44% in the recorded stresses comparing to the results of the new method. We conclude that our methodology enhanced the model fidelity, and it is applicable in most of the existing FE solvers for future mixture studies with accurate stress distributions.


Asunto(s)
Cartílago Articular , Algoritmos , Anisotropía , Simulación por Computador , Análisis de Elementos Finitos , Modelos Biológicos , Estrés Mecánico
4.
Philos Trans A Math Phys Eng Sci ; 367(1895): 1951-69, 2009 May 28.
Artículo en Inglés | MEDLINE | ID: mdl-19380320

RESUMEN

Models of cardiac electrophysiology consist of a system of partial differential equations (PDEs) coupled with a system of ordinary differential equations representing cell membrane dynamics. Current software to solve such models does not provide the required computational speed for practical applications. One reason for this is that little use is made of recent developments in adaptive numerical algorithms for solving systems of PDEs. Studies have suggested that a speedup of up to two orders of magnitude is possible by using adaptive methods. The challenge lies in the efficient implementation of adaptive algorithms on massively parallel computers. The finite-element (FE) method is often used in heart simulators as it can encapsulate the complex geometry and small-scale details of the human heart. An alternative is the spectral element (SE) method, a high-order technique that provides the flexibility and accuracy of FE, but with a reduced number of degrees of freedom. The feasibility of implementing a parallel SE algorithm based on fully unstructured all-hexahedra meshes is discussed. A major computational task is solution of the large algebraic system resulting from FE or SE discretization. Choice of linear solver and preconditioner has a substantial effect on efficiency. A fully parallel implementation based on dynamic partitioning that accounts for load balance, communication and data movement costs is required. Each of these methods must be implemented on next-generation supercomputers in order to realize the necessary speedup. The problems that this may cause, and some of the techniques that are beginning to be developed to overcome these issues, are described.


Asunto(s)
Computadores , Electrocardiografía , Corazón/fisiología , Algoritmos , Análisis de Elementos Finitos , Humanos
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA