RESUMO
BACKGROUND: Patient heterogeneity poses significant challenges for managing individuals and designing clinical trials, especially in complex diseases. Existing classifications rely on outcome-predicting scores, potentially overlooking crucial elements contributing to heterogeneity without necessarily impacting prognosis. METHODS: To address patient heterogeneity, we developed ClustALL, a computational pipeline that simultaneously faces diverse clinical data challenges like mixed types, missing values, and collinearity. ClustALL enables the unsupervised identification of patient stratifications while filtering for stratifications that are robust against minor variations in the population (population-based) and against limited adjustments in the algorithm's parameters (parameter-based). RESULTS: Applied to a European cohort of patients with acutely decompensated cirrhosis (n = 766), ClustALL identified five robust stratifications, using only data at hospital admission. All stratifications included markers of impaired liver function and number of organ dysfunction or failure, and most included precipitating events. When focusing on one of these stratifications, patients were categorized into three clusters characterized by typical clinical features; notably, the 3-cluster stratification showed a prognostic value. Re-assessment of patient stratification during follow-up delineated patients' outcomes, with further improvement of the prognostic value of the stratification. We validated these findings in an independent prospective multicentre cohort of patients from Latin America (n = 580). CONCLUSIONS: By applying ClustALL to patients with acutely decompensated cirrhosis, we identified three patient clusters. Following these clusters over time offers insights that could guide future clinical trial design. ClustALL is a novel and robust stratification method capable of addressing the multiple challenges of patient stratification in most complex diseases.
Assuntos
Cirrose Hepática , Humanos , Masculino , Feminino , Análise por Conglomerados , Pessoa de Meia-Idade , Prognóstico , Doença Aguda , Algoritmos , Idoso , Estudos de CoortesRESUMO
A topic of growing interest in computational neuroscience is the discovery of fundamental principles underlying global dynamics and the self-organization of the brain. In particular, the notion that the brain operates near criticality has gained considerable support, and recent work has shown that the dynamics of different brain states may be modeled by pairwise maximum entropy Ising models at various distances from a phase transition, i.e., from criticality. Here we aim to characterize two brain states (psychedelics-induced and placebo) as captured by functional magnetic resonance imaging (fMRI), with features derived from the Ising spin model formalism (system temperature, critical point, susceptibility) and from algorithmic complexity. We hypothesized, along the lines of the entropic brain hypothesis, that psychedelics drive brain dynamics into a more disordered state at a higher Ising temperature and increased complexity. We analyze resting state blood-oxygen-level-dependent (BOLD) fMRI data collected in an earlier study from fifteen subjects in a control condition (placebo) and during ingestion of lysergic acid diethylamide (LSD). Working with the automated anatomical labeling (AAL) brain parcellation, we first create "archetype" Ising models representative of the entire dataset (global) and of the data in each condition. Remarkably, we find that such archetypes exhibit a strong correlation with an average structural connectome template obtained from dMRI (r = 0.6). We compare the archetypes from the two conditions and find that the Ising connectivity in the LSD condition is lower than in the placebo one, especially in homotopic links (interhemispheric connectivity), reflecting a significant decrease of homotopic functional connectivity in the LSD condition. The global archetype is then personalized for each individual and condition by adjusting the system temperature. The resulting temperatures are all near but above the critical point of the model in the paramagnetic (disordered) phase. The individualized Ising temperatures are higher in the LSD condition than in the placebo condition (p = 9 × 10-5). Next, we estimate the Lempel-Ziv-Welch (LZW) complexity of the binarized BOLD data and the synthetic data generated with the individualized model using the Metropolis algorithm for each participant and condition. The LZW complexity computed from experimental data reveals a weak statistical relationship with condition (p = 0.04 one-tailed Wilcoxon test) and none with Ising temperature (r(13) = 0.13, p = 0.65), presumably because of the limited length of the BOLD time series. Similarly, we explore complexity using the block decomposition method (BDM), a more advanced method for estimating algorithmic complexity. The BDM complexity of the experimental data displays a significant correlation with Ising temperature (r(13) = 0.56, p = 0.03) and a weak but significant correlation with condition (p = 0.04, one-tailed Wilcoxon test). This study suggests that the effects of LSD increase the complexity of brain dynamics by loosening interhemispheric connectivity-especially homotopic links. In agreement with earlier work using the Ising formalism with BOLD data, we find the brain state in the placebo condition is already above the critical point, with LSD resulting in a shift further away from criticality into a more disordered state.
Assuntos
Alucinógenos , Humanos , Alucinógenos/farmacologia , Dietilamida do Ácido Lisérgico/farmacologia , Temperatura , Encéfalo , Imageamento por Ressonância Magnética/métodosRESUMO
The relationship between stochastic transcriptional bursts and dynamic 3D chromatin states is not well understood. Using an innovated, ultra-sensitive technique, we address here enigmatic features underlying the communications between MYC and its enhancers in relation to the transcriptional process. MYC thus interacts with its flanking enhancers in a mutually exclusive manner documenting that enhancer hubs impinging on MYC detected in large cell populations likely do not exist in single cells. Dynamic encounters with pathologically activated enhancers responsive to a range of environmental cues, involved <10% of active MYC alleles at any given time in colon cancer cells. Being the most central node of the chromatin network, MYC itself likely drives its communications with flanking enhancers, rather than vice versa. We submit that these features underlie an acquired ability of MYC to become dynamically activated in response to a diverse range of environmental cues encountered by the cell during the neoplastic process.
Assuntos
Carcinogênese/genética , Montagem e Desmontagem da Cromatina , Regulação Neoplásica da Expressão Gênica , Proteínas Proto-Oncogênicas c-myc/genética , Animais , Drosophila , Redes Reguladoras de Genes , Células HCT116 , Humanos , Proteínas Proto-Oncogênicas c-myc/metabolismo , Processos EstocásticosRESUMO
OBJECTIVE: The objective of this study is to determine the prevalence and predictors of sickness absence (SA) and disability pension (DP) in women with metastatic breast cancer (mBC). METHODS: Data were obtained from Swedish registers concerning 1,240 adult women diagnosed 1997-2011 with mBC, from 1 year before (y-1) to 2 (y1) and 2 (y2) years after diagnosis. SA and DP prevalence was calculated. Odds ratios (AOR) were determined for factors associated with using long-term (SA > 180 days or DP > 0 days) sickness benefits. RESULTS: Prevalence of SA and DP was 56.0% and 24.8% during y-1, 69.9% and 28.9% during y1, and 64.0% and 34.7% during y2, respectively. Odds of using long-term sickness benefits were higher y1 and y2 in patients using long-term sickness benefits the year before diagnosis (AOR = 3.82, 95% CI 2.91-5.02; AOR = 4.31, 95% CI 2.96-6.29, respectively) and y2 in patients with mBC diagnosis 1997-2000 (AOR = 1.84, 95% CI 1.10-3.08) and using long-term sickness benefits the year after diagnosis (AOR = 22.10, 95% CI 14.33-34.22). CONCLUSIONS: The prevalence of sickness benefit utilisation was high and increased after mBC diagnosis, particularly for patients using long-term sickness benefits prior to diagnosis. Additional study is needed to determine factors that might reduce the need for sickness benefits and enhance work ability in these patients.
Assuntos
Neoplasias da Mama , Pessoas com Deficiência , Adulto , Neoplasias da Mama/diagnóstico , Neoplasias da Mama/terapia , Estudos de Coortes , Feminino , Humanos , Pensões , Fatores de Risco , Licença Médica , Suécia/epidemiologiaRESUMO
Dysregulation of signaling pathways in multiple sclerosis (MS) can be analyzed by phosphoproteomics in peripheral blood mononuclear cells (PBMCs). We performed in vitro kinetic assays on PBMCs in 195 MS patients and 60 matched controls and quantified the phosphorylation of 17 kinases using xMAP assays. Phosphoprotein levels were tested for association with genetic susceptibility by typing 112 single-nucleotide polymorphisms (SNPs) associated with MS susceptibility. We found increased phosphorylation of MP2K1 in MS patients relative to the controls. Moreover, we identified one SNP located in the PHDGH gene and another on IRF8 gene that were associated with MP2K1 phosphorylation levels, providing a first clue on how this MS risk gene may act. The analyses in patients treated with disease-modifying drugs identified the phosphorylation of each receptor's downstream kinases. Finally, using flow cytometry, we detected in MS patients increased STAT1, STAT3, TF65, and HSPB1 phosphorylation in CD19+ cells. These findings indicate the activation of cell survival and proliferation (MAPK), and proinflammatory (STAT) pathways in the immune cells of MS patients, primarily in B cells. The changes in the activation of these kinases suggest that these pathways may represent therapeutic targets for modulation by kinase inhibitors.
Assuntos
Linfócitos B , Sistema de Sinalização das MAP Quinases/genética , Esclerose Múltipla , Fosfoproteínas , Polimorfismo de Nucleotídeo Único , Proteômica , Linfócitos B/metabolismo , Linfócitos B/patologia , Proliferação de Células , Sobrevivência Celular , Feminino , Humanos , Masculino , Esclerose Múltipla/genética , Esclerose Múltipla/metabolismo , Esclerose Múltipla/patologia , Fosfoproteínas/genética , Fosfoproteínas/metabolismo , Fosforilação/genética , Proteínas Quinases/genética , Proteínas Quinases/metabolismoRESUMO
BACKGROUND: Regulatory T cells (Tregs) expressing the transcription factor FOXP3 are crucial mediators of self-tolerance, preventing autoimmune diseases but possibly hampering tumor rejection. Clinical manipulation of Tregs is of great interest, and first-in-man trials of Treg transfer have achieved promising outcomes. Yet, the mechanisms governing induced Treg (iTreg) differentiation and the regulation of FOXP3 are incompletely understood. RESULTS: To gain a comprehensive and unbiased molecular understanding of FOXP3 induction, we performed time-series RNA sequencing (RNA-Seq) and proteomics profiling on the same samples during human iTreg differentiation. To enable the broad analysis of universal FOXP3-inducing pathways, we used five differentiation protocols in parallel. Integrative analysis of the transcriptome and proteome confirmed involvement of specific molecular processes, as well as overlap of a novel iTreg subnetwork with known Treg regulators and autoimmunity-associated genes. Importantly, we propose 37 novel molecules putatively involved in iTreg differentiation. Their relevance was validated by a targeted shRNA screen confirming a functional role in FOXP3 induction, discriminant analyses classifying iTregs accordingly, and comparable expression in an independent novel iTreg RNA-Seq dataset. CONCLUSION: The data generated by this novel approach facilitates understanding of the molecular mechanisms underlying iTreg generation as well as of the concomitant changes in the transcriptome and proteome. Our results provide a reference map exploitable for future discovery of markers and drug candidates governing control of Tregs, which has important implications for the treatment of cancer, autoimmune, and inflammatory diseases.
Assuntos
Fatores de Transcrição Forkhead/metabolismo , Proteoma/metabolismo , Linfócitos T Reguladores/metabolismo , Transcriptoma/fisiologia , Diferenciação Celular/genética , Diferenciação Celular/fisiologia , Linhagem Celular , Fatores de Transcrição Forkhead/genética , Regulação da Expressão Gênica , Humanos , Análise de Sequência de RNA , Transdução de Sinais , Transcriptoma/genética , Fator de Crescimento Transformador beta/genética , Fator de Crescimento Transformador beta/metabolismoRESUMO
The principle of maximum entropy (Maxent) is often used to obtain prior probability distributions as a method to obtain a Gibbs measure under some restriction giving the probability that a system will be in a certain state compared to the rest of the elements in the distribution. Because classical entropy-based Maxent collapses cases confounding all distinct degrees of randomness and pseudo-randomness, here we take into consideration the generative mechanism of the systems considered in the ensemble to separate objects that may comply with the principle under some restriction and whose entropy is maximal but may be generated recursively from those that are actually algorithmically random offering a refinement to classical Maxent. We take advantage of a causal algorithmic calculus to derive a thermodynamic-like result based on how difficult it is to reprogram a computer code. Using the distinction between computable and algorithmic randomness, we quantify the cost in information loss associated with reprogramming. To illustrate this, we apply the algorithmic refinement to Maxent on graphs and introduce a Maximal Algorithmic Randomness Preferential Attachment (MARPA) Algorithm, a generalisation over previous approaches. We discuss practical implications of evaluation of network randomness. Our analysis provides insight in that the reprogrammability asymmetry appears to originate from a non-monotonic relationship to algorithmic probability. Our analysis motivates further analysis of the origin and consequences of the aforementioned asymmetries, reprogrammability, and computation.
RESUMO
We survey and introduce concepts and tools located at the intersection of information theory and network biology. We show that Shannon's information entropy, compressibility and algorithmic complexity quantify different local and global aspects of synthetic and biological data. We show examples such as the emergence of giant components in Erdös-Rényi random graphs, and the recovery of topological properties from numerical kinetic properties simulating gene expression data. We provide exact theoretical calculations, numerical approximations and error estimations of entropy, algorithmic probability and Kolmogorov complexity for different types of graphs, characterizing their variant and invariant properties. We introduce formal definitions of complexity for both labeled and unlabeled graphs and prove that the Kolmogorov complexity of a labeled graph is a good approximation of its unlabeled Kolmogorov complexity and thus a robust definition of graph complexity.
Assuntos
Teoria da Informação , Redes e Vias Metabólicas , Algoritmos , Animais , Entropia , HumanosRESUMO
Network inference is a rapidly advancing field, with new methods being proposed on a regular basis. Understanding the advantages and limitations of different network inference methods is key to their effective application in different circumstances. The common structural properties shared by diverse networks naturally pose a challenge when it comes to devising accurate inference methods, but surprisingly, there is a paucity of comparison and evaluation methods. Historically, every new methodology has only been tested against gold standard (true values) purpose-designed synthetic and real-world (validated) biological networks. In this paper we aim to assess the impact of taking into consideration aspects of topological and information content in the evaluation of the final accuracy of an inference procedure. Specifically, we will compare the best inference methods, in both graph-theoretic and information-theoretic terms, for preserving topological properties and the original information content of synthetic and biological networks. New methods for performance comparison are introduced by borrowing ideas from gene set enrichment analysis and by applying concepts from algorithmic complexity. Experimental results show that no individual algorithm outperforms all others in all cases, and that the challenging and non-trivial nature of network inference is evident in the struggle of some of the algorithms to turn in a performance that is superior to random guesswork. Therefore special care should be taken to suit the method to the purpose at hand. Finally, we show that evaluations from data generated using different underlying topologies have different signatures that can be used to better choose a network reconstruction method.
Assuntos
Redes Reguladoras de Genes , Algoritmos , Animais , Teorema de Bayes , Entropia , Humanos , Modelos Genéticos , Genética ReversaRESUMO
MOTIVATION: The use of differential equations (ODE) is one of the most promising approaches to network inference. The success of ODE-based approaches has, however, been limited, due to the difficulty in estimating parameters and by their lack of scalability. Here, we introduce a novel method and pipeline to reverse engineer gene regulatory networks from gene expression of time series and perturbation data based upon an improvement on the calculation scheme of the derivatives and a pre-filtration step to reduce the number of possible links. The method introduces a linear differential equation model with adaptive numerical differentiation that is scalable to extremely large regulatory networks. RESULTS: We demonstrate the ability of this method to outperform current state-of-the-art methods applied to experimental and synthetic data using test data from the DREAM4 and DREAM5 challenges. Our method displays greater accuracy and scalability. We benchmark the performance of the pipeline with respect to dataset size and levels of noise. We show that the computation time is linear over various network sizes. AVAILABILITY AND IMPLEMENTATION: The Matlab code of the HiDi implementation is available at: www.complexitycalculator.com/HiDiScript.zip. CONTACT: hzenilc@gmail.com or narsis.kiani@ki.se. SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.
Assuntos
Algoritmos , Biologia Computacional/métodos , Redes Reguladoras de Genes , Benchmarking , Expressão Gênica , Modelos GenéticosRESUMO
Multiple Sclerosis (MS) is an autoimmune disease driving inflammatory and degenerative processes that damage the central nervous system (CNS). However, it is not well understood how these events interact and evolve to evoke such a highly dynamic and heterogeneous disease. We established a hypothesis whereby the variability in the course of MS is driven by the very same pathogenic mechanisms responsible for the disease, the autoimmune attack on the CNS that leads to chronic inflammation, neuroaxonal degeneration and remyelination. We propose that each of these processes acts more or less severely and at different times in each of the clinical subgroups. To test this hypothesis, we developed a mathematical model that was constrained by experimental data (the expanded disability status scale [EDSS] time series) obtained from a retrospective longitudinal cohort of 66 MS patients with a long-term follow-up (up to 20 years). Moreover, we validated this model in a second prospective cohort of 120 MS patients with a three-year follow-up, for which EDSS data and brain volume time series were available. The clinical heterogeneity in the datasets was reduced by grouping the EDSS time series using an unsupervised clustering analysis. We found that by adjusting certain parameters, albeit within their biological range, the mathematical model reproduced the different disease courses, supporting the dynamic CNS damage hypothesis to explain MS heterogeneity. Our analysis suggests that the irreversible axon degeneration produced in the early stages of progressive MS is mainly due to the higher rate of myelinated axon degeneration, coupled to the lower capacity for remyelination. However, and in agreement with recent pathological studies, degeneration of chronically demyelinated axons is not a key feature that distinguishes this phenotype. Moreover, the model reveals that lower rates of axon degeneration and more rapid remyelination make relapsing MS more resilient than the progressive subtype. Therefore, our results support the hypothesis of a common pathogenesis for the different MS subtypes, even in the presence of genetic and environmental heterogeneity. Hence, MS can be considered as a single disease in which specific dynamics can provoke a variety of clinical outcomes in different patient groups. These results have important implications for the design of therapeutic interventions for MS at different stages of the disease.
Assuntos
Encéfalo , Biologia Computacional/métodos , Processamento de Imagem Assistida por Computador/métodos , Esclerose Múltipla , Encéfalo/diagnóstico por imagem , Encéfalo/fisiopatologia , Bases de Dados Factuais , Humanos , Inflamação , Imageamento por Ressonância Magnética , Esclerose Múltipla/classificação , Esclerose Múltipla/diagnóstico por imagem , Esclerose Múltipla/fisiopatologia , Estudos ProspectivosRESUMO
We introduce a definition of algorithmic symmetry in the context of geometric and spatial complexity able to capture mathematical aspects of different objects using as a case study polyominoes and polyhedral graphs. We review, study and apply a method for approximating the algorithmic complexity (also known as Kolmogorov-Chaitin complexity) of graphs and networks based on the concept of Algorithmic Probability (AP). AP is a concept (and method) capable of recursively enumerate all properties of computable (causal) nature beyond statistical regularities. We explore the connections of algorithmic complexity-both theoretical and numerical-with geometric properties mainly symmetry and topology from an (algorithmic) information-theoretic perspective. We show that approximations to algorithmic complexity by lossless compression and an Algorithmic Probability-based method can characterize spatial, geometric, symmetric and topological properties of mathematical objects and graphs.
RESUMO
Information-theoretic-based measures have been useful in quantifying network complexity. Here we briefly survey and contrast (algorithmic) information-theoretic methods which have been used to characterize graphs and networks. We illustrate the strengths and limitations of Shannon's entropy, lossless compressibility and algorithmic complexity when used to identify aspects and properties of complex networks. We review the fragility of computable measures on the one hand and the invariant properties of algorithmic measures on the other demonstrating how current approaches to algorithmic complexity are misguided and suffer of similar limitations than traditional statistical approaches such as Shannon entropy. Finally, we review some current definitions of algorithmic complexity which are used in analyzing labelled and unlabelled graphs. This analysis opens up several new opportunities to advance beyond traditional measures.
RESUMO
We investigate the properties of a Block Decomposition Method (BDM), which extends the power of a Coding Theorem Method (CTM) that approximates local estimations of algorithmic complexity based on Solomonoff-Levin's theory of algorithmic probability providing a closer connection to algorithmic complexity than previous attempts based on statistical regularities such as popular lossless compression schemes. The strategy behind BDM is to find small computer programs that produce the components of a larger, decomposed object. The set of short computer programs can then be artfully arranged in sequence so as to produce the original object. We show that the method provides efficient estimations of algorithmic complexity but that it performs like Shannon entropy when it loses accuracy. We estimate errors and study the behaviour of BDM for different boundary conditions, all of which are compared and assessed in detail. The measure may be adapted for use with more multi-dimensional objects than strings, objects such as arrays and tensors. To test the measure we demonstrate the power of CTM on low algorithmic-randomness objects that are assigned maximal entropy (e.g., π ) but whose numerical approximations are closer to the theoretical low algorithmic-randomness expectation. We also test the measure on larger objects including dual, isomorphic and cospectral graphs for which we know that algorithmic randomness is low. We also release implementations of the methods in most major programming languages-Wolfram Language (Mathematica), Matlab, R, Perl, Python, Pascal, C++, and Haskell-and an online algorithmic complexity calculator.
RESUMO
The pathogenesis of multiple sclerosis (MS) involves alterations to multiple pathways and processes, which represent a significant challenge for developing more-effective therapies. Systems biology approaches that study pathway dysregulation should offer benefits by integrating molecular networks and dynamic models with current biological knowledge for understanding disease heterogeneity and response to therapy. In MS, abnormalities have been identified in several cytokine-signaling pathways, as well as those of other immune receptors. Among the downstream molecules implicated are Jak/Stat, NF-Kb, ERK1/3, p38 or Jun/Fos. Together, these data suggest that MS is likely to be associated with abnormalities in apoptosis/cell death, microglia activation, blood-brain barrier functioning, immune responses, cytokine production, and/or oxidative stress, although which pathways contribute to the cascade of damage and can be modulated remains an open question. While current MS drugs target some of these pathways, others remain untouched. Here, we propose a pragmatic systems analysis approach that involves the large-scale extraction of processes and pathways relevant to MS. These data serve as a scaffold on which computational modeling can be performed to identify disease subgroups based on the contribution of different processes. Such an analysis, targeting these relevant MS-signaling pathways, offers the opportunity to accelerate the development of novel individual or combination therapies.
Assuntos
Esclerose Múltipla/tratamento farmacológico , Esclerose Múltipla/metabolismo , Transdução de Sinais/efeitos dos fármacos , Transdução de Sinais/fisiologia , Descoberta de Drogas , HumanosRESUMO
BACKGROUND: Network inference deals with the reconstruction of molecular networks from experimental data. Given N molecular species, the challenge is to find the underlying network. Due to data limitations, this typically is an ill-posed problem, and requires the integration of prior biological knowledge or strong regularization. We here focus on the situation when time-resolved measurements of a system's response after systematic perturbations are available. RESULTS: We present a novel method to infer signaling networks from time-course perturbation data. We utilize dynamic Bayesian networks with probabilistic Boolean threshold functions to describe protein activation. The model posterior distribution is analyzed using evolutionary MCMC sampling and subsequent clustering, resulting in probability distributions over alternative networks. We evaluate our method on simulated data, and study its performance with respect to data set size and levels of noise. We then use our method to study EGF-mediated signaling in the ERBB pathway. CONCLUSIONS: Dynamic Probabilistic Threshold Networks is a new method to infer signaling networks from time-series perturbation data. It exploits the dynamic response of a system after external perturbation for network reconstruction. On simulated data, we show that the approach outperforms current state of the art methods. On the ERBB data, our approach recovers a significant fraction of the known interactions, and predicts novel mechanisms in the ERBB pathway.
Assuntos
Algoritmos , Transdução de Sinais , Biologia de Sistemas/métodos , Teorema de Bayes , Cadeias de Markov , Método de Monte Carlo , Fatores de TempoRESUMO
BACKGROUND AND HYPOTHESIS: Chronic Obstructive Pulmonary Disease (COPD) patients are characterized by heterogeneous clinical manifestations and patterns of disease progression. Two major factors that can be used to identify COPD subtypes are muscle dysfunction/wasting and co-morbidity patterns. We hypothesized that COPD heterogeneity is in part the result of complex interactions between several genes and pathways. We explored the possibility of using a Systems Medicine approach to identify such pathways, as well as to generate predictive computational models that may be used in clinic practice. OBJECTIVE AND METHOD: Our overarching goal is to generate clinically applicable predictive models that characterize COPD heterogeneity through a Systems Medicine approach. To this end we have developed a general framework, consisting of three steps/objectives: (1) feature identification, (2) model generation and statistical validation, and (3) application and validation of the predictive models in the clinical scenario. We used muscle dysfunction and co-morbidity as test cases for this framework. RESULTS: In the study of muscle wasting we identified relevant features (genes) by a network analysis and generated predictive models that integrate mechanistic and probabilistic models. This allowed us to characterize muscle wasting as a general de-regulation of pathway interactions. In the co-morbidity analysis we identified relevant features (genes/pathways) by the integration of gene-disease and disease-disease associations. We further present a detailed characterization of co-morbidities in COPD patients that was implemented into a predictive model. In both use cases we were able to achieve predictive modeling but we also identified several key challenges, the most pressing being the validation and implementation into actual clinical practice. CONCLUSIONS: The results confirm the potential of the Systems Medicine approach to study complex diseases and generate clinically relevant predictive models. Our study also highlights important obstacles and bottlenecks for such approaches (e.g. data availability and normalization of frameworks among others) and suggests specific proposals to overcome them.
Assuntos
Sistemas de Apoio a Decisões Clínicas , Doença Pulmonar Obstrutiva Crônica/diagnóstico , Doença Pulmonar Obstrutiva Crônica/terapia , Biomarcadores/metabolismo , Comorbidade , Simulação por Computador , Metabolismo Energético , Humanos , Músculo Esquelético/patologia , Oxigênio/química , Espécies Reativas de Oxigênio , Pesquisa Translacional Biomédica/métodosRESUMO
We demonstrate that the assembly pathway method underlying assembly theory (AT) is an encoding scheme widely used by popular statistical compression algorithms. We show that in all cases (synthetic or natural) AT performs similarly to other simple coding schemes and underperforms compared to system-related indexes based upon algorithmic probability that take into account statistical repetitions but also the likelihood of other computable patterns. Our results imply that the assembly index does not offer substantial improvements over existing methods, including traditional statistical ones, and imply that the separation between living and non-living compounds following these methods has been reported before.
Assuntos
Algoritmos , Biologia Computacional/métodosRESUMO
BACKGROUND: Planning for return to work (RTW) is relevant among sub-groups of metastatic breast cancer (mBC) survivors. RTW and protective factors for RTW in patients with mBC were determined. METHODS: Patients with mBC, ages 18-63 years, were identified in Swedish registers, and data were collected starting 1 year before their mBC diagnosis. The prevalence of working net days (WNDs) (>90 and >180) during the year after mBC diagnosis (y1) was determined. Factors associated with RTW were assessed using regression analysis. The impact of contemporary oncological treatment of mBC on RTW and 5-year mBC-specific survival was compared between those diagnosed in 1997-2002 and 2003-2011. RESULTS: Of 490 patients, 239 (48.8%) and 189 (36.8%) had >90 and >180 WNDs, respectively, during y1. Adjusted odds ratios (AORs) of WNDs >90 or >180 during y1 were significantly higher for patients with age ≤50 years (AOR180 = 1.54), synchronous metastasis (AOR90 = 1.68, AOR180 = 1.67), metastasis within 24 months (AOR180 = 1.51), soft tissue, visceral, brain as first metastatic site (AOR90 = 1.47) and sickness absence <90 net days in the year before mBC diagnosis, suggesting limited comorbidities (AOR90 = 1.28, AOR180 = 2.00), respectively. Mean (standard deviation) WNDs were 134.9 (140.1) and 161.3 (152.4) for patients diagnosed with mBC in 1997-2002 and 2003-2011, respectively (p = 0.046). Median (standard error) mBC-specific survivals were 41.0 (2.5) and 62.0 (9.6) months for patients diagnosed with mBC in 1997-2002 and 2003-2011, respectively (p < 0.001). CONCLUSIONS: RTW of more than 180 WNDs was associated with younger age, early development of metastases and limited comorbidities during the year before the diagnosis of mBC. Patients diagnosed with mBC in 2003 or later had more WNDs and better survival than those diagnosed earlier.