Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 11 de 11
Filtrar
1.
Syst Biol ; 66(6): 950-963, 2017 Nov 01.
Artigo em Inglês | MEDLINE | ID: mdl-28204787

RESUMO

Although it is now widely accepted that the rate of phenotypic evolution may not necessarily be constant across large phylogenies, the frequency and phylogenetic position of periods of rapid evolution remain unclear. In his highly influential view of evolution, G. G. Simpson supposed that such evolutionary jumps occur when organisms transition into so-called new adaptive zones, for instance after dispersal into a new geographic area, after rapid climatic changes, or following the appearance of an evolutionary novelty. Only recently, large, accurate and well calibrated phylogenies have become available that allow testing this hypothesis directly, yet inferring evolutionary jumps remains computationally very challenging. Here, we develop a computationally highly efficient algorithm to accurately infer the rate and strength of evolutionary jumps as well as their phylogenetic location. Following previous work we model evolutionary jumps as a compound process, but introduce a novel approach to sample jump configurations that does not require matrix inversions and thus naturally scales to large trees. We then make use of this development to infer evolutionary jumps in Anolis lizards and Loriinii parrots where we find strong signal for such jumps at the basis of clades that transitioned into new adaptive zones, just as postulated by Simpson's hypothesis. [evolutionary jump; Lévy process; phenotypic evolution; punctuated equilibrium; quantitative traits.


Assuntos
Classificação/métodos , Modelos Genéticos , Filogenia , Algoritmos , Animais , Evolução Biológica , Lagartos/classificação , Papagaios/classificação
2.
Medicine (Baltimore) ; 98(17): e15194, 2019 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-31027064

RESUMO

INTRODUCTION: While the role of inflammation in acute coronary events is well established, the impact of inflammatory-mediated vulnerability of coronary plaques from the entire coronary tree, on the extension of ventricular remodeling and scaring, has not been clarified yet. MATERIALS AND METHODS: The present manuscript describes the procedures of the VIABILITY trial, a descriptive prospective single-center cohort study. The main purpose of this trial is to assess the link between systemic inflammation, pan-coronary plaque vulnerability (referring to the plaque vulnerability within the entire coronary tree), myocardial viability and ventricular remodeling in patients who had suffered a recent ST-segment elevation acute myocardial infarction (STEMI). One hundred patients with STEMI who underwent successful revascularization of the culprit lesion in the first 12 hours after the onset of symptoms will be enrolled in the study. The level of systemic inflammation will be evaluated based on the serum biomarker levels (hs-CRP, matrix metalloproteinases, interleukin-6) in the acute phase of the myocardial infarction (MI) and at 1 month. Pan-coronary plaque vulnerability will be assessed based on serum biomarkers known to be associated with increased plaque vulnerability (V-CAM or I-CAM) and at 1 month after infarction, based on computed tomographic angiography analysis of vulnerability features of all coronary plaques. Myocardial viability and remodeling will be assessed based on 3D speckle tracking echocardiography associated with dobutamine infusion and LGE-CMR associated with post-processing imaging methods. The study population will be categorized in 2 subgroups: subgroup 1 - subjects with STEMI and increased inflammatory response at 7 days after the acute event (hs-CRP ≥ 3 mg/dl), and subgroup 2 - subjects with STEMI and no increased inflammatory response at 7 days (hs-CRP < 3 mg/dl). Study outcomes will consist in the rate of post-infarction heart failure development and the major adverse events (MACE) rate. CONCLUSION: VIABILITY is the first prospective study designed to evaluate the influence of infarct-related inflammatory response on several major determinants of post-infarction outcomes, such as coronary plaque vulnerability, myocardial viability, and ventricular remodeling.


Assuntos
Doença da Artéria Coronariana/imunologia , Inflamação/imunologia , Placa Aterosclerótica/imunologia , Infarto do Miocárdio com Supradesnível do Segmento ST/imunologia , Remodelação Ventricular/imunologia , Biomarcadores/sangue , Doença da Artéria Coronariana/sangue , Doença da Artéria Coronariana/diagnóstico por imagem , Humanos , Inflamação/sangue , Inflamação/diagnóstico por imagem , Placa Aterosclerótica/sangue , Placa Aterosclerótica/diagnóstico por imagem , Infarto do Miocárdio com Supradesnível do Segmento ST/sangue , Infarto do Miocárdio com Supradesnível do Segmento ST/diagnóstico por imagem , Infarto do Miocárdio com Supradesnível do Segmento ST/cirurgia
3.
Comput Methods Programs Biomed ; 135: 15-26, 2016 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-27586476

RESUMO

BACKGROUND: Graph-based hierarchical clustering algorithms become prohibitively costly in both execution time and storage space, as the number of nodes approaches the order of millions. OBJECTIVE: A fast and highly memory efficient Markov clustering algorithm is proposed to perform the classification of huge sparse networks using an ordinary personal computer. METHODS: Improvements compared to previous versions are achieved through adequately chosen data structures that facilitate the efficient handling of symmetric sparse matrices. Clustering is performed in two stages: the initial connected network is processed in a sparse matrix until it breaks into isolated, small, and relatively dense subgraphs, which are then processed separately until convergence is obtained. An intelligent stopping criterion is also proposed to quit further processing of a subgraph that tends toward completeness with equal edge weights. The main advantage of this algorithm is that the necessary number of iterations is separately decided for each graph node. RESULTS: The proposed algorithm was tested using the SCOP95 and large synthetic protein sequence data sets. The validation process revealed that the proposed method can reduce 3-6 times the processing time of huge sequence networks compared to previous Markov clustering solutions, without losing anything from the partition quality. CONCLUSIONS: A one-million-node and one-billion-edge protein sequence network defined by a BLAST similarity matrix can be processed with an upper-class personal computer in 100 minutes. Further improvement in speed is possible via parallel data processing, while the extension toward several million nodes needs intermediary data storage, for example on solid state drives.


Assuntos
Algoritmos , Cadeias de Markov , Análise por Conglomerados
4.
Comput Biol Med ; 48: 94-101, 2014 May.
Artigo em Inglês | MEDLINE | ID: mdl-24657908

RESUMO

TRIBE-MCL is a Markov clustering algorithm that operates on a graph built from pairwise similarity information of the input data. Edge weights stored in the stochastic similarity matrix are alternately fed to the two main operations, inflation and expansion, and are normalized in each main loop to maintain the probabilistic constraint. In this paper we propose an efficient implementation of the TRIBE-MCL clustering algorithm, suitable for fast and accurate grouping of protein sequences. A modified sparse matrix structure is introduced that can efficiently handle most operations of the main loop. Taking advantage of the symmetry of the similarity matrix, a fast matrix squaring formula is also introduced to facilitate the time consuming expansion. The proposed algorithm was tested on protein sequence databases like SCOP95. In terms of efficiency, the proposed solution improves execution speed by two orders of magnitude, compared to recently published efficient solutions, reducing the total runtime well below 1min in the case of the 11,944proteins of SCOP95. This improvement in computation time is reached without losing anything from the partition quality. Convergence is generally reached in approximately 50 iterations. The efficient execution enabled us to perform a thorough evaluation of classification results and to formulate recommendations regarding the choice of the algorithm׳s parameter values.


Assuntos
Algoritmos , Análise por Conglomerados , Biologia Computacional/métodos , Bases de Dados de Proteínas , Proteínas , Análise de Sequência de Proteína/métodos , Cadeias de Markov , Proteínas/química , Proteínas/classificação
5.
Artigo em Inglês | MEDLINE | ID: mdl-24109768

RESUMO

In this paper we propose an efficient reformulation of a Markov clustering algorithm, suitable for fast and accurate grouping of protein sequences, based on pairwise similarity information. The proposed modification consists of optimal reordering of rows and columns in the similarity matrix after every iteration, transforming it into a matrix with several compact blocks along the diagonal, and zero similarities outside the blocks. These blocks are treated separately in later iterations, thus reducing the computational burden of the algorithm. The proposed algorithm was tested on protein sequence databases like SCOP95. In terms of efficiency, the proposed solution achieves a speed-up factor in the range 15-50 compared to the conventional Markov clustering, depending on input data size and parameter settings. This improvement in computation time is reached without losing anything from the partition accuracy. The convergence is usually reached in 40-50 iterations. Combining the proposed method with sparse matrix representation and parallel execution will certainly lead to a significantly more efficient solution in future.


Assuntos
Análise de Sequência de Proteína/métodos , Algoritmos , Sequência de Aminoácidos , Análise por Conglomerados , Bases de Dados de Proteínas , Cadeias de Markov
6.
Comput Methods Programs Biomed ; 108(1): 80-9, 2012 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-22405524

RESUMO

Intensity inhomogeneity or intensity non-uniformity (INU) is an undesired phenomenon that represents the main obstacle for magnetic resonance (MR) image segmentation and registration methods. Various techniques have been proposed to eliminate or compensate the INU, most of which are embedded into classification or clustering algorithms, they generally have difficulties when INU reaches high amplitudes and usually suffer from high computational load. This study reformulates the design of c-means clustering based INU compensation techniques by identifying and separating those globally working computationally costly operations that can be applied to gray intensity levels instead of individual pixels. The theoretical assumptions are demonstrated using the fuzzy c-means algorithm, but the proposed modification is compatible with a various range of c-means clustering based INU compensation and MR image segmentation algorithms. Experiments carried out using synthetic phantoms and real MR images indicate that the proposed approach produces practically the same segmentation accuracy as the conventional formulation, but 20-30 times faster.


Assuntos
Lógica Fuzzy , Modelos Teóricos , Encéfalo/fisiologia , Análise por Conglomerados , Humanos
7.
Comput Methods Programs Biomed ; 101(2): 183-200, 2011 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-20692715

RESUMO

This paper presents a patient specific deformable heart model that involves the known electrical and mechanical properties of the cardiac cells and tissue. The whole heart model comprises ten Tusscher's ventricular and Nygren's atrial cell models, the anatomical and electrophysiological model descriptions of the atria (introduced by Harrild et al.) and ventricle (given by Winslow et al.), and the mechanical model of the periodical cardiac contraction and resting phenomena proposed by Moireau et al. During the propagation of the depolarization wave, the kinetic, compositional and rotational anisotropy is handled by the tissue, organ and torso model. The applied patient specific parameters were determined by an evolutionary computation method. An intensive parameter reduction was performed using the abstract formulation of the searching space. This patient specific parameter representation enables the adjustment of deformable model parameters in real-time. The validation process was performed using simultaneously measured ECG and ultrasound image records that were compared with simulated signals and shapes using an abstract, parameterized evaluation criterion.


Assuntos
Coração/anatomia & histologia , Coração/fisiopatologia , Humanos , Modelos Anatômicos
8.
Artigo em Inglês | MEDLINE | ID: mdl-19163564

RESUMO

Intensity inhomogeneity or intensity non-uniformity (INU) is an undesired phenomenon that represents the main obstacle for MR image segmentation and registration methods. Various techniques have been proposed to eliminate or compensate the INU, most of which are embedded into clustering algorithms. This paper proposes a multiple stage fuzzy c-means (FCM) based algorithm for the estimation and compensation of the slowly varying additive or multiplicative noise, supported by a pre-filtering technique for Gaussian and impulse noise elimination. The slowly varying behavior of the bias or gain field is assured by a smoothening filter that performs a context dependent averaging, based on a morphological criterion. The experiments using 2-D synthetic phantoms and real MR images show, that the proposed method provides accurate segmentation. The produced segmentation and fuzzy membership values can serve as excellent support for 3-D registration and segmentation techniques.


Assuntos
Encéfalo/patologia , Interpretação de Imagem Assistida por Computador/métodos , Imageamento por Ressonância Magnética/métodos , Algoritmos , Encéfalo/anatomia & histologia , Análise por Conglomerados , Humanos , Modelos Estatísticos , Distribuição Normal , Reconhecimento Automatizado de Padrão , Imagens de Fantasmas , Reprodutibilidade dos Testes , Software
9.
Artigo em Inglês | MEDLINE | ID: mdl-19163176

RESUMO

This paper presents a novel ECG telemetry system based on Z-Wave communication protocol. The proposed system consists of small portable devices that acquire, compress and transmit the ECG to a RF-USB interface connected to a central monitoring computer. The received signals are filtered, QRS complexes and P and T waves are localized, and different waveforms are classified in order to be able to provide diagnosis tools like heart rate variability and turbulence analysis. Due to the limitation of communication bandwidth, the maximum number of measuring devices connected to a central monitor is four. The proposed system composed of inexpensive components can serve as flexible alternative to current ECG monitoring systems.


Assuntos
Redes de Comunicação de Computadores/instrumentação , Eletrocardiografia/instrumentação , Telemetria/instrumentação , Telemetria/métodos , Algoritmos , Desenho de Equipamento , Frequência Cardíaca/fisiologia , Humanos , Processamento de Sinais Assistido por Computador
10.
Conf Proc IEEE Eng Med Biol Soc ; 2006: 3998-4001, 2006.
Artigo em Inglês | MEDLINE | ID: mdl-17946213

RESUMO

This paper presents an analysis of the Arruda accessory pathway localization method (for patients suffering from Wolff-Parkinson-White syndrome) with suggestions to increase the overall performance. The Arruda method was tested on a total of 121 patients, and a 90% localization performance was reached. This was considered almost as performing result as the highest published (90%) by L. Boersma in 2002. After a deeper analysis of each decision point of Arruda localization method we considered that the lead AVF is not as relevant as other used leads (I, II, III, VI). The overall performance (90%) was slightly lower then the correct decision rate (91.67%) at the weakest decision element (AVF+) of the method. The vectorial space constructed from the most used leads (II, VI, AVF) is not orthogonal which can be a reason for weaker rate in case of AVF.


Assuntos
Síndrome de Wolff-Parkinson-White/fisiopatologia , Algoritmos , Ablação por Cateter , Eletrocardiografia , Humanos , Reprodutibilidade dos Testes , Projetos de Pesquisa , Sensibilidade e Especificidade , Síndrome de Wolff-Parkinson-White/terapia
11.
Conf Proc IEEE Eng Med Biol Soc ; 2006: 1678-81, 2006.
Artigo em Inglês | MEDLINE | ID: mdl-17946060

RESUMO

Computer-aided bedside patient monitoring requires real-time analysis of vital functions. On-line Holter monitors need reliable and quick algorithms to perform all the necessary signal processing tasks. This paper presents the methods that were conceptualized and implemented at the development of such a monitoring system at Medical Clinic No. 4 of Targu-Mures. The system performs the following ECG signal processing steps: (1) Decomposition of the ECG signals using multi-resolution wavelet transform, which also eliminates most of the high and low frequency noises. These components will serve as input for wave classification algorithms; (2) Identification of QRS complexes, P and T waves using two different algorithms: a sequential clustering and a neural-network-based classification. This latter also distinguishes normal R waves from abnormal cases; (3) Localization of several kinds of arrhythmia using a spectral method. An autoregressive model is applied to estimate the series of R-R intervals. The coefficients of the AR model are predicted using the Kalman filter, and these coefficients will determine a local spectrum for each QRS complex. By analyzing this spectrum, different arrhythmia cases are identified. The algorithms were tested using the MIT-BIH signal database and own multichannel ECG registrations. The QRS complex detection ratio is over 99.5%


Assuntos
Algoritmos , Arritmias Cardíacas/diagnóstico , Arritmias Cardíacas/fisiopatologia , Diagnóstico por Computador/métodos , Eletrocardiografia Ambulatorial/métodos , Frequência Cardíaca , Processamento de Sinais Assistido por Computador , Sistemas Computacionais , Humanos , Reprodutibilidade dos Testes , Sensibilidade e Especificidade , Fatores de Tempo
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA