Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 8 de 8
Filtrar
Más filtros










Intervalo de año de publicación
1.
BMC Bioinformatics ; 21(1): 376, 2020 Aug 31.
Artículo en Inglés | MEDLINE | ID: mdl-32867673

RESUMEN

BACKGROUND: Two-dimensional gel electrophoresis (2-DGE) is a commonly used tool for proteomic analysis. This gel-based technique separates proteins in a sample according to their isoelectric point and molecular weight. 2-DGE images often present anomalies due to the acquisition process, such as: diffuse and overlapping spots, and background noise. This study proposes a joint pre-processing framework that combines the capabilities of nonlinear filtering, background correction and image normalization techniques for pre-processing 2-DGE images. Among the most important, joint nonlinear diffusion filtering, adaptive piecewise histogram equalization and multilevel thresholding were evaluated using both synthetic data and real 2-DGE images. RESULTS: An improvement of up to 46% in spot detection efficiency was achieved for synthetic data using the proposed framework compared to implementing a single technique of either normalization, background correction or filtering. Additionally, the proposed framework increased the detection of low abundance spots by 20% for synthetic data compared to a normalization technique, and increased the background estimation by 67% compared to a background correction technique. In terms of real data, the joint pre-processing framework reduced the false positives up to 93%. CONCLUSIONS: The proposed joint pre-processing framework outperforms results achieved with a single approach. The best structure was obtained with the ordered combination of adaptive piecewise histogram equalization for image normalization, geometric nonlinear diffusion (GNDF) for filtering, and multilevel thresholding for background correction.


Asunto(s)
Electroforesis en Gel Bidimensional/métodos , Bases de Datos de Proteínas , Humanos , Procesamiento de Imagen Asistido por Computador , Proteínas/análisis , Proteómica/métodos , Relación Señal-Ruido
2.
Entropy (Basel) ; 21(4)2019 Apr 10.
Artículo en Inglés | MEDLINE | ID: mdl-33267099

RESUMEN

Permutation Entropy (PE) is a time series complexity measure commonly used in a variety of contexts, with medicine being the prime example. In its general form, it requires three input parameters for its calculation: time series length N, embedded dimension m, and embedded delay τ . Inappropriate choices of these parameters may potentially lead to incorrect interpretations. However, there are no specific guidelines for an optimal selection of N, m, or τ , only general recommendations such as N > > m ! , τ = 1 , or m = 3 , … , 7 . This paper deals specifically with the study of the practical implications of N > > m ! , since long time series are often not available, or non-stationary, and other preliminary results suggest that low N values do not necessarily invalidate PE usefulness. Our study analyses the PE variation as a function of the series length N and embedded dimension m in the context of a diverse experimental set, both synthetic (random, spikes, or logistic model time series) and real-world (climatology, seismic, financial, or biomedical time series), and the classification performance achieved with varying N and m. The results seem to indicate that shorter lengths than those suggested by N > > m ! are sufficient for a stable PE calculation, and even very short time series can be robustly classified based on PE measurements before the stability point is reached. This may be due to the fact that there are forbidden patterns in chaotic time series, not all the patterns are equally informative, and differences among classes are already apparent at very short lengths.

3.
Genomics Proteomics Bioinformatics ; 16(1): 63-72, 2018 02.
Artículo en Inglés | MEDLINE | ID: mdl-29474888

RESUMEN

Various methods and specialized software programs are available for processing two-dimensional gel electrophoresis (2-DGE) images. However, due to the anomalies present in these images, a reliable, automated, and highly reproducible system for 2-DGE image analysis has still not been achieved. The most common anomalies found in 2-DGE images include vertical and horizontal streaking, fuzzy spots, and background noise, which greatly complicate computational analysis. In this paper, we review the preprocessing techniques applied to 2-DGE images for noise reduction, intensity normalization, and background correction. We also present a quantitative comparison of non-linear filtering techniques applied to synthetic gel images, through analyzing the performance of the filters under specific conditions. Synthetic proteins were modeled into a two-dimensional Gaussian distribution with adjustable parameters for changing the size, intensity, and degradation. Three types of noise were added to the images: Gaussian, Rayleigh, and exponential, with signal-to-noise ratios (SNRs) ranging 8-20 decibels (dB). We compared the performance of wavelet, contourlet, total variation (TV), and wavelet-total variation (WTTV) techniques using parameters SNR and spot efficiency. In terms of spot efficiency, contourlet and TV were more sensitive to noise than wavelet and WTTV. Wavelet worked the best for images with SNR ranging 10-20 dB, whereas WTTV performed better with high noise levels. Wavelet also presented the best performance with any level of Gaussian noise and low levels (20-14 dB) of Rayleigh and exponential noise in terms of SNR. Finally, the performance of the non-linear filtering techniques was evaluated using a real 2-DGE image with previously identified proteins marked. Wavelet achieved the best detection rate for the real image.


Asunto(s)
Algoritmos , Electroforesis en Gel Bidimensional/métodos , Procesamiento de Imagen Asistido por Computador/métodos , Proteínas/análisis , Proteómica/métodos , Programas Informáticos , Animales , Humanos
4.
Acta biol. colomb ; 21(3): 619-626, set.-dic, 2016. ilus
Artículo en Español | LILACS | ID: biblio-827639

RESUMEN

La abeja africanizada es la más común en la apicultura colombiana y a su veneno (apitoxina) se le han atribuido propiedades terapéuticas para diferentes enfermedades, sin mayor soporte científico. Al revisar en la literatura los reportes publicados sobre el análisis proteómico de la apitoxina, se encontraron cuatro métodos distintos para la extracción de proteínas de la apitoxina. El primer método consiste en resuspender la apitoxina en Urea 7 M, precipitar con acetona y finalmente resuspender en Urea 7 M y CHAPS 4 %. Para el segundo método se resuspende la apitoxina en buffer de lisis, se precipita con ácido tricloroacético, y luego se resuspende en Urea 7 M y CHAPS 4 %. El tercer método es igual al anterior, excepto que la precipitación se realiza con acetona en vez de ácido tricloroacético. Finalmente, el cuarto método consiste en resuspender la apitoxina en agua destilada, precipitar con acetona y resuspender en Urea 7 M y CHAPS 4 %. Este trabajo se enfocó en comparar el desempeño de estos cuatro métodos de extracción y determinar el método con el mejor resultado en cuanto a la concentración e integridad obtenida de las proteínas. De los distintos métodos evaluados, se encontró que los mejores resultados en cuanto a concentración de proteínas se obtuvieron con la resuspensión de apitoxina en buffer de lisis y precipitación con acetona (método 3) y con el método de resuspensión de apitoxina en agua destilada y precipitación con acetona (método 4). De estos, el mejor método de extracción en cuanto a integridad de las proteínas y perfil proteómico fue el de resuspensión de apitoxina en buffer de lisis seguido de precipitación con acetona (método 3).


The Africanised bee is the most common type of bee in Colombia, and therapeutic properties for different diseases have been attributed to its venom, without much scientific support. A literature search of reports on the proteomic analysis of honeybee venom yielded four different methods for extracting proteins from bee venom. The first method consists in resuspending the venom in 7 M Urea, followed by precipitation with acetone and finally resuspending the pellet in 7 M Urea and 4 % CHAPS. For the second method, the venom is resuspended in lysis buffer, precipitated with trichloroacetic acid, and then resuspended in 7 M Urea and 4 % CHAPS. The third method is similar to the previous one, except that the precipitation step is performed with acetone instead of trichloroacetic acid. Finally, the fourth method is to resuspend the venom in distilled water, precipitate with acetone and resuspend in 7 M Urea and 4 % CHAPS. This work focused on comparing the performance of these four extraction methods, in order to determine the method with the best results in terms of concentration and integrity of the proteins obtained. Of the four methods evaluated, the best results in terms of protein concentration and yield were obtained by resuspending the bee venom in lysis buffer followed by precipitation with acetone (method 3), and by resuspending in distilled water followed by precipitation with acetone (method 4). Of these, the method that maintained protein integrity and yielded the best proteomic profile was that in which the bee venom was resuspended in lysis buffer followed by precipitation with acetone (method 3).

5.
Artículo en Inglés | MEDLINE | ID: mdl-24109851

RESUMEN

The heart's mechanical activity can be appraised by auscultation recordings, taken from the 4-Standard Auscultation Areas (4-SAA), one for each cardiac valve, as there are invisible murmurs when a single area is examined. This paper presents an effective approach for cardiac murmur detection based on adaptive neuro-fuzzy inference systems (ANFIS) over acoustic representations derived from Empirical Mode Decomposition (EMD) and Hilbert-Huang Transform (HHT) of 4-channel phonocardiograms (4-PCG). The 4-PCG database belongs to the National University of Colombia. Mel-Frequency Cepstral Coefficients (MFCC) and statistical moments of HHT were estimated on the combination of different intrinsic mode functions (IMFs). A fuzzy-rough feature selection (FRFS) was applied in order to reduce complexity. An ANFIS network was implemented on the feature space, randomly initialized, adjusted using heuristic rules and trained using a hybrid learning algorithm made up by least squares and gradient descent. Global classification for 4-SAA was around 98.9% with satisfactory sensitivity and specificity, using a 50-fold cross-validation procedure (70/30 split). The representation capability of the EMD technique applied to 4-PCG and the neuro-fuzzy inference of acoustic features offered a high performance to detect cardiac murmurs.


Asunto(s)
Acústica , Algoritmos , Lógica Difusa , Redes Neurales de la Computación , Fonocardiografía/instrumentación , Adulto , Auscultación Cardíaca , Humanos
6.
Artículo en Inglés | MEDLINE | ID: mdl-23367121

RESUMEN

This paper presents a dimensionality reduction study based on fuzzy rough sets with the aim of increasing the discriminant capability of the representation of normal ECG beats and those that contain ischemic events. A novel procedure is proposed to obtain the fuzzy equivalence classes based on entropy and neighborhood techniques and a modification of the Quick Reduct Algorithm is used to select the relevant features from a large feature space by a dependency function. The tests were carried out on a feature space made up by 840 wavelet features extracted from 900 ECG normal beats and 900 ECG beats with evidence of ischemia. Results of around 99% classification accuracy are obtained. This methodology provides a reduced feature space with low complexity and high representation capability. Additionally, the discriminant strength of entropy in terms of representing ischemic disorders from time-frequency information in ECG signals is highlighted.


Asunto(s)
Lógica Difusa , Isquemia Miocárdica/diagnóstico , Electrocardiografía , Humanos , Modelos Teóricos , Isquemia Miocárdica/fisiopatología
7.
IEEE Trans Inf Technol Biomed ; 13(4): 590-8, 2009 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-19304491

RESUMEN

An effective data representation methodology on high-dimension feature spaces is presented, which allows a better interpretation of subjacent physiological phenomena (namely, cardiac behavior related to cardiovascular diseases), and is based on search criteria over a feature set resulting in an increase in the detection capability of ischemic pathologies, but also connecting these features with the physiologic representation of the ECG. The proposed dimension reduction scheme consists of three levels: projection, interpretation, and visualization. First, a hybrid algorithm is described that projects the multidimensional data to a lower dimension space, gathering the features that contribute similarly in the meaning of the covariance reconstruction in order to find information of clinical relevance over the initial training space. Next, an algorithm of variable selection is provided that further reduces the dimension, taking into account only the variables that offer greater class separability, and finally, the selected feature set is projected to a 2-D space in order to verify the performance of the suggested dimension reduction algorithm in terms of the discrimination capability for ischemia detection. The ECG recordings used in this study are from the European ST-T database and from the Universidad Nacional de Colombia database. In both cases, over 99% feature reduction was obtained, and classification precision was over 99% using a five-nearest-neighbor classifier (5-NN).


Asunto(s)
Electrocardiografía/métodos , Isquemia Miocárdica/diagnóstico , Procesamiento de Señales Asistido por Computador , Adulto , Anciano , Algoritmos , Análisis por Conglomerados , Bases de Datos Factuales , Femenino , Humanos , Masculino , Persona de Mediana Edad , Modelos Cardiovasculares
8.
Ann Biomed Eng ; 37(2): 337-53, 2009 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-19048376

RESUMEN

This work presents a comparison of different approaches for the detection of murmurs from phonocardiographic signals. Taking into account the variability of the phonocardiographic signals induced by valve disorders, three families of features were analyzed: (a) time-varying & time-frequency features; (b) perceptual; and (c) fractal features. With the aim of improving the performance of the system, the accuracy of the system was tested using several combinations of the aforementioned families of parameters. In the second stage, the main components extracted from each family were combined together with the goal of improving the accuracy of the system. The contribution of each family of features extracted was evaluated by means of a simple k-nearest neighbors classifier, showing that fractal features provide the best accuracy (97.17%), followed by time-varying & time-frequency (95.28%), and perceptual features (88.7%). However, an accuracy around 94% can be reached just by using the two main features of the fractal family; therefore, considering the difficulties related to the automatic intrabeat segmentation needed for spectral and perceptual features, this scheme becomes an interesting alternative. The conclusion is that fractal type features were the most robust family of parameters (in the sense of accuracy vs. computational load) for the automatic detection of murmurs. This work was carried out using a database that contains 164 phonocardiographic recordings (81 normal and 83 records with murmurs). The database was segmented to extract 360 representative individual beats (180 per class).


Asunto(s)
Algoritmos , Soplos Cardíacos/fisiopatología , Diástole/fisiología , Humanos , Fonocardiografía/métodos , Sístole/fisiología
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...