Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 9 de 9
Filtrar
1.
BMC Bioinformatics ; 19(1): 123, 2018 04 05.
Artigo em Inglês | MEDLINE | ID: mdl-29621971

RESUMO

BACKGROUND: Thanks to a reasonable cost and simple sample preparation procedure, linear MALDI-ToF spectrometry is a growing technology for clinical microbiology. With appropriate spectrum databases, this technology can be used for early identification of pathogens in body fluids. However, due to the low resolution of linear MALDI-ToF instruments, robust and accurate peak picking remains a challenging task. In this context we propose a new peak extraction algorithm from raw spectrum. With this method the spectrum baseline and spectrum peaks are processed jointly. The approach relies on an additive model constituted by a smooth baseline part plus a sparse peak list convolved with a known peak shape. The model is then fitted under a Gaussian noise model. The proposed method is well suited to process low resolution spectra with important baseline and unresolved peaks. RESULTS: We developed a new peak deconvolution procedure. The paper describes the method derivation and discusses some of its interpretations. The algorithm is then described in a pseudo-code form where the required optimization procedure is detailed. For synthetic data the method is compared to a more conventional approach. The new method reduces artifacts caused by the usual two-steps procedure, baseline removal then peak extraction. Finally some results on real linear MALDI-ToF spectra are provided. CONCLUSIONS: We introduced a new method for peak picking, where peak deconvolution and baseline computation are performed jointly. On simulated data we showed that this global approach performs better than a classical one where baseline and peaks are processed sequentially. A dedicated experiment has been conducted on real spectra. In this study a collection of spectra of spiked proteins were acquired and then analyzed. Better performances of the proposed method, in term of accuracy and reproductibility, have been observed and validated by an extended statistical analysis.


Assuntos
Algoritmos , Espectrometria de Massas por Ionização e Dessorção a Laser Assistida por Matriz/métodos , Artefatos
2.
BMC Bioinformatics ; 19(1): 73, 2018 03 01.
Artigo em Inglês | MEDLINE | ID: mdl-29490628

RESUMO

BACKGROUND: In the field of biomarker validation with mass spectrometry, controlling the technical variability is a critical issue. In selected reaction monitoring (SRM) measurements, this issue provides the opportunity of using variance component analysis to distinguish various sources of variability. However, in case of unbalanced data (unequal number of observations in all factor combinations), the classical methods cannot correctly estimate the various sources of variability, particularly in presence of interaction. The present paper proposes an extension of the variance component analysis to estimate the various components of the variance, including an interaction component in case of unbalanced data. RESULTS: We applied an experimental design that uses a serial dilution to generate known relative protein concentrations and estimated these concentrations by two processing algorithms, a classical and a more recent one. The extended method allowed estimating the variances explained by the dilution and the technical process by each algorithm in an experiment with 9 proteins: L-FABP, 14.3.3 sigma, Calgi, Def.A6, Villin, Calmo, I-FABP, Peroxi-5, and S100A14. Whereas, the recent algorithm gave a higher dilution variance and a lower technical variance than the classical one in two proteins with three peptides (L-FABP and Villin), there were no significant difference between the two algorithms on all proteins. CONCLUSIONS: The extension of the variance component analysis was able to estimate correctly the variance components of protein concentration measurement in case of unbalanced design.


Assuntos
Algoritmos , Biomarcadores/análise , Espectrometria de Massas , Proteínas/análise , Análise de Variância , Ensaio de Imunoadsorção Enzimática , Humanos , Reprodutibilidade dos Testes
3.
Biom J ; 60(2): 262-274, 2018 03.
Artigo em Inglês | MEDLINE | ID: mdl-29230881

RESUMO

Controlling the technological variability on an analytical chain is critical for biomarker discovery. The sources of technological variability should be modeled, which calls for specific experimental design, signal processing, and statistical analysis. Furthermore, with unbalanced data, the various components of variability cannot be estimated with the sequential or adjusted sums of squares of usual software programs. We propose a novel approach to variance component analysis with application to the matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) technology and use this approach for protein quantification by a classical signal processing algorithm and two more recent ones (BHI-PRO 1 and 2). Given the high technological variability, the quantification failed to restitute the known quantities of five out of nine proteins present in a controlled solution. There was a linear relationship between protein quantities and peak intensities for four out of nine peaks with all algorithms. The biological component of the variance was higher with BHI-PRO than with the classical algorithm (80-95% with BHI-PRO 1, 79-95% with BHI-PRO 2 vs. 56-90%); thus, BHI-PRO were more efficient in protein quantification. The technological component of the variance was higher with the classical algorithm than with BHI-PRO (6-25% vs. 2.5-9.6% with BHI-PRO 1 and 3.5-11.9% with BHI-PRO 2). The chemical component was also higher with the classical algorithm (3.6-18.7% vs. < 3.5%). Thus, BHI-PRO were better in removing noise from signal when the expected peaks are detected. Overall, either BHI-PRO algorithm may reduce the technological variance from 25 to 10% and thus improve protein quantification and biomarker validation.


Assuntos
Biometria/métodos , Proteínas/análise , Espectrometria de Massas por Ionização e Dessorção a Laser Assistida por Matriz , Algoritmos , Análise de Variância , Biomarcadores/análise , Biomarcadores/química , Modelos Lineares , Proteínas/química
4.
J Opt Soc Am A Opt Image Sci Vis ; 27(7): 1593-607, 2010 Jul 01.
Artigo em Inglês | MEDLINE | ID: mdl-20596145

RESUMO

This paper tackles the problem of image deconvolution with joint estimation of point spread function (PSF) parameters and hyperparameters. Within a Bayesian framework, the solution is inferred via a global a posteriori law for unknown parameters and object. The estimate is chosen as the posterior mean, numerically calculated by means of a Monte Carlo Markov chain algorithm. The estimates are efficiently computed in the Fourier domain, and the effectiveness of the method is shown on simulated examples. Results show precise estimates for PSF parameters and hyperparameters as well as precise image estimates including restoration of high frequencies and spatial details, within a global and coherent approach.

5.
Phys Med ; 65: 172-180, 2019 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-31494371

RESUMO

Proton imaging can be carried out on microscopic samples by focusing the beam to a diameter ranging from a few micrometers down to a few tens of nanometers, depending on the required beam intensity and spatial resolution. Three-dimensional (3D) imaging by tomography is obtained from proton transmission (STIM: Scanning Transmission Ion Microscopy) and/or X-ray emission (PIXE: Particle Induced X-ray Emission). In these experiments, the samples are dehydrated for under vacuum analysis. In situ quantification of nanoparticles has been carried out at CENBG in the frame of nanotoxicology studies, on cells and small organisms used as biological models, especially on Caenorhabditis elegans (C. elegans) nematodes. Tomography experiments reveal the distribution of mass density and chemical content (in g.cm-3) within the analyzed volume. These density values are obtained using an inversion algorithm. To investigate the effect of this data reduction process, we defined different numerical phantoms, including a (dehydrated) C. elegans phantom whose geometry and density were derived from experimental data. A Monte Carlo simulation based on the Geant4 toolkit was developed. Using different simulation and reconstruction conditions, we compared the resulting tomographic images to the initial numerical reference phantom. A study of the relative error between the reconstructed and the reference images lead to the result that 20 protons per shot can be considered as an optimal number for 3D STIM imaging. Preliminary results for PIXE tomography are also presented, showing the interest of such numerical phantoms to produce reference data for future studies on X-ray signal attenuation in thick samples.


Assuntos
Imageamento Tridimensional , Microscopia , Método de Monte Carlo , Prótons , Animais , Caenorhabditis elegans , Processamento de Imagem Assistida por Computador , Imagens de Fantasmas
6.
IEEE Trans Image Process ; 17(1): 16-26, 2008 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-18229801

RESUMO

This paper proposes a non-Gaussian Markov field with a special feature: an explicit partition function. To the best of our knowledge, this is an original contribution. Moreover, the explicit expression of the partition function enables the development of an unsupervised edge-preserving convex deconvolution method. The method is fully Bayesian, and produces an estimate in the sense of the posterior mean, numerically calculated by means of a Monte-Carlo Markov chain technique. The approach is particularly effective and the computational practicability of the method is shown on a simple simulated example.


Assuntos
Algoritmos , Inteligência Artificial , Teorema de Bayes , Aumento da Imagem/métodos , Interpretação de Imagem Assistida por Computador/métodos , Reconhecimento Automatizado de Padrão/métodos , Armazenamento e Recuperação da Informação/métodos , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
7.
EURASIP J Bioinform Syst Biol ; 2017(1): 9, 2017 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-28710702

RESUMO

This paper addresses the question of biomarker discovery in proteomics. Given clinical data regarding a list of proteins for a set of individuals, the tackled problem is to extract a short subset of proteins the concentrations of which are an indicator of the biological status (healthy or pathological). In this paper, it is formulated as a specific instance of variable selection. The originality is that the proteins are not investigated one after the other but the best partition between discriminant and non-discriminant proteins is directly sought. In this way, correlations between the proteins are intrinsically taken into account in the decision. The developed strategy is derived in a Bayesian setting, and the decision is optimal in the sense that it minimizes a global mean error. It is finally based on the posterior probabilities of the partitions. The main difficulty is to calculate these probabilities since they are based on the so-called evidence that require marginalization of all the unknown model parameters. Two models are presented that relate the status to the protein concentrations, depending whether the latter are biomarkers or not. The first model accounts for biological variabilities by assuming that the concentrations are Gaussian distributed with a mean and a covariance matrix that depend on the status only for the biomarkers. The second one is an extension that also takes into account the technical variabilities that may significantly impact the observed concentrations. The main contributions of the paper are: (1) a new Bayesian formulation of the biomarker selection problem, (2) the closed-form expression of the posterior probabilities in the noiseless case, and (3) a suitable approximated solution in the noisy case. The methods are numerically assessed and compared to the state-of-the-art methods (t test, LASSO, Battacharyya distance, FOHSIC) on synthetic and real data from proteins quantified in human serum by mass spectrometry in selected reaction monitoring mode.

8.
IEEE Trans Image Process ; 15(11): 3325-37, 2006 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-17076393

RESUMO

Super-resolution (SR) techniques make use of subpixel shifts between frames in an image sequence to yield higher resolution images. We propose an original observation model devoted to the case of nonisometric inter-frame motion as required, for instance, in the context of airborne imaging sensors. First, we describe how the main observation models used in the SR literature deal with motion, and we explain why they are not suited for nonisometric motion. Then, we propose an extension of the observation model by Elad and Feuer adapted to affine motion. This model is based on a decomposition of affine transforms into successive shear transforms, each one efficiently implemented by row-by-row or column-by-column one-dimensional affine transforms. We demonstrate on synthetic and real sequences that our observation model incorporated in a SR reconstruction technique leads to better results in the case of variable scale motions and it provides equivalent results in the case of isometric motions.


Assuntos
Algoritmos , Artefatos , Aumento da Imagem/métodos , Interpretação de Imagem Assistida por Computador/métodos , Técnica de Subtração , Gravação em Vídeo/métodos , Inteligência Artificial , Simulação por Computador , Armazenamento e Recuperação da Informação/métodos , Modelos Teóricos , Movimento (Física) , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
9.
Appl Opt ; 43(2): 257-63, 2004 Jan 10.
Artigo em Inglês | MEDLINE | ID: mdl-14735945

RESUMO

We address the issue of distinguishing point objects from a cluttered background and estimating their position by image processing. We are interested in the specific context in which the object's signature varies significantly relative to its random subpixel location because of aliasing. The conventional matched filter neglects this phenomenon and causes a consistent degradation of detection performance. Thus alternative detectors are proposed, and numerical results show the improvement brought by approximate and generalized likelihood-ratio tests compared with pixel-matched filtering. We also study the performance of two types of subpixel position estimator. Finally, we put forward the major influence of sensor design on both estimation and point object detection.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA