Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 10 de 10
Filtrar
1.
Annu Int Conf IEEE Eng Med Biol Soc ; 2015: 7849-52, 2015 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-26738111

RESUMO

Analysis of a fluid mixture using a chromatographic system is a standard technique for many biomedical applications such as in-vitro diagnostic of body fluids or air and water quality assessment. The analysis is often dedicated towards a set of molecules or biomarkers. However, due to the fluid complexity, the number of mixture components is often larger than the list of targeted molecules. In order to get an analysis as exhaustive as possible and also to take into account possible interferences, it is important to identify and to quantify all the components that are included in the chromatographic signal. Thus the signal processing aims to reconstruct a list of an unknown number of components and their relative concentrations. We address this question as a problem of sparse representation of a chromatographic signal. The innovative representation is based on a stochastic forward model describing the transport of elementary molecules in the chromatography column as a molecular random walk. We investigate three methods: two probabilistic Bayesian approaches, one parametric and one non-parametric, and a determinist approach based on a parsimonious decomposition on a dictionary basis. We examine the performances of these 3 approaches on an experimental case dedicated to the analysis of mixtures of the micro-pollutants Polycyclic Aromatic Hydrocarbons (PAH) in a methanol solution in two cases of high and low signal to noise ratio (SNR).


Assuntos
Cromatografia/métodos , Processamento de Sinais Assistido por Computador , Teorema de Bayes , Cromatografia Gasosa/métodos , Modelos Moleculares , Hidrocarbonetos Policíclicos Aromáticos/análise , Razão Sinal-Ruído , Processos Estocásticos , Poluentes Químicos da Água/análise
2.
Comput Sci Eng ; 94(6): 521-539, 2012 Jun 01.
Artigo em Inglês | MEDLINE | ID: mdl-22942787

RESUMO

Nanoinformatics has recently emerged to address the need of computing applications at the nano level. In this regard, the authors have participated in various initiatives to identify its concepts, foundations and challenges. While nanomaterials open up the possibility for developing new devices in many industrial and scientific areas, they also offer breakthrough perspectives for the prevention, diagnosis and treatment of diseases. In this paper, we analyze the different aspects of nanoinformatics and suggest five research topics to help catalyze new research and development in the area, particularly focused on nanomedicine. We also encompass the use of informatics to further the biological and clinical applications of basic research in nanoscience and nanotechnology, and the related concept of an extended "nanotype" to coalesce information related to nanoparticles. We suggest how nanoinformatics could accelerate developments in nanomedicine, similarly to what happened with the Human Genome and other -omics projects, on issues like exchanging modeling and simulation methods and tools, linking toxicity information to clinical and personal databases or developing new approaches for scientific ontologies, among many others.

3.
Artigo em Inglês | MEDLINE | ID: mdl-18003376

RESUMO

There is a need to precisely measure concentration of proteins in biological substance for early diagnosis of disease or knowledge of fundamental biological processes. Many apparatus have been proposed, and now data processing methods have to be investigated. This paper focuses on data processing of proteomic experiments combining nano liquid chromatography and mass spectrometry. Experimental fluctuations of this process raise an interest for robust methods. Consequently, we propose a model of this acquisition system and a probabilistic Bayesian method to estimate the proteins' concentrations and system parameters.


Assuntos
Cromatografia Líquida/métodos , Perfilação da Expressão Gênica/métodos , Espectrometria de Massas/métodos , Reconhecimento Automatizado de Padrão/métodos , Mapeamento de Peptídeos/métodos , Proteoma/análise , Proteoma/química , Algoritmos , Inteligência Artificial , Teorema de Bayes , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
4.
Artigo em Inglês | MEDLINE | ID: mdl-18003377

RESUMO

This paper presents a full proteomics analysis LC-MS (Liquid Chromatography-Mass Spectrometry) chain combining bio, nano and information technologies in order to quantify targeted proteins in blood sample. The objective is to enable an early detection of pancreatic cancer. We focus on the data processing step which estimates the proteins' concentration. First, we pre-process the data in order to correct time shift between the experiments. We propose to use block matching algorithm. Second, quantification of protein is performed using chemometrics approaches and more precisely CLS, PLS, N-PLS and PARAFAC algorithms. Performances of the various methods have been compared on cytochrome c protein LC-MS analyses.


Assuntos
Algoritmos , Análise Química do Sangue/métodos , Proteínas Sanguíneas/análise , Cromatografia Líquida/métodos , Técnicas de Química Combinatória/métodos , Espectrometria de Massas/métodos , Reconhecimento Automatizado de Padrão/métodos , Biomarcadores/sangue , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
5.
Artigo em Inglês | MEDLINE | ID: mdl-18003398

RESUMO

Detecting proteins in human blood holds the promise of a revolution in cancer diagnosis. Also, the ability to perform laboratory operations on small scales using miniaturized (lab-on-a-chip) devices has many benefits. Designing and fabricating such systems is extremely challenging, but physicists and engineers are beginning to construct such highly integrated and compact labs on chips with exciting functionality. This paper focuses on the presentation of the requirements of the information technology layer in such an integrated platform been developed in the LOCCANDIA project. LOCCANDIA is a Specific Targeted Research project (STREP) funded under the 6th Framework program of the EC. Its ultimate objective is to develop an innovative nano-technology based (lab-on-a-chip) platform for the medical-proeomics field. The paper presents the main engineering aspects, challenges and architecture for creating an Integrated Clinico-Proteomic Environment. The environment will be used to monitor and document the analysis and discovery chain and to allow the physician to interpret the digital spectrogram data delivered by the mass spectrometer, for diagnostic purposes.


Assuntos
Análise Química do Sangue/instrumentação , Biologia Computacional/instrumentação , Bases de Dados de Proteínas , Análise Serial de Proteínas/instrumentação , Proteômica/instrumentação , Análise de Sequência de Proteína/instrumentação , Software , Análise Química do Sangue/métodos , Biologia Computacional/métodos , Sistemas de Gerenciamento de Base de Dados , Desenho de Equipamento , Análise de Falha de Equipamento , Análise Serial de Proteínas/métodos , Proteômica/métodos , Análise de Sequência de Proteína/métodos , Integração de Sistemas
6.
Phys Med Biol ; 44(10): 2623-42, 1999 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-10533932

RESUMO

In SPECT, regularization is necessary to avoid divergence of the iterative algorithms used for non uniform attenuation compensation. In this paper, we propose a spline-based regularization method for the minimal residual algorithm. First, the acquisition noise is filtered using a statistical model involving spline smoothing so that the filtered projections belong to a Sobolev space with specific continuity and derivability properties. Then, during the iterative reconstruction procedure, the continuity of the inverse Radon transform between Sobolev spaces is used to design a spline-regularized filtered backprojection method, by which the known regularity properties of the projections determine those of the corresponding reconstructed slices. This ensures that the activity distributions estimated at each iteration present regularity properties, which avoids computational noise amplification, thus stabilizing the iterative process. Analytical and Monte Carlo simulations are used to show that the proposed spline-regularized minimal residual algorithm converges to a satisfactory stable solution in terms of restored activity and homogeneity, using at most 25 iterations, whereas the non regularized version of the algorithm diverges. Choosing the number of iterations is therefore no longer a critical issue for this reconstruction procedure.


Assuntos
Coração/diagnóstico por imagem , Processamento de Imagem Assistida por Computador , Imagens de Fantasmas , Tomografia Computadorizada de Emissão de Fóton Único/métodos , Algoritmos , Humanos , Modelos Estatísticos , Reprodutibilidade dos Testes
7.
Eur J Nucl Med ; 26(12): 1580-8, 1999 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-10638410

RESUMO

Epidepride labelled with iodine-123 is a suitable probe for the in vivo imaging of striatal and extrastriatal dopamine D2 receptors using single-photon emission tomography (SPET). Recently, this molecule has also been labelled with carbon-11. The goal of this work was to develop a method allowing the in vivo quantification of radioactivity uptake in baboon brain using SPET and to validate it using positron emission tomography (PET). SPET studies were performed in Papio anubis baboons using 123I-epidepride. Emission and transmission measurements were acquired on a dual-headed system with variable head angulation and low-energy ultra-high resolution (LEUHR) collimation. The imaging protocol consisted of one transmission measurement (24 min, heads at 90 degrees), obtained with two sliding line sources of gadolinium-153 prior to injection of 0.21-0.46 GBq of 123I-epidepride, and 12 emission measurements starting 5 min post injection. For scatter correction (SC) we used a dual-window method adapted to 123I. Collimator blurring correction (CBC) was done by deconvolution in Fourier space and attenuation correction (AT) was applied on a preliminary (CBC) filtered back-projection reconstruction using 12 iterations of a preconditioned, regularized minimal residual algorithm. For each reconstruction, a calibration factor was derived from a uniform cylinder filled with a 123I solution of a known radioactivity concentration. Calibration and baboon images were systematically built with the same reconstruction parameters. Uncorrected (UNC) and (AT), (SC + AT) and (SC + CBC + AT) corrected images were compared. PET acquisitions using 0.11-0.44 GBq of 11C-epidepride were performed on the same baboons and used as a reference. The radioactive concentrations expressed in percent of the injected dose per 100 ml (% ID/100 ml) obtained after (SC + CBC + AT) in SPET are in good agreement with those obtained with PET and 11C-epidepride. A method for the in vivo absolute quantitation of 123I-epidepride uptake using SPET has been developed which can be directly applied to other 123I-labelled molecules used in the study of the dopamine system. Further work will consist in using PET to model the radioligand-receptor interactions and to derive a simplified model applicable in SPET.


Assuntos
Benzamidas/farmacocinética , Encéfalo/metabolismo , Pirrolidinas/farmacocinética , Tomografia Computadorizada de Emissão de Fóton Único/métodos , Tomografia Computadorizada de Emissão/métodos , Animais , Benzamidas/análise , Transporte Biológico , Encéfalo/diagnóstico por imagem , Radioisótopos de Carbono , Radioisótopos do Iodo , Papio , Pirrolidinas/análise , Compostos Radiofarmacêuticos/farmacocinética , Reprodutibilidade dos Testes
8.
Phys Med Biol ; 43(4): 715-27, 1998 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-9572498

RESUMO

This paper presents an iterative method based on the minimal residual algorithm for tomographic attenuation compensated reconstruction from attenuated cone-beam projections given the attenuation distribution. Unlike conjugate-gradient based reconstruction techniques, the proposed minimal residual based algorithm solves directly a quasisymmetric linear system, which is a preconditioned system. Thus it avoids the use of normal equations, which improves the convergence rate. Two main contributions are introduced. First, a regularization method is derived for quasisymmetric problems, based on a Tikhonov-Phillips regularization applied to the factorization of the symmetric part of the system matrix. This regularization is made spatially adaptive to avoid smoothing the region of interest. Second, our existing reconstruction algorithm for attenuation correction in parallel-beam geometry is extended to cone-beam geometry. A circular orbit is considered. Two preconditioning operators are proposed: the first one is Grangeat's inversion formula and the second one is Feldkamp's inversion formula. Experimental results obtained on simulated data are presented and the shadow zone effect on attenuated data is illustrated.


Assuntos
Interpretação de Imagem Assistida por Computador/métodos , Processamento de Imagem Assistida por Computador/métodos , Imagens de Fantasmas , Tomografia Computadorizada de Emissão de Fóton Único , Algoritmos , Calibragem , Modelos Estatísticos , Distribuição Normal
9.
Comput Med Imaging Graph ; 17(4-5): 279-87, 1993.
Artigo em Inglês | MEDLINE | ID: mdl-8306299

RESUMO

In this article we report on the TOMOCONIC project for cerebral SPECT using cone-beam collimators. First, we describe our experimental set-up. The cone-beam collimator improves both spatial resolution and sensitivity. We tilt the detector to center the focal point rotation plane on the region to reconstruct. Then, to minimize cone-beam artefacts, we use the Grangeat algorithm for image reconstruction. We describe here how it can be generated to this tilted acquisition geometry. Finally, we present our first experimental results and a comparison with parallel-beam SPECT. The improvement ratio on transverse spatial resolution is 1.5. We conclude with our first clinical images.


Assuntos
Encéfalo/diagnóstico por imagem , Processamento de Imagem Assistida por Computador/métodos , Tomografia Computadorizada de Emissão de Fóton Único/instrumentação , Algoritmos , Circulação Cerebrovascular , Gráficos por Computador , Humanos , Modelos Estruturais , Compostos de Organotecnécio , Oximas , Sensibilidade e Especificidade , Software , Tecnécio Tc 99m Exametazima
10.
Phys Med Biol ; 37(3): 717-29, 1992 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-1565699

RESUMO

The interest in fully three-dimensional image reconstruction, especially in positron emission tomography (PET) has significantly increased for the last few years. Taking into account the cross-plane gamma rays in a three-dimensional reconstruction algorithm improves the sensitivity. At LETI, our specialty in PET is the time-of-flight (TOF) measurement. Thus, we present in this article two reconstruction techniques for 3D TOF PET. The first is a backprojection-convolution algorithm. Due to the redundancy in the 3D data set, there exists an infinite number of filters. As Defrise and co-workers did for classical tomography, we established a general condition that characterizes the filters and propose an algorithm with a factorizable filter. However, this first technique requires an acquisition system with revolution symmetry. Thus, we present a second one which is adapted to a detection geometry with a small number of angular positions. It consists of a multi-image deconvolution algorithm with Wiener filter.


Assuntos
Processamento de Imagem Assistida por Computador/métodos , Tomografia Computadorizada de Emissão/métodos , Algoritmos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA