Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 14 de 14
Filtrar
Más filtros













Base de datos
Intervalo de año de publicación
1.
Sensors (Basel) ; 23(13)2023 Jul 02.
Artículo en Inglés | MEDLINE | ID: mdl-37447945

RESUMEN

The development of a capnometry wristband is of great interest for monitoring patients at home. We consider a new architecture in which a non-dispersive infrared (NDIR) optical measurement is located close to the skin surface and is combined with an open chamber principle with a continuous circulation of air flow in the collection cell. We propose a model for the temporal dynamics of the carbon dioxide exchange between the blood and the gas channel inside the device. The transport of carbon dioxide is modeled by convection-diffusion equations. We consider four compartments: blood, skin, the measurement cell and the collection cell. We introduce the state-space equations and the associated transition matrix associated with a Markovian model. We define an augmented system by combining a first-order autoregressive model describing the supply of carbon dioxide concentration in the blood compartment and its inertial resistance to change. We propose to use a Kalman filter to estimate the carbon dioxide concentration in the blood vessels recursively over time and thus monitor arterial carbon dioxide blood pressure in real time. Four performance factors with respect to the dynamic quantification of the CO2 blood concentration are considered, and a simulation is carried out based on data from a previous clinical study. These demonstrate the feasibility of such a technological concept.


Asunto(s)
Capnografía , Dióxido de Carbono , Humanos , Difusión , Monitoreo Fisiológico/métodos
2.
Annu Int Conf IEEE Eng Med Biol Soc ; 2020: 4640-4643, 2020 07.
Artículo en Inglés | MEDLINE | ID: mdl-33019028

RESUMEN

The development of wearable devices for healthcare monitoring is of primary interest, in particular for homecare applications. But it is challenging to develop an evaluation framework to test and optimize such a device by following a non-invasive protocol. As well established reference devices do exist for capnometry, we propose a protocol to evaluate and compare the performance of the transcutaneous carbon dioxide monitoring wristband that we develop. We present here this protocol, the signal processing pipeline and the data analysis based on signal alignment and intercorrelation study, and the first results on a cohort of 13 healthy subjects. This test allows demonstrating the influence of the device response time and of the carbon dioxide content in the ambient air.Clinical Relevance-The protocol described here allows to test and optimize the new device in clinical conditions simulating hypo and hypercapnia variations on a subject at rest, as it would be the case at home to monitor the health status of chronic respiratory patients, and to compare the performances with reference devices. A strong intercorrelation greater than 0.8 has been observed in 5 healthy subjects out of 13 and factors influencing the intercorrelation are suggested.


Asunto(s)
Dióxido de Carbono , Hipercapnia , Capnografía , Voluntarios Sanos , Humanos , Monitoreo Fisiológico
3.
Annu Int Conf IEEE Eng Med Biol Soc ; 2019: 3352-3355, 2019 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-31946599

RESUMEN

We introduce an innovative wristband wireless device based on a dual wavelength NDIR optical measurement and an optimized thermo-fluidic channel to improve the extraction of the carbon dioxide gas from the blood within the heated skin region. We describe a signal processing model combining an innovative linear quadratic model of the optical measurement and a fluidic model. The evaluation is achieved using a cardiopulmonary exercise test (CPET). We compare carbon dioxide tension measurement at the forearm level using our device, with an electrochemical measurement at the forearm level, and an optical measurement of the end-tidal exhaled breath. These curves demonstrate a significant reduction of the variability of carbon dioxide pressure measurement with respect to the pressure dynamic range during the test.


Asunto(s)
Dióxido de Carbono , Prueba de Esfuerzo , Dispositivos Electrónicos Vestibles , Análisis de los Gases de la Sangre , Monitoreo de Gas Sanguíneo Transcutáneo , Dióxido de Carbono/análisis , Prueba de Esfuerzo/instrumentación , Humanos , Monitoreo Fisiológico
4.
Annu Int Conf IEEE Eng Med Biol Soc ; 2018: 1230-1233, 2018 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-30440612

RESUMEN

In medical applications, quantitative analysis of breath may open new prospects for diagnosis or for patient monitoring. To detect acetone, a breath biomarker for diabetes, we use a single metal-oxide (MOX) gas sensor working in a dual temperature mode. We propose a linear-quadratic model to describe the mixing model mapping gas concentrations to MOX sensor responses. In this purpose, it is necessary to inverse the nonlinear problem in order to quantify the component of the gas mixture. As a proof of concept, we study a mixture of two gases, acetone and ethanol diluted in air buffer. In order to estimate the concentration of each gas, we introduce a supervised Bayesian source separation method. Based on MCMC stochastic sampling methods to estimate the mean of the posterior distribution, this Bayesian approach is robust to noise for solving this ill-posed non-linear inversion problem. We analyze the performance on a set of samples associated with a set of gas concentration covering the range suitable for exhaled breath. We use a cross-validation approach, calibrating the mixing parameters with some samples and validating the source estimation with others. Our new supervised method applied on a linear-quadratic model allows to estimate acetone and ethanol concentration with a precision of around 2 ppm.


Asunto(s)
Acetona/análisis , Teorema de Bayes , Pruebas Respiratorias , Gases/análisis , Espiración , Humanos , Óxidos
5.
Sensors (Basel) ; 18(6)2018 Jun 01.
Artículo en Inglés | MEDLINE | ID: mdl-29865202

RESUMEN

The aim of our work is to quantify two gases (acetone and ethanol) diluted in an air buffer using only a single metal oxide (MOX) sensor. We took advantage of the low selectivity of the MOX sensor, exploiting a dual-temperature mode. Working at two temperatures of the MOX sensitive layer allowed us to obtain diversity in the measures. Two virtual sensors were created to characterize our gas mixture. We presented a linear-quadratic mixture sensing model which was closer to the experimental data. To validate this model and the experimental protocol, we inverted the system of quadratic equations to quantify a mixture of the two gases. The linear-quadratic model was compared to the bilinear model proposed in the literature. We presented an experimental evaluation on mixtures made of a few ppm of acetone and ethanol, and we obtained a precision close to the ppm. This is an important step towards medical applications, particularly in terms of diabetes, to deliver a non-invasive measure with a low-cost device.

6.
BMC Bioinformatics ; 19(1): 123, 2018 04 05.
Artículo en Inglés | MEDLINE | ID: mdl-29621971

RESUMEN

BACKGROUND: Thanks to a reasonable cost and simple sample preparation procedure, linear MALDI-ToF spectrometry is a growing technology for clinical microbiology. With appropriate spectrum databases, this technology can be used for early identification of pathogens in body fluids. However, due to the low resolution of linear MALDI-ToF instruments, robust and accurate peak picking remains a challenging task. In this context we propose a new peak extraction algorithm from raw spectrum. With this method the spectrum baseline and spectrum peaks are processed jointly. The approach relies on an additive model constituted by a smooth baseline part plus a sparse peak list convolved with a known peak shape. The model is then fitted under a Gaussian noise model. The proposed method is well suited to process low resolution spectra with important baseline and unresolved peaks. RESULTS: We developed a new peak deconvolution procedure. The paper describes the method derivation and discusses some of its interpretations. The algorithm is then described in a pseudo-code form where the required optimization procedure is detailed. For synthetic data the method is compared to a more conventional approach. The new method reduces artifacts caused by the usual two-steps procedure, baseline removal then peak extraction. Finally some results on real linear MALDI-ToF spectra are provided. CONCLUSIONS: We introduced a new method for peak picking, where peak deconvolution and baseline computation are performed jointly. On simulated data we showed that this global approach performs better than a classical one where baseline and peaks are processed sequentially. A dedicated experiment has been conducted on real spectra. In this study a collection of spectra of spiked proteins were acquired and then analyzed. Better performances of the proposed method, in term of accuracy and reproductibility, have been observed and validated by an extended statistical analysis.


Asunto(s)
Algoritmos , Espectrometría de Masa por Láser de Matriz Asistida de Ionización Desorción/métodos , Artefactos
7.
BMC Bioinformatics ; 19(1): 73, 2018 03 01.
Artículo en Inglés | MEDLINE | ID: mdl-29490628

RESUMEN

BACKGROUND: In the field of biomarker validation with mass spectrometry, controlling the technical variability is a critical issue. In selected reaction monitoring (SRM) measurements, this issue provides the opportunity of using variance component analysis to distinguish various sources of variability. However, in case of unbalanced data (unequal number of observations in all factor combinations), the classical methods cannot correctly estimate the various sources of variability, particularly in presence of interaction. The present paper proposes an extension of the variance component analysis to estimate the various components of the variance, including an interaction component in case of unbalanced data. RESULTS: We applied an experimental design that uses a serial dilution to generate known relative protein concentrations and estimated these concentrations by two processing algorithms, a classical and a more recent one. The extended method allowed estimating the variances explained by the dilution and the technical process by each algorithm in an experiment with 9 proteins: L-FABP, 14.3.3 sigma, Calgi, Def.A6, Villin, Calmo, I-FABP, Peroxi-5, and S100A14. Whereas, the recent algorithm gave a higher dilution variance and a lower technical variance than the classical one in two proteins with three peptides (L-FABP and Villin), there were no significant difference between the two algorithms on all proteins. CONCLUSIONS: The extension of the variance component analysis was able to estimate correctly the variance components of protein concentration measurement in case of unbalanced design.


Asunto(s)
Algoritmos , Biomarcadores/análisis , Espectrometría de Masas , Proteínas/análisis , Análisis de Varianza , Ensayo de Inmunoadsorción Enzimática , Humanos , Reproducibilidad de los Resultados
8.
Biom J ; 60(2): 262-274, 2018 03.
Artículo en Inglés | MEDLINE | ID: mdl-29230881

RESUMEN

Controlling the technological variability on an analytical chain is critical for biomarker discovery. The sources of technological variability should be modeled, which calls for specific experimental design, signal processing, and statistical analysis. Furthermore, with unbalanced data, the various components of variability cannot be estimated with the sequential or adjusted sums of squares of usual software programs. We propose a novel approach to variance component analysis with application to the matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) technology and use this approach for protein quantification by a classical signal processing algorithm and two more recent ones (BHI-PRO 1 and 2). Given the high technological variability, the quantification failed to restitute the known quantities of five out of nine proteins present in a controlled solution. There was a linear relationship between protein quantities and peak intensities for four out of nine peaks with all algorithms. The biological component of the variance was higher with BHI-PRO than with the classical algorithm (80-95% with BHI-PRO 1, 79-95% with BHI-PRO 2 vs. 56-90%); thus, BHI-PRO were more efficient in protein quantification. The technological component of the variance was higher with the classical algorithm than with BHI-PRO (6-25% vs. 2.5-9.6% with BHI-PRO 1 and 3.5-11.9% with BHI-PRO 2). The chemical component was also higher with the classical algorithm (3.6-18.7% vs. < 3.5%). Thus, BHI-PRO were better in removing noise from signal when the expected peaks are detected. Overall, either BHI-PRO algorithm may reduce the technological variance from 25 to 10% and thus improve protein quantification and biomarker validation.


Asunto(s)
Biometría/métodos , Proteínas/análisis , Espectrometría de Masa por Láser de Matriz Asistida de Ionización Desorción , Algoritmos , Análisis de Varianza , Biomarcadores/análisis , Biomarcadores/química , Modelos Lineales , Proteínas/química
9.
EURASIP J Bioinform Syst Biol ; 2017(1): 9, 2017 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-28710702

RESUMEN

This paper addresses the question of biomarker discovery in proteomics. Given clinical data regarding a list of proteins for a set of individuals, the tackled problem is to extract a short subset of proteins the concentrations of which are an indicator of the biological status (healthy or pathological). In this paper, it is formulated as a specific instance of variable selection. The originality is that the proteins are not investigated one after the other but the best partition between discriminant and non-discriminant proteins is directly sought. In this way, correlations between the proteins are intrinsically taken into account in the decision. The developed strategy is derived in a Bayesian setting, and the decision is optimal in the sense that it minimizes a global mean error. It is finally based on the posterior probabilities of the partitions. The main difficulty is to calculate these probabilities since they are based on the so-called evidence that require marginalization of all the unknown model parameters. Two models are presented that relate the status to the protein concentrations, depending whether the latter are biomarkers or not. The first model accounts for biological variabilities by assuming that the concentrations are Gaussian distributed with a mean and a covariance matrix that depend on the status only for the biomarkers. The second one is an extension that also takes into account the technical variabilities that may significantly impact the observed concentrations. The main contributions of the paper are: (1) a new Bayesian formulation of the biomarker selection problem, (2) the closed-form expression of the posterior probabilities in the noiseless case, and (3) a suitable approximated solution in the noisy case. The methods are numerically assessed and compared to the state-of-the-art methods (t test, LASSO, Battacharyya distance, FOHSIC) on synthetic and real data from proteins quantified in human serum by mass spectrometry in selected reaction monitoring mode.

10.
IEEE Trans Med Imaging ; 26(2): 261-9, 2007 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-17304739

RESUMEN

This work concerns 2D + t dynamic tomography. We show that a much larger class of deformations than the affine transforms can be compensated analytically within filtered back projection algorithms in 2D parallel beam and fan beam dynamic tomography. We present numerical experiments on the Shepp and Logan phantom showing that nonaffine deformations can be compensated. A generalization to 3D cone beam tomography is proposed.


Asunto(s)
Algoritmos , Artefactos , Imagenología Tridimensional/métodos , Intensificación de Imagen Radiográfica/métodos , Interpretación de Imagen Radiográfica Asistida por Computador/métodos , Tomografía Computarizada por Rayos X/métodos , Movimiento (Física) , Reproducibilidad de los Resultados , Sensibilidad y Especificidad , Factores de Tiempo
11.
Phys Med Biol ; 49(11): 2377-90, 2004 Jun 07.
Artículo en Inglés | MEDLINE | ID: mdl-15248584

RESUMEN

We give the sampling conditions of the 3D fan-beam x-ray transform (3DFBXRT). The motivation of this work lies in the fact that helical tomography with a single detector line is simply a sampling of this transform under the helical constraint. We give a precise description of the geometry of the essential support of the 3DFBXRT Fourier transform and show how to derive efficient sampling schemes. We then give efficient sampling schemes in parallel helical tomography. We present numerical experiments showing that efficient sampling on hexagonal interlaced schemes yields better reconstructions than the standard schemes in both parallel helical tomography (using QDO) and 3DFBXRT. We discuss the practical drawbacks and advantages of these efficient schemes and the possible extension to fan-beam helical CT.


Asunto(s)
Algoritmos , Imagenología Tridimensional/métodos , Almacenamiento y Recuperación de la Información/métodos , Intensificación de Imagen Radiográfica/métodos , Interpretación de Imagen Radiográfica Asistida por Computador/métodos , Tamaño de la Muestra , Tomografía Computarizada Espiral/métodos , Análisis Numérico Asistido por Computador , Fantasmas de Imagen , Reproducibilidad de los Resultados , Sensibilidad y Especificidad , Procesamiento de Señales Asistido por Computador , Técnica de Sustracción , Tomografía Computarizada Espiral/instrumentación
12.
Phys Med Biol ; 49(11): 2169-82, 2004 Jun 07.
Artículo en Inglés | MEDLINE | ID: mdl-15250070

RESUMEN

This work is dedicated to the reduction of reconstruction artefacts due to motion occurring during the acquisition of computerized tomographic projections. This problem has to be solved when imaging moving organs such as the lungs or the heart. The proposed method belongs to the class of motion compensation algorithms, where the model of motion is included in the reconstruction formula. We address two fundamental questions. First what conditions on the deformation are required for the reconstruction of the object from projections acquired sequentially during the deformation, and second how do we reconstruct the object from those projections. Here we answer these questions in the particular case of 2D general time-dependent affine deformations, assuming the motion parameters are known. We treat the problem of admissibility conditions on the deformation in the parallel-beam and fan-beam cases. Then we propose exact reconstruction methods based on rebinning or sequential FBP formulae for each of these geometries and present reconstructed images obtained with the fan-beam algorithm on simulated data.


Asunto(s)
Algoritmos , Artefactos , Movimiento , Intensificación de Imagen Radiográfica/métodos , Interpretación de Imagen Radiográfica Asistida por Computador/métodos , Técnica de Sustracción , Tomografía Computarizada por Rayos X/métodos , Humanos , Almacenamiento y Recuperación de la Información/métodos , Análisis Numérico Asistido por Computador , Fantasmas de Imagen , Radiografía Torácica/métodos , Reproducibilidad de los Resultados , Respiración , Sensibilidad y Especificidad , Procesamiento de Señales Asistido por Computador , Factores de Tiempo
13.
Phys Med Biol ; 47(15): 2611-25, 2002 Aug 07.
Artículo en Inglés | MEDLINE | ID: mdl-12211208

RESUMEN

Dynamic cone-beam reconstruction algorithms are required to reconstruct three-dimensional (3D) image sequences on dynamic 3D CT combining multi-row two-dimensional (2D) detectors and sub-second scanners. The speed-up of the rotating gantry allows one to improve the temporal resolution of the image sequence, but at the same time, it implies increase in the dose delivered during a given time period to keep constant the signal-to-noise ratio associated with each frame. The alternative solution proposed in this paper is to process data acquisition on several half-turns in order to reduce the dose delivered per rotation with the same signal-to-noise ratio. In order to compensate for time evolution and motion artefacts, we propose to use a dynamic particle model to describe the object evolution during the scan. In this article, we first introduce the dynamic particle model and the dynamic CT acquisition model. Then, we explain the principle of the proposed dynamic cone-beam reconstruction algorithm. Lastly, we present preliminary results on simulated data.


Asunto(s)
Algoritmos , Simulación por Computador , Imagenología Tridimensional/métodos , Intensificación de Imagen Radiográfica/métodos , Tomografía Computarizada por Rayos X/métodos , Humanos , Modelos Teóricos , Movimiento (Física) , Fantasmas de Imagen , Radiografía Torácica/métodos , Sensibilidad y Especificidad , Tomografía Computarizada por Rayos X/instrumentación
14.
Phys Med Biol ; 47(15): 2659-71, 2002 Aug 07.
Artículo en Inglés | MEDLINE | ID: mdl-12211209

RESUMEN

Some recent medical imaging applications such as functional imaging (PET and SPECT) or interventional imaging (CT fluoroscopy) involve increasing amounts of data. In order to reduce the image reconstruction time, we develop a new fast 3D reconstruction algorithm based on a divide and conquer approach. The proposed multichannel algorithm performs an indirect frequential subband decomposition of the image f to be reconstructed (f = sigma fj) through the filtering of the projections Rf. The subband images fj are reconstructed on a downsampled grid without information suppression. In order to reduce the computation time, we do not backproject the null filtered projections and we downsample the number of projections according to the Shannon conditions associated with the subband image. Our algorithm is based on filtering and backprojection operators. Using the same algorithms for these basic operators, our approach is three and a half times faster than a classical FBP algorithm for a 2D image 512 x 512 and six times faster for a 3D image 32 x 512 x 512.


Asunto(s)
Algoritmos , Encéfalo/diagnóstico por imagen , Intensificación de Imagen Radiográfica/métodos , Procesamiento de Señales Asistido por Computador , Fluoroscopía/métodos , Análisis de Fourier , Aumento de la Imagen/métodos , Fantasmas de Imagen , Control de Calidad , Intensificación de Imagen Radiográfica/instrumentación , Sensibilidad y Especificidad , Tomografía Computarizada de Emisión/métodos
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA