Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 14 de 14
Filtrar
Mais filtros











Base de dados
Intervalo de ano de publicação
1.
Comput Biol Med ; 167: 107698, 2023 12.
Artigo em Inglês | MEDLINE | ID: mdl-37956624

RESUMO

The resolution of the inverse problem of electrocardiography represents a major interest in the diagnosis and catheter-based therapy of cardiac arrhythmia. In this context, the ability to simulate several cardiac electrical behaviors was crucial for evaluating and comparing the performance of inversion methods. For this application, existing models are either too complex or do not produce realistic cardiac patterns. In this work, a low-resolution heart-torso model generating realistic whole heart cardiac mappings and electrocardiograms in healthy and pathological cases is designed. This model was built upon a simplified heart-torso geometry and implements the monodomain formalism by using the finite element method. In addition, a model reduction step through a sensitivity analysis was proposed where parameters were identified using an evolutionary optimization approach. Finally, the study illustrates the usefulness of the proposed model by comparing the performance of different variants of Tikhonov-based inversion methods for the determination of the regularization parameter in healthy, ischemic and ventricular tachycardia scenarios. First, results of the sensitivity analysis show that among 58 parameters only 25 are influent. Note also that the level of influence of the parameters depends on the heart region. Besides, the synthesized electrocardiograms globally present the same characteristic shape compared to the reference once with a correlation value that reaches 88%. Regarding inverse problem, results highlight that only Robust Generalized Cross Validation and Discrepancy Principle provide best performance, with a quasi-perfect success rate for both, and a respective relative error, between the generated electrocardiograms to the reference one, of 0.75 and 0.62.


Assuntos
Eletrocardiografia , Taquicardia Ventricular , Humanos , Eletrocardiografia/métodos , Pericárdio , Matemática , Diagnóstico por Imagem , Modelos Cardiovasculares , Mapeamento Potencial de Superfície Corporal/métodos , Algoritmos
2.
Front Psychol ; 14: 1122793, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37251030

RESUMO

Mental workload (MWL) is a concept that is used as a reference for assessing the mental cost of activities. In recent times, challenges related to user experience are determining the expected MWL value for a given activity and real-time adaptation of task complexity level to achieve or maintain desired MWL. As a consequence, it is important to have at least one task that can reliably predict the MWL level associated with a given complexity level. In this study, we used several cognitive tasks to meet this need, including the N-Back task, the commonly used reference test in the MWL literature, and the Corsi test. Tasks were adapted to generate different MWL classes measured via NASA-TLX and Workload Profile questionnaires. Our first objective was to identify which tasks had the most distinct MWL classes based on combined statistical methods. Our results indicated that the Corsi test satisfied our first objective, obtaining three distinct MWL classes associated with three complexity levels offering therefore a reliable model (about 80% accuracy) to predicted MWL classes. Our second objective was to achieve or maintain the desired MWL, which entailed the use of an algorithm to adapt the MWL class based on an accurate prediction model. This model needed to be based on an objective and real-time indicator of MWL. For this purpose, we identified different performance criteria for each task. The classification models obtained indicated that only the Corsi test would be a good candidate for this aim (more than 50% accuracy compared to a chance level of 33%) but performances were not sufficient to consider identifying and adapting the MWL class online with sufficient accuracy during a task. Thus, performance indicators require to be complemented by other types of measures like physiological ones. Our study also highlights the limitations of the N-back task in favor of the Corsi test which turned out to be the best candidate to model and predict the MWL among several cognitive tasks.

3.
Biomolecules ; 13(2)2023 02 02.
Artigo em Inglês | MEDLINE | ID: mdl-36830655

RESUMO

Magnetic Resonance Imaging is a powerful non-destructive tool in the study of plant tissues. For potato tubers, it greatly assists the study of tissue defects and tissue evolution during storage. This paper describes the MRI analysis of potato tubers with internal defects in their flesh tissue at eight sampling dates from 14 to 33 weeks after harvest. Spatialized multi-exponential T2 relaxometry was used to generate bi-exponential T2 maps, coupled with a classification scheme to identify the different T2 homogeneous zones within the tubers. Six classes with statistically different relaxation parameters were identified at each sampling date, allowing the defects and the pith and cortex tissues to be detected. A further distinction could be made between three constitutive elements within the flesh, revealing the heterogeneity of this particular tissue. Relaxation parameters for each class and their evolution during storage were successfully analyzed. The work demonstrated the value of MRI for detailed non-invasive plant tissue characterization.


Assuntos
Solanum tuberosum , Tubérculos , Imageamento por Ressonância Magnética/métodos
4.
Magn Reson Imaging ; 87: 119-132, 2022 04.
Artigo em Inglês | MEDLINE | ID: mdl-34871716

RESUMO

The estimation of multi-exponential relaxation time T2 and their associated amplitudes A0 at the voxel level has been made possible by recent developments in the field of image processing. These data are of great interest for the characterization of biological tissues, such as fruit tissues. However, they represent a high number of information, not easily interpretable. Moreover, the non-uniformity of the MRI images, which mainly directly impacts A0, could induce interpretation errors. In this paper, we propose a post-processing scheme that clusters similar voxels according to the multi-exponential relaxation parameters in order to reduce the complexity of the information while avoiding the problems associated with intensity non-uniformity. We also suggest a data representation suitable for the visualization of the multi-T2 distribution within each tissue. We illustrate this work with results for different fruits, demonstrating the great potential of multi-T2 information to shed new light on fruit characterization.


Assuntos
Frutas , Imageamento por Ressonância Magnética , Processamento de Imagem Assistida por Computador/métodos , Imageamento por Ressonância Magnética/métodos
6.
Sci Rep ; 11(1): 4112, 2021 02 18.
Artigo em Inglês | MEDLINE | ID: mdl-33603139

RESUMO

Wall Shear Stress (WSS) has been demonstrated to be a biomarker of the development of atherosclerosis. In vivo assessment of WSS is still challenging, but 4D Flow MRI represents a promising tool to provide 3D velocity data from which WSS can be calculated. In this study, a system based on Laser Doppler Velocimetry (LDV) was developed to validate new improvements of 4D Flow MRI acquisitions and derived WSS computing. A hydraulic circuit was manufactured to allow both 4D Flow MRI and LDV velocity measurements. WSS profiles were calculated with one 2D and one 3D method. Results indicated an excellent agreement between MRI and LDV velocity data, and thus the set-up enabled the evaluation of the improved performances of 3D with respect to the 2D-WSS computation method. To provide a concrete example of the efficacy of this method, the influence of the spatial resolution of MRI data on derived 3D-WSS profiles was investigated. This investigation showed that, with acquisition times compatible with standard clinical conditions, a refined MRI resolution does not improve WSS assessment, if the impact of noise is unreduced. This study represents a reliable basis to validate with LDV WSS calculation methods based on 4D Flow MRI.

7.
Magn Reson Imaging ; 74: 232-243, 2020 12.
Artigo em Inglês | MEDLINE | ID: mdl-32889090

RESUMO

Wall shear stress (WSS) is a relevant hemodynamic indicator of the local stress applied on the endothelium surface. More specifically, its spatiotemporal distribution reveals crucial in the evolution of many pathologies such as aneurysm, stenosis, and atherosclerosis. This paper introduces a new solution, called PaLMA, to quantify the WSS from 4D Flow MRI data. It relies on a two-step local parametric model, to accurately describe the vessel wall and the velocity-vector field in the neighborhood of a given point of interest. Extensive validations have been performed on synthetic 4D Flow MRI data, including four datasets generated from patient specific computational fluid dynamics simulations on carotids. The validation tests are focused on the impact of the noise component, of the resolution level, and of the segmentation accuracy concerning the vessel position in the context of complex flow patterns. In simulated cases aimed to reproduce clinical acquisition conditions, the WSS quantification performance reached by PaLMA is significantly higher (with a gain in RMSE of 12 to 27%) than the reference one obtained using the smoothing B-spline method proposed by Potters et al. (2015) method, while the computation time is equivalent for both WSS quantification methods.


Assuntos
Artérias Carótidas/diagnóstico por imagem , Artérias Carótidas/fisiologia , Hemodinâmica , Imageamento por Ressonância Magnética , Resistência ao Cisalhamento , Estresse Mecânico , Velocidade do Fluxo Sanguíneo , Humanos , Modelos Cardiovasculares
8.
IEEE Trans Med Imaging ; 39(11): 3725-3736, 2020 11.
Artigo em Inglês | MEDLINE | ID: mdl-32746117

RESUMO

In a low-statistics PET imaging context, the positive bias in regions of low activity is a burning issue. To overcome this problem, algorithms without the built-in non-negativity constraint may be used. They allow negative voxels in the image to reduce, or even to cancel the bias. However, such algorithms increase the variance and are difficult to interpret since the resulting images contain negative activities, which do not hold a physical meaning when dealing with radioactive concentration. In this paper, a post-processing approach is proposed to remove these negative values while preserving the local mean activities. Its original idea is to transfer the value of each voxel with negative activity to its direct neighbors under the constraint of preserving the local means of the image. In that respect, the proposed approach is formalized as a linear programming problem with a specific symmetric structure, which makes it solvable in a very efficient way by a dual-simplex-like iterative algorithm. The relevance of the proposed approach is discussed on simulated and on experimental data. Acquired data from an yttrium-90 phantom show that on images produced by a non-constrained algorithm, a much lower variance in the cold area is obtained after the post-processing step, at the price of a slightly increased bias. More specifically, when compared with the classical OSEM algorithm, images are improved, both in terms of bias and of variance.


Assuntos
Processamento de Imagem Assistida por Computador , Tomografia por Emissão de Pósitrons , Algoritmos , Imagens de Fantasmas
9.
Artigo em Inglês | MEDLINE | ID: mdl-32406838

RESUMO

Relaxation signal inside each voxel of magnetic resonance images (MRI) is commonly fitted by a multi-exponential decay curve. The estimation of a discrete multi-component relaxation model parameters from magnitude MRI data is a challenging nonlinear inverse problem since it should be conducted on the entire image voxels under non-Gaussian noise statistics. This paper proposes an efficient algorithm allowing the joint estimation of relaxation time values and their amplitudes using different criteria taking into account a Rician noise model, combined with a spatial regularization accounting for low spatial variability of relaxation time constants and amplitudes between neighboring voxels. The Rician noise hypothesis is accounted for either by an adapted nonlinear least squares algorithm applied to a corrected least squares criterion or by a majorization-minimization approach applied to the maximum likelihood criterion. In order to solve the resulting large-scale non-negativity constrained optimization problem with a reduced numerical complexity and computing time, an optimization algorithm based on a majorization approach ensuring separability of variables between voxels is proposed. The minimization is carried out iteratively using an adapted Levenberg-Marquardt algorithm that ensures convergence by imposing a sufficient decrease of the objective function and the non-negativity of the parameters. The importance of the regularization alongside the Rician noise incorporation is shown both visually and numerically on a simulated phantom and on magnitude MRI images acquired on fruit samples.

10.
Appl Spectrosc ; 74(7): 780-790, 2020 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-32452210

RESUMO

This work introduces hyper-resolution (HyRes), a numerical approach for spatial resolution enhancement that combines hyperspectral unmixing and super-resolution image restoration (SRIR). HyRes yields a substantial increase in spatial resolution of Raman spectroscopy while simultaneously preserving the undistorted spectral information. The resolving power of this technique is demonstrated on Raman spectroscopic data from a polymer nanowire sample. Here, we demonstrate an achieved resolution of better than 14 nm, a more than eightfold improvement on single-channel image-based SRIR and 25× better than regular far-field Raman spectroscopy, and comparable to near-field probing techniques.

11.
Appl Spectrosc ; 73(8): 902-909, 2019 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-30916988

RESUMO

Raman microscopy is a valuable tool for detecting physical and chemical properties of a sample material. When probing nanomaterials or nanocomposites the spatial resolution of Raman microscopy is not always adequate as it is limited by the optical diffraction limit. Numerical post-processing with super-resolution algorithms provides a means to enhance resolution and can be straightforwardly applied. The aim of this work is to present interior point least squares (IPLS) as a powerful tool for super-resolution in Raman imaging through constrained optimization. IPLS's potential for super-resolution is illustrated on numerically generated test images. Its resolving power is demonstrated on Raman spectroscopic data of a polymer nanowire sample. Comparison to atomic force microscopy data of the same sample substantiates that the presented method is a promising technique for analyzing nanomaterial samples.

12.
Magn Reson Imaging ; 49: 39-46, 2018 06.
Artigo em Inglês | MEDLINE | ID: mdl-29326046

RESUMO

Multi-tissue partial volume estimation in MRI images is investigated with a viewpoint related to spectral unmixing as used in hyperspectral imaging. The main contribution of this paper is twofold. It firstly proposes a theoretical analysis of the statistical optimality conditions of the proportion estimation problem, which in the context of multi-contrast MRI data acquisition allows to appropriately set the imaging sequence parameters. Secondly, an efficient proportion quantification algorithm based on the minimisation of a penalised least-square criterion incorporating a regularity constraint on the spatial distribution of the proportions is proposed. Furthermore, the resulting developments are discussed using empirical simulations. The practical usefulness of the spectral unmixing approach for partial volume quantification in MRI is illustrated through an application to food analysis on the proving of a Danish pastry.


Assuntos
Análise de Alimentos/métodos , Processamento de Imagem Assistida por Computador/métodos , Imageamento por Ressonância Magnética/métodos , Algoritmos , Análise dos Mínimos Quadrados
13.
IEEE Trans Image Process ; 20(6): 1517-28, 2011 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-21193375

RESUMO

This paper proposes accelerated subspace optimization methods in the context of image restoration. Subspace optimization methods belong to the class of iterative descent algorithms for unconstrained optimization. At each iteration of such methods, a stepsize vector allowing the best combination of several search directions is computed through a multidimensional search. It is usually obtained by an inner iterative second-order method ruled by a stopping criterion that guarantees the convergence of the outer algorithm. As an alternative, we propose an original multidimensional search strategy based on the majorize-minimize principle. It leads to a closed-form stepsize formula that ensures the convergence of the subspace algorithm whatever the number of inner iterations. The practical efficiency of the proposed scheme is illustrated in the context of edge-preserving image restoration.


Assuntos
Algoritmos , Aumento da Imagem/métodos , Interpretação de Imagem Assistida por Computador/métodos , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
14.
IEEE Trans Image Process ; 17(9): 1574-86, 2008 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-18701396

RESUMO

Entropy-coded lattice vector quantization (ECLVQ) with codebooks dedicated to independent identically distributed (i.i.d.) generalized Gaussian sources have proven their high coding performances in the wavelet domain. It is well known that wavelet coefficients with high magnitude (corresponding to edges and textures) tend to be clustered in a few amount of vectors. In this paper, we first show that this property has a major influence on the performances of ECLVQ schemes. Since this clustering property cannot be taken into account by the classical i.i.d. assumption, our first proposal is to model the joint distribution of vectors by a multidimensional mixture of generalized Gaussian (MMGG) densities. The main outcome of this MMGG model is to provide a theoretical framework to simply derive from i.i.d. R- D models, the corresponding MMGG R- D models. In a second part, a new codebook better suited to wavelet coding is proposed: the so-called dead zone lattice vector quantizers (DZLVQ). It consists of generalizing the scalar dead zone to vector quantization by thresholding vectors according to their energy. We show that DZLVQ improves the rate-distortion tradeoff. Experimental results are provided for the pyramidal LVQ scheme under the assumption of a multidimensional mixture of Laplacian (MML) densities. Results performed on a set of real life images show the precision of the analytical R- D curves and the efficiency of the DZLVQ scheme.


Assuntos
Algoritmos , Compressão de Dados/métodos , Aumento da Imagem/métodos , Interpretação de Imagem Assistida por Computador/métodos , Processamento de Sinais Assistido por Computador , Simulação por Computador , Modelos Estatísticos , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA