Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros

Base de dados
Assunto principal
Ano de publicação
Tipo de documento
País de afiliação
Intervalo de ano de publicação
1.
Nat Methods ; 20(12): 1949-1956, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-37957430

RESUMO

Live-cell super-resolution microscopy enables the imaging of biological structure dynamics below the diffraction limit. Here we present enhanced super-resolution radial fluctuations (eSRRF), substantially improving image fidelity and resolution compared to the original SRRF method. eSRRF incorporates automated parameter optimization based on the data itself, giving insight into the trade-off between resolution and fidelity. We demonstrate eSRRF across a range of imaging modalities and biological systems. Notably, we extend eSRRF to three dimensions by combining it with multifocus microscopy. This realizes live-cell volumetric super-resolution imaging with an acquisition speed of ~1 volume per second. eSRRF provides an accessible super-resolution approach, maximizing information extraction across varied experimental conditions while minimizing artifacts. Its optimal parameter prediction strategy is generalizable, moving toward unbiased and optimized analyses in super-resolution microscopy.


Assuntos
Artefatos , Microscopia de Fluorescência/métodos
2.
Front Bioinform ; 1: 775379, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-36303735

RESUMO

Multiple fields in biological and medical research produce large amounts of point cloud data with high dimensionality and complexity. In addition, a large set of experiments generate point clouds, including segmented medical data or single-molecule localization microscopy. In the latter, individual molecules are observed within their natural cellular environment. Analyzing this type of experimental data is a complex task and presents unique challenges, where providing extra physical dimensions for visualization and analysis could be beneficial. Furthermore, whether highly noisy data comes from single-molecule recordings or segmented medical data, the necessity to guide analysis with user intervention creates both an ergonomic challenge to facilitate this interaction and a computational challenge to provide fluid interactions as information is being processed. Several applications, including our software DIVA for image stack and our platform Genuage for point clouds, have leveraged Virtual Reality (VR) to visualize and interact with data in 3D. While the visualization aspects can be made compatible with different types of data, quantifications, on the other hand, are far from being standard. In addition, complex analysis can require significant computational resources, making the real-time VR experience uncomfortable. Moreover, visualization software is mainly designed to represent a set of data points but lacks flexibility in manipulating and analyzing the data. This paper introduces new libraries to enhance the interaction and human-in-the-loop analysis of point cloud data in virtual reality and integrate them into the open-source platform Genuage. We first detail a new toolbox of communication tools that enhance user experience and improve flexibility. Then, we introduce a mapping toolbox allowing the representation of physical properties in space overlaid on a 3D mesh while maintaining a point cloud dedicated shader. We introduce later a new and programmable video capture tool in VR and desktop modes for intuitive data dissemination. Finally, we highlight the protocols that allow simultaneous analysis and fluid manipulation of data with a high refresh rate. We illustrate this principle by performing real-time inference of random walk properties of recorded trajectories with a pre-trained Graph Neural Network running in Python.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA