Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 5 de 5
Filtrar
Más filtros










Base de datos
Intervalo de año de publicación
1.
Sci Rep ; 14(1): 5033, 2024 02 29.
Artículo en Inglés | MEDLINE | ID: mdl-38424155

RESUMEN

Quantifying healthy and degraded inner tissues in plants is of great interest in agronomy, for example, to assess plant health and quality and monitor physiological traits or diseases. However, detecting functional and degraded plant tissues in-vivo without harming the plant is extremely challenging. New solutions are needed in ligneous and perennial species, for which the sustainability of plantations is crucial. To tackle this challenge, we developed a novel approach based on multimodal 3D imaging and artificial intelligence-based image processing that allowed a non-destructive diagnosis of inner tissues in living plants. The method was successfully applied to the grapevine (Vitis vinifera L.). Vineyard's sustainability is threatened by trunk diseases, while the sanitary status of vines cannot be ascertained without injuring the plants. By combining MRI and X-ray CT 3D imaging with an automatic voxel classification, we could discriminate intact, degraded, and white rot tissues with a mean global accuracy of over 91%. Each imaging modality contribution to tissue detection was evaluated, and we identified quantitative structural and physiological markers characterizing wood degradation steps. The combined study of inner tissue distribution versus external foliar symptom history demonstrated that white rot and intact tissue contents are key-measurements in evaluating vines' sanitary status. We finally proposed a model for an accurate trunk disease diagnosis in grapevine. This work opens new routes for precision agriculture and in-situ monitoring of tissue quality and plant health across plant species.


Asunto(s)
Inteligencia Artificial , Vitis , Imagenología Tridimensional , Flujo de Trabajo , Enfermedades de las Plantas , Aprendizaje Automático
2.
Plant Methods ; 18(1): 130, 2022 Dec 08.
Artículo en Inglés | MEDLINE | ID: mdl-36482291

RESUMEN

BACKGROUND: High-throughput phenotyping platforms allow the study of the form and function of a large number of genotypes subjected to different growing conditions (GxE). A number of image acquisition and processing pipelines have been developed to automate this process, for micro-plots in the field and for individual plants in controlled conditions. Capturing shoot development requires extracting from images both the evolution of the 3D plant architecture as a whole, and a temporal tracking of the growth of its organs. RESULTS: We propose PhenoTrack3D, a new pipeline to extract a 3D + t reconstruction of maize. It allows the study of plant architecture and individual organ development over time during the entire growth cycle. The method tracks the development of each organ from a time-series of plants whose organs have already been segmented in 3D using existing methods, such as Phenomenal [Artzet et al. in BioRxiv 1:805739, 2019] which was chosen in this study. First, a novel stem detection method based on deep-learning is used to locate precisely the point of separation between ligulated and growing leaves. Second, a new and original multiple sequence alignment algorithm has been developed to perform the temporal tracking of ligulated leaves, which have a consistent geometry over time and an unambiguous topological position. Finally, growing leaves are back-tracked with a distance-based approach. This pipeline is validated on a challenging dataset of 60 maize hybrids imaged daily from emergence to maturity in the PhenoArch platform (ca. 250,000 images). Stem tip was precisely detected over time (RMSE < 2.1 cm). 97.7% and 85.3% of ligulated and growing leaves respectively were assigned to the correct rank after tracking, on 30 plants × 43 dates. The pipeline allowed to extract various development and architecture traits at organ level, with good correlation to manual observations overall, on random subsets of 10-355 plants. CONCLUSIONS: We developed a novel phenotyping method based on sequence alignment and deep-learning. It allows to characterise the development of maize architecture at organ level, automatically and at a high-throughput. It has been validated on hundreds of plants during the entire development cycle, showing its applicability on GxE analyses of large maize datasets.

3.
Plant Methods ; 18(1): 127, 2022 Dec 01.
Artículo en Inglés | MEDLINE | ID: mdl-36457133

RESUMEN

BACKGROUND: High-throughput phenotyping is crucial for the genetic and molecular understanding of adaptive root system development. In recent years, imaging automata have been developed to acquire the root system architecture of many genotypes grown in Petri dishes to explore the Genetic x Environment (GxE) interaction. There is now an increasing interest in understanding the dynamics of the adaptive responses, such as the organ apparition or the growth rate. However, due to the increasing complexity of root architectures in development, the accurate description of the topology, geometry, and dynamics of a growing root system remains a challenge. RESULTS: We designed a high-throughput phenotyping method, combining an imaging device and an automatic analysis pipeline based on registration and topological tracking, capable of accurately describing the topology and geometry of observed root systems in 2D + t. The method was tested on a challenging Arabidopsis seedling dataset, including numerous root occlusions and crossovers. Static phenes are estimated with high accuracy ([Formula: see text] and [Formula: see text] for primary and second-order roots length, respectively). These performances are similar to state-of-the-art results obtained on root systems of equal or lower complexity. In addition, our pipeline estimates dynamic phenes accurately between two successive observations ([Formula: see text] for lateral root growth). CONCLUSIONS: We designed a novel method of root tracking that accurately and automatically measures both static and dynamic parameters of the root system architecture from a novel high-throughput root phenotyping platform. It has been used to characterise developing patterns of root systems grown under various environmental conditions. It provides a solid basis to explore the GxE interaction controlling the dynamics of root system architecture adaptive responses. In future work, our approach will be adapted to a wider range of imaging configurations and species.

4.
Bioinformatics ; 37(10): 1482-1484, 2021 06 16.
Artículo en Inglés | MEDLINE | ID: mdl-32997734

RESUMEN

SUMMARY: The increasing interest of animal and plant research communities for biomedical 3D imaging devices results in the emergence of new topics. The anatomy, structure and function of tissues can be observed non-destructively in time-lapse multimodal imaging experiments by combining the outputs of imaging devices such as X-ray CT and MRI scans. However, living samples cannot remain in these devices for a long period. Manual positioning and natural growth of the living samples induce variations in the shape, position and orientation in the acquired images that require a preprocessing step of 3D registration prior to analyses. This registration step becomes more complex when combining observations from devices that highlight various tissue structures. Identifying image invariants over modalities is challenging and can result in intractable problems. Fijiyama, a Fiji plugin built upon biomedical registration algorithms, is aimed at non-specialists to facilitate automatic alignment of 3D images acquired either at successive times and/or with different imaging systems. Its versatility was assessed on four case studies combining multimodal and time series data, spanning from micro to macro scales. AVAILABILITY AND IMPLEMENTATION: Fijiyama is an open source software (GPL license) implemented in Java. The plugin is available through the official Fiji release. An extensive documentation is available at the official page: https://imagej.github.io/Fijiyama. SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.


Asunto(s)
Algoritmos , Programas Informáticos , Animales , Imagenología Tridimensional , Imagen por Resonancia Magnética , Imagen de Lapso de Tiempo
5.
Nat Methods ; 7(7): 547-53, 2010 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-20543845

RESUMEN

Quantitative information on growing organs is required to better understand morphogenesis in both plants and animals. However, detailed analyses of growth patterns at cellular resolution have remained elusive. We developed an approach, multiangle image acquisition, three-dimensional reconstruction and cell segmentation-automated lineage tracking (MARS-ALT), in which we imaged whole organs from multiple angles, computationally merged and segmented these images to provide accurate cell identification in three dimensions and automatically tracked cell lineages through multiple rounds of cell division during development. Using these methods, we quantitatively analyzed Arabidopsis thaliana flower development at cell resolution, which revealed differential growth patterns of key regions during early stages of floral morphogenesis. Lastly, using rice roots, we demonstrated that this approach is both generic and scalable.


Asunto(s)
Arabidopsis/citología , Linaje de la Célula/fisiología , Flores/citología , Flores/crecimiento & desarrollo , Procesamiento de Imagen Asistido por Computador/métodos , Meristema/citología , Algoritmos , División Celular/fisiología , Proteínas Fluorescentes Verdes , Meristema/crecimiento & desarrollo , Proteínas de Plantas/metabolismo , Reproducibilidad de los Resultados , Factores de Tiempo
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...