Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Mais filtros

Base de dados
Ano de publicação
Tipo de documento
País de afiliação
Intervalo de ano de publicação
2.
Sensors (Basel) ; 20(3)2020 Jan 28.
Artigo em Inglês | MEDLINE | ID: mdl-32012976

RESUMO

An understanding of marine ecosystems and their biodiversity is relevant to sustainable use of the goods and services they offer. Since marine areas host complex ecosystems, it is important to develop spatially widespread monitoring networks capable of providing large amounts of multiparametric information, encompassing both biotic and abiotic variables, and describing the ecological dynamics of the observed species. In this context, imaging devices are valuable tools that complement other biological and oceanographic monitoring devices. Nevertheless, large amounts of images or movies cannot all be manually processed, and autonomous routines for recognizing the relevant content, classification, and tagging are urgently needed. In this work, we propose a pipeline for the analysis of visual data that integrates video/image annotation tools for defining, training, and validation of datasets with video/image enhancement and machine and deep learning approaches. Such a pipeline is required to achieve good performance in the recognition and classification tasks of mobile and sessile megafauna, in order to obtain integrated information on spatial distribution and temporal dynamics. A prototype implementation of the analysis pipeline is provided in the context of deep-sea videos taken by one of the fixed cameras at the LoVe Ocean Observatory network of Lofoten Islands (Norway) at 260 m depth, in the Barents Sea, which has shown good classification results on an independent test dataset with an accuracy value of 76.18% and an area under the curve (AUC) value of 87.59%.


Assuntos
Organismos Aquáticos/fisiologia , Biodiversidade , Ecossistema , Gravação em Vídeo/métodos , Animais , Organismos Aquáticos/classificação , Aprendizado Profundo , Humanos , Aumento da Imagem/métodos , Aprendizado de Máquina , Redes Neurais de Computação , Oceanos e Mares
3.
Sensors (Basel) ; 20(21)2020 Nov 04.
Artigo em Inglês | MEDLINE | ID: mdl-33158174

RESUMO

Imaging technologies are being deployed on cabled observatory networks worldwide. They allow for the monitoring of the biological activity of deep-sea organisms on temporal scales that were never attained before. In this paper, we customized Convolutional Neural Network image processing to track behavioral activities in an iconic conservation deep-sea species-the bubblegum coral Paragorgia arborea-in response to ambient oceanographic conditions at the Lofoten-Vesterålen observatory. Images and concomitant oceanographic data were taken hourly from February to June 2018. We considered coral activity in terms of bloated, semi-bloated and non-bloated surfaces, as proxy for polyp filtering, retraction and transient activity, respectively. A test accuracy of 90.47% was obtained. Chronobiology-oriented statistics and advanced Artificial Neural Network (ANN) multivariate regression modeling proved that a daily coral filtering rhythm occurs within one major dusk phase, being independent from tides. Polyp activity, in particular extrusion, increased from March to June, and was able to cope with an increase in chlorophyll concentration, indicating the existence of seasonality. Our study shows that it is possible to establish a model for the development of automated pipelines that are able to extract biological information from times series of images. These are helpful to obtain multidisciplinary information from cabled observatory infrastructures.


Assuntos
Antozoários/fisiologia , Processamento de Imagem Assistida por Computador , Redes Neurais de Computação , Periodicidade , Animais
4.
Sensors (Basel) ; 20(6)2020 Mar 13.
Artigo em Inglês | MEDLINE | ID: mdl-32183233

RESUMO

This paper presents the technological developments and the policy contexts for the project "Autonomous Robotic Sea-Floor Infrastructure for Bentho-Pelagic Monitoring" (ARIM). The development is based on the national experience with robotic component technologies that are combined and merged into a new product for autonomous and integrated ecological deep-sea monitoring. Traditional monitoring is often vessel-based and thus resource demanding. It is economically unviable to fulfill the current policy for ecosystem monitoring with traditional approaches. Thus, this project developed platforms for bentho-pelagic monitoring using an arrangement of crawler and stationary platforms at the Lofoten-Vesterålen (LoVe) observatory network (Norway). Visual and acoustic imaging along with standard oceanographic sensors have been combined to support advanced and continuous spatial-temporal monitoring near cold water coral mounds. Just as important is the automatic processing techniques under development that have been implemented to allow species (or categories of species) quantification (i.e., tracking and classification). At the same time, real-time outboard processed three-dimensional (3D) laser scanning has been implemented to increase mission autonomy capability, delivering quantifiable information on habitat features (i.e., for seascape approaches). The first version of platform autonomy has already been tested under controlled conditions with a tethered crawler exploring the vicinity of a cabled stationary instrumented garage. Our vision is that elimination of the tether in combination with inductive battery recharge trough fuel cell technology will facilitate self-sustained long-term autonomous operations over large areas, serving not only the needs of science, but also sub-sea industries like subsea oil and gas, and mining.


Assuntos
Ecossistema , Monitoramento Ambiental/métodos , Oceanografia/métodos , Oceanos e Mares , Acústica/instrumentação , Animais , Antozoários/fisiologia , Humanos , Robótica/instrumentação , Gravação em Vídeo/métodos
SELEÇÃO DE REFERÊNCIAS
Detalhe da pesquisa