Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 7 de 7
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Sensors (Basel) ; 23(16)2023 Aug 18.
Artigo em Inglês | MEDLINE | ID: mdl-37631778

RESUMO

As pollinators, insects play a crucial role in ecosystem management and world food production. However, insect populations are declining, necessitating efficient insect monitoring methods. Existing methods analyze video or time-lapse images of insects in nature, but analysis is challenging as insects are small objects in complex and dynamic natural vegetation scenes. In this work, we provide a dataset of primarily honeybees visiting three different plant species during two months of the summer. The dataset consists of 107,387 annotated time-lapse images from multiple cameras, including 9423 annotated insects. We present a method for detecting insects in time-lapse RGB images, which consists of a two-step process. Firstly, the time-lapse RGB images are preprocessed to enhance insects in the images. This motion-informed enhancement technique uses motion and colors to enhance insects in images. Secondly, the enhanced images are subsequently fed into a convolutional neural network (CNN) object detector. The method improves on the deep learning object detectors You Only Look Once (YOLO) and faster region-based CNN (Faster R-CNN). Using motion-informed enhancement, the YOLO detector improves the average micro F1-score from 0.49 to 0.71, and the Faster R-CNN detector improves the average micro F1-score from 0.32 to 0.56. Our dataset and proposed method provide a step forward for automating the time-lapse camera monitoring of flying insects.


Assuntos
Ecossistema , Insetos , Abelhas , Animais , Imagem com Lapso de Tempo , Alimentos , Movimento (Física)
2.
Ecol Lett ; 25(12): 2753-2775, 2022 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-36264848

RESUMO

High-resolution monitoring is fundamental to understand ecosystems dynamics in an era of global change and biodiversity declines. While real-time and automated monitoring of abiotic components has been possible for some time, monitoring biotic components-for example, individual behaviours and traits, and species abundance and distribution-is far more challenging. Recent technological advancements offer potential solutions to achieve this through: (i) increasingly affordable high-throughput recording hardware, which can collect rich multidimensional data, and (ii) increasingly accessible artificial intelligence approaches, which can extract ecological knowledge from large datasets. However, automating the monitoring of facets of ecological communities via such technologies has primarily been achieved at low spatiotemporal resolutions within limited steps of the monitoring workflow. Here, we review existing technologies for data recording and processing that enable automated monitoring of ecological communities. We then present novel frameworks that combine such technologies, forming fully automated pipelines to detect, track, classify and count multiple species, and record behavioural and morphological traits, at resolutions which have previously been impossible to achieve. Based on these rapidly developing technologies, we illustrate a solution to one of the greatest challenges in ecology: the ability to rapidly generate high-resolution, multidimensional and standardised data across complex ecologies.


Assuntos
Inteligência Artificial , Ecossistema , Biodiversidade , Biota
3.
PeerJ ; 10: e13837, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-36032940

RESUMO

Image-based methods for species identification offer cost-efficient solutions for biomonitoring. This is particularly relevant for invertebrate studies, where bulk samples often represent insurmountable workloads for sorting, identifying, and counting individual specimens. On the other hand, image-based classification using deep learning tools have strict requirements for the amount of training data, which is often a limiting factor. Here, we examine how classification accuracy increases with the amount of training data using the BIODISCOVER imaging system constructed for image-based classification and biomass estimation of invertebrate specimens. We use a balanced dataset of 60 specimens of each of 16 taxa of freshwater macroinvertebrates to systematically quantify how classification performance of a convolutional neural network (CNN) increases for individual taxa and the overall community as the number of specimens used for training is increased. We show a striking 99.2% classification accuracy when the CNN (EfficientNet-B6) is trained on 50 specimens of each taxon, and also how the lower classification accuracy of models trained on less data is particularly evident for morphologically similar species placed within the same taxonomic order. Even with as little as 15 specimens used for training, classification accuracy reached 97%. Our results add to a recent body of literature showing the huge potential of image-based methods and deep learning for specimen-based research, and furthermore offers a perspective to future automatized approaches for deriving ecological data from bulk arthropod samples.


Assuntos
Artrópodes , Aprendizado Profundo , Animais , Redes Neurais de Computação , Monitoramento Biológico , Água Doce
4.
HardwareX ; 12: e00331, 2022 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-35795086

RESUMO

Climate change is rapidly altering the Arctic environment. Although long-term environmental observations have been made at a few locations in the Arctic, the incomplete coverage from ground stations is a main limitation to observations in these remote areas. Here we present a wind and sun powered multi-purpose mobile observatory (ARC-MO) that enables near real time measurements of air, ice, land, rivers, and marine parameters in remote off-grid areas. Two test units were constructed and placed in Northeast Greenland where they have collected data from cabled and wireless instruments deployed in the environment since late summer 2021. The two units can communicate locally via WiFi (units placed 25 km apart) and transmit near-real time data globally over satellite. Data are streamed live and accessible from (https://gios.org). The cost of one mobile observatory unit is c. 304.000€. These test units demonstrate the possibility for integrative and automated environmental data collection in remote coastal areas and could serve as models for a proposed global observatory system.

5.
Sensors (Basel) ; 21(18)2021 Sep 13.
Artigo em Inglês | MEDLINE | ID: mdl-34577335

RESUMO

Invasive alien plant species (IAPS) pose a threat to biodiversity as they propagate and outcompete natural vegetation. In this study, a system for monitoring IAPS on the roadside is presented. The system consists of a camera that acquires images at high speed mounted on a vehicle that follows the traffic. Images of seven IAPS (Cytisus scoparius, Heracleum, Lupinus polyphyllus, Pastinaca sativa, Reynoutria, Rosa rugosa, and Solidago) were collected on Danish motorways. Three deep convolutional neural networks for classification (ResNet50V2 and MobileNetV2) and object detection (YOLOv3) were trained and evaluated at different image sizes. The results showed that the performance of the networks varied with the input image size and also the size of the IAPS in the images. Binary classification of IAPS vs. non-IAPS showed an increased performance, compared to the classification of individual IAPS. This study shows that automatic detection and mapping of invasive plants along the roadside is possible at high speeds.


Assuntos
Aprendizado Profundo , Espécies Introduzidas , Biodiversidade , Redes Neurais de Computação , Plantas
6.
Sensors (Basel) ; 21(2)2021 Jan 06.
Artigo em Inglês | MEDLINE | ID: mdl-33419136

RESUMO

Insect monitoring methods are typically very time-consuming and involve substantial investment in species identification following manual trapping in the field. Insect traps are often only serviced weekly, resulting in low temporal resolution of the monitoring data, which hampers the ecological interpretation. This paper presents a portable computer vision system capable of attracting and detecting live insects. More specifically, the paper proposes detection and classification of species by recording images of live individuals attracted to a light trap. An Automated Moth Trap (AMT) with multiple light sources and a camera was designed to attract and monitor live insects during twilight and night hours. A computer vision algorithm referred to as Moth Classification and Counting (MCC), based on deep learning analysis of the captured images, tracked and counted the number of insects and identified moth species. Observations over 48 nights resulted in the capture of more than 250,000 images with an average of 5675 images per night. A customized convolutional neural network was trained on 2000 labeled images of live moths represented by eight different classes, achieving a high validation F1-score of 0.93. The algorithm measured an average classification and tracking F1-score of 0.71 and a tracking detection rate of 0.79. Overall, the proposed computer vision system and algorithm showed promising results as a low-cost solution for non-destructive and automatic monitoring of moths.


Assuntos
Aprendizado Profundo , Mariposas , Animais , Computadores , Insetos , Redes Neurais de Computação
7.
Proc Natl Acad Sci U S A ; 118(2)2021 01 12.
Artigo em Inglês | MEDLINE | ID: mdl-33431561

RESUMO

Most animal species on Earth are insects, and recent reports suggest that their abundance is in drastic decline. Although these reports come from a wide range of insect taxa and regions, the evidence to assess the extent of the phenomenon is sparse. Insect populations are challenging to study, and most monitoring methods are labor intensive and inefficient. Advances in computer vision and deep learning provide potential new solutions to this global challenge. Cameras and other sensors can effectively, continuously, and noninvasively perform entomological observations throughout diurnal and seasonal cycles. The physical appearance of specimens can also be captured by automated imaging in the laboratory. When trained on these data, deep learning models can provide estimates of insect abundance, biomass, and diversity. Further, deep learning models can quantify variation in phenotypic traits, behavior, and interactions. Here, we connect recent developments in deep learning and computer vision to the urgent demand for more cost-efficient monitoring of insects and other invertebrates. We present examples of sensor-based monitoring of insects. We show how deep learning tools can be applied to exceptionally large datasets to derive ecological information and discuss the challenges that lie ahead for the implementation of such solutions in entomology. We identify four focal areas, which will facilitate this transformation: 1) validation of image-based taxonomic identification; 2) generation of sufficient training data; 3) development of public, curated reference databases; and 4) solutions to integrate deep learning and molecular tools.


Assuntos
Aprendizado Profundo , Monitorização de Parâmetros Ecológicos/tendências , Entomologia/tendências , Insetos , Animais , Monitorização de Parâmetros Ecológicos/instrumentação , Entomologia/instrumentação
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...