Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
Ecol Evol ; 11(12): 8254-8263, 2021 Jun.
Article in English | MEDLINE | ID: mdl-34188884

ABSTRACT

Animal movement studies are conducted to monitor ecosystem health, understand ecological dynamics, and address management and conservation questions. In marine environments, traditional sampling and monitoring methods to measure animal movement are invasive, labor intensive, costly, and limited in the number of individuals that can be feasibly tracked. Automated detection and tracking of small-scale movements of many animals through cameras are possible but are largely untested in field conditions, hampering applications to ecological questions.Here, we aimed to test the ability of an automated object detection and object tracking pipeline to track small-scale movement of many individuals in videos. We applied the pipeline to track fish movement in the field and characterize movement behavior. We automated the detection of a common fisheries species (yellowfin bream, Acanthopagrus australis) along a known movement passageway from underwater videos. We then tracked fish movement with three types of tracking algorithms (MOSSE, Seq-NMS, and SiamMask) and evaluated their accuracy at characterizing movement.We successfully detected yellowfin bream in a multispecies assemblage (F1 score =91%). At least 120 of the 169 individual bream present in videos were correctly identified and tracked. The accuracies among the three tracking architectures varied, with MOSSE and SiamMask achieving an accuracy of 78% and Seq-NMS 84%.By employing this integrated object detection and tracking pipeline, we demonstrated a noninvasive and reliable approach to studying fish behavior by tracking their movement under field conditions. These cost-effective technologies provide a means for future studies to scale-up the analysis of movement across many visual monitoring systems.

2.
Sci Data ; 8(1): 84, 2021 03 16.
Article in English | MEDLINE | ID: mdl-33727570

ABSTRACT

This paper describes benthic coral reef community composition point-based field data sets derived from georeferenced photoquadrats using machine learning. Annually over a 17 year period (2002-2018), data were collected using downward-looking photoquadrats that capture an approximately 1 m2 footprint along 100 m-1500 m transect surveys distributed along the reef slope and across the reef flat of Heron Reef (28 km2), Southern Great Barrier Reef, Australia. Benthic community composition for the photoquadrats was automatically interpreted through deep learning, following initial manual calibration of the algorithm. The resulting data sets support understanding of coral reef biology, ecology, mapping and dynamics. Similar methods to derive the benthic data have been published for seagrass habitats, however here we have adapted the methods for application to coral reef habitats, with the integration of automatic photoquadrat analysis. The approach presented is globally applicable for various submerged and benthic community ecological applications, and provides the basis for further studies at this site, regional to global comparative studies, and for the design of similar monitoring programs elsewhere.


Subject(s)
Biota , Coral Reefs , Animals , Australia
3.
Sci Data ; 7(1): 355, 2020 Oct 20.
Article in English | MEDLINE | ID: mdl-33082344

ABSTRACT

Addressing the global decline of coral reefs requires effective actions from managers, policymakers and society as a whole. Coral reef scientists are therefore challenged with the task of providing prompt and relevant inputs for science-based decision-making. Here, we provide a baseline dataset, covering 1300 km of tropical coral reef habitats globally, and comprised of over one million geo-referenced, high-resolution photo-quadrats analysed using artificial intelligence to automatically estimate the proportional cover of benthic components. The dataset contains information on five major reef regions, and spans 2012-2018, including surveys before and after the 2016 global bleaching event. The taxonomic resolution attained by image analysis, as well as the spatially explicit nature of the images, allow for multi-scale spatial analyses, temporal assessments (decline and recovery), and serve for supporting image recognition developments. This standardised dataset across broad geographies offers a significant contribution towards a sound baseline for advancing our understanding of coral reef ecology and thereby taking collective and informed actions to mitigate catastrophic losses in coral reefs worldwide.


Subject(s)
Coral Reefs , Environmental Monitoring , Animals , Anthozoa/classification , Artificial Intelligence , Earth, Planet
4.
Environ Monit Assess ; 192(11): 698, 2020 Oct 12.
Article in English | MEDLINE | ID: mdl-33044609

ABSTRACT

Environmental monitoring guides conservation and is particularly important for aquatic habitats which are heavily impacted by human activities. Underwater cameras and uncrewed devices monitor aquatic wildlife, but manual processing of footage is a significant bottleneck to rapid data processing and dissemination of results. Deep learning has emerged as a solution, but its ability to accurately detect animals across habitat types and locations is largely untested for coastal environments. Here, we produce five deep learning models using an object detection framework to detect an ecologically important fish, luderick (Girella tricuspidata). We trained two models on footage from single habitats (seagrass or reef) and three on footage from both habitats. All models were subjected to tests from both habitat types. Models performed well on test data from the same habitat type (object detection measure: mAP50: 91.7 and 86.9% performance for seagrass and reef, respectively) but poorly on test sets from a different habitat type (73.3 and 58.4%, respectively). The model trained on a combination of both habitats produced the highest object detection results for both tests (an average of 92.4 and 87.8%, respectively). The ability of the combination trained models to correctly estimate the ecological abundance metric, MaxN, showed similar patterns. The findings demonstrate that deep learning models extract ecologically useful information from video footage accurately and consistently and can perform across habitat types when trained on footage from the variety of habitat types.


Subject(s)
Deep Learning , Environmental Monitoring , Animals , Ecosystem , Environment , Fishes , Humans
SELECTION OF CITATIONS
SEARCH DETAIL
...