Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros











Base de dados
Intervalo de ano de publicação
1.
R Soc Open Sci ; 11(6): 240271, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-39100157

RESUMO

Marine predators are integral to the functioning of marine ecosystems, and their consumption requirements should be integrated into ecosystem-based management policies. However, estimating prey consumption in diving marine predators requires innovative methods as predator-prey interactions are rarely observable. We developed a novel method, validated by animal-borne video, that uses tri-axial acceleration and depth data to quantify prey capture rates in chinstrap penguins (Pygoscelis antarctica). These penguins are important consumers of Antarctic krill (Euphausia superba), a commercially harvested crustacean central to the Southern Ocean food web. We collected a large data set (n = 41 individuals) comprising overlapping video, accelerometer and depth data from foraging penguins. Prey captures were manually identified in videos, and those observations were used in supervised training of two deep learning neural networks (convolutional neural network (CNN) and V-Net). Although the CNN and V-Net architectures and input data pipelines differed, both trained models were able to predict prey captures from new acceleration and depth data (linear regression slope of predictions against video-observed prey captures = 1.13; R 2 ≈ 0.86). Our results illustrate that deep learning algorithms offer a means to process the large quantities of data generated by contemporary bio-logging sensors to robustly estimate prey capture events in diving marine predators.

2.
Am J Primatol ; 86(4): e23599, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-38244194

RESUMO

The urgent need for effective wildlife monitoring solutions in the face of global biodiversity loss has resulted in the emergence of conservation technologies such as passive acoustic monitoring (PAM). While PAM has been extensively used for marine mammals, birds, and bats, its application to primates is limited. Black-and-white ruffed lemurs (Varecia variegata) are a promising species to test PAM with due to their distinctive and loud roar-shrieks. Furthermore, these lemurs are challenging to monitor via traditional methods due to their fragmented and often unpredictable distribution in Madagascar's dense eastern rainforests. Our goal in this study was to develop a machine learning pipeline for automated call detection from PAM data, compare the effectiveness of PAM versus in-person observations, and investigate diel patterns in lemur vocal behavior. We did this study at Mangevo, Ranomafana National Park by concurrently conducting focal follows and deploying autonomous recorders in May-July 2019. We used transfer learning to build a convolutional neural network (optimized for recall) that automated the detection of lemur calls (57-h runtime; recall = 0.94, F1 = 0.70). We found that PAM outperformed in-person observations, saving time, money, and labor while also providing re-analyzable data. Using PAM yielded novel insights into V. variegata diel vocal patterns; we present the first published evidence of nocturnal calling. We developed a graphic user interface and open-sourced data and code, to serve as a resource for primatologists interested in implementing PAM and machine learning. By leveraging the potential of this pipeline, we can address the urgent need for effective primate population surveys to inform conservation strategies.


Assuntos
Aprendizado Profundo , Lemur , Lemuridae , Strepsirhini , Humanos , Animais , Madagáscar , Parques Recreativos , Acústica , Mamíferos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA