Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Más filtros

Banco de datos
País como asunto
Tipo del documento
Publication year range
1.
Conserv Biol ; 35(2): 654-665, 2021 04.
Artículo en Inglés | MEDLINE | ID: mdl-32537779

RESUMEN

Collisions with buildings cause up to 1 billion bird fatalities annually in the United States and Canada. However, efforts to reduce collisions would benefit from studies conducted at large spatial scales across multiple study sites with standardized methods and consideration of species- and life-history-related variation and correlates of collisions. We addressed these research needs through coordinated collection of data on bird collisions with buildings at sites in the United States (35), Canada (3), and Mexico (2). We collected all carcasses and identified species. After removing records for unidentified carcasses, species lacking distribution-wide population estimates, and species with distributions overlapping fewer than 10 sites, we retained 269 carcasses of 64 species for analysis. We estimated collision vulnerability for 40 bird species with ≥2 fatalities based on their North American population abundance, distribution overlap in study sites, and sampling effort. Of 10 species we identified as most vulnerable to collisions, some have been identified previously (e.g., Black-throated Blue Warbler [Setophaga caerulescens]), whereas others emerged for the first time (e.g., White-breasted Nuthatch [Sitta carolinensis]), possibly because we used a more standardized sampling approach than past studies. Building size and glass area were positively associated with number of collisions for 5 of 8 species with enough observations to analyze independently. Vegetation around buildings influenced collisions for only 1 of those 8 species (Swainson's Thrush [Catharus ustulatus]). Life history predicted collisions; numbers of collisions were greatest for migratory, insectivorous, and woodland-inhabiting species. Our results provide new insight into the species most vulnerable to building collisions, making them potentially in greatest need of conservation attention to reduce collisions and into species- and life-history-related variation and correlates of building collisions, information that can help refine collision management.


Correlaciones de las Colisiones de Aves contra Edificios en Tres Países de América del Norte Resumen Las colisiones contra los edificios causan hasta mil millones de fatalidades de aves al año en los Estados Unidos y en Canadá. Sin embargo, los esfuerzos por reducir estas colisiones se beneficiarían con estudios realizados a grandes escalas espaciales en varios sitios de estudio con métodos estandarizados y considerando las variaciones relacionadas a la historia de vida y a la especie y las correlaciones de las colisiones. Abordamos estas necesidades de investigación por medio de una recolección coordinada de datos sobre las colisiones de aves contra edificios en los Estados Unidos (35), Canadá (3) y México (2). Recolectamos todos los cadáveres y los identificamos hasta especie. Después de retirar los registros de cadáveres no identificados, las especies sin estimaciones poblacionales a nivel distribución y las especies con distribuciones traslapadas en menos de diez sitios, nos quedamos con 269 cadáveres de 64 especies para el análisis. Estimamos la vulnerabilidad a colisiones para 40 especies con ≥2 fatalidades con base en la abundancia poblacional para América del Norte, el traslape de su distribución entre los sitios de estudio y el esfuerzo de muestreo. De las diez especies que identificamos como las más vulnerables a las colisiones, algunas han sido identificadas previamente (Setophaga caerulescens), y otras aparecieron por primera vez (Sitta carolinensis), posiblemente debido a que usamos una estrategia de muestreo más estandarizada que en los estudios previos. El tamaño del edificio y el área del vidrio estuvieron asociados positivamente con el número de colisiones para cinco de ocho especies con suficientes observaciones para ser analizadas independientemente. La vegetación alrededor de los edificios influyó sobre las colisiones solamente para una de esas ocho especies Catharus ustulatus). Las historias de vida pronosticaron las colisiones; el número de colisiones fue mayor para las especies migratorias, insectívoras y aquellas que habitan en las zonas boscosas. Nuestros resultados proporcionan una nueva perspectiva hacia las especies más vulnerables a las colisiones contra edificios, lo que las pone en una necesidad potencialmente mayor de atención conservacionista para reducir estas colisiones y de estudio de las variaciones relacionadas con la especie y la historia de vida y las correlaciones de las colisiones contra edificios, información que puede ayudar a refinar el manejo de colisiones.


Asunto(s)
Conservación de los Recursos Naturales , Pájaros Cantores , Animales , Canadá , México , América del Norte , Estados Unidos
2.
Sensors (Basel) ; 21(17)2021 Aug 24.
Artículo en Inglés | MEDLINE | ID: mdl-34502588

RESUMEN

In recent years, small unmanned aircraft systems (sUAS) have been used widely to monitor animals because of their customizability, ease of operating, ability to access difficult to navigate places, and potential to minimize disturbance to animals. Automatic identification and classification of animals through images acquired using a sUAS may solve critical problems such as monitoring large areas with high vehicle traffic for animals to prevent collisions, such as animal-aircraft collisions on airports. In this research we demonstrate automated identification of four animal species using deep learning animal classification models trained on sUAS collected images. We used a sUAS mounted with visible spectrum cameras to capture 1288 images of four different animal species: cattle (Bos taurus), horses (Equus caballus), Canada Geese (Branta canadensis), and white-tailed deer (Odocoileus virginianus). We chose these animals because they were readily accessible and white-tailed deer and Canada Geese are considered aviation hazards, as well as being easily identifiable within aerial imagery. A four-class classification problem involving these species was developed from the acquired data using deep learning neural networks. We studied the performance of two deep neural network models, convolutional neural networks (CNN) and deep residual networks (ResNet). Results indicate that the ResNet model with 18 layers, ResNet 18, may be an effective algorithm at classifying between animals while using a relatively small number of training samples. The best ResNet architecture produced a 99.18% overall accuracy (OA) in animal identification and a Kappa statistic of 0.98. The highest OA and Kappa produced by CNN were 84.55% and 0.79 respectively. These findings suggest that ResNet is effective at distinguishing among the four species tested and shows promise for classifying larger datasets of more diverse animals.


Asunto(s)
Aprendizaje Profundo , Ciervos , Aeronaves , Algoritmos , Animales , Bovinos , Caballos , Redes Neurales de la Computación
3.
Database (Oxford) ; 20242024 Jul 23.
Artículo en Inglés | MEDLINE | ID: mdl-39043628

RESUMEN

Drones (unoccupied aircraft systems) have become effective tools for wildlife monitoring and conservation. Automated animal detection and classification using artificial intelligence (AI) can substantially reduce logistical and financial costs and improve drone surveys. However, the lack of annotated animal imagery for training AI is a critical bottleneck in achieving accurate performance of AI algorithms compared to other fields. To bridge this gap for drone imagery and help advance and standardize automated animal classification, we have created the Aerial Wildlife Image Repository (AWIR), which is a dynamic, interactive database with annotated images captured from drone platforms using visible and thermal cameras. The AWIR provides the first open-access repository for users to upload, annotate, and curate images of animals acquired from drones. The AWIR also provides annotated imagery and benchmark datasets that users can download to train AI algorithms to automatically detect and classify animals, and compare algorithm performance. The AWIR contains 6587 animal objects in 1325 visible and thermal drone images of predominantly large birds and mammals of 13 species in open areas of North America. As contributors increase the taxonomic and geographic diversity of available images, the AWIR will open future avenues for AI research to improve animal surveys using drones for conservation applications. Database URL: https://projectportal.gri.msstate.edu/awir/.


Asunto(s)
Aeronaves , Animales Salvajes , Inteligencia Artificial , Bases de Datos Factuales , Animales , Algoritmos , Aves
4.
Sci Rep ; 13(1): 10385, 2023 06 27.
Artículo en Inglés | MEDLINE | ID: mdl-37369669

RESUMEN

Visible and thermal images acquired from drones (unoccupied aircraft systems) have substantially improved animal monitoring. Combining complementary information from both image types provides a powerful approach for automating detection and classification of multiple animal species to augment drone surveys. We compared eight image fusion methods using thermal and visible drone images combined with two supervised deep learning models, to evaluate the detection and classification of white-tailed deer (Odocoileus virginianus), domestic cow (Bos taurus), and domestic horse (Equus caballus). We classified visible and thermal images separately and compared them with the results of image fusion. Fused images provided minimal improvement for cows and horses compared to visible images alone, likely because the size, shape, and color of these species made them conspicuous against the background. For white-tailed deer, which were typically cryptic against their backgrounds and often in shadows in visible images, the added information from thermal images improved detection and classification in fusion methods from 15 to 85%. Our results suggest that image fusion is ideal for surveying animals inconspicuous from their backgrounds, and our approach uses few image pairs to train compared to typical machine-learning methods. We discuss computational and field considerations to improve drone surveys using our fusion approach.


Asunto(s)
Ciervos , Femenino , Animales , Bovinos , Caballos , Dispositivos Aéreos No Tripulados , Aeronaves
SELECCIÓN DE REFERENCIAS
Detalles de la búsqueda