Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 6 de 6
Filtrar
Más filtros

Banco de datos
País/Región como asunto
Tipo del documento
País de afiliación
Intervalo de año de publicación
1.
Conserv Biol ; 35(2): 654-665, 2021 04.
Artículo en Inglés | MEDLINE | ID: mdl-32537779

RESUMEN

Collisions with buildings cause up to 1 billion bird fatalities annually in the United States and Canada. However, efforts to reduce collisions would benefit from studies conducted at large spatial scales across multiple study sites with standardized methods and consideration of species- and life-history-related variation and correlates of collisions. We addressed these research needs through coordinated collection of data on bird collisions with buildings at sites in the United States (35), Canada (3), and Mexico (2). We collected all carcasses and identified species. After removing records for unidentified carcasses, species lacking distribution-wide population estimates, and species with distributions overlapping fewer than 10 sites, we retained 269 carcasses of 64 species for analysis. We estimated collision vulnerability for 40 bird species with ≥2 fatalities based on their North American population abundance, distribution overlap in study sites, and sampling effort. Of 10 species we identified as most vulnerable to collisions, some have been identified previously (e.g., Black-throated Blue Warbler [Setophaga caerulescens]), whereas others emerged for the first time (e.g., White-breasted Nuthatch [Sitta carolinensis]), possibly because we used a more standardized sampling approach than past studies. Building size and glass area were positively associated with number of collisions for 5 of 8 species with enough observations to analyze independently. Vegetation around buildings influenced collisions for only 1 of those 8 species (Swainson's Thrush [Catharus ustulatus]). Life history predicted collisions; numbers of collisions were greatest for migratory, insectivorous, and woodland-inhabiting species. Our results provide new insight into the species most vulnerable to building collisions, making them potentially in greatest need of conservation attention to reduce collisions and into species- and life-history-related variation and correlates of building collisions, information that can help refine collision management.


Correlaciones de las Colisiones de Aves contra Edificios en Tres Países de América del Norte Resumen Las colisiones contra los edificios causan hasta mil millones de fatalidades de aves al año en los Estados Unidos y en Canadá. Sin embargo, los esfuerzos por reducir estas colisiones se beneficiarían con estudios realizados a grandes escalas espaciales en varios sitios de estudio con métodos estandarizados y considerando las variaciones relacionadas a la historia de vida y a la especie y las correlaciones de las colisiones. Abordamos estas necesidades de investigación por medio de una recolección coordinada de datos sobre las colisiones de aves contra edificios en los Estados Unidos (35), Canadá (3) y México (2). Recolectamos todos los cadáveres y los identificamos hasta especie. Después de retirar los registros de cadáveres no identificados, las especies sin estimaciones poblacionales a nivel distribución y las especies con distribuciones traslapadas en menos de diez sitios, nos quedamos con 269 cadáveres de 64 especies para el análisis. Estimamos la vulnerabilidad a colisiones para 40 especies con ≥2 fatalidades con base en la abundancia poblacional para América del Norte, el traslape de su distribución entre los sitios de estudio y el esfuerzo de muestreo. De las diez especies que identificamos como las más vulnerables a las colisiones, algunas han sido identificadas previamente (Setophaga caerulescens), y otras aparecieron por primera vez (Sitta carolinensis), posiblemente debido a que usamos una estrategia de muestreo más estandarizada que en los estudios previos. El tamaño del edificio y el área del vidrio estuvieron asociados positivamente con el número de colisiones para cinco de ocho especies con suficientes observaciones para ser analizadas independientemente. La vegetación alrededor de los edificios influyó sobre las colisiones solamente para una de esas ocho especies Catharus ustulatus). Las historias de vida pronosticaron las colisiones; el número de colisiones fue mayor para las especies migratorias, insectívoras y aquellas que habitan en las zonas boscosas. Nuestros resultados proporcionan una nueva perspectiva hacia las especies más vulnerables a las colisiones contra edificios, lo que las pone en una necesidad potencialmente mayor de atención conservacionista para reducir estas colisiones y de estudio de las variaciones relacionadas con la especie y la historia de vida y las correlaciones de las colisiones contra edificios, información que puede ayudar a refinar el manejo de colisiones.


Asunto(s)
Conservación de los Recursos Naturales , Pájaros Cantores , Animales , Canadá , México , América del Norte , Estados Unidos
2.
Sensors (Basel) ; 21(17)2021 Aug 24.
Artículo en Inglés | MEDLINE | ID: mdl-34502588

RESUMEN

In recent years, small unmanned aircraft systems (sUAS) have been used widely to monitor animals because of their customizability, ease of operating, ability to access difficult to navigate places, and potential to minimize disturbance to animals. Automatic identification and classification of animals through images acquired using a sUAS may solve critical problems such as monitoring large areas with high vehicle traffic for animals to prevent collisions, such as animal-aircraft collisions on airports. In this research we demonstrate automated identification of four animal species using deep learning animal classification models trained on sUAS collected images. We used a sUAS mounted with visible spectrum cameras to capture 1288 images of four different animal species: cattle (Bos taurus), horses (Equus caballus), Canada Geese (Branta canadensis), and white-tailed deer (Odocoileus virginianus). We chose these animals because they were readily accessible and white-tailed deer and Canada Geese are considered aviation hazards, as well as being easily identifiable within aerial imagery. A four-class classification problem involving these species was developed from the acquired data using deep learning neural networks. We studied the performance of two deep neural network models, convolutional neural networks (CNN) and deep residual networks (ResNet). Results indicate that the ResNet model with 18 layers, ResNet 18, may be an effective algorithm at classifying between animals while using a relatively small number of training samples. The best ResNet architecture produced a 99.18% overall accuracy (OA) in animal identification and a Kappa statistic of 0.98. The highest OA and Kappa produced by CNN were 84.55% and 0.79 respectively. These findings suggest that ResNet is effective at distinguishing among the four species tested and shows promise for classifying larger datasets of more diverse animals.


Asunto(s)
Aprendizaje Profundo , Ciervos , Aeronaves , Algoritmos , Animales , Bovinos , Caballos , Redes Neurales de la Computación
3.
MethodsX ; 13: 102933, 2024 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-39286441

RESUMEN

Thermal sensors mounted on drones (unoccupied aircraft systems) are popular and effective tools for monitoring cryptic animal species, although few studies have quantified sampling error of animal counts from thermal images. Using decoys is one effective strategy to quantify bias and count accuracy; however, plastic decoys do not mimic thermal signatures of representative species. Our objective was to produce heat signatures in animal decoys to realistically match thermal images of live animals obtained from a drone-based sensor. We tested commercially available methods to heat plastic decoys of three different size classes, including chemical foot warmers, manually heated water, electric socks, pad, or blanket, and mini and small electric space heaters. We used criteria in two categories, 1) external temperature differences from ambient temperatures (ambient difference) and 2) color bins from a palette in thermal images obtained from a drone near the ground and in the air, to determine if heated decoys adequately matched respective live animals in four body regions. Three methods achieved similar thermal signatures to live animals for three to four body regions in external temperatures and predominantly matched the corresponding yellow color bins in thermal drone images from the ground and in the air. Pigeon decoys were best and most consistently heated with three-foot warmers. Goose and deer decoys were best heated by mini and small space heaters, respectively, in their body cavities, with a heated sock in the head of the goose decoy. The materials and equipment for our best heating methods were relatively inexpensive, commercially available items that provide sustained heat and could be adapted to various shapes and sizes for a wide range of avian and mammalian species. Our heating methods could be used in future studies to quantify bias and validate methodologies for drone surveys of animals with thermal sensors.•We determined optimal heating methods for plastic animal decoys with inexpensive and commercially available equipment to mimic thermal signatures of live animals.•Methods could be used to quantify bias and improve thermal surveys of animals with drones in future studies.

4.
Database (Oxford) ; 20242024 Jul 23.
Artículo en Inglés | MEDLINE | ID: mdl-39043628

RESUMEN

Drones (unoccupied aircraft systems) have become effective tools for wildlife monitoring and conservation. Automated animal detection and classification using artificial intelligence (AI) can substantially reduce logistical and financial costs and improve drone surveys. However, the lack of annotated animal imagery for training AI is a critical bottleneck in achieving accurate performance of AI algorithms compared to other fields. To bridge this gap for drone imagery and help advance and standardize automated animal classification, we have created the Aerial Wildlife Image Repository (AWIR), which is a dynamic, interactive database with annotated images captured from drone platforms using visible and thermal cameras. The AWIR provides the first open-access repository for users to upload, annotate, and curate images of animals acquired from drones. The AWIR also provides annotated imagery and benchmark datasets that users can download to train AI algorithms to automatically detect and classify animals, and compare algorithm performance. The AWIR contains 6587 animal objects in 1325 visible and thermal drone images of predominantly large birds and mammals of 13 species in open areas of North America. As contributors increase the taxonomic and geographic diversity of available images, the AWIR will open future avenues for AI research to improve animal surveys using drones for conservation applications. Database URL: https://projectportal.gri.msstate.edu/awir/.


Asunto(s)
Aeronaves , Animales Salvajes , Inteligencia Artificial , Bases de Datos Factuales , Animales , Algoritmos , Aves
5.
Sci Rep ; 13(1): 10385, 2023 06 27.
Artículo en Inglés | MEDLINE | ID: mdl-37369669

RESUMEN

Visible and thermal images acquired from drones (unoccupied aircraft systems) have substantially improved animal monitoring. Combining complementary information from both image types provides a powerful approach for automating detection and classification of multiple animal species to augment drone surveys. We compared eight image fusion methods using thermal and visible drone images combined with two supervised deep learning models, to evaluate the detection and classification of white-tailed deer (Odocoileus virginianus), domestic cow (Bos taurus), and domestic horse (Equus caballus). We classified visible and thermal images separately and compared them with the results of image fusion. Fused images provided minimal improvement for cows and horses compared to visible images alone, likely because the size, shape, and color of these species made them conspicuous against the background. For white-tailed deer, which were typically cryptic against their backgrounds and often in shadows in visible images, the added information from thermal images improved detection and classification in fusion methods from 15 to 85%. Our results suggest that image fusion is ideal for surveying animals inconspicuous from their backgrounds, and our approach uses few image pairs to train compared to typical machine-learning methods. We discuss computational and field considerations to improve drone surveys using our fusion approach.


Asunto(s)
Ciervos , Femenino , Animales , Bovinos , Caballos , Dispositivos Aéreos No Tripulados , Aeronaves
6.
Environ Evid ; 12(1): 3, 2023 Feb 13.
Artículo en Inglés | MEDLINE | ID: mdl-39294790

RESUMEN

BACKGROUND: Small unoccupied aircraft systems (UAS) are replacing or supplementing occupied aircraft and ground-based surveys in animal monitoring due to improved sensors, efficiency, costs, and logistical benefits. Numerous UAS and sensors are available and have been used in various methods. However, justification for selection or methods used are not typically offered in published literature. Furthermore, existing reviews do not adequately cover past and current UAS applications for animal monitoring, nor their associated UAS/sensor characteristics and environmental considerations. We present a systematic map that collects and consolidates evidence pertaining to UAS monitoring of animals. METHODS: We investigated the current state of knowledge on UAS applications in terrestrial animal monitoring by using an accurate, comprehensive, and repeatable systematic map approach. We searched relevant peer-reviewed and grey literature, as well as dissertations and theses, using online publication databases, Google Scholar, and by request through a professional network of collaborators and publicly available websites. We used a tiered approach to article exclusion with eligible studies being those that monitor (i.e., identify, count, estimate, etc.) terrestrial vertebrate animals. Extracted metadata concerning UAS, sensors, animals, methodology, and results were recorded in Microsoft Access. We queried and catalogued evidence in the final database to produce tables, figures, and geographic maps to accompany this full narrative review, answering our primary and secondary questions. REVIEW FINDINGS: We found 5539 articles from our literature searches of which 216 were included with extracted metadata categories in our database and narrative review. Studies exhibited exponential growth over time but have levelled off between 2019 and 2021 and were primarily conducted in North America, Australia, and Antarctica. Each metadata category had major clusters and gaps, which are described in the narrative review. CONCLUSIONS: Our systematic map provides a useful synthesis of current applications of UAS-animal related studies and identifies major knowledge clusters (well-represented subtopics that are amenable to full synthesis by a systematic review) and gaps (unreported or underrepresented topics that warrant additional primary research) that guide future research directions and UAS applications. The literature for the use of UAS to conduct animal surveys has expanded intensely since its inception in 2006 but is still in its infancy. Since 2015, technological improvements and subsequent cost reductions facilitated widespread research, often to validate UAS technology to survey single species with application of descriptive statistics over limited spatial and temporal scales. Studies since the 2015 expansion have still generally focused on large birds or mammals in open landscapes of 4 countries, but regulations, such as maximum altitude and line-of-sight limitations, remain barriers to improved animal surveys with UAS. Critical knowledge gaps include the lack of (1) best practices for using UAS to conduct standardized surveys in general, (2) best practices to survey whole wildlife communities in delineated areas, and (3) data on factors affecting bias in counting animals from UAS images. Promising advances include the use of thermal sensors in forested environments or nocturnal surveys and the development of automated or semi-automated machine-learning algorithms to accurately detect, identify, and count animals from UAS images.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA