Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 8 de 8
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Brain Behav Evol ; 2024 Apr 03.
Artigo em Inglês | MEDLINE | ID: mdl-38569487

RESUMO

INTRODUCTION: Transitions in temporal niche have occurred many times over the course of mammalian evolution. These are associated with changes in sensory stimuli available to animals, particularly with visual cues, because levels of light are so much higher during the day than night. This relationship between temporal niche and available sensory stimuli elicits the expectation that evolutionary transitions between diurnal and nocturnal lifestyles will be accompanied by modifications of sensory systems that optimize the ability of animals to receive, process, and react to important stimuli in the environment. METHODS: This study examines the influence of temporal niche on investment in sensory brain tissue of 13 rodent species (five diurnal; eight nocturnal). Animals were euthanized and the brain immediately frozen on dry ice; olfactory bulbs were subsequently dissected and weighed, and the remaining brain was weighed, sectioned, and stained. Stereo Investigator was used to calculate volumes of four sensory regions that function in processing visual (lateral geniculate nucleus, superior colliculus) and auditory (medial geniculate nucleus, inferior colliculus) information. A phylogenetic framework was used to assess the influence of temporal niche on the relative sizes of these brain structures and of olfactory bulb weights. RESULTS: Compared to nocturnal species, diurnal species had larger visual regions, whereas nocturnal species had larger olfactory bulbs than their diurnal counterparts. Of the two auditory structures examined, one (medial geniculate nucleus) was larger in diurnal species, while the other (inferior colliculus) did not differ significantly with temporal niche. CONCLUSION: Our results indicate a possible indirect association between temporal niche and auditory investment and suggest probable tradeoffs of investment between olfactory and visual areas of the brain, with diurnal species investing more in processing visual information and nocturnal species investing more in processing olfactory information.

2.
Ecol Evol ; 11(9): 4494-4506, 2021 May.
Artigo em Inglês | MEDLINE | ID: mdl-33976825

RESUMO

A time-consuming challenge faced by camera trap practitioners is the extraction of meaningful data from images to inform ecological management. An increasingly popular solution is automated image classification software. However, most solutions are not sufficiently robust to be deployed on a large scale due to lack of location invariance when transferring models between sites. This prevents optimal use of ecological data resulting in significant expenditure of time and resources to annotate and retrain deep learning models.We present a method ecologists can use to develop optimized location invariant camera trap object detectors by (a) evaluating publicly available image datasets characterized by high intradataset variability in training deep learning models for camera trap object detection and (b) using small subsets of camera trap images to optimize models for high accuracy domain-specific applications.We collected and annotated three datasets of images of striped hyena, rhinoceros, and pigs, from the image-sharing websites FlickR and iNaturalist (FiN), to train three object detection models. We compared the performance of these models to that of three models trained on the Wildlife Conservation Society and Camera CATalogue datasets, when tested on out-of-sample Snapshot Serengeti datasets. We then increased FiN model robustness by infusing small subsets of camera trap images into training.In all experiments, the mean Average Precision (mAP) of the FiN trained models was significantly higher (82.33%-88.59%) than that achieved by the models trained only on camera trap datasets (38.5%-66.74%). Infusion further improved mAP by 1.78%-32.08%.Ecologists can use FiN images for training deep learning object detection solutions for camera trap image processing to develop location invariant, robust, out-of-the-box software. Models can be further optimized by infusion of 5%-10% camera trap images into training data. This would allow AI technologies to be deployed on a large scale in ecological applications. Datasets and code related to this study are open source and available on this repository: https://doi.org/10.5061/dryad.1c59zw3tx.

3.
Sensors (Basel) ; 21(8)2021 Apr 08.
Artigo em Inglês | MEDLINE | ID: mdl-33917792

RESUMO

Image data is one of the primary sources of ecological data used in biodiversity conservation and management worldwide. However, classifying and interpreting large numbers of images is time and resource expensive, particularly in the context of camera trapping. Deep learning models have been used to achieve this task but are often not suited to specific applications due to their inability to generalise to new environments and inconsistent performance. Models need to be developed for specific species cohorts and environments, but the technical skills required to achieve this are a key barrier to the accessibility of this technology to ecologists. Thus, there is a strong need to democratize access to deep learning technologies by providing an easy-to-use software application allowing non-technical users to train custom object detectors. U-Infuse addresses this issue by providing ecologists with the ability to train customised models using publicly available images and/or their own images without specific technical expertise. Auto-annotation and annotation editing functionalities minimize the constraints of manually annotating and pre-processing large numbers of images. U-Infuse is a free and open-source software solution that supports both multiclass and single class training and object detection, allowing ecologists to access deep learning technologies usually only available to computer scientists, on their own device, customised for their application, without sharing intellectual property or sensitive data. It provides ecological practitioners with the ability to (i) easily achieve object detection within a user-friendly GUI, generating a species distribution report, and other useful statistics, (ii) custom train deep learning models using publicly available and custom training data, (iii) achieve supervised auto-annotation of images for further training, with the benefit of editing annotations to ensure quality datasets. Broad adoption of U-Infuse by ecological practitioners will improve ecological image analysis and processing by allowing significantly more image data to be processed with minimal expenditure of time and resources, particularly for camera trap images. Ease of training and use of transfer learning means domain-specific models can be trained rapidly, and frequently updated without the need for computer science expertise, or data sharing, protecting intellectual property and privacy.

4.
Mycologia ; 112(6): 1075-1085, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-32678700

RESUMO

Rodents are the most widespread and diverse order of vertebrate mycophagists and are key to the dispersal of mycorrhizal fungi. Rodents consume and subsequently disperse fungi through their feces on every continent except Antarctica. This study examines the fungal taxa consumed by the Hastings River mouse (Pseudomys oralis), an endangered Australian endemic rodent from the family Muridae. We analyzed 251 fecal samples collected over a 19-year period between 1993 and 2012 at sites throughout the distribution of the animal in New South Wales and Queensland. We show that at least 16 genera of mycorrhizal fungi are eaten by this species and that it is therefore playing an important role as a vector of ectomycorrhizal truffle-like fungi in eastern Australia. Similar to the fungal diets of other mammals in eastern Australia, seasonal fungal consumption was greatest in autumn and winter. The dietary diversity of P. oralis also appeared to follow a geographic trend from south to north; samples collected at sites in the southern part of the species' range had greater diversity than those from sites in the northern part of the range, and overall, diets from southern sites yielded more fungal taxa than did northern sites. This study provides novel insights into the diet of P. oralis and highlights the importance of previously overlooked ecosystem services this species provides through its dispersal of mycorrhizal fungi.


Assuntos
Fezes/microbiologia , Fungos/classificação , Fungos/isolamento & purificação , Camundongos/microbiologia , Micorrizas/isolamento & purificação , Animais , Biodiversidade , Dieta , Ecossistema , Espécies em Perigo de Extinção , Feminino , Fungos/genética , Masculino , Micorrizas/classificação , Micoses/transmissão , New South Wales , Queensland , Rios
5.
Animals (Basel) ; 10(1)2019 Dec 27.
Artigo em Inglês | MEDLINE | ID: mdl-31892236

RESUMO

We present ClassifyMe a software tool for the automated identification of animal species from camera trap images. ClassifyMe is intended to be used by ecologists both in the field and in the office. Users can download a pre-trained model specific to their location of interest and then upload the images from a camera trap to a laptop or workstation. ClassifyMe will identify animals and other objects (e.g., vehicles) in images, provide a report file with the most likely species detections, and automatically sort the images into sub-folders corresponding to these species categories. False Triggers (no visible object present) will also be filtered and sorted. Importantly, the ClassifyMe software operates on the user's local machine (own laptop or workstation)-not via internet connection. This allows users access to state-of-the-art camera trap computer vision software in situ, rather than only in the office. The software also incurs minimal cost on the end-user as there is no need for expensive data uploads to cloud services. Furthermore, processing the images locally on the users' end-device allows them data control and resolves privacy issues surrounding transfer and third-party access to users' datasets.

6.
Ecol Evol ; 6(10): 3216-25, 2016 05.
Artigo em Inglês | MEDLINE | ID: mdl-27096080

RESUMO

Camera trapping is widely used in ecological studies. It is often considered nonintrusive simply because animals are not captured or handled. However, the emission of light and sound from camera traps can be intrusive. We evaluated the daytime and nighttime behavioral responses of four mammalian predators to camera traps in road-based, passive (no bait) surveys, in order to determine how this might affect ecological investigations. Wild dogs, European red foxes, feral cats, and spotted-tailed quolls all exhibited behaviors indicating they noticed camera traps. Their recognition of camera traps was more likely when animals were approaching the device than if they were walking away from it. Some individuals of each species retreated from camera traps and some moved toward them, with negative behaviors slightly more common during the daytime. There was no consistent response to camera traps within species; both attraction and repulsion were observed. Camera trapping is clearly an intrusive sampling method for some individuals of some species. This may limit the utility of conclusions about animal behavior obtained from camera trapping. Similarly, it is possible that behavioral responses to camera traps could affect detection probabilities, introducing as yet unmeasured biases into camera trapping abundance surveys. These effects demand consideration when utilizing camera traps in ecological research and will ideally prompt further work to quantify associated biases in detection probabilities.

7.
PLoS One ; 9(10): e110832, 2014.
Artigo em Inglês | MEDLINE | ID: mdl-25354356

RESUMO

Camera traps are electrical instruments that emit sounds and light. In recent decades they have become a tool of choice in wildlife research and monitoring. The variability between camera trap models and the methods used are considerable, and little is known about how animals respond to camera trap emissions. It has been reported that some animals show a response to camera traps, and in research this is often undesirable so it is important to understand why the animals are disturbed. We conducted laboratory based investigations to test the audio and infrared optical outputs of 12 camera trap models. Camera traps were measured for audio outputs in an anechoic chamber; we also measured ultrasonic (n = 5) and infrared illumination outputs (n = 7) of a subset of the camera trap models. We then compared the perceptive hearing range (n = 21) and assessed the vision ranges (n = 3) of mammals species (where data existed) to determine if animals can see and hear camera traps. We report that camera traps produce sounds that are well within the perceptive range of most mammals' hearing and produce illumination that can be seen by many species.


Assuntos
Mamíferos/fisiologia , Fotografação/instrumentação , Gravação em Vídeo/instrumentação , Animais , Audição , Fotografação/métodos , Gravação em Vídeo/métodos , Visão Ocular
8.
Conserv Biol ; 28(2): 572-9, 2014 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-24283832

RESUMO

The taxonomic uniqueness of island populations is often uncertain which hinders effective prioritization for conservation. The Christmas Island shrew (Crocidura attenuata trichura) is the only member of the highly speciose eutherian family Soricidae recorded from Australia. It is currently classified as a subspecies of the Asian gray or long-tailed shrew (C. attenuata), although it was originally described as a subspecies of the southeast Asian white-toothed shrew (C. fuliginosa). The Christmas Island shrew is currently listed as endangered and has not been recorded in the wild since 1984-1985, when 2 specimens were collected after an 80-year absence. We aimed to obtain DNA sequence data for cytochrome b (cytb) from Christmas Island shrew museum specimens to determine their taxonomic affinities and to confirm the identity of the 1980s specimens. The Cytb sequences from 5, 1898 specimens and a 1985 specimen were identical. In addition, the Christmas Island shrew cytb sequence was divergent at the species level from all available Crocidura cytb sequences. Rather than a population of a widespread species, current evidence suggests the Christmas Island shrew is a critically endangered endemic species, C. trichura, and a high priority for conservation. As the decisions typically required to save declining species can be delayed or deferred if the taxonomic status of the population in question is uncertain, it is hoped that the history of the Christmas Island shrew will encourage the clarification of taxonomy to be seen as an important first step in initiating informed and effective conservation action.


Assuntos
Biodiversidade , Conservação dos Recursos Naturais , Musaranhos/classificação , Musaranhos/genética , Animais , Austrália , Citocromos b/genética , Espécies em Perigo de Extinção , Ilhas do Oceano Índico , Dados de Sequência Molecular , Filogenia , Reação em Cadeia da Polimerase , Análise de Sequência de DNA
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...