Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 191
Filtrar
Más filtros

Banco de datos
Tipo del documento
Intervalo de año de publicación
1.
AIDS Behav ; 28(9): 2941-2949, 2024 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-38780868

RESUMEN

The primary goal of antiretroviral treatment is to improve the health of individuals with HIV, and a secondary goal is to prevent further transmission. In 2016, Rwanda adopted the World Health Organization's "treat-all" approach in combination with the differentiated service delivery (DSD) model. The model's goal was to shorten the time from HIV diagnosis to treatment initiation, regardless of the CD4 T-cell count. This study sought to identify perceptions, enablers, and challenges associated with DSD model adoption among PLHIV.This study included selected health centers in Kigali city, Rwanda, between August and September 2022. The patients included were those exposed to the new HIV care model (DSD) model and those exposed to the previous model who transitioned to the current model. Interviews and focus group discussions were also held to obtain views and opinions on the DSD model. The data were collected via questionnaires and audio-recorded focus group discussions and were subsequently analyzed.The study identified several themes, including participants' initial emotions about a new HIV diagnosis, disclosure, experiences with transitioning to the DSD model, the effect of peer education, and barriers to and facilitators of the DSD model. Participants appreciated reduced clinic visits under the DSD model but faced transition and peer educator mobility challenges.The DSD model reduces waiting times, educates patients, and aligns with national goals. Identified barriers call for training and improved peer educator retention. Recommendations include enhancing the DSD model and future research to evaluate its long-term impact and cost-effectiveness.


Asunto(s)
Atención a la Salud , Grupos Focales , Infecciones por VIH , Humanos , Rwanda/epidemiología , Infecciones por VIH/psicología , Infecciones por VIH/tratamiento farmacológico , Infecciones por VIH/terapia , Masculino , Femenino , Adulto , Atención a la Salud/organización & administración , Persona de Mediana Edad , Encuestas y Cuestionarios , Investigación Cualitativa , Recuento de Linfocito CD4 , Fármacos Anti-VIH/uso terapéutico
2.
AIDS Behav ; 28(2): 583-590, 2024 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-38127168

RESUMEN

Multi-month dispensing (MMD) has been widely adopted by national HIV programs as a key strategy for improving the quality of HIV care and treatment services while meeting the unique needs of diverse client populations. We assessed the clinical outcomes of clients receiving MMD in Kenya by conducting a retrospective cohort study using routine programmatic data in 32 government health facilities in Kenya. We included clients who were eligible for multi-month antiretroviral therapy (ART) dispensing for ≥ 3 months (≥ 3MMD) according to national guidelines. The primary exposure was enrollment into ≥ 3MMD. The outcomes were lost to follow-up (LTFU) and viral rebound. Multilevel modified-Poisson regression models with robust standard errors were used to compare clinical outcomes between clients enrolled in ≥ 3MMD and those receiving ART dispensing for less than 3 months (< 3MMD). A total of 3,501 clients eligible for ≥ 3MMD were included in the analysis, of whom 65% were enrolled in ≥ 3MMD at entry into the cohort. There was no difference in LTFU of ≥ 180 days between the two types of care (aRR 1.1, 95% CI 0.7-1.6), while ≥ 3MMD was protective for viral rebound (aRR 0.1 95% CI 0.0-0.2). As more diverse client-focused service delivery models are being implemented, robust evaluations are essential to guide the implementation, monitor progress, and assess acceptability and effectiveness to deliver optimal people-centered care.


Asunto(s)
Fármacos Anti-VIH , Infecciones por VIH , Humanos , Estudios Retrospectivos , Fármacos Anti-VIH/uso terapéutico , Infecciones por VIH/tratamiento farmacológico , Kenia/epidemiología , Estudios de Cohortes
3.
J Dairy Sci ; 106(2): 1065-1077, 2023 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-36543638

RESUMEN

Hoof overgrowth is associated with poor conformation, an altered weight-bearing surface, and a reduction in the hoof's anatomic and functional integrity. As a result of housing systems that promote hoof overgrowth, hoof trimming is considered a priority in dairy goats. However, there are few data on the effects of the timing of first trimming on hoof conformation and growth rate. The aims of this study were (1) to evaluate the long-term effects of 2 different hoof trimming start times and (2) to investigate the pattern of hoof growth across the first 2 yr of life. Eighty 5-mo-old female Saanen-cross commercially housed dairy goats were allocated randomly to 1 of 2 treatments: (1) early trimmed (trimming beginning at 5 mo old; hooves were trimmed every 4 mo thereafter) and (2) late trimmed (trimming beginning at 13 mo old; hooves were trimmed every 4 mo thereafter). Using a combination of photographs and radiographs, hoof conformation, joint positions, and hoof wall length were assessed before the 13- and 25-mo trimming events. Hoof growth was assessed every 12 wk using caliper measurements. Overall, starting hoof trimming earlier had minor and inconsistent effects. However, detrimental changes in conformation and joint positions occurred between trimming events, particularly in the hind hooves, regardless of trimming treatment. At both assessments, there was a high percentage of overgrown toes and dipped heels, with the hind hooves being more affected compared with the front (overgrown toes at 13 mo, 97.1 vs. 79.1 ± 5.2%; overgrown toes at 25 mo, 91.7 vs. 56.3 ± 6.7%; dipped heels at 13 mo, 98.5 vs. 19.3 ± 5.0%; dipped heels at 25 mo, 88.3 vs. 4.9 ± 4.8%). In addition, at both assessments, the distal interphalangeal joint angle was greater in the hind hooves compared with the front (13 mo, 79.5 vs. 65.2 ± 1.7°; 25 mo, 79.0 vs. 66.7 ± 0.9°), whereas heel angles were less in the hind hooves compared with the front (13 mo, 41.8 vs. 57.1 ± 1.5°; 25 mo, 44.9 vs. 55.9 ± 1.1°). On average, the front hooves grew 4.39 mm/mo and the hind hooves grew 4.20 mm/mo. Early trimming did not have consistent effects on hoof growth rate. Importantly, our results suggest that trimming every 4 mo is not sufficient to prevent hoof overgrowth, the development of poor conformation, and detrimental changes in joint positions, particularly in the hind hooves. Furthermore, the detrimental changes may have masked any long-term treatment effects. Therefore, trimming frequency and age of first trimming should be considered when devising hoof care protocols for dairy goats housed in environments that do not offer opportunities for natural hoof wear.


Asunto(s)
Enfermedades de los Bovinos , Enfermedades de las Cabras , Pezuñas y Garras , Femenino , Animales , Bovinos , Pezuñas y Garras/cirugía , Nueva Zelanda , Soporte de Peso , Cabras
4.
Epidemiol Infect ; 149: e80, 2021 03 25.
Artículo en Inglés | MEDLINE | ID: mdl-33762052

RESUMEN

This study aimed to identify an appropriate simple mathematical model to fit the number of coronavirus disease 2019 (COVID-19) cases at the national level for the early portion of the pandemic, before significant public health interventions could be enacted. The total number of cases for the COVID-19 epidemic over time in 28 countries was analysed and fit to several simple rate models. The resulting model parameters were used to extrapolate projections for more recent data. While the Gompertz growth model (mean R2 = 0.998) best fit the current data, uncertainties in the eventual case limit introduced significant model errors. However, the quadratic rate model (mean R2 = 0.992) fit the current data best for 25 (89%) countries as determined by R2 values of the remaining models. Projection to the future using the simple quadratic model accurately forecast the number of future total number of cases 50% of the time up to 10 days in advance. Extrapolation to the future with the simple exponential model significantly overpredicted the total number of future cases. These results demonstrate that accurate future predictions of the case load in a given country can be made using this very simple model.


Asunto(s)
COVID-19/diagnóstico , Modelos Logísticos , Modelos Teóricos , COVID-19/epidemiología , Europa (Continente)/epidemiología , Humanos , Pandemias/prevención & control
5.
N Z Vet J ; 68(2): 84-91, 2020 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-31607211

RESUMEN

Aims: To characterise and classify wounds in sheep suspected to have been caused by attacks by kea (Nestor notabilis) (kea strike), and to report the prevalence of these wounds on five high country farms in the South Island of New Zealand.Methods: Data were collected from farms between 28 August 2012 and 20 September 2013. Sheep were examined opportunistically immediately after shearing for signs of wounds caused by kea. The age and sex of sheep were also recorded. Wounds were measured and characterised as recent, healing, or healed, and the estimated true prevalence was calculated for each farm.Results: Injuries consistent with kea strike wounds were identified in 70/13,978 (0.5%) sheep examined. The estimated true prevalence varied between farms, from 0 (95% CI = 0-0.16) to 1.25 (95% CI = 0.97-1.61)%. Of the 76 wounds identified, 61 (80%) were located in the lumbar region, and 74 (97%) consisted of full-thickness ulceration of the skin, one showed evidence of injury to muscle and one to bone. The median length of the 63 wounds measured was 6 (min 1, max 23.5) cm, and 10/63 (13%) were categorised as recently healed, 47/63 (62%) as healing, and 17/63 (22%) as recent wounds.Conclusions: The results of this study show that kea strike on sheep was occurring at a low prevalence on the high country farms surveyed. The wounds identified were survivable, but the welfare impact of kea strike on sheep should be considered in balance with the conservation status of kea. There was clear variation in the prevalence of wounds attributed to kea strike between the farms but we were not able to identify the risk factors contributing to these differences. Future studies of kea strike should examine variables such as altitude, local kea density and distribution, and differences in kea strike management and husbandry practices, and should include high country farms without a history of kea strike.


Asunto(s)
Mordeduras y Picaduras , Loros/fisiología , Ovinos/lesiones , Heridas y Lesiones/veterinaria , Animales , Nueva Zelanda/epidemiología , Prevalencia , Heridas y Lesiones/epidemiología , Heridas y Lesiones/patología
6.
Asian-Australas J Anim Sci ; 33(11): 1848-1857, 2020 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-32054195

RESUMEN

OBJECTIVE: An experiment was designed to determine if behaviour traits expressed by twinand triplet-bearing lambs and their dams at 3 to 18 hours of age (after the immediate ewelamb bonding had occurred) were associated with lamb survival to weaning. METHODS: The behaviour of twin and triplet lambs and their dams was assessed in the paddock at 3 to 18 hours after birth. Observations were made of the number of high- and low-pitched bleats, time to stand, make contact with dam, suck from dam and follow dam were recorded for each lamb. The maternal behaviour score of each dam was assessed. A random sub-sample of lambs were assessed during a maternal-recognition test at 12 or 24 hours of age. Traits included time spent standing, sitting, walking, time taken to reach the ewes and time spent with the ewes as well as the number of high- and low-pitched bleats emitted by the lamb. RESULTS: In the paddock, for each additional second required for twin-born lambs to follow their dam, lambs were 1.004 (95% confidence interval [CI] 1.000 to 1.008) times more likely to survive to weaning (p<0.05). The opposite relationship, however, was seen in triplet lambs. For each additional second required for triplet-born lambs to follow their dam, lambs were 0.996 (95% CI 0.993 to 0.999) times as likely to survive to weaning (p<0.05). During the maternal recognition test, twin-born lambs were 0.989 (95% CI 0.979 to 1.000) times as likely to survive to weaning for every additional second they took to reach the contact zone (p<0.05). Similarly, triplet-born lambs were 0.994 (95% CI 0.989 to 0.999) as likely to survive for every additional second they took to reach their dam (p<0.05). CONCLUSION: All ewe behaviours and the majority of lamb paddock and test behaviours were not associated with the survival of twin- or triplet-born lambs and, therefore, are of little use as indicators of lamb survival to weaning.

7.
Biol Lett ; 14(4)2018 04.
Artículo en Inglés | MEDLINE | ID: mdl-29618521

RESUMEN

Almost all mammals communicate using sound, but few species produce complex songs. Two baleen whales sing complex songs that change annually, though only the humpback whale (Megaptera novaeangliae) has received much research attention. This study focuses on the other baleen whale singer, the bowhead whale (Balaena mysticetus). Members of the Spitsbergen bowhead whale population produced 184 different song types over a 3-year period, based on duty-cycled recordings from a site in Fram Strait in the northeast Atlantic. Distinct song types were recorded over short periods, lasting at most some months. This song diversity could be the result of population expansion, or immigration of animals from other populations that are no longer isolated from each other by heavy sea ice. However, this explanation does not account for the within season and annual shifting of song types. Other possible explanations for the extraordinary diversity in songs could be that it results either from weak selection pressure for interspecific identification or for maintenance of song characteristics or, alternatively, from strong pressure for novelty in a small population.


Asunto(s)
Ballena de Groenlandia/fisiología , Vocalización Animal/fisiología , Animales , Océano Atlántico , Estaciones del Año , Svalbard , Tiempo
8.
J Dairy Sci ; 101(5): 4491-4497, 2018 May.
Artículo en Inglés | MEDLINE | ID: mdl-29477516

RESUMEN

Numerical rating scales are frequently used in gait scoring systems as indicators of lameness in dairy animals. The gait scoring systems commonly used in dairy goats are based on 4-point scales that focus on detecting and judging the severity of a definite limp. An uneven gait, such as a shortened stride or not "tracking up," is arguably the precursor to the development of a limp; thus, identifying such changes in gait could provide opportunity for early treatment. The objectives of this study were (1) to develop a 5-point gait scoring system that included an "uneven gait" category and compare the distribution of gait scores generated using this system to scores generated using a 4-point system, and (2) to determine whether this system could be reliably used. Forty-eight Saanen cross 2- and 3-yr-old lactating does were enrolled from a commercial dairy goat farm. Two observers carried out weekly live gait scoring sessions for 7 wk using the developed 5-point scoring system. The first 2 wk were used as training sessions (training sessions 1-2), with the subsequent 5 wk completed as gait assessments (assessments 1-5). In addition to training session 1 being lived scored, the goats were also video-recorded. This allowed observer 1 to re-score the session 4 times: twice using the developed 5-point system and twice using the previously used 4-point system. Comparisons of score distributions could then be made. Using the 4-point system, 81% of the goats were assigned score 1 (normal gait). Using the 5-point system, only 36% of the goats were assigned score 1 (normal gait), with 50% assigned score 2 (uneven gait). High levels of intra-observer reliability were achieved by observer 1 using both gait scoring systems [weighted kappa (κw) = 1.00: 4-point, κw = 0.96: 5-point]. At training session 1 (wk 1), inter-observer reliability was only moderate (κw = 0.54), but this was improved during the subsequent training session 2 (κw = 0.89). Inter-observer reliability was high among assessments 1 to 5 (κw = 0.90-1.00). During the training sessions, sensitivity for gait scores 1 and 2 was 77 and 65% (training session 1) and 89 and 94% (training session 2), respectively. Sensitivity was high among assessments 1 to 5 (score 1: 83-100%, score 2: 97-100%). This highlights the likely reason why existing gait scoring systems for dairy goats do not include an "uneven gait" category, as distinguishing it from a normal gait was challenging without training. In conclusion, with training, a 5-point gait scoring system could be reliably used. The 5-point system was found to be more sensitive than the 4-point system, allowing for a potential precursor to lameness to be identified. Further work is needed to determine whether the score can be reliably used in an on-farm setting.


Asunto(s)
Industria Lechera/métodos , Marcha/fisiología , Enfermedades de las Cabras/diagnóstico , Cojera Animal/diagnóstico , Animales , Granjas , Femenino , Cabras , Lactancia , Variaciones Dependientes del Observador , Reproducibilidad de los Resultados , Grabación en Video
9.
Am J Transplant ; 17(2): 551-556, 2017 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-27458893

RESUMEN

Renal transplant has become an important option for human immunodeficiency virus (HIV)-infected patients with end-stage renal disease; however, these patients experience a high rate of acute cellular rejection (ACR). Guidelines do not currently exist for the optimal duration of viral suppression prior to transplantation. In a retrospective cohort analysis of 47 HIV-infected renal transplant recipients, we compared the rate of ACR between patients based on the length of time of viral suppression prior to transplantation. Of the patients who achieved viral suppression for >6 months but less than 2 years prior to transplantation (n = 15), 60% experienced ACR compared to 41% of patients suppressed at least 2 years or more (n = 32) prior to transplant (p = 0.21). Patients suppressed <2 years experienced ACR at 2.48 times the rate of those suppressed 2 years or longer. Induction immunosuppression, HLA mismatch and panel-reactive antibodies (PRAs) did not significantly differ between the two groups.


Asunto(s)
Fármacos Anti-VIH/uso terapéutico , Rechazo de Injerto/etiología , Infecciones por VIH/complicaciones , VIH-1/patogenicidad , Fallo Renal Crónico/cirugía , Trasplante de Riñón/efectos adversos , Adulto , Anciano , Femenino , Estudios de Seguimiento , Rechazo de Injerto/diagnóstico , Supervivencia de Injerto , Infecciones por VIH/tratamiento farmacológico , Infecciones por VIH/virología , Humanos , Fallo Renal Crónico/epidemiología , Masculino , Persona de Mediana Edad , Complicaciones Posoperatorias , Pronóstico , Estudios Retrospectivos , Factores de Riesgo
10.
Med Vet Entomol ; 29(2): 171-7, 2015 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-25604709

RESUMEN

The cat flea, Ctenocephalides felis felis (Bouche, 1835) (Siphonaptera: Pulicidae), which is found worldwide and which parasitizes many species of wild and domestic animal, is a vector and/or reservoir of bacteria, protozoa and helminths. To aid in the study of the physiology and behaviour of fleas and of their transmission of pathogens, it would be of value to improve the laboratory rearing of pathogen-free fleas. The conditions under which artificially reared fleas at the University of Bristol (U.K.) and the Rickettsial Diseases Institute (France) are maintained were studied, with different ratios of male to female fleas per chamber (25 : 50, 50 : 100, 100 : 100, 200 : 200). The fleas were fed with bovine, ovine, caprine, porcine or human blood containing the anticoagulants sodium citrate or EDTA. Egg production was highest when fleas were kept in chambers with a ratio of 25 males to 100 females. In addition, the use of EDTA as an anticoagulant rather than sodium citrate resulted in a large increase in the number of eggs produced per female; however, the low percentage of eggs developing through to adult fleas was lower with EDTA. The modifications described in our rearing methods will improve the rearing of cat fleas for research.


Asunto(s)
Ctenocephalides/crecimiento & desarrollo , Parasitología/métodos , Crianza de Animales Domésticos , Animales , Sangre/metabolismo , Ctenocephalides/metabolismo , Ácido Edético/farmacología , Femenino , Humanos , Larva/crecimiento & desarrollo , Larva/metabolismo , Masculino , Rumiantes/fisiología , Razón de Masculinidad , Especificidad de la Especie , Sus scrofa/fisiología
11.
Asian-Australas J Anim Sci ; 28(3): 360-8, 2015 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-25656209

RESUMEN

The present study evaluated the effect of controlled ryegrass-white clover herbage availability from day 128 until day 142 of pregnancy in comparison to unrestricted availability, on the performance of twin-bearing ewes of varying body condition score (BCS; 2.0, 2.5, or 3.0) and their lambs. It was hypothesised that under conditions of controlled herbage availability, the performance of lambs born to ewes with a greater BCS would be greater than those born to ewes with a lower BCS. During the period that the nutritional regimens were imposed, the pre- and post-grazing herbage masses of the Control regimen (1,070±69 and 801±30 kg dry matter [DM]/ha) were lower than the ad libitum regimen (1,784±69 and 1,333±33 kg DM/ha; p<0.05). The average herbage masses during lactation were 1,410±31 kg DM/ha. Nutritional regimen had no effect on ewe live weight, BCS and back fat depth or on lamb live weight, indices of colostrum uptake, maximal heat production, total litter weight weaned or survival to weaning (p>0.05). The difference in ewe BCSs and back fats observed among body condition groups was maintained throughout pregnancy (p<0.05). At weaning, ewes from the BCS2.0 group had lower BCS and live weight (2.4±0.2, 74.3±2.6 kg) than both the BCS2.5 (2.6±0.2, 78.6±2.4 kg) and BCS3.0 ewes (2.7±0.2, 79.0±2.6 kg; p<0.05), which did not differ (p>0.05). Ewe BCS group had no effect on lamb live weight at birth or weaning or on maximal heat production (p>0.05). Serum gamma glutamyl transferase concentrations of lambs born to BCS3.0 ewes were higher within 36 hours of birth than lambs born to BCS2.0 ewes and BCS2.5 ewes (51.8±1.9 vs 46.5±1.9 and 45.6±1.9 IU/mL, respectively [p<0.05]). There was, however, no effect of ewe body condition on lamb plasma glucose concentration (p>0.05). Lamb survival was the only lamb parameter that showed an interaction between ewe nutritional regimen and ewe BCS whereby survival of lambs born to BCS2.5 and BCS3.0 ewes differed but only within the Control nutritional regimen ewes (p<0.05). These results indicate farmers can provide twin-bearing ewes with pre- and post-grazing ryegrass-white clover herbage covers of approximately 1,100 and 800 kg DM/ha in late pregnancy, provided that herbage covers are 1400 in lactation, without affecting lamb performance to weaning. The present results also indicate that under these grazing conditions, there is little difference in ewe performance within the BCS range of 2.0 to 3.0 and therefore they do not need to be managed separately.

12.
Vet Res Commun ; 48(2): 1073-1082, 2024 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-38103118

RESUMEN

Hoof overgrowth in commercial housed dairy goats is a major health and welfare concern; thus, it is important to better understand hoof trimming, a priority practice which addresses hoof growth. We evaluated the immediate effects of trimming on external conformation, internal joint positions, and hoof wall overgrowth of front and hind hooves. Eighty female goats were enrolled. Pre and post hoof trimming data were collected at 13, 17, 21 and 25 months of age. Overall, before trimming, a high percentage of hooves were scored as overgrown (77.8%). Trimming decreased the percentage of overgrown hooves (17.6%: P < 0.001) and other moderate/severe conformational issues: dipped heels (49.3% vs. 26.7; P < 0.001), misshaped claws (37.0% vs. 17.6%; P < 0.001), splayed claws (73.7% vs. 56.7%; P < 0.001). More hind than front hooves had dipped heels pre-trimming and (91.3% vs. 7.3%; P < 0.001) and post-trimming (52.8% vs. 0.6%; P < 0.001); over half of the hind heels were not restored to an upright position. A greater proportion of toe length was removed from the hind hooves compared to the front (0.50 vs. 0.43, P < 0.001), with the greatest proportion of hoof wall overgrowth removed from the hind hoof medial claw at the 13-month assessment (P < 0.001). Following trimming, distal interphalangeal joint angle decreased more in hind compared to front hooves (11.0° vs. 6.9°; P < 0.001); distal interphalangeal joint height decreased (0.21 cm, P < 0.001), and proximal interphalangeal joint, and heel, angles increased (7.76° and 8.93°, respectively; P < 0.001). Trimming did not restore conformation of all hooves when trimmed every 4 months, suggesting a need to investigate reasons for underlying poor conformation, including trimming frequency.


Asunto(s)
Pezuñas y Garras , Femenino , Animales , Pezuñas y Garras/cirugía , Cabras
13.
Mol Ecol ; 22(6): 1717-32, 2013 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-23205556

RESUMEN

Fungal mitospores may function as dispersal units and/ or spermatia and thus play a role in distribution and/or mating of species that produce them. Mitospore production in ectomycorrhizal (EcM) Pezizales is rarely reported, but here we document mitospore production by a high diversity of EcM Pezizales on three continents, in both hemispheres. We sequenced the internal transcribed spacer (ITS) and partial large subunit (LSU) nuclear rDNA from 292 spore mats (visible mitospore clumps) collected in Argentina, Chile, China, Mexico and the USA between 2009 and 2012. We collated spore mat ITS sequences with 105 fruit body and 47 EcM root sequences to generate operational taxonomic units (OTUs). Phylogenetic inferences were made through analyses of both molecular data sets. A total of 48 OTUs from spore mats represented six independent EcM Pezizales lineages and included truffles and cup fungi. Three clades of seven OTUs have no known meiospore stage. Mitospores failed to germinate on sterile media, or form ectomycorrhizas on Quercus, Pinus and Populus seedlings, consistent with a hypothesized role of spermatia. The broad geographic range, high frequency and phylogenetic diversity of spore mats produced by EcM Pezizales suggests that a mitospore stage is important for many species in this group in terms of mating, reproduction and/or dispersal.


Asunto(s)
Ascomicetos/clasificación , Micorrizas/clasificación , Filogenia , Ascomicetos/genética , ADN de Hongos/genética , ADN Espaciador Ribosómico/genética , Funciones de Verosimilitud , Datos de Secuencia Molecular , Micorrizas/genética , Pinus/microbiología , Raíces de Plantas/microbiología , Populus/microbiología , Quercus/microbiología , Análisis de Secuencia de ADN , Esporas Fúngicas/clasificación , Esporas Fúngicas/genética
14.
Br Poult Sci ; 54(1): 12-23, 2013.
Artículo en Inglés | MEDLINE | ID: mdl-23444850

RESUMEN

1. Faecal samples from 19 commercial, 65 week old free-range egg laying flocks were examined to assess the prevalence and number of parasitic nematode eggs. Data were collected to characterise the housing, husbandry, behaviour and welfare of the flocks to examine possible relationships with the egg counts. 2. Eggs of at least one genus of nematode were present in the faeces of all 19 flocks. Heterakis eggs were detected in 17 (89%) flocks, Ascaridia in 16 (84%), Trichostrongylus in 9 (47%), and Syngamus in 6 (32%). Faecal egg counts (FEC) were greatest for Ascaridia and Heterakis. 3. For each nematode genus, there was no significant difference in FEC between organic (N = 9) and non-organic (N = 10) flocks, or between static (N = 8) and mobile (N = 11) flocks. 4. FEC were correlated with a range of housing, husbandry and management practices which varied between the nematode genus and included depth of the litter, percentage of hens using the range, and number of dead hens. Statistical analysis indicated relationships with FEC that included light intensity above the feeder, indoor and outdoor stocking density, fearfulness in the shed and on the range, distance to the nearest shelter, and swollen toes. 5. None of the FEC for any of the genera was correlated with weekly egg production or cumulative mortality. 6. Although nematode FEC were highly prevalent among the flocks, the overall lack of relation to other welfare and production measures suggests that these infections were not severe.


Asunto(s)
Pollos/parasitología , Infecciones por Nematodos/veterinaria , Enfermedades de las Aves de Corral/epidemiología , Crianza de Animales Domésticos/métodos , Bienestar del Animal , Animales , Heces/parasitología , Femenino , Infecciones por Nematodos/epidemiología , Agricultura Orgánica/métodos , Recuento de Huevos de Parásitos , Prevalencia
15.
J Appl Anim Welf Sci ; 26(1): 91-101, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-34541975

RESUMEN

Namibia needs a robust welfare assessment protocol for beef cattle for benchmarking and trade. As there is presently no such protocol, one was developed for Namibian conditions based on one designed for extensive beef cattle in New Zealand which had been derived from the Welfare Quality and UC Davis Cow-Calf protocols, the modified protocol was evaluated in a semi-commercial farming village during the pregnancy testing of 141 cows from 5 herds of different households. Animal- and stockperson-based measures were assessed directly, cows were observed at grazing, and a questionnaire-guided interview was conducted. The protocol provided a good basis for welfare assessment, but additional measures and modifications were needed for the Namibian system. These were the effects of recurrent drought, predation, plant poisoning, external parasites, walking long distances to water and grazing, compulsory hot-iron branding, extraneous cattle marking, and variable standards of handling facilities. The protocol was modified to incorporate these changes, resulting in a total of 40 measures. It now needs full validation through widespread testing across the range of beef production systems used in Namibia.


Asunto(s)
Crianza de Animales Domésticos , Bienestar del Animal , Embarazo , Femenino , Bovinos , Animales , Crianza de Animales Domésticos/métodos , Namibia , Estudios de Factibilidad , Nueva Zelanda , Granjas
16.
Res Vet Sci ; 140: 251-258, 2021 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-34537551

RESUMEN

Ovine pre-partum vaginal prolapse (known as bearings in sheep) occurs within a few weeks prior to lambing and unless treated both ewes and unborn lambs will die. It is a worldwide problem with no clear aetiology. Rates of prolapse in New Zealand typically vary from 0.1 to 2% per annum, varying between seasons and farms. In order to determine preclinical changes leading to prolapse, blood samples were collected prior to prolapse occurring and analysed for changes in both protein and specific hormone and vitamin levels. 650 ewes were ear tagged and blood samples were taken one month prior to the beginning of lambing; 28 of these ewes subsequently prolapsed. Using an improved proteomic method plasma samples were subjected to 2D DIGE (two dimensional differential in gel electrophoresis) to determine if there were differences between the pre-prolapse and non-prolapsing ewes. Acidic isoforms of haptoglobin, a major acute phase protein in ruminants, increased approximately 3-fold in ewes prior to prolapse occurring. Total haptoglobin quantitation was confirmed with an independent assay. Although another plasma protein, α-1B-glycoprotein, was down regulated close to prolapse, the biological significance of this is unknown. While vitamin D levels were not associated with subsequent prolapse there was, however, a negative correlation between cortisol and days to prolapse from sampling (r2 = 0.36); i.e. ewes sampled closest to prolapse had higher plasma cortisol concentrations than controls. This raises the possibility that the ewes which prolapsed may have been suffering from chronic stress. Further research is needed.


Asunto(s)
Enfermedades de las Ovejas , Prolapso Uterino , Animales , Biomarcadores , Femenino , Proteómica , Ovinos , Enfermedades de las Ovejas/diagnóstico , Prolapso Uterino/veterinaria , Vitaminas
17.
Aust Vet J ; 99(6): 255-262, 2021 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-33748942

RESUMEN

OBJECTIVES: To describe the distribution, and determine the incidence, of veterinary reported injuries experienced by greyhounds during racing in New Zealand. MATERIALS AND METHODS: This retrospective cohort study utilised data obtained on all greyhound race starts and all racing injuries sustained in New Zealand between 10 September 2014 and 19 June 2019. Greyhound injuries were described by the number and percentage of the type, location, and presumed cause of injuries. The overall incidence of injuries per 1000 racing starts was calculated and stratified incidence rates were calculated for race year, racetrack, race number, sex of the greyhound, country of origin of the greyhound, starting box number, race type, race class and race distance. Poisson regression was used to calculate incidence rate ratios for the outcome of injury and race exposure variables. RESULTS: There were 213,630 race starts and 4100 injuries. The incidence of injury was 19.2 per 1000 starts, while the number of fatalities at the track was 1.3 per 1000 race starts. Most injuries experienced by greyhounds on race-day were minor (soft-tissue). Most injuries affected the limbs of the greyhounds (82.8%, n = 3393/4100). The rate of injuries was higher in Australian dogs compared with New Zealand dogs, the incidence rate of injury increased with advancing age group and the incidence rate varied among racetracks. CONCLUSION: The injury rates were similar to those previously reported for racing greyhounds in New Zealand. This study highlighted the need for greater uniformity and conciseness around the classification of injuries to permit comparisons across jurisdictions.


Asunto(s)
Estudios Retrospectivos , Animales , Australia , Perros , Incidencia , Nueva Zelanda/epidemiología
18.
Aust Vet J ; 99(8): 334-343, 2021 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-34002368

RESUMEN

This study set out to explore how euthanasia decision-making for animals was taught to students in eight Australasian veterinary schools. A questionnaire-style interview guide was used by a representative at each university to interview educators. Educators were interviewed about their teaching of euthanasia decision-making for four categories of animals: livestock, equine, companion and avian/wildlife. Using thematic analysis, the terms provided by participants to describe how (mode of teaching) and what (specific content) they taught to students were categorised. Information about content was categorised into human-centred factors that influence decision-making, and animal-based indicators used to directly inform decision-making. All eight representatives reported some teaching relevant to euthanasia decision-making at their university for livestock, companion animal and avian/wildlife. One representative reported no such teaching for equid animals at their university. Observation of a euthanasia case was rarely reported as a teaching method. Five universities reported multiple modes of teaching relevant information, while two universities made use of modalities that could be described as opportunistic teaching (e.g., 'Discussion of clinical cases'). Factors taught at most universities included financial considerations, and that it is the owner's decision to make, while animal-based indicators taught included QoL/animal welfare, prognosis and behaviour change. Overall, most universities used a variety of methods to cover relevant material, usually including lectures and several other approaches for all animal types. However, because two universities relied on presentation of clinical cases, not all students at these veterinary schools will be exposed to make, or assist in making, euthanasia decisions.


Asunto(s)
Educación en Veterinaria , Animales , Animales Salvajes , Eutanasia Animal , Caballos , Humanos , Calidad de Vida , Estudiantes
19.
Med Vet Entomol ; 24(2): 210-3, 2010 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-20202108

RESUMEN

The ability of three commercially available trap types to catch Lucilia (Diptera: Calliphoridae) blowflies was assessed on three sheep farms in southwest England in 2008. The aim was to evaluate their relative value for the control of ovine cutaneous myiasis (sheep blowfly strike) on farms. There was a highly significant difference between the total number of female Lucilia caught per day by the traps, with an Agrilure Trap (Agrimin Ltd, Brigg, U.K.) catching more than the other trap types (Rescue Disposable Fly Trap, Sterling International, Spokane, U.S.A.; Redtop Trap, Miller Methods, Johannesburg, South Africa). However, there was no significant difference between the traps in the numbers of female Lucilia sericata (Meigen) caught. Nevertheless, consideration of the rate at which female L. sericata were caught over time showed that the Agrilure trap did not begin catching until about 30 days after its initial deployment. It subsequently caught L. sericata at a faster rate than the other two traps. The data suggest that the freeze-dried liver bait used in the Agrilure trap required a period of about 30 days to become fully rehydrated and decompose to the degree required to attract and catch L. sericata. Once the bait was attractive, however, the trap outperformed the other two traps in terms of the rate of L. sericata capture. The Agrilure trap would appear to be the most effective of the designs tested for use against sheep blowfly and blowfly strike in the U.K., but care would be needed to ensure that the traps were deployed in advance of the blowfly season so that the bait was suitably aged when trapping was required.


Asunto(s)
Dípteros/fisiología , Control de Insectos/instrumentación , Animales , Femenino , Factores de Tiempo , Reino Unido
20.
J Dairy Sci ; 93(8): 3602-9, 2010 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-20655429

RESUMEN

The role of the autonomic nervous system (ANS) in mediating eye temperature responses during painful procedures was examined in thirty 4-mo-old bull calves randomly assigned to 4 treatments: 1) sham handling control (C; n=8), 2) surgical castration (SC; n=6), 3) local anesthesia with sham handling (LAC; n=8), and 4) local anesthesia with surgical castration (LASC; n=8). Maximum eye temperature ( degrees C), measured by infrared thermography, heart rate (HR), and heart rate variability (HRV) were recorded continuously from 25 min before to 20 min after castration. The HRV was analyzed by examining segments of 512 interbeat intervals before and after treatments and comparing the root mean square of successive differences (RMSSD), high and low frequency (HF and LF, respectively) power, and the ratio of LF and HF powers (LF:HF). Jugular blood samples were analyzed for norepinephrine and epinephrine in C and SC treatments and for cortisol during all treatments. There was an immediate increase in HR following castration in SC (+15.3+/-2.8 beats/min) and LASC (+6.3+/-2.4 beats/min) calves. Eye temperature increased during the 20-min observation period in SC and LASC calves (+0.47+/-0.05 degrees C and +0.28+/-0.05 degrees C, respectively), and there was a small increase in C calves (+0.10+/-0.05 degrees C). Following castration in SC calves, there was an increase in RMSSD (+25.8+/-6.4) and HF power (+11.0+/-6.5) and LF:HF decreased (-2.1+/-0.7). Following castration in LASC, there was an increase in RMSSD (+18.1+/-4.9) and a decrease in LF power (-10.2+/-5.0). Cortisol increased above baseline within 15 min following treatment in both castrated groups, but was greater for SC calves (+18.4+/-2.3 ng/mL) than for LASC calves (+11.1+/-1.9 ng/mL). After castration, norepinephrine increased 3-fold and epinephrine increased by half in SC calves but not in C calves. There were no changes in HR, HRV, or cortisol responses to C or LAC treatments. Local anesthetic reduced, but did not eliminate, responses to surgical castration. The synchronized increase in catecholamine and HR responses immediately following SC treatment suggests the initial response was mediated by the sympathetic branch of the ANS. The subsequent changes in RMSSD, HF power, and LF:HF ratio indicated this was followed by an increase in parasympathetic activity. The use of HR, HRV, and infrared thermography measurements together provide a noninvasive means to assess ANS responses as indicators of acute pain in cattle.


Asunto(s)
Sistema Nervioso Autónomo/fisiología , Temperatura Corporal/fisiología , Castración/veterinaria , Ojo , Dimensión del Dolor/métodos , Dolor/veterinaria , Anestesia Local/veterinaria , Animales , Castración/métodos , Bovinos , Epinefrina/sangre , Frecuencia Cardíaca/fisiología , Hidrocortisona/sangre , Masculino , Norepinefrina/sangre , Dolor/fisiopatología , Termografía/veterinaria
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA