Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 14 de 14
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Front Vet Sci ; 10: 1298756, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-38317789

RESUMO

Hair cortisol is a stress indicator and could be used to assess the pigs' exposure to stressors in the weeks/months prior to non-invasive hair sampling. The main aim of this study was to describe the hair cortisol concentration (HCC) variability between individuals within a batch, between farms and between batches within a farm. The secondary aim was to determine how the number of sampled pigs influences the characterization of HCC within a batch. Twenty farrow-to-finish pig farms were recruited considering the diversity of their management practices and health status (data collected). Hair was sampled in two separate batches, 8 months apart. The necks of 24 finishing pigs were clipped per batch the week prior to slaughter. To describe the variability in HCC, an analysis of the variance model was run with three explanatory variables (batch, farm and their interaction). To identify farm clusters, a principal component analysis followed by a hierarchical clustering was carried out with four active variables (means and standard deviations of the two batches per farm) and 17 supplementary variables (management practices, herd health data). We determined how the number of sampled pigs influenced the characterization of HCC within a batch by selecting subsamples of the results. HCC ranged from 0.4 to 121.6 pg/mg, with a mean of 25.9 ± 16.2 pg/mg. The variability in HCC was mainly explained by differences between pigs (57%), then between farms (24%), between batches within the same farm (16%) and between batches (3%). Three clusters of farms were identified: low homogeneous concentrations (n = 3 farms), heterogeneous concentrations with either higher (n = 7) or lower (n = 10) HCC in batch 2 than in batch 1. The diversity of management practices and health statuses allowed to discuss hypotheses explaining the HCC variations observed. We highlighted the need to sample more than 24 pigs to characterize HCC in a pig batch. HCC differences between batches on six farms suggest sampling pigs in more than one batch to describe the HCC at the farm level. HCC variations described here confirm the need to study its links with exposure of pigs to stressors.

2.
BMC Immunol ; 23(1): 61, 2022 12 10.
Artigo em Inglês | MEDLINE | ID: mdl-36496363

RESUMO

BACKGROUND: Multiple antigenic stimulations are crucial to immune system training during early post-natal life. These stimulations can be either due to commensals, which accounts for the acquisition and maintenance of tolerance, or to pathogens, which triggers immunity. In pig, only few works previously explored the influence of natural exposition to pathogens upon immune competence. We propose herein the results of a multicentric, field study, conducted on 265 piglets exposed to contrasted pathogen levels in their living environment. Piglets were housed in 15 different commercial farms, sorted in two groups, low (HSLOW)- and high (HSHIGH)-health status farms, depending on their recurrent exposition to five common swine pathogens. RESULTS: Using animal-based measures, we compared the immune competence and growth performances of HSLOW and HSHIGH pigs around weaning. As expected, we observed a rise in the number of circulating leucocytes with age, which affected different cell populations. Monocyte, antigen-experienced and cytotoxic lymphocyte subpopulation counts were higher in piglets reared in HSLOW farms as compared to their HSHIGH homologs. Also, the age-dependent evolution in γδ T cell and neutrophil counts was significantly affected by the health status. With age, circulating IFNα level decreased and IgM level increased while being greater in HSLOW piglets at any time. After weaning, LPS-stimulated blood cells derived from HSLOW piglets were more prone to secrete IL-8 than those derived from HSHIGH pigs did. Monocytes and granulocytes issued from HSLOW pigs also exhibited comparable phagocytosis capacity. Altogether our data emphasize the more robust immunophenotype of HSLOW piglets. Finally, piglets raised under higher pathogen pressure grew less than HSHIGH piglets did and exhibited a different metabolic profile. The higher cost of the immune responses associated with the low farm health status may account for lower HSLOW piglet performances. CONCLUSIONS: Altogether, our data, obtained in field conditions, provide evidence that early exposure to pathogens shapes the immune competence of piglets. They also document the negative impact of an overstimulation of the immune system on piglets' growth.


Assuntos
Neutrófilos , Fagocitose , Suínos , Animais , Desmame , Contagem de Leucócitos , Leucócitos
3.
Parasit Vectors ; 12(1): 353, 2019 Jul 16.
Artigo em Inglês | MEDLINE | ID: mdl-31311591

RESUMO

BACKGROUND: Faecal egg counts (FEC) and the FEC reduction test (FECRT) for assessing gastrointestinal nematode (GIN) infection and efficacy of anthelmintics are rarely carried out on ruminant farms because of the cost of individual analyses. The use of pooled faecal samples is a promising method to reduce time and costs, but few studies are available for cattle, especially on the evaluation of different pool sizes and FECRT application. METHODS: A study was conducted to assess FEC strategies based on pooled faecal samples using different pool sizes and to evaluate the pen-side use of a portable FEC-kit for the assessment of FEC on cattle farms. A total of 19 farms representing 29 groups of cattle were investigated in Italy and France. On each farm, individual faecal samples from heifers were collected before (D0) and two weeks after (D14) anthelmintic treatment with ivermectin or benzimidazoles. FEC were determined individually and as pooled samples using the Mini-FLOTAC technique. Four different pool sizes were used: 5 individual samples, 10 individual samples, global and global on-farm. Correlations and agreements between individual and pooled results were estimated with Spearman's correlation coefficient and Lin's concordance correlation coefficients, respectively. RESULTS: High correlation and agreement coefficients were found between the mean of individual FEC and the mean of FEC of the different pool sizes when considering all FEC obtained at D0 and D14. However, these parameters were lower for FECR calculation due to a poorer estimate of FEC at D14 from the faecal pools. When using FEC from pooled samples only at D0, higher correlation and agreement coefficients were found between FECR data, the better results being obtained with pools of 5 samples. Interestingly, FEC obtained on pooled samples by the portable FEC-kit on-farm showed high correlation and agreement with FEC obtained on individual samples in the laboratory. This field approach has to be validated on a larger scale to assess its feasibility and reliability. CONCLUSIONS: The present study highlights that the pooling strategy and the use of portable FEC-kits on-farm are rapid and cost-effective procedures for the assessment of GIN egg excretion and can be used cautiously for FECR calculation following the administration of anthelmintics in cattle.


Assuntos
Bovinos/parasitologia , Fezes/parasitologia , Infecções por Nematoides/veterinária , Contagem de Ovos de Parasitas/métodos , Animais , Anti-Helmínticos/uso terapêutico , Doenças dos Bovinos/tratamento farmacológico , Doenças dos Bovinos/parasitologia , Feminino , França , Itália , Infecções por Nematoides/diagnóstico , Contagem de Ovos de Parasitas/instrumentação , Kit de Reagentes para Diagnóstico , Reprodutibilidade dos Testes , Manejo de Espécimes/métodos
4.
Vet Rec ; 181(24): 657, 2017 Dec 16.
Artigo em Inglês | MEDLINE | ID: mdl-29051316

RESUMO

Pig farmers are strongly encouraged to reduce their antimicrobial usage because of the rising threat from antimicrobial resistance. However, such efforts should not compromise the herd health status and performance. This study aimed to describe the profile of so-called 'top-farms' that managed to combine both high technical performance and low antimicrobial usage. A cross-sectional study was conducted among 227 farrow-to-finish farms in Belgium, France, Germany and Sweden. Among them, 44 farms were allocated to the top-farms group and were compared with the 'regular' farms group in terms of farm characteristics, biosecurity and health status. Top-farms had fewer gastrointestinal symptoms in suckling pigs and fewer respiratory symptoms in fatteners, which could partly explain their reduced need for antimicrobials and higher performance. They also had higher biosecurity and were located in sparsely populated pig areas. However, 14 farms of the top-farms group were located in densely populated pig areas, but still managed to have low usage and high technical performance; they had higher internal biosecurity and more extensive vaccination against respiratory pathogens. These results illustrate that it is possible to control infectious diseases using other approaches than high antimicrobial usage, even in farms with challenging environmental and health conditions.


Assuntos
Criação de Animais Domésticos/métodos , Anti-Infecciosos/uso terapêutico , Fazendas/organização & administração , Doenças dos Suínos/prevenção & controle , Animais , Estudos Transversais , Europa (Continente) , Humanos , Suínos , Vacinação/veterinária
5.
Vet Parasitol ; 237: 17-29, 2017 Apr 15.
Artigo em Inglês | MEDLINE | ID: mdl-28274492

RESUMO

Targeted-selective treatments against gastrointestinal nematode (GIN) in adult dairy cows require the identification of "cows to treat", i.e. cows whose milk production (MP) would increase after treatment. This study aimed at quantifying the ability of multi-indicator profiles to identify such cows. A randomized controlled clinical trial was conducted at housing in 25 French pasturing dairy herds. In each herd, treated cows received fenbendazole orally, control cows remained untreated. Daily MP was recorded and the MP variation between the pre- and post-visit periods was calculated (ΔMP) for each cow. ΔMP was modelled with control cows data (n=412) (piecewise linear mixed model). Estimated parameters were applied to treated cows data (n=414) to predict the expected ΔMP in treated cows if they had not been treated. Treated cows with an observed ΔMP (with treatment) higher than the expected ΔMP (without treatment) were labelled as "cows to treat". Herds where at least 50% of the young cows were "cows to treat" were qualified as "herds to target". To characterize such cows and herds, the available candidate indicators were (i) at the cow-level: parity, stage of lactation and production level, faecal egg count (FEC), serum pepsinogen level and anti-Ostertagia antibody level (expressed as ODR); (ii) at the herd-level: bulk tank milk (BTM) Ostertagia ODR, Time of Effective Contact (TEC, in months) with GIN infective larvae before the first calving, and percentage of positive FEC. These indicators were tested one-by-one or in combination to assess their ability to characterize "herds to target" and "cows to treat" (Chi-square tests). 115 out of 414 treated cows (27.8%) were considered as "cows to treat", and 9 out of 22 herds were qualified as "herds to target". The indicators retained to profile such cows and herds were the parity, the production level, the BTM Ostertagia ODR and the TEC. Multi-indicator profiles were much more specific than single indicator profiles, induced lower treatment rates, thereby minimizing the selection pressure on parasite populations. Particularly, to target a herd, the specificity was better with the profile "high BTM Ostertagia ODR and low-TEC" than with the BTM ODR value taken into account alone. The targeted-selective treatment of "young cows, belonging to herds with a high BTM ODR at housing and a low TEC" appeared as a pertinent solution, enabling a global approach for the control of GIN infection in which GIN control in heifers is connected to GIN control in adult cows.


Assuntos
Anti-Helmínticos/uso terapêutico , Doenças dos Bovinos/tratamento farmacológico , Fenbendazol/uso terapêutico , Gastroenteropatias/veterinária , Ostertagia/efeitos dos fármacos , Ostertagíase/veterinária , Criação de Animais Domésticos , Animais , Bovinos , Doenças dos Bovinos/parasitologia , Feminino , Gastroenteropatias/tratamento farmacológico , Gastroenteropatias/parasitologia , Abrigo para Animais , Lactação/efeitos dos fármacos , Leite/metabolismo , Ostertagia/imunologia , Ostertagia/isolamento & purificação , Ostertagíase/tratamento farmacológico , Ostertagíase/parasitologia , Gravidez
6.
Prev Vet Med ; 138: 104-112, 2017 Mar 01.
Artigo em Inglês | MEDLINE | ID: mdl-28237225

RESUMO

A two-year study was carried out to assess the feasibility of a targeted selective treatment to control gastrointestinal nematodes (GIN) in 24 groups of first grazing season (FGS) cattle. A two-step procedure aiming at defining exposure risk at group level and at identifying the most infected individuals within groups through measurement of the average daily weight gain (ADWG) at housing was used. The first step was to define retrospectively, by grazing management practices (GMP) indicators, two levels of groups' exposure to GIN determined by anti O. ostertagi antibody ODR level (cut-off 0.7). For the low level of exposure, no relationship between parasitological parameters and heifer growth was seen, whereas for the high level ADWG was negatively correlated with increasing Ostertagia ODR values. The best classification was obtained with an expert system modelling the number of Ostertagia L3 generations on plots. GMP input for the expert system included standard data (turnout/housing data and supplementary feeding amount) combined with paddock rotation planning and monthly temperatures. The threshold of 3 successive generations of L3 or more on plots allowed identifying the groups according to low or high infection exposure level, except two groups that were misidentified as being highly exposed. In the second step, individual ADWG was found to be negatively associated with Ostertagia ODR in heifers from groups classified as highly exposed (≥3 generations of L3). In these groups, sensitivity and specificity of ADWG thresholds were calculated for several individual Ostertagia ODR thresholds. The best compromise between sensitivity (i.e., correctly treating the heifers that need to be treated) and specificity (i.e., not treating animals that should not be treated) was equivalent respectively to 76% and 56% (AUC≈0.7) and was reached using an end-season ADWG threshold of 683g/day to detect animals exhibiting an Ostertagia ODR cut-off at 0.93. Other ADWG thresholds were proposed taking into account the farmers' or the veterinarians' objectives: either maximizing the production through both an increase of the ADWG threshold and the sensitivity or keeping a significant nematode population in refugia with a corresponding limitation of anthelmintic treatments through a decrease of ADWG threshold and an increase of the specificity. Finally, a targeted selective treatment for FGS cattle based on GMP and flexible ADWG thresholds seems feasible at housing without laboratory analysis, accepting that some resilient animals with high Ostertagia ODR will not be treated due to their ability to perform under parasitic challenge.


Assuntos
Doenças dos Bovinos/parasitologia , Gastroenteropatias/veterinária , Ostertagíase/veterinária , Criação de Animais Domésticos , Animais , Anti-Helmínticos/uso terapêutico , Anticorpos Anti-Helmínticos , Bovinos , Doenças dos Bovinos/tratamento farmacológico , Fezes/parasitologia , França , Gastroenteropatias/tratamento farmacológico , Gastroenteropatias/parasitologia , Modelos Lineares , Nematoides , Ostertagia , Ostertagíase/tratamento farmacológico , Contagem de Ovos de Parasitas , Curva ROC , Medição de Risco , Aumento de Peso
7.
PLoS One ; 11(1): e0147835, 2016.
Artigo em Inglês | MEDLINE | ID: mdl-26808824

RESUMO

Gastrointestinal nematodes (GIN) infection can impair milk production (MP) in dairy cows. To investigate whether MP would be optimized by spring targeted-selective anthelmintic treatment in grazing cows, we assessed (1) the effect on MP of an anthelmintic treatment applied 1.5 to 2 months after turn-out, and (2) herd and individual indicators associated with the post-treatment MP response. A randomized controlled clinical trial was conducted in 13 dairy farms (578 cows) in western France in spring 2012. In each herd, lactating cows of the treatment group received fenbendazole orally, control cows remained untreated. Daily cow MP was recorded from 2 weeks before until 15 weeks after treatment. Individual serum pepsinogen and anti-Ostertagia antibody levels (expressed as ODR), faecal egg count and bulk tank milk (BTM) Ostertagia ODR were measured at treatment time. Anthelmintic treatment applied during the previous housing period was recorded for each cow. In each herd, information regarding heifers' grazing and anthelmintic treatment history was collected to assess the Time of Effective Contact (TEC, in months) with GIN infective larvae before the first calving. The effect of treatment on weekly MP averages and its relationships with herd and individual indicators were studied using linear mixed models with two nested random effects (cow within herd). Unexpectedly, spring treatment had a significant detrimental effect on MP (-0.92 kg/cow/day on average). This negative MP response was particularly marked in high producing cows, in cows not treated during the previous housing period or with high pepsinogen levels, and in cows from herds with a high TEC or a high BTM ODR. This post-treatment decrease in MP may be associated with immuno-inflammatory mechanisms. Until further studies can assess whether this unexpected result can be generalized, non-persistent treatment of immunized adult dairy cows against GIN should not be recommended in early grazing season.


Assuntos
Lactação/fisiologia , Animais , Antinematódeos/uso terapêutico , Bovinos , Doenças dos Bovinos/tratamento farmacológico , Doenças dos Bovinos/parasitologia , Fezes/parasitologia , Feminino , Fenbendazol/uso terapêutico , Ostertagia/fisiologia , Ostertagíase/complicações , Ostertagíase/tratamento farmacológico , Contagem de Ovos de Parasitas , Distribuição Aleatória , Estações do Ano
8.
J Dairy Sci ; 97(10): 6135-50, 2014 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-25087027

RESUMO

In response to increasing risks of emerging infectious diseases, syndromic surveillance can be a suitable approach to detect outbreaks of such diseases across a large territory in an early phase. To implement a syndromic surveillance system, the primary challenge is to find appropriate health-related data. The objective of this study was to evaluate whether routinely collected dates of reproductive events in dairy cattle could be used to build indicators of health anomalies for syndromic surveillance. The evaluation was performed on data collected in France between 2003 and 2009. First, a set of 5 indicators was proposed to assess several types of reproductive disorders. For each indicator, the demographic coverage over the total number of cattle at risk was analyzed in time and space. Second, the ability to detect an emerging disease in an early phase was retrospectively evaluated during epidemics of bluetongue serotypes 1 and 8 (BTV-1, BTV-8) in France in 2007 and 2008. Reproductive indicators were analyzed weekly during these epidemics for each indicator in each infected French district (16 in 2007 and 50 in 2008 out of 94 districts). The indicators were able to detect the BTV epidemics despite their low demographic coverage on a weekly basis relatively to total number of cattle (median=1.21%; range=0-11.7%). Four indicators related to abortions, late embryonic death, and short gestations were abnormally elevated during both BTV epidemics. Median times to abnormal elevations in these indicators were 20 to 71 d after the first notification of clinical signs of BTV by veterinarians. These results demonstrate that reproduction data can be used as indicators of disease emergences, whereas in the specific case of these BTV epidemics, detection via these indicators was later than clinical detection by veterinarians. The emergence of bluetongue in 2007 in France was associated with gestations that were a few days shorter than expected. A short gestation indicator underwent high elevations relative to prior random fluctuations and was the earliest (out of the 4 indicators) to show abnormal elevations, making it possible to detect this emergence.


Assuntos
Vírus Bluetongue , Bluetongue/epidemiologia , Doenças dos Bovinos/diagnóstico , Doenças dos Bovinos/epidemiologia , Reprodução , Aborto Animal/epidemiologia , Animais , Bluetongue/complicações , Vírus Bluetongue/classificação , Bovinos , Doenças dos Bovinos/virologia , Notificação de Doenças/estatística & dados numéricos , Surtos de Doenças/prevenção & controle , Surtos de Doenças/veterinária , Feminino , Morte Fetal , França/epidemiologia , Idade Gestacional , Gravidez , Estudos Retrospectivos , Ovinos , Médicos Veterinários
9.
Prev Vet Med ; 113(4): 484-91, 2014 Mar 01.
Artigo em Inglês | MEDLINE | ID: mdl-24433639

RESUMO

Two culicoides-borne diseases, Bluetongue (BTV) and Schmallenberg, have emerged in the European cattle population since 2006. Other diseases transmitted by these vectors could emerge. This justifies the development of syndromic surveillance programs whereby one or several indicators would be routinely monitored for the early detection of emerging diseases. The aim of this study was to evaluate milk yield from milk recording in dairy cattle as an indicator to be included in an emerging disease surveillance system. It was hypothesized that emergences would result in episodes of low milk production clustered in space and time. The 2007 BTV epizootic in France was used as a case study. Because it had already emerged in neighbouring countries, the disease emergence was expected and notification was mandatory. Herd-test-day milk productions were predicted for the entire country for 2006 and 2007 from herd historical data using linear mixed models. The differences between observed and predicted milk productions were averaged per week and per municipality and used as input for a space-time prospective scan statistic. Log likelihood ratios (LLR) associated with clusters were used to define alarms. The threshold chosen was a trade-off between detection timeliness and the number of false alarms per week. The first four BTV notifications occurred on the 12th (two notifications), 13th and 27th of July 2007. The 12th of July was considered to be the date of emergence. Alarms occurring before the 1st of March 2007 were considered to be false alarms. Using an LLR of 50, there were an average of 1.7 false alarms per week and the BTV emergence was detected seven weeks after emergence. Using an LLR of 100, there were an average of 0.8 false alarms per week and the BTV emergence was detected 9 weeks after emergence. Detection may have been delayed because of a discontinuation of milk recording between mid-July and mid-August. The first cluster with an LLR>100 located in the emergence area was further investigated. A difference between observed and predicted production of >1 kg/cow/day was observed around the time of emergence. However, a difference of equal magnitude was observed during the year preceding the outbreak. Milk production predicted from herd history alone did not allow the detection of the 2007 BTV emergence in France. Further research should be conducted on improving the prediction of test-day milk yield and on combining it with other indicators based on routinely collected data.


Assuntos
Vírus Bluetongue/imunologia , Bluetongue/epidemiologia , Doenças dos Bovinos/epidemiologia , Surtos de Doenças/veterinária , Leite/virologia , Animais , Bluetongue/virologia , Bovinos , Doenças dos Bovinos/virologia , França/epidemiologia , Vigilância da População , Estudos Prospectivos , Estações do Ano
10.
Vet J ; 199(1): 184-7, 2014 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-24239263

RESUMO

Under the assumption that milk yield may be reduced in herds with impaired welfare, the present study aimed at investigating whether milk yield could be used as a reliable indicator of welfare. In 125 commercial French dairy herds, the association between the welfare of the herd (evaluated using the Welfare Quality assessment protocol) and cow milk yield was investigated using linear mixed models. Positive associations were identified between milk yield and low aggressions between cows and good emotional state of the herd but there was a negative association with good health assessed through the occurrence of diseases and injuries. These opposite associations resulted in no association with the overall welfare of the herd. Milk yield should not therefore be used as an indicator of overall welfare.


Assuntos
Bem-Estar do Animal , Bovinos/fisiologia , Indústria de Laticínios/métodos , Lactação/fisiologia , Leite/fisiologia , Animais , Feminino , França
11.
Comp Immunol Microbiol Infect Dis ; 37(1): 1-9, 2014 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-24184019

RESUMO

The effectiveness of the vaccination of dairy cows combined or not with antibiotics (i.e. oxytetracycline) to control Coxiella burnetii (Cb) shedding at herd level was investigated in 77 Q fever clinically affected herds. In addition to nulliparous heifers' vaccination, one out of the four following medical strategies was randomly assigned to dairy cows in each herd: vaccination (using a phase I vaccine) alone, vaccination combined with oxytetracycline, oxytetracycline alone or nothing. Their effectiveness to reduce Cb load in quarterly samples of bulk tank milk (BTM) and of pooled milk of primiparous (MP) was assessed through logistic hierarchical models. A significant reduction in Cb load was observed in herds where the vaccination of ≥80% of dairy cows was implemented; whereas the use of antibiotics was uneffective. Our findings support the interest of a whole vaccination strategy and provide evidence for decreasing the use of antibiotics in dairy cattle herds.


Assuntos
Vacinas Bacterianas/administração & dosagem , Doenças dos Bovinos/microbiologia , Doenças dos Bovinos/prevenção & controle , Coxiella burnetii/imunologia , Oxitetraciclina/farmacologia , Febre Q/veterinária , Vacinação/veterinária , Animais , Vacinas Bacterianas/imunologia , Bovinos , Doenças dos Bovinos/imunologia , Coxiella burnetii/genética , DNA Bacteriano/química , DNA Bacteriano/genética , Feminino , França , Modelos Logísticos , Leite/microbiologia , Febre Q/microbiologia , Febre Q/prevenção & controle , Reação em Cadeia da Polimerase em Tempo Real/veterinária , Vacinação/métodos , Vacinação/normas
12.
PLoS One ; 8(9): e73726, 2013.
Artigo em Inglês | MEDLINE | ID: mdl-24069227

RESUMO

Two vector borne diseases, caused by the Bluetongue and Schmallenberg viruses respectively, have emerged in the European ruminant populations since 2006. Several diseases are transmitted by the same vectors and could emerge in the future. Syndromic surveillance, which consists in the routine monitoring of indicators for the detection of adverse health events, may allow an early detection. Milk yield is routinely measured in a large proportion of dairy herds and could be incorporated as an indicator in a surveillance system. However, few studies have evaluated continuous indicators for syndromic surveillance. The aim of this study was to develop a framework for the quantification of both disease characteristics and model predictive abilities that are important for a continuous indicator to be sensitive, timely and specific for the detection of a vector-borne disease emergence. Emergences with a range of spread characteristics and effects on milk production were simulated. Milk yields collected monthly in 48 713 French dairy herds were used to simulate 576 disease emergence scenarios. First, the effect of disease characteristics on the sensitivity and timeliness of detection were assessed: Spatio-temporal clusters of low milk production were detected with a scan statistic using the difference between observed and simulated milk yields as input. In a second step, the system specificity was evaluated by running the scan statistic on the difference between observed and predicted milk yields, in the absence of simulated emergence. The timeliness of detection depended mostly on how easily the disease spread between and within herds. The time and location of the emergence or adding random noise to the simulated effects had a limited impact on the timeliness of detection. The main limitation of the system was the low specificity i.e. the high number of clusters detected from the difference between observed and predicted productions, in the absence of disease.


Assuntos
Doenças dos Bovinos/diagnóstico , Leite , Animais , Bovinos , Doenças dos Bovinos/fisiopatologia
13.
Int J Food Microbiol ; 150(1): 8-13, 2011 Oct 17.
Artigo em Inglês | MEDLINE | ID: mdl-21788093

RESUMO

A study was conducted in 2009 to identify risk factors of Campylobacter spp. transmission from the digestive tract to the carcasses of standard broilers (slaughter age: 37 day, carcass weight: 1.3 kg on average). Counts of Campylobacter were performed on pools of 10 ceca and 10 neck-skins from 108 Campylobacter ceca-positive batches in three slaughterhouses. Technical and health data also was collected on the broilers: age, size, carcass weight (mean and standard deviation), condemnation rate, mortality rate and nature of treatment during the rearing period. Cecal counts varied from 4.8 to 10.2 log(10) cfu/g. In seventeen batches (15.7%), the skin count was below the detection limit. In the 91 batches with positive neck-skin test results, the counts varied from 2.0 to 5.2 log(10) cfu/g. Standard deviation of carcass weight, condemnation rate, slaughter rate and cecal count were significantly lower and growth rate higher in the 17 batches where neck-skin results were not detected positive. Multivariate analysis showed that batches with higher standard deviation of carcass weight were 5 to 9 fold more at risk of having detectable carcass contamination. Among the 91 positive neck-skin batches, only slaughter rate and cecal counts were found to have a significant but limited effect on the level of neck-skin contamination. As far as body weight homogeneity may be affected by disease, better health control can contribute to a reduction of the contamination of the broiler carcasses in Campylobacter carrier batches.


Assuntos
Matadouros , Campylobacter/crescimento & desenvolvimento , Ceco/microbiologia , Microbiologia de Alimentos , Aves Domésticas/microbiologia , Animais , Campylobacter/isolamento & purificação , Galinhas/microbiologia , Contagem de Colônia Microbiana , Manipulação de Alimentos/métodos , Humanos
14.
Occup Environ Med ; 67(7): 493-9, 2010 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-20581259

RESUMO

OBJECTIVES: Waste incineration releases a mixture of chemicals with high embryotoxic potential, including heavy metals and dioxins/furans, into the atmosphere. In a previous ecological study we found an association between the risk of urinary tract birth defects and residence in the vicinity of municipal solid waste incinerators (MSWIs). The objective of the present study was to specifically test this association. METHODS: A population-based case-control study compared 304 infants with urinary tract birth defects diagnosed in the Rhône-Alpes region (2001-2003) with a random sample of 226 population controls frequency-matched for infant sex and year and district of birth. Exposure to dioxins in early pregnancy at the place of residence, used as a tracer of the mixture released by 21 active waste incinerators, was predicted with second-generation Gaussian modelling (ADMS3 software). Other industrial emissions of dioxins, population density and neighbourhood deprivation were also assessed. Individual risk factors including consumption of local food were obtained by interviews with 62% of the case and all control families. RESULTS: Risk was increased for mothers exposed to dioxins above the median at the beginning of pregnancy (OR 2.95, 95% CI 1.47 to 5.92 for dioxin deposits). When only interviewed cases were considered, risk estimates decreased mainly because the non-interviewed cases were more likely to live in exposed residential environments (OR 2.05, 95% CI 0.92 to 4.57). The results suggest that consumption of local food modifies this risk. CONCLUSIONS: This study confirms our previous observation of a link between the risk of urinary tract birth defects and exposure to MSWI emissions in early pregnancy and illustrates the effect of participation bias on risk estimates of environmental health impacts.


Assuntos
Poluição do Ar/efeitos adversos , Anormalidades Congênitas/epidemiologia , Dioxinas/toxicidade , Resíduos Perigosos/efeitos adversos , Incineração/instrumentação , Efeitos Tardios da Exposição Pré-Natal/epidemiologia , Sistema Urinário/anormalidades , Adulto , Estudos de Casos e Controles , Anormalidades Congênitas/prevenção & controle , Feminino , França/epidemiologia , Humanos , Recém-Nascido , Gravidez , Resultado da Gravidez , Inquéritos e Questionários
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...