RESUMEN
Shiga toxin-producing Escherichia coli is carried in the intestine of ruminant animals, and outbreaks have occurred after contact with ruminant animals or their environment. The presence of STEC virulence genes in the environment was investigated along recreational walking paths in the North West and East Anglia regions of England. In all, 720 boot sock samples from walkers' shoes were collected between April 2013 and July 2014. Multiplex PCR was used to detect E. coli based on the amplification of the uidA gene and investigate STEC-associated virulence genes eaeA, stx1 and stx2. The eaeA virulence gene was detected in 45·5% of the samples, where stx1 and/or stx2 was detected in 12·4% of samples. There was a difference between the two regions sampled, with the North West exhibiting a higher proportion of positive boot socks for stx compared to East Anglia. In univariate analysis, ground conditions, river flow and temperature were associated with positive boot socks. The detection of stx genes in the soil samples suggests that STEC is present in the English countryside and individuals may be at risk for infection after outdoor activities even if there is no direct contact with animals. SIGNIFICANCE AND IMPACT OF THE STUDY: Several outbreaks within the UK have highlighted the danger of contracting Shiga toxin-producing Escherichia coli from contact with areas recently vacated by livestock. This is more likely to occur for STEC infections compared to other zoonotic bacteria given the low infectious dose required. While studies have determined the prevalence of STEC within farms and petting zoos, determining the risk to individuals enjoying recreational outdoor activities that occur near where livestock may be present is less researched. This study describes the prevalence with which stx genes, indicative of STEC bacteria, were found in the environment in the English countryside.
Asunto(s)
Adhesinas Bacterianas/genética , Proteínas de Escherichia coli/genética , Toxina Shiga I/genética , Toxina Shiga II/genética , Escherichia coli Shiga-Toxigénica/genética , Escherichia coli Shiga-Toxigénica/patogenicidad , Animales , Inglaterra , Infecciones por Escherichia coli/microbiología , Heces/microbiología , Geografía , Humanos , Ganado/microbiología , Reacción en Cadena de la Polimerasa Multiplex , Escherichia coli Shiga-Toxigénica/aislamiento & purificación , Zapatos , Virulencia/genética , Factores de Virulencia/genéticaRESUMEN
BACKGROUND: Worldwide, syndromic surveillance is increasingly used for improved and timely situational awareness and early identification of public health threats. Syndromic data streams are fed into detection algorithms, which produce statistical alarms highlighting potential activity of public health importance. All alarms must be assessed to confirm whether they are of public health importance. In England, approximately 100 alarms are generated daily and, although their analysis is formalised through a risk assessment process, the process requires notable time, training, and maintenance of an expertise base to determine which alarms are of public health importance. The process is made more complicated by the observation that only 0.1% of statistical alarms are deemed to be of public health importance. Therefore, the aims of this study were to evaluate machine learning as a tool for computer-assisted human decision-making when assessing statistical alarms. METHODS: A record of the risk assessment process was obtained from Public Health England for all 67,505 statistical alarms between August 2013 and October 2015. This record contained information on the characteristics of the alarm (e.g. size, location). We used three Bayesian classifiers- naïve Bayes, tree-augmented naïve Bayes and Multinets - to examine the risk assessment record in England with respect to the final 'Decision' outcome made by an epidemiologist of 'Alert', 'Monitor' or 'No-action'. Two further classifications based upon tree-augmented naïve Bayes and Multinets were implemented to account for the predominance of 'No-action' outcomes. RESULTS: The attributes of each individual risk assessment were linked to the final decision made by an epidemiologist, providing confidence in the current process. The naïve Bayesian classifier performed best, correctly classifying 51.5% of 'Alert' outcomes. If the 'Alert' and 'Monitor' actions are combined then performance increases to 82.6% correctly classified. We demonstrate how a decision support system based upon a naïve Bayes classifier could be operationalised within an operational syndromic surveillance system. CONCLUSIONS: Within syndromic surveillance systems, machine learning techniques have the potential to make risk assessment following statistical alarms more automated, robust, and rigorous. However, our results also highlight the importance of specialist human input to the process.
Asunto(s)
Toma de Decisiones , Aprendizaje Automático , Salud Pública/métodos , Medición de Riesgo/métodos , Vigilancia de Guardia , Algoritmos , Teorema de Bayes , Inglaterra , HumanosRESUMEN
BackgroundCampylobacteriosis is the most commonly reported food-borne infection in the European Union, with an annual number of cases estimated at around 9 million. In many countries, campylobacteriosis has a striking seasonal peak during early/mid-summer. In the early 2000s, several publications reported on campylobacteriosis seasonality across Europe and associations with temperature and precipitation. Subsequently, many European countries have introduced new measures against this food-borne disease.AimTo examine how the seasonality of campylobacteriosis varied across Europe from 2008-16, to explore associations with temperature and precipitation, and to compare these results with previous studies. We also sought to assess the utility of the European Surveillance System TESSy for cross-European seasonal analysis of campylobacteriosis.MethodsWard's Minimum Variance Clustering was used to group countries with similar seasonal patterns of campylobacteriosis. A two-stage multivariate meta-analysis methodology was used to explore associations with temperature and precipitation.ResultsNordic countries had a pronounced seasonal campylobacteriosis peak in mid- to late summer (weeks 29-32), while most other European countries had a less pronounced peak earlier in the year. The United Kingdom, Ireland, Hungary and Slovakia had a slightly earlier peak (week 24). Campylobacteriosis cases were positively associated with temperature and, to a lesser degree, precipitation.ConclusionAcross Europe, the strength and timing of campylobacteriosis peaks have remained similar to those observed previously. In addition, TESSy is a useful resource for cross-European seasonal analysis of infectious diseases such as campylobacteriosis, but its utility depends upon each country's reporting infrastructure.
Asunto(s)
Infecciones por Campylobacter/epidemiología , Campylobacter/aislamiento & purificación , Brotes de Enfermedades , Monitoreo Epidemiológico , Europa (Continente)/epidemiología , Humanos , Incidencia , Estaciones del Año , Vigilancia de Guardia , TemperaturaRESUMEN
The potential for contaminant uptake from recycled materials used in livestock farming, to animal tissues and organs, was investigated in three practical modular studies involving broiler chickens, laying chickens and pigs. Six types of commercially available recycled materials were used either as bedding material for chickens or as fertilizer for cropland that later housed outdoor reared pigs. The contaminants studied included regulated contaminants e.g. polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs, dioxins) and polychlorinated biphenyls (PCBs), but related contaminants such as polybrominated diphenylethers (PBDEs), hexabrominated cyclododecane (HBCDD), polychlorinated naphthalenes (PCNs), polybrominated dioxins (PBDD/Fs) and perfluoroalkyl substances (PFAS) were also investigated. Contaminant occurrence in the recycled materials was verified prior to the studies and the relationship to tissue and egg concentrations in market ready animals was investigated using a weights of evidence approach. Contaminant uptake to animal tissues and eggs was observed in all the studies but the extent varied depending on the species and the recycled material. PCBs, PBDEs, PCDD/Fs, PCNs and PFAS showed the highest potential to transfer, with laying chickens showing the most pronounced effects. PBDD/Fs showed low concentrations in the recycled materials, making it difficult to evaluate potential transfer. Higher resulting occurrence levels in laying chickens relative to broilers suggests that period of contact with the materials may influence the extent of uptake in chickens. Bio-transfer factors (BTFs) estimated for PCDD/F and PCBs showed a greater magnitude for chicken muscle tissue relative to pigs with the highest values observed for PCBs in laying chickens. There were no significant differences between BTFs for the different chicken tissues which contrasted with the high BTF values for pigs liver relative to muscle. The study raises further questions which require investigation such as the effects of repeated or yearly application of recycled materials as fertilizers, and the batch homogeneity/consistency of available recycled materials.
Asunto(s)
Agricultura/métodos , Alimentación Animal/análisis , Monitoreo del Ambiente , Contaminantes Ambientales/análisis , Contaminación de Alimentos/análisis , Animales , Dibenzofuranos Policlorados/análisis , Ganado , Bifenilos Policlorados/análisis , Dibenzodioxinas Policloradas/análisis , ReciclajeRESUMEN
PURPOSE OF REVIEW: We present a review of the likely consequences of climate change for foodborne pathogens and associated human illness in higher-income countries. RECENT FINDINGS: The relationships between climate and food are complex and hence the impacts of climate change uncertain. This makes it difficult to know which foodborne pathogens will be most affected, what the specific effects will be, and on what timescales changes might occur. Hence, a focus upon current capacity and adaptation potential against foodborne pathogens is essential. We highlight a number of developments that may enhance preparedness for climate change. These include the following: Adoption of novel surveillance methods, such as syndromic methods, to speed up detection and increase the fidelity of intervention in foodborne outbreaks Genotype-based approaches to surveillance of food pathogens to enhance spatiotemporal resolution in tracing and tracking of illness Ever increasing integration of plant, animal and human surveillance systems, One Health, to maximise potential for identifying threats Increased commitment to cross-border (global) information initiatives (including big data) Improved clarity regarding the governance of complex societal issues such as the conflict between food safety and food waste Strong user-centric (social) communications strategies to engage diverse stakeholder groups The impact of climate change upon foodborne pathogens and associated illness is uncertain. This emphasises the need to enhance current capacity and adaptation potential against foodborne illness. A range of developments are explored in this paper to enhance preparedness.
Asunto(s)
Cambio Climático , Países Desarrollados , Enfermedades Transmitidas por los Alimentos/etiología , Cambio Climático/estadística & datos numéricos , Países Desarrollados/estadística & datos numéricos , Microbiología de Alimentos/métodos , Enfermedades Transmitidas por los Alimentos/epidemiología , Enfermedades Transmitidas por los Alimentos/microbiología , Enfermedades Transmitidas por los Alimentos/prevención & control , HumanosRESUMEN
BACKGROUND: Numerous studies have suggested an inverse relationship between drinking water hardness and cardiovascular disease. However, the weight of evidence is insufficient for the WHO to implement a health-based guideline for water hardness. This study followed WHO recommendations to assess the feasibility of using ecological time series data from areas exposed to step changes in water hardness to investigate this issue. METHOD: Monthly time series of cardiovascular mortality data, subdivided by age and sex, were systematically collected from areas reported to have undergone step changes in water hardness, calcium and magnesium in England and Wales between 1981 and 2005. Time series methods were used to investigate the effect of water hardness changes on mortality. RESULTS: No evidence was found of an association between step changes in drinking water hardness or drinking water calcium and cardiovascular mortality. The lack of areas with large populations and a reasonable change in magnesium levels precludes a definitive conclusion about the impact of this cation. We use our results on the variability of the series to consider the data requirements (size of population, time of water hardness change) for such a study to have sufficient power. Only data from areas with large populations (>500,000) are likely to be able to detect a change of the size suggested by previous studies (rate ratio of 1.06). CONCLUSION: Ecological time series studies of populations exposed to changes in drinking water hardness may not be able to provide conclusive evidence on the links between water hardness and cardiovascular mortality unless very large populations are studied. Investigations of individuals may be more informative.
Asunto(s)
Calcio/análisis , Enfermedades Cardiovasculares/mortalidad , Ingestión de Líquidos , Monitoreo del Ambiente/métodos , Magnesio/análisis , Agua/química , Anciano , Enfermedades Cardiovasculares/epidemiología , Inglaterra/epidemiología , Monitoreo Epidemiológico , Estudios de Factibilidad , Femenino , Humanos , Masculino , Persona de Mediana Edad , Factores de Riesgo , Estadísticas Vitales , Gales/epidemiologíaRESUMEN
The effects of temperature on reported cases of a number of foodborne illnesses in England and Wales were investigated. We also explored whether the impact of temperature had changed over time. Food poisoning, campylobacteriosis, salmonellosis, Salmonella Typhimurium infections and Salmonella Enteritidis infections were positively associated (P<0.01) with temperature in the current and previous week. Only food poisoning, salmonellosis and S. Typhimurium infections were associated with temperature 2-5 weeks previously (P<0.01). There were significant reductions also in the impact of temperature on foodborne illnesses over time. This applies to temperature in the current and previous week for all illness types (P<0.01) except S. Enteritidis infection (P=0.079). Temperature 2-5 weeks previously diminished in importance for food poisoning and S. Typhimurium infection (P<0.001). The results are consistent with reduced pathogen concentrations in food and improved food hygiene over time. These adaptations to temperature imply that current estimates of how climate change may alter foodborne illness burden are overly pessimistic.
Asunto(s)
Infecciones por Campylobacter/epidemiología , Enfermedades Transmitidas por los Alimentos/epidemiología , Intoxicación Alimentaria por Salmonella/epidemiología , Temperatura , Inglaterra/epidemiología , Efecto Invernadero , Humanos , Modelos Biológicos , Riesgo , Salmonella enteritidis , Salmonella typhimurium , Gales/epidemiologíaRESUMEN
In New Zealand human cryptosporidiosis demonstrates spring and autumn peaks of incidence with the spring peak being three times greater in magnitude than the autumn peak. The imbalance between the two peaks is notable, and may be associated with the high livestock density in New Zealand. In the summer and autumn the cryptosporidiosis rate was positively associated with temperatures in the current and previous month, highlighting the importance of outdoor recreation to transmission. No associations between spring incidence and weather were found providing little support for the importance of drinking-water pathways. Imported travel cases do not appear to be an important factor in the aetiology of cryptosporidiosis in New Zealand.