Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 72
Filtrar
1.
Prev Med ; 177: 107774, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-37992976

RESUMO

Installation of technologies to remove or deactivate respiratory pathogens from indoor air is a plausible non-pharmaceutical infectious disease control strategy. OBJECTIVE: We undertook a systematic review of worldwide observational and experimental studies, published 1970-2022, to synthesise evidence about the effectiveness of suitable indoor air treatment technologies to prevent respiratory or gastrointestinal infections. METHODS: We searched for data about infection and symptom outcomes for persons who spent minimum 20 h/week in shared indoor spaces subjected to air treatment strategies hypothesised to change risk of respiratory or gastrointestinal infections or symptoms. RESULTS: Pooled data from 32 included studies suggested no net benefits of air treatment technologies for symptom severity or symptom presence, in absence of confirmed infection. Infection incidence was lower in three cohort studies for persons exposed to high efficiency particulate air filtration (RR 0.4, 95%CI 0.28-0.58, p < 0.001) and in one cohort study that combined ionisers with electrostatic nano filtration (RR 0.08, 95%CI 0.01-0.60, p = 0.01); other types of air treatment technologies and air treatment in other study designs were not strongly linked to fewer infections. The infection outcome data exhibited strong publication bias. CONCLUSIONS: Although environmental and surface samples are reduced after air treatment by several air treatment strategies, especially germicidal lights and high efficiency particulate air filtration, robust evidence has yet to emerge that these technologies are effective at reducing respiratory or gastrointestinal infections in real world settings. Data from several randomised trials have yet to report and will be welcome to the evidence base.


Assuntos
Infecções Respiratórias , Humanos , Estudos de Coortes , Infecções Respiratórias/prevenção & controle
2.
J Water Health ; 20(10): 1506-1516, 2022 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-36308495

RESUMO

A small island community in Malaysia uses gravity-fed drinking water, and rejected water treatment by the authorities. This study was conducted to evaluate the community's risk perception towards their untreated water supply by interviewing one adult per household in four out of eight villages on the island. The survey asked questions on risk perception, socioeconomic characteristics, and perception of water supply quality. Water samples were collected from a total of 24 sampling locations across the four villages, and 91.7% of them were positive for E.coli. The study surveyed 218 households and found that 61.5% of respondents agreed to some degree that the water is safe to drink without treatment, while 67.9% of respondents disagreed to some degree that drinking tap water is associated with health risks, and 73.3% of respondents agreed to some degree that it is safe to drink directly from taps that are fitted with water filters. Using factor analysis to group the risk perception questions and multivariable GLM to explore relationships with underlying factors, the study found that older respondents, lower income level, positive water odour perception and positive water supply reliability perception lowers risk perception. The village of residence also significantly affects the risk perception level in the model.


Assuntos
Água Potável , Reprodutibilidade dos Testes , Qualidade da Água , Abastecimento de Água , Percepção , Ingestão de Líquidos
3.
Euro Surveill ; 27(11)2022 03.
Artigo em Inglês | MEDLINE | ID: mdl-35301981

RESUMO

When SARS-CoV-2 Omicron emerged in 2021, S gene target failure enabled differentiation between Omicron and the dominant Delta variant. In England, where S gene target surveillance (SGTS) was already established, this led to rapid identification (within ca 3 days of sample collection) of possible Omicron cases, alongside real-time surveillance and modelling of Omicron growth. SGTS was key to public health action (including case identification and incident management), and we share applied insights on how and when to use SGTS.


Assuntos
COVID-19 , SARS-CoV-2 , COVID-19/epidemiologia , Humanos , Glicoproteínas de Membrana/genética , SARS-CoV-2/genética , Glicoproteína da Espícula de Coronavírus/genética , Proteínas do Envelope Viral/genética
4.
Epidemiol Infect ; 149: e73, 2021 03 08.
Artigo em Inglês | MEDLINE | ID: mdl-33678199

RESUMO

The spatio-temporal dynamics of an outbreak provide important insights to help direct public health resources intended to control transmission. They also provide a focus for detailed epidemiological studies and allow the timing and impact of interventions to be assessed.A common approach is to aggregate case data to administrative regions. Whilst providing a good visual impression of change over space, this method masks spatial variation and assumes that disease risk is constant across space. Risk factors for COVID-19 (e.g. population density, deprivation and ethnicity) vary from place to place across England so it follows that risk will also vary spatially. Kernel density estimation compares the spatial distribution of cases relative to the underlying population, unfettered by arbitrary geographical boundaries, to produce a continuous estimate of spatially varying risk.Using test results from healthcare settings in England (Pillar 1 of the UK Government testing strategy) and freely available methods and software, we estimated the spatial and spatio-temporal risk of COVID-19 infection across England for the first 6 months of 2020. Widespread transmission was underway when partial lockdown measures were introduced on 23 March 2020 and the greatest risk erred towards large urban areas. The rapid growth phase of the outbreak coincided with multiple introductions to England from the European mainland. The spatio-temporal risk was highly labile throughout.In terms of controlling transmission, the most important practical application of our results is the accurate identification of areas within regions that may require tailored intervention strategies. We recommend that this approach is absorbed into routine surveillance outputs in England. Further risk characterisation using widespread community testing (Pillar 2) data is needed as is the increased use of predictive spatial models at fine spatial scales.


Assuntos
COVID-19/diagnóstico , Fatores de Tempo , COVID-19/classificação , COVID-19/epidemiologia , Inglaterra/epidemiologia , Humanos , Vigilância da População/métodos , Avaliação de Risco e Mitigação , Fatores de Risco , Análise Espaço-Temporal , População Urbana/estatística & dados numéricos
5.
Proc Natl Acad Sci U S A ; 115(24): 6243-6248, 2018 06 12.
Artigo em Inglês | MEDLINE | ID: mdl-29844166

RESUMO

The Paris Climate Agreement aims to hold global-mean temperature well below 2 °C and to pursue efforts to limit it to 1.5 °C above preindustrial levels. While it is recognized that there are benefits for human health in limiting global warming to 1.5 °C, the magnitude with which those societal benefits will be accrued remains unquantified. Crucial to public health preparedness and response is the understanding and quantification of such impacts at different levels of warming. Using dengue in Latin America as a study case, a climate-driven dengue generalized additive mixed model was developed to predict global warming impacts using five different global circulation models, all scaled to represent multiple global-mean temperature assumptions. We show that policies to limit global warming to 2 °C could reduce dengue cases by about 2.8 (0.8-7.4) million cases per year by the end of the century compared with a no-policy scenario that warms by 3.7 °C. Limiting warming further to 1.5 °C produces an additional drop in cases of about 0.5 (0.2-1.1) million per year. Furthermore, we found that by limiting global warming we can limit the expansion of the disease toward areas where incidence is currently low. We anticipate our study to be a starting point for more comprehensive studies incorporating socioeconomic scenarios and how they may further impact dengue incidence. Our results demonstrate that although future climate change may amplify dengue transmission in the region, impacts may be avoided by constraining the level of warming.


Assuntos
Dengue/epidemiologia , Dengue/etiologia , Dióxido de Carbono/química , Mudança Climática , Aquecimento Global , Humanos , Incidência , América Latina/epidemiologia , Temperatura
6.
Risk Anal ; 41(12): 2286-2292, 2021 12.
Artigo em Inglês | MEDLINE | ID: mdl-34076284

RESUMO

The COVID-19 pandemic has disrupted economies and societies throughout the world since early 2020. Education is especially affected, with schools and universities widely closed for long periods. People under 25 years have the lowest risk of severe disease but their activities can be key to persistent ongoing community transmission. A challenge arose for how to provide education, including university level, without the activities of students increasing wider community SARS-CoV-2 infections. We used a Hazard Analysis of Critical Control Points (HACCP) framework to assess the risks associated with university student activity and recommend how to mitigate these risks. This tool appealed because it relies on multiagency collaboration and interdisciplinary expertise and yet is low cost, allowing rapid generation of evidence-based recommendations. We identified key critical control points associated with university student' activities, lifestyle, and interaction patterns both on-and-off campus. Unacceptable contact thresholds and the most up-to-date guidance were used to identify levels of risk for potential SARS-CoV-2 transmission, as well as recommendations based on existing research and emerging evidence for strategies that can reduce the risks of transmission. Employing the preventative measures we suggest can reduce the risks of SARS-CoV-2 transmission among and from university students. Reduction of infectious disease transmission in this demographic will reduce overall community transmission, lower demands on health services and reduce risk of harm to clinically vulnerable individuals while allowing vital education activity to continue. HACCP assessment proved a flexible tool for risk analysis in a specific setting in response to an emerging infectious disease threat. Systematic approaches to assessing hazards and risk critical control points (#HACCP) enable robust strategies for protecting students and staff in HE settings during #COVID19 pandemic.


Assuntos
COVID-19/epidemiologia , Análise de Perigos e Pontos Críticos de Controle , Estudantes , Universidades , COVID-19/prevenção & controle , COVID-19/virologia , Humanos , Pandemias , SARS-CoV-2/isolamento & purificação
7.
Bioinformatics ; 35(17): 3110-3118, 2019 09 01.
Artigo em Inglês | MEDLINE | ID: mdl-30689731

RESUMO

MOTIVATION: Public health authorities can provide more effective and timely interventions to protect populations during health events if they have effective multi-purpose surveillance systems. These systems rely on aberration detection algorithms to identify potential threats within large datasets. Ensuring the algorithms are sensitive, specific and timely is crucial for protecting public health. Here, we evaluate the performance of three detection algorithms extensively used for syndromic surveillance: the 'rising activity, multilevel mixed effects, indicator emphasis' (RAMMIE) method and the improved quasi-Poisson regression-based method known as 'Farrington Flexible' both currently used at Public Health England, and the 'Early Aberration Reporting System' (EARS) method used at the US Centre for Disease Control and Prevention. We model the wide range of data structures encountered within the daily syndromic surveillance systems used by PHE. We undertake extensive simulations to identify which algorithms work best across different types of syndromes and different outbreak sizes. We evaluate RAMMIE for the first time since its introduction. Performance metrics were computed and compared in the presence of a range of simulated outbreak types that were added to baseline data. RESULTS: We conclude that amongst the algorithm variants that have a high specificity (i.e. >90%), Farrington Flexible has the highest sensitivity and specificity, whereas RAMMIE has the highest probability of outbreak detection and is the most timely, typically detecting outbreaks 2-3 days earlier. AVAILABILITY AND IMPLEMENTATION: R codes developed for this project are available through https://github.com/FelipeJColon/AlgorithmComparison. SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.


Assuntos
Vigilância de Evento Sentinela , Algoritmos , Surtos de Doenças , Inglaterra , Humanos
8.
J Water Health ; 18(2): 145-158, 2020 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-32300088

RESUMO

Cholera is a severe diarrhoeal disease affecting vulnerable communities. A long-term solution to cholera transmission is improved access to and uptake of water, sanitation and hygiene (WASH). Climate change threatens WASH. A systematic review and meta-analysis determined five overarching WASH factors incorporating 17 specific WASH factors associated with cholera transmission, focussing upon community cases. Eight WASH factors showed lower odds and six showed higher odds for cholera transmission. These results were combined with findings in the climate change and WASH literature, to propose a health impact pathway illustrating potential routes through which climate change dynamics (e.g. drought, flooding) impact on WASH and cholera transmission. A causal process diagram visualising links between climate change dynamics, WASH factors, and cholera transmission was developed. Climate change dynamics can potentially affect multiple WASH factors (e.g. drought-induced reductions in handwashing and rainwater use). Multiple climate change dynamics can influence WASH factors (e.g. flooding and sea-level rise affect piped water usage). The influence of climate change dynamics on WASH factors can be negative or positive for cholera transmission (e.g. drought could increase pathogen desiccation but reduce rainwater harvesting). Identifying risk pathways helps policymakers focus on cholera risk mitigation, now and in the future.


Assuntos
Cólera/transmissão , Mudança Climática , Higiene , Saneamento , Causalidade , Humanos , Fatores de Risco , Água , Abastecimento de Água
9.
Euro Surveill ; 25(49)2020 12.
Artigo em Inglês | MEDLINE | ID: mdl-33303066

RESUMO

BackgroundEvidence for face-mask wearing in the community to protect against respiratory disease is unclear.AimTo assess effectiveness of wearing face masks in the community to prevent respiratory disease, and recommend improvements to this evidence base.MethodsWe systematically searched Scopus, Embase and MEDLINE for studies evaluating respiratory disease incidence after face-mask wearing (or not). Narrative synthesis and random-effects meta-analysis of attack rates for primary and secondary prevention were performed, subgrouped by design, setting, face barrier type, and who wore the mask. Preferred outcome was influenza-like illness. Grading of Recommendations, Assessment, Development and Evaluations (GRADE) quality assessment was undertaken and evidence base deficits described.Results33 studies (12 randomised control trials (RCTs)) were included. Mask wearing reduced primary infection by 6% (odds ratio (OR): 0.94; 95% CI: 0.75-1.19 for RCTs) to 61% (OR: 0.85; 95% CI: 0.32-2.27; OR: 0.39; 95% CI: 0.18-0.84 and OR: 0.61; 95% CI: 0.45-0.85 for cohort, case-control and cross-sectional studies respectively). RCTs suggested lowest secondary attack rates when both well and ill household members wore masks (OR: 0.81; 95% CI: 0.48-1.37). While RCTs might underestimate effects due to poor compliance and controls wearing masks, observational studies likely overestimate effects, as mask wearing might be associated with other risk-averse behaviours. GRADE was low or very low quality.ConclusionWearing face masks may reduce primary respiratory infection risk, probably by 6-15%. It is important to balance evidence from RCTs and observational studies when their conclusions widely differ and both are at risk of significant bias. COVID-19-specific studies are required.


Assuntos
COVID-19/prevenção & controle , Dispositivos de Proteção dos Olhos , Influenza Humana/prevenção & controle , Máscaras , Infecções por Picornaviridae/prevenção & controle , Infecções Respiratórias/prevenção & controle , Tuberculose/prevenção & controle , COVID-19/transmissão , Infecções por Coronavirus/prevenção & controle , Infecções por Coronavirus/transmissão , Humanos , Influenza Humana/transmissão , Infecções por Picornaviridae/transmissão , Dispositivos de Proteção Respiratória , Infecções Respiratórias/transmissão , SARS-CoV-2 , Tuberculose/transmissão
10.
J Transl Med ; 17(1): 34, 2019 01 21.
Artigo em Inglês | MEDLINE | ID: mdl-30665426

RESUMO

BACKGROUND: With over 800 million cases globally, campylobacteriosis is a major cause of food borne disease. In temperate climates incidence is highly seasonal but the underlying mechanisms are poorly understood, making human disease control difficult. We hypothesised that observed disease patterns reflect complex interactions between weather, patterns of human risk behaviour, immune status and level of food contamination. Only by understanding these can we find effective interventions. METHODS: We analysed trends in human Campylobacter cases in NE England from 2004 to 2009, investigating the associations between different risk factors and disease using time-series models. We then developed an individual-based (IB) model of risk behaviour, human immunological responses to infection and environmental contamination driven by weather and land use. We parameterised the IB model for NE England and compared outputs to observed numbers of reported cases each month in the population in 2004-2009. Finally, we used it to investigate different community level disease reduction strategies. RESULTS: Risk behaviours like countryside visits (t = 3.665, P < 0.001 and t = - 2.187, P = 0.029 for temperature and rainfall respectively), and consumption of barbecued food were strongly associated with weather, (t = 3.219, P = 0.002 and t = 2.015, P = 0.045 for weekly average temperature and average maximum temperature respectively) and also rain (t = 2.254, P = 0.02527). This suggests that the effect of weather was indirect, acting through changes in risk behaviour. The seasonal pattern of cases predicted by the IB model was significantly related to observed patterns (r = 0.72, P < 0.001) indicating that simulating risk behaviour could produce the observed seasonal patterns of cases. A vaccination strategy providing short-term immunity was more effective than educational interventions to modify human risk behaviour. Extending immunity to 1 year from 20 days reduced disease burden by an order of magnitude (from 2412-2414 to 203-309 cases per 50,000 person-years). CONCLUSIONS: This is the first interdisciplinary study to integrate environment, risk behaviour, socio-demographics and immunology to model Campylobacter infection, including pathways to mitigation. We conclude that vaccination is likely to be the best route for intervening against campylobacteriosis despite the technical problems associated with understanding both the underlying human immunology and genetic variation in the pathogen, and the likely cost of vaccine development.


Assuntos
Comportamento , Infecções por Campylobacter/epidemiologia , Clima , Efeitos Psicossociais da Doença , Meio Ambiente , Modelos Biológicos , Estações do Ano , Animais , Galinhas , Inglaterra/epidemiologia , Humanos , Chuva , Temperatura
11.
Epidemiol Infect ; 147: e101, 2019 01.
Artigo em Inglês | MEDLINE | ID: mdl-30869042

RESUMO

Syndromic surveillance is a form of surveillance that generates information for public health action by collecting, analysing and interpreting routine health-related data on symptoms and clinical signs reported by patients and clinicians rather than being based on microbiologically or clinically confirmed cases. In England, a suite of national real-time syndromic surveillance systems (SSS) have been developed over the last 20 years, utilising data from a variety of health care settings (a telehealth triage system, general practice and emergency departments). The real-time systems in England have been used for early detection (e.g. seasonal influenza), for situational awareness (e.g. describing the size and demographics of the impact of a heatwave) and for reassurance of lack of impact on population health of mass gatherings (e.g. the London 2012 Olympic and Paralympic Games).We highlight the lessons learnt from running SSS, for nearly two decades, and propose questions and issues still to be addressed. We feel that syndromic surveillance is an example of the use of 'big data', but contend that the focus for sustainable and useful systems should be on the added value of such systems and the importance of people working together to maximise the value for the public health of syndromic surveillance services.


Assuntos
Vigilância em Saúde Pública/métodos , Vigilância de Evento Sentinela , Inglaterra , Humanos
12.
BMC Infect Dis ; 19(1): 255, 2019 Mar 13.
Artigo em Inglês | MEDLINE | ID: mdl-30866826

RESUMO

BACKGROUND: Campylobacteriosis is a major public health concern. The weather factors that influence spatial and seasonal distributions are not fully understood. METHODS: To investigate the impacts of temperature and rainfall on Campylobacter infections in England and Wales, cases of Campylobacter were linked to local temperature and rainfall at laboratory postcodes in the 30 days before the specimen date. Methods for investigation included a comparative conditional incidence, wavelet, clustering, and time series analyses. RESULTS: The increase of Campylobacter infections in the late spring was significantly linked to temperature two weeks before, with an increase in conditional incidence of 0.175 cases per 100,000 per week for weeks 17 to 24; the relationship to temperature was not linear. Generalized structural time series model revealed that changes in temperature accounted for 33.3% of the expected cases of Campylobacteriosis, with an indication of the direction and relevant temperature range. Wavelet analysis showed a strong annual cycle with additional harmonics at four and six months. Cluster analysis showed three clusters of seasonality with geographic similarities representing metropolitan, rural, and other areas. CONCLUSIONS: The association of Campylobacteriosis with temperature is likely to be indirect. High-resolution spatial temporal linkage of weather parameters and cases is important in improving weather associations with infectious diseases. The primary driver of Campylobacter incidence remains to be determined; other avenues, such as insect contamination of chicken flocks through poor biosecurity should be explored.


Assuntos
Infecções por Campylobacter/epidemiologia , Tempo (Meteorologia) , Animais , Galinhas , Inglaterra/epidemiologia , Humanos , Estações do Ano , País de Gales/epidemiologia
13.
Epidemiol Infect ; 146(15): 1928-1939, 2018 11.
Artigo em Inglês | MEDLINE | ID: mdl-30205851

RESUMO

Infection with STEC O157 is relatively rare but has potentially serious sequelae, particularly for children. Large outbreaks have prompted considerable efforts designed to reduce transmission primarily from food and direct animal contact. Despite these interventions, numbers of infections have remained constant for many years and the mechanisms leading to many sporadic infections remain unclear.Here, we show that two-thirds of all cases reported in England between 2009 and 2015 were sporadic. Crude rates of infection differed geographically and were highest in rural areas during the summer months. Living in rural areas with high densities of cattle, sheep or pigs and those served by private water supplies were associated with increased risk. Living in an area of lower deprivation contributed to increased risk but this appeared to be associated with reported travel abroad. Fresh water coverage and residential proximity to the coast were not risk factors.To reduce the overall burden of infection in England, interventions designed to reduce the number of sporadic infections with STEC should focus on the residents of rural areas with high densities of livestock and the effective management of non-municipal water supplies. The role of sheep as a reservoir and potential source of infection in humans should not be overlooked.


Assuntos
Técnicas de Tipagem Bacteriana , Infecções por Escherichia coli/epidemiologia , Escherichia coli O157/classificação , Escherichia coli O157/isolamento & purificação , Análise Espaço-Temporal , Criação de Animais Domésticos , Animais , Inglaterra/epidemiologia , Geografia , Humanos , Exposição Ocupacional , Fatores de Risco , População Rural , Estações do Ano , Fatores Socioeconômicos , Abastecimento de Água
14.
BMC Public Health ; 18(1): 544, 2018 04 24.
Artigo em Inglês | MEDLINE | ID: mdl-29699520

RESUMO

BACKGROUND: Syndromic surveillance complements traditional public health surveillance by collecting and analysing health indicators in near real time. The rationale of syndromic surveillance is that it may detect health threats faster than traditional surveillance systems permitting more timely, and hence potentially more effective public health action. The effectiveness of syndromic surveillance largely relies on the methods used to detect aberrations. Very few studies have evaluated the performance of syndromic surveillance systems and consequently little is known about the types of events that such systems can and cannot detect. METHODS: We introduce a framework for the evaluation of syndromic surveillance systems that can be used in any setting based upon the use of simulated scenarios. For a range of scenarios this allows the time and probability of detection to be determined and uncertainty is fully incorporated. In addition, we demonstrate how such a framework can model the benefits of increases in the number of centres reporting syndromic data and also determine the minimum size of outbreaks that can or cannot be detected. Here, we demonstrate its utility using simulations of national influenza outbreaks and localised outbreaks of cryptosporidiosis. RESULTS: Influenza outbreaks are consistently detected with larger outbreaks being detected in a more timely manner. Small cryptosporidiosis outbreaks (<1000 symptomatic individuals) are unlikely to be detected. We also demonstrate the advantages of having multiple syndromic data streams (e.g. emergency attendance data, telephone helpline data, general practice consultation data) as different streams are able to detect different outbreak types with different efficacy (e.g. emergency attendance data are useful for the detection of pandemic influenza but not for outbreaks of cryptosporidiosis). We also highlight that for any one disease, the utility of data streams may vary geographically, and that the detection ability of syndromic surveillance varies seasonally (e.g. an influenza outbreak starting in July is detected sooner than one starting later in the year). We argue that our framework constitutes a useful tool for public health emergency preparedness in multiple settings. CONCLUSIONS: The proposed framework allows the exhaustive evaluation of any syndromic surveillance system and constitutes a useful tool for emergency preparedness and response.


Assuntos
Surtos de Doenças/prevenção & controle , Pandemias/prevenção & controle , Vigilância em Saúde Pública/métodos , Vigilância de Evento Sentinela , Criptosporidiose/epidemiologia , Inglaterra/epidemiologia , Humanos , Influenza Humana/epidemiologia
15.
Appl Environ Microbiol ; 83(14)2017 07 15.
Artigo em Inglês | MEDLINE | ID: mdl-28500040

RESUMO

This paper introduces a novel method for sampling pathogens in natural environments. It uses fabric boot socks worn over walkers' shoes to allow the collection of composite samples over large areas. Wide-area sampling is better suited to studies focusing on human exposure to pathogens (e.g., recreational walking). This sampling method is implemented using a citizen science approach: groups of three walkers wearing boot socks undertook one of six routes, 40 times over 16 months in the North West (NW) and East Anglian (EA) regions of England. To validate this methodology, we report the successful implementation of this citizen science approach, the observation that Campylobacter bacteria were detected on 47% of boot socks, and the observation that multiple boot socks from individual walks produced consistent results. The findings indicate higher Campylobacter levels in the livestock-dominated NW than in EA (55.8% versus 38.6%). Seasonal differences in the presence of Campylobacter bacteria were found between the regions, with indications of winter peaks in both regions but a spring peak in the NW. The presence of Campylobacter bacteria on boot socks was negatively associated with ambient temperature (P = 0.011) and positively associated with precipitation (P < 0.001), results consistent with our understanding of Campylobacter survival and the probability of material adhering to boot socks. Campylobacter jejuni was the predominant species found; Campylobacter coli was largely restricted to the livestock-dominated NW. Source attribution analysis indicated that the potential source of C. jejuni was predominantly sheep in the NW and wild birds in EA but did not differ between peak and nonpeak periods of human incidence.IMPORTANCE There is debate in the literature on the pathways through which pathogens are transferred from the environment to humans. We report on the success of a novel method for sampling human-pathogen interactions using boot socks and citizen science techniques, which enable us to sample human-pathogen interactions that may occur through visits to natural environments. This contrasts with traditional environmental sampling, which is based on spot sampling techniques and does not sample human-pathogen interactions. Our methods are of practical value to scientists trying to understand the transmission of pathogens from the environment to people. Our findings provide insight into the risk of Campylobacter exposure from recreational visits and an understanding of seasonal differences in risk and the factors behind these patterns. We highlight the Campylobacter species predominantly encountered and the potential sources of C. jejuni.


Assuntos
Infecções por Campylobacter/microbiologia , Infecções por Campylobacter/veterinária , Campylobacter/isolamento & purificação , Gado/microbiologia , Técnicas Microbiológicas/métodos , Animais , Animais Selvagens/microbiologia , Campylobacter/classificação , Campylobacter/genética , Campylobacter/fisiologia , Inglaterra , Meio Ambiente , Humanos , Técnicas Microbiológicas/instrumentação , Estações do Ano , Sapatos
16.
Environ Health ; 16(Suppl 1): 117, 2017 12 05.
Artigo em Inglês | MEDLINE | ID: mdl-29219100

RESUMO

This review examined the likely impact of climate change upon food-borne disease in the UK using Campylobacter and Salmonella as example organisms. Campylobacter is an important food-borne disease and an increasing public health threat. There is a reasonable evidence base that the environment and weather play a role in its transmission to humans. However, uncertainty as to the precise mechanisms through which weather affects disease, make it difficult to assess the likely impact of climate change. There are strong positive associations between Salmonella cases and ambient temperature, and a clear understanding of the mechanisms behind this. However, because the incidence of Salmonella disease is declining in the UK, any climate change increases are likely to be small. For both Salmonella and Campylobacter the disease incidence is greatest in older adults and young children. There are many pathways through which climate change may affect food but only a few of these have been rigorously examined. This provides a high degree of uncertainty as to what the impacts of climate change will be. Food is highly controlled at the National and EU level. This provides the UK with resilience to climate change as well as potential to adapt to its consequences but it is unknown whether these are sufficient in the context of a changing climate.


Assuntos
Infecções por Campylobacter/epidemiologia , Mudança Climática , Doenças Transmitidas por Alimentos/epidemiologia , Infecções por Salmonella/epidemiologia , Campylobacter/fisiologia , Infecções por Campylobacter/transmissão , Doenças Transmitidas por Alimentos/etiologia , Humanos , Incidência , Saúde Pública , Salmonella/fisiologia , Infecções por Salmonella/transmissão , Incerteza , Reino Unido/epidemiologia
17.
Bull World Health Organ ; 94(6): 424-32, 2016 Jun 01.
Artigo em Inglês | MEDLINE | ID: mdl-27274594

RESUMO

OBJECTIVE: To assess, within communities experiencing Ebola virus outbreaks, the risks associated with the disposal of human waste and to generate recommendations for mitigating such risks. METHODS: A team with expertise in the Hazard Analysis of Critical Control Points framework identified waste products from the care of individuals with Ebola virus disease and constructed, tested and confirmed flow diagrams showing the creation of such products. After listing potential hazards associated with each step in each flow diagram, the team conducted a hazard analysis, determined critical control points and made recommendations to mitigate the transmission risks at each control point. FINDINGS: The collection, transportation, cleaning and shared use of blood-soiled fomites and the shared use of latrines contaminated with blood or bloodied faeces appeared to be associated with particularly high levels of risk of Ebola virus transmission. More moderate levels of risk were associated with the collection and transportation of material contaminated with bodily fluids other than blood, shared use of latrines soiled with such fluids, the cleaning and shared use of fomites soiled with such fluids, and the contamination of the environment during the collection and transportation of blood-contaminated waste. CONCLUSION: The risk of the waste-related transmission of Ebola virus could be reduced by the use of full personal protective equipment, appropriate hand hygiene and an appropriate disinfectant after careful cleaning. Use of the Hazard Analysis of Critical Control Points framework could facilitate rapid responses to outbreaks of emerging infectious disease.


Assuntos
Ebolavirus , Eliminação de Resíduos de Serviços de Saúde/métodos , Doença pelo Vírus Ebola/prevenção & controle , Humanos
18.
Euro Surveill ; 21(41)2016 Oct 13.
Artigo em Inglês | MEDLINE | ID: mdl-27762208

RESUMO

During August 2015, a boil water notice (BWN) was issued across parts of North West England following the detection of Cryptosporidium oocysts in the public water supply. Using prospective syndromic surveillance, we detected statistically significant increases in the presentation of cases of gastroenteritis and diarrhoea to general practitioner services and related calls to the national health telephone advice service in those areas affected by the BWN. In the affected areas, average in-hours general practitioner consultations for gastroenteritis increased by 24.8% (from 13.49 to 16.84) during the BWN period; average diarrhoea consultations increased by 28.5% (from 8.33 to 10.71). Local public health investigations revealed no laboratory reported cases confirmed as being associated with the water supply. These findings suggest that the increases reported by syndromic surveillance of cases of gastroenteritis and diarrhoea likely resulted from changes in healthcare seeking behaviour driven by the intense local and national media coverage of the potential health risks during the event. This study has further highlighted the potential for media-driven bias in syndromic surveillance, and the challenges in disentangling true increases in community infection from those driven by media reporting.


Assuntos
Criptosporidiose/epidemiologia , Cryptosporidium , Surtos de Doenças , Meios de Comunicação de Massa , Vigilância da População/métodos , Microbiologia da Água , Abastecimento de Água , Animais , Criptosporidiose/diagnóstico , Diarreia/epidemiologia , Diarreia/microbiologia , Notificação de Doenças , Inglaterra/epidemiologia , Feminino , Gastroenterite/epidemiologia , Gastroenterite/microbiologia , Educação em Saúde , Humanos , Estudos Prospectivos
19.
BMC Public Health ; 14: 781, 2014 Aug 22.
Artigo em Inglês | MEDLINE | ID: mdl-25149418

RESUMO

BACKGROUND: Dengue fever is the most prevalent mosquito-borne viral disease worldwide. Dengue transmission is critically dependent on climatic factors and there is much concern as to whether climate change would spread the disease to areas currently unaffected. The occurrence of autochthonous infections in Croatia and France in 2010 has raised concerns about a potential re-emergence of dengue in Europe. The objective of this study is to estimate dengue risk in Europe under climate change scenarios. METHODS: We used a Generalized Additive Model (GAM) to estimate dengue fever risk as a function of climatic variables (maximum temperature, minimum temperature, precipitation, humidity) and socioeconomic factors (population density, urbanisation, GDP per capita and population size), under contemporary conditions (1985-2007) in Mexico. We then used our model estimates to project dengue incidence under baseline conditions (1961-1990) and three climate change scenarios: short-term 2011-2040, medium-term 2041-2070 and long-term 2071-2100 across Europe. The model was used to calculate average number of yearly dengue cases at a spatial resolution of 10 × 10 km grid covering all land surface of the currently 27 EU member states. To our knowledge, this is the first attempt to model dengue fever risk in Europe in terms of disease occurrence rather than mosquito presence. RESULTS: The results were presented using Geographical Information System (GIS) and allowed identification of areas at high risk. Dengue fever hot spots were clustered around the coastal areas of the Mediterranean and Adriatic seas and the Po Valley in northern Italy. CONCLUSIONS: This risk assessment study is likely to be a valuable tool assisting effective and targeted adaptation responses to reduce the likely increased burden of dengue fever in a warmer world.


Assuntos
Mudança Climática , Dengue/epidemiologia , Aedes , Animais , Europa (Continente)/epidemiologia , Sistemas de Informação Geográfica , Humanos , Incidência , Modelos Teóricos , Densidade Demográfica , Medição de Risco , Tempo (Meteorologia)
20.
Lancet Microbe ; 5(2): e173-e180, 2024 02.
Artigo em Inglês | MEDLINE | ID: mdl-38244555

RESUMO

BACKGROUND: Whole-genome sequencing (WGS) is the gold standard diagnostic tool to identify and genetically characterise emerging pathogen mutations (variants), but cost, capacity, and timeliness limit its use when large populations need rapidly assessing. We assessed the potential of genotyping assays to provide accurate and timely variant information at scale by retrospectively examining surveillance for SARS-CoV-2 variants in England between March and September, 2021, when genotyping assays were used widely for variant detection. METHODS: We chose a panel of four RT-PCR genotyping assays to detect circulating variants of SARS-COV-2 in England and developed a decision algorithm to assign a probable SARS-CoV-2 variant to samples using the assay results. We extracted surveillance data from the UK Health Security Agency databases for 115 934 SARS-CoV-2-positive samples (March 1-Sept 6, 2021) when variant information was available from both genotyping and WGS. By comparing the genotyping and WGS variant result, we calculated accuracy metrics (ie, sensitivity, specificity, and positive predictive value [PPV]) and the time difference between the sample collection date and the availability of variant information. We assessed the number of samples with a variant assigned from genotyping or WGS, or both, over time. FINDINGS: Genotyping and an initial decision algorithm (April 10-May 11, 2021 data) were accurate for key variant assignment: sensitivities and PPVs were 0·99 (95% CI 0·99-0·99) for the alpha, 1·00 (1·00-1·00) for the beta, and 0·91 (0·80-1·00) for the gamma variants; specificities were 0·97 (0·96-0·98), 1·00 (1·00-1·00), and 1·00 (1·00-1·00), respectively. A subsequent decision algorithm over a longer time period (May 27-Sept 6, 2021 data) remained accurate for key variant assignment: sensitivities were 0·91 (95% CI 0·74-1·00) for the beta, 0·98 (0·98-0·99) for the delta, and 0·93 (0·81-1·00) for the gamma variants; specificities were 1·00 (1·00-1·00), 0·96 (0·96-0·97), and 1·00 (1·00-1·00), respectively; and PPVs were 0·83 (0·62-1·00), 1·00 (1·00-1·00), and 0·78 (0·59-0·97), respectively. Genotyping produced variant information a median of 3 days (IQR 2-4) after the sample collection date, which was faster than with WGS (9 days [8-11]). The flexibility of genotyping enabled a nine-times increase in the quantity of samples tested for variants by this method (from 5000 to 45 000). INTERPRETATION: RT-PCR genotyping assays are suitable for high-throughput variant surveillance and could complement WGS, enabling larger scale testing for known variants and timelier results, with important implications for effective public health responses and disease control globally, especially in settings with low WGS capacity. However, the choice of panels of RT-PCR assays is highly dependent on database information on circulating variants generated by WGS, which could limit the use of genotyping assays when new variants are emerging and spreading rapidly. FUNDING: UK Health Security Agency and National Institute for Health Research Health Protection Research Unit in Emergency Preparedness and Response.


Assuntos
COVID-19 , Humanos , COVID-19/diagnóstico , COVID-19/epidemiologia , Genótipo , Estudos Retrospectivos , Reação em Cadeia da Polimerase Via Transcriptase Reversa , SARS-CoV-2/genética , Inglaterra/epidemiologia , Teste para COVID-19
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA