Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 49
Filtrar
1.
Prev Med ; 177: 107774, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-37992976

RESUMO

Installation of technologies to remove or deactivate respiratory pathogens from indoor air is a plausible non-pharmaceutical infectious disease control strategy. OBJECTIVE: We undertook a systematic review of worldwide observational and experimental studies, published 1970-2022, to synthesise evidence about the effectiveness of suitable indoor air treatment technologies to prevent respiratory or gastrointestinal infections. METHODS: We searched for data about infection and symptom outcomes for persons who spent minimum 20 h/week in shared indoor spaces subjected to air treatment strategies hypothesised to change risk of respiratory or gastrointestinal infections or symptoms. RESULTS: Pooled data from 32 included studies suggested no net benefits of air treatment technologies for symptom severity or symptom presence, in absence of confirmed infection. Infection incidence was lower in three cohort studies for persons exposed to high efficiency particulate air filtration (RR 0.4, 95%CI 0.28-0.58, p < 0.001) and in one cohort study that combined ionisers with electrostatic nano filtration (RR 0.08, 95%CI 0.01-0.60, p = 0.01); other types of air treatment technologies and air treatment in other study designs were not strongly linked to fewer infections. The infection outcome data exhibited strong publication bias. CONCLUSIONS: Although environmental and surface samples are reduced after air treatment by several air treatment strategies, especially germicidal lights and high efficiency particulate air filtration, robust evidence has yet to emerge that these technologies are effective at reducing respiratory or gastrointestinal infections in real world settings. Data from several randomised trials have yet to report and will be welcome to the evidence base.


Assuntos
Infecções Respiratórias , Humanos , Estudos de Coortes , Infecções Respiratórias/prevenção & controle
2.
J Water Health ; 20(10): 1506-1516, 2022 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-36308495

RESUMO

A small island community in Malaysia uses gravity-fed drinking water, and rejected water treatment by the authorities. This study was conducted to evaluate the community's risk perception towards their untreated water supply by interviewing one adult per household in four out of eight villages on the island. The survey asked questions on risk perception, socioeconomic characteristics, and perception of water supply quality. Water samples were collected from a total of 24 sampling locations across the four villages, and 91.7% of them were positive for E.coli. The study surveyed 218 households and found that 61.5% of respondents agreed to some degree that the water is safe to drink without treatment, while 67.9% of respondents disagreed to some degree that drinking tap water is associated with health risks, and 73.3% of respondents agreed to some degree that it is safe to drink directly from taps that are fitted with water filters. Using factor analysis to group the risk perception questions and multivariable GLM to explore relationships with underlying factors, the study found that older respondents, lower income level, positive water odour perception and positive water supply reliability perception lowers risk perception. The village of residence also significantly affects the risk perception level in the model.


Assuntos
Água Potável , Reprodutibilidade dos Testes , Qualidade da Água , Abastecimento de Água , Percepção , Ingestão de Líquidos
3.
Euro Surveill ; 27(11)2022 03.
Artigo em Inglês | MEDLINE | ID: mdl-35301981

RESUMO

When SARS-CoV-2 Omicron emerged in 2021, S gene target failure enabled differentiation between Omicron and the dominant Delta variant. In England, where S gene target surveillance (SGTS) was already established, this led to rapid identification (within ca 3 days of sample collection) of possible Omicron cases, alongside real-time surveillance and modelling of Omicron growth. SGTS was key to public health action (including case identification and incident management), and we share applied insights on how and when to use SGTS.


Assuntos
COVID-19 , SARS-CoV-2 , COVID-19/epidemiologia , Humanos , Glicoproteínas de Membrana/genética , SARS-CoV-2/genética , Glicoproteína da Espícula de Coronavírus/genética , Proteínas do Envelope Viral/genética
4.
Epidemiol Infect ; 149: e73, 2021 03 08.
Artigo em Inglês | MEDLINE | ID: mdl-33678199

RESUMO

The spatio-temporal dynamics of an outbreak provide important insights to help direct public health resources intended to control transmission. They also provide a focus for detailed epidemiological studies and allow the timing and impact of interventions to be assessed.A common approach is to aggregate case data to administrative regions. Whilst providing a good visual impression of change over space, this method masks spatial variation and assumes that disease risk is constant across space. Risk factors for COVID-19 (e.g. population density, deprivation and ethnicity) vary from place to place across England so it follows that risk will also vary spatially. Kernel density estimation compares the spatial distribution of cases relative to the underlying population, unfettered by arbitrary geographical boundaries, to produce a continuous estimate of spatially varying risk.Using test results from healthcare settings in England (Pillar 1 of the UK Government testing strategy) and freely available methods and software, we estimated the spatial and spatio-temporal risk of COVID-19 infection across England for the first 6 months of 2020. Widespread transmission was underway when partial lockdown measures were introduced on 23 March 2020 and the greatest risk erred towards large urban areas. The rapid growth phase of the outbreak coincided with multiple introductions to England from the European mainland. The spatio-temporal risk was highly labile throughout.In terms of controlling transmission, the most important practical application of our results is the accurate identification of areas within regions that may require tailored intervention strategies. We recommend that this approach is absorbed into routine surveillance outputs in England. Further risk characterisation using widespread community testing (Pillar 2) data is needed as is the increased use of predictive spatial models at fine spatial scales.


Assuntos
COVID-19/diagnóstico , Fatores de Tempo , COVID-19/classificação , COVID-19/epidemiologia , Inglaterra/epidemiologia , Humanos , Vigilância da População/métodos , Avaliação de Risco e Mitigação , Fatores de Risco , Análise Espaço-Temporal , População Urbana/estatística & dados numéricos
5.
Proc Natl Acad Sci U S A ; 115(24): 6243-6248, 2018 06 12.
Artigo em Inglês | MEDLINE | ID: mdl-29844166

RESUMO

The Paris Climate Agreement aims to hold global-mean temperature well below 2 °C and to pursue efforts to limit it to 1.5 °C above preindustrial levels. While it is recognized that there are benefits for human health in limiting global warming to 1.5 °C, the magnitude with which those societal benefits will be accrued remains unquantified. Crucial to public health preparedness and response is the understanding and quantification of such impacts at different levels of warming. Using dengue in Latin America as a study case, a climate-driven dengue generalized additive mixed model was developed to predict global warming impacts using five different global circulation models, all scaled to represent multiple global-mean temperature assumptions. We show that policies to limit global warming to 2 °C could reduce dengue cases by about 2.8 (0.8-7.4) million cases per year by the end of the century compared with a no-policy scenario that warms by 3.7 °C. Limiting warming further to 1.5 °C produces an additional drop in cases of about 0.5 (0.2-1.1) million per year. Furthermore, we found that by limiting global warming we can limit the expansion of the disease toward areas where incidence is currently low. We anticipate our study to be a starting point for more comprehensive studies incorporating socioeconomic scenarios and how they may further impact dengue incidence. Our results demonstrate that although future climate change may amplify dengue transmission in the region, impacts may be avoided by constraining the level of warming.


Assuntos
Dengue/epidemiologia , Dengue/etiologia , Dióxido de Carbono/química , Mudança Climática , Aquecimento Global , Humanos , Incidência , América Latina/epidemiologia , Temperatura
6.
Bioinformatics ; 35(17): 3110-3118, 2019 09 01.
Artigo em Inglês | MEDLINE | ID: mdl-30689731

RESUMO

MOTIVATION: Public health authorities can provide more effective and timely interventions to protect populations during health events if they have effective multi-purpose surveillance systems. These systems rely on aberration detection algorithms to identify potential threats within large datasets. Ensuring the algorithms are sensitive, specific and timely is crucial for protecting public health. Here, we evaluate the performance of three detection algorithms extensively used for syndromic surveillance: the 'rising activity, multilevel mixed effects, indicator emphasis' (RAMMIE) method and the improved quasi-Poisson regression-based method known as 'Farrington Flexible' both currently used at Public Health England, and the 'Early Aberration Reporting System' (EARS) method used at the US Centre for Disease Control and Prevention. We model the wide range of data structures encountered within the daily syndromic surveillance systems used by PHE. We undertake extensive simulations to identify which algorithms work best across different types of syndromes and different outbreak sizes. We evaluate RAMMIE for the first time since its introduction. Performance metrics were computed and compared in the presence of a range of simulated outbreak types that were added to baseline data. RESULTS: We conclude that amongst the algorithm variants that have a high specificity (i.e. >90%), Farrington Flexible has the highest sensitivity and specificity, whereas RAMMIE has the highest probability of outbreak detection and is the most timely, typically detecting outbreaks 2-3 days earlier. AVAILABILITY AND IMPLEMENTATION: R codes developed for this project are available through https://github.com/FelipeJColon/AlgorithmComparison. SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.


Assuntos
Vigilância de Evento Sentinela , Algoritmos , Surtos de Doenças , Inglaterra , Humanos
7.
Euro Surveill ; 25(49)2020 12.
Artigo em Inglês | MEDLINE | ID: mdl-33303066

RESUMO

BackgroundEvidence for face-mask wearing in the community to protect against respiratory disease is unclear.AimTo assess effectiveness of wearing face masks in the community to prevent respiratory disease, and recommend improvements to this evidence base.MethodsWe systematically searched Scopus, Embase and MEDLINE for studies evaluating respiratory disease incidence after face-mask wearing (or not). Narrative synthesis and random-effects meta-analysis of attack rates for primary and secondary prevention were performed, subgrouped by design, setting, face barrier type, and who wore the mask. Preferred outcome was influenza-like illness. Grading of Recommendations, Assessment, Development and Evaluations (GRADE) quality assessment was undertaken and evidence base deficits described.Results33 studies (12 randomised control trials (RCTs)) were included. Mask wearing reduced primary infection by 6% (odds ratio (OR): 0.94; 95% CI: 0.75-1.19 for RCTs) to 61% (OR: 0.85; 95% CI: 0.32-2.27; OR: 0.39; 95% CI: 0.18-0.84 and OR: 0.61; 95% CI: 0.45-0.85 for cohort, case-control and cross-sectional studies respectively). RCTs suggested lowest secondary attack rates when both well and ill household members wore masks (OR: 0.81; 95% CI: 0.48-1.37). While RCTs might underestimate effects due to poor compliance and controls wearing masks, observational studies likely overestimate effects, as mask wearing might be associated with other risk-averse behaviours. GRADE was low or very low quality.ConclusionWearing face masks may reduce primary respiratory infection risk, probably by 6-15%. It is important to balance evidence from RCTs and observational studies when their conclusions widely differ and both are at risk of significant bias. COVID-19-specific studies are required.


Assuntos
COVID-19/prevenção & controle , Dispositivos de Proteção dos Olhos , Influenza Humana/prevenção & controle , Máscaras , Infecções por Picornaviridae/prevenção & controle , Infecções Respiratórias/prevenção & controle , Tuberculose/prevenção & controle , COVID-19/transmissão , Infecções por Coronavirus/prevenção & controle , Infecções por Coronavirus/transmissão , Humanos , Influenza Humana/transmissão , Infecções por Picornaviridae/transmissão , Dispositivos de Proteção Respiratória , Infecções Respiratórias/transmissão , SARS-CoV-2 , Tuberculose/transmissão
8.
BMC Infect Dis ; 19(1): 255, 2019 Mar 13.
Artigo em Inglês | MEDLINE | ID: mdl-30866826

RESUMO

BACKGROUND: Campylobacteriosis is a major public health concern. The weather factors that influence spatial and seasonal distributions are not fully understood. METHODS: To investigate the impacts of temperature and rainfall on Campylobacter infections in England and Wales, cases of Campylobacter were linked to local temperature and rainfall at laboratory postcodes in the 30 days before the specimen date. Methods for investigation included a comparative conditional incidence, wavelet, clustering, and time series analyses. RESULTS: The increase of Campylobacter infections in the late spring was significantly linked to temperature two weeks before, with an increase in conditional incidence of 0.175 cases per 100,000 per week for weeks 17 to 24; the relationship to temperature was not linear. Generalized structural time series model revealed that changes in temperature accounted for 33.3% of the expected cases of Campylobacteriosis, with an indication of the direction and relevant temperature range. Wavelet analysis showed a strong annual cycle with additional harmonics at four and six months. Cluster analysis showed three clusters of seasonality with geographic similarities representing metropolitan, rural, and other areas. CONCLUSIONS: The association of Campylobacteriosis with temperature is likely to be indirect. High-resolution spatial temporal linkage of weather parameters and cases is important in improving weather associations with infectious diseases. The primary driver of Campylobacter incidence remains to be determined; other avenues, such as insect contamination of chicken flocks through poor biosecurity should be explored.


Assuntos
Infecções por Campylobacter/epidemiologia , Tempo (Meteorologia) , Animais , Galinhas , Inglaterra/epidemiologia , Humanos , Estações do Ano , País de Gales/epidemiologia
9.
Epidemiol Infect ; 146(15): 1928-1939, 2018 11.
Artigo em Inglês | MEDLINE | ID: mdl-30205851

RESUMO

Infection with STEC O157 is relatively rare but has potentially serious sequelae, particularly for children. Large outbreaks have prompted considerable efforts designed to reduce transmission primarily from food and direct animal contact. Despite these interventions, numbers of infections have remained constant for many years and the mechanisms leading to many sporadic infections remain unclear.Here, we show that two-thirds of all cases reported in England between 2009 and 2015 were sporadic. Crude rates of infection differed geographically and were highest in rural areas during the summer months. Living in rural areas with high densities of cattle, sheep or pigs and those served by private water supplies were associated with increased risk. Living in an area of lower deprivation contributed to increased risk but this appeared to be associated with reported travel abroad. Fresh water coverage and residential proximity to the coast were not risk factors.To reduce the overall burden of infection in England, interventions designed to reduce the number of sporadic infections with STEC should focus on the residents of rural areas with high densities of livestock and the effective management of non-municipal water supplies. The role of sheep as a reservoir and potential source of infection in humans should not be overlooked.


Assuntos
Técnicas de Tipagem Bacteriana , Infecções por Escherichia coli/epidemiologia , Escherichia coli O157/classificação , Escherichia coli O157/isolamento & purificação , Análise Espaço-Temporal , Criação de Animais Domésticos , Animais , Inglaterra/epidemiologia , Geografia , Humanos , Exposição Ocupacional , Fatores de Risco , População Rural , Estações do Ano , Fatores Socioeconômicos , Abastecimento de Água
10.
BMC Public Health ; 18(1): 544, 2018 04 24.
Artigo em Inglês | MEDLINE | ID: mdl-29699520

RESUMO

BACKGROUND: Syndromic surveillance complements traditional public health surveillance by collecting and analysing health indicators in near real time. The rationale of syndromic surveillance is that it may detect health threats faster than traditional surveillance systems permitting more timely, and hence potentially more effective public health action. The effectiveness of syndromic surveillance largely relies on the methods used to detect aberrations. Very few studies have evaluated the performance of syndromic surveillance systems and consequently little is known about the types of events that such systems can and cannot detect. METHODS: We introduce a framework for the evaluation of syndromic surveillance systems that can be used in any setting based upon the use of simulated scenarios. For a range of scenarios this allows the time and probability of detection to be determined and uncertainty is fully incorporated. In addition, we demonstrate how such a framework can model the benefits of increases in the number of centres reporting syndromic data and also determine the minimum size of outbreaks that can or cannot be detected. Here, we demonstrate its utility using simulations of national influenza outbreaks and localised outbreaks of cryptosporidiosis. RESULTS: Influenza outbreaks are consistently detected with larger outbreaks being detected in a more timely manner. Small cryptosporidiosis outbreaks (<1000 symptomatic individuals) are unlikely to be detected. We also demonstrate the advantages of having multiple syndromic data streams (e.g. emergency attendance data, telephone helpline data, general practice consultation data) as different streams are able to detect different outbreak types with different efficacy (e.g. emergency attendance data are useful for the detection of pandemic influenza but not for outbreaks of cryptosporidiosis). We also highlight that for any one disease, the utility of data streams may vary geographically, and that the detection ability of syndromic surveillance varies seasonally (e.g. an influenza outbreak starting in July is detected sooner than one starting later in the year). We argue that our framework constitutes a useful tool for public health emergency preparedness in multiple settings. CONCLUSIONS: The proposed framework allows the exhaustive evaluation of any syndromic surveillance system and constitutes a useful tool for emergency preparedness and response.


Assuntos
Surtos de Doenças/prevenção & controle , Pandemias/prevenção & controle , Vigilância em Saúde Pública/métodos , Vigilância de Evento Sentinela , Criptosporidiose/epidemiologia , Inglaterra/epidemiologia , Humanos , Influenza Humana/epidemiologia
11.
Appl Environ Microbiol ; 83(14)2017 07 15.
Artigo em Inglês | MEDLINE | ID: mdl-28500040

RESUMO

This paper introduces a novel method for sampling pathogens in natural environments. It uses fabric boot socks worn over walkers' shoes to allow the collection of composite samples over large areas. Wide-area sampling is better suited to studies focusing on human exposure to pathogens (e.g., recreational walking). This sampling method is implemented using a citizen science approach: groups of three walkers wearing boot socks undertook one of six routes, 40 times over 16 months in the North West (NW) and East Anglian (EA) regions of England. To validate this methodology, we report the successful implementation of this citizen science approach, the observation that Campylobacter bacteria were detected on 47% of boot socks, and the observation that multiple boot socks from individual walks produced consistent results. The findings indicate higher Campylobacter levels in the livestock-dominated NW than in EA (55.8% versus 38.6%). Seasonal differences in the presence of Campylobacter bacteria were found between the regions, with indications of winter peaks in both regions but a spring peak in the NW. The presence of Campylobacter bacteria on boot socks was negatively associated with ambient temperature (P = 0.011) and positively associated with precipitation (P < 0.001), results consistent with our understanding of Campylobacter survival and the probability of material adhering to boot socks. Campylobacter jejuni was the predominant species found; Campylobacter coli was largely restricted to the livestock-dominated NW. Source attribution analysis indicated that the potential source of C. jejuni was predominantly sheep in the NW and wild birds in EA but did not differ between peak and nonpeak periods of human incidence.IMPORTANCE There is debate in the literature on the pathways through which pathogens are transferred from the environment to humans. We report on the success of a novel method for sampling human-pathogen interactions using boot socks and citizen science techniques, which enable us to sample human-pathogen interactions that may occur through visits to natural environments. This contrasts with traditional environmental sampling, which is based on spot sampling techniques and does not sample human-pathogen interactions. Our methods are of practical value to scientists trying to understand the transmission of pathogens from the environment to people. Our findings provide insight into the risk of Campylobacter exposure from recreational visits and an understanding of seasonal differences in risk and the factors behind these patterns. We highlight the Campylobacter species predominantly encountered and the potential sources of C. jejuni.


Assuntos
Infecções por Campylobacter/microbiologia , Infecções por Campylobacter/veterinária , Campylobacter/isolamento & purificação , Gado/microbiologia , Técnicas Microbiológicas/métodos , Animais , Animais Selvagens/microbiologia , Campylobacter/classificação , Campylobacter/genética , Campylobacter/fisiologia , Inglaterra , Meio Ambiente , Humanos , Técnicas Microbiológicas/instrumentação , Estações do Ano , Sapatos
12.
Environ Health ; 16(Suppl 1): 117, 2017 12 05.
Artigo em Inglês | MEDLINE | ID: mdl-29219100

RESUMO

This review examined the likely impact of climate change upon food-borne disease in the UK using Campylobacter and Salmonella as example organisms. Campylobacter is an important food-borne disease and an increasing public health threat. There is a reasonable evidence base that the environment and weather play a role in its transmission to humans. However, uncertainty as to the precise mechanisms through which weather affects disease, make it difficult to assess the likely impact of climate change. There are strong positive associations between Salmonella cases and ambient temperature, and a clear understanding of the mechanisms behind this. However, because the incidence of Salmonella disease is declining in the UK, any climate change increases are likely to be small. For both Salmonella and Campylobacter the disease incidence is greatest in older adults and young children. There are many pathways through which climate change may affect food but only a few of these have been rigorously examined. This provides a high degree of uncertainty as to what the impacts of climate change will be. Food is highly controlled at the National and EU level. This provides the UK with resilience to climate change as well as potential to adapt to its consequences but it is unknown whether these are sufficient in the context of a changing climate.


Assuntos
Infecções por Campylobacter/epidemiologia , Mudança Climática , Doenças Transmitidas por Alimentos/epidemiologia , Infecções por Salmonella/epidemiologia , Campylobacter/fisiologia , Infecções por Campylobacter/transmissão , Doenças Transmitidas por Alimentos/etiologia , Humanos , Incidência , Saúde Pública , Salmonella/fisiologia , Infecções por Salmonella/transmissão , Incerteza , Reino Unido/epidemiologia
13.
BMC Public Health ; 14: 781, 2014 Aug 22.
Artigo em Inglês | MEDLINE | ID: mdl-25149418

RESUMO

BACKGROUND: Dengue fever is the most prevalent mosquito-borne viral disease worldwide. Dengue transmission is critically dependent on climatic factors and there is much concern as to whether climate change would spread the disease to areas currently unaffected. The occurrence of autochthonous infections in Croatia and France in 2010 has raised concerns about a potential re-emergence of dengue in Europe. The objective of this study is to estimate dengue risk in Europe under climate change scenarios. METHODS: We used a Generalized Additive Model (GAM) to estimate dengue fever risk as a function of climatic variables (maximum temperature, minimum temperature, precipitation, humidity) and socioeconomic factors (population density, urbanisation, GDP per capita and population size), under contemporary conditions (1985-2007) in Mexico. We then used our model estimates to project dengue incidence under baseline conditions (1961-1990) and three climate change scenarios: short-term 2011-2040, medium-term 2041-2070 and long-term 2071-2100 across Europe. The model was used to calculate average number of yearly dengue cases at a spatial resolution of 10 × 10 km grid covering all land surface of the currently 27 EU member states. To our knowledge, this is the first attempt to model dengue fever risk in Europe in terms of disease occurrence rather than mosquito presence. RESULTS: The results were presented using Geographical Information System (GIS) and allowed identification of areas at high risk. Dengue fever hot spots were clustered around the coastal areas of the Mediterranean and Adriatic seas and the Po Valley in northern Italy. CONCLUSIONS: This risk assessment study is likely to be a valuable tool assisting effective and targeted adaptation responses to reduce the likely increased burden of dengue fever in a warmer world.


Assuntos
Mudança Climática , Dengue/epidemiologia , Aedes , Animais , Europa (Continente)/epidemiologia , Sistemas de Informação Geográfica , Humanos , Incidência , Modelos Teóricos , Densidade Demográfica , Medição de Risco , Tempo (Meteorologia)
14.
Lancet Microbe ; 5(2): e173-e180, 2024 02.
Artigo em Inglês | MEDLINE | ID: mdl-38244555

RESUMO

BACKGROUND: Whole-genome sequencing (WGS) is the gold standard diagnostic tool to identify and genetically characterise emerging pathogen mutations (variants), but cost, capacity, and timeliness limit its use when large populations need rapidly assessing. We assessed the potential of genotyping assays to provide accurate and timely variant information at scale by retrospectively examining surveillance for SARS-CoV-2 variants in England between March and September, 2021, when genotyping assays were used widely for variant detection. METHODS: We chose a panel of four RT-PCR genotyping assays to detect circulating variants of SARS-COV-2 in England and developed a decision algorithm to assign a probable SARS-CoV-2 variant to samples using the assay results. We extracted surveillance data from the UK Health Security Agency databases for 115 934 SARS-CoV-2-positive samples (March 1-Sept 6, 2021) when variant information was available from both genotyping and WGS. By comparing the genotyping and WGS variant result, we calculated accuracy metrics (ie, sensitivity, specificity, and positive predictive value [PPV]) and the time difference between the sample collection date and the availability of variant information. We assessed the number of samples with a variant assigned from genotyping or WGS, or both, over time. FINDINGS: Genotyping and an initial decision algorithm (April 10-May 11, 2021 data) were accurate for key variant assignment: sensitivities and PPVs were 0·99 (95% CI 0·99-0·99) for the alpha, 1·00 (1·00-1·00) for the beta, and 0·91 (0·80-1·00) for the gamma variants; specificities were 0·97 (0·96-0·98), 1·00 (1·00-1·00), and 1·00 (1·00-1·00), respectively. A subsequent decision algorithm over a longer time period (May 27-Sept 6, 2021 data) remained accurate for key variant assignment: sensitivities were 0·91 (95% CI 0·74-1·00) for the beta, 0·98 (0·98-0·99) for the delta, and 0·93 (0·81-1·00) for the gamma variants; specificities were 1·00 (1·00-1·00), 0·96 (0·96-0·97), and 1·00 (1·00-1·00), respectively; and PPVs were 0·83 (0·62-1·00), 1·00 (1·00-1·00), and 0·78 (0·59-0·97), respectively. Genotyping produced variant information a median of 3 days (IQR 2-4) after the sample collection date, which was faster than with WGS (9 days [8-11]). The flexibility of genotyping enabled a nine-times increase in the quantity of samples tested for variants by this method (from 5000 to 45 000). INTERPRETATION: RT-PCR genotyping assays are suitable for high-throughput variant surveillance and could complement WGS, enabling larger scale testing for known variants and timelier results, with important implications for effective public health responses and disease control globally, especially in settings with low WGS capacity. However, the choice of panels of RT-PCR assays is highly dependent on database information on circulating variants generated by WGS, which could limit the use of genotyping assays when new variants are emerging and spreading rapidly. FUNDING: UK Health Security Agency and National Institute for Health Research Health Protection Research Unit in Emergency Preparedness and Response.


Assuntos
COVID-19 , Humanos , COVID-19/diagnóstico , COVID-19/epidemiologia , Genótipo , Estudos Retrospectivos , Reação em Cadeia da Polimerase Via Transcriptase Reversa , SARS-CoV-2/genética , Inglaterra/epidemiologia , Teste para COVID-19
15.
Ann Epidemiol ; 82: 66-76.e6, 2023 06.
Artigo em Inglês | MEDLINE | ID: mdl-37001627

RESUMO

PURPOSE: Most index cases with novel coronavirus infections transmit disease to just one or two other individuals, but some individuals "super-spread"-they infect many secondary cases. Understanding common factors that super-spreaders may share could inform outbreak models, and be used to guide contact tracing during outbreaks. METHODS: We searched in MEDLINE, Scopus, and preprints to identify studies about people documented as transmitting pathogens that cause SARS, MERS, or COVID-19 to at least nine other people. We extracted data to describe them by age, sex, location, occupation, activities, symptom severity, any underlying conditions, disease outcome and undertook quality assessment for outbreaks published by June 2021. RESULTS: The most typical super-spreader was a male age 40+. Most SARS or MERS super-spreaders were very symptomatic, the super-spreading occurred in hospital settings and frequently the individual died. In contrast, COVID-19 super-spreaders often had very mild disease and most COVID-19 super-spreading happened in community settings. CONCLUSIONS: SARS and MERS super-spreaders were often symptomatic, middle- or older-age adults who had a high mortality rate. In contrast, COVID-19 super-spreaders tended to have mild disease and were any adult age. More outbreak reports should be published with anonymized but useful demographic information to improve understanding of super-spreading, super-spreaders, and the settings in which super-spreading happens.


Assuntos
COVID-19 , Adulto , Masculino , Humanos , COVID-19/epidemiologia , SARS-CoV-2 , Surtos de Doenças
16.
Am J Infect Control ; 51(7): 792-799, 2023 07.
Artigo em Inglês | MEDLINE | ID: mdl-36332725

RESUMO

BACKGROUND: Staff actions to prevent infection introduction and transmission in long-term care facilities (LTCFs) were key to reducing morbidity and mortality from COVID-19. Implementing infection control measures (ICMs) requires training, adherence and complex decision making while trying to deliver high quality care. We surveyed LTCF staff in England about their preparedness and morale at 3 timepoints during the COVID-19 epidemic. METHODS: Online structured survey targeted at LTCF workers (any role) administered at 3 timepoints (November 2020-January 2021; August-November 2021; March-May 2022). Narrative summary of answers, narrative and statistical summary (proportionality with Pearson's chi-square or Fisher's Exact Test) of possible differences in answers between waves. RESULTS: Across all 3 survey waves, 387 responses were received. Morale, attitudes towards working environment and perception about colleague collaboration were mostly positive at all survey points. Infection control training was perceived as adequate. Staff felt mostly positive emotions at work. The working environment remained challenging. Masks were the single form of PPE most consistently used; eye protection the least used. Mask-wearing was linked to poorer communication and resident discomfort as well as mild negative health impacts on many staff, such as dehydration and adverse skin reactions. Hand sanitizer caused skin irritation. CONCUSIONS: Staff morale and working practices were generally good even though the working environment provided many new challenges that did not exist pre-pandemic.


Assuntos
COVID-19 , Humanos , COVID-19/prevenção & controle , Pandemias/prevenção & controle , Controle de Infecções , Instalações de Saúde , Moral
17.
Lancet Public Health ; 8(11): e850-e858, 2023 11.
Artigo em Inglês | MEDLINE | ID: mdl-37832574

RESUMO

BACKGROUND: During the COVID-19 pandemic, cases were tracked using multiple surveillance systems. Some systems were completely novel, and others incorporated multiple data streams to estimate case incidence and prevalence. How well these different surveillance systems worked as epidemic indicators is unclear, which has implications for future disease surveillance and outbreak management. The aim of this study was to compare case counts, prevalence and incidence, timeliness, and comprehensiveness of different COVID-19 surveillance systems in England. METHODS: For this retrospective observational study of COVID-19 surveillance systems in England, data from 12 surveillance systems were extracted from publicly available sources (Jan 1, 2020-Nov 30, 2021). The main outcomes were correlations between different indicators of COVID-19 incidence or prevalence. These data were integrated as daily time-series and comparisons undertaken using Spearman correlation between candidate alternatives and the most timely (updated daily, clinical case register) and the least biased (from comprehensive household sampling) COVID-19 epidemic indicators, with comparisons focused on the period of Sept 1, 2020-Nov 30, 2021. FINDINGS: Spearman statistic correlations during the full focus period between the least biased indicator (from household surveys) and other epidemic indicator time-series were 0·94 (95% CI 0·92 to 0·95; clinical cases, the most timely indicator), 0·92 (0·90 to 0·94; estimates of incidence generated after incorporating information about self-reported case status on the ZoeApp, which is a digital app), 0·67 (95% CI 0·60 to 0·73, emergency department attendances), 0·64 (95% CI 0·60 to 0·68, NHS 111 website visits), 0·63 (95% CI 0·56 to 0·69, wastewater viral genome concentrations), 0·60 (95% CI 0·52 to 0·66, admissions to hospital with positive COVID-19 status), 0·45 (95% CI 0·36 to 0·52, NHS 111 calls), 0·08 (95% CI -0·03 to 0·18, Google search rank for "covid"), -0·04 (95% CI -0·12 to 0·05, in-hours consultations with general practitioners), and -0·37 (95% CI -0·46 to -0·28, Google search rank for "coronavirus"). Time lags (-14 to +14 days) did not markedly improve these rho statistics. Clinical cases (the most timely indicator) captured a more consistent proportion of cases than the self-report digital app did. INTERPRETATION: A suite of monitoring systems is useful. The household survey system was the most comprehensive and least biased epidemic monitor, but not very timely. Data from laboratory testing, the self-reporting digital app, and attendances to emergency departments were comparatively useful, fairly accurate, and timely epidemic trackers. FUNDING: National Institute for Health and Care Research Health Protection Research Unit in Emergency Preparedness and Response, a partnership between the UK Health Security Agency, King's College London, and the University of East Anglia.


Assuntos
COVID-19 , Humanos , COVID-19/epidemiologia , Pandemias/prevenção & controle , Inglaterra/epidemiologia , Estudos Retrospectivos , Londres
18.
Sci Total Environ ; 892: 164441, 2023 Sep 20.
Artigo em Inglês | MEDLINE | ID: mdl-37245822

RESUMO

Some types of poultry bedding made from recycled materials have been reported to contain environmental contaminants such as polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs, dioxins), polychlorinated biphenyls (PCBs) brominated flame retardants (BFRs) polychlorinated naphthalenes (PCNs), polybrominated dioxins (PBDD/Fs), perfluoroalkyl substances (PFAS), etc. In one of the first studies of its kind, the uptake of these contaminants by chicken muscle tissue, liver, and eggs from three types of recycled, commercially available bedding material was simultaneously investigated using conventional husbandry to raise day old chickens to maturity. A weight of evidence analysis showed that PCBs, polybrominated diphenylethers (PBDEs), PCDD/Fs, PCNs and PFAS displayed the highest potential for uptake which varied depending on the type of bedding material used. During the first three to four months of laying, an increasing trend was observed in the concentrations of ΣTEQ (summed toxic equivalence of PCDD/Fs, PCBs, PBDD/Fs, PCNs and polybrominated biphenyls), NDL-PCBs and PBDEs in the eggs of chickens raised on shredded cardboard. Further analysis using bio-transfer factors (BTFs) when egg production reached a steady state, revealed that some PCB congeners (PCBs 28, 81, 138, 153 and 180) irrespective of molecular configuration or chlorine number, showed the highest tendency for uptake. Conversely, BTFs for PBDEs showed good correlation with bromine number, increasing to a maximum value for BDE-209. This relationship was reversed for PCDFs (and to some extent for PCDDs) with tetra- and penta- chlorinated congeners showing a greater tendency for selective uptake. The overall patterns were consistent, although some variability in BTF values was observed between tested materials which may relate to differences in bioavailability. The results indicate a potentially overlooked source of food chain contamination as other livestock products (cow's milk, lamb, beef, duck, etc.) could be similarly impacted.


Assuntos
Dioxinas , Fluorocarbonos , Bifenilos Policlorados , Dibenzodioxinas Policloradas , Feminino , Bovinos , Animais , Ovinos , Dioxinas/análise , Bifenilos Policlorados/análise , Galinhas , Dibenzodioxinas Policloradas/análise , Dibenzofuranos/análise , Éteres Difenil Halogenados/análise , Fluorocarbonos/análise , Dibenzofuranos Policlorados/análise , Monitoramento Ambiental
19.
Sci Rep ; 13(1): 3893, 2023 03 23.
Artigo em Inglês | MEDLINE | ID: mdl-36959189

RESUMO

Vibrio vulnificus is an opportunistic bacterial pathogen, occurring in warm low-salinity waters. V. vulnificus wound infections due to seawater exposure are infrequent but mortality rates are high (~ 18%). Seawater bacterial concentrations are increasing but changing disease pattern assessments or climate change projections are rare. Here, using a 30-year database of V. vulnificus cases for the Eastern USA, changing disease distribution was assessed. An ecological niche model was developed, trained and validated to identify links to oceanographic and climate data. This model was used to predict future disease distribution using data simulated by seven Global Climate Models (GCMs) which belong to the newest Coupled Model Intercomparison Project (CMIP6). Risk was estimated by calculating the total population within 200 km of the disease distribution. Predictions were generated for different "pathways" of global socioeconomic development which incorporate projections of greenhouse gas emissions and demographic change. In Eastern USA between 1988 and 2018, V. vulnificus wound infections increased eightfold (10-80 cases p.a.) and the northern case limit shifted northwards 48 km p.a. By 2041-2060, V. vulnificus infections may expand their current range to encompass major population centres around New York (40.7°N). Combined with a growing and increasingly elderly population, annual case numbers may double. By 2081-2100 V. vulnificus infections may be present in every Eastern USA State under medium-to-high future emissions and warming. The projected expansion of V. vulnificus wound infections stresses the need for increased individual and public health awareness in these areas.


Assuntos
Vibrioses , Vibrio vulnificus , Infecção dos Ferimentos , Humanos , Idoso , Vibrioses/epidemiologia , América do Norte
20.
Environ Sci Technol ; 45(11): 5017-24, 2011 Jun 01.
Artigo em Inglês | MEDLINE | ID: mdl-21548556

RESUMO

The first investigation into PBDE levels in food produced from flood-prone land on industrial river catchments was conducted. In August 2008 samples of cows' milk, along with grass and soil were taken from 5 pairs of flood-prone and control farms on the River Trent (Central UK). The sum of 7 BDE congeners (28, 47, 99, 100, 153, 154, and 183) was calculated. Higher PBDE levels occurred in soil on flood-prone compared to control farms (median 770 vs 280 ng/kg dry weight). These higher levels were not reflected in the grass samples indicating that PBDE contamination on soils is not transferred efficiently to grass. This observation alongside the fact that cows on flood-prone farms spend time on non-flood-prone land and are fed substantial quantities of commercial feed are reasons why higher PBDE levels were not found in milk from flood-prone farms (median 300 vs 250 ng/kg fat weight). Similar BDE47/BDE99 ratios were observed in soil and grass samples compared to the PBDE product commonly used in the UK, indicating few differences in source-pathway transfer efficiencies between congeners. The BDE47/BDE99 ratio in the milk samples was greater than those in the grass and feed indicating differential food to milk transfer efficiencies between congeners.


Assuntos
Poluentes Ambientais/análise , Inundações , Éteres Difenil Halogenados/análise , Leite/química , Animais , Bovinos , Poaceae/química , Rios , Solo/química , Reino Unido
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA