Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 26
Filtrar
1.
Prev Med ; 177: 107774, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-37992976

RESUMO

Installation of technologies to remove or deactivate respiratory pathogens from indoor air is a plausible non-pharmaceutical infectious disease control strategy. OBJECTIVE: We undertook a systematic review of worldwide observational and experimental studies, published 1970-2022, to synthesise evidence about the effectiveness of suitable indoor air treatment technologies to prevent respiratory or gastrointestinal infections. METHODS: We searched for data about infection and symptom outcomes for persons who spent minimum 20 h/week in shared indoor spaces subjected to air treatment strategies hypothesised to change risk of respiratory or gastrointestinal infections or symptoms. RESULTS: Pooled data from 32 included studies suggested no net benefits of air treatment technologies for symptom severity or symptom presence, in absence of confirmed infection. Infection incidence was lower in three cohort studies for persons exposed to high efficiency particulate air filtration (RR 0.4, 95%CI 0.28-0.58, p < 0.001) and in one cohort study that combined ionisers with electrostatic nano filtration (RR 0.08, 95%CI 0.01-0.60, p = 0.01); other types of air treatment technologies and air treatment in other study designs were not strongly linked to fewer infections. The infection outcome data exhibited strong publication bias. CONCLUSIONS: Although environmental and surface samples are reduced after air treatment by several air treatment strategies, especially germicidal lights and high efficiency particulate air filtration, robust evidence has yet to emerge that these technologies are effective at reducing respiratory or gastrointestinal infections in real world settings. Data from several randomised trials have yet to report and will be welcome to the evidence base.


Assuntos
Infecções Respiratórias , Humanos , Estudos de Coortes , Infecções Respiratórias/prevenção & controle
2.
Int J Mol Sci ; 24(19)2023 Sep 30.
Artigo em Inglês | MEDLINE | ID: mdl-37834258

RESUMO

Brain-derived neurotrophic factor (BDNF) has been studied as a biomarker of major depressive disorder (MDD). Besides diagnostic biomarkers, clinically useful biomarkers can inform response to treatment. We aimed to review all studies that sought to relate BDNF baseline levels, or BDNF polymorphisms, with response to treatment in MDD. In order to achieve this, we performed a systematic review of studies that explored the relation of BDNF with both pharmacological and non-pharmacological treatment. Finally, we reviewed the evidence that relates peripheral levels of BDNF and BDNF polymorphisms with the development and management of treatment-resistant depression.


Assuntos
Transtorno Depressivo Maior , Transtorno Depressivo Resistente a Tratamento , Humanos , Transtorno Depressivo Maior/diagnóstico , Transtorno Depressivo Maior/tratamento farmacológico , Transtorno Depressivo Maior/genética , Fator Neurotrófico Derivado do Encéfalo/genética , Fator Neurotrófico Derivado do Encéfalo/uso terapêutico , Biomarcadores , Polimorfismo Genético
3.
Lancet Public Health ; 8(11): e850-e858, 2023 11.
Artigo em Inglês | MEDLINE | ID: mdl-37832574

RESUMO

BACKGROUND: During the COVID-19 pandemic, cases were tracked using multiple surveillance systems. Some systems were completely novel, and others incorporated multiple data streams to estimate case incidence and prevalence. How well these different surveillance systems worked as epidemic indicators is unclear, which has implications for future disease surveillance and outbreak management. The aim of this study was to compare case counts, prevalence and incidence, timeliness, and comprehensiveness of different COVID-19 surveillance systems in England. METHODS: For this retrospective observational study of COVID-19 surveillance systems in England, data from 12 surveillance systems were extracted from publicly available sources (Jan 1, 2020-Nov 30, 2021). The main outcomes were correlations between different indicators of COVID-19 incidence or prevalence. These data were integrated as daily time-series and comparisons undertaken using Spearman correlation between candidate alternatives and the most timely (updated daily, clinical case register) and the least biased (from comprehensive household sampling) COVID-19 epidemic indicators, with comparisons focused on the period of Sept 1, 2020-Nov 30, 2021. FINDINGS: Spearman statistic correlations during the full focus period between the least biased indicator (from household surveys) and other epidemic indicator time-series were 0·94 (95% CI 0·92 to 0·95; clinical cases, the most timely indicator), 0·92 (0·90 to 0·94; estimates of incidence generated after incorporating information about self-reported case status on the ZoeApp, which is a digital app), 0·67 (95% CI 0·60 to 0·73, emergency department attendances), 0·64 (95% CI 0·60 to 0·68, NHS 111 website visits), 0·63 (95% CI 0·56 to 0·69, wastewater viral genome concentrations), 0·60 (95% CI 0·52 to 0·66, admissions to hospital with positive COVID-19 status), 0·45 (95% CI 0·36 to 0·52, NHS 111 calls), 0·08 (95% CI -0·03 to 0·18, Google search rank for "covid"), -0·04 (95% CI -0·12 to 0·05, in-hours consultations with general practitioners), and -0·37 (95% CI -0·46 to -0·28, Google search rank for "coronavirus"). Time lags (-14 to +14 days) did not markedly improve these rho statistics. Clinical cases (the most timely indicator) captured a more consistent proportion of cases than the self-report digital app did. INTERPRETATION: A suite of monitoring systems is useful. The household survey system was the most comprehensive and least biased epidemic monitor, but not very timely. Data from laboratory testing, the self-reporting digital app, and attendances to emergency departments were comparatively useful, fairly accurate, and timely epidemic trackers. FUNDING: National Institute for Health and Care Research Health Protection Research Unit in Emergency Preparedness and Response, a partnership between the UK Health Security Agency, King's College London, and the University of East Anglia.


Assuntos
COVID-19 , Humanos , COVID-19/epidemiologia , Pandemias/prevenção & controle , Inglaterra/epidemiologia , Estudos Retrospectivos , Londres
4.
J Emerg Manag ; 21(7): 85-96, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37154447

RESUMO

The COVID-19 pandemic had a global reach and impact, introducing stay at home orders, social distancing, facemask wearing, and closing national and international borders. Yet, the need for international disaster aid as a result of previous disasters and ongoing crises remained present. Interviews with staff from United Kingdom aid agencies and their partner organizations examined how development and humanitarian activities changed during the first six months of the pandemic. Seven key themes were highlighted. The need to recognize individual country contexts and experiences when dealing with a pandemic was emphasized, together with appropriate strategic decisions around guidance and supporting staff and the value of learning from previous experiences. Restrictions limited agencies' ability to monitor programs and ensure accountability effectively, but relationships between partners adjusted, with a move to a greater reliance on local partners and increased empowerment in these groups. Trust was vital to allow for the continuation of programs and services during the first months of the pandemic. Most programs continued but with significant adaptations. An enhanced use of communication technology was a key adaptation, though caveats remained around access. Concern around safeguarding and stigmatization of vulnerable groups was reported as an increasing issue in some contexts. The impact of COVID-19 restrictions on ongoing disaster aid was rapid and extensive, forcing aid agencies at different scales to work swiftly to try to ensure as little disruption as possible, and -generating important lessons for both the ongoing and future crises.


Assuntos
COVID-19 , Desastres , Humanos , COVID-19/epidemiologia , Pandemias
5.
Sci Total Environ ; 892: 164441, 2023 Sep 20.
Artigo em Inglês | MEDLINE | ID: mdl-37245822

RESUMO

Some types of poultry bedding made from recycled materials have been reported to contain environmental contaminants such as polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs, dioxins), polychlorinated biphenyls (PCBs) brominated flame retardants (BFRs) polychlorinated naphthalenes (PCNs), polybrominated dioxins (PBDD/Fs), perfluoroalkyl substances (PFAS), etc. In one of the first studies of its kind, the uptake of these contaminants by chicken muscle tissue, liver, and eggs from three types of recycled, commercially available bedding material was simultaneously investigated using conventional husbandry to raise day old chickens to maturity. A weight of evidence analysis showed that PCBs, polybrominated diphenylethers (PBDEs), PCDD/Fs, PCNs and PFAS displayed the highest potential for uptake which varied depending on the type of bedding material used. During the first three to four months of laying, an increasing trend was observed in the concentrations of ΣTEQ (summed toxic equivalence of PCDD/Fs, PCBs, PBDD/Fs, PCNs and polybrominated biphenyls), NDL-PCBs and PBDEs in the eggs of chickens raised on shredded cardboard. Further analysis using bio-transfer factors (BTFs) when egg production reached a steady state, revealed that some PCB congeners (PCBs 28, 81, 138, 153 and 180) irrespective of molecular configuration or chlorine number, showed the highest tendency for uptake. Conversely, BTFs for PBDEs showed good correlation with bromine number, increasing to a maximum value for BDE-209. This relationship was reversed for PCDFs (and to some extent for PCDDs) with tetra- and penta- chlorinated congeners showing a greater tendency for selective uptake. The overall patterns were consistent, although some variability in BTF values was observed between tested materials which may relate to differences in bioavailability. The results indicate a potentially overlooked source of food chain contamination as other livestock products (cow's milk, lamb, beef, duck, etc.) could be similarly impacted.


Assuntos
Dioxinas , Fluorocarbonos , Bifenilos Policlorados , Dibenzodioxinas Policloradas , Feminino , Bovinos , Animais , Ovinos , Dioxinas/análise , Bifenilos Policlorados/análise , Galinhas , Dibenzodioxinas Policloradas/análise , Dibenzofuranos/análise , Éteres Difenil Halogenados/análise , Fluorocarbonos/análise , Dibenzofuranos Policlorados/análise , Monitoramento Ambiental
6.
Ann Epidemiol ; 82: 66-76.e6, 2023 06.
Artigo em Inglês | MEDLINE | ID: mdl-37001627

RESUMO

PURPOSE: Most index cases with novel coronavirus infections transmit disease to just one or two other individuals, but some individuals "super-spread"-they infect many secondary cases. Understanding common factors that super-spreaders may share could inform outbreak models, and be used to guide contact tracing during outbreaks. METHODS: We searched in MEDLINE, Scopus, and preprints to identify studies about people documented as transmitting pathogens that cause SARS, MERS, or COVID-19 to at least nine other people. We extracted data to describe them by age, sex, location, occupation, activities, symptom severity, any underlying conditions, disease outcome and undertook quality assessment for outbreaks published by June 2021. RESULTS: The most typical super-spreader was a male age 40+. Most SARS or MERS super-spreaders were very symptomatic, the super-spreading occurred in hospital settings and frequently the individual died. In contrast, COVID-19 super-spreaders often had very mild disease and most COVID-19 super-spreading happened in community settings. CONCLUSIONS: SARS and MERS super-spreaders were often symptomatic, middle- or older-age adults who had a high mortality rate. In contrast, COVID-19 super-spreaders tended to have mild disease and were any adult age. More outbreak reports should be published with anonymized but useful demographic information to improve understanding of super-spreading, super-spreaders, and the settings in which super-spreading happens.


Assuntos
COVID-19 , Adulto , Masculino , Humanos , COVID-19/epidemiologia , SARS-CoV-2 , Surtos de Doenças
7.
Sci Rep ; 13(1): 3893, 2023 03 23.
Artigo em Inglês | MEDLINE | ID: mdl-36959189

RESUMO

Vibrio vulnificus is an opportunistic bacterial pathogen, occurring in warm low-salinity waters. V. vulnificus wound infections due to seawater exposure are infrequent but mortality rates are high (~ 18%). Seawater bacterial concentrations are increasing but changing disease pattern assessments or climate change projections are rare. Here, using a 30-year database of V. vulnificus cases for the Eastern USA, changing disease distribution was assessed. An ecological niche model was developed, trained and validated to identify links to oceanographic and climate data. This model was used to predict future disease distribution using data simulated by seven Global Climate Models (GCMs) which belong to the newest Coupled Model Intercomparison Project (CMIP6). Risk was estimated by calculating the total population within 200 km of the disease distribution. Predictions were generated for different "pathways" of global socioeconomic development which incorporate projections of greenhouse gas emissions and demographic change. In Eastern USA between 1988 and 2018, V. vulnificus wound infections increased eightfold (10-80 cases p.a.) and the northern case limit shifted northwards 48 km p.a. By 2041-2060, V. vulnificus infections may expand their current range to encompass major population centres around New York (40.7°N). Combined with a growing and increasingly elderly population, annual case numbers may double. By 2081-2100 V. vulnificus infections may be present in every Eastern USA State under medium-to-high future emissions and warming. The projected expansion of V. vulnificus wound infections stresses the need for increased individual and public health awareness in these areas.


Assuntos
Vibrioses , Vibrio vulnificus , Infecção dos Ferimentos , Humanos , Idoso , Vibrioses/epidemiologia , América do Norte
8.
Nutrients ; 14(3)2022 Jan 18.
Artigo em Inglês | MEDLINE | ID: mdl-35276767

RESUMO

Vitamin A deficiency is a major health risk for infants and children in low- and middle-income countries. This scoping review identified, quantified, and mapped research for use in updating nutrient requirements and upper limits for vitamin A in children aged 0 to 48 months, using health-based or modelling-based approaches. Structured searches were run on Medline, EMBASE, and Cochrane Central, from inception to 19 March 2021. Titles and abstracts were assessed independently in duplicate, as were 20% of full texts. Included studies were tabulated by question, methodology and date, with the most relevant data extracted and assessed for risk of bias. We found that the most recent health-based systematic reviews and trials assessed the effects of supplementation, though some addressed the effects of staple food fortification, complementary foods, biofortified maize or cassava, and fortified drinks, on health outcomes. Recent isotopic tracer studies and modelling approaches may help quantify the effects of bio-fortification, fortification, and food-based approaches for increasing vitamin A depots. A systematic review and several trials identified adverse events associated with higher vitamin A intakes, which should be useful for setting upper limits. We have generated and provide a database of relevant research. Full systematic reviews, based on this scoping review, are needed to answer specific questions to set vitamin A requirements and upper limits.


Assuntos
Deficiência de Vitamina A , Vitamina A , Criança , Pré-Escolar , Alimentos Fortificados , Humanos , Lactente , Recém-Nascido , Necessidades Nutricionais , Estado Nutricional , Deficiência de Vitamina A/prevenção & controle
9.
Rev Med Chil ; 149(7): 1014-1022, 2021 Jul.
Artigo em Espanhol | MEDLINE | ID: mdl-34751303

RESUMO

BACKGROUND: A significant proportion of the clinical record is in free text format, making it difficult to extract key information and make secondary use of patient data. Automatic detection of information within narratives initially requires humans, following specific protocols and rules, to identify medical entities of interest. AIM: To build a linguistic resource of annotated medical entities on texts produced in Chilean hospitals. MATERIAL AND METHODS: A clinical corpus was constructed using 150 referrals in public hospitals. Three annotators identified six medical entities: clinical findings, diagnoses, body parts, medications, abbreviations, and family members. An annotation scheme was designed, and an iterative approach to train the annotators was applied. The F1-Score metric was used to assess the progress of the annotator's agreement during their training. RESULTS: An average F1-Score of 0.73 was observed at the beginning of the project. After the training period, it increased to 0.87. Annotation of clinical findings and body parts showed significant discrepancy, while abbreviations, medications, and family members showed high agreement. CONCLUSIONS: A linguistic resource with annotated medical entities on texts produced in Chilean hospitals was built and made available, working with annotators related to medicine. The iterative annotation approach allowed us to improve performance metrics. The corpus and annotation protocols will be released to the research community.


Assuntos
Processamento Eletrônico de Dados , Chile , Humanos
10.
Rev. méd. Chile ; 149(7): 1014-1022, jul. 2021. ilus, graf
Artigo em Espanhol | LILACS | ID: biblio-1389546

RESUMO

Background: A significant proportion of the clinical record is in free text format, making it difficult to extract key information and make secondary use of patient data. Automatic detection of information within narratives initially requires humans, following specific protocols and rules, to identify medical entities of interest. Aim: To build a linguistic resource of annotated medical entities on texts produced in Chilean hospitals. Material and Methods: A clinical corpus was constructed using 150 referrals in public hospitals. Three annotators identified six medical entities: clinical findings, diagnoses, body parts, medications, abbreviations, and family members. An annotation scheme was designed, and an iterative approach to train the annotators was applied. The F1-Score metric was used to assess the progress of the annotator's agreement during their training. Results: An average F1-Score of 0.73 was observed at the beginning of the project. After the training period, it increased to 0.87. Annotation of clinical findings and body parts showed significant discrepancy, while abbreviations, medications, and family members showed high agreement. Conclusions: A linguistic resource with annotated medical entities on texts produced in Chilean hospitals was built and made available, working with annotators related to medicine. The iterative annotation approach allowed us to improve performance metrics. The corpus and annotation protocols will be released to the research community.


Assuntos
Humanos , Processamento Eletrônico de Dados , Chile
11.
Sci Total Environ ; 765: 142787, 2021 Apr 15.
Artigo em Inglês | MEDLINE | ID: mdl-33246727

RESUMO

Many types of bioresource materials are beneficially recycled in agriculture for soil improvement and as alternative bedding materials for livestock, but they also potentially transfer contaminants into plant and animal foods. Representative types of industrial and municipal bioresources were selected to assess the extent of organic chemical contamination, including: (i) land applied materials: treated sewage sludge (biosolids), meat and bone meal ash (MBMA), poultry litter ash (PLA), paper sludge ash (PSA) and compost-like-output (CLO), and (ii) bedding materials: recycled waste wood (RWW), dried paper sludge (DPS), paper sludge ash (PSA) and shredded cardboard. The materials generally contained lower concentrations of polychlorinated dibenzo-p-dioxins/dibenzofurans (PCDD/Fs) and dioxin-like polychlorinated biphenyls (PCBs) relative to earlier reports, indicating the decline in environmental emissions of these established contaminants. However, concentrations of polycyclic aromatic hydrocarbons (PAHs) remain elevated in biosolids samples from urban catchments. Polybrominated dibenzo-p-dioxins/dibenzofurans (PBDD/Fs) were present in larger amounts in biosolids and CLO compared to their chlorinated counterparts and hence are of potentially greater significance in contemporary materials. The presence of non-ortho-polychlorinated biphenyls (PCBs) in DPS was probably due to non-legacy sources of PCBs in paper production. Flame retardent chemicals were one of the most significant and extensive groups of contaminants found in the bioresource materials. Decabromodiphenylether (deca-BDE) was the most abundant polybrominated diphenyl ether (PBDE) and may explain the formation and high concentrations of PBDD/Fs detected. Emerging flame retardant compounds, including: decabromodiphenylethane (DBDPE) and organophosphate flame retardants (OPFRs), were also detected in several of the materials. The profile of perfluoroalkyl substances (PFAS) depended on the type of waste category; perfluoroundecanoic acid (PFUnDA) was the most significant PFAS for DPS, whereas perfluorooctane sulfonate (PFOS) was dominant in biosolids and CLO. The concentrations of polychlorinated alkanes (PCAs) and di-2-ethylhexyl phthalate (DEHP) were generally much larger than the other contaminants measured, indicating that there are major anthropogenic sources of these potentially hazardous chemicals entering the environment. The study results suggest that continued vigilance is required to control emissions and sources of these contaminants to support the beneficial use of secondary bioresource materials.


Assuntos
Bifenilos Policlorados , Dibenzodioxinas Policloradas , Agricultura , Animais , Dibenzofuranos , Monitoramento Ambiental , Bifenilos Policlorados/análise , Dibenzodioxinas Policloradas/análise , Reino Unido
12.
Euro Surveill ; 25(49)2020 12.
Artigo em Inglês | MEDLINE | ID: mdl-33303066

RESUMO

BackgroundEvidence for face-mask wearing in the community to protect against respiratory disease is unclear.AimTo assess effectiveness of wearing face masks in the community to prevent respiratory disease, and recommend improvements to this evidence base.MethodsWe systematically searched Scopus, Embase and MEDLINE for studies evaluating respiratory disease incidence after face-mask wearing (or not). Narrative synthesis and random-effects meta-analysis of attack rates for primary and secondary prevention were performed, subgrouped by design, setting, face barrier type, and who wore the mask. Preferred outcome was influenza-like illness. Grading of Recommendations, Assessment, Development and Evaluations (GRADE) quality assessment was undertaken and evidence base deficits described.Results33 studies (12 randomised control trials (RCTs)) were included. Mask wearing reduced primary infection by 6% (odds ratio (OR): 0.94; 95% CI: 0.75-1.19 for RCTs) to 61% (OR: 0.85; 95% CI: 0.32-2.27; OR: 0.39; 95% CI: 0.18-0.84 and OR: 0.61; 95% CI: 0.45-0.85 for cohort, case-control and cross-sectional studies respectively). RCTs suggested lowest secondary attack rates when both well and ill household members wore masks (OR: 0.81; 95% CI: 0.48-1.37). While RCTs might underestimate effects due to poor compliance and controls wearing masks, observational studies likely overestimate effects, as mask wearing might be associated with other risk-averse behaviours. GRADE was low or very low quality.ConclusionWearing face masks may reduce primary respiratory infection risk, probably by 6-15%. It is important to balance evidence from RCTs and observational studies when their conclusions widely differ and both are at risk of significant bias. COVID-19-specific studies are required.


Assuntos
COVID-19/prevenção & controle , Dispositivos de Proteção dos Olhos , Influenza Humana/prevenção & controle , Máscaras , Infecções por Picornaviridae/prevenção & controle , Infecções Respiratórias/prevenção & controle , Tuberculose/prevenção & controle , COVID-19/transmissão , Infecções por Coronavirus/prevenção & controle , Infecções por Coronavirus/transmissão , Humanos , Influenza Humana/transmissão , Infecções por Picornaviridae/transmissão , Dispositivos de Proteção Respiratória , Infecções Respiratórias/transmissão , SARS-CoV-2 , Tuberculose/transmissão
13.
J Water Health ; 18(2): 145-158, 2020 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-32300088

RESUMO

Cholera is a severe diarrhoeal disease affecting vulnerable communities. A long-term solution to cholera transmission is improved access to and uptake of water, sanitation and hygiene (WASH). Climate change threatens WASH. A systematic review and meta-analysis determined five overarching WASH factors incorporating 17 specific WASH factors associated with cholera transmission, focussing upon community cases. Eight WASH factors showed lower odds and six showed higher odds for cholera transmission. These results were combined with findings in the climate change and WASH literature, to propose a health impact pathway illustrating potential routes through which climate change dynamics (e.g. drought, flooding) impact on WASH and cholera transmission. A causal process diagram visualising links between climate change dynamics, WASH factors, and cholera transmission was developed. Climate change dynamics can potentially affect multiple WASH factors (e.g. drought-induced reductions in handwashing and rainwater use). Multiple climate change dynamics can influence WASH factors (e.g. flooding and sea-level rise affect piped water usage). The influence of climate change dynamics on WASH factors can be negative or positive for cholera transmission (e.g. drought could increase pathogen desiccation but reduce rainwater harvesting). Identifying risk pathways helps policymakers focus on cholera risk mitigation, now and in the future.


Assuntos
Cólera/transmissão , Mudança Climática , Higiene , Saneamento , Causalidade , Humanos , Fatores de Risco , Água , Abastecimento de Água
14.
Sci Total Environ ; 683: 240-248, 2019 Sep 15.
Artigo em Inglês | MEDLINE | ID: mdl-31132703

RESUMO

Common ragweed is a highly allergenic invasive species in Europe, expected to become widespread under climate change. Allergy to ragweed manifests as eye, nasal and lung symptoms, and children may retain these throughout life. The dose-response relationship between symptoms and pollen concentrations is unclear. We undertook a longitudinal study, assessing the association between ragweed pollen concentration and allergic eye, nasal and lung symptoms in children living under a range of ragweed pollen concentrations in Croatia. Over three years, 85 children completed daily diaries, detailing allergic symptoms alongside daily location, activities and medication, resulting in 10,130 individual daily entries. The daily ragweed pollen concentration for the children's locations was obtained, alongside daily weather and air pollution. Parents completed a home/lifestyle/medical questionnaire. Generalised Additive Mixed Models established the relationship between pollen concentrations and symptoms, alongside other covariates. Eye symptoms were associated with mean daily pollen concentration over four days (day of symptoms plus 3 previous days); 61 grains/m3/day (95%CI: 45, 100) was the threshold at which 50% of children reported symptoms. Nasal symptoms were associated with mean daily pollen concentration over 12 days (day of symptoms plus 11 previous days); the threshold for 50% of children reporting symptoms was 40 grains/m3/day (95%CI: 24, 87). Lung symptoms showed a relationship with mean daily pollen concentration over 19 days (day of symptoms plus 18 previous days), with a threshold of 71 grains/m3/day (95%CI: 59, 88). Taking medication on the day of symptoms showed higher odds, suggesting responsive behaviour. Taking medication on the day prior to symptoms showed lower odds of reporting, indicating preventative behaviour. Different symptoms in children demonstrate varying dose-response relationships with ragweed pollen concentrations. Each symptom type responded to pollen exposure over different time periods. Using medication prior to symptoms can reduce symptom presence. These findings can be used to better manage paediatric ragweed allergy symptoms.


Assuntos
Alérgenos/efeitos adversos , Antígenos de Plantas/efeitos adversos , Extratos Vegetais/efeitos adversos , Rinite Alérgica Sazonal/imunologia , Alérgenos/análise , Ambrosia/fisiologia , Antígenos de Plantas/análise , Criança , Pré-Escolar , Croácia , Feminino , Humanos , Estudos Longitudinais , Masculino , Extratos Vegetais/análise , Rinite Alérgica Sazonal/etiologia
15.
J Transl Med ; 17(1): 34, 2019 01 21.
Artigo em Inglês | MEDLINE | ID: mdl-30665426

RESUMO

BACKGROUND: With over 800 million cases globally, campylobacteriosis is a major cause of food borne disease. In temperate climates incidence is highly seasonal but the underlying mechanisms are poorly understood, making human disease control difficult. We hypothesised that observed disease patterns reflect complex interactions between weather, patterns of human risk behaviour, immune status and level of food contamination. Only by understanding these can we find effective interventions. METHODS: We analysed trends in human Campylobacter cases in NE England from 2004 to 2009, investigating the associations between different risk factors and disease using time-series models. We then developed an individual-based (IB) model of risk behaviour, human immunological responses to infection and environmental contamination driven by weather and land use. We parameterised the IB model for NE England and compared outputs to observed numbers of reported cases each month in the population in 2004-2009. Finally, we used it to investigate different community level disease reduction strategies. RESULTS: Risk behaviours like countryside visits (t = 3.665, P < 0.001 and t = - 2.187, P = 0.029 for temperature and rainfall respectively), and consumption of barbecued food were strongly associated with weather, (t = 3.219, P = 0.002 and t = 2.015, P = 0.045 for weekly average temperature and average maximum temperature respectively) and also rain (t = 2.254, P = 0.02527). This suggests that the effect of weather was indirect, acting through changes in risk behaviour. The seasonal pattern of cases predicted by the IB model was significantly related to observed patterns (r = 0.72, P < 0.001) indicating that simulating risk behaviour could produce the observed seasonal patterns of cases. A vaccination strategy providing short-term immunity was more effective than educational interventions to modify human risk behaviour. Extending immunity to 1 year from 20 days reduced disease burden by an order of magnitude (from 2412-2414 to 203-309 cases per 50,000 person-years). CONCLUSIONS: This is the first interdisciplinary study to integrate environment, risk behaviour, socio-demographics and immunology to model Campylobacter infection, including pathways to mitigation. We conclude that vaccination is likely to be the best route for intervening against campylobacteriosis despite the technical problems associated with understanding both the underlying human immunology and genetic variation in the pathogen, and the likely cost of vaccine development.


Assuntos
Comportamento , Infecções por Campylobacter/epidemiologia , Clima , Efeitos Psicossociais da Doença , Meio Ambiente , Modelos Biológicos , Estações do Ano , Animais , Galinhas , Inglaterra/epidemiologia , Humanos , Chuva , Temperatura
17.
Artigo em Inglês | MEDLINE | ID: mdl-29949854

RESUMO

Ragweed allergy is a major public health concern. Within Europe, ragweed is an introduced species and research has indicated that the amounts of ragweed pollen are likely to increase over Europe due to climate change, with corresponding increases in ragweed allergy. To address this threat, improving our understanding of predisposing factors for allergic sensitisation to ragweed and disease is necessary, specifically focusing upon factors that are potentially modifiable (i.e., environmental). In this study, a total of 4013 children aged 2⁻13 years were recruited across Croatia to undergo skin prick tests to determine sensitisation to ragweed and other aeroallergens. A parental questionnaire collected home environment, lifestyle, family and personal medical history, and socioeconomic information. Environmental variables were obtained using Geographical Information Systems and data from nearby pollen, weather, and air pollution stations. Logistic regression was performed (clustered on school) focusing on risk factors for allergic sensitisation and disease. Ragweed sensitisation was strongly associated with ragweed pollen at levels over 5000 grains m⁻3 year−1 and, above these levels, the risk of sensitisation was 12⁻16 times greater than in low pollen areas with about 400 grains m⁻3 year−1. Genetic factors were strongly associated with sensitisation but nearly all potentially modifiable factors were insignificant. This included measures of local land use and proximity to potential sources of ragweed pollen. Rural residence was protective (odds ratio (OR) 0.73, 95% confidence interval (CI) 0.55⁻0.98), but the factors underlying this association were unclear. Being sensitised to ragweed doubled (OR 2.17, 95% CI 1.59⁻2.96) the risk of rhinoconjunctivitis. No other potentially modifiable risk factors were associated with rhinoconjunctivitis. Ragweed sensitisation was strongly associated with ragweed pollen, and sensitisation was significantly associated with rhinoconjunctivitis. Apart from ragweed pollen levels, few other potentially modifiable factors were significantly associated with ragweed sensitisation. Hence, strategies to lower the risk of sensitisation should focus upon ragweed control.


Assuntos
Ambrosia/imunologia , Antígenos de Plantas/imunologia , Hipersensibilidade/epidemiologia , Extratos Vegetais/imunologia , Adolescente , Poluição do Ar , Alérgenos/efeitos adversos , Antígenos de Plantas/toxicidade , Estudos de Casos e Controles , Criança , Pré-Escolar , Mudança Climática , Croácia/epidemiologia , Feminino , Humanos , Hipersensibilidade/etiologia , Masculino , Razão de Chances , Extratos Vegetais/toxicidade , Pólen/imunologia , Fatores de Risco , Testes Cutâneos , Tempo (Meteorologia)
18.
Appl Environ Microbiol ; 83(14)2017 07 15.
Artigo em Inglês | MEDLINE | ID: mdl-28500040

RESUMO

This paper introduces a novel method for sampling pathogens in natural environments. It uses fabric boot socks worn over walkers' shoes to allow the collection of composite samples over large areas. Wide-area sampling is better suited to studies focusing on human exposure to pathogens (e.g., recreational walking). This sampling method is implemented using a citizen science approach: groups of three walkers wearing boot socks undertook one of six routes, 40 times over 16 months in the North West (NW) and East Anglian (EA) regions of England. To validate this methodology, we report the successful implementation of this citizen science approach, the observation that Campylobacter bacteria were detected on 47% of boot socks, and the observation that multiple boot socks from individual walks produced consistent results. The findings indicate higher Campylobacter levels in the livestock-dominated NW than in EA (55.8% versus 38.6%). Seasonal differences in the presence of Campylobacter bacteria were found between the regions, with indications of winter peaks in both regions but a spring peak in the NW. The presence of Campylobacter bacteria on boot socks was negatively associated with ambient temperature (P = 0.011) and positively associated with precipitation (P < 0.001), results consistent with our understanding of Campylobacter survival and the probability of material adhering to boot socks. Campylobacter jejuni was the predominant species found; Campylobacter coli was largely restricted to the livestock-dominated NW. Source attribution analysis indicated that the potential source of C. jejuni was predominantly sheep in the NW and wild birds in EA but did not differ between peak and nonpeak periods of human incidence.IMPORTANCE There is debate in the literature on the pathways through which pathogens are transferred from the environment to humans. We report on the success of a novel method for sampling human-pathogen interactions using boot socks and citizen science techniques, which enable us to sample human-pathogen interactions that may occur through visits to natural environments. This contrasts with traditional environmental sampling, which is based on spot sampling techniques and does not sample human-pathogen interactions. Our methods are of practical value to scientists trying to understand the transmission of pathogens from the environment to people. Our findings provide insight into the risk of Campylobacter exposure from recreational visits and an understanding of seasonal differences in risk and the factors behind these patterns. We highlight the Campylobacter species predominantly encountered and the potential sources of C. jejuni.


Assuntos
Infecções por Campylobacter/microbiologia , Infecções por Campylobacter/veterinária , Campylobacter/isolamento & purificação , Gado/microbiologia , Técnicas Microbiológicas/métodos , Animais , Animais Selvagens/microbiologia , Campylobacter/classificação , Campylobacter/genética , Campylobacter/fisiologia , Inglaterra , Meio Ambiente , Humanos , Técnicas Microbiológicas/instrumentação , Estações do Ano , Sapatos
19.
Environ Health Perspect ; 125(3): 385-391, 2017 03.
Artigo em Inglês | MEDLINE | ID: mdl-27557093

RESUMO

BACKGROUND: Globally, pollen allergy is a major public health problem, but a fundamental unknown is the likely impact of climate change. To our knowledge, this is the first study to quantify the consequences of climate change upon pollen allergy in humans. OBJECTIVES: We produced quantitative estimates of the potential impact of climate change upon pollen allergy in humans, focusing upon common ragweed (Ambrosia artemisiifolia) in Europe. METHODS: A process-based model estimated the change in ragweed's range under climate change. A second model simulated current and future ragweed pollen levels. These findings were translated into health burdens using a dose-response curve generated from a systematic review and from current and future population data. Models considered two different suites of regional climate/pollen models, two greenhouse gas emissions scenarios [Representative Concentration Pathways (RCPs) 4.5 and 8.5], and three different plant invasion scenarios. RESULTS: Our primary estimates indicated that sensitization to ragweed will more than double in Europe, from 33 to 77 million people, by 2041-2060. According to our projections, sensitization will increase in countries with an existing ragweed problem (e.g., Hungary, the Balkans), but the greatest proportional increases will occur where sensitization is uncommon (e.g., Germany, Poland, France). Higher pollen concentrations and a longer pollen season may also increase the severity of symptoms. Our model projections were driven predominantly by changes in climate (66%) but were also influenced by current trends in the spread of this invasive plant species. Assumptions about the rate at which ragweed spreads throughout Europe had a large influence upon the results. CONCLUSIONS: Our quantitative estimates indicate that ragweed pollen allergy will become a common health problem across Europe, expanding into areas where it is currently uncommon. Control of ragweed spread may be an important adaptation strategy in response to climate change. Citation: Lake IR, Jones NR, Agnew M, Goodess CM, Giorgi F, Hamaoui-Laguel L, Semenov MA, Solomon F, Storkey J, Vautard R, Epstein MM. 2017. Climate change and future pollen allergy in Europe. Environ Health Perspect 125:385-391; http://dx.doi.org/10.1289/EHP173.


Assuntos
Alérgenos/análise , Mudança Climática/estatística & dados numéricos , Exposição Ambiental/estatística & dados numéricos , Pólen , Rinite Alérgica Sazonal/epidemiologia , Europa (Continente)/epidemiologia , Hipersensibilidade
20.
J Aging Phys Act ; 24(4): 599-616, 2016 10.
Artigo em Inglês | MEDLINE | ID: mdl-27049356

RESUMO

We examine the relative importance of both objective and perceived environmental features for physical activity in older English adults. Self-reported physical activity levels of 8,281 older adults were used to compute volumes of outdoor recreational and commuting activity. Perceptions of neighborhood environment supportiveness were drawn from a questionnaire survey and a geographical information system was used to derive objective measures. Negative binominal regression models were fitted to examine associations. Perceptions of neighborhood environment were more associated with outdoor recreational activity (over 10% change per standard deviation) than objective measures (5-8% change). Commuting activity was associated with several objective measures (up to 16% change). We identified different environmental determinants of recreational and commuting activity in older adults. Perceptions of environmental supportiveness for recreational activity appear more important than actual neighborhood characteristics. Understanding how older people perceive neighborhoods might be key to encouraging outdoor recreational activity.


Assuntos
Meio Ambiente , Exercício Físico/fisiologia , Idoso , Inglaterra , Feminino , Sistemas de Informação Geográfica , Humanos , Masculino , Recreação , Autorrelato , Inquéritos e Questionários
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA