RESUMO
The peripherical protons of the dye molecule hypericin can undergo structural interconversion (tautomerization) between different isomers separated by a low energy barrier with rates that depends sensitively on the interaction with local chemical environment defined by the nature of host material. We investigate the deuterium (D) isotope effect of hypericin tautomerism at the single-molecule level to avoid ensemble averaging in different polymer matrices by a combined spectroscopic and computational approach. In the 'innocent' PMMA matrix only intramolecular isotope effects on the internal conversion channel and tautomerization are observed; while PVA specifically interacts with the probe via H- and D-bonding. This establishes a single molecular picture on intra- and intermolecular nano-environment effects to control chromophore photophysics and -chemistry.
RESUMO
There is growing evidence that weather alters SARS-CoV-2 transmission, but it remains unclear what drives the phenomenon. One prevailing hypothesis is that people spend more time indoors in cooler weather, leading to increased spread of SARS-CoV-2 related to time spent in confined spaces and close contact with others. However, the evidence in support of that hypothesis is limited and, at times, conflicting. We use a mediation framework, and combine daily weather, COVID-19 hospital surveillance, cellphone-based mobility data and building footprints to estimate the relationship between daily indoor and outdoor weather conditions, mobility, and COVID-19 hospitalizations. We quantify the direct health impacts of weather on COVID-19 hospitalizations and the indirect effects of weather via time spent indoors away-from-home on COVID-19 hospitalizations within five Colorado counties between March 4th 2020 and January 31st 2021. We also evaluated the evidence for seasonal effect modification by comparing the results of all-season (using season as a covariate) to season-stratified models. Four weather conditions were associated with both time spent indoors away-from-home and 12-day lagged COVID-19 hospital admissions in one or more season: high minimum temperature (all-season), low maximum temperature (spring), low minimum absolute humidity (winter), and high solar radiation (all-season & winter). In our mediation analyses, we found evidence that changes in 12-day lagged hospital admissions were primarily via the direct effects of weather conditions, rather than via indirect effects by which weather changes time spent indoors away-from-home. Our findings do not support the hypothesis that weather impacted SARS-CoV-2 transmission via changes in mobility patterns during the first year of the pandemic. Rather, weather appears to have impacted SARS-CoV-2 transmission primarily via mechanisms other than human movement. We recommend further analysis of this phenomenon to determine whether these findings generalize to current SARS-CoV-2 transmission dynamics, as well as other seasonal respiratory pathogens.
Assuntos
COVID-19 , Telefone Celular , SARS-CoV-2 , Tempo (Meteorologia) , COVID-19/transmissão , COVID-19/epidemiologia , Humanos , Hospitalização/estatística & dados numéricos , Estações do Ano , Colorado/epidemiologiaRESUMO
Background: There is growing evidence that weather alters SARS-CoV-2 transmission, but it remains unclear what drives the phenomenon. One prevailing hypothesis is that people spend more time indoors in cooler weather, leading to increased spread of SARS-CoV-2 related to time spent in confined spaces and close contact with others. However, the evidence in support of that hypothesis is limited and, at times, conflicting. Objectives: We aim to evaluate the extent to which weather impacts COVID-19 via time spent away-from-home in indoor spaces, as compared to a direct effect of weather on COVID-19 hospitalization, independent of mobility. Methods: We use a mediation framework, and combine daily weather, COVID-19 hospital surveillance, cellphone-based mobility data and building footprints to estimate the relationship between daily indoor and outdoor weather conditions, mobility, and COVID-19 hospitalizations. We quantify the direct health impacts of weather on COVID-19 hospitalizations and the indirect effects of weather via time spent indoors away-from-home on COVID-19 hospitalizations within five Colorado counties between March 4th 2020 and January 31st 2021. Results: We found evidence that changes in 12-day lagged hospital admissions were primarily via the direct effects of weather conditions, rather than via indirect effects by which weather changes time spent indoors away-from-home. Sensitivity analyses evaluating time at home as a mediator were consistent with these conclusions. Discussion: Our findings do not support the hypothesis that weather impacted SARS-CoV-2 transmission via changes in mobility patterns during the first year of the pandemic. Rather, weather appears to have impacted SARS-CoV-2 transmission primarily via mechanisms other than human movement. We recommend further analysis of this phenomenon to determine whether these findings generalize to current SARS-CoV-2 transmission dynamics and other seasonal respiratory pathogens.
RESUMO
We evaluated detectable viral load (VL) in pregnant women established on antiretroviral therapy (ART) for at least 6 months before conception and those self-reported as ART naïve at first antenatal care (ANC) at two government clinics in Southern Malawi. We used logistic regression to identify the predictors of detectable viral load (VL), defined as any measure greater than 400 copies/ml. Of 816 women, 67.9% were established on ART and 32.1% self-reported as ART naïve. Among women established on ART, 10.8% had detectable VL and 9.9% had VL >1000 copies/ml (WHO criteria for virological failure). In adjusted analysis, among women established on ART, virological failure was associated with younger age (p = .02), "being single/widowed" (p = 0.001) and no previous deliveries (p = .05). One fifth of women who reported to be ART-naive were found to have an undetectable VL at first ANC. None of the demographic factors could significantly differentiate those with high versus low VL in the ART-naïve sub-sample. In this cohort, approximately 90% of women who had initiated ART prior to conception had an undetectable VL at first ANC. This demonstrates good success of the ART program but identifies high risk populations that require additional support.
RESUMO
Clinical immunity against Plasmodium falciparum infection develops in residents of malaria endemic regions, manifesting in reduced clinical symptoms during infection and in protection against severe disease but the mechanisms are not fully understood. Here, we compare the cellular and humoral immune response of clinically immune (0-1 episode over 18 months) and susceptible (at least 3 episodes) during a mild episode of Pf malaria infection in a malaria endemic region of Malawi, by analysing peripheral blood samples using high dimensional mass cytometry (CyTOF), spectral flow cytometry and single-cell transcriptomic analyses. In the clinically immune, we find increased proportions of circulating follicular helper T cells and classical monocytes, while the humoral immune response shows characteristic age-related differences in the protected. Presence of memory CD4+ T cell clones with a strong cytolytic ZEB2+ T helper 1 effector signature, sharing identical T cell receptor clonotypes and recognizing the Pf-derived circumsporozoite protein (CSP) antigen are found in the blood of the Pf-infected participants gaining protection. Moreover, in clinically protected participants, ZEB2+ memory CD4+ T cells express lower level of inhibitory and chemotactic receptors. We thus propose that clonally expanded ZEB2+ CSP-specific cytolytic memory CD4+ Th1 cells may contribute to clinical immunity against the sporozoite and liver-stage Pf malaria.
Assuntos
Vacinas Antimaláricas , Malária Falciparum , Malária , Humanos , Plasmodium falciparum , Malária Falciparum/prevenção & controle , Malária/prevenção & controle , Células Th1 , Proteínas de Protozoários , Células ClonaisRESUMO
BACKGROUND: Infants under 6 months of age are often excluded from malaria surveillance and observational studies. The impact of malaria during early infancy on health later in childhood remains unknown. METHODS: Infants from two birth cohorts in Malawi were monitored at quarterly intervals and whenever they were ill from birth through 24 months for Plasmodium falciparum infections and clinical malaria. Poisson regression and linear mixed effects models measured the effect of exposure to malaria in infancy on subsequent malaria incidence, weight-for-age z-scores (WAZ), and haemoglobin concentrations after 6 months. RESULTS: Infants with at least one P. falciparum infection during their first 6 months had increased incidence ratio (IRR) of P. falciparum infection (IRR = 1.27, 95% CI, 1.06-1.52) and clinical malaria (IRR = 2.37, 95% CI, 2.02-2.80) compared to infants without infection. Infants with clinical malaria had increased risk of P. falciparum infection incidence between 6 and 24 months (IRR = 1.64, 95% CI, 1.38-1.94) and clinical malaria (IRR = 1.85, 95% CI, 1.48-2.32). Exposure to malaria was associated with lower WAZ over time (p = 0.02) and lower haemoglobin levels than unexposed infants at every time interval (p = 0.02). CONCLUSIONS: Infants experiencing malaria infection or clinical malaria are at increased risk of subsequent infection and disease, have poorer growth, and lower haemoglobin concentrations.
Assuntos
Anemia , Malária Falciparum , Malária , Humanos , Lactente , Plasmodium falciparum , Malária Falciparum/complicações , Malária Falciparum/epidemiologia , Malária/complicações , Anemia/epidemiologia , Anemia/complicações , HemoglobinasRESUMO
Respiratory syncytial virus (RSV) is the most common cause of early childhood lower respiratory tract infection (LRTI) in low- and middle-income countries (LMICs). Maternal vaccines, birth-dose extended half-life monoclonal antibodies (mAbs), and pediatric vaccines are under development for prevention of respiratory syncytial virus (RSV) lower respiratory tract infection (LRTI) in young children. We analyzed the health and economic impact of RSV interventions used alone or in combinations in Mali. We modeled age-specific and season-specific risks of RSV LRTI in children through three years, using WHO Preferred Product Characteristics and data generated in Mali. Health outcomes included RSV LRTI cases, hospitalizations, deaths, and disability-adjusted life-years (DALYs). We identified the optimal combination of products across a range of scenarios. We found that mAb delivered at birth could avert 878 DALYs per birth cohort at an incremental cost-effectiveness ratio (ICER) of $597 per DALY averted compared to no intervention if the product were available at $1 per dose. Combining mAb with pediatric vaccine administered at 10/14 weeks, 1947 DALYs would be prevented. The ICER of this combination strategy is $1514 per DALY averted compared to mAb alone. Incorporating parameter uncertainty, mAb alone is likely to be optimal from the societal perspective at efficacy against RSV LRTI above 66%. The optimal strategy was sensitive to economic considerations, including product prices and willingness-to-pay for DALYs. For example, the combination of mAb and pediatric vaccine would be optimal from the government perspective at a willingness-to-pay above $775 per DALY. Maternal vaccine alone or in combination with other interventions was never the optimal strategy, even for high vaccine efficacy. The same was true for pediatric vaccine administered at 6/7 months. At prices comparable to existing vaccine products, extended half-life RSV mAbs would be impactful and efficient components of prevention strategies in LMICs such as Mali.
RESUMO
BACKGROUND: Diarrheal disease is heterogeneous, including watery diarrhea (WD) and dysentery, some cases of which become persistent diarrhea (PD). Changes in risk over time necessitate updated knowledge of these syndromes in sub-Saharan Africa. METHODS: The Vaccine Impact on Diarrhea in Africa (VIDA) study was an age-stratified, case-control study of moderate-to-severe diarrhea among children <5 years old in The Gambia, Mali, and Kenya (2015-2018). We analyzed cases with follow-up of about 60 days after enrollment to detect PD (lasting ≥14 days), examined the features of WD and dysentery, and examined determinants for progression to and sequelae from PD. Data were compared with those from the Global Enteric Multicenter Study (GEMS) to detect temporal changes. Etiology was assessed from stool samples using pathogen attributable fractions (AFs), and predictors were assessed using χ2 tests or multivariate regression, where appropriate. RESULTS: Among 4606 children with moderate-to-severe diarrhea, 3895 (84.6%) had WD and 711 (15.4%) had dysentery. PD was more frequent among infants (11.3%) than in children 12-23 months (9.9%) or 24-59 months (7.3%), P = .001 and higher in Kenya (15.5%) than in The Gambia (9.3%) or Mali (4.3%), P < .001; the frequencies were similar among children with WD (9.7%) and those with dysentery (9.4%). Compared to children not treated with antibiotics, those who received antibiotics had a lower frequency of PD overall (7.4% vs 10.1%, P = .01), and particularly among those with WD (6.3% vs 10.0%; P = .01) but not among children with dysentery (8.5% vs 11.0%; P = .27). For those with watery PD, Cryptosporidium and norovirus had the highest AFs among infants (0.16 and 0.12, respectively), while Shigella had the highest AF (0.25) in older children. The odds of PD decreased significantly over time in Mali and Kenya while increasing significantly in The Gambia. CONCLUSIONS: The burden of PD endures in sub-Saharan Africa, with nearly 10% of episodes of WD and dysentery becoming persistent.
Assuntos
Criptosporidiose , Cryptosporidium , Disenteria , Vacinas contra Rotavirus , Lactente , Criança , Humanos , Pré-Escolar , Estudos de Casos e Controles , Criptosporidiose/complicações , Diarreia/epidemiologia , Diarreia/prevenção & controle , Diarreia/etiologia , Disenteria/complicações , Fatores de Risco , Quênia/epidemiologia , AntibacterianosRESUMO
OBJECTIVES: Severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection, which causes coronavirus disease 2019 (COVID-19), is spread primarily through exposure to respiratory droplets from close contact with an infected person. To inform prevention measures, we conducted a case-control study among Colorado adults to assess the risk of SARS-CoV-2 infection from community exposures. METHODS: Cases were symptomatic Colorado adults (aged ≥18 years) with a positive SARS-CoV-2 test by reverse transcription-polymerase chain reaction (RT-PCR) reported to Colorado's COVID-19 surveillance system. From March 16 to December 23, 2021, cases were randomly selected from surveillance data ≤12 days after their specimen collection date. Cases were matched on age, zip code (urban areas) or region (rural/frontier areas), and specimen collection date with controls randomly selected among persons with a reported negative SARS-CoV-2 test result. Data on close contact and community exposures were obtained from surveillance and a survey administered online. RESULTS: The most common exposure locations among all cases and controls were place of employment, social events, or gatherings and the most frequently reported exposure relationship was co-worker or friend. Cases were more likely than controls to work outside the home (adjusted odds ratio (aOR) 1.18, 95% confidence interval (CI): 1.09-1.28) in industries and occupations related to accommodation and food services, retail sales, and construction. Cases were also more likely than controls to report contact with a non-household member with confirmed or suspected COVID-19 (aOR 1.16, 95% CI: 1.06-1.27). CONCLUSIONS: Understanding the settings and activities associated with a higher risk of SARS-CoV-2 infection is essential for informing prevention measures aimed at reducing the transmission of SARS-CoV-2 and other respiratory diseases. These findings emphasize the risk of community exposure to infected persons and the need for workplace precautions in preventing ongoing transmission.
Assuntos
COVID-19 , SARS-CoV-2 , Adulto , Humanos , Adolescente , COVID-19/diagnóstico , COVID-19/epidemiologia , Estudos de Casos e Controles , Colorado/epidemiologia , Acomodação OcularRESUMO
BACKGROUND: When people with human immunodeficiency virus (HIV) infection (PWH) develop malaria, they are at risk of poor anti-malarial treatment efficacy resulting from impairment in the immune response and/or drug-drug interactions that alter anti-malarial metabolism. The therapeutic efficacy of artemether-lumefantrine was evaluated in a cohort of PWH on antiretroviral therapy (ART) and included measurement of day 7 lumefantrine levels in a subset to evaluate for associations between lumefantrine exposure and treatment response. METHODS: Adults living with HIV (≥ 18 years), on ART for ≥ 6 months with undetectable HIV RNA viral load and CD4 count ≥ 250/mm3 were randomized to daily trimethoprim-sulfamethoxazole (TS), weekly chloroquine (CQ) or no prophylaxis. After diagnosis of uncomplicated Plasmodium falciparum malaria, a therapeutic efficacy monitoring was conducted with PCR-correction according to WHO guidelines. The plasma lumefantrine levels on day 7 in 100 episodes of uncomplicated malaria was measured. A frailty proportional hazards model with random effects models to account for clustering examined the relationship between participant characteristics and malaria treatment failure within 28 days. Pearson's Chi-squared test was used to compare lumefantrine concentrations among patients with treatment failure and adequate clinical and parasitological response (ACPR). RESULTS: 411 malaria episodes were observed among 186 participants over 5 years. The unadjusted ACPR rate was 81% (95% CI 77-86). However, after PCR correction to exclude new infections, ACPR rate was 94% (95% CI 92-97). Increasing age and living in Ndirande were associated with decreased hazard of treatment failure. In this population of adults with HIV on ART, 54% (51/94) had levels below a previously defined optimal day 7 lumefantrine level of 200 ng/ml. This occurred more commonly among participants who were receiving an efavirenz-based ART compared to other ART regimens (OR 5.09 [95% CI 1.52-7.9]). Participants who experienced treatment failure had lower day 7 median lumefantrine levels (91 ng/ml [95% CI 48-231]) than participants who experienced ACPR (190 ng/ml [95% CI 101-378], p-value < 0.008). CONCLUSION: Recurrent malaria infections are frequent in this population of PWH on ART. The PCR-adjusted efficacy of AL meets the WHO criteria for acceptable treatment efficacy. Nevertheless, lumefantrine levels tend to be low in this population, particularly in those on efavirenz-based regimens, with lower concentrations associated with more frequent malaria infections following treatment. These results highlight the importance of understanding drug-drug interactions when diseases commonly co-occur.
Assuntos
Antimaláricos , Artemisininas , Infecções por HIV , Malária Falciparum , Malária , Humanos , Adulto , Antimaláricos/uso terapêutico , Malaui , Artemisininas/uso terapêutico , Artemeter/uso terapêutico , Combinação de Medicamentos , Combinação Arteméter e Lumefantrina/uso terapêutico , Malária/tratamento farmacológico , Malária Falciparum/tratamento farmacológico , Malária Falciparum/prevenção & controle , Lumefantrina/uso terapêutico , Infecções por HIV/tratamento farmacológico , Resultado do Tratamento , Etanolaminas/uso terapêutico , Fluorenos/uso terapêuticoRESUMO
Climate change may alter access to safe drinking water, with important implications for health. We assessed the relationship between temperature and rainfall and utilization of basic drinking water (BDW) in The Gambia, Mozambique, Pakistan, and Kenya. The outcomes of interest were (a) whether the reported drinking water source used in the past 2 weeks met the World Health Organization definition of BDW and (b) use of a BDW source that was always available. Temperature and precipitation data were compiled from weather stations and satellite data and summarized to account for long- and short-term weather patterns and lags. We utilized random forests and logistic regression to identify key weather variables that predicted outcomes by site and the association between important weather variables and BDW use. Higher temperatures were associated with decreased BDW use at three of four sites and decreased use of BDW that is always available at all four sites. Increasing rainfall, both in the long- and short-term, was associated with increased BDW use in three sites. We found evidence for interactions between household wealth and weather variables at two sites, suggesting lower wealth populations may be more sensitive to weather-driven changes in water access. Changes in temperature and precipitation can alter safe water use in low-resource settings-investigating drivers for these relationships can inform efforts to build climate resilience.
RESUMO
Adverse events may be a cause of observed poor completion of isoniazid preventive therapy (IPT) among people living with HIV in high tuberculosis burden areas. Data on IPT-related adverse events (AE) from sub-Saharan Africa are scarce. We report IPT-related AEs, associated clinical characteristics, and IPT discontinuations in adults who were stable on antiretroviral therapy (ART) when they initiated IPT. Cohort study nested within a randomized, controlled, clinical trial of cotrimoxazole and chloroquine prophylaxis in Malawians agedâ ≥â 18 years and virologically suppressed on ART. Eight hundred sixty-nine patients were followed for a median of 6 months after IPT initiation. IPT relatedness of AEs was determined retrospectively with the World Health Organization case-causality tool. Frailty survival regression modeling identified factors associated with time to first probably IPT-related AE. The overall IPT-related AE incidence rate was 1.1/person year of observation. IPT relatedness was mostly uncertain and few AEs were severe. Most common were liver and hematological toxicities. Higher age increased risk of a probably IPT-related AE (aHRâ =â 1.02; 95% CI 1.00-1.06; Pâ =â .06) and higher weight reduced this risk (aHRâ =â 0.98; 95% CI 0.96-1.00; Pâ =â .03). Of 869 patients, 114 (13%) discontinued IPT and 94/114 (82%) discontinuations occurred at the time of a possibly or probably IPT-related AE. We observed a high incidence of mostly mild IPT-related AEs among individuals who were stable on ART. More than 1 in 8 persons discontinued IPT. These findings inform strategies to improve implementation of IPT in adults on ART, including close monitoring of groups at higher risk of IPT-related AEs.
Assuntos
Infecções por HIV , Isoniazida , Adulto , Antituberculosos/efeitos adversos , Cloroquina/uso terapêutico , Estudos de Coortes , Infecções por HIV/epidemiologia , Humanos , Isoniazida/efeitos adversos , Estudos Retrospectivos , Combinação Trimetoprima e Sulfametoxazol/uso terapêuticoRESUMO
Malaria in pregnancy (MIP) causes poor birth outcomes, but its impact on neurocognitive development has not been well characterized. Between 2012 and 2014, we enrolled 307 mother-infant pairs and monitored 286 infants for neurocognitive development using the Malawi Developmental Assessment Tool at 6, 12, and 24 months of age. MIP was diagnosed from peripheral blood and placental specimens. Cord blood cytokine levels were assessed for a subset of neonates. Predictors of neurodevelopment were examined using mixed-effect logistic regression for developmental delay. Among the participants, 78 mothers (25.4%) had MIP, and 45 infants (15.7%) experienced severe neurocognitive delay. MIP was not associated with differences in cord blood cytokine levels or neurocognitive development. Preterm birth, low birthweight, increasing maternal education level, and increasing interleukin 6 levels were associated significantly with delay. The results highlight the prevalence of severe delay and a need for broad access to early childhood support in this setting.
Assuntos
Malária , Nascimento Prematuro , Pré-Escolar , Lactente , Recém-Nascido , Gravidez , Feminino , Humanos , Placenta , Malária/complicações , Malária/epidemiologia , Inflamação , CitocinasRESUMO
Shigella continues to be a major contributor to diarrheal illness and dysentery in children younger than 5 years of age in low- and middle-income countries. Strategies for the prevention of shigellosis have focused on enhancing adaptive immunity. The interaction between Shigella and intrinsic host factors, such as the microbiome, remains unknown. We hypothesized that Shigella infection would impact the developing microbial community in infancy and, conversely, that changes in the gastrointestinal microbiome may predispose infections. To test this hypothesis, we characterized the gastrointestinal microbiota in a longitudinal birth cohort from Malawi that was monitored for Shigella infection using 16S rRNA amplicon sequencing. Children with at least one Shigella quantitative polymerase chain reaction (qPCR) positive sample during the first 2 years of life (cases) were compared to uninfected controls that were matched for sex and age. Overall, the microbial species diversity, as measured by the Shannon diversity index, increased over time, regardless of case status. At early time points, the microbial community was dominated by Bifidobacterium longum and Escherichia/Shigella. A greater abundance of Prevotella 9 and Bifidobacterium kashiwanohense was observed at 2 years of age. While no single species was associated with susceptibility to Shigella infection, significant increases in Lachnospiraceae NK4A136 and Fusicatenibacter saccharivorans were observed following Shigella infection. Both taxa are in the family Lachnospiraceae, which are known short-chain fatty acid producers that may improve gut health. Our findings identified temporal changes in the gastrointestinal microbiota associated with Shigella infection in Malawian children and highlight the need to further elucidate the microbial communities associated with disease susceptibility and resolution. IMPORTANCE Shigella causes more than 180 million cases of diarrhea globally, mostly in children living in poor regions. Infection can lead to severe health impairments that reduce quality of life. There is increasing evidence that disruptions in the gut microbiome early in life can influence susceptibility to illnesses. A delayed or impaired reconstitution of the microbiota following infection can further impact overall health. Aiming to improve our understanding of the interaction between Shigella and the developing infant microbiome, we investigated changes in the gut microbiome of Shigella-infected and uninfected children over the course of their first 2 years of life. We identified species that may be involved in recovery from Shigella infection and in driving the microbiota back to homeostasis. These findings support future studies into the elucidation of the interaction between the microbiota and enteric pathogens in young children and into the identification of potential targets for prevention or treatment.
Assuntos
Disenteria Bacilar , Microbioma Gastrointestinal , Shigella , Lactente , Humanos , Criança , Pré-Escolar , Microbioma Gastrointestinal/genética , Disenteria Bacilar/epidemiologia , RNA Ribossômico 16S/genética , Qualidade de Vida , Fezes/microbiologia , Shigella/genética , Diarreia/microbiologiaRESUMO
OBJECTIVE: Many individuals living with the human immunodeficiency virus (HIV) infection and receiving antiretroviral therapy (ART) reside in areas at high risk for malaria but how malaria affects clinical outcomes is not well described in this population. We evaluated the burden of malaria infection and clinical malaria, and impact on HIV viral load and CD4 + cell count among adults on ART. DESIGN: We recruited Malawian adults on ART who had an undetectable viral load and ≥250 CD4 + âcells/µl to participate in this randomized trial to continue daily trimethoprim-sulfamethoxazole (TS), discontinue daily co-trimoxazole, or switch to weekly chloroquine (CQ). METHODS: We defined clinical malaria as symptoms consistent with malaria and positive blood smear, and malaria infection as Plasmodium falciparum DNA detected from dried blood spots (collected every 4-12âweeks). CD4 + cell count and viral load were measured every 24âweeks. We used Poisson regression and survival analysis to compare the incidence of malaria infection and clinical malaria. Clinicaltrials.gov NCT01650558. RESULTS: Among 1499 participants enrolled, clinical malaria incidence was 21.4/100 person-years of observation (PYO), 2.4/100 PYO and 1.9/100 PYO in the no prophylaxis, TS, and CQ arms, respectively. We identified twelve cases of malaria that led to hospitalization and all individuals recovered. The preventive effect of staying on prophylaxis was approximately 90% compared to no prophylaxis (TS: incidence rate ratio [IRR] 0.11, 95% confidence interval [CI] 0.08, 0.15 and CQ: IRR 0.09, 95% CI 0.06, 0.13). P. falciparum infection prevalence among all visits was 187/1475 (12.7%), 48/1563 (3.1%), and 29/1561 (1.9%) in the no prophylaxis, TS, and CQ arms, respectively. Malaria infection and clinical malaria were not associated with changes in CD4 + cell count or viral load. CONCLUSION: In clinically stable adults living with HIV on ART, clinical malaria was common after chemoprophylaxis stopped. However, neither malaria infection nor clinical illness appeared to affect HIV disease progression.
Assuntos
Antimaláricos , Infecções por HIV , Malária , Adulto , Antimaláricos/uso terapêutico , Contagem de Linfócito CD4 , Quimioprevenção , Infecções por HIV/complicações , Infecções por HIV/tratamento farmacológico , Infecções por HIV/epidemiologia , Humanos , Malária/epidemiologia , Combinação Trimetoprima e Sulfametoxazol/uso terapêuticoRESUMO
In China, bovines are believed to be the most common animal source of human schistosomiasis infections, though little is known about what factors promote bovine infections. The current body of literature features inconsistent, and sometimes contradictory results, and to date, few studies have looked beyond physical characteristics to identify the broader environmental conditions that predict bovine schistosomiasis. Because schistosomiasis is a sanitation-related, water-borne disease transmitted by many animals, we hypothesised that several environmental factors - such as the lack of improved sanitation systems, or participation in agricultural production that is water-intensive - could promote schistosomiasis infection in bovines. Using data collected as part of a repeat cross-sectional study conducted in rural villages in Sichuan, China from 2007 to 2016, we used a Random Forests, machine learning approach to identify the best physical and environmental predictors of bovine Schistosoma japonicum infection. Candidate predictors included: (i) physical/biological characteristics of bovines, (ii) human sources of environmental schistosomes, (iii) socio-economic indicators, (iv) animal reservoirs, and (v) agricultural practices. The density of bovines in a village and agricultural practices such as the area of rice and dry summer crops planted, and the use of night soil as an agricultural fertilizer, were among the top predictors of bovine S. japonicum infection in all collection years. Additionally, human infection prevalence, pig ownership and bovine age were found to be strong predictors of bovine infection in at least 1 year. Our findings highlight that presumptively treating bovines in villages with high bovine density or human infection prevalence may help to interrupt transmission. Furthermore, village-level predictors were stronger predictors of bovine infection than household-level predictors, suggesting future investigations may need to apply a broad ecological lens to identify potential underlying sources of persistent transmission.
Assuntos
Schistosoma japonicum , Esquistossomose Japônica , Esquistossomose , Animais , Bovinos , China/epidemiologia , Estudos Transversais , Humanos , Prevalência , Schistosoma , Esquistossomose/epidemiologia , Esquistossomose Japônica/epidemiologia , Esquistossomose Japônica/veterinária , Caramujos , Suínos , ÁguaRESUMO
Wildfire management in the US relies on a complex nationwide network of shared resources that are allocated based on regional need. While this network bolsters firefighting capacity, it may also provide pathways for transmission of infectious diseases between fire sites. In this manuscript, we review a first attempt at building an epidemiological model adapted to the interconnected fire system, with the aims of supporting prevention and mitigation efforts along with understanding potential impacts to workforce capacity. Specifically, we developed an agent-based model of COVID-19 built on historical wildland fire assignments using detailed dispatch data from 2016-2018, which form a network of firefighters dispersed spatially and temporally across the US. We used this model to simulate SARS-CoV-2 transmission under several intervention scenarios including vaccination and social distancing. We found vaccination and social distancing are effective at reducing transmission at fire incidents. Under a scenario assuming High Compliance with recommended mitigations (including vaccination), infection rates, number of outbreaks, and worker days missed are effectively negligible, suggesting the recommended interventions could successfully mitigate the risk of cascading infections between fires. Under a contrasting Low Compliance scenario, it is possible for cascading outbreaks to emerge leading to relatively high numbers of worker days missed. As the model was built in 2021 before the emergence of the Delta and Omicron variants, the modeled viral parameters and isolation/quarantine policies may have less relevance to 2022, but nevertheless underscore the importance of following basic prevention and mitigation guidance. This work could set the foundation for future modeling efforts focused on mitigating spread of infectious disease at wildland fire incidents to manage both the health of fire personnel and system capacity.
Assuntos
COVID-19 , Incêndios , Incêndios Florestais , COVID-19/epidemiologia , COVID-19/prevenção & controle , Humanos , SARS-CoV-2 , Recursos HumanosRESUMO
BACKGROUND: In areas highly endemic for malaria, Plasmodium falciparum infection prevalence peaks in school-age children, adversely affecting health and education. School-based intermittent preventive treatment reduces this burden but concerns about cost and widespread use of antimalarial drugs limit enthusiasm for this approach. School-based screening and treatment is an attractive alternative. In a prospective cohort study, we evaluated the impact of school-based screening and treatment on the prevalence of P. falciparum infection and anemia in 2 transmission settings. METHODS: We screened 704 students in 4 Malawian primary schools for P. falciparum infection using rapid diagnostic tests (RDTs), and treated students who tested positive with artemether-lumefantrine. We determined P. falciparum infection by microscopy and quantitative polymerase chain reaction (qPCR), and hemoglobin concentrations over 6 weeks in all students. RESULTS: Prevalence of infection by RDT screening was 37% (9%-64% among schools). An additional 9% of students had infections detected by qPCR. Following the intervention, significant reductions in infections were detected by microscopy (adjusted relative reduction [aRR], 48.8%; Pâ <â .0001) and qPCR (aRR, 24.5%; Pâ <â .0001), and in anemia prevalence (aRR, 30.8%; Pâ =â .003). Intervention impact was reduced by infections not detected by RDT and new infections following treatment. CONCLUSIONS: School-based screening and treatment reduced P. falciparum infection and anemia. This approach could be enhanced by repeating screening, using more-sensitive screening tests, and providing longer-acting drugs. CLINICAL TRIALS REGISTRATION: NCT04858087.
Assuntos
Anemia , Antimaláricos , Malária Falciparum , Malária , Anemia/diagnóstico , Anemia/epidemiologia , Anemia/prevenção & controle , Antimaláricos/uso terapêutico , Artemeter , Combinação Arteméter e Lumefantrina/uso terapêutico , Criança , Humanos , Malária/epidemiologia , Malária Falciparum/diagnóstico , Malária Falciparum/tratamento farmacológico , Malária Falciparum/epidemiologia , Malaui/epidemiologia , Plasmodium falciparum , Prevalência , Estudos Prospectivos , Instituições AcadêmicasRESUMO
Since early 2020, non-pharmaceutical interventions (NPIs)-implemented at varying levels of severity and based on widely-divergent perspectives of risk tolerance-have been the primary means to control SARS-CoV-2 transmission. This paper aims to identify how risk tolerance and vaccination rates impact the rate at which a population can return to pre-pandemic contact behavior. To this end, we developed a novel mathematical model and we used techniques from feedback control to inform data-driven decision-making. We use this model to identify optimal levels of NPIs across geographical regions in order to guarantee that hospitalizations will not exceed given risk tolerance thresholds. Results are shown for the state of Colorado, United States, and they suggest that: coordination in decision-making across regions is essential to maintain the daily number of hospitalizations below the desired limits; increasing risk tolerance can decrease the number of days required to discontinue NPIs, at the cost of an increased number of deaths; and if vaccination uptake is less than 70%, at most levels of risk tolerance, return to pre-pandemic contact behaviors before the early months of 2022 may newly jeopardize the healthcare system. The sooner we can acquire population-level vaccination of greater than 70%, the sooner we can safely return to pre-pandemic behaviors.
Assuntos
COVID-19 , Influenza Humana , COVID-19/epidemiologia , COVID-19/prevenção & controle , Humanos , Influenza Humana/epidemiologia , Modelos Teóricos , Pandemias/prevenção & controle , SARS-CoV-2 , Estados UnidosRESUMO
In the rapidly urbanizing region of West Africa, Aedes mosquitoes pose an emerging threat of infectious disease that is compounded by limited vector surveillance. Citizen science has been proposed as a way to fill surveillance gaps by training local residents to collect and share information on disease vectors. Understanding the distribution of arbovirus vectors in West Africa can inform researchers and public health officials on where to conduct disease surveillance and focus public health interventions. We utilized citizen science data collected through NASA's GLOBE Observer mobile phone application and data from a previously published literature review on Aedes mosquito distribution to examine the contribution of citizen science to understanding the distribution of Ae. aegypti in West Africa using Maximum Entropy modeling. Combining citizen science and literature-derived observations improved the fit of the model compared to models created by each data source alone but did not alleviate location bias within the models, likely due to lack of widespread observations. Understanding Ae. aegypti distribution will require greater investment in Aedes mosquito surveillance in the region, and citizen science should be utilized as a tool in this mission to increase the reach of surveillance.