RESUMEN
Environmentally-mediated protozoan diseases like cryptosporidiosis and giardiasis are likely to be highly impacted by extreme weather, as climate-related conditions like temperature and precipitation have been linked to their survival, distribution, and overall transmission success. Our aim was to investigate the relationship between extreme temperature and precipitation and cryptosporidiosis and giardiasis infection using monthly weather data and case reports from Colorado counties over a twenty-one year period. Data on reportable diseases and weather among Colorado counties were collected using the Colorado Electronic Disease Reporting System (CEDRS) and the Daily Surface Weather and Climatological Summaries (Daymet) Version 3 dataset, respectively. We used a conditional Poisson distributed-lag nonlinear modeling approach to estimate the lagged association (between 0 and 12-months) between relative temperature and precipitation extremes and the risk of cryptosporidiosis and giardiasis infection in Colorado counties between 1997 and 2017, relative to the risk found at average values of temperature and precipitation for a given county and month. We found distinctly different patterns in the associations between temperature extremes and cryptosporidiosis, versus temperature extremes and giardiasis. When maximum or minimum temperatures were high (90th percentile) or very high (95th percentile), we found a significant increase in cryptosporidiosis risk, but a significant decrease in giardiasis risk, relative to risk at the county and calendar-month mean. Conversely, we found very similar relationships between precipitation extremes and both cryptosporidiosis and giardiasis, which highlighted the prominent role of long-term (>8 months) lags. Our study presents novel insights on the influence that extreme temperature and precipitation can have on parasitic disease transmission in real-world settings. Additionally, we present preliminary evidence that the standard lag periods that are typically used in epidemiological studies to assess the impacts of extreme weather on cryptosporidiosis and giardiasis may not be capturing the entire relevant period.
Asunto(s)
Criptosporidiosis , Giardiasis , Tiempo (Meteorología) , Criptosporidiosis/epidemiología , Colorado/epidemiología , Giardiasis/epidemiología , Humanos , Dinámicas no Lineales , Temperatura , LluviaRESUMEN
There is growing evidence that weather alters SARS-CoV-2 transmission, but it remains unclear what drives the phenomenon. One prevailing hypothesis is that people spend more time indoors in cooler weather, leading to increased spread of SARS-CoV-2 related to time spent in confined spaces and close contact with others. However, the evidence in support of that hypothesis is limited and, at times, conflicting. We use a mediation framework, and combine daily weather, COVID-19 hospital surveillance, cellphone-based mobility data and building footprints to estimate the relationship between daily indoor and outdoor weather conditions, mobility, and COVID-19 hospitalizations. We quantify the direct health impacts of weather on COVID-19 hospitalizations and the indirect effects of weather via time spent indoors away-from-home on COVID-19 hospitalizations within five Colorado counties between March 4th 2020 and January 31st 2021. We also evaluated the evidence for seasonal effect modification by comparing the results of all-season (using season as a covariate) to season-stratified models. Four weather conditions were associated with both time spent indoors away-from-home and 12-day lagged COVID-19 hospital admissions in one or more season: high minimum temperature (all-season), low maximum temperature (spring), low minimum absolute humidity (winter), and high solar radiation (all-season & winter). In our mediation analyses, we found evidence that changes in 12-day lagged hospital admissions were primarily via the direct effects of weather conditions, rather than via indirect effects by which weather changes time spent indoors away-from-home. Our findings do not support the hypothesis that weather impacted SARS-CoV-2 transmission via changes in mobility patterns during the first year of the pandemic. Rather, weather appears to have impacted SARS-CoV-2 transmission primarily via mechanisms other than human movement. We recommend further analysis of this phenomenon to determine whether these findings generalize to current SARS-CoV-2 transmission dynamics, as well as other seasonal respiratory pathogens.
Asunto(s)
COVID-19 , Teléfono Celular , SARS-CoV-2 , Tiempo (Meteorología) , COVID-19/transmisión , COVID-19/epidemiología , Humanos , Hospitalización/estadística & datos numéricos , Estaciones del Año , Colorado/epidemiologíaRESUMEN
OBJECTIVES: To assess whether increasing levels of hospital stress-measured by intensive care unit (ICU) bed occupancy (primary), ventilators in use and emergency department (ED) overflow-were associated with decreasing COVID-19 ICU patient survival in Colorado ICUs during the pre-Delta, Delta and Omicron variant eras. DESIGN: A retrospective cohort study using discrete-time survival models, fit with generalised estimating equations. SETTING: 34 hospital systems in Colorado, USA, with the highest patient volume ICUs during the COVID-19 pandemic. PARTICIPANTS: 9196 non-paediatric SARS-CoV-2 patients in Colorado hospitals admitted once to an ICU between 1 August 2020 and 1 March 2022 and followed for 28 days. OUTCOME MEASURES: Death or discharge to hospice. RESULTS: For Delta-era COVID-19 ICU patients in Colorado, the odds of death were estimated to be 26% greater for patients exposed every day of their ICU admission to a facility experiencing its all-era 75th percentile ICU fullness or above, versus patients exposed for none of their days (OR: 1.26; 95% CI: 1.04 to 1.54; p=0.0102), adjusting for age, sex, length of ICU stay, vaccination status and hospital quality rating. For both Delta-era and Omicron-era patients, we also detected significantly increased mortality hazard associated with high ventilator utilisation rates and (in a subset of facilities) states of ED overflow. For pre-Delta-era patients, we estimated relatively null or even protective effects for the same fullness exposures, something which provides a meaningful contrast to previous studies that found increased hazards but were limited to pre-Delta study windows. CONCLUSIONS: Overall, and especially during the Delta era (when most Colorado facilities were at their fullest), increasing exposure to a fuller hospital was associated with an increasing mortality hazard for COVID-19 ICU patients.
Asunto(s)
COVID-19 , Mortalidad Hospitalaria , Unidades de Cuidados Intensivos , SARS-CoV-2 , Humanos , COVID-19/mortalidad , COVID-19/epidemiología , Colorado/epidemiología , Estudios Retrospectivos , Unidades de Cuidados Intensivos/estadística & datos numéricos , Masculino , Femenino , Persona de Mediana Edad , Anciano , Ocupación de Camas/estadística & datos numéricos , Adulto , Servicio de Urgencia en Hospital/estadística & datos numéricosRESUMEN
Background: There is growing evidence that weather alters SARS-CoV-2 transmission, but it remains unclear what drives the phenomenon. One prevailing hypothesis is that people spend more time indoors in cooler weather, leading to increased spread of SARS-CoV-2 related to time spent in confined spaces and close contact with others. However, the evidence in support of that hypothesis is limited and, at times, conflicting. Objectives: We aim to evaluate the extent to which weather impacts COVID-19 via time spent away-from-home in indoor spaces, as compared to a direct effect of weather on COVID-19 hospitalization, independent of mobility. Methods: We use a mediation framework, and combine daily weather, COVID-19 hospital surveillance, cellphone-based mobility data and building footprints to estimate the relationship between daily indoor and outdoor weather conditions, mobility, and COVID-19 hospitalizations. We quantify the direct health impacts of weather on COVID-19 hospitalizations and the indirect effects of weather via time spent indoors away-from-home on COVID-19 hospitalizations within five Colorado counties between March 4th 2020 and January 31st 2021. Results: We found evidence that changes in 12-day lagged hospital admissions were primarily via the direct effects of weather conditions, rather than via indirect effects by which weather changes time spent indoors away-from-home. Sensitivity analyses evaluating time at home as a mediator were consistent with these conclusions. Discussion: Our findings do not support the hypothesis that weather impacted SARS-CoV-2 transmission via changes in mobility patterns during the first year of the pandemic. Rather, weather appears to have impacted SARS-CoV-2 transmission primarily via mechanisms other than human movement. We recommend further analysis of this phenomenon to determine whether these findings generalize to current SARS-CoV-2 transmission dynamics and other seasonal respiratory pathogens.
RESUMEN
BACKGROUND: Although the presence of intermediate snails is a necessary condition for local schistosomiasis transmission to occur, using them as surveillance targets in areas approaching elimination is challenging because the patchy and dynamic quality of snail host habitats makes collecting and testing snails labor-intensive. Meanwhile, geospatial analyses that rely on remotely sensed data are becoming popular tools for identifying environmental conditions that contribute to pathogen emergence and persistence. METHODS: In this study, we assessed whether open-source environmental data can be used to predict the presence of human Schistosoma japonicum infections among households with a similar or improved degree of accuracy compared to prediction models developed using data from comprehensive snail surveys. To do this, we used infection data collected from rural communities in Southwestern China in 2016 to develop and compare the predictive performance of two Random Forest machine learning models: one built using snail survey data, and one using open-source environmental data. RESULTS: The environmental data models outperformed the snail data models in predicting household S. japonicum infection with an estimated accuracy and Cohen's kappa value of 0.89 and 0.49, respectively, in the environmental model, compared to an accuracy and kappa of 0.86 and 0.37 for the snail model. The Normalized Difference in Water Index (an indicator of surface water presence) within half to one kilometer of the home and the distance from the home to the nearest road were among the top performing predictors in our final model. Homes were more likely to have infected residents if they were further from roads, or nearer to waterways. CONCLUSION: Our results suggest that in low-transmission environments, leveraging open-source environmental data can yield more accurate identification of pockets of human infection than using snail surveys. Furthermore, the variable importance measures from our models point to aspects of the local environment that may indicate increased risk of schistosomiasis. For example, households were more likely to have infected residents if they were further from roads or were surrounded by more surface water, highlighting areas to target in future surveillance and control efforts.
Asunto(s)
Esquistosomiasis Japónica , Esquistosomiasis , Humanos , Esquistosomiasis/diagnóstico , Esquistosomiasis/epidemiología , Esquistosomiasis/prevención & control , Esquistosomiasis Japónica/epidemiología , Esquistosomiasis Japónica/prevención & control , Ecosistema , China/epidemiología , AguaRESUMEN
Climate change may alter access to safe drinking water, with important implications for health. We assessed the relationship between temperature and rainfall and utilization of basic drinking water (BDW) in The Gambia, Mozambique, Pakistan, and Kenya. The outcomes of interest were (a) whether the reported drinking water source used in the past 2 weeks met the World Health Organization definition of BDW and (b) use of a BDW source that was always available. Temperature and precipitation data were compiled from weather stations and satellite data and summarized to account for long- and short-term weather patterns and lags. We utilized random forests and logistic regression to identify key weather variables that predicted outcomes by site and the association between important weather variables and BDW use. Higher temperatures were associated with decreased BDW use at three of four sites and decreased use of BDW that is always available at all four sites. Increasing rainfall, both in the long- and short-term, was associated with increased BDW use in three sites. We found evidence for interactions between household wealth and weather variables at two sites, suggesting lower wealth populations may be more sensitive to weather-driven changes in water access. Changes in temperature and precipitation can alter safe water use in low-resource settings-investigating drivers for these relationships can inform efforts to build climate resilience.
RESUMEN
The global community has adopted ambitious goals to eliminate schistosomiasis as a public health problem, and new tools are needed to achieve them. Mass drug administration programs, for example, have reduced the burden of schistosomiasis, but the identification of hotspots of persistent and reemergent transmission threaten progress toward elimination and underscore the need to couple treatment with interventions that reduce transmission. Recent advances in DNA sequencing technologies make whole-genome sequencing a valuable and increasingly feasible option for population-based studies of complex parasites such as schistosomes. Here, we focus on leveraging genomic data to tailor interventions to distinct social and ecological circumstances. We consider two priority questions that can be addressed by integrating epidemiological, ecological, and genomic information: (1) how often do non-human host species contribute to human schistosome infection? and (2) what is the importance of locally acquired versus imported infections in driving transmission at different stages of elimination? These questions address processes that can undermine control programs, especially those that rely heavily on treatment with praziquantel. Until recently, these questions were difficult to answer with sufficient precision to inform public health decision-making. We review the literature related to these questions and discuss how whole-genome approaches can identify the geographic and taxonomic sources of infection, and how such information can inform context-specific efforts that advance schistosomiasis control efforts and minimize the risk of reemergence.
Asunto(s)
Parásitos , Esquistosomiasis , Animales , Genómica , Administración Masiva de Medicamentos , Schistosoma , Esquistosomiasis/epidemiología , Esquistosomiasis/prevención & controlRESUMEN
In China, bovines are believed to be the most common animal source of human schistosomiasis infections, though little is known about what factors promote bovine infections. The current body of literature features inconsistent, and sometimes contradictory results, and to date, few studies have looked beyond physical characteristics to identify the broader environmental conditions that predict bovine schistosomiasis. Because schistosomiasis is a sanitation-related, water-borne disease transmitted by many animals, we hypothesised that several environmental factors - such as the lack of improved sanitation systems, or participation in agricultural production that is water-intensive - could promote schistosomiasis infection in bovines. Using data collected as part of a repeat cross-sectional study conducted in rural villages in Sichuan, China from 2007 to 2016, we used a Random Forests, machine learning approach to identify the best physical and environmental predictors of bovine Schistosoma japonicum infection. Candidate predictors included: (i) physical/biological characteristics of bovines, (ii) human sources of environmental schistosomes, (iii) socio-economic indicators, (iv) animal reservoirs, and (v) agricultural practices. The density of bovines in a village and agricultural practices such as the area of rice and dry summer crops planted, and the use of night soil as an agricultural fertilizer, were among the top predictors of bovine S. japonicum infection in all collection years. Additionally, human infection prevalence, pig ownership and bovine age were found to be strong predictors of bovine infection in at least 1 year. Our findings highlight that presumptively treating bovines in villages with high bovine density or human infection prevalence may help to interrupt transmission. Furthermore, village-level predictors were stronger predictors of bovine infection than household-level predictors, suggesting future investigations may need to apply a broad ecological lens to identify potential underlying sources of persistent transmission.
Asunto(s)
Schistosoma japonicum , Esquistosomiasis Japónica , Esquistosomiasis , Animales , Bovinos , China/epidemiología , Estudios Transversales , Humanos , Prevalencia , Schistosoma , Esquistosomiasis/epidemiología , Esquistosomiasis Japónica/epidemiología , Esquistosomiasis Japónica/veterinaria , Caracoles , Porcinos , AguaRESUMEN
Understanding the genetic underpinnings of schistosome host preferences is critical. Luo et al. recently identified genes associated with intermediate and definitive host-switching based on a new chromosome-level genome for Schistosoma japonicum, population genetic comparisons, and follow-up experiments. This represents a guide to fully map-selected schistosome genes using population genetics.
Asunto(s)
Schistosoma japonicum , Animales , Genética de Población , Genómica , Especificidad del Huésped/genética , Schistosoma japonicum/genéticaRESUMEN
Since early 2020, non-pharmaceutical interventions (NPIs)-implemented at varying levels of severity and based on widely-divergent perspectives of risk tolerance-have been the primary means to control SARS-CoV-2 transmission. This paper aims to identify how risk tolerance and vaccination rates impact the rate at which a population can return to pre-pandemic contact behavior. To this end, we developed a novel mathematical model and we used techniques from feedback control to inform data-driven decision-making. We use this model to identify optimal levels of NPIs across geographical regions in order to guarantee that hospitalizations will not exceed given risk tolerance thresholds. Results are shown for the state of Colorado, United States, and they suggest that: coordination in decision-making across regions is essential to maintain the daily number of hospitalizations below the desired limits; increasing risk tolerance can decrease the number of days required to discontinue NPIs, at the cost of an increased number of deaths; and if vaccination uptake is less than 70%, at most levels of risk tolerance, return to pre-pandemic contact behaviors before the early months of 2022 may newly jeopardize the healthcare system. The sooner we can acquire population-level vaccination of greater than 70%, the sooner we can safely return to pre-pandemic behaviors.
Asunto(s)
COVID-19 , Gripe Humana , COVID-19/epidemiología , COVID-19/prevención & control , Humanos , Gripe Humana/epidemiología , Modelos Teóricos , Pandemias/prevención & control , SARS-CoV-2 , Estados UnidosRESUMEN
Emerging evidence supports a link between environmental factors-including air pollution and chemical exposures, climate, and the built environment-and severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) transmission and coronavirus disease 2019 (COVID-19) susceptibility and severity. Climate, air pollution, and the built environment have long been recognized to influence viral respiratory infections, and studies have established similar associations with COVID-19 outcomes. More limited evidence links chemical exposures to COVID-19. Environmental factors were found to influence COVID-19 through four major interlinking mechanisms: increased risk of preexisting conditions associated with disease severity; immune system impairment; viral survival and transport; and behaviors that increase viral exposure. Both data and methodologic issues complicate the investigation of these relationships, including reliance on coarse COVID-19 surveillance data; gaps in mechanistic studies; and the predominance of ecological designs. We evaluate the strength of evidence for environment-COVID-19 relationships and discuss environmental actions that might simultaneously address the COVID-19 pandemic, environmental determinants of health, and health disparities.
Asunto(s)
Contaminación del Aire , COVID-19 , Contaminación del Aire/efectos adversos , COVID-19/epidemiología , Humanos , Incidencia , Pandemias , SARS-CoV-2RESUMEN
In the rapidly urbanizing region of West Africa, Aedes mosquitoes pose an emerging threat of infectious disease that is compounded by limited vector surveillance. Citizen science has been proposed as a way to fill surveillance gaps by training local residents to collect and share information on disease vectors. Understanding the distribution of arbovirus vectors in West Africa can inform researchers and public health officials on where to conduct disease surveillance and focus public health interventions. We utilized citizen science data collected through NASA's GLOBE Observer mobile phone application and data from a previously published literature review on Aedes mosquito distribution to examine the contribution of citizen science to understanding the distribution of Ae. aegypti in West Africa using Maximum Entropy modeling. Combining citizen science and literature-derived observations improved the fit of the model compared to models created by each data source alone but did not alleviate location bias within the models, likely due to lack of widespread observations. Understanding Ae. aegypti distribution will require greater investment in Aedes mosquito surveillance in the region, and citizen science should be utilized as a tool in this mission to increase the reach of surveillance.
Asunto(s)
Aedes , Arbovirus , Ciencia Ciudadana , Animales , Vectores de Enfermedades , África Occidental , Mosquitos VectoresRESUMEN
Schistosomiasis is a neglected tropical disease caused by multiple parasitic Schistosoma species, and which impacts over 200 million people globally, mainly in low- and middle-income countries. Genomic surveillance to detect evidence for natural selection in schistosome populations represents an emerging and promising approach to identify and interpret schistosome responses to ongoing control efforts or other environmental factors. Here we review how genomic variation is used to detect selection, how these approaches have been applied to schistosomes, and how future studies to detect selection may be improved. We discuss the theory of genomic analyses to detect selection, identify experimental designs for such analyses, and review studies that have applied these approaches to schistosomes. We then consider the biological characteristics of schistosomes that are expected to respond to selection, particularly those that may be impacted by control programs. Examples include drug resistance, host specificity, and life history traits, and we review our current understanding of specific genes that underlie them in schistosomes. We also discuss how inherent features of schistosome reproduction and demography pose substantial challenges for effective identification of these traits and their genomic bases. We conclude by discussing how genomic surveillance for selection should be designed to improve understanding of schistosome biology, and how the parasite changes in response to selection.
RESUMEN
Severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) emerged in 2019 and has become a major global pathogen in an astonishingly short period of time. The emergence of SARS-CoV-2 has been notable due to its impacts on residents in long-term care facilities (LTCFs). LTCF residents tend to possess several risk factors for severe outcomes of SARS-CoV-2 infection, including advanced age and the presence of comorbidities. Indeed, residents of LTCFs represent approximately 40% of SARS-CoV-2 deaths in the United States. Few studies have focused on the prevalence and transmission dynamics of SARS-CoV-2 among LTCF staff during the early months of the pandemic, prior to mandated surveillance testing. To assess the prevalence and incidence of SARS-CoV-2 among LTCF staff, characterize the extent of asymptomatic infections, and investigate the genomic epidemiology of the virus within these settings, we sampled staff for 8 to 11 weeks at six LTCFs with nasopharyngeal swabs from March through June of 2020. We determined the presence and levels of viral RNA and infectious virus and sequenced 54 nearly complete genomes. Our data revealed that over 50% of infections were asymptomatic/mildly symptomatic and that there was a strongly significant relationship between viral RNA (vRNA) and infectious virus, prolonged infections, and persistent vRNA (4+ weeks) in a subset of individuals, and declining incidence over time. Our data suggest that asymptomatic SARS-CoV-2-infected LTCF staff contributed to virus persistence and transmission within the workplace during the early pandemic period. Genetic epidemiology data generated from samples collected during this period support that SARS-CoV-2 was commonly spread between staff within an LTCF and that multiple-introduction events were less common. IMPORTANCE Our work comprises unique data on the characteristics of SARS-CoV-2 dynamics among staff working at LTCFs in the early months of the SARS-CoV-2 pandemic prior to mandated staff surveillance testing. During this time period, LTCF residents were largely sheltering-in-place. Given that staff were able to leave and return daily and could therefore be a continued source of imported or exported infection, we performed weekly SARS-CoV-2 PCR on nasal swab samples collected from this population. There are limited data from the early months of the pandemic comprising longitudinal surveillance of staff at LTCFs. Our data reveal the surprisingly high level of asymptomatic/presymptomatic infections within this cohort during the early months of the pandemic and show genetic epidemiological analyses that add novel insights into both the origin and transmission of SARS-CoV-2 within LTCFs.
Asunto(s)
Prueba de COVID-19/métodos , COVID-19/diagnóstico , COVID-19/epidemiología , Hospitales , Cuidados a Largo Plazo , SARS-CoV-2/aislamiento & purificación , Análisis de Secuencia/métodos , Adolescente , Adulto , Anciano , Infecciones Asintomáticas/epidemiología , COVID-19/virología , Estudios de Cohortes , Pruebas Diagnósticas de Rutina , Monitoreo Epidemiológico , Femenino , Secuenciación de Nucleótidos de Alto Rendimiento , Humanos , Masculino , Persona de Mediana Edad , Pandemias , Filogenia , Prevalencia , ARN Viral , SARS-CoV-2/clasificación , SARS-CoV-2/genética , Manejo de Especímenes , Adulto JovenRESUMEN
The severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) pandemic necessitated rapid local public health response, but studies examining the impact of social distancing policies on SARS-CoV-2 transmission have struggled to capture regional-level dynamics. We developed a susceptible-exposed-infected-recovered transmission model, parameterized to Colorado, USAâspecific data, to estimate the impact of coronavirus diseaseârelated policy measures on mobility and SARS-CoV-2 transmission in real time. During MarchâJune 2020, we estimated unknown parameter values and generated scenario-based projections of future clinical care needs. Early coronavirus disease policy measures, including a stay-at-home order, were accompanied by substantial decreases in mobility and reduced the effective reproductive number well below 1. When some restrictions were eased in late April, mobility increased to near baseline levels, but transmission remained low (effective reproductive number <1) through early June. Over time, our model parameters were adjusted to more closely reflect reality in Colorado, leading to modest changes in estimates of intervention effects and more conservative long-term projections.
Asunto(s)
COVID-19 , SARS-CoV-2 , Colorado/epidemiología , Humanos , Pandemias , PolíticasRESUMEN
Schistosomiasis persists in Asian regions despite aggressive elimination measures. To identify factors enabling continued parasite transmission, we performed reduced representation genome sequencing on Schistosoma japonicum miracidia collected across multiple years from transmission hotspots in Sichuan, China. We discovered strong geographic structure, suggesting that local, rather than imported, reservoirs are key sources of persistent infections in the region. At the village level, parasites collected after referral for praziquantel treatment are closely related to local pre-treatment populations. Schistosomes within villages are also highly related, suggesting that only a few parasites from a limited number of hosts drive re-infection. The close familial relationships among miracidia from different human hosts also implicate short transmission routes among humans. At the individual host level, genetic evidence indicates that multiple humans retained infections following referral for treatment. Our findings suggest that end-game schistosomiasis control measures should focus on completely extirpating local parasite reservoirs and confirming successful treatment of infected human hosts.
Asunto(s)
Variación Genética , Metagenómica , Schistosoma japonicum/genética , Esquistosomiasis Japónica/parasitología , Selección Genética , Animales , China/epidemiología , Genotipo , Humanos , Schistosoma japonicum/clasificación , Schistosoma japonicum/aislamiento & purificación , Esquistosomiasis Japónica/epidemiología , Esquistosomiasis Japónica/transmisiónRESUMEN
Urbanization increases human mobility in ways that can alter the transmission of classically rural, vector-borne diseases like schistosomiasis. The impact of human mobility on individual-level Schistosoma risk is poorly characterized. Travel outside endemic areas may protect against infection by reducing exposure opportunities, whereas travel to other endemic regions may increase risk. Using detailed monthly travel- and water-contact surveys from 27 rural communities in Sichuan, China, in 2008, we aimed to describe human mobility and to identify mobility-related predictors of S. japonicum infection. Candidate predictors included timing, frequency, distance, duration, and purpose of recent travel as well as water-contact measures. Random forests machine learning was used to detect key predictors of individual infection status. Logistic regression was used to assess the strength and direction of associations. Key mobility-related predictors include frequent travel and travel during July-both associated with decreased probability of infection and less time engaged in risky water-contact behavior, suggesting travel may remove opportunities for schistosome exposure. The importance of July travel and July water contact suggests a high-risk window for cercarial exposure. The frequency and timing of human movement out of endemic areas should be considered when assessing potential drivers of rural infectious diseases.
Asunto(s)
Enfermedades Endémicas/estadística & datos numéricos , Dinámica Poblacional/estadística & datos numéricos , Población Rural/estadística & datos numéricos , Esquistosomiasis Japónica/epidemiología , Viaje/estadística & datos numéricos , Adulto , China/epidemiología , Femenino , Humanos , Modelos Logísticos , Masculino , Persona de Mediana Edad , Esquistosomiasis Japónica/etiología , Factores de Tiempo , Recursos HídricosRESUMEN
Genomic approaches hold great promise for resolving unanswered questions about transmission patterns and responses to control efforts for schistosomiasis and other neglected tropical diseases. However, the cost of generating genomic data and the challenges associated with obtaining sufficient DNA from individual schistosome larvae (miracidia) from mammalian hosts have limited the application of genomic data for studying schistosomes and other complex macroparasites. Here, we demonstrate the feasibility of utilizing whole genome amplification and sequencing (WGS) to analyze individual archival miracidia. As an example, we sequenced whole genomes of 22 miracidia from 11 human hosts representing two villages in rural Sichuan, China, and used these data to evaluate patterns of relatedness and genetic diversity. We also down-sampled our dataset to test how lower coverage sequencing could increase the cost effectiveness of WGS while maintaining power to accurately infer relatedness. Collectively, our results illustrate that population-level WGS datasets are attainable for individual miracidia and represent a powerful tool for ultimately providing insight into overall genetic diversity, parasite relatedness, and transmission patterns for better design and evaluation of disease control efforts.