Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 17 de 17
Filter
Add more filters










Publication year range
1.
Water Res ; 212: 118070, 2022 Apr 01.
Article in English | MEDLINE | ID: mdl-35101695

ABSTRACT

Wastewater surveillance has emerged as a useful tool in the public health response to the COVID-19 pandemic. While wastewater surveillance has been applied at various scales to monitor population-level COVID-19 dynamics, there is a need for quantitative metrics to interpret wastewater data in the context of public health trends. 24-hour composite wastewater samples were collected from March 2020 through May 2021 from a Massachusetts wastewater treatment plant and SARS-CoV-2 RNA concentrations were measured using RT-qPCR. The relationship between wastewater copy numbers of SARS-CoV-2 gene fragments and COVID-19 clinical cases and deaths varies over time. We demonstrate the utility of three new metrics to monitor changes in COVID-19 epidemiology: (1) the ratio between wastewater copy numbers of SARS-CoV-2 gene fragments and clinical cases (WC ratio), (2) the time lag between wastewater and clinical reporting, and (3) a transfer function between the wastewater and clinical case curves. The WC ratio increases after key events, providing insight into the balance between disease spread and public health response. Time lag and transfer function analysis showed that wastewater data preceded clinically reported cases in the first wave of the pandemic but did not serve as a leading indicator in the second wave, likely due to increased testing capacity, which allows for more timely case detection and reporting. These three metrics could help further integrate wastewater surveillance into the public health response to the COVID-19 pandemic and future pandemics.


Subject(s)
COVID-19 , Pandemics , Benchmarking , Humans , RNA, Viral , SARS-CoV-2 , Wastewater , Wastewater-Based Epidemiological Monitoring
2.
Sci Total Environ ; 805: 150121, 2022 Jan 20.
Article in English | MEDLINE | ID: mdl-34534872

ABSTRACT

Current estimates of COVID-19 prevalence are largely based on symptomatic, clinically diagnosed cases. The existence of a large number of undiagnosed infections hampers population-wide investigation of viral circulation. Here, we quantify the SARS-CoV-2 concentration and track its dynamics in wastewater at a major urban wastewater treatment facility in Massachusetts, between early January and May 2020. SARS-CoV-2 was first detected in wastewater on March 3. SARS-CoV-2 RNA concentrations in wastewater correlated with clinically diagnosed new COVID-19 cases, with the trends appearing 4-10 days earlier in wastewater than in clinical data. We inferred viral shedding dynamics by modeling wastewater viral load as a convolution of back-dated new clinical cases with the average population-level viral shedding function. The inferred viral shedding function showed an early peak, likely before symptom onset and clinical diagnosis, consistent with emerging clinical and experimental evidence. This finding suggests that SARS-CoV-2 concentrations in wastewater may be primarily driven by viral shedding early in infection. This work shows that longitudinal wastewater analysis can be used to identify trends in disease transmission in advance of clinical case reporting, and infer early viral shedding dynamics for newly infected individuals, which are difficult to capture in clinical investigations.


Subject(s)
COVID-19 , SARS-CoV-2 , Humans , RNA, Viral , Virus Shedding , Wastewater
3.
Cell ; 184(26): 6229-6242.e18, 2021 12 22.
Article in English | MEDLINE | ID: mdl-34910927

ABSTRACT

SARS-CoV-2 variants of concern exhibit varying degrees of transmissibility and, in some cases, escape from acquired immunity. Much effort has been devoted to measuring these phenotypes, but understanding their impact on the course of the pandemic-especially that of immune escape-has remained a challenge. Here, we use a mathematical model to simulate the dynamics of wild-type and variant strains of SARS-CoV-2 in the context of vaccine rollout and nonpharmaceutical interventions. We show that variants with enhanced transmissibility frequently increase epidemic severity, whereas those with partial immune escape either fail to spread widely or primarily cause reinfections and breakthrough infections. However, when these phenotypes are combined, a variant can continue spreading even as immunity builds up in the population, limiting the impact of vaccination and exacerbating the epidemic. These findings help explain the trajectories of past and present SARS-CoV-2 variants and may inform variant assessment and response in the future.


Subject(s)
COVID-19/immunology , COVID-19/transmission , Immune Evasion , SARS-CoV-2/immunology , COVID-19/epidemiology , COVID-19/virology , Computer Simulation , Humans , Immunity , Models, Biological , Reinfection , Vaccination
4.
Water Res ; 202: 117400, 2021 Sep 01.
Article in English | MEDLINE | ID: mdl-34274898

ABSTRACT

Wastewater-based disease surveillance is a promising approach for monitoring community outbreaks. Here we describe a nationwide campaign to monitor SARS-CoV-2 in the wastewater of 159 counties in 40 U.S. states, covering 13% of the U.S. population from February 18 to June 2, 2020. Out of 1,751 total samples analyzed, 846 samples were positive for SARS-CoV-2 RNA, with overall viral concentrations declining from April to May. Wastewater viral titers were consistent with, and appeared to precede, clinical COVID-19 surveillance indicators, including daily new cases. Wastewater surveillance had a high detection rate (>80%) of SARS-CoV-2 when the daily incidence exceeded 13 per 100,000 people. Detection rates were positively associated with wastewater treatment plant catchment size. To our knowledge, this work represents the largest-scale wastewater-based SARS-CoV-2 monitoring campaign to date, encompassing a wide diversity of wastewater treatment facilities and geographic locations. Our findings demonstrate that a national wastewater-based approach to disease surveillance may be feasible and effective.


Subject(s)
COVID-19 , SARS-CoV-2 , Disease Outbreaks , Humans , RNA, Viral , Wastewater
5.
Cell Host Microbe ; 29(7): 1048-1051, 2021 07 14.
Article in English | MEDLINE | ID: mdl-34265244

ABSTRACT

If enough individuals in a population are immune to a pathogen, it cannot cause an outbreak. Deliberately seeking such herd immunity through infection during a potentially lethal pandemic is contrary to all principles of public health, given the potential for uncontrolled outbreaks and risks to vulnerable populations.


Subject(s)
COVID-19/immunology , Immunity, Herd , Pandemics , COVID-19/transmission , COVID-19 Vaccines , Disease Outbreaks , Humans , Public Health , SARS-CoV-2 , Vaccination
6.
medRxiv ; 2021 Jun 16.
Article in English | MEDLINE | ID: mdl-34159339

ABSTRACT

Wastewater surveillance has emerged as a useful tool in the public health response to the COVID-19 pandemic. While wastewater surveillance has been applied at various scales to monitor population-level COVID-19 dynamics, there is a need for quantitative metrics to interpret wastewater data in the context of public health trends. We collected 24-hour composite wastewater samples from March 2020 through May 2021 from a Massachusetts wastewater treatment plant and measured SARS-CoV-2 RNA concentrations using RT-qPCR. We show that the relationship between wastewater viral titers and COVID-19 clinical cases and deaths varies over time. We demonstrate the utility of three new metrics to monitor changes in COVID-19 epidemiology: (1) the ratio between wastewater viral titers and clinical cases (WC ratio), (2) the time lag between wastewater and clinical reporting, and (3) a transfer function between the wastewater and clinical case curves. We find that the WC ratio increases after key events, providing insight into the balance between disease spread and public health response. We also find that wastewater data preceded clinically reported cases in the first wave of the pandemic but did not serve as a leading indicator in the second wave, likely due to increased testing capacity. These three metrics could complement a framework for integrating wastewater surveillance into the public health response to the COVID-19 pandemic and future pandemics.

7.
Eur J Epidemiol ; 36(4): 429-439, 2021 Apr.
Article in English | MEDLINE | ID: mdl-33881667

ABSTRACT

Nonpharmaceutical interventions, such as contact tracing and quarantine, have been the primary means of controlling the spread of SARS-CoV-2; however, it remains uncertain which interventions are most effective at reducing transmission at the population level. Using serial interval data from before and after the rollout of nonpharmaceutical interventions in China, we estimate that the relative frequency of presymptomatic transmission increased from 34% before the rollout to 71% afterward. The shift toward earlier transmission indicates a disproportionate reduction in transmission post-symptom onset. We estimate that, following the rollout of nonpharmaceutical interventions, transmission post-symptom onset was reduced by 82% whereas presymptomatic transmission decreased by only 16%. The observation that only one-third of transmission was presymptomatic at baseline, combined with the finding that NPIs reduced presymptomatic transmission by less than 20%, suggests that the overall impact of NPIs was driven in large part by reductions in transmission following symptom onset. This implies that interventions which limit opportunities for transmission in the later stages of infection, such as contact tracing and isolation, are particularly important for control of SARS-CoV-2. Interventions which specifically reduce opportunities for presymptomatic transmission, such as quarantine of asymptomatic contacts, are likely to have smaller, but non-negligible, effects on overall transmission.


Subject(s)
COVID-19/physiopathology , COVID-19/transmission , SARS-CoV-2 , China , Contact Tracing , Databases, Factual , Humans , Incidence , Models, Statistical , Quarantine , SARS-CoV-2/pathogenicity
8.
medRxiv ; 2021 Mar 14.
Article in English | MEDLINE | ID: mdl-33758888

ABSTRACT

Wastewater-based disease surveillance is a promising approach for monitoring community outbreaks. Here we describe a nationwide campaign to monitor SARS-CoV-2 in the wastewater of 159 counties in 40 U.S. states, covering 13% of the U.S. population from February 18 to June 2, 2020. Out of 1,751 total samples analyzed, 846 samples were positive for SARS-CoV-2 RNA, with overall viral concentrations declining from April to May. Wastewater viral titers were consistent with, and appeared to precede, clinical COVID-19 surveillance indicators, including daily new cases. Wastewater surveillance had a high detection rate (>80%) of SARS-CoV-2 when the daily incidence exceeded 13 per 100,000 people. Detection rates were positively associated with wastewater treatment plant catchment size. To our knowledge, this work represents the largest-scale wastewater-based SARS-CoV-2 monitoring campaign to date, encompassing a wide diversity of wastewater treatment facilities and geographic locations. Our findings demonstrate that a national wastewater-based approach to disease surveillance may be feasible and effective.

9.
medRxiv ; 2020 Jul 06.
Article in English | MEDLINE | ID: mdl-32607521

ABSTRACT

Current estimates of COVID-19 prevalence are largely based on symptomatic, clinically diagnosed cases. The existence of a large number of undiagnosed infections hampers population-wide investigation of viral circulation. Here, we use longitudinal wastewater analysis to track SARS-CoV-2 dynamics in wastewater at a major urban wastewater treatment facility in Massachusetts, between early January and May 2020. SARS-CoV-2 was first detected in wastewater on March 3. Viral titers in wastewater increased exponentially from mid-March to mid-April, after which they began to decline. Viral titers in wastewater correlated with clinically diagnosed new COVID-19 cases, with the trends appearing 4-10 days earlier in wastewater than in clinical data. We inferred viral shedding dynamics by modeling wastewater viral titers as a convolution of back-dated new clinical cases with the viral shedding function of an individual. The inferred viral shedding function showed an early peak, likely before symptom onset and clinical diagnosis, consistent with emerging clinical and experimental evidence. Finally, we found that wastewater viral titers at the neighborhood level correlate better with demographic variables than with population size. This work suggests that longitudinal wastewater analysis can be used to identify trends in disease transmission in advance of clinical case reporting, and may shed light on infection characteristics that are difficult to capture in clinical investigations, such as early viral shedding dynamics.

11.
Nat Med ; 26(4): 506-510, 2020 04.
Article in English | MEDLINE | ID: mdl-32284616

ABSTRACT

As of 29 February 2020 there were 79,394 confirmed cases and 2,838 deaths from COVID-19 in mainland China. Of these, 48,557 cases and 2,169 deaths occurred in the epicenter, Wuhan. A key public health priority during the emergence of a novel pathogen is estimating clinical severity, which requires properly adjusting for the case ascertainment rate and the delay between symptoms onset and death. Using public and published information, we estimate that the overall symptomatic case fatality risk (the probability of dying after developing symptoms) of COVID-19 in Wuhan was 1.4% (0.9-2.1%), which is substantially lower than both the corresponding crude or naïve confirmed case fatality risk (2,169/48,557 = 4.5%) and the approximator1 of deaths/deaths + recoveries (2,169/2,169 + 17,572 = 11%) as of 29 February 2020. Compared to those aged 30-59 years, those aged below 30 and above 59 years were 0.6 (0.3-1.1) and 5.1 (4.2-6.1) times more likely to die after developing symptoms. The risk of symptomatic infection increased with age (for example, at ~4% per year among adults aged 30-60 years).


Subject(s)
Age Factors , Coronavirus Infections , Models, Statistical , Pandemics , Pneumonia, Viral , Severity of Illness Index , Adolescent , Adult , Aged , Aged, 80 and over , Asymptomatic Diseases , Betacoronavirus , COVID-19 , COVID-19 Testing , Child , Child, Preschool , China/epidemiology , Clinical Laboratory Techniques , Coronavirus Infections/complications , Coronavirus Infections/diagnosis , Coronavirus Infections/epidemiology , Female , Humans , Infant , Male , Middle Aged , Mortality , Pneumonia, Viral/complications , Pneumonia, Viral/diagnosis , Pneumonia, Viral/epidemiology , Prognosis , Real-Time Polymerase Chain Reaction , Risk Factors , SARS-CoV-2
12.
Evol Med Public Health ; 2020(1): 30-34, 2020.
Article in English | MEDLINE | ID: mdl-32099654

ABSTRACT

Lay Summary: Competition often occurs among diverse parasites within a single host, but control efforts could change its strength. We examined how the interplay between competition and control could shape the evolution of parasite traits like drug resistance and disease severity.

13.
J R Soc Interface ; 16(155): 20190165, 2019 06 28.
Article in English | MEDLINE | ID: mdl-31238835

ABSTRACT

Theoretical models suggest that mixed-strain infections, or co-infections, are an important driver of pathogen evolution. However, the within-host dynamics of co-infections vary enormously, which complicates efforts to develop a general understanding of how co-infections affect evolution. Here, we develop a general framework which condenses the within-host dynamics of co-infections into a few key outcomes, the most important of which is the overall R0 of the co-infection. Similar to how fitness is determined by two different alleles in a heterozygote, the R0 of a co-infection is a product of the R0 values of the co-infecting strains, shaped by the interaction of those strains at the within-host level. Extending the analogy, we propose that the overall R0 reflects the dominance of the co-infecting strains, and that the ability of a mutant strain to invade a population is a function of its dominance in co-infections. To illustrate the utility of these concepts, we use a within-host model to show how dominance arises from the within-host dynamics of a co-infection, and then use an epidemiological model to demonstrate that dominance is a robust predictor of the ability of a mutant strain to save a maladapted wild-type strain from extinction (evolutionary emergence).


Subject(s)
Coinfection , Host-Pathogen Interactions , Models, Biological , Animals
14.
PLoS Biol ; 16(8): e2005712, 2018 08.
Article in English | MEDLINE | ID: mdl-30130363

ABSTRACT

In the malaria parasite P. falciparum, drug resistance generally evolves first in low-transmission settings, such as Southeast Asia and South America. Resistance takes noticeably longer to appear in the high-transmission settings of sub-Saharan Africa, although it may spread rapidly thereafter. Here, we test the hypothesis that competitive suppression of drug-resistant parasites by drug-sensitive parasites may inhibit evolution of resistance in high-transmission settings, where mixed-strain infections are common. We employ a cross-scale model, which simulates within-host (infection) dynamics and between-host (transmission) dynamics of sensitive and resistant parasites for a population of humans and mosquitoes. Using this model, we examine the effects of transmission intensity, selection pressure, fitness costs of resistance, and cross-reactivity between strains on the establishment and spread of resistant parasites. We find that resistant parasites, introduced into the population at a low frequency, are more likely to go extinct in high-transmission settings, where drug-sensitive competitors and high levels of acquired immunity reduce the absolute fitness of the resistant parasites. Under strong selection from antimalarial drug use, however, resistance spreads faster in high-transmission settings than low-transmission ones. These contrasting results highlight the distinction between establishment and spread of resistance and suggest that the former but not the latter may be inhibited in high-transmission settings. Our results suggest that within-host competition is a key factor shaping the evolution of drug resistance in P. falciparum.


Subject(s)
Adaptation, Biological/physiology , Host-Parasite Interactions/physiology , Plasmodium falciparum/physiology , Africa South of the Sahara , Animals , Antimalarials/therapeutic use , Culicidae , Disease Transmission, Infectious , Drug Resistance , Drug Resistance, Bacterial/genetics , Drug Resistance, Bacterial/physiology , Humans , Malaria/parasitology , Malaria, Falciparum/drug therapy , Plasmodium falciparum/drug effects , Plasmodium falciparum/genetics , Plasmodium falciparum/pathogenicity , South America
15.
Proc Biol Sci ; 283(1826): 20153038, 2016 Mar 16.
Article in English | MEDLINE | ID: mdl-26984625

ABSTRACT

Infections with the malaria parasite Plasmodium falciparum typically comprise multiple strains, especially in high-transmission areas where infectious mosquito bites occur frequently. However, little is known about the dynamics of mixed-strain infections, particularly whether strains sharing a host compete or grow independently. Competition between drug-sensitive and drug-resistant strains, if it occurs, could be a crucial determinant of the spread of resistance. We analysed 1341 P. falciparum infections in children from Angola, Ghana and Tanzania and found compelling evidence for competition in mixed-strain infections: overall parasite density did not increase with additional strains, and densities of individual chloroquine-sensitive (CQS) and chloroquine-resistant (CQR) strains were reduced in the presence of competitors. We also found that CQR strains exhibited low densities compared with CQS strains (in the absence of chloroquine), which may underlie observed declines of chloroquine resistance in many countries following retirement of chloroquine as a first-line therapy. Our observations support a key role for within-host competition in the evolution of drug-resistant malaria. Malaria control and resistance-management efforts in high-transmission regions may be significantly aided or hindered by the effects of competition in mixed-strain infections. Consideration of within-host dynamics may spur development of novel strategies to minimize resistance while maximizing the benefits of control measures.


Subject(s)
Antimalarials/pharmacology , Chloroquine/pharmacology , Drug Resistance , Malaria, Falciparum/parasitology , Plasmodium falciparum/drug effects , Plasmodium falciparum/physiology , Angola , Child , Child, Preschool , Ghana , Humans , Infant , Infant, Newborn , Malaria, Falciparum/drug therapy , Plasmodium falciparum/genetics , Tanzania
16.
Antimicrob Agents Chemother ; 59(10): 6096-100, 2015 Oct.
Article in English | MEDLINE | ID: mdl-26195521

ABSTRACT

Routine therapeutic efficacy monitoring to measure the response to antimalarial treatment is a cornerstone of malaria control. To correctly measure drug efficacy, therapeutic efficacy studies require genotyping parasites from late treatment failures to differentiate between recrudescent infections and reinfections. However, there is a lack of statistical methods to systematically classify late treatment failures from genotyping data. A Bayesian algorithm was developed to estimate the posterior probability of late treatment failure being the result of a recrudescent infection from microsatellite genotyping data. The algorithm was implemented using a Monte Carlo Markov chain approach and was used to classify late treatment failures using published microsatellite data from therapeutic efficacy studies in Ethiopia and Angola. The algorithm classified 85% of the Ethiopian and 95% of the Angolan late treatment failures as either likely reinfection or likely recrudescence, defined as a posterior probability of recrudescence of <0.1 or >0.9, respectively. The adjusted efficacies calculated using the new algorithm differed from efficacies estimated using commonly used methods for differentiating recrudescence from reinfection. In a high-transmission setting such as Angola, as few as 15 samples needed to be genotyped in order to have enough power to correctly classify treatment failures. Analysis of microsatellite genotyping data for differentiating between recrudescence and reinfection benefits from an approach that both systematically classifies late treatment failures and estimates the uncertainty of these classifications. Researchers analyzing genotyping data from antimalarial therapeutic efficacy monitoring are urged to publish their raw genetic data and to estimate the uncertainty around their classification.


Subject(s)
Algorithms , Antimalarials/therapeutic use , Malaria/drug therapy , Malaria/genetics , Microsatellite Repeats/genetics , Genotype , Humans , Monte Carlo Method , Treatment Outcome
17.
PLoS Genet ; 9(3): e1003312, 2013.
Article in English | MEDLINE | ID: mdl-23516369

ABSTRACT

Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR), together with associated genes (cas), form the CRISPR-cas adaptive immune system, which can provide resistance to viruses and plasmids in bacteria and archaea. Here, we use mathematical models, population dynamic experiments, and DNA sequence analyses to investigate the host-phage interactions in a model CRISPR-cas system, Streptococcus thermophilus DGCC7710 and its virulent phage 2972. At the molecular level, the bacteriophage-immune mutant bacteria (BIMs) and CRISPR-escape mutant phage (CEMs) obtained in this study are consistent with those anticipated from an iterative model of this adaptive immune system: resistance by the addition of novel spacers and phage evasion of resistance by mutation in matching sequences or flanking motifs. While CRISPR BIMs were readily isolated and CEMs generated at high rates (frequencies in excess of 10(-6)), our population studies indicate that there is more to the dynamics of phage-host interactions and the establishment of a BIM-CEM arms race than predicted from existing assumptions about phage infection and CRISPR-cas immunity. Among the unanticipated observations are: (i) the invasion of phage into populations of BIMs resistant by the acquisition of one (but not two) spacers, (ii) the survival of sensitive bacteria despite the presence of high densities of phage, and (iii) the maintenance of phage-limited communities due to the failure of even two-spacer BIMs to become established in populations with wild-type bacteria and phage. We attribute (i) to incomplete resistance of single-spacer BIMs. Based on the results of additional modeling and experiments, we postulate that (ii) and (iii) can be attributed to the phage infection-associated production of enzymes or other compounds that induce phenotypic phage resistance in sensitive bacteria and kill resistant BIMs. We present evidence in support of these hypotheses and discuss the implications of these results for the ecology and (co)evolution of bacteria and phage.


Subject(s)
Bacteriophages/genetics , Biological Evolution , Genetics, Population , Interspersed Repetitive Sequences/genetics , Streptococcus thermophilus , Bacteriophages/immunology , Base Sequence , DNA, Intergenic/genetics , Immunity/genetics , Models, Theoretical , Molecular Sequence Data , Mutation , Sequence Analysis, DNA , Streptococcus thermophilus/genetics , Streptococcus thermophilus/immunology , Streptococcus thermophilus/virology
SELECTION OF CITATIONS
SEARCH DETAIL
...