ABSTRACT
Discerning the effect of pharmacological exposures on intestinal bacterial communities in cancer patients is challenging. Here, we deconvoluted the relationship between drug exposures and changes in microbial composition by developing and applying a new computational method, PARADIGM (parameters associated with dynamics of gut microbiota), to a large set of longitudinal fecal microbiome profiles with detailed medication-administration records from patients undergoing allogeneic hematopoietic cell transplantation. We observed that several non-antibiotic drugs, including laxatives, antiemetics, and opioids, are associated with increased Enterococcus relative abundance and decreased alpha diversity. Shotgun metagenomic sequencing further demonstrated subspecies competition, leading to increased dominant-strain genetic convergence during allo-HCT that is significantly associated with antibiotic exposures. We integrated drug-microbiome associations to predict clinical outcomes in two validation cohorts on the basis of drug exposures alone, suggesting that this approach can generate biologically and clinically relevant insights into how pharmacological exposures can perturb or preserve microbiota composition. The application of a computational method called PARADIGM to a large dataset of cancer patients' longitudinal fecal specimens and detailed daily medication records reveals associations between drug exposures and the intestinal microbiota that recapitulate in vitro findings and are also predictive of clinical outcomes.
Subject(s)
Gastrointestinal Microbiome , Hematopoietic Stem Cell Transplantation , Microbiota , Neoplasms , Humans , Gastrointestinal Microbiome/genetics , Feces/microbiology , Metagenome , Anti-Bacterial Agents , Neoplasms/drug therapyABSTRACT
An outbreak of over 1,000 COVID-19 cases in Provincetown, Massachusetts (MA), in July 2021-the first large outbreak mostly in vaccinated individuals in the US-prompted a comprehensive public health response, motivating changes to national masking recommendations and raising questions about infection and transmission among vaccinated individuals. To address these questions, we combined viral genomic and epidemiological data from 467 individuals, including 40% of outbreak-associated cases. The Delta variant accounted for 99% of cases in this dataset; it was introduced from at least 40 sources, but 83% of cases derived from a single source, likely through transmission across multiple settings over a short time rather than a single event. Genomic and epidemiological data supported multiple transmissions of Delta from and between fully vaccinated individuals. However, despite its magnitude, the outbreak had limited onward impact in MA and the US overall, likely due to high vaccination rates and a robust public health response.
Subject(s)
COVID-19/epidemiology , COVID-19/immunology , COVID-19/transmission , SARS-CoV-2/genetics , SARS-CoV-2/immunology , Adolescent , Adult , Aged , Aged, 80 and over , COVID-19/virology , Child , Child, Preschool , Contact Tracing/methods , Disease Outbreaks , Female , Genome, Viral , Humans , Infant , Infant, Newborn , Male , Massachusetts/epidemiology , Middle Aged , Molecular Epidemiology , Phylogeny , SARS-CoV-2/classification , Vaccination , Whole Genome Sequencing , Young AdultABSTRACT
In this issue of Cell, Washington et al. and Alpert et al. demonstrate the value of genomic surveillance when studying the introduction of the B.1.1.7 variant to the US and illustrate the challenge that results from the lack of good sampling strategies.
Subject(s)
COVID-19/epidemiology , Communicable Diseases, Emerging/epidemiology , Epidemiological Monitoring , Metagenomics/methods , SARS-CoV-2/isolation & purification , COVID-19/virology , Communicable Diseases, Emerging/virology , Humans , SARS-CoV-2/genetics , United States/epidemiologyABSTRACT
SARS-CoV-2 variants of concern exhibit varying degrees of transmissibility and, in some cases, escape from acquired immunity. Much effort has been devoted to measuring these phenotypes, but understanding their impact on the course of the pandemic-especially that of immune escape-has remained a challenge. Here, we use a mathematical model to simulate the dynamics of wild-type and variant strains of SARS-CoV-2 in the context of vaccine rollout and nonpharmaceutical interventions. We show that variants with enhanced transmissibility frequently increase epidemic severity, whereas those with partial immune escape either fail to spread widely or primarily cause reinfections and breakthrough infections. However, when these phenotypes are combined, a variant can continue spreading even as immunity builds up in the population, limiting the impact of vaccination and exacerbating the epidemic. These findings help explain the trajectories of past and present SARS-CoV-2 variants and may inform variant assessment and response in the future.
Subject(s)
COVID-19/immunology , COVID-19/transmission , Immune Evasion , SARS-CoV-2/immunology , COVID-19/epidemiology , COVID-19/virology , Computer Simulation , Humans , Immunity , Models, Biological , Reinfection , VaccinationABSTRACT
The gut microbiota influences development1-3 and homeostasis4-7 of the mammalian immune system, and is associated with human inflammatory8 and immune diseases9,10 as well as responses to immunotherapy11-14. Nevertheless, our understanding of how gut bacteria modulate the immune system remains limited, particularly in humans, where the difficulty of direct experimentation makes inference challenging. Here we study hundreds of hospitalized-and closely monitored-patients with cancer receiving haematopoietic cell transplantation as they recover from chemotherapy and stem-cell engraftment. This aggressive treatment causes large shifts in both circulatory immune cell and microbiota populations, enabling the relationships between the two to be studied simultaneously. Analysis of observed daily changes in circulating neutrophil, lymphocyte and monocyte counts and more than 10,000 longitudinal microbiota samples revealed consistent associations between gut bacteria and immune cell dynamics. High-resolution clinical metadata and Bayesian inference allowed us to compare the effects of bacterial genera in relation to those of immunomodulatory medications, revealing a considerable influence of the gut microbiota-together and over time-on systemic immune cell dynamics. Our analysis establishes and quantifies the link between the gut microbiota and the human immune system, with implications for microbiota-driven modulation of immunity.
Subject(s)
Gastrointestinal Microbiome/immunology , Leukocytes/cytology , Leukocytes/immunology , Age Factors , Bayes Theorem , Fecal Microbiota Transplantation , Female , Humans , Leukocyte Count , Lymphocytes/cytology , Lymphocytes/immunology , Monocytes/cytology , Monocytes/immunology , Neutrophils/cytology , Neutrophils/immunology , Reproducibility of ResultsABSTRACT
BACKGROUND: Highly cross-linked polyethylene (HXLPE) has been an excellent bearing for total hip arthroplasty (THA) due to improved wear characteristics compared to conventional materials. Patients 50 years of age or younger are at high risk for wear-related complications of their THA, and few studies have followed these patients who have HXPLE into the third decade. METHODS: In a retrospective review of 88 consecutive THAs performed in 77 patients aged 50 years and younger (mean 41; range, 20 to 50), in which HXLPE was utilized, they were evaluated for their clinical and radiographic results at an average of 20-year follow-up (range, 18 to 24). The current study reports on longer-term follow-up from our previously published series at shorter follow-up times. Patients were categorized by femoral head material: cobalt chrome (n = 14), ceramic (n = 30), and oxidized zirconium (n = 22) and by femoral head size: 26 mm (n = 12), 28 mm (n = 46), and 32 mm (n = 8). Harris Hip Scores were collected preoperatively and at the most recent follow-up. Radiographs were evaluated for linear and volumetric wear, radiolucent lines, and osteolysis. RESULTS: Mean Harris Hip Scores improved from 47.1 (standard deviation [SD] 8.8) preoperatively to 92.0 (SD 7.7) (P < .0001) at 20-year follow-up. There was one hip that was revised for recurrent instability, and no hip demonstrated radiographic evidence of loosening or osteolysis. The mean polyethylene linear wear rate was 0.017 (SD 0.012) mm/y, and the mean polyethylene volumetric wear rate was 3.15 (SD 2.8) mm3/year, with no significant differences based on articulation type or head size. CONCLUSIONS: Total hip arthroplasty with HXLPE in patients ≤ 50 years of age continues to demonstrate excellent long-term clinical and radiographic outcomes with low wear characteristics at 20-year follow-up, regardless of femoral head material or size.
Subject(s)
Arthroplasty, Replacement, Hip , Hip Prosthesis , Polyethylene , Prosthesis Design , Humans , Middle Aged , Arthroplasty, Replacement, Hip/instrumentation , Retrospective Studies , Follow-Up Studies , Adult , Male , Female , Young Adult , Prosthesis Failure , Radiography , Hip Joint/surgery , Hip Joint/diagnostic imaging , Treatment OutcomeABSTRACT
BACKGROUND: Tibial bone defects are commonly encountered in revision total knee arthroplasty (rTKA) and can be managed with metaphyseal cones or sleeves. Few studies have directly compared tibial cones and sleeves in rTKA, and none have limited this comparison to the most severe tibial defects. The purpose of this study was to evaluate and compare the outcomes of metaphyseal cones and sleeves for tibial reconstruction in rTKA regarding implant fixation and clinical outcomes. METHODS: A retrospective review was conducted on patients undergoing rTKA in which metaphyseal cones or sleeves were utilized for addressing metaphyseal bone loss (34 cones and 18 sleeves). Tibial bone loss was classified according to the Anderson Orthopaedic Research Institute bone defect classification, with types 2B and 3 being included. Patient-reported outcomes and postoperative complications were collected, and a radiographic evaluation of osseointegration or loosening was performed. RESULTS: There were 52 knees included (34 cones, 18 sleeves), with a median follow-up of 41.0 months. All-cause implant survival was 100% at 2 years and 96% (95% confidence interval: 76 to 99%) at 4 years, with 98% of tibial components demonstrating osseointegration at the final follow-up. During follow-up, there were a total 11 revisions, of which 1 sleeve was revised secondary to implant loosening. Tibial sleeves had a higher risk of revision compared to tibial cones (P < .01), and sleeves fixed with a hybrid technique were more likely to need revision than cones fixed by the same method (P = .01). CONCLUSIONS: Porous metaphyseal tibial cones and tibial metaphyseal sleeves both performed well at a 41-month median follow-up with no difference in aseptic survivorship between the 2 constructs. Both demonstrate high rates of osseointegration, low rates of aseptic failure, and significant improvement in Knee Society Scores in patients with severe tibial defects in rTKA.
Subject(s)
Arthroplasty, Replacement, Knee , Knee Prosthesis , Reoperation , Tibia , Humans , Retrospective Studies , Female , Male , Arthroplasty, Replacement, Knee/instrumentation , Arthroplasty, Replacement, Knee/methods , Aged , Tibia/surgery , Tibia/diagnostic imaging , Middle Aged , Osseointegration , Treatment Outcome , Prosthesis Failure , Knee Joint/surgery , Knee Joint/diagnostic imaging , Knee Joint/physiopathology , Aged, 80 and over , Follow-Up StudiesABSTRACT
Managed aquifer recharge (MAR) offers a potential innovative solution for addressing groundwater resource issues, enabling excess surface water to be stored underground for later abstraction. Given its favourable hydrogeological properties, the Pliocene sand and gravel (Crag) aquifer in Suffolk, UK, was selected for a demonstration MAR scheme, with the goal of supplying additional summer irrigation water. The recharge source was a 4.6 km drainage channel that discharges to the River Deben estuary. Trialling the scheme in June 2022, 12,262 m3 of source water were recharged to the aquifer over 12 days via a lagoon and an array of 565 m of buried slotted pipes. Groundwater levels were raised by 0.3 m at the centre of the recharge mound with an approximate radius of 250 m, with no detrimental impact on local water features observed. The source water quality remained stable during the trial with a mean chloride concentration (133 mg L-1) below the regulatory requirement (165 mg L-1). The fraction of recharge water mixing with the groundwater ranged from 69% close to the centre and 5% at the boundary of the recharge mound, leading to a reduction in nitrate-N concentration of 23.6 mg L-1 at the centre of the mound. During July-September 2022, 12,301 m3 of recharge water were abstracted from two, 18 m boreholes to supplement surface irrigation reservoirs during drought conditions. However, the hydraulic conductivity of the Crag aquifer (â¼10 m day-1) restricted the yield and thereby reduced the economic viability of the scheme. Construction costs for the MAR system were comparatively low but the high costs of data collection and securing regulatory permits brought the overall capital costs to within 18% of an equivalent surface storage reservoir, demonstrating that market-based mechanisms and more streamlined regulatory processes are required to incentivise similar MAR schemes.
Subject(s)
Groundwater , Water Resources , Sand , Water Supply , United KingdomABSTRACT
BACKGROUND: The Omicron variant of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) is highly transmissible in vaccinated and unvaccinated populations. The dynamics that govern its establishment and propensity toward fixation (reaching 100% frequency in the SARS-CoV-2 population) in communities remain unknown. Here, we describe the dynamics of Omicron at 3 institutions of higher education (IHEs) in the greater Boston area. METHODS: We use diagnostic and variant-specifying molecular assays and epidemiological analytical approaches to describe the rapid dominance of Omicron following its introduction into 3 IHEs with asymptomatic surveillance programs. RESULTS: We show that the establishment of Omicron at IHEs precedes that of the state and region and that the time to fixation is shorter at IHEs (9.5-12.5 days) than in the state (14.8 days) or region. We show that the trajectory of Omicron fixation among university employees resembles that of students, with a 2- to 3-day delay. Finally, we compare cycle threshold values in Omicron vs Delta variant cases on college campuses and identify lower viral loads among college affiliates who harbor Omicron infections. CONCLUSIONS: We document the rapid takeover of the Omicron variant at IHEs, reaching near-fixation within the span of 9.5-12.5 days despite lower viral loads, on average, than the previously dominant Delta variant. These findings highlight the transmissibility of Omicron, its propensity to rapidly dominate small populations, and the ability of robust asymptomatic surveillance programs to offer early insights into the dynamics of pathogen arrival and spread.
Subject(s)
COVID-19 , Humans , COVID-19/epidemiology , SARS-CoV-2/genetics , Universities , BostonABSTRACT
We describe the direct measurement of the expulsion of a magnetic field from a plasma driven by heat flow. Using a laser to heat a column of gas within an applied magnetic field, we isolate Nernst advection and show how it changes the field over a nanosecond timescale. Reconstruction of the magnetic field map from proton radiographs demonstrates that the field is advected by heat flow in advance of the plasma expansion with a velocity v_{N}=(6±2)×10^{5} m/s. Kinetic and extended magnetohydrodynamic simulations agree well in this regime due to the buildup of a magnetic transport barrier.
ABSTRACT
BACKGROUND: Throughout the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) pandemic, healthcare workers (HCWs) have faced risk of infection from within the workplace via patients and staff as well as from the outside community, complicating our ability to resolve transmission chains in order to inform hospital infection control policy. Here we show how the incorporation of sequences from public genomic databases aided genomic surveillance early in the pandemic when circulating viral diversity was limited. METHODS: We sequenced a subset of discarded, diagnostic SARS-CoV-2 isolates between March and May 2020 from Boston Medical Center HCWs and combined this data set with publicly available sequences from the surrounding community deposited in GISAID with the goal of inferring specific transmission routes. RESULTS: Contextualizing our data with publicly available sequences reveals that 73% (95% confidence interval, 63%-84%) of coronavirus disease 2019 cases in HCWs are likely novel introductions rather than nosocomial spread. CONCLUSIONS: We argue that introductions of SARS-CoV-2 into the hospital environment are frequent and that expanding public genomic surveillance can better aid infection control when determining routes of transmission.
Subject(s)
COVID-19 , SARS-CoV-2 , Humans , SARS-CoV-2/genetics , Pandemics/prevention & control , COVID-19/epidemiology , Infection Control , Health Personnel , HospitalsABSTRACT
Studies of the relationship between the gastrointestinal microbiota and outcomes in allogeneic hematopoietic stem cell transplantation (allo-HCT) have thus far largely focused on early complications, predominantly infection and acute graft-versus-host disease (GVHD). We examined the potential relationship of the microbiome with chronic GVHD (cGVHD) by analyzing stool and plasma samples collected late after allo-HCT using a case-control study design. We found lower circulating concentrations of the microbe-derived short-chain fatty acids (SCFAs) propionate and butyrate in day 100 plasma samples from patients who developed cGVHD, compared with those who remained free of this complication, in the initial case-control cohort of transplant patients and in a further cross-sectional cohort from an independent transplant center. An additional cross-sectional patient cohort from a third transplant center was analyzed; however, serum (rather than plasma) was available, and the differences in SCFAs observed in the plasma samples were not recapitulated. In sum, our findings from the primary case-control cohort and 1 of 2 cross-sectional cohorts explored suggest that the gastrointestinal microbiome may exert immunomodulatory effects in allo-HCT patients at least in part due to control of systemic concentrations of microbe-derived SCFAs.
Subject(s)
Butyrates/blood , Gastrointestinal Microbiome , Graft vs Host Disease/microbiology , Propionates/blood , Adult , Allografts , Bacteria/isolation & purification , Bacteria/metabolism , Case-Control Studies , Chronic Disease , Dysbiosis/etiology , Dysbiosis/microbiology , Feces/microbiology , Graft vs Host Disease/blood , Graft vs Host Disease/etiology , Hematopoietic Stem Cell Transplantation/adverse effects , Humans , Metabolome , RibotypingABSTRACT
Dramatic microbiota changes and loss of commensal anaerobic bacteria are associated with adverse outcomes in hematopoietic cell transplantation (HCT) recipients. In this study, we demonstrate these dynamic changes at high resolution through daily stool sampling and assess the impact of individual antibiotics on those changes. We collected 272 longitudinal stool samples (with mostly daily frequency) from 18 patients undergoing HCT and determined their composition by multiparallel 16S rRNA gene sequencing as well as the density of bacteria in stool by quantitative PCR (qPCR). We calculated microbiota volatility to quantify rapid shifts and developed a new dynamic systems inference method to assess the specific impact of antibiotics. The greatest shifts in microbiota composition occurred between stem cell infusion and reconstitution of healthy immune cells. Piperacillin-tazobactam caused the most severe declines among obligate anaerobes. Our approach of daily sampling, bacterial density determination, and dynamic systems modeling allowed us to infer the independent effects of specific antibiotics on the microbiota of HCT patients.
Subject(s)
Anti-Bacterial Agents/pharmacology , Feces/microbiology , Gastrointestinal Microbiome/drug effects , Hematopoietic Stem Cell Transplantation , Microbiota/drug effects , Adult , Aged , Bacteria/genetics , Female , Hematopoietic Stem Cell Transplantation/adverse effects , Humans , Male , Middle Aged , RNA, Ribosomal, 16SABSTRACT
Vancomycin-resistant Enterococcus faecium (VRE) is a leading cause of hospital-acquired infections. This is particularly true in immunocompromised patients, where the damage to the microbiota caused by antibiotics can lead to VRE domination of the intestine, increasing a patient's risk for bloodstream infection. In previous studies we observed that the intestinal domination by VRE of patients hospitalized to receive allogeneic bone marrow transplantation can persist for weeks, but little is known about subspecies diversification and evolution during prolonged domination. Here we combined a longitudinal analysis of patient data and in vivo experiments to reveal previously unappreciated subspecies dynamics during VRE domination that appeared to be stable from 16S rRNA microbiota analyses. Whole-genome sequencing of isolates obtained from sequential stool samples provided by VRE-dominated patients revealed an unanticipated level of VRE population complexity that evolved over time. In experiments with ampicillin-treated mice colonized with a single CFU, VRE rapidly diversified and expanded into distinct lineages that competed for dominance. Mathematical modeling shows that in vivo evolution follows mostly a parabolic fitness landscape, where each new mutation provides diminishing returns and, in the setting of continuous ampicillin treatment, reveals a fitness advantage for mutations in penicillin-binding protein 5 (pbp5) that increase resistance to ampicillin. Our results reveal the rapid diversification of host-colonizing VRE populations, with implications for epidemiologic tracking of in-hospital VRE transmission and susceptibility to antibiotic treatment.
Subject(s)
DNA, Bacterial/genetics , Enterococcus faecium/genetics , Genetic Variation , Gram-Positive Bacterial Infections/microbiology , Vancomycin-Resistant Enterococci/genetics , Animals , Biological Evolution , DNA Mutational Analysis , Feces/microbiology , Humans , Longitudinal Studies , RNA, Ribosomal, 16S/geneticsABSTRACT
Extracellular dopamine and serotonin concentrations are determined by the presynaptic dopamine (DAT) and serotonin (SERT) transporters, respectively. Numerous studies have investigated the DAT and SERT structural elements contributing to inhibitor and substrate binding. To date, crystallographic studies have focused on conserved transmembrane domains, where multiple substrate binding and translocation features are conserved. However, it is unknown what, if any, role the highly divergent intracellular N and C termini contribute to these processes. Here, we used chimeric proteins to test whether DAT and SERT N and C termini contribute to transporter substrate and inhibitor affinities. Replacing the DAT N terminus with that of SERT had no effect on DA transport Vmax but significantly decreased DAT substrate affinities for DA and amphetamine. Similar losses in uptake inhibition were observed for small DAT inhibitors, whereas substituting the DAT C terminus with that of SERT affected neither substrate nor inhibitor affinities. In contrast, the N-terminal substitution was completely tolerated by the larger DAT inhibitors, which exhibited no loss in apparent affinity. Remarkably, all affinity losses were rescued in DAT chimeras encoding both SERT N and C termini. The sensitivity to amino-terminal substitution was specific for DAT, because replacing the SERT N and/or C termini affected neither substrate nor inhibitor affinities. Taken together, these findings provide compelling experimental evidence that DAT N and C termini synergistically contribute to substrate and inhibitor affinities.
Subject(s)
Dopamine Plasma Membrane Transport Proteins/metabolism , Amino Acid Substitution , Biological Transport, Active , Cell Line , Dopamine Plasma Membrane Transport Proteins/genetics , Humans , Mutation, Missense , Protein Domains , Serotonin Plasma Membrane Transport Proteins/genetics , Serotonin Plasma Membrane Transport Proteins/metabolismABSTRACT
Multiple virus particles can infect a target host cell. Such multiple infections (MIs) have significant and varied ecological and evolutionary consequences for both virus and host populations. Yet, the in situ rates and drivers of MIs in virus-microbe systems remain largely unknown. Here, we develop an individual-based model (IBM) of virus-microbe dynamics to probe how spatial interactions drive the frequency and nature of MIs. In our IBMs, we identify increasingly spatially correlated clusters of viruses given sufficient decreases in viral movement. We also identify increasingly spatially correlated clusters of viruses and clusters of hosts given sufficient increases in viral infectivity. The emergence of clusters is associated with an increase in multiply infected hosts as compared to expectations from an analogous mean field model. We also observe long-tails in the distribution of the multiplicity of infection in contrast to mean field expectations that such events are exponentially rare. We show that increases in both the frequency and severity of MIs occur when viruses invade a cluster of uninfected microbes. We contend that population-scale enhancement of MI arises from an aggregate of invasion dynamics over a distribution of microbe cluster sizes. Our work highlights the need to consider spatially explicit interactions as a potentially key driver underlying the ecology and evolution of virus-microbe communities.
Subject(s)
Bacteria/virology , Bacteriophages/physiology , Microbial Interactions , Bacterial Physiological Phenomena , Biological Evolution , Kinetics , Molecular Dynamics Simulation , Spatial AnalysisABSTRACT
Dynamic models - often deterministic in nature - were used to estimate the basic reproductive number, R0, of the 2014-5 Ebola virus disease (EVD) epidemic outbreak in West Africa. Estimates of R0 were then used to project the likelihood for large outbreak sizes, e.g., exceeding hundreds of thousands of cases. Yet fitting deterministic models can lead to over-confidence in the confidence intervals of the fitted R0, and, in turn, the type and scope of necessary interventions. In this manuscript we propose a hybrid stochastic-deterministic method to estimate R0 and associated confidence intervals (CIs). The core idea is that stochastic realizations of an underlying deterministic model can be used to evaluate the compatibility of candidate values of R0 with observed epidemic curves. The compatibility is based on comparing the distribution of expected epidemic growth rates with the observed epidemic growth rate given "process noise", i.e., arising due to stochastic transmission, recovery and death events. By applying our method to reported EVD case counts from Guinea, Liberia and Sierra Leone, we show that prior estimates of R0 based on deterministic fits appear to be more confident than analysis of stochastic trajectories suggests should be possible. Moving forward, we recommend including process noise among other sources of noise when estimating R0 CIs of emerging epidemics. Our hybrid procedure represents an adaptable and easy-to-implement approach for such estimation.
Subject(s)
Communicable Diseases, Emerging/epidemiology , Hemorrhagic Fever, Ebola/epidemiology , Models, Theoretical , Stochastic Processes , Africa, Western , Communicable Diseases, Emerging/transmission , Confidence Intervals , Epidemics , Hemorrhagic Fever, Ebola/transmission , Virus ReplicationABSTRACT
Virophages are viruses that rely on the replication machinery of other viruses to reproduce within eukaryotic hosts. Two different modes of coinfection have been posited based on experimental observation. In one mode, the virophage and the virus enter the host independently. In the other mode, the virophage adheres to the virus so both virophage and virus enter the host together. Here we ask: what are the ecological effects of these different modes of coinfection? In particular, what ecological effects are common to both infection modes, and what are the differences particular to each mode? We develop a pair of biophysically motivated ODE models of viral-host population dynamics, corresponding to dynamics arising from each mode of infection. We find that both modes of coinfection allow for the coexistence of the virophage, virus, and host either at a stable fixed point or through cyclical dynamics. In both models, virophage tends to be the most abundant population and their presence always reduces the viral abundance and increases the host abundance. However, we do find qualitative differences between models. For example, via extensive sampling of biologically relevant parameter space, we only observe bistability when the virophage and the virus enter the host together. We discuss how such differences may be leveraged to help identify modes of infection in natural environments from population level data.
Subject(s)
Coinfection , Ecosystem , Host-Pathogen Interactions , Models, Biological , Virus Diseases , Virus Physiological Phenomena , VirusesABSTRACT
Motivations for owning rural land are shifting from an agricultural-production orientation to a preference for natural and cultural amenities. Resultant changes in land management have significant implications for the type and distribution of landscape-level disturbances that affect the delivery of ecosystem services. We examined the relationship between motivations for owning land and the implementation of conservation land management practices by landowners in the Southern Great Plains of the United States. Using a mail survey, we classified landowners into three groups: agricultural production, multiple-objective, and lifestyle-oriented. Cross tabulations of landowner group with past, current, and future use of 12 different land management practices (related to prescribed grazing, vegetation management, restoration, and water management) found that lifestyle-oriented landowners were overall less likely to adopt these practices. To the degree that the cultural landscape of rural lands transitions from production-oriented to lifestyle-oriented landowners, the ecological landscape and the associated flow of ecosystem services will likely change. This poses new challenges to natural resource managers regarding education, outreach, and policy; however, a better understanding about the net ecological consequences of lower rates of adoption of conservation management practices requires consideration of the ecological tradeoffs associated with the changing resource dependency of rural landowners.