ABSTRACT
Persistent SARS-CoV-2 infections may act as viral reservoirs that could seed future outbreaks1-5, give rise to highly divergent lineages6-8 and contribute to cases with post-acute COVID-19 sequelae (long COVID)9,10. However, the population prevalence of persistent infections, their viral load kinetics and evolutionary dynamics over the course of infections remain largely unknown. Here, using viral sequence data collected as part of a national infection survey, we identified 381 individuals with SARS-CoV-2 RNA at high titre persisting for at least 30 days, of which 54 had viral RNA persisting at least 60 days. We refer to these as 'persistent infections' as available evidence suggests that they represent ongoing viral replication, although the persistence of non-replicating RNA cannot be ruled out in all. Individuals with persistent infection had more than 50% higher odds of self-reporting long COVID than individuals with non-persistent infection. We estimate that 0.1-0.5% of infections may become persistent with typically rebounding high viral loads and last for at least 60 days. In some individuals, we identified many viral amino acid substitutions, indicating periods of strong positive selection, whereas others had no consensus change in the sequences for prolonged periods, consistent with weak selection. Substitutions included mutations that are lineage defining for SARS-CoV-2 variants, at target sites for monoclonal antibodies and/or are commonly found in immunocompromised people11-14. This work has profound implications for understanding and characterizing SARS-CoV-2 infection, epidemiology and evolution.
Subject(s)
COVID-19 , Health Surveys , Persistent Infection , SARS-CoV-2 , Humans , Amino Acid Substitution , Antibodies, Monoclonal/immunology , COVID-19/epidemiology , COVID-19/virology , Evolution, Molecular , Immunocompromised Host/immunology , Mutation , Persistent Infection/epidemiology , Persistent Infection/virology , Post-Acute COVID-19 Syndrome/epidemiology , Post-Acute COVID-19 Syndrome/virology , Prevalence , RNA, Viral/analysis , RNA, Viral/genetics , SARS-CoV-2/chemistry , SARS-CoV-2/classification , SARS-CoV-2/genetics , SARS-CoV-2/immunology , SARS-CoV-2/isolation & purification , Selection, Genetic , Self Report , Time Factors , Viral Load , Virus ReplicationABSTRACT
In this study, we evaluated the impact of viral variant, in addition to other variables, on within-host viral burden, by analysing cycle threshold (Ct) values derived from nose and throat swabs, collected as part of the UK COVID-19 Infection Survey. Because viral burden distributions determined from community survey data can be biased due to the impact of variant epidemiology on the time-since-infection of samples, we developed a method to explicitly adjust observed Ct value distributions to account for the expected bias. By analysing the adjusted Ct values using partial least squares regression, we found that among unvaccinated individuals with no known prior exposure, viral burden was 44% lower among Alpha variant infections, compared to those with the predecessor strain, B.1.177. Vaccination reduced viral burden by 67%, and among vaccinated individuals, viral burden was 286% higher among Delta variant, compared to Alpha variant, infections. In addition, viral burden increased by 17% for every 10-year age increment of the infected individual. In summary, within-host viral burden increases with age, is reduced by vaccination, and is influenced by the interplay of vaccination status and viral variant.
Subject(s)
COVID-19 , SARS-CoV-2 , Humans , Selection Bias , SARS-CoV-2/genetics , Viral Load , COVID-19/epidemiology , COVID-19/prevention & control , VaccinationABSTRACT
[This corrects the article DOI: 10.1371/journal.ppat.1011461.].
ABSTRACT
The Office for National Statistics Coronavirus (COVID-19) Infection Survey (ONS-CIS) is the largest surveillance study of SARS-CoV-2 positivity in the community, and collected data on the United Kingdom (UK) epidemic from April 2020 until March 2023 before being paused. Here, we report on the epidemiological and evolutionary dynamics of SARS-CoV-2 determined by analysing the sequenced samples collected by the ONS-CIS during this period. We observed a series of sweeps or partial sweeps, with each sweeping lineage having a distinct growth advantage compared to their predecessors, although this was also accompanied by a gradual fall in average viral burdens from June 2021 to March 2023. The sweeps also generated an alternating pattern in which most samples had either S-gene target failure (SGTF) or non-SGTF over time. Evolution was characterized by steadily increasing divergence and diversity within lineages, but with step increases in divergence associated with each sweeping major lineage. This led to a faster overall rate of evolution when measured at the between-lineage level compared to within lineages, and fluctuating levels of diversity. These observations highlight the value of viral sequencing integrated into community surveillance studies to monitor the viral epidemiology and evolution of SARS-CoV-2, and potentially other pathogens.
Subject(s)
COVID-19 , Epidemics , Humans , COVID-19/epidemiology , SARS-CoV-2 , United Kingdom/epidemiology , Surveys and QuestionnairesABSTRACT
Lymphoid tissue is a key reservoir established by HIV-1 during acute infection. It is a site associated with viral production, storage of viral particles in immune complexes, and viral persistence. Although combinations of antiretroviral drugs usually suppress viral replication and reduce viral RNA to undetectable levels in blood, it is unclear whether treatment fully suppresses viral replication in lymphoid tissue reservoirs. Here we show that virus evolution and trafficking between tissue compartments continues in patients with undetectable levels of virus in their bloodstream. We present a spatial and dynamic model of persistent viral replication and spread that indicates why the development of drug resistance is not a foregone conclusion under conditions in which drug concentrations are insufficient to completely block virus replication. These data provide new insights into the evolutionary and infection dynamics of the virus population within the host, revealing that HIV-1 can continue to replicate and replenish the viral reservoir despite potent antiretroviral therapy.
Subject(s)
Carrier State/drug therapy , Carrier State/virology , HIV Infections/drug therapy , HIV Infections/virology , HIV-1/growth & development , Viral Load , Virus Replication , Anti-HIV Agents/administration & dosage , Anti-HIV Agents/pharmacology , Anti-HIV Agents/therapeutic use , Carrier State/blood , Drug Resistance, Viral/drug effects , HIV Infections/blood , HIV-1/drug effects , HIV-1/genetics , HIV-1/isolation & purification , Haplotypes/drug effects , Humans , Lymph Nodes/drug effects , Lymph Nodes/virology , Models, Biological , Molecular Sequence Data , Phylogeny , Selection, Genetic/drug effects , Sequence Analysis, DNA , Spatio-Temporal Analysis , Time Factors , Viral Load/drug effects , Virus Replication/drug effectsABSTRACT
Although antiretroviral drug therapy suppresses human immunodeficiency virus-type 1 (HIV-1) to undetectable levels in the blood of treated individuals, reservoirs of replication competent HIV-1 endure. Upon cessation of antiretroviral therapy, the reservoir usually allows outgrowth of virus and approaches to targeting the reservoir have had limited success. Ongoing cycles of viral replication in regions with low drug penetration contribute to this persistence. Here, we use a mathematical model to illustrate a new approach to eliminating the part of the reservoir attributable to persistent replication in drug sanctuaries. Reducing the residency time of CD4 T cells in drug sanctuaries renders ongoing replication unsustainable in those sanctuaries. We hypothesize that, in combination with antiretroviral drugs, a strategy to orchestrate CD4 T cell trafficking could contribute to a functional cure for HIV-1 infection.
Subject(s)
Disease Reservoirs/virology , HIV Infections/therapy , Virus Replication/drug effects , CD4 Lymphocyte Count/methods , CD4-Positive T-Lymphocytes/physiology , CD4-Positive T-Lymphocytes/virology , Computer Simulation , HIV Infections/virology , HIV-1/pathogenicity , HIV-1/physiology , Humans , Models, Theoretical , T-Lymphocytes/physiology , Viral Load/methods , Virus Latency/physiology , Virus Replication/physiologySubject(s)
Anti-Bacterial Agents , Drug Resistance, Bacterial , Climate , Climate Change , Global Warming , HumansABSTRACT
Identifying human immunodeficiency virus (HIV) immune escape mutations has implications for understanding the impact of host immunity on pathogen evolution and guiding the choice of vaccine antigens. One means of identifying cytotoxic-T-lymphocyte (CTL) escape mutations is to search for statistical associations between mutations and host human leukocyte antigen (HLA) class I alleles at the population level. The impact of evolutionary rates on the strength of such associations is not well defined. Here, we address this topic using a mathematical model of within-host evolution and between-host transmission of CTL escape mutants that predicts the prevalence of escape mutants at the population level. We ask how the rates at which an escape mutation emerges in a host who bears the restricting HLA and reverts when transmitted to a host who does not bear the HLA affect the strength of an association. We consider the impact of these factors when using a standard statistical method to test for an association and when using an adaptation of that method that corrects for phylogenetic relationships. We show that with both methods, the average sample size required to identify an escape mutation is smaller if the mutation escapes and reverts quickly. Thus, escape mutations identified as HLA associated systematically favor those that escape and revert rapidly. We also present expressions that can be used to infer escape and reversion rates from cross-sectional escape prevalence data.
Subject(s)
Genes, MHC Class I , HIV Infections/immunology , HIV Infections/virology , HIV/genetics , HIV/immunology , Mutation , T-Lymphocytes, Cytotoxic/immunology , Epitopes, T-Lymphocyte/genetics , Epitopes, T-Lymphocyte/immunology , HIV/classification , HIV Infections/transmission , Humans , Models, Theoretical , Phylogeny , Viral Proteins/genetics , Viral Proteins/immunologyABSTRACT
During infection with human immunodeficiency virus (HIV), immune pressure from cytotoxic T-lymphocytes (CTLs) selects for viral mutants that confer escape from CTL recognition. These escape variants can be transmitted between individuals where, depending upon their cost to viral fitness and the CTL responses made by the recipient, they may revert. The rates of within-host evolution and their concordant impact upon the rate of spread of escape mutants at the population level are uncertain. Here we present a mathematical model of within-host evolution of escape mutants, transmission of these variants between hosts and subsequent reversion in new hosts. The model is an extension of the well-known SI model of disease transmission and includes three further parameters that describe host immunogenetic heterogeneity and rates of within host viral evolution. We use the model to explain why some escape mutants appear to have stable prevalence whilst others are spreading through the population. Further, we use it to compare diverse datasets on CTL escape, highlighting where different sources agree or disagree on within-host evolutionary rates. The several dozen CTL epitopes we survey from HIV-1 gag, RT and nef reveal a relatively sedate rate of evolution with average rates of escape measured in years and reversion in decades. For many epitopes in HIV, occasional rapid within-host evolution is not reflected in fast evolution at the population level.
Subject(s)
Biological Evolution , HIV Reverse Transcriptase/genetics , HIV-1/immunology , Models, Theoretical , Mutation/genetics , T-Lymphocytes, Cytotoxic/immunology , gag Gene Products, Human Immunodeficiency Virus/genetics , nef Gene Products, Human Immunodeficiency Virus/genetics , Epitopes, T-Lymphocyte , HIV Infections/genetics , HIV Infections/immunology , HIV Infections/pathology , HIV-1/genetics , Humans , PhylogenyABSTRACT
Because cytotoxic T-lymphocytes (CTLs) have been shown to play a role in controlling human immunodeficiency virus (HIV) infection and because CTL-based simian immunodeficiency virus (SIV) vaccines have proved effective in non-human primates, one goal of HIV vaccine design is to elicit effective CTL responses in humans. Such a vaccine could improve viral control in patients who later become infected, thereby reducing onwards transmission and enhancing life expectancy in the absence of treatment. The ability of HIV to evolve mutations that evade CTLs and the ability of these 'escape mutants' to spread amongst the population poses a challenge to the development of an effective and robust vaccine. We present a mathematical model of within-host evolution and between-host transmission of CTL escape mutants amongst a population receiving a vaccine that elicits CTL responses to multiple epitopes. Within-host evolution at each epitope is represented by the outgrowth of escape mutants in hosts who restrict the epitope and their reversion in hosts who do not restrict the epitope. We use this model to investigate how the evolution and spread of escape mutants could affect the impact of a vaccine. We show that in the absence of escape, such a vaccine could markedly reduce the prevalence of both infection and disease in the population. However the impact of such a vaccine could be significantly abated by CTL escape mutants, especially if their selection in hosts who restrict the epitope is rapid and their reversion in hosts who do not restrict the epitope is slow. We also use the model to address whether a vaccine should span a broad or narrow range of CTL epitopes and target epitopes restricted by rare or common HLA types. We discuss the implications and limitations of our findings.
Subject(s)
AIDS Vaccines/immunology , HIV Infections/immunology , HIV Infections/virology , HIV/immunology , Models, Immunological , T-Lymphocytes, Cytotoxic/immunology , Computational Biology , Epidemics , Epitopes, T-Lymphocyte/immunology , HIV/genetics , HIV Infections/epidemiology , HLA Antigens/immunology , Host-Pathogen Interactions , Humans , MutationABSTRACT
To determine the relative fitness of oseltamivir-resistant strains compared to susceptible wild-type viruses, we combined mathematical modeling and statistical techniques with a novel in vivo "competitive-mixtures" experimental model. Ferrets were coinfected with either pure populations (100% susceptible wild-type or 100% oseltamivir-resistant mutant virus) or mixed populations of wild-type and oseltamivir-resistant influenza viruses (80%:20%, 50%:50%, and 20%:80%) at equivalent infectivity titers, and the changes in the relative proportions of those two viruses were monitored over the course of the infection during within-host and over host-to-host transmission events in a ferret contact model. Coinfection of ferrets with mixtures of an oseltamivir-resistant R292K mutant A(H3N2) virus and a R292 oseltamivir-susceptible wild-type virus demonstrated that the R292K mutant virus was rapidly outgrown by the R292 wild-type virus in artificially infected donor ferrets and did not transmit to any of the recipient ferrets. The competitive-mixtures model was also used to investigate the fitness of the seasonal A(H1N1) oseltamivir-resistant H274Y mutant and showed that within infected ferrets the H274Y mutant virus was marginally outgrown by the wild-type strain but demonstrated equivalent transmissibility between ferrets. This novel in vivo experimental method and accompanying mathematical analysis provide greater insight into the relative fitness, both within the host and between hosts, of two different influenza virus strains compared to more traditional methods that infect ferrets with only pure populations of viruses. Our statistical inferences are essential for the development of the next generation of mathematical models of the emergence and spread of oseltamivir-resistant influenza in human populations.
Subject(s)
Antiviral Agents/pharmacology , Drug Resistance, Viral , Influenza A Virus, H1N1 Subtype/physiology , Influenza A Virus, H3N2 Subtype/physiology , Oseltamivir/pharmacology , Virus Replication , Animals , Ferrets , Humans , Influenza A Virus, H1N1 Subtype/drug effects , Influenza A Virus, H1N1 Subtype/growth & development , Influenza A Virus, H1N1 Subtype/isolation & purification , Influenza A Virus, H3N2 Subtype/drug effects , Influenza A Virus, H3N2 Subtype/growth & development , Influenza A Virus, H3N2 Subtype/isolation & purification , Influenza, Human/virology , Models, Theoretical , Molecular Sequence Data , Mutation, Missense , Neuraminidase/genetics , RNA, Viral/genetics , Sequence Analysis, DNA , Viral Proteins/geneticsABSTRACT
BACKGROUND: We aimed to characterize the molecular epidemiology of HIV type-1 (HIV-1) and the prevalence of drug-associated mutations prior to initiating highly active antiretroviral therapy (HAART) in the Free State province, South Africa. The Free State has a population of 3 million, an antenatal HIV prevalence of approximately 34% and a well established infrastucture for antiretroviral (ARV) provision. METHODS: HIV-1 polymerase genes were sequenced from 425 HAART-naive HIV-1-positive patients at voluntary primary healthcare HIV testing centres, who were subsequently attending district centres for assessment for commencing ARVs. Patients (>18 years) were sampled randomly with no exclusion for gender or clinical criteria. Sequences were analysed according to phylogeny and drug resistance. RESULTS: Phylogenetic clustering within the cohort was suggestive of multiple introductions of subtype C virus into the region. Drug resistance mutations (according to the International AIDS Society-USA classification) were distributed randomly across the cohort phylogeny with an overall prevalence of 2.3% in the sampled patients. When stratified according to CD4(+) T-cell count, the prevalence of resistance was 3.6%, 0.9% and 1.2% for CD4(+) T-cell counts <100, 200-350 and >500 cells/microl, respectively, and was most common for non-nucleoside reverse transcriptase inhibitor resistance (3.1% in patients with CD4(+) T-cell count <100 cells/microl). We surveyed all drug-selected mutations and found further significant clustering among patients with low CD4(+) T-cell counts (P=0.003), suggesting unrecognized exposure to ARVs. CONCLUSIONS: In the Free State population, there was a statistical association between low CD4(+) T-cell counts and drug-associated viral polymorphisms. Our data advocate the benefit of detailed history taking from patients starting HAART at low CD4(+) T-cell counts with close follow-up of the virological response.
Subject(s)
Drug Resistance, Multiple, Viral/genetics , HIV Infections/epidemiology , HIV-1/genetics , Mutation , Adult , Antiretroviral Therapy, Highly Active , CD4 Lymphocyte Count , Cohort Studies , Female , HIV Infections/drug therapy , HIV Infections/immunology , HIV-1/drug effects , Humans , Male , Middle Aged , Prevalence , RNA, Viral/analysis , RNA, Viral/genetics , South Africa/epidemiologyABSTRACT
Strong competition between cytotoxic T-lymphocytes (CTLs) specific for different epitopes in human immunodeficiency virus (HIV) infection would have important implications for the design of an HIV vaccine. To investigate evidence for this type of competition, we analysed CTL response data from 97 patients with chronic HIV infection who were frequently sampled for up to 96 weeks. For each sample, CTL responses directed against a range of known epitopes in gag, pol and nef were measured using an enzyme-linked immunospot assay. The Lotka-Volterra model of competition was used to predict patterns that would be expected from these data if competitive interactions materially affect CTL numbers. In this application, the model predicts that when hosts make responses to a larger number of epitopes, they would have diminished responses to each epitope and that if one epitope-specific response becomes dramatically smaller, others would increase in size to compensate; conversely if one response grows, others would shrink. Analysis of the experimental data reveals results that are wholly inconsistent with these predictions. In hosts who respond to more epitopes, the average epitope-specific response tends to be larger, not smaller. Furthermore, responses to different epitopes almost always increase in unison or decrease in unison. Our findings are therefore inconsistent with the hypothesis that there is competition between CTL responses directed against different epitopes in HIV infection. This suggests that vaccines that elicit broad responses would be favourable because they would direct a larger total response against the virus, in addition to being more robust to the effects of CTL escape.
Subject(s)
Epitopes, T-Lymphocyte/immunology , HIV Infections/immunology , HIV-1 , T-Lymphocytes, Cytotoxic/immunology , Humans , Immunoenzyme Techniques , Models, Theoretical , SwitzerlandABSTRACT
Although no naturally infected sheep with bovine spongiform encephalopathy (BSE) has ever been discovered, it remains possible that BSE once infected the UK sheep population, has been transmitted between sheep, and is still present today. We constructed a mathematical model to assess the current maximum theoretical exposure to consumers from BSE-infected ovine material and to estimate the risk reduction that could be achieved by abattoir-based control options if BSE-infected sheep were ever found in the national flock. We predict that, if present, the exposure to consumers from a single BSE-infected sheep would be high: one sheep, close to the end of its incubation period, is likely to contribute 10-1000 times more infectious material than a fully infectious cow. Furthermore, 30% of this exposure comes from infectivity residing in lymphatic and peripheral tissue that cannot be completely removed from a carcass. We are 95% confident that throughout Great Britain, no more than four sheep flocks currently harbour an ongoing BSE epidemic. However, since the exposure from a single infected sheep is high, the annual human exposure from four 'typical' BSE-infected flocks could be considerable. Small reductions in exposure could be achieved by strategies based on tissue testing, a 12-month age restriction or expanded definitions of high-risk tissues. A six-month age restriction is likely to be more effective and genotype-based strategies the most effective.
Subject(s)
Disease Outbreaks/veterinary , Disease Transmission, Infectious/veterinary , Encephalopathy, Bovine Spongiform/epidemiology , Food Contamination/analysis , Mass Screening/veterinary , Models, Biological , Sheep Diseases/epidemiology , Animals , Cattle , Disease Transmission, Infectious/prevention & control , Encephalopathy, Bovine Spongiform/transmission , Mass Screening/methods , Meat , Population Surveillance/methods , Risk Assessment/methods , Sheep , Sheep Diseases/transmission , United Kingdom/epidemiologyABSTRACT
OBJECTIVE: This article predicts the future epidemiology of HIV-2 in Caió, a rural region of Guinea Bissau; and investigates whether HIV-2, which has halved in prevalence between 1990 and 2007 and is now almost absent in young adults in Caió, can persist as an infection of the elderly. DESIGN: A mathematical model of the spread of HIV-2 was tailored to the epidemic in Caió, a village in Guinea-Bissau. METHODS: An age-stratified difference equation model of HIV-2 transmission was fitted to age-stratified HIV-2 incidence and prevalence data from surveys conducted in Caió in 1990, 1997 and 2007. A stochastic version of the same model was used to make projections. RESULTS: HIV-2 infection is predicted to continue to rapidly decline in Caió such that new infections will cease and prevalence will reach low levels (e.g. below 0.1%) within a few decades. HIV-2 is not predicted to persist in the elderly. CONCLUSION: HIV-2 is predicted go extinct in Caió during the second half of this century.
Subject(s)
Disease Eradication , HIV Infections/epidemiology , HIV Infections/virology , HIV-2/isolation & purification , Adolescent , Adult , Aged , Aged, 80 and over , Female , Guinea-Bissau/epidemiology , Humans , Incidence , Male , Middle Aged , Models, Theoretical , Prevalence , Rural Population , Young AdultABSTRACT
BACKGROUND: Relatives often experience considerable problems looking after a family member with severe mental illness. The problems arising from verbal and physical abuse are not well researched or acknowledged. AIMS: To examine the frequency with which family carers experienced verbal and physical abuse from relatives who were being looked after by a community mental health service and to identify the correlates and consequences of that abuse. METHOD: Interviews with all the clients of a community mental health service in suburban Melbourne who had regular contact with a family carer together with interviews with the carers. RESULTS: One hundred and one clients and their family carers were interviewed. Supporting a previous study of patients on an acute admission ward, the experiences of verbal and physical abuse were positively correlated. Higher rates of abuse were associated with poor relationships between patients and their families and a history of poly-drug misuse and previous criminal offences on the part of the patient. Relatives experiencing higher levels of abuse were more likely to have. symptoms of emotional distress and were rated as experiencing more burden. CONCLUSIONS: Verbal and physical abuse are not infrequent problems facing family members caring for a relative with severe mental illness. Some of the risk factors for such abuse can be identified. Care plans for family carers could usefully target risk reduction strategies to minimise the occurrence of abuse.
Subject(s)
Caregivers , Family Relations , Mental Disorders/psychology , Adolescent , Adult , Female , Humans , Male , Middle Aged , Severity of Illness IndexABSTRACT
Cost-benefit is rarely combined with nonlinear dynamic models when evaluating control options for infectious diseases. The current strategy for scrapie in Great Britain requires that all genetically susceptible livestock in affected flocks be culled (Compulsory Scrapie Flock Scheme or CSFS). However, this results in the removal of many healthy sheep, and a recently developed pre-clinical test for scrapie now offers a strategy based on disease detection. We explore the flock level cost-effectiveness of scrapie control using a deterministic transmission model and industry estimates of costs associated with genotype testing, pre-clinical tests and the value of a sheep culled. Benefit was measured in terms of the reduction in the number of infected sheep sold on, compared to a baseline strategy of doing nothing, using Incremental Cost Effectiveness analysis to compare across strategies. As market data was not available for pre-clinical testing, a threshold analysis was used to set a unit-cost giving equal costs for CSFS and multiple pre-clinical testing (MT, one test each year for three consecutive years). Assuming a 40% within-flock proportion of susceptible genotypes and a test sensitivity of 90%, a single test (ST) was cheaper but less effective than either the CSFS or MT strategies (30 infected-sales-averted over the lifetime of the average epidemic). The MT strategy was slightly less effective than the CSFS and would be a dominated strategy unless preclinical testing was cheaper than the threshold price of £6.28, but may be appropriate for flocks with particularly valuable livestock. Though the ST is not currently recommended, the proportion of susceptible genotypes in the national flock is likely to continue to decrease; this may eventually make it a cost-effective alternative to the MT or CSFS.
Subject(s)
Scrapie/economics , Scrapie/prevention & control , Animals , Costs and Cost Analysis , Scrapie/epidemiology , United Kingdom/epidemiologyABSTRACT
Understanding the circumstances under which exposure to transmissible spongiform encephalopathies (TSEs) leads to infection is important for managing risks to public health. Based upon ideas in toxicology and radiology, it is plausible that exposure to harmful agents, including TSEs, is completely safe if the dose is low enough. However, the existence of a threshold, below which infection probability is zero has never been demonstrated experimentally. Here we explore this question by combining data and mathematical models that describe scrapie infections in mice following experimental challenge over a broad range of doses. We analyse data from 4338 mice inoculated at doses ranging over ten orders of magnitude. These data are compared to results from a within-host model in which prions accumulate according to a stochastic birth-death process. Crucially, this model assumes no threshold on the dose required for infection. Our data reveal that infection is possible at the very low dose of a 1000 fold dilution of the dose that infects half the challenged animals (ID50). Furthermore, the dose response curve closely matches that predicted by the model. These findings imply that there is no safe dose of prions and that assessments of the risk from low dose exposure are right to assume a linear relationship between dose and probability of infection. We also refine two common perceptions about TSE incubation periods: that their mean values decrease linearly with logarithmic decreases in dose and that they are highly reproducible between hosts. The model and data both show that the linear decrease in incubation period holds only for doses above the ID50. Furthermore, variability in incubation periods is greater than predicted by the model, not smaller. This result poses new questions about the sources of variability in prion incubation periods. It also provides insight into the limitations of the incubation period assay.