ABSTRACT
Cytochrome P450 (CYP)3A4 induction by drugs and pesticides plays a critical role in the enhancement of pyrrolizidine alkaloid (PA) toxicity as it leads to increased formation of hepatotoxic dehydro-PA metabolites. Addressing the need for a quantitative analysis of this interaction, we developed a physiologically-based toxicokinetic (PBTK) model. Specifically, the model describes the impact of the well-characterized CYP3A4 inducer rifampicin on the kinetics of retrorsine, which is a prototypic PA and contaminant in herbal teas. Based on consumption data, the kinetics after daily intake of retrorsine were simulated with concomitant rifampicin treatment. Strongest impact on retrorsine kinetics (plasma AUC 24 and C max reduced to 67% and 74% compared to the rifampicin-free reference) was predicted directly after withdrawal of rifampicin. At this time point, the competitive inhibitory effect of rifampicin stopped, while CYP3A4 induction was still near its maximum. Due to the impacted metabolism kinetics, the cumulative formation of intestinal retrorsine CYP3A4 metabolites increased to 254% (from 10 to 25 nmol), while the cumulative formation of hepatic CYP3A4 metabolites was not affected (57 nmol). Return to baseline PA toxicokinetics was predicted 14 days after stop of a 14-day rifampicin treatment. In conclusion, the PBTK model showed to be a promising tool to assess the dynamic interplay of enzyme induction and toxification pathways.
Subject(s)
Cytochrome P-450 CYP3A Inducers , Cytochrome P-450 CYP3A , Models, Biological , Pyrrolizidine Alkaloids , Rifampin , Toxicokinetics , Humans , Male , Cytochrome P-450 CYP3A/drug effects , Cytochrome P-450 CYP3A/metabolism , Drug Interactions , Liver/drug effects , Liver/metabolism , Pyrrolizidine Alkaloids/toxicity , Pyrrolizidine Alkaloids/pharmacokinetics , Rifampin/toxicity , Rifampin/pharmacokineticsABSTRACT
Retrorsine is a hepatotoxic pyrrolizidine alkaloid (PA) found in herbal supplements and medicines, food and livestock feed. Dose-response studies enabling the derivation of a point of departure including a benchmark dose for risk assessment of retrorsine in humans and animals are not available. Addressing this need, a physiologically based toxicokinetic (PBTK) model of retrorsine was developed for mouse and rat. Comprehensive characterization of retrorsine toxicokinetics revealed: both the fraction absorbed from the intestine (78%) and the fraction unbound in plasma (60%) are high, hepatic membrane permeation is dominated by active uptake and not by passive diffusion, liver metabolic clearance is 4-fold higher in rat compared to mouse and renal excretion contributes to 20% of the total clearance. The PBTK model was calibrated with kinetic data from available mouse and rat studies using maximum likelihood estimation. PBTK model evaluation showed convincing goodness-of-fit for hepatic retrorsine and retrorsine-derived DNA adducts. Furthermore, the developed model allowed to translate in vitro liver toxicity data of retrorsine to in vivo dose-response data. Resulting benchmark dose confidence intervals (mg/kg bodyweight) are 24.1-88.5 in mice and 79.9-104 in rats for acute liver toxicity after oral retrorsine intake. As the PBTK model was built to enable extrapolation to different species and other PA congeners, this integrative framework constitutes a flexible tool to address gaps in the risk assessment of PA.
Subject(s)
Pyrrolizidine Alkaloids , Humans , Rats , Mice , Animals , Pyrrolizidine Alkaloids/metabolism , Liver/metabolism , Microsomes, Liver/metabolism , DNA Adducts/metabolismABSTRACT
BACKGROUND: Various methods exist for statistical inference about a prevalence that consider misclassifications due to an imperfect diagnostic test. However, traditional methods are known to suffer from truncation of the prevalence estimate and the confidence intervals constructed around the point estimate, as well as from under-performance of the confidence intervals' coverage. METHODS: In this study, we used simulated data sets to validate a Bayesian prevalence estimation method and compare its performance to frequentist methods, i.e. the Rogan-Gladen estimate for prevalence, RGE, in combination with several methods of confidence interval construction. Our performance measures are (i) error distribution of the point estimate against the simulated true prevalence and (ii) coverage and length of the confidence interval, or credible interval in the case of the Bayesian method. RESULTS: Across all data sets, the Bayesian point estimate and the RGE produced similar error distributions with slight advantages of the former over the latter. In addition, the Bayesian estimate did not suffer from the RGE's truncation problem at zero or unity. With respect to coverage performance of the confidence and credible intervals, all of the traditional frequentist methods exhibited strong under-coverage, whereas the Bayesian credible interval as well as a newly developed frequentist method by Lang and Reiczigel performed as desired, with the Bayesian method having a very slight advantage in terms of interval length. CONCLUSION: The Bayesian prevalence estimation method should be prefered over traditional frequentist methods. An acceptable alternative is to combine the Rogan-Gladen point estimate with the Lang-Reiczigel confidence interval.
Subject(s)
Bayes Theorem , Prevalence , HumansABSTRACT
Bacteria of the genus Campylobacter spp. are one of the most common causes of gastroenteritis and can lead to serious sequelae. Several studies have estimated the disease burden of Campylobacter spp. with the quantitative metric of disability-adjusted life years (DALY). The aim of this systematic review is to give an overview of the information available about different countries and periods for which DALYs were calculated and how the different results are comparable. One of the most important transmission pathways for Campylobacter spp. is food. Therefore, special attention was given to studies that only estimated the foodborne disease burden of Campylobacter bacteria. With a systematic search for the period 1/1996-6/2016, one worldwide and 21 country-specific publications of the WHO were identified. Because of the different methods and the quality of the different data sets, the estimated results of all Campylobacter health outcomes of the country-specific studies vary from 0.4 DALYs per 100000 people in France to 109 DALY per population in Poland. The calculation of the attributable foodborne disease burden was based on the estimations of the incidences of all Campylobacter health outcomes with the associated uncertainty for each result. So the estimations of the foodborne disease burden show a large range from 0.5 DALYs per 100000 people in Greek to 21.2 DALYs per 100000 people in New Zealand. This span can only be partially explained by the country-specific variability in the food production, the consumption behavior and the incidence of Campylobacter bacteria.
Subject(s)
Campylobacter Infections/pathology , Cost of Illness , Foodborne Diseases , Gastroenteritis/pathology , Campylobacter/pathogenicity , Campylobacter Infections/complications , Communicable Diseases/microbiology , Gastroenteritis/complications , Germany , Humans , Poland , Quality-Adjusted Life YearsABSTRACT
The microscopic agglutination test (MAT) is still considered the gold standard for the diagnosis of leptospirosis, although studies have shown that the test is an imperfect gold standard for clinical samples and unsuitable for epidemiological studies. Here, test characteristics of an in-house ELISA were identified for both subclinical and clinical populations by Bayesian latent class models. A conditional dependence model for two diagnostic tests and two populations was adapted to analyse a clinical and a subclinical scenario, respectively. These Bayesian models were used to estimate the sensitivity and specificity of the in-house ELISA and the MAT as well as the prevalences. The Bayesian estimates of the in-house ELISA were: clinical sensitivity=83.0%, clinical specificity=98.5%, subclinical sensitivity=85.7% and subclinical specificity=99.1%. In contrast, the estimates of the MAT were: clinical sensitivity=65.6%, clinical specificity=97.7%, subclinical sensitivity 54.9% and subclinical specificity=97.3%. The results show the suitability of the in-house ELISA for both clinical investigations and epidemiological studies in mildly endemic areas.
Subject(s)
Leptospira/immunology , Leptospirosis/diagnosis , Serologic Tests/methods , Bayes Theorem , Enzyme-Linked Immunosorbent Assay/methods , Humans , Leptospirosis/epidemiology , Sensitivity and Specificity , Seroepidemiologic StudiesABSTRACT
The advent of new testing systems and "omics"-technologies has left regulatory toxicology facing one of the biggest challenges for decades. That is the question whether and how these methods can be used for regulatory purposes. The new methods undoubtedly enable regulators to address important open questions of toxicology such as species-specific toxicity, mixture toxicity, low-dose effects, endocrine effects or nanotoxicology, while promising faster and more efficient toxicity testing with the use of less animals. Consequently, the respective assays, methods and testing strategies are subject of several research programs worldwide. On the other hand, the practical application of such tests for regulatory purposes is a matter of ongoing debate. This document summarizes key aspects of this debate in the light of the European "regulatory status quo", while elucidating new perspectives for regulatory toxicity testing.
Subject(s)
Animal Testing Alternatives/methods , Toxicity Tests/methods , Toxicology/methods , Animal Testing Alternatives/legislation & jurisprudence , Animals , Europe , Government Regulation , Humans , Species Specificity , Toxicity Tests/standards , Toxicity Tests/trends , Toxicology/legislation & jurisprudence , Toxicology/standards , Toxicology/trends , United StatesABSTRACT
The Fiber Pathogenicity Paradigm (FPP) establishes connections between fiber structure, durability, and disease-causing potential observed in materials like asbestos and synthetic fibers. While emerging nanofibers are anticipated to exhibit pathogenic traits according to the FPP, their nanoscale diameter limits rigidity, leading to tangling and loss of fiber characteristics. The absence of validated rigidity measurement methods complicates nanofiber toxicity assessment. By comprehensively analyzing 89 transcriptomics and 37 proteomics studies, this study aims to enhance carbon material toxicity understanding and proposes an alternative strategy to assess morphology-driven toxicity. Carbon materials are categorized as non-fibrous, high aspect ratio with shorter lengths, tangled, and rigid fibers. Mitsui-7 serves as a benchmark for pathogenic fibers. The meta-analysis reveals distinct cellular changes for each category, effectively distinguishing rigid fibers from other carbon materials. Subsequently, a robust random forest model is developed to predict morphology, unveiling the pathogenicity of previously deemed non-pathogenic NM-400 due to its secondary structures. This study fills a crucial gap in nanosafety by linking toxicological effects to material morphology, in particular regarding fibers. It demonstrates the significant impact of morphology on toxicological behavior and the necessity of integrating morphological considerations into regulatory frameworks.
Subject(s)
Asbestos , Carbon , Carbon/toxicity , Proteomics , Asbestos/chemistry , Gene Expression Profiling , Structure-Activity RelationshipABSTRACT
The Shiga toxin-producing Escherichia coli O104:H4 outbreak in Germany in 2011 required the development of appropriate tools in real-time for tracing suspicious foods along the supply chain, namely salad ingredients, sprouts, and seeds. Food commodities consumed at locations identified as most probable site of infection (outbreak clusters) were traced back in order to identify connections between different disease clusters via the supply chain of the foods. A newly developed relational database with integrated consistency and plausibility checks was used to collate these data for further analysis. Connections between suppliers, distributors, and producers were visualized in network graphs and geographic projections. Finally, this trace-back and trace-forward analysis led to the identification of sprouts produced by a horticultural farm in Lower Saxony as vehicle for the pathogen, and a specific lot of fenugreek seeds imported from Egypt as the most likely source of contamination. Network graphs have proven to be a powerful tool for summarizing and communicating complex trade relationships to various stake holders. The present article gives a detailed description of the newly developed tracing tools and recommendations for necessary requirements and improvements for future foodborne outbreak investigations.
Subject(s)
Disease Outbreaks , Enterobacteriaceae Infections/epidemiology , Foodborne Diseases/epidemiology , Shiga-Toxigenic Escherichia coli/pathogenicity , Cluster Analysis , Egypt , Enterobacteriaceae Infections/microbiology , Food Contamination/analysis , Food Microbiology , Foodborne Diseases/microbiology , Germany/epidemiology , Humans , Plant Extracts , Shiga-Toxigenic Escherichia coli/isolation & purification , Trigonella/microbiologyABSTRACT
BACKGROUND: The proportionality principle has been broadly used for over 10 years in regulatory assessments of pesticide residues. It allows extrapolation of supervised field trial data conducted at lower or higher application rates compared to the use pattern under evaluation by adjustment of measured concentrations, assuming direct proportionality between the rates applied and the resulting residues. This work revisits the principle idea by using supervised residue trials sets conducted under identical conditions but with deviating application rates. Four different statistical methods were used to investigate the relationship between application rates and residue concentrations and to draw conclusions on the statistical significance of the direct proportionality assumed. RESULTS: Based on over 5000 individual trial results, the assumption of direct proportionality was not confirmed to be statistically significant (P > 0.05) using three models: direct comparison of application rates and residue concentrations ratios and two linear log-log regression models correlating application rate and residue concentration or only residue concentrations per se. In addition, a fourth model analysed deviations between expected concentrations following direct proportional adjustment and measured residue values from corresponding field trials. In 56% of all cases, the deviation was larger than ±25%, which represents the tolerance usually accepted for the selection of supervised field trials in regulatory assessments. CONCLUSION: Overall, the assumption of direct proportionality between application rates and resulting residue concentrations of pesticides was not statistically significant. Although the proportionality approach is highly pragmatic in regulatory practice, its use should be considered carefully on a case-by-case basis. © 2023 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.
Subject(s)
Pesticide Residues , Pesticides , Pesticide Residues/analysis , Pesticides/analysis , Crops, Agricultural , Food Contamination/analysisABSTRACT
The Treaty of Amsterdam, in force since 1 May 1999, has established new ground rules for the actions of the European Union (EU) on animal welfare. It recognizes that animals are sentient beings and obliges the European Institutions to pay full regard to the welfare requirements of animals when formulating and implementing Community legislation. In order to properly address welfare issues, these need to be assessed in a scientific and transparent way. The principles of risk assessment in terms of transparency and use of available scientific data are probably well suited for this area. The application of risk assessment for terrestrial and aquatic animal welfare is a relatively new area. This paper describes the work developed in the context of the European Food Safety Authority (EFSA) opinions on the application of a risk assessment methodology to fish welfare. Risk assessment is a scientifically based process that seeks to determine the likelihood and consequences of an adverse event, which is referred to as a hazard. It generally consists of the following steps: (i) hazard identification, (ii) hazard characterisation, (iii) exposure assessment and (iv) risk characterisation. Different approaches can be used for risk assessments, such as qualitative, semi-quantitative and quantitative approaches. These are discussed in the context of fish welfare, using examples from assessments done to aquaculture husbandry systems and stunning/killing methods for farmed fish. A critical review of the applications and limitations of the risk methodology in fish welfare is given. There is a need to develop appropriate indicators of fish welfare. Yet, risk assessment methodology provides a transparent approach to identify significant hazards and support recommendations for improved welfare.
Subject(s)
Animal Welfare , Fisheries/standards , Fishes/physiology , Risk Assessment , Animal Welfare/legislation & jurisprudence , Animals , Risk Assessment/standardsABSTRACT
The hepatitis E virus (HEV) can cause acute and chronic hepatitis in humans. Infections with the zoonotic HEV genotype 3, which can be transmitted from infected wild boar and deer to humans, are increasingly detected in Europe. To investigate the spatiotemporal HEV infection dynamics in wild animal populations, a study involving 3572 samples of wild boar and three deer species from six different geographic areas in Germany over a 4-year period was conducted. The HEV-specific antibody detection rates increased between 2013-2014 and 2016-2017 in wild boar from 9.5% to 22.8%, and decreased in deer from 1.1% to 0.2%. At the same time, HEV-RNA detection rates increased in wild boar from 2.8% to 13.3% and in deer from 0.7% to 4.2%. Marked differences were recorded between the investigated areas, with constantly high detection rates in one area and new HEV introductions followed by increasing detection rates in others. Molecular typing identified HEV subtypes 3c, 3f, 3i and a putative new subtype related to Italian wild boar strains. In areas, where sufficient numbers of positive samples were available for further analysis, a specific subtype dominated over the whole observation period. Phylogenetic analysis confirmed the close relationship between strains from the same area and identified closely related human strains from Germany. The results suggest that the HEV infection dynamics in wild animals is dependent on the particular geographical area where area-specific dominant strains circulate over a long period. The virus can spread from wild boar, which represent the main wild animal reservoir, to deer, and generally from wild animals to humans.
Subject(s)
Deer , Hepatitis E virus , Hepatitis E , Swine Diseases , Animals , Animals, Wild , Genotype , Germany/epidemiology , Hepatitis Antibodies , Hepatitis E/epidemiology , Hepatitis E/veterinary , Hepatitis E virus/genetics , Humans , Phylogeny , RNA , RNA, Viral/genetics , Sus scrofa , Swine , Swine Diseases/epidemiologyABSTRACT
The social structure of animal groups is considered to have an impact on their health and welfare. This could also be true for animals under commercial conditions, but research in this area has been limited. Pigs for example are known to be very social animals, but information about their grouping behavior is mostly derived from wild boars and a limited number of studies in seminatural and commercial conditions. Specifically under commercial conditions it is still unclear to what extent pig herds organize themselves in subgroups and how such group patterns emerge. To answer these questions, we tracked the positions of about 200 sows inside a barn during ongoing production over a period of five weeks and used these data to construct and analyze the animal contact networks. Our analysis showed a very high contact density and only little variation in the number of other animals that a specific animal is in contact with. Nevertheless, in each week we consistently detected three subgroups inside the barn, which also showed a clear spatial separation. Our results show that even in the high density environment of a commercial pig farm, the behavior of pigs to form differentiated groups is consistent with their behavior under seminatural conditions. Furthermore, our findings also imply that the barn layout could play an important role in the formation of the grouping pattern. These insights could be used to monitor and understand the spread of infectious diseases inside the barn better. In addition, our insights could potentially be used to improve the welfare of pigs.
Subject(s)
Animal Husbandry , Housing, Animal , Social Behavior , Sus scrofa/psychology , Animals , FemaleABSTRACT
Quantitative risk assessments for Bovine spongiform encephalopathy (BSE) necessitate estimates for key parameters such as the prevalence of infection, the probability of absence of infection in defined birth cohorts, and the numbers of BSE-infected, but non-detected cattle entering the food chain. We estimated three key parameters with adjustment for misclassification using the German BSE surveillance data using a Gompertz model for latent (i.e., unobserved) age-dependent detection probabilities and a Poisson response model for the number of BSE cases for birth cohorts 1999 to 2015. The models were combined in a Bayesian framework. We estimated the median true BSE prevalence between 3.74 and 0.216 cases per 100,000 animals for the birth cohorts 1990 to 2001 and observed a peak for the 1996 birth cohort with a point estimate of 16.41 cases per 100,000 cattle. For birth cohorts ranging from 2002 to 2013, the estimated median prevalence was below one case per 100,000 heads. The calculated confidence in freedom from disease (design prevalence 1 in 100,000) was above 99.5% for the birth cohorts 2002 to 2006. In conclusion, BSE surveillance in the healthy slaughtered cattle chain was extremely sensitive at the time, when BSE repeatedly occurred in Germany (2000-2009), because the entry of BSE-infected cattle into the food chain could virtually be prevented by the extensive surveillance program during these years and until 2015 (estimated non-detected cases/100.000 [95% credible interval] in 2000, 2009, and 2015 are 0.64 [0.5,0.8], 0.05 [0.01,0.14], and 0.19 [0.05,0.61], respectively).
Subject(s)
Encephalopathy, Bovine Spongiform , Animals , Bayes Theorem , Cattle , Encephalopathy, Bovine Spongiform/diagnosis , Encephalopathy, Bovine Spongiform/epidemiology , Freedom , Prevalence , Risk AssessmentABSTRACT
Bacteria of the genus Campylobacter are an important cause of human illness worldwide. Campylobacter infections are expressed as gastroenteritis and can lead to severe sequelae like reactive arthritis, Guillain-Barré syndrome, irritable bowel syndrome and inflammatory bowel disease. In Germany, Campylobacter-associated gastroenteritis cases are notifiable but there is no reporting obligation for the sequelaes and the disease burden is clearly underestimated. The aim of our study was to quantify reliably the current disease burden of all Campylobacter spp.-associated diseases for Germany with the method of disability-adjusted life years (DALYs). DALYs combine mortality and morbidity in a single summary measure, whereby one DALY represents the loss of one year in full health. For acute gastroenteritis, we estimated 967 DALYs of which only 484 DALYs were detected within the reporting system. Overall, we estimated that 8811 DALYs were caused by the campylobacter-related diseases known so far. 98% of the DALYs were associated with morbidity and 2% with mortality. Mortality was caused by the health outcomes Gastroenteritis and Guillain-Barré syndrome exclusively.
Subject(s)
Campylobacter Infections/mortality , Campylobacter , Cost of Illness , Gastroenteritis/mortality , Guillain-Barre Syndrome/mortality , Acute Disease , Female , Germany/epidemiology , Humans , MaleABSTRACT
Game meat may contain elevated concentrations of lead especially if lead-containing ammunition is used for hunting. Then a health risk is possible for consumer groups with high game meat intake. The lead concentrations in three edible parts (marketable meat from the area close to the wound channel, saddle and haunch) of meat from red deer (Cervus elaphus) between animals hunted either with lead or nonlead ammunition were compared. Furthermore, lead levels in game meat of lead-shot red deer were compared with those of lead-shot roe deer and lead-shot wild boar. Ninety red deer were shot and killed in the context of this study (64 with lead and 26 with nonlead ammunition). Since the lead concentration for a number of the samples was below the limit of detection or the limit of quantification, statistical methods for left-censored data were applied. The median concentrations of lead in game meat did not differ significantly between lead shot and nonlead shot animals. However, when we analyzed the more elevated lead concentrations, they were significantly higher in edible parts of animals shot with lead ammunition than non-lead ammunition. The highest concentrations were found in samples from edible meat from the area close to the wound channel (max 3442â¯mgâ¯Pb/kg), followed by the saddle (max 1.14â¯mgâ¯Pb/kg) and with the lowest levels in the haunch (max 0.09â¯mgâ¯Pb/kg). A comparison of game species revealed that the lead concentration in haunch and saddle of lead shot red deer was higher than in the corresponding samples of lead shot roe deer. Our results have shown that by the use of non-lead ammunition, a significant reduction of the lead concentration especially in edible parts near the wound channel is possible.
Subject(s)
Deer , Food Contamination/analysis , Lead/analysis , Meat/analysis , Sus scrofa , Animals , Female , MaleABSTRACT
This study aimed to estimate the disease burden of methylmercury for children born in Germany in the year 2014. Humans are mainly exposed to methylmercury when they eat fish or seafood. Prenatal methylmercury exposure is associated with IQ loss. To quantify this disease burden, we used Monte Carlo simulation to estimate the incidence of mild and severe mental retardation in children born to mothers who consume fish based on empirical data. Subsequently, we calculated the disease burden with the disability-adjusted life years (DALY)-method. DALYs combine mortality and morbidity in one measure and quantify the gap between an ideal situation, where the entire population experiences the standard life expectancy without disease and disability, and the actual situation. Thus, one DALY corresponds to the loss of one year of life in good health. The methylmercury-induced burden of disease for the German birth cohort 2014 was an average of 14,186 DALY (95% CI 12,915-15,440 DALY). A large majority of the DALYs was attributed to morbidity as compared to mortality. Of the total disease burden, 98% were attributed to mild mental retardation, which only leads to morbidity. The remaining disease burden was a result of severe mental retardation with equal proportions of premature death and morbidity.
Subject(s)
Methylmercury Compounds/toxicity , Body Burden , Child, Preschool , Cohort Studies , Disabled Persons , Environmental Exposure , Female , Germany , History, 21st Century , Humans , Intellectual Disability/chemically induced , Male , Methylmercury Compounds/pharmacokinetics , Monte Carlo Method , Quality-Adjusted Life Years , Seafood/analysisABSTRACT
The toxicity of lead has been known for a long time, and no safe uptake level can be derived for humans. Consumers' intake via food should therefore be kept as low as possible. Game meat can contain elevated levels of lead due to the use of lead ammunition for hunting. A risk assessment conducted in 2010 by the German Federal Institute for Risk Assessment including various consumption scenarios revealed a possible health risk for extreme consumers of game meat hunted with lead ammunition (i.e. hunters and members of hunters' households). Babies, infants, children and women of childbearing age were identified as vulnerable group with regards to the developmental neurotoxicity of lead. It was noted, that a sound data base was required in order to refine the assessment. Therefore, the research project "Safety of game meat obtained through hunting" (LEMISI) has been conducted in Germany, with the aims of determining the concentrations of lead (as well as of copper and zinc) brought into the edible parts of game meat (roe deer (Capreolus capreolus) and wild boar (Sus scrofa)) due to using either lead or non-lead hunting ammunition, whilst concurrently taking geogenic (i.e. "background") levels of lead into account. Compared to non-lead ammunition, lead ammunition significantly increased lead concentrations in the game meat. The use of both lead and non-lead ammunition deposited copper and zinc in the edible parts of game meat, and the concentrations were in the range of those detected regularly in meat of farm animals. For the average consumer of game meat in Germany the additional uptake of lead only makes a minor contribution to the average alimentary lead exposure. However, for consumers from hunters' households the resulting uptake of lead-due to lead ammunition-can be several times higher than the average alimentary lead exposure. Non-lead bullets in combination with suitable game meat hygienic measures are therefore recommended in order to ensure "state of the art consumer health protection".
Subject(s)
Food Contamination/analysis , Lead/analysis , Meat/analysis , Animals , Child , Deer , Female , Humans , Sus scrofa , SwineABSTRACT
BACKGROUND: Surrogate markers of protective immunity to malaria in humans are needed to rationalize malaria vaccine discovery and development. In an effort to identify such markers, and thereby provide a clue to the complex equation malaria vaccine development is facing, we investigated the relationship between protection acquired through exposure in the field with naturally occurring immune responses (i.e., induced by the parasite) to molecules that are considered as valuable vaccine candidates. METHODS AND FINDINGS: We analyzed, under comparative conditions, the antibody responses of each of six isotypes to five leading malaria vaccine candidates in relation to protection acquired by exposure to natural challenges in 217 of the 247 inhabitants of the African village of Dielmo, Senegal (96 children and 121 older adolescents and adults). The status of susceptibility or resistance to malaria was determined by active case detection performed daily by medical doctors over 6 y from a unique follow-up study of this village. Of the 30 immune responses measured, only one, antibodies of the IgG3 isotype directed to merozoite surface protein 3 (MSP3), was strongly associated with clinical protection against malaria in all age groups, i.e., independently of age. This immunological parameter had a higher statistical significance than the sickle cell trait, the strongest factor of protection known against Plasmodium falciparum. A single determination of antibody was significantly associated with the clinical outcome over six consecutive years in children submitted to massive natural parasite challenges by mosquitoes (over three parasite inoculations per week). Finally, the target epitopes of these antibodies were found to be fully conserved. CONCLUSIONS: Since anti-MSP3 IgG3 antibodies can naturally develop along with protection against P. falciparum infection in young children, our results provide the encouraging indication that these antibodies should be possible to elicit by vaccination early in life. Since these antibodies have been found to achieve parasite killing under in vitro and in vivo conditions, and since they can be readily elicited by immunisation in naïve volunteers, our immunoepidemiological findings support the further development of MSP3-based vaccine formulations.
Subject(s)
Antigens, Protozoan/immunology , Immunoglobulin G/blood , Malaria, Falciparum/immunology , Plasmodium falciparum/immunology , Protozoan Proteins/immunology , Adolescent , Adult , Animals , Antigens, Protozoan/genetics , Child , Child, Preschool , Epitopes/genetics , Epitopes/immunology , Female , Humans , Immunity, Innate/immunology , Immunoglobulin G/immunology , Infant , Infant, Newborn , Malaria Vaccines/genetics , Malaria Vaccines/immunology , Malaria, Falciparum/epidemiology , Male , Merozoites/immunology , Molecular Sequence Data , Plasmodium falciparum/genetics , Plasmodium falciparum/growth & development , Protozoan Proteins/genetics , Senegal/epidemiology , Sequence Analysis, DNA , Seroepidemiologic StudiesABSTRACT
Though predation, productivity (nutrient richness), spatial heterogeneity, and disturbance regimes are known to influence species diversity, interactions between these factors remain largely unknown. Predation has been shown to interact with productivity and with spatial heterogeneity, but few experimental studies have focused on how predation and disturbance interact to influence prey diversity. We used theory and experiments to investigate how these factors influence diversification of Pseudomonas fluorescens by manipulating both predation (presence or absence of Bdellovibrio bacteriovorus) and disturbance (frequency and intensity of disturbance). Our results show that in a homogeneous environment, predation is essential to promote prey species diversity. However, in most but not all treatments, elevated diversity was transitory, implying that the effect of predation on diversity was strongly influenced by disturbance. Both our experimental and theoretical results suggest that disturbance interacts with predation by modifying the interplay of resource and apparent competition among prey.
Subject(s)
Bdellovibrio/physiology , Biodiversity , Pseudomonas fluorescens/virology , Models, Biological , Mutation , Pseudomonas fluorescens/classification , Pseudomonas fluorescens/genetics , Species SpecificityABSTRACT
BACKGROUND: Non-lead hunting ammunition is an alternative to bullets that contain lead. The use of lead ammunition can result in severe contamination of game meat, thus posing a health risk to consumers. With any kind of ammunition for hunting, the terminal effectiveness of bullets is an animal welfare issue. Doubts about the effectiveness of non-lead bullets for a humane kill of game animals in hunting have been discussed. The length of the escape distance after the shot has been used previously as an indicator for bullet performance. OBJECTIVE: The object of this study was to determine how the bullet material (lead or non-lead) influences the observed escape distances. METHODS: 1,234 records of the shooting of roe deer (Capreolus capreolus) and 825 records of the shooting of wild boar (Sus scrofa) were evaluated. As the bullet material cannot be regarded as the sole cause of variability of escape distances, interactions of other potential influencing variables like shot placement, shooting distance, were analyzed using conditional regression trees and two-part hurdle models. RESULTS: The length of the escape distance is not influenced by the use of lead or non-lead ammunition with either roe deer or wild boar. With roe deer, the length of the escape distance is influenced significantly by the shot placement and the type of hunting. Increasing shooting distances increased the length of the escape distance. With wild boar, shot placement and the age of the animals were found to be a significant influencing factor on the length of the escape distance. CONCLUSIONS: The length of the escape distance can be used as an indicator for adequate bullet effectiveness for humane killings of game animals in hunting.Non-lead bullets already exist which have an equally reliable killing effect as lead bullets.