ABSTRACT
Consumers can be exposed to many foodborne biological hazards that cause diseases with varying outcomes and incidence and, therefore, represent different levels of public health burden. To help the French risk managers to rank these hazards and to prioritize food safety actions, we have developed a three-step approach. The first step was to develop a list of foodborne hazards of health concern in mainland France. From an initial list of 335 human pathogenic biological agents, the final list of "retained hazards" consists of 24 hazards, including 12 bacteria (including bacterial toxins and metabolites), 3 viruses and 9 parasites. The second step was to collect data to estimate the disease burden (incidence, Disability Adjusted Life Years) associated with these hazards through food during two time periods: 2008-2013 and 2014-2019. The ranks of the different hazards changed slightly according to the considered period. The third step was the ranking of hazards according to a multicriteria decision support model using the ELECTRE III method. Three ranking criteria were used, where two reflect the severity of the effects (Years of life lost and Years lost due to disability) and one reflects the likelihood (incidence) of the disease. The multicriteria decision analysis approach takes into account the preferences of the risk managers through different sets of weights and the uncertainties associated with the data. The method and the data collected allowed to estimate the health burden of foodborne biological hazards in mainland France and to define a prioritization list for the health authorities.
Subject(s)
Foodborne Diseases , Risk Management , Foodborne Diseases/epidemiology , France/epidemiology , Humans , Food Safety , Food Microbiology , Incidence , Risk Assessment , Food Contamination/analysisABSTRACT
Temperature and relative humidity are major factors determining virus inactivation in the environment. This article reviews inactivation data regarding coronaviruses on surfaces and in liquids from published studies and develops secondary models to predict coronaviruses inactivation as a function of temperature and relative humidity. A total of 102 D values (i.e., the time to obtain a log10 reduction of virus infectivity), including values for severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), were collected from 26 published studies. The values obtained from the different coronaviruses and studies were found to be generally consistent. Five different models were fitted to the global data set of D values. The most appropriate model considered temperature and relative humidity. A spreadsheet predicting the inactivation of coronaviruses and the associated uncertainty is presented and can be used to predict virus inactivation for untested temperatures, time points, or any coronavirus strains belonging to Alphacoronavirus and Betacoronavirus genera.IMPORTANCE The prediction of the persistence of SARS-CoV-2 on fomites is essential in investigating the importance of contact transmission. This study collects available information on inactivation kinetics of coronaviruses in both solid and liquid fomites and creates a mathematical model for the impact of temperature and relative humidity on virus persistence. The predictions of the model can support more robust decision-making and could be useful in various public health contexts. A calculator for the natural clearance of SARS-CoV-2 depending on temperature and relative humidity could be a valuable operational tool for public authorities.
Subject(s)
Betacoronavirus/physiology , Coronavirus Infections/virology , Models, Biological , Pneumonia, Viral/virology , Virus Inactivation , COVID-19 , Fomites/virology , Humans , Humidity , Pandemics , Public Health , SARS-CoV-2 , Suspensions , TemperatureABSTRACT
Shiga-toxin producing Escherichia coli (STEC) strains may cause human infections ranging from simple diarrhea to Haemolytic Uremic Syndrome (HUS). The five main pathogenic serotypes of STEC (MPS-STEC) identified thus far in Europe are O157:H7, O26:H11, O103:H2, O111:H8, and O145:H28. Because STEC strains can survive or grow during cheese making, particularly in soft cheeses, a stochastic quantitative microbial risk assessment model was developed to assess the risk of HUS associated with the five MPS-STEC in raw milk soft cheeses. A baseline scenario represents a theoretical worst-case scenario where no intervention was considered throughout the farm-to-fork continuum. The risk level assessed with this baseline scenario is the risk-based level. The impact of seven preharvest scenarios (vaccines, probiotic, milk farm sorting) on the risk-based level was expressed in terms of risk reduction. Impact of the preharvest intervention ranges from 76% to 98% of risk reduction with highest values predicted with scenarios combining a decrease of the number of cow shedding STEC and of the STEC concentration in feces. The impact of postharvest interventions on the risk-based level was also tested by applying five microbiological criteria (MC) at the end of ripening. The five MCs differ in terms of sample size, the number of samples that may yield a value larger than the microbiological limit, and the analysis methods. The risk reduction predicted varies from 25% to 96% by applying MCs without preharvest interventions and from 1% to 96% with combination of pre- and postharvest interventions.
Subject(s)
Escherichia coli O157/pathogenicity , Milk/microbiology , Risk Assessment , Uremia/complications , Animals , Escherichia coli O157/isolation & purification , Humans , Models, TheoreticalABSTRACT
Microbiological food safety is an important economic and health issue in the context of globalization and presents food business operators with new challenges in providing safe foods. The hazard analysis and critical control point approach involve identifying the main steps in food processing and the physical and chemical parameters that have an impact on the safety of foods. In the risk-based approach, as defined in the Codex Alimentarius, controlling these parameters in such a way that the final products meet a food safety objective (FSO), fixed by the competent authorities, is a big challenge and of great interest to the food business operators. Process risk models, issued from the quantitative microbiological risk assessment framework, provide useful tools in this respect. We propose a methodology, called multivariate factor mapping (MFM), for establishing a link between process parameters and compliance with a FSO. For a stochastic and dynamic process risk model of Listeriamonocytogenes in soft cheese made from pasteurized milk with many uncertain inputs, multivariate sensitivity analysis and MFM are combined to (i) identify the critical control points (CCPs) for L.monocytogenes throughout the food chain and (ii) compute the critical limits of the most influential process parameters, located at the CCPs, with regard to the specific process implemented in the model. Due to certain forms of interaction among parameters, the results show some new possibilities for the management of microbiological hazards when a FSO is specified.
Subject(s)
Cheese/microbiology , Listeria monocytogenes/isolation & purification , Milk/microbiology , Pasteurization , Uncertainty , Animals , Multivariate AnalysisABSTRACT
According to Codex Alimentarius Commission recommendations, management options applied at the process production level should be based on good hygiene practices, HACCP system, and new risk management metrics such as the food safety objective. To follow this last recommendation, the use of quantitative microbiological risk assessment is an appealing approach to link new risk-based metrics to management options that may be applied by food operators. Through a specific case study, Listeria monocytogenes in soft cheese made from pasteurized milk, the objective of the present article is to practically show how quantitative risk assessment could be used to direct potential intervention strategies at different food processing steps. Based on many assumptions, the model developed estimates the risk of listeriosis at the moment of consumption taking into account the entire manufacturing process and potential sources of contamination. From pasteurization to consumption, the amplification of a primo-contamination event of the milk, the fresh cheese or the process environment is simulated, over time, space, and between products, accounting for the impact of management options, such as hygienic operations and sampling plans. A sensitivity analysis of the model will help orientating data to be collected prioritarily for the improvement and the validation of the model. What-if scenarios were simulated and allowed for the identification of major parameters contributing to the risk of listeriosis and the optimization of preventive and corrective measures.
Subject(s)
Cheese/microbiology , Dairying , Food Safety/methods , Listeria monocytogenes/isolation & purification , Safety Management/methods , Animals , Cattle , Food Handling , Food Microbiology , Humans , Listeria monocytogenes/growth & development , Listeriosis/etiology , Listeriosis/prevention & control , Milk/microbiology , Models, Statistical , Pasteurization , Risk AssessmentABSTRACT
Better knowledge regarding the Listeria monocytogenes dose-response (DR) model is needed to refine the assessment of the risk of foodborne listeriosis. In 2018, the European Food Safety Agency (EFSA) derived a lognormal Poisson DR model for 14 different age-sex sub-groups, marginally to strain virulence. In the present study, new sets of parameters are developed by integrating the EFSA model for these sub-groups together with three classes of strain virulence characteristics ("less virulent", "virulent", and "more virulent"). Considering classes of virulence leads to estimated relative risks (RRs) of listeriosis following the ingestion of 1000 bacteria of "less virulent" vs. "more virulent" strains ranging from 21.6 to 24.1, depending on the sub-group. These relatively low RRs when compared with RRs linked to comorbidities described in the literature suggest that the influence of comorbidity on the occurrence of invasive listeriosis for a given exposure is much more important than the influence of the virulence of the strains. The updated model parameters allow better prediction of the risk of invasive listeriosis across a population of interest, provided the necessary data on population demographics and the proportional contribution of strain virulence classes in food products of interest are available. An R package is made available to facilitate the use of these dose-response models.
ABSTRACT
A review of the published quantitative risk assessment (QRA) models of L. monocytogenes in meat and meat products was performed, with the objective of appraising the intervention strategies deemed suitable for implementation along the food chain as well as their relative effectiveness. A systematic review retrieved 23 QRA models; most of them (87%) focused on ready-to-eat meat products and the majority (78%) also covered short supply chains (end processing/retail to consumption, or consumption only). The processing-to-table scope was the choice of models for processed meats such as chorizo, bulk-cooked meat, fermented sausage and dry-cured pork, in which the effects of processing were simulated. Sensitivity analysis demonstrated the importance of obtaining accurate estimates for lag time, growth rate and maximum microbial density, in particular when affected by growth inhibitors and lactic acid bacteria. In the case of deli meats, QRA models showed that delicatessen meats sliced at retail were associated with a higher risk of listeriosis than manufacture pre-packed deli meats. Many models converged on the fact that (1) controlling cold storage temperature led to greater reductions in the final risk than decreasing the time to consumption and, furthermore, that (2) lower numbers and less prevalence of L. monocytogenes at the end of processing were far more effective than keeping low temperatures and/or short times during retail and/or home storage. Therefore, future listeriosis QRA models for meat products should encompass a processing module in order to assess the intervention strategies that lead to lower numbers and prevalence, such as the use of bio-preservation and novel technologies. Future models should be built upon accurate microbial kinetic parameters, and should realistically represent cross-contamination events along the food chain.
ABSTRACT
Invasive listeriosis, due to its severe nature in susceptible populations, has been the focus of many quantitative risk assessment (QRA) models aiming to provide a valuable guide in future risk management efforts. A review of the published QRA models of Listeria monocytogenes in seafood was performed, with the objective of appraising the effectiveness of the control strategies at different points along the food chain. It is worth noting, however, that the outcomes of a QRA model are context-specific, and influenced by the country and target population, the assumptions that are employed, and the model architecture itself. Studies containing QRA models were retrieved through a literature search using properly connected keywords on Scopus and PubMed®. All 13 QRA models that were recovered were of short scope, covering, at most, the period from the end of processing to consumption; the majority (85%) focused on smoked or gravad fish. Since the modelled pathways commenced with the packaged product, none of the QRA models addressed cross-contamination events. Many models agreed that keeping the product's temperature at 4.0-4.5 °C leads to greater reductions in the final risk of listeriosis than reducing the shelf life by one week and that the effectiveness of both measures can be surpassed by reducing the initial occurrence of L. monocytogenes in the product (at the end of processing). It is, therefore, necessary that future QRA models for RTE seafood contain a processing module that can provide insight into intervention strategies that can retard L. monocytogenes' growth, such as the use of bacteriocins, ad hoc starter cultures and/or organic acids, and other strategies seeking to reduce cross-contamination at the facilities, such as stringent controls for sanitation procedures. Since risk estimates were shown to be moderately driven by growth kinetic parameters, namely, the exponential growth rate, the minimum temperature for growth, and the maximum population density, further work is needed to reduce uncertainties.
ABSTRACT
A review of quantitative risk assessment (QRA) models of Listeria monocytogenes in produce was carried out, with the objective of appraising and contrasting the effectiveness of the control strategies placed along the food chains. Despite nine of the thirteen QRA models recovered being focused on fresh or RTE leafy greens, none of them represented important factors or sources of contamination in the primary production, such as the type of cultivation, water, fertilisers or irrigation method/practices. Cross-contamination at processing and during consumer's handling was modelled using transfer rates, which were shown to moderately drive the final risk of listeriosis, therefore highlighting the importance of accurately representing the transfer coefficient parameters. Many QRA models coincided in the fact that temperature fluctuations at retail or temperature abuse at home were key factors contributing to increasing the risk of listeriosis. In addition to a primary module that could help assess current on-farm practices and potential control measures, future QRA models for minimally processed produce should also contain a refined sanitisation module able to estimate the effectiveness of various sanitisers as a function of type, concentration and exposure time. Finally, L. monocytogenes growth in the products down the supply chain should be estimated by using realistic time-temperature trajectories, and validated microbial kinetic parameters, both of them currently available in the literature.
ABSTRACT
A review of the published quantitative risk assessment (QRA) models of L. monocytogenes in dairy products was undertaken in order to identify and appraise the relative effectiveness of control measures and intervention strategies implemented at primary production, processing, retail, and consumer practices. A systematic literature search retrieved 18 QRA models, most of them (9) investigated raw and pasteurized milk cheeses, with the majority covering long supply chains (4 farm-to-table and 3 processing-to-table scopes). On-farm contamination sources, either from shedding animals or from the broad environment, have been demonstrated by different QRA models to impact the risk of listeriosis, in particular for raw milk cheeses. Through scenarios and sensitivity analysis, QRA models demonstrated the importance of the modeled growth rate and lag phase duration and showed that the risk contribution of consumers' practices is greater than in retail conditions. Storage temperature was proven to be more determinant of the final risk than storage time. Despite the pathogen's known ability to reside in damp spots or niches, re-contamination and/or cross-contamination were modeled in only two QRA studies. Future QRA models in dairy products should entail the full farm-to-table scope, should represent cross-contamination and the use of novel technologies, and should estimate L. monocytogenes growth more accurately by means of better-informed kinetic parameters and realistic time-temperature trajectories.
ABSTRACT
The present study aims to compare ochratoxin A (OTA) exposure through the intake of three cereal derivative products (bread, pasta and semolina) in two different Moroccan climatic regions (littoral and continental). OTA weekly intakes from cereal products were calculated using a deterministic approach for each region. Results showed a statistically significant difference (p < 0.05) of OTA exposure between the two regions. Indeed, the median OTA exposure was estimated at 48.97 ng/kg b.w./week in the littoral region, while it was estimated at 6.36 ng/kg b.w./week in the continental region. The probabilistic approach showed that, due to uncertainties, the 95th percentile of weekly OTA exposure associated with the three cereal products ranged from 66.18 to 137.79 (95% CI) with a median of 97.44 ng/kg body weight (b.w.)/week. Compared to the threshold of 100 ng/kg b.w./week, 95% of the cumulative distributions predicted an exceedance frequency between 0.42 and 17.30% (95% CI), with an exceedance frequency median of 4.43%. Results showed that cereal derivatives constitute an important vector of OTA exposure and cause a significant exceedance of toxicological reference value among large consumers in the littoral region, which suggests the urgency of reconsidering the maximum regulatory limit (MRL) set for OTA (3 µg/kg) in cereal derivatives by Moroccan authorities.
Subject(s)
Edible Grain , Ochratoxins , Edible Grain/chemistry , Food Contamination/analysis , Ochratoxins/analysis , BreadABSTRACT
Food safety is a constant challenge for stakeholders in the food industry. To manage the likelihood of microbiological contamination, food safety management systems must be robust, including food and environmental testing. Environmental monitoring programs (EMP) have emerged this last decade aiming to validate cleaning-sanitation procedures and other environmental pathogen control programs. The need to monitor production environments has become evident because of recent foodborne outbreaks. However, the boundaries of environmental monitoring are not only limited to the management of pathogens but also extend to spoilage and hygiene indicators, microorganisms, allergens, and other hygiene monitoring. Surfaces in production environments can be a source of contamination, either through ineffective cleaning and disinfection procedures or through contamination during production by flows or operators. This study analyses the current practices of 37 French agri-food industries (small, medium, or large), reporting their objectives for EMPs, microbial targets, types, numbers and frequency of sampling, analysis of results, and types of corrective actions.
ABSTRACT
SARS-CoV-2 (Severe acute respiratory syndrome coronavirus 2), a virus causing severe acute respiratory disease in humans, emerged in late 2019. This respiratory virus can spread via aerosols, fomites, contaminated hands or surfaces as for other coronaviruses. Studying their persistence under different environmental conditions represents a key step for better understanding the virus transmission. This work aimed to present a reproducible procedure for collecting data of stability and inactivation kinetics from the scientific literature. The aim was to identify data useful for characterizing the persistence of viruses in the food production plants. As a result, a large dataset related to persistence on matrices or in liquid media under different environmental conditions is presented. This procedure, combining bibliographic survey, data digitalization techniques and predictive microbiological modelling, identified 65 research articles providing 455 coronaviruses kinetics. A ranking step as well as a technical validation with a Gage Repeatability & Reproducibility process were performed to check the quality of the kinetics. All data were deposited in public repositories for future uses by other researchers.
Subject(s)
COVID-19 , SARS-CoV-2 , Humans , Food Handling , Kinetics , Plants, Edible , Reproducibility of Results , Databases, FactualABSTRACT
Assessment of potential human health risks associated with environmental and other agents requires careful evaluation of all available and relevant evidence for the agent of interest, including both data-rich and data-poor agents. With the advent of new approach methodologies in toxicological risk assessment, guidance on integrating evidence from mul-tiple evidence streams is needed to ensure that all available data is given due consideration in both qualitative and quantitative risk assessment. The present report summarizes the discussions among academic, government, and private sector participants from North America and Europe in an international workshop convened to explore the development of an evidence-based risk assessment framework, taking into account all available evidence in an appropriate manner in order to arrive at the best possible characterization of potential human health risks and associated uncertainty. Although consensus among workshop participants was not a specific goal, there was general agreement on the key consider-ations involved in evidence-based risk assessment incorporating 21st century science into human health risk assessment. These considerations have been embodied into an overarching prototype framework for evidence integration that will be explored in more depth in a follow-up meeting.
Subject(s)
Risk Assessment , Humans , EuropeABSTRACT
The foodborne disease burden (FBDB) related to 26 major biological hazards in France was attributed to foods and poor food-handling practices at the final food preparation step, in order to develop effective intervention strategies, especially food safety campaigns. Campylobacter spp. and non-typhoidal Salmonella accounted for more than 60% of the FBDB. Approximately 30% of the FBDB were attributed to 11 other hazards including bacteria, viruses and parasites. Meats were estimated as the main contributing food category causing (50-69%) (CI90) of the FBDB with (33-44%), (9-21%), (4-20%) (CI90) of the FBDB for poultry, pork and beef, respectively. Dairy products, eggs, raw produce and complex foods caused each approximately (5-20%) (CI90) of the FBDB. When foods are contaminated before the final preparation step, we estimated that inadequate cooking, cross-contamination and inadequate storage contribute for (19-49%), (7-34%) and (9-23%) (CI90) of the FBDB, respectively; (15-33%) (CI90) of the FBDB were attributed to the initial contamination of ready-to-eat foods-without any contribution from final food handlers. The thorough implementation of good hygienic practices (GHPs) at the final food preparation step could potentially reduce the FBDB by (67-85%) (CI90) (mainly with the prevention of cross-contamination and adequate cooking and storage).
ABSTRACT
Entomophagy has been part of human diets for a long time in a significant part of the world, but insects are considered to be a novel food everywhere else. It would appear to be a strategic alternative in the future of human diet to face the challenge of ensuring food security for a growing world population, using more environmentally sustainable production systems than those required for the rearing of other animals. Tenebrio molitor, called yellow mealworm, is one of the most interesting insect species in view of mass rearing, and can be processed into a powder that ensures a long shelf life for its use in many potential products. When considering insects as food or feed, it is necessary to guarantee their safety. Therefore, manufacturers must implement a Hazard Analysis Critical Control plan (HACCP), to limit risks for consumers' health. The aim of this case study was to develop a HACCP plan for Tenebrio molitor larvae powders for food in a risk-based approach to support their implementation in industry. Specific purposes were to identify related significant biological hazards and to assess the efficiency of different manufacturing process steps when used as Critical Control Points. Then, combinations of four different processes with four potential uses of powders by consumers in burger, protein shake, baby porridge, and biscuits were analyzed with regard to their safety.
ABSTRACT
Food safety risk assessments and large-scale epidemiological investigations have the potential to provide better and new types of information when whole genome sequence (WGS) data are effectively integrated. Today, the NCBI Pathogen Detection database WGS collections have grown significantly through improvements in technology, coordination, and collaboration, such as the GenomeTrakr and PulseNet networks. However, high-quality genomic data is not often coupled with high-quality epidemiological or food chain metadata. We have created a set of tools for cleaning, curation, integration, analysis and visualization of microbial genome sequencing data. It has been tested using Salmonella enterica and Listeria monocytogenes data sets provided by NCBI Pathogen Detection (160,000 sequenced isolates in 2018). GenomeGraphR presents foodborne pathogen WGS data and associated curated metadata in a user-friendly interface that allows a user to query a variety of research questions such as, transmission sources and dynamics, global reach, and persistence of genotypes associated with contamination in the food supply and foodborne illness across time or space. The application is freely available (https://fda-riskmodels.foodrisk.org/genomegraphr/).
Subject(s)
Food Microbiology , Food Safety , Foodborne Diseases/microbiology , Whole Genome Sequencing/statistics & numerical data , Databases, Genetic , Foodborne Diseases/epidemiology , Genome, Bacterial , Humans , Internet , Listeria monocytogenes/genetics , Listeria monocytogenes/isolation & purification , Listeriosis/epidemiology , Listeriosis/microbiology , Metadata , Molecular Epidemiology , Polymorphism, Single Nucleotide , Risk Assessment , Salmonella Food Poisoning/epidemiology , Salmonella Food Poisoning/microbiology , Salmonella enterica/genetics , Software , User-Computer InterfaceABSTRACT
With increased interest in source attribution of foodborne pathogens, there is a need to sort and assess the applicability of currently available methods. Herewith we reviewed the most frequently applied methods for source attribution of foodborne diseases, discussing their main strengths and weaknesses to be considered when choosing the most appropriate methods based on the type, quality, and quantity of data available, the research questions to be addressed, and the (epidemiological and microbiological) characteristics of the pathogens in question. A variety of source attribution approaches have been applied in recent years. These methods can be defined as top-down, bottom-up, or combined. Top-down approaches assign the human cases back to their sources of infection based on epidemiological (e.g., outbreak data analysis, case-control/cohort studies, etc.), microbiological (i.e., microbial subtyping), or combined (e.g., the so-called 'source-assigned case-control study' design) methods. Methods based on microbial subtyping are further differentiable according to the modeling framework adopted as frequency-matching (e.g., the Dutch and Danish models) or population genetics (e.g., Asymmetric Island Models and STRUCTURE) models, relying on the modeling of either phenotyping or genotyping data of pathogen strains from human cases and putative sources. Conversely, bottom-up approaches like comparative exposure assessment start from the level of contamination (prevalence and concentration) of a given pathogen in each source, and then go upwards in the transmission chain incorporating factors related to human exposure to these sources and dose-response relationships. Other approaches are intervention studies, including 'natural experiments,' and expert elicitations. A number of methodological challenges concerning all these approaches are discussed. In absence of an universally agreed upon 'gold' standard, i.e., a single method that satisfies all situations and needs for all pathogens, combining different approaches or applying them in a comparative fashion seems to be a promising way forward.
ABSTRACT
EFSA received an application from the Dutch Competent Authority, under Article 20 of Regulation (EC) No 1069/2009 and Regulation (EU) No 142/2011, for the evaluation of an alternative method for treatment of Category 3 animal by-products (ABP). It consists of the hydrolysis of the material to short-carbon chains, resulting in medium-chain fatty acids that may contain up to 1% hydrolysed protein, for use in animal feed. A physical process, with ultrafiltration followed by nanofiltration to remove hazards, is also used. Process efficacy has been evaluated based on the ability of the membrane barriers to retain potential biological hazards present. Small viruses passing the ultrafiltration membrane will be retained at the nanofiltration step, which represents a Critical Control Point (CCP) in the process. This step requires the Applicant to validate and provide certification for the specific use of the nanofiltration membranes used. Continuous monitoring and membrane integrity tests should be included as control measures in the HACCP plan. The ultrafiltration and nanofiltration techniques are able to remove particles of the size of virus, bacteria and parasites from liquids. If used under controlled and appropriate conditions, the processing methods proposed should reduce the risk in the end product to a degree which is at least equivalent to that achieved with the processing standards laid down in the Regulation for Category 3 material. The possible presence of small bacterial toxins produced during the fermentation steps cannot be avoided by the nanofiltration step and this hazard should be controlled by a CCP elsewhere in the process. The limitations specified in the current legislation and any future modifications in relation to the end use of the product also apply to this alternative process, and no hydrolysed protein of ruminant origin (except ruminant hides and skins) can be included in feed for farmed animals or for aquaculture.