Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 6 de 6
Filter
Add more filters










Database
Language
Publication year range
1.
J Food Prot ; 83(5): 767-778, 2020 May 01.
Article in English | MEDLINE | ID: mdl-32294762

ABSTRACT

ABSTRACT: According to the U.S. Food and Drug Administration's (FDA's) rule on "Prevention of Salmonella Enteritidis in Shell Eggs during Production, Storage, and Transportation," shell eggs intended for human consumption are required to be held or transported at or below 45°F (7.2°C) ambient temperature beginning 36 h after time of lay. Meanwhile, eggs in hatcheries are typically stored at a temperature of 65°F (18.3°C). Although most of those eggs are directed to incubators for hatching, excess eggs have the potential to be diverted for human consumption as egg products through the "breaker" market if these eggs are refrigerated in accordance with FDA's requirement. Combining risk assessment models developed by the U.S. Department of Agriculture's Food Safety and Inspection Service for shell eggs and for egg products, we quantified and compared Salmonella Enteritidis levels in eggs held at 65°F versus 45°F, Salmonella Enteritidis levels in the resulting egg products, and the risk of human salmonellosis from consumption of those egg products. For eggs stored 5 days at 65°F (following 36 h at 75°F [23.9°C] in the layer house), the mean level of Salmonella Enteritidis contamination is 30-fold higher than for eggs stored at 45°F. These increased levels of contamination lead to a 47-fold increase in the risk of salmonellosis from consumption of egg products made from these eggs, with some variation in the public health risk on the basis of the egg product type (e.g., whole egg versus whole egg with added sugar). Assuming that 7% of the liquid egg product supply originates from eggs stored at 65°F versus 45°F, this study estimates an additional burden of 3,562 cases of salmonellosis per year in the United States. A nominal range uncertainty analysis suggests that the relative increase in the risk linked to the storage of eggs at higher temperature estimated in this study is robust to the uncertainty surrounding the model parameters. The diversion of eggs from broiler production to human consumption under the current storage practices of 65°F (versus 45°F) would present a substantive overall increase in the risk of salmonellosis.


Subject(s)
Egg Shell/microbiology , Food Storage/instrumentation , Salmonella Food Poisoning , Salmonella enteritidis/growth & development , Animals , Chickens , Eggs/microbiology , Food Microbiology , Food Safety , Humans , Salmonella Food Poisoning/etiology , United States
2.
J Food Prot ; 79(7): 1076-88, 2016 07.
Article in English | MEDLINE | ID: mdl-27357026

ABSTRACT

Cross-contamination, improper holding temperatures, and insufficient sanitary practices are known retail practices that may lead to product contamination and growth of Listeria monocytogenes. However, the relative importance of control options to mitigate the risk of invasive listeriosis from ready-to-eat (RTE) products sliced or prepared at retail is not well understood. This study illustrates the utility of a quantitative risk assessment model described in a first article of this series (Pouillot, R., D. Gallagher, J. Tang, K. Hoelzer, J. Kause, and S. B. Dennis, J. Food Prot. 78:134-145, 2015) to evaluate the public health impact associated with changes in retail deli practices and interventions. Twenty-two mitigation scenarios were modeled and evaluated under six different baseline conditions. These scenarios were related to sanitation, worker behavior, use of growth inhibitors, cross-contamination, storage temperature control, and reduction of the level of L. monocytogenes on incoming RTE food products. The mean risk per serving of RTE products obtained under these scenarios was then compared with the risk estimated in the baseline condition. Some risk mitigations had a consistent impact on the predicted listeriosis risk in all baseline conditions (e.g. presence or absence of growth inhibitor), whereas others were greatly dependent on the initial baseline conditions or practices in the deli (e.g. preslicing of products). Overall, the control of the bacterial growth and the control of contamination at its source were major factors of listeriosis risk in these settings. Although control of cross-contamination and continued sanitation were also important, the decrease in the predicted risk was not amenable to a simple solution. Findings from these predictive scenario analyses are intended to encourage improvements to retail food safety practices and mitigation strategies to control L. monocytogenes in RTE foods more effectively and to demonstrate the utility of quantitative risk assessment models to inform risk management decisions.


Subject(s)
Listeria monocytogenes , Meat Products/microbiology , Food Contamination , Food Microbiology , Humans , Listeriosis , Risk Assessment
3.
J Food Prot ; 78(1): 134-45, 2015 Jan.
Article in English | MEDLINE | ID: mdl-25581188

ABSTRACT

The Interagency Risk Assessment-Listeria monocytogenes (Lm) in Retail Delicatessens provides a scientific assessment of the risk of listeriosis associated with the consumption of ready-to-eat (RTE) foods commonly prepared and sold in the delicatessen (deli) of a retail food store. The quantitative risk assessment (QRA) model simulates the behavior of retail employees in a deli department and tracks the Lm potentially present in this environment and in the food. Bacterial growth, bacterial inactivation (following washing and sanitizing actions), and cross-contamination (from object to object, from food to object, or from object to food) are evaluated through a discrete event modeling approach. The QRA evaluates the risk per serving of deli-prepared RTE food for the susceptible and general population, using a dose-response model from the literature. This QRA considers six separate retail baseline conditions and provides information on the predicted risk of listeriosis for each. Among the baseline conditions considered, the model predicts that (i) retail delis without an environmental source of Lm (such as niches), retail delis without niches that do apply temperature control, and retail delis with niches that do apply temperature control lead to lower predicted risk of listeriosis relative to retail delis with niches and (ii) retail delis with incoming RTE foods that are contaminated with Lm lead to higher predicted risk of listeriosis, directly or through cross-contamination, whether the contaminated incoming product supports growth or not. The risk assessment predicts that listeriosis cases associated with retail delicatessens result from a sequence of key events: (i) the contaminated RTE food supports Lm growth; (ii) improper retail and/or consumer storage temperature or handling results in the growth of Lm on the RTE food; and (iii) the consumer of this RTE food is susceptible to listeriosis. The risk assessment model, therefore, predicts that cross-contamination with Lm at retail predominantly results in sporadic cases.


Subject(s)
Food Contamination/analysis , Listeria monocytogenes/growth & development , Meat Products/microbiology , Disinfection/methods , Equipment Contamination , Food Microbiology , Humans , Listeriosis/epidemiology , Microbial Viability , Models, Theoretical , Risk Assessment , Small Business , Temperature
4.
Risk Anal ; 35(1): 90-108, 2015 Jan.
Article in English | MEDLINE | ID: mdl-24975545

ABSTRACT

Evaluations of Listeria monocytogenes dose-response relationships are crucially important for risk assessment and risk management, but are complicated by considerable variability across population subgroups and L. monocytogenes strains. Despite difficulties associated with the collection of adequate data from outbreak investigations or sporadic cases, the limitations of currently available animal models, and the inability to conduct human volunteer studies, some of the available data now allow refinements of the well-established exponential L. monocytogenes dose response to more adequately represent extremely susceptible population subgroups and highly virulent L. monocytogenes strains. Here, a model incorporating adjustments for variability in L. monocytogenes strain virulence and host susceptibility was derived for 11 population subgroups with similar underlying comorbidities using data from multiple sources, including human surveillance and food survey data. In light of the unique inherent properties of L. monocytogenes dose response, a lognormal-Poisson dose-response model was chosen, and proved able to reconcile dose-response relationships developed based on surveillance data with outbreak data. This model was compared to a classical beta-Poisson dose-response model, which was insufficiently flexible for modeling the specific case of L. monocytogenes dose-response relationships, especially in outbreak situations. Overall, the modeling results suggest that most listeriosis cases are linked to the ingestion of food contaminated with medium to high concentrations of L. monocytogenes. While additional data are needed to refine the derived model and to better characterize and quantify the variability in L. monocytogenes strain virulence and individual host susceptibility, the framework derived here represents a promising approach to more adequately characterize the risk of listeriosis in highly susceptible population subgroups.


Subject(s)
Host-Parasite Interactions , Listeria monocytogenes/pathogenicity , Models, Theoretical , Virulence
5.
J Food Prot ; 76(3): 376-85, 2013 Mar.
Article in English | MEDLINE | ID: mdl-23462073

ABSTRACT

Stakeholders in the system of food safety, in particular federal agencies, need evidence-based, transparent, and rigorous approaches to estimate and compare the risk of foodborne illness from microbial and chemical hazards and the public health impact of interventions. FDA-iRISK (referred to here as iRISK), a Web-based quantitative risk assessment system, was developed to meet this need. The modeling tool enables users to assess, compare, and rank the risks posed by multiple food-hazard pairs at all stages of the food supply system, from primary production, through manufacturing and processing, to retail distribution and, ultimately, to the consumer. Using standard data entry templates, built-in mathematical functions, and Monte Carlo simulation techniques, iRISK integrates data and assumptions from seven components: the food, the hazard, the population of consumers, process models describing the introduction and fate of the hazard up to the point of consumption, consumption patterns, dose-response curves, and health effects. Beyond risk ranking, iRISK enables users to estimate and compare the impact of interventions and control measures on public health risk. iRISK provides estimates of the impact of proposed interventions in various ways, including changes in the mean risk of illness and burden of disease metrics, such as losses in disability-adjusted life years. Case studies for Listeria monocytogenes and Salmonella were developed to demonstrate the application of iRISK for the estimation of risks and the impact of interventions for microbial hazards. iRISK was made available to the public at http://irisk.foodrisk.org in October 2012.


Subject(s)
Food Contamination/analysis , Food Handling/standards , Food Microbiology , Food/standards , Risk Assessment , Food Safety , Foodborne Diseases/prevention & control , Humans , Models, Theoretical , Monte Carlo Method , United States , United States Food and Drug Administration
6.
J Food Prot ; 67(3): 616-23, 2004 Mar.
Article in English | MEDLINE | ID: mdl-15035384

ABSTRACT

Local health departments that investigate foodborne disease outbreaks do not have adequate guidelines for collecting data that could be used to estimate dose-response relationships, a key component of hazard characterization in quantitative microbial risk assessment. To meet this need, criteria and a questionnaire template for the collection of appropriate dose-response data in the context of outbreaks were developed and applied in the investigation of a point-source outbreak linked to Salmonella serotype Enteritidis in a salmon entrée in February 2000. In this outbreak, the attack rate and risk of hospitalization increased with the amount of salmon entrée consumed, and detailed data were obtained on illness severity measures and host susceptibility factors. Local health departments might consider broadening investigations to include the collection of additional data when investigating outbreaks that have met a specific set of conditions. These data could provide information needed by federal regulatory agencies and other organizations for quantitative microbial risk assessment. Intensive investigations of outbreaks could prevent future illnesses by providing information needed to develop approaches to minimizing risk.


Subject(s)
Foodborne Diseases/prevention & control , Risk Assessment/methods , Chicago/epidemiology , Colony Count, Microbial , Consumer Product Safety , Data Collection , Disease Outbreaks , Food Handling/methods , Food Microbiology , Foodborne Diseases/epidemiology , Forecasting , Guidelines as Topic , Humans , Salmonella Food Poisoning/epidemiology , Salmonella Food Poisoning/prevention & control , Time Factors
SELECTION OF CITATIONS
SEARCH DETAIL
...