Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 23
Filtrar
1.
Front Nutr ; 11: 1290680, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38425480

RESUMEN

Qualitative and quantitative risk-benefit assessments (RBA) can be used to support public health decisions in food safety. We conducted an evidence scan to understand the state of the science regarding RBA in seafood to help inform seafood dietary advice in the United States. We collected published RBA studies assessing seafood consumption, designed inclusion and exclusion criteria to screen these studies, and conducted systematic data extraction for the relevant studies published since 2019. Our findings indicate the selection of health risks and benefits does not generally follow a systematic approach. Uncertainty and variability in RBAs is often not addressed, and quantitative RBAs making use of a single health metric generally have not been leveraged to directly support published regulatory decisions or dietary guidance. To elevate the role of RBA in supporting regulatory decision-making, risk assessors and risk managers must work together to set expectations and goals. We identified the need for a prioritization phase (e.g., multicriteria decision analysis model) to determine the risks and benefits of greatest public health impact to inform the RBA design. This prioritization would consider not only the degree of public health impact of each risk and benefit, but also the potential for risks and benefits to converge on common health outcomes and their importance to subpopulations. Including a prioritization could improve the utility of the RBAs to better inform risk management decisions and advance public health. Our work serves to guide the United States Food and Drug Administration's approaches to RBA in foods.

2.
Regul Toxicol Pharmacol ; 144: 105487, 2023 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-37640100

RESUMEN

The U.S. Food and Drug Administration (FDA) developed an oral toxicological reference value (TRV) for characterizing potential health concerns from dietary exposure to cadmium (Cd). The development of the TRV leveraged the FDA's previously published research including (1) a systematic review for adverse health effects associated with oral Cd exposure and (2) a human physiological based pharmacokinetic (PBPK) model adapted from Kjellstrom and Nordberg (1978) for use in reverse dosimetry applied to the U.S. population. Adverse effects of Cd on the bone and kidney are associated with similar points of departure (PODs) of approximately 0.50 µg Cd/g creatinine for females aged 50-60 based on available epidemiologic data. We also used the upper bound estimate of the renal cortical concentration (50 µg/g Cd) occurring in the U.S. population at 50 years of age as a POD. Based on the output from our reverse dosimetry PBPK Model, a range of 0.21-0.36 µg/kg bw/day was developed for the TRV. The animal data used for the animal TRV derivation (0.63-1.8 µg/kg bw/day) confirms biological plausibility for both the bone and kidney endpoints.


Asunto(s)
Cadmio , Exposición a Riesgos Ambientales , Femenino , Animales , Humanos , Persona de Mediana Edad , Cadmio/toxicidad , Exposición a Riesgos Ambientales/efectos adversos , Valores de Referencia , Alimentos , Riñón
3.
Artículo en Inglés | MEDLINE | ID: mdl-33735599

RESUMEN

In food safety, process pathway risk assessments usually estimate the risk of illness from a single hazard and a single food and can inform food safety decisions and consumer advice. To evaluate the health impact of a potential change in diet, we need to understand not only the risk posed by the considered hazard and food but also the risk posed by the substitution food and other potential hazards. We developed a framework to provide decision-makers with a multi-faceted evaluation of the impact of dietary shifts on risk of illness. Our case study explored exposure to inorganic arsenic (iAs) and aflatoxins through consumption of infant cereals and the risk of developing lung, bladder and liver cancer over a lifetime. The estimated additional Disability-Adjusted Life Year (DALY) in the U.S. from exposure to iAs and aflatoxin based on available contamination and consumption patterns of infant rice and oat cereal is 4,921 (CI 90% 414; 9,071). If all infant cereal consumers shift intake (maintaining equivalent serving size and frequency) to only consuming infant rice cereal, the predicted DALY increases to 6,942 (CI 90% 326; 12,931). If all infant cereal consumers shift intake to only consuming infant oat cereal, the predicted DALY decreases to 1,513 (CI 90% 312; 3,356). Changes in contaminant concentrations or percent consumers, that could occur in the future, also significantly impact the predicted risk. Uncertainty in these risk predictions is primarily driven by the dose-response models. A risk-risk analysis framework provides decision-makers with a nuanced understanding of the public health impact of dietary changes and can be applied to other food safety and nutrition questions.


Asunto(s)
Ingestión de Alimentos , Grano Comestible/química , Análisis de los Alimentos , Contaminación de Alimentos/análisis , Alimentos Infantiles/análisis , Neoplasias/diagnóstico , Inocuidad de los Alimentos , Humanos , Lactante , Medición de Riesgo
4.
PLoS One ; 15(4): e0231393, 2020.
Artículo en Inglés | MEDLINE | ID: mdl-32352974

RESUMEN

Whole genome sequencing (WGS) was performed on 201 Listeria monocytogenes isolates recovered from 102 of 27,389 refrigerated ready-to-eat (RTE) food samples purchased at retail in U.S. FoodNet sites as part of the 2010-2013 interagency L. monocytogenes Market Basket Survey (Lm MBS). Core genome multi-locus sequence typing (cgMLST) and in-silico analyses were conducted, and these data were analyzed with metadata for isolates from five food groups: produce, seafood, dairy, meat, and combination foods. Six of 201 isolates, from 3 samples, were subsequently confirmed as L. welshimeri. Three samples contained one isolate per sample; mmong the 96 samples that contained two isolates per sample, 3 samples each contained two different strains and 93 samples each contained duplicate isolates. After 93 duplicate isolates were removed, the remaining 102 isolates were delineated into 29 clonal complexes (CCs) or singletons based on their sequence type. The five most prevalent CCs were CC155, CC1, CC5, CC87, and CC321. The Shannon's diversity index for clones per food group ranged from 1.49 for dairy to 2.32 for produce isolates, which were not significantly different in pairwise comparisons. The most common molecular serogroup as determined by in-silico analysis was IIa (45.6%), followed by IIb (27.2%), IVb (20.4%), and IIc (4.9%). The proportions of isolates within lineages I, II, and III were 48.0%, 50.0% and 2.0%, respectively. Full-length inlA was present in 89.3% of isolates. Listeria pathogenicity island 3 (LIPI-3) and LIPI-4 were found in 51% and 30.6% of lineage I isolates, respectively. Stress survival islet 1 (SSI-1) was present in 34.7% of lineage I isolates, 80.4% of lineage II isolates and the 2 lineage III isolates; SSI-2 was present only in the CC121 isolate. Plasmids were found in 48% of isolates, including 24.5% of lineage I isolates and 72.5% of lineage II isolates. Among the plasmid-carrying isolates, 100% contained at least one cadmium resistance cassette and 89.8% contained bcrABC, involved in quaternary ammonium compound tolerance. Multiple clusters of isolates from different food samples were identified by cgMLST which, along with available metadata, could aid in the investigation of possible cross-contamination and persistence events.


Asunto(s)
Microbiología de Alimentos , Variación Genética , Listeria monocytogenes/genética , Virulencia/genética , Proteínas Bacterianas/genética , ADN Bacteriano/química , ADN Bacteriano/metabolismo , Humanos , Listeria monocytogenes/clasificación , Listeria monocytogenes/aislamiento & purificación , Listeria monocytogenes/patogenicidad , Listeriosis/patología , Listeriosis/transmisión , Tipificación de Secuencias Multilocus , Filogenia , Plásmidos/genética , Plásmidos/metabolismo , Serogrupo , Secuenciación Completa del Genoma
5.
J Food Prot ; 83(5): 767-778, 2020 May 01.
Artículo en Inglés | MEDLINE | ID: mdl-32294762

RESUMEN

ABSTRACT: According to the U.S. Food and Drug Administration's (FDA's) rule on "Prevention of Salmonella Enteritidis in Shell Eggs during Production, Storage, and Transportation," shell eggs intended for human consumption are required to be held or transported at or below 45°F (7.2°C) ambient temperature beginning 36 h after time of lay. Meanwhile, eggs in hatcheries are typically stored at a temperature of 65°F (18.3°C). Although most of those eggs are directed to incubators for hatching, excess eggs have the potential to be diverted for human consumption as egg products through the "breaker" market if these eggs are refrigerated in accordance with FDA's requirement. Combining risk assessment models developed by the U.S. Department of Agriculture's Food Safety and Inspection Service for shell eggs and for egg products, we quantified and compared Salmonella Enteritidis levels in eggs held at 65°F versus 45°F, Salmonella Enteritidis levels in the resulting egg products, and the risk of human salmonellosis from consumption of those egg products. For eggs stored 5 days at 65°F (following 36 h at 75°F [23.9°C] in the layer house), the mean level of Salmonella Enteritidis contamination is 30-fold higher than for eggs stored at 45°F. These increased levels of contamination lead to a 47-fold increase in the risk of salmonellosis from consumption of egg products made from these eggs, with some variation in the public health risk on the basis of the egg product type (e.g., whole egg versus whole egg with added sugar). Assuming that 7% of the liquid egg product supply originates from eggs stored at 65°F versus 45°F, this study estimates an additional burden of 3,562 cases of salmonellosis per year in the United States. A nominal range uncertainty analysis suggests that the relative increase in the risk linked to the storage of eggs at higher temperature estimated in this study is robust to the uncertainty surrounding the model parameters. The diversion of eggs from broiler production to human consumption under the current storage practices of 65°F (versus 45°F) would present a substantive overall increase in the risk of salmonellosis.


Asunto(s)
Cáscara de Huevo/microbiología , Almacenamiento de Alimentos/instrumentación , Intoxicación Alimentaria por Salmonella , Salmonella enteritidis/crecimiento & desarrollo , Animales , Pollos , Huevos/microbiología , Microbiología de Alimentos , Inocuidad de los Alimentos , Humanos , Intoxicación Alimentaria por Salmonella/etiología , Estados Unidos
6.
Regul Toxicol Pharmacol ; 111: 104579, 2020 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-31945454

RESUMEN

FDA developed the interim reference level (IRL) for lead of 3 µg/day in children and 12.5 µg/day in women of childbearing age (WOCBA) to better protect the fetus from lead toxicity. These IRLs correspond to a blood lead level (BLL) of 0.5 µg/dL in both populations. The current investigation was performed to determine if the IRL for WOCBA should apply to the general population of adults. A literature review of epidemiological studies was conducted to determine whether a BLL of 0.5 µg/dL is associated with adverse effects in adults. Some studies reported adverse effects over a wide range of BLLs that included 0.5 µg/dL adding uncertainty to conclusions about effects at 0.5 µg/dL; however, no studies clearly identified this BLL as an adverse effect level. Results also showed that the previously developed PTTDI for adults of 75 µg/day lead may not be health protective, supporting use of a lower reference value for lead toxicity in this population group. Use of the 12.5 µg/day IRL as a benchmark for dietary lead intake is one way FDA will ensure that dietary lead intake in adults is reduced.


Asunto(s)
Exposición Dietética/efectos adversos , Exposición Dietética/normas , Plomo/administración & dosificación , Plomo/efectos adversos , Adulto , Contaminantes Ambientales , Humanos , Plomo/sangre
7.
J Food Sci ; 85(2): 260-267, 2020 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-31957884

RESUMEN

Cadmium has long been recognized as an environmental contaminant that poses risks to human health. Cadmium is of concern since nearly everyone in the general population is exposed to the metal through the food supply and the ability of the element to accumulate in the body over a lifetime. In support of the United States Food and Drug Administration's (FDA) Toxic Element Working Group's efforts to reduce the risks associated with elements in food, this review sought to identify current or new mitigation efforts that have the potential to reduce exposures of cadmium throughout the food supply chain. Cadmium contamination of foods can occur at various stages, including agronomic production, processing, and consumer preparation for consumption. The presence of cadmium in food is variable and dependent on the geographical location, the bioavailability of cadmium from the soil, crop genetics, agronomic practices used, and postharvest operations. Although there are multiple points in the food supply system for foods to be contaminated and mitigations to be applied, a key step to reducing cadmium in the diet is to reduce or prevent initial uptake by plants consumed as food or feed crops. Due to complex interactions of soil chemistry, plant genetics, and agronomic practices, additional research is needed. Support for field-based experimentation and testing is needed to inform risk modeling and to develop practical farm-specific management strategies. This study can also assist the FDA in determining where to focus resources so that research and regulatory efforts can have the greatest impact on reducing cadmium exposures from the food supply. PRACTICAL APPLICATION: The presence of cadmium in food is highly variable and highly dependent on the geographical location, the bioavailability of cadmium from the soil, crop genetics, and agronomic practices used. This study can assist the FDA in determining where to focus resources so that research and regulatory efforts can have the greatest impact on reducing cadmium exposures from the food supply.


Asunto(s)
Cadmio/análisis , Exposición Dietética/prevención & control , Contaminación de Alimentos/prevención & control , Animales , Cadmio/toxicidad , Productos Agrícolas/química , Exposición Dietética/análisis , Contaminación de Alimentos/análisis , Humanos , Contaminantes del Suelo/análisis , Contaminantes del Suelo/toxicidad
8.
Artículo en Inglés | MEDLINE | ID: mdl-31647750

RESUMEN

Dietary exposures to lead were estimated for older children, females of childbearing age and adults based on lead concentration data from the FDA's Total Diet Study and on food consumption data from What We Eat In America (WWEIA), the food survey portion of the National Health and Nutrition Examination Survey (NHANES). Estimated mean exposures varied based on the population and on the three different substitution scenarios for lead values below the limit of detection (non-detects = 0; non-detects = limit of detection; hybrid approach). Estimated mean lead exposures range from 1.4 to 4.0 µg/day for older children (males and females 7-17 years), 1.6 to 4.6 µg/day for women of childbearing age (females 16-49 years) and 1.7 to 5.3 µg/day for adults (males and females 18 years and older). Estimated 90th percentile lead exposures range from 2.3 to 5.8 µg/day for older children, 2.8 to 6.7 µg/day for women of childbearing age and 3.2 to 7.8 µg/day for adults. Exposure estimates suggest some older children may be exposed to dietary lead above the FDA interim reference level for lead in children of 3 µg/day. The results of this study can be used by the FDA to prioritise research and regulatory efforts in the area of dietary lead exposure.


Asunto(s)
Encuestas sobre Dietas , Dieta/estadística & datos numéricos , Contaminación de Alimentos/análisis , Plomo/análisis , United States Food and Drug Administration , Adolescente , Adulto , Niño , Femenino , Humanos , Masculino , Persona de Mediana Edad , Estados Unidos , Adulto Joven
9.
Regul Toxicol Pharmacol ; 110: 104516, 2020 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-31707132

RESUMEN

Reducing lead exposure is a public health priority for the US Food and Drug Administration as well as other federal agencies. The goals of this research were to 1) update the maximum daily dietary intake of lead from food, termed an interim reference level (IRL), for children and for women of childbearing age (WOCBA) and 2) to confirm through a literature review that with the exception of neurodevelopment, which was not evaluated here, no adverse effects of lead consistently occur at the blood lead level (BLL) associated with the IRL. Because no safe level of lead exposure has yet been identified for children's health, the IRLs of 3 µg/day for children and 12.5 µg/day for WOCBA were derived from the Centers for Disease Control and Prevention reference value of 5 µg/dL BLL, the level at which public health actions should be initiated. The literature review showed that no adverse effects of lead consistently occurred at the BLL associated with the IRLs (0.5 µg/dL). The IRLs of 3 µg/day for children and 12.5 µg/day for WOCBA should serve as useful benchmarks in evaluating the potential for adverse effects of dietary lead.


Asunto(s)
Exposición Dietética/normas , Contaminantes Ambientales/normas , Plomo/normas , Adulto , Niño , Preescolar , Exposición Dietética/prevención & control , Contaminantes Ambientales/toxicidad , Femenino , Humanos , Lactante , Recién Nacido , Plomo/toxicidad , Embarazo , Estados Unidos , United States Food and Drug Administration
10.
Risk Anal ; 38(8): 1738-1757, 2018 08.
Artículo en Inglés | MEDLINE | ID: mdl-29341180

RESUMEN

We developed a risk assessment of human salmonellosis associated with consumption of alfalfa sprouts in the United States to evaluate the public health impact of applying treatments to seeds (0-5-log10 reduction in Salmonella) and testing spent irrigation water (SIW) during production. The risk model considered variability and uncertainty in Salmonella contamination in seeds, Salmonella growth and spread during sprout production, sprout consumption, and Salmonella dose response. Based on an estimated prevalence of 2.35% for 6.8 kg seed batches and without interventions, the model predicted 76,600 (95% confidence interval (CI) 15,400-248,000) cases/year. Risk reduction (by 5- to 7-fold) predicted from a 1-log10 seed treatment alone was comparable to SIW testing alone, and each additional 1-log10 seed treatment was predicted to provide a greater risk reduction than SIW testing. A 3-log10 or a 5-log10 seed treatment reduced the predicted cases/year to 139 (95% CI 33-448) or 1.4 (95% CI <1-4.5), respectively. Combined with SIW testing, a 3-log10 or 5-log10 seed treatment reduced the cases/year to 45 (95% CI 10-146) or <1 (95% CI <1-1.5), respectively. If the SIW coverage was less complete (i.e., less representative), a smaller risk reduction was predicted, e.g., a combined 3-log10 seed treatment and SIW testing with 20% coverage resulted in an estimated 92 (95% CI 22-298) cases/year. Analysis of alternative scenarios using different assumptions for key model inputs showed that the predicted relative risk reductions are robust. This risk assessment provides a comprehensive approach for evaluating the public health impact of various interventions in a sprout production system.


Asunto(s)
Microbiología de Alimentos , Medicago sativa/efectos adversos , Medicago sativa/microbiología , Intoxicación Alimentaria por Salmonella/etiología , Microbiología del Agua , Riego Agrícola , Carga Bacteriana , Inocuidad de los Alimentos/métodos , Humanos , Salud Pública , Medición de Riesgo , Conducta de Reducción del Riesgo , Salmonella/crecimiento & desarrollo , Salmonella/patogenicidad , Intoxicación Alimentaria por Salmonella/prevención & control , Semillas/crecimiento & desarrollo , Semillas/microbiología , Estados Unidos
11.
J Food Prot ; 80(6): 903-921, 2017 06.
Artículo en Inglés | MEDLINE | ID: mdl-28437165

RESUMEN

A multiyear interagency Listeria monocytogenes Market Basket Survey was undertaken for selected refrigerated ready-to-eat foods purchased at retail in four FoodNet sites in the United States. Food samples from 16 food categories in six broad groups (seafood, produce, dairy, meat, eggs, and combination foods) were collected weekly at large national chain supermarkets and independent grocery stores in California, Maryland, Connecticut, and Georgia for 100 weeks between December 2010 and March 2013. Of the 27,389 total samples, 116 samples tested positive by the BAX PCR system for L. monocytogenes , and the pathogen was isolated and confirmed for 102 samples. Among the 16 food categories, the proportion of positive samples (i.e., without considering clustering effects) based on recovery of a viable isolate of L. monocytogenes ranged from 0.00% (95% confidence interval: 0.00, 0.18) for the category of soft-ripened and semisoft cheese to 1.07% (0.63, 1.68) for raw cut vegetables. Among the 571 samples that tested positive for Listeria-like organisms, the proportion of positive samples ranged from 0.79% (0.45, 1.28) for soft-ripened and semisoft cheese to 4.76% (2.80, 7.51) for fresh crab meat or sushi. Across all 16 categories, L. monocytogenes contamination was significantly associated with the four states (P < 0.05) but not with the packaging location (prepackaged by the manufacturer versus made and/or packaged in the store), the type of store (national chain versus independent), or the season. Among the 102 samples positive for L. monocytogenes , levels ranged from <0.036 most probable number per g to 6.1 log CFU/g. For delicatessen (deli) meats, smoked seafood, seafood salads, soft-ripened and semisoft cheeses, and deli-type salads without meat, the percentage of positive samples was significantly lower (P < 0.001) in this survey than that reported a decade ago based on comparable surveys in the United States. Use of mixed logistic regression models to address clustering effects with regard to the stores revealed that L. monocytogenes prevalence ranged from 0.11% (0.03, 0.34) for sprouts (prepackaged) to 1.01% (0.58, 1.74) for raw cut vegetables (prepackaged).


Asunto(s)
Listeria monocytogenes/aislamiento & purificación , Productos de la Carne , California , Connecticut , Contaminación de Alimentos , Microbiología de Alimentos , Georgia , Maryland , Prevalencia , Estados Unidos
12.
J Food Prot ; 79(7): 1076-88, 2016 07.
Artículo en Inglés | MEDLINE | ID: mdl-27357026

RESUMEN

Cross-contamination, improper holding temperatures, and insufficient sanitary practices are known retail practices that may lead to product contamination and growth of Listeria monocytogenes. However, the relative importance of control options to mitigate the risk of invasive listeriosis from ready-to-eat (RTE) products sliced or prepared at retail is not well understood. This study illustrates the utility of a quantitative risk assessment model described in a first article of this series (Pouillot, R., D. Gallagher, J. Tang, K. Hoelzer, J. Kause, and S. B. Dennis, J. Food Prot. 78:134-145, 2015) to evaluate the public health impact associated with changes in retail deli practices and interventions. Twenty-two mitigation scenarios were modeled and evaluated under six different baseline conditions. These scenarios were related to sanitation, worker behavior, use of growth inhibitors, cross-contamination, storage temperature control, and reduction of the level of L. monocytogenes on incoming RTE food products. The mean risk per serving of RTE products obtained under these scenarios was then compared with the risk estimated in the baseline condition. Some risk mitigations had a consistent impact on the predicted listeriosis risk in all baseline conditions (e.g. presence or absence of growth inhibitor), whereas others were greatly dependent on the initial baseline conditions or practices in the deli (e.g. preslicing of products). Overall, the control of the bacterial growth and the control of contamination at its source were major factors of listeriosis risk in these settings. Although control of cross-contamination and continued sanitation were also important, the decrease in the predicted risk was not amenable to a simple solution. Findings from these predictive scenario analyses are intended to encourage improvements to retail food safety practices and mitigation strategies to control L. monocytogenes in RTE foods more effectively and to demonstrate the utility of quantitative risk assessment models to inform risk management decisions.


Asunto(s)
Listeria monocytogenes , Productos de la Carne/microbiología , Contaminación de Alimentos , Microbiología de Alimentos , Humanos , Listeriosis , Medición de Riesgo
13.
J Food Prot ; 78(2): 240-7, 2015 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-25710137

RESUMEN

A field trial in Salinas Valley, California, was conducted during July 2011 to quantify the microbial load that transfers from wildlife feces onto nearby lettuce during foliar irrigation. Romaine lettuce was grown using standard commercial practices and irrigated using an impact sprinkler design. Five grams of rabbit feces was spiked with 1.29 × 10(8) CFU of Escherichia coli O157:H7 and placed - 3, - 2, and - 1 days and immediately before a 2-h irrigation event. Immediately after irrigation, 168 heads of lettuce ranging from ca. 23 to 69 cm (from 9 to 27 in.) from the fecal deposits were collected, and the concentration of E. coli O157:H7 was determined. Thirty-eight percent of the collected lettuce heads had detectable E. coli O157:H7, ranging from 1 MPN to 2.30 × 10(5) MPN per head and a mean concentration of 7.37 × 10(3) MPN per head. Based on this weighted arithmetic mean concentration of 7.37 × 10(3) MPN of bacteria per positive head, only 0.00573% of the original 5 g of scat with its mean load of 1.29 × 10(8) CFU was transferred to the positive heads of lettuce. Bacterial contamination was limited to the outer leaves of lettuce. In addition, factors associated with the transfer of E. coli O157:H7 from scat to lettuce were distance between the scat and lettuce, age of scat before irrigation, and mean distance between scat and the irrigation sprinkler heads. This study quantified the transfer coefficient between scat and adjacent heads of lettuce as a function of irrigation. The data can be used to populate a quantitative produce risk assessment model for E. coli O157:H7 in romaine lettuce to inform risk management and food safety policies.


Asunto(s)
Escherichia coli O157/aislamiento & purificación , Heces/microbiología , Contaminación de Alimentos/análisis , Lactuca/microbiología , Animales , Animales Salvajes , California , Recuento de Colonia Microbiana , Seguridad de Productos para el Consumidor , Microbiología de Alimentos , Inocuidad de los Alimentos , Hojas de la Planta/microbiología
14.
J Food Prot ; 78(1): 134-45, 2015 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-25581188

RESUMEN

The Interagency Risk Assessment-Listeria monocytogenes (Lm) in Retail Delicatessens provides a scientific assessment of the risk of listeriosis associated with the consumption of ready-to-eat (RTE) foods commonly prepared and sold in the delicatessen (deli) of a retail food store. The quantitative risk assessment (QRA) model simulates the behavior of retail employees in a deli department and tracks the Lm potentially present in this environment and in the food. Bacterial growth, bacterial inactivation (following washing and sanitizing actions), and cross-contamination (from object to object, from food to object, or from object to food) are evaluated through a discrete event modeling approach. The QRA evaluates the risk per serving of deli-prepared RTE food for the susceptible and general population, using a dose-response model from the literature. This QRA considers six separate retail baseline conditions and provides information on the predicted risk of listeriosis for each. Among the baseline conditions considered, the model predicts that (i) retail delis without an environmental source of Lm (such as niches), retail delis without niches that do apply temperature control, and retail delis with niches that do apply temperature control lead to lower predicted risk of listeriosis relative to retail delis with niches and (ii) retail delis with incoming RTE foods that are contaminated with Lm lead to higher predicted risk of listeriosis, directly or through cross-contamination, whether the contaminated incoming product supports growth or not. The risk assessment predicts that listeriosis cases associated with retail delicatessens result from a sequence of key events: (i) the contaminated RTE food supports Lm growth; (ii) improper retail and/or consumer storage temperature or handling results in the growth of Lm on the RTE food; and (iii) the consumer of this RTE food is susceptible to listeriosis. The risk assessment model, therefore, predicts that cross-contamination with Lm at retail predominantly results in sporadic cases.


Asunto(s)
Contaminación de Alimentos/análisis , Listeria monocytogenes/crecimiento & desarrollo , Productos de la Carne/microbiología , Desinfección/métodos , Contaminación de Equipos , Microbiología de Alimentos , Humanos , Listeriosis/epidemiología , Viabilidad Microbiana , Modelos Teóricos , Medición de Riesgo , Pequeña Empresa , Temperatura
15.
Risk Anal ; 35(1): 90-108, 2015 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-24975545

RESUMEN

Evaluations of Listeria monocytogenes dose-response relationships are crucially important for risk assessment and risk management, but are complicated by considerable variability across population subgroups and L. monocytogenes strains. Despite difficulties associated with the collection of adequate data from outbreak investigations or sporadic cases, the limitations of currently available animal models, and the inability to conduct human volunteer studies, some of the available data now allow refinements of the well-established exponential L. monocytogenes dose response to more adequately represent extremely susceptible population subgroups and highly virulent L. monocytogenes strains. Here, a model incorporating adjustments for variability in L. monocytogenes strain virulence and host susceptibility was derived for 11 population subgroups with similar underlying comorbidities using data from multiple sources, including human surveillance and food survey data. In light of the unique inherent properties of L. monocytogenes dose response, a lognormal-Poisson dose-response model was chosen, and proved able to reconcile dose-response relationships developed based on surveillance data with outbreak data. This model was compared to a classical beta-Poisson dose-response model, which was insufficiently flexible for modeling the specific case of L. monocytogenes dose-response relationships, especially in outbreak situations. Overall, the modeling results suggest that most listeriosis cases are linked to the ingestion of food contaminated with medium to high concentrations of L. monocytogenes. While additional data are needed to refine the derived model and to better characterize and quantify the variability in L. monocytogenes strain virulence and individual host susceptibility, the framework derived here represents a promising approach to more adequately characterize the risk of listeriosis in highly susceptible population subgroups.


Asunto(s)
Interacciones Huésped-Parásitos , Listeria monocytogenes/patogenicidad , Modelos Teóricos , Virulencia
16.
J Food Prot ; 76(3): 376-85, 2013 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-23462073

RESUMEN

Stakeholders in the system of food safety, in particular federal agencies, need evidence-based, transparent, and rigorous approaches to estimate and compare the risk of foodborne illness from microbial and chemical hazards and the public health impact of interventions. FDA-iRISK (referred to here as iRISK), a Web-based quantitative risk assessment system, was developed to meet this need. The modeling tool enables users to assess, compare, and rank the risks posed by multiple food-hazard pairs at all stages of the food supply system, from primary production, through manufacturing and processing, to retail distribution and, ultimately, to the consumer. Using standard data entry templates, built-in mathematical functions, and Monte Carlo simulation techniques, iRISK integrates data and assumptions from seven components: the food, the hazard, the population of consumers, process models describing the introduction and fate of the hazard up to the point of consumption, consumption patterns, dose-response curves, and health effects. Beyond risk ranking, iRISK enables users to estimate and compare the impact of interventions and control measures on public health risk. iRISK provides estimates of the impact of proposed interventions in various ways, including changes in the mean risk of illness and burden of disease metrics, such as losses in disability-adjusted life years. Case studies for Listeria monocytogenes and Salmonella were developed to demonstrate the application of iRISK for the estimation of risks and the impact of interventions for microbial hazards. iRISK was made available to the public at http://irisk.foodrisk.org in October 2012.


Asunto(s)
Contaminación de Alimentos/análisis , Manipulación de Alimentos/normas , Microbiología de Alimentos , Alimentos/normas , Medición de Riesgo , Inocuidad de los Alimentos , Enfermedades Transmitidas por los Alimentos/prevención & control , Humanos , Modelos Teóricos , Método de Montecarlo , Estados Unidos , United States Food and Drug Administration
17.
Int J Food Microbiol ; 157(2): 267-77, 2012 Jul 02.
Artículo en Inglés | MEDLINE | ID: mdl-22704063

RESUMEN

Listeria monocytogenes is readily found in the environment of retail deli establishments and can occasionally contaminate food handled in these establishments. Here we synthesize the available scientific evidence to derive probability distributions and mathematical models of bacterial transfers between environmental surfaces and foods, including those during slicing of food, and of bacterial removal during cleaning and sanitizing (models available at www.foodrisk.org). Transfer coefficients varied considerably by surface type, and after log(10) transformation were best described by normal distributions with means ranging from -0.29 to -4.96 and standard deviations that ranged from 0.07 to 1.39. 'Transfer coefficients' during slicing were best described by a truncated logistic distribution with location 0.07 and scale 0.03. In the absence of protein residues, mean log inactivation indicated a greater than 5 log(10) reduction for sanitization with hypochlorite (mean: 6.5 log(10); 95% confidence interval (CI): 5.0-8.1 log(10)) and quaternary ammonium compounds (mean: 5.5 log(10); 95% CI: 3.6-7.3 log(10)), but in the presence of protein residues efficacy reduced dramatically for hypochlorite (mean: 3.8 log(10); 95% CI: 2.1-5.4 log(10)) as well as quaternary ammonium compounds (mean: 4.4log(10); 95% CI: 2.5-6.4 log(10)). Overall, transfer coefficients are therefore low, even though cross-contamination can be extremely efficient under certain conditions. Dozens of food items may consequently be contaminated from a single contaminated slicer blade, albeit at low concentrations. Correctly performed sanitizing efficiently reduces L. monocytogenes contamination in the environment and therefore limits cross-contamination, even though sanitization is only performed a few times per day. However, under unfavorable conditions reductions in bacterial concentration may be far below 5 log(10). The probability distributions and mathematical models derived here can be used to evaluate L. monocytogenes cross-contamination dynamics in environments where foods are handled, and to assess the potential impact of different intervention strategies.


Asunto(s)
Listeria monocytogenes , Modelos Teóricos , Recuento de Colonia Microbiana , Seguridad de Productos para el Consumidor , Desinfección , Contaminación de Alimentos , Microbiología de Alimentos , Listeria monocytogenes/crecimiento & desarrollo , Listeria monocytogenes/aislamiento & purificación , Carne/microbiología , Saneamiento
18.
Foodborne Pathog Dis ; 9(7): 661-73, 2012 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-22612229

RESUMEN

Several listeriosis outbreaks have been linked to the consumption of fresh or processed produce in recent years. One major determinant of the listeriosis risk is the ability of a food to support growth of Listeria monocytogenes during storage. However, data regarding the ability to support growth of L. monocytogenes are scarce or nonexisting for many produce commodities. Here we synthesize the available data regarding growth behavior of L. monocytogenes on produce, compare the growth data with listeriosis outbreak data, and evaluate the adequacy of the data for predictive modeling. Growth rates and maximum L. monocytogenes population densities differed markedly among produce commodities, and post-harvest processing had a considerable effect on growth dynamics for certain commodities such as tomatoes. However, data scarcity prevented reliable estimation of growth rates for many commodities. Produce outbreaks seemed frequently associated with processed produce and often involved storage under suboptimal conditions (e.g., at room temperature for several hours or for several months in the refrigerator) or environmental cross-contamination after processing. However, no clear associations between high growth rates of L. monocytogenes on fresh produce and outbreaks were detected. In conclusion, produce commodities differ in the supported growth rate of L. monocytogenes, the maximum attainable L. monocytogenes population density, and possibly in the impact of post-harvest processing, but data are currently insufficient to predict growth behavior, and the listeriosis risk appears to be also governed by additional factors.


Asunto(s)
Brotes de Enfermedades/prevención & control , Contaminación de Alimentos/prevención & control , Enfermedades Transmitidas por los Alimentos/epidemiología , Listeria monocytogenes/crecimiento & desarrollo , Listeria monocytogenes/patogenicidad , Listeriosis/epidemiología , Seguridad de Productos para el Consumidor , Bases de Datos Factuales , Manipulación de Alimentos/métodos , Microbiología de Alimentos/métodos , Almacenamiento de Alimentos/métodos , Enfermedades Transmitidas por los Alimentos/microbiología , Enfermedades Transmitidas por los Alimentos/fisiopatología , Humanos , Listeriosis/microbiología , Listeriosis/fisiopatología , Modelos Teóricos
19.
Vet Res ; 43: 18, 2012 Mar 14.
Artículo en Inglés | MEDLINE | ID: mdl-22417207

RESUMEN

Listeriosis is a leading cause of hospitalization and death due to foodborne illness in the industrialized world. Animal models have played fundamental roles in elucidating the pathophysiology and immunology of listeriosis, and will almost certainly continue to be integral components of the research on listeriosis. Data derived from animal studies helped for example characterize the importance of cell-mediated immunity in controlling infection, allowed evaluation of chemotherapeutic treatments for listeriosis, and contributed to quantitative assessments of the public health risk associated with L. monocytogenes contaminated food commodities. Nonetheless, a number of pivotal questions remain unresolved, including dose-response relationships, which represent essential components of risk assessments. Newly emerging data about species-specific differences have recently raised concern about the validity of most traditional animal models of listeriosis. However, considerable uncertainty about the best choice of animal model remains. Here we review the available data on traditional and potential new animal models to summarize currently recognized strengths and limitations of each model. This knowledge is instrumental for devising future studies and for interpreting current data. We deliberately chose a historical, comparative and cross-disciplinary approach, striving to reveal clues that may help predict the ultimate value of each animal model in spite of incomplete data.


Asunto(s)
Modelos Animales de Enfermedad , Listeria monocytogenes/fisiología , Listeria monocytogenes/patogenicidad , Listeriosis/microbiología , Listeriosis/fisiopatología , Animales , Humanos , Listeriosis/epidemiología , Listeriosis/veterinaria , Especificidad de la Especie
20.
J Food Prot ; 73(2): 312-21, 2010 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-20132677

RESUMEN

Home refrigeration temperatures and product storage times are important factors for controlling the growth of Listeria monocytogenes in refrigerated ready-to-eat foods. In 2005, RTI International, in collaboration with Tennessee State University and Kansas State University, conducted a national survey of U.S. adults to characterize consumers' home storage and refrigeration practices for 10 different categories of refrigerated ready-to-eat foods. No distributions of storage time or refrigeration temperature were presented in any of the resulting publications. This study used classical parametric survival modeling to derive parametric distributions from the RTI International storage practices data set. Depending on the food category, variability in product storage times was best modeled using either exponential or Weibull distributions. The shape and scale of the distributions varied greatly depending on the food category. Moreover, the results indicated that consumers tend to keep a product that is packaged by a manufacturer for a longer period of time than a product that is packaged at retail. Refrigeration temperatures were comparable to those previously reported, with the variability in temperatures best fit using a Laplace distribution, as an alternative to the empirical distribution. In contrast to previous research, limited support was found for a correlation between storage time and temperature. The distributions provided in this study can be used to better model consumer behavior in future risk assessments.


Asunto(s)
Seguridad de Productos para el Consumidor , Contaminación de Alimentos/análisis , Conservación de Alimentos/métodos , Listeria monocytogenes/crecimiento & desarrollo , Manipulación de Alimentos/métodos , Microbiología de Alimentos , Humanos , Listeria monocytogenes/aislamiento & purificación , Refrigeración , Medición de Riesgo , Encuestas y Cuestionarios , Temperatura , Factores de Tiempo , Estados Unidos
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...