Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 86
Filtrar
1.
Foodborne Pathog Dis ; 2024 Jul 04.
Artigo em Inglês | MEDLINE | ID: mdl-38963777

RESUMO

Consumers can be exposed to many foodborne biological hazards that cause diseases with varying outcomes and incidence and, therefore, represent different levels of public health burden. To help the French risk managers to rank these hazards and to prioritize food safety actions, we have developed a three-step approach. The first step was to develop a list of foodborne hazards of health concern in mainland France. From an initial list of 335 human pathogenic biological agents, the final list of "retained hazards" consists of 24 hazards, including 12 bacteria (including bacterial toxins and metabolites), 3 viruses and 9 parasites. The second step was to collect data to estimate the disease burden (incidence, Disability Adjusted Life Years) associated with these hazards through food during two time periods: 2008-2013 and 2014-2019. The ranks of the different hazards changed slightly according to the considered period. The third step was the ranking of hazards according to a multicriteria decision support model using the ELECTRE III method. Three ranking criteria were used, where two reflect the severity of the effects (Years of life lost and Years lost due to disability) and one reflects the likelihood (incidence) of the disease. The multicriteria decision analysis approach takes into account the preferences of the risk managers through different sets of weights and the uncertainties associated with the data. The method and the data collected allowed to estimate the health burden of foodborne biological hazards in mainland France and to define a prioritization list for the health authorities.

2.
Foods ; 13(5)2024 Feb 29.
Artigo em Inglês | MEDLINE | ID: mdl-38472864

RESUMO

Better knowledge regarding the Listeria monocytogenes dose-response (DR) model is needed to refine the assessment of the risk of foodborne listeriosis. In 2018, the European Food Safety Agency (EFSA) derived a lognormal Poisson DR model for 14 different age-sex sub-groups, marginally to strain virulence. In the present study, new sets of parameters are developed by integrating the EFSA model for these sub-groups together with three classes of strain virulence characteristics ("less virulent", "virulent", and "more virulent"). Considering classes of virulence leads to estimated relative risks (RRs) of listeriosis following the ingestion of 1000 bacteria of "less virulent" vs. "more virulent" strains ranging from 21.6 to 24.1, depending on the sub-group. These relatively low RRs when compared with RRs linked to comorbidities described in the literature suggest that the influence of comorbidity on the occurrence of invasive listeriosis for a given exposure is much more important than the influence of the virulence of the strains. The updated model parameters allow better prediction of the risk of invasive listeriosis across a population of interest, provided the necessary data on population demographics and the proportional contribution of strain virulence classes in food products of interest are available. An R package is made available to facilitate the use of these dose-response models.

3.
PLoS One ; 18(12): e0294624, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-38051743

RESUMO

The serovars of Salmonella enterica display dramatic differences in pathogenesis and host preferences. We developed a process (patent pending) for grouping Salmonella isolates and serovars by their public health risk. We collated a curated set of 12,337 S. enterica isolate genomes from human, beef, and bovine sources in the US. After annotating a virulence gene catalog for each isolate, we used unsupervised random forest methods to estimate the proximity (similarity) between isolates based upon the genomic presentation of putative virulence traits We then grouped isolates (virulence clusters) using hierarchical clustering (Ward's method), used non-parametric bootstrapping to assess cluster stability, and externally validated the clusters against epidemiological virulence measures from FoodNet, the National Outbreak Reporting System (NORS), and US federal sampling of beef products. We identified five stable virulence clusters of S. enterica serovars. Cluster 1 (higher virulence) serovars yielded an annual incidence rate of domestically acquired sporadic cases roughly one and a half times higher than the other four clusters combined (Clusters 2-5, lower virulence). Compared to other clusters, cluster 1 also had a higher proportion of infections leading to hospitalization and was implicated in more foodborne and beef-associated outbreaks, despite being isolated at a similar frequency from beef products as other clusters. We also identified subpopulations within 11 serovars. Remarkably, we found S. Infantis and S. Typhimurium subpopulations that significantly differed in genome length and clinical case presentation. Further, we found that the presence of the pESI plasmid accounted for the genome length differences between the S. Infantis subpopulations. Our results show that S. enterica strains associated with highest incidence of human infections share a common virulence repertoire. This work could be updated regularly and used in combination with foodborne surveillance information to prioritize serovars of public health concern.


Assuntos
Salmonella enterica , Animais , Bovinos , Humanos , Estados Unidos/epidemiologia , Virulência/genética , Sorogrupo , Salmonella , Genômica
4.
Regul Toxicol Pharmacol ; 144: 105487, 2023 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-37640100

RESUMO

The U.S. Food and Drug Administration (FDA) developed an oral toxicological reference value (TRV) for characterizing potential health concerns from dietary exposure to cadmium (Cd). The development of the TRV leveraged the FDA's previously published research including (1) a systematic review for adverse health effects associated with oral Cd exposure and (2) a human physiological based pharmacokinetic (PBPK) model adapted from Kjellstrom and Nordberg (1978) for use in reverse dosimetry applied to the U.S. population. Adverse effects of Cd on the bone and kidney are associated with similar points of departure (PODs) of approximately 0.50 µg Cd/g creatinine for females aged 50-60 based on available epidemiologic data. We also used the upper bound estimate of the renal cortical concentration (50 µg/g Cd) occurring in the U.S. population at 50 years of age as a POD. Based on the output from our reverse dosimetry PBPK Model, a range of 0.21-0.36 µg/kg bw/day was developed for the TRV. The animal data used for the animal TRV derivation (0.63-1.8 µg/kg bw/day) confirms biological plausibility for both the bone and kidney endpoints.


Assuntos
Cádmio , Exposição Ambiental , Feminino , Animais , Humanos , Pessoa de Meia-Idade , Cádmio/toxicidade , Exposição Ambiental/efeitos adversos , Valores de Referência , Alimentos , Rim
6.
Risk Anal ; 43(9): 1713-1732, 2023 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-36513596

RESUMO

The objective of this study was to leverage quantitative risk assessment to investigate possible root cause(s) of foodborne illness outbreaks related to Shiga toxin-producing Escherichia coli O157:H7 (STEC O157) infections in leafy greens in the United States. To this end, we developed the FDA leafy green quantitative risk assessment epidemic curve prediction model (FDA-LG QRA-EC) that simulated the lettuce supply chain. The model was used to predict the number of reported illnesses and the epidemic curve associated with lettuce contaminated with STEC O157 for a wide range of scenarios representing various contamination conditions and facility processing/sanitation practices. Model predictions were generated for fresh-cut and whole lettuce, quantifying the differing impacts of facility processing and home preparation on predicted illnesses. Our model revealed that the timespan (i.e., number of days with at least one reported illness) and the peak (i.e., day with the most predicted number of reported illnesses) of the epidemic curve of a STEC O157-lettuce outbreak were not strongly influenced by facility processing/sanitation practices and were indications of contamination pattern among incoming lettuce batches received by the facility or distribution center. Through comparisons with observed number of illnesses from recent STEC O157-lettuce outbreaks, the model identified contamination conditions on incoming lettuce heads that could result in an outbreak of similar size, which can be used to narrow down potential root cause hypotheses.


Assuntos
Epidemias , Escherichia coli O157 , Doenças Transmitidas por Alimentos , Humanos , Estados Unidos/epidemiologia , Lactuca , Surtos de Doenças , Doenças Transmitidas por Alimentos/epidemiologia
7.
Toxicol Lett ; 367: 67-75, 2022 Aug 15.
Artigo em Inglês | MEDLINE | ID: mdl-35901988

RESUMO

The goal of this study was to assess a cadmium (Cd) physiologically based pharmacokinetic (PBPK) model to evaluate Cd toxicological reference values (e.g. reference dose, tolerable intake, minimum risk level) adapted to the U.S. population. We reviewed and evaluated previously published Cd PBPK models and developed further adaptations to the 1978 Kjellström and Nordberg (KN) model. Specifically, we propose adaptations with updated U.S.-specific bodyweight, kidney weight and creatinine excretion models by using NHANES data as well as a stochastic PBPK model that provides credible intervals of uncertainty around mean populational estimates. We provide our model review and adaptations as well as present estimates from the newly adapted models using observed U.S. urinary Cd values as a function of gender and age and given dietary exposure as evaluated from NHANES/WWEIA and U.S. Total Diet Study data. Results show all newly adapted models provide acceptable mean estimates of urinary Cd in the U.S. The stochastic model provides credible intervals to further inform regulatory decision making. Validation of the estimated K-Cd concentration values was not possible as data for a representative population was not available. We developed a web-based tool implementing these models and other potential adaptations to facilitate PBPK model estimate comparisons.


Assuntos
Cádmio , Dieta , Modelos Biológicos , Inquéritos Nutricionais , Valores de Referência , Medição de Risco
9.
Environ Res ; 212(Pt B): 113315, 2022 09.
Artigo em Inglês | MEDLINE | ID: mdl-35436451

RESUMO

We developed an association model to estimate the risk of femoral neck low bone mass and osteoporosis from exposure to cadmium for women and men aged 50-79 in the U.S, as a function of the urinary cadmium (U-Cd) levels. We analyzed data from the NHANES 2005-2014 surveys and evaluated the relationship between U-Cd and femoral neck bone mineral density (BMD) using univariate and multivariate regression models with a combination of NHANES cycle, gender, age, smoking, race/ethnicity, height, body weight, body mass index, lean body mass, diabetes, kidney disease, physical activity, menopausal status, hormone replacement therapy, urinary lead, and prednisone intake as confounding variables. The regression coefficient between U-Cd and femoral neck BMD obtained with the best multivariate regression was used to develop an association model that can estimate the additional risk of low bone mass or osteoporosis in the population given a certain level of U-Cd. Results showed a linear relationship between U-Cd and BMD, conditional to body weight, where individuals with higher U-Cd had decreased BMD values. Our results do not support the hypothesis of a threshold for the effect of Cd on bone. Our model estimates that exposure to Cd results in an increase of 0.51 percentage points (CI95% 0.00, 0.92) of the population diagnosed with osteoporosis, compared to a theoretical absence of exposure. We estimate that 16% (CI95%: 0.00, 40%) of osteoporosis cases in the U.S. 50-79 aged population are a result of Cd exposure. This study presents the first continuous model estimating low bone mass and osteoporosis risk in the U.S. population given actual or potential changes in U-Cd levels. Our model will provide information to inform FDA's Closer to Zero initiative goal to reduce exposure to toxic elements.


Assuntos
Cádmio , Osteoporose , Adulto , Peso Corporal , Densidade Óssea , Cádmio/toxicidade , Feminino , Humanos , Masculino , Inquéritos Nutricionais , Osteoporose/induzido quimicamente , Osteoporose/epidemiologia
10.
J Food Prot ; 85(8): 1177-1191, 2022 08 01.
Artigo em Inglês | MEDLINE | ID: mdl-35358310

RESUMO

ABSTRACT: Reduction of foodborne illness caused by norovirus (NoV) continues to be a focus for the food safety community. Using a previously published quantitative risk assessment model, we evaluated more than 60 scenarios examining the impact of implementation of and compliance with risk management strategies identified in the U.S. Food and Drug Administration Food Code for (a) surface cleaning and sanitizing, (b) hand hygiene, (c) exclusion, or (d) restriction of ill employees. Implementation of and compliance with hand hygiene and ill food employee exclusion strategies had the largest impact on the predicted number of highly contaminated food servings and associated consumer illnesses. In scenarios in which gloves were always worn and hand washing compliance was 90%, the model estimated reductions in the number of highly contaminated food servings and ill consumers to 39 and 43% of baseline estimates (i.e., typical practice), respectively. Reductions were smaller when gloves were never worn. Hand washing compliance after using the restroom strongly impacted predicted numbers of highly contaminated servings and consumer illnesses. Ten percent compliance with removing or excluding ill food employees was predicted to increase the number of highly contaminated food servings and ill consumers to 221 and 213% of baseline estimates, respectively. Ninety-four percent compliance with exclusion of ill food employees was predicted to decrease these numbers to 69 and 71% of baseline estimates, respectively. Surface cleaning in food establishments had a relatively small impact on these measures. Restriction of food employees (removed from contact with food and food contact equipment and utensils) was not effective for reducing NoV illness unless this restriction included additional provisions. The results from this study can help risk managers prioritize mitigation strategies and their implementation for controlling the transmission of NoV and subsequent consumer foodborne illness.


Assuntos
Doenças Transmitidas por Alimentos , Norovirus , Alimentos , Manipulação de Alimentos , Doenças Transmitidas por Alimentos/prevenção & controle , Humanos , Restaurantes , Medição de Risco
11.
Risk Anal ; 42(2): 344-369, 2022 02.
Artigo em Inglês | MEDLINE | ID: mdl-34121216

RESUMO

Human norovirus (NoV) is the leading cause of foodborne illness in the United States and Canada. Bivalve molluscan shellfish is one commodity commonly identified as being a vector of NoV. Bivalve molluscan shellfish are grown in waters that may be affected by contamination events, tend to bioaccumulate viruses, and are frequently eaten raw. In an effort to better assess the elements that contribute to potential risk of NoV infection and illness from consumption of bivalve molluscan shellfish, the U.S. Department of Health and Human Services/Food and Drug Administration (FDA), Health Canada (HC), the Canadian Food Inspection Agency (CFIA), and Environment and Climate Change Canada (ECCC) collaborated to conduct a quantitative risk assessment for NoV in bivalve molluscan shellfish, notably oysters. This study describes the model and scenarios developed and results obtained to assess the risk of NoV infection and illness from consumption of raw oysters harvested from a quasi-steady-state situation. Among the many factors that influence the risk of NoV illness for raw oyster consumers, the concentrations of NoV in the influent (raw, untreated) and effluent (treated) of wastewater treatment plants (WWTP) were identified to be the most important. Thus, mitigation and control strategies that limit the influence from human waste (WWTP outfalls) in oyster growing areas have a major influence on the risk of illness from consumption of those oysters.


Assuntos
Infecções por Caliciviridae , Norovirus , Ostreidae , Animais , Infecções por Caliciviridae/epidemiologia , Canadá , Contaminação de Alimentos/análise , Humanos , Medição de Risco , Estados Unidos
12.
Artigo em Inglês | MEDLINE | ID: mdl-33735599

RESUMO

In food safety, process pathway risk assessments usually estimate the risk of illness from a single hazard and a single food and can inform food safety decisions and consumer advice. To evaluate the health impact of a potential change in diet, we need to understand not only the risk posed by the considered hazard and food but also the risk posed by the substitution food and other potential hazards. We developed a framework to provide decision-makers with a multi-faceted evaluation of the impact of dietary shifts on risk of illness. Our case study explored exposure to inorganic arsenic (iAs) and aflatoxins through consumption of infant cereals and the risk of developing lung, bladder and liver cancer over a lifetime. The estimated additional Disability-Adjusted Life Year (DALY) in the U.S. from exposure to iAs and aflatoxin based on available contamination and consumption patterns of infant rice and oat cereal is 4,921 (CI 90% 414; 9,071). If all infant cereal consumers shift intake (maintaining equivalent serving size and frequency) to only consuming infant rice cereal, the predicted DALY increases to 6,942 (CI 90% 326; 12,931). If all infant cereal consumers shift intake to only consuming infant oat cereal, the predicted DALY decreases to 1,513 (CI 90% 312; 3,356). Changes in contaminant concentrations or percent consumers, that could occur in the future, also significantly impact the predicted risk. Uncertainty in these risk predictions is primarily driven by the dose-response models. A risk-risk analysis framework provides decision-makers with a nuanced understanding of the public health impact of dietary changes and can be applied to other food safety and nutrition questions.


Assuntos
Ingestão de Alimentos , Grão Comestível/química , Análise de Alimentos , Contaminação de Alimentos/análise , Alimentos Infantis/análise , Neoplasias/diagnóstico , Inocuidade dos Alimentos , Humanos , Lactente , Medição de Risco
13.
PLoS One ; 15(4): e0231393, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-32352974

RESUMO

Whole genome sequencing (WGS) was performed on 201 Listeria monocytogenes isolates recovered from 102 of 27,389 refrigerated ready-to-eat (RTE) food samples purchased at retail in U.S. FoodNet sites as part of the 2010-2013 interagency L. monocytogenes Market Basket Survey (Lm MBS). Core genome multi-locus sequence typing (cgMLST) and in-silico analyses were conducted, and these data were analyzed with metadata for isolates from five food groups: produce, seafood, dairy, meat, and combination foods. Six of 201 isolates, from 3 samples, were subsequently confirmed as L. welshimeri. Three samples contained one isolate per sample; mmong the 96 samples that contained two isolates per sample, 3 samples each contained two different strains and 93 samples each contained duplicate isolates. After 93 duplicate isolates were removed, the remaining 102 isolates were delineated into 29 clonal complexes (CCs) or singletons based on their sequence type. The five most prevalent CCs were CC155, CC1, CC5, CC87, and CC321. The Shannon's diversity index for clones per food group ranged from 1.49 for dairy to 2.32 for produce isolates, which were not significantly different in pairwise comparisons. The most common molecular serogroup as determined by in-silico analysis was IIa (45.6%), followed by IIb (27.2%), IVb (20.4%), and IIc (4.9%). The proportions of isolates within lineages I, II, and III were 48.0%, 50.0% and 2.0%, respectively. Full-length inlA was present in 89.3% of isolates. Listeria pathogenicity island 3 (LIPI-3) and LIPI-4 were found in 51% and 30.6% of lineage I isolates, respectively. Stress survival islet 1 (SSI-1) was present in 34.7% of lineage I isolates, 80.4% of lineage II isolates and the 2 lineage III isolates; SSI-2 was present only in the CC121 isolate. Plasmids were found in 48% of isolates, including 24.5% of lineage I isolates and 72.5% of lineage II isolates. Among the plasmid-carrying isolates, 100% contained at least one cadmium resistance cassette and 89.8% contained bcrABC, involved in quaternary ammonium compound tolerance. Multiple clusters of isolates from different food samples were identified by cgMLST which, along with available metadata, could aid in the investigation of possible cross-contamination and persistence events.


Assuntos
Microbiologia de Alimentos , Variação Genética , Listeria monocytogenes/genética , Virulência/genética , Proteínas de Bactérias/genética , DNA Bacteriano/química , DNA Bacteriano/metabolismo , Humanos , Listeria monocytogenes/classificação , Listeria monocytogenes/isolamento & purificação , Listeria monocytogenes/patogenicidade , Listeriose/patologia , Listeriose/transmissão , Tipagem de Sequências Multilocus , Filogenia , Plasmídeos/genética , Plasmídeos/metabolismo , Sorogrupo , Sequenciamento Completo do Genoma
14.
J Food Prot ; 83(5): 767-778, 2020 May 01.
Artigo em Inglês | MEDLINE | ID: mdl-32294762

RESUMO

ABSTRACT: According to the U.S. Food and Drug Administration's (FDA's) rule on "Prevention of Salmonella Enteritidis in Shell Eggs during Production, Storage, and Transportation," shell eggs intended for human consumption are required to be held or transported at or below 45°F (7.2°C) ambient temperature beginning 36 h after time of lay. Meanwhile, eggs in hatcheries are typically stored at a temperature of 65°F (18.3°C). Although most of those eggs are directed to incubators for hatching, excess eggs have the potential to be diverted for human consumption as egg products through the "breaker" market if these eggs are refrigerated in accordance with FDA's requirement. Combining risk assessment models developed by the U.S. Department of Agriculture's Food Safety and Inspection Service for shell eggs and for egg products, we quantified and compared Salmonella Enteritidis levels in eggs held at 65°F versus 45°F, Salmonella Enteritidis levels in the resulting egg products, and the risk of human salmonellosis from consumption of those egg products. For eggs stored 5 days at 65°F (following 36 h at 75°F [23.9°C] in the layer house), the mean level of Salmonella Enteritidis contamination is 30-fold higher than for eggs stored at 45°F. These increased levels of contamination lead to a 47-fold increase in the risk of salmonellosis from consumption of egg products made from these eggs, with some variation in the public health risk on the basis of the egg product type (e.g., whole egg versus whole egg with added sugar). Assuming that 7% of the liquid egg product supply originates from eggs stored at 65°F versus 45°F, this study estimates an additional burden of 3,562 cases of salmonellosis per year in the United States. A nominal range uncertainty analysis suggests that the relative increase in the risk linked to the storage of eggs at higher temperature estimated in this study is robust to the uncertainty surrounding the model parameters. The diversion of eggs from broiler production to human consumption under the current storage practices of 65°F (versus 45°F) would present a substantive overall increase in the risk of salmonellosis.


Assuntos
Casca de Ovo/microbiologia , Armazenamento de Alimentos/instrumentação , Intoxicação Alimentar por Salmonella , Salmonella enteritidis/crescimento & desenvolvimento , Animais , Galinhas , Ovos/microbiologia , Microbiologia de Alimentos , Inocuidade dos Alimentos , Humanos , Intoxicação Alimentar por Salmonella/etiologia , Estados Unidos
15.
PLoS One ; 14(2): e0213039, 2019.
Artigo em Inglês | MEDLINE | ID: mdl-30818354

RESUMO

Food safety risk assessments and large-scale epidemiological investigations have the potential to provide better and new types of information when whole genome sequence (WGS) data are effectively integrated. Today, the NCBI Pathogen Detection database WGS collections have grown significantly through improvements in technology, coordination, and collaboration, such as the GenomeTrakr and PulseNet networks. However, high-quality genomic data is not often coupled with high-quality epidemiological or food chain metadata. We have created a set of tools for cleaning, curation, integration, analysis and visualization of microbial genome sequencing data. It has been tested using Salmonella enterica and Listeria monocytogenes data sets provided by NCBI Pathogen Detection (160,000 sequenced isolates in 2018). GenomeGraphR presents foodborne pathogen WGS data and associated curated metadata in a user-friendly interface that allows a user to query a variety of research questions such as, transmission sources and dynamics, global reach, and persistence of genotypes associated with contamination in the food supply and foodborne illness across time or space. The application is freely available (https://fda-riskmodels.foodrisk.org/genomegraphr/).


Assuntos
Microbiologia de Alimentos , Inocuidade dos Alimentos , Doenças Transmitidas por Alimentos/microbiologia , Sequenciamento Completo do Genoma/estatística & dados numéricos , Bases de Dados Genéticas , Doenças Transmitidas por Alimentos/epidemiologia , Genoma Bacteriano , Humanos , Internet , Listeria monocytogenes/genética , Listeria monocytogenes/isolamento & purificação , Listeriose/epidemiologia , Listeriose/microbiologia , Metadados , Epidemiologia Molecular , Polimorfismo de Nucleotídeo Único , Medição de Risco , Intoxicação Alimentar por Salmonella/epidemiologia , Intoxicação Alimentar por Salmonella/microbiologia , Salmonella enterica/genética , Software , Interface Usuário-Computador
16.
Foodborne Pathog Dis ; 16(4): 290-297, 2019 04.
Artigo em Inglês | MEDLINE | ID: mdl-30735066

RESUMO

Listeria monocytogenes is a foodborne pathogen that disproportionally affects pregnant females, older adults, and immunocompromised individuals. Using U.S. Foodborne Diseases Active Surveillance Network (FoodNet) surveillance data, we examined listeriosis incidence rates and rate ratios (RRs) by age, sex, race/ethnicity, and pregnancy status across three periods from 2008 to 2016, as recent incidence trends in U.S. subgroups had not been evaluated. The invasive listeriosis annual incidence rate per 100,000 for 2008-2016 was 0.28 cases among the general population (excluding pregnant females), and 3.73 cases among pregnant females. For adults ≥70 years, the annual incidence rate per 100,000 was 1.33 cases. No significant change in estimated listeriosis incidence was found over the 2008-2016 period, except for a small, but significantly lower pregnancy-associated rate in 2011-2013 when compared with 2008-2010. Among the nonpregnancy-associated cases, RRs increased with age from 0.43 (95% confidence interval: 0.25-0.73) for 0- to 14-year olds to 44.9 (33.5-60.0) for ≥85-year olds, compared with 15- to 44-year olds. Males had an incidence of 1.28 (1.12-1.45) times that of females. Compared with non-Hispanic whites, the incidence was 1.57 (1.18-1.20) times higher among non-Hispanic Asians, 1.49 (1.22-1.83) among non-Hispanic blacks, and 1.73 (1.15-2.62) among Hispanics. Among females of childbearing age, non-Hispanic Asian females had 2.72 (1.51-4.89) and Hispanic females 3.13 (2.12-4.89) times higher incidence than non-Hispanic whites. We observed a higher percentage of deaths among older patient groups compared with 15- to 44-year olds. This study is the first characterizing higher RRs for listeriosis in the United States among non-Hispanic blacks and Asians compared with non-Hispanic whites. This information for public health risk managers may spur further research to understand if differences in listeriosis rates relate to differences in consumption patterns of foods with higher contamination levels, food handling practices, comorbidities, immunodeficiencies, health care access, or other factors.


Assuntos
Listeria monocytogenes/isolamento & purificação , Listeriose/epidemiologia , Adolescente , Adulto , Fatores Etários , Idoso , Idoso de 80 Anos ou mais , Criança , Pré-Escolar , Etnicidade , Feminino , Doenças Transmitidas por Alimentos/epidemiologia , Doenças Transmitidas por Alimentos/microbiologia , Humanos , Incidência , Lactente , Recém-Nascido , Listeriose/microbiologia , Masculino , Pessoa de Meia-Idade , Vigilância da População , Gravidez , Complicações Infecciosas na Gravidez/epidemiologia , Complicações Infecciosas na Gravidez/microbiologia , Fatores Sexuais , Estados Unidos/epidemiologia
17.
J Food Prot ; 82(1): 45-57, 2019 01.
Artigo em Inglês | MEDLINE | ID: mdl-30586329

RESUMO

We assessed the risk of human salmonellosis from consumption of shelled walnuts in the United States and the impact of 0- to 5-log reduction treatments for Salmonella during processing. We established a baseline model with Salmonella contamination data from 2010 to 2013 surveys of walnuts from California operations to estimate baseline prevalence and levels of Salmonella during preshelling storage and typical walnut processing stages, considered U.S. consumption data, and applied an adapted dose-response model from the Food and Agriculture Organization and the World Health Organization to evaluate risk of illness per serving and per year. Our baseline model predicted 1 case of salmonellosis per 100 million servings (95% confidence interval [CI], 1 case per 3 million to 1 case per 2 billion servings) of walnuts untreated during processing and uncooked by consumers, resulting in an estimated 6 cases of salmonellosis per year (95% CI, <1 to 278 cases) in the United States. A minimum 3-log reduction treatment for Salmonella during processing of walnuts eaten alone or as an uncooked ingredient resulted in a mean risk of <1 case per year. We modeled the impact on risk per serving of three atypical situations in which the Salmonella levels were increased by 0.5 to 1.5 log CFU per unit pretreatment during processing at the float tank or during preshelling storage or posttreatment during partitioning into consumer packages. No change in risk was associated with the small increase in levels of Salmonella at the float tank, whereas an increase in risk was estimated for each of the other two atypical events. In a fourth scenario, we estimated the risk per serving associated with consumption of walnuts with Salmonella prevalence and levels from a 2014 to 2015 U.S. retail survey. Risk per serving estimates were two orders of magnitude larger than those of the baseline model without treatment. Further research is needed to determine whether this finding reflects variability in Salmonella contamination across the supply or a rare event affecting a portion of the supply.


Assuntos
Contaminação de Alimentos/análise , Juglans/microbiologia , Intoxicação Alimentar por Salmonella , California , Microbiologia de Alimentos , Humanos , Medição de Risco , Intoxicação Alimentar por Salmonella/epidemiologia , Infecções por Salmonella , Estados Unidos
18.
J Food Prot ; 81(6): 1001-1014, 2018 06.
Artigo em Inglês | MEDLINE | ID: mdl-29757010

RESUMO

We developed a quantitative risk assessment model to assess the risk of human nontyphoidal salmonellosis from consumption of pistachios in the United States and to evaluate the impact of Salmonella treatments (1- to 5-log reductions). The exposure model estimating prevalence and contamination levels of Salmonella at consumption included steps in pistachio processing such as transport from grower to huller, removal of the hull through wet abrasion, separation of pistachio floaters (immature, smaller nuts) and sinkers (mature, larger nuts) in a flotation tank, drying, storage, and partitioning. The risks of illness per serving and per year were evaluated by including a Salmonella dose-response model and U.S. consumption data. The spread of Salmonella through float tank water, delay in drying resulting in growth, increased Salmonella levels through pest infestation during storage (pre- and posttreatment), and a simulation of the 2016 U.S. salmonellosis outbreak linked to consumption of pistachios were the modeled atypical situations. The baseline model predicted one case of salmonellosis per 2 million servings (95% CI: one case per 5 million to 800,000 servings) for sinker pistachios and one case per 200,000 servings (95% CI: one case per 400,000 to 40,000 servings) for floater pistachios when no Salmonella treatment was applied and pistachios were consumed as a core product (>80% pistachio) uncooked at home. Assuming 90% of the pistachio supply is sinkers and 10% is floaters, the model estimated 419 salmonellosis cases per year (95% CI: 200 to 1,083 cases) when no Salmonella treatment was applied. A mean risk of illness of less than one case per year was estimated when a minimum 4-log reduction treatment was applied to the U.S. pistachio supply, similar to the results of the Salmonella risk assessment for almonds. This analysis revealed that the predicted risk of illness per serving is higher for all atypical situations modeled compared with the baseline, and delay in drying had the greatest impact on consumer risk.


Assuntos
Contaminação de Alimentos/análise , Pistacia , Intoxicação Alimentar por Salmonella , Microbiologia de Alimentos , Humanos , Pistacia/microbiologia , Medição de Risco , Intoxicação Alimentar por Salmonella/epidemiologia , Estados Unidos
19.
Risk Anal ; 38(8): 1718-1737, 2018 08.
Artigo em Inglês | MEDLINE | ID: mdl-29315715

RESUMO

We developed a probabilistic mathematical model for the postharvest processing of leafy greens focusing on Escherichia coli O157:H7 contamination of fresh-cut romaine lettuce as the case study. Our model can (i) support the investigation of cross-contamination scenarios, and (ii) evaluate and compare different risk mitigation options. We used an agent-based modeling framework to predict the pathogen prevalence and levels in bags of fresh-cut lettuce and quantify spread of E. coli O157:H7 from contaminated lettuce to surface areas of processing equipment. Using an unbalanced factorial design, we were able to propagate combinations of random values assigned to model inputs through different processing steps and ranked statistically significant inputs with respect to their impacts on selected model outputs. Results indicated that whether contamination originated on incoming lettuce heads or on the surface areas of processing equipment, pathogen prevalence among bags of fresh-cut lettuce and batches was most significantly impacted by the level of free chlorine in the flume tank and frequency of replacing the wash water inside the tank. Pathogen levels in bags of fresh-cut lettuce were most significantly influenced by the initial levels of contamination on incoming lettuce heads or surface areas of processing equipment. The influence of surface contamination on pathogen prevalence or levels in fresh-cut bags depended on the location of that surface relative to the flume tank. This study demonstrates that developing a flexible yet mathematically rigorous modeling tool, a "virtual laboratory," can provide valuable insights into the effectiveness of individual and combined risk mitigation options.

20.
Risk Anal ; 38(8): 1738-1757, 2018 08.
Artigo em Inglês | MEDLINE | ID: mdl-29341180

RESUMO

We developed a risk assessment of human salmonellosis associated with consumption of alfalfa sprouts in the United States to evaluate the public health impact of applying treatments to seeds (0-5-log10 reduction in Salmonella) and testing spent irrigation water (SIW) during production. The risk model considered variability and uncertainty in Salmonella contamination in seeds, Salmonella growth and spread during sprout production, sprout consumption, and Salmonella dose response. Based on an estimated prevalence of 2.35% for 6.8 kg seed batches and without interventions, the model predicted 76,600 (95% confidence interval (CI) 15,400-248,000) cases/year. Risk reduction (by 5- to 7-fold) predicted from a 1-log10 seed treatment alone was comparable to SIW testing alone, and each additional 1-log10 seed treatment was predicted to provide a greater risk reduction than SIW testing. A 3-log10 or a 5-log10 seed treatment reduced the predicted cases/year to 139 (95% CI 33-448) or 1.4 (95% CI <1-4.5), respectively. Combined with SIW testing, a 3-log10 or 5-log10 seed treatment reduced the cases/year to 45 (95% CI 10-146) or <1 (95% CI <1-1.5), respectively. If the SIW coverage was less complete (i.e., less representative), a smaller risk reduction was predicted, e.g., a combined 3-log10 seed treatment and SIW testing with 20% coverage resulted in an estimated 92 (95% CI 22-298) cases/year. Analysis of alternative scenarios using different assumptions for key model inputs showed that the predicted relative risk reductions are robust. This risk assessment provides a comprehensive approach for evaluating the public health impact of various interventions in a sprout production system.


Assuntos
Microbiologia de Alimentos , Medicago sativa/efeitos adversos , Medicago sativa/microbiologia , Intoxicação Alimentar por Salmonella/etiologia , Microbiologia da Água , Irrigação Agrícola , Carga Bacteriana , Inocuidade dos Alimentos/métodos , Humanos , Saúde Pública , Medição de Risco , Comportamento de Redução do Risco , Salmonella/crescimento & desenvolvimento , Salmonella/patogenicidade , Intoxicação Alimentar por Salmonella/prevenção & controle , Sementes/crescimento & desenvolvimento , Sementes/microbiologia , Estados Unidos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...