Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 86
Filtrar
Más filtros

Bases de datos
Tipo del documento
Intervalo de año de publicación
1.
Foodborne Pathog Dis ; 2024 Jul 04.
Artículo en Inglés | MEDLINE | ID: mdl-38963777

RESUMEN

Consumers can be exposed to many foodborne biological hazards that cause diseases with varying outcomes and incidence and, therefore, represent different levels of public health burden. To help the French risk managers to rank these hazards and to prioritize food safety actions, we have developed a three-step approach. The first step was to develop a list of foodborne hazards of health concern in mainland France. From an initial list of 335 human pathogenic biological agents, the final list of "retained hazards" consists of 24 hazards, including 12 bacteria (including bacterial toxins and metabolites), 3 viruses and 9 parasites. The second step was to collect data to estimate the disease burden (incidence, Disability Adjusted Life Years) associated with these hazards through food during two time periods: 2008-2013 and 2014-2019. The ranks of the different hazards changed slightly according to the considered period. The third step was the ranking of hazards according to a multicriteria decision support model using the ELECTRE III method. Three ranking criteria were used, where two reflect the severity of the effects (Years of life lost and Years lost due to disability) and one reflects the likelihood (incidence) of the disease. The multicriteria decision analysis approach takes into account the preferences of the risk managers through different sets of weights and the uncertainties associated with the data. The method and the data collected allowed to estimate the health burden of foodborne biological hazards in mainland France and to define a prioritization list for the health authorities.

2.
Regul Toxicol Pharmacol ; 144: 105487, 2023 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-37640100

RESUMEN

The U.S. Food and Drug Administration (FDA) developed an oral toxicological reference value (TRV) for characterizing potential health concerns from dietary exposure to cadmium (Cd). The development of the TRV leveraged the FDA's previously published research including (1) a systematic review for adverse health effects associated with oral Cd exposure and (2) a human physiological based pharmacokinetic (PBPK) model adapted from Kjellstrom and Nordberg (1978) for use in reverse dosimetry applied to the U.S. population. Adverse effects of Cd on the bone and kidney are associated with similar points of departure (PODs) of approximately 0.50 µg Cd/g creatinine for females aged 50-60 based on available epidemiologic data. We also used the upper bound estimate of the renal cortical concentration (50 µg/g Cd) occurring in the U.S. population at 50 years of age as a POD. Based on the output from our reverse dosimetry PBPK Model, a range of 0.21-0.36 µg/kg bw/day was developed for the TRV. The animal data used for the animal TRV derivation (0.63-1.8 µg/kg bw/day) confirms biological plausibility for both the bone and kidney endpoints.


Asunto(s)
Cadmio , Exposición a Riesgos Ambientales , Femenino , Animales , Humanos , Persona de Mediana Edad , Cadmio/toxicidad , Exposición a Riesgos Ambientales/efectos adversos , Valores de Referencia , Alimentos , Riñón
3.
Risk Anal ; 43(9): 1713-1732, 2023 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-36513596

RESUMEN

The objective of this study was to leverage quantitative risk assessment to investigate possible root cause(s) of foodborne illness outbreaks related to Shiga toxin-producing Escherichia coli O157:H7 (STEC O157) infections in leafy greens in the United States. To this end, we developed the FDA leafy green quantitative risk assessment epidemic curve prediction model (FDA-LG QRA-EC) that simulated the lettuce supply chain. The model was used to predict the number of reported illnesses and the epidemic curve associated with lettuce contaminated with STEC O157 for a wide range of scenarios representing various contamination conditions and facility processing/sanitation practices. Model predictions were generated for fresh-cut and whole lettuce, quantifying the differing impacts of facility processing and home preparation on predicted illnesses. Our model revealed that the timespan (i.e., number of days with at least one reported illness) and the peak (i.e., day with the most predicted number of reported illnesses) of the epidemic curve of a STEC O157-lettuce outbreak were not strongly influenced by facility processing/sanitation practices and were indications of contamination pattern among incoming lettuce batches received by the facility or distribution center. Through comparisons with observed number of illnesses from recent STEC O157-lettuce outbreaks, the model identified contamination conditions on incoming lettuce heads that could result in an outbreak of similar size, which can be used to narrow down potential root cause hypotheses.


Asunto(s)
Epidemias , Escherichia coli O157 , Enfermedades Transmitidas por los Alimentos , Humanos , Estados Unidos/epidemiología , Lactuca , Brotes de Enfermedades , Enfermedades Transmitidas por los Alimentos/epidemiología
4.
Environ Res ; 212(Pt B): 113315, 2022 09.
Artículo en Inglés | MEDLINE | ID: mdl-35436451

RESUMEN

We developed an association model to estimate the risk of femoral neck low bone mass and osteoporosis from exposure to cadmium for women and men aged 50-79 in the U.S, as a function of the urinary cadmium (U-Cd) levels. We analyzed data from the NHANES 2005-2014 surveys and evaluated the relationship between U-Cd and femoral neck bone mineral density (BMD) using univariate and multivariate regression models with a combination of NHANES cycle, gender, age, smoking, race/ethnicity, height, body weight, body mass index, lean body mass, diabetes, kidney disease, physical activity, menopausal status, hormone replacement therapy, urinary lead, and prednisone intake as confounding variables. The regression coefficient between U-Cd and femoral neck BMD obtained with the best multivariate regression was used to develop an association model that can estimate the additional risk of low bone mass or osteoporosis in the population given a certain level of U-Cd. Results showed a linear relationship between U-Cd and BMD, conditional to body weight, where individuals with higher U-Cd had decreased BMD values. Our results do not support the hypothesis of a threshold for the effect of Cd on bone. Our model estimates that exposure to Cd results in an increase of 0.51 percentage points (CI95% 0.00, 0.92) of the population diagnosed with osteoporosis, compared to a theoretical absence of exposure. We estimate that 16% (CI95%: 0.00, 40%) of osteoporosis cases in the U.S. 50-79 aged population are a result of Cd exposure. This study presents the first continuous model estimating low bone mass and osteoporosis risk in the U.S. population given actual or potential changes in U-Cd levels. Our model will provide information to inform FDA's Closer to Zero initiative goal to reduce exposure to toxic elements.


Asunto(s)
Cadmio , Osteoporosis , Adulto , Peso Corporal , Densidad Ósea , Cadmio/toxicidad , Femenino , Humanos , Masculino , Encuestas Nutricionales , Osteoporosis/inducido químicamente , Osteoporosis/epidemiología
5.
Risk Anal ; 42(2): 344-369, 2022 02.
Artículo en Inglés | MEDLINE | ID: mdl-34121216

RESUMEN

Human norovirus (NoV) is the leading cause of foodborne illness in the United States and Canada. Bivalve molluscan shellfish is one commodity commonly identified as being a vector of NoV. Bivalve molluscan shellfish are grown in waters that may be affected by contamination events, tend to bioaccumulate viruses, and are frequently eaten raw. In an effort to better assess the elements that contribute to potential risk of NoV infection and illness from consumption of bivalve molluscan shellfish, the U.S. Department of Health and Human Services/Food and Drug Administration (FDA), Health Canada (HC), the Canadian Food Inspection Agency (CFIA), and Environment and Climate Change Canada (ECCC) collaborated to conduct a quantitative risk assessment for NoV in bivalve molluscan shellfish, notably oysters. This study describes the model and scenarios developed and results obtained to assess the risk of NoV infection and illness from consumption of raw oysters harvested from a quasi-steady-state situation. Among the many factors that influence the risk of NoV illness for raw oyster consumers, the concentrations of NoV in the influent (raw, untreated) and effluent (treated) of wastewater treatment plants (WWTP) were identified to be the most important. Thus, mitigation and control strategies that limit the influence from human waste (WWTP outfalls) in oyster growing areas have a major influence on the risk of illness from consumption of those oysters.


Asunto(s)
Infecciones por Caliciviridae , Norovirus , Ostreidae , Animales , Infecciones por Caliciviridae/epidemiología , Canadá , Contaminación de Alimentos/análisis , Humanos , Medición de Riesgo , Estados Unidos
6.
Foodborne Pathog Dis ; 16(4): 290-297, 2019 04.
Artículo en Inglés | MEDLINE | ID: mdl-30735066

RESUMEN

Listeria monocytogenes is a foodborne pathogen that disproportionally affects pregnant females, older adults, and immunocompromised individuals. Using U.S. Foodborne Diseases Active Surveillance Network (FoodNet) surveillance data, we examined listeriosis incidence rates and rate ratios (RRs) by age, sex, race/ethnicity, and pregnancy status across three periods from 2008 to 2016, as recent incidence trends in U.S. subgroups had not been evaluated. The invasive listeriosis annual incidence rate per 100,000 for 2008-2016 was 0.28 cases among the general population (excluding pregnant females), and 3.73 cases among pregnant females. For adults ≥70 years, the annual incidence rate per 100,000 was 1.33 cases. No significant change in estimated listeriosis incidence was found over the 2008-2016 period, except for a small, but significantly lower pregnancy-associated rate in 2011-2013 when compared with 2008-2010. Among the nonpregnancy-associated cases, RRs increased with age from 0.43 (95% confidence interval: 0.25-0.73) for 0- to 14-year olds to 44.9 (33.5-60.0) for ≥85-year olds, compared with 15- to 44-year olds. Males had an incidence of 1.28 (1.12-1.45) times that of females. Compared with non-Hispanic whites, the incidence was 1.57 (1.18-1.20) times higher among non-Hispanic Asians, 1.49 (1.22-1.83) among non-Hispanic blacks, and 1.73 (1.15-2.62) among Hispanics. Among females of childbearing age, non-Hispanic Asian females had 2.72 (1.51-4.89) and Hispanic females 3.13 (2.12-4.89) times higher incidence than non-Hispanic whites. We observed a higher percentage of deaths among older patient groups compared with 15- to 44-year olds. This study is the first characterizing higher RRs for listeriosis in the United States among non-Hispanic blacks and Asians compared with non-Hispanic whites. This information for public health risk managers may spur further research to understand if differences in listeriosis rates relate to differences in consumption patterns of foods with higher contamination levels, food handling practices, comorbidities, immunodeficiencies, health care access, or other factors.


Asunto(s)
Listeria monocytogenes/aislamiento & purificación , Listeriosis/epidemiología , Adolescente , Adulto , Factores de Edad , Anciano , Anciano de 80 o más Años , Niño , Preescolar , Etnicidad , Femenino , Enfermedades Transmitidas por los Alimentos/epidemiología , Enfermedades Transmitidas por los Alimentos/microbiología , Humanos , Incidencia , Lactante , Recién Nacido , Listeriosis/microbiología , Masculino , Persona de Mediana Edad , Vigilancia de la Población , Embarazo , Complicaciones Infecciosas del Embarazo/epidemiología , Complicaciones Infecciosas del Embarazo/microbiología , Factores Sexuales , Estados Unidos/epidemiología
7.
Risk Anal ; 38(8): 1718-1737, 2018 08.
Artículo en Inglés | MEDLINE | ID: mdl-29315715

RESUMEN

We developed a probabilistic mathematical model for the postharvest processing of leafy greens focusing on Escherichia coli O157:H7 contamination of fresh-cut romaine lettuce as the case study. Our model can (i) support the investigation of cross-contamination scenarios, and (ii) evaluate and compare different risk mitigation options. We used an agent-based modeling framework to predict the pathogen prevalence and levels in bags of fresh-cut lettuce and quantify spread of E. coli O157:H7 from contaminated lettuce to surface areas of processing equipment. Using an unbalanced factorial design, we were able to propagate combinations of random values assigned to model inputs through different processing steps and ranked statistically significant inputs with respect to their impacts on selected model outputs. Results indicated that whether contamination originated on incoming lettuce heads or on the surface areas of processing equipment, pathogen prevalence among bags of fresh-cut lettuce and batches was most significantly impacted by the level of free chlorine in the flume tank and frequency of replacing the wash water inside the tank. Pathogen levels in bags of fresh-cut lettuce were most significantly influenced by the initial levels of contamination on incoming lettuce heads or surface areas of processing equipment. The influence of surface contamination on pathogen prevalence or levels in fresh-cut bags depended on the location of that surface relative to the flume tank. This study demonstrates that developing a flexible yet mathematically rigorous modeling tool, a "virtual laboratory," can provide valuable insights into the effectiveness of individual and combined risk mitigation options.

8.
Risk Anal ; 38(8): 1738-1757, 2018 08.
Artículo en Inglés | MEDLINE | ID: mdl-29341180

RESUMEN

We developed a risk assessment of human salmonellosis associated with consumption of alfalfa sprouts in the United States to evaluate the public health impact of applying treatments to seeds (0-5-log10 reduction in Salmonella) and testing spent irrigation water (SIW) during production. The risk model considered variability and uncertainty in Salmonella contamination in seeds, Salmonella growth and spread during sprout production, sprout consumption, and Salmonella dose response. Based on an estimated prevalence of 2.35% for 6.8 kg seed batches and without interventions, the model predicted 76,600 (95% confidence interval (CI) 15,400-248,000) cases/year. Risk reduction (by 5- to 7-fold) predicted from a 1-log10 seed treatment alone was comparable to SIW testing alone, and each additional 1-log10 seed treatment was predicted to provide a greater risk reduction than SIW testing. A 3-log10 or a 5-log10 seed treatment reduced the predicted cases/year to 139 (95% CI 33-448) or 1.4 (95% CI <1-4.5), respectively. Combined with SIW testing, a 3-log10 or 5-log10 seed treatment reduced the cases/year to 45 (95% CI 10-146) or <1 (95% CI <1-1.5), respectively. If the SIW coverage was less complete (i.e., less representative), a smaller risk reduction was predicted, e.g., a combined 3-log10 seed treatment and SIW testing with 20% coverage resulted in an estimated 92 (95% CI 22-298) cases/year. Analysis of alternative scenarios using different assumptions for key model inputs showed that the predicted relative risk reductions are robust. This risk assessment provides a comprehensive approach for evaluating the public health impact of various interventions in a sprout production system.


Asunto(s)
Microbiología de Alimentos , Medicago sativa/efectos adversos , Medicago sativa/microbiología , Intoxicación Alimentaria por Salmonella/etiología , Microbiología del Agua , Riego Agrícola , Carga Bacteriana , Inocuidad de los Alimentos/métodos , Humanos , Salud Pública , Medición de Riesgo , Conducta de Reducción del Riesgo , Salmonella/crecimiento & desarrollo , Salmonella/patogenicidad , Intoxicación Alimentaria por Salmonella/prevención & control , Semillas/crecimiento & desarrollo , Semillas/microbiología , Estados Unidos
9.
Appl Environ Microbiol ; 83(23)2017 Dec 01.
Artículo en Inglés | MEDLINE | ID: mdl-28939600

RESUMEN

This study examined the inactivation of human norovirus (HuNoV) GI.1 and GII.4 by chlorine under conditions mimicking sewage treatment. Using a porcine gastric mucin-magnetic bead (PGM-MB) assay, no statistically significant loss in HuNoV binding (inactivation) was observed for secondary effluent treatments of ≤25 ppm total chlorine; for both strains, 50 and 100 ppm treatments resulted in ≤0.8-log10 unit and ≥3.9-log10 unit reductions, respectively. Treatments of 10, 25, 50, and 100 ppm chlorine inactivated 0.31, 1.35, >5, and >5 log10 units, respectively, of the norovirus indicator MS2 bacteriophage. Evaluation of treatment time indicated that the vast majority of MS2 and HuNoV inactivation occurred in the first 5 min for 0.2-µm-filtered, prechlorinated secondary effluent. Free chlorine measurements of secondary effluent seeded with MS2 and HuNoV demonstrated substantial oxidative burdens. With 25, 50, and 100 ppm treatments, free chlorine levels after 5 min of exposure ranged from 0.21 to 0.58 ppm, from 0.28 to 16.7 ppm, and from 11.6 to 53 ppm, respectively. At chlorine treatment levels of >50 ppm, statistically significant differences were observed between reductions for PGM-MB-bound HuNoV (potentially infectious) particles and those for unbound (noninfectious) HuNoV particles or total norovirus particles. While results suggested that MS2 and HuNoV (measured as PGM-MB binding) behave similarly, although not identically, both have limited susceptibility to chlorine treatments of ≤25 ppm total chlorine. Since sewage treatment is performed at ≤25 ppm total chlorine, targeting free chlorine levels of 0.5 to 1.0 ppm, these results suggest that traditional chlorine-based sewage treatment does not inactivate HuNoV efficiently.IMPORTANCE HuNoV is ubiquitous in sewage. A receptor binding assay was used to assess inactivation of HuNoV by chlorine-based sewage treatment, given that the virus cannot be routinely propagated in vitro Results reported here indicate that chlorine treatment of sewage is not effective for inactivating HuNoV unless chlorine levels are above those routinely used for sewage treatment.


Asunto(s)
Cloro/farmacología , Desinfectantes/farmacología , Levivirus/efectos de los fármacos , Norovirus/efectos de los fármacos , Aguas del Alcantarillado/virología , Eliminación de Residuos Líquidos/métodos , Animales , Humanos , Levivirus/crecimiento & desarrollo , Norovirus/crecimiento & desarrollo , Aguas del Alcantarillado/química , Porcinos , Inactivación de Virus/efectos de los fármacos
11.
Risk Anal ; 37(11): 2080-2106, 2017 11.
Artículo en Inglés | MEDLINE | ID: mdl-28247943

RESUMEN

We developed a quantitative risk assessment model using a discrete event framework to quantify and study the risk associated with norovirus transmission to consumers through food contaminated by infected food employees in a retail food setting. This study focused on the impact of ill food workers experiencing symptoms of diarrhea and vomiting and potential control measures for the transmission of norovirus to foods. The model examined the behavior of food employees regarding exclusion from work while ill and after symptom resolution and preventive measures limiting food contamination during preparation. The mean numbers of infected customers estimated for 21 scenarios were compared to the estimate for a baseline scenario representing current practices. Results show that prevention strategies examined could not prevent norovirus transmission to food when a symptomatic employee was present in the food establishment. Compliance with exclusion from work of symptomatic food employees is thus critical, with an estimated range of 75-226% of the baseline mean for full to no compliance, respectively. Results also suggest that efficient handwashing, handwashing frequency associated with gloving compliance, and elimination of contact between hands, faucets, and door handles in restrooms reduced the mean number of infected customers to 58%, 62%, and 75% of the baseline, respectively. This study provides quantitative data to evaluate the relative efficacy of policy and practices at retail to reduce norovirus illnesses and provides new insights into the interactions and interplay of prevention strategies and compliance in reducing transmission of foodborne norovirus.


Asunto(s)
Infecciones por Caliciviridae/transmisión , Contaminación de Alimentos , Manipulación de Alimentos , Norovirus , Medición de Riesgo/métodos , Algoritmos , Brotes de Enfermedades , Alimentos , Microbiología de Alimentos , Enfermedades Transmitidas por los Alimentos/epidemiología , Gastroenteritis/virología , Humanos , Modelos Estadísticos , Exposición Profesional , Prevalencia , Restaurantes , Factores de Tiempo
12.
Foodborne Pathog Dis ; 14(9): 524-530, 2017 09.
Artículo en Inglés | MEDLINE | ID: mdl-28632414

RESUMEN

Listeria monocytogenes is an important cause of foodborne illness hospitalization, fetal loss, and death in the United States. Listeriosis incidence rate varies significantly among population subgroups with pregnant women, older persons, and the Hispanic population having increased relative risks compared with the other subpopulations. Using estimated rates of listeriosis per subpopulation based on FoodNet data from 2004 to 2009, we evaluate the expected number of cases and incidence rates of listeriosis in the US population and the pregnant women subpopulation as the demographic composition changes over time with respect to ethnicity, pregnancy status, and age distribution. If the incidence rate per subpopulation is held constant, the overall US population listeriosis incidence rate would increase from 0.25 per 100,000 (95% confidence interval [CI]: 0.19-0.34) in 2010 to 0.28 (95% CI: 0.22-0.38) in 2020 and 0.32 (95% CI: 0.25-0.43) in 2030, because of the changes in the population structure. Similarly, the pregnancy-associated incidence rate is expected to increase from 4.0 per 100,000 pregnant women (95% CI: 2.5-6.5) in 2010 to 4.1 (95% CI: 2.6-6.7) in 2020 and 4.4 (95% CI: 2.7-7.2) in 2030 as the proportion of Hispanic pregnant women increases. We further estimate that a reduction of 12% in the exposure of the US population to L. monocytogenes would be needed to maintain a constant incidence rate from 2010 to 2020 (current trend), assuming infectivity (strain virulence distribution and individual susceptibility) is unchanged. To reduce the overall US population incidence rate by one-third (Healthy People 2020 goal) would require a reduction in exposure (or infectivity) to L. monocytogenes of 48% over the same time period. Reduction/elimination in exposure of pregnant and Hispanic subpopulations alone could not meet this target. This information may be useful in setting public health targets, developing risk management options, and in interpreting trends in the public health burden of foodborne listeriosis in the United States.


Asunto(s)
Enfermedades Transmitidas por los Alimentos/epidemiología , Listeria monocytogenes/fisiología , Listeriosis/epidemiología , Complicaciones Infecciosas del Embarazo/epidemiología , Adolescente , Adulto , Anciano , Anciano de 80 o más Años , Niño , Preescolar , Demografía , Femenino , Enfermedades Transmitidas por los Alimentos/microbiología , Hispánicos o Latinos/estadística & datos numéricos , Humanos , Incidencia , Lactante , Listeriosis/microbiología , Masculino , Persona de Mediana Edad , Embarazo , Complicaciones Infecciosas del Embarazo/microbiología , Salud Pública , Estados Unidos/epidemiología , Adulto Joven
13.
Emerg Infect Dis ; 22(12): 2113-2119, 2016 12.
Artículo en Inglés | MEDLINE | ID: mdl-27869595

RESUMEN

The relationship between the number of ingested Listeria monocytogenes cells in food and the likelihood of developing listeriosis is not well understood. Data from an outbreak of listeriosis linked to milkshakes made from ice cream produced in 1 factory showed that contaminated products were distributed widely to the public without any reported cases, except for 4 cases of severe illness in persons who were highly susceptible. The ingestion of high doses of L. monocytogenes by these patients infected through milkshakes was unlikely if possible additional contamination associated with the preparation of the milkshake is ruled out. This outbreak illustrated that the vast majority of the population did not become ill after ingesting a low level of L. monocytogenes but raises the question of listeriosis cases in highly susceptible persons after distribution of low-level contaminated products that did not support the growth of this pathogen.


Asunto(s)
Brotes de Enfermedades , Enfermedades Transmitidas por los Alimentos/epidemiología , Helados/microbiología , Listeriosis/epidemiología , Listeriosis/microbiología , Anciano , Anciano de 80 o más Años , Carga Bacteriana , Contaminación de Alimentos , Microbiología de Alimentos , Historia del Siglo XXI , Humanos , Listeria monocytogenes , Listeriosis/historia , Listeriosis/transmisión , Vigilancia de la Población , Estados Unidos/epidemiología
14.
Appl Environ Microbiol ; 81(14): 4669-81, 2015 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-25934626

RESUMEN

Human norovirus (NoV) is the leading cause of foodborne illness in the United States and Canada. Wastewater treatment plant (WWTP) effluents impacting bivalve mollusk-growing areas are potential sources of NoV contamination. We have developed a meta-analysis that evaluates WWTP influent concentrations and log10 reductions of NoV genotype I (NoV GI; in numbers of genome copies per liter [gc/liter]), NoV genotype II (NoV GII; in gc/liter), and male-specific coliphage (MSC; in number of PFU per liter), a proposed viral surrogate for NoV. The meta-analysis included relevant data (2,943 measurements) reported in the scientific literature through September 2013 and previously unpublished surveillance data from the United States and Canada. Model results indicated that the mean WWTP influent concentration of NoV GII (3.9 log10 gc/liter; 95% credible interval [CI], 3.5, 4.3 log10 gc/liter) is larger than the value for NoV GI (1.5 log10 gc/liter; 95% CI, 0.4, 2.4 log10 gc/liter), with large variations occurring from one WWTP to another. For WWTPs with mechanical systems and chlorine disinfection, mean log10 reductions were -2.4 log10 gc/liter (95% CI, -3.9, -1.1 log10 gc/liter) for NoV GI, -2.7 log10 gc/liter (95% CI, -3.6, -1.9 log10 gc/liter) for NoV GII, and -2.9 log10 PFU per liter (95% CI, -3.4, -2.4 log10 PFU per liter) for MSCs. Comparable values for WWTPs with lagoon systems and chlorine disinfection were -1.4 log10 gc/liter (95% CI, -3.3, 0.5 log10 gc/liter) for NoV GI, -1.7 log10 gc/liter (95% CI, -3.1, -0.3 log10 gc/liter) for NoV GII, and -3.6 log10 PFU per liter (95% CI, -4.8, -2.4 PFU per liter) for MSCs. Within WWTPs, correlations exist between mean NoV GI and NoV GII influent concentrations and between the mean log10 reduction in NoV GII and the mean log10 reduction in MSCs.


Asunto(s)
Colifagos/crecimiento & desarrollo , Agua Dulce/virología , Norovirus/crecimiento & desarrollo , Aguas Residuales/microbiología , Purificación del Agua/instrumentación , Colifagos/genética , Colifagos/aislamiento & purificación , Desinfección , Agua Dulce/química , Genotipo , Norovirus/genética , Norovirus/aislamiento & purificación , Aguas Residuales/química
15.
Food Microbiol ; 45(Pt B): 245-53, 2015 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-25500390

RESUMEN

When developing quantitative risk assessment models, a fundamental consideration for risk assessors is to decide whether to evaluate changes in bacterial levels in terms of concentrations or in terms of bacterial numbers. Although modeling bacteria in terms of integer numbers may be regarded as a more intuitive and rigorous choice, modeling bacterial concentrations is more popular as it is generally less mathematically complex. We tested three different modeling approaches in a simulation study. The first approach considered bacterial concentrations; the second considered the number of bacteria in contaminated units, and the third considered the expected number of bacteria in contaminated units. Simulation results indicate that modeling concentrations tends to overestimate risk compared to modeling the number of bacteria. A sensitivity analysis using a regression tree suggests that processes which include drastic scenarios consisting of combinations of large bacterial inactivation followed by large bacterial growth frequently lead to a >10-fold overestimation of the average risk when modeling concentrations as opposed to bacterial numbers. Alternatively, the approach of modeling the expected number of bacteria in positive units generates results similar to the second method and is easier to use, thus potentially representing a promising compromise.


Asunto(s)
Bacterias/crecimiento & desarrollo , Microbiología de Alimentos , Bacterias/química , Contaminación de Alimentos , Modelos Teóricos , Medición de Riesgo
16.
Risk Anal ; 35(1): 90-108, 2015 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-24975545

RESUMEN

Evaluations of Listeria monocytogenes dose-response relationships are crucially important for risk assessment and risk management, but are complicated by considerable variability across population subgroups and L. monocytogenes strains. Despite difficulties associated with the collection of adequate data from outbreak investigations or sporadic cases, the limitations of currently available animal models, and the inability to conduct human volunteer studies, some of the available data now allow refinements of the well-established exponential L. monocytogenes dose response to more adequately represent extremely susceptible population subgroups and highly virulent L. monocytogenes strains. Here, a model incorporating adjustments for variability in L. monocytogenes strain virulence and host susceptibility was derived for 11 population subgroups with similar underlying comorbidities using data from multiple sources, including human surveillance and food survey data. In light of the unique inherent properties of L. monocytogenes dose response, a lognormal-Poisson dose-response model was chosen, and proved able to reconcile dose-response relationships developed based on surveillance data with outbreak data. This model was compared to a classical beta-Poisson dose-response model, which was insufficiently flexible for modeling the specific case of L. monocytogenes dose-response relationships, especially in outbreak situations. Overall, the modeling results suggest that most listeriosis cases are linked to the ingestion of food contaminated with medium to high concentrations of L. monocytogenes. While additional data are needed to refine the derived model and to better characterize and quantify the variability in L. monocytogenes strain virulence and individual host susceptibility, the framework derived here represents a promising approach to more adequately characterize the risk of listeriosis in highly susceptible population subgroups.


Asunto(s)
Interacciones Huésped-Parásitos , Listeria monocytogenes/patogenicidad , Modelos Teóricos , Virulencia
17.
Foods ; 13(5)2024 Feb 29.
Artículo en Inglés | MEDLINE | ID: mdl-38472864

RESUMEN

Better knowledge regarding the Listeria monocytogenes dose-response (DR) model is needed to refine the assessment of the risk of foodborne listeriosis. In 2018, the European Food Safety Agency (EFSA) derived a lognormal Poisson DR model for 14 different age-sex sub-groups, marginally to strain virulence. In the present study, new sets of parameters are developed by integrating the EFSA model for these sub-groups together with three classes of strain virulence characteristics ("less virulent", "virulent", and "more virulent"). Considering classes of virulence leads to estimated relative risks (RRs) of listeriosis following the ingestion of 1000 bacteria of "less virulent" vs. "more virulent" strains ranging from 21.6 to 24.1, depending on the sub-group. These relatively low RRs when compared with RRs linked to comorbidities described in the literature suggest that the influence of comorbidity on the occurrence of invasive listeriosis for a given exposure is much more important than the influence of the virulence of the strains. The updated model parameters allow better prediction of the risk of invasive listeriosis across a population of interest, provided the necessary data on population demographics and the proportional contribution of strain virulence classes in food products of interest are available. An R package is made available to facilitate the use of these dose-response models.

18.
Food Microbiol ; 36(2): 149-60, 2013 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-24010593

RESUMEN

In response to increased concerns about spice safety, the United States Food and Drug Administration (FDA) initiated research to characterize the prevalence and levels of Salmonella in imported spices. 299 imported dried capsicum shipments and 233 imported sesame seed shipments offered for entry to the United States were sampled. Observed Salmonella shipment prevalence was 3.3% (1500 g examined; 95% CI 1.6-6.1%) for capsicum and 9.9% (1500 g; 95% Confidence Interval (CI) 6.3-14%) for sesame seed. Within shipment contamination was not inconsistent with a Poisson distribution. Shipment mean Salmonella level estimates among contaminated shipments ranged from 6 × 10(-4) to 0.09 (capsicum) or 6 × 10(-4) to 0.04 (sesame seed) MPN/g. A gamma-Poisson model provided the best fit to observed data for both imported shipments of capsicum and imported shipments of sesame seed sampled in this study among the six parametric models considered. Shipment mean levels of Salmonella vary widely between shipments; many contaminated shipments contain low levels of contamination. Examination of sampling plan efficacy for identifying contaminated spice shipments from these distributions indicates that sample size of spice examined is critical. Sampling protocols examining 25 g samples are predicted to be able to identify a small fraction of contaminated shipments of imported capsicum or sesame seeds.


Asunto(s)
Capsicum/microbiología , Contaminación de Alimentos/análisis , Salmonella/aislamiento & purificación , Sesamum/microbiología , Especias/microbiología , Contaminación de Alimentos/economía , Inocuidad de los Alimentos , Salmonella/clasificación , Salmonella/genética , Semillas/microbiología , Especias/economía , Estados Unidos , United States Food and Drug Administration/estadística & datos numéricos
19.
Foodborne Pathog Dis ; 10(11): 907-15, 2013 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-23869961

RESUMEN

While adequate, statistically designed sampling plans should be used whenever feasible, inference about the presence of pathogens in food occasionally has to be made based on smaller numbers of samples. To help the interpretation of such results, we reviewed the impact of small sample sizes on pathogen detection and prevalence estimation. In particular, we evaluated four situations commonly encountered in practice. The first two examples evaluate the combined impact of sample size and pathogen prevalence (i.e., fraction of contaminated food items in a given lot) on pathogen detection and prevalence estimation. The latter two examples extend the previous example to consider the impact of pathogen concentration and imperfect test sensitivity. The provided examples highlight the difficulties of making inference based on small numbers of samples, and emphasize the importance of using appropriate statistical sampling designs whenever possible.


Asunto(s)
Contaminación de Alimentos/análisis , Microbiología de Alimentos/métodos , Probabilidad , Tamaño de la Muestra , Sensibilidad y Especificidad
20.
PLoS One ; 18(12): e0294624, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-38051743

RESUMEN

The serovars of Salmonella enterica display dramatic differences in pathogenesis and host preferences. We developed a process (patent pending) for grouping Salmonella isolates and serovars by their public health risk. We collated a curated set of 12,337 S. enterica isolate genomes from human, beef, and bovine sources in the US. After annotating a virulence gene catalog for each isolate, we used unsupervised random forest methods to estimate the proximity (similarity) between isolates based upon the genomic presentation of putative virulence traits We then grouped isolates (virulence clusters) using hierarchical clustering (Ward's method), used non-parametric bootstrapping to assess cluster stability, and externally validated the clusters against epidemiological virulence measures from FoodNet, the National Outbreak Reporting System (NORS), and US federal sampling of beef products. We identified five stable virulence clusters of S. enterica serovars. Cluster 1 (higher virulence) serovars yielded an annual incidence rate of domestically acquired sporadic cases roughly one and a half times higher than the other four clusters combined (Clusters 2-5, lower virulence). Compared to other clusters, cluster 1 also had a higher proportion of infections leading to hospitalization and was implicated in more foodborne and beef-associated outbreaks, despite being isolated at a similar frequency from beef products as other clusters. We also identified subpopulations within 11 serovars. Remarkably, we found S. Infantis and S. Typhimurium subpopulations that significantly differed in genome length and clinical case presentation. Further, we found that the presence of the pESI plasmid accounted for the genome length differences between the S. Infantis subpopulations. Our results show that S. enterica strains associated with highest incidence of human infections share a common virulence repertoire. This work could be updated regularly and used in combination with foodborne surveillance information to prioritize serovars of public health concern.


Asunto(s)
Salmonella enterica , Animales , Bovinos , Humanos , Estados Unidos/epidemiología , Virulencia/genética , Serogrupo , Salmonella , Genómica
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA