Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 16 de 16
Filtrar
1.
Int J Food Microbiol ; 378: 109801, 2022 Oct 02.
Artigo em Inglês | MEDLINE | ID: mdl-35749912

RESUMO

The United States Department of Agriculture's Food Safety and Inspection Service implemented Salmonella performance standards for establishments producing chicken parts in 2016. The standards were chosen based on the assumption that a 30 % reduction in the occurrence of Salmonella-contaminated chicken parts samples (i.e., legs, breasts or wings) would result following implementation of the performance standard program. The derivation of the performance standards was based on data collected prior to the implementation of the standards and in the intervening years, so overall changes in the Salmonella contamination of this product can be assessed. This study presents a historical review of changes in Salmonella contamination on chicken parts as these changes relate to the performance standard. The analysis demonstrates that the reduction in Salmonella contaminated chicken parts samples was more than 75 %, so the FSIS risk assessment significantly underestimated the actual reduction in Salmonella contamination. An analysis of chicken parts samples collected at retail demonstrates reductions of a similar magnitude. Changes in the characteristics of Salmonella contamination that are potentially relevant to the occurrence or severity of human illness, such as seasonal changes in contamination, the composition of serotypes and changes in antimicrobial resistance, are also assessed. Small but significant seasonal increases in contamination were observed, with the peaks occurring in late winter rather than the more traditional late summer peak. Rapid changes in both the five most common serotypes and antimicrobial resistance patterns were also observed.


Assuntos
Anti-Infecciosos , Galinhas , Animais , Anti-Infecciosos/análise , Contaminação de Alimentos/análise , Contaminação de Alimentos/prevenção & controle , Microbiologia de Alimentos , Humanos , Carne/análise , Salmonella , Estados Unidos
2.
Emerg Infect Dis ; 27(1): 214-222, 2021 01.
Artigo em Inglês | MEDLINE | ID: mdl-33350919

RESUMO

Foodborne illness source attribution is foundational to a risk-based food safety system. We describe a method for attributing US foodborne illnesses caused by nontyphoidal Salmonella enterica, Escherichia coli O157, Listeria monocytogenes, and Campylobacter to 17 food categories using statistical modeling of outbreak data. This method adjusts for epidemiologic factors associated with outbreak size, down-weights older outbreaks, and estimates credibility intervals. On the basis of 952 reported outbreaks and 32,802 illnesses during 1998-2012, we attribute 77% of foodborne Salmonella illnesses to 7 food categories (seeded vegetables, eggs, chicken, other produce, pork, beef, and fruits), 82% of E. coli O157 illnesses to beef and vegetable row crops, 81% of L. monocytogenes illnesses to fruits and dairy, and 74% of Campylobacter illnesses to dairy and chicken. However, because Campylobacter outbreaks probably overrepresent dairy as a source of nonoutbreak campylobacteriosis, we caution against using these Campylobacter attribution estimates without further adjustment.


Assuntos
Infecções por Campylobacter , Doenças Transmitidas por Alimentos , Gastroenterite , Listeria monocytogenes , Animais , Infecções por Campylobacter/epidemiologia , Bovinos , Surtos de Doenças , Microbiologia de Alimentos , Doenças Transmitidas por Alimentos/epidemiologia , Estados Unidos/epidemiologia
3.
J Food Prot ; 81(11): 1851-1863, 2018 11.
Artigo em Inglês | MEDLINE | ID: mdl-30325223

RESUMO

Buffered peptone water is the rinsate commonly used for chicken rinse sampling. A new formulation of buffered peptone water was developed to address concerns about the transfer of antimicrobials, used during poultry slaughter and processing, into the rinsate. This new formulation contains additives to neutralize the antimicrobials, and this neutralizing buffered peptone water replaced the original formulation for all chicken carcass and chicken part sampling programs run by the Food Safety and Inspection Service beginning in July 2016. Our goal was to determine whether the change in rinsate resulted in significant differences in the observed proportion of positive chicken rinse samples for both Salmonella and Campylobacter. This assessment compared sampling results for the 12-month periods before and after implementation. The proportion of carcass samples that tested positive for Salmonella increased from approximately 0.02 to almost 0.06. Concurrently, the proportion of chicken part samples that tested for Campylobacter decreased from 0.15 to 0.04. There were no significant differences associated with neutralizing buffered peptone water for the other two product-pathogen pairs. Further analysis of the effect of the new rinsate on corporations that operate multiple establishments demonstrated that changes in the percent positive rates differed across the corporations, with some corporations being unaffected, while others saw all of the establishments operated by the corporation move from passing to failing the performance standard and vice versa. The results validated earlier concerns that antimicrobial contamination of rinse samples was causing false-negative Salmonella testing results for chicken carcasses. The results also indicate that additional development work may still be required before the rinsate is sufficiently robust for its use in Campylobacter testing.


Assuntos
Campylobacter , Galinhas , Manipulação de Alimentos/métodos , Microbiologia de Alimentos , Salmonella/isolamento & purificação , Animais , Campylobacter/isolamento & purificação , Contaminação de Alimentos , Carne , Peptonas , Prevalência , Água , Microbiologia da Água
4.
Int J Food Microbiol ; 282: 24-27, 2018 Oct 03.
Artigo em Inglês | MEDLINE | ID: mdl-29885974

RESUMO

Advances in microbiological testing methods have led to faster and less expensive assays. Given these advances, it is logical to employ these assays for use in the sampling plan of an existing microbiological criterion. A change in the performance characteristics of the assay can affect the intended effect of the microbiological criterion. This study describes a method for updating a 2-class attributes sampling plan to account for the different test sensitivity and specificity of a new assay and provides an example based on the replacement of a culture-based assay with a real-time polymerase chain reaction assay.


Assuntos
Campylobacter/isolamento & purificação , Galinhas/microbiologia , Produtos da Carne/microbiologia , Técnicas Microbiológicas/métodos , Animais , Campylobacter/genética , Laboratórios , Técnicas Microbiológicas/economia , Reação em Cadeia da Polimerase em Tempo Real , Sensibilidade e Especificidade
5.
Emerg Infect Dis ; 22(7): 1193-200, 2016 07.
Artigo em Inglês | MEDLINE | ID: mdl-27314510

RESUMO

Outbreak data have been used to estimate the proportion of illnesses attributable to different foods. Applying outbreak-based attribution estimates to nonoutbreak foodborne illnesses requires an assumption of similar exposure pathways for outbreak and sporadic illnesses. This assumption cannot be tested, but other comparisons can assess its veracity. Our study compares demographic, clinical, temporal, and geographic characteristics of outbreak and sporadic illnesses from Campylobacter, Escherichia coli O157, Listeria, and Salmonella bacteria ascertained by the Foodborne Diseases Active Surveillance Network (FoodNet). Differences among FoodNet sites in outbreak and sporadic illnesses might reflect differences in surveillance practices. For Campylobacter, Listeria, and Escherichia coli O157, outbreak and sporadic illnesses are similar for severity, sex, and age. For Salmonella, outbreak and sporadic illnesses are similar for severity and sex. Nevertheless, the percentage of outbreak illnesses in the youngest age category was lower. Therefore, we do not reject the assumption that outbreak and sporadic illnesses are similar.


Assuntos
Surtos de Doenças , Monitoramento Epidemiológico , Microbiologia de Alimentos , Doenças Transmitidas por Alimentos/epidemiologia , Vigilância da População/métodos , Campylobacter , Infecções por Campylobacter/epidemiologia , Infecções por Campylobacter/microbiologia , Infecções por Escherichia coli/epidemiologia , Infecções por Escherichia coli/microbiologia , Escherichia coli O157 , Humanos , Estudos Retrospectivos , Salmonella , Infecções por Salmonella/epidemiologia , Infecções por Salmonella/microbiologia , Estados Unidos/epidemiologia
6.
Int J Food Microbiol ; 208: 114-21, 2015 Sep 02.
Artigo em Inglês | MEDLINE | ID: mdl-26065728

RESUMO

The proportion of Campylobacter contaminated food and water samples collected by different surveillance systems often exhibit seasonal patterns. In addition, the incidence of foodborne campylobacteriosis also tends to exhibit strong seasonal patterns. Of the various product classes, the occurrence of Campylobacter contamination can be high on raw poultry products, and chicken is often thought to be one of the leading food vehicles for campylobacteriosis. Two different federal agencies in the United States collected samples of raw chicken products and tested them for the presence of Campylobacter. During the same time period, a consortium of federal and state agencies operated a nationwide surveillance system to monitor cases of campylobacteriosis in the United States. This study uses a common modeling approach to estimate trends and seasonal patterns in both the proportion of raw chicken product samples that test positive for Campylobacter and cases of campylobacteriosis. The results generally support the hypothesis of a weak seasonal increase in the proportion of Campylobacter positive chicken samples in the summer months, though the number of Campylobacter on test-positive samples is slightly lower during this time period. In contrast, campylobacteriosis cases exhibit a strong seasonal pattern that generally precedes increases in contaminated raw chicken. These results suggest that while contaminated chicken products may be responsible for a substantial number of campylobacteriosis cases, they are most likely not the primary driver of the seasonal pattern in human illness.


Assuntos
Infecções por Campylobacter/epidemiologia , Campylobacter/fisiologia , Microbiologia de Alimentos , Carne/microbiologia , Animais , Infecções por Campylobacter/microbiologia , Galinhas , Microbiologia Ambiental , Humanos , Incidência , Produtos Avícolas/microbiologia , Estações do Ano , Fatores de Tempo , Estados Unidos/epidemiologia
7.
Int J Food Microbiol ; 162(3): 266-75, 2013 Apr 01.
Artigo em Inglês | MEDLINE | ID: mdl-23454818

RESUMO

This report illustrates how the uncertainty about food safety metrics may influence the selection of a performance objective (PO). To accomplish this goal, we developed a model concerning Listeria monocytogenes in ready-to-eat (RTE) deli meats. This application used a second order Monte Carlo model that simulates L. monocytogenes concentrations through a series of steps: the food-processing establishment, transport, retail, the consumer's home and consumption. The model accounted for growth inhibitor use, retail cross contamination, and applied an FAO/WHO dose response model for evaluating the probability of illness. An appropriate level of protection (ALOP) risk metric was selected as the average risk of illness per serving across all consumed servings-per-annum and the model was used to solve for the corresponding performance objective (PO) risk metric as the maximum allowable L. monocytogenes concentration (cfu/g) at the processing establishment where regulatory monitoring would occur. Given uncertainty about model inputs, an uncertainty distribution of the PO was estimated. Additionally, we considered how RTE deli meats contaminated at levels above the PO would be handled by the industry using three alternative approaches. Points on the PO distribution represent the probability that - if the industry complies with a particular PO - the resulting risk-per-serving is less than or equal to the target ALOP. For example, assuming (1) a target ALOP of -6.41 log10 risk of illness per serving, (2) industry concentrations above the PO that are re-distributed throughout the remaining concentration distribution and (3) no dose response uncertainty, establishment PO's of -4.98 and -4.39 log10 cfu/g would be required for 90% and 75% confidence that the target ALOP is met, respectively. The PO concentrations from this example scenario are more stringent than the current typical monitoring level of an absence in 25 g (i.e., -1.40 log10 cfu/g) or a stricter criteria of absence in 125 g (i.e., -2.1 log10 cfu/g). This example, and others, demonstrates that a PO for L. monocytogenes would be far below any current monitoring capabilities. Furthermore, this work highlights the demands placed on risk managers and risk assessors when applying uncertain risk models to the current risk metric framework.


Assuntos
Contaminação de Alimentos/prevenção & controle , Microbiologia de Alimentos/organização & administração , Listeria monocytogenes/crescimento & desenvolvimento , Carne/microbiologia , Modelos Estatísticos , Gestão de Riscos , Manipulação de Alimentos/normas , Humanos , Listeria monocytogenes/isolamento & purificação , Concentração Máxima Permitida , Produtos da Carne/microbiologia , Método de Monte Carlo , Medição de Risco , Incerteza
8.
Int J Food Microbiol ; 139(3): 140-6, 2010 May 15.
Artigo em Inglês | MEDLINE | ID: mdl-20385419

RESUMO

Rinse sampling is a common method for determining the level of microbial contamination on poultry carcasses. One of the advantages of rinse sampling, over other carcass sampling methods, is that the results can be used for both process control applications and to estimate the total microbial level on a carcass. The latter objective is possible because rinse sampling removes a portion of the bacteria from the entire carcass, whereas methods such as neck-skin sampling focus on a small area of the carcass where the level of contamination may not be representative of the entire carcass. Two recurring issues with rinse sampling are differences in sampling protocols and the difficulty of determining the proportion of bacteria removed during sampling. A situation arose where 300 rinse samples were collected using two different rinse fluid volumes (i.e., 100 and 400 ml). The original intent of the study was to demonstrate the similarity of the removal rates for the two methods, but summary statistics suggested substantial differences. A Bayesian model was constructed to estimate the removal rates for the two sampling methods as well as to estimate the parameters of distributions describing the carcass-level contamination across 3 days of processing. The results of the study suggest that approximately 11 times as many bacteria are removed from the carcass when using a 400 ml rinse sample than with a 100 ml rinse sample. While this estimate is subject to a rather large degree of uncertainty, the 95% Bayesian credible interval for the ratio of the two removal rate parameters of (7.5, and 17.0) still indicates a significant difference in the removal rates for the two sampling methods.


Assuntos
Bactérias/isolamento & purificação , Manipulação de Alimentos/métodos , Microbiologia de Alimentos , Modelos Biológicos , Aves Domésticas/microbiologia , Animais , Teorema de Bayes
9.
J Food Prot ; 72(10): 2151-61, 2009 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-19833039

RESUMO

The U.S. Department of Agriculture, Food Safety and Inspection Service is exploring quantitative risk assessment methodologies to incorporate the use of the Codex Alimentarius' newly adopted risk management metrics (e.g., food safety objectives and performance objectives). It is suggested that use of these metrics would more closely tie the results of quantitative microbial risk assessments (QMRAs) to public health outcomes. By estimating the food safety objective (the maximum frequency and/or concentration of a hazard in a food at the time of consumption) and the performance objective (the maximum frequency and/or concentration of a hazard in a food at a specified step in the food chain before the time of consumption), risk managers will have a better understanding of the appropriate level of protection (ALOP) from microbial hazards for public health protection. We here demonstrate a general methodology that allows identification of an ALOP and evaluation of corresponding metrics at appropriate points in the food chain. It requires a two-dimensional probabilistic risk assessment, the example used being the Monte Carlo QMRA for Clostridium perfringens in ready-to eat and partially cooked meat and poultry products, with minor modifications to evaluate and abstract required measures. For demonstration purposes, the QMRA model was applied specifically to hot dogs produced and consumed in the United States. Evaluation of the cumulative uncertainty distribution for illness rate allows a specification of an ALOP that, with defined confidence, corresponds to current industry practices.


Assuntos
Clostridium perfringens/crescimento & desenvolvimento , Qualidade de Produtos para o Consumidor , Contaminação de Alimentos/análise , Manipulação de Alimentos/métodos , Produtos da Carne/microbiologia , Produtos Avícolas/microbiologia , Contagem de Colônia Microbiana , Culinária/métodos , Microbiologia de Alimentos , Humanos , Modelos Biológicos , Método de Monte Carlo , Medição de Risco , Fatores de Risco , Gestão de Riscos , Estados Unidos , United States Department of Agriculture
10.
Foodborne Pathog Dis ; 6(7): 827-35, 2009 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-19737061

RESUMO

Highly pathogenic avian influenza (HPAI) H5N1 is an infectious disease of fowl that can cause rapid and pervasive mortality resulting in complete flock loss. It has also been shown to cause death in humans. Although H5N1 HPAI virus (HPAIV) has not been identified in the United States, there are concerns about whether an infected flock could remain undetected long enough to pose a risk to consumers. This paper considers exposure from an Asian lineage H5N1 HPAIV-infected chicken flock given that no other flocks have been identified as H5N1 HPAIV positive (the index flock). A state-transition model is used to evaluate the probability of an infected flock remaining undetected until slaughter. This model describes three possible states within the flock: susceptible, infected, and dead, and the transition probabilities that predict movements between the possible states. Assuming a 20,000-bird house with 1 bird initially infected, the probability that an H5N1 HPAIV-infected flock would be detected before slaughter is approximately 94%. This is because H5N1 HPAIV spreads rapidly through a flock, and bird mortality quickly reaches high levels. It is assumed that approximately 2% or greater bird mortality due to H5N1 HPAIV would result in on-farm identification of the flock as infected. The only infected flock likely to reach slaughter undetected is one that was infected within approximately 3.5 days of shipment. In this situation, there is not enough time for high mortality to present. These results suggest that the probability of an infected undetected flock going to slaughter is low, yet such an event could occur if a flock is infected at the most opportune time.


Assuntos
Galinhas/virologia , Contaminação de Alimentos/prevenção & controle , Virus da Influenza A Subtipo H5N1/isolamento & purificação , Influenza Aviária/diagnóstico , Influenza Aviária/transmissão , Criação de Animais Domésticos/métodos , Criação de Animais Domésticos/estatística & dados numéricos , Animais , Embrião de Galinha , Simulação por Computador , Suscetibilidade a Doenças , Exposição Ambiental/estatística & dados numéricos , Contaminação de Alimentos/estatística & dados numéricos , Abastecimento de Alimentos/estatística & dados numéricos , Humanos , Virus da Influenza A Subtipo H5N1/patogenicidade , Virus da Influenza A Subtipo H5N1/fisiologia , Influenza Aviária/mortalidade , Influenza Humana/prevenção & controle , Indústria de Embalagem de Carne/métodos , Indústria de Embalagem de Carne/estatística & dados numéricos , Modelos Biológicos , Probabilidade , Medição de Risco , Estatística como Assunto , Fatores de Tempo , Estados Unidos , Latência Viral
11.
J Food Prot ; 72(7): 1376-84, 2009 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-19681258

RESUMO

An assessment of the risk of illness associated with Clostridium perfringens in ready-to-eat and partially cooked meat and poultry products was completed to estimate the effect on the annual frequency of illnesses of changing the allowed maximal 1-log growth of C. perfringens during stabilization (cooling after the manufacturing heat step). The exposure assessment modeled stabilization, storage, and consumer preparation such as reheating and hot-holding. The model predicted that assuming a 10- or 100-fold increase from the assumed 1-log (maximal allowable) growth of C. perfringens results in a 1.2- or 1.6-fold increase of C. perfringens-caused illnesses, respectively, at the median of the uncertainty distribution. Improper retail and consumer refrigeration accounted for approximately 90% of the 79,000 C. perfringens illnesses predicted by the model at 1-log growth during stabilization. Improper hot-holding accounted for 8% of predicted illnesses, although model limitations imply that this is an underestimate. Stabilization accounted for less than 1% of illnesses. Efforts to reduce illnesses from C. perfringens in ready-to-eat and partially cooked meat and poultry products should focus on retail and consumer storage and preparation methods.


Assuntos
Clostridium perfringens/crescimento & desenvolvimento , Contaminação de Alimentos/análise , Manipulação de Alimentos/métodos , Produtos da Carne/microbiologia , Produtos Avícolas/microbiologia , Contagem de Colônia Microbiana , Qualidade de Produtos para o Consumidor , Culinária/métodos , Microbiologia de Alimentos , Humanos , Modelos Biológicos , Medição de Risco , Esporos Bacterianos
12.
Vet Microbiol ; 131(3-4): 215-28, 2008 Oct 15.
Artigo em Inglês | MEDLINE | ID: mdl-18479846

RESUMO

As laying hens age, egg production and quality decreases. Egg producers can impose an induced molt on older hens that results in increased egg productivity and decreased hen mortality compared with non-molted hens of the same age. This review discusses the effect of induced molting by feed removal on immune parameters, Salmonella enterica serovar Enteritidis (SE) invasion and subsequent production of SE-contaminated eggs. Experimental oral infections with SE show molted hens are more susceptible to SE infection and produce more SE-contaminated eggs in the first few weeks post-molt compared with pre-molt egg production. In addition, it appears that molted hens are more likely to disseminate SE into their environment. Molted hens are more susceptible to SE infection by contact exposure to experimentally infected hens; thus, transmission of SE among molted hens could be more rapid than non-molted birds. Histological examination of the gastrointestinal tracts of molted SE-infected hens revealed more frequent and severe intestinal mucosal lesions compared with non-molted SE-infected hens. These data suggest that induced molting by feed deprivation alters the normal asymptomatic host-pathogen relationship. Published data suggest the highest proportion of SE-positive eggs is produced within 1-5 weeks post-molt and decreases sharply by 6-10 weeks and dissipates to the background level for non-molted hens by 11-20 weeks. Appropriate treatment measures of eggs produced in the fist 5 weeks post-molting may decrease the risk of foodborne infections to humans.


Assuntos
Galinhas/microbiologia , Galinhas/fisiologia , Privação de Alimentos , Muda , Óvulo/microbiologia , Salmonella enteritidis/isolamento & purificação , Animais , Feminino
13.
Foodborne Pathog Dis ; 5(1): 59-68, 2008 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-18260816

RESUMO

As part of the process for developing risk-based performance standards for egg product processing, the United States Department of Agriculture (USDA) Food Safety and Inspection Service (FSIS) undertook a quantitative microbial risk assessment for Salmonella spp. in pasteurized egg products. The assessment was designed to assist risk managers in evaluating egg handling and pasteurization performance standards for reducing the likelihood of Salmonella in pasteurized egg products and the subsequent risk to human health. The following seven pasteurized liquid egg product formulations were included in the risk assessment model, with the value in parentheses indicating the estimated annual number of human illnesses from Salmonella from each: egg white (2636), whole egg (1763), egg yolk (708), whole egg with 10% salt (407), whole egg with 10% sugar (0), egg yolk with 10% salt (11), and egg yolk with 10% sugar (0). Increased levels of pasteurization were predicted to be highly effective mitigations for reducing the number of illnesses. For example, if all egg white products were pasteurized for a 6-log(10) reduction of Salmonella, the estimated annual number of illnesses from these products would be reduced from 2636 to 270. The risk assessment identified several data gaps and research needs, including a quantitative study of cross-contamination during egg product processing and characterization of egg storage times and temperatures (i) on farms and in homes, (ii) for eggs produced off-line, and (iii) for egg products at retail. Pasteurized egg products are a relatively safe food; however, findings from this study suggest increased pasteurization can make them safer.


Assuntos
Ovos/microbiologia , Contaminação de Alimentos/análise , Manipulação de Alimentos/métodos , Medição de Risco , Salmonella/crescimento & desenvolvimento , Animais , Galinhas , Contagem de Colônia Microbiana , Qualidade de Produtos para o Consumidor , Clara de Ovo/microbiologia , Gema de Ovo/microbiologia , Conservação de Alimentos/métodos , Temperatura Alta , Humanos , Fatores de Tempo , Estados Unidos , United States Department of Agriculture
14.
Foodborne Pathog Dis ; 3(4): 403-12, 2006.
Artigo em Inglês | MEDLINE | ID: mdl-17199522

RESUMO

In 1998, the United States Department of Agriculture's Food Safety and Inspection Service (FSIS) and the Food and Drug Administration completed a risk assessment that indicated multiple interventions along the farm-to-table chain were needed to reduce the risk of human illness from Salmonella Enteritidis in shell eggs. Based on newly available data and improved modeling techniques, FSIS completed an updated risk assessment to examine the effect of pasteurization and refrigeration on reducing human illnesses from S. Enteritidis in shell eggs. The risk assessment model was written in Visual Basic for Applications (Microsoft, Redmond, WA) and run using Monte Carlo methods. The model estimated that if all shell eggs produced in the United States were pasteurized for a 3-log10 reduction of S. Enteritidis, the annual number of illnesses from S. Enteritidis in eggs would decrease from approximately 130,000 to 40,000. Pasteurization for a 5-log10 reduction of S. Enteritidis was estimated to reduce the annual number of illnesses to 19,000. The model also estimated that if all eggs produced in the United States were stored and held at 7.2 degrees C within 12 hours of lay, the annual number of illnesses from S. Enteritidis in eggs would decrease from 130,000 to 28,000. As a result, rapid cooling and pasteurization of shell eggs were predicted to be highly effective mitigations for reducing illnesses from consumption of S. Enteritidis in shell eggs.


Assuntos
Qualidade de Produtos para o Consumidor , Ovos/microbiologia , Contaminação de Alimentos/análise , Medição de Risco , Intoxicação Alimentar por Salmonella/epidemiologia , Salmonella enteritidis/isolamento & purificação , Animais , Galinhas , Ovos/normas , Inspeção de Alimentos , Humanos , Método de Monte Carlo , Intoxicação Alimentar por Salmonella/etiologia , Estados Unidos/epidemiologia
15.
J Toxicol Environ Health A ; 67(8-10): 667-85, 2004.
Artigo em Inglês | MEDLINE | ID: mdl-15192861

RESUMO

In order to estimate the risk or probability of adverse events in risk assessment, it is necessary to identify the important variables that contribute to the risk and provide descriptions of distributions of these variables for well-defined populations. One component of modeling dose response that can create uncertainty is the inherent genetic variability among pathogenic bacteria. For many microbial risk assessments, the "default" assumption used for dose response does not account for strain or serotype variability in pathogenicity and virulence, other than perhaps, recognizing the existence of avirulent strains. However, an examination of data sets from human clinical trials in which Salmonella spp. and Campylobacter jejuni strains were administered reveals significant strain differences. This article discusses the evidence for strain variability and concludes that more biologically based alternatives are necessary to replace the default assumptions commonly used in microbial risk assessment, specifically regarding strain variability.


Assuntos
Infecções por Campylobacter/microbiologia , Campylobacter jejuni/classificação , Microbiologia de Alimentos , Medição de Risco , Intoxicação Alimentar por Salmonella/microbiologia , Salmonella/classificação , Campylobacter jejuni/patogenicidade , Humanos , Salmonella/patogenicidade
16.
Infect Immun ; 70(4): 1761-71, 2002 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-11895937

RESUMO

Campylobacter jejuni has been identified as the leading cause of acute bacterial diarrhea in the United States, yet compared with other enteric pathogens, considerably less is understood concerning the virulence factors of this human pathogen. A random in vivo transposon mutagenesis system was recently developed for the purpose of creating a library of C. jejuni transformants. A total of 1,065 C. jejuni transposon mutants were screened for their ability to swarm on motility agar plates and autoagglutinate in liquid cultures; 28 mutants were subsequently identified. The transposon insertion sites were obtained by using random-primed PCR, and the putative genes responsible for these phenotypes were identified. Of these mutants, all 28 were found to have diminished motility (0 to 86% that of the control). Seventeen motility mutants had insertions in genes with strong homology to functionally known motility and chemotaxis genes; however, 11 insertions were in genes of unknown function. Twenty motility mutants were unable to autoagglutinate, suggesting that the expression of flagella is correlated with autoagglutination (AAG). However, four mutants expressed wild-type levels of surface FlaA, as indicated by Western blot analysis, yet were unable to autoagglutinate (Cj1318, Cj1333, Cj1340c, and Cj1062). These results suggest that FlaA is necessary but not sufficient to mediate the AAG phenotype. Furthermore, two of the four AAG mutants (Cj1333 and Cj1062) were unable to invade INT-407 intestinal epithelial cells, as determined by a gentamicin treatment assay. These data identify novel genes important for motility, chemotaxis, and AAG and demonstrate their potential role in virulence.


Assuntos
Aglutinação , Campylobacter jejuni/genética , Elementos de DNA Transponíveis , Aderência Bacteriana , Campylobacter jejuni/patogenicidade , Campylobacter jejuni/fisiologia , Linhagem Celular , Flagelina/análise , Mucosa Intestinal/microbiologia , Microscopia de Fluorescência , Movimento , Mutagênese , Mutação
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...