Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 10 de 10
Filter
Add more filters










Publication year range
1.
J Food Prot ; 87(1): 100201, 2024 01.
Article in English | MEDLINE | ID: mdl-38036175

ABSTRACT

Whole genome sequencing (WGS) is a powerful tool that may be used to assist in identifying Listeria contamination sources and movement within environments, and to assess persistence. This study investigated sites in a produce packinghouse where Listeria had been historically isolated; and aimed to characterize dispersal patterns and identify cases of transient and resident Listeria. Environmental swab samples (n = 402) were collected from 67 sites at two time-points on three separate visits. Each sample was tested for Listeria, and Listeria isolates were characterized by partial sigB sequencing to determine species and allelic type (AT). Representative isolates from the three most common L. monocytogenes ATs (n = 79) were further characterized by WGS. Of the 144 Listeria species positive samples (35.8%), L. monocytogenes was the most prevalent species. L. monocytogenes was often coisolated with another species of Listeria. WGS identified cases of sporadic and continued reintroduction of L. monocytogenes from the cold storages into the packinghouse and demonstrated cases of L. monocytogenes persistence over 2 years in cold storages, drains, and on a forklift. Nine distinct clusters were found in this study. Two clusters showed evidence of persistence. Isolates in these two clusters (N = 11, with one historical isolate) were obtained predominantly and over multiple samplings from cold storages, with sporadic movement to sites in the packing area, suggesting residence in cold storages with opportunistic dispersal within the packinghouse. The other seven clusters demonstrated evidence of transient Listeria, as isolation was sporadic over time and space during the packing season. Our data provide important insights into likely L. monocytogenes harborage points and transfer in a packinghouse, which is key to root cause analysis. While results support Listeria spp. as a suitable indicator organism for environmental monitoring surveys, findings were unable to establish a specific species as an index organism for L. monocytogenes. Findings also suggest long-term persistence with substantial SNP diversification, which may assist in identifying potential contamination sources and implementing control measures.


Subject(s)
Listeria monocytogenes , Listeria , Listeria monocytogenes/genetics , Food Microbiology , Whole Genome Sequencing
2.
Heliyon ; 9(9): e19676, 2023 Sep.
Article in English | MEDLINE | ID: mdl-37809630

ABSTRACT

During harvest pecan nuts are at risk of contamination with foodborne pathogens from extended contact with the ground. The objective of this study was to determine the potential transfer of Escherichia coli and Salmonella from the ground to in-shell pecans during the harvesting process. Plots (2 m2) were sprayed with 1 L of a rifampicin (rif) resistant strain of either E. coli TVS 353 or an attenuated Salmonella Typhimurium inoculum at a low (∼4 log CFU/ml), mid (∼6 log CFU/ml) or high (∼8 log CFU/ml) concentrations. The following day, nuts were mechanically harvested and samples from each plot were collected at 1 min, 4 h, and 24 h. Samples were enumerated for Salmonella and E. coli on tryptic soy agar supplemented with rif. The Salmonella levels in the soil from the inoculated plots were 2.0 ± 0.3, 4.1 ± 0.1, and 6.4 ± 0.2 log CFU/g for the low, mid, and high inocula, respectively. The E. coli levels in the soil from the inoculated plots were 1.5 ± 0.4, 3.7 ± 0.3, and 5.8 ± 0.1 log CFU/g for the low, mid, and high inocula, respectively. There was a significant difference in the average daily rainfall among the three trials. Trial 3 received 23.8 ± 9.2 cm, while trials 1 and 2 received much less (0.1 ± 0.1 0.0 ± 0.0 cm, respectively). Inoculation concentration and trial were significant (P<0.05) factors that influenced the transfer of E. coli and Salmonella to pecans. For the high inoculum treatment, bacterial transfer to pecans ranged from 0.7 ± 0.3 to 4.1 ± 0.2 for E. coli and 1.3 ± 0.7 to 4.3 ± 0.4 log CFU/g for Salmonella. For the medium inoculum treatment, transfer ranged from <0.3 to 1.5 ± 0.1 for E. coli and <0.3 to 1.9 ± 0.2 log CFU/g for Salmonella. For the low treatment, transfer ranged from <0.3 to 0.4 ± 0.2 and <0.3 to 0.5 ± 0.1 log CFU/g for E. coli and Salmonella, respectively. These results show the need for implementing agricultural practices that prevent potential transfer of foodborne pathogens onto the surface of in-shell pecans during harvest.

3.
J Food Prot ; 85(12): 1842-1847, 2022 12 01.
Article in English | MEDLINE | ID: mdl-36150096

ABSTRACT

ABSTRACT: Many studies have examined the survival of Escherichia coli and foodborne pathogens in agricultural soils. The results of these studies can be influenced by various growth conditions and growth media used when preparing cultures for an experiment. The objectives of this study were to (i) determine the growth curves of rifampin (R)-resistant E. coli in three types of growth media containing R: tryptic soy agar (TSA-R); tryptic soy broth (TSB-R); and poultry pellet extract (PPE-R) and (ii) evaluate the influence of growth media on the survival of E. coli in agricultural soil. Poultry pellet extract (PPE) was prepared by filter sterilizing a 1:10 suspension of heat-treated poultry pellets in sterile water. Generic E. coli (TVS 353) acclimated to 80 µg/mL of R was grown in TSA-R, TSB-R, and PPE-R at 3.0 to 3.5 log CFU/mL and incubated at 37°C. Growth curves were determined by quantifying E. coli populations at 0, 4, 8, 16, 24, and 32 h. Soil microcosms were inoculated with E. coli (6.0 log CFU/g) previously cultured in one of the three media types and stored at 25°C, and soil samples were quantified for E. coli on days 0, 1, 3, 7, 14, 28, and 42. Growth curves and survival models were generated by using DMFit and GInaFiT, respectively. E. coli growth rates were 0.88, 0.77, and 0.69 log CFU/mL/h in TSA-R, TSB-R, and PPE-R, respectively. E. coli populations in the stationary phase were greater for cultures grown in TSA-R (9.4 log CFU/mL) and TSB-R (9.1 log CFU/mL) compared with PPE-R (7.9 log CFU/mL). The E. coli populations in the soil remained stable up to 3 days before declining. An approximate 2 log CFU/g decline of E. coli in soil was observed for each culture type between days 3 and 7, after which E. coli populations declined more slowly from days 7 to 42. A biphasic shoulder model was used to evaluate E. coli survival in soils on the basis of growth media. Using standardized culture growth preparation may aid in determining the complex interactions of enteric pathogen survival in soils.


Subject(s)
Escherichia coli O157 , Soil , Animals , Agar , Colony Count, Microbial , Culture Media , Food Microbiology , Plant Extracts , Poultry
4.
J Appl Microbiol ; 132(3): 2342-2354, 2022 Mar.
Article in English | MEDLINE | ID: mdl-34637586

ABSTRACT

AIMS: This study investigated Salmonella concentrations following combinations of horticultural practices including anaerobic soil disinfestation (ASD), soil amendment type and irrigation regimen. METHODS AND RESULTS: Sandy-loam soil was inoculated with a five-serovar Salmonella cocktail (5.5 ± 0.2 log CFU per gram) and subjected to one of six treatments: (i) no soil amendment, ASD (ASD control), (ii) no soil amendment, no-ASD (non-ASD control) and (iii-vi) soil amended with pelletized poultry litter, rye, rapeseed or hairy vetch with ASD. The effect of irrigation regimen was determined by collecting samples 3 and 7 days after irrigation. Twenty-five-gram soil samples were collected pre-ASD, post-soil saturation (i.e. ASD-process), and at 14 time-points post-ASD, and Salmonella levels enumerated. Log-linear models examined the effect of amendment type and irrigation regimen on Salmonella die-off during and post-ASD. During ASD, Salmonella concentrations significantly decreased in all treatments (range: -0.2 to -2.7 log CFU per gram), albeit the smallest decrease (-0.2 log CFU per gram observed in the pelletized poultry litter) was of negligible magnitude. Salmonella die-off rates varied by amendment with an average post-ASD rate of -0.05 log CFU per gram day (CI = -0.05, -0.04). Salmonella concentrations remained highest over the 42 days post-ASD in pelletized poultry litter, followed by rapeseed, and hairy vetch treatments. Findings suggested ASD was not able to eliminate Salmonella in soil, and certain soil amendments facilitated enhanced Salmonella survival. Salmonella serovar distribution differed by treatment with pelletized poultry litter supporting S. Newport survival, compared with other serovars. Irrigation appeared to assist Salmonella survival with concentrations being 0.14 log CFU per gram (CI = 0.05, 0.23) greater 3 days, compared with 7 days post-irrigation. CONCLUSIONS: ASD does not eliminate Salmonella in soil, and may in fact, depending on the soil amendment used, facilitate Salmonella survival. SIGNIFICANCE AND IMPACT OF THE STUDY: Synergistic and antagonistic effects on food safety hazards of implementing horticultural practices should be considered.


Subject(s)
Soil Microbiology , Soil , Agricultural Irrigation , Agriculture/methods , Anaerobiosis , Salmonella
5.
J Food Prot ; 85(1): 22-26, 2022 01 01.
Article in English | MEDLINE | ID: mdl-34469547

ABSTRACT

ABSTRACT: The process of washing tomatoes in dump (flume) tanks has been identified as a potential source of cross-contamination. This study's objective was to assess the potential for Salmonella enterica cross-contamination at various inoculation levels in the presence of free chlorine (HOCl) and organic matter. Uninoculated tomatoes were introduced into a laboratory-based model flume containing tomatoes inoculated with a cocktail of five rifampin-resistant S. enterica serovars at 104, 106, or 108 CFU per tomato in water containing 0 or 25 mg/L HOCl and 0 or 300 mg/L chemical oxygen demand (COD). Uninoculated tomatoes exposed to the inoculated tomatoes were removed from the water after 5, 30, 60, and 120 s and placed in bags containing tryptic soy broth supplemented with rifampin and 0.1% sodium thiosulfate. Following incubation, enrichment cultures were plated on tryptic soy agar supplemented with rifampin and xylose lysine deoxycholate agar to determine the presence of Salmonella. HOCl and pH were measured before and after each trial. The HOCl in water containing 300 mg/L COD significantly declined (P ≤ 0.05) by the end of each 120-s trial, most likely due to the increased demand for the oxidant. Higher inoculum levels and lower HOCl concentrations were significant factors (P ≤ 0.05) that contributed to increased cross-contamination. At 25 mg/L HOCl, no Salmonella was recovered under all conditions from uninoculated tomatoes exposed to tomatoes inoculated at 104 CFU per tomato. When the inoculum was increased to 106 and 108 CFU per tomato, cross-contamination was observed, independent of COD levels. The results from this study indicate that the currently required sanitizer concentration (e.g., 100 or 150 mg/L) for flume water may be higher than necessary and warrants reevaluation.


Subject(s)
Disinfectants , Solanum lycopersicum , Chlorine/pharmacology , Colony Count, Microbial , Disinfectants/pharmacology , Food Handling/methods , Food Microbiology , Salmonella
6.
J Food Prot ; 84(10): 1784-1792, 2021 10 01.
Article in English | MEDLINE | ID: mdl-34086886

ABSTRACT

ABSTRACT: Monitoring and maintenance of water quality in dump tanks or flume systems is crucial to maintaining proper sanitizer levels to prevent pathogen cross-contamination during postharvest washing of tomatoes, but there is limited information on how organic matter influences sanitizer efficacy in the water. The main objective of this study was to monitor water quality in flume tanks and evaluate the efficacy of postharvest washing of tomatoes in commercial packinghouses. Flume tank water samples (n = 3) were collected on an hourly basis from three packinghouses in Florida and analyzed for pH, total dissolved solids (TDS), free chlorine, chemical oxygen demand (COD), oxidation-reduction potential, and turbidity. Additionally, three flume-water samples were collected and tested for total aerobic plate count (APC), total coliforms (TC), and Escherichia coli. Fresh tomatoes (n = 3), both before and after washing, were collected and analyzed for the same bacterial counts. Turbidity, COD, and TDS levels in flume water increased over time in all packinghouses. Correlations observed include COD and turbidity (r = 0.631), turbidity and TDS (r = 0.810), and oxidation-reduction potential and chlorine (r = 0.660). APC for water samples had an average range of 0.0 to 4.7 log CFU/mL and TC average range of 0.0 to 4.7 log CFU/mL. All water samples were negative for E. coli. The average APC for pre- and postflume tomatoes from the three packinghouses was 6.0 log CFU per tomato and ranged from 2.2 to 7.4 log CFU per tomato. The average TC count was <1.5 and 7.0 log CFU per tomato for pre- and postwash tomatoes, respectively. There was no significant effect (P > 0.05) of postharvest washing on the microbiological qualities of tomatoes. Water quality in flume tanks deteriorated over time in all packinghouses during a typical operational day of 4 to 8 h.


Subject(s)
Disinfectants , Solanum lycopersicum , Bacterial Load , Chlorine , Colony Count, Microbial , Escherichia coli , Florida , Food Handling , Food Microbiology , Water Quality
7.
Front Microbiol ; 12: 590303, 2021.
Article in English | MEDLINE | ID: mdl-33796083

ABSTRACT

The use of untreated biological soil amendments of animal origin (BSAAO) have been identified as one potential mechanism for the dissemination and persistence of Salmonella in the produce growing environment. Data on factors influencing Salmonella concentration in amended soils are therefore needed. The objectives here were to (i) compare die-off between 12 Salmonella strains following inoculation in amended soil and (ii) characterize any significant effects associated with soil-type, irrigation regimen, and amendment on Salmonella survival and die-off. Three greenhouse trials were performed using a randomized complete block design. Each strain (~4 log CFU/g) was homogenized with amended or non-amended sandy-loam or clay-loam soil. Salmonella levels were enumerated in 25 g samples 0, 0.167 (4 h), 1, 2, 4, 7, 10, 14, 21, 28, 56, 84, 112, 168, 210, 252, and 336 days post-inoculation (dpi), or until two consecutive samples were enrichment negative. Regression analysis was performed between strain, soil-type, irrigation, and (i) time to last detect (survival) and (ii) concentration at each time-point (die-off rate). Similar effects of strain, irrigation, soil-type, and amendment were identified using the survival and die-off models. Strain explained up to 18% of the variance in survival, and up to 19% of variance in die-off rate. On average Salmonella survived for 129 days in amended soils, however, Salmonella survived, on average, 30 days longer in clay-loam soils than sandy-loam soils [95% Confidence interval (CI) = 45, 15], with survival time ranging from 84 to 210 days for the individual strains during daily irrigation. When strain-specific associations were investigated using regression trees, S. Javiana and S. Saintpaul were found to survive longer in sandy-loam soil, whereas most of the other strains survived longer in clay-loam soil. Salmonella also survived, on average, 128 days longer when irrigated weekly, compared to daily (CI = 101, 154), and 89 days longer in amended soils, than non-amended soils (CI = 61, 116). Overall, this study provides insight into Salmonella survival following contamination of field soils by BSAAO. Specifically, Salmonella survival may be strain-specific as affected by both soil characteristics and management practices. These data can assist in risk assessment and strain selection for use in challenge and validation studies.

8.
J Food Prot ; 84(4): 597-610, 2021 Apr 01.
Article in English | MEDLINE | ID: mdl-33232452

ABSTRACT

ABSTRACT: Listeria monocytogenes was associated with more than 60 produce recalls, including tomato, cherry, broccoli, lemon, and lime, between 2017 and 2020. This study describes the effects of temperature, time, and food substrate as factors influencing L. monocytogenes behavior on whole intact raw fruits and vegetables. Ten intact whole fruit and vegetable commodities were chosen based on data gaps identified in a systematic literature review. Produce investigated belong to major commodity families: Ericaceae (blackberry, raspberry, and blueberry), Rutaceae (lemon and mandarin orange), Roseaceae (sweet cherry), Solanaceae (tomato), Brassaceae (cauliflower and broccoli), and Apiaceae (carrot). A cocktail of five L. monocytogenes strains that included clinical, food, or environmental isolates linked to foodborne outbreaks was used to inoculate intact whole fruits and vegetables. Samples were incubated at 2, 12, 22, 30, and 35°C with relative humidities matched to typical real-world conditions. Foods were sampled (n = 6) for up to 28 days, depending on temperature. Growth and decline rates were estimated using DMFit, an Excel add-in. Growth rates were compared with ComBase modeling predictions for L. monocytogenes. Almost every experiment showed initial growth, followed by subsequent decline. L. monocytogenes was able to grow on the whole intact surface of all produce tested, except for carrot. The 10 produce commodities supported growth of L. monocytogenes at 22 and 35°C. Growth and survival at 2 and 12°C varied by produce commodity. The standard deviation of the square root growth and decline rates showed significantly larger variability in both growth and decline rates within replicates as temperature increased. When L. monocytogenes growth occurred, it was conservatively modeled by ComBase Predictor, and growth was generally followed by decreases in concentration. This research will assist in understanding the risks of foodborne disease outbreaks and recalls associated with L. monocytogenes on fresh whole produce.


Subject(s)
Listeria monocytogenes , Colony Count, Microbial , Food Handling , Food Microbiology , Fruit , Humans , Temperature , Vegetables
9.
J Food Prot ; 83(5): 858-864, 2020 May 01.
Article in English | MEDLINE | ID: mdl-31928419

ABSTRACT

ABSTRACT: Understanding a food's ability to support the growth and/or survival of a pathogen throughout the supply chain is essential to minimizing large-scale contamination events. The purpose of this study was to examine the behavior (growth and/or survival) of Listeria monocytogenes on broccoli and cauliflower florets stored at different postharvest temperatures utilized along the supply chain. Broccoli and cauliflower samples were inoculated with L. monocytogenes at approximately 3 log CFU/g and stored at 23 ± 2, 12 ± 2, 4 ± 2, and -18 ± 2°C. Samples were evaluated for L. monocytogenes levels after 0, 0.167 (4 h), 1, 2, 3, and 4 days at 23 ± 2°C; 0, 0.167, 1, 2, 3, 4, 7, 10, and 14 days at 12 ± 2°C; 0, 0.167, 1, 2, 3, 4, 7, 10, 14, 21, and 28 days at 4 ± 2°C; and 0, 1, 7, 28, 56, 84, 112, 140, and 168 days at -18 ± 2°C. L. monocytogenes populations were determined by plating samples onto tryptic soy agar and modified Oxford agar supplemented with nalidixic acid. Broccoli and cauliflower supported the growth of L. monocytogenes at 23, 12, and 4°C, and higher growth rates were observed at higher temperatures. Populations of L. monocytogenes on broccoli and cauliflower samples significantly increased within 1 day at 23°C (by 1.6 and 2.0 log CFU/g, respectively) (P ≤ 0.05). At 12°C, populations of L. monocytogenes on broccoli and cauliflower samples significantly increased over 14 days by 1.4 and 1.9 log CFU/g, respectively (P ≤ 0.05). No significant difference over time was observed in L. monocytogenes populations on broccoli and cauliflower samples held under refrigeration until populations began to grow by day 10 in both commodities (P > 0.05). Under frozen storage (-18°C), populations of L. monocytogenes survived on broccoli and cauliflower at least up to 168 days. Storage of broccoli and cauliflower at lower temperatures can minimize L. monocytogenes growth potential; growth rates were lower at 4°C than at 12 and 23°C.


Subject(s)
Brassica , Food Storage/methods , Listeria monocytogenes , Brassica/microbiology , Colony Count, Microbial , Food Handling , Food Microbiology , Food Preservation , Listeria monocytogenes/growth & development , Temperature
10.
J Food Prot ; 82(2): 301-309, 2019 Feb.
Article in English | MEDLINE | ID: mdl-30682262

ABSTRACT

Cucumbers were associated with four multistate outbreaks of Salmonella in the United States between 2013 and 2016. This study evaluated the fate of Listeria monocytogenes and Salmonella on whole and sliced cucumbers at various storage temperatures. Cucumbers were inoculated with five-strain cocktails of L. monocytogenes or Salmonella, air dried, and stored at 23 ± 2, 4 ± 2, and -18 ± 2°C. Whole and sliced cucumber samples were enumerated on nonselective and selective media at 0, 0.21, 1, 2, 3, and 4 days (23 ± 2°C); 0, 1, 2, 3, 7, 14, and 21 days (4 ± 2°C); and 0, 7, 28, 60, 90, and 120 days (-18 ± 2°C). For Salmonella, additional time points were added at 8 and 17 h (23 ± 2°C) and at 17 h (4 ± 2°C). Population levels were calculated for whole (CFU per cucumber) and sliced (CFU per gram) cucumbers. Both pathogens grew on whole and sliced cucumbers held at ambient temperatures. At 23 ± 2°C, L. monocytogenes and Salmonella populations significantly increased on whole (2.3 and 3.4 log CFU per cucumber, respectively) and sliced (1.7 and 3.2 log CFU/g, respectively) cucumbers within 1 day. Salmonella populations significantly increased on whole and sliced cucumbers after only 5 h (2.1 log CFU per cucumber and 1.5 log CFU/g, respectively), whereas L. monocytogenes populations were not significantly different on whole and sliced cucumbers at 5 h. L. monocytogenes and Salmonella populations survived up to 21 days on refrigerated whole and sliced cucumbers. At 4 ± 2°C, L. monocytogenes populations significantly increased on whole (2.8 log CFU per cucumber) and sliced (2.9 log CFU/g) cucumbers, whereas Salmonella populations significantly decreased on whole (0.6 log CFU per cucumber) and sliced (1.3 log CFU/g) cucumbers over 21 days. Both pathogens survived on frozen whole and sliced cucumbers for at least 120 days. The ability of L. monocytogenes and Salmonella to grow on whole and sliced cucumbers in short amounts of time at ambient temperatures, and to survive on whole and sliced cucumbers past the recommended shelf life at refrigeration temperatures, highlights the need to reduce the likelihood of contamination events throughout the cucumber supply chain.


Subject(s)
Cucumis sativus , Food Contamination/analysis , Listeria monocytogenes , Salmonella/growth & development , Colony Count, Microbial , Cucumis sativus/microbiology , Food Handling , Food Microbiology , Listeria monocytogenes/growth & development , Temperature
SELECTION OF CITATIONS
SEARCH DETAIL
...