Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 113
Filter
1.
BMC Public Health ; 21(1): 2307, 2021 12 20.
Article in English | MEDLINE | ID: mdl-34930193

ABSTRACT

BACKGROUND: Effective syndromic surveillance alongside COVID-19 testing behaviours in the population including in higher risk and hard to reach subgroups is vital to detect re-emergence of COVID-19 transmission in the community. The aim of this paper was to identify the prevalence of acute respiratory infection symptoms and coronavirus testing behaviour among South Australians using data from a population based survey. METHODS: We used cross-sectional data from the 2020 state-wide population level health survey on 6857 respondents aged 18 years and above. Descriptive statistics were used to explore the risk factors and multivariable logistic regression models were used to assess the factors associated with the acute respiratory infection symptoms and coronavirus testing behaviour after adjusting for gender, age, household size, household income, Aboriginal and/or Torres Strait Islander status, SEIFA, Country of birth, number of chronic diseases, wellbeing, psychological distress, and mental health. RESULTS: We found that 19.3% of respondents reported having symptoms of acute respiratory infection and the most commonly reported symptoms were a runny nose (11.2%), coughing (9.9%) and sore throat (6.2%). Fever and cough were reported by 0.8% of participants. Of the symptomatic respondents, 32.6% reported seeking health advice from a nurse, doctor or healthcare provider. Around 18% (n = 130) of symptomatic respondents had sought testing and a further 4.3% (n = 31) reported they intended to get tested. The regression results suggest that older age, larger household size, a higher number of chronic disease, mental health condition, poor wellbeing, and psychological distress were associated with higher odds of ARI symptoms. Higher household income was associated with lower odds of being tested or intending to be tested for coronavirus after adjusting for other explanatory variables. CONCLUSIONS: There were relatively high rates of self-reported acute respiratory infection during a period of very low COVID-19 prevalence and low rate of coronavirus testing among symptomatic respondents. Ongoing monitoring of testing uptake, including in higher-risk groups, and possible interventions to improve testing uptake is key to early detection of disease.


Subject(s)
COVID-19 Testing , COVID-19 , Aged , Australia/epidemiology , Cross-Sectional Studies , Health Surveys , Humans , SARS-CoV-2 , South Australia/epidemiology
2.
J Dairy Sci ; 104(12): 12332-12341, 2021 Dec.
Article in English | MEDLINE | ID: mdl-34600705

ABSTRACT

Certain cheeses can be legally produced in the United States using raw milk, but they must be aged for at least 60 d to reduce pathogen risks. However, some varieties, even when aged for 60 d, have been shown to support growth of Listeria monocytogenes or survival of Shiga toxin-producing Escherichia coli (STEC). Thermization, as a subpasteurization heat treatment, has been proposed as a control to reduce the risk of pathogens in raw cheese milk while retaining some quality attributes in the cheese. However, the temperature and time combinations needed to enhance safety have not been well characterized. The objective of this research was to determine and validate decimal reduction values (D-values) for L. monocytogenes and STEC at thermization temperatures 65.6, 62.8, and 60.0°C; a D-value at 57.2°C was also determined for L. monocytogenes only. Nonhomogenized, pasteurized whole-milk samples (1 mL) were inoculated with 8-log cfu/mL L. monocytogenes or STEC (5- or 7-strain mixtures, respectively), vacuum-sealed in moisture-impermeable pouches, and heated via water bath submersion. Duplicate samples were removed at appropriate intervals and immediately cooled in an ice bath. Surviving bacteria were enumerated on modified Oxford or sorbitol MacConkey overlaid with tryptic soy agar to aid in the recovery of heat-injured cells. Duplicate trials were conducted, and survival data were used to calculate thermal inactivation rates. D65.6°C-, D62.8°C-, and D60.0°C-values of 17.1 and 7.2, 33.8 and 16.9, and 146.6 and 60.0 s were found for L. monocytogenes and STEC, respectively, and a D57.2°C-value of 909.1 s was determined for L. monocytogenes. Triplicate validation trials were conducted for each test temperature using 100 mL of milk inoculated with 3 to 4 log cfu/mL of each pathogen cocktail, A 3-log reduction of each pathogen was achieved faster in larger volumes than what was predicted by D-values (D-values were fail-safe). Data were additionally compared with published results from 21 scientific studies investigating L. monocytogenes and STEC in whole milk heated to thermization temperatures (55.0-71.7°C). These data can be used to give producers of artisanal raw-milk cheese flexibility in designing thermal processes to reduce L. monocytogenes and STEC populations to levels that are not infectious to consumers.


Subject(s)
Cheese , Listeria monocytogenes , Shiga-Toxigenic Escherichia coli , Animals , Cheese/analysis , Colony Count, Microbial/veterinary , Food Microbiology , Milk
3.
BMC Med ; 19(1): 50, 2021 02 17.
Article in English | MEDLINE | ID: mdl-33596902

ABSTRACT

BACKGROUND: Following implementation of strong containment measures, several countries and regions have low detectable community transmission of COVID-19. We developed an efficient, rapid, and scalable surveillance strategy to detect remaining COVID-19 community cases through exhaustive identification of every active transmission chain. We identified measures to enable early detection and effective management of any reintroduction of transmission once containment measures are lifted to ensure strong containment measures do not require reinstatement. METHODS: We compared efficiency and sensitivity to detect community transmission chains through testing of the following: hospital cases; fever, cough and/or ARI testing at community/primary care; and asymptomatic testing; using surveillance evaluation methods and mathematical modelling, varying testing capacities, reproductive number (R) and weekly cumulative incidence of COVID-19 and non-COVID-19 respiratory symptoms using data from Australia. We assessed system requirements to identify all transmission chains and follow up all cases and primary contacts within each chain, per million population. RESULTS: Assuming 20% of cases are asymptomatic and 30% of symptomatic COVID-19 cases present for testing, with R = 2.2, a median of 14 unrecognised community cases (8 infectious) occur when a transmission chain is identified through hospital surveillance versus 7 unrecognised cases (4 infectious) through community-based surveillance. The 7 unrecognised community upstream cases are estimated to generate a further 55-77 primary contacts requiring follow-up. The unrecognised community cases rise to 10 if 50% of cases are asymptomatic. Screening asymptomatic community members cannot exhaustively identify all cases under any of the scenarios assessed. The most important determinant of testing requirements for symptomatic screening is levels of non-COVID-19 respiratory illness. If 4% of the community have respiratory symptoms, and 1% of those with symptoms have COVID-19, exhaustive symptomatic screening requires approximately 11,600 tests/million population using 1/4 pooling, with 98% of cases detected (2% missed), given 99.9% sensitivity. Even with a drop in sensitivity to 70%, pooling was more effective at detecting cases than individual testing under all scenarios examined. CONCLUSIONS: Screening all acute respiratory disease in the community, in combination with exhaustive and meticulous case and contact identification and management, enables appropriate early detection and elimination of COVID-19 community transmission. An important component is identification, testing, and management of all contacts, including upstream contacts (i.e. potential sources of infection for identified cases, and their related transmission chains). Pooling allows increased case detection when testing capacity is limited, even given reduced test sensitivity. Critical to the effectiveness of all aspects of surveillance is appropriate community engagement, messaging to optimise testing uptake and compliance with other measures.


Subject(s)
COVID-19/epidemiology , COVID-19/prevention & control , Independent Living/trends , Models, Theoretical , Population Surveillance/methods , Australia/epidemiology , Basic Reproduction Number/prevention & control , COVID-19/transmission , Early Diagnosis , Feasibility Studies , Hospitalization/trends , Humans , Longitudinal Studies , Mass Screening/methods , Mass Screening/trends
4.
Arch Gynecol Obstet ; 301(6): 1579-1588, 2020 06.
Article in English | MEDLINE | ID: mdl-32377787

ABSTRACT

PURPOSE: Due to modern and individualised treatments, women at reproductive age have a high survival rate after cancer therapy. What are pregnancy and birth rates of women after cancer and how often do they use cryopreserved ovarian tissue or gametes? METHODS: From 2007 to 2015, 162 women aged 26.7 ± 6.9 years were counselled for fertility preservation at a single University Fertility Centre. A questionnaire study was performed in average 3 and 6 years after the diagnosis of cancer. The women were asked about their fertility, partnership, family planning, and pregnancy history. 72 women (51%) answered a written questionnaire in 2016. 59 women were reached again by phone in 2019 (82%). RESULTS: The preferred method of fertility preservation was ovarian tissue cryopreservation (n = 36, 50%); none of the women had ovarian hyperstimulation in order to cryopreserve oocytes. About 3 years after treatment, 37 women of 72 women (51%) of the women with a mean age of 29.9 years had a strong wish to conceive. 21/72 (29%) had actively tried to conceive after successful cancer treatment; eight women (11%) were already pregnant or had children. Six years after cancer diagnosis 16/59 (27%) women had ongoing anticancer treatment. 12/59 (20%) were pregnant or had children, while 39% (23/59) had no menstrual cycle. Only one woman used her cryopreserved ovarian tissue, but did not become pregnant. CONCLUSION: After cancer and gonadotoxic treatment, women's desire to have a child is substantial. In this study, the rate of spontaneous pregnancies and births was 20% 6 years after gonadotoxic therapies. Not every woman, however, has the opportunity to conceive: factors impairing fertility include ongoing cancer treatment or persistent disease, no partner, no menstrual cycle, as well as other reasons for infertility.


Subject(s)
Fertility Preservation/methods , Fertility/physiology , Infertility/etiology , Neoplasms/complications , Adult , Female , Humans
5.
Epidemiol Infect ; 147: e152, 2019 01.
Article in English | MEDLINE | ID: mdl-31063089

ABSTRACT

Clostridium difficile infections (CDIs) affect patients in hospitals and in the community, but the relative importance of transmission in each setting is unknown. We developed a mathematical model of C. difficile transmission in a hospital and surrounding community that included infants, adults and transmission from animal reservoirs. We assessed the role of these transmission routes in maintaining disease and evaluated the recommended classification system for hospital- and community-acquired CDIs. The reproduction number in the hospital was 1 for nearly all scenarios without transmission from animal reservoirs (range: 1.0-1.34). However, the reproduction number for the human population was 3.5-26.0%) of human exposures originated from animal reservoirs. Symptomatic adults accounted for <10% transmission in the community. Under conservative assumptions, infants accounted for 17% of community transmission. An estimated 33-40% of community-acquired cases were reported but 28-39% of these reported cases were misclassified as hospital-acquired by recommended definitions. Transmission could be plausibly sustained by asymptomatically colonised adults and infants in the community or exposure to animal reservoirs, but not hospital transmission alone. Under-reporting of community-onset cases and systematic misclassification underplays the role of community transmission.


Subject(s)
Carrier State/epidemiology , Carrier State/veterinary , Clostridium Infections/transmission , Community-Acquired Infections/transmission , Disease Reservoirs , Disease Transmission, Infectious , Animals , Carrier State/microbiology , Clostridium Infections/epidemiology , Community-Acquired Infections/epidemiology , Humans , Infant , Models, Theoretical
6.
J Neonatal Perinatal Med ; 12(2): 231-237, 2019.
Article in English | MEDLINE | ID: mdl-30829620

ABSTRACT

BACKGROUND: Simulation is widely used in graduate medical education. A prior survey showed that 80% of Neonatal-Perinatal Medicine (NPM) fellowship programs in the U.S. use simulation. There are multiple ways to provide simulation-based education. One such method is through intensive simulation-based education sessions held at the beginning of a training program, common called 'boot camps'. The aim of this study was to describe the use of simulation-based boot camps in NPM fellowship programs. METHODS: Survey study of Accreditation Council for Graduate Medical Education (ACGME) accredited NPM fellowships in the U.S. RESULTS: Fifty-nine of 98 programs (60%) responded. Thirty six (61%) participated in 1st year fellow boot camps, which focused on procedural skills and newborn resuscitation. Nearly half of programs participated in regional boot camps. Most boot camps were one or two days long. Eleven programs (19%) held 2nd or 3rd year fellow boot camps, which focused on advanced resuscitation and communication. Barriers included lack of faculty protected time (57%), funding (39%), and lack of faculty experience (31%). CONCLUSIONS: A majority of ACGME accredited NPM fellowships participate in 1st year fellows' boot camps. Many participate in regional boot camps. A few programs have 2nd or 3rd year fellow boot camps. Lack of time, funding, and faculty experience were common barriers.


Subject(s)
Education, Medical, Graduate/methods , Perinatology/education , Simulation Training/methods , Cross-Sectional Studies , Fellowships and Scholarships , Humans , Surveys and Questionnaires , Training Support
7.
J Hosp Infect ; 102(2): 157-164, 2019 Jun.
Article in English | MEDLINE | ID: mdl-30880267

ABSTRACT

BACKGROUND: Clostridium difficile infection (CDI) is the leading cause of antibiotic-associated diarrhoea with peak incidence in late winter or early autumn. Although CDI is commonly associated with hospitals, community transmission is important. AIM: To explore potential drivers of CDI seasonality and the effect of community-based interventions to reduce transmission. METHODS: A mechanistic compartmental model of C. difficile transmission in a hospital and surrounding community was used to determine the effect of reducing transmission or antibiotic prescriptions in these settings. The model was extended to allow for seasonal antibiotic prescriptions and seasonal transmission. FINDINGS: Modelling antibiotic seasonality reproduced the seasonality of CDI, including approximate magnitude (13.9-15.1% above annual mean) and timing of peaks (0.7-1.0 months after peak antibiotics). Halving seasonal excess prescriptions reduced the incidence of CDI by 6-18%. Seasonal transmission produced larger seasonal peaks in the prevalence of community colonization (14.8-22.1% above mean) than seasonal antibiotic prescriptions (0.2-1.7% above mean). Reducing transmission from symptomatic or hospitalized patients had little effect on community-acquired CDI, but reducing transmission in the community by ≥7% or transmission from infants by ≥30% eliminated the pathogen. Reducing antibiotic prescription rates led to approximately proportional reductions in infections, but limited reductions in the prevalence of colonization. CONCLUSION: Seasonal variation in antibiotic prescription rates can account for the observed magnitude and timing of C. difficile seasonality. Even complete prevention of transmission from hospitalized patients or symptomatic patients cannot eliminate the pathogen, but interventions to reduce transmission from community residents or infants could have a large impact on both hospital- and community-acquired infections.


Subject(s)
Anti-Bacterial Agents/therapeutic use , Clostridium Infections/prevention & control , Clostridium Infections/transmission , Disease Transmission, Infectious/prevention & control , Drug Utilization , Infection Control/methods , Models, Theoretical , Adult , Aged , Humans , Infant , Prescriptions/statistics & numerical data , Prevalence , Seasons
8.
Prev Vet Med ; 165: 8-14, 2019 Apr 01.
Article in English | MEDLINE | ID: mdl-30851932

ABSTRACT

As of 2018, Australia has experienced seven outbreaks of highly pathogenic avian influenza (HPAI) in poultry since 1976, all of which involved chickens. There is concern that increases in free-range farming could heighten HPAI outbreak risk due to the potential for greater contact between chickens and wild birds that are known to carry low pathogenic avian influenza (LPAI). We use mathematical models to assess the effect of a shift to free-range farming on the risk of HPAI outbreaks of H5 or H7 in the Australian commercial chicken industry, and the potential for intervention strategies to reduce this risk. We find that a shift of 25% of conventional indoor farms to free-range farming practices would result in a 6-7% increase in the risk of a HPAI outbreak. Current practices to treat water are highly effective, reducing the risk of outbreaks by 25-28% compared to no water treatment. Halving wild bird presence in feed storage areas could reduce risk by 16-19% while halving wild bird access of potential bridge-species to sheds could reduce outbreak risk by 23-25%, and relatively small improvements in biosecurity measures could entirely compensate for increased risks due to the increasing proportion of free-range farms in the industry. The short production cycle and cleaning practices for chicken meat sheds considerably reduce the risk that an introduced low pathogenic avian influenza virus is maintained in the flock until it is detected as HPAI through increased mortality of chickens. These findings help explain HPAI outbreak history in Australia and suggest practical changes in biosecurity practices that could reduce the risk of future outbreaks.


Subject(s)
Animal Husbandry/methods , Disease Outbreaks/veterinary , Influenza in Birds/prevention & control , Poultry Diseases/prevention & control , Animals , Australia/epidemiology , Chickens/virology , Disease Outbreaks/prevention & control , Housing, Animal , Influenza in Birds/epidemiology , Models, Theoretical , Poultry Diseases/epidemiology , Poultry Diseases/virology
9.
Epidemiol Infect ; 146(15): 1903-1908, 2018 11.
Article in English | MEDLINE | ID: mdl-30103838

ABSTRACT

Salmonellosis is a leading cause of hospitalisation due to gastroenteritis in Australia. A previous source attribution analysis for a temperate state in Australia attributed most infections to chicken meat or eggs. Queensland is in northern Australia and includes subtropical and tropical climate zones. We analysed Queensland notifications for salmonellosis and conducted source attribution to compare reservoir sources with those in southern Australia. In contrast to temperate Australia, most infections were due to non-Typhimurium serotypes, with particularly high incidence in children under 5 years and strong seasonality, peaking in summer. We attributed 65.3% (95% credible interval (CrI) 60.6-73.2) of cases to either chicken meat or eggs and 15.5% (95% CrI 7.0-19.5) to nuts. The subtypes with the strongest associations with nuts were Salmonella Aberdeen, S. Birkenhead, S. Hvittingfoss, S. Potsdam and S. Waycross. All five subtypes had high rates of illness in children under 5 years (ranging from 4/100 000 to 23/100 000), suggesting that nuts may be serving as a proxy for environmental transmission in the model. Australia's climatic range allows us to conduct source attribution in different climate zones with similar food consumption patterns. This attribution provides evidence for environment-mediated transmission of salmonellosis in sub-tropical regions.


Subject(s)
Disease Transmission, Infectious , Foodborne Diseases/epidemiology , Gastroenteritis/epidemiology , Salmonella Infections/epidemiology , Salmonella/isolation & purification , Adolescent , Adult , Aged , Aged, 80 and over , Animals , Chickens/microbiology , Child , Child, Preschool , Eggs/microbiology , Female , Humans , Incidence , Infant , Infant, Newborn , Male , Meat/microbiology , Middle Aged , Nuts/microbiology , Queensland/epidemiology , Salmonella/classification , Seasons , Serogroup , Young Adult
10.
J Hosp Infect ; 99(4): 453-460, 2018 Aug.
Article in English | MEDLINE | ID: mdl-29258917

ABSTRACT

BACKGROUND: Clostridium difficile infections occur frequently among hospitalized patients, with some infections acquired in hospital and others in the community. International guidelines classify cases as hospital-acquired if symptom onset occurs more than two days after admission. This classification informs surveillance and infection control, but has not been verified by empirical or modelling studies. AIM: To assess current classification of C. difficile acquisition using a simulation model as a reference standard. METHODS: C. difficile transmission was simulated in a range of hospital scenarios. The sensitivity, specificity and precision of classifications that use cut-offs ranging from 0.25 h to 40 days were calculated. The optimal cut-off that correctly estimated the proportion of cases that were hospital acquired and the balanced cut-off that had equal sensitivity and specificity were identified. FINDINGS: The recommended two-day cut-off overestimated the incidence of hospital-acquired cases in all scenarios and by >100% in the base scenario. The two-day cut-off had good sensitivity (96%) but poor specificity (48%) and precision (52%) to identify cases acquired during the current hospitalization. A five-day cut-off was balanced, and a six-day cut-off was optimal in the base scenario. The optimal and balanced cut-offs were more than two days for nearly all scenarios considered (ranges: four to nine days and two to eight days, respectively). CONCLUSION: Current guidelines for classifying C. difficile infections overestimate the proportion of cases acquired in hospital in all model scenarios. To reduce misclassification bias, an infection should be classified as being acquired prior to admission if symptoms begin within five days of admission.


Subject(s)
Clostridioides difficile/isolation & purification , Clostridium Infections/epidemiology , Community-Acquired Infections/diagnosis , Community-Acquired Infections/epidemiology , Cross Infection/diagnosis , Cross Infection/epidemiology , Epidemiologic Methods , Clostridioides difficile/classification , Clostridioides difficile/genetics , Clostridium Infections/microbiology , Community-Acquired Infections/microbiology , Cross Infection/microbiology , Humans , Incidence , Models, Theoretical , Sensitivity and Specificity
11.
Epidemiol Infect ; 145(4): 839-847, 2017 03.
Article in English | MEDLINE | ID: mdl-27938447

ABSTRACT

Campylobacter sp. are a globally significant cause of gastroenteritis. Although rates of infection in Australia are among the highest in the industrialized world, studies describing campylobacteriosis incidence in Australia are lacking. Using national disease notification data between 1998 and 2013 we examined Campylobacter infections by gender, age group, season and state and territory. Negative binomial regression was used to estimate incidence rate ratios (IRRs), including trends by age group over time, with post-estimation commands used to obtain adjusted incidence rates. The incidence rate for males was significantly higher than for females [IRR 1·20, 95% confidence interval (CI) 1·18-1·21], while a distinct seasonality was demonstrated with higher rates in both spring (IRR 1·18, 95% CI 1·16-1·20) and summer (IRR 1·17, 95% CI 1·16-1·19). Examination of trends in age-specific incidence over time showed declines in incidence in those aged <40 years combined with contemporaneous increases in older age groups, notably those aged 70-79 years (IRR 1998-2013: 1·75, 95% CI 1·63-1·88). While crude rates continue to be highest in children, our findings suggest the age structure for campylobacteriosis in Australia is changing, carrying significant public health implications for older Australians.


Subject(s)
Campylobacter Infections/epidemiology , Campylobacter/isolation & purification , Australia/epidemiology , Demography , Humans , Incidence , Seasons , Sex Factors , Spatial Analysis
12.
Epidemiol Infect ; 145(3): 575-582, 2017 02.
Article in English | MEDLINE | ID: mdl-27780483

ABSTRACT

Clostridium difficile is the principal cause of infectious diarrhoea in hospitalized patients. We investigated the incidence and risk factors for hospitalization due to C. difficile infection (CDI) in older Australians. We linked data from a population-based prospective cohort study (the 45 and Up Study) of 266 922 adults aged ⩾45 years recruited in New South Wales, Australia to hospitalization and death records for 2006-2012. We estimated the incidence of CDI hospitalization and calculated days in hospital and costs per hospitalization. We also estimated hazard ratios (HR) for CDI hospitalization using Cox regression with age as the underlying time variable. Over a total follow-up of 1 126 708 person-years, 187 adults had an incident CDI hospitalization. The crude incidence of CDI hospitalization was 16·6/100 000 person-years, with a median hospital stay of 6 days, and a median cost of AUD 6102 per admission. Incidence increased with age and year of follow-up, with a threefold increase for 2009-2012. After adjustment, CDI hospitalization rates were significantly lower in males than females (adjusted HR 0·6, 95% confidence interval 0·4-0·7). CDI hospitalization rates increased significantly over 2009-2012. There is a need to better understand the increasing risk of CDI hospitalization in women.


Subject(s)
Clostridioides difficile/isolation & purification , Clostridium Infections/epidemiology , Diarrhea/epidemiology , Hospitalization , Age Factors , Aged , Aged, 80 and over , Clostridium Infections/microbiology , Diarrhea/microbiology , Female , Health Care Costs , Humans , Incidence , Length of Stay , Longitudinal Studies , Male , Middle Aged , New South Wales/epidemiology , Prospective Studies , Risk Factors , Sex Factors
13.
Epidemiol Infect ; 145(2): 266-271, 2017 01.
Article in English | MEDLINE | ID: mdl-27821195

ABSTRACT

From a population-based birth cohort of 245 249 children born in Western Australia during 1996-2005, we used linkage of laboratory and birth record datasets to obtain data including all respiratory syncytial virus (RSV) detections during infancy from a subcohort of 87 981 singleton children born in the Perth metropolitan area from 2000 to 2004. Using log binomial regression, we found that the risk of infant RSV detection increases with the number of older siblings, with those having ⩾3 older siblings experiencing almost three times the risk (relative risk 2·83, 95% confidence interval 2·46-3·26) of firstborn children. We estimate that 45% of the RSV detections in our subcohort were attributable to infection from an older sibling. The sibling effect was significantly higher for those infants who were younger during the season of peak risk (winter) than those who were older. Although older siblings were present in our cohort, they had very few RSV detections which could be temporally linked to an infant's infection. We conclude that RSV infection in older children leads to less severe symptoms but is nevertheless an important source of infant infection. Our results lend support to a vaccination strategy which includes family members in order to provide maximum protection for newborn babies.


Subject(s)
Family Health , Respiratory Syncytial Virus Infections/epidemiology , Respiratory Syncytial Viruses/isolation & purification , Siblings , Adult , Child , Child, Preschool , Cohort Studies , Disease Transmission, Infectious , Female , Humans , Infant , Infant, Newborn , Male , Respiratory Syncytial Virus Infections/pathology , Respiratory Syncytial Virus Infections/transmission , Risk Assessment , Urban Population , Western Australia/epidemiology , Young Adult
14.
J Dairy Sci ; 100(1): 841-847, 2017 Jan.
Article in English | MEDLINE | ID: mdl-27816245

ABSTRACT

Development of science-based interventions in raw milk cheese production is challenging due to the large diversity of production procedures and final products. Without an agreed upon categorization scheme, science-based food safety evaluations and validation of preventive controls would have to be completed separately on each individual cheese product, which is not feasible considering the large diversity of products and the typically small scale of production. Thus, a need exists to systematically group raw milk cheeses into logically agreed upon categories to be used for food safety evaluations. This paper proposes and outlines one such categorization scheme that provides for 30 general categories of cheese. As a base for this systematization and categorization of raw milk cheese, we used Table B of the US Food and Drug Administration's 2013 Food Code, which represents the interaction of pH and water activity for control of vegetative cells and spores in non-heat-treated food. Building on this table, we defined a set of more granular pH and water activity categories to better represent the pH and water activity range of different raw milk cheeses. The resulting categorization scheme was effectively validated using pH and water activity values determined for 273 different cheese samples collected in the marketplace throughout New York State, indicating the distribution of commercially available cheeses among the categories proposed here. This consensus categorization of cheese provides a foundation for a feasible approach to developing science-based solutions to assure compliance of the cheese processors with food safety regulations, such as those required by the US Food Safety Modernization Act. The key purpose of the cheese categorization proposed here is to facilitate product assessment for food safety risks and provide scientifically validated guidance on effective interventions for general cheese categories. Once preventive controls for a given category have been defined, these categories would represent safe havens for cheesemakers, which would allow cheesemakers to safely and legally produce raw milk cheeses that meet appropriate science-based safety requirements (e.g., risk to human health equivalent to pasteurized milk cheeses).


Subject(s)
Cheese/analysis , Consensus , Food Handling , Water/analysis , Animals , Cheese/microbiology , Dairying , Food Contamination/analysis , Food Microbiology , Food Safety , Hydrogen-Ion Concentration , Milk/chemistry , Milk/microbiology , New York
15.
Epidemiol Infect ; 144(13): 2874-82, 2016 10.
Article in English | MEDLINE | ID: mdl-27097518

ABSTRACT

An innovative strategy to reduce dengue transmission uses the bacterium Wolbachia. We analysed the effects of Wolbachia on dengue transmission dynamics in the presence of two serotypes of dengue using a mathematical model, allowing for differences in the epidemiological characteristics of the serotypes. We found that Wolbachia has a greater effect on secondary infections than on primary infections across a range of epidemiological characteristics. If one serotype is more transmissible than the other, it will dominate primary infections and Wolbachia will be less effective at reducing secondary infections of either serotype. Differences in the antibody-dependent enhancement of the two serotypes have considerably less effect on the benefits of Wolbachia than differences in transmission probability. Even if the antibody-dependent enhancement rate is high, Wolbachia is still effective in reducing dengue. Our findings suggest that Wolbachia will be effective in the presence of more than one serotype of dengue; however, a better understanding of serotype-specific differences in transmission probability may be needed to optimize delivery of a Wolbachia intervention.


Subject(s)
Aedes/microbiology , Dengue Virus/physiology , Dengue/epidemiology , Insect Vectors/microbiology , Aedes/virology , Animals , Coinfection/prevention & control , Coinfection/transmission , Coinfection/virology , Dengue/prevention & control , Dengue/transmission , Dengue/virology , Dengue Virus/genetics , Humans , Incidence , Insect Vectors/virology , Models, Theoretical , Serogroup , Wolbachia
16.
Epidemiol Infect ; 144(5): 897-906, 2016 Apr.
Article in English | MEDLINE | ID: mdl-26455517

ABSTRACT

Estimates of the proportion of illness transmitted by food for different enteric pathogens are essential for foodborne burden-of-disease studies. Owing to insufficient scientific data, a formal synthesis of expert opinion, an expert elicitation, is commonly used to produce such estimates. Eleven experts participated in an elicitation to estimate the proportion of illnesses due to food in Australia for nine pathogens over three rounds: first, based on their own knowledge alone; second, after being provided with systematic reviews of the literature and Australian data; and finally, at a workshop where experts reflected on the evidence. Estimates changed significantly across the three rounds (P = 0·002) as measured by analysis of variance. Following the workshop in round 3, estimates showed smoother distributions with significantly less variation for several pathogens. When estimates were combined to provide combined distributions for each pathogen, the width of these combined distributions reflected experts' perceptions of the availability of evidence, with narrower intervals for pathogens for which evidence was judged to be strongest. Our findings show that the choice of expert elicitation process can significantly influence final estimates. Our structured process - and the workshop in particular - produced robust estimates and distributions appropriate for inclusion in burden-of-disease studies.


Subject(s)
Expert Testimony/methods , Food Microbiology , Food Safety/methods , Foodborne Diseases/epidemiology , Australia/epidemiology , Foodborne Diseases/microbiology , Humans
17.
Risk Anal ; 36(3): 561-70, 2016 Mar.
Article in English | MEDLINE | ID: mdl-26133008

ABSTRACT

Salmonellosis is a significant cause of foodborne gastroenteritis in Australia, and rates of illness have increased over recent years. We adopt a Bayesian source attribution model to estimate the contribution of different animal reservoirs to illness due to Salmonella spp. in South Australia between 2000 and 2010, together with 95% credible intervals (CrI). We excluded known travel associated cases and those of rare subtypes (fewer than 20 human cases or fewer than 10 isolates from included sources over the 11-year period), and the remaining 76% of cases were classified as sporadic or outbreak associated. Source-related parameters were included to allow for different handling and consumption practices. We attributed 35% (95% CrI: 20-49) of sporadic cases to chicken meat and 37% (95% CrI: 23-53) of sporadic cases to eggs. Of outbreak-related cases, 33% (95% CrI: 20-62) were attributed to chicken meat and 59% (95% CrI: 29-75) to eggs. A comparison of alternative model assumptions indicated that biases due to possible clustering of samples from sources had relatively minor effects on these estimates. Analysis of source-related parameters showed higher risk of illness from contaminated eggs than from contaminated chicken meat, suggesting that consumption and handling practices potentially play a bigger role in illness due to eggs, considering low Salmonella prevalence on eggs. Our results strengthen the evidence that eggs and chicken meat are important vehicles for salmonellosis in South Australia.


Subject(s)
Salmonella Food Poisoning/epidemiology , Salmonella Food Poisoning/prevention & control , Animals , Bacterial Typing Techniques , Bayes Theorem , Chickens , Disease Outbreaks , Eggs , Food Microbiology , Food Safety , Health Policy , Humans , Meat , Salmonella Food Poisoning/etiology , Salmonella Infections/epidemiology , South Australia , Travel
18.
J Public Health (Oxf) ; 36(1): 5-12, 2014 Mar.
Article in English | MEDLINE | ID: mdl-23735960

ABSTRACT

The 2009 H1N1 influenza pandemic posed challenges for governments worldwide. Strategies designed to limit community transmission, such as antiviral deployment, were largely ineffective due to both feasibility constraints and the generally mild nature of disease, resulting in incomplete case ascertainment. Reviews of national pandemic plans have identified pandemic impact, primarily linked to measures of transmissibility and severity, as a key concept to incorporate into the next generation of plans. While an assessment of impact provides the rationale under which interventions may be warranted, it does not directly provide an assessment on whether particular interventions may be effective. Such considerations motivate our introduction of the concept of pandemic controllability. For case-targeted interventions, such as antiviral treatment and post-exposure prophylaxis, we identify the visibility and transmissibility of a pandemic as the key drivers of controllability. Taking a case-study approach, we suggest that high-impact pandemics, for which control is most desirable, are likely uncontrollable with case-targeted interventions. Strategies that do not rely on the identification of cases may prove relatively more effective. By introducing a pragmatic framework for relating the assessment of impact to the ability to mitigate an epidemic (controllability), we hope to address a present omission identified in pandemic response plans.


Subject(s)
Influenza, Human/prevention & control , Pandemics/prevention & control , Antiviral Agents/therapeutic use , Health Planning , Humans , Influenza, Human/epidemiology , Influenza, Human/transmission , Organizational Case Studies , Post-Exposure Prophylaxis
19.
J Vet Intern Med ; 26(6): 1443-8, 2012.
Article in English | MEDLINE | ID: mdl-23113879

ABSTRACT

BACKGROUND: Isolation of multiple bacterial species is common in foals with Rhodococcus equi pneumonia. HYPOTHESIS: There is no association between isolation of other microorganisms and outcome. ANIMALS: 155 foals with pneumonia caused by R. equi. METHODS: Case records of foals diagnosed with R. equi pneumonia based on culture of the respiratory tract were reviewed at 2 referral hospitals (University of Florida [UF] and Texas A&M University [TAMU]). RESULTS: R. equi was cultured from a tracheobronchial aspirate (TBA) in 115 foals and from lung tissue in 38 foals. Survival was significantly higher at UF (71%; 70/99) than at TAMU (50%; 28/56). R. equi was significantly more likely to grow in pure cultures from samples obtained from foals at UF (55%; 54/99) than from foals at TAMU (23%; 13/56). Microorganisms cultured with R. equi included Gram-positive bacteria in 40, Gram-negative bacteria in 41, and fungi in 23 foals. The most common bacteria isolated were beta-hemolytic streptococci (n = 26) and Escherichia coli (n = 18). Mixed infections were significantly more likely to be encountered in TBA than in lung tissue. Only foals from which R. equi was cultured from a TBA were included in the analysis for association between mixed infection and outcome. After adjusting for the effect of hospital using multivariate logistic regression, mixed culture, mixed bacterial culture, Gram-positive bacteria, beta-hemolytic streptococci, Gram-negative bacteria, enteric Gram-negative bacteria, nonenteric Gram-negative bacteria, and fungi were not significantly associated with outcome. CONCLUSIONS AND CLINICAL IMPORTANCE: Isolation of multiple bacteria or fungi from a TBA along with R. equi does not negatively impact prognosis.


Subject(s)
Actinomycetales Infections/veterinary , Horse Diseases/microbiology , Pneumonia, Bacterial/veterinary , Rhodococcus equi , Actinomycetales Infections/pathology , Animals , Horse Diseases/pathology , Horses , Odds Ratio , Pneumonia, Bacterial/microbiology , Prognosis , Retrospective Studies
20.
Epidemics ; 3(3-4): 152-8, 2011 Sep.
Article in English | MEDLINE | ID: mdl-22094338

ABSTRACT

Most household models of disease transmission assume static household distributions. Although this is a reasonable simplification for assessing vaccination strategies at a single point in time or over the course of an outbreak, it has considerable drawbacks for assessing long term vaccination policies or for predicting future changes in immunity. We demonstrate that household models that include births, deaths and movement between households can show dramatically different patterns of infection and immunity to static population models. When immunity is assumed to be life-long, the pattern of births by household size is the key driver of infection, suggesting that the influx of susceptibles has most impact on infection risk in the household. In a comparison of 12 countries, we show that both the crude birth rate and the mean household size affect the risk of infection in households.


Subject(s)
Birth Rate , Communicable Diseases/epidemiology , Communicable Diseases/transmission , Family Characteristics , Population Dynamics , Algorithms , Australia/epidemiology , Communicable Diseases/mortality , Disease Outbreaks , Humans , Models, Theoretical , Population Density , Prevalence , Vaccination/statistics & numerical data
SELECTION OF CITATIONS
SEARCH DETAIL
...