Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 120
Filter
1.
J Med Toxicol ; 2024 Jul 11.
Article in English | MEDLINE | ID: mdl-38992233

ABSTRACT

BACKGROUND: Acetaminophen toxicity remains one of the most common causes of liver failure and is treated with a course of n-acetylcysteine (NAC). This exceptionally effective medication is traditionally administered using a complicated three-bag protocol that is prone to administration errors. OBJECTIVE: We aimed to assess whether switching to a novel two-bag protocol (150 mg/kg over 1 h followed by 150 mg/kg over 20 h) reduced administration errors while not increasing liver injury or anaphylactoid reactions. METHODS: This was a retrospective chart review of hospital encounters for patients with acetaminophen toxicity, comparing outcomes before and after the change from a three-bag protocol to a two-bag protocol at two affiliated institutions. The primary outcome was incidence of medication errors with secondary outcomes including acute liver injury (ALI) and incidence of non-anaphylactoid allergic reactions (NAAR). The study was approved by the health system's Institutional Review Board. RESULTS: 483 encounters were included for analysis (239 in the three-bag and 244 in the two-bag groups). NAAR were identified in 11 patients with no difference seen between groups. Similarly, no differences were seen in ALI. Medication administration errors were observed significantly less often in the two-bag group (OR 0.24) after adjusting for confounders. CONCLUSION: Transitioning to a novel two-bag NAC regimen decreased administration errors. This adds to the literature that two-bag NAC regimens are not only safe but also may have significant benefits over the traditional NAC protocol.

2.
Article in English | MEDLINE | ID: mdl-38791793

ABSTRACT

Recreational waterbodies with high levels of faecal indicator bacteria (FIB) pose health risks and are an ongoing challenge for urban-lake managers. Lake Burley Griffin (LBG) in the Australian Capital city of Canberra is a popular site for water-based recreation, but analyses of seasonal and long-term patterns in enterococci that exceed alert levels (>200 CFU per 100 mL, leading to site closures) are lacking. This study analysed enterococci concentrations from seven recreational sites from 2001-2021 to examine spatial and temporal patterns in exceedances during the swimming season (October-April), when exposure is highest. The enterococci concentrations varied significantly across sites and in the summer months. The frequency of the exceedances was higher in the 2009-2015 period than in the 2001-2005 and 2015-2021 periods. The odds of alert-level concentrations were greater in November, December, and February compared to October. The odds of exceedance were higher at the Weston Park East site (swimming beach) and lower at the Ferry Terminal and Weston Park West site compared to the East Basin site. This preliminary examination highlights the need for site-specific assessments of environmental and management-related factors that may impact the public health risks of using the lake, such as inflows, turbidity, and climatic conditions. The insights from this study confirm the need for targeted monitoring efforts during high-risk months and at specific sites. The study also advocates for implementing measures to minimise faecal pollution at its sources.


Subject(s)
Enterococcus , Environmental Monitoring , Lakes , Recreation , Water Quality , Lakes/microbiology , Enterococcus/isolation & purification , Water Microbiology , Seasons , Spatio-Temporal Analysis
3.
BMC Infect Dis ; 24(1): 510, 2024 May 21.
Article in English | MEDLINE | ID: mdl-38773455

ABSTRACT

BACKGROUND: Respiratory syncytial virus (RSV) is the most common cause of acute lower respiratory infections in children worldwide. The highest incidence of severe disease is in the first 6 months of life, with infants born preterm at greatest risk for severe RSV infections. The licensure of new RSV therapeutics (a long-acting monoclonal antibody and a maternal vaccine) in Europe, USA, UK and most recently in Australia, has driven the need for strategic decision making on the implementation of RSV immunisation programs. Data driven approaches, considering the local RSV epidemiology, are critical to advise on the optimal use of these therapeutics for effective RSV control. METHODS: We developed a dynamic compartmental model of RSV transmission fitted to individually-linked population-based laboratory, perinatal and hospitalisation data for 2000-2012 from metropolitan Western Australia (WA), stratified by age and prior exposure. We account for the differential risk of RSV-hospitalisation in full-term and preterm infants (defined as < 37 weeks gestation). We formulated a function relating age, RSV exposure history, and preterm status to the risk of RSV-hospitalisation given infection. RESULTS: The age-to-risk function shows that risk of hospitalisation, given RSV infection, declines quickly in the first 12 months of life for all infants and is 2.6 times higher in preterm compared with term infants. The hospitalisation risk, given infection, declines to < 10% of the risk at birth by age 7 months for term infants and by 9 months for preterm infants. CONCLUSIONS: The dynamic model, using the age-to-risk function, characterises RSV epidemiology for metropolitan WA and can now be extended to predict the impact of prevention measures. The stratification of the model by preterm status will enable the comparative assessment of potential strategies in the extended model that target this RSV risk group relative to all-population approaches. Furthermore, the age-to-risk function developed in this work has wider relevance to the epidemiological characterisation of RSV.


Subject(s)
Hospitalization , Infant, Premature , Respiratory Syncytial Virus Infections , Humans , Respiratory Syncytial Virus Infections/epidemiology , Respiratory Syncytial Virus Infections/prevention & control , Hospitalization/statistics & numerical data , Infant , Infant, Newborn , Western Australia/epidemiology , Female , Respiratory Syncytial Virus, Human , Age Factors , Male , Risk Assessment , Risk Factors
4.
Prev Vet Med ; 228: 106212, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38704921

ABSTRACT

African swine fever (ASF) is a viral disease that affects domestic and feral pigs. While not currently present in Australia, ASF outbreaks have been reported nearby in Indonesia, Timor-Leste, and Papua New Guinea. Feral pigs are found in all Australian states and territories and are distributed in a variety of habitats. To investigate the impacts of an ASF introduction event in Australia, we used a stochastic network-based metapopulation feral pig model to simulate ASF outbreaks in different regions of Australia. Outbreak intensity and persistence in feral pig populations was governed by local pig recruitment rates, population size, carcass decay period, and, if applicable, metapopulation topology. In Northern Australia, the carcass decay period was too short for prolonged persistence, while endemic transmission could possibly occur in cooler southern areas. Populations in Macquarie Marshes in New South Wales and in Namadgi National Park in the Australian Capital Territory had the highest rates of persistence. The regions had different modes of transmission that led to long-term persistence. Endemic Macquarie Marshes simulations were characterised by rapid transmission caused by high population density that required a fragmented metapopulation to act as a bottleneck to slow transmission. Endemic simulations in Namadgi, with low density and relatively slow transmission, relied on large, well-connected populations coupled with long carcass decay times. Despite the potential for endemic transmission, both settings required potentially unlikely population sizes and dynamics for prolonged disease survival.


Subject(s)
African Swine Fever , Disease Outbreaks , Animals , Swine , African Swine Fever/epidemiology , African Swine Fever/transmission , African Swine Fever/virology , Disease Outbreaks/veterinary , Australia/epidemiology , Animals, Wild/virology , Population Density , Models, Biological , Sus scrofa
5.
PLoS One ; 19(2): e0296774, 2024.
Article in English | MEDLINE | ID: mdl-38300944

ABSTRACT

In low-to-middle-income countries (LMICs), enteric pathogens contribute to child malnutrition, affecting nutrient absorption, inducing inflammation, and causing diarrhoea. This is a substantial problem in LMICs due to high disease burden, poor sanitation and nutritional status, and the cyclical nature of pathogen infection and malnutrition. This relationship remains understudied in Timor-Leste. In our pilot study of enteric pathogens and malnutrition in Dili, Timor-Leste (July 2019-October 2020), we recruited 60 infants in a birth cohort from Hospital Nacional Guido Valadares (HNGV) with up to four home visits. We collected faecal samples and details of demographics, anthropometrics, diet and food practices, and animal husbandry. Additionally, we collected faecal samples, diagnostics, and anthropometrics from 160 children admitted to HNGV with a clinical diagnosis of severe diarrhoea or severe acute malnutrition (SAM). We tested faeces using the BioFire® FilmArray® Gastrointestinal Panel. We detected high prevalence of enteric pathogens in 68.8% (95%CI 60.4-76.2%) of infants at home, 88.6% of SAM cases (95%CI 81.7-93.3%) and 93.8% of severe diarrhoea cases (95%CI 67.7-99.7%). Diarrhoeagenic Escherichia coli and Campylobacter spp. were most frequently detected. Pathogen presence did not significantly differ in birth cohort diarrhoeal stool, but hospital data indicated associations between Salmonella and Shigella and diarrhoea. We observed wasting in 18.4% (95%CI 9.2-32.5%) to 30.8% (95%CI 17.5-47.7%) of infants across home visits, 57.9% (95%CI 34.0-78.9%) of severe diarrhoea cases, and 92.5% (95%CI 86.4-96.2%) of SAM cases. We associated bottle feeding with increased odds of pathogen detection when compared with exclusive breastfeeding at home (OR 8.3, 95%CI 1.1-62.7). We detected high prevalence of enteric pathogens and signs of malnutrition in children in Dili. Our pilot is proof of concept for a study to fully explore the risk factors and associations between enteric pathogens and malnutrition in Timor-Leste.


Subject(s)
Child Nutrition Disorders , Malnutrition , Severe Acute Malnutrition , Infant , Child , Animals , Female , Humans , Pilot Projects , Child Nutrition Disorders/epidemiology , Child Nutrition Disorders/complications , Birth Cohort , Timor-Leste/epidemiology , Malnutrition/epidemiology , Malnutrition/complications , Diarrhea/epidemiology , Diarrhea/etiology , Severe Acute Malnutrition/complications , Hospitals
7.
Microb Genom ; 10(1)2024 Jan.
Article in English | MEDLINE | ID: mdl-38214338

ABSTRACT

Campylobacter spp. are a common cause of bacterial gastroenteritis in Australia, primarily acquired from contaminated meat. We investigated the relationship between genomic virulence characteristics and the severity of campylobacteriosis, hospitalisation, and other host factors.We recruited 571 campylobacteriosis cases from three Australian states and territories (2018-2019). We collected demographic, health status, risk factors, and self-reported disease data. We whole genome sequenced 422 C. jejuni and 84 C. coli case isolates along with 616 retail meat isolates. We classified case illness severity using a modified Vesikari scoring system, performed phylogenomic analysis, and explored risk factors for hospitalisation and illness severity.On average, cases experienced a 7.5 day diarrhoeal illness with additional symptoms including stomach cramps (87.1 %), fever (75.6 %), and nausea (72.0 %). Cases aged ≥75 years had milder symptoms, lower Vesikari scores, and higher odds of hospitalisation compared to younger cases. Chronic gastrointestinal illnesses also increased odds of hospitalisation. We observed significant diversity among isolates, with 65 C. jejuni and 21 C. coli sequence types. Antimicrobial resistance genes were detected in 20.4 % of isolates, but multidrug resistance was rare (0.04 %). Key virulence genes such as cdtABC (C. jejuni) and cadF were prevalent (>90 % presence) but did not correlate with disease severity or hospitalisation. However, certain genes (e.g. fliK, Cj1136, and Cj1138) appeared to distinguish human C. jejuni cases from food source isolates.Campylobacteriosis generally presents similarly across cases, though some are more severe. Genotypic virulence factors identified in the literature to-date do not predict disease severity but may differentiate human C. jejuni cases from food source isolates. Host factors like age and comorbidities have a greater influence on health outcomes than virulence factors.


Subject(s)
Campylobacter Infections , Campylobacter coli , Campylobacter jejuni , Gastroenteritis , Humans , Campylobacter Infections/epidemiology , Campylobacter Infections/microbiology , Campylobacter coli/genetics , Australia/epidemiology , Virulence Factors/genetics , Genomics
8.
BMC Public Health ; 23(1): 2466, 2023 12 11.
Article in English | MEDLINE | ID: mdl-38082260

ABSTRACT

BACKGROUND: COVID-19 vaccine coverage in low- and middle-income countries continues to be challenging. As supplies increase, coverage is increasingly becoming determined by rollout capacity. METHODS: We developed a deterministic compartmental model of COVID-19 transmission to explore how age-, risk-, and dose-specific vaccine prioritisation strategies can minimise severe outcomes of COVID-19 in Sierra Leone. RESULTS: Prioritising booster doses to older adults and adults with comorbidities could reduce the incidence of severe disease by 23% and deaths by 34% compared to the use of these doses as primary doses for all adults. Providing a booster dose to pregnant women who present to antenatal care could prevent 38% of neonatal deaths associated with COVID-19 infection during pregnancy. The vaccination of children is not justified unless there is sufficient supply to not affect doses delivered to adults. CONCLUSIONS: Our paper supports current WHO SAGE vaccine prioritisation guidelines (released January 2022). Individuals who are at the highest risk of developing severe outcomes should be prioritised, and opportunistic vaccination strategies considered in settings with limited rollout capacity.


Subject(s)
COVID-19 , Perinatal Death , Pregnancy , Child , Infant, Newborn , Humans , Female , Aged , COVID-19/epidemiology , COVID-19/prevention & control , COVID-19 Vaccines , Sierra Leone/epidemiology , Vaccination
9.
Biology (Basel) ; 12(11)2023 Nov 13.
Article in English | MEDLINE | ID: mdl-37998028

ABSTRACT

Ross River virus (RRV) is the most common mosquito-borne disease in Australia, with Queensland recording high incidence rates (with an annual average incidence rate of 0.05% over the last 20 years). Accurate prediction of RRV incidence is critical for disease management and control. Many factors, including mosquito abundance, climate, weather, geographical factors, and socio-economic indices, can influence the RRV transmission cycle and thus have potential utility as predictors of RRV incidence. We collected mosquito data from the city councils of Brisbane, Redlands, and Mackay in Queensland, together with other meteorological and geographical data. Predictors were selected to build negative binomial generalised linear models for prediction. The models demonstrated excellent performance in Brisbane and Redlands but were less satisfactory in Mackay. Mosquito abundance was selected in the Brisbane model and can improve the predictive performance. Sufficient sample sizes of continuous mosquito data and RRV cases were essential for accurate and effective prediction, highlighting the importance of routine vector surveillance for disease management and control. Our results are consistent with variation in transmission cycles across different cities, and our study demonstrates the usefulness of mosquito surveillance data for predicting RRV incidence within small geographical areas.

10.
Open Forum Infect Dis ; 10(10): ofad450, 2023 Oct.
Article in English | MEDLINE | ID: mdl-37790944

ABSTRACT

Background: The association between early-life respiratory syncytial virus (RSV) infections and later respiratory morbidity is well established. However, there is limited evidence on factors that influence this risk. We examined sociodemographic and perinatal factors associated with later childhood respiratory morbidity requiring secondary care following exposure to a laboratory-confirmed RSV episode in the first 2 years. Methods: We used a probabilistically linked whole-of-population-based birth cohort including 252 287 children born in Western Australia between 2000 and 2009 with follow-up to the end of 2012. Cox proportional hazards models estimated adjusted hazard ratios (aHRs) of the association of various risk factors with the first respiratory episode for asthma, wheezing, and unspecified acute lower respiratory infection beyond the age of 2 years. Results: The analytic cohort included 4151 children with a confirmed RSV test before age 2 years. The incidence of subsequent respiratory morbidity following early-life RSV infection decreased with child age at outcome (highest incidence in 2-<4-year-olds: 41.8 per 1000 child-years; 95% CI, 37.5-46.6), increased with age at RSV infection (6-<12-month-olds: 23.6/1000 child-years; 95% CI, 19.9-27.8; 12-<24-month-olds: 22.4/1000 child-years; 95% CI, 18.2-22.7) and decreasing gestational age (50.8/1000 child-years; 95% CI, 33.5-77.2 for children born extremely preterm, <28 weeks gestation). Risk factors included age at first RSV episode (6-<12 months: aHR, 1.42; 95% CI, 1.06-1.90), extreme prematurity (<28 weeks: aHR, 2.22; 95% CI, 1.40-3.53), maternal history of asthma (aHR, 1.33; 95% CI, 1.04-1.70), and low socioeconomic index (aHR, 1.76; 95% CI, 1.03-3.00). Conclusions: Our results suggest that in addition to preterm and young infants, children aged 12-<24 months could also be potential target groups for RSV prevention to reduce the burden of later respiratory morbidities associated with RSV.

11.
Vaccine X ; 15: 100386, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37727365

ABSTRACT

Continued efforts to reduce the burden of COVID-19 require the consideration of additional booster doses and emerging oral antivirals. This study explored the individual- and population-level impacts of booster dose and oral antivirals in Indonesia, Fiji, Papua New Guinea, and Timor-Leste. Our mathematical model included age structure, vaccine coverage, prevalence of comorbidities, and immunity from prior infection fit to incidence data from our study settings. We explored a range of eligibility criteria and found that boosters had the largest impact per dose when prioritised to high-risk adults and adults who had not previously received a booster. Antivirals were most effective in settings with low vaccine-derived immunity. In general, fewer antivirals than booster doses were required to prevent a hospitalisation or death. Only in settings with very high vaccine uptake was the impact per dose of providing booster doses to high-risk adults comparable to providing oral antivirals to high-risk adults. Together, booster doses and oral antivirals could prevent 80%, 64%, 49%, and 65% of deaths, and 38%, 37%, 16%, and 34% of hospitalisations in Fiji, Indonesia, Papua New Guinea, and Timor-Leste respectively. Therefore, our findings support the continued provision of COVID-19 booster doses to high-risk adults in 2023, and advocate for increased access to oral antivirals, especially in settings with low vaccine coverage such as Papua New Guinea. Future work should consider the threshold at which self-financing of COVID-19 oral antivirals would be viable for middle-income countries in South-East Asia and the Pacific.

12.
Lancet Microbe ; 4(11): e953-e962, 2023 11.
Article in English | MEDLINE | ID: mdl-37683688

ABSTRACT

Whole-genome sequencing (WGS) has resulted in improvements to pathogen characterisation for the rapid investigation and management of disease outbreaks and surveillance. We conducted a systematic review to synthesise the economic evidence of WGS implementation for pathogen identification and surveillance. Of the 2285 unique publications identified through online database searches, 19 studies met the inclusion criteria. The economic evidence to support the broader application of WGS as a front-line pathogen characterisation and surveillance tool is insufficient and of low quality. WGS has been evaluated in various clinical settings, but these evaluations are predominantly investigations of a single pathogen. There are also considerable variations in the evaluation approach. Economic evaluations of costs, effectiveness, and cost-effectiveness are needed to support the implementation of WGS in public health settings.


Subject(s)
Cross Infection , Public Health Surveillance , Humans , Cost-Benefit Analysis , Whole Genome Sequencing/methods , Disease Outbreaks , Public Health
13.
Foodborne Pathog Dis ; 20(10): 419-426, 2023 Oct.
Article in English | MEDLINE | ID: mdl-37610847

ABSTRACT

Foodborne illnesses cause a significant health burden, with Campylobacter and norovirus the most common causes of illness and Salmonella a common cause of hospitalization and occasional cause of death. Estimating the cost of illness can assist in quantifying this health burden, with pathogen-specific costs informing prioritization of interventions. We used a simulation-based approach to cost foodborne disease in Australia, capturing the cost of premature mortality, direct costs of nonfatal illness (including health care costs, medications, and tests), indirect costs of illness due to lost productivity, and costs associated with pain and suffering. In Australia circa 2019, the cost in Australian Dollars (AUD) of foodborne illness and its sequelae was 2.44 billion (90% uncertainty interval 1.65-3.68) each year, with the highest pathogen-specific costs for Campylobacter, non-typhoidal Salmonella, non-Shiga toxin-producing pathogenic Escherichia coli, and norovirus. The highest cost per case was for Listeria monocytogenes (AUD 776,000). Lost productivity was the largest component cost for foodborne illness due to all causes and for most individual pathogens; the exceptions were pathogens causing more severe illness such as Salmonella and L. monocytogenes, where premature mortality was the largest component cost. Foodborne illness results in a substantial cost to Australia; interventions to improve food safety across industry, retail, and consumers are needed to maintain public health safety.

14.
PLOS Glob Public Health ; 3(8): e0000915, 2023.
Article in English | MEDLINE | ID: mdl-37619237

ABSTRACT

Maternal pneumococcal vaccines have been proposed as a method of protecting infants in the first few months of life. In this paper, we use results from a dynamic transmission model to assess the cost-effectiveness of a maternal pneumococcal polysaccharide vaccine from both healthcare and societal perspectives. We estimate the costs of delivering a maternal pneumococcal polysaccharide vaccine, the healthcare costs averted, and productivity losses avoided through the prevention of severe pneumococcal outcomes such as pneumonia and meningitis. Our model estimates that a maternal pneumococcal program would cost $606 (2020 USD, 95% prediction interval 437 to 779) from a healthcare perspective and $132 (95% prediction interval -1 to 265) from a societal perspective per DALY averted for one year of vaccine delivery. Hence, a maternal pneumococcal vaccine would be cost-effective from a societal perspective but not cost-effective from a healthcare perspective using Sierra Leone's GDP per capita of $527 as a cost-effectiveness threshold. Sensitivity analysis demonstrates how the choice to discount ongoing health benefits determines whether the maternal pneumococcal vaccine was deemed cost-effective from a healthcare perspective. Without discounting, the cost per DALY averted would be $292 (55% of Sierra Leone's GDP per capita) from a healthcare perspective. Further, the cost per DALY averted would be $142 (27% GDP per capita) from a healthcare perspective if PPV could be procured at the same cost relative to PCV in Sierra Leone as on the PAHO reference price list. Overall, our paper demonstrates that maternal pneumococcal vaccines have the potential to be cost-effective in low-income settings; however, the likelihood of low-income countries self-financing this intervention will depend on negotiations with vaccine providers on vaccine price. Vaccine price is the largest program cost driving the cost-effectiveness of a future maternal pneumococcal vaccine.

15.
Vaccine ; 41(36): 5216-5220, 2023 08 14.
Article in English | MEDLINE | ID: mdl-37474407

ABSTRACT

Respiratory syncytial virus contributes to significant global infant morbidity and mortality. We applied a previously developed statistical prediction model incorporating pre-pandemic RSV testing data and hospital admission data to estimate infant RSV-hospitalizations by birth month and prematurity, focused on infants aged <1 year. The overall predicted RSV-hospitalization incidence rates in infants <6 months were 32.7/1,000 child-years (95 % CI: 31.8, 33.5) and 3.1/1,000 child-years (95 % CI: 3.0, 3.1) in infants aged 6-<12 months. Predicted RSV-hospitalization rates for infants aged <6 months were highest for infants born in April/May. Predicted rates for preterm infants born 29-32 weeks gestation were highest in March-May, whereas infants born >33 weeks had peak RSV-hospitalization rates from May-June, similar to late preterm or term births. RSV-hospitalization rates in the pre-pandemic era were highly seasonal, and seasonality varied with degree of prematurity. Accurate estimates of RSV-hospitalization in high-risk sub-groups are essential to understand preventable burden of RSV especially given the current prevention landscape.


Subject(s)
Respiratory Syncytial Virus Infections , Respiratory Syncytial Virus, Human , Humans , Infant, Newborn , Infant , Infant, Premature , Incidence , Respiratory Syncytial Virus Infections/epidemiology , Western Australia/epidemiology , Seasons , Hospitalization , Palivizumab/therapeutic use , Antiviral Agents/therapeutic use
16.
PLoS Negl Trop Dis ; 17(5): e0011347, 2023 05.
Article in English | MEDLINE | ID: mdl-37200375

ABSTRACT

American Samoa underwent seven rounds of mass drug administration (MDA) for lymphatic filariasis (LF) from 2000-2006, but subsequent surveys found evidence of ongoing transmission. American Samoa has since undergone further rounds of MDA in 2018, 2019, and 2021; however, recent surveys indicate that transmission is still ongoing. GEOFIL, a spatially-explicit agent-based LF model, was used to compare the effectiveness of territory-wide triple-drug MDA (3D-MDA) with targeted surveillance and treatment strategies. Both approaches relied on treatment with ivermectin, diethylcarbamazine, and albendazole. We simulated three levels of whole population coverage for 3D-MDA: 65%, 73%, and 85%, while the targeted strategies relied on surveillance in schools, workplaces, and households, followed by targeted treatment. In the household-based strategies, we simulated 1-5 teams travelling village-to-village and offering antigen (Ag) testing to randomly selected households in each village. If an Ag-positive person was identified, treatment was offered to members of all households within 100m-1km of the positive case. All simulated interventions were finished by 2027 and their effectiveness was judged by their 'control probability'-the proportion of simulations in which microfilariae prevalence decreased between 2030 and 2035. Without future intervention, we predict Ag prevalence will rebound. With 3D-MDA, a 90% control probability required an estimated ≥ 4 further rounds with 65% coverage, ≥ 3 rounds with 73% coverage, or ≥ 2 rounds with 85% coverage. While household-based strategies were substantially more testing-intensive than 3D-MDA, they could offer comparable control probabilities with substantially fewer treatments; e.g. three teams aiming to test 50% of households and offering treatment to a 500m radius had approximately the same control probability as three rounds of 73% 3D-MDA, but used < 40% the number of treatments. School- and workplace-based interventions proved ineffective. Regardless of strategy, reducing Ag prevalence below the 1% target threshold recommended by the World Health Organization was a poor indicator of the interruption of LF transmission, highlighting the need to review blanket elimination targets.


Subject(s)
Elephantiasis, Filarial , Filaricides , Animals , Humans , Elephantiasis, Filarial/drug therapy , Elephantiasis, Filarial/epidemiology , Elephantiasis, Filarial/prevention & control , Mass Drug Administration , Wuchereria bancrofti , Filaricides/therapeutic use , Filaricides/pharmacology , American Samoa/epidemiology , Albendazole/therapeutic use
17.
Risk Anal ; 43(12): 2527-2548, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37032319

ABSTRACT

Campylobacter jejuni and Campylobacter coli infections are the leading cause of foodborne gastroenteritis in high-income countries. Campylobacter colonizes a variety of warm-blooded hosts that are reservoirs for human campylobacteriosis. The proportions of Australian cases attributable to different animal reservoirs are unknown but can be estimated by comparing the frequency of different sequence types in cases and reservoirs. Campylobacter isolates were obtained from notified human cases and raw meat and offal from the major livestock in Australia between 2017 and 2019. Isolates were typed using multi-locus sequence genotyping. We used Bayesian source attribution models including the asymmetric island model, the modified Hald model, and their generalizations. Some models included an "unsampled" source to estimate the proportion of cases attributable to wild, feral, or domestic animal reservoirs not sampled in our study. Model fits were compared using the Watanabe-Akaike information criterion. We included 612 food and 710 human case isolates. The best fitting models attributed >80% of Campylobacter cases to chickens, with a greater proportion of C. coli (>84%) than C. jejuni (>77%). The best fitting model that included an unsampled source attributed 14% (95% credible interval [CrI]: 0.3%-32%) to the unsampled source and only 2% to ruminants (95% CrI: 0.3%-12%) and 2% to pigs (95% CrI: 0.2%-11%) The best fitting model that did not include an unsampled source attributed 12% to ruminants (95% CrI: 1.3%-33%) and 6% to pigs (95% CrI: 1.1%-19%). Chickens were the leading source of human Campylobacter infections in Australia in 2017-2019 and should remain the focus of interventions to reduce burden.


Subject(s)
Campylobacter Infections , Campylobacter jejuni , Campylobacter , Gastroenteritis , Animals , Humans , Swine , Campylobacter Infections/epidemiology , Bayes Theorem , Chickens , Australia/epidemiology , Multilocus Sequence Typing , Campylobacter/genetics , Campylobacter jejuni/genetics , Ruminants
18.
PLoS Negl Trop Dis ; 17(3): e0010450, 2023 03.
Article in English | MEDLINE | ID: mdl-36857390

ABSTRACT

Shigellosis is an increasing cause of gastroenteritis in Australia, with prolonged outbreaks reported in remote Aboriginal and Torres Strait Islander (hereafter "First Nations") communities and among men who have sex with men (MSM) in major cities. To determine associations between Shigella species and demographic and geographic factors, we used multivariate negative binomial regression to analyse national case notifications of shigellosis from 2001 to 2019. Between 2001 and 2019, Australian states and territories reported 18,363 shigellosis cases to the National Notifiable Diseases Surveillance System (NNDSS), of which age, sex and organism information were available for >99% (18,327/18,363) of cases. Of the cases included in our analysis, 42% (7,649/18,327) were S. sonnei, 29% (5,267/18,327) were S. flexneri, 1% (214/18,327) were S. boydii, less than 1% (87/18,327) were S. dysenteriae, and species information was unknown for 28% (5,110/18,327) of cases. Males accounted for 54% (9,843/18,327) of cases, and the highest proportion of cases were in children aged 0-4 years (19%; 3,562/18,327). Crude annual notification rates ranged from 2.2 cases per 100,000 in 2003 and 2011 to 12.4 cases per 100,000 in 2019. Nationally, notification rates increased from 2001 to 2019 with yearly notification rate ratios of 1.04 (95% CI 1.02-1.07) for S. boydii and 1.05 (95% CI 1.04-1.06) for S. sonnei. Children aged 0-4 years had the highest burden of infection for S. flexneri, S. sonnei and S. boydii; and males had a higher notification rate for S. sonnei (notification rate ratio 1.24, 95% CI 1.15-1.33). First Nations Australians were disproportionately affected by shigellosis, with the notification rate in this population peaking in 2018 at 92.1 cases per 100,000 population. Over the study period, we also observed a shift in the testing method used to diagnose shigellosis, with culture independent diagnostic testing (CIDT) increasing from 2014; this also coincided with an increase in notifications of untyped Shigella. This change in testing methodology may have contributed to the observed increase in shigellosis notifications since 2014, with CIDT being more sensitive than culture dependent testing methods. The findings of this study provide important insights into the epidemiological characteristics of shigellosis in Australia, including identification of high-risk groups. This can be used to inform public health prevention and control strategies, such as targeted communication programs in First Nations communities and places with high levels of interaction between young children, such as childcare centres. Our study findings also highlight the implications of culture independent testing on shigellosis surveillance, particularly a reduction in the availability of species level information. This emphasises the continued importance of culture dependant testing for national surveillance of shigellosis.


Subject(s)
Dysentery, Bacillary , Sexual and Gender Minorities , Shigella , Child , Male , Humans , Child, Preschool , Dysentery, Bacillary/epidemiology , Dysentery, Bacillary/diagnosis , Homosexuality, Male , Australia/epidemiology
19.
JBI Evid Synth ; 21(3): 507-519, 2023 03 01.
Article in English | MEDLINE | ID: mdl-36683451

ABSTRACT

OBJECTIVE: This study aimed to assess the utility of a unified tool (MASTER) for bias assessment against design-specific tools in terms of content and coverage. METHODS: Each of the safeguards in the design-specific tools was compared and matched to safeguards in the unified MASTER scale. The design-specific tools were the JBI, Scottish Intercollegiate Guidelines Network (SIGN), and the Newcastle-Ottawa Scale (NOS) tools for analytic study designs. Duplicates, safeguards that could not be mapped to the MASTER scale, and items not applicable as safeguards against bias were flagged and described. RESULTS: Many safeguards across the JBI, SIGN, and NOS tools were common, with a minimum of 10 to a maximum of 23 unique safeguards across various tools. These 3 design-specific toolsets were missing 14 to 26 safeguards from the MASTER scale. The MASTER scale had complete coverage of safeguards within the 3 toolsets for analytic designs. CONCLUSIONS: The MASTER scale provides a unified framework for bias assessment of analytic study designs, has good coverage, avoids duplication, has less redundancy, and is more convenient when used for methodological quality assessment in evidence synthesis. It also allows assessment across designs that cannot be done using a design-specific tool.


Subject(s)
Research Design , Humans , Bias
20.
Arch Dis Child Fetal Neonatal Ed ; 108(4): 400-407, 2023 Jul.
Article in English | MEDLINE | ID: mdl-36593112

ABSTRACT

OBJECTIVE: There is an expectation among the public and within the profession that the performance and outcome of neonatal intensive care units (NICUs) should be comparable between centres with a similar setting. This study aims to benchmark and audit performance variation in a regional Australian network of eight NICUs. DESIGN: Cohort study using prospectively collected data. SETTING: All eight perinatal centres in New South Wales and the Australian Capital Territory, Australia. PATIENTS: All live-born infants born between 23+0 and 31+6 weeks gestation admitted to one of the tertiary perinatal centres from 2007 to 2020 (n=12 608). MAIN OUTCOME MEASURES: Early and late confirmed sepsis, intraventricular haemorrhage, medically and surgically treated patent ductus arteriosus, chronic lung disease (CLD), postnatal steroid for CLD, necrotising enterocolitis, retinopathy of prematurity (ROP), surgery for ROP, hospital mortality and home oxygen. RESULTS: NICUs showed variations in maternal and neonatal characteristics and resources. The unadjusted funnel plots for neonatal outcomes showed apparent variation with multiple centres outside the 99.8% control limits of the network values. The hierarchical model-based risk-adjustment accounting for differences in patient characteristics showed that discharged home with oxygen is the only outcome above the 99.8% control limits. CONCLUSIONS: Hierarchical model-based risk-adjusted estimates of morbidity rates plotted on funnel plots provide a robust and straightforward visual graphical tool for presenting variations in outcome performance to detect aberrations in healthcare delivery and guide timely intervention. We propose using hierarchical model-based risk adjustment and funnel plots in real or near real-time to detect aberrations and start timely intervention.


Subject(s)
Lung Diseases , Retinopathy of Prematurity , Humans , Infant, Newborn , Australia/epidemiology , Cohort Studies , Hospitals , Infant, Premature , Intensive Care Units, Neonatal , Oxygen
SELECTION OF CITATIONS
SEARCH DETAIL
...