Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 26
Filter
1.
Syst Rev ; 13(1): 126, 2024 May 08.
Article in English | MEDLINE | ID: mdl-38720337

ABSTRACT

BACKGROUND: The unprecedented volume and speed at which COVID-19-related systematic reviews (SRs) may have been produced has raised questions regarding the quality of this evidence. It is feasible that pandemic-related factors may have led to an impairment in quality (reduced internal validity, increased risk of bias [RoB]). This may have serious implications for decision-making related to public health and individual healthcare. OBJECTIVE: The primary objective was to compare the quality of SRs published during the pandemic that were related to COVID-19 with SRs published during the pandemic that were unrelated to COVID-19 (all of which were fully appraised in the KSR Evidence database of SRs in healthcare). Our secondary objective was to compare the quality of SRs published during the pandemic (regardless of research topic), with SRs published pre-pandemic. METHODS: We compared all SRs related to COVID-19 to all SRs unrelated to COVID-19 that (i) were published during the pandemic (between 1st March 2020 and September 14, 2022), (ii) were included in KSR Evidence, and (iii) had been appraised using the ROBIS tool. We then compared all SRs published during the pandemic (regardless of research topic) with a pre-pandemic sample of SRs. RESULTS: For SRs published during the pandemic, we found there was no statistically significant difference in quality between those SRs tagged as being related to COVID-19 and those that were not [relative risk (RR) of low RoB for COVID-19 versus COVID-19-unrelated reviews: 0.94; 95% confidence interval (CI): 0.66 to 1.34]. Generally, COVID-19 SRs and COVID-19-unrelated SRs were both of low quality with only 10% of COVID-19 reviews and 11% of COVID-19-unrelated reviews rated as low RoB. However, SRs (regardless of topic) published during the pandemic were of lower quality than those published pre-pandemic (RR for low RoB for 'during pandemic' versus 'pre-pandemic': 0.30; 95% CI: 0.26 to 0.34) with 11% of pandemic and 36% of pre-pandemic SRs rated as low RoB. CONCLUSION: These results suggest COVID-19 and COVID-19-unrelated SRs published during the pandemic are equally of low quality. SRs published during the pandemic were generally lower quality compared with SRs published pre-pandemic irrespective of COVID-19 focus. Moreover, SR quality in general is seriously lacking, and considerable efforts need to be made to substantially improve the quality and rigour of the SR process.


Subject(s)
COVID-19 , SARS-CoV-2 , Systematic Reviews as Topic , COVID-19/epidemiology , Humans , Pandemics
2.
Water Res ; 253: 121207, 2024 Apr 01.
Article in English | MEDLINE | ID: mdl-38401469

ABSTRACT

Wastewater-based epidemiology (WBE) is an emerging, practical surveillance tool for monitoring community levels of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2, SC2). However, a paucity of data exists regarding SARS-CoV-2 and viral biomarker behaviour in aqueous and wastewater environments. Therefore, there is a pressing need to develop efficient and robust methods that both improve method sensitivity and reduce time and cost. We present a novel method for SARS-CoV-2, Human Coronavirus 229E (229E), and Pepper Mild Mottle Virus (PMMoV) recovery utilizing surface charge-based attraction via the branched cationic polymer, polyethylenimine (PEI). Initially, dose-optimization experiments demonstrated that low concentrations of PEI (0.001% w/v) proved most effective at flocculating suspended viruses and viral material, including additional unbound SC2 viral fragments and/or RNA from raw wastewater. A design-of-experiments (DOE) approach was used to optimize virus and/or viral material aggregation behaviour and recovery across varying aqueous conditions, revealing pH as a major influence on recoverability in this system, combinatorially due to both a reduction in viral material surface charge and increased protonation of PEI-bound amine groups. Overall, this method has shown great promise in significantly improving quantitative viral recovery, providing a straightforward and effective augmentation to standard centrifugation techniques.


Subject(s)
COVID-19 , RNA, Viral , Humans , SARS-CoV-2 , Polyethyleneimine , Wastewater
3.
Front Digit Health ; 5: 1185586, 2023.
Article in English | MEDLINE | ID: mdl-37534029

ABSTRACT

Background: Strategies to increase physical activity (PA) and improve nutrition would contribute to substantial health benefits in the population, including reducing the risk of several types of cancers. The increasing accessibility of digital technologies mean that these tools could potentially facilitate the improvement of health behaviours among young people. Objective: We conducted a review of systematic reviews to assess the available evidence on digital interventions aimed at increasing physical activity and good nutrition in sub-populations of young people (school-aged children, college/university students, young adults only (over 18 years) and both adolescent and young adults (<25 years)). Methods: Searches for systematic reviews were conducted across relevant databases including KSR Evidence (www.ksrevidence.com), Cochrane Database of Systematic Reviews (CDSR) and Database of Abstracts of Reviews of Effects (DARE; CRD). Records were independently screened by title and abstract by two reviewers and those deemed eligible were obtained for full text screening. Risk of bias (RoB) was assessed with the Risk of Bias Assessment Tool for Systematic Reviews (ROBIS) tool. We employed a narrative analysis and developed evidence gap maps. Results: Twenty-four reviews were included with at least one for each sub-population and employing a range of digital interventions. The quality of evidence was limited with only one of the 24 of reviews overall judged as low RoB. Definitions of "digital intervention" greatly varied across systematic reviews with some reported interventions fitting into more than one category (i.e., an internet intervention could also be a mobile phone or computer intervention), however definitions as reported in the relevant reviews were used. No reviews reported cancer incidence or related outcomes. Available evidence was limited both by sub-population and type of intervention, but evidence was most pronounced in school-aged children. In school-aged children eHealth interventions, defined as school-based programmes delivered by the internet, computers, tablets, mobile technology, or tele-health methods, improved outcomes. Accelerometer-measured (Standardised Mean Difference [SMD] 0.33, 95% Confidence Interval [CI]: 0.05 to 0.61) and self-reported (SMD: 0.14, 95% CI: 0.05 to 0.23) PA increased, as did fruit and vegetable intake (SMD: 0.11, 95% CI: 0.03 to 0.19) (review rated as low RoB, minimal to considerable heterogeneity across results). No difference was reported for consumption of fat post-intervention (SMD: -0.06, 95% CI: -0.15 to 0.03) or sugar sweetened beverages(SSB) and snack consumption combined post-intervention (SMD: -0.02, 95% CI:-0.10 to 0.06),or at the follow up (studies reported 2 weeks to 36 months follow-up) after the intervention (SMD:-0.06, 95% CI: -0.15 to 0.03) (review rated low ROB, minimal to substantial heterogeneity across results). Smartphone based interventions utilising Short Messaging Service (SMS), app or combined approaches also improved PA measured using objective and subjective methods (SMD: 0.44, 95% CI: 0.11 to 0.77) when compared to controls, with increases in total PA [weighted mean difference (WMD) 32.35 min per day, 95% CI: 10.36 to 54.33] and in daily steps (WMD: 1,185, 95% CI: 303 to 2,068) (review rated as high RoB, moderate to substantial heterogeneity across results). For all results, interpretation has limitations in terms of RoB and presence of unexplained heterogeneity. Conclusions: This review of reviews has identified limited evidence that suggests some potential for digital interventions to increase PA and, to lesser extent, improve nutrition in school-aged children. However, effects can be small and based on less robust evidence. The body of evidence is characterised by a considerable level of heterogeneity, unclear/overlapping populations and intervention definitions, and a low methodological quality of systematic reviews. The heterogeneity across studies is further complicated when the age (older vs. more recent), interactivity (feedback/survey vs. no/less feedback/surveys), and accessibility (type of device) of the digital intervention is considered. This underscores the difficulty in synthesising evidence in a field with rapidly evolving technology and the resulting challenges in recommending the use of digital technology in public health. There is an urgent need for further research using contemporary technology and appropriate methods.

4.
Front Digit Health ; 5: 1178407, 2023.
Article in English | MEDLINE | ID: mdl-37288171

ABSTRACT

Background: Strategies to reduce alcohol consumption would contribute to substantial health benefits in the population, including reducing cancer risk. The increasing accessibility and applicability of digital technologies make these powerful tools suitable to facilitate changes in behaviour in young people which could then translate into both immediate and long-term improvements to public health. Objective: We conducted a review of systematic reviews to assess the available evidence on digital interventions aimed at reducing alcohol consumption in sub-populations of young people [school-aged children, college/university students, young adults only (over 18 years) and both adolescent and young adults (<25 years)]. Methods: Searches were conducted across relevant databases including KSR Evidence, Cochrane Database of Systematic Reviews (CDSR) and Database of Abstracts of Reviews of Effects (DARE). Records were independently screened by title and abstract and those that met inclusion criteria were obtained for full text screening by two reviewers. Risk of bias (RoB) was assessed with the ROBIS checklist. We employed a narrative analysis. Results: Twenty-seven systematic reviews were included that addressed relevant interventions in one or more of the sub-populations, but those reviews were mostly assessed as low quality. Definitions of "digital intervention" greatly varied across systematic reviews. Available evidence was limited both by sub-population and type of intervention. No reviews reported cancer incidence or influence on cancer related outcomes. In school-aged children eHealth multiple health behaviour change interventions delivered through a variety of digital methods were not effective in preventing or reducing alcohol consumption with no effect on the prevalence of alcohol use [Odds Ratio (OR) = 1.13, 95% CI: 0.95-1.36, review rated low RoB, minimal heterogeneity]. While in adolescents and/or young adults who were identified as risky drinkers, the use of computer or mobile device-based interventions resulted in reduced alcohol consumption when comparing the digital intervention with no/minimal intervention (-13.4 g/week, 95% CI: -19.3 to -7.6, review rated low RoB, moderate to substantial heterogeneity).In University/College students, a range of E-interventions reduced the number of drinks consumed per week compared to assessment only controls although the overall effect was small [standardised mean difference (SMD): -0.15, 95% CI: -0.21 to -0.09]. Web-based personalised feedback interventions demonstrated a small to medium effect on alcohol consumption (SMD: -0.19, 95% CI: -0.27 to -0.11) (review rated high RoB, minimal heterogeneity). In risky drinkers, stand-alone Computerized interventions reduced short (SMD: -0.17, 95% CI: -0.27 to -0.08) and long term (SMD: -0.17, 95% CI: -0.30 to -0.04) alcohol consumption compared to no intervention, while a small effect (SMD: -0.15, 95% CI: -0.25 to -0.06) in favour of computerised assessment and feedback vs. assessment only was observed. No short-term (SMD: -0.10, 95% CI: -0.30 to 0.11) or long-term effect (SMD: -0.11, 95% CI: -0.53 to 0.32) was demonstrated for computerised brief interventions when compared to counsellor based interventions (review rated low RoB, minimal to considerable heterogeneity). In young adults and adolescents, SMS-based interventions did not significantly reduce the quantity of drinks per occasion from baseline (SMD: 0.28, 95% CI: -0.02 to 0.58) or the average number of standard glasses per week (SMD: -0.05, 95% CI: -0.15 to 0.05) but increased the risk of binge drinking episodes (OR = 2.45, 95% CI: 1.32-4.53, review rated high RoB; minimal to substantial heterogeneity). For all results, interpretation has limitations in terms of risk of bias and heterogeneity. Conclusions: Limited evidence suggests some potential for digital interventions, particularly those with feedback, in reducing alcohol consumption in certain sub-populations of younger people. However, this effect is often small, inconsistent or diminishes when only methodologically robust evidence is considered. There is no systematic review evidence that digital interventions reduce cancer incidence through alcohol moderation in young people. To reduce alcohol consumption, a major cancer risk factor, further methodologically robust research is warranted to explore the full potential of digital interventions and to form the basis of evidence based public health initiatives.

5.
Pharmacoeconomics ; 41(8): 857-867, 2023 08.
Article in English | MEDLINE | ID: mdl-37129774

ABSTRACT

The National Institute for Health and Care Excellence (NICE) invited the manufacturer (Celgene) of oral azacitidine (ONUREG), as part of the Single Technology Appraisal (STA) process, to submit evidence for the clinical effectiveness and cost-effectiveness of oral azacitidine for maintenance treatment of acute myeloid leukaemia (AML) after induction therapy compared with watch-and-wait plus best supportive care (BSC) and midostaurin. Kleijnen Systematic Reviews Ltd, in collaboration with Maastricht University Medical Centre+, was commissioned to act as the independent Evidence Review Group (ERG). This paper summarises the company submission (CS), presents the ERG's critical review on the clinical and cost-effectiveness evidence in the CS, highlights the key methodological considerations and describes the development of the NICE guidance by the Appraisal Committee. In the QUAZAR AML-001 trial, oral azacitidine significantly improved overall survival (OS) versus placebo: median OS gain of 9.9 months (24.7 months versus 14.8 months; hazard ratio (HR) 0.69 (95% CI 0.55-0.86), p < 0.001). The median time to relapse was also better for oral azacitidine, and the incidences of TEAEs were similar for the two arms. The company excluded two of the comparators listed in the scope, low-dose cytarabine and subcutaneous azacitidine, informed only by clinical expert opinion, leaving only best supportive care (BSC) and midostaurin for the FLT3-ITD and/or FLT3-TKD (FLT3 mutation)-positive subgroup. An ITC comparing oral azacitidine to midostaurin as maintenance therapy in the appropriate subgroup demonstrated that the OS and relapse-free survival (RFS) HRs were favourable for oral azacitidine when compared with midostaurin. However, in the only available trial of midostaurin as maintenance treatment in AML that was used for this ITC, subjects were not randomised at the maintenance phase, but at induction, which posed a substantial risk of bias. The revised and final probabilistic incremental cost-effectiveness ratio (ICER) presented by the company, including a commercial arrangement, was £32,480 per quality-adjusted life year (QALY) gained for oral azacitidine versus watch-and-wait plus BSC. Oral azacitidine was dominant versus midostaurin in the FLT-3 subgroup. The ERG's concerns included the approach of modelling haematopoietic stem cell transplantation (HSCT), the generalisability of the population and the number of cycles of consolidation therapy pre-treatment in the QUAZAR AML-001 trial to UK clinical practice, and uncertainty in the relapse utility. The revised and final ERG base case resulted in a similar probabilistic ICER of £33,830 per QALY gained versus watch-and-wait plus BSC, but with remaining uncertainty. Oral azacitidine remained dominant versus midostaurin in the FLT-3 subgroup. After the second NICE appraisal committee meeting, the NICE Appraisal Committee recommended oral azacitidine (according to the commercial arrangement), within its marketing authorisation, as an option for maintenance treatment for AML in adults who are in complete remission, or complete remission with incomplete blood count recovery, after induction therapy with or without consolidation treatment, and cannot have or do not want HSCT.


Subject(s)
Induction Chemotherapy , Leukemia, Myeloid, Acute , Adult , Humans , Neoplasm Recurrence, Local/drug therapy , Azacitidine/therapeutic use , Leukemia, Myeloid, Acute/drug therapy , Cost-Benefit Analysis , Technology Assessment, Biomedical/methods , Quality-Adjusted Life Years
6.
Pharmacoeconomics ; 41(7): 741-750, 2023 07.
Article in English | MEDLINE | ID: mdl-36952138

ABSTRACT

The National Institute for Health and Care Excellence (NICE) invited the manufacturer (Eli Lilly) of abemaciclib (Verzenios) to submit evidence for the clinical and cost effectiveness of this drug in combination with endocrine therapy (ET) for the treatment of adult patients with hormone receptor (HR)-positive, human epidermal growth factor receptor 2 (HER2)-negative, node-positive early breast cancer at high risk of recurrence, as part of the Institute's Single Technology Appraisal (STA) process. Kleijnen Systematic Reviews Ltd, in combination with Newcastle University, was commissioned to act as the independent Evidence Review Group (ERG). This paper summarised the Company Submission (CS), presents the ERG's critical review of the clinical and cost-effectiveness evidence in the CS, highlights the key methodological considerations, and describes the development of the NICE guidance by the Appraisal Committee. The ERG produced a critical review of the evidence for the clinical and cost-effectiveness evidence in the CS and also independently searched for relevant evidence and modified the manufacturer decision analytic model to examine the impact of altering some of the key assumptions. A systematic literature review identified the MonarchE trial, an ongoing, open-label, randomised, double blind trial involving 5637 people comparing abemaciclib in combination with ET versus ET alone. The trial included two cohorts that used different inclusion criteria to define high risk of recurrence. The ERG considered Cohort 1 as an adequate representation of this population and the AC concluded that Cohort 1 was generalisable to National Health Service clinical practice. Trial results showed improvements in invasive disease-free survival for the abemaciclib arm, which was considered an appropriate surrogate outcome. The ERG believed that the modelling structure presented in the de novo economic model by the company was appropriate but highlighted several areas of uncertainty that had the potential to have a significant impact on the resulting incremental cost-effectiveness ratio (ICER). Areas of uncertainty included the extrapolation of long-term survival curves, the duration of treatment effect and treatment waning, and the proportion of patients who receive other CDK4/6 treatments for metastatic disease after receiving abemaciclib. ICER estimates were £9164 per quality-adjusted life-year gained for the company's base-case and £17,810 for the ERG's base-case. NICE recommended abemaciclib with ET as an option for the adjuvant treatment of HR-positive, HER2-negative, node-positive early breast cancer at high risk of recurrence.


Subject(s)
Breast Neoplasms , Adult , Humans , Female , Breast Neoplasms/drug therapy , State Medicine , Aminopyridines , Benzimidazoles , Adjuvants, Immunologic , Cost-Benefit Analysis , Technology Assessment, Biomedical/methods , Quality-Adjusted Life Years , Randomized Controlled Trials as Topic
7.
Pharmacoeconomics ; 41(3): 239-251, 2023 03.
Article in English | MEDLINE | ID: mdl-36725788

ABSTRACT

The National Institute for Health and Care Excellence invited the manufacturer (Galapagos) of filgotinib (Jyseleca®), as part of the Single Technology Appraisal process, to submit evidence for the clinical effectiveness and cost effectiveness of filgotinib for treating moderately to severely active ulcerative colitis in adults who have had an inadequate response, loss of response or were intolerant to a previous biologic agent or conventional therapy. Kleijnen Systematic Reviews Ltd, in collaboration with Maastricht University Medical Centre+, was commissioned to act as the independent Evidence Review Group. This paper summarises the company submission, presents the Evidence Review Group's critical review on the clinical and cost-effectiveness evidence in the company submission, highlights the key methodological considerations and describes the development of the National Institute for Health and Care Excellence guidance by the Appraisal Committee. The company submission included one relevant study for the comparison of filgotinib versus placebo: the SELECTION trial. As there was no head-to-head evidence with any of the comparators, the company performed two separate network meta-analyses, one for the biologic-naïve population and one for the biologic-experienced population, and for both the induction and maintenance phases. The Evidence Review Group questioned the validity of the maintenance network meta-analysis because it assumed all active treatments to be comparators in this phase, which is not in line with clinical practice. The economic analysis used a number of assumptions that introduced substantial uncertainty, which could not be fully explored, for instance, the assumption that a risk of loss of response would be independent of health state and constant over time. Company and Evidence Review Group results indicate that at its current price, and disregarding confidential discounts for comparators and subsequent treatments, filgotinib dominates some comparators (golimumab and adalimumab in the company base case, all but intravenous and subcutaneous vedolizumab in the Evidence Review Group's base case) in the biologic-naïve population. In the biologic-experienced population, filgotinib dominates all comparators in both the company and the Evidence Review Group's base case. Results should be interpreted with caution as some important uncertainties were not included in the modelling. These uncertainties were mostly centred around the maintenance network meta-analysis, loss of response, health-related quality-of-life estimates and modelling of dose escalation. The National Institute for Health and Care Excellence recommended filgotinib within its marketing authorisation, as an option for treating moderately to severely active ulcerative colitis in adults when conventional or biological treatment cannot be tolerated, or if the disease has not responded well enough or has stopped responding to these treatments, and if the company provides filgotinib according to the commercial arrangement.


Subject(s)
Biological Products , Colitis, Ulcerative , Adult , Humans , Adalimumab , Colitis, Ulcerative/drug therapy , Cost-Benefit Analysis , Pyridines , Quality-Adjusted Life Years , Technology Assessment, Biomedical
8.
Int J Hyg Environ Health ; 248: 114077, 2023 03.
Article in English | MEDLINE | ID: mdl-36462411

ABSTRACT

The province of Ontario compromises the largest groundwater reliant population in Canada serving approximately 1.6 million individuals. Unlike municipal water systems, private well water is not required to meet water quality regulatory standards and thus source maintenance, treatment and testing remains the responsibility of the well owner. Infections associated with private drinking water systems are rarely documented given their typically sporadic nature, thus the human health effects (e.g., acute gastrointestinal illness (AGI)) on consumers remains relatively unknown, representing a significant gap in water safety management. The current study sought to quantify the risk of waterborne AGI attributed to Giardia, shiga-toxin producing E. coli (STEC) and norovirus from private drinking water sources in Ontario using Monte Carlo simulation-based quantitative microbial risk assessment (QMRA). Findings suggest that consumption of contaminated private well water in Ontario is responsible for approximately 4823 AGI cases annually, with 3464 (71.8%) and 1359 (28.1%) AGI cases predicted to occur in consolidated and unconsolidated aquifers, respectively. By pathogen, waterborne AGI was attributed to norovirus (62%; 2991/4823), Giardia (24.6%; 1186/4823) and STEC (13.4%; 646/4823). The developed QMRA framework was used to assess the potential health impacts of partial and total well water treatment system failure. In the unlikely event of total treatment failure, total mean annual illnesses are predicted to almost double (4217 to 7064 cases per year), highlighting the importance of effective water treatment and comprehensive testing programs in reducing infectious health risks attributable to private well water in Ontario. Study findings indicate significant underreporting of waterborne AGI rates at the provincial level likely biasing public health interventions and programs that are effective in monitoring and minimizing the health risk associated with private well water.


Subject(s)
Drinking Water , Giardiasis , Groundwater , Humans , Ontario/epidemiology , Water Wells , Escherichia coli , Risk Assessment , Water Microbiology , Water Supply
9.
Transl Anim Sci ; 6(3): txac099, 2022 Jul.
Article in English | MEDLINE | ID: mdl-36000073

ABSTRACT

Genetic evaluations provide producers with a tool to aid in breeding decisions and highlight the increase in performance achievable at the farm level through genetic gain. Despite this, large-scale validation of sheep breeding objectives using field data is lacking in the scientific literature. The objective of the present study was to evaluate the phenotypic differences for a range of economically important traits for animals divergent in genetic merit for the Irish national maternal and terminal sheep breeding objectives. A dataset of 17,356 crossbred ewes and 54,322 progeny differing in their maternal and terminal breeding index recorded in 139 commercial flocks was available. The association of the maternal index of the ewe or terminal index of the ram and a range of phenotypic performance traits, including lambing, lamb performance, ewe performance, and health traits, were undertaken. Ewes excelling on the maternal index had higher litter sizes and produced progeny with greater perinatal lamb survival, heavier live weights from birth to postweaning and reduced days to slaughter (P < 0.05). Ewe maternal index had no quantifiable impact on lambing ease, carcass conformation, or fat, the health status of the ewe or lamb, ewe barren rate, or ewe live weight. Lambs born to rams of superior terminal index produced heavier lambs from preweaning onwards, with a reduced day to slaughter (P < 0.05). Lambing traits, lamb health, and carcass characteristics of the progeny did not differ between sires stratified as low or high on the terminal index (P > 0.05). Results from this study highlight that selecting either ewes or rams of superior maternal or terminal attributes will result in an improvement on pertinent performance traits of the national sheep flock, resulting in greater flock productivity and profitability.

10.
Environ Pollut ; 309: 119784, 2022 Sep 15.
Article in English | MEDLINE | ID: mdl-35843457

ABSTRACT

Approximately 1.6 million individuals in Ontario rely on private water wells. Private well water quality in Ontario remains the responsibility of the well owner, and due to the absence of regulation, quantitative microbial risk assessment (QMRA) likely represents the most effective approach to estimating and mitigating waterborne infection risk(s) from these supplies. Annual contamination duration (i.e., contaminated days per annum) represents a central input for waterborne QMRA; however, it is typically based on laboratory studies or meta-analyses, thus representing an important limitation for risk assessment, as groundwater mesocosms cannot accurately replicate subsurface conditions. The present study sought to address these limitations using a large spatio-temporal in-situ groundwater quality dataset (>700,000 samples) to evaluate aquifer-specific E. coli die-off rates (CFU/100 mL per day decline), subsequent contamination sequence duration(s) and the likelihood of overlapping contamination events. Findings indicate median E. coli die-off rates of 0.38 CFU/100 mL per day and 0.64 CFU/100 mL per day, for private wells located in unconsolidated and consolidated aquifers, respectlvely, with mean calculated contamination sequence durations of 18 days (unconsolidated) and 11 days (consolidated). Study findings support and permit development of increasingly evidence-based, regionally- and temporally-specific quantitative waterborne risk assessment.


Subject(s)
Escherichia coli , Groundwater , Humans , Ontario , Risk Assessment , Water Quality , Water Supply
11.
Clin Infect Dis ; 75(12): 2266-2274, 2022 12 19.
Article in English | MEDLINE | ID: mdl-35856638

ABSTRACT

The duration of protection after a single dose of yellow fever vaccine is a matter of debate. To summarize the current knowledge, we performed a systematic literature review and meta-analysis. Studies on the duration of protection after 1 and ≥2 vaccine doses were reviewed. Data were stratified by time since vaccination. In our meta-analysis, we used random-effects models. We identified 36 studies from 20 countries, comprising more than 17 000 participants aged 6 months to 85 years. Among healthy adults and children, pooled seroprotection rates after single vaccination dose were close to 100% by 3 months and remained high in adults for 5 to 10 years. In children vaccinated before age 2 years, the seroprotection rate was 52% within 5 years after primary vaccination. For immunodeficient persons, data indicate relevant waning. The extent of waning of seroprotection after yellow fever vaccination depends on age and immune status at primary vaccination.


Subject(s)
Yellow Fever Vaccine , Yellow Fever , Adult , Child , Humans , Yellow Fever/prevention & control , Vaccination , Time Factors , Antigens, Viral
12.
Sci Total Environ ; 846: 157478, 2022 Nov 10.
Article in English | MEDLINE | ID: mdl-35868388

ABSTRACT

A spatiotemporally static total coliform (TC) concentration threshold of five colony-forming units (CFU) per 100 mL is used in Ontario to determine whether well water is of acceptable quality for drinking. The current study sought to assess the role of TC and associated thresholds as microbial water quality parameters as the authors hypothesized that, since static TC thresholds are not evidence-based, they may not be appropriate for all well water consumers. A dataset containing the microbial water quality information of 795,023 samples (including TC and Escherichia coli (E. coli) counts) collected from 253,136 private wells in Ontario between 2010 and 2017 was used. To accurately assess the relationship between E. coli and non-E. coli TC, "non-E. coli coliform" (NEC) counts were calculated from microbial water quality data and replaced TC throughout analyses. This study analysed NEC and E. coli detection rates to determine differences between the two, and NEC:E. coli concentration ratios to assess links, if any, between NEC and E. coli contamination. Study findings suggest that spatiotemporally static NEC thresholds are not appropriate because seasonal, spatial, and well-specific susceptibility factors are associated with distinct contamination trends. For example, NEC detection rates exhibited bimodality, with summer (29.4 %) and autumn (30.2 %) detection rates being significantly higher (p < 0.05) than winter (21.9 %) and spring (19.9 %). E. coli detection rates also varied seasonally, but peaked in summer rather than autumn. As such, it is recommended that these factors be considered during the development of private well water guidelines and that static thresholds be avoided. Furthermore, the authors propose that, because NEC:E. coli concentration ratios change in the context of the aforementioned factors, they may have a role in inferring groundwater contamination mechanisms, with high ratios being associated with generalized aquifer contamination mechanisms and low ratios with localized contamination mechanisms.


Subject(s)
Drinking Water , Groundwater , Escherichia coli , Ontario , Quality Indicators, Health Care , Water Microbiology , Water Quality , Water Supply , Water Wells
13.
J Anim Breed Genet ; 139(3): 342-350, 2022 May.
Article in English | MEDLINE | ID: mdl-35106841

ABSTRACT

Genetic evaluations in sheep have proven to be an effective way of increasing farm profitability. Much research has previously been conducted on producing within-country genetic evaluations; however, to date, no across-country sheep genetic evaluations have been produced between Ireland and the UK. The objective of the present study was to examine the feasibility of an across-country genetic evaluation of live body weight and carcass composition traits for Texel sheep raised in Ireland and the UK. The benefit of genetic selection based on across-country genetic evaluations, in comparison with within-country genetic evaluations, was also quantified. Animal traits included early-life and postweaning live body weights, and muscle and fat depth ultrasound measurements. Irish and UK data were combined, common animals with progeny with records in both countries were identified and a series of bivariate analyses were performed separately for each trait to produce across-country genetic evaluations. Fixed effects included contemporary group, age at first lambing of the dam, parity of the dam (Ireland), dam age at lamb's birth (UK), a gender by age of the lamb interaction, a birth type by rearing type of the lamb interaction and country of birth of the lamb. Random effects included the animal additive genetic, dam maternal, litter common environment and residual effect. The model for postweaning weight, muscle depth and fat depth included only the animal additive genetic and litter common environmental random effects. Genetic correlations between the two countries ranged from 0.82 to 0.88 for the various traits. Across-country breeding values were estimated for all animals and response to selection was predicted using the top 10 and top 20 sires in both within- and across-country analyses for the two countries. Overall, results showed that rates of genetic gain could potentially increase from between 2.59% and 19.63% from selection based on across-country genetic evaluations compared to within-country evaluations alone. Across-country evaluations are feasible and would be of significant benefit to both the Irish and UK sheep industries. In order to realize these potential gains though, there would need to be a switch in emphasis by sheep breeders towards using objective traits as their primary selection criteria.


Subject(s)
Red Meat , Animals , Body Weight/genetics , Female , Ireland , Meat , Parity , Parturition , Phenotype , Pregnancy , Sheep/genetics
14.
Environ Monit Assess ; 194(3): 225, 2022 Feb 26.
Article in English | MEDLINE | ID: mdl-35217908

ABSTRACT

The Sustainable Development Goal 6 calls for global progress by 2030 in treating domestic wastewater and providing access to adequate sanitation facilities. However, meeting these goals will be a challenge for most Small Island Developing States, including Caribbean island nations. In the nearshore zone of the Soufriere region on the Caribbean island of St. Lucia, there is a history of high levels of bacteria of fecal origin. Possible land-based sources of microbial contamination in the Soufriere Bay include discharges from the Soufriere River and transport of wastewater, including fecal material from the town of Soufriere. This area is an important tourist destination and supports a local fishery. To identify the sources of microbial contamination in Soufriere Bay, a range of monitoring methods were employed in this study. In grab samples of surface water collected from the Soufriere River, counts of total coliforms and Escherichia coli were elevated above water quality guidelines. However, the spikes in concentrations of these indicator organisms in the river did not necessarily coincide with the spikes in the levels of total coliforms and E. coli detected in samples collected on the same dates in Soufriere Bay, indicating that there are other sources of pollution in the Bay besides discharges from the river. Monitoring for chemical indicators of wastewater (i.e., caffeine, sucralose, fluconazole) in the Soufriere River indicated that there are inputs of sewage or human fecal material throughout the watershed. However, analysis of Bacteroidales 16S rRNA genetic markers for fecal bacteria originating from humans, bovine ruminants, or other warm-blooded animals indicated that the majority of microbial contamination in the river was not from humans. Monitoring for chemical indicators of wastewater using passive samplers deployed in Soufriere Bay indicated that there are two "hot spots" of contamination located offshore of economically depressed areas of the town of Soufriere. This study indicates that efforts to control contamination of Soufriere Bay by fecal microorganisms must include management of pollution originating from both sewage and domestic animals in the watershed.


Subject(s)
Environmental Monitoring , Escherichia coli , Animals , Cattle , Environmental Monitoring/methods , Escherichia coli/genetics , Feces/microbiology , Humans , RNA, Ribosomal, 16S/genetics , Rivers/chemistry , Saint Lucia , Water Microbiology , Water Pollution/analysis
15.
J Am Pharm Assoc (2003) ; 62(2): 612-619, 2022.
Article in English | MEDLINE | ID: mdl-34802944

ABSTRACT

BACKGROUND: During the coronavirus 2019 (COVID-19) pandemic, physician focus shifted from continuity of care to pandemic duties. However, patients still required in-person visits for acute or chronic complaints. Specially trained pharmacists were utilized to alleviate Family Medicine Walk-In (FMWI) provider shortages. OBJECTIVE: To describe the innovative practice utilizing diagnostic pharmacists in FMWI, evaluate their impact on provider time, compare workload with traditional advanced practice providers (APPs), and evaluate type of visits and medications prescribed. PRACTICE DESCRIPTION: Pharmacists at an Indian Health Service medical center staffed FMWI 2.5 days per week to alleviate provider shortages during the COVID-19 pandemic. The privileged pharmacist had a diagnostic scope like APPs. Non-privileged pharmacists provided care to patients utilizing current protocols and were required to present all new complaints to providers. PRACTICE INNOVATION: The facility utilized pharmacists that have completed or were progressing through the local diagnostic training program to alleviate provider shortages. EVALUATION METHODS: The absolute number of visits by pharmacists was determined and the number of provider hours shifted to pharmacy estimated. The number of visits by provider type was calculated and compared. ICD-10 codes were evaluated for purpose of visits. New prescriptions written by pharmacists were categorized and reimbursement rates determined. RESULTS: Pharmacists were responsible for 677 visits during 88 clinic days, with an estimated 338 provider hours shifted to pharmacists. Pharmacists saw 5.8 patients per day, APPs 5.2, and physicians 5.7. Pharmacists primarily evaluated hypertension, diabetes, musculoskeletal, and infectious disease complaints. New prescription categories included pain management, endocrine, cardiovascular, and infectious disease. The single billable pharmacist was reimbursed $77,945. CONCLUSION: Diagnostic pharmacists in FMWI have allowed providers to shift to other pandemic duties and demonstrate similar workload as APPs. Most visits and prescriptions fall within known pharmacist practice. Pharmacists in this setting pay for the existence of this position and remain integrated in FMWI.


Subject(s)
COVID-19 , Pharmaceutical Services , Family Practice , Humans , Pandemics , Pharmacists
16.
Environ Pollut ; 285: 117263, 2021 Sep 15.
Article in English | MEDLINE | ID: mdl-33940229

ABSTRACT

Groundwater quality monitoring typically employs testing for the presence of E. coli as a fecal indicator of recent ingress of human or animal fecal material. The efficacy of fecal indicator organisms is based on the primary criteria that the organism does not reproduce in the aquatic environment. However, recent studies have reported that E. coli may proliferate (i.e., has adapted to) in the external environment, including soil and surface water. To date, the presence of environmentally-adapted E. coli in groundwater has not been examined. The current study employed Clermont phylotyping and the presence of six accessory genes to identify the likely presence of adapted E. coli in private groundwater sources. E. coli isolates (n = 325) from 76 contaminated private water wells located in a southeastern Ontario watershed were compared with geographically analogous human and animal fecal E. coli isolates (n = 234). Cryptic clades III-V, a well-described environmentally-adapted Escherichia population, were identified in three separate groundwater wells, one of which exclusively comprised this adapted population. Dimensionality reduction (via Principal Component Analysis) was used to develop an "E. coli adaptation model", comprising three distinct components (groundwater, animal feces, human feces) and suggests adaptation occurs frequently in the groundwater environment. Model findings indicate that 23/76 (30.3%) wells had an entirely adapted community. Accordingly, the use of E. coli as a FIO returned a false positive result in these instances, while an additional 23/76 (30.3%) wells exhibited some evidence of adaptation (i.e., not all isolates were adapted) representing an over-estimate of the magnitude (concentration) of contamination. Study findings highlight the need to further characterize environmentally-adapted E. coli in the groundwater environment and the potential implications with respect to water quality policy, legislation and determinants of human health risk both regionally and internationally.


Subject(s)
Escherichia coli , Groundwater , Animals , Environmental Monitoring , Feces , Humans , Ontario , Water Microbiology , Water Wells
17.
Water Res ; 197: 117089, 2021 Jun 01.
Article in English | MEDLINE | ID: mdl-33836295

ABSTRACT

Groundwater resources are under increasing threats from contamination and overuse, posing direct threats to human and environmental health. The purpose of this study is to better understand drivers of, and relationships between, well and aquifer characteristics, sampling frequencies, and microbiological contamination indicators (specifically E. coli) as a precursor for improving knowledge and tools to assess aquifer vulnerability and well contamination within Ontario, Canada. A dataset with 795, 023 microbiological testing observations over an eight-year period (2010 to 2017) from 253,136 unique wells across Ontario was employed. Variables in this dataset include date and location of test, test results (E. coli concentration), well characteristics (well depth, location), and hydrogeological characteristics (bottom of well stratigraphy, specific capacity). Association rule analysis, univariate and bivariate analyses, regression analyses, and variable discretization techniques were utilized to identify relationships between E. coli concentration and the other variables in the dataset. These relationships can be used to identify drivers of contamination, their relative importance, and therefore potential public health risks associated with the use of private wells in Ontario. Key findings are that: i) bedrock wells completed in sedimentary or igneous rock are more susceptible to contamination events; ii) while shallow wells pose a greater risk to consumers, deep wells are also subject to contamination events and pose a potentially unanticipated risk to health of well users; and, iii) well testing practices are influenced by results of previous tests. Further, while there is a general correlation between months with the greatest testing frequencies and concentrations of E. coli occurring in samples, an offset in this timing is observed in recent years. Testing remains highest in July while peaks in adverse results occur up to three months later. The realization of these trends prompts a need to further explore the bases for such occurrences.


Subject(s)
Drinking Water , Groundwater , Water Pollutants, Chemical , Environmental Monitoring , Escherichia coli , Humans , Machine Learning , Ontario , Water Pollutants, Chemical/analysis , Water Supply , Water Wells
18.
Sci Total Environ ; 738: 140382, 2020 Oct 10.
Article in English | MEDLINE | ID: mdl-32806349

ABSTRACT

Approximately 1.5 million individuals in Ontario are supplied by private water wells (private groundwater supplies). Unlike municipal supplies, private well water quality remains unregulated, with owners responsible for testing, treating, and maintaining their own water supplies. The primary goal of this study was to assess the effect of repeat sampling of private well water in Ontario and investigate the efficacy of geographically- and/or temporally specific testing recommendations and health risk assessments. The current study combines the Well Water Information System Dataset and the Well Water Testing Dataset from 2010 to 2017, inclusive. These two large existing province-wide datasets collated over an eight-year period were merged using an integrated spatial fuzzy logic and (next)- nearest neighbour approach. Provincial sampling data from 239,244 wells (702,861 samples) were analyzed for Escherichia coli to study the relationship between sampling frequency and Escherichia coli detection. Dataset variables were delineated based on hydrogeological setting (e.g. aquifer type, overburden depth, well depth, bedrock type) and seasonality to provide an in-depth understanding of Escherichia coli detection in private well water. Findings reveal differences between detection rates in consolidated and unconsolidated aquifers (p = 0.0191), and across seasons (p < 0.0001). The variability associated with Escherichia coli detection rates was explored by estimating sentinel sampling rates for private wells sampled three times, twelve times and twenty-four times per year. As sample size increases on an annual basis, so too does detection rate, highlighting the need to address current testing frequency guidelines. Future health risk assessments for private well water should consider the impact of spatial and temporal factors on the susceptibility of this drinking water source, leading to an increasingly accurate depiction of private well water contamination and the estimated effects on human health.


Subject(s)
Drinking Water , Groundwater , Humans , Ontario , Risk Assessment , Water Wells
19.
Transl Anim Sci ; 4(1): 242-249, 2020 Jan.
Article in English | MEDLINE | ID: mdl-32704983

ABSTRACT

The decision on which ewe lamb to retain versus which to sell is likely to vary by producer based on personal preference. What is not known, however, is if any commonality exists among producers in the characteristics of ewe lambs that influence their eventual fate. The objective of the present study was to determine what genetic and nongenetic factors associate with the fate of maiden ewe lambs. The fate of each ewe lamb born in the present study was defined as either subsequently: 1) having lambed in the flock, or 2) was slaughtered without any recorded lambing event. A total of 9,705 ewe lamb records from 41 crossbred flocks were used. The logit of the odds of the ewe lamb being retained for lambing was modeled using logistic regression. Variance components were then estimated for the binary trait representing the fate of the ewe lamb using animal linear and threshold mixed models. The genetic correlations between fate of the ewe lamb and preweaning, weaning, or postweaning liveweight were also estimated. From the edited data set, 45% of ewe lambs born entered the mature flock as ewes. Ewe lambs reared as singles, with greater levels of heterosis but lower levels of recombination loss, born to dams that lambed for the first time as hoggets, with greater breed proportion of the Belclare, Suffolk, Texel, and Llyen breeds were more likely (P < 0.001) to eventually lamb in the flock than be slaughtered without ever lambing. Irrespective of the age of the animal when weighed, heavier ewe lambs were more likely to eventually lamb (P < 0.001). The genetic SD and direct heritability of fate of the ewe lamb estimated in the univariate linear model was 26.58 percentage units and 0.31 (SE = 0.03), respectively; the heritability was 0.30 when estimated using the threshold model. The corresponding direct heritability of fate of the ewe lamb estimated in the bivariate analyses with liveweight ranged from 0.29 (SE = 0.03; preweaning weight) to 0.35 (SE = 0.04; postweaning weight). The genetic correlations estimated between fate of the ewe lamb and the liveweight traits were weak to moderate but strengthened as the age of the ewe lamb at weighing increased. Results from this study provide an understanding of the factors producers consider when selecting females for retention versus slaughter which may form useful parameters in the development of a decision support tool to identify suitable ewe lambs for retention.

20.
Sci Total Environ ; 717: 137188, 2020 May 15.
Article in English | MEDLINE | ID: mdl-32062277

ABSTRACT

Approximately 12% of the Canadian population uses private wells for daily water consumption; however, well water testing rates are on the decline, resulting in an increased risk of waterborne acute gastrointestinal illness. To date, limited research has explored the determinants influencing well testing practices. Accordingly, the current study sought to investigate the drivers of "one-off" and repeat well water testing in southern Ontario during the 5-year period 2012-2016, using the worlds largest private groundwater testing data-frame. Data from >400,000 wells were geospatially integrated with all tests conducted by the provincial laboratory in southern Ontario. The Ontario Marginalization Index (ON-Marg) was used as a proxy measure of socioeconomic status (SES), with rurality, based on population density, season, and index (1st) test results assessed as effect modifiers. Multivariate analysis was undertaken using log-binomial regression. Approximately 27.5% of wells (n = 417,406) were tested during the study period, 66.7% of which were sampled more than once; 3% of all samples tested positive for E. coli (>0 colony forming unit/100 mL). In rural regions (<150 people/km2), wells located in low SES areas were 13% more likely to be tested compared to high SES areas (95% CI: 1.11, 1.15). In urban (>400 people/km2) and peri-urban regions (>150 and <400 people/km2), wells located in low SES areas were 14% (95% CI: 0.78, 0.95) and 15% (95% CI: 0.76, 0.94) less likely to be tested compared to high SES areas. Wells located in low SES areas were 6% more likely to be re-tested (95% CI: 1.04, 1.07). Positive index tests were associated with a 17% increased likelihood of repeat testing (95% CI: 1.16, 1.18). Accordingly, the authors conclude that location and SES are significant predictors of well water testing, with index test status being the most influential predictor of repeat well testing.

SELECTION OF CITATIONS
SEARCH DETAIL
...