Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 79
Filter
1.
J Antimicrob Chemother ; 76(9): 2464-2471, 2021 08 12.
Article in English | MEDLINE | ID: mdl-34109397

ABSTRACT

BACKGROUND: Understanding antimicrobial consumption is essential to mitigate the development of antimicrobial resistance, yet robust data in children are sparse and methodologically limited. Electronic prescribing systems provide an important opportunity to analyse and report antimicrobial consumption in detail. OBJECTIVES: We investigated the value of electronic prescribing data from a tertiary children's hospital to report temporal trends in antimicrobial consumption in hospitalized children and compare commonly used metrics of antimicrobial consumption. METHODS: Daily measures of antimicrobial consumption [days of therapy (DOT) and DDDs] were derived from the electronic prescribing system between 2010 and 2018. Autoregressive moving-average models were used to infer trends and the estimates were compared with simulated point prevalence surveys (PPSs). RESULTS: More than 1.3 million antimicrobial administrations were analysed. There was significant daily and seasonal variation in overall consumption, which reduced annually by 1.77% (95% CI 0.50% to 3.02%). Relative consumption of meropenem decreased by 6.6% annually (95% CI -3.5% to 15.8%) following the expansion of the hospital antimicrobial stewardship programme. DOT and DDDs exhibited similar trends for most antimicrobials, though inconsistencies were observed where changes to dosage guidelines altered consumption calculation by DDDs, but not DOT. PPS simulations resulted in estimates of change over time, which converged on the model estimates, but with much less precision. CONCLUSIONS: Electronic prescribing systems offer significant opportunities to better understand and report antimicrobial consumption in children. This approach to modelling administration data overcomes the limitations of using interval data and dispensary data. It provides substantially more detailed inferences on prescribing patterns and the potential impact of stewardship interventions.


Subject(s)
Anti-Infective Agents , Antimicrobial Stewardship , Electronic Prescribing , Anti-Bacterial Agents/therapeutic use , Child , Child, Hospitalized , Humans
2.
Spat Spatiotemporal Epidemiol ; 27: 61-70, 2018 11.
Article in English | MEDLINE | ID: mdl-30409377

ABSTRACT

Giardia and Cryptosporidium are both waterborne parasites and leading causes of gastroenteritis. Although specimens from diarrhoeic patients are routinely examined for Cryptosporidium, they are often not examined for Giardia so many cases go undiagnosed. Since 2002, all faecal specimens in Central Lancashire have been tested for infection with Giardia and Cryptosporidium. The aim of this paper is to gain insight into the factors contributing to giardiasis and cryptosporidiosis, including evidence of transmission via drinking water. Our analysis found a higher risk of both conditions for young children and a second peak in risk of giardiasis in adults. There was a significantly higher risk of giardiasis for males and a higher risk of cryptosporidiosis for females. The geographical location was significant, with an increased risk in the north. Residence in an area with increased supply from one water treatment works was a significant predictor for cryptosporidiosis.


Subject(s)
Cryptosporidiosis , Drinking Water/standards , Giardiasis , Waterborne Diseases , Adult , Age Factors , Aged , Child , Cryptosporidiosis/epidemiology , Cryptosporidiosis/etiology , Cryptosporidiosis/prevention & control , England/epidemiology , Female , Giardiasis/epidemiology , Giardiasis/etiology , Giardiasis/prevention & control , Humans , Male , Risk Assessment/methods , Risk Factors , Spatial Analysis , Water Supply/methods , Water Supply/standards , Water Supply/statistics & numerical data , Waterborne Diseases/epidemiology , Waterborne Diseases/etiology , Waterborne Diseases/prevention & control
3.
Epidemiol Infect ; 145(16): 3438-3448, 2017 12.
Article in English | MEDLINE | ID: mdl-29173242

ABSTRACT

Infectious diseases frequently have multiple potential routes of intraspecific transmission of pathogens within wildlife and other populations. For pathogens causing zoonotic diseases, knowing whether these transmission routes occur in the wild and their relative importance, is critical for understanding maintenance, improving control measures and ultimately preventing human disease. The Norway rat (Rattus norvegicus) is the primary reservoir of leptospirosis in the urban slums of Salvador, Brazil. There is biological evidence for potentially three different transmission routes of leptospire infection occurring in the rodent population. Using newly obtained prevalence data from rodents trapped at an urban slum field site, we present changes in cumulative risk of infection in relation to age-dependent transmission routes to infer which intra-specific transmission routes occur in the wild. We found that a significant proportion of animals leave the nest with infection and that the risk of infection increases throughout the lifetime of Norway rats. We did not observe a significant effect of sexual maturity on the risk of infection. In conclusion, our results suggest that vertical and environmental transmission of leptospirosis both occur in wild populations of Norway rats.


Subject(s)
Leptospira , Leptospirosis , Rodent Diseases , Aging , Animals , Body Weight , Brazil/epidemiology , Carrier State/epidemiology , Carrier State/transmission , Carrier State/veterinary , Female , Infectious Disease Transmission, Vertical , Leptospirosis/epidemiology , Leptospirosis/transmission , Leptospirosis/veterinary , Male , Prevalence , Rats , Rodent Diseases/epidemiology , Rodent Diseases/transmission , Survival Analysis
4.
Diabet Med ; 34(8): 1136-1144, 2017 08.
Article in English | MEDLINE | ID: mdl-28294392

ABSTRACT

AIM: To analyse the cost-effectiveness of different interventions for Type 2 diabetes prevention within a common framework. METHODS: A micro-simulation model was developed to evaluate the cost-effectiveness of a range of diabetes prevention interventions including: (1) soft drinks taxation; (2) retail policy in socially deprived areas; (3) workplace intervention; (4) community-based intervention; and (5) screening and intensive lifestyle intervention in individuals with high diabetes risk. Within the model, individuals follow metabolic trajectories (for BMI, cholesterol, systolic blood pressure and glycaemia); individuals may develop diabetes, and some may exhibit complications of diabetes and related disorders, including cardiovascular disease, and eventually die. Lifetime healthcare costs, employment costs and quality-adjusted life-years are collected for each person. RESULTS: All interventions generate more life-years and lifetime quality-adjusted life-years and reduce healthcare spending compared with doing nothing. Screening and intensive lifestyle intervention generates greatest lifetime net benefit (£37) but is costly to implement. In comparison, soft drinks taxation or retail policy generate lower net benefit (£11 and £11) but are cost-saving in a shorter time period, preferentially benefit individuals from deprived backgrounds and reduce employer costs. CONCLUSION: The model enables a wide range of diabetes prevention interventions to be evaluated according to cost-effectiveness, employment and equity impacts over the short and long term, allowing decision-makers to prioritize policies that maximize the expected benefits, as well as fulfilling other policy targets, such as addressing social inequalities.


Subject(s)
Diabetes Mellitus, Type 2/prevention & control , Diet, Healthy , Health Policy , Health Promotion/economics , Healthy Lifestyle , Models, Economic , Quality of Life , Carbonated Beverages/adverse effects , Carbonated Beverages/economics , Computer Simulation , Cost Savings , Cost-Benefit Analysis , Costs and Cost Analysis , Diabetes Mellitus, Type 2/blood , Diabetes Mellitus, Type 2/diagnosis , Diabetes Mellitus, Type 2/economics , Diet, Healthy/economics , England , Health Education/economics , Health Surveys , Humans , Mass Screening/economics , Residence Characteristics , Taxes , Workplace
5.
Diabet Med ; 34(5): 632-640, 2017 05.
Article in English | MEDLINE | ID: mdl-28075544

ABSTRACT

AIMS: To develop a cost-effectiveness model to compare Type 2 diabetes prevention programmes targeting different at-risk population subgroups with a lifestyle intervention of varying intensity. METHODS: An individual patient simulation model was constructed to simulate the development of diabetes in a representative sample of adults without diabetes from the UK population. The model incorporates trajectories for HbA1c , 2-h glucose, fasting plasma glucose, BMI, systolic blood pressure, total cholesterol and HDL cholesterol. Patients can be diagnosed with diabetes, cardiovascular disease, microvascular complications of diabetes, cancer, osteoarthritis and depression, or can die. The model collects costs and utilities over a lifetime horizon. The perspective is the UK National Health Service and personal social services. We used the model to evaluate the population-wide impact of targeting a lifestyle intervention of varying intensity to six population subgroups defined as high risk for diabetes. RESULTS: The intervention produces 0.0003 to 0.0009 incremental quality-adjusted life years and saves up to £1.04 per person in the general population, depending upon the subgroup targeted. Cost-effectiveness increases with intervention intensity. The most cost-effective options are to target individuals with HbA1c > 42 mmol/mol (6%) or with a high Finnish Diabetes Risk (FINDRISC) probability score (> 0.1). CONCLUSION: The model indicates that diabetes prevention interventions are likely to be cost-effective and may be cost-saving over a lifetime. In the model, the criteria for selecting at-risk individuals differentially impact upon diabetes and cardiovascular disease outcomes, and on the timing of benefits. These findings have implications for deciding who should be targeted for diabetes prevention interventions.


Subject(s)
Diabetes Mellitus, Type 2/prevention & control , Primary Prevention , Risk Reduction Behavior , Adult , Aged , Aged, 80 and over , Cardiovascular Diseases/etiology , Cardiovascular Diseases/prevention & control , Cost-Benefit Analysis , Diabetes Mellitus, Type 2/complications , Diabetes Mellitus, Type 2/economics , Female , Health Promotion/economics , Health Promotion/methods , Humans , Life Style , Male , Middle Aged , Primary Prevention/economics , Primary Prevention/methods , Quality-Adjusted Life Years , Risk Assessment , Risk Factors , Young Adult
6.
Parasite Immunol ; 38(7): 387-402, 2016 07.
Article in English | MEDLINE | ID: mdl-27108767

ABSTRACT

Bovine tuberculosis (BTB), caused by Mycobacterium bovis, has an annual incidence in cattle of 0.5% in the Republic of Ireland and 4.7% in the UK, despite long-standing eradication programmes being in place. Failure to achieve complete eradication is multifactorial, but the limitations of diagnostic tests are significant complicating factors. Previously, we have demonstrated that Fasciola hepatica infection, highly prevalent in these areas, induced reduced sensitivity of the standard diagnostic tests for BTB in animals co-infected with F. hepatica and M. bovis. This was accompanied by a reduced M. bovis-specific Th1 immune response. We hypothesized that these changes in co-infected animals would be accompanied by enhanced growth of M. bovis. However, we show here that mycobacterial burden in cattle is reduced in animals co-infected with F. hepatica. Furthermore, we demonstrate a lower mycobacterial recovery and uptake in blood monocyte-derived macrophages (MDM) from F. hepatica-infected cattle which is associated with suppression of pro-inflammatory cytokines and a switch to alternative activation of macrophages. However, the cell surface expression of TLR2 and CD14 in MDM from F. hepatica-infected cattle is increased. These findings reflecting the bystander effect of helminth-induced downregulation of pro-inflammatory responses provide insights to understand host-pathogen interactions in co-infection.


Subject(s)
Cytokines/immunology , Fasciola hepatica/physiology , Fascioliasis/immunology , Mycobacterium bovis/growth & development , Tuberculosis, Bovine/microbiology , Animals , Cattle , Coinfection/immunology , Coinfection/microbiology , Coinfection/parasitology , Cytokines/genetics , Fascioliasis/parasitology , Host-Pathogen Interactions , Macrophages/immunology , Macrophages/microbiology , Mycobacterium bovis/physiology , Tuberculosis, Bovine/immunology
8.
BJOG ; 122(9): 1226-34, 2015 Aug.
Article in English | MEDLINE | ID: mdl-25958769

ABSTRACT

OBJECTIVE: (Primary) To establish the effect of antenatal group self-hypnosis for nulliparous women on intra-partum epidural use. DESIGN: Multi-method randomised control trial (RCT). SETTING: Three NHS Trusts. POPULATION: Nulliparous women not planning elective caesarean, without medication for hypertension and without psychological illness. METHODS: Randomisation at 28-32 weeks' gestation to usual care, or to usual care plus brief self-hypnosis training (two × 90-minute groups at around 32 and 35 weeks' gestation; daily audio self-hypnosis CD). Follow up at 2 and 6 weeks postnatal. MAIN OUTCOME MEASURES: Primary: epidural analgesia. Secondary: associated clinical and psychological outcomes; cost analysis. RESULTS: Six hundred and eighty women were randomised. There was no statistically significant difference in epidural use: 27.9% (intervention), 30.3% (control), odds ratio (OR) 0.89 [95% confidence interval (CI): 0.64-1.24], or in 27 of 29 pre-specified secondary clinical and psychological outcomes. Women in the intervention group had lower actual than anticipated levels of fear and anxiety between baseline and 2 weeks post natal (anxiety: mean difference -0.72, 95% CI -1.16 to -0.28, P = 0.001); fear (mean difference -0.62, 95% CI -1.08 to -0.16, P = 0.009) [Correction added on 7 July 2015, after first online publication: 'Mean difference' replaced 'Odds ratio (OR)' in the preceding sentence.]. Postnatal response rates were 67% overall at 2 weeks. The additional cost in the intervention arm per woman was £4.83 (CI -£257.93 to £267.59). CONCLUSIONS: Allocation to two-third-trimester group self-hypnosis training sessions did not significantly reduce intra-partum epidural analgesia use or a range of other clinical and psychological variables. The impact of women's anxiety and fear about childbirth needs further investigation.


Subject(s)
Analgesia, Epidural/statistics & numerical data , Analgesia, Obstetrical/statistics & numerical data , Hypnosis , Labor Pain/therapy , Pain Management , Patient Compliance/statistics & numerical data , Self Care/methods , Adult , Female , Humans , Labor Pain/epidemiology , Pain Management/methods , Patient Education as Topic , Patient Satisfaction , Pregnancy , Reminder Systems , Surveys and Questionnaires , Treatment Outcome
9.
Contemp Clin Trials ; 41: 100-9, 2015 Mar.
Article in English | MEDLINE | ID: mdl-25602581

ABSTRACT

BACKGROUND: Relapse prevention interventions for Bipolar Disorder are effective but implementation in routine clinical services is poor. Web-based approaches offer a way to offer easily accessible access to evidence based interventions at low cost, and have been shown to be effective for other mood disorders. METHODS/DESIGN: This protocol describes the development and feasibility testing of the ERPonline web-based intervention using a single blind randomised controlled trial. Data will include the extent to which the site was used, detailed feedback from users about their experiences of the site, reported benefits and costs to mental health and wellbeing of users, and costs and savings to health services. We will gain an estimate of the likely effect size of ERPonline on a range of important outcomes including mood, functioning, quality of life and recovery. We will explore potential mechanisms of change, giving us a greater understanding of the underlying processes of change, and consequently how the site could be made more effective. We will be able to determine rates of recruitment and retention, and identify what factors could improve these rates. DISCUSSION: The findings will be used to improve the site in accordance with user needs, and inform the design of a large scale evaluation of the clinical and cost effectiveness of ERPonline. They will further contribute to the growing evidence base for web-based interventions designed to support people with mental health problems.


Subject(s)
Bipolar Disorder/therapy , Internet , Patient Acceptance of Health Care , Secondary Prevention , Self Care/methods , Therapy, Computer-Assisted/methods , Adaptation, Psychological , Adult , Feasibility Studies , Female , Humans , Male , Qualitative Research , Quality of Life , Single-Blind Method , Treatment Outcome
10.
Epidemiol Infect ; 143(8): 1692-701, 2015 Jun.
Article in English | MEDLINE | ID: mdl-25266562

ABSTRACT

Many cases of giardiasis in the UK are undiagnosed and among other things, diagnosis is dependent upon the readiness of GPs to request a specimen. The aim of this study is to assess the rate of specimens requested per GP practice in Central Lancashire, to examine the differences between GP practices and to estimate the pattern of unexplained spatial variation in the practice rate of specimens after adjustment for deprivation. To achieve this, we fitted a set of binomial and Poisson regression models, with random effects for GP practice. Our analysis suggests that there were differences in the rate of specimens by GP practices (P < 0·001) for a single year, but no difference in the proportion of positive tests per specimen submitted or in the rate of positive specimens per practice population. There was a difference in the cumulative rate of positive specimens per practice population over a 9-year period (P < 0·001). Neither the specimen rate per practice for a single year nor the cumulative rate of positive specimens over multiple years demonstrated significant spatial correlation. Hence, spatial variation in the incidence of giardiasis is unlikely to be confounded by variation in GP rate of specimens.


Subject(s)
Feces/parasitology , General Practice/statistics & numerical data , Giardiasis/diagnosis , Practice Patterns, Physicians'/statistics & numerical data , Specimen Handling/statistics & numerical data , England/epidemiology , Giardiasis/epidemiology , Humans , Regression Analysis , Socioeconomic Factors
11.
Epidemiol Infect ; 142(4): 861-70, 2014 Apr.
Article in English | MEDLINE | ID: mdl-23830295

ABSTRACT

In a 2-year longitudinal study of adult animals on 15 dairy farms and four sheep farms in Lancashire, UK, Arcobacter spp. were isolated from all farms although not at every sampling occasion. Faecal samples were collected and cultured using standard techniques for isolation of campylobacters. Assignment to species was via PCR assays. Apparent prevalence of Arcobacter spp. was higher in dairy cattle compared to sheep (40.1% vs. 8%, P < 0.001) and in housed cattle compared to cattle at pasture (50.1% vs. 20.9%, P < 0.001). This was reflected in the higher prevalence observed in herds that were housed (n = 4) all year compared to herds that grazed cattle on pasture in the summer and housed cattle in the winter (n = 11) (55.5% vs. 36%, P < 0.001). In the case of sheep, peak prevalence was observed in autumn with increased prevalence also being associated with improving pasture quality. There was an apparent inverse association between the faecal pat prevalence of Arcobacter spp. and Campylobacter jejuni although this may in part be an artefact of laboratory test method sensitivity, whereby a relative increase in the frequency of one bacterial species would reduce the sensitivity of detecting the other.


Subject(s)
Arcobacter/isolation & purification , Feces/microbiology , Gram-Negative Bacterial Infections/epidemiology , Gram-Negative Bacterial Infections/microbiology , Animals , Arcobacter/genetics , Bacteriological Techniques , Campylobacter/genetics , Campylobacter/isolation & purification , Campylobacter Infections/epidemiology , Campylobacter Infections/microbiology , Campylobacter Infections/veterinary , Cattle , Cluster Analysis , Cohort Studies , Gram-Negative Bacterial Infections/veterinary , Logistic Models , Sheep
12.
Epidemiol Infect ; 141(4): 687-96, 2013 Apr.
Article in English | MEDLINE | ID: mdl-22687530

ABSTRACT

This study investigated the relationships between Legionnaires' disease (LD) incidence and weather in Glasgow, UK, by using advanced statistical methods. Using daily meteorological data and 78 LD cases with known exact date of onset, we fitted a series of Poisson log-linear regression models with explanatory variables for air temperature, relative humidity, wind speed and year, and sine-cosine terms for within-year seasonal variation. Our initial model showed an association between LD incidence and 2-day lagged humidity (positive, P = 0·0236) and wind speed (negative, P = 0·033). However, after adjusting for year-by-year and seasonal variation in cases there were no significant associations with weather. We also used normal linear models to assess the importance of short-term, unseasonable weather values. The most significant association was between LD incidence and air temperature residual lagged by 1 day prior to onset (P = 0·0014). The contextual role of unseasonably high air temperatures is worthy of further investigation. Our methods and results have further advanced understanding of the role which weather plays in risk of LD infection.


Subject(s)
Legionnaires' Disease/epidemiology , Linear Models , Weather , Adult , Aged , Aged, 80 and over , Female , Humans , Humidity , Incidence , Male , Middle Aged , Scotland/epidemiology , Seasons , Temperature , Wind
13.
Epidemiol Infect ; 141(8): 1764-71, 2013 Aug.
Article in English | MEDLINE | ID: mdl-22995184

ABSTRACT

Meningococcal meningitis is a major public health problem in the African Belt. Despite the obvious seasonality of epidemics, the factors driving them are still poorly understood. Here, we provide a first attempt to predict epidemics at the spatio-temporal scale required for in-year response, using a purely empirical approach. District-level weekly incidence rates for Niger (1986-2007) were discretized into latent, alert and epidemic states according to pre-specified epidemiological thresholds. We modelled the probabilities of transition between states, accounting for seasonality and spatio-temporal dependence. One-week-ahead predictions for entering the epidemic state were generated with specificity and negative predictive value >99%, sensitivity and positive predictive value >72%. On the annual scale, we predict the first entry of a district into the epidemic state with sensitivity 65∙0%, positive predictive value 49∙0%, and an average time gained of 4∙6 weeks. These results could inform decisions on preparatory actions.


Subject(s)
Epidemics , Meningitis, Meningococcal/epidemiology , Models, Biological , Humans , Incidence , Markov Chains , Niger/epidemiology , Public Health , Seasons , Time Factors
14.
Equine Vet J ; 44(3): 289-96, 2012 May.
Article in English | MEDLINE | ID: mdl-21848534

ABSTRACT

REASONS FOR PERFORMING STUDY: The increasing prevalence of antimicrobial-resistant bacteria such as methicillin-resistant Staphylococcus aureus (MRSA) and antimicrobial-resistant Escherichia coli represents a significant problem. However, the carriage of such bacteria by horses in the UK has not been well characterised. OBJECTIVES: To estimate the prevalence of nasal carriage of MRSA and faecal carriage of antimicrobial-resistant E. coli amongst horses in the general equine community of the mainland UK. METHODS: A cross-sectional study of horses recruited by 65 randomly selected equine veterinary practices was conducted, with nasal swabs and faecal samples collected. Faecal samples were cultured for antimicrobial-resistant E. coli. Nasal swabs were cultured for staphylococcal species; methicillin-resistant isolates identified as S. aureus were characterised by SCCmec and spa gene typing. Multilevel logistic regression models were used to calculate prevalence estimates with adjustment for clustering at practice and premises levels. Spatial variation in risk of antimicrobial resistance was also examined. RESULTS: In total, 650 faecal samples and 678 nasal swabs were collected from 692 horses located on 525 premises. The prevalence of faecal carriage of E. coli with resistance to any antimicrobial was 69.5% (95% CI 65.9-73.1%) and the prevalence of extended-spectrum ß-lactamase (ESBL)-producing E. coli was 6.3% (95% CI 4.1-9.6%). The prevalence of nasal carriage of MRSA was 0.6% (95% CI 0.2-1.5%). Spatial analysis indicated variation across the UK for risk of carriage of resistant and multidrug-resistant (resistant to more than 3 antimicrobial classes) E. coli. CONCLUSIONS AND POTENTIAL RELEVANCE: Carriage of MRSA by horses in the community appears rare, but the prevalence of antimicrobial-resistant E. coli (including ESBL-producing E. coli) is higher. A high prevalence of antimicrobial-resistant bacteria could have significant health implications for the horse population of the UK.


Subject(s)
Anti-Bacterial Agents/pharmacology , Escherichia coli Infections/veterinary , Escherichia coli/drug effects , Horse Diseases/microbiology , Methicillin-Resistant Staphylococcus aureus/isolation & purification , Staphylococcal Infections/veterinary , Animals , Carrier State/epidemiology , Carrier State/microbiology , Carrier State/veterinary , Cross-Sectional Studies , Drug Resistance, Bacterial , Escherichia coli/isolation & purification , Escherichia coli Infections/epidemiology , Escherichia coli Infections/microbiology , Feces/microbiology , Female , Horse Diseases/epidemiology , Horses , Male , Methicillin Resistance , Methicillin-Resistant Staphylococcus aureus/drug effects , Nasal Mucosa/microbiology , Prevalence , Staphylococcal Infections/epidemiology , Staphylococcal Infections/microbiology , United Kingdom/epidemiology
15.
Ann Bot ; 108(4): 749-63, 2011 Sep.
Article in English | MEDLINE | ID: mdl-21724655

ABSTRACT

BACKGROUND AND AIMS: Plants are sessile organisms that face selection by both herbivores and pollinators. Herbivores and pollinators may select on the same traits and/or mediate each others' effects. Erysimum capitatum (Brassicaceae) is a widespread and variable plant species with generalized pollination that is attacked by a number of herbivores. The following questions were addressed. (a) Are pollinators and herbivores attracted by similar plant traits? (b) Does herbivory affect pollinator preferences? (c) Do pollinators and/or herbivores affect fitness and select on plant traits? (d) Do plant compensatory responses affect the outcome of interactions among plants, pollinators and herbivores? (e) Do interactions among E. capitatum and its pollinators and herbivores differ among sites and years? METHODS: In 2005 and 2006, observational and experimental studies were combined in four populations at different elevations to examine selection by pollinators and herbivores on floral traits of E. capitatum. KEY RESULTS: Pollinator and herbivore assemblages varied spatially and temporally, as did their effects on plant fitness and selection. Both pollinators and herbivores preferred plants with more flowers, and herbivory sometimes reduced pollinator visitation. Pollinators did not select on plant traits in any year or population and E. capitatum was not pollen limited; however, supplemental pollen resulted in altered plant resource allocation. Herbivores reduced fitness and selected for plant traits in some populations, and these effects were mediated by plant compensatory responses. CONCLUSIONS: Individuals of Erysimum capitatum are visited by diverse groups of pollinators and herbivores that shift in abundance and importance in time and space. Compensatory reproductive mechanisms mediate interactions with both pollinators and herbivores and may allow E. capitatum to succeed in this complex selective environment.


Subject(s)
Erysimum/physiology , Herbivory/physiology , Pollination/physiology , Animals , Ecotype , Flowers/physiology , Models, Biological , Time Factors
16.
J Hosp Infect ; 78(4): 256-9, 2011 Aug.
Article in English | MEDLINE | ID: mdl-21669476

ABSTRACT

Blood culture is a vital investigation and can be the first step in obtaining a definitive diagnosis in a patient with presumed sepsis, but can also have serious adverse consequences for the patient. The aim of this study was to evaluate the extent of the blood culture contamination problem at the Lancashire Teaching Hospitals (LTH) and to assess the impact of the introduction of a new blood culture collection kit on the contamination rate. Blood culture contamination rate at the LTH before the introduction of the blood culture collection kit was 9.2%. A fall in contamination rate was observed after kit introduction, to 3.8%, a proportion approaching the American Society of Microbiologists' recommended standard of ≤3%. The reduction in contamination was associated with an unintended, yet sustained, reduction in the total number of blood culture sets collected and an unwanted reduction in the number of genuine Gram-negative bacteraemias. This reduction may reflect education and training issues at the time of the introduction. In the era of 'route cause analyses', it may also reflect fears by junior colleagues of the consequences of being found responsible for a blood culture contaminant. The study recommended continuing with the blood culture kit, but ensuring regular training and education sessions, carried out in a non-blame manner.


Subject(s)
Blood/microbiology , Microbiological Techniques/methods , Sepsis/diagnosis , Sepsis/etiology , Specimen Handling/methods , Hospitals , Humans , Quality Control , United Kingdom
17.
J Hosp Infect ; 79(1): 32-7, 2011 Sep.
Article in English | MEDLINE | ID: mdl-21684038

ABSTRACT

The standard approach for norovirus control in hospitals in the UK, as outlined by the Health Protection Agency guidance and implemented previously by Lancashire Teaching Hospitals, involves the early closure of affected wards. However, this has a major impact on bed-days lost and cancelled admissions. In 2008, a new strategy was introduced in the study hospital, key elements of which included closure of affected ward bays (rather than wards), installation of bay doors, enhanced cleaning, a rapid in-house molecular test and an enlarged infection control team. The impact of these changes was assessed by comparing two norovirus seasons (2007-08 and 2009-10) before and after implementation of the new strategy, expressing the contrast between seasons as a ratio (r) of expected counts in the two seasons. There was a significant decrease in the ratio of confirmed hospital outbreaks to community outbreaks (r = 0.317, P = 0.025), the number of days of restricted admissions on hospital wards per outbreak (r = 0.742, P = 0.041), and the number of hospital bed-days lost per outbreak (r = 0.344, P <0.001). However, there was no significant change in the number of patients affected per hospital outbreak (r = 1.080, P = 0.517), or the number of hospital staff affected per outbreak (r = 0.651, P = 0.105). Closure of entire wards during norovirus outbreaks is not always necessary. The changes implemented at the study hospital resulted in a significant reduction in the number of bed-days lost per outbreak, and this, together with a reduction in outbreak frequency, resulted in considerable cost savings.


Subject(s)
Caliciviridae Infections/epidemiology , Caliciviridae Infections/prevention & control , Cross Infection/epidemiology , Cross Infection/prevention & control , Disease Outbreaks , Infection Control/methods , Norovirus/isolation & purification , Health Services Research , Hospital Units , Humans , United Kingdom/epidemiology
18.
Epidemiol Infect ; 139(12): 1854-62, 2011 Dec.
Article in English | MEDLINE | ID: mdl-21303589

ABSTRACT

The AEGISS (Ascertainment and Enhancement of Disease Surveillance and Statistics) project uses spatio-temporal statistical methods to identify anomalies in the incidence of gastrointestinal infections in the UK. The focus of this paper is the modelling of temporal variation in incidence using data from the Southampton area in southern England. We identified and fitted a hierarchical stochastic model for the time series of daily incident cases to enable probabilistic prediction of temporal variation in risk, and demonstrated the resulting gains in predictive accuracy by comparison with a conventional analysis based on an over-dispersed Poisson log-linear regression model. We used Bayesian methods of inference in order to incorporate parameter uncertainty in our predictive inference of risk. Incorporation of our model in the overall spatio-temporal model, will contribute to the accurate and timely prediction of unusually high food-poisoning incidence, and thus to the identification and prevention of future outbreaks.


Subject(s)
Foodborne Diseases/epidemiology , Gastrointestinal Diseases/epidemiology , Models, Biological , Bayes Theorem , England/epidemiology , Foodborne Diseases/microbiology , Foodborne Diseases/prevention & control , Gastrointestinal Diseases/microbiology , Gastrointestinal Diseases/prevention & control , Humans , Incidence , Monte Carlo Method , Population Surveillance , Regression Analysis , Risk Assessment , Space-Time Clustering , Stochastic Processes
19.
Epidemiol Infect ; 139(11): 1661-71, 2011 Nov.
Article in English | MEDLINE | ID: mdl-21134320

ABSTRACT

Multi-locus sequence typing was performed on 1003 Campylobacter jejuni isolates collected in a 2-year longitudinal study of 15 dairy farms and four sheep farms in Lancashire, UK. There was considerable farm-level variation in occurrence and prevalence of clonal complexes (CC). Clonal complexes ST61, ST21, ST403 and ST45 were most prevalent in cattle while in sheep CC ST42, ST21, ST48 and ST52 were most prevalent. CC ST45, a complex previously shown to be more common in summer months in human cases, was more prevalent in summer in our ruminant samples. Gene flow analysis demonstrated a high level of genetic heterogeneity at the within-farm level. Sequence-type diversity was greater in cattle compared to sheep, in cattle at pasture vs. housed, and in isolates from farms on the Pennines compared to the Southern Fylde. Sequence-type diversity was greatest in isolates belonging to CC ST21, ST45 and ST206.


Subject(s)
Campylobacter Infections/veterinary , Campylobacter jejuni/genetics , Cattle Diseases/epidemiology , Sheep Diseases/epidemiology , Animals , Campylobacter Infections/epidemiology , Campylobacter Infections/microbiology , Campylobacter jejuni/isolation & purification , Cattle , Cattle Diseases/microbiology , Cross-Sectional Studies , England/epidemiology , Feces/microbiology , Female , Genetic Variation , Longitudinal Studies , Male , Molecular Epidemiology , Multilocus Sequence Typing , Multivariate Analysis , Sheep , Sheep Diseases/microbiology
20.
Ann Bot ; 106(2): 309-19, 2010 Aug.
Article in English | MEDLINE | ID: mdl-20519237

ABSTRACT

BACKGROUND AND AIMS: Variability in embryo development can influence the rate of seed maturation and seed size, which may have an impact on offspring fitness. While it is expected that embryo development will be under maternal control, more controversial hypotheses suggest that the pollen donor and the embryo itself may influence development. These latter possibilities are, however, poorly studied. Characteristics of 10-d-old embryos and seeds of wild radish (Raphanus sativus) were examined to address: (a) the effects of maternal plant and pollen donor on development; (b) the effects of earlier reproductive events (pollen tube growth and fertilization) on embryos and seeds, and the influence of embryo size on mature seed mass; (c) the effect of water stress on embryos and seeds; (d) the effect of stress on correlations of embryo and seed characteristics with earlier and later reproductive events and stages; and (e) changes in maternal and paternal effects on embryo and seed characteristics during development. METHODS: Eight maternal plants (two each from four families) and four pollen donors were crossed and developing gynoecia were collected at 10 d post-pollination. Half of the maternal plants experienced water stress. Characteristics of embryos and seeds were summarized and also compared with earlier and later developmental stages. KEY RESULTS: In addition to the expected effects of the maternal plants, all embryo characters differed among pollen donors. Paternal effects varied over time, suggesting that there are windows of opportunity for pollen donors to influence embryo development. Water-stress treatment altered embryo characteristics; embryos were smaller and less developed. In addition, correlations of embryo characteristics with earlier and later stages changed dramatically with water stress. CONCLUSIONS: The expected maternal effects on embryo development were observed, but there was also evidence for an early paternal role. The relative effects of these controls may change over time. Thus, there may be times in development when selection on the maternal, paternal or embryo contributions to development are more and less likely.


Subject(s)
Raphanus/embryology , Seeds/embryology , Pollen/physiology , Raphanus/genetics , Seeds/genetics
SELECTION OF CITATIONS
SEARCH DETAIL