Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 39
Filter
1.
Eur Heart J Open ; 3(3): oead037, 2023 May.
Article in English | MEDLINE | ID: mdl-37143610

ABSTRACT

Aims: In patients with non-valvular atrial fibrillation (NVAF) prescribed warfarin, the association between guideline defined international normalised ratio (INR) control and adverse outcomes in unknown. We aimed to (i) determine stroke and systemic embolism (SSE) and bleeding events in NVAF patients prescribed warfarin; and (ii) estimate the increased risk of these adverse events associated with poor INR control in this population. Methods and results: Individual-level population-scale linked patient data were used to investigate the association between INR control and both SSE and bleeding events using (i) the National Institute for Health and Care Excellence (NICE) criteria of poor INR control [time in therapeutic range (TTR) <65%, two INRs <1.5 or two INRs >5 in a 6-month period or any INR >8]. A total of 35 891 patients were included for SSE and 35 035 for bleeding outcome analyses. Mean CHA2DS2-VASc score was 3.5 (SD = 1.7), and the mean follow up was 4.3 years for both analyses. Mean TTR was 71.9%, with 34% of time spent in poor INR control according to NICE criteria.SSE and bleeding event rates (per 100 patient years) were 1.01 (95%CI 0.95-1.08) and 3.4 (95%CI 3.3-3.5), respectively, during adequate INR control, rising to 1.82 (95%CI 1.70-1.94) and 4.8 (95% CI 4.6-5.0) during poor INR control.Poor INR control was independently associated with increased risk of both SSE [HR = 1.69 (95%CI = 1.54-1.86), P < 0.001] and bleeding [HR = 1.40 (95%CI 1.33-1.48), P < 0.001] in Cox-multivariable models. Conclusion: Guideline-defined poor INR control is associated with significantly higher SSE and bleeding event rates, independent of recognised risk factors for stroke or bleeding.

2.
Age Ageing ; 51(12)2022 12 05.
Article in English | MEDLINE | ID: mdl-36469089

ABSTRACT

BACKGROUND: dementia may increase care home residents' risk of COVID-19, but there is a lack of evidence on this effect and on interactions with individual and care home-level factors. METHODS: we created a national cross-sectional retrospective cohort of care home residents in Wales for 1 September to 31 December 2020. Risk factors were analysed using multi-level logistic regression to model the likelihood of SARS-CoV-2 infection and mortality. RESULTS: the cohort included 9,571 individuals in 673 homes. Dementia was diagnosed in 5,647 individuals (59%); 1,488 (15.5%) individuals tested positive for SARS-CoV-2. We estimated the effects of age, dementia, frailty, care home size, proportion of residents with dementia, nursing and dementia services, communal space and region. The final model included the proportion of residents with dementia (OR for positive test 4.54 (95% CIs 1.55-13.27) where 75% of residents had dementia compared to no residents with dementia) and frailty (OR 1.29 (95% CIs 1.05-1.59) for severe frailty compared with no frailty). Analysis suggested 76% of the variation was due to setting rather than individual factors. Additional analysis suggested severe frailty and proportion of residents with dementia was associated with all-cause mortality, as was dementia diagnosis. Mortality analyses were challenging to interpret. DISCUSSION: whilst individual frailty increased the risk of COVID-19 infection, dementia was a risk factor at care home but not individual level. These findings suggest whole-setting interventions, particularly in homes with high proportions of residents with dementia and including those with low/no individual risk factors may reduce the impact of COVID-19.


Subject(s)
COVID-19 , Dementia , Frailty , Humans , SARS-CoV-2 , COVID-19/epidemiology , COVID-19/therapy , Nursing Homes , Retrospective Studies , Prevalence , Incidence , Cross-Sectional Studies , Frailty/diagnosis , Frailty/epidemiology , Dementia/diagnosis , Dementia/epidemiology , Dementia/therapy
3.
J R Soc Med ; 115(12): 467-478, 2022 12.
Article in English | MEDLINE | ID: mdl-35796183

ABSTRACT

OBJECTIVES: To better understand the risk of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection among healthcare workers, leading to recommendations for the prioritisation of personal protective equipment, testing, training and vaccination. DESIGN: Observational, longitudinal, national cohort study. SETTING: Our cohort were secondary care (hospital-based) healthcare workers employed by NHS Wales (United Kingdom) organisations from 1 April 2020 to 30 November 2020. PARTICIPANTS: We included 577,756 monthly observations among 77,587 healthcare workers. Using linked anonymised datasets, participants were grouped into 20 staff roles. Additionally, each role was deemed either patient-facing, non-patient-facing or undetermined. This was linked to individual demographic details and dates of positive SARS-CoV-2 PCR tests. MAIN OUTCOME MEASURES: We used univariable and multivariable logistic regression models to determine odds ratios (ORs) for the risk of a positive SARS-CoV-2 PCR test. RESULTS: Patient-facing healthcare workers were at the highest risk of SARS-CoV-2 infection with an adjusted OR (95% confidence interval [CI]) of 2.28 (95% CI 2.10-2.47). We found that after adjustment, foundation year doctors (OR 1.83 [95% CI 1.47-2.27]), healthcare support workers [OR 1.36 [95% CI 1.20-1.54]) and hospital nurses (OR 1.27 [95% CI 1.12-1.44]) were at the highest risk of infection among all staff groups. Younger healthcare workers and those living in more deprived areas were at a higher risk of infection. We also observed that infection rates varied over time and by organisation. CONCLUSIONS: These findings have important policy implications for the prioritisation of vaccination, testing, training and personal protective equipment provision for patient-facing roles and the higher risk staff groups.


Subject(s)
COVID-19 , Humans , Cohort Studies , Longitudinal Studies , COVID-19/epidemiology , SARS-CoV-2 , United Kingdom/epidemiology , Health Personnel
4.
Age Ageing ; 51(5)2022 05 01.
Article in English | MEDLINE | ID: mdl-35291009

ABSTRACT

BACKGROUND: defining features of the COVID-19 pandemic in many countries were the tragic extent to which care home residents were affected and the difficulty in preventing the introduction and subsequent spread of infection. Management of risk in care homes requires good evidence on the most important transmission pathways. One hypothesised route at the start of the pandemic, prior to widespread testing, was the transfer of patients from hospitals that were experiencing high levels of nosocomial events. METHODS: we tested the hypothesis that hospital discharge events increased the intensity of care home cases using a national individually linked health record cohort in Wales, UK. We monitored 186,772 hospital discharge events over the period from March to July 2020, tracking individuals to 923 care homes and recording the daily case rate in the homes populated by 15,772 residents. We estimated the risk of an increase in case rates following exposure to a hospital discharge using multi-level hierarchical logistic regression and a novel stochastic Hawkes process outbreak model. FINDINGS: in regression analysis, after adjusting for care home size, we found no significant association between hospital discharge and subsequent increases in care home case numbers (odds ratio: 0.99, 95% CI: 0.82, 1.90). Risk factors for increased cases included care home size, care home resident density and provision of nursing care. Using our outbreak model, we found a significant effect of hospital discharge on the subsequent intensity of cases. However, the effect was small and considerably less than the effect of care home size, suggesting the highest risk of introduction came from interaction with the community. We estimated that approximately 1.8% of hospital discharged patients may have been infected. INTERPRETATION: there is growing evidence in the UK that the risk of transfer of COVID-19 from the high-risk hospital setting to the high-risk care home setting during the early stages of the pandemic was relatively small. Although access to testing was limited to initial symptomatic cases in each care home at this time, our results suggest that reduced numbers of discharges, selection of patients and action taken within care homes following transfer all may have contributed to the mitigation. The precise key transmission routes from the community remain to be quantified.


Subject(s)
COVID-19 , COVID-19/epidemiology , Hospitals , Humans , Nursing Homes , Pandemics/prevention & control , Patient Discharge , United Kingdom/epidemiology
5.
Age Ageing ; 51(1)2022 01 06.
Article in English | MEDLINE | ID: mdl-34850818

ABSTRACT

BACKGROUND: vaccinations for COVID-19 have been prioritised for older people living in care homes. However, vaccination trials included limited numbers of older people. AIM: we aimed to study infection rates of SARS-CoV-2 for older care home residents following vaccination and identify factors associated with increased risk of infection. STUDY DESIGN AND SETTING: we conducted an observational data-linkage study including 14,104 vaccinated older care home residents in Wales (UK) using anonymised electronic health records and administrative data. METHODS: we used Cox proportional hazards models to estimate hazard ratios (HRs) for the risk of testing positive for SARS-CoV-2 infection following vaccination, after landmark times of either 7 or 21 days post-vaccination. We adjusted HRs for age, sex, frailty, prior SARS-CoV-2 infections and vaccination type. RESULTS: we observed a small proportion of care home residents with positive polymerase chain reaction (tests following vaccination 1.05% (N = 148), with 90% of infections occurring within 28 days. For the 7-day landmark analysis we found a reduced risk of SARS-CoV-2 infection for vaccinated individuals who had a previous infection; HR (95% confidence interval) 0.54 (0.30, 0.95). For the 21-day landmark analysis, we observed high HRs for individuals with low and intermediate frailty compared with those without; 4.59 (1.23, 17.12) and 4.85 (1.68, 14.04), respectively. CONCLUSIONS: increased risk of infection after 21 days was associated with frailty. We found most infections occurred within 28 days of vaccination, suggesting extra precautions to reduce transmission risk should be taken in this time frame.


Subject(s)
COVID-19 , Aged , Cohort Studies , Humans , Longitudinal Studies , SARS-CoV-2 , Wales/epidemiology
6.
Eur J Prev Cardiol ; 28(8): 854-861, 2021 07 23.
Article in English | MEDLINE | ID: mdl-34298561

ABSTRACT

AIMS: European Society of Cardiology/European Atherosclerosis Society 2019 guidelines recommend more aggressive lipid targets in high- and very high-risk patients and the addition of adjuvant treatments to statins in uncontrolled patients. We aimed to assess (a) achievement of prior and new European Society of Cardiology/European Atherosclerosis Society lipid targets and (b) lipid-lowering therapy prescribing in a nationwide cohort of very high-risk patients. METHODS: We conducted a retrospective observational population study using linked health data in patients undergoing percutaneous coronary intervention (2012-2017). Follow-up was for one-year post-discharge. RESULTS: Altogether, 10,071 patients had a documented LDL-C level, of whom 48% had low-density lipoprotein cholesterol (LDL-C)<1.8 mmol/l (2016 target) and (23%) <1.4 mmol/l (2019 target). Five thousand three hundred and forty patients had non-high-density lipoprotein cholesterol (non-HDL-C) documented with 57% <2.6 mmol/l (2016) and 37% <2.2 mmol/l (2019). In patients with recurrent vascular events, fewer than 6% of the patients achieved the 2019 LDL-C target of <1.0 mmol/l. A total of 10,592 patients had triglyceride (TG) levels documented, of whom 14% were ≥2.3 mmol/l and 41% ≥1.5 mmol/l (2019). High-intensity statins were prescribed in 56.4% of the cohort, only 3% were prescribed ezetimibe, fibrates or prescription-grade N-3 fatty acids. Prescribing of these agents was lower amongst patients above target LDL-C, non-HDL-C and triglyceride levels. Females were more likely to have LDL-C, non-HDL-C and triglyceride levels above target. CONCLUSION: There was a low rate of achievement of the new European Society of Cardiology/European Atherosclerosis Society lipid targets in this large post-percutaneous coronary intervention population and relatively low rates of intensive lipid-lowering therapy prescribing in those with uncontrolled lipids. There is considerable potential to optimise lipid-lowering therapy further through statin intensification and appropriate use of novel lipid-lowering therapy, especially in women.


Subject(s)
Hydroxymethylglutaryl-CoA Reductase Inhibitors , Percutaneous Coronary Intervention , Aftercare , Female , Humans , Hydroxymethylglutaryl-CoA Reductase Inhibitors/therapeutic use , Lipids , Patient Discharge , Percutaneous Coronary Intervention/adverse effects , Retrospective Studies
7.
BMJ Paediatr Open ; 5(1): e001049, 2021.
Article in English | MEDLINE | ID: mdl-34192199

ABSTRACT

Background: Better understanding of the role that children and school staff play in the transmission of SARS-CoV-2 is essential to guide policy development on controlling infection while minimising disruption to children's education and well-being. Methods: Our national e-cohort (n=464531) study used anonymised linked data for pupils, staff and associated households linked via educational settings in Wales. We estimated the odds of testing positive for SARS-CoV-2 infection for staff and pupils over the period August- December 2020, dependent on measures of recent exposure to known cases linked to their educational settings. Results: The total number of cases in a school was not associated with a subsequent increase in the odds of testing positive (staff OR per case: 0.92, 95% CI 0.85 to 1.00; pupil OR per case: 0.98, 95% CI 0.93 to 1.02). Among pupils, the number of recent cases within the same year group was significantly associated with subsequent increased odds of testing positive (OR per case: 1.12, 95% CI 1.08 to 1.15). These effects were adjusted for a range of demographic covariates, and in particular any known cases within the same household, which had the strongest association with testing positive (staff OR: 39.86, 95% CI 35.01 to 45.38; pupil OR: 9.39, 95% CI 8.94 to 9.88). Conclusions: In a national school cohort, the odds of staff testing positive for SARS-CoV-2 infection were not significantly increased in the 14-day period after case detection in the school. However, pupils were found to be at increased odds, following cases appearing within their own year group, where most of their contacts occur. Strong mitigation measures over the whole of the study period may have reduced wider spread within the school environment.


Subject(s)
COVID-19 , Child , Humans , SARS-CoV-2 , Schools , Semantic Web , Wales/epidemiology
8.
Influenza Other Respir Viruses ; 15(3): 371-380, 2021 05.
Article in English | MEDLINE | ID: mdl-33547872

ABSTRACT

BACKGROUND: The population of adult residential care homes has been shown to have high morbidity and mortality in relation to COVID-19. METHODS: We examined 3115 hospital discharges to a national cohort of 1068 adult care homes and subsequent outbreaks of COVID-19 occurring between 22 February and 27 June 2020. A Cox proportional hazards regression model was used to assess the impact of time-dependent exposure to hospital discharge on incidence of the first known outbreak, over a window of 7-21 days after discharge, and adjusted for care home characteristics, including size and type of provision. RESULTS: A total of 330 homes experienced an outbreak, and 544 homes received a discharge over the study period. Exposure to hospital discharge was not associated with a significant increase in the risk of a new outbreak (hazard ratio 1.15, 95% CI 0.89, 1.47, P = .29) after adjusting for care home characteristics. Care home size was the most significant predictor. Hazard ratios (95% CI) in comparison with homes of <10 residents were as follows: 3.40 (1.99, 5.80) for 10-24 residents; 8.25 (4.93, 13.81) for 25-49 residents; and 17.35 (9.65, 31.19) for 50+ residents. When stratified for care home size, the outbreak rates were similar for periods when homes were exposed to a hospital discharge, in comparison with periods when homes were unexposed. CONCLUSION: Our analyses showed that large homes were at considerably greater risk of outbreaks throughout the epidemic, and after adjusting for care home size, a discharge from hospital was not associated with a significant increase in risk.


Subject(s)
COVID-19/epidemiology , Disease Outbreaks , Nursing Homes , SARS-CoV-2 , Cohort Studies , Humans , Patient Discharge , Proportional Hazards Models
9.
Eur Heart J Cardiovasc Pharmacother ; 7(1): 40-49, 2021 01 16.
Article in English | MEDLINE | ID: mdl-31774502

ABSTRACT

AIMS: In patients with non-valvular atrial fibrillation prescribed warfarin, the UK National Institute of Health and Care Excellence (NICE) defines poor anticoagulation as a time in therapeutic range (TTR) of <65%, any two international normalized ratios (INRs) within a 6-month period of ≤1.5 ('low'), two INRs ≥5 within 6 months, or any INR ≥8 ('high'). Our objectives were to (i) quantify the number of patients with poor INR control and (ii) describe the demographic and clinical characteristics associated with poor INR control. METHOD AND RESULTS: Linked anonymized health record data for Wales, UK (2006-2017) was used to evaluate patients prescribed warfarin who had at least 6 months of INR data. 32 380 patients were included. In total, 13 913 (43.0%) patients had at least one of the NICE markers of poor INR control. Importantly, in the 24 123 (74.6%) of the cohort with an acceptable TTR (≥65%), 5676 (23.5%) had either low or high INR readings at some point in their history. In a multivariable regression female gender, age (≥75 years), excess alcohol, diabetes heart failure, ischaemic heart disease, and respiratory disease were independently associated with all markers of poor INR control. CONCLUSION: Acceptable INR control according to NICE standards is poor. Of those with an acceptable TTR (>65%), one-quarter still had unacceptably low or high INR levels according to NICE criteria. Thus, only using TTR to assess effectiveness with warfarin has the potential to miss a large number of patients with non-therapeutic INRs who are likely to be at increased risk.


Subject(s)
Atrial Fibrillation , Warfarin , Aged , Atrial Fibrillation/drug therapy , Female , Humans , International Normalized Ratio , Male , Warfarin/therapeutic use
10.
Age Ageing ; 50(1): 25-31, 2021 01 08.
Article in English | MEDLINE | ID: mdl-32951042

ABSTRACT

BACKGROUND: mortality in care homes has had a prominent focus during the COVID-19 outbreak. Care homes are particularly vulnerable to the spread of infectious diseases, which may lead to increased mortality risk. Multiple and interconnected challenges face the care home sector in the prevention and management of outbreaks of COVID-19, including adequate supply of personal protective equipment, staff shortages and insufficient or lack of timely COVID-19 testing. AIM: to analyse the mortality of older care home residents in Wales during COVID-19 lockdown and compare this across the population of Wales and the previous 4 years. STUDY DESIGN AND SETTING: we used anonymised electronic health records and administrative data from the secure anonymised information linkage databank to create a cross-sectional cohort study. We anonymously linked data for Welsh residents to mortality data up to the 14th June 2020. METHODS: we calculated survival curves and adjusted Cox proportional hazards models to estimate hazard ratios (HRs) for the risk of mortality. We adjusted HRs for age, gender, social economic status and prior health conditions. RESULTS: survival curves show an increased proportion of deaths between 23rd March and 14th June 2020 in care homes for older people, with an adjusted HR of 1.72 (1.55, 1.90) compared with 2016. Compared with the general population in 2016-2019, adjusted care home mortality HRs for older adults rose from 2.15 (2.11, 2.20) in 2016-2019 to 2.94 (2.81, 3.08) in 2020. CONCLUSIONS: the survival curves and increased HRs show a significantly increased risk of death in the 2020 study periods.


Subject(s)
COVID-19 Testing , COVID-19 , Homes for the Aged/statistics & numerical data , Infection Control , Nursing Homes/statistics & numerical data , Aged , COVID-19/mortality , COVID-19/prevention & control , COVID-19/therapy , COVID-19 Testing/methods , COVID-19 Testing/standards , Female , Health Status Disparities , Humans , Infection Control/methods , Infection Control/organization & administration , Infection Control/statistics & numerical data , Male , Mortality , Needs Assessment , Personal Protective Equipment/supply & distribution , Risk Assessment , SARS-CoV-2/isolation & purification , Wales/epidemiology , Workload/standards
11.
J R Soc Interface ; 17(173): 20200775, 2020 12.
Article in English | MEDLINE | ID: mdl-33292095

ABSTRACT

Controlling the regional re-emergence of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) after its initial spread in ever-changing personal contact networks and disease landscapes is a challenging task. In a landscape context, contact opportunities within and between populations are changing rapidly as lockdown measures are relaxed and a number of social activities re-activated. Using an individual-based metapopulation model, we explored the efficacy of different control strategies across an urban-rural gradient in Wales, UK. Our model shows that isolation of symptomatic cases or regional lockdowns in response to local outbreaks have limited efficacy unless the overall transmission rate is kept persistently low. Additional isolation of non-symptomatic infected individuals, who may be detected by effective test-and-trace strategies, is pivotal to reducing the overall epidemic size over a wider range of transmission scenarios. We define an 'urban-rural gradient in epidemic size' as a correlation between regional epidemic size and connectivity within the region, with more highly connected urban populations experiencing relatively larger outbreaks. For interventions focused on regional lockdowns, the strength of such gradients in epidemic size increased with higher travel frequencies, indicating a reduced efficacy of the control measure in the urban regions under these conditions. When both non-symptomatic and symptomatic individuals are isolated or regional lockdown strategies are enforced, we further found the strongest urban-rural epidemic gradients at high transmission rates. This effect was reversed for strategies targeted at symptomatic individuals only. Our results emphasize the importance of test-and-trace strategies and maintaining low transmission rates for efficiently controlling SARS-CoV-2 spread, both at landscape scale and in urban areas.


Subject(s)
COVID-19/prevention & control , Communicable Disease Control/methods , Pandemics/prevention & control , SARS-CoV-2 , Asymptomatic Infections/epidemiology , COVID-19/epidemiology , COVID-19/transmission , Computer Simulation , Contact Tracing , Humans , Models, Biological , Physical Distancing , Rural Population , Social Interaction , Urban Population , Wales/epidemiology
12.
BMJ Open ; 10(10): e043010, 2020 10 21.
Article in English | MEDLINE | ID: mdl-33087383

ABSTRACT

INTRODUCTION: The emergence of the novel respiratory SARS-CoV-2 and subsequent COVID-19 pandemic have required rapid assimilation of population-level data to understand and control the spread of infection in the general and vulnerable populations. Rapid analyses are needed to inform policy development and target interventions to at-risk groups to prevent serious health outcomes. We aim to provide an accessible research platform to determine demographic, socioeconomic and clinical risk factors for infection, morbidity and mortality of COVID-19, to measure the impact of COVID-19 on healthcare utilisation and long-term health, and to enable the evaluation of natural experiments of policy interventions. METHODS AND ANALYSIS: Two privacy-protecting population-level cohorts have been created and derived from multisourced demographic and healthcare data. The C20 cohort consists of 3.2 million people in Wales on the 1 January 2020 with follow-up until 31 May 2020. The complete cohort dataset will be updated monthly with some individual datasets available daily. The C16 cohort consists of 3 million people in Wales on the 1 January 2016 with follow-up to 31 December 2019. C16 is designed as a counterfactual cohort to provide contextual comparative population data on disease, health service utilisation and mortality. Study outcomes will: (a) characterise the epidemiology of COVID-19, (b) assess socioeconomic and demographic influences on infection and outcomes, (c) measure the impact of COVID-19 on short -term and longer-term population outcomes and (d) undertake studies on the transmission and spatial spread of infection. ETHICS AND DISSEMINATION: The Secure Anonymised Information Linkage-independent Information Governance Review Panel has approved this study. The study findings will be presented to policy groups, public meetings, national and international conferences, and published in peer-reviewed journals.


Subject(s)
Betacoronavirus , Coronavirus Infections/therapy , Delivery of Health Care/standards , Pandemics/prevention & control , Pneumonia, Viral/therapy , COVID-19 , Coronavirus Infections/epidemiology , Humans , Pneumonia, Viral/epidemiology , Risk Factors , SARS-CoV-2 , Wales/epidemiology
13.
Int J Popul Data Sci ; 5(4): 1715, 2020.
Article in English | MEDLINE | ID: mdl-35677101

ABSTRACT

Background: Population-level information on dispensed medication provides insight on the distribution of treated morbidities, particularly if linked to other population-scale data at an individual-level. Objective: To evaluate the impact of COVID-19 on dispensing patterns of medications. Methods: Retrospective observational study using population-scale, individual-level dispensing records in Wales, UK. Total dispensed drug items for the population between 1 st January 2016 and 31 st December 2019 (3-years, pre-COVID-19) were compared to 2020 with follow up until 27 th July 2021 (COVID-19 period). We compared trends across all years and British National Formulary (BNF) chapters and highlighted the trends in three major chapters for 2019-21: 1-Cardiovascular system (CVD); 2-Central Nervous System (CNS); 3-Immunological & Vaccine. We developed an interactive dashboard to enable monitoring of changes as the pandemic evolves. Result: Amongst all BNF chapters, 73,410,543 items were dispensed in 2020 compared to 74,121,180 items in 2019 demonstrating -0.96% relative decrease in 2020. Comparison of monthly patterns showed average difference (D) of -59,220 and average Relative Change (RC) of -0.74% between the number of dispensed items in 2020 and 2019. Maximum RC was observed in March 2020 (D = +1,224,909 and RC = +20.62), followed by second peak in June 2020 (D = +257,920, RC = +4.50%). A third peak was observed in September 2020 (D = +264,138, RC = +4.35%). Large increases in March 2020 were observed for CVD and CNS medications across all age groups. The Immunological and Vaccine products dropped to very low levels across all age groups and all months (including the March dispensing peak). Conclusions: Reconfiguration of routine clinical services during COVID-19 led to substantial changes in community pharmacy drug dispensing. This change may contribute to a long-term burden of COVID-19, raising the importance of a comprehensive and timely monitoring of changes for evaluation of the potential impact on clinical care and outcomes.

14.
J Am Heart Assoc ; 8(21): e012812, 2019 11 05.
Article in English | MEDLINE | ID: mdl-31658860

ABSTRACT

Background Early discontinuation of P2Y12 antagonists post-percutaneous coronary intervention may increase risk of stent thrombosis or nonstent recurrent myocardial infarction. Our aims were to (1) analyze the early discontinuation rate of P2Y12 antagonists post-percutaneous coronary intervention, (2) explore factors associated with early discontinuation, and (3) analyze the risk of major adverse cardiovascular events (death, acute coronary syndrome, revascularization, or stroke) associated with discontinuation from a prespecified prescribing instruction of 1 year. Method and Results We studied 2090 patients (2011-2015) who were recommended for clopidogrel for 12 months (+aspirin) post-percutaneous coronary intervention within a retrospective observational population cohort. Relationships between clopidogrel discontinuation and major adverse cardiac events were evaluated over 18-month follow-up. Discontinuation of clopidogrel in the first 4 quarters was low at 1.1%, 2.6%, 3.7%, and 6.1%, respectively. Previous revascularization, previous ischemic stroke, and age >80 years were independent predictors of early discontinuation. In a time-dependent multiple regression model, clopidogrel discontinuation and bleeding (hazard ratio=1.82 [1.01-3.30] and hazard ratio=5.30 [3.14-8.94], respectively) were independent predictors of major adverse cardiac events as were age <49 and ≥70 years (versus those aged 50-59 years), hypertension, chronic kidney disease stage 4+, previous revascularization, ischemic stroke, and thromboembolism. Furthermore, in those with both bleeding and clopidogrel discontinuation, hazard ratio for major adverse cardiac events was 9.34 (3.39-25.70). Conclusions Discontinuation of clopidogrel is low in the first year post-percutaneous coronary intervention, where a clear discharge instruction to treat for 1 year is provided. Whereas this is reassuring from the population level, at an individual level discontinuation earlier than the intended duration is associated with an increased rate of adverse events, most notably in those with both bleeding and discontinuation.


Subject(s)
Clopidogrel/administration & dosage , Medication Adherence , Percutaneous Coronary Intervention , Purinergic P2Y Receptor Antagonists/administration & dosage , Age Factors , Aged , Aged, 80 and over , Cohort Studies , Drug Prescriptions/statistics & numerical data , Female , Hemorrhage/epidemiology , Humans , Hypertension/epidemiology , Male , Middle Aged , Renal Insufficiency, Chronic/epidemiology , Retreatment , Retrospective Studies , Stroke/epidemiology , Thromboembolism/epidemiology , Wales/epidemiology
15.
Sci Rep ; 8(1): 7668, 2018 05 16.
Article in English | MEDLINE | ID: mdl-29769554

ABSTRACT

Most randomised controlled trials (RCTs) are relatively short term and, due to costs and available resources, have limited opportunity to be re-visited or extended. There is no guarantee that effects of treatments remain unchanged beyond the study. Here, we illustrate the feasibility, benefits and cost-effectiveness of enriching standard trial design with electronic follow up. We completed a 5-year electronic follow up of a RCT investigating the impact of probiotics on asthma and eczema in children born 2005-2007, with traditional fieldwork follow up to two years. Participants and trial outcomes were identified and analysed after five years using secure, routine, anonymised, person-based electronic health service databanks. At two years, we identified 93% of participants and compared fieldwork with electronic health records, highlighting areas of agreement and disagreement. Retention of children from lower socio-economic groups was improved, reducing volunteer bias. At 5 years we identified a reduced 82% of participants. These data allowed the trial's first robust analysis of asthma endpoints. We found no indication that probiotic supplementation to pregnant mothers and infants protected against asthma or eczema at 5 years. Continued longer-term follow up is technically straightforward.


Subject(s)
Asthma/prevention & control , Eczema/prevention & control , Electronic Health Records/statistics & numerical data , Mothers/statistics & numerical data , Probiotics/therapeutic use , Child, Preschool , Double-Blind Method , Female , Humans , Infant, Newborn , Pregnancy , Quality of Life
16.
Mov Ecol ; 4: 22, 2016.
Article in English | MEDLINE | ID: mdl-27688882

ABSTRACT

BACKGROUND: We are increasingly using recording devices with multiple sensors operating at high frequencies to produce large volumes of data which are problematic to interpret. A particularly challenging example comes from studies on animals and humans where researchers use animal-attached accelerometers on moving subjects to attempt to quantify behaviour, energy expenditure and condition. RESULTS: The approach taken effectively concatinated three complex lines of acceleration into one visualization that highlighted patterns that were otherwise not obvious. The summation of data points within sphere facets and presentation into histograms on the sphere surface effectively dealt with data occlusion. Further frequency binning of data within facets and representation of these bins as discs on spines radiating from the sphere allowed patterns in dynamic body accelerations (DBA) associated with different postures to become obvious. METHOD: We examine the extent to which novel, gravity-based spherical plots can produce revealing visualizations to incorporate the complexity of such multidimensional acceleration data using a suite of different acceleration-derived metrics with a view to highlighting patterns that are not obvious using current approaches. The basis for the visualisation involved three-dimensional plots of the smoothed acceleration values, which then occupied points on the surface of a sphere. This sphere was divided into facets and point density within each facet expressed as a histogram. Within each facet-dependent histogram, data were also grouped into frequency bins of any desirable parameters, most particularly dynamic body acceleration (DBA), which were then presented as discs on a central spine radiating from the facet. Greater radial distances from the sphere surface indicated greater DBA values while greater disc diameter indicated larger numbers of data points with that particular value. CONCLUSIONS: We indicate how this approach links behaviour and proxies for energetics and can inform our identification and understanding of movement-related processes, highlighting subtle differences in movement and its associated energetics. This approach has ramifications that should expand to areas as disparate as disease identification, lifestyle, sports practice and wild animal ecology. UCT Science Faculty Animal Ethics 2014/V10/PR (valid until 2017).

17.
BMJ Open ; 5(5): e007447, 2015 May 11.
Article in English | MEDLINE | ID: mdl-25968000

ABSTRACT

OBJECTIVE: To classify wear and non-wear time of accelerometer data for accurately quantifying physical activity in public health or population level research. DESIGN: A bi-moving-window-based approach was used to combine acceleration and skin temperature data to identify wear and non-wear time events in triaxial accelerometer data that monitor physical activity. SETTING: Local residents in Swansea, Wales, UK. PARTICIPANTS: 50 participants aged under 16 years (n=23) and over 17 years (n=27) were recruited in two phases: phase 1: design of the wear/non-wear algorithm (n=20) and phase 2: validation of the algorithm (n=30). METHODS: Participants wore a triaxial accelerometer (GeneActiv) against the skin surface on the wrist (adults) or ankle (children). Participants kept a diary to record the timings of wear and non-wear and were asked to ensure that events of wear/non-wear last for a minimum of 15 min. RESULTS: The overall sensitivity of the proposed method was 0.94 (95% CI 0.90 to 0.98) and specificity 0.91 (95% CI 0.88 to 0.94). It performed equally well for children compared with adults, and females compared with males. Using surface skin temperature data in combination with acceleration data significantly improved the classification of wear/non-wear time when compared with methods that used acceleration data only (p<0.01). CONCLUSIONS: Using either accelerometer seismic information or temperature information alone is prone to considerable error. Combining both sources of data can give accurate estimates of non-wear periods thus giving better classification of sedentary behaviour. This method can be used in population studies of physical activity in free-living environments.


Subject(s)
Accelerometry/methods , Exercise , Monitoring, Ambulatory/methods , Sedentary Behavior , Acceleration , Adolescent , Adult , Algorithms , Ankle , Body Temperature , Child , Female , Humans , Male , Motor Activity , Skin , Wales , Wrist , Young Adult
18.
Ann Card Anaesth ; 18(1): 45-51, 2015.
Article in English | MEDLINE | ID: mdl-25566711

ABSTRACT

OBJECTIVE: OBJECTIVE platelet function assessment after cardiac surgery can predict postoperative blood loss, guide transfusion requirements and discriminate the need for surgical re-exploration. We conducted this study to assess the predictive value of point-of-care testing platelet function using the Multiplate® device. METHODS: Patients undergoing isolated coronary artery bypass grafting were prospectively recruited ( n = 84). Group A ( n = 42) patients were on anti-platelet therapy until surgery; patients in Group B ( n = 42) stopped anti-platelet treatment at least 5 days preoperatively. Multiplate® and thromboelastography (TEG) tests were performed in the perioperative period. Primary end-point was excessive bleeding (>2.5 ml/kg/h) within first 3 h postoperative. Secondary end-points included transfusion requirements, re-exploration rates, intensive care unit and in-hospital stays. RESULTS: Patients in Group A had excessive bleeding (59% vs. 33%, P = 0.02), higher re-exploration rates (14% vs. 0%, P < 0.01) and higher rate of blood (41% vs. 14%, P < 0.01) and platelet (14% vs. 2%, P = 0.05) transfusions. On multivariate analysis, preoperative platelet function testing was the most significant predictor of excessive bleeding (odds ratio [OR]: 2.3, P = 0.08), need for blood (OR: 5.5, P < 0.01) and platelet transfusion (OR: 15.1, P < 0.01). Postoperative "ASPI test" best predicted the need for transfusion (sensitivity - 0.86) and excessive blood loss (sensitivity - 0.81). TEG results did not correlate well with any of these outcome measures. CONCLUSIONS: Peri-operative platelet functional assessment with Multiplate® was the strongest predictor for bleeding and transfusion requirements in patients on anti-platelet therapy until the time of surgery.


Subject(s)
Blood Transfusion/methods , Coronary Artery Bypass/methods , Platelet Function Tests/methods , Point-of-Care Systems , Postoperative Hemorrhage/diagnosis , Blood Transfusion/statistics & numerical data , Endpoint Determination , Female , Humans , Male , Middle Aged , Postoperative Care/methods , Predictive Value of Tests , Preoperative Care , Prospective Studies , Thrombelastography
19.
PLoS One ; 9(11): e113592, 2014.
Article in English | MEDLINE | ID: mdl-25409038

ABSTRACT

Although inequalities in health and socioeconomic status have an important influence on childhood educational performance, the interactions between these multiple factors relating to variation in educational outcomes at micro-level is unknown, and how to evaluate the many possible interactions of these factors is not well established. This paper aims to examine multi-dimensional deprivation factors and their impact on childhood educational outcomes at micro-level, focusing on geographic areas having widely different disparity patterns, in which each area is characterised by six deprivation domains (Income, Health, Geographical Access to Services, Housing, Physical Environment, and Community Safety). Traditional health statistical studies tend to use one global model to describe the whole population for macro-analysis. In this paper, we combine linked educational and deprivation data across small areas (median population of 1500), then use a local modelling technique, the Takagi-Sugeno fuzzy system, to predict area educational outcomes at ages 7 and 11. We define two new metrics, "Micro-impact of Domain" and "Contribution of Domain", to quantify the variations of local impacts of multidimensional factors on educational outcomes across small areas. The two metrics highlight differing priorities. Our study reveals complex multi-way interactions between the deprivation domains, which could not be provided by traditional health statistical methods based on single global model. We demonstrate that although Income has an expected central role, all domains contribute, and in some areas Health, Environment, Access to Services, Housing and Community Safety each could be the dominant factor. Thus the relative importance of health and socioeconomic factors varies considerably for different areas, depending on the levels of each of the other factors, and therefore each component of deprivation must be considered as part of a wider system. Childhood educational achievement could benefit from policies and intervention strategies that are tailored to the local geographic areas' profiles.


Subject(s)
Health Status , Models, Theoretical , Socioeconomic Factors , Child , Databases, Factual , Housing , Humans , Income , Risk Factors , Social Environment
20.
Glob Chang Biol ; 20(1): 140-6, 2014 Jan.
Article in English | MEDLINE | ID: mdl-24323534

ABSTRACT

Populations may potentially respond to climate change in various ways including moving to new areas or alternatively staying where they are and adapting as conditions shift. Traditional laboratory and mesocosm experiments last days to weeks and thus only give a limited picture of thermal adaptation, whereas ocean warming occurring over decades allows the potential for selection of new strains better adapted to warmer conditions. Evidence for adaptation in natural systems is equivocal. We used a 50-year time series comprising of 117 056 samples in the NE Atlantic, to quantify the abundance and distribution of two particularly important and abundant members of the ocean plankton (copepods of the genus Calanus) that play a key trophic role for fisheries. Abundance of C. finmarchicus, a cold-water species, and C. helgolandicus, a warm-water species, were negatively and positively related to sea surface temperature (SST) respectively. However, the abundance vs. SST relationships for neither species changed over time in a manner consistent with thermal adaptation. Accompanying the lack of evidence for thermal adaptation there has been an unabated range contraction for C. finmarchicus and range expansion for C. helgolandicus. Our evidence suggests that thermal adaptation has not mitigated the impacts of ocean warming for dramatic range changes of these key species and points to continued dramatic climate induced changes in the biology of the oceans.


Subject(s)
Climate Change , Copepoda/physiology , Adaptation, Physiological , Animals , Atlantic Ocean , Biodiversity , Population Density , Temperature
SELECTION OF CITATIONS
SEARCH DETAIL
...