Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 10 de 10
Filter
Add more filters

Country/Region as subject
Affiliation country
Publication year range
1.
Surg Endosc ; 35(8): 4371-4379, 2021 08.
Article in English | MEDLINE | ID: mdl-32909207

ABSTRACT

BACKGROUND: Surgery has a recognised role in the treatment of 'sportsman's groin'. This study hypothesises that elite athletes have a superior advantage in both pre- and post-op rehabilitation and therefore will present and resume sporting activities quicker. METHODS: A retrospective analysis on a secure database of athletes presenting with groin pain that underwent surgery for 'inguinal disruption'. All data were explored via appropriate descriptive statistics and comparisons made between elite and amateur athletes. RESULTS: All patients were male (n = 144). The median age 33 years (range 14-72). The median return to sporting activity was 4.5 weeks (range 2.0-16.0) with one amateur athlete being unable to return to sporting activity. Using the mean of both sides, a comparison of VAS pain scores at pre-operative and 1 month post-operative time points showed a significant reduction (p < 0.001). Comparing 'elite' versus 'amateur' athletes, significant differences were seen in patient age (median 26 vs 40 years; p < 0.001), lead time to clinic presentation (median 62.0 vs 111.5 days; p = 0.004), and time to return to sporting activity (4 vs 5 weeks; p = 0.019). Additional MRI findings within the groin girdle were found in 89 patients (66.4%) and 34 patients (23.6%) had an MRI finding within the adductor tendon. CONCLUSION: The Manchester Groin Rrepair is an effective surgical management for 'inguinal disruption'. Elite athletes present quicker and return to sport sooner. Given the prevalence of other findings, a multidisciplinary approach to the 'sportsman's groin' is required.


Subject(s)
Athletic Injuries , Hernia, Inguinal , Adolescent , Adult , Aged , Athletes , Athletic Injuries/surgery , Groin/injuries , Groin/surgery , Hernia, Inguinal/surgery , Humans , Male , Middle Aged , Retrospective Studies , Treatment Outcome , Young Adult
2.
Diabet Med ; 36(3): 383-387, 2019 03.
Article in English | MEDLINE | ID: mdl-30307056

ABSTRACT

AIMS: To assess the impact of social deprivation, demographics and centre on HbA1c outcomes with continuous subcutaneous insulin infusion (CSII) in adults with Type 1 diabetes. METHODS: Demographic data, postcode-derived English Index of Multiple Deprivation data and 12-month average HbA1c (mmol/mol) pre- and post-CSII were collated from three diabetes centres in the north west of England, University Hospital of South Manchester (UHSM), Salford Royal Foundation Hospital (SRFT) and Manchester Royal Infirmary (MRI). Univariable and multivariable regression models explored relationships between demographics, Index of Multiple Deprivation, centre and HbA1c outcomes. RESULTS: Data were available for 693 (78%) individuals (UHSM, n = 90; SRFT, n = 112; and MRI, n = 491) of whom 59% were women. Median age at CSII start was 39 (IQR 29.5-49.0) years and median diabetes duration was 20 (11-29) years. Median Index of Multiple Deprivation was 15 193 (6313-25 727). Overall median HbA1c improved from 69 to 64 mmol/mol (8.5% to 8.0%) within the first year of CSII. In multivariable analysis, higher pre-CSII HbA1c was significantly associated with higher deprivation (P = 0.036), being female (P < 0.001), and centre (MRI; P = 0.005). Following pre-CSII HbA1c adjustment, post-CSII HbA1c or HbA1c change were not related to demographic factors and deprivation, but remained significantly related to the centre; UHSM and SRFT had larger reductions in HbA1c with CSII compared with MRI [median -7.0 (-0.6%) vs. -6.0 (-0.55%) vs. -4.5 (-0.45%) mmol/mol; P = 0.005]. CONCLUSIONS: Higher pre-CSII HbA1c levels were associated with higher deprivation and being female. CSII improves HbA1c irrespective of social deprivation and demographics. Significant differences in HbA1c improvements were observed between centres. Further work is warranted to explain these differences and minimize variation in clinical outcomes with CSII.


Subject(s)
Cultural Deprivation , Diabetes Mellitus, Type 1 , Glycated Hemoglobin/metabolism , Health Services Accessibility/statistics & numerical data , Insulin Infusion Systems , Insulin/administration & dosage , Adult , Blood Glucose/metabolism , Demography , Diabetes Mellitus, Type 1/blood , Diabetes Mellitus, Type 1/drug therapy , Diabetes Mellitus, Type 1/epidemiology , England/epidemiology , Female , Geography , Glycated Hemoglobin/analysis , Humans , Infusions, Subcutaneous , Male , Middle Aged , Psychological Distance , Retrospective Studies , Treatment Outcome
3.
Ann R Coll Surg Engl ; 106(1): 19-28, 2024 Jan.
Article in English | MEDLINE | ID: mdl-36927080

ABSTRACT

INTRODUCTION: Outcomes following pancreas transplantation are suboptimal and better donor selection is required to improve this. Vasoactive drugs (VaD) are commonly used to correct the abnormal haemodynamics of organ donors in intensive care units. VaDs can differentially affect insulin secretion positively (dobutamine) or negatively (noradrenaline). The hypothesis was that some VaDs might induce beta-cell stress or rest and therefore impact pancreas transplant outcomes. The aim of the study was to assess relationships between VaD use and pancreas transplant graft survival. METHODS: Data from the UK Transplant Registry on all pancreas transplants performed between 2004 and 2016 with complete follow-up data were included. Univariable- and multivariable-adjusted Cox regression analyses determined risks of graft failure associated with VaD use. RESULTS: In 2,183 pancreas transplants, VaDs were used in the following numbers of donors: dobutamine 76 (3.5%), dopamine 84 (3.8%), adrenaline 161 (7.4%), noradrenaline 1,589 (72.8%) and vasopressin 1,219 (55.8%). In multivariable models, adjusted for covariates and the co-administration of other VaDs, noradrenaline use (vs non-use) was a strong predictor of better graft survival (hazard ratio [95% confidence interval] 0.77 [0.64-0.94], p = 0.01). CONCLUSIONS: Noradrenaline use was associated with better graft survival in models adjusted for donor and recipient variables - this may be related to inhibition of pancreatic insulin secretion initiating pancreatic beta-cell 'rest'. Further research is required to replicate these findings and establish whether relationships are causal. Identification of alternative methods of inducing beta-cell rest could be valuable in improving graft outcomes.


Subject(s)
Pancreas Transplantation , Humans , Pancreas Transplantation/methods , Norepinephrine/therapeutic use , Dobutamine , Treatment Outcome , Tissue Donors , Allografts , Graft Survival
4.
Cochlear Implants Int ; 21(5): 239-245, 2020 09.
Article in English | MEDLINE | ID: mdl-32299308

ABSTRACT

Introduction: Standardized outcome measures are importantfor accurately monitoring the language development of pre-lingually deaf children receiving auditory implants. Current commonly used outcome measures are time-consuming,limiting the practicality of regular testing. To address these limitations, the Manchester Spoken Language Development Scale (MSLDS) was developed as a quick and easily applicable interim measurement. This is an 11-point scale designed to provide a streamlined overview of a child's expressive language development. This study describes the MSLDS, evaluates its ease of use and inter-rater reliability, and outlines its application in the paediatric auditory implant population. Methods: Sixteen speech therapists and teachers for the deaf reviewed videos of paediatric cochlear implant assessmentsand rehabilitation sessions at a UK auditory implant centre. Twenty-five videos from fourteen children were used in this validation study. Reviewers were asked to evaluate a child's language development using the MSLDS by assigning a score for each video and to evaluate the ease of use of the scale. Each video wasrated by three different reviewers. Results: MSLDS scores showed a high degree of consistency between raters for each child. 8/25 (32%) videos demonstrated perfect agreement on the MSLDS. In 15/25 (60%) videos, there was a one-point difference between MSLDS scores. The remaining 2/25 (8%) videos varied by 2 points. Statistical analysis demonstrated an intra-class correlation coefficient (ICC) of 0.987, indicating a high level of agreement between users of the scale. Qualitative feedback from the raters suggested further modifications which have been incorporated into the scale. Conclusion: The high inter-rater agreement reflects the potential for the MSLDS to be a reliable tool for monitoring language development in the paediatric auditory implant population.


Subject(s)
Child Language , Cochlear Implants , Correction of Hearing Impairment/psychology , Deafness/psychology , Language Tests/standards , Adolescent , Child , Child, Preschool , Cochlear Implantation , Deafness/rehabilitation , Female , Humans , Infant , Male , Postoperative Period , Reproducibility of Results , Treatment Outcome
5.
Hernia ; 24(3): 591-599, 2020 06.
Article in English | MEDLINE | ID: mdl-32152806

ABSTRACT

AIM: The aim of the study was to evaluate any social, occupational and physical factors, which may influence the occurence or cause of a primary inguinal hernia in two European countries. METHODS: A questionnaire was completed by all the respondents in the setting of an out-patient clinic prospectively at the time of initial presentation and the data were collected on a secure database. All responses for each question were explored via appropriate descriptive statistics. Statistical comparisons were made using Fisher's exact test where appropriate. RESULTS: 537 adults completed the questionnaire and had their data analysed. Comparisons between those that presented with a primary complaint of either 'bulge/swelling' or 'discomfort/pain' found no differences in occupation, age or any other demographic data. Equal proportions of patients who described a single strenuous event presented with a bulge/swelling or discomfort/pain. The reporting of a causative single strenuous event was not significantly influenced by occupation, lifestyle or amount of activity carried out nor was there any significant influence upon when a hernia presented after the suspected strenuous event, although the majority reported a lump within 1 week. CONCLUSION: This study cannot at present support the belief that a single strenuous event will be the sole cause for the development of a primary inguinal hernia.


Subject(s)
Hernia, Inguinal/etiology , Adolescent , Adult , Aged , Aged, 80 and over , Causality , Female , Hernia, Inguinal/epidemiology , Hernia, Inguinal/surgery , Humans , Male , Middle Aged , Netherlands/epidemiology , Surveys and Questionnaires , United Kingdom/epidemiology , Young Adult
6.
Semin Thorac Cardiovasc Surg ; 31(3): 583-592, 2019.
Article in English | MEDLINE | ID: mdl-30529157

ABSTRACT

Thoracotomy is a common surgical procedure performed worldwide for lung disease. Despite major advances in analgesia, patients still experience severe shoulder, central back and surgical incision site pain in the postoperative period. This study aimed to assess whether intraoperative phrenic nerve infiltration reduces the incidence of postoperative pain and improves peak flow volume measurements during incentive spirometry. 90 patients undergoing open lobectomy were randomly assigned to have phrenic nerve infiltration (n = 46) or not (n = 44). The phrenic nerve infiltration group received 10 mL of 0.25% bupivacaine into the periphrenic fat pad. Preoperative assessments of spirometry and pain scores were recorded (at rest and with movement). Postoperative assessments included peak flow and pain measurements at intervals up to 72 hours. Less shoulder pain was experienced with phrenic nerve infiltration up to 6 hours postsurgery at rest (P = 0.005) and up to 12 hours with movement (P < 0.001). Reduced back pain was reported in the phrenic nerve infiltration group up to 6 hours after surgery both at rest (P = 0.001) and with movement (P = 0.00). Phrenic nerve infiltration reduced pain at the incision site for up to 3 hours both at rest (P < 0.001) and with movement (P = 0.001). Spirometry readings dropped in both groups with consistently lower readings at baseline and follow-up in the PNI group (P = 0.007). Lower analgesic usage of patient controlled analgesia morphine (P < 0.0001), epipleural bupivacaine (P = 0.001), and oramorph/zomorph (P = 0.0002) were recorded. Our findings indicate that the use of phrenic nerve infiltration significantly reduced patient pain scores during the early postoperative period, particularly during movement. We believe that each technique has advantages and disadvantages; however, further studies with large sample size are warranted.


Subject(s)
Anesthetics, Local/administration & dosage , Back Pain/prevention & control , Bupivacaine/administration & dosage , Nerve Block/methods , Pain, Postoperative/prevention & control , Phrenic Nerve , Pneumonectomy , Shoulder Pain/prevention & control , Thoracotomy , Aged , Aged, 80 and over , Anesthetics, Local/adverse effects , Back Pain/diagnosis , Back Pain/epidemiology , Bupivacaine/adverse effects , England , Female , Humans , Incidence , Male , Middle Aged , Nerve Block/adverse effects , Pain Measurement , Pain, Postoperative/diagnosis , Pain, Postoperative/epidemiology , Pneumonectomy/adverse effects , Shoulder Pain/diagnosis , Shoulder Pain/epidemiology , Spirometry , Thoracotomy/adverse effects , Time Factors , Treatment Outcome
7.
Eye (Lond) ; 31(8): 1229-1236, 2017 Aug.
Article in English | MEDLINE | ID: mdl-28452995

ABSTRACT

PurposeTo investigate the frequencies, trends, and in vitro drug susceptibilities of the causative pathogens in microbial keratitis in Manchester Royal Eye Hospital.Patients and methodsCorneal scrape results recorded by the microbiology service between 2004 and 2015 were extracted from an established database. A total of 4229 corneal scrape specimens were identified from an established database. First-line antibiotic treatment in our centre during the study period was ofloxacin and second line was cefuroxime and gentamicin.ResultsMean age was 45.9±21.0. A total of 1379 samples (32.6%) were culture positive. One hundred forty-eight (10.7%) specimens cultured multiple organisms. Of the 1539 organisms identified, 63.3% were Gram-positive bacteria, 27.3% Gram-negative bacteria, 7.1% fungi, and 2.3% Acanthamoebae. A decreasing trend in Gram-positive isolates was found together with a stable trend in Gram negatives and an increasing trend in Acanthamoeba and fungi. There appeared to be a significant increasing trend of Moraxella infection (P=0.001). In all, 83.1 and 90.8% of Gram-positive and -negative isolates tested were susceptible to ofloxacin, respectively. Cefuroxime covered 86.6% of Gram-positive and 61.4% of Gram-negative isolates, whereas gentamicin covered 88.8 and 96.5% of Gram-positive and -negative isolates, respectively.ConclusionWe found a change in the type of Gram-negative organisms isolated over time, with the Moraxella species on the rise. Reassuringly, no significant increase in resistance was observed in vitro for any of the commonly used antibiotics. Ofloxacin remains a good first-line antibiotic treatment but duo-therapy does have broader coverage and should be considered in non-responsive cases.


Subject(s)
Anti-Bacterial Agents/therapeutic use , Eye Infections, Bacterial/microbiology , Keratitis/microbiology , Acanthamoeba/isolation & purification , Adult , Aged , Anti-Bacterial Agents/pharmacology , Corneal Ulcer/microbiology , Drug Resistance, Bacterial/drug effects , Eye Infections, Bacterial/drug therapy , Female , Fungi/isolation & purification , Gram-Negative Bacteria/drug effects , Gram-Negative Bacteria/isolation & purification , Gram-Positive Bacteria/drug effects , Gram-Positive Bacteria/isolation & purification , Humans , Keratitis/drug therapy , Male , Microbial Sensitivity Tests , Middle Aged , Retrospective Studies , Tertiary Care Centers/statistics & numerical data , United Kingdom
8.
Article in English | MEDLINE | ID: mdl-35515202

ABSTRACT

Background: Effective paediatric basic life support improves survival and outcomes. Current cardiopulmonary resuscitation (CPR) training involves 4-yearly courses plus annual updates. Skills degrade by 3-6 months. No method has been described to motivate frequent and persistent CPR practice. To achieve this, we explored the use of competition and a leaderboard, as a gamification technique, on a CPR training feedback device, to increase CPR usage and performance. Objective: To assess whether self-motivated CPR training with integrated CPR feedback improves quality of infant CPR over time, in comparison to no refresher CPR training. Design: Randomised controlled trial (RCT) to assess the effect of self-motivated manikin-based learning on infant CPR skills over time. Setting: A UK tertiary children's hospital. Participants: 171 healthcare professionals randomly assigned to self-motivated CPR training (n=90) or no refresher CPR training (n=81) and followed for 26 weeks. Intervention: The intervention comprised 24 h a day access to a CPR training feedback device and anonymous leaderboard. The CPR training feedback device calculated a compression score based on rate, depth, hand position and release and a ventilation score derived from rate and volume. Main outcome measure: The outcome measure was Infant CPR technical skill performance score as defined by the mean of the cardiac compressions and ventilations scores, provided by the CPR training feedback device software. The primary analysis considered change in score from baseline to 6 months. Results: Overall, the control group showed little change in their scores (median 0, IQR -7.00-5.00) from baseline to 6 months, while the intervention group had a slight median increase of 0.50, IQR 0.00-33.50. The two groups were highly significantly different in their changes (p<0.001). Conclusions: A significant effect on CPR performance was demonstrated by access to self-motivated refresher CPR training, a competitive leaderboard and a CPR training feedback device.

9.
Int J STD AIDS ; 24(6): 449-53, 2013 Jun.
Article in English | MEDLINE | ID: mdl-23970747

ABSTRACT

Clinical staging determines antiretroviral therapy (ART) eligibility when CD4 count is not available. Haemoglobin (Hb) ≤8 g/dL is an indication for the treatment. We measured Hb in HIV-positive Malawian adults undergoing clinical assessment for ART eligibility and calculated the percentage of patients with CD4 ≤ 350 cells/µL deemed eligible for ART by clinical staging with and without Hb measurement, using the existing threshold and an alternative proposed after comparing Hb values to CD4 counts. Three hundred and thirty-eight patients had CD4 counts measured and 226 (67%) had CD4 ≤ 350 cells/µL. Thirty-six (16%) patients with low CD4 count were eligible for ART by clinical assessment alone, 48 (21%) when Hb was also measured with a threshold of ≤8 g/dL and 74 (34%) with a threshold of ≤10 g/dL. Measuring Hb alongside clinical assessment could increase the number of patients with CD4 ≤ 350 cells/µL starting ART by 33% using a threshold of Hb ≤ 8 g/dL or 114% with a threshold of ≤10g/dL.


Subject(s)
Anemia/complications , Antiretroviral Therapy, Highly Active , Eligibility Determination , HIV Infections/drug therapy , Mass Screening/methods , Adolescent , Adult , Aged , Aged, 80 and over , Anemia/epidemiology , Anti-HIV Agents/therapeutic use , CD4 Lymphocyte Count , Female , HIV Infections/complications , HIV Infections/epidemiology , Hemoglobins/metabolism , Humans , Malawi/epidemiology , Male , Middle Aged , Predictive Value of Tests , Prevalence , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL