Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 256
Filter
1.
J Prev Alzheimers Dis ; 11(3): 549-557, 2024.
Article in English | MEDLINE | ID: mdl-38706271

ABSTRACT

BACKGROUND: In an exploratory 91-participant phase 2a clinical trial (AscenD-LB, NCT04001517) in dementia with Lewy bodies (DLB), neflamapimod showed improvement over placebo on multiple clinical endpoints. To confirm those results, a phase 2b clinical study (RewinD-LB, NCT05869669 ) that is similar to AscenD-LB has been initiated. OBJECTIVES: To optimize the choice of patient population, primary endpoint, and biomarker evaluations in RewinD-LB. DESIGN: Evaluation of the efficacy results from AscenD-LB, the main results of which, and a re-analysis after stratification for absence or presence of AD co-pathology (assessed by plasma ptau181), have been published. In addition, the MRI data from a prior phase 2a clinical trial in Early Alzheimer's disease (AD), were reviewed. SETTING: 22 clinical sites in the US and 2 in the Netherlands. PARTICIPANTS: Probable DLB by consensus criteria and abnormal dopamine uptake by DaTscan™ (Ioflupane I123 SPECT). INTERVENTION: Neflamapimod 40mg capsules or matching placebo capsules, twice-a-day (BID) or three-times-a-day (TID), for 16 weeks. MEASUREMENTS: 6-test Neuropsychological Test Battery (NTB) assessing attention and executive function, Clinical Dementia Rating Sum-of-Boxes (CDR-SB), Timed Up and Go (TUG), International Shopping List Test (ISLT). RESULTS: Within AscenD-LB, patients without evidence of AD co-pathology exhibited a neflamapimod treatment effect that was greater than that in the overall population and substantial (cohen's d effect size vs. placebo ≥ for CDR-SB, TUG, Attention and ISLT-recognition). In addition, the CDR-SB and TUG performed better than the cognitive tests to demonstrate neflamapimod treatment effect in comparison to placebo. Further, clinical trial simulations indicate with 160-patients (randomized 1:1), RewinD-LB conducted in patients without AD co-pathology has >95% (approaching 100%) statistical power to detect significant improvement over placebo on the CDR-SB. Preliminary evidence of positive treatment effects on beta functional connectivity by EEG and basal forebrain atrophy by MRI were obtained in AscenD-LB and the Early AD study, respectively. CONCLUSION: In addition to use of a single dose regimen of neflamapimod (40mg TID), key distinctions between phase 2b and phase 2a include RewinD-LB (1) excluding patients with AD co-pathology, (2) having CDR-SB as the primary endpoint, and (3) having MRI studies to evaluate effects on basal forebrain atrophy.


Subject(s)
Benzylamines , Fluorocarbons , Indoles , Lewy Body Disease , Humans , Lewy Body Disease/drug therapy , Lewy Body Disease/diagnostic imaging , Aged , Female , Male , Double-Blind Method , Magnetic Resonance Imaging , Biomarkers/blood , Aged, 80 and over , Neuropsychological Tests
2.
Front Cardiovasc Med ; 10: 1332868, 2023.
Article in English | MEDLINE | ID: mdl-38292455

ABSTRACT

Background: Catheter ablation (CA) for symptomatic atrial fibrillation (AF) offers the best outcomes for patients. Despite the benefits of CA, a significant proportion of patients suffer a recurrence; hence, there is scope to potentially improve outcomes through technical innovations such as ablation index (AI) guidance during AF ablation. We present real-world 5-year follow-up data of AI-guided pulmonary vein isolation. Methods: We retrospectively followed 123 consecutive patients who underwent AI-guided CA shortly after its introduction to routine practice. Data were collected from the MPH AF Ablation Registry with the approval of the institutional research board. Results: Our patient cohort was older, with higher BMI, greater CHA2DS2-VASc scores, and larger left atrial sizes compared to similar previously published cohorts, while gender balance and other characteristics were similar. The probability of freedom from atrial arrhythmia with repeat procedures is as follows: year 1: 0.95, year 2: 0.92, year 3: 0.85, year 4: 0.79, and year 5: 0.72. Age >75 years (p = 0.02, HR: 2.7, CI: 1.14-6.7), BMI >35 kg/m2 (p = 0.0009, HR: 4.6, CI: 1.8-11.4), and left atrial width as measured on CT in the upper quartile (p = 0.04, HR: 2.5, CI: 1-5.7) were statistically significant independent predictors of recurrent AF. Conclusion: AI-guided CA is an effective treatment for AF, with 95.8% of patients remaining free from atrial arrhythmia at 1 year and 72.3% at 5 years, allowing for repeat procedures. It is safe with a low major complication rate of 1.25%. Age >75 years, BMI >35 kg/m2, and markedly enlarged atria were associated with higher recurrence rates.

3.
Eur Spine J ; 31(11): 2866-2874, 2022 11.
Article in English | MEDLINE | ID: mdl-35786771

ABSTRACT

PURPOSE: To determine the predictive validity of the STarT Back tool (SBT) undertaken at baseline and 6 weeks to classify Emergency Department (ED) patients with LBP into groups at low, medium or high risk of persistent disability at 3 months. A secondary aim was to evaluate the clinical effectiveness of pragmatic risk-matched treatment in an ED cohort at 3 months. METHODS: A prospective observational multi-centre study took place in the physiotherapy services linked to the ED in four teaching hospitals in Dublin, Ireland. Patients were stratified into low, medium and high-risk groups at their baseline assessment. Participants received stratified care, where the content of their treatment was matched to their risk profile. Outcomes completed at baseline and 3 months included pain and disability. Linear regression analyses assessed if baseline or 6-week SBT score were predictive of disability at 3 months. Changes in the primary outcome of disability were dichotomised into those who achieved/ did not achieve a 30% improvement in their RMDQ at 6 weeks and 3 months. RESULTS: The study enrolled 118 patients with a primary complaint of LBP ± leg pain with 67 (56.7%) completing their 6-week and 3-month follow-up. Baseline RMDQ and being in medium or high risk SBT group at 6 weeks were predictive of persistent disability at 3 months. A total of 54 (80.6%) participants reported a > 30% improvement at 3 months. CONCLUSION: Disability at baseline and SBT administered at 6 weeks more accurately predicted disability at 3 months than SBT at baseline in an ED population.


Subject(s)
Low Back Pain , Humans , Low Back Pain/therapy , Disability Evaluation , Treatment Outcome , Prospective Studies , Emergency Service, Hospital
4.
Pulmonology ; 28(4): 297-309, 2022.
Article in English | MEDLINE | ID: mdl-35227650

ABSTRACT

BACKGROUND AND AIM: Tuberculosis (TB) is associated with a high mortality in the intensive care unit (ICU), especially in subjects with Acute Respiratory Distress Syndrome (ARDS) requiring mechanical ventilation. Despite its global burden on morbidity and mortality, TB is an uncommon cause of ICU admission, however mortality is disproportionate to the advances in diagnosis and treatment made. Herein we report a systematic review of published studies. METHODS: Our Literature search was conducted to identify studies on outcomes of individuals with TB admitted to ICU. We report and review in-hospital mortality, predictors of poorer outcomes, usefulness of severity scoring systems and potential benefits of intravenous antibiotics. Searches from Pubmed, Embase, Cochrane and Medline were conducted from inception to March 2020. Only literature in English was included. RESULTS: Out of 529 potentially relevant articles, 17 were included. Mortality across all studies ranged from 29-95% with an average of 52.9%. All severity scores underestimated average mortality. The most common indication for ICU admission was acute respiratory failure (36.3%). Negative predictors of outcome included hospital acquired infections, need of mechanical ventilation and vasopressors, delay in initiation of anti-TB treatment, more than one organ failure and a higher severity score. Low income, high incidence countries showed a 23.4% higher mortality rate compared to high income, low TB incidence countries. CONCLUSION: Mortality in individuals with TB admitted to ICU is high. Earlier detection and treatment initiation is needed.


Subject(s)
Respiratory Distress Syndrome , Tuberculosis, Pulmonary , Critical Care , Humans , Intensive Care Units , Respiration, Artificial , Tuberculosis, Pulmonary/diagnosis , Tuberculosis, Pulmonary/drug therapy
5.
J Shoulder Elbow Surg ; 30(5): 1060-1067, 2021 May.
Article in English | MEDLINE | ID: mdl-32853790

ABSTRACT

HYPOTHESIS AND BACKGROUND: Complex glenoid bone loss and deformity present a challenge for the shoulder arthroplasty surgeon. Eccentric reaming, bone grafting, augmented glenoid components, and salvage hemiarthroplasty are common strategies for managing these patients. The glenoid vault reconstruction system (VRS; Zimmer-Biomet) is a novel solution for both primary and revision arthroplasty using a custom glenoid baseplate. We hypothesized that patients undergoing reverse shoulder arthroplasty (RSA) with VRS would have acceptable short-term outcomes and complication rates. METHODS: Patients who underwent RSA with VRS for severe glenoid deformity or bone loss by one of 4 board-certified, fellowship-trained shoulder and elbow surgeons at 3 academic tertiary referral centers between September 2015 and November 2018 were eligible for inclusion. Patient data were obtained via medical record review and telephone questionnaires. The Numeric Pain Rating Scale (NPRS), Single Assessment Numeric Evaluation (SANE), American Shoulder and Elbow Surgeons Standardized Shoulder Assessment Form (ASES), Penn Shoulder Scores, and range of motion (ROM) measurements were obtained pre- and postoperatively. Radiographs were reviewed at final follow-up for evidence of component loosening or hardware failure. Any complication was documented. Outcomes were compared using Wilcoxon signed-rank tests with P <. 05 considered significant. RESULTS: Twelve shoulders (11 patients) were included with a mean age of 68 years; 7 were primary arthroplasties and 5 were revisions. At an average follow-up time of 30 months, median improvement in NPRS score was 7 points, SANE score 43%, ASES score 45 points, and Penn Shoulder Score 49 points. There were statistically significant improvements in median ROM measurements (forward elevation 20°, external rotation 40°, internal rotation 2 spinal levels). At final follow-up, all implants were radiographically stable without loosening. There were no complications. DISCUSSION AND CONCLUSION: This study demonstrates that RSA using the custom VRS glenoid implant is a safe and effective technique addressing complex glenoid deformity or bone loss in both primary and revision settings. At short-term follow-up, all patient-reported outcomes and ROM measures improved significantly, and there were no complications. Future work should determine mid- and long-term outcomes, preferably in a prospective manner with defined patient populations.


Subject(s)
Arthroplasty, Replacement, Shoulder , Glenoid Cavity , Shoulder Joint , Aged , Follow-Up Studies , Glenoid Cavity/diagnostic imaging , Glenoid Cavity/surgery , Humans , Prospective Studies , Range of Motion, Articular , Retrospective Studies , Shoulder Joint/diagnostic imaging , Shoulder Joint/surgery , Treatment Outcome
6.
J Syst Integr Neurosci ; 72020 Apr 30.
Article in English | MEDLINE | ID: mdl-32934824

ABSTRACT

In the face of the global pandemic of COVID 19, approaching 1.75 Million infected worldwide (4/12/2020) and associated mortality (over 108, 000 as of 4/12/2020) as well-as other catastrophic events including the opioid crisis, a focus on brain health seems prudent [1] (https://www.coronavirus.gov). This manuscript reports on the systemic benefits of restoring and achieving dopamine homeostasis to reverse and normalize thoughts and behaviors of Reward Deficiency Syndrome (RDS) dysfunctional conditions and their effects on behavioral physiology; function of reward genes; and focuses on digestive, immune, eye health, and the constellation of symptomatic behaviors. The role of nutrigenomic interventions on restoring normal brain functions and its benefits on these systems will be discussed. We demonstrate that modulation of dopamine homeostasis using nutrigenomic dopamine agonists, instead of pharmaceutical interventions, is achievable. The allied interlinking with diverse chronic diseases and disorders, roles of free radicals and incidence of anaerobic events have been extensively highlighted. In conjunction, the role of dopamine in aspects of sleep, rapid eye movement and waking are extensively discussed. The integral aspects of food indulgence, the influence of taste sensations, and gut-brain signaling are also discussed along with a special emphasis on ocular health. The detailed mechanistic insight of dopamine, immune competence and the allied aspects of autoimmune disorders are also highlighted. Finally, the integration of dopamine homeostasis utilizing a patented gene test and a research-validated nutrigenomic intervention are presented. Overall, a cutting-edge nutrigenomic intervention could prove to be a technological paradigm shift in our understanding of the extent to which achieving dopamine homeostasis will benefit overall health.

8.
Eur Ann Otorhinolaryngol Head Neck Dis ; 136(6): 439-445, 2019 Nov.
Article in English | MEDLINE | ID: mdl-31477531

ABSTRACT

OBJECTIVE: To assess the impact of rehabilitation systems (CROS: Contralateral Routing of Signal; BAHA: Bone-Anchored Hearing Aid; CI: cochlear implant) on cortical auditory evoked potentials (CAEP) and auditory performance in unilateral hearing loss. SUBJECTS AND METHOD: Twenty-one adults with unilateral hearing loss, using CROS (n=6), BAHA (n=6) or CI (n=9), were included. Seven normal-hearing subjects served as controls. CAEPs were recorded for a (/ba/) speech stimulus; for patients, tests were conducted with and without their auditory rehabilitation. Amplitude and latency of the various CAEP components of the global field power (GFP) were measured, and scalp potential fields were mapped. Behavioral assessment used sentence recognition in noise, with and without spatial cues. RESULTS: Only CI induced N1 peak amplitude change (P<0.05). CI and CROS increased polarity inversion amplitude in the contralateral ear, and frontocentral negativity on the scalp potential map. CI improved understanding when speech was presented to the implanted ear and noise to the healthy ear, and vice-versa. CONCLUSION: Cochlear implantation had the greatest impact on CAEP morphology and auditory performance. A longitudinal study could analyze progression of cortical reorganization.


Subject(s)
Auditory Cortex/physiopathology , Evoked Potentials, Auditory/physiology , Hearing Loss, Unilateral/rehabilitation , Adult , Aged , Aged, 80 and over , Cochlear Implants , Female , Functional Laterality/physiology , Hearing Aids , Hearing Loss, Unilateral/physiopathology , Humans , Male , Middle Aged , Perceptual Masking/physiology , Reaction Time/physiology , Speech Perception/physiology
9.
Eur J Orthop Surg Traumatol ; 29(6): 1319-1323, 2019 Aug.
Article in English | MEDLINE | ID: mdl-30963325

ABSTRACT

INTRODUCTION: Opioids are commonly used for post-operative pain control. It is known that diabetic patients with ankle fractures will experience prolonged healing, higher risk of hardware failure, and an increased risk of infection. However, the opioid requirements amongst this patient cohort have not been previously evaluated. Thus, the purpose of this study is to retrospectively compare opioid utilization amongst ankle fracture patients with and without diabetes mellitus (DM). METHODS: An IRB approval was obtained for the retrospective review of patients who presented with an ankle fracture and underwent surgery between November 2013 and January 2017. A total of 180 patients (144 without DM, 36 with DM) with a mean age of 50 years (± 18 years) were included. Opioid consumption was quantified utilizing a morphine-milliequivalent conversion algorithm. A repeated measures ANOVA was conducted to compare opioid consumption. A two-tailed p value of 0.05 was set as the threshold for statistical significance. RESULTS: Repeated measures ANOVA revealed a statistically significant decrease in total opioid consumption during the 4-month duration (p < 0.001). The model demonstrated a mean difference in opioid consumption of - 214.3 morphine meq between the patients without and with DM (p = 0.022). Post hoc pair-wise comparison revealed less opioid consumption amongst non-diabetic patients at 2 (- 418.5 Meq; p = 0.009), 3 months (- 355.6 Meq; p = 0.021), and 4 months (- 152.6 Meq; p = 0.006) after surgery. CONCLUSION: Our study revealed increased opioid consumption amongst diabetic patients who are treated surgically for ankle fractures. With increasing efforts aimed at reducing opioid administration, orthopaedic surgeons should be aware of higher opioid consumption amongst this patient cohort. Further studies are needed to verify the results of this study.


Subject(s)
Analgesics, Opioid , Ankle Fractures/surgery , Diabetes Mellitus/epidemiology , Fracture Fixation/adverse effects , Pain, Postoperative/drug therapy , Postoperative Complications , Analgesics, Opioid/administration & dosage , Analgesics, Opioid/adverse effects , Ankle Fractures/epidemiology , Comorbidity , Drug Utilization Review , Female , Fracture Fixation/methods , Humans , Male , Middle Aged , Pain Management/methods , Pain Management/statistics & numerical data , Postoperative Complications/diagnosis , Postoperative Complications/etiology , Prescription Drug Overuse/prevention & control , Retrospective Studies
10.
Am J Sports Med ; 47(6): 1294-1301, 2019 05.
Article in English | MEDLINE | ID: mdl-30995074

ABSTRACT

BACKGROUND: The use of artificial turf in American football continues to grow in popularity, and the effect of these playing surfaces on athletic injuries remains controversial. Knee injuries account for a significant portion of injuries in the National Collegiate Athletic Association (NCAA) football league; however, the effect of artificial surfaces on knee injuries remains ill-defined. HYPOTHESIS: There is no difference in the rate or mechanism of knee ligament and meniscal injuries during NCAA football events on natural grass and artificial turf playing surfaces. STUDY DESIGN: Descriptive epidemiology study. METHODS: The NCAA Injury Surveillance System Men's Football Injury and Exposure Data Sets for the 2004-2005 through 2013-2014 seasons were analyzed to determine the incidence of anterior cruciate ligament (ACL), posterior cruciate ligament (PCL), medial collateral ligament (MCL), medial meniscus, and lateral meniscal tear injuries. Injury rates were calculated per 10,000 athlete exposures, and rate ratios (RRs) were used to compare injury rates during practices and competitions on natural grass and artificial turf in NCAA football as a whole and by competition level (Divisions I, Divisions II and III). Mechanisms of injury were calculated for each injury on natural grass and artificial turf surfaces. RESULTS: A total of 3,009,205 athlete exposures and 2460 knee injuries were reported from 2004 to 2014: 1389 MCL, 522 ACL, 269 lateral meniscal, 164 medial meniscal, and 116 PCL. Athletes experienced all knee injuries at a significantly higher rate when participating in competitions as compared with practices. Athletes participating in competitions on artificial turf experienced PCL injuries at 2.94 times the rate as those playing on grass (RR = 2.94; 95% CI, 1.61-5.68). When stratified by competition level, Division I athletes participating in competitions on artificial turf experienced PCL injuries at 2.99 times the rate as those playing on grass (RR = 2.99; 95% CI, 1.39-6.99), and athletes in lower NCAA divisions (II and III) experienced ACL injuries at 1.63 times the rate (RR = 1.63; 95% CI, 1.10-2.45) and PCL injuries at 3.13 times the rate (RR = 3.13; 95% CI, 1.14-10.69) on artificial turf as compared with grass. There was no statistically significant difference in the rate of MCL, medial meniscal, or lateral meniscal injuries on artificial turf versus grass when stratified by event type or level of NCAA competition. No difference was found in the mechanisms of knee injuries on natural grass and artificial turf. CONCLUSION: Artificial turf is an important risk factor for specific knee ligament injuries in NCAA football. Injury rates for PCL tears were significantly increased during competitions played on artificial turf as compared with natural grass. Lower NCAA divisions (II and III) also showed higher rates of ACL injuries during competitions on artificial turf versus natural grass.


Subject(s)
Football/injuries , Knee Injuries/epidemiology , Poaceae , Anterior Cruciate Ligament , Anterior Cruciate Ligament Injuries/epidemiology , Athletes , Athletic Injuries/epidemiology , Humans , Incidence , Male , Menisci, Tibial , Posterior Cruciate Ligament/injuries , Risk Factors , Seasons , Tibial Meniscus Injuries/epidemiology , United States/epidemiology , Universities
11.
Open Respir Med J ; 12: 1-10, 2018.
Article in English | MEDLINE | ID: mdl-29456774

ABSTRACT

Gastro-Oesophageal Reflux (GOR) has been associated with chronic airway diseases while the passage of foreign matter into airways and lungs through aspiration has the potential to initiate a wide spectrum of pulmonary disorders. The clinical syndrome resulting from such aspiration will depend both on the quantity and nature of the aspirate as well as the individual host response. Aspiration of gastric fluids may cause damage to airway epithelium, not only because acidity is toxic to bronchial epithelial cells but also due to the effect of digestive enzymes such as pepsin and bile salts. Experimental models have shown that direct instillation of these factors to airways epithelia cause damage with a consequential inflammatory response. The pathophysiology of these responses is gradually being dissected, with better understanding of acute gastric aspiration injury, a major cause of acute lung injury, providing opportunities for therapeutic intervention and potentially, ultimately, improved understanding of the chronic airway response to aspiration. Ultimately, clarification of the inflammatory pathways which are related to micro-aspiration via pepsin and bile acid salts may eventually progress to pharmacological intervention and surgical studies to assess the clinical benefits of such therapies in driving symptom improvement or reducing disease progression.

13.
Ir Med J ; 110(7): 615, 2017 Aug 12.
Article in English | MEDLINE | ID: mdl-29168997

ABSTRACT

In 2015, The Department of Health published the first annual report of the "National Healthcare Quality Reporting System." Connolly Hospital was reported to a mortality rate within 30 days post-Acute Myocardial Infarction (AMI) of 9.87 per 100 cases which was statistically significantly higher than the national rate. We carried out a retrospective audit of patients who were HIPE-coded as having died within 30 days of AMI from 2011-2013 and identified 42 patients. On review, only 23 patients (54.8%) were confirmed as having had an AMI. We identified 12 patients who had AMI included on death certificate without any evidence for same. If the 22 patients incorrectly coded were excluded, the mortality rate within 30 days post-AMI in CHB would fall to 4.14 deaths per 100 cases, well below the national average. Inaccuracies of data collection can lead to erroneous conclusions when examining healthcare data.


Subject(s)
Hospital Mortality , Myocardial Infarction/mortality , Humans , Quality of Health Care , Retrospective Studies , Time Factors
14.
Ir J Med Sci ; 186(3): 607-613, 2017 Aug.
Article in English | MEDLINE | ID: mdl-28238200

ABSTRACT

BACKGROUND: Left cardiac sympathetic denervation (LCSD) is a surgical procedure that has been shown to have an antiarrhythmic and antifibrillatory effect. Evidence indicating its antiarrhythmic effect has been available for over 100 years. It involves the removal of the lower half of the stellate ganglion and T2-T4 of the sympathetic ganglia and is carried out as either a unilateral or bilateral procedure. With advancements in thoracic surgery, it can be safely performed via a minimally invasive Video-Assisted Thoracoscopic Surgery (VATS) approach resulting in significantly less morbidity and a shortened inpatient stay. LCSD provides a valuable treatment option for patients with life-threatening channelopathies and cardiomyopathies. AIMS AND METHODS: This case series reports the preliminary paediatric and adult experience in the Republic of Ireland with LCSD and describes five cases recently treated in addition to an outline of the operative procedure employed. Of the five cases included, two were paediatric cases and three were adult cases. RESULTS: One of the paediatric patients had a diagnosis of the rare catecholaminergic polymorphic ventricular tachycardia (CPVT) and the other a diagnosis of long-QT syndrome. Both paediatric patients experienced excellent outcomes. Of the three adult patients, two benefitted greatly and remain well at follow-up (one inappropriate sinus tachycardia and one CPVT). One patient with idiopathic ventricular fibrillation unfortunately passed away from intractable VF despite all attempts at resuscitation. CONCLUSION: In this case series, we highlight that LCSD provides a critical adjunct to existing medical therapies and should be considered for all patients with life-threatening refractory arrhythmias especially those patients on maximal medical therapy.


Subject(s)
Sympathectomy/methods , Child , Female , Humans , Ireland , Male , Middle Aged , Treatment Outcome
15.
Bone Marrow Transplant ; 51(12): 1594-1598, 2016 Dec.
Article in English | MEDLINE | ID: mdl-27427918

ABSTRACT

Emerging evidence suggests that psychosocial factors pre-transplant predict survival in cancer patients undergoing hematopoietic stem cell transplantation (HSCT). These studies, however, typically have small sample sizes, short-term follow ups or a limited panel of medical covariates. We extend this research in a large, well-characterized sample of transplant patients, asking whether patients' perceived emotional support and psychological distress predict mortality over 2 years. Prior to transplant, 400 cancer patients (55.5% males; 82.8% White; Mage=50.0 years; 67.0% leukemia, 20.0% lymphoma) were interviewed by a social caseworker, who documented the patients' perceived emotional support and psychological distress. Subsequently, patients received an allogeneic HSCT (51.0% matched-related donor, 42.0% matched-unrelated donor and 7.0% cord blood). HSCT outcomes were obtained from medical records. Controlling for demographic characteristics (age, sex, race/ethnicity and marital status) and medical confounders (disease type, conditioning regimen, remission status, cell dosage, donor and recipient CMV seropositivity, donor sex, comorbidities and disease risk), ratings of good emotional support pre-transplant predicted longer overall survival (hazard ratio (HR)=0.61, 95% confidence interval (CI), 0.42-0.91; P=0.013). Pre-transplant psychological distress was unrelated to survival, however (Ps>0.58). Emotional support was marginally associated with lower rates of treatment-related mortality (HR=0.58, CI, 0.32-1.05; P=0.073). These findings are consistent with the hypothesis that emotional support contributes to better outcomes following HSCT. Future studies should examine whether intervention efforts to optimize emotional resources can improve survival in cancer patients.


Subject(s)
Caregivers/psychology , Leukemia/psychology , Lymphoma/psychology , Social Support , Adult , Female , Hematopoietic Stem Cell Transplantation/mortality , Humans , Leukemia/mortality , Leukemia/therapy , Lymphoma/mortality , Lymphoma/therapy , Male , Middle Aged , Prognosis , Stress, Psychological/psychology , Survival Rate
16.
Eur Arch Otorhinolaryngol ; 273(8): 2019-26, 2016 Aug.
Article in English | MEDLINE | ID: mdl-26329899

ABSTRACT

The objective of this study was to investigate the usefulness of auditory steady-state responses (ASSRs) for estimating hearing thresholds in young children, compared with behavioural thresholds. The second objective was to investigate ASSR thresholds obtained with insert earphones versus supra-aural headphones to determine which transducer produces ASSR thresholds most similar to behavioural thresholds measured with supra-aural headphones. This retrospective study included 29 participants (58 ears): 12 children (24 ears) in the insert group and 17 children (34 ears) in the supra-aural group. No general anaesthesia was used. For both groups, there was a strong correlation between behavioural and ASSR thresholds, with a stronger correlation for the insert group. When behavioural thresholds are difficult to obtain, ASSR may be a useful objective measure that can be combined with other audiometric procedures to estimate hearing thresholds and to determine appropriate auditory rehabilitation approaches.


Subject(s)
Auditory Threshold/physiology , Child Behavior/physiology , Hearing Loss , Transducers , Audiometry/instrumentation , Audiometry/methods , Child, Preschool , Comparative Effectiveness Research , Female , Hearing Loss/diagnosis , Hearing Loss/psychology , Humans , Infant , Male , Retrospective Studies , Transducers/classification , Transducers/standards
17.
J Psychiatr Ment Health Nurs ; 22(10): 773-83, 2015 Dec.
Article in English | MEDLINE | ID: mdl-26459938

ABSTRACT

ACCESSIBLE SUMMARY: What is known on the subject? Stress can impact students on mental health nurse training. This can have implications at the individual level (e.g. their own mental health) and at the level of the organization (e.g. sickness absence and attrition). What this paper adds to existing knowledge? We interviewed 12 mental health nursing students regarding the stress they experienced during training. Participants described how the academic demands can at times be unbearable during clinical placements. There were also issues with 'being a student' on some placements, with participants describing negative attitudes towards them from staff. The younger participants reported feeling overwhelmed on their initial placements and described some of the main challenges of mental health work for them. Raising concerns about the quality of care on wards was also described as particularly challenging for the students. What are the implications for practice? This paper can be useful to help training providers support mental health nursing students. Recommendations include reducing academic demands during clinical placements and extending and promoting existing support services beyond normal 9 am-5 pm working hours, even if these services are limited. Younger students could be better supported by being allocated to the more well-resourced placements in the early stages of their training. Raising awareness among staff of the tasks students can and cannot perform can help improve staff/student relations. Finally, students should be educated about the issues around raising concerns on placements to help the government's drive for a more open and transparent National Health Service (NHS). INTRODUCTION: Previous studies investigating stress in nursing students focus on general nursing students or adopt quantitative measures. PURPOSE OF STUDY: A qualitative study focusing specifically on mental health nursing students is required. METHOD: One-to-one interviews were carried out with mental health nursing students (n = 12). Data were thematically analysed. RESULTS: Participants reported unreasonable demands during clinical blocks, and described how control/support is lowest on placements with staff shortages. Negative attitudes towards students from staff and related issues were also discussed. Younger participants described struggling with mental health work during the early stages of training. DISCUSSION: Training providers should strive to provide adequate support to students to help them manage stress during training. Implications for practice Academic demands should be reasonable during clinical blocks and support services outside normal working hours should be available for students, even if these are limited in scope. Greater consideration to the allocation of placements for younger students in the mental health branch could be helpful. Furthermore, staff on placements should be aware of the tasks students can and cannot perform, to help improve staff/student relations. Educating students on the issues of raising concerns can help the government's drive for a more open and transparent National Health Service (NHS).


Subject(s)
Education, Nursing, Baccalaureate , Psychiatric Nursing/education , Stress, Psychological/psychology , Students, Nursing/psychology , Adult , Female , Humans , Interview, Psychological , Male , Nursing Education Research , Qualitative Research , Young Adult
18.
Am J Transplant ; 15(6): 1475-83, 2015 Jun.
Article in English | MEDLINE | ID: mdl-25807873

ABSTRACT

The development of organ transplantation as a therapy for end-stage organ failure is among the most significant achievements of 20th century medicine, but chronic rejection remains a barrier to achieving long-term success. Current therapeutic regimens consist of immunosuppressive drugs that are efficient at delaying rejection but are associated with significant risks such as opportunistic infections, toxicity, and malignancy. Thus, the induction of specific immune tolerance to transplant antigens is the coveted aim of researchers. The use of 1-ethyl-3-(3-dimethylaminopropyl)carbodiimide (ECDI)-treated, autoantigen-coupled syngeneic leukocytes has been developed as a specific immunotherapy in preclinical models of autoimmunity and is currently in a phase II clinical trial for the treatment of multiple sclerosis. In this review, we discuss the use of allogeneic ECDI-treated apoptotic donor leukocytes (allo-ECDI-SP) as a strategy for inducing antigen-specific tolerance in allogeneic transplantation. Allo-ECDI-SP therapy induces long-term systemic immune tolerance to transplant antigens by subverting alloimmune recognition and exploiting apoptotic cell uptake pathways to recapitulate innate mechanisms of peripheral tolerance. Lastly, we discuss potential indications and challenges for transitioning allo-ECDI-SP therapy into clinical practice.


Subject(s)
Apoptosis/immunology , Immune Tolerance/immunology , Immunity, Innate/immunology , Leukocytes/immunology , Tissue Donors , Transplants/immunology , Animals , Apoptosis/physiology , Cytokines/physiology , Ethyldimethylaminopropyl Carbodiimide/pharmacology , Graft Rejection/immunology , Graft Rejection/prevention & control , Humans , Immune Tolerance/physiology , Immunity, Cellular/immunology , Immunity, Cellular/physiology , Immunity, Innate/physiology , Immunotherapy/methods , Leukocytes/cytology , Leukocytes/drug effects , Models, Animal , Transplantation, Homologous , Transplants/cytology , Transplants/physiology
19.
Int J Clin Pract ; 69(5): 518-30, 2015 May.
Article in English | MEDLINE | ID: mdl-25684069

ABSTRACT

BACKGROUND AND OBJECTIVES: Rivastigmine patch is approved for the treatment of all stages of Alzheimer's disease (AD). Application site reactions may be a concern to clinicians and we used two large clinical trial databases to investigate the incidence of skin reactions in patients receiving rivastigmine patch. METHODS: Data from a 24-week, randomised, double-blind (DB) evaluation of 13.3 vs. 4.6 mg/24 h rivastigmine patch in severe AD (ACTION) and a 72- to 96-week study comprising an initial open-label (IOL) phase followed by a 48-week randomised, DB phase (13.3 vs. 9.5 mg/24 h rivastigmine patch) in declining patients with mild-to-moderate AD (OPTIMA) were analyzed. The incidence, frequency, severity, management and predictors of application site reactions were assessed. RESULTS: Application site reactions were mostly mild or moderate in severity and reported by similar proportions in each treatment group ( ACTION: 13.3 mg/24 h, 24.5% and 4.6 mg/24 h, 24.2%; OPTIMA: IOL 9.5 mg/24 h, 22.9%; DB 13.3 mg/24 h, 11.4% and 9.5 mg/24 h, 12.0%); none were rated serious. In both studies, <9% of patients required treatment for application site reactions. Application site reactions led to discontinuation of 1.7% and 2.5% of the 13.3 mg/24 h and 4.6 mg/24 h groups, respectively, in ACTION, 8.7% in OPTIMA IOL and 1.8% and 3.5% of the 13.3 mg/24 h and 9.5 mg/24 h groups, respectively, in OPTIMA DB. CONCLUSIONS: Application site reactions were experienced by <25% of patients in both studies, with no notable effect of dose. No reactions qualified as serious and skin reactions were uncommon as a reason for study discontinuation.


Subject(s)
Alzheimer Disease/drug therapy , Drug Eruptions/etiology , Neuroprotective Agents/administration & dosage , Rivastigmine/administration & dosage , Administration, Cutaneous , Aged , Dose-Response Relationship, Drug , Drug Eruptions/pathology , Female , Humans , Incidence , Male , Neuroprotective Agents/adverse effects , Randomized Controlled Trials as Topic , Rivastigmine/adverse effects , Severity of Illness Index , Transdermal Patch
20.
Arch Orthop Trauma Surg ; 134(9): 1211-7, 2014 Sep.
Article in English | MEDLINE | ID: mdl-25077784

ABSTRACT

INTRODUCTION: The optimal timing of surgery for multiply injured patients with operative spinal injuries remains unknown. The purported benefits of early intervention must be weighed against the morbidity of surgery in the early post-injury period. The performance of spine surgery in the Afghanistan theater permits analysis of the morbidity of early surgery on military casualties. The objective is to compare surgical morbidity of early spinal surgery in multiply injured patients versus stable patients. MATERIALS AND METHODS: Patients were retrospectively categorized as stable or borderline unstable depending on the presence of at least one of the following: ISS >40, ISS >20 and chest injury, exploratory laparotomy or thoracotomy, lactate >2.5 mEq/L, platelet <110,000/mm(3), or >10 U PRBCs transfused pre-operatively. Surgical morbidity, complications, and neurologic improvement between the two groups were compared retrospectively. RESULTS: 30 casualties underwent 31 spine surgeries during a 12-month period. 16 of 30 patients met criteria indicating a borderline unstable patient. Although there were no significant differences in the procedures performed for stable and borderline unstable patients as measured by the Surgical Invasiveness Index (7.5 vs. 6.9, p = 0.8), borderline unstable patients had significantly higher operative time (4.3 vs. 3.0 h, p = 0.01), blood loss (1,372 vs. 366 mL, p = 0.001), PRBCs transfused intra-op (3.88 vs. 0.14 U, p < 0.001), and total PRBCs transfused in theater (10.18 vs. 0.31 U, p < 0.001). CONCLUSIONS: The results indicate that published criteria defining a borderline unstable patient may have a role in predicting increased morbidity of early spine surgery. The perceived benefits of early intervention should be weighed against the greater risks of performing extensive spinal surgeries on multiply injured patients in the early post-injury period, especially in the setting of combat trauma.


Subject(s)
Military Personnel , Multiple Trauma/surgery , Spinal Cord Injuries/surgery , Spinal Injuries/surgery , Adult , Afghan Campaign 2001- , Blood Loss, Surgical/statistics & numerical data , Erythrocyte Transfusion/statistics & numerical data , Female , Humans , Injury Severity Score , Male , Middle Aged , Operative Time , Quality Improvement , Retrospective Studies , Time Factors , Treatment Outcome , United States
SELECTION OF CITATIONS
SEARCH DETAIL