Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 59
Filter
1.
J Environ Manage ; 365: 121521, 2024 Aug.
Article in English | MEDLINE | ID: mdl-38959774

ABSTRACT

As part of electronic waste (e-waste), the fastest growing solid waste stream in the world, discarded liquid crystal displays (LCDs) contain substantial amounts of both valuable and potentially harmful metal, offering valuable opportunities for resource extraction but posing environmental threats. The present comprehensive study is an investigation into the bioleaching of indium from discarded LCD panels, with a particular focus on high pulp density shredded (Sh-LCDs) and powdered (P-LCDs) materials. This study involved an acidophilic consortium, with two pathways, namely the mixed sulfur-iron pathways and sulfur pathways, being explored to understand the bioleaching mechanisms. Indium bioleaching efficiencies through the mixed sulfur-iron pathway were approximately 60% and 100% for Sh-LCDs and P-LCDs, respectively. Three mechanisms were involved in the extraction of indium from LCD samples: acidolysis, complexolysis, and redoxolysis. The microbial community adapted to a pulp density of 32.5 g/L was streak-plated and it was revealed that sulfur-oxizing bacteria dominated, resulting in the minimum indium extraction of 10% and 55% for both Sh-LCDs and P-LCDs samples, respectively. It was generally accepted that ferric ions as oxidants were effective for indium bioleaching from both the Sh-LCDs and P-LCDs. This implies that the cooperation or interaction within the microbial community used in the bioleaching process had a beneficial impact, enhancing the overall effectiveness of extracting indium from LCD panels. The adapted consortium utilizes a combination of microbial transformation, efflux systems, and chelation through extracellular substances to detoxify heavy metals. The adapted microbial community demonstrated better indium leaching efficiency (50%) compared to the non-adapted microbial community which achieved a maximum of 29% and 5% respectively from Sh-LCDs and P-LCDs at a pulp density of 32.5 g/L. The advantages of an adapted microbial community for indium leaching efficiency, attributing this advantage to factors such as high metabolic activity and improved tolerance to heavy metals. Additionally, the protective role of the biofilm formed by the adapted microbial community is particularly noteworthy, as it contributes to the community's resilience in the presence of inhibitory substances. This information is valuable for understanding and optimizing bioleaching processes for indium recovery, and by extension to possibly other metals.


Subject(s)
Electronic Waste , Indium , Liquid Crystals
2.
IEEE Rev Biomed Eng ; 16: 136-151, 2023.
Article in English | MEDLINE | ID: mdl-34669577

ABSTRACT

Optical pulse detection 'photoplethysmography' (PPG) provides a means of low cost and unobtrusive physiological monitoring that is popular in many wearable devices. However, the accuracy, robustness and generalizability of single-wavelength PPG sensing are sensitive to biological characteristics as well as sensor configuration and placement; this is significant given the increasing adoption of single-wavelength wrist-worn PPG devices in clinical studies and healthcare. Since different wavelengths interact with the skin to varying degrees, researchers have explored the use of multi-wavelength PPG to improve sensing accuracy, robustness and generalizability. This paper contributes a novel and comprehensive state-of-the-art review of wearable multi-wavelength PPG sensing, encompassing motion artifact reduction and estimation of physiological parameters. The paper also encompasses theoretical details about multi-wavelength PPG sensing and the effects of biological characteristics. The review findings highlight the promising developments in motion artifact reduction using multi-wavelength approaches, the effects of skin temperature on PPG sensing, the need for improved diversity in PPG sensing studies and the lack of studies that investigate the combined effects of factors. Recommendations are made for the standardization and completeness of reporting in terms of study design, sensing technology and participant characteristics.


Subject(s)
Wearable Electronic Devices , Wrist , Humans , Monitoring, Physiologic , Photoplethysmography , Heart Rate/physiology , Signal Processing, Computer-Assisted , Algorithms
3.
Annu Int Conf IEEE Eng Med Biol Soc ; 2022: 1651-1654, 2022 07.
Article in English | MEDLINE | ID: mdl-36086420

ABSTRACT

Wearable Photoplethysmography (PPG) has gained prominence as a low cost, unobtrusive and continuous method for physiological monitoring. The quality of the collected PPG signals is affected by several sources of interference, predominantly due to physical motion. Many methods for estimating heart rate (HR) from PPG signals have been proposed with Deep Neural Networks (DNNs) gaining popularity in recent years. However, the "black-box" and complex nature of DNNs has caused a lack of trust in the predicted values. This paper contributes DeepPulse, an uncertainty-aware DNN method for estimating HR from PPG and accelerometer signals, with aims of increasing trust of the predicted HR values. To the best of the authors' knowledge no PPG HR estimation method has considered aleatoric and epistemic uncertainty metrics. The results show DeepPulse is the most accurate method for DNNs with smaller network sizes. Finally, recommendations are given to reduce epistemic uncertainty, validate uncertainty estimates, improve the accuracy of DeepPulse as well as reduce the model size for resource-constrained edge devices.


Subject(s)
Photoplethysmography , Wrist , Heart Rate/physiology , Neural Networks, Computer , Photoplethysmography/methods , Signal Processing, Computer-Assisted , Uncertainty , Wrist/physiology
4.
PLoS Med ; 18(10): e1003783, 2021 10.
Article in English | MEDLINE | ID: mdl-34637437

ABSTRACT

BACKGROUND: Unkept outpatient hospital appointments cost the National Health Service £1 billion each year. Given the associated costs and morbidity of unkept appointments, this is an issue requiring urgent attention. We aimed to determine rates of unkept outpatient clinic appointments across hospital trusts in the England. In addition, we aimed to examine the predictors of unkept outpatient clinic appointments across specialties at Imperial College Healthcare NHS Trust (ICHT). Our final aim was to train machine learning models to determine the effectiveness of a potential intervention in reducing unkept appointments. METHODS AND FINDINGS: UK Hospital Episode Statistics outpatient data from 2016 to 2018 were used for this study. Machine learning models were trained to determine predictors of unkept appointments and their relative importance. These models were gradient boosting machines. In 2017-2018 there were approximately 85 million outpatient appointments, with an unkept appointment rate of 5.7%. Within ICHT, there were almost 1 million appointments, with an unkept appointment rate of 11.2%. Hepatology had the highest rate of unkept appointments (17%), and medical oncology had the lowest (6%). The most important predictors of unkept appointments included the recency (25%) and frequency (13%) of previous unkept appointments and age at appointment (10%). A sensitivity of 0.287 was calculated overall for specialties with at least 10,000 appointments in 2016-2017 (after data cleaning). This suggests that 28.7% of patients who do miss their appointment would be successfully targeted if the top 10% least likely to attend received an intervention. As a result, an intervention targeting the top 10% of likely non-attenders, in the full population of patients, would be able to capture 28.7% of unkept appointments if successful. Study limitations include that some unkept appointments may have been missed from the analysis because recording of unkept appointments is not mandatory in England. Furthermore, results here are based on a single trust in England, hence may not be generalisable to other locations. CONCLUSIONS: Unkept appointments remain an ongoing concern for healthcare systems internationally. Using machine learning, we can identify those most likely to miss their appointment and implement more targeted interventions to reduce unkept appointment rates.


Subject(s)
Appointments and Schedules , Health Services , Machine Learning , Outpatients , Cohort Studies , Delivery of Health Care , England , Humans , Likelihood Functions , Models, Theoretical
5.
Stud Health Technol Inform ; 281: 1106-1107, 2021 May 27.
Article in English | MEDLINE | ID: mdl-34042859

ABSTRACT

Extracting accurate heart rate estimations from wrist-worn photoplethysmography (PPG) devices is challenging due to the signal containing artifacts from several sources. Deep Learning approaches have shown very promising results outperforming classical methods with improvements of 21% and 31% on two state-of-the-art datasets. This paper provides an analysis of several data-driven methods for creating deep neural network architectures with hopes of further improvements.


Subject(s)
Signal Processing, Computer-Assisted , Wearable Electronic Devices , Algorithms , Artifacts , Heart Rate , Neural Networks, Computer
6.
Front Public Health ; 8: 556789, 2020.
Article in English | MEDLINE | ID: mdl-33224912

ABSTRACT

Technological innovations such as artificial intelligence and robotics may be of potential use in telemedicine and in building capacity to respond to future pandemics beyond the current COVID-19 era. Our international consortium of interdisciplinary experts in clinical medicine, health policy, and telemedicine have identified gaps in uptake and implementation of telemedicine or telehealth across geographics and medical specialties. This paper discusses various artificial intelligence and robotics-assisted telemedicine or telehealth applications during COVID-19 and presents an alternative artificial intelligence assisted telemedicine framework to accelerate the rapid deployment of telemedicine and improve access to quality and cost-effective healthcare. We postulate that the artificial intelligence assisted telemedicine framework would be indispensable in creating futuristic and resilient health systems that can support communities amidst pandemics.


Subject(s)
COVID-19 , Telemedicine , Artificial Intelligence , Humans , Pandemics , SARS-CoV-2
7.
Front Public Health ; 8: 556720, 2020.
Article in English | MEDLINE | ID: mdl-33178656

ABSTRACT

Coronavirus disease 2019 (COVID-19) has accelerated the adoption of telemedicine globally. The current consortium critically examines the telemedicine frameworks, identifies gaps in its implementation and investigates the changes in telemedicine framework/s during COVID-19 across the globe. Streamlining of global public health preparedness framework that is interoperable and allow for collaboration and sharing of resources, in which telemedicine is an integral part of the public health response during outbreaks such as COVID-19, should be pursued. With adequate reinforcement, telemedicine has the potential to act as the "safety-net" of our public health response to an outbreak. Our focus on telemedicine must shift to the developing and under-developing nations, which carry a disproportionate burden of vulnerable communities who are at risk due to COVID-19.


Subject(s)
COVID-19 , Telemedicine , Humans , Pandemics/prevention & control , Public Health , SARS-CoV-2
8.
Front Public Health ; 8: 410, 2020.
Article in English | MEDLINE | ID: mdl-33014958

ABSTRACT

Technology has acted as a great enabler of patient continuity through remote consultation, ongoing monitoring, and patient education using telephone and videoconferencing in the coronavirus disease 2019 (COVID-19) era. The devastating impact of COVID-19 is bound to prevail beyond its current reign. The vulnerable sections of our community, including the elderly, those from lower socioeconomic backgrounds, those with multiple comorbidities, and immunocompromised patients, endure a relatively higher burden of a pandemic such as COVID-19. The rapid adoption of different technologies across countries, driven by the need to provide continued medical care in the era of social distancing, has catalyzed the penetration of telemedicine. Limiting the exposure of patients, healthcare workers, and systems is critical in controlling the viral spread. Telemedicine offers an opportunity to improve health systems delivery, access, and efficiency. This article critically examines the current telemedicine landscape and challenges in its adoption, toward remote/tele-delivery of care, across various medical specialties. The current consortium provides a roadmap and/or framework, along with recommendations, for telemedicine uptake and implementation in clinical practice during and beyond COVID-19.


Subject(s)
Ambulatory Care Facilities , COVID-19/prevention & control , Telemedicine/trends , Health Personnel , Humans , Pandemics , Physical Distancing , Videoconferencing
9.
Front Neurol ; 11: 664, 2020.
Article in English | MEDLINE | ID: mdl-32695066

ABSTRACT

With the rapid pace and scale of the emerging coronavirus 2019 (COVID-19) pandemic, a growing body of evidence has shown a strong association of COVID-19 with pre- and post- neurological complications. This has necessitated the need to incorporate targeted neurological care for this subgroup of patients which warrants further reorganization of services, healthcare workforce, and ongoing management of chronic neurological cases. The social distancing and the shutdown imposed by several nations in the midst of COVID-19 have severely impacted the ongoing care, access and support of patients with chronic neurological conditions such as Multiple Sclerosis, Epilepsy, Neuromuscular Disorders, Migraine, Dementia, and Parkinson disease. There is a pressing need for governing bodies including national and international professional associations, health ministries and health institutions to harmonize policies, guidelines, and recommendations relating to the management of chronic neurological conditions. These harmonized guidelines should ensure patient continuity across the spectrum of hospital and community care including the well-being, safety, and mental health of the patients, their care partners and the health professionals involved. This article provides an in-depth analysis of the impact of COVID-19 on chronic neurological conditions and specific recommendations to minimize the potential harm to those at high risk.

10.
J Am Heart Assoc ; 8(22): e013485, 2019 11 19.
Article in English | MEDLINE | ID: mdl-31718445

ABSTRACT

Background Women are underrepresented in cardiac resynchronization therapy (CRT) trials. Some studies suggest that women fare better than men after CRT. We sought to explore clinical outcomes in women and men undergoing CRT-defibrillation or CRT-pacing in real-world clinical practice. Methods and Results A national database (Hospital Episode Statistics for England) was used to quantify clinical outcomes in 43 730 patients (women: 10 890 [24.9%]; men: 32 840 [75.1%]) undergoing CRT over 7.6 years, (median follow-up 2.2 years, interquartile range, 1-4 years). In analysis of the total population, the primary end point of total mortality (adjusted hazard ratio [aHR], 0.73; 95% CI, 0.69-0.76) and the secondary end point of total mortality or heart failure hospitalization (aHR, 0.79, 95% CI 0.75-0.82) were lower in women, independent of known confounders. Total mortality (aHR, 0.73; 95% CI, 0.70-0.76) and total mortality or heart failure hospitalization (aHR, 0.79; 95% CI, 0.75-0.82) were lower for CRT-defibrillation than for CRT-pacing. In analyses of patients with (aHR, 0.89; 95% CI, 0.80-0.98) or without (aHR, 0.70; 95% CI, 0.66-0.73) a myocardial infarction, women had a lower total mortality. In sex-specific analyses, total mortality was lower after CRT-defibrillation in women (aHR, 0.83; P=0.013) and men (aHR, 0.69; P<0.001). Conclusions Compared with men, women lived longer and were less likely to be hospitalized for heart failure after CRT. In both sexes, CRT-defibrillation was superior to CRT-pacing with respect to survival and heart failure hospitalization. The longest survival after CRT was observed in women without a history of myocardial infarction.


Subject(s)
Cardiac Resynchronization Therapy Devices , Cardiac Resynchronization Therapy/methods , Defibrillators, Implantable , Heart Failure/therapy , Hospitalization/statistics & numerical data , Mortality , Aged , Aged, 80 and over , Electric Countershock/methods , Female , Humans , Male , Middle Aged , Proportional Hazards Models , Retrospective Studies , Sex Factors , Survival Rate
11.
Europace ; 21(5): 754-762, 2019 May 01.
Article in English | MEDLINE | ID: mdl-30590500

ABSTRACT

AIMS: Randomized controlled trials have shown that cardiac resynchronization therapy (CRT) prolongs survival in patients with heart failure. No studies have explored survival after CRT in relation to individuals in the general population (relative survival, RS). We sought to determine observed and RS after CRT in a nationwide cohort undergoing CRT. METHODS AND RESULTS: A national administrative database was used to quantify observed mortality for patients undergoing CRT. Relative survival (RS) was quantified using life tables. In 50 084 patients [age 72.1 ± 11.6 years (mean ± standard deviation)] undergoing CRT with (CRT-D) (n = 25 273) or without (CRT-P) defibrillation (n = 24 811) over 8.8 years (median follow-up 2.7 years, interquartile range 1.3-4.8), expected survival decreased with age. Device type, male sex, ischaemic heart disease, diabetes, and chronic kidney disease predicted excess mortality. In multivariate analyses, excess mortality (analogue of RS) was lower after CRT-D than after CRT-P in all patients [adjusted hazard ratio (aHR) 0.80, 95% confidence interval (CI) 0.76-0.84] as well as in subgroups with (aHR 0.79, 95% CI 0.74-0.84) or without (aHR 0.82, 95% CI 0.74-0.91) ischaemic heart disease. A Charlson Comorbidity Index (CCI) ≥3 portended a higher excess mortality (aHR 3.04, 95% CI 2.76-3.34). Relative survival was higher in 2015-2017 than in 2009-2011 (aHR 0.64, 95% CI 0.59-0.69). CONCLUSION: Reference RS data after CRT is presented. Sex, ischaemic heart disease, diabetes, chronic kidney disease, and CCI were major determinants of RS after CRT. CRT-D was associated with a higher RS than CRT-P in patients with or without ischaemic heart disease. Relative survival after CRT improved from 2009 to 2017.


Subject(s)
Cardiac Resynchronization Therapy , Heart Failure , Myocardial Ischemia , Age Factors , Cardiac Resynchronization Therapy/adverse effects , Cardiac Resynchronization Therapy/methods , Cause of Death , Databases, Factual/statistics & numerical data , Female , Heart Failure/mortality , Heart Failure/therapy , Humans , Male , Middle Aged , Mortality , Myocardial Ischemia/diagnosis , Myocardial Ischemia/epidemiology , Risk Factors , Survival Analysis , United Kingdom/epidemiology
12.
Am J Hosp Palliat Care ; 36(2): 147-153, 2019 Feb.
Article in English | MEDLINE | ID: mdl-30157670

ABSTRACT

OBJECTIVE:: To determine the rate and predictors of palliative care referral (PCR) in hospitalized patients with acute heart failure (AHF). INTRODUCTION:: The PCR is commonly utilized in terminal conditions such as metastatic cancers. There is no data on trends and predictors from large-scale registry of general population regarding PCR in patients with AHF. METHODS:: For this retrospective study, data were obtained from National Inpatient Sample Database from 2010 to 2014. We used International Classification of Diseases, Ninth Revision diagnosis codes to identify cases with a principle diagnosis of AHF. These patients were divided into 2 groups: (1) PCR, (2) no PCR groups. We performed multivariate analysis to identify predictors of PCRs, as well as reported PCR trends from 2010 to 2014. RESULTS:: From the database, out of 37 312 324 hospitalizations, 621 947 unweighted cases with primary diagnosis of AHF were selected for further analysis. About 2.8% received PCR. From 2010 to 2014, there was an uptrend from 2.0% to 3.6% for PCR. Metastatic cancer, ventilator-dependent respiratory failure, and cardiogenic shock were strongly associated with PCR. Those who underwent percutaneous coronary intervention and African American or other races were negative predictors for PCR. In the PCR group, 31.4% of patients died during hospitalization. CONCLUSION:: Palliative care referrals were made in a very small proportion of patients with AHF. We observed steady rise in the PCR utilization. Chronic conditions, advancing age, and high-risk patients were major predictors of PCR.


Subject(s)
Heart Failure/epidemiology , Inpatients/statistics & numerical data , Palliative Care/statistics & numerical data , Adolescent , Adult , Age Factors , Aged , Aged, 80 and over , Cardiovascular Diseases/epidemiology , Chronic Disease , Female , Hospitalization , Humans , Male , Middle Aged , Multivariate Analysis , Racial Groups , Retrospective Studies , Risk Factors , Young Adult
13.
Open Heart ; 5(1): e000704, 2018.
Article in English | MEDLINE | ID: mdl-29344378

ABSTRACT

Objectives: Healthcare expenditure per-capita in the USA is higher than in England. We hypothesised that clinical outcomes after cardiac revascularisation are better in the USA. We compared costs and outcomes of patients undergoing coronary artery bypass grafting (CABG) and percutaneous coronary intervention (PCI) in England and New York State (NYS). Methods: Costs and total mortality were assessed using the Hospital Episode Statistics for England and the Statewide Planning and Research Cooperative System for NYS. Outcomes after a first CABG or PCI were assessed in patients undergoing a first CABG (n=142 969) or PCI (n=431 416). Results: After CABG, crude total mortality in England was 0.72% lower at 30 days and 3.68% lower at 1 year (both P<0.001). After PCI, crude total mortality was 0.35% lower at 30 days and 3.55% lower at 1 year (both P<0.001). No differences emerged in total mortality at 30 days after either CABG (England: HR 1.02,95% CI 0.94 to 1.10) or PCI (HR 1.04, 95% CI 0.99 to 1.09) after covariate adjustment. At 1 year, adjusted total mortality was lower in England after both CABG (HR 0.74, 95% CI 0.71 to 0.78) and PCI (HR 0.66, 95% CI 0.65 to 0.68). After adjustment for cost-to-charge ratios and purchasing power parities, costs in NYS amounted to uplifts of 3.8-fold for CABG and 3.6-fold for PCI. Conclusions: Total mortality after CABG and PCI was similar at 30 days and lower in England at 1 year. Costs were approximately fourfold higher in NYS.

14.
Ann Am Thorac Soc ; 14(7): 1094-1102, 2017 07.
Article in English | MEDLINE | ID: mdl-28590164

ABSTRACT

Burn specialists have long recognized the need for and have role modeled a comprehensive approach incorporating relief of distress as part of care during critical illness. More recently, palliative care specialists have become part of the healthcare team in many U.S. hospitals, especially larger academic institutions that are more likely to have designated burn centers. No current literature describes the intersection of palliative care and burn care or integration of primary and specialist palliative care in this unique context. This Perspective gives an overview of burn care; focuses on pain and other symptoms in burn intensive care unit settings; addresses special needs of critically ill burned patients, their families, and clinicians for high-quality palliative care; and highlights potential benefits of integrating primary and specialist palliative care in burn critical care. MEDLINE and the Cumulative Index to Nursing and Allied Health Literature were searched, and an e-mail survey was used to obtain information from U.S. Burn Fellowship Program directors about palliative medicine training. The Improving Palliative Care in the Intensive Care Unit Project Advisory Board synthesized published evidence with their own research and clinical experience in preparing this article. Mortality and severe morbidity for critically ill burned patients remains high. American Burn Association guidelines lay the foundation for a robust system of palliative care delivery, embedding palliative care principles and processes in intensive care by burn providers. Understanding basic burn care, challenges for symptom management and communication, and the culture of the particular burn unit, can optimize quality and integration of primary and specialist palliative care in this distinctive setting.


Subject(s)
Burns/therapy , Palliative Care/methods , Terminal Care/methods , Burns/physiopathology , Burns/psychology , Caregivers/psychology , Child , Critical Care , Humans
15.
BMC Infect Dis ; 16: 166, 2016 Apr 18.
Article in English | MEDLINE | ID: mdl-27091375

ABSTRACT

BACKGROUND: Early review of antimicrobial prescribing decisions within 48 h is recommended to reduce the overall use of unnecessary antibiotics, and in particular the use of broad-spectrum antibiotics. When parenteral antibiotics are used, blood culture results provide valuable information to help decide whether to continue, alter or stop antibiotics at 48 h. The objective of this study was to investigate the frequency of parenteral antibiotic use, broad spectrum antibiotic use and use of blood cultures when parenteral antibiotics are initiated in patients admitted via the Emergency Department. METHODS: We used electronic health records from patients admitted from the Emergency Department at University Hospital Birmingham in 2014. RESULTS: Six percent (4562/72939) of patients attending the Emergency department and one-fifth (4357/19034) of those patients admitted to hospital were prescribed a parenteral antimicrobial. More than half of parenteral antibiotics used were either co-amoxiclav or piperacillin-tazobactam. Blood cultures were obtained in less than one-third of patients who were treated with a parenteral antibiotic. CONCLUSIONS: Parenteral antibiotics are frequently used in those admitted from the Emergency Department; they are usually broad spectrum and are usually initiated without first obtaining cultures. Blood cultures may have limited value to support prescribing review as part of antimicrobial stewardship initiatives.


Subject(s)
Anti-Infective Agents/therapeutic use , Bacteremia/drug therapy , Microbiological Techniques , Aged , Amoxicillin-Potassium Clavulanate Combination/therapeutic use , Bacteremia/prevention & control , Cross-Sectional Studies , Emergency Service, Hospital , Female , Hospitals, University , Humans , Male , Penicillanic Acid/analogs & derivatives , Penicillanic Acid/therapeutic use , Piperacillin/therapeutic use , Retrospective Studies , Tazobactam
16.
J R Soc Med ; 109(6): 230-238, 2016 Jun.
Article in English | MEDLINE | ID: mdl-27053359

ABSTRACT

OBJECTIVE: Current advice for patients being discharged from hospital suggests a body mass index of 18.5 to 24 kgm-2, although this aspirational target may often not be achieved. We examined the relationship between body mass index on discharge from hospital and subsequent mortality over a maximum follow-up of 3.8 years. DESIGN: We conducted a survival analysis using linked hospital records data with national hospital episode statistics and national death certification data. PARTICIPANTS & SETTING: The analysis included adult patients who were admitted to University Hospitals Birmingham NHS Foundation Trust for a period of over 24 h during 2011, excluding day cases and regular day case attenders. MAIN OUTCOME MEASURES: The relationship between body mass index and mortality at medium term was estimated separately in both men and women, after accounting for case-mix. RESULTS: For both males and females, the relationship between body mass index at discharge and the loge hazard of death was strongly non-linear (p = 0.0002 for females and p < 0.0001 for males) and predictive (both p < 0.0001). In all models, the optimal body mass index range associated with best survival was 25 to 35 kgm-2, with a sharp increase in risk for lower body mass index. CONCLUSIONS: There was little evidence to support current aspirational body mass index targets in the discharge population. Hospitals should ensure adequate nutrition especially among those with a reduced body mass index.

17.
J Pain Symptom Manage ; 51(5): 898-906, 2016 05.
Article in English | MEDLINE | ID: mdl-26988848

ABSTRACT

CONTEXT: Emergency medicine (EM) residents perceive palliative care (PC) skills as important and want training, yet there is a general lack of formal PC training in EM residency programs. A clearer definition of the PC educational needs of EM trainees is a research priority. OBJECTIVES: To assess PC competency education in EM residency programs. METHODS: This was a mixed-mode survey of residency program directors, associate program directors, and assistant program directors at accredited EM residency programs, evaluating four educational domains: 1) importance of specific competencies for senior EM residents, 2) senior resident skills in PC competencies, 3) effectiveness of educational methods, and 4) barriers to training. RESULTS: Response rate was 50% from more than 100 residency programs. Most respondents (64%) identified PC competencies as important for residents to learn, and 59% reported that they teach7 PC skills in their residency program. In Domains 1 and 2, crucial conversations, management of pain, and management of the imminently dying had the highest scores for importance and residents' skill. In Domain 3, bedside teaching, mentoring from hospice and palliative medicine faculty, and case-based simulation were the most effective educational methods. In Domain 4, lack of PC expertise among faculty and lack of interest by faculty and residents were the greatest barriers. There were differences between competency importance and senior resident skill level for management of the dying child, withdrawal/withholding of nonbeneficial interventions, and ethical/legal issues. CONCLUSION: There are specific barriers and opportunities for PC competency training and gaps in resident skill level. Specifically, there are discrepancies in competency importance and residency skill in the management of the dying child, nonbeneficial interventions, and ethical and legal issues that could be a focus for educational interventions in PC competency training in EM residencies.


Subject(s)
Emergency Medicine/education , Internship and Residency , Palliative Care , Advance Care Planning , Clinical Competence , Communication , Cross-Sectional Studies , Cultural Competency , Emergency Medicine/organization & administration , Female , Humans , Male , Multivariate Analysis , Physicians , Regression Analysis , Surveys and Questionnaires , United States
18.
Exp Clin Transplant ; 14(1): 50-7, 2016 Feb.
Article in English | MEDLINE | ID: mdl-26862824

ABSTRACT

OBJECTIVES: Stroke is a major cause of mortality in the general population but data regarding stroke-related hospitalization or mortality after a kidney transplant is limited. We determined risk for stroke-related episodes after a kidney transplant in a population-based cohort study of 19,103 kidney allograft recipients in England between 2001 and 2012. MATERIALS AND METHODS: The incidence of stroke-related events after a kidney transplant with pretransplant history of stroke, the incidence of stroke-related hospitalization or death among all kidney allograft recipients after a kidney transplant, and risk factors for stroke-related mortality after a kidney transplant were examined. Data were obtained from hospital episode statistics (an administrative data warehouse that contains admissions to all National Health Service hospitals in England) and is linked to the Office for National Statistics, which collects information on all registered deaths in England. RESULTS: There were 782 nonfatal stroke-related hospitalizations and 113 stroke-related deaths (5.4% of total deaths) after a kidney transplant (median follow-up 4.4 y after a kidney transplant). Risk for all-cause mortality was higher for those recipients with, compared to those without, a history of stroke (21.5% vs 10.8%; P < .001). However, risk for stroke-related mortality after a kidney transplant was no different. Kidney allograft recipients with nonfatal stroke episodes after a transplant were at a higher risk for all-cause and stroke-related mortality. In a Cox regression model, pretransplant history of stroke was an independent risk factor for all-cause mortality, but not stroke-related mortality, while posttransplant hospitalization with nonfatal stroke was a risk factor for both. CONCLUSIONS: Fatal and nonfatal stroke-related events are common among kidney allograft recipients. Further research is warranted to allow better risk stratification and facilitate clinical trials for risk attenuation of stroke after a kidney transplant.


Subject(s)
Hospital Mortality , Hospitalization , Kidney Transplantation/adverse effects , Stroke/mortality , Stroke/therapy , Adult , Aged , Allografts , Chi-Square Distribution , England/epidemiology , Female , Humans , Incidence , Kaplan-Meier Estimate , Kidney Transplantation/mortality , Male , Middle Aged , Multivariate Analysis , Prevalence , Proportional Hazards Models , Retrospective Studies , Risk Assessment , Risk Factors , Stroke/diagnosis , Time Factors , Treatment Outcome
19.
J Crit Care ; 31(1): 172-7, 2016 Feb.
Article in English | MEDLINE | ID: mdl-26507641

ABSTRACT

PURPOSE: To describe educational features in palliative and end-of-life care (PEOLC) in pulmonary/critical care fellowships and identify the features associated with perceptions of trainee competence in PEOLC. METHODS: A survey of educational features in 102 training programs and the perceived skill and comfort level of trainees in 6 PEOLC domains: communication, symptom control, ethical/legal, community/institutional resources, specific syndromes, and ventilator withdrawal. We evaluated associations between perceived trainee competence/comfort in PEOLC and training program features, using regression analyses. RESULTS: Fifty-five percent of program directors (PDs) reported faculty with training in PEOLC; 30% had a written PEOLC curriculum. Neither feature was associated with trainee competence/comfort. Program directors and trainees rated bedside PEOLC teaching highly. Only 20% offered PEOLC rotations; most trainees judged these valuable. Most PDs and trainees reported that didactic teaching was insufficient in communication, although sufficient teaching of this was associated with perceived trainee competence in communication. Perceived trainee competence in managing institutional resources was rated poorly. Program directors reporting significant barriers to PEOLC education also judged trainees less competent in PEOLC. Time constraint was the greatest barrier. CONCLUSION: This survey of PEOLC education in US pulmonary/critical care fellowships identified associations between certain program features and perceived trainee skill in PEOLC. These results generate hypotheses for further study.


Subject(s)
Critical Care , Education, Medical, Graduate/methods , Palliative Care , Terminal Care , Adult , Attitude of Health Personnel , Clinical Competence , Curriculum , Education, Medical, Graduate/standards , Fellowships and Scholarships , Female , Humans , Male , Regression Analysis , United States
20.
Int J Cardiol ; 203: 196-203, 2016 Jan 15.
Article in English | MEDLINE | ID: mdl-26512837

ABSTRACT

OBJECTIVES: Various risk models exist to predict short-term risk-adjusted outcomes after cardiac surgery. Statistical models constructed using clinical registry data usually perform better than those based on administrative datasets. We constructed a procedure-specific risk prediction model based on administrative hospital data for England and we compared its performance with the EuroSCORE (ES) and its variants. METHODS: The Hospital Episode Statistics (HES) risk prediction model was developed using administrative data linked to national mortality statistics register of patients undergoing CABG (35,115), valve surgery (18,353) and combined CABG and valve surgery (8392) from 2008 to 2011 in England and tested using an independent dataset sampled for the financial years 2011-2013. Specific models were constructed to predict mortality within 1-year post discharge. Comparisons with EuroSCORE models were performed on a local cohort of patients (2580) from 2008 to 2013. RESULTS: The discrimination of the HES model demonstrates a good performance for early and up to 1-year following surgery (c-stats: CABG 81.6%, 78.4%; isolated valve 78.6%, 77.8%; CABG & valve 76.4%, 72.0%), respectively. Extended testing in subsequent financial years shows that the models maintained performance outside the development period. Calibration of the HES model demonstrates a small difference (CABG 0.15%; isolated valve 0.39%; CABG & valve 0.63%) between observed and expected mortality rates and delivers a good estimate of risk. Discrimination for the HES model for in-hospital deaths is similar for CABG (logistic ES 79.0%) and combined CABG and valve surgery (logistic ES 71.6%) patients and superior for valve patients (logistic ES 70.9%) compared to the EuroSCORE models. The C-statistics of the EuroSCORE models for longer periods are numerically lower than that of the HES model. CONCLUSION: The national administrative dataset has produced an accurate, stable and clinically useful early and 1-year mortality prediction after cardiac surgery.


Subject(s)
Cardiac Surgical Procedures/mortality , Hospital Information Systems , Models, Statistical , Adult , Aged , Aged, 80 and over , England , Female , Hospital Mortality , Humans , Male , Middle Aged , Prognosis , Reproducibility of Results , Risk Assessment , Time Factors , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL