Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 809
Filtrar
1.
Medicina (Kaunas) ; 60(9)2024 Aug 29.
Artículo en Inglés | MEDLINE | ID: mdl-39336453

RESUMEN

Background and Subject: Hyponatraemia is a common electrolyte disorder. For patients with severe hyponatraemia, intensive care unit (ICU) admission may be required. This will enable close monitoring and allow safe management of sodium levels effectively. While severe hyponatraemia may be associated with significant symptoms, rapid overcorrection of hyponatraemia can lead to complications. We aimed to describe the management and outcomes of severe hyponatraemia in our ICU and identify risk factors for overcorrection. Materials and Methods: This was a retrospective single-centre cohort that included consecutive adults admitted to the ICU with serum sodium < 120 mmol/L between 1 January 2017 and 8 March 2023. Anonymised data were collected from electronic records. We included 181 patients (median age 67 years, 51% male). Results: Median admission serum sodium was 113 mmol/L (IQR: 108-117), with an average rate of improvement over the first 48 h of 10 mmol/L/day (IQR: 5-15 mmol/L). A total of 62 patients (34%) met the criteria for overcorrection at 48 h, and they were younger, presented with severe symptoms (seizures/arrythmias), and had lower admission sodium concentration. They were more likely to be treated with hypertonic saline infusions. Lower admission sodium was an independent risk factor for overcorrection within 48 h, whereas the presence of liver cirrhosis and fluid restriction was associated with normal correction. No difference was identified between the normal and overcorrected cohorts for ICU/hospital length of stay or mortality. Conclusions: In some patients with severe hyponatraemia, overcorrection is inevitable to avoid symptoms such as seizures and arrhythmias, and consequently, we highlight the key factors associated with overcorrection. Overall, we identified that overcorrection was common and concordant with the current literature.


Asunto(s)
Cuidados Críticos , Hiponatremia , Unidades de Cuidados Intensivos , Humanos , Hiponatremia/terapia , Masculino , Femenino , Anciano , Estudios Retrospectivos , Persona de Mediana Edad , Cuidados Críticos/métodos , Cuidados Críticos/estadística & datos numéricos , Unidades de Cuidados Intensivos/estadística & datos numéricos , Factores de Riesgo , Estudios de Cohortes , Sodio/sangre , Anciano de 80 o más Años
2.
Artículo en Inglés | MEDLINE | ID: mdl-39333028

RESUMEN

BACKGROUND: Current understanding of clinical practice and care for maternal kidney disease in pregnancy in Australia is hampered by limitations in available renal-specific datasets. AIMS: To capture the epidemiology, management, and outcomes of women with significant kidney disease in pregnancy and demonstrate feasibility of a national cohort study approach. MATERIALS AND METHODS: An Australian prospective study (2017-2018) using a new kidney disease-specific survey within the Australasian Maternity Outcomes Surveillance System (AMOSS). Women who gave birth with acute kidney injury (AKI), advanced chronic kidney disease (CKD), dialysis dependence or a kidney transplant were included. Demographic data, renal and obstetric management, and perinatal outcomes were collected. RESULTS: Among 58 case notifications from 12 hospitals in five states, we included 23 cases with kidney transplant (n = 12), pre-existing CKD (n = 8), newly diagnosed CKD (n = 2) and dialysis (n = 1). No cases of AKI were reported. Reporting rates were better in states with study investigators and, overall, cases were likely under-reported. Nearly 35% of women had a non-delivery-related antenatal admission. Nephrology involvement was 78.3% during pregnancy and 91% post-partum. Adverse events were increased, including pre-eclampsia (21.7%), and preterm birth (60.9%). Women had high rates of aspirin (82.6%) and antihypertensive (73.9%) use, indwelling catheter for labour/delivery (65.2%), caesarean delivery (60.9%), and blood transfusion (21.7%). CONCLUSIONS: This first-ever Australian prospective study of significant kidney diseases in pregnancy provided novel insights into renal-specific clinical patterns and practices. However, under-reporting was likely. Future studies need to overcome the challenges of case identification and data collection burden.

3.
Clin Transl Radiat Oncol ; 49: 100851, 2024 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-39308635

RESUMEN

Background and purpose: Radical surgery is the standard of care for early rectal cancer. However, alternative organ-preserving approaches are attractive, especially in frail or elderly patients as these avoid surgical complications. We have assessed the efficacy of sole Contact X-ray Brachytherapy (CXB) treatment in stage-1 rectal cancer patients who were unsuitable for or declined surgery. Materials and methods: This retrospective multi-centre study (2009-2021) evaluated 76 patients with T1/2-N0-M0 rectal adenocarcinomas who were treated with CXB alone. Outcomes were assessed for the entire cohort and sub-groups based on the T-stage and the criteria for receiving CXB alone; Group A: patients who were fit enough for surgery but declined, Group B: patients who were high-risk for surgery and Group C: patients who had received prior pelvic radiation for a different cancer. Results: With a median follow-up of 26(IQR:12-49) months, initial clinical Complete Response (cCR) was 82(70-93)% with rates of local regrowth 18(8-29)%, 3-year actuarial local control (LC) 84(75-95)%, distant relapse 3 %, and no nodal relapse. 5-year disease-free survival (DFS) and overall survival (OS) were 66(48-78)% and 58(44-75)%. Lower OS was observed in Groups B [HR:2.54(95 %CI:1.17, 5.59), p = 0.02] and C [HR:2.75(95 %CI:1.15, 6.58), p = 0.03]. Previous pelvic radiation predicted lower cCR and OS. The main toxicity was G1-2 rectal bleeding (26 %) and symptoms of impaired anal sphincter function were not reported in any patients. Conclusion: CXB treatment alone achieved a high cCR rate with satisfactory LC and DFS. Inferior oncological outcomes were observed in patients who had received prior pelvic radiotherapy. CXB alone, with its favourable toxicity profile and avoidance of general anaesthesia and surgery risks, therefore, can be considered for patients who are unsuitable for or refuse surgery.

4.
bioRxiv ; 2024 Aug 05.
Artículo en Inglés | MEDLINE | ID: mdl-39149364

RESUMEN

Peripheral artery disease (PAD) is the narrowing of the arteries that carry blood to the lower extremities. PAD has been traditionally associated with atherosclerosis. However, recent studies have found that medial arterial calcification (MAC) is the primary cause of chronic limb ischemia below the knee. MAC involves calcification of the elastin fibers surrounding smooth muscle cells (SMCs) in arteries. Matrix GLA Protein (MGP) binds circulating calcium and inhibits vascular calcification. Mgp -/- mice develop severe MAC and die within 8 weeks of birth due to aortic rupture or heart failure. We previously discovered a rare genetic disease Arterial Calcification due to Deficiency in CD73 (ACDC) in which patients present with extensive MAC in their lower extremity arteries. Using a patient-specific induced pluripotent stem cell model we found that rapamycin inhibited calcification. Here we investigated whether rapamycin could reduce MAC in vivo using Mgp -/- mice as a model. Mgp +/+ and Mgp -/- mice received 5mg/kg rapamycin or vehicle. Calcification content was assessed via microCT, and vascular morphology and extracellular matrix content assessed histologically. Immunostaining and western blot analysis were used to examine SMC phenotypes and cellular functions. Rapamycin prolonged Mgp -/- mice lifespan, decreased mineral density in the arteries, and increased smooth muscle actin protein levels, however, calcification volume, vessel morphology, SMC proliferation, and autophagy flux were all unchanged. These findings suggest that rapamycin's effects in the Mgp -/- mouse are independent of the vascular phenotype.

5.
J Infus Nurs ; 47(4): 255-265, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38968588

RESUMEN

Oncology and critical care patients often require central vascular access devices (CVADs), which can make them prone to central line-associated bloodstream infections (CLABSIs) and thrombotic occlusions. According to the literature, CLABSIs are rampant and increased by 63% during the COVID-19 pandemic, highlighting the need for innovative interventions. Four percent ethylenediaminetetraacetic acid (4% EDTA) is an antimicrobial locking solution that reduces CLABSIs, thrombotic occlusions, and biofilm. This retrospective pre-post quality improvement project determined if 4% EDTA could improve patient safety by decreasing CLABSIs and central catheter occlusions. This was implemented in all adult cancer and critical care units at a regional cancer hospital and center. Before implementing 4% EDTA, there were 36 CLABSI cases in 16 months (27 annualized). After implementation, there were 6 cases in 6 months (12 annualized), showing a statistically significant decrease of 59% in CLABSIs per 1000 catheter days. However, there was no significant difference in occlusions (alteplase use). Eighty-eight percent of patients had either a positive or neutral outlook, while most nurses reported needing 4% EDTA to be available in prefilled syringes. The pandemic and nursing shortages may have influenced the results; hence, randomized controlled trials are needed to establish a causal relationship between 4% EDTA and CLABSIs and occlusions.


Asunto(s)
COVID-19 , Infecciones Relacionadas con Catéteres , Ácido Edético , Mejoramiento de la Calidad , Humanos , Estudios Retrospectivos , COVID-19/prevención & control , Infecciones Relacionadas con Catéteres/prevención & control , Canadá , Catéteres Venosos Centrales/efectos adversos , Cateterismo Venoso Central/efectos adversos , Femenino , Masculino , Persona de Mediana Edad
6.
Proc Natl Acad Sci U S A ; 121(28): e2408072121, 2024 Jul 09.
Artículo en Inglés | MEDLINE | ID: mdl-38950363

RESUMEN

Female mosquitoes produce eggs in gonadotrophic cycles that are divided between a previtellogenic and vitellogenic phase. Previtellogenic females consume water and sugar sources like nectar while also being attracted to hosts for blood feeding. Consumption of a blood meal activates the vitellogenic phase, which produces mature eggs and suppresses host attraction. In this study, we tested the hypothesis that neuropeptide Y-like hormones differentially modulate host attraction behavior in the mosquito Aedes aegypti. A series of experiments collectively indicated that enteroendocrine cells (EECs) in the posterior midgut produce and release neuropeptide F (NPF) into the hemolymph during the previtellogenic phase which stimulates attraction to humans and biting behavior. Consumption of a blood meal, which primarily consists of protein by dry weight, down-regulated NPF in EECs until mature eggs developed, which was associated with a decline in hemolymph titer. NPF depletion depended on protein digestion but was not associated with EEC loss. Other experiments showed that neurons in the terminal ganglion extend axons to the posterior midgut and produce RYamide, which showed evidence of increased secretion into circulation after a blood meal. Injection of RYamide-1 and -2 into previtellogenic females suppressed host attraction, while coinjection of RYamides with or without short NPF-2 also inhibited the host attraction activity of NPF. Overall, our results identify NPF and RYamide as gut-associated hormones in A. aegypti that link host attraction behavior to shifts in diet during sequential gonadotrophic cycles.


Asunto(s)
Aedes , Neuropéptidos , Animales , Aedes/metabolismo , Aedes/fisiología , Neuropéptidos/metabolismo , Femenino , Conducta Alimentaria/fisiología , Hemolinfa/metabolismo , Células Enteroendocrinas/metabolismo , Proteínas de Insectos/metabolismo , Humanos , Vitelogénesis/fisiología
7.
Health Technol Assess ; 28(27): 1-97, 2024 06.
Artículo en Inglés | MEDLINE | ID: mdl-38940695

RESUMEN

Background: Anterior cruciate ligament injury of the knee is common and leads to decreased activity and risk of secondary osteoarthritis of the knee. Management of patients with a non-acute anterior cruciate ligament injury can be non-surgical (rehabilitation) or surgical (reconstruction). However, insufficient evidence exists to guide treatment. Objective(s): To determine in patients with non-acute anterior cruciate ligament injury and symptoms of instability whether a strategy of surgical management (reconstruction) without prior rehabilitation was more clinically and cost-effective than non-surgical management (rehabilitation). Design: A pragmatic, multicentre, superiority, randomised controlled trial with two-arm parallel groups and 1:1 allocation. Due to the nature of the interventions, no blinding could be carried out. Setting: Twenty-nine NHS orthopaedic units in the United Kingdom. Participants: Participants with a symptomatic (instability) non-acute anterior cruciate ligament-injured knee. Interventions: Patients in the surgical management arm underwent surgical anterior cruciate ligament reconstruction as soon as possible and without any further rehabilitation. Patients in the rehabilitation arm attended physiotherapy sessions and only were listed for reconstructive surgery on continued instability following rehabilitation. Surgery following initial rehabilitation was an expected outcome for many patients and within protocol. Main outcome measures: The primary outcome was the Knee Injury and Osteoarthritis Outcome Score 4 at 18 months post randomisation. Secondary outcomes included return to sport/activity, intervention-related complications, patient satisfaction, expectations of activity, generic health quality of life, knee-specific quality of life and resource usage. Results: Three hundred and sixteen participants were recruited between February 2017 and April 2020 with 156 randomised to surgical management and 160 to rehabilitation. Forty-one per cent (n = 65) of those allocated to rehabilitation underwent subsequent reconstruction within 18 months with 38% (n = 61) completing rehabilitation and not undergoing surgery. Seventy-two per cent (n = 113) of those allocated to surgery underwent reconstruction within 18 months. Follow-up at the primary outcome time point was 78% (n = 248; surgical, n = 128; rehabilitation, n = 120). Both groups improved over time. Adjusted mean Knee Injury and Osteoarthritis Outcome Score 4 scores at 18 months had increased to 73.0 in the surgical arm and to 64.6 in the rehabilitation arm. The adjusted mean difference was 7.9 (95% confidence interval 2.5 to 13.2; p = 0.005) in favour of surgical management. The per-protocol analyses supported the intention-to-treat results, with all treatment effects favouring surgical management at a level reaching statistical significance. There was a significant difference in Tegner Activity Score at 18 months. Sixty-eight per cent (n = 65) of surgery patients did not reach their expected activity level compared to 73% (n = 63) in the rehabilitation arm. There were no differences between groups in surgical complications (n = 1 surgery, n = 2 rehab) or clinical events (n = 11 surgery, n = 12 rehab). Of surgery patients, 82.9% were satisfied compared to 68.1% of rehabilitation patients. Health economic analysis found that surgical management led to improved health-related quality of life compared to non-surgical management (0.052 quality-adjusted life-years, p = 0.177), but with higher NHS healthcare costs (£1107, p < 0.001). The incremental cost-effectiveness ratio for the surgical management programme versus rehabilitation was £19,346 per quality-adjusted life-year gained. Using £20,000-30,000 per quality-adjusted life-year thresholds, surgical management is cost-effective in the UK setting with a probability of being the most cost-effective option at 51% and 72%, respectively. Limitations: Not all surgical patients underwent reconstruction, but this did not affect trial interpretation. The adherence to physiotherapy was patchy, but the trial was designed as pragmatic. Conclusions: Surgical management (reconstruction) for non-acute anterior cruciate ligament-injured patients was superior to non-surgical management (rehabilitation). Although physiotherapy can still provide benefit, later-presenting non-acute anterior cruciate ligament-injured patients benefit more from surgical reconstruction without delaying for a prior period of rehabilitation. Future work: Confirmatory studies and those to explore the influence of fidelity and compliance will be useful. Trial registration: This trial is registered as Current Controlled Trials ISRCTN10110685; ClinicalTrials.gov Identifier: NCT02980367. Funding: This award was funded by the National Institute of Health and Care Research (NIHR) Health Technology Assessment programme (NIHR award ref: 14/140/63) and is published in full in Health Technology Assessment; Vol. 28, No. 27. See the NIHR Funding and Awards website for further award information.


The study aimed to find out whether it is better to offer surgical reconstruction or rehabilitation first to patients with a more long-standing injury of their anterior cruciate ligament in their knee. This injury causes physical giving way of the knee and/or sensations of it being wobbly (instability). The instability can affect daily activities, work, sport and can lead to arthritis. There are two main treatment options for this problem: non-surgical rehabilitation (prescribed exercises and advice from physiotherapists) or an operation by a surgeon to replace the damaged ligament (anterior cruciate ligament reconstruction). Although studies have highlighted the best option for a recently injured knee, the best management was not known for patients with a long-standing injury, perhaps occurring several months previously. Because the surgery is expensive to the NHS (around £100 million per year), it was also important to look at the costs involved. We carried out a study recruiting 316 non-acute anterior cruciate ligament-injured patients from 29 different hospitals and allocated each patient to either surgery or rehabilitation as their treatment option. We measured how well they did with special function and activity scores, patient satisfaction and costs of treatment. Patients in both groups improved substantially. It was expected that some patients in the rehabilitation group would want surgery if non-surgical management was unsuccessful. Forty-one per cent of patients who initially underwent rehabilitation subsequently elected to have reconstructive surgery. Overall, the patients allocated to the surgical reconstruction group had better results in terms of knee function and stability, activity level and satisfaction with treatment than patients allocated to the non-operative rehabilitation group. There were few problems or complications with either treatment option. Although the surgery was a more expensive treatment option, it was found to be cost-effective in the UK setting. The evidence can be discussed in shared decision-making with anterior cruciate ligament-injured patients. Both strategies of management led to improvement. Although a rehabilitation strategy can be beneficial, especially for recently injured patients, it is advised that later-presenting non-acute and more long-standing anterior cruciate ligament-injured patients undergo surgical reconstruction without necessarily delaying for a period of rehabilitation.


Asunto(s)
Lesiones del Ligamento Cruzado Anterior , Reconstrucción del Ligamento Cruzado Anterior , Análisis Costo-Beneficio , Humanos , Masculino , Femenino , Lesiones del Ligamento Cruzado Anterior/cirugía , Lesiones del Ligamento Cruzado Anterior/rehabilitación , Adulto , Reino Unido , Reconstrucción del Ligamento Cruzado Anterior/rehabilitación , Calidad de Vida , Años de Vida Ajustados por Calidad de Vida , Persona de Mediana Edad , Adulto Joven , Medicina Estatal , Inestabilidad de la Articulación/cirugía , Inestabilidad de la Articulación/rehabilitación , Adolescente , Evaluación de la Tecnología Biomédica
8.
Breast Cancer Res ; 26(1): 85, 2024 May 28.
Artículo en Inglés | MEDLINE | ID: mdl-38807211

RESUMEN

BACKGROUND: Abbreviated breast MRI (FAST MRI) is being introduced into clinical practice to screen women with mammographically dense breasts or with a personal history of breast cancer. This study aimed to optimise diagnostic accuracy through the adaptation of interpretation-training. METHODS: A FAST MRI interpretation-training programme (short presentations and guided hands-on workstation teaching) was adapted to provide additional training during the assessment task (interpretation of an enriched dataset of 125 FAST MRI scans) by giving readers feedback about the true outcome of each scan immediately after each scan was interpreted (formative assessment). Reader interaction with the FAST MRI scans used developed software (RiViewer) that recorded reader opinions and reading times for each scan. The training programme was additionally adapted for remote e-learning delivery. STUDY DESIGN: Prospective, blinded interpretation of an enriched dataset by multiple readers. RESULTS: 43 mammogram readers completed the training, 22 who interpreted breast MRI in their clinical role (Group 1) and 21 who did not (Group 2). Overall sensitivity was 83% (95%CI 81-84%; 1994/2408), specificity 94% (95%CI 93-94%; 7806/8338), readers' agreement with the true outcome kappa = 0.75 (95%CI 0.74-0.77) and diagnostic odds ratio = 70.67 (95%CI 61.59-81.09). Group 1 readers showed similar sensitivity (84%) to Group 2 (82% p = 0.14), but slightly higher specificity (94% v. 93%, p = 0.001). Concordance with the ground truth increased significantly with the number of FAST MRI scans read through the formative assessment task (p = 0.002) but by differing amounts depending on whether or not a reader had previously attended FAST MRI training (interaction p = 0.02). Concordance with the ground truth was significantly associated with reading batch size (p = 0.02), tending to worsen when more than 50 scans were read per batch. Group 1 took a median of 56 seconds (range 8-47,466) to interpret each FAST MRI scan compared with 78 (14-22,830, p < 0.0001) for Group 2. CONCLUSIONS: Provision of immediate feedback to mammogram readers during the assessment test set reading task increased specificity for FAST MRI interpretation and achieved high diagnostic accuracy. Optimal reading-batch size for FAST MRI was 50 reads per batch. Trial registration (25/09/2019): ISRCTN16624917.


Asunto(s)
Neoplasias de la Mama , Curva de Aprendizaje , Imagen por Resonancia Magnética , Mamografía , Humanos , Femenino , Neoplasias de la Mama/diagnóstico por imagen , Neoplasias de la Mama/diagnóstico , Imagen por Resonancia Magnética/métodos , Mamografía/métodos , Persona de Mediana Edad , Detección Precoz del Cáncer/métodos , Estudios Prospectivos , Anciano , Sensibilidad y Especificidad , Interpretación de Imagen Asistida por Computador/métodos , Mama/diagnóstico por imagen , Mama/patología
9.
Health Soc Care Deliv Res ; 12(14): 1-182, 2024 May.
Artículo en Inglés | MEDLINE | ID: mdl-38794956

RESUMEN

Background: Acute inpatient mental health services report high levels of safety incidents. The application of patient safety theory has been sparse, particularly concerning interventions that proactively seek patient perspectives. Objective(s): Develop and evaluate a theoretically based, digital monitoring tool to collect real-time information from patients on acute adult mental health wards about their perceptions of ward safety. Design: Theory-informed mixed-methods study. A prototype digital monitoring tool was developed from a co-design approach, implemented in hospital settings, and subjected to qualitative and quantitative evaluation. Setting and methods: Phase 1: scoping review of the literature on patient involvement in safety interventions in acute mental health care; evidence scan of digital technology in mental health contexts; qualitative interviews with mental health patients and staff about perspectives on ward safety. This, alongside stakeholder engagement with advisory groups, service users and health professionals, informed the development processes. Most data collection was virtual. Phase 1 resulted in the technical development of a theoretically based digital monitoring tool that collected patient feedback for proactive safety monitoring. Phase 2: implementation of the tool in six adult acute mental health wards across two UK NHS trusts; evaluation via focused ethnography and qualitative interviews. Statistical analysis of WardSonar data and routine ward data involving construction of an hour-by-hour data set per ward, permitting detailed analysis of the use of the WardSonar tool. Participants: A total of 8 patients and 13 mental health professionals participated in Phase 1 interviews; 33 staff and 34 patients participated in Phase 2 interviews. Interventions: Patients could use a web application (the WardSonar tool) to record real-time perceptions of ward safety. Staff could access aggregated, anonymous data to inform timely interventions. Results: Coronavirus disease 2019 restrictions greatly impacted the study. Stakeholder engagement permeated the project. Phase 1 delivered a theory-based, collaboratively designed digital tool for proactive patient safety monitoring. Phase 2 showed that the tool was user friendly and broadly acceptable to patients and staff. The aggregated safety data were infrequently used by staff. Feasibility depended on engaged staff and embedding use of the tool in ward routines. There is strong evidence that an incident leads to increased probability of further incidents within the next 4 hours. This puts a measure on the extent to which social/behavioural contagion persists. There is weak evidence to suggest that an incident leads to a greater use of the WardSonar tool in the following hour, but none to suggest that ward atmosphere predicts future incidents. Therefore, how often patients use the tool seems to send a stronger signal about potential incidents than patients' real-time reports about ward atmosphere. Limitations: Implementation was limited to two NHS trusts. Coronavirus disease 2019 impacted design processes including stakeholder engagement; implementation; and evaluation of the monitoring tool in routine clinical practice. Higher uptake could enhance validity of the results. Conclusions: WardSonar has the potential to provide a valuable route for patients to communicate safety concerns. The WardSonar monitoring tool has a strong patient perspective and uses proactive real-time safety monitoring rather than traditional retrospective data review. Future work: The WardSonar tool can be refined and tested further in a post Coronavirus disease 2019 context. Study registration: This study is registered as ISRCTN14470430. Funding: This award was funded by the National Institute for Health and Care Research (NIHR) Health and Social Care Delivery Research programme (NIHR award ref: NIHR128070) and is published in full in Health and Social Care Delivery Research; Vol. 12, No. 14. See the NIHR Funding and Awards website for further award information.


Mental health wards can feel unsafe. We know that patients and staff have different ideas about what makes a hospital ward safe or unsafe. Patients are often the first to know when the atmosphere on a ward becomes tense but, often, no one asks them for input or feedback at the time. We worked with service users and staff to develop new technology to make it easy for patients to tell staff about changes in the ward atmosphere. We put everyone's ideas together and some technical developers then built a digital safety tool to use on a tablet computer. Patients put in anonymous information about the ward atmosphere and staff can read it straight away. We tested it on six adult acute mental health wards for 10 weeks. We asked patients and staff what they thought about the tool and we looked at how it was being used. Patients and staff liked the look of the tool on the tablet computer. Some staff said they did not need it because they could tell how patients were feeling, but patients told us that staff did not talk with them much and did not always know when patients were feeling tense. Coronavirus disease 2019 made life difficult on the wards. Most ward managers said the tool could be helpful, but they had not had time to get used to it on the wards. Occasionally, the tablet computers were out of action. Many staff tried hard to use the tool. Most patient information was gathered when it was calm, perhaps because staff were not too busy to help them. We found that this tool could help staff know about tensions on the ward, but they need to get used to it and bring it into ward routines.


Asunto(s)
COVID-19 , Seguridad del Paciente , Humanos , Adulto , Masculino , Femenino , COVID-19/epidemiología , Servicio de Psiquiatría en Hospital/organización & administración , Reino Unido , Investigación Cualitativa , Persona de Mediana Edad , Tecnología Digital , Servicios de Salud Mental/organización & administración , Medicina Estatal/organización & administración , Participación del Paciente/métodos
10.
Radiol Artif Intell ; 6(4): e230431, 2024 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-38775671

RESUMEN

Purpose To develop an artificial intelligence (AI) deep learning tool capable of predicting future breast cancer risk from a current negative screening mammographic examination and to evaluate the model on data from the UK National Health Service Breast Screening Program. Materials and Methods The OPTIMAM Mammography Imaging Database contains screening data, including mammograms and information on interval cancers, for more than 300 000 female patients who attended screening at three different sites in the United Kingdom from 2012 onward. Cancer-free screening examinations from women aged 50-70 years were performed and classified as risk-positive or risk-negative based on the occurrence of cancer within 3 years of the original examination. Examinations with confirmed cancer and images containing implants were excluded. From the resulting 5264 risk-positive and 191 488 risk-negative examinations, training (n = 89 285), validation (n = 2106), and test (n = 39 351) datasets were produced for model development and evaluation. The AI model was trained to predict future cancer occurrence based on screening mammograms and patient age. Performance was evaluated on the test dataset using the area under the receiver operating characteristic curve (AUC) and compared across subpopulations to assess potential biases. Interpretability of the model was explored, including with saliency maps. Results On the hold-out test set, the AI model achieved an overall AUC of 0.70 (95% CI: 0.69, 0.72). There was no evidence of a difference in performance across the three sites, between patient ethnicities, or across age groups. Visualization of saliency maps and sample images provided insights into the mammographic features associated with AI-predicted cancer risk. Conclusion The developed AI tool showed good performance on a multisite, United Kingdom-specific dataset. Keywords: Deep Learning, Artificial Intelligence, Breast Cancer, Screening, Risk Prediction Supplemental material is available for this article. ©RSNA, 2024.


Asunto(s)
Neoplasias de la Mama , Aprendizaje Profundo , Detección Precoz del Cáncer , Mamografía , Humanos , Neoplasias de la Mama/diagnóstico , Neoplasias de la Mama/epidemiología , Neoplasias de la Mama/diagnóstico por imagen , Femenino , Reino Unido/epidemiología , Persona de Mediana Edad , Mamografía/métodos , Anciano , Detección Precoz del Cáncer/métodos , Medición de Riesgo/métodos , Tamizaje Masivo/métodos , Estudios de Cohortes
11.
Sci Total Environ ; 927: 172118, 2024 Jun 01.
Artículo en Inglés | MEDLINE | ID: mdl-38569959

RESUMEN

Declines in insect pollinators have been linked to a range of causative factors such as disease, loss of habitats, the quality and availability of food, and exposure to pesticides. Here, we analysed an extensive dataset generated from pesticide screening of foraging insects, pollen-nectar stores/beebread, pollen and ingested nectar across three species of bees collected at 128 European sites set in two types of crop. In this paper, we aimed to (i) derive a new index to summarise key aspects of complex pesticide exposure data and (ii) understand the links between pesticide exposures depicted by the different matrices, bee species and apple orchards versus oilseed rape crops. We found that summary indices were highly correlated with the number of pesticides detected in the related matrix but not with which pesticides were present. Matrices collected from apple orchards generally contained a higher number of pesticides (7.6 pesticides per site) than matrices from sites collected from oilseed rape crops (3.5 pesticides), with fungicides being highly represented in apple crops. A greater number of pesticides were found in pollen-nectar stores/beebread and pollen matrices compared with nectar and bee body matrices. Our results show that for a complete assessment of pollinator pesticide exposure, it is necessary to consider several different exposure routes and multiple species of bees across different agricultural systems.


Asunto(s)
Productos Agrícolas , Monitoreo del Ambiente , Plaguicidas , Polinización , Animales , Abejas/fisiología , Plaguicidas/análisis , Polen , Malus , Exposición a Riesgos Ambientales/estadística & datos numéricos
12.
JMIR Form Res ; 8: e53726, 2024 Apr 12.
Artículo en Inglés | MEDLINE | ID: mdl-38607663

RESUMEN

BACKGROUND: Acute mental health services report high levels of safety incidents that involve both patients and staff. The potential for patients to be involved in interventions to improve safety within a mental health setting is acknowledged, and there is a need for interventions that proactively seek the patient perspective of safety. Digital technologies may offer opportunities to address this need. OBJECTIVE: This research sought to design and develop a digital real-time monitoring tool (WardSonar) to collect and collate daily information from patients in acute mental health wards about their perceptions of safety. We present the design and development process and underpinning logic model and programme theory. METHODS: The first stage involved a synthesis of the findings from a systematic review and evidence scan, interviews with patients (n=8) and health professionals (n=17), and stakeholder engagement. Cycles of design activities and discussion followed with patients, staff, and stakeholder groups, to design and develop the prototype tool. RESULTS: We drew on patient safety theory and the concepts of contagion and milieu. The data synthesis, design, and development process resulted in three prototype components of the digital monitoring tool (WardSonar): (1) a patient recording interface that asks patients to input their perceptions into a tablet computer, to assess how the ward feels and whether the direction is changing, that is, "getting worse" or "getting better"; (2) a staff dashboard and functionality to interrogate the data at different levels; and (3) a public-facing ward interface. The technology is available as open-source code. CONCLUSIONS: Recent patient safety policy and research priorities encourage innovative approaches to measuring and monitoring safety. We developed a digital real-time monitoring tool to collect information from patients in acute mental health wards about perceived safety, to support staff to respond and intervene to changes in the clinical environment more proactively.

13.
Altern Lab Anim ; 52(3): 149-154, 2024 May.
Artículo en Inglés | MEDLINE | ID: mdl-38606566

RESUMEN

In the cosmetics sector, many products such as shampoos have a probability of accidental ocular exposure during their routine use. One very specific safety parameter is the residence time of the substance on the corneal surface, as prolonged exposure may cause injury. In this study, we developed a system that simulates corneal exposure to blinking and tear flow, for comparing the corneal clearance times of viscous detergent formulations. The Ex Vivo Eye Irritation Test (EVEIT), which uses corneal explants from discarded rabbit eyes from an abattoir, was used as the basis for the new system. To simulate blinking, we developed a silicone wiping membrane to regularly move across the corneal surface, under conditions of constant addition and aspiration of fluid, to mimic tear flow. Six shampoo formulations were tested and were shown to differ widely in their corneal clearance time. Three groups could be identified according to the observed clearance times (fast, intermediate and slow); the reference shampoo had the shortest clearance time of all tested formulations. With this new system, it is now possible to investigate an important physicochemical parameter, i.e. corneal clearance time, for the consideration of ocular safety during the development of novel cosmetic formulations.


Asunto(s)
Parpadeo , Córnea , Animales , Conejos , Córnea/efectos de los fármacos , Parpadeo/efectos de los fármacos , Alternativas a las Pruebas en Animales/métodos , Preparaciones para el Cabello , Lágrimas/efectos de los fármacos
14.
Adv Rheumatol ; 64(1): 31, 2024 04 22.
Artículo en Inglés | MEDLINE | ID: mdl-38650049

RESUMEN

BACKGROUND: To illustrate how (standardised) effect sizes (ES) vary based on calculation method and to provide considerations for improved reporting. METHODS: Data from three trials of tanezumab in subjects with osteoarthritis were analyzed. ES of tanezumab versus comparator for WOMAC Pain (outcome) was defined as least squares difference between means (mixed model for repeated measures analysis) divided by a pooled standard deviation (SD) of outcome scores. Three approaches to computing the SD were evaluated: Baseline (the pooled SD of WOMAC Pain values at baseline [pooled across treatments]); Endpoint (the pooled SD of these values at the time primary endpoints were assessed); and Median (the median pooled SD of these values based on the pooled SDs across available timepoints). Bootstrap analyses were used to compute 95% confidence intervals (CI). RESULTS: ES (95% CI) of tanezumab 2.5 mg based on Baseline, Endpoint, and Median SDs in one study were - 0.416 (- 0.796, - 0.060), - 0.195 (- 0.371, - 0.028), and - 0.196 (- 0.373, - 0.028), respectively; negative values indicate pain improvement. This pattern of ES differences (largest with Baseline SD, smallest with Endpoint SD, Median SD similar to Endpoint SD) was consistent across all studies and doses of tanezumab. CONCLUSION: Differences in ES affect interpretation of treatment effect. Therefore, we advocate clearly reporting individual elements of ES in addition to its overall calculation. This is particularly important when ES estimates are used to determine sample sizes for clinical trials, as larger ES will lead to smaller sample sizes and potentially underpowered studies. TRIAL REGISTRATION: Clinicaltrials.gov NCT02697773, NCT02709486, and NCT02528188.


Asunto(s)
Anticuerpos Monoclonales Humanizados , Osteoartritis , Ensayos Clínicos Controlados Aleatorios como Asunto , Humanos , Anticuerpos Monoclonales Humanizados/uso terapéutico , Interpretación Estadística de Datos , Osteoartritis/tratamiento farmacológico , Dimensión del Dolor , Resultado del Tratamiento
15.
Health Rep ; 35(3): 3-17, 2024 03 20.
Artículo en Inglés | MEDLINE | ID: mdl-38527107

RESUMEN

Background: Small area estimation refers to statistical modelling procedures that leverage information or "borrow strength" from other sources or variables. This is done to enhance the reliability of estimates of characteristics or outcomes for areas that do not contain sufficient sample sizes to provide disaggregated estimates of adequate precision and reliability. There is growing interest in secondary research applications for small area estimates (SAEs). However, it is crucial to assess the analytic value of these estimates when used as proxies for individual-level characteristics or as distinct measures that offer insights at the area level. This study assessed novel area-level community belonging measures derived using small area estimation and examined associations with individual-level measures of community belonging and self-rated health. Data and methods: SAEs of community belonging within census tracts produced from the 2016-2019 cycles of the Canadian Community Health Survey (CCHS) were merged with respondent data from the 2020 CCHS. Multinomial logistic regression models were run between area-level SAEs, individual-level sense of community belonging, and self-rated health on the study sample of people aged 18 years and older. Results: Area-level community belonging was associated with individual-level community belonging, even after adjusting for individual-level sociodemographic characteristics, despite limited agreement between individual- and area-level measures. Living in a neighbourhood with low community belonging was associated with higher odds of reporting being in fair or poor health, versus being in very good or excellent health (odds ratio: 1.53; 95% confidence interval: 1.22, 1.91), even after adjusting for other factors such as individual-level sense of community belonging, which was also associated with self-rated health. Interpretation: Area-level and individual-level sense of community belonging were independently associated with self-rated health. The novel SAEs of community belonging can be used as distinct measures of neighbourhood-level community belonging and should be understood as complementary to, rather than proxies for, individual-level measures of community belonging.


Asunto(s)
Estado de Salud , Características de la Residencia , Humanos , Factores Socioeconómicos , Reproducibilidad de los Resultados , Canadá , Encuestas Epidemiológicas
16.
Value Health ; 27(4): 469-477, 2024 04.
Artículo en Inglés | MEDLINE | ID: mdl-38307389

RESUMEN

OBJECTIVES: The EQ-5D-5L is a commonly used health-related quality of life instrument for evaluating interventions in patients receiving dialysis; however, the minimal important difference (MID) that constitutes a meaningful treatment effect for this population has not been established. This study aims to estimate the MID for the EQ-5D-5L utility index in dialysis patients. METHODS: 6-monthly EQ-5D-5L measurements were collected from adult dialysis patients between April 2017 and November 2020 at a renal network in Sydney, Australia. EQ-VAS and Integrated Palliative care Outcome Scale Renal symptom burden scores were collected simultaneously and used as anchors. MID estimates for the EQ-5D-5L utility index were derived using anchor-based and distribution-based methods. RESULTS: A total of 352 patients with ≥1 EQ-5D-5L observation were included, constituting 1127 observations. Mean EQ-5D-5L utility index at baseline was 0.719 (SD ± 0.267), and mean EQ-5D-5L utility decreased over time by -0.017 per year (95% CI -0.029 to -0.006, P = .004). Using cross-sectional anchor-based methods, MID estimates ranged from 0.073 to 0.107. Using longitudinal anchor-based methods, MID for improvement and deterioration ranged from 0.046 to 0.079 and -0.111 to -0.048, respectively. Using receiver operating characteristic curves, MID for improvement and deterioration ranged from 0.037 to 0.122 and -0.074 to -0.063, respectively. MID estimates from distribution-based methods were consistent with anchor-based estimates. CONCLUSIONS: Anchor-based and distribution-based approaches provided EQ-5D-5L utility index MID estimates ranging from 0.034 to 0.134. These estimates can inform the target difference or "effect size" for clinical trial design among dialysis populations.


Asunto(s)
Calidad de Vida , Diálisis Renal , Adulto , Humanos , Estudios Transversales , Encuestas y Cuestionarios , Psicometría
17.
Mutat Res ; 828: 111853, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38401335

RESUMEN

The widespread use of chemicals and the presence of chemical and metal residues in various foods, beverages, and other consumables have raised concerns about the potential for enhanced toxicity. This study assessed the cytotoxic effects of Piperonyl butoxide (PBO) and its enhancement by combination with major contamination chemicals including Imidacloprid and metals, using different cytotoxic and genotoxic assays in Chinese hamster ovary (CHO) cells. PBO exhibited elevated cytotoxic effects in poly (ADP-ribose) polymerase (PARP) deficient CHO mutants but not in Glutathione S-transferase deficient CHO mutants. PBO cytotoxicity was enhanced by PARP inhibitor, Olaparib. PBO cytotoxicity was also enhanced with co-exposure to Imidacloprid, Lead Chloride, or Sodium Selenite. PBO induces γH2AX foci formation and apoptosis. The induction of DNA damage markers was elevated with PARP deficiency and co-exposure to Imidacloprid, Lead Chloride, or Sodium Selenite. Moreover, PBO triggers to form etch pits on plastic surfaces. These results revealed novel mechanisms of PBO cytotoxicity associated with PARP and synergistic effects with other environmental pollutants. The toxicological mechanisms underlying exposure to various combinations at different concentrations, including concentrations below the permitted limit of intake or the level of concern, require further study.


Asunto(s)
Cricetulus , Sinergismo Farmacológico , Neonicotinoides , Nitrocompuestos , Butóxido de Piperonilo , Animales , Células CHO , Neonicotinoides/toxicidad , Nitrocompuestos/toxicidad , Butóxido de Piperonilo/toxicidad , Imidazoles/toxicidad , Cricetinae , Apoptosis/efectos de los fármacos , Daño del ADN/efectos de los fármacos , Plomo/toxicidad , Piperazinas/toxicidad , Insecticidas/toxicidad , Inhibidores de Poli(ADP-Ribosa) Polimerasas/farmacología , Ftalazinas
18.
Sci Rep ; 14(1): 3524, 2024 02 12.
Artículo en Inglés | MEDLINE | ID: mdl-38347035

RESUMEN

Infectious and parasitic agents (IPAs) and their associated diseases are major environmental stressors that jeopardize bee health, both alone and in interaction with other stressors. Their impact on pollinator communities can be assessed by studying multiple sentinel bee species. Here, we analysed the field exposure of three sentinel managed bee species (Apis mellifera, Bombus terrestris and Osmia bicornis) to 11 IPAs (six RNA viruses, two bacteria, three microsporidia). The sentinel bees were deployed at 128 sites in eight European countries adjacent to either oilseed rape fields or apple orchards during crop bloom. Adult bees of each species were sampled before their placement and after crop bloom. The IPAs were detected and quantified using a harmonised, high-throughput and semi-automatized qPCR workflow. We describe differences among bee species in IPA profiles (richness, diversity, detection frequencies, loads and their change upon field exposure, and exposure risk), with no clear patterns related to the country or focal crop. Our results suggest that the most frequent IPAs in adult bees are more appropriate for assessing the bees' IPA exposure risk. We also report positive correlations of IPA loads supporting the potential IPA transmission among sentinels, suggesting careful consideration should be taken when introducing managed pollinators in ecologically sensitive environments.


Asunto(s)
Bacterias , Polinización , Abejas , Animales , Europa (Continente)
19.
G3 (Bethesda) ; 14(4)2024 04 03.
Artículo en Inglés | MEDLINE | ID: mdl-38334143

RESUMEN

Pollinators are vital for food security and the maintenance of terrestrial ecosystems. Bumblebees are important pollinators across northern temperate, arctic, and alpine ecosystems, yet are in decline across the globe. Vairimorpha bombi is a parasite belonging to the fungal class Microsporidia that has been implicated in the rapid decline of bumblebees in North America, where it may be an emerging infectious disease. To investigate the evolutionary basis of pathogenicity of V. bombi, we sequenced and assembled its genome using Oxford Nanopore and Illumina technologies and performed phylogenetic and genomic evolutionary analyses. The genome assembly for V. bombi is 4.73 Mb, from which we predicted 1,870 protein-coding genes and 179 tRNA genes. The genome assembly has low repetitive content and low GC content. V. bombi's genome assembly is the smallest of the Vairimorpha and closely related Nosema genera, but larger than those found in the Encephalitozoon and Ordospora sister clades. Orthology and phylogenetic analysis revealed 18 core conserved single-copy microsporidian genes including the histone acetyltransferase (HAT) GCN5. Surprisingly, V. bombi was unique to the microsporidia in not encoding the second predicted HAT ESA1. The V. bombi genome assembly annotation included 265 unique genes (i.e. not predicted in other microsporidia genome assemblies), 20% of which encode a secretion signal, which is a significant enrichment. Intriguingly, of the 36 microsporidian genomes we analyzed, 26 also had a significant enrichment of secreted signals encoded by unique genes, ranging from 6 to 71% of those predicted genes. These results suggest that microsporidia are under selection to generate and purge diverse and unique genes encoding secreted proteins, potentially contributing to or facilitating infection of their diverse hosts. Furthermore, V. bombi has 5/7 conserved spore wall proteins (SWPs) with its closest relative V. ceranae (that primarily infects honeybees), while also uniquely encoding four additional SWPs. This gene class is thought to be essential for infection, providing both environmental protection and recognition and uptake into the host cell. Together, our results show that SWPs and unique genes encoding a secretion signal are rapidly evolving in the microsporidia, suggesting that they underpin key pathobiological traits including host specificity and pathogenicity.


Asunto(s)
Ecosistema , Microsporidios , Nosema , Abejas/genética , Animales , Filogenia , Nosema/genética , América del Norte
20.
Commun Biol ; 7(1): 125, 2024 01 24.
Artículo en Inglés | MEDLINE | ID: mdl-38267685

RESUMEN

Marine heatwaves (MHWs) cause disruption to marine ecosystems, deleteriously impacting macroflora and fauna. However, effects on microorganisms are relatively unknown despite ocean temperature being a major determinant of assemblage structure. Using data from thousands of Southern Hemisphere samples, we reveal that during an "unprecedented" 2015/16 Tasman Sea MHW, temperatures approached or surpassed the upper thermal boundary of many endemic taxa. Temperate microbial assemblages underwent a profound transition to niche states aligned with sites over 1000 km equatorward, adapting to higher temperatures and lower nutrient conditions bought on by the MHW. MHW conditions also modulate seasonal patterns of microbial diversity and support novel assemblage compositions. The most significant affects of MHWs on microbial assemblages occurred during warmer months, when temperatures exceeded the upper climatological bounds. Trends in microbial response across several MHWs in different locations suggest these are emergent properties of temperate ocean warming, which may facilitate monitoring, prediction and adaptation efforts.


Asunto(s)
Ecosistema , Rayos Infrarrojos , Nutrientes , Temperatura
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...