Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 672
Filtrar
1.
Women Birth ; 37(4): 101624, 2024 May 09.
Artículo en Inglés | MEDLINE | ID: mdl-38728845

RESUMEN

BACKGROUND: The provision of high-quality midwifery education relies on well-prepared educators. Faculty members need professional development and support to deliver quality midwifery education. AIM: To identify development needs of midwifery faculty in low- and middle-income countries of the Asia Pacific region, to inform program content and the development of guidelines for faculty development programs. METHODS: An online learning needs assessment survey was conducted with midwifery faculty from low- and middle-income countries in the Asia Pacific Region. Quantitative survey data were analysed using descriptive statistics. Textual data were condensed using a general inductive approach to summarise responses and establish links between research aim and findings. FINDINGS: One hundred and thirty-one faculty completed the survey and a high need for development in all aspects of faculty practice was identified. Development in research and publication was the top priority for faculty. Followed closely by leadership and management development, and then more traditional activities of teaching and curriculum development. Preferred mode of program delivery was a blended learning approach. DISCUSSION: Historically, programs of faculty development have primarily focussed on learning and teaching methods and educational development. Yet contemporary faculty members are expected to function in roles including scholarly activities of research and publication, institutional leadership and management, and program design and implementation. Unfortunately, programs of development are rarely based on identified need and fail to consider the expanded role expectation of contemporary faculty practice. CONCLUSION: Future midwifery faculty development programs should address the identified need for development in all expected faculty roles.

2.
Artículo en Inglés | MEDLINE | ID: mdl-38763793

RESUMEN

BACKGROUND: An estimated 12 million adults in the United States experience delayed diagnoses and other diagnostic errors annually. Ambulatory safety nets (ASNs) are an intervention to reduce delayed diagnoses by identifying patients with abnormal results overdue for follow-up using registries, workflow redesign, and patient navigation. The authors sought to co-design a collaborative and implement colorectal cancer (CRC) ASNs across various health care settings. METHODS: A working group was convened to co-design implementation guidance, measures, and the collaborative model. Collaborative sites were recruited through a medical professional liability insurance program and chose to begin with developing an ASN for positive at-home CRC screening or overdue surveillance colonoscopy. The 18-month Breakthrough Series Collaborative ran from January 2022 to July 2023, with sites continuing to collect data while sustaining their ASNs. Data were collected from sites monthly on patients in the ASN, including the proportion that was successfully contacted, scheduled, and completed a follow-up colonoscopy. RESULTS: Six sites participated; four had an operational ASN at the end of the Breakthrough Series, with the remaining sites launching three months later. From October 2022 through February 2024, the Collaborative ASNs collectively identified 5,165 patients from the registry as needing outreach. Among patients needing outreach, 3,555 (68.8%) were successfully contacted, 2,060 (39.9%) were scheduled for a colonoscopy, and 1,504 (29.1%) completed their colonoscopy. CONCLUSION: The Collaborative successfully identified patients with previously abnormal CRC screening and facilitated completion of follow-up testing. The CRC ASN Implementation Guide offers a comprehensive road map for health care leaders interested in implementing CRC ASNs.

3.
Clin Neurophysiol ; 163: 39-46, 2024 Apr 10.
Artículo en Inglés | MEDLINE | ID: mdl-38703698

RESUMEN

OBJECTIVE: We set out to evaluate whether response to treatment for epileptic spasms is associated with specific candidate computational EEG biomarkers, independent of clinical attributes. METHODS: We identified 50 children with epileptic spasms, with pre- and post-treatment overnight video-EEG. After EEG samples were preprocessed in an automated fashion to remove artifacts, we calculated amplitude, power spectrum, functional connectivity, entropy, and long-range temporal correlations (LRTCs). To evaluate the extent to which each feature is independently associated with response and relapse, we conducted logistic and proportional hazards regression, respectively. RESULTS: After statistical adjustment for the duration of epileptic spasms prior to treatment, we observed an association between response and stronger baseline and post-treatment LRTCs (P = 0.042 and P = 0.004, respectively), and higher post-treatment entropy (P = 0.003). On an exploratory basis, freedom from relapse was associated with stronger post-treatment LRTCs (P = 0.006) and higher post-treatment entropy (P = 0.044). CONCLUSION: This study suggests that multiple EEG features-especially LRTCs and entropy-may predict response and relapse. SIGNIFICANCE: This study represents a step toward a more precise approach to measure and predict response to treatment for epileptic spasms.

5.
BMJ Paediatr Open ; 8(1)2024 May 15.
Artículo en Inglés | MEDLINE | ID: mdl-38754893

RESUMEN

BACKGROUND: Poor-quality care is linked to higher rates of neonatal mortality in low-income and middle-income countries (LMICs). Limited educational and upskilling opportunities for healthcare professionals, particularly those who work in remote areas, are key barriers to providing quality neonatal care. Novel digital technologies, including mobile applications and virtual reality, can help bridge this gap. This scoping review aims to identify, analyse and compare available digital technologies for staff education and training to improve newborn care. METHODS: We conducted a structured search of seven databases (MEDLINE (Ovid), EMBASE (Ovid), EMCARE (Ovid), Global Health (CABI), CINAHL (EBSCO), Global Index Medicus (WHO) and Cochrane Central Register of Controlled Trials on 1 June 2023. Eligible studies were those that aimed to improve healthcare providers' competency in newborn resuscitation and management of sepsis or respiratory distress during the early postnatal period. Studies published in English from 1 January 2000 onwards were included. Data were extracted using a predefined data extraction format. RESULTS: The review identified 93 eligible studies, of which 35 were conducted in LMICs. E-learning platforms and mobile applications were common technologies used in LMICs for neonatal resuscitation training. Digital technologies were generally well accepted by trainees. Few studies reported on the long-term effects of these tools on healthcare providers' education or on neonatal health outcomes. Limited studies reported on costs and other necessary resources to maintain the educational intervention. CONCLUSIONS: Lower-cost digital methods such as mobile applications, simulation games and/or mobile mentoring that engage healthcare providers in continuous skills practice are feasible methods for improving neonatal resuscitation skills in LMICs. To further consider the use of these digital technologies in resource-limited settings, assessments of the resources to sustain the intervention and the effectiveness of the digital technologies on long-term health provider performance and neonatal health outcomes are required.


Asunto(s)
Tecnología Digital , Resucitación , Humanos , Recién Nacido , Resucitación/educación , Personal de Salud/educación , Países en Desarrollo , Competencia Clínica
7.
Am J Epidemiol ; 2024 May 13.
Artículo en Inglés | MEDLINE | ID: mdl-38751312

RESUMEN

The Cohort Study of Mobile Phone Use and Health (COSMOS) has repeatedly collected self-reported and operator-recorded data on mobile phone use. Assessing health effects using self-reported information is prone to measurement error, but operator data were available prospectively for only part of the study population and did not cover past mobile phone use. To optimize the available data and reduce bias, we evaluated different statistical approaches for constructing mobile phone exposure histories within COSMOS. We evaluated and compared the performance of four regression calibration (RC) methods (simple, direct, inverse, and generalized additive model for location, shape, and scale), complete-case (CC) analysis and multiple imputation (MI) in a simulation study with a binary health outcome. We used self-reported and operator-recorded mobile phone call data collected at baseline (2007-2012) from participants in Denmark, Finland, the Netherlands, Sweden, and the UK. Parameter estimates obtained using simple, direct, and inverse RC methods were associated with less bias and lower mean squared error than those obtained with CC analysis or MI. We showed that RC methods resulted in more accurate estimation of the relation between mobile phone use and health outcomes, by combining self-reported data with objective operator-recorded data available for a subset of participants.

8.
Healthcare (Basel) ; 12(7)2024 Mar 29.
Artículo en Inglés | MEDLINE | ID: mdl-38610164

RESUMEN

Cancer patients undergoing major interventions face numerous challenges, including the adverse effects of cancer and the side effects of treatment. Cancer rehabilitation is vital in ensuring cancer patients have the support they need to maximise treatment outcomes and minimise treatment-related side effects and symptoms. The Active Together service is a multi-modal rehabilitation service designed to address critical support gaps for cancer patients. The service is located and provided in Sheffield, UK, an area with higher cancer incidence and mortality rates than the national average. The service aligns with local and regional cancer care objectives and aims to improve the clinical and quality-of-life outcomes of cancer patients by using lifestyle behaviour-change techniques to address their physical, nutritional, and psychological needs. This paper describes the design and initial implementation of the Active Together service, highlighting its potential to support and benefit cancer patients.

10.
J Addict Med ; 2024 Apr 10.
Artículo en Inglés | MEDLINE | ID: mdl-38598300

RESUMEN

OBJECTIVE: Buprenorphine is a medication for opioid use disorder that reduces mortality. This study aims to investigate the less well-understood relationship between the dose in the early stages of treatment and the subsequent risk of death. METHODS: We used Kentucky prescription monitoring data to identify adult Kentucky residents initiating transmucosal buprenorphine medication for opioid use disorder (January 2017 to November 2019). Average daily buprenorphine dose for days covered in the first 30 days of treatment was categorized as ≤8 mg, >8 to ≤16 mg, and >16 mg. Patients were followed for 365 days after the first 30 days of buprenorphine treatment. Endpoints were opioid-involved overdose death and death from other causes. Causes and dates of death were obtained using Kentucky death certificate records. Associations were evaluated using multivariable Fine and Gray models adjusting for patient baseline characteristics. RESULTS: In the cohort of 49,857 patients, there were 227 opioid-involved overdose deaths and 459 deaths from other causes. Compared with ≤8 mg, the adjusted subdistribution hazard ratio (aSHR) of opioid-involved overdose death decreased by 55% (aSHR, 0.45; 95% confidence interval [CI], 0.34-0.60) and 64% (aSHR, 0.36; 95% CI, 0.25-0.52) for patients receiving doses of >8 to ≤16 mg and >16 mg, respectively. The incidence of death from other causes was lower in patients receiving >8 to ≤16 mg (aSHR, 0.78; 95% CI, 0.62-0.98) and >16 mg (aSHR, 0.62; 95% CI, 0.47-0.80) versus ≤8 mg dose. CONCLUSIONS: Higher first 30-day buprenorphine doses were associated with reduced opioid-involved overdose death and death from other causes, supporting benefit of higher dosing in reducing mortality.

11.
IJID Reg ; 11: 100361, 2024 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-38634070

RESUMEN

Objectives: The spread of extended-spectrum cephalosporin-resistant Enterobacterales (ESCrE) and carbapenem-resistant Enterobacterales (CRE) has resulted in increased morbidity, mortality, and health care costs worldwide. To identify the factors associated with ESCrE and CRE colonization within hospitals, we enrolled hospitalized patients at a regional hospital located in Guatemala. Methods: Stool samples were collected from randomly selected patients using a cross-sectional study design (March-September, 2021), and samples were tested for the presence of ESCrE and CRE. Hospital-based and household variables were examined for associations with ESCrE and CRE colonization using lasso regression models, clustered by ward (n = 21). Results: A total of 641 patients were enrolled, of whom complete data sets were available for 593. Colonization with ESCrE (72.3%, n = 429/593) was negatively associated with carbapenem administration (odds ratio [OR] 0.21, 95% confidence interval [CI] 0.11-0.42) and positively associated with ceftriaxone administration (OR 1.61, 95% CI 1.02-2.53), as was reported hospital admission within 30 days of the current hospitalization (OR 2.84, 95% CI 1.19-6.80). Colonization with CRE (34.6%, n = 205 of 593) was associated with carbapenem administration (OR 2.62, 95% CI 1.39-4.97), reported previous hospital admission within 30 days of current hospitalization (OR 2.58, 95% CI 1.17-5.72), hospitalization in wards with more patients (OR 1.05, 95% CI 1.02-1.08), hospitalization for ≥4 days (OR 3.07, 95% CI 1.72-5.46), and intubation (OR 2.51, 95% CI 1.13-5.59). No household-based variables were associated with ESCrE or CRE colonization in hospitalized patients. Conclusion: The hospital-based risk factors identified in this study are similar to what has been reported for risk of health care-associated infections, consistent with colonization being driven by hospital settings rather than community factors. This also suggests that colonization with ESCrE and CRE could be a useful metric to evaluate the efficacy of infection and prevention control programs in clinics and hospitals.

12.
Nanotoxicology ; 18(2): 214-228, 2024 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-38557361

RESUMEN

Carbon nanotubes (CNTs) are increasingly being used in industrial applications, but their toxicological data in animals and humans are still sparse. To assess the toxicological dose-response of CNTs and to evaluate their pulmonary biopersistence, their quantification in tissues, especially lungs, is crucial. There are currently no reference methods or reference materials for low levels of CNTs in organic matter. Among existing analytical methods, few have been fully and properly validated. To remedy this, we undertook an inter-laboratory comparison on samples of freeze-dried pig lung, ground and doped with CNTs. Eight laboratories were enrolled to analyze 3 types of CNTs at 2 concentration levels each in this organic matrix. Associated with the different analysis techniques used (specific to each laboratory), sample preparation may or may not have involved prior digestion of the matrix, depending on the analysis technique and the material being analyzed. Overall, even challenging, laboratories' ability to quantify CNT levels in organic matter is demonstrated. However, CNT quantification is often overestimated. Trueness analysis identified effective methods, but systematic errors persisted for some. Choosing the assigned value proved complex. Indirect analysis methods, despite added steps, outperform direct methods. The study emphasizes the need for reference materials, enhanced precision, and organized comparisons.


Asunto(s)
Pulmón , Nanotubos de Carbono , Nanotubos de Carbono/química , Nanotubos de Carbono/toxicidad , Animales , Porcinos , Pulmón/química , Pulmón/efectos de los fármacos , Laboratorios/normas , Compuestos Orgánicos/análisis , Compuestos Orgánicos/química
13.
Environ Int ; 185: 108552, 2024 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-38458118

RESUMEN

BACKGROUND: Each new generation of mobile phone technology has triggered discussions about potential carcinogenicity from exposure to radiofrequency electromagnetic fields (RF-EMF). Available evidence has been insufficient to conclude about long-term and heavy mobile phone use, limited by differential recall and selection bias, or crude exposure assessment. The Cohort Study on Mobile Phones and Health (COSMOS) was specifically designed to overcome these shortcomings. METHODS: We recruited participants in Denmark, Finland, the Netherlands, Sweden, and the UK 2007-2012. The baseline questionnaire assessed lifetime history of mobile phone use. Participants were followed through population-based cancer registers to identify glioma, meningioma, and acoustic neuroma cases during follow-up. Non-differential exposure misclassification was reduced by adjusting estimates of mobile phone call-time through regression calibration methods based on self-reported data and objective operator-recorded information at baseline. Hazard ratios (HR) and 95% confidence intervals (CI) for glioma, meningioma, and acoustic neuroma in relation to lifetime history of mobile phone use were estimated with Cox regression models with attained age as the underlying time-scale, adjusted for country, sex, educational level, and marital status. RESULTS: 264,574 participants accrued 1,836,479 person-years. During a median follow-up of 7.12 years, 149 glioma, 89 meningioma, and 29 incident cases of acoustic neuroma were diagnosed. The adjusted HR per 100 regression-calibrated cumulative hours of mobile phone call-time was 1.00 (95 % CI 0.98-1.02) for glioma, 1.01 (95 % CI 0.96-1.06) for meningioma, and 1.02 (95 % CI 0.99-1.06) for acoustic neuroma. For glioma, the HR for ≥ 1908 regression-calibrated cumulative hours (90th percentile cut-point) was 1.07 (95 % CI 0.62-1.86). Over 15 years of mobile phone use was not associated with an increased tumour risk; for glioma the HR was 0.97 (95 % CI 0.62-1.52). CONCLUSIONS: Our findings suggest that the cumulative amount of mobile phone use is not associated with the risk of developing glioma, meningioma, or acoustic neuroma.


Asunto(s)
Neoplasias Encefálicas , Uso del Teléfono Celular , Teléfono Celular , Glioma , Neoplasias Meníngeas , Meningioma , Neuroma Acústico , Humanos , Meningioma/epidemiología , Meningioma/etiología , Estudios de Cohortes , Neuroma Acústico/epidemiología , Neuroma Acústico/etiología , Estudios Prospectivos , Neoplasias Encefálicas/epidemiología , Neoplasias Encefálicas/etiología , Glioma/epidemiología , Glioma/etiología , Campos Electromagnéticos , Encuestas y Cuestionarios , Estudios de Casos y Controles
14.
Appetite ; 197: 107319, 2024 Jun 01.
Artículo en Inglés | MEDLINE | ID: mdl-38514019

RESUMEN

Research suggests that as we age, protein intake, recognised as vital for combating negative health outcomes, consistently falls below recommendations in older adults. Decreased food intake, combined with age-related eating complications is a major determinant of this protein undernutrition. If nutritional interventions are to be effective and sustainable, they must enable eating pleasure, cater for personal preferences and be adaptable to different eating patterns. As such, we aimed to identify successful strategies for at-home protein-fortification to empower older adults to take a personalised approach to their nutrition, without requiring a large behavioural change. To explore healthy older adults' (age 70+) acceptability and preferences for at-home protein fortification, European project Fortiphy led discussions with older adults (n = 37) and caregivers of older adults (n = 15) to develop high-protein recipes, which were then utilised in a home-use trial with healthy older adults (n = 158). Each fortified recipe was paired with a questionnaire to rate the ease of preparation and liking, and an end-of-study questionnaire was provided to capture overall opinions and preferences. The uniqueness of this study is that the protein fortified recipes were prepared and tested by older adults themselves, in their own homes. Findings showed that older adults were unaware of the importance of protein in ageing and did not have a desire to fortify their foods at present. Yet, they were positive regarding the concept and highlighted the importance of taste, familiar ingredients, and preferred preparation methods. Cultural preferences across countries were identified as having the most influence on the liking of fortified meals. This study also indicated a need for increased awareness of protein requirements to influence the motivation to use fortification.


Asunto(s)
Alimentos Fortificados , Estado Nutricional , Humanos , Anciano , Envejecimiento , Francia , Reino Unido
15.
bioRxiv ; 2024 May 14.
Artículo en Inglés | MEDLINE | ID: mdl-38496442

RESUMEN

Sepsis-associated encephalopathy (SAE) is a common manifestation in septic patients that is associated with increased risk of long-term cognitive impairment. SAE is driven, at least in part, by brain endothelial dysfunction in response to systemic cytokine signaling. However, the mechanisms driving SAE and its consequences remain largely unknown. Here, we performed translating ribosome affinity purification and RNA-sequencing (TRAP-seq) from the brain endothelium to determine the transcriptional changes after an acute endotoxemic (LPS) challenge. LPS induced a strong acute transcriptional response in the brain endothelium that partially correlates with the whole brain transcriptional response and suggested an endothelial-specific hypoxia response. Consistent with a crucial role for IL-6, loss of the main regulator of this pathway, SOCS3, leads to a broadening of the population of genes responsive to LPS, suggesting that an overactivation of the IL-6/JAK/STAT3 pathway leads to an increased transcriptional response that could explain our prior findings of severe brain injury in these mice. To identify any potential sequelae of this acute response, we performed brain TRAP-seq following a battery of behavioral tests in mice after apparent recovery. We found that the transcriptional response returns to baseline within days post-challenge. Despite the transient nature of the response, we observed that mice that recovered from the endotoxemic shock showed mild, sex-dependent cognitive impairment, suggesting that the acute brain injury led to sustained, non-transcriptional effects. A better understanding of the transcriptional and non-transcriptional changes in response to shock is needed in order to prevent and/or revert the devastating consequences of septic shock.

16.
Ann Pharmacother ; : 10600280241236507, 2024 Mar 14.
Artículo en Inglés | MEDLINE | ID: mdl-38486351

RESUMEN

BACKGROUND: The use of albumin resuscitation in septic shock is only recommended in patients who have received large volumes of crystalloid resuscitation regardless of serum albumin concentration. The role of albumin is still largely debated and evidence to support its use still lacking. OBJECTIVE: The objective of this study was to evaluate whether albumin replacement increases the number of vasopressor-free days in patients with septic shock and hypoalbuminemia. METHODS: A retrospective analysis was conducted to assess the effect of albumin replacement in septic shock. Hypoalbuminemic patients with septic shock who received albumin were retrospectively compared with a cohort who did not. The primary outcome was number of vasopressor-free days at day 14 from shock presentation, which was analyzed using an adjusted linear regression model to adjust for confounders. RESULTS: There was no difference in vasopressor-free days at day 14 in patients who received albumin versus those who did not, after adjusting for confounders of exposure (0.50, 95% CI = -0.97 to 1.97; P = 0.502). There also was no difference in secondary outcomes except for need for invasive mechanical ventilation (MV), which was significantly lower in patients who received albumin (61 [54.4%] vs 88 [67.7%]; P = 0.035). CONCLUSIONS AND RELEVANCE: We observed no difference in vasopressor-free days at day 14 in patients with hypoalbuminemia who received albumin compared with those who did not. However, patients who received albumin required significantly less MV although further studies are warranted to assess this effect.

17.
Nat Med ; 30(4): 1075-1084, 2024 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-38429522

RESUMEN

Chronic pain is a common problem, with more than one-fifth of adult Americans reporting pain daily or on most days. It adversely affects the quality of life and imposes substantial personal and economic costs. Efforts to treat chronic pain using opioids had a central role in precipitating the opioid crisis. Despite an estimated heritability of 25-50%, the genetic architecture of chronic pain is not well-characterized, in part because studies have largely been limited to samples of European ancestry. To help address this knowledge gap, we conducted a cross-ancestry meta-analysis of pain intensity in 598,339 participants in the Million Veteran Program, which identified 126 independent genetic loci, 69 of which are new. Pain intensity was genetically correlated with other pain phenotypes, level of substance use and substance use disorders, other psychiatric traits, education level and cognitive traits. Integration of the genome-wide association studies findings with functional genomics data shows enrichment for putatively causal genes (n = 142) and proteins (n = 14) expressed in brain tissues, specifically in GABAergic neurons. Drug repurposing analysis identified anticonvulsants, ß-blockers and calcium-channel blockers, among other drug groups, as having potential analgesic effects. Our results provide insights into key molecular contributors to the experience of pain and highlight attractive drug targets.


Asunto(s)
Dolor Crónico , Veteranos , Adulto , Humanos , Dolor Crónico/tratamiento farmacológico , Dolor Crónico/genética , Estudio de Asociación del Genoma Completo/métodos , Dimensión del Dolor , Calidad de Vida , Predisposición Genética a la Enfermedad , Polimorfismo de Nucleótido Simple/genética
18.
J Neurol Phys Ther ; 48(2): 102-111, 2024 Apr 01.
Artículo en Inglés | MEDLINE | ID: mdl-38441461

RESUMEN

BACKGROUND/PURPOSE: Gait impairments in Parkinson disease (PD) contribute to decreased quality of life. This randomized controlled trial examined immediate- and longer-term effects of a single joint robotic exoskeleton device (EXOD), the Honda Walking Assist device, on gait. METHODS: Participants (n = 45) with PD (Hoehn and Yahr stages 1-3) were randomized to a robotic-assisted gait training (RAGT) group (n = 23) or control (CON) group (n = 22). The RAGT group was tested with and without the EXOD at baseline and then received supervised in-home and community training with the EXOD twice weekly for 8 weeks. The CON group received no interventions. Outcome measures included gait speed (primary), gait endurance (6-minute walk test), perceived ease of walking, and questionnaires and logs assessing performance of daily activities, freezing of gait, and daily activity levels. RESULTS: Forty participants completed the study. No significant immediate impact of EXOD usage on participants' gait measures was found. Differences in gait speed and secondary outcome measures postintervention were not significantly different between the RAGT and CON groups. Participants with greater disease severity (worse baseline motor scores) had greater improvements in stride length during unassisted walking after the intervention than those with lower severity (mean difference: 3.22, 95% confidence interval: 0.05-6.40; P = 0.04). DISCUSSION AND CONCLUSIONS: All RAGT participants could use the EXOD safely. The RAGT treatment used in this mostly low impairment population of people with PD may be ineffective and/or was insufficiently dosed to see a positive treatment effect. Our findings suggest that RAGT interventions in PD may be more effective in individuals with greater motor impairments.


Asunto(s)
Trastornos Neurológicos de la Marcha , Enfermedad de Parkinson , Procedimientos Quirúrgicos Robotizados , Humanos , Trastornos Neurológicos de la Marcha/etiología , Calidad de Vida , Marcha , Caminata , Terapia por Ejercicio
19.
Soc Sci Med ; 345: 116652, 2024 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-38364721

RESUMEN

BACKGROUND: The World Health Organization Surgical Safety Checklist (SSC) is a tool designed to enhance team communication and patient safety. When used properly, the SSC acts as a layer of defence against never events. In this study, we performed secondary qualitative analysis of operating theatres (OT) SSC observational notes to examine how the SSC was used after an intensive SSC re-implementation effort and drew on relevant theories to shed light on the observed patterns of behaviours. We aimed to go beyond assessing checklist compliance and to understand potential sociopsychological mechanisms of the variations in SSC practices. METHODS: Direct observation notes of 109 surgical procedures across 13 surgical disciplines were made by two trained nurses in the OT of a large tertiary hospital in Singapore from February to April 2022, three months after SSC re-implementation. Only notes relevant to the use of SSC were extracted and analyzed using reflexive thematic analysis. Data were coded following an inductive process to identify themes or patterns of SSC practices. These patterns were subsequently interpreted against a relevant theory to appreciate the potential sociopsychological forces behind them. RESULTS: Two broad types of SSC practices and their respective sub-themes were identified. Type 1 (vs. Type 2) SSC practices are characterized by patience and thoroughness (vs. hurriedness and omission) in carrying out the SSC process, dedication and attention (vs. delegation and distraction) to the SSC safety checks, and frequent (vs. absence of) safety voices during the conduct of SSC. These patterns were conceptualized as safety-seeking action vs. ritualistic action using Merton's social deviance theory. CONCLUSION: Ritualistic practice of the SSC can undermine surgical safety by creating conditions conducive to never events. To fully realize the SSC's potential as an essential tool for communication and safety, a concerted effort is needed to balance thoroughness with efficiency. Additionally, fostering a culture of collaboration and collegiality is crucial to reinforce and enhance the culture of surgical safety.


Asunto(s)
Lista de Verificación , Quirófanos , Humanos , Investigación Cualitativa , Seguridad del Paciente , Errores Médicos
20.
J Med Internet Res ; 26: e45114, 2024 Feb 07.
Artículo en Inglés | MEDLINE | ID: mdl-38324379

RESUMEN

BACKGROUND: Adolescents are susceptible to mental illness and have experienced substantial disruption owing to the COVID-19 pandemic. The digital environment is increasingly important in the context of a pandemic when in-person social connection is restricted. OBJECTIVE: This study aims to estimate whether depression and anxiety had worsened compared with the prepandemic period and examine potential associations with sociodemographic characteristics and behavioral factors, particularly digital behaviors. METHODS: We analyzed cross-sectional and longitudinal data from a large, representative Greater London adolescent cohort study: the Study of Cognition, Adolescents and Mobile Phones (SCAMP). Participants completed surveys at T1 between November 2016 and July 2018 (N=4978; aged 13 to 15 years) and at T2 between July 2020 and June 2021 (N=1328; aged 16 to 18 years). Depression and anxiety were measured using the Patient Health Questionnaire and Generalized Anxiety Disorder scale, respectively. Information on the duration of total mobile phone use, social network site use, and video gaming was also collected using questionnaires. Multivariable logistic regression was used to assess the cross-sectional and longitudinal associations of sociodemographic characteristics, digital technology use, and sleep duration with clinically significant depression and anxiety. RESULTS: The proportion of adolescents who had clinical depression and anxiety significantly increased at T2 (depression: 140/421, 33.3%; anxiety: 125/425, 29.4%) compared with the proportion of adolescents at T1 (depression: 57/421, 13.5%; anxiety: 58/425, 13.6%; P for 2-proportion z test <.001 for both depression and anxiety). Depression and anxiety levels were similar between the summer holiday, school opening, and school closures. Female participants had higher odds of new incident depression (odds ratio [OR] 2.5, 95% CI 1.5-4.18) and anxiety (OR 2.11, 95% CI 1.23-3.61) at T2. A high level of total mobile phone use at T1 was associated with developing depression at T2 (OR 1.89, 95% CI 1.02-3.49). Social network site use was associated with depression and anxiety cross-sectionally at T1 and T2 but did not appear to be associated with developing depression or anxiety longitudinally. Insufficient sleep at T1 was associated with developing depression at T2 (OR 2.26, 95% CI 1.31-3.91). CONCLUSIONS: The mental health of this large sample of adolescents from London deteriorated during the pandemic without noticeable variations relating to public health measures. The deterioration was exacerbated in girls, those with preexisting high total mobile phone use, and those with preexisting disrupted sleep. Our findings suggest the necessity for allocating resources to address these modifiable factors and target high-risk groups.


Asunto(s)
COVID-19 , Tecnología Digital , Adolescente , Femenino , Humanos , Estudios de Cohortes , Estudios Longitudinales , Pandemias , Estudios Transversales , Depresión , Ansiedad
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...