Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
JAMA Netw Open ; 7(8): e2425923, 2024 Aug 01.
Artículo en Inglés | MEDLINE | ID: mdl-39110461

RESUMEN

Importance: Residents must prepare for effective communication with patients after medical errors. The video-based communication assessment (VCA) is software that plays video of a patient scenario, asks the physician to record what they would say, engages crowdsourced laypeople to rate audio recordings of physician responses, and presents feedback to physicians. Objective: To evaluate the effectiveness of VCA feedback in resident error disclosure skill training. Design, Setting, and Participants: This single-blinded, randomized clinical trial was conducted from July 2022 to May 2023 at 7 US internal medicine and family medicine residencies (10 total sites). Participants were second-year residents attending required teaching conferences. Data analysis was performed from July to December 2023. Intervention: Residents completed 2 VCA cases at time 1 and were randomized to the intervention, an individual feedback report provided in the VCA application after 2 weeks, or to control, in which feedback was not provided until after time 2. Residents completed 2 additional VCA cases after 4 weeks (time 2). Main Outcomes and Measures: Panels of crowdsourced laypeople rated recordings of residents disclosing simulated medical errors to create scores on a 5-point scale. Reports included learning points derived from layperson comments. Mean time 2 ratings were compared to test the hypothesis that residents who had access to feedback on their time 1 performance would score higher at time 2 than those without feedback access. Residents were surveyed about demographic characteristics, disclosure experience, and feedback use. The intervention's effect was examined using analysis of covariance. Results: A total of 146 residents (87 [60.0%] aged 25-29 years; 60 female [41.0%]) completed the time 1 VCA, and 103 (70.5%) completed the time 2 VCA (53 randomized to intervention and 50 randomized to control); of those, 28 (54.9%) reported reviewing their feedback. Analysis of covariance found a significant main effect of feedback between intervention and control groups at time 2 (mean [SD] score, 3.26 [0.45] vs 3.14 [0.39]; difference, 0.12; 95% CI, 0.08-0.48; P = .01). In post hoc comparisons restricted to residents without prior disclosure experience, intervention residents scored higher than those in the control group at time 2 (mean [SD] score, 3.33 [0.43] vs 3.09 [0.44]; difference, 0.24; 95% CI, 0.01-0.48; P = .007). Worse performance at time 1 was associated with increased likelihood of dropping out before time 2 (odds ratio, 2.89; 95% CI, 1.06-7.84; P = .04). Conclusions and Relevance: In this randomized clinical trial, self-directed review of crowdsourced feedback was associated with higher ratings of internal medicine and family medicine residents' error disclosure skill, particularly for those without real-life error disclosure experience, suggesting that such feedback may be an effective way for residency programs to address their requirement to prepare trainees for communicating with patients after medical harm. Trial Registration: ClinicalTrials.gov Identifier: NCT06234085.


Asunto(s)
Colaboración de las Masas , Internado y Residencia , Errores Médicos , Humanos , Internado y Residencia/métodos , Femenino , Masculino , Colaboración de las Masas/métodos , Adulto , Errores Médicos/prevención & control , Competencia Clínica/estadística & datos numéricos , Competencia Clínica/normas , Método Simple Ciego , Revelación de la Verdad , Medicina Interna/educación , Relaciones Médico-Paciente , Retroalimentación
2.
JMIR Med Educ ; 8(4): e40758, 2022 Oct 03.
Artículo en Inglés | MEDLINE | ID: mdl-36190751

RESUMEN

BACKGROUND: US residents require practice and feedback to meet Accreditation Council for Graduate Medical Education mandates and patient expectations for effective communication after harmful errors. Current instructional approaches rely heavily on lectures, rarely provide individualized feedback to residents about communication skills, and may not assure that residents acquire the skills desired by patients. The Video-based Communication Assessment (VCA) app is a novel tool for simulating communication scenarios for practice and obtaining crowdsourced assessments and feedback on physicians' communication skills. We previously established that crowdsourced laypeople can reliably assess residents' error disclosure skills with the VCA app. However, its efficacy for error disclosure training has not been tested. OBJECTIVE: We aimed to evaluate the efficacy of using VCA practice and feedback as a stand-alone intervention for the development of residents' error disclosure skills. METHODS: We conducted a pre-post study in 2020 with pathology, obstetrics and gynecology, and internal medicine residents at an academic medical center in the United States. At baseline, residents each completed 2 specialty-specific VCA cases depicting medical errors. Audio responses were rated by at least 8 crowdsourced laypeople using 6 items on a 5-point scale. At 4 weeks, residents received numerical and written feedback derived from layperson ratings and then completed 2 additional cases. Residents were randomly assigned cases at baseline and after feedback assessments to avoid ordinal effects. Ratings were aggregated to create overall assessment scores for each resident at baseline and after feedback. Residents completed a survey of demographic characteristics. We used a 2×3 split-plot ANOVA to test the effects of time (pre-post) and specialty on communication ratings. RESULTS: In total, 48 residents completed 2 cases at time 1, received a feedback report at 4 weeks, and completed 2 more cases. The mean ratings of residents' communication were higher at time 2 versus time 1 (3.75 vs 3.53; P<.001). Residents with prior error disclosure experience performed better at time 1 compared to those without such experience (ratings: mean 3.63 vs mean 3.46; P=.02). No differences in communication ratings based on specialty or years in training were detected. Residents' communication was rated higher for angry cases versus sad cases (mean 3.69 vs mean 3.58; P=.01). Less than half of all residents (27/62, 44%) reported prior experience with disclosing medical harm to patients; experience differed significantly among specialties (P<.001) and was lowest for pathology (1/17, 6%). CONCLUSIONS: Residents at all training levels can potentially improve error disclosure skills with VCA practice and feedback. Error disclosure curricula should prepare residents for responding to various patient affects. Simulated error disclosure may particularly benefit trainees in diagnostic specialties, such as pathology, with infrequent real-life error disclosure practice opportunities. Future research should examine the effectiveness, feasibility, and acceptability of VCA within a longitudinal error disclosure curriculum.

3.
JMIR Med Educ ; 8(2): e30988, 2022 Apr 29.
Artículo en Inglés | MEDLINE | ID: mdl-35486423

RESUMEN

BACKGROUND: Residents may benefit from simulated practice with personalized feedback to prepare for high-stakes disclosure conversations with patients after harmful errors and to meet American Council on Graduate Medical Education mandates. Ideally, feedback would come from patients who have experienced communication after medical harm, but medical researchers and leaders have found it difficult to reach this community, which has made this approach impractical at scale. The Video-Based Communication Assessment app is designed to engage crowdsourced laypeople to rate physician communication skills but has not been evaluated for use with medical harm scenarios. OBJECTIVE: We aimed to compare the reliability of 2 assessment groups (crowdsourced laypeople and patient advocates) in rating physician error disclosure communication skills using the Video-Based Communication Assessment app. METHODS: Internal medicine residents used the Video-Based Communication Assessment app; the case, which consisted of 3 sequential vignettes, depicted a delayed diagnosis of breast cancer. Panels of patient advocates who have experienced harmful medical error, either personally or through a family member, and crowdsourced laypeople used a 5-point scale to rate the residents' error disclosure communication skills (6 items) based on audiorecorded responses. Ratings were aggregated across items and vignettes to create a numerical communication score for each physician. We used analysis of variance, to compare stringency, and Pearson correlation between patient advocates and laypeople, to identify whether rank order would be preserved between groups. We used generalizability theory to examine the difference in assessment reliability between patient advocates and laypeople. RESULTS: Internal medicine residents (n=20) used the Video-Based Communication Assessment app. All patient advocates (n=8) and 42 of 59 crowdsourced laypeople who had been recruited provided complete, high-quality ratings. Patient advocates rated communication more stringently than crowdsourced laypeople (patient advocates: mean 3.19, SD 0.55; laypeople: mean 3.55, SD 0.40; P<.001), but patient advocates' and crowdsourced laypeople's ratings of physicians were highly correlated (r=0.82, P<.001). Reliability for 8 raters and 6 vignettes was acceptable (patient advocates: G coefficient 0.82; crowdsourced laypeople: G coefficient 0.65). Decision studies estimated that 12 crowdsourced layperson raters and 9 vignettes would yield an acceptable G coefficient of 0.75. CONCLUSIONS: Crowdsourced laypeople may represent a sustainable source of reliable assessments of physician error disclosure skills. For a simulated case involving delayed diagnosis of breast cancer, laypeople correctly identified high and low performers. However, at least 12 raters and 9 vignettes are required to ensure adequate reliability and future studies are warranted. Crowdsourced laypeople rate less stringently than raters who have experienced harm. Future research should examine the value of the Video-Based Communication Assessment app for formative assessment, summative assessment, and just-in-time coaching of error disclosure communication skills.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA