Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 6 de 6
Filtrar
Más filtros










Base de datos
Intervalo de año de publicación
1.
BMC Med Educ ; 22(1): 899, 2022 Dec 28.
Artículo en Inglés | MEDLINE | ID: mdl-36578064

RESUMEN

BACKGROUND: Physician delivered weight management counseling (WMC) occurs infrequently and physicians report lack of training and poor self-efficacy. The purpose of this study was to develop and test the Video-based Communication Assessment (VCA) for weight management counseling (WMC) training in medical residents. METHODS: This study was a mixed methods pilot conducted in 3 phases. First, we created five vignettes based on our prior data and expert feedback, then administered the vignettes via the VCA to Internal Medicine categorical residents (n = 16) from a University Medical School. Analog patients rated responses and also provided comments. We created individualized feedback reports which residents were able to view on the VCA. Lastly, we conducted debriefing interviews with the residents (n = 11) to obtain their feedback on the vignettes and personalized feedback. Interviews were transcribed, and we used thematic analysis to generate and apply codes, followed by identifying themes. RESULTS: Descriptive statistics were calculated and learning points were created for the individualized feedback reports. In VCA debriefing interviews with residents, five themes emerged: 1) Overall the VCA was easy to use, helpful and more engaging than traditional learning and assessment modes, 2) Patient scenarios were similar to those encountered in the clinic, including diversity, health literacy and different stages of change, 3) The knowledge, skills, and reminders from the VCA can be transferred to practice, 4) Feedback reports were helpful, to the point and informative, including the exemplar response of how to best respond to the scenario, and 5) The VCA provide alternatives and practice scenarios to real-life patient situations when they aren't always accessible. CONCLUSIONS: We demonstrated the feasibility and acceptability of the VCA, a technology delivered platform, for delivering WMC to residents. The VCA exposed residents to diverse patient experiences and provided potential opportunities to tailor providers responses to sociological and cultural factors in WMC scenarios. Future work will examine the effect of the VCA on WMC in actual clinical practice.


Asunto(s)
Internado y Residencia , Humanos , Competencia Clínica , Comunicación , Consejo , Aprendizaje
2.
JMIR Med Educ ; 8(4): e40758, 2022 Oct 03.
Artículo en Inglés | MEDLINE | ID: mdl-36190751

RESUMEN

BACKGROUND: US residents require practice and feedback to meet Accreditation Council for Graduate Medical Education mandates and patient expectations for effective communication after harmful errors. Current instructional approaches rely heavily on lectures, rarely provide individualized feedback to residents about communication skills, and may not assure that residents acquire the skills desired by patients. The Video-based Communication Assessment (VCA) app is a novel tool for simulating communication scenarios for practice and obtaining crowdsourced assessments and feedback on physicians' communication skills. We previously established that crowdsourced laypeople can reliably assess residents' error disclosure skills with the VCA app. However, its efficacy for error disclosure training has not been tested. OBJECTIVE: We aimed to evaluate the efficacy of using VCA practice and feedback as a stand-alone intervention for the development of residents' error disclosure skills. METHODS: We conducted a pre-post study in 2020 with pathology, obstetrics and gynecology, and internal medicine residents at an academic medical center in the United States. At baseline, residents each completed 2 specialty-specific VCA cases depicting medical errors. Audio responses were rated by at least 8 crowdsourced laypeople using 6 items on a 5-point scale. At 4 weeks, residents received numerical and written feedback derived from layperson ratings and then completed 2 additional cases. Residents were randomly assigned cases at baseline and after feedback assessments to avoid ordinal effects. Ratings were aggregated to create overall assessment scores for each resident at baseline and after feedback. Residents completed a survey of demographic characteristics. We used a 2×3 split-plot ANOVA to test the effects of time (pre-post) and specialty on communication ratings. RESULTS: In total, 48 residents completed 2 cases at time 1, received a feedback report at 4 weeks, and completed 2 more cases. The mean ratings of residents' communication were higher at time 2 versus time 1 (3.75 vs 3.53; P<.001). Residents with prior error disclosure experience performed better at time 1 compared to those without such experience (ratings: mean 3.63 vs mean 3.46; P=.02). No differences in communication ratings based on specialty or years in training were detected. Residents' communication was rated higher for angry cases versus sad cases (mean 3.69 vs mean 3.58; P=.01). Less than half of all residents (27/62, 44%) reported prior experience with disclosing medical harm to patients; experience differed significantly among specialties (P<.001) and was lowest for pathology (1/17, 6%). CONCLUSIONS: Residents at all training levels can potentially improve error disclosure skills with VCA practice and feedback. Error disclosure curricula should prepare residents for responding to various patient affects. Simulated error disclosure may particularly benefit trainees in diagnostic specialties, such as pathology, with infrequent real-life error disclosure practice opportunities. Future research should examine the effectiveness, feasibility, and acceptability of VCA within a longitudinal error disclosure curriculum.

3.
JMIR Med Educ ; 8(2): e30988, 2022 Apr 29.
Artículo en Inglés | MEDLINE | ID: mdl-35486423

RESUMEN

BACKGROUND: Residents may benefit from simulated practice with personalized feedback to prepare for high-stakes disclosure conversations with patients after harmful errors and to meet American Council on Graduate Medical Education mandates. Ideally, feedback would come from patients who have experienced communication after medical harm, but medical researchers and leaders have found it difficult to reach this community, which has made this approach impractical at scale. The Video-Based Communication Assessment app is designed to engage crowdsourced laypeople to rate physician communication skills but has not been evaluated for use with medical harm scenarios. OBJECTIVE: We aimed to compare the reliability of 2 assessment groups (crowdsourced laypeople and patient advocates) in rating physician error disclosure communication skills using the Video-Based Communication Assessment app. METHODS: Internal medicine residents used the Video-Based Communication Assessment app; the case, which consisted of 3 sequential vignettes, depicted a delayed diagnosis of breast cancer. Panels of patient advocates who have experienced harmful medical error, either personally or through a family member, and crowdsourced laypeople used a 5-point scale to rate the residents' error disclosure communication skills (6 items) based on audiorecorded responses. Ratings were aggregated across items and vignettes to create a numerical communication score for each physician. We used analysis of variance, to compare stringency, and Pearson correlation between patient advocates and laypeople, to identify whether rank order would be preserved between groups. We used generalizability theory to examine the difference in assessment reliability between patient advocates and laypeople. RESULTS: Internal medicine residents (n=20) used the Video-Based Communication Assessment app. All patient advocates (n=8) and 42 of 59 crowdsourced laypeople who had been recruited provided complete, high-quality ratings. Patient advocates rated communication more stringently than crowdsourced laypeople (patient advocates: mean 3.19, SD 0.55; laypeople: mean 3.55, SD 0.40; P<.001), but patient advocates' and crowdsourced laypeople's ratings of physicians were highly correlated (r=0.82, P<.001). Reliability for 8 raters and 6 vignettes was acceptable (patient advocates: G coefficient 0.82; crowdsourced laypeople: G coefficient 0.65). Decision studies estimated that 12 crowdsourced layperson raters and 9 vignettes would yield an acceptable G coefficient of 0.75. CONCLUSIONS: Crowdsourced laypeople may represent a sustainable source of reliable assessments of physician error disclosure skills. For a simulated case involving delayed diagnosis of breast cancer, laypeople correctly identified high and low performers. However, at least 12 raters and 9 vignettes are required to ensure adequate reliability and future studies are warranted. Crowdsourced laypeople rate less stringently than raters who have experienced harm. Future research should examine the value of the Video-Based Communication Assessment app for formative assessment, summative assessment, and just-in-time coaching of error disclosure communication skills.

4.
Patient Educ Couns ; 104(9): 2297-2303, 2021 09.
Artículo en Inglés | MEDLINE | ID: mdl-33715944

RESUMEN

OBJECTIVE: Effective physician-patient communication is important, but physicians who are seeking to improve have few opportunities for practice or receive actionable feedback. The Video-based Communication Assessment (VCA) provides both. Using the VCA, physicians respond to communication dilemmas depicted in brief video vignettes; crowdsourced analog patients rate responses and offer comments. We characterized analog patients' comments and generated actionable recommendations for improving communication. METHODS: Physicians and residents completed the VCA; analog patients rated responses and answered:"What would you want the provider to say in this situation?" We used qualitative analysis to identify themes. RESULTS: Forty-three participants completed the VCA; 556 analog patients provided 1035 comments. We identified overarching themes (e.g., caring, empathy, respect) and generated actionable recommendations, incorporating analog patient quotes. CONCLUSION: While analog patients' comments could be provided directly to users, conducting a thematic analysis and developing recommendations for physician-patient communication reduced the burden on users, and allowed for focused feedback. Research is needed into physicians' reactions to the recommendations and the impact on communication. PRACTICE IMPLICATIONS: Physicians seeking to improve communication skills may benefit from practice and feedback. The VCA was designed to provide both, incorporating the patient voice on how best to communicate in clinical situations.


Asunto(s)
Colaboración de las Masas , Médicos , Comunicación , Retroalimentación , Humanos , Relaciones Médico-Paciente
5.
JMIR Med Educ ; 5(1): e10400, 2019 Feb 14.
Artículo en Inglés | MEDLINE | ID: mdl-30710460

RESUMEN

Good clinician-patient communication is essential to provide quality health care and is key to patient-centered care. However, individuals and organizations seeking to improve in this area face significant challenges. A major barrier is the absence of an efficient system for assessing clinicians' communication skills and providing meaningful, individual-level feedback. The purpose of this paper is to describe the design and creation of the Video-Based Communication Assessment (VCA), an innovative, flexible system for assessing and ultimately enhancing clinicians' communication skills. We began by developing the VCA concept. Specifically, we determined that it should be convenient and efficient, accessible via computer, tablet, or smartphone; be case based, using video patient vignettes to which users respond as if speaking to the patient in the vignette; be flexible, allowing content to be tailored to the purpose of the assessment; allow incorporation of the patient's voice by crowdsourcing ratings from analog patients; provide robust feedback including ratings, links to highly rated responses as examples, and learning points; and ultimately, have strong psychometric properties. We collected feedback on the concept and then proceeded to create the system. We identified several important research questions, which will be answered in subsequent studies. The VCA is a flexible, innovative system for assessing clinician-patient communication. It enables efficient sampling of clinicians' communication skills, supports crowdsourced ratings of these spoken samples using analog patients, and offers multifaceted feedback reports.

6.
Acad Med ; 88(11): 1670-5, 2013 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-24072122

RESUMEN

The National Board of Medical Examiners (NBME) reviewed all components of the United States Medical Licensing Examination as part of a strategic planning activity. One recommendation generated from the review called for enhancements of the communication skills component of the Step 2 Clinical Skills (Step 2 CS) examination. To address this recommendation, the NBME created a multidisciplinary team that comprised experts in communication content, communication measurement, and implementation of standardized patient (SP)-based examinations. From 2007 through 2012, the team reviewed literature in physician-patient communication, examined performance characteristics of the Step 2 CS exam, observed case development and quality assurance processes, interviewed SPs and their trainers, and reviewed video recordings of examinee-SP interactions. The authors describe perspectives gained by their team from the review process and outline the resulting enhancements to the Step 2 CS exam, some of which were rolled out in June 2012.


Asunto(s)
Comunicación , Evaluación Educacional/normas , Licencia Médica , Relaciones Médico-Paciente , Humanos , Simulación de Paciente , Estados Unidos
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...