Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
2.
BMJ Simul Technol Enhanc Learn ; 7(5): 360-365, 2021.
Article in English | MEDLINE | ID: mdl-35515739

ABSTRACT

Background: The evidence for the conventional wisdom that debriefing quality determines the effectiveness of learning in simulation-based training is lacking. We investigated whether the quality of debriefing in using simulation-based training in team training correlated with the degree of learning of participants. Methods: Forty-two teams of medical and undergraduate nursing students participated in simulation-based training sessions using a two-scenario format with after-action debriefing. Observers rated team performance with an 11-item Teamwork Assessment Scales (TAS) instrument (three subscales, team-based behaviours (5-items), shared mental model (3-items), adaptive communication and response (3-items)). Two independent, blinded raters evaluated video-recorded facilitator team prebriefs and debriefs using the Objective Structured Assessment of Debriefing (OSAD) 8-item tool. Descriptive statistics were calculated, t-test comparisons made and multiple linear regression and univariate analysis used to compare OSAD item scores and changes in TAS scores. Results: Statistically significant improvements in all three TAS subscales occurred from scenario 1 to 2. Seven faculty teams taught learners with all scores ≥3.0 (except two) for prebriefs and all scores ≥ 3.5 (except one) for debriefs (OSAD rating 1=done poorly to 5=done well). Linear regression analysis revealed a single statistically significant correlation between debrief engagement and adaptive communication and response score without significance on univariate analysis. Conclusions: Quality of debriefing does not seem to increase the degree of learning in interprofessional education using simulation-based training of prelicensure student teams. Such a finding may be due to the relatively high quality of the prebrief and debrief of the faculty teams involved in the training.

3.
J Am Med Inform Assoc ; 25(10): 1284-1291, 2018 10 01.
Article in English | MEDLINE | ID: mdl-30299477

ABSTRACT

Objective: The Objective Structured Assessment of Debriefing (OSAD) is an evidence-based, 8-item tool that uses a behaviorally anchored rating scale in paper-based form to evaluate the quality of debriefing in medical education. The objective of this project was twofold: 1) to create an easy-to-use electronic format of the OSAD (eOSAD) in order to streamline data entry; and 2) to pilot its use on videoed debriefings. Materials and Methods: The eOSAD was developed in collaboration with the LSU Health New Orleans Epidemiology Data Center using SurveyGizmo (Widgix Software, LLC, Boulder, CO, USA) software. The eOSAD was then piloted by 2 trained evaluators who rated 37 videos of faculty teams conducting pre-briefing and debriefing after a high-fidelity trauma simulation. Inter-rater reliability was assessed, and evaluators' qualitative feedback was obtained. Results: Inter-rater reliability was good [prebrief, intraclass correlation coefficient, ICC = 0.955 (95% CI, 0.912-0.977), P < .001; debrief, ICC = 0.853 (95% CI, 0.713-0.924), P < .001]. Qualitative feedback from evaluators found that the eOSAD was easy to complete, simple to read and add comments, and reliably stored data that were readily retrievable, enabling the smooth dissemination of information collected. Discussion: The eOSAD features a secure login, sharable internet access link for distant evaluators, and the immediate exporting of data into a secure database for future analysis. It provided convenience for end-users, produced reliable assessments among independent evaluators, and eliminated multiple sources of possible data corruption. Conclusion: The eOSAD tool format advances the post debriefing evaluation of videoed inter-professional team training in high-fidelity simulation.


Subject(s)
Education, Medical , Feedback , High Fidelity Simulation Training , Video Recording , Clinical Competence , Educational Measurement/methods , Humans , User-Computer Interface
SELECTION OF CITATIONS
SEARCH DETAIL
...