Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
Add more filters










Database
Language
Publication year range
1.
J Clin Transl Sci ; 8(1): e24, 2024.
Article in English | MEDLINE | ID: mdl-38384910

ABSTRACT

The University of Michigan created the Practice-Oriented Research Training (PORT) program and implemented it between 2008 and 2018. The PORT program provided research training and funding opportunities for allied healthcare professionals. The program consisted of weekly didactics and group discussion related to topics relevant to developing specific research ideas into projects and funding for a mentored research project for those who submitted a competitive grant application. The goal of this evaluation was to assess the long-term impact of the PORT program on the research careers of the participants. Ninety-two participants (74 staff and 18 faculty) participated in both phases of the program. A mixed-methods approach to evaluation was used; 25 participants who received funding for their research completed surveys, and semi-structured interviews were conducted with eight program participants. In addition, data were collected on participants' publication history. Fifteen out of the 74 staff participants published 31 first-authored papers after participating in PORT. Twelve out of 15 staff participants who published first-authored papers did so for the first time after participating in the PORT program. Results of quantitative and qualitative analyses suggest that the PORT program had positive impacts on both participants and the research community.

2.
J Clin Transl Sci ; 7(1): e195, 2023.
Article in English | MEDLINE | ID: mdl-37771414

ABSTRACT

Introduction: Community health workers and promotoras (CHW/Ps) have a fundamental role in facilitating research with communities. However, no national standard training exists as part of the CHW/P job role. We developed and evaluated a culturally- and linguistically tailored online research best practices course for CHW/Ps to meet this gap. Methods: After the research best practices course was developed, we advertised the opportunity to CHW/Ps nationwide to complete the training online in English or Spanish. Following course completion, CHW/Ps received an online survey to rate their skills in community-engaged research and their perceptions of the course using Likert scales of agreement. A qualitative content analysis was conducted on open-ended response data. Results: 104 CHW/Ps completed the English or Spanish course (n = 52 for each language; mean age 42 years SD ± 12); 88% of individuals identified as female and 56% identified as Hispanic, Latino, or Spaniard. 96%-100% of respondents reported improvement in various skills. Nearly all CHW/Ps (97%) agreed the course was relevant to their work, and 96% felt the training was useful. Qualitative themes related to working more effectively as a result of training included enhanced skills, increased resources, and building bridges between communities and researchers. Discussion: The CHW/P research best practices course was rated as useful and relevant by CHW/Ps, particularly for communicating about research with community members. This course can be a professional development resource for CHW/Ps and could serve as the foundation for a national standardized training on their role related to research best practices.

3.
Eval Health Prof ; 44(3): 268-278, 2021 09.
Article in English | MEDLINE | ID: mdl-31867997

ABSTRACT

Although there is extensive research literature on clinical skill competencies and the use of competency-based frameworks for clinical research, the appropriate methods to assess these competencies are not as well understood. Our goal in this systematic literature review is to identify, compare, and critique assessments of clinical research competencies. Articles were included in this review if they examined clinical investigators or clinical investigators in training, focused on research-based skills, and included some form of assessment of research-based competencies. A total of 76 articles were identified as part of the initial search; 16 met the criteria for inclusion. Two types of assessments of clinical research competence were identified: subjective self-assessments (n = 13) and objective tests (n = 6). These assessments covered a wide range of competencies, but there were no competency domains common to all. Most assessments had limited validation. Training was consistently associated with self-assessed competence but had little relationship to objective measures of competence. In contrast, experience was consistently associated with objectively assessed competence but not with self-assessed competence. These findings have important implications for those interested in assessing medical education programs. We describe a recommended standard for validity for assessments used for the purposes of summative program assessment.


Subject(s)
Clinical Competence , Research Personnel , Humans , Self-Assessment
4.
MedEdPublish (2016) ; 10: 143, 2021.
Article in English | MEDLINE | ID: mdl-38486548

ABSTRACT

This article was migrated. The article was marked as recommended. Introduction The Research Objective Structured Clinical Exam (R-OSCE) described in this paper was designed as part of a comprehensive program to assess competency in specific domains of clinical and translational research (CTR) for students enrolled in a 12-week introductory summer research program. Methods The program curriculum was mapped to core competencies developed by the National Center for Translational Science (NCATS) and used to develop R-OSCE stations. Twelve stations were developed, with five administered during orientation as a practice test and seven administered post-program. A scoring rubric using an anchored scale of 1-5 was developed and six qualified raters were trained in its use. The exam was self-paced and delivered through a secure online computer-based platform. Results Forty-seven students (three cohorts) completed the post-program R-OSCE. Most respondents scored at 3 (developing competence) or higher on most stations for both the practice and post-program exams, the exceptions being the stations involving writing research questions and engaging communities in research. Students indicated they liked demonstrating CTR skills through the R-OSCE. Most participants agreed that exam tasks were related to stated program competencies and that stations were realistic. Discussion The R-OSCE is best used as part of a comprehensive assessment program and may be useful in providing formative feedback to trainees that they can share with their mentors. Additionally, this study demonstrated that it could feasibly be used to evaluate the effectiveness of research education programs. However, additional time was needed to train raters and score the R-OSCE. Modifications were made to administer the exam through use of an online format with a modest budget. The computer-based format provides a solution to the current need for assessments that can be administered remotely.

5.
J Clin Transl Sci ; 4(6): 480-484, 2020 Jul 06.
Article in English | MEDLINE | ID: mdl-33948223

ABSTRACT

Although several initiatives have produced core competency domains for training the translational science workforce, training resources to help clinical research professionals advance these skills reside primarily within local departments or institutions. The Development, Implementation, and AssessMent of Novel Training in Domain (DIAMOND) project was designed to make this training more readily and publicly available. DIAMOND includes a digital portal to catalog publicly available educational resources and an ePortfolio to document professional development. DIAMOND is a nationally crowdsourced, federated, online catalog providing a platform for practitioners to find and share training and assessment materials. Contributors can share their own educational materials using a simple intake form that creates an electronic record; the portal enables users to browse or search this catalog of digital records and access the resources. Since September 2018, the portal has been visited more than 5,700 times and received over 280 contributions from professionals. The portal facilitates opportunities to connect and collaborate regarding future applications of these resources. Consequently, growing the collection and increasing numbers of both contributors and users remains a priority. Results from a small subset of users indicated over half accomplished their purpose for visiting the site, while qualitative results showed that users identified several benefits and helpful features of the ePortfolio.

6.
J Clin Transl Sci ; 3(2-3): 75-81, 2019 Jun.
Article in English | MEDLINE | ID: mdl-31552144

ABSTRACT

INTRODUCTION: There is a clear need to educate and train the clinical research workforce to conduct scientifically sound clinical research. Meeting this need requires the creation of tools to assess both an individual's preparedness to function efficiently in the clinical research enterprise as well as tools to evaluate the quality and effectiveness of programs that are designed to educate and train clinical research professionals. Here we report the development and validation of a competency self-assessment entitled the Competency Index for Clinical Research Professionals, version II (CICRP-II). METHODS: CICRP-II was developed using data collected from clinical research coordinators (CRCs) participating in the "Development, Implementation and Assessment of Novel Training In Domain-Based Competencies" project at four clinical and translational science award (CTSA) hubs and partnering institutions. RESULTS: An exploratory factor analysis identified a two-factor structure: the first factor measures self-reported competence to perform routine clinical research functions while the second factor measures competence to perform advanced clinical functions. We demonstrate the between groups validity by comparing CRCs working in different research settings. DISCUSSION: The excellent psychometric properties of CICRP-II, its ability to distinguish between experienced CRCs at research intensive CTSA hubs and CRCs working in less intensive community-based sites coupled with the simplicity of alternative methods for scoring respondents make it a valuable tool for gauging an individual's perceived preparedness to function in the role of CRC as well as an equally valuable tool to evaluate the value and effectiveness of clinical research education and training programs.

7.
J Clin Transl Sci ; 2(2): 95-102, 2018 Apr.
Article in English | MEDLINE | ID: mdl-31660222

ABSTRACT

INTRODUCTION: The Best Practices in Social and Behavioral Research Course was developed to provide instruction on good clinical practice for social and behavioral trials. This study evaluated the new course. METHODS: Participants across 4 universities took the course (n=294) and were sent surveys following course completion and 2 months later. Outcomes included relevance, how engaging the course was, and working differently because of the course. Open-ended questions were posed to understand how work was impacted. RESULTS: Participants rated the course as relevant and engaging (6.4 and 5.8/7 points) and reported working differently (4.7/7 points). Participants with less experience in social and behavioral trials were most likely to report working differently 2 months later. DISCUSSION: The course was perceived as relevant and engaging. Participants described actions taken to improve rigor in implementing trials. Future studies with a larger sample and additional participating sites are recommended.

SELECTION OF CITATIONS
SEARCH DETAIL
...