RESUMEN
In an ongoing effort to incorporate active learning and promote higher order learning outcomes in undergraduate organic chemistry, a hybrid ("flipped") classroom structure has been used to facilitate a series of collaborative activities in the first two courses of the lower division organic chemistry sequence. An observational study of seven classes over a five-year period reveals there is a strong correlation between performance on the in-class activities and performance on the final exam across all classes; however, a significant number of students in these courses continue to struggle on both the in-class activities and final exam. The Activity Engagement Survey (AcES) was administered in the most recent course offering included in this study, and these preliminary data suggest that students who achieved lower scores on the in-class activities had lower levels of emotional and behavioral/cognitive engagement and were less likely to work in collaborative groups. In total, these findings suggest that if students can be guided to engage more successfully with the in-class activities, they are likely to be more successful in carrying out the higher order learning required on the final exam. In addition to the analyses of student performance and engagement in the in-class activities, the implementation of the flipped classroom structure and suggestions for how student engagement in higher order learning might be improved in future iterations of the class are described herein.
RESUMEN
Measuring students' perceptions of active learning activities may provide valuable insight into their engagement and subsequent performance outcomes. A recently published measure, the Assessing Student Engagement in Class Tool (ASPECT), was developed to assess student perceptions of various active learning environments. As such, we sought to use this measure in our courses to assess the students' perceptions of different active learning environments. Initial results analyzed with confirmatory factor analysis (CFA) indicated that the ASPECT did not function as expected in our active learning environments. Therefore, before administration within an introductory biology course that incorporated two types of active learning strategies, additional items were created and the wording of some original items were modified to better align with the structure of each strategy, thereby producing two modified ASPECT (mASPECT) versions. Evidence of response process validity of the data collected was analyzed using cognitive interviews with students, while internal structure validity evidence was assessed through exploratory factor analysis (EFA). When data were collected after a "deliberative democracy" (DD) activity, 17 items were found to contribute to 3 factors related to 'personal effort', 'value of the environment', and 'instructor contribution'. However, data collected after a "clicker" day resulted in 21 items that contributed to 4 factors, 3 of which were similar to the DD activity, and a fourth was related to 'social influence'. Overall, these results suggested that the same measure may not function identically when used within different types of active learning environments, even with the same population, and highlights the need to collect data validity evidence when adopting and/or adapting measures.