Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 9 de 9
Filter
Add more filters










Database
Language
Publication year range
1.
CBE Life Sci Educ ; 23(1): ar3, 2024 03.
Article in English | MEDLINE | ID: mdl-38100316

ABSTRACT

Students struggle to regulate their learning during independent study sessions. In this study, we ask whether an online behavioral intervention helped introductory students decrease distraction while studying. The intervention consisted of exam 1 reflection, exam 2 planning, and exam 2 reflection exercises. During planning, students formed a goal, mentally contrasted (MC) a positive outcome of their goal to their present reality, identified an obstacle, and formed an implementation intention (II) to overcome that obstacle. During reflection, students self-reported their distraction while studying. Distraction was the most frequently reported study obstacle, and decreasing distraction was the second most frequently reported study goal. While students who aimed to decrease distraction as a goal did not follow through, students who planned for distraction obstacles did follow through on decreasing distraction levels. Only about half of students generated an II that aligned with their study goal, which may provide one reason for the opposing follow-through of distraction framed as a goal versus as an obstacle. Lastly, we examined the specificity of students' II's and found no relationship with follow-through. Overall, MC with II holds promise as a self-regulatory technique to help introductory biology students change their behaviors while studying.


Subject(s)
Learning , Students , Humans , Educational Measurement/methods , Biology/education
2.
CBE Life Sci Educ ; 21(4): ar65, 2022 12.
Article in English | MEDLINE | ID: mdl-36112624

ABSTRACT

Previous studies have found that students' concept-building approaches, identified a priori with a cognitive psychology laboratory task, are associated with student exam performances in chemistry classes. Abstraction learners (those who extract the principles underlying related examples) performed better than exemplar learners (those who focus on memorizing the training exemplars and responses) on transfer exam questions but not retention questions, after accounting for general ability. We extended these findings to introductory biology courses in which active-learning techniques were used to try to foster deep conceptual learning. Exams were constructed to contain both transfer and retention questions. Abstraction learners demonstrated better performance than exemplar learners on the transfer questions but not on the retention questions. These results were not moderated by indices of crystallized or fluid intelligence. Our central interpretation is that students identified as abstraction learners appear to construct a deep understanding of the concepts (presumably based on abstract underpinnings), thereby enabling them to apply and generalize the concepts to scenarios and instantiations not seen during instruction (transfer questions). By contrast, other students appear to base their representations on memorized instructed examples, leading to good performance on retention questions but not transfer questions.


Subject(s)
Educational Measurement , Students , Biology/education , Educational Measurement/methods , Humans , Learning , Problem-Based Learning
3.
CBE Life Sci Educ ; 21(2): fe1, 2022 06 01.
Article in English | MEDLINE | ID: mdl-35544201

ABSTRACT

Problem solving plays an essential role in all scientific disciplines, and solving problems can reveal essential concepts that underlie those disciplines. Thus, problem solving serves both as a common tool and desired outcome in many science classes. Research on teaching problem solving offers principles for instruction that are guided by learning theories. This essay describes an online, evidence-based teaching guide (https://lse.ascb.org/evidence-based-teaching-guides/problem-solving) intended to guide instructors in the use of these principles. The guide describes the theoretical underpinnings of problem-solving research and instructional choices that can place instruction before problem solving (e.g., peer-led team learning and worked examples) or problem solving before instruction (e.g., process-oriented guided inquiry learning, contrasting cases, and productive failure). Finally, the guide describes assessment choices that help instructors consider alternative outcomes for problem-solving instruction. Each of these sections consists of key points that can be gleaned from the literature as well as summaries and links to articles that inform these points. The guide also includes an instructor checklist that offers a concise summary of key points with actionable steps to direct instructors as they develop and refine their problem-solving instruction.


Subject(s)
Problem Solving , Students , Humans , Learning , Teaching
4.
CBE Life Sci Educ ; 20(1): ar6, 2021 03.
Article in English | MEDLINE | ID: mdl-33444109

ABSTRACT

Students' study sessions outside class are important learning opportunities in college courses. However, we often depend on students to study effectively without explicit instruction. In this study, we described students' self-reported study habits and related those habits to their performance on exams. Notably, in these analyses, we controlled for potential confounds, such as academic preparation, self-reported class absences, and self-reported total study time. First, we found that, on average, students used approximately four active strategies to study and that they spent about half of their study time using active strategies. In addition, both the number of active strategies and the proportion of their study time using active strategies positively predicted exam performance. Second, on average, students started studying 6 days before an exam, but how early a student started studying was not related to performance on in-term (immediate) or cumulative (delayed) exams. Third, on average, students reported being distracted about 20% of their study time, and distraction while studying negatively predicted exam performance. These results add nuance to lab findings and help instructors prioritize study habits to target for change.


Subject(s)
Educational Measurement , Students , Habits , Humans , Learning , Universities
5.
CBE Life Sci Educ ; 19(3): ar42, 2020 09.
Article in English | MEDLINE | ID: mdl-32870077

ABSTRACT

We previously reported that students' concept-building approaches, identified a priori using a cognitive psychology laboratory task, extend to learning complex science, technology, engineering, and mathematics topics. This prior study examined student performance in both general and organic chemistry at a select research institution, after accounting for preparation. We found that abstraction learners (defined cognitively as learning the theory underlying related examples) performed higher on course exams than exemplar learners (defined cognitively as learning by memorizing examples). In the present paper, we further examined this initial finding by studying a general chemistry course using a different pedagogical approach (process-oriented guided-inquiry learning) at an institution focused on health science majors, and then extended our studies via think-aloud interviews to probe the effect concept-building approaches have on problem-solving behaviors of average exam performance students. From interviews with students in the average-achieving group, using problems at three transfer levels, we found that: 1) abstraction learners outperformed exemplar learners at all problem levels; 2) abstraction learners relied on understanding and exemplar learners dominantly relied on an algorithm without understanding at all problem levels; and 3) both concept-building-approach students had weaknesses in their metacognitive monitoring accuracy skills, specifically their postperformance confidence level in their solution accuracy.


Subject(s)
Problem Solving , Students , Engineering , Humans , Learning , Mathematics
6.
CBE Life Sci Educ ; 19(1): ar2, 2020 03.
Article in English | MEDLINE | ID: mdl-31916912

ABSTRACT

As research has shown, collaborative peer learning is effective for improving student learning. Peer-led team learning (PLTL) is one well-known collaborative-group approach in which groups are facilitated by trained undergraduate peer leaders. This paper contributes to the literature on peer-leader training by examining how peer leaders for a large introductory science course translate their training into practice during their sessions. By conducting qualitative analysis on annual advice books written by emergent peer leaders, we examined the practiced advice and strategies of these peer leaders as they facilitate PLTL groups in a university-level general chemistry course. These advice books are passed on to future peer instructors, creating a community of practice between new and more experienced peer leaders. From the analysis, we discovered that peer leaders focus on developing robust student-student discussion during complex problem solving by 1) creating a community-oriented social and intellectual environment, 2) adapting their tactics and the collaborative-learning strategies to balance different personalities and promote equal participation among all students, and 3) modifying collaborative group approaches when facilitating their sessions. Also, in their correspondence across cohorts, peer leaders provided near-peer support to one another. These annual books disseminate practiced advice between peer-leader generations and are used during new peer-leader training.


Subject(s)
Learning , Peer Group , Science , Humans , Science/education , Students , Universities
7.
CBE Life Sci Educ ; 18(2): ar15, 2019 06.
Article in English | MEDLINE | ID: mdl-31025914

ABSTRACT

Low-stakes testing, or quizzing, is a formative assessment tool often used to structure course work. After students complete a quiz, instructors commonly encourage them to use those quizzes again to retest themselves near exam time (i.e., delayed re-quizzing). In this study, we examine student use of online, ungraded practice quizzes that are reopened near exam time after a first graded attempt 1-3 weeks prior. We find that, when controlling for preparation (performance in a previous science, technology, engineering, and mathematics [STEM] course and incoming biology knowledge), re-quizzing predicts better performance on two cumulative exams in introductory biology: a course posttest and final exam. Additionally, we describe a preliminary finding that, for the final exam, but not the posttest, re-quizzing benefits students with lower performance in a previous STEM course more than their higher-performing peers. But unfortunately, these struggling students are also less likely to participate in re-quizzing. Together, these data suggest that a common practice, reopening quizzes for practice near exam time, can effectively benefit student performance. This study adds to a growing body of literature that suggests quizzing can be used as both an assessment tool and a learning tool by showing that the "testing effect" extends to delayed re-quizzing within the classroom.


Subject(s)
Biology/education , Educational Measurement , Engineering/education , Humans , Learning , Mathematics/education , Models, Educational , Peer Group , Students , Technology/education
8.
CBE Life Sci Educ ; 17(2): ar30, 2018 06.
Article in English | MEDLINE | ID: mdl-29786474

ABSTRACT

Active learning with clickers is a common approach in high-enrollment, lecture-based courses in science, technology, engineering, and mathematics. In this study, we describe the procedures that faculty at one institution used when implementing clicker-based active learning, and how they situated these activities in their class sessions. Using a mixed-methods approach, we categorized faculty into four implementation styles based on quantitative observation data and conducted qualitative interviews to further understand why faculty used these styles. We found that faculty tended to use similar procedures when implementing a clicker activity, but differed on how they situated the clicker-based active learning into their courses. These variations were attributed to different faculty goals for using clicker-based active learning, with some using it to engage students at specific time points throughout their class sessions and others who selected it as the best way to teach a concept from several possible teaching techniques. Future research should continue to investigate and describe how active-learning strategies from literature may differ from what is being implemented.


Subject(s)
Engineering/education , Mathematics/education , Problem-Based Learning , Science/education , Technology/education , Faculty , Female , Humans , Students , Teaching , Time Factors
9.
CBE Life Sci Educ ; 15(4)2016.
Article in English | MEDLINE | ID: mdl-27856548

ABSTRACT

The PULSE Vision & Change Rubrics, version 1.0, assess life sciences departments' progress toward implementation of the principles of the Vision and Change report. This paper reports on the development of the rubrics, their validation, and their reliability in measuring departmental change aligned with the Vision and Change recommendations. The rubrics assess 66 different criteria across five areas: Curriculum Alignment, Assessment, Faculty Practice/Faculty Support, Infrastructure, and Climate for Change. The results from this work demonstrate the rubrics can be used to evaluate departmental transformation equitably across institution types and represent baseline data about the adoption of the Vision and Change recommendations by life sciences programs across the United States. While all institution types have made progress, liberal arts institutions are farther along in implementing these recommendations. Generally, institutions earned the highest scores on the Curriculum Alignment rubric and the lowest scores on the Assessment rubric. The results of this study clearly indicate that the Vision & Change Rubrics, version 1.0, are valid and equitable and can track long-term progress of the transformation of life sciences departments. In addition, four of the five rubrics have broad applicability and can be used to evaluate departmental transformation by other science, technology, engineering, and mathematics disciplines.


Subject(s)
Biological Science Disciplines/education , Educational Measurement/methods , Universities , Analysis of Variance , Databases as Topic , Faculty , Principal Component Analysis , Reproducibility of Results
SELECTION OF CITATIONS
SEARCH DETAIL