No Arabic abstract
We demonstrate how students use of modeling can be examined and assessed using student notebooks collected from an upper-division electronics lab course. The use of models is a ubiquitous practice in undergraduate physics education, but the process of constructing, testing, and refining these models is much less common. We focus our attention on a lab course that has been transformed to engage students in this modeling process during lab activities. The design of the lab activities was guided by a framework that captures the different components of model-based reasoning, called the Modeling Framework for Experimental Physics. We demonstrate how this framework can be used to assess students written work and to identify how students model-based reasoning differed from activity to activity. Broadly speaking, we were able to identify the different steps of students model-based reasoning and assess the completeness of their reasoning. Varying degrees of scaffolding present across the activities had an impact on how thoroughly students would engage in the full modeling process, with more scaffolded activities resulting in more thorough engagement with the process. Finally, we identified that the step in the process with which students had the most difficulty was the comparison between their interpreted data and their model prediction. Students did not use sufficiently sophisticated criteria in evaluating such comparisons, which had the effect of halting the modeling process. This may indicate that in order to engage students further in using model-based reasoning during lab activities, the instructor needs to provide further scaffolding for how students make these types of experimental comparisons. This is an important design consideration for other such courses attempting to incorporate modeling as a learning goal.
The use of lab notebooks for scientific documentation is a ubiquitous part of physics research. However, it is common for undergraduate physics laboratory courses not to emphasize the development of documentation skills, despite the fact that such courses are some of the earliest opportunities for students to start engaging in this practice. One potential impediment to the inclusion of explicit documentation training is that it may be unclear to instructors which features of authentic documentation practice are efficacious to teach and how to incorporate these features into the lab class environment. In this work, we outline some of the salient features of authentic documentation, informed by interviews with physics researchers, and provide recommendations for how these can be incorporated into the lab curriculum. We do not focus on structural details or templates for notebooks. Instead, we address holistic considerations for the purpose of scientific documentation that can guide students to develop their own documentation style. Taking into consideration all the aspects that can help improve students documentation, it is also important to consider the design of the lab activities themselves. Students should have experience with implementing these authentic features of documentation during lab activities in order for them to find practice with documentation beneficial.
Although developing proficiency with modeling is a nationally endorsed learning outcome for upper-division undergraduate physics lab courses, no corresponding research-based assessments exist. Our longterm goal is to develop assessments of students modeling ability that are relevant across multiple upper-division lab contexts. To this end, we interviewed 19 instructors from 16 institutions about optics lab activities that incorporate photodiodes. Interviews focused on how those activities were designed to engage students in some aspects of modeling. We find that, according to many interviewees, iteration is an important aspect of modeling. In addition, interviewees described four distinct types of iteration: revising apparatuses, revising models, revising data-taking procedures, and repeating data collection using existing apparatuses and procedures. We provide examples of each type of iteration, and discuss implications for the development of future modeling assessments.
Proficiency with calculating, reporting, and understanding measurement uncertainty is a nationally recognized learning outcome for undergraduate physics lab courses. The Physics Measurement Questionnaire (PMQ) is a research-based assessment tool that measures such understanding. The PMQ was designed to characterize student reasoning into point or set paradigms, where the set paradigm is more aligned with expert reasoning. We analyzed over 500 student open-ended responses collected at the beginning and the end of a traditional introductory lab course at the University of Colorado Boulder. We discuss changes in students understanding over a semester by analyzing pre-post shifts in student responses regarding data collection, data analysis, and data comparison.
In this exploratory qualitative study, we describe instructors self-reported practices for teaching and assessing students ability to troubleshoot in electronics lab courses. We collected audio data from interviews with 20 electronics instructors from 18 institutions that varied by size, selectivity, and other factors. In addition to describing participants instructional practices, we characterize their perceptions about the role of troubleshooting in electronics, the importance of the ability to troubleshoot more generally, and what it means for students to be competent troubleshooters. One major finding of this work is that, while almost all instructors in our study said that troubleshooting is an important learning outcome for students in electronics lab courses, only half of instructors said they directly assessed students ability to troubleshoot. Based on our findings, we argue that there is a need for research-based instructional materials that attend to both cognitive and non-cognitive aspects of troubleshooting proficiency. We also identify several areas for future investigation related to troubleshooting instruction in electronics lab courses.
The general problem of effectively using interactive engagement in non-introductory physics courses remains open. We present a three-year study comparing different approaches to lecturing in an intermediate mechanics course at the Colorado School of Mines. In the first year, the lectures were fairly traditional. In the second year the lectures were modified to include Socratic dialogs between the instructor and students. In the third year, the instructor used a personal response system and Peer Instruction-like pedagogy. All other course materials were nearly identical to an established traditional lecture course. We present results from a new instructor-constructed conceptual survey, exams, and course evaluations. We observe little change in student exam performance as lecture techniques varied, though students consistently stated clickers were the best part of the course from which they learned the most. Indeed, when using clickers in this course, students were considerably more likely to become engaged than students in CSM introductory courses using the same methods.