No Arabic abstract
In this exploratory qualitative study, we describe instructors self-reported practices for teaching and assessing students ability to troubleshoot in electronics lab courses. We collected audio data from interviews with 20 electronics instructors from 18 institutions that varied by size, selectivity, and other factors. In addition to describing participants instructional practices, we characterize their perceptions about the role of troubleshooting in electronics, the importance of the ability to troubleshoot more generally, and what it means for students to be competent troubleshooters. One major finding of this work is that, while almost all instructors in our study said that troubleshooting is an important learning outcome for students in electronics lab courses, only half of instructors said they directly assessed students ability to troubleshoot. Based on our findings, we argue that there is a need for research-based instructional materials that attend to both cognitive and non-cognitive aspects of troubleshooting proficiency. We also identify several areas for future investigation related to troubleshooting instruction in electronics lab courses.
The ability to develop, use, and refine models of experimental systems is a nationally recognized learning outcome for undergraduate physics lab courses. However, no assessments of students model-based reasoning exist for upper-division labs. This study is the first step toward development of modeling assessments for optics and electronics labs. In order to identify test objectives that are likely relevant across many institutional contexts, we interviewed 35 lab instructors about the ways they incorporate modeling in their course learning goals and activities. The study design was informed by the Modeling Framework for Experimental Physics. This framework conceptualizes modeling as consisting of multiple subtasks: making measurements, constructing system models, comparing data to predictions, proposing causes for discrepancies, and enacting revisions to models or apparatus. We found that each modeling subtask was identified by multiple instructors as an important learning outcome for their course. Based on these results, we argue that test objectives should include probing students competence with most modeling subtasks, and test items should be designed to elicit students justifications for choosing particular modeling pathways. In addition to discussing these and other implications for assessment, we also identify future areas of research related to the role of modeling in optics and electronics labs.
We demonstrate how students use of modeling can be examined and assessed using student notebooks collected from an upper-division electronics lab course. The use of models is a ubiquitous practice in undergraduate physics education, but the process of constructing, testing, and refining these models is much less common. We focus our attention on a lab course that has been transformed to engage students in this modeling process during lab activities. The design of the lab activities was guided by a framework that captures the different components of model-based reasoning, called the Modeling Framework for Experimental Physics. We demonstrate how this framework can be used to assess students written work and to identify how students model-based reasoning differed from activity to activity. Broadly speaking, we were able to identify the different steps of students model-based reasoning and assess the completeness of their reasoning. Varying degrees of scaffolding present across the activities had an impact on how thoroughly students would engage in the full modeling process, with more scaffolded activities resulting in more thorough engagement with the process. Finally, we identified that the step in the process with which students had the most difficulty was the comparison between their interpreted data and their model prediction. Students did not use sufficiently sophisticated criteria in evaluating such comparisons, which had the effect of halting the modeling process. This may indicate that in order to engage students further in using model-based reasoning during lab activities, the instructor needs to provide further scaffolding for how students make these types of experimental comparisons. This is an important design consideration for other such courses attempting to incorporate modeling as a learning goal.
While laboratory instruction is a cornerstone of physics education, the impact of student behaviours in labs on retention, persistence in the field, and the formation of students physics identity remains an open question. In this study, we performed in-lab observations of student actions over two semesters in two pedagogically different sections of the same introductory physics course. We used a cluster analysis to identify different categories of student behaviour and analyzed how they correlate with lab structure and gender. We find that, in lab structures which fostered collaborative group work and promoted decision making, there was a task division along gender lines with respect to laptop and equipment usage (and found no such divide among students in guided verification labs).
One way to foster a supportive culture in physics departments is for instructors to provide students with personal attention regarding their academic difficulties. To this end, we have developed the Guided Reflection Form (GRF), an online tool that facilitates student reflections and personalized instructor responses. In the present work, we report on the experiences and practices of two instructors who used the GRF in an introductory physics lab course. Our analysis draws on two sources of data: (i) post-semester interviews with both instructors and (ii) the instructors written responses to 134 student reflections. Interviews focused on the instructors perceptions about the goals and framing of the GRF activity, and characteristics of good or bad feedback. Their GRF responses were analyzed for the presence of up to six types of statement: encouraging statements, normalizing statements, empathizing statements, strategy suggestions, resource suggestions, and feedback to the student on the structure of their reflection. We find that both instructors used all six response types, in alignment with their perceptions of what counts as good feedback. This exploratory qualitative investigation demonstrates that the GRF can serve as a mechanism for instructors to pay personal attention to their students. In addition, it opens the door to future work about the impact of the GRF on student-teacher interactions.
Recently, there have been several national calls to emphasize physics practices and skills within laboratory courses. In this paper, we describe the redesign and implementation of a two-course sequence of algebra-based physics laboratories at Michigan State University called Design Analysis Tools and Apprenticeship (DATA) Lab. The large-scale course transformation removes physics specific content from the overall learning goals of the course, and instead, uses physics concepts to focus on specific laboratory practices and research skills that students can take into their future careers. Students in DATA Lab engage in the exploration of physical systems to increase their understanding of the experimental process, data analysis, collaboration, and scientific communication. In order to ensure our students are making progress toward the skills outlined in the course learning goals, we designed all of the assessments in the courses to evaluate their progress specific to these laboratory practices. Here, we will describe the structures, scaffolds, goals, and assessments of the course.