Traditional methods of reporting changes in student responses have focused on class-wide averages. Such models hide information about the switches in responses by individual students over the course of a semester. We extend unpublished work by Steven Kanim on escalator diagrams which show changes in student responses from correct to incorrect (and vice versa) while representing pre- and post-instruction results on questions. Our extension consists of consistency plots in which we represent three forms of data: method of solution and correctness of solution both before and after instruction. Our data are from an intermediate mechanics class, and come from (nearly) identical midterm and final examination questions.
One way to foster a supportive culture in physics departments is for instructors to provide students with personal attention regarding their academic difficulties. To this end, we have developed the Guided Reflection Form (GRF), an online tool that facilitates student reflections and personalized instructor responses. In the present work, we report on the experiences and practices of two instructors who used the GRF in an introductory physics lab course. Our analysis draws on two sources of data: (i) post-semester interviews with both instructors and (ii) the instructors written responses to 134 student reflections. Interviews focused on the instructors perceptions about the goals and framing of the GRF activity, and characteristics of good or bad feedback. Their GRF responses were analyzed for the presence of up to six types of statement: encouraging statements, normalizing statements, empathizing statements, strategy suggestions, resource suggestions, and feedback to the student on the structure of their reflection. We find that both instructors used all six response types, in alignment with their perceptions of what counts as good feedback. This exploratory qualitative investigation demonstrates that the GRF can serve as a mechanism for instructors to pay personal attention to their students. In addition, it opens the door to future work about the impact of the GRF on student-teacher interactions.
Immersive virtual reality (VR) has enormous potential for education, but classroom resources are limited. Thus, it is important to identify whether and when VR provides sufficient advantages over other modes of learning to justify its deployment. In a between-subjects experiment, we compared three methods of teaching Moon phases (a hands-on activity, VR, and a desktop simulation) and measured student improvement on existing learning and attitudinal measures. While a substantial majority of students preferred the VR experience, we found no significant differences in learning between conditions. However, we found differences between conditions based on gender, which was highly correlated with experience with video games. These differences may indicate certain groups have an advantage in the VR setting.
We describe a study on the conceptual difficulties faced by college students in understanding hydrodynamics of ideal fluids. This study was based on responses obtained in hundreds of written exams and oral interviews, which were held with first-year Engineering and Science university students. Their responses allowed us to identify a series of misconceptions unreported in the literature so far. The study findings demonstrate that the most important difficulties arise from the students inability to establish a link between the kinematics and dynamics of moving fluids, and from a lack of understanding regarding how different regions of a system interact.
A novel way of picturing the processing of quantum information is described, allowing a direct visualization of teleportation of quantum states and providing a simple and intuitive understanding of this fascinating phenomenon. The discussion is aimed
at providing physicists a method of explaining teleportation to non-scientists. The basic ideas of quantum physics are first explained in lay terms, after which these ideas are used with a graphical description, out of which teleportation arises naturally.
Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model. Unfortunately Baos original presentation of the model plot did not include a way to represent uncertainty in these measurements. I present details of a method to add error bars to model plots by expanding the work of Sommer and Lindell. I also provide a template for generating model plots with error bars.