ترغب بنشر مسار تعليمي؟ اضغط هنا

Characterizing lab instructors self-reported learning goals to inform development of an experimental modeling skills assessment

140   0   0.0 ( 0 )
 نشر من قبل Dimitri Dounas-Frazer
 تاريخ النشر 2017
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

The ability to develop, use, and refine models of experimental systems is a nationally recognized learning outcome for undergraduate physics lab courses. However, no assessments of students model-based reasoning exist for upper-division labs. This study is the first step toward development of modeling assessments for optics and electronics labs. In order to identify test objectives that are likely relevant across many institutional contexts, we interviewed 35 lab instructors about the ways they incorporate modeling in their course learning goals and activities. The study design was informed by the Modeling Framework for Experimental Physics. This framework conceptualizes modeling as consisting of multiple subtasks: making measurements, constructing system models, comparing data to predictions, proposing causes for discrepancies, and enacting revisions to models or apparatus. We found that each modeling subtask was identified by multiple instructors as an important learning outcome for their course. Based on these results, we argue that test objectives should include probing students competence with most modeling subtasks, and test items should be designed to elicit students justifications for choosing particular modeling pathways. In addition to discussing these and other implications for assessment, we also identify future areas of research related to the role of modeling in optics and electronics labs.



قيم البحث

اقرأ أيضاً

In this exploratory qualitative study, we describe instructors self-reported practices for teaching and assessing students ability to troubleshoot in electronics lab courses. We collected audio data from interviews with 20 electronics instructors fro m 18 institutions that varied by size, selectivity, and other factors. In addition to describing participants instructional practices, we characterize their perceptions about the role of troubleshooting in electronics, the importance of the ability to troubleshoot more generally, and what it means for students to be competent troubleshooters. One major finding of this work is that, while almost all instructors in our study said that troubleshooting is an important learning outcome for students in electronics lab courses, only half of instructors said they directly assessed students ability to troubleshoot. Based on our findings, we argue that there is a need for research-based instructional materials that attend to both cognitive and non-cognitive aspects of troubleshooting proficiency. We also identify several areas for future investigation related to troubleshooting instruction in electronics lab courses.
We demonstrate how students use of modeling can be examined and assessed using student notebooks collected from an upper-division electronics lab course. The use of models is a ubiquitous practice in undergraduate physics education, but the process o f constructing, testing, and refining these models is much less common. We focus our attention on a lab course that has been transformed to engage students in this modeling process during lab activities. The design of the lab activities was guided by a framework that captures the different components of model-based reasoning, called the Modeling Framework for Experimental Physics. We demonstrate how this framework can be used to assess students written work and to identify how students model-based reasoning differed from activity to activity. Broadly speaking, we were able to identify the different steps of students model-based reasoning and assess the completeness of their reasoning. Varying degrees of scaffolding present across the activities had an impact on how thoroughly students would engage in the full modeling process, with more scaffolded activities resulting in more thorough engagement with the process. Finally, we identified that the step in the process with which students had the most difficulty was the comparison between their interpreted data and their model prediction. Students did not use sufficiently sophisticated criteria in evaluating such comparisons, which had the effect of halting the modeling process. This may indicate that in order to engage students further in using model-based reasoning during lab activities, the instructor needs to provide further scaffolding for how students make these types of experimental comparisons. This is an important design consideration for other such courses attempting to incorporate modeling as a learning goal.
Writing is an integral part of the process of science. In the undergraduate physics curriculum, the most common place that students engage with scientific writing is in lab classes, typically through lab notebooks, reports, and proposals. There has n ot been much research on why and how we include writing in physics lab classes, and instructors may incorporate writing for a variety of reasons. Through a broader study of multiweek projects in advanced lab classes, we have developed a framework for thinking about and understanding the role of writing in lab classes. This framework defines and describes the breadth of goals for incorporating writing in lab classes, and is a tool we can use to begin to understand why, and subsequently how, we teach scientific writing in physics.
The ability to construct, use, and revise models is a crucial experimental physics skill. Many existing frameworks describe modeling in science education at introductory levels. However, most have limited applicability to the context of upper-divisio n physics lab courses or experimental physics. Here, we discuss the Modeling Framework for Experimental Physics, a theoretical framework tailored to labs and experimentation. A key feature of the Framework is recursive interaction between models and apparatus. Models are revised to account for new evidence produced by apparatus, and apparatus are revised to better align with the simplifying assumptions of models. Another key feature is the distinction between the physical phenomenon being investigated and the measurement equipment used to conduct the investigation. Models of physical systems facilitate explanation or prediction of phenomena, whereas models of measurement systems facilitate interpretation of data. We describe the Framework, provide a chronological history of its development, and summarize its applications to research and curricular design. Ultimately, we argue that the Modeling Framework is a theoretically sound and well-tested tool that is applicable to multiple physics domains and research purposes. In particular, it is useful for characterizing students approaches to experimentation, designing or evaluating curricula for lab courses, and developing instruments to assess students experimental modeling skills.
One way to foster a supportive culture in physics departments is for instructors to provide students with personal attention regarding their academic difficulties. To this end, we have developed the Guided Reflection Form (GRF), an online tool that f acilitates student reflections and personalized instructor responses. In the present work, we report on the experiences and practices of two instructors who used the GRF in an introductory physics lab course. Our analysis draws on two sources of data: (i) post-semester interviews with both instructors and (ii) the instructors written responses to 134 student reflections. Interviews focused on the instructors perceptions about the goals and framing of the GRF activity, and characteristics of good or bad feedback. Their GRF responses were analyzed for the presence of up to six types of statement: encouraging statements, normalizing statements, empathizing statements, strategy suggestions, resource suggestions, and feedback to the student on the structure of their reflection. We find that both instructors used all six response types, in alignment with their perceptions of what counts as good feedback. This exploratory qualitative investigation demonstrates that the GRF can serve as a mechanism for instructors to pay personal attention to their students. In addition, it opens the door to future work about the impact of the GRF on student-teacher interactions.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا