No Arabic abstract
Analysis of stochastic processes can be used to engender critical thinking. Quantum dots have a reversible, stochastic transition between luminescent and non-luminescent states. The luminescence intermittency is known as blinking, and is not evident from ensemble measurements. In order to stimulate critical thinking, students design, perform, and analyze a semiconductor quantum dot blinking laboratory experiment. The design of the experiment and stochastic nature of the data collected require students to make judgements throughout the course of the single-particle measurement and analysis. Some of the decisions do not have uniquely correct answers, challenging the students to engage in critical thinking. We propose that students self-examined decision making develops a constructivist view of science. The experiment is visually striking, interdisciplinary, and develops higher order thinking.
The photoluminescence intermittency (blinking) of quantum dots is interesting because it is an easily-measured quantum process whose transition statistics cannot be explained by Fermis Golden Rule. Commonly, the transition statistics are power-law distributed, implying that quantum dots possess at least trivial memories. By investigating the temporal correlations in the blinking data, we demonstrate with high statistical confidence that quantum dot blinking data has non-trivial memory, which we define to be statistical complexity greater than one. We show that this memory cannot be discovered using the transition distribution. We show by simulation that this memory does not arise from standard data manipulations. Finally, we conclude that at least three physical mechanisms can explain the measured non-trivial memory: 1) Storage of state information in the chemical structure of a quantum dot; 2) The existence of more than two intensity levels in a quantum dot; and 3) The overlap in the intensity distributions of the quantum dot states, which arises from fundamental photon statistics.
We discuss our outreach efforts to introduce school students to network science and explain why networks researchers should be involved in such outreach activities. We provide overviews of modules that we have designed for these efforts, comment on our successes and failures, and illustrate the potentially enormous impact of such outreach efforts.
Science students must deal with the errors inherent to all physical measurements and be conscious of the need to expressvthem as a best estimate and a range of uncertainty. Errors are routinely classified as statistical or systematic. Although statistical errors are usually dealt with in the first years of science studies, the typical approaches are based on manually performing repetitive observations. Our work proposes a set of laboratory experiments to teach error and uncertainties based on data recorded with the sensors available in many mobile devices. The main aspects addressed are the physical meaning of the mean value and standard deviation, and the interpretation of histograms and distributions. The normality of the fluctuations is analyzed qualitatively comparing histograms with normal curves and quantitatively comparing the number of observations in intervals to the number expected according to a normal distribution and also performing a Chi-squared test. We show that the distribution usually follows a normal distribution, however, when the sensor is placed on top of a loudspeaker playing a pure tone significant differences with a normal distribution are observed. As applications to every day situations we discuss the intensity of the fluctuations in different situations, such as placing the device on a table or holding it with the hands in different ways. Other activities are focused on the smoothness of a road quantified in terms of the fluctuations registered by the accelerometer. The present proposal contributes to gaining a deep insight into modern technologies and statistical errors and, finally, motivating and encouraging engineering and science students.
Computational Thinking (CT) is still a relatively new term in the lexicon of learning objectives and science standards. There is not yet widespread agreement on the precise definition or implementation of CT, and efforts to assess CT are still maturing, even as more states adopt K-12 computer science standards. In this article we will try to summarize what CT means for a typical introductory (i.e. high school or early college) physics class. This will include a discussion of the ways that instructors may already be incorporating elements of CT in their classes without knowing it. Our intention in writing this article is to provide a helpful, concise and readable introduction to this topic for physics instructors. We also put forward some ideas for what the future of CT in introductory physics may look like.
Wave mixing is an archetypical phenomenon in bosonic systems. In optomechanics, the bi-directional conversion between electromagnetic waves or photons at optical frequencies and elastic waves or phonons at radio frequencies is building on precisely this fundamental principle. Surface acoustic waves provide a versatile interconnect on a chip and, thus, enable the optomechanical control of remote systems. Here, we report on the coherent nonlinear three-wave mixing between the coherent fields of two radio frequency surface acoustic waves and optical laser photons via the dipole transition of a single quantum dot exciton. In the resolved sideband regime, we demonstrate fundamental acoustic analogues of sum and difference frequency generation between the two SAWs and employ phase matching to deterministically enhance or suppress individual sidebands. This bi-directional transfer between the acoustic and optical domains is described by theory which fully takes into account direct and virtual multi-phonon processes. Finally, we show that the precision of the wave mixing is limited by the frequency accuracy of modern radio frequency electronics.