No Arabic abstract
Reduced motor control is one of the most frequent features associated with aging and disease. Nonlinear and fractal analyses have proved to be useful in investigating human physiological alterations with age and disease. Similar findings have not been established for any of the model organisms typically studied by biologists, though. If the physiology of a simpler model organism displays the same characteristics, this fact would open a new research window on the control mechanisms that organisms use to regulate physiological processes during aging and stress. Here, we use a recently introduced animal tracking technology to simultaneously follow tens of Caenorhabdits elegans for several hours and use tools from fractal physiology to quantitatively evaluate the effects of aging and temperature stress on nematode motility. Similarly to human physiological signals, scaling analysis reveals long-range correlations in numerous motility variables, fractal properties in behavioral shifts, and fluctuation dynamics over a wide range of timescales. These properties change as a result of a superposition of age and stress-related adaptive mechanisms that regulate motility.
A quantitative understanding of how sensory signals are transformed into motor outputs places useful constraints on brain function and helps reveal the brains underlying computations. We investigate how the nematode C. elegans responds to time-varying mechanosensory signals using a high-throughput optogenetic assay and automated behavior quantification. In the prevailing picture of the touch circuit, the animals behavior is determined by which neurons are stimulated and by the stimulus amplitude. In contrast, we find that the behavioral response is tuned to temporal properties of mechanosensory signals, like its integral and derivative, that extend over many seconds. Mechanosensory signals, even in the same neurons, can be tailored to elicit different behavioral responses. Moreover, we find that the animals response also depends on its behavioral context. Most dramatically, the animal ignores all tested mechanosensory stimuli during turns. Finally, we present a linear-nonlinear model that predicts the animals behavioral response to stimulus.
The ability to acquire large-scale recordings of neuronal activity in awake and unrestrained animals poses a major challenge for studying neural coding of animal behavior. We present a new instrument capable of recording intracellular calcium transients from every neuron in the head of a freely behaving C. elegans with cellular resolution while simultaneously recording the animals position, posture and locomotion. We employ spinning-disk confocal microscopy to capture 3D volumetric fluorescent images of neurons expressing the calcium indicator GCaMP6s at 5 head-volumes per second. Two cameras simultaneously monitor the animals position and orientation. Custom software tracks the 3D position of the animals head in real-time and adjusts a motorized stage to keep it within the field of view as the animal roams freely. We observe calcium transients from 78 neurons and correlate this activity with the animals behavior. Across worms, multiple neurons show significant correlations with modes of behavior corresponding to forward, backward, and turning locomotion. By comparing the 3D positions of these neurons with a known atlas, our results are consistent with previous single-neuron studies and demonstrate the existence of new candidate neurons for behavioral circuits.
Fueled by breakthrough technology developments, the biological, biomedical, and behavioral sciences are now collecting more data than ever before. There is a critical need for time- and cost-efficient strategies to analyze and interpret these data to advance human health. The recent rise of machine learning as a powerful technique to integrate multimodality, multifidelity data, and reveal correlations between intertwined phenomena presents a special opportunity in this regard. However, classical machine learning techniques often ignore the fundamental laws of physics and result in ill-posed problems or non-physical solutions. Multiscale modeling is a successful strategy to integrate multiscale, multiphysics data and uncover mechanisms that explain the emergence of function. However, multiscale modeling alone often fails to efficiently combine large data sets from different sources and different levels of resolution. We show how machine learning and multiscale modeling can complement each other to create robust predictive models that integrate the underlying physics to manage ill-posed problems and explore massive design spaces. We critically review the current literature, highlight applications and opportunities, address open questions, and discuss potential challenges and limitations in four overarching topical areas: ordinary differential equations, partial differential equations, data-driven approaches, and theory-driven approaches. Towards these goals, we leverage expertise in applied mathematics, computer science, computational biology, biophysics, biomechanics, engineering mechanics, experimentation, and medicine. Our multidisciplinary perspective suggests that integrating machine learning and multiscale modeling can provide new insights into disease mechanisms, help identify new targets and treatment strategies, and inform decision making for the benefit of human health.
We present a high-throughput optogenetic illumination system capable of simultaneous closed-loop light delivery to specified targets in populations of moving Caenorhabditis elegans. The instrument addresses three technical challenges: it delivers targeted illumination to specified regions of the animals body such as its head or tail; it automatically delivers stimuli triggered upon the animals behavior; and it achieves high throughput by targeting many animals simultaneously. The instrument was used to optogenetically probe the animals behavioral response to competing mechanosensory stimuli in the the anterior and posterior soft touch receptor neurons. Responses to more than $10^4$ stimulus events from a range of anterior-posterior intensity combinations were measured. The animals probability of sprinting forward in response to a mechanosensory stimulus depended on both the anterior and posterior stimulation intensity, while the probability of reversing depended primarily on the posterior stimulation intensity. We also probed the animals response to mechanosensory stimulation during the onset of turning, a relatively rare behavioral event, by delivering stimuli automatically when the animal began to turn. Using this closed-loop approach, over $10^3$ stimulus events were delivered during turning onset at a rate of 9.2 events per worm-hour, a greater than 25-fold increase in throughput compared to previous investigations. These measurements validate with greater statistical power previous findings that turning acts to gate mechanosensory evoked reversals. Compared to previous approaches, the current system offers targeted optogenetic stimulation to specific body regions or behaviors with many-fold increases in throughput to better constrain quantitative models of sensorimotor processing.
The roundworm C. elegans exhibits robust escape behavior in response to rapidly rising temperature. The behavior lasts for a few seconds, shows history dependence, involves both sensory and motor systems, and is too complicated to model mechanistically using currently available knowledge. Instead we model the process phenomenologically, and we use the Sir Isaac dynamical inference platform to infer the model in a fully automated fashion directly from experimental data. The inferred model requires incorporation of an unobserved dynamical variable, and is biologically interpretable. The model makes accurate predictions about the dynamics of the worm behavior, and it can be used to characterize the functional logic of the dynamical system underlying the escape response. This work illustrates the power of modern artificial intelligence to aid in discovery of accurate and interpretable models of complex natural systems.