ترغب بنشر مسار تعليمي؟ اضغط هنا

Exploring vestibulo-ocular adaptation in a closed-loop neuro-robotic experiment using STDP. A simulation study

158   0   0.0 ( 0 )
 نشر من قبل Niceto R. Luque
 تاريخ النشر 2020
والبحث باللغة English




اسأل ChatGPT حول البحث

Studying and understanding the computational primitives of our neural system requires for a diverse and complementary set of techniques. In this work, we use the Neuro-robotic Platform (NRP)to evaluate the vestibulo ocular cerebellar adaptatIon (Vestibulo-ocular reflex, VOR)mediated by two STDP mechanisms located at the cerebellar molecular layer and the vestibular nuclei respectively. This simulation study adopts an experimental setup (rotatory VOR)widely used by neuroscientists to better understand the contribution of certain specific cerebellar properties (i.e. distributed STDP, neural properties, coding cerebellar topology, etc.)to r-VOR adaptation. The work proposes and describes an embodiment solution for which we endow a simulated humanoid robot (iCub)with a spiking cerebellar model by means of the NRP, and we face the humanoid to an r-VOR task. The results validate the adaptive capabilities of the spiking cerebellar model (with STDP)in a perception-action closed-loop (r- VOR)causing the simulated iCub robot to mimic a human behavior.



قيم البحث

اقرأ أيضاً

Despite neuromorphic engineering promises the deployment of low latency, adaptive and low power systems that can lead to the design of truly autonomous artificial agents, the development of a fully neuromorphic artificial agent is still missing. Whil e neuromorphic sensing and perception, as well as decision-making systems, are now mature, the control and actuation part is lagging behind. In this paper, we present a closed-loop motor controller implemented on mixed-signal analog-digital neuromorphic hardware using a spiking neural network. The network performs a proportional control action by encoding target, feedback, and error signals using a spiking relational network. It continuously calculates the error through a connectivity pattern, which relates the three variables by means of feed-forward connections. Recurrent connections within each population are used to speed up the convergence, decrease the effect of mismatch and improve selectivity. The neuromorphic motor controller is interfaced with the iCub robot simulator. We tested our spiking P controller in a single joint control task, specifically for the robot head yaw. The spiking controller sends the target positions, reads the motor state from its encoder, and sends back the motor commands to the joint. The performance of the spiking controller is tested in a step response experiment and in a target pursuit task. In this work, we optimize the network structure to make it more robust to noisy inputs and device mismatch, which leads to better control performances.
Stimulation of target neuronal populations using optogenetic techniques during specific sleep stages has begun to elucidate the mechanisms and effects of sleep. To conduct closed-loop optogenetic sleep studies in untethered animals, we designed a ful ly integrated, low-power system-on-chip (SoC) for real-time sleep stage classification and stage-specific optical stimulation. The SoC consists of a 4-channel analog front-end for recording polysomnography signals, a mixed-signal machine-learning (ML) core, and a 16-channel optical stimulation back-end. A novel ML algorithm and innovative circuit design techniques improved the online classification performance while minimizing power consumption. The SoC was designed and simulated in 180 nm CMOS technology. In an evaluation using an expert labeled sleep database with 20 subjects, the SoC achieves a high sensitivity of 0.806 and a specificity of 0.947 in discriminating 5 sleep stages. Overall power consumption in continuous operation is 97 uW.
The dendritic cell algorithm is an immune-inspired technique for processing time-dependant data. Here we propose it as a possible solution for a robotic classification problem. The dendritic cell algorithm is implemented on a real robot and an invest igation is performed into the effects of varying the migration threshold median for the cell population. The algorithm performs well on a classification task with very little tuning. Ways of extending the implementation to allow it to be used as a classifier within the field of robotic security are suggested.
This paper develops a flexible and robust robotic system for autonomous drawing on 3D surfaces. The system takes 2D drawing strokes and a 3D target surface (mesh or point clouds) as input. It maps the 2D strokes onto the 3D surface and generates a ro bot motion to draw the mapped strokes using visual recognition, grasp pose reasoning, and motion planning. The system is flexible compared to conventional robotic drawing systems as we do not fix drawing tools to the end of a robot arm. Instead, a robot selects drawing tools using a vision system and holds drawing tools for painting using its hand. Meanwhile, with the flexibility, the system has high robustness thanks to the following crafts: First, a high-quality mapping method is developed to minimize deformation in the strokes. Second, visual detection is used to re-estimate the drawing tools pose before executing each drawing motion. Third, force control is employed to avoid noisy visual detection and calibration, and ensure a firm touch between the pen tip and a target surface. Fourth, error detection and recovery are implemented to deal with unexpected problems. The planning and executions are performed in a closed-loop manner until the strokes are successfully drawn. We evaluate the system and analyze the necessity of the various crafts using different real-word tasks. The results show that the proposed system is flexible and robust to generate a robot motion from picking and placing the pens to successfully drawing 3D strokes on given surfaces.
We present a high-throughput optogenetic illumination system capable of simultaneous closed-loop light delivery to specified targets in populations of moving Caenorhabditis elegans. The instrument addresses three technical challenges: it delivers tar geted illumination to specified regions of the animals body such as its head or tail; it automatically delivers stimuli triggered upon the animals behavior; and it achieves high throughput by targeting many animals simultaneously. The instrument was used to optogenetically probe the animals behavioral response to competing mechanosensory stimuli in the the anterior and posterior soft touch receptor neurons. Responses to more than $10^4$ stimulus events from a range of anterior-posterior intensity combinations were measured. The animals probability of sprinting forward in response to a mechanosensory stimulus depended on both the anterior and posterior stimulation intensity, while the probability of reversing depended primarily on the posterior stimulation intensity. We also probed the animals response to mechanosensory stimulation during the onset of turning, a relatively rare behavioral event, by delivering stimuli automatically when the animal began to turn. Using this closed-loop approach, over $10^3$ stimulus events were delivered during turning onset at a rate of 9.2 events per worm-hour, a greater than 25-fold increase in throughput compared to previous investigations. These measurements validate with greater statistical power previous findings that turning acts to gate mechanosensory evoked reversals. Compared to previous approaches, the current system offers targeted optogenetic stimulation to specific body regions or behaviors with many-fold increases in throughput to better constrain quantitative models of sensorimotor processing.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا