Do you want to publish a course? Click here

Artificial SA-I and RA-I Afferentsfor Tactile Sensing of Ridges and Gratings

47   0   0.0 ( 0 )
 Added by Nicholas Pestell
 Publication date 2021
and research's language is English




Ask ChatGPT about the research

For robot touch to converge with the human sense of touch, artificial transduction should involve biologically-plausible population codes analogous to those of natural afferents. Using a biomimetic tactile sensor with 3d-printed skin based on the dermal-epidermal boundary, we propose two novel feature sets to mimic slowly-adapting and rapidly-adapting type-I tactile mechanoreceptor function. Their plausibility is tested with three classic experiments from the study of natural touch: impingement on a flat plate to probe adaptation and spatial modulation; stimulation by spatially-complex ridged stimuli to probe single afferent responses; and perception of grating orientation to probe the population response. Our results show a match between artificial and natural afferent responses in their sensitivity to edges and gaps; likewise, the human and robot psychometric functions match for grating orientation. These findings could benefit robot manipulation, prosthetics and the neurophysiology of touch.



rate research

Read More

206 - U Dammalapati , K. Jungmann , 2016
Energy levels, wavelengths, lifetimes and hyperfine structure constants for the isotopes of the first and second spectra of radium, Ra I and Ra II have been compiled. Wavelengths and wave numbers are tabulated for 226Ra and for other Ra isotopes. Isotope shifts and hyperfine structure constants of even and odd-A isotopes of neutral radium atom and singly ionized radium are included. Experimental lifetimes of the states for both neutral and ionic Ra are also added, where available. The information is beneficial for present and future experiments aimed at different physics motivations using neutral Ra and singly ionized Ra.
This work contributes an event-driven visual-tactile perception system, comprising a novel biologically-inspired tactile sensor and multi-modal spike-based learning. Our neuromorphic fingertip tactile sensor, NeuTouch, scales well with the number of taxels thanks to its event-based nature. Likewise, our Visual-Tactile Spiking Neural Network (VT-SNN) enables fast perception when coupled with event sensors. We evaluate our visual-tactile system (using the NeuTouch and Prophesee event camera) on two robot tasks: container classification and rotational slip detection. On both tasks, we observe good accuracies relative to standard deep learning methods. We have made our visual-tactile datasets freely-available to encourage research on multi-modal event-driven robot perception, which we believe is a promising approach towards intelligent power-efficient robot systems.
To perform complex tasks, robots must be able to interact with and manipulate their surroundings. One of the key challenges in accomplishing this is robust state estimation during physical interactions, where the state involves not only the robot and the object being manipulated, but also the state of the contact itself. In this work, within the context of planar pushing, we extend previous inference-based approaches to state estimation in several ways. We estimate the robot, object, and the contact state on multiple manipulation platforms configured with a vision-based articulated model tracker, and either a biomimetic tactile sensor or a force-torque sensor. We show how to fuse raw measurements from the tracker and tactile sensors to jointly estimate the trajectory of the kinematic states and the forces in the system via probabilistic inference on factor graphs, in both batch and incremental settings. We perform several benchmarks with our framework and show how performance is affected by incorporating various geometric and physics based constraints, occluding vision sensors, or injecting noise in tactile sensors. We also compare with prior work on multiple datasets and demonstrate that our approach can effectively optimize over multi-modal sensor data and reduce uncertainty to find better state estimates.
Humans display the remarkable ability to sense the world through tools and other held objects. For example, we are able to pinpoint impact locations on a held rod and tell apart different textures using a rigid probe. In this work, we consider how we can enable robots to have a similar capacity, i.e., to embody tools and extend perception using standard grasped objects. We propose that vibro-tactile sensing using dynamic tactile sensors on the robot fingers, along with machine learning models, enables robots to decipher contact information that is transmitted as vibrations along rigid objects. This paper reports on extensive experiments using the BioTac micro-vibration sensor and a new event dynamic sensor, the NUSkin, capable of multi-taxel sensing at 4~kHz. We demonstrate that fine localization on a held rod is possible using our approach (with errors less than 1 cm on a 20 cm rod). Next, we show that vibro-tactile perception can lead to reasonable grasp stability prediction during object handover, and accurate food identification using a standard fork. We find that multi-taxel vibro-tactile sensing at sufficiently high sampling rate (above 2 kHz) led to the best performance across the various tasks and objects. Taken together, our results provides both evidence and guidelines for using vibro-tactile perception to extend tactile perception, which we believe will lead to enhanced competency with tools and better physical human-robot-interaction.
There are a wide range of features that tactile contact provides, each with different aspects of information that can be used for object grasping, manipulation, and perception. In this paper inference of some key tactile features, tip displacement, contact location, shear direction and magnitude, is demonstrated by introducing a novel method of transducing a third dimension to the sensor data via Voronoi tessellation. The inferred features are displayed throughout the work in a new visualisation mode derived from the Voronoi tessellation; these visualisations create easier interpretation of data from an optical tactile sensor that measures local shear from displacement of internal pins (the TacTip). The output values of tip displacement and shear magnitude are calibrated to appropriate mechanical units and validate the direction of shear inferred from the sensor. We show that these methods can infer the direction of shear to $sim$2.3$^{circ}$ without the need for training a classifier or regressor. The approach demonstrated here will increase the versatility and generality of the sensors and thus allow sensor to be used in more unstructured and unknown environments, as well as improve the use of these tactile sensors in more complex systems such as robot hands.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا