ترغب بنشر مسار تعليمي؟ اضغط هنا

Extrinsic Contact Sensing with Relative-Motion Tracking from Distributed Tactile Measurements

92   0   0.0 ( 0 )
 نشر من قبل Daolin Ma
 تاريخ النشر 2021
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

This paper addresses the localization of contacts of an unknown grasped rigid object with its environment, i.e., extrinsic to the robot. We explore the key role that distributed tactile sensing plays in localizing contacts external to the robot, in contrast to the role that aggregated force/torque measurements play in localizing contacts on the robot. When in contact with the environment, an object will move in accordance with the kinematic and possibly frictional constraints imposed by that contact. Small motions of the object, which are observable with tactile sensors, indirectly encode those constraints and the geometry that defines them. We formulate the extrinsic contact sensing problem as a constraint-based estimation. The estimation is subject to the kinematic constraints imposed by the tactile measurements of object motion, as well as the kinematic (e.g., non-penetration) and possibly frictional (e.g., sticking) constraints imposed by rigid-body mechanics. We validate the approach in simulation and with real experiments on the case studies of fixed point and line contacts. This paper discusses the theoretical basis for the value of distributed tactile sensing in contrast to aggregated force/torque measurements. It also provides an estimation framework for localizing environmental contacts with potential impact in contact-rich manipulation scenarios such as assembling or packing.



قيم البحث

اقرأ أيضاً

Robotic touch, particularly when using soft optical tactile sensors, suffers from distortion caused by motion-dependent shear. The manner in which the sensor contacts a stimulus is entangled with the tactile information about the geometry of the stim ulus. In this work, we propose a supervised convolutional deep neural network model that learns to disentangle, in the latent space, the components of sensor deformations caused by contact geometry from those due to sliding-induced shear. The approach is validated by reconstructing unsheared tactile images from sheared images and showing they match unsheared tactile images collected with no sliding motion. In addition, the unsheared tactile images give a faithful reconstruction of the contact geometry that is not possible from the sheared data, and robust estimation of the contact pose that can be used for servo control sliding around various 2D shapes. Finally, the contact geometry reconstruction in conjunction with servo control sliding were used for faithful full object reconstruction of various 2D shapes. The methods have broad applicability to deep learning models for robots with a shear-sensitive sense of touch.
Rotational displacement about the grasping point is a common grasp failure when an object is grasped at a location away from its center of gravity. Tactile sensors with soft surfaces, such as GelSight sensors, can detect the rotation patterns on the contacting surfaces when the object rotates. In this work, we propose a model-based algorithm that detects those rotational patterns and measures rotational displacement using the GelSight sensor. We also integrate the rotation detection feedback into a closed-loop regrasping framework, which detects the rotational failure of grasp in an early stage and drives the robot to a stable grasp pose. We validate our proposed rotation detection algorithm and grasp-regrasp system on self-collected dataset and online experiments to show how our approach accurately detects the rotation and increases grasp stability.
There are a wide range of features that tactile contact provides, each with different aspects of information that can be used for object grasping, manipulation, and perception. In this paper inference of some key tactile features, tip displacement, c ontact location, shear direction and magnitude, is demonstrated by introducing a novel method of transducing a third dimension to the sensor data via Voronoi tessellation. The inferred features are displayed throughout the work in a new visualisation mode derived from the Voronoi tessellation; these visualisations create easier interpretation of data from an optical tactile sensor that measures local shear from displacement of internal pins (the TacTip). The output values of tip displacement and shear magnitude are calibrated to appropriate mechanical units and validate the direction of shear inferred from the sensor. We show that these methods can infer the direction of shear to $sim$2.3$^{circ}$ without the need for training a classifier or regressor. The approach demonstrated here will increase the versatility and generality of the sensors and thus allow sensor to be used in more unstructured and unknown environments, as well as improve the use of these tactile sensors in more complex systems such as robot hands.
In this paper, we present an approach to tactile pose estimation from the first touch for known objects. First, we create an object-agnostic map from real tactile observations to contact shapes. Next, for a new object with known geometry, we learn a tailored perception model completely in simulation. To do so, we simulate the contact shapes that a dense set of object poses would produce on the sensor. Then, given a new contact shape obtained from the sensor output, we match it against the pre-computed set using the object-specific embedding learned purely in simulation using contrastive learning. This results in a perception model that can localize objects from a single tactile observation. It also allows reasoning over pose distributions and including additional pose constraints coming from other perception systems or multiple contacts. We provide quantitative results for four objects. Our approach provides high accuracy pose estimations from distinctive tactile observations while regressing pose distributions to account for those contact shapes that could result from different object poses. We further extend and test our approach in multi-contact scenarios where several tactile sensors are simultaneously in contact with the object. Website: http://mcube.mit.edu/research/tactile_loc_first_touch.html
103 - Nathan F. Lepora 2021
Reproducing the capabilities of the human sense of touch in machines is an important step in enabling robot manipulation to have the ease of human dexterity. A combination of robotic technologies will be needed, including soft robotics, biomimetics a nd the high-resolution sensing offered by optical tactile sensors. This combination is considered here as a SoftBOT (Soft Biomimetic Optical Tactile) sensor. This article reviews the BRL TacTip as a prototypical example of such a sensor. Topics include the relation between artificial skin morphology and the transduction principles of human touch, the nature and benefits of tactile shear sensing, 3D printing for fabrication and integration into robot hands, the application of AI to tactile perception and control, and the recent step-change in capabilities due to deep learning. This review consolidates those advances from the past decade to indicate a path for robots to reach human-like dexterity.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا