ترغب بنشر مسار تعليمي؟ اضغط هنا

Soft Biomimetic Optical Tactile Sensing with the TacTip: A Review

104   0   0.0 ( 0 )
 نشر من قبل Nathan Lepora
 تاريخ النشر 2021
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English
 تأليف Nathan F. Lepora




اسأل ChatGPT حول البحث

Reproducing the capabilities of the human sense of touch in machines is an important step in enabling robot manipulation to have the ease of human dexterity. A combination of robotic technologies will be needed, including soft robotics, biomimetics and the high-resolution sensing offered by optical tactile sensors. This combination is considered here as a SoftBOT (Soft Biomimetic Optical Tactile) sensor. This article reviews the BRL TacTip as a prototypical example of such a sensor. Topics include the relation between artificial skin morphology and the transduction principles of human touch, the nature and benefits of tactile shear sensing, 3D printing for fabrication and integration into robot hands, the application of AI to tactile perception and control, and the recent step-change in capabilities due to deep learning. This review consolidates those advances from the past decade to indicate a path for robots to reach human-like dexterity.

قيم البحث

اقرأ أيضاً

We present a modified TacTip biomimetic optical tactile sensor design which demonstrates the ability to induce and detect incipient slip, as confirmed by recording the movement of markers on the sensors external surface. Incipient slip is defined as slippage of part, but not all, of the contact surface between the sensor and object. The addition of ridges - which mimic the friction ridges in the human fingertip - in a concentric ring pattern allowed for localised shear deformation to occur on the sensor surface for a significant duration prior to the onset of gross slip. By detecting incipient slip we were able to predict when several differently shaped objects were at risk of falling and prevent them from doing so. Detecting incipient slip is useful because a corrective action can be taken before slippage occurs across the entire contact area thus minimising the risk of objects been dropped.
139 - Zeyi Yang , Sheng Ge , Fang Wan 2020
Robotic fingers made of soft material and compliant structures usually lead to superior adaptation when interacting with the unstructured physical environment. In this paper, we present an embedded sensing solution using optical fibers for an omni-ad aptive soft robotic finger with exceptional adaptation in all directions. In particular, we managed to insert a pair of optical fibers inside the fingers structural cavity without interfering with its adaptive performance. The resultant integration is scalable as a versatile, low-cost, and moisture-proof solution for physically safe human-robot interaction. In addition, we experimented with our finger design for an object sorting task and identified sectional diameters of 94% objects within the $pm$6mm error and measured 80% of the structural strains within $pm$0.1mm/mm error. The proposed sensor design opens many doors in future applications of soft robotics for scalable and adaptive physical interactions in the unstructured environment.
Rotational displacement about the grasping point is a common grasp failure when an object is grasped at a location away from its center of gravity. Tactile sensors with soft surfaces, such as GelSight sensors, can detect the rotation patterns on the contacting surfaces when the object rotates. In this work, we propose a model-based algorithm that detects those rotational patterns and measures rotational displacement using the GelSight sensor. We also integrate the rotation detection feedback into a closed-loop regrasping framework, which detects the rotational failure of grasp in an early stage and drives the robot to a stable grasp pose. We validate our proposed rotation detection algorithm and grasp-regrasp system on self-collected dataset and online experiments to show how our approach accurately detects the rotation and increases grasp stability.
This paper proposes a controller for stable grasping of unknown-shaped objects by two robotic fingers with tactile fingertips. The grasp is stabilised by rolling the fingertips on the contact surface and applying a desired grasping force to reach an equilibrium state. The validation is both in simulation and on a fully-actuated robot hand (the Shadow Modular Grasper) fitted with custom-built optical tactile sensors (based on the BRL TacTip). The controller requires the orientations of the contact surfaces, which are estimated by regressing a deep convolutional neural network over the tactile images. Overall, the grasp system is demonstrated to achieve stable equilibrium poses on various objects ranging in shape and softness, with the system being robust to perturbations and measurement errors. This approach also has promise to extend beyond grasping to stable in-hand object manipulation with multiple fingers.
To perform complex tasks, robots must be able to interact with and manipulate their surroundings. One of the key challenges in accomplishing this is robust state estimation during physical interactions, where the state involves not only the robot and the object being manipulated, but also the state of the contact itself. In this work, within the context of planar pushing, we extend previous inference-based approaches to state estimation in several ways. We estimate the robot, object, and the contact state on multiple manipulation platforms configured with a vision-based articulated model tracker, and either a biomimetic tactile sensor or a force-torque sensor. We show how to fuse raw measurements from the tracker and tactile sensors to jointly estimate the trajectory of the kinematic states and the forces in the system via probabilistic inference on factor graphs, in both batch and incremental settings. We perform several benchmarks with our framework and show how performance is affected by incorporating various geometric and physics based constraints, occluding vision sensors, or injecting noise in tactile sensors. We also compare with prior work on multiple datasets and demonstrate that our approach can effectively optimize over multi-modal sensor data and reduce uncertainty to find better state estimates.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا