ترغب بنشر مسار تعليمي؟ اضغط هنا

Design of an Optoelectronically Innervated Gripper for Rigid-Soft Interactive Grasping

74   0   0.0 ( 0 )
 نشر من قبل Chaoyang Song
 تاريخ النشر 2020
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Over the past few decades, efforts have been made towards robust robotic grasping, and therefore dexterous manipulation. The soft gripper has shown their potential in robust grasping due to their inherent properties-low, control complexity, and high adaptability. However, the deformation of the soft gripper when interacting with objects bring inaccuracy of grasped objects, which causes instability for robust grasping and further manipulation. In this paper, we present an omni-directional adaptive soft finger that can sense deformation based on embedded optical fibers and the application of machine learning methods to interpret transmitted light intensities. Furthermore, to use tactile information provided by a soft finger, we design a low-cost and multi degrees of freedom gripper to conform to the shape of objects actively and optimize grasping policy, which is called Rigid-Soft Interactive Grasping. Two main advantages of this grasping policy are provided: one is that a more robust grasping could be achieved through an active adaptation; the other is that the tactile information collected could be helpful for further manipulation.



قيم البحث

اقرأ أيضاً

Inspired by widely used soft fingers on grasping, we propose a method of rigid-soft interactive learning, aiming at reducing the time of data collection. In this paper, we classify the interaction categories into Rigid-Rigid, Rigid-Soft, Soft-Rigid a ccording to the interaction surface between grippers and target objects. We find experimental evidence that the interaction types between grippers and target objects play an essential role in the learning methods. We use soft, stuffed toys for training, instead of everyday objects, to reduce the integration complexity and computational burden and exploit such rigid-soft interaction by changing the gripper fingers to the soft ones when dealing with rigid, daily-life items such as the Yale-CMU-Berkeley (YCB) objects. With a small data collection of 5K picking attempts in total, our results suggest that such Rigid-Soft and Soft-Rigid interactions are transferable. Moreover, the combination of different grasp types shows better performance on the grasping test. We achieve the best grasping performance at 97.5% for easy YCB objects and 81.3% for difficult YCB objects while using a precise grasp with a two-soft-finger gripper to collect training data and power grasp with a four-soft-finger gripper to test.
This paper aims to improve robots versatility and adaptability by allowing them to use a large variety of end-effector tools and quickly adapt to new tools. We propose AdaGrasp, a method to learn a single grasping policy that generalizes to novel gri ppers. By training on a large collection of grippers, our algorithm is able to acquire generalizable knowledge of how different grippers should be used in various tasks. Given a visual observation of the scene and the gripper, AdaGrasp infers the possible grasp poses and their grasp scores by computing the cross convolution between the shape encodings of the gripper and scene. Intuitively, this cross convolution operation can be considered as an efficient way of exhaustively matching the scene geometry with gripper geometry under different grasp poses (i.e., translations and orientations), where a good match of 3D geometry will lead to a successful grasp. We validate our methods in both simulation and real-world environments. Our experiment shows that AdaGrasp significantly outperforms the existing multi-gripper grasping policy method, especially when handling cluttered environments and partial observations. Video is available at https://youtu.be/kknTYTbORfs
Soft robotic hands and grippers are increasingly attracting attention as a robotic end-effector. Compared with rigid counterparts, they are safer for human-robot and environment-robot interactions, easier to control, lower cost and weight, and more c ompliant. Current soft robotic hands have mostly focused on the soft fingers and bending actuators. However, the palm is also essential part for grasping. In this work, we propose a novel design of soft humanoid hand with pneumatic soft fingers and soft palm. The hand is inexpensive to fabricate. The configuration of the soft palm is based on modular design which can be easily applied into actuating all kinds of soft fingers before. The splaying of the fingers, bending of the whole palm, abduction and adduction of the thumb are implemented by the soft palm. Moreover, we present a new design of soft finger, called hybrid bending soft finger (HBSF). It can both bend in the grasping axis and deflect in the side-to-side axis as human-like motion. The functions of the HBSF and soft palm were simulated by SOFA framework. And their performance was tested in experiments. The 6 fingers with 1 to 11 segments were tested and analyzed. The versatility of the soft hand is evaluated and testified by the grasping experiments in real scenario according to Feix taxonomy. And the results present the diversity of grasps and show promise for grasping a variety of objects with different shapes and weights.
We propose a novel tri-fingered soft robotic gripper with decoupled stiffness and shape control capability for performing adaptive grasping with minimum system complexity. The proposed soft fingers adaptively conform to object shapes facilitating the handling of objects of different types, shapes, and sizes. Each soft gripper finger has an inextensible articulable backbone and is actuated by pneumatic muscles. We derive a kinematic model of the gripper and use an empirical approach to map input pressures to stiffness and bending deformation of fingers. We use these mappings to achieve decoupled stiffness and shape control. We conduct tests to quantify the ability to hold objects as the gripper changes orientation, the ability to maintain the grasping status as the gripper moves, and the amount of force required to release the object from the gripped fingers, respectively. The results validate the proposed grippers performance and show how stiffness control can improve the grasping quality.
This paper presents INVIGORATE, a robot system that interacts with human through natural language and grasps a specified object in clutter. The objects may occlude, obstruct, or even stack on top of one another. INVIGORATE embodies several challenges : (i) infer the target object among other occluding objects, from input language expressions and RGB images, (ii) infer object blocking relationships (OBRs) from the images, and (iii) synthesize a multi-step plan to ask questions that disambiguate the target object and to grasp it successfully. We train separate neural networks for object detection, for visual grounding, for question generation, and for OBR detection and grasping. They allow for unrestricted object categories and language expressions, subject to the training datasets. However, errors in visual perception and ambiguity in human languages are inevitable and negatively impact the robots performance. To overcome these uncertainties, we build a partially observable Markov decision process (POMDP) that integrates the learned neural network modules. Through approximate POMDP planning, the robot tracks the history of observations and asks disambiguation questions in order to achieve a near-optimal sequence of actions that identify and grasp the target object. INVIGORATE combines the benefits of model-based POMDP planning and data-driven deep learning. Preliminary experiments with INVIGORATE on a Fetch robot show significant benefits of this integrated approach to object grasping in clutter with natural language interactions. A demonstration video is available at https://youtu.be/zYakh80SGcU.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا