No Arabic abstract
Assistive free-flying robots are a promising platform for supporting and working alongside astronauts in carrying out tasks that require interaction with the environment. However, current free-flying robot platforms are limited by existing manipulation technologies in being able to grasp and manipulate surrounding objects. Instead, gecko-inspired adhesives offer many advantages for an alternate grasping and manipulation paradigm for use in assistive free-flyer applications. In this work, we present the design of a gecko-inspired adhesive gripper for performing perching and grasping maneuvers for the Astrobee robot, a free-flying robot currently operating on-board the International Space Station. We present software and hardware integration details for the gripper units that were launched to the International Space Station in 2019 for in-flight experiments with Astrobee. Finally, we present preliminary results for on-ground experiments conducted with the gripper and Astrobee on a free-floating spacecraft test bed.
Monitoring the state of contact is essential for robotic devices, especially grippers that implement gecko-inspired adhesives where intimate contact is crucial for a firm attachment. However, due to the lack of deformable sensors, few have demonstrated tactile sensing for gecko grippers. We present Viko, an adaptive gecko gripper that utilizes vision-based tactile sensors to monitor contact state. The sensor provides high-resolution real-time measurements of contact area and shear force. Moreover, the sensor is adaptive, low-cost, and compact. We integrated gecko-inspired adhesives into the sensor surface without impeding its adaptiveness and performance. Using a robotic arm, we evaluate the performance of the gripper by a series of grasping test. The gripper has a maximum payload of 8N even at a low fingertip pitch angle of 30 degrees. We also showcase the grippers ability to adjust fingertip pose for better contact using sensor feedback. Further, everyday object picking is presented as a demonstration of the grippers adaptiveness.
Achieving short-distance flight helps improve the efficiency of humanoid robots moving in complex environments (e.g., crossing large obstacles or reaching high places) for rapid emergency missions. This study proposes a design of a flying humanoid robot named Jet-HR2. The robot has 10 joints driven by brushless motors and harmonic drives for locomotion. To overcome the challenge of the stable-attitude takeoff in small thrust-to-weight conditions, the robot was designed based on the concept of thrust vectoring. The propulsion system consists of four ducted fans, that is, two fixed on the waist of the robot and the other two mounted on the feet, for thrust vector control. The thrust vector is controlled by adjusting the attitude of the foot during the flight. A simplified model and control strategies are proposed to solve the problem of attitude instability caused by mass errors and joint position errors during takeoff. The experimental results show that the robots spin and dive behaviors during takeoff were effectively suppressed by controlling the thrust vector of the ducted fan on the foot. The robot successfully achieved takeoff at a thrust-to-weight ratio of 1.17 (17 kg / 20 kg) and maintained a stable attitude, reaching a takeoff height of over 1000 mm.
Over the past few decades, efforts have been made towards robust robotic grasping, and therefore dexterous manipulation. The soft gripper has shown their potential in robust grasping due to their inherent properties-low, control complexity, and high adaptability. However, the deformation of the soft gripper when interacting with objects bring inaccuracy of grasped objects, which causes instability for robust grasping and further manipulation. In this paper, we present an omni-directional adaptive soft finger that can sense deformation based on embedded optical fibers and the application of machine learning methods to interpret transmitted light intensities. Furthermore, to use tactile information provided by a soft finger, we design a low-cost and multi degrees of freedom gripper to conform to the shape of objects actively and optimize grasping policy, which is called Rigid-Soft Interactive Grasping. Two main advantages of this grasping policy are provided: one is that a more robust grasping could be achieved through an active adaptation; the other is that the tactile information collected could be helpful for further manipulation.
In this paper, we present a novel passive single Degree-of-Freedom (DoF) manipulator design and its integration on an autonomous drone to capture a moving target. The end-effector is designed to be passive, to disengage the moving target from a flying UAV and capture it efficiently in the presence of disturbances, with minimal energy usage. It is also designed to handle target sway and the effect of downwash. The passive manipulator is integrated with the drone through a single Degree of Freedom (DoF) arm, and experiments are carried out in an outdoor environment. The rack-and-pinion mechanism incorporated for this manipulator ensures safety by extending the manipulator beyond the body of the drone to capture the target. The autonomous capturing experiments are conducted using a red ball hanging from a stationary drone and subsequently from a moving drone. The experiments show that the manipulator captures the target with a success rate of 70% even under environmental/measurement uncertainties and errors.
Regular irradiation of indoor environments with ultraviolet C (UVC) light has become a regular task for many indoor settings as a result of COVID-19, but current robotic systems attempting to automate it suffer from high costs and inefficient irradiation. In this paper, we propose a purpose-made inexpensive robotic platform with off-the-shelf components and standard navigation software that, with a novel algorithm for finding optimal irradiation locations, addresses both shortcomings to offer affordable and efficient solutions for UVC irradiation. We demonstrate in simulations the efficacy of the algorithm and show a prototypical run of the autonomous integrated robotic system in an indoor environment. In our sample instances, our proposed algorithm reduces the time needed by roughly 30% while it increases the coverage by a factor of 35% (when compared to the best possible placement of a static light).