No Arabic abstract
This paper presents a dynamic constraint formulation to provide protective virtual fixtures of 3D anatomical structures from polygon mesh representations. The proposed approach can anisotropically limit the tool motion of surgical robots without any assumption of the local anatomical shape close to the tool. Using a bounded search strategy and Principle Directed tree, the proposed system can run efficiently at 180 Hz for a mesh object containing 989,376 triangles and 493,460 vertices. The proposed algorithm has been validated in both simulation and skull cutting experiments. The skull cutting experiment setup uses a novel piezoelectric bone cutting tool designed for the da Vinci research kit. The result shows that the virtual fixture assisted teleoperation has statistically significant improvements in the cutting path accuracy and penetration depth control. The code has been made publicly available at https://github.com/mli0603/PolygonMeshVirtualFixture.
The objective of this paper is to present a systematic review of existing sensor-based control methodologies for applications that involve direct interaction between humans and robots, in the form of either physical collaboration or safe coexistence. To this end, we first introduce the basic formulation of the sensor-servo problem, then present the most common approaches: vision-based, touch-based, audio-based, and distance-based control. Afterwards, we discuss and formalize the methods that integrate heterogeneous sensors at the control level. The surveyed body of literature is classified according to the type of sensor, to the way multiple measurements are combined, and to the target objectives and applications. Finally, we discuss open problems, potential applications, and future research directions.
Reliable real-time planning for robots is essential in todays rapidly expanding automated ecosystem. In such environments, traditional methods that plan by relaxing constraints become unreliable or slow-down for kinematically constrained robots. This paper describes the algorithm Dynamic Motion Planning Networks (Dynamic MPNet), an extension to Motion Planning Networks, for non-holonomic robots that address the challenge of real-time motion planning using a neural planning approach. We propose modifications to the training and planning networks that make it possible for real-time planning while improving the data efficiency of training and trained models generalizability. We evaluate our model in simulation for planning tasks for a non-holonomic robot. We also demonstrate experimental results for an indoor navigation task using a Dubins car.
This paper tackles a friction compensation problem without using a friction model. The unique feature of the proposed friction observer is that the nominal motor-side signal is fed back into the controller instead of the measured signal. By doing so, asymptotic stability and passivity of the controller are maintained. Another advantage of the proposed observer is that it provides a clear understanding for the stiction compensation which is hard to be captured in model-free approaches. This allows to design observers that do not overcompensate for the stiction. The proposed scheme is validated through simulations and experiments.
Pruning is the art of cutting unwanted and unhealthy plant branches and is one of the difficult tasks in the field robotics. It becomes even more complex when the plant branches are moving. Moreover, the reproducibility of robot pruning skills is another challenge to deal with due to the heterogeneous nature of vines in the vineyard. This research proposes a multi-modal framework to deal with the dynamic vines with the aim of sim2real skill transfer. The 3D models of vines are constructed in blender engine and rendered in simulated environment as a need for training the robot. The Natural Admittance Controller (NAC) is applied to deal with the dynamics of vines. It uses force feedback and compensates the friction effects while maintaining the passivity of system. The faster R-CNN is used to detect the spurs on the vines and then statistical pattern recognition algorithm using K-means clustering is applied to find the effective pruning points. The proposed framework is tested in simulated and real environments.
To achieve highly dynamic jumps of legged robots, it is essential to control the rotational dynamics of the robot. In this paper, we aim to improve the jumping performance by proposing a unified model for planning highly dynamic jumps that can approximately model the centroidal inertia. This model abstracts the robot as a single rigid body for the base and point masses for the legs. The model is called the Lump Leg Single Rigid Body Model (LL-SRBM) and can be used to plan motions for both bipedal and quadrupedal robots. By taking the effects of leg dynamics into account, LL-SRBM provides a computationally efficient way for the motion planner to change the centroidal inertia of the robot with various leg configurations. Concurrently, we propose a novel contact detection method by using the norm of the average spatial velocity. After the contact is detected, the controller is switched to force control to achieve a soft landing. Twisting jump and forward jump experiments on the bipedal robot SLIDER and quadrupedal robot ANYmal demonstrate the improved jump performance by actively changing the centroidal inertia. These experiments also show the generalization and the robustness of the integrated planning and control framework.