ﻻ يوجد ملخص باللغة العربية
Human factors and ergonomics are the essential constituents of teleoperation interfaces, which can significantly affect the human operators performance. Thus, a quantitative evaluation of these elements and the ability to establish reliable comparison bases for different teleoperation interfaces are the keys to select the most suitable one for a particular application. However, most of the works on teleoperation have so far focused on the stability analysis and the transparency improvement of these systems, and do not cover the important usability aspects. In this work, we propose a foundation to build a general framework for the analysis of human factors and ergonomics in employing diverse teleoperation interfaces. The proposed framework will go beyond the traditional subjective analyses of usability by complementing it with online measurements of the human body configurations. As a result, multiple quantitative metrics such as joints usage, range of motion comfort, center of mass divergence, and posture comfort are introduced. To demonstrate the potential of the proposed framework, two different teleoperation interfaces are considered, and real-world experiments with eleven participants performing a simulated industrial remote pick-and-place task are conducted. The quantitative results of this analysis are provided, and compared with subjective questionnaires, illustrating the effectiveness of the proposed framework.
Assessing human performance in robotic scenarios such as those seen in telepresence and teleoperation has always been a challenging task. With the recent spike in mixed reality technologies and the subsequent focus by researchers, new pathways have o
Large-scale shape-changing interfaces have great potential, but creating such systems requires substantial time, cost, space, and efforts, which hinders the research community to explore interactions beyond the scale of human hands. We introduce modu
Recent advances in haptic hardware and software technology have generated interest in novel, multimodal interfaces based on the sense of touch. Such interfaces have the potential to revolutionize the way we think about human computer interaction and
We propose a new approach to Human Activity Evaluation (HAE) in long videos using graph-based multi-task modeling. Previous works in activity evaluation either directly compute a metric using a detected skeleton or use the scene information to regres
Virtual reality (VR) head-mounted displays (HMD) have recently been used to provide an immersive, first-person vision/view in real-time for manipulating remotely-controlled unmanned ground vehicles (UGV). The teleoperation of UGV can be challenging f