ترغب بنشر مسار تعليمي؟ اضغط هنا

A Survey on Sensor Technologies for Unmanned Ground Vehicles

79   0   0.0 ( 0 )
 نشر من قبل Qi Liu
 تاريخ النشر 2020
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Unmanned ground vehicles have a huge development potential in both civilian and military fields, and have become the focus of research in various countries. In addition, high-precision, high-reliability sensors are significant for UGVs efficient operation. This paper proposes a brief review on sensor technologies for UGVs. Firstly, characteristics of various sensors are introduced. Then the strengths and weaknesses of different sensors as well as their application scenarios are compared. Furthermore, sensor applications in some existing UGVs are summarized. Finally, the hotspots of sensor technologies are forecasted to point the development direction.



قيم البحث

اقرأ أيضاً

Unmanned Aerial Vehicles (UAV)-based civilian or military applications become more critical to serving civilian and/or military missions. The significantly increased attention on UAV applications also has led to security concerns particularly in the context of networked UAVs. Networked UAVs are vulnerable to malicious attacks over open-air radio space and accordingly, intrusion detection systems (IDSs) have been naturally derived to deal with the vulnerabilities and/or attacks. In this paper, we briefly survey the state-of-the-art IDS mechanisms that deal with vulnerabilities and attacks under networked UAV environments. In particular, we classify the existing IDS mechanisms according to information gathering sources, deployment strategies, detection methods, detection states, IDS acknowledgment, and intrusion types. We conclude this paper with research challenges, insights, and future research directions to propose a networked UAV-IDS system which meets required standards of effectiveness and efficiency in terms of the goals of both security and performance.
84 - Xianqi He , Zirui Li , Xufeng Yin 2020
Unmanned vehicles often need to locate targets with high precision during work. In the unmanned material handling workshop, the unmanned vehicle needs to perform high-precision pose estimation of the workpiece to accurately grasp the workpiece. In th is context, this paper proposes a high-precision unmanned vehicle target positioning system based on binocular vision. The system uses a region-based stereo matching algorithm to obtain a disparity map, and uses the RANSAC algorithm to extract position and posture features, which achives the estimation of the position and attitude of a six-degree-of-freedom cylindrical workpiece. In order to verify the effect of the system, this paper collects the accuracy and calculation time of the output results of the cylinder in different poses. The experimental data shows that the position accuracy of the system is 0.61~1.17mm and the angular accuracy is 1.95~5.13{deg}, which can achieve better high-precision positioning effect.
The use of unmanned aerial vehicles (UAVs) is growing rapidly across many civil application domains including real-time monitoring, providing wireless coverage, remote sensing, search and rescue, delivery of goods, security and surveillance, precisio n agriculture, and civil infrastructure inspection. Smart UAVs are the next big revolution in UAV technology promising to provide new opportunities in different applications, especially in civil infrastructure in terms of reduced risks and lower cost. Civil infrastructure is expected to dominate the more that $45 Billion market value of UAV usage. In this survey, we present UAV civil applications and their challenges. We also discuss current research trends and provide future insights for potential UAV uses. Furthermore, we present the key challenges for UAV civil applications, including: charging challenges, collision avoidance and swarming challenges, and networking and security related challenges. Based on our review of the recent literature, we discuss open research challenges and draw high-level insights on how these challenges might be approached.
Current driver assistance systems and autonomous driving stacks are limited to well-defined environment conditions and geo fenced areas. To increase driving safety in adverse weather conditions, broadening the application spectrum of autonomous drivi ng and driver assistance systems is necessary. In order to enable this development, reproducible benchmarking methods are required to quantify the expected distortions. In this publication, a testing methodology for disturbances from spray is presented. It introduces a novel lightweight and configurable spray setup alongside an evaluation scheme to assess the disturbances caused by spray. The analysis covers an automotive RGB camera and two different LiDAR systems, as well as downstream detection algorithms based on YOLOv3 and PV-RCNN. In a common scenario of a closely cutting vehicle, it is visible that the distortions are severely affecting the perception stack up to four seconds showing the necessity of benchmarking the influences of spray.
85 - Tianjiao Li , Jun Liu , Wei Zhang 2021
Human behavior understanding with unmanned aerial vehicles (UAVs) is of great significance for a wide range of applications, which simultaneously brings an urgent demand of large, challenging, and comprehensive benchmarks for the development and eval uation of UAV-based models. However, existing benchmarks have limitations in terms of the amount of captured data, types of data modalities, categories of provided tasks, and diversities of subjects and environments. Here we propose a new benchmark - UAVHuman - for human behavior understanding with UAVs, which contains 67,428 multi-modal video sequences and 119 subjects for action recognition, 22,476 frames for pose estimation, 41,290 frames and 1,144 identities for person re-identification, and 22,263 frames for attribute recognition. Our dataset was collected by a flying UAV in multiple urban and rural districts in both daytime and nighttime over three months, hence covering extensive diversities w.r.t subjects, backgrounds, illuminations, weathers, occlusions, camera motions, and UAV flying attitudes. Such a comprehensive and challenging benchmark shall be able to promote the research of UAV-based human behavior understanding, including action recognition, pose estimation, re-identification, and attribute recognition. Furthermore, we propose a fisheye-based action recognition method that mitigates the distortions in fisheye videos via learning unbounded transformations guided by flat RGB videos. Experiments show the efficacy of our method on the UAV-Human dataset. The project page: https://github.com/SUTDCV/UAV-Human
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا