|Article title||THE VARIANTS OF IMPLEMENTATION OF COMPUTER VISION SYSTEM FOR UAS AUTO-TAKEOFF AND AUTO-LANDING PROCEDURE|
|Authors||S. V. Kuleshov, A. A. Zaytseva|
|Section||SECTION V. MACHINE VISION|
|Month, Year||01, 2018 @en|
|Abstract||The key purpose of the research is the development and testing of the approach to the implementation of auto-takeoff and auto-landing system in an unmanned aircraft system (UAS) of a multi-rotor or helicopter type based on a computer vision system (CV-system). Such a study is relevant due to modern requirements for UAS, including the necessity of increasing the autonomy of the management of such systems in the mode of flight stabilization. Analyzing the existing state of research in this field, it has been revealed that the state agencies and services whose functions are related to the protection, monitoring of facilities, including the liquidation of emergencies, as well as companies whose activities are connected with the acquisition of spatial data are mostly interested in the development of UAS technologies. To achieve the goal of this study, the following tasks are solved: analysis of the existing auto-takeoff and auto-landing systems not using global positioning systems; comparison of various layout options and the relative positioning of system components between the aircraft and the landing pad; development of recommendations on the use of markers in auto-takeoff and auto-landing systems, depending on the features of their implementation and use. In the process of solving this tasks and experimentations using various layouts and algorithms, a detailed analysis of the time distribution of the operation of the UAV’s position localization algorithm using controlled markers in the developed system of auto-takeoff and auto-landing on the basis of CV system. It has been established that the most effective are the markers controlled by the landing system computer via the control channel (radio channel). This simplifies the methods of recognition and localization of the marker. It becomes possible to use the time factor with an a priori knowledge about the state of the marker and use simple CV methods, for example, a difference frame. It does not need to make a difference in the characteristics of the markers themselves or use the pattern of the markers, since only one particular marker can be included at any time, which facilitates its identification. Based on the research and conclusions, problems are formulated for further development of the topic: the study of the CV methods for the implementation of auto take-off and auto-landing in difficult weather conditions and the development of modular software for the implementation of the developed algorithms.|
|Keywords||Unmanned aircraft vehicles (UAV); unmanned aircraft systems (UAS); multicopters; markers; CV-system; auto-takeoff; auto-landing.|
|References||1. Aksenov A.Y., Kuleshov S.V., Zaytseva A.A. An application of computer vision systems to solve the problem of unmanned aerial vehicle control, J. Transport and Telecommunication, 2014, Vol. 15, No. 3, pp. 209-214.
2. Aksenov A.Yu., Zaytseva A.A., Kuleshov S.V., Nenausnikov K.V. Varianty obespecheniya posadki pri avtonomnom upravlenii bespilotnymi mul'tirotornymi letatel'nymi apparatami [Variants of landing providing for autonomous control of unmanned multi-rotor vehicles], Trudy MAI [Trudy MAI], Issue No. 96. Available at: http://trudy.mai.ru/ published.php?ID=85880.
3. Barbasov V.K. i dr. Mnogorotornye bespilotnye letatel'nye apparaty i vozmozhnosti ikh ispol'zovaniya dlya distantsionnogo zondirovaniya zemli [Multi-rotor unmanned aerial vehicles and their use for remote sensing of the earth], Inzhenernye izyskaniya [Engineering survey], 2012, No. 10, pp. 38-42.
4. Barbasov V.K., Grechishchev A.V. Mul'tirotornye bespilotnye letatel'nye apparaty, predstavlennye na rossiyskom rynke: obzor [Multirotor unmanned aerial vehicles in the Russian market: a review], Inzhenernye izyskaniya [Engineering survey], 2014, No. 8, pp. 27-31.
5. DJI Innovations. (2013). Naza for Multi-Rotor User Manual. Guangdong. (V2.8 2013.05.03 Revision).
6. Saripalli S., Montgomery J.F., Sukhatme G.S. Visually guided landing of an unmanned aerial vehicle, IEEE Transactions on Robotics and Automation, 2003, Vol. 19, Is. 3.
7. Radu Horaud, Miles Hansard, Georgios Evangelidis, Menier Cl ́ement. An Overview of Depth Cameras and Range Scanners Based on Time-of-Flight Technologies, Machine Vision and Applications Journal, 2016. Available at: https://hal.inria.fr/hal-01325045.
8. Lidar: range-resolved optical remote sensing of the atmosphere series, Springer series in optical sciences. Vol. 102, C. Weitkamp (Ed.). New York: Springer, 2005, 460 p.
9. Avtomaticheskaya posadka BPLA na dvizhushchiysya avtomobil' [Automatic landing of a UAV on a moving car]. Available at: http://absrf.ru/ru/technology/2016-01-26.htm.
10. Veremeenko K.K., Pron'kin A.N., Repnikov A.V. Algoritmy strukturnoy perestroyki bortovykh podsistem integrirovannoy sistemy posadki bespilotnogo letatel'nogo apparata [Algorithms of structural rearrangement of the onboard subsystems of the integrated landing system of an unmanned aerial vehicle], Trudy MAI [Trudy MAI], 2011, Issue No. 49. Available at: http://trudymai.ru/published.php?ID=28110.
11. Pavlova N.V., Smeyukha A.V. Povyshenie effektivnosti vypolneniya poletnogo zadaniya sovremennymi manevrennymi letatel'nymi apparatami [The increasing the efficiency of the flight task by modern maneuverable aircraft], Trudy MAI [Trudy MAI], 2016, Issue No. 87. Available at: http://trudy.mai.ru/upload/iblock/ebf/pavlova_smeyukha_rus.pdf.
12. Corke P. An inertial and visual sensing system for a small autonomous helicopter, J. Robot. Syst., 2004, Vol. 21 (2), pp. 43-51.
13. Cesetti A., Frontoni E., Mancini A. et al. A Visual Global Positioning System for Unmanned Aerial Vehicles Used in Photogrammetric Applications, J. Intell Robot Syst., 2011. 61: 157. Doi:10.1007/s10846-010-9489-5.
14. Garcia Carrillo, L.R., Dzul Lopez, A.E., Lozano, R. et al. Combining Stereo Vision and Inertial Navigation System for a Quad-Rotor UAV, J. Intell. Robot. Syst., 2012. 65: 373. Doi:10.1007/s10846-011-9571-7.
15. Bai G.; Liu J.; Song Y.; Zuo Y. Two-UAV Intersection Localization System Based on the Airborne Optoelectronic Platform. Sensors 2017, 17, 98.
16. Cesetti A., Frontoni E., Mancini A., Zingaretti P., Longhi S. A Vision-Based Guidance System for UAV Navigation and Safe Landing using Natural Landmarks, Selected papers from the 2nd International Symposium on UAVs, Reno, Nevada, U.S.A. June 8–10, 2009, pp. 233-257.
17. Levin A., Szeliski R. Visual odometry and map correlation. // In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition. Washington, DC, USA. 2004.
18. ETH IDSC. Flying Machine Arena. Zurich. 2014. Available at: http://www.idsc.ethz.ch.
19. Ritz R., Müller M.W., Hehn M., D'Andrea R. Cooperative quadrocopter ball throwing and catching, Proceedings of Intelligent Robots and Systems (IROS), IEEE/RSJ International Conference. Vilamoura. October 2012, IEEE. 2012, pp. 4972-4978.
20. Aksenov A.Y., Kuleshov S.V., Zaytseva A.A. An Application of Computer Vision Systems to Unmanned Aerial Vehicle Autolanding, A. Ronzhin et al. (Eds.): ICR 2017, LNAI 10459,
pp. 105-112, 2017. DOI: 10.1007/978-3-319-66471-2_12.