|Article title||METHOD OF THE STATIC OBJECTS SPATIAL LOCALIZATION ACCORDING TO DEPTH SENSOR DATA AND RGB CAMERA|
|Authors||A. O. Pyavchenko, A. V. Ilchenko|
|Section||SECTION V. MACHINE VISION|
|Month, Year||01, 2018 @en|
|Abstract||The task of the research is to improve the quality of the onboard computer vision system of a mobile robotic platform that provides localization of static obstacles (objects) in a three-dimensional a priori nondeterministic environment by sharing use of onboard depth sensors and an RGB camera. The article deals with the method of the objects spatial localization, known as DARP (Depth-Assisted Rectification of Patches). Estimated is the effectiveness of its application for the objects detection and localization from the onboard depth sensors data with the use of different features detectors. In order to increase the effectiveness of the DARP method application for mobile robotic platforms, it is proposed to introduce a number of SLAM algorithms (Simultaneous Location and Mapping). Thus, it is recommended to use the ORB-method (Oriented FAST and Rotated BRIEF) as a features detector and a features descriptor, and analyze the current situation taking into account the background of its development. As a result, an onboard computer has the ability to real-time estimate the objects size, correct or partially restore distorted or damaged sections of image frames, determine the belonging of ranges data to a particular object located in the CVS platform coverage area. The above theoretical assumptions are tested on the research software developed in the ROS environment on a Linux-compatible operating platform and implementing the proposed approach to solving the problem of detecting and locating objects from depth sensor data. The results of the successful approbation of the developed software complex using the Intel NUC NUC6I5SYH Mini-PC on-board computer and two Intel RealSense 3D Cameras (RGB-D cameras) allow us to receive, process and visualize RGB- and range data coming from the onboard vision system of the mobile robotic platform. A number of experiments performed on the proposed complex have proved that, in comparison with analogues, the developed and software-implemented set of algorithms of the ORB+DARP method, oriented to the use with RGB-D cameras, provides the required improvement of the onboard vision system in solving the detection and spatial localization problem in the objects located on the movement direction of the mobile robotic platform.|
|Keywords||Computer vision; mobile robotic platform; objects detection and objects localization; RGB-D camera; point cloud; local navigation; DARP method; image features that are resistant to dimensional distortion; ORB; program complex; ROS environment.|
|References||1. Razrabotka mul'timediynykh prilozheniy s ispol'zovaniem bibliotek OpenCV i IPP: Zadacha detektirovaniya ob"ektov na izobrazheniyakh i metody ee resheniya [Development of multimedia applications using OpenCV and IPP libraries: The task of detecting objects on images and methods for solving it], NOU INTUIT [NEU INTUIT]. Available at: https://www.intuit.ru/studies/courses/10622/1106/lecture /18020?page=4 (accessed 26 February 2018).
2. Analiz dvizheniya v zadachakh videonablyudeniya [Motion analysis in video surveillance problems], Tekhnicheskoe zrenie [Technical vision]. Available at: http://wiki.technicalvision.ru/ index.php?title=Analiz_dvizheniya_v_zadachakh_videonablyudeniya&oldid=896 (accessed 26 February 2018).
3. Svertochnaya neyronnaya set' [Convolutional Neural Network], Vikipediya. Data obnovleniya: 11.01.2018 [Wikipedia contributors. Last changes: 11.01.2018]. Available at: http://ru.wikipedia.org/?oldid=90226289 (accessed 26 February 2018).
4. Juergen Schmidhuber Deep Learning in Neural Networks: An Overview, Neural Networks, Jan 2015, Vol. 61, pp. 85-117.
5. Lima J.P., Simoes F., Uchiyama H., Teichrieb V., Marchand E., et al.: Depth-assisted rectification of patches using rgb-d consumer devices to improve real-time keypoint matching, In: Int. Conf. on Computer Vision Theory and Applications, Visapp 2013, pp. 651-656.
6. Lima J.P., Teichrieb V., Uchiyama H., Marchand E., et al. Object detection and pose estimation from natural features using consumer rgb-d sensors: Applications in augmented reality, In: IEEE Int. Symp. on Mixed and Augmented Reality (doctoral symposium), ISMAR’12, pp. 1-4.
7. Lima J.P., Simoes F., Uchiyama H., Teichrieb V., Marchand E. Depth-assisted rectification for real-time object detection and pose estimation.
8. Harris C., Stephens M. A combined corner and edge detector, Alvey vision conference. Manchester: The Plessey Company, 1988, Vol. 15, pp. 147-151.
9. Rosten Edward and Drummond Tom Machine learning for high-speed corner detection, European Conference on Computer Vision (ECCV). Graz, 2006, Vol. 9, pp. 430-443.
10. Lowe David G. Distinctive image features from scale-invariant keypoints, International journal of computer vision. Springer Netherlands, 2004. 2: Vol. 60, pp. 91-110.
11. Berkmann J., Caelli T. Computation of surface geometry and segmentation using covariance techniques // Pattern Analysis and Machine Intelligence, IEEE Transactions on 16 (11), 1994, pp. 1114-1116.
12. Brockett R. Robotic manipulators and the product of exponentials formula, International Symposium on Mathematical Theory of Networks and Systems, 1984, pp. 120-127.
13. Rublee Ethan, et. al. ORB: an efﬁcient alternative to SIFT or SURF, International Conference on Computer Vision (ICCV). Barcelona: IEEE, 2011.
14. Bay H., Ess A., Tuytelaars T., Van Gool L. Speeded-up robust features (surf), Computer vision and image understanding, 2008, Vol. 110 (3), pp. 346-359.
15. Calonder M., Lepetit V., Strecha C., and Fua P. Brief: Binary robust independent elementary features, European Conference on Computer Vision, 2010, pp. 778-792.
16. Lee Wonwoo, Park Nohyoung and Woo Woontack Depth-assisted Real-time 3D Object Detection for Augmented Reality, International Conference on Artificial Reality and Telexistence (ICAT). Osaka, 2011, Vol. 21, pp. 126-132.
17. Rosten, Edward; Tom Drummond. Fusing points and lines for high performance tracking, IEEE International Conference on Computer Vision. 2. IEEE, 2005, pp. 1508-1511.
18. Rosin P.L. Measuring corner properties, Computer Vision and Image Understanding, 1999, Vol. 73 (2), pp. 291-307.
19. Everingham M, Zisserman A., Williams C., Van Gool L. The PASCAL Visual Object Classes Challenge 2006 (VOC2006). [Report], 57 p.
20. Labbé M. and Michaud F. Memory management for real-time appearance-based loop closure detection, in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2011, pp. 1271-1276. (IEEE Xplore).
21. Labbé M. and Michaud F. Appearance-Based Loop Closure Detection for Online Large-Scale and Long-Term Operation, in IEEE Transactions on Robotics, 2013, Vol. 29, No. 3,
pp. 734-745, (IEEE Xplore).
22. Il'chenko A.V., P'yavchenko A.O. Problemy postroeniya sistemy tekhnicheskogo zreniya mobil'nogo robota na osnove infrakrasnogo 3D-datchika glubiny okruzhayushchego prostranstva [The constructing's problems of a mobile robot's computer vision system based on an infrared 3D depth sensor of the environment], Sb. trudov XIII Vserossiyskoy nauchn. konf. mol. uch., asp. i stud. «Informatsionnye tekhnologii, sistemnyy analiz i upravlenie (ITSAiU-2015)» [Proceedings of the XIII All-Russian Scientific Conference of junior scientists, postgraduate students and students. «Information technology, systems analysis and management (ITSAaM-2015)»]. Rostov-on-Don: Izd-vo YuFU, 2016, Vol. 3, pp. 50-60.
23. P'yavchenko A.O., Pereverzev V.A., Il'chenko A.V. Tekhnologiya SLAM i metodologicheskie problemy ee realizatsii v robototekhnike [SLAM technology and methodological problems of its application in robotics], Sb. trudov XIV Vserossiyskoy nauchn. konf. mol. uch., asp. i stud. «Informatsionnye tekhnologii, sistemnyy analiz i upravlenie (ITSAU-2016)» [Proceedings of the XIV All-Russian Scientific Conference of junior scientists, postgraduate students and students. «Information technology, systems analysis and management (ITSAaM-2016)»]. Rostov-on-Don: Izd-vo YuFU, 2016, Vol. 2, pp. 345-351.