Experimental Evaluation of the Effectiveness of Using Visual Cues for Controlling Unmanned Vehicles





unmanned vehicle, localization, fiducial marker, visual tag, ArUco, ARToolKitPlus, QR code, machine vision camera JeVois-A33


Purpose. The study aims to experimentally evaluate the effectiveness of using visual cues, namely ARToolKitPlus, ArUco markers and two-dimensional QR (quick response code) codes in the tasks of localizing unmanned vehicles (UVs) indoors. Methodology. To enable the implementation of the processes of localization and perception of unmanned vehicles based on visual marks (markers), the structure of the visual marks processing subsystem has been developed. An algorithm for the combined use of three types of visual markers – ARToolKitPlus, ArUco markers and QR codes – for localizing unmanned vehicles and identifying cargo is proposed using the example of an online store warehouse scenario. To conduct experiments on the markers, we chose a hardware and software tool such as the JeVois-A33 smart machine vision camera with the JeVois Markers Combo software module and the JeVois Inventor graphical interface. Findings. An experimental study of the possibility of correct recognition of visual marks for indoor working conditions was carried out. As a result of a series of experiments, the possibilities of correct recognition of visual marks such as ArUco, ARToolKitPlus markers and QR codes during scanning at right angles to the camera at a distance ranging from 0.3 to 2 meters were determined. Originality. The study obtained the probabilities of correct recognition of ArUco, ARToolKitPlus markers and two-dimensional QR codes in the conditions of localization of unmanned vehicles indoors. Practical value. The obtained results of the study can be used to create, simulate, and analyze the effectiveness of algorithms for localizing and perceiving unmanned vehicles indoors using appropriate visual markers. The proposed generalized structure of the visual marker processing subsystem of the localization and perception system can be used in the development of unmanned vehicle control systems.


Ostapets, Y. D. (2024). Research of systems and algorithms for controlling unmanned vehicles (Master’s thesis). Ukrainian State University of Science and Technology. Dnipro. (in English)

Benligiray, B., Topal, C., & Akinlar, C. (2019). STag: A stable fiducial marker system. Image and Vision Com-puting, 89, 158-69. DOI: https://doi.org/10.1016/j.imavis.2019.06.007 (in English)

dos Santos Cesar, D. B., Gaudig, C., Fritsche, M., dos Reis, M. A., & Kirchner, F. (2015, May). An evaluation of artificial fiducial markers in underwater environments. In OCEANS 2015 – Genova (pp. 1-7). Genova, Ita-ly. DOI: https://doi.org/10.1109/oceans-genova.2015.7271491 (in English)

Garrido-Jurado, S., Muñoz-Salinas, R., Madrid-Cuevas, F. J., & Marín-Jiménez, M. J. (2014). Automatic genera-tion and detection of highly reliable fiducial markers under occlusion. Pattern Recognition, 47(6), 2280-2292. DOI: https://doi.org/10.1016/j.patcog.2014.01.005 (in English)

Garrido-Jurado, S., Muñoz-Salinas, R., Madrid-Cuevas, F. J., & Medina-Carnicer, R. (2016). Generation of fidu-cial marker dictionaries using Mixed Integer Linear Programming. Pattern Recognition, 51, 481-491. DOI https://doi.org/10.1016/j.patcog.2015.09.023 (in English)

Lightbody, P., Krajník, T., & Hanheide, M. (2017). An efficient visual fiducial localisation system. ACM SIGAPP Applied Computing Review, 17(3), 28-37. DOI: https://doi.org/10.1145/3161534.3161537 (in English)

Minor, B. Reverse-Engineering Fiducial Markers For Perception. Tangram Vision. Retrieved from https://www.tangramvision.com/blog/reverse-engineering-fiducial-markers-for-perception#further-resources (in English)

Pătru, G-C., Pĭrvan, A-I., Rosner, D., & Rughiniş, R-V. (2023). Fiducial marker systems overview and empirical analysis of ArUco, AprilTag and CCTag. U.P.B. Sci. Bull. Series C, 85(2), 49-62. (in English)

Siki, Z., & Takács, B. (2021). Automatic Recognition of ArUco Codes in Land Surveying Tasks. Baltic Journal of Modern Computing, 9(1), 115-125. DOI: https://doi.org/10.22364/bjmc.2021.9.1.06 (in English)

Zakiev, A., Tsoy, T., Shabalina, K., Magid, E., & Saha, S. K. (2020, July). Virtual Experiments on ArUco and AprilTag Systems Comparison for Fiducial Marker Rotation Resistance under Noisy Sensory Data. In 2020 International Joint Conference on Neural Networks (pp. 1-6). Glasgow, United Kingdom. DOI: https://doi.org/10.1109/ijcnn48605.2020.9207701 (in English)

Zhang, X., Fronz, S., & Navab, N. (2002, Oct.). Visual marker detection and decoding in AR systems: a com-parative study. In IEEE and ACM International Symposium on Mixed and Augmented Reality (pp. 97-106). Darmstadt, Germany. DOI: https://doi.org/10.1109/ismar.2002.1115078 (in English)



How to Cite

Ostapets, Y. D. (2024). Experimental Evaluation of the Effectiveness of Using Visual Cues for Controlling Unmanned Vehicles. Science and Transport Progress, (2(106), 34–42. https://doi.org/10.15802/stp2024/306148