Main Article Content
Robot localization, Encoders, IMU, Sensor Fusion, EKF
One of the popular studies recently is about social robots that have been implemented in several public areas such as offices. The robot is an employee or worker assistant robot in the Telkom Surabaya Institute of Technology building to help carry out the work of delivering packages to the destination according to the tasks given. The problem that often occurs is an error in the robot's localization system causing the robot's movement to the target point to experience a position error. This research contributes to the comparative evaluation of 2 localization methods on mobile robots, namely the first is the use of a rotary encoder sensor and the second is the use of sensor fusion based on the extended Kalman filter implemented on the robot prototype. This study aims to develop a sensor system that is adapted to the design of the robot and the environment in which the robot is tested and to find out the comparison of the two methods. The use of extended Kalman filter-based sensor fusion can provide more accurate results in robot localization, especially when moving on complex paths. Sensor fusion enables the combination of several sensors such as rotary encoders and IMU (Inertial Measurement Unit) sensors to provide more complete and accurate information about the position and orientation of the robot. In this study, sensor fusion successfully reduced the localization error of the robot to 0.63 m when moving straight and 0.29 m when moving on a complex path, compared to the use of a single sensor which resulted in a larger error of 0.89 m. Based on the study that has been conducted, it can be considered as a potential solution in the development of other social robots to improve the accuracy and performance of the robots when performing certain tasks in the future.
Wang, Q., Wang, S. and Ni, H., 2021, May. Design of an Odor Search Robot System Based on Open Sampling System. In 2021 33rd Chinese Control and Decision Conference (CCDC) (pp. 3383-3388). IEEE.
L'Orange, C., Neymark, G., Carter, E. and Volckens, J., 2021. A High-throughput, Robotic System for Analysis of Aerosol Sampling Filters. Aerosol and Air Quality Research, 21, p.210037.
Bredesen, K., Arnarson, H., Solvang, B. and Anfinnsen, A., 2022, January. Human-robot collaboration for automatic garbage removal. In 2022 IEEE/SICE International Symposium on System Integration (SII) (pp. 803-808). IEEE.
Kavidha, V., Gayathri, N. and Kumar, S.R., 2021. AI, IoT and Robotics in the Medical and Healthcare Field. AI and IoT‐Based Intelligent Automation in Robotics, pp.165-187.
Su, Y., Wang, T., Shao, S., Yao, C. and Wang, Z., 2021. GR-LOAM: LiDAR-based sensor fusion SLAM for ground robots on complex terrain. Robotics and Autonomous Systems, 140, p.103759.
Al Farouq, A., Habibi, A., Putra, P.D.H. and Montolalu, B., 2022. Real-Time Field Segmentation and Depth Map Using Stereo, Color and Ball Pattern. In Proceedings of the 2nd International Conference on Electronics, Biomedical Engineering, and Health Informatics (pp. 491-498). Springer, Singapore.
Pramadihanto, D., Alfarouq, A., Waskitho, S.A. and Sukaridhoto, S., 2017. Merging of depth image between stereo camera and structure sensor on robot “Flow” vision. International Journal on Advanced Science, Engineering and Information Technology, 7(3), pp.1014-1025.
Li, Y., Iida, M., Suyama, T., Suguri, M. and Masuda, R., 2020. Implementation of deep-learning algorithm for obstacle detection and collision avoidance for robotic harvester. Computers and Electronics in Agriculture, 174, p.105499.
Arvind, C.S. and Senthilnath, J., 2020. Autonomous vehicle for obstacle detection and avoidance using reinforcement learning. In Soft Computing for Problem Solving (pp. 55-66). Springer, Singapore.
A. T. Erdem and A. O. Ercan, “Fusing inertial sensor data in an extended Kalman _lter for 3D camera tracking,” IEEE Trans. Image Process., vol. 24, no. 2, pp. 538_548, Feb. 2015.
E. T. Benser, “Trends in inertial sensors and applications,” in Proc. IEEE Int. Symp. Inertial Sensors Syst. (ISISS), Mar. 2015, pp. 1_4.
F. N. Sibai, H. Trigui, P. C. Zanini, and A. R. Al-Odail, “Evaluation of indoor mobile robot localization techniques,” in Proc. Int. Conf. Comput.Syst. Ind. Informat., Dec. 2012, pp. 1_6.
S. Subedi and J.-Y. Pyun, “Practical fingerprinting localization for indoor positioning system by using beacons,” J. Sensors, vol. 2017, pp. 1_16, 2017.
Y. Yang, X. Meng, and M. Gao, “Vision system of mobile robot combining binocular and depth cameras,” J. Sensors, vol. 2017, pp. 1_11, Sep. 2017.
A. Ben-A_a, V. Gay-Bellile, A.-C. Escher, D. Salos, L. Soulier, L. Deambrogio, and C. Macabiau, “Review and classi_cation of visionbased localisation techniques in unknown environments,” IET Radar, Sonar Navigat., vol. 8, no. 9, pp. 1059_1072, Dec. 2014.
M. Alatise and G. P. Hancke, “Pose estimation of a mobile robot using monocular vision and inertial sensors data,” in Proc. IEEE AFRICON, Sep. 2017, pp. 1552_1557.
P. Loncomilla, J. Ruiz-del-Solar, and L. Martínez, “Object recognition using local invariant features for robotic applications: A survey,” Pattern Recognit., vol. 60, pp. 499_514, Dec. 2016.
J. Farooq, “Object detection and identi_cation using SURF and BoW model,” in Proc. Int. Conf. Comput., Electron. Electr. Eng. (ICE Cube), Apr. 2016, pp. 318_323.
F. Castanedo, “A review of data fusion techniques,” Sci. World J., vol. 2013, pp. 1_19, Sep. 2013.
K. Nagla, M. Uddin, and D. Singh, “Multisensor data fusion and integration for mobile robots: A review,” IAES Int. J. Robot. Autom. (IJRA), vol. 3, no. 2, pp. 131_138, Jun. 2014.
F. Coito, A. Eleuterio, S. Valtchev, and F. Coito, “Tracking a mobile robot position using vision and inertial sensor,” IFIP Adv. Inf. Commun. Technol., vol. 423, pp. 201_208, Apr. 2014.
Chaplot, D.S., Salakhutdinov, R., Gupta, A. and Gupta, S., 2020. Neural topological slam for visual navigation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp. 12875-12884).
Zou, Q., Sun, Q., Chen, L., Nie, B. and Li, Q., 2021. A comparative analysis of LiDAR SLAM-based indoor navigation for autonomous vehicles. IEEE Transactions on Intelligent Transportation Systems.
Brossard, M., Bonnabel, S. and Barrau, A., 2018, October. Unscented Kalman filter on Lie groups for visual inertial odometry. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 649-655). IEEE.
Pareek, S., Manjunath, H., Esfahani, E.T. and Kesavadas, T., 2019. Myotrack: Realtime estimation of subject participation in robotic rehabilitation using semg and imu. IEEE Access, 7, pp.76030-76041.
Shen, D., Huang, Y., Wang, Y. and Zhao, C., 2018, August. Research and implementation of SLAM based on LIDAR for four-wheeled mobile robot. In 2018 IEEE International Conference of Intelligent Robotic and Control Engineering (IRCE) (pp. 19-23). IEEE.
Li, C., Wang, S., Zhuang, Y. and Yan, F., 2019. Deep sensor fusion between 2D laser scanner and IMU for mobile robot localization. IEEE Sensors Journal, 21(6), pp.8501-8509.
Jin, J. and Chung, W., 2019. Obstacle avoidance of two-wheel differential robots considering the uncertainty of robot motion on the basis of encoder odometry information. Sensors, 19(2), p.289.
Rafiq, A.A., Rohman, W.N. and Riyanto, S.D., 2020. Development of a simple and low-cost smartphone gimbal with MPU-6050 sensor. Journal of Robotics and Control (JRC), 1(4), pp.136-140.
Hartley, R., Ghaffari, M., Eustice, R.M. and Grizzle, J.W., 2020. Contact-aided invariant extended Kalman filtering for robot state estimation. The International Journal of Robotics Research, 39(4), pp.402-430.
Odry, Á., Kecskes, I., Sarcevic, P., Vizvari, Z., Toth, A. and Odry, P., 2020. A novel fuzzy-adaptive extended kalman filter for real-time attitude estimation of mobile robots. Sensors, 20(3), p.803.
M Taufiqqurohman and N F Sari ,2018. Odometry Method and Rotary Encoder for Wheeled Soccer Robot. IOP Conference Series: Materials Science and Engineering, Volume 407, International Conference on Informatics, Engineering, Science and Technology (INCITEST).Ser.: Mater. Sci. Eng. 407 012103.