Enhancing Interface Efficiency: Adaptive Virtual Keyboard Minimizing Keystrokes in Electrooculography-Based Control

Main Article Content

Arrya Anandika
Pringgo Dwi Laksono
Muhammad Syaiful Amri bin Suhaimi
Joseph Muguro
Muhammad Ilhamdi Rusydi

Keywords

control, EOG, Virtual Keyboard, Adaptive

Abstract

Rapid technological developments, one of which is technology to build communication relationships between humans and machines using Biosignals. One of them is Electrooculography (EOG). EOG is a type of biosignals obtained from eye movement. Research related to EOG has also developed a lot, especially for virtual keyboard control. Research on virtual keyboard control based on eye gaze motion using electrooculography technology has been widely developed. Previous research mostly drew conclusions based on time consumption in typing paragraphs. However, it has not been seen based on the number of eye gaze motions made by the user. In this research, an adaptive virtual keyboard system is built, controlled using EOG signals. The adaptive virtual keyboard is designed with 7x7 dimensions and has 49 buttons, including main buttons, letters, numbers, symbols, and unused buttons. The layout of the adaptive virtual keyboard has six zones. Each zone has a different number of steps. Characters located in the same zone have the same number of steps. The adaptive feature is to rearrange the position of the character's button based on the previously used characters. In the experiments, 30 respondents controlled static and adaptive virtual keyboards with 7 paragraphs typed. Adaptive mode rearranges the position of buttons based on k-selection activities from respondents. the k numbers are 10, 30, 50, 70 and 100. Two virtual keyboard modes are evaluated based on the number of steps required to type the paragraphs. Test results show that the performance of the adaptive virtual keyboard can shorten the number of user steps compared to static mode. There are tests of the optimal system that can be reduced up to 283 number of steps and from respondents, that can reduced up to 258 number of steps or about 40% of steps. This research underscores the promise of EOG-driven adaptive virtual keyboards, signaling a notable stride in augmenting user interaction efficiency in typing experiences, heralding a promising direction for future human-machine interface advancements.

References

[1] M. Sasaki, M. S. A. Bin Suhaimi, K. Matsushita, S. Ito, and M. I. Rusydi, "Robot Control System Based on Electrooculography and Electromyogram," Journal of Computer and Communications, vol. 03, no. 11, pp. 113–120, 2015, doi: 10.4236/jcc.2015.311018.
[2] A. López, M. Fernández, H. Rodríguez, F. Ferrero, and O. Postolache, "Development of an EOG-based system to control a serious game," Measurement (Lond), vol. 127, no. June, pp. 481–488, 2018, doi: 10.1016/j.measurement.2018.06.017.
[3] Y. K. Meena, H. Cecotti, and G. Prasad, "A Novel Multimodal Gaze-Controlled Hindi Virtual Keyboard for Disabled Users," no. October, 2016, doi: 10.1109/SMC.2016.7844807.
[4] M. I. Rusydi, T. Okamoto, S. Ito, and M. Sasaki, "Rotation matrix to operate a robot manipulator for 2D analog tracking objects using electrooculography," Robotics, vol. 3, no. 3, pp. 289–309, 2014, doi: 10.3390/robotics3030289.
[5] A. López, P. J. Arévalo, F. J. Ferrero, M. Valledor, and J. C. Campo, "EOG-based system for mouse control," Proceedings of IEEE Sensors, vol. 2014-Decem, no. December, pp. 1264–1267, 2014, doi: 10.1109/ICSENS.2014.6985240.
[6] M. I. Rusydi, M. Sasaki, and S. Ito, "Affine Transform to Reform Pixel Coordinates of EOG Signals for Controlling Robot Manipulators Using Gaze Motions," pp. 10107–10123, 2014, doi: 10.3390/s140610107.
[7] W. Tangsuksant, C. Aekmunkhongpaisal, P. Cambua, T. Charoenpong, and T. Chanwimalueang, "Directional Eye Movement Detection System for Virtual Keyboard Controller," in The 2012 Biomedical Engineering International Conference (BMEiCON-2012), 2012, p. 156.
[8] A. B. Usakli and S. Gurkan, "Design of a novel efficient humancomputer interface: An electrooculagram based virtual keyboard," IEEE Trans Instrum Meas, vol. 59, no. 8, pp. 2099–2108, 2010, doi: 10.1109/TIM.2009.2030923.
[9] M. A. Ahamed, M. Asraf-Ul-Ahad, M. H. A. Sohag, and M. Ahmad, "Development of low cost wireless ECG data acquisition system," Proceedings of 2015 3rd International Conference on Advances in Electrical Engineering, ICAEE 2015, no. Eict, pp. 72–75, 2016, doi: 10.1109/ICAEE.2015.7506799.
[10] Y. Y. Lu and Y. T. Huang, "A method of personal computer operation using Electrooculography signal," Proceedings of 2019 IEEE Eurasia Conference on Biomedical Engineering, Healthcare and Sustainability, ECBIOS 2019, no. 49, pp. 76–78, 2019, doi: 10.1109/ECBIOS.2019.8807879.
[11] M. I. Rusydi, M. Bahri, R. S. Ryaldi, F. Akbar, K. Matsuhita, and M. Sasaki, "Recognition of horizontal gaze motion based on electrooculography using tsugeno fuzzy logic," IOP Conf Ser Mater Sci Eng, vol. 602, no. 1, 2019, doi: 10.1088/1757-899X/602/1/012029.
[12] N. M. M. Noor and M. A. M. Mustafa, "Eye movement activity that affected the eye signals using electrooculography (EOG) technique," Proceedings - 6th IEEE International Conference on Control System, Computing and Engineering, ICCSCE 2016, no. November, pp. 91–95, 2017, doi: 10.1109/ICCSCE.2016.7893551.
[13] M. I. Rusydi, M. Sasaki, and S. Ito, "Calculate Target Position of Object in 3-Dimensional Area Based on the Perceived Locations Using EOG Signals," Journal of Computer and Communications, vol. 02, no. 11, pp. 53–60, 2014, doi: 10.4236/jcc.2014.211007.
[14] M. I. Rusydi, A. Anandika, R. Adnan, K. Matsuhita, and M. Sasaki, "Adaptive Symmetrical Virtual Keyboard Based on EOG Signal," 2019 4th Asia-Pacific Conference on Intelligent Robot Systems, ACIRS 2019, pp. 22–26, 2019, doi: 10.1109/ACIRS.2019.8935956.
[15] S. S. S. Teja, S. S. Embrandiri, N. Chandrachoodan, and R. Reddy M., "EOG based virtual keyboard," 2015 41st Annual Northeast Biomedical Engineering Conference, NEBEC 2015, pp. 1–2, 2015, doi: 10.1109/NEBEC.2015.7117201.
[16] A. López, F. Ferrero, D. Yangüela, C. Álvarez, and O. Postolache, "Development of a computer writing system based on EOG," Sensors (Switzerland), vol. 17, no. 7, pp. 1–20, 2017, doi: 10.3390/s17071505.
[17] N. Barbara, T. A. Camilleri, and K. P. Camilleri, "EOG-based eye movement detection and gaze estimation for an asynchronous virtual keyboard," Biomed Signal Process Control, vol. 47, pp. 159–167, 2019, doi: 10.1016/j.bspc.2018.07.005.
[18] Q. Huang et al., "An EOG-based human-machine interface for wheelchair control," IEEE Trans Biomed Eng, vol. 65, no. 9, pp. 2023–2032, 2018, doi: 10.1109/TBME.2017.2732479.
[19] B. Champaty, J. Jose, K. Pal, and A. Thirugnanam, "Interface control System for Motorized Wheelchair," 2014 Annual International Conference on Emerging Research Areas: Magnetics, Machines and Drives (AICERA/iCMMD), pp. 1–7, 2014, doi: 10.1109/AICERA.2014.6908256.
[20] M. I. Rusydi, T. Okamoto, S. Ito, and M. Sasaki, "Controlling 3-D movement of robot manipulator using electrooculography," International Journal on Electrical Engineering and Informatics, vol. 10, no. 1, pp. 170–185, 2018, doi: 10.15676/ijeei.2018.10.1.12.
[21] S. Chakraborty, A. Dasgupta, P. Dash, and A. Routray, "Development of a wireless wearable electrooculogram recorder for IoT based applications," IEEE International Symposium on Industrial Electronics, no. June, pp. 1991–1995, 2017, doi: 10.1109/ISIE.2017.8001559.
[22] L. D. Lledó, A. Úbeda, E. Iáñez, and J. M. Azorín, "Internet browsing application based on electrooculography for disabled people," Expert Syst Appl, vol. 40, no. 7, pp. 2640–2648, 2013, doi: 10.1016/j.eswa.2012.11.012.
[23] J. R. Bobade and M. D. Khirwadkar, "Design and Implementation of Electrooculogram Based Alarm System for Disabled," International Journal of Advanced Research in Computer Engineering & Technology (IJARCET), vol. 5, no. 4, pp. 872–874, 2016.
[24] A. Henzen and P. Nohama, "Adaptable virtual keyboard and mouse for people with special needs," FTC 2016 - Proceedings of Future Technologies Conference, no. December, pp. 1357–1360, 2017, doi: 10.1109/FTC.2016.7821782.
[25] T. H. Lee and H. J. Lee, "Ambidextrous Virtual Keyboard Design with Finger Gesture Recognition," Proceedings - IEEE International Symposium on Circuits and Systems, vol. 2018-May, pp. 1–4, 2018, doi: 10.1109/ISCAS.2018.8351485.
[26] M. I. Rusydi, Oktrison, W. Azhar, S. W. Oluwarotimi, and F. Rusydi, "Towards hand gesture-based control of virtual keyboards for effective communication," IOP Conf Ser Mater Sci Eng, vol. 602, no. 1, 2019, doi: 10.1088/1757-899X/602/1/012030.
[27] M. I. Rusydi et al., "The Use of Two Fingers to Control Virtual Keyboards with Leap Motion Sensor," Proceedings of 2017 5th International Conference on Instrumentation, Communications, Information Technology, and Biomedical Engineering, ICICI-BME 2017, no. November, pp. 255–260, 2018, doi: 10.1109/ICICI-BME.2017.8537763.
[28] Y. K. Meena, H. Cecotti, K. Wong-Lin, and G. Prasad, "A novel multimodal gaze-controlled Hindi virtual keyboard for disabled users," 2016 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2016 - Conference Proceedings, pp. 3688–3693, 2017, doi: 10.1109/SMC.2016.7844807.
[29] M. I. Rusydi, D. Saputra, D. Anugrah, Syafii, A. W. Setiawan, and M. Sasaki, "Real time control of virtual menu based on EMG signal from Jaw," Proceedings of 2018 3rd Asia-Pacific Conference on Intelligent Robot Systems, ACIRS 2018, pp. 18–22, 2018, doi: 10.1109/ACIRS.2018.8467273.
[30] Y. Oyedele and D. Van Greunen, "Towards User Experience Heuristics for Engagement and Interaction," in 2022 IST-Africa Conference (IST-Africa), 2022, pp. 1–9. doi: 10.23919/IST-Africa56635.2022.9845564.
[31] F. P. A. Praja, R. Afwani, E. Sutoyo, E. Suryani, and D. Diswandi, "Enhancing Website Design: The Implementation of Sequential Monadic Concept Testing on User Interface and User Experience Design," in 2023 International Conference on Advancement in Data Science, E-learning and Information System (ICADEIS), 2023, pp. 1–6. doi: 10.1109/ICADEIS58666.2023.10271051.
[32] A. Valerian, H. B. Santoso, M. Schrepp, and G. Guarddin, "Usability Evaluation and Development of a University Staff Website," in 2018 Third International Conference on Informatics and Computing (ICIC), 2018, pp. 1–6. doi: 10.1109/IAC.2018.8780456.
[33] A. Anandika, M. I. Rusydi, P. P. Utami, R. Hadelina, and M. Sasaki, "Hand Gesture to Control Virtual Keyboard using Neural Network," JITCE (Journal of Information Technology and Computer Engineering), vol. 7, no. 01, pp. 40–48, Mar. 2023, doi: 10.25077/jitce.7.01.40-48.2023.