Machine Learning Techniques for Biometric Unsupervised Gaze Estimation in ADHD Screening
DOI:
https://doi.org/10.47611/jsrhs.v13i2.6439Keywords:
Gaze Estimation, ADHD Screening, Convolutional Neural NetworkAbstract
Attention-Deficit/Hyperactivity Disorder (ADHD) is a complex neurodevelopmental condition characterized by persistent challenges in attention, hyperactivity, and impulsivity, significantly impacting daily functioning and developmental trajectories. The traditional ADHD diagnostic process typically involves comprehensive assessments conducted by healthcare professionals, such as psychologists, psychiatrists, or pediatricians. These assessments rely heavily on subjective observations and reports from various sources. However, this approach is time-consuming, labor-intensive, and often requires multiple appointments which makes it a resource-intensive process. To address this issue, I propose an unsupervised learning-based gaze estimation system for the screening of ADHD. The proposed system takes eye images as input and generates gaze vectors which indicate the individual's current focal point. By aggregating these gaze vectors over a specific time series, the system can identify abnormalities in the gaze patterns which facilitate the early screening of ADHD. Comprehensive experiments have shown the superiority of the proposed system over previous methods. The experimental results also confirm the feasibility of utilizing the proposed method as a biometric for screening ADHD.
Downloads
References or Bibliography
Babu, M. D., JeevithaShree, D. V., Prabhakar, G., & Biswas, P. (2019). USING EYE GAZE TRACKER TO AUTOMATICALLY ESTIMATE PILOTS’COGNITIVE LOAD. In 50th International Symposium of the Society for Flight Test Engineers (SFTE).
Cuemath. (2021, Oct 19). “Rotation Matrix”: Cuemath.
https://www.cuemath.com/algebra/rotation-matrix/
Funes Mora, K. A., Monay, F., & Odobez, J. M. (2014, March). Eyediap: A database for the development and evaluation of gaze estimation algorithms from rgb and rgb-d cameras. In Proceedings of the symposium on eye tracking research and applications (pp. 255-258).
Galgani, F., Sun, Y., Lanzi, P. L., & Leigh, J. (2009, March). Automatic analysis of eye tracking data for medical diagnosis. In 2009 IEEE Symposium on Computational Intelligence and Data Mining (pp. 195-202). IEEE.
Gideon, J., Su, S., & Stent, S. (2022). Unsupervised multi-view gaze representation learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp. 5001-5009).
Goto, Y., Hatakeyama, K., Kitama, T., Sato, Y., Kanemura, H., Aoyagi, K., ... & Aihara, M. (2010). Saccade eye movements as a quantitative measure of frontostriatal network in children with ADHD. Brain and Development, 32(5), 347-355.
Kar, A., & Corcoran, P. (2017). A review and analysis of eye-gaze estimation systems, algorithms and performance evaluation methods in consumer platforms. IEEE Access, 5, 16495-16519.
Kasahara, I., Stent, S., & Park, H. S. (2022, October). Look both ways: Self-supervising driver gaze estimation and road scene saliency. In European Conference on Computer Vision (pp. 126-142). Cham: Springer Nature Switzerland.
Krafka, K., Khosla, A., Kellnhofer, P., Kannan, H., Bhandarkar, S., Matusik, W., & Torralba, A. (2016). Eye tracking for everyone. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 2176-2184).
Levantini, V., Muratori, P., Inguaggiato, E., Masi, G., Milone, A., Valente, E., ... & Billeci, L. (2020). Eyes are the window to the mind: Eye-tracking technology as a novel approach to study clinical characteristics of ADHD. Psychiatry Research, 290, 113135.
Park, S., Mello, S. D., Molchanov, P., Iqbal, U., Hilliges, O., & Kautz, J. (2019). Few-shot adaptive gaze estimation. In Proceedings of the IEEE/CVF international conference on computer vision (pp. 9368-9377).
Ross, R. G., Olincy, A., Harris, J. G., Sullivan, B., & Radant, A. (2000). Smooth pursuit eye movements in schizophrenia and attentional dysfunction: adults with schizophrenia, ADHD, and a normal comparison group. Biological psychiatry, 48(3), 197-203.fv
Shah, S. M., Sun, Z., Zaman, K., Hussain, A., Shoaib, M., & Pei, L. (2022). A driver gaze estimation method based on deep learning. Sensors, 22(10), 3959.
Sun, Y., Zeng, J., Shan, S., & Chen, X. (2021). Cross-encoder for unsupervised gaze representation learning. In Proceedings of the IEEE/CVF International Conference on Computer Vision (pp. 3702-3711).
Yu, Y., & Odobez, J. M. (2020). Unsupervised representation learning for gaze estimation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp. 7314-7324).
Zhang, X., Park, S., Beeler, T., Bradley, D., Tang, S., & Hilliges, O. (2020). Eth-xgaze: A large scale dataset for gaze estimation under extreme head pose and gaze variation. In Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23–28, 2020, Proceedings, Part V 16 (pp. 365-381). Springer International Publishing.
Published
How to Cite
Issue
Section
Copyright (c) 2024 Yaejoon Jung; Eunju Moon
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Copyright holder(s) granted JSR a perpetual, non-exclusive license to distriute & display this article.