| Issue |
EPJ Web Conf.
Volume 341, 2025
2nd International Conference on Advent Trends in Computational Intelligence and Communication Technologies (ICATCICT 2025)
|
|
|---|---|---|
| Article Number | 01009 | |
| Number of page(s) | 10 | |
| DOI | https://doi.org/10.1051/epjconf/202534101009 | |
| Published online | 20 November 2025 | |
https://doi.org/10.1051/epjconf/202534101009
Eye Detection for Attention-Based Video Playback
Ajeenkya D. Y. Patil School of Engineering, Lohegaon, Pune, India
* Corresponding author: This email address is being protected from spambots. You need JavaScript enabled to view it.
Published online: 20 November 2025
Abstract
Eye tracking technology has developed as an important part in human-computer interaction while allowing for some automation depending on where a user looks. We conduct a study that tests the use of eye tracking to facilitate video control where the video will pause or play depending on where the user's attention is. We layout some of the ways in which eye-tracking systems can be used in media players, not only for automation but also to enhance accessibility features, heightened user engagement and experiences. All of this indicates that video playback is a meaningful context to integrate eye-trackers into and that eye-tracking potentially changes how users interact with videos and create less distractions and slowdowns when accessing media. Meaning that for example, if head movements are accurately captured the user could access and view videos, without touching the screen. The study also provides next steps for integrating and/or combining eye tracking with artificial intelligence and deep learning to help increase the accuracy and potential usefulness of integrating eye tracking into video playback systems. The paper will also present some examples of how eye tracking technology is being used in education for training prospective medical practitioners and/or simulation around healthcare for use in mental health and disability learning, and in the field of gaming, around behaviours that extra-gaming. The prototype we built used OpenCV and some of the different basic programming libraries. The device eyetracker was the only device input for tracking the user's eye. Our testing of the prototype concluded that it did perform adequately. Under normal operating conditions the prototype can be improved a lot. And as stated previously, beyond media control, the system could used to improve access for people living with disabilities.Page layout.
Key words: Self-regulated learning / cognitive load / eye-tracking / machine learning / optimized learning
© The Authors, published by EDP Sciences, 2025
This is an Open Access article distributed under the terms of the Creative Commons Attribution License 4.0, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.

