Technology

Eye-tracking Research Provides a Glimpse Into the Future of Mobile Device Interaction

Eye-tracking Research Provides a Glimpse Into the Future of Mobile Device Interaction

According to academics, a recent study investigating how users’ eyes alone can operate mobile devices might provide a glimpse into the future of gaze-based interactions with smartphones.

Human-computer interaction experts from institutions in Scotland, Germany, and Portugal have examined the use of eyes to operate mobile devices more closely and have provided a number of suggestions on how to include gaze interaction in the next technological advancements.

For the first time, researchers have looked at how three types of gaze contact function, while users are walking or sitting, as well as which, approach individuals prefer in both scenarios. The results are set to be presented as a paper at the ACM Conference on Human Factors in Computing Systems later this month.

The report might help define the user experience of future mobile devices, which are anticipated to incorporate eye-tracking technologies as front-facing cameras improve.

The report is based on an assessment of the experiences of 24 research participants using various eye-based engagement approaches while seated at a desk and subsequently wandering around a room. Each time one of the targets changed from white to black, the participants used the strategies to choose particular targets from a grid of white, circular shapes on a mobile phone screen.

Interaction
Eye-tracking Research Provides a Glimpse Into the Future of Mobile Device Interaction

Throughout the trial, participants were prompted to choose among a variety of onscreen targets. The numbers of targets varied between two and 32 and were counterbalanced between participants to minimize the influence of extraneous factors, such as practice or fatigue, on the experimental results.

Dwell Time, Pursuits, and Gaze Gestures were the three techniques that the participants had to use. With Dwell Time, people may choose products by gazing at a target for 800 milliseconds. Users in Pursuits choose the objective by following a little object that is orbiting it. By looking off-screen to the left or right, users of Gaze Gestures can reduce the number of targets until they reach the one they wish to pick.

The study’s findings revealed that the participants preferred to utilize Pursuits when sitting. Additionally, it was quicker; on average, users chose a target in 1.36 seconds as opposed to 2.33 seconds with Dwell and 5.17 seconds with Gaze Gestures.

Users liked Dwell Time when they were on the go. It took 2.76 seconds on average, which was significantly slower than Pursuits (2.14 seconds) but faster than Gaze Gestures (6.68 seconds).

The study’s supervisor and co-author is Dr. Mohamed Khamis of the University of Glasgow’s School of Computing Science. Eye tracking has been thoroughly researched recently in a variety of user settings, however, the majority of those have either the user, the camera, or both immobile.

“Despite the difficulties of both the gadget and the user moving at the same time, eye-tracking has become considerably more practicable as smartphone camera technology has evolved.

“It’s a highly promising way for enabling speedy engagement with gadgets, and it might make using cellphones simpler for those with mobility challenges in addition to increasing the range of scenarios where devices can be utilized. Anyone who has attempted to make an essential phone call while wearing gloves or carrying something heavy in one hand understands how difficult it can be to free their hands to touch their smartphone.

“What we set out to do was to investigate how users might prefer to use eye-tracking to control their devices and to lay the groundwork for future developments.”

The paper’s first author is Omar Namnakani, a student at the School of Computing Science. “In the paper, we propose a few guidelines for deciding how gaze-based interaction should be used in different situations,” he said.

According to our study, Pursuits appear to be the ideal technique to utilize when there are fewer than nine targets on the screen and users are seated. When there are more targets, using pursuits might become taxing. In those circumstances, both sitting still and moving around, Dwell Time is the ideal choice.

“However, Gaze Gestures were the most precise mode of selection while users were both seated and moving, despite users’ preferences in our study and the slower pace of the input. This should be the technique of selection used when precision comes above speed.

The study was included in the Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI ’23). The researchers intend to continue working together to investigate additional methods of eye engagement that might improve the three strategies investigated in the publication. They’re also interested in investigating the ramifications of eye-tracking technologies for consumers’ digital privacy.