Specialist hardware not needed as gaming and interfaces set to benefit, writes former Engineers Ireland president Dr Chris Horn.
Eye-tracking technology has a number of applications, but the most impressive of these is helping people with severe motor impairment to be able to interact with computers. Sufferers of motor neuron disease, cerebral palsy and other illnesses may lose the ability to speak, as well as the control of hand and finger movement.
Computer's speech-to-text unit
Technology which can accurately track the gaze of an eye can be trained to help sufferers to type and select words, thus enabling them to interact with a computer. In turn the computer can have a speech-to-text unit to speak the 'eye-ed' in phrases.
The celebrated physicist Stephen Hawking used eye-tracking technology for a time to be able to write and synthesise speech. Unfortunately his eyelids then began to droop, impacting the effectiveness of the system, and a replacement was built which tracked tension in his cheek muscle. Nevertheless, eye tracking is widely used to help those challenged by motor impairment to be able to type and to have their speech synthesised.
One of the first eye trackers was built by an educational psychologist, Edmund Huey, in 1908. It used a plastic lens over the eye with a hole which tracked the position of the eye pupil. The lens had an aluminium pointer which traced onto a rotating drum.
Modern eye trackers are special-purpose hardware units. They continuously scan the difference in the position of the centre of the eye pupil and of an illumination generated by the device, reflected off the spherical surface of the cornea.
With calibration for each user, and knowing the position of the illumination relative to the eye, this angular difference can be used to derive the gaze direction. The illumination itself is usually in the infrared spectrum and hence invisible to the user.
Interest and attention
With the advent of the commercial web in the mid-1990s, there was a surge in industry interest in eye-tracking technology to be able to track what designs and layouts for web pages and websites were the most conducive for advertising. The best placement of adverts on a web page full of text segments and multiple images depends heavily on the other content on the page.
However, tracking the movement of a computer mouse across a web page can also be used as a proxy for a user’s interest and attention. Unlike eye tracking, it requires no specialist hardware nor calibration for each user.
On the other hand, mouse tracking is not a high-fidelity proxy of intent, since a mouse may often be at rest on the page while a user nevertheless scans content. Today, neither eye tracking nor mouse tracking are normally used.
Instead web page designs are improved by giving slightly alternative layouts concurrently to two or more groups of users, and testing which layout elicits the optimal response over a set period of time. The process can be iterated, leading to significant improvements from the initial page design.
Eye-tracking technology is frequently applied in designing the best visual layout of instruments and controls for machine operators. Most car and aircraft manufacturers use the technology to monitor the attention of test drivers and pilots to alternative dashboard and cockpit layouts, and so derive safer designs.
Some car manufacturers have integrated eye tracking to detect when a driver does not appear to be concentrating on the road ahead, and to generate appropriate warning.
Once just a niche technology for assistive technologies and industry ergonomic design, eye tracking may now become a mainstream consumer technology and potentially represents a significant market opportunity.
The intense computing demands of computing gaming have driven many developments in computing. Head tracking is supported by a number of games, and there is strong interest in eye tracking in addition to just head position. There are eye tracking attachments for gaming consoles and the technology is being incorporated into some virtual and augmented headsets and glasses.
Once a game can dynamically track a player’s gaze, then it can for example enhance the detail of its generated images at the focus of the gaze, swivel the image to the centre of the screen, or activate a control.
A Swiss start-up, Eyeware Tech, has recently announced eye tracking technology which completely dispenses with the need for specialist hardware units.
Exploiting the improvement in the capabilities of smartphone devices, including built-in depth-sensing technology, Eyeware has launched a software-only eye-tracking solution using off the shelf iPhones and iPads.
By downloading the Eyeware Beam app (currently in beta testing) from the Apple appstore, a user has simply to place their iPhone or iPad beside their gaming screen. Eyeware has suggested that their business model will be an in-app purchase within each game of their eye-tracking enhancement.
Rather than using cornea reflections in the infrared spectrum, the app instead dynamically and continuously analyses the image of the user’s face to track eye movements and gaze. Using just normal light and standard smart devices, the strategy threatens to disrupt the established eye-tracking suppliers.
This article first appeared in The Irish Times on September 9, 2021.
Author: Dr Chris Horn, former president of Engineers Ireland, is the co-founder, CEO and chairman of IONA Technologies, industry expert on Irish technology development, trends, and business. As an honorary Doctor of Science from Trinity College Dublin and former TCD lecturer in computer science, Dr Horn is at the forefront of the Irish high-tech debate.