A team of researchers led by Professor Paul McNeilge of the University of Nevada has developed a video surveillance technology that clearly demonstrates how a person looks at the world. It simultaneously monitors what is happening in front of a person and the movements of his eyes in order to compare the data and get a complete model. The first experiments showed that our perception of what we saw is very different from what our eyes actually looked at.
The system is based on the headset of the German company Pupil Labs, which is equipped with a pair of microcameras to track eye movements. There is also an inertial measurement unit for orientation in space. McNeilge's team has added a couple more forward facing cameras in the user's direction. In order not to burden the system, they carried the computing power to a laptop in a backpack, to which the headset is connected.
The resulting system simultaneously tracks the movements of the eyes and head of a person, separating these data, so that you can see not only in which direction he is looking, but also on which point he focused his attention. The situation is constantly changing, and even during a leisurely walk, our gaze moves all the time. Together with GPS data, video from front cameras and timing marks, a complete model of what the user saw is obtained, highlighting those objects and details that caught his attention.
In the next step, McNeilge's team wants to record a series of videos with volunteers ranging in age from 5 to 70. The resulting knowledge base will be analyzed by various methods to find the answer to the question: how exactly do we visually assess the world? What and why attracts our eye, what makes it hold, how does visual interaction take place, etc. This data will later be useful in the development of implants, artificial intelligence, new sensors, in neuroscience, and even in art.