What are you looking at?

Human eye

By Bryn Farnsworth, iMotions

That’s been the question that has been asked by psychologists and market researchers for decades, and the answer has reliably come from eye tracking technology. Now, as more and more of the devices that we use—from our phone to our car—seek to understand how we interact with the world, eye tracking is being used more than ever.

There are a lot of things happening right now as you look at this screen: Your eyes are tracking the words, and maybe they’re searching for the header, or furtively glancing at the sidebar. This kind of information is critical to market researchers and businesses, who strive to understand how users make decisions (and which of those decisions leads to a purchase).

Psychologists have been using even finer points of measurement, to break down the cognitive and psychophysiological processes that underlie our attention. Knowing what you’re looking at has been critical for helping understand humans, a journey that is continually in progress.

But recently, eye tracking has begun to be applied and implemented in a myriad of new ways, due to the increased portability, and reduced costs of the tracking units. Eye tracking devices can now be almost seamlessly integrated into glasses, offering a way to monitor eye movements with as little distraction as possible. For example, they can be worn when in the supermarket, for market research, or discreetly placed into our phone to help how we use it. All in all, this opens up new ways for technology to work with human attention.

wearable-eye-tracking-glasses

New tech

It’s of course no surprise that eye tracking, as an increasingly utilized piece of tech, would be introduced to the technology du jour: VR. Eye tracking has been integrated into VR headsets designed by Fove, a kickstarter-backed startup that uses the attention of users to impact the virtual environment they are placed in. By focussing on the user’s focus, the scenes can be shaped and changed in response to the eyes. This is just one example of these two technologies merging, and is something we will see more and more of as VR develops.

Human-computer interaction based on eye tracking is being explored and used within applications from assistive technologies, to improving car safety, to e-learning, and more. There is a wide, and increasing, range of opportunities for human attention to be paid attention to.

New fields

We’re also seeing eye tracking being used not only in new technologies, but also new fields of study. Urban design, neuroarchitecture, and even studies of art have also recently been utilizing this tool to understand how people examine their surroundings. Knowing how people pay attention has enabled researchers to better understand what is appealing about the environment, what doesn’t work, or what is confusing.

Such studies have also given researchers better insight into what users will do next, and to anticipate their future actions.

For example, Perkins + WIll, a research-based architecture firm, use eye tracking (along with measurements of galvanic skin response (GSR), and other measures) to understand how someone will connect with the buildings they have designed. By placing them in a virtual environment, the data provides a new layer of insight into how an individual experiences a building, before it’s really even left the drawing board.

neuroarchitecture-design

Medicine

As for medical uses, there is an increasing interest in using eye tracking to help diagnose, and potentially treat, neurological disorders. For example, infants usually like to look at images with people’s faces—scenes that have a social element. Research from UCSD has shown that infants that go on to develop autism are much more likely to have a preference for images that feature geometric shapes, suggesting that the analysis of eye movements may help guide early diagnosis.

Research platforms, in response to the growing demand and use by various companies and research groups, have increasingly focused on assisting such discoveries through eye tracking. The company iMotions has focused on increasing the simplicity of setting up an eye tracking study, while also enabling integration with other psychophysiological measurements.

The use of additional sources of physiological information about humans being integrated with eye tracking has brought increased attention for the possibilities it offers. While eye tracking clearly provides many opportunities for increased knowledge about humans, the capabilities can be multiplied when used in tandem with other measurements.

For example, research has used both eye tracking and GSR (the latter of which can be measured through wearable technology) to measure decision making with human-computer interfaces, providing more knowledge about how these should be designed. EEG recordings have also been combined with eye tracking, and GSR, for emotion recognition.

The possibilities opened up by the breadth of data that these devices offer when used in combination is huge, offering great potential for increasing the ease and speed of human-computer interaction. It won’t be too long in the future before we won’t have to use a computer mouse anymore (although we might still want to).

vr-headset

The future

As technology improves, and hardware costs continue to decline, we will likely see ever further integration of eye tracking technology in various aspects of our lives, helping our actions throughout the world be simpler and more effective than ever before. The use of tracking hand gestures, voice, and emotions (through facial expressions) all offer implementable routes into improving our experience with technology.

A growing body of work and research has been seen within the automotive industry, that uses both eye tracking and emotion recognition to improve the safety and intelligence of the cars that we drive (and, in the future, to improve AI-driven cars, too). The combined technologies can examine both our attentional focus, and how we are feeling – if we’re tired and not paying attention, perhaps the car can tell us to pull over and take a rest.

However the future shapes up to be, it will surely feature eye tracking as a way to enhance our daily lives and to make the machines around us all the more intelligent as they work with us. It’s something we’ll keep looking at, and keep looking forward to.

Bryn Farnsworth is a neuroscientist and psychologist, and the science editor at iMotions. He has a PhD in neuroscience and developmental biology, alongside a bachelor’s degree in psychology, and a master’s degree in cognitive and computational neuroscience. A big fan of the brain and mind, Bryn believes in the power of well-captured data to provide answers about who we are, what we think, and why we behave in the way that we do.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.