Skip to Navigation Skip to Page Content

Reading the mind by tracking the eyes: Computer attempts to understand humans’ visual reasoning

Cropped image of a heat sensor scan of an eyeball.Common sense and thousands of years of observation have given rise to the axiom that there is no substitute for experience. A National Science Foundation-funded project recently completed by researchers at the University of Missouri, takes aim at changing that truism.

The project, led by Chi-Ren Shyu, Shumaker Endowed Professor of Computer Science in the MU College of Engineering, used the built-in camera on a computer to track and record the eye movement of both seasoned and novice health professionals as they viewed identical radiography images on the computer screen. All participants were given identical instructions from simple routine tasks to complex diagnostic processes.

A heat scan of an eyeball.

A heat sensor map tracks eye movement for a diagnostic process to the task of finding pathologies for the HRCT lung image.

Millions of points from haystacks of eye movement data were automatically processed and their patterns mined using computer algorithms developed for this project to study human reasoning behaviors by tracking anatomical and spatial patterns to model visual knowledge.

The researchers discovered that health professionals with varying levels of expertise looked at images in different ways. Less experienced professionals’ visual reasoning behaviors were significantly different than those of experienced ones whose visual behaviors were more consistent and accurate, especially for complex diagnostic processes.

The research team is building a searchable database of the tacit knowledge inherent in experienced practitioners’ responses. They believe that it could eventually play a role in error identification and decision-making because it will be able to pinpoint the accurate when, where and why for a specific pathology’s visual diagnostic footprint.

Research findings have potential additional applications in areas of security.

An X-ray of a left shoulder.

When tasked with looking for foreign objects in an X-ray, researchers tracked both the eye’s trajectory for the different spots glanced upon.

Even twins and immediate family members cannot mimic a person’s eye behavior in a series of images captured within a few seconds. This research provides a theoretical framework for types of computer authentication means that are different than existing biometric technologies.

Additionally, these methodologies may prove useful for the intelligence community. An archive of visual knowledge that might be difficult to articulate can be captured with large-scale eye-gaze patterns as it is interpreted by seasoned analysts. As a generation of image analysts retires, this research can insure their practiced wisdom and tacit knowledge will never been lost.

A variety of collaborators worked with Shyu on this project, including; Jeffrey Rouder, professor of psychological sciences; Sandra Erdelez, professor of information science and learning technology; Guilherme DeSouza, associate professor of electrical and computer engineering; Matthew Klaric, assistant research professor of electrical and computer engineering; Carla Allen, clinical associate professor of health professionals; Gerald Arthur, pathologist faculty of pathology and anatomical science; and students from MU Informatics Institute (MUII) and Computer Science Department.