Mizzou Engineer developing system to combat cybersickness experienced by soldiers using XR goggles

November 15, 2023

Two men at computer

In theory, augmented and virtual reality are ideal tools to train soldiers for battle in safe, controlled settings. In reality, these extended reality (XR) goggles are causing all sorts of problems — headaches, nausea, eye strain and other forms of so-called cybersickness.

Now, a Mizzou Engineer is working with the U.S. Army to develop user-friendly artificial intelligence (AI) that can not only detect cybersickness but also explain why it’s flagging those issues and recommend ways to alleviate cybersickness.

Khaza Anuarul Hoque has recently received $450,000 in funding from the U.S. Army Combat Capabilities Development Command Army Research Laboratory for the work. Hoque is an assistant professor of electrical engineering and computer science and director of the Dependable Cyber-Physical Systems (DCPS) Lab.

“Recently, there has been a trend in the military using extended reality to train soldiers for different warfighting situations,” Hoque said. “But last year, a report came out that more than 80% of soldiers using XR goggles were experiencing cybersickness.”

Previous research has proposed automated methods using machine learning and deep learning, types of AI, to detect cybersickness. However, these systems are complex, hard to understand and require a lot of computational power and energy.

Hoque is the first to propose an explainable AI model that also accounts for energy efficiency and user experience.

Hoque and his team will investigate methods and prototype a system that can predict, explain and mitigate cybersickness in a trustworthy, interactive and efficient manner, incorporating explainable AI throughout.

The project has three components. First, they will develop new techniques to allow AI models to automatically detect cybersickness and explain the reason the machine identified the symptoms.

“If you are developing AI models, you need to know the features that are making the decision to detect cybersickness,” Hoque said. “If the model is making a wrong decision, you need to know which feature contributed to that decision so that you can debug it and make it more accurate.”

The model would also be designed to deploy the appropriate cybersickness mitigation from a library of preprogrammed techniques. For instance, if a soldier is suffering eye strain, one mitigation technique could be adjusting the depth of field in the scene.

To make it user-friendly, Hoque is using a large language model similar to ChatGPT to allow a soldier in the field to be able to communicate with the system. A user would be able to give voice or text commands to interact with the detection model, including deciding which mitigation strategy to use.

Finally, the model will be designed to account for resource and energy constraints. Commercial XR headsets are often heavy and, when coupled with large AI models, can drain batteries quickly. Hoque is proposing a system in which the AI can determine if certain features aren’t necessary and can be eliminated to make the headsets more energy efficient.

“If successful,” Hoque said, “this research will transform state-of-the-art XR applications and improve training and operational effectiveness.”

Learn more about research in the Dependable Cyber-Physical Systems Lab.