Team studies virtual reality platforms using new CAVE

February 02, 2022

Students and faculty in CAVE.

Members of the research team stand in the CAVE, a virtual reality facility at Mizzou Engineering.

Mixed reality platforms immerse users into virtual environments that can be used to map out drone flights, simulate aircraft training, assess traffic in real time and plan infrastructure projects. But before users can fully experience the benefits of a realistic virtual environment, there are some challenges to solve.

A Mizzou Engineering team recently tested the capabilities to create photorealistic renderings for virtual reality (VR) using the Unity game engine software. They discovered that conventional software practices need to be transformed to meet the needs a user might have while accessing these environments. For instance, recreating a scene for a Microsoft HoloLens will require re-tooling to produce a high-quality user experience in a full-scale facility.

Specifically, the team attempted to develop a new software that can create large-scale virtual models in the CAVE on campus. Funded through a National Science Foundation grant, the CAVE (computer assisted virtual environment) consists of adjustable walls, a floor and motion sensors to provide virtual 3D surroundings.

The group conducted the study during the Research Experiences for Undergraduates (REU) in Consumer Network Technologies this past summer. The students and faculty advisors were the first to test out the CAVE and recently presented their findings at the Institute of Electrical and Electronics Engineers International Conference on Big Data in December.

“This was the first paper regarding the CAVE facility that we presented,” said Prasad Calyam, Greg Gilliom Professor of Cyber Security and director of the Mizzou Center for Cyber Education, Research and Infrastructure, where the CAVE is located. “We were happy to see it was accepted at an international IEEE conference.”

For the REU project, the group specifically looked at 3D modeling of urban cities using aerial images of Columbia, Missouri, and Albuquerque, New Mexico. These images were collected by Kannappan Palaniappan, professor of electrical engineering and computer science, and his team from the Computational Imaging & Visualization Analysis (CIVA) Lab.

The researchers used 3D point clouds that were created from the images, then turned those point clouds into a mesh. In computer graphics, a triangular mesh is the collection of vertices, edges and planar faces that make up a 3D object.

“The idea was how do we make 3D models of cities in immersive environments, which is different from modeling on a desktop scale,” Calyam said. “We showed how city scale meshes appeared based on how they are developed from point clouds and how the end user experiences it, either in the CAVE or through the HoloLens.”

The team — led by REU participants Calvin Davis and Emily Lattanzio, and guided by PhD student Jaired Collins who works under supervision of Palaniappan  — demonstrated that city-scale point clouds can be created with open source software such as CloudCompare and MeshLab. These point cloud visualization tools have limited capabilities when trying to display large-scale scenes for long durations of time or sufficient interactivity for an immersive experience.

“The undergraduate students, Calvin and Emily, learned a lot about computer graphics and 3D surface mesh generation,” said Palaniappan, who was the REU team advisor. “They helped with evaluating four meshing algorithms for representation accuracy including ball-pivoting algorithm, greedy surface triangulation, Poisson surface reconstruction, and screened Poisson surface reconstruction, by doing extensive tuning experiments to find the ideal parameter settings and application tradeoffs for each meshing method.”

Authors also made specific recommendations on which techniques to use depending on intended applications, such as city planning, gaming or planning drone flights.

“This paper developed an end-to-end workflow for creating urban scale real-world environments that can be experienced in the CAVE or using the HoloLens,” Calyam said. “It was the first to address the entirety of a city-to-synthetic environment pipeline for different applications using city-scale data.”

The project illustrates the type of successful collaborations that can come from the CAVE facility, said Ye Duan, an associate professor of electrical engineering and computer science and PI of the NSF grant funding the CAVE facility.

The CAVE team has begun to work with researchers across campus to study other potential applications using the virtual environment.

“We had very good synergy working with the various research teams,” he said. “Bringing in the REU program definitely helped showcase how it can be used to support undergraduate research and education. We’re hoping to see more of this type of work and collaboration in the future.”