February 10, 2020
Four students represented the College of Engineering at the MIT Reality Hack where participants were challenged with developing new cross reality experiences using the industry’s best technology.
The hackathon took place in January over five days filled with workshops, networking, and of course, hacking.
The MIT Reality Hack is an international competition with half students and half industry practitioners that is striving to foster uninhibited creativity and originality in both augmented and virtual reality, or XR for short.
Participants must apply to be a part of the hackathon, and four of the College’s students, Scottie Murrell, Weiyu Feng, Joseph Hays and Samuel Jr. Frimpong, were accepted to go to the MIT Media Lab and develop alongside some of the world’s best minds.
Feng says team members all submitted prior research and course work they conducted in the Virtual Reality Media Capture lab in the Information Technology Program in the College of Engineering. The hackathon application also required a video record of a virtual reality lab simulation project.
“The project was a complete simulation of the radiology lab located on the ground floor of Engineering Building North,” he said. “People could follow the instructions in the virtual lab to learn how to operate machines and learn about the lab process. This can help new students learn about the lab process so they don’t hurt themselves or waste materials.”
Hackers form teams at the competition with other hackers they’ve only just met. The goal of the competition is to develop groundbreaking new XR experiences, but to truly do so requires diverse ways of thinking that can easily be found with new team members.
It was Feng’s first time at the competition, and he said one of the best parts was being surrounded by people who shared his passion.
“It’s a good place to build connections,” he said. “You feel like you’re connected with all the people who have the same interests, so that’s pretty cool.”
The hackathon had over 300 participants and industry sponsors like Microsoft, HP and Nreal.
Another key component of the competition was the opportunity for attendees to receive guidance and troubleshooting help from expert mentors.
Murrell, who went to the competition last year, found that wasn’t always necessary.
“Even though it is a competition, the atmosphere of friendship and enthusiasm to develop is so unique that teams would actually help each other if they had problems,” he said. “We were all learning and loving it.”
Feng and his team won the Best of VIVE Pro Eye Tracking award for their project called The Mind’s Eye.
The Mind’s Eye is a who-done-it detective game where players are tasked with figuring out which of the characters committed the crime and with what weapon.
The eye tracking technology follows where the player’s eyes are looking and then the characters tell the player information about whatever they are looking at. Then using this information, players are supposed to figure out which character committed the crime.
Using the eye tracking technology, Feng’s team reduced the need for players to move around which often creates motion sickness while using VR. Feng said that helps produce a better overall user experience.
All of the objects in the crime scene are prominently displayed, so users can see them equally well.
While the sleuthing is going on, the game is collecting data on how long the player is looking at specific characters and objects. And as players progress through the game, the new information they’re provided is based on the information they received previously and how long they were looking at certain objects.
At the end, the player is shown how long they looked at certain weapons, like a knife, or characters, like the grandma.
The goal for the player may be to solve the crime, but the goal of Feng’s team was to create a XR experience that would highlight the user’s own biases.
“So we give you the opportunity and give the chance to let you do logical thinking, critical thinking,” Feng said. “It’s also called cognitive bias which means you only see what you want to see instead of observing everything equally. We want to build the players critical thinking when they reach the end of the game.”
Murrell’s group developed teaching software aimed at increasing understanding and classroom engagement through augmented reality. Students can wear the AR glasses and be fully immersed, literally, in what is being taught.
For example, one of the simulations Murrell’s group made was the solar system.
“You and the teacher could actually move through the solar system with the computer and point out things,” he said. “And then the students could actually ask questions, and the teacher can see who’s asking questions.”
He had the incredible opportunity to work with Nreal’s AR glasses which haven’t hit the market, making him one of the first developers to work with them.
Murrell said that it was an incredible learning experience and despite the fact he’d never developed for AR lenses before, he came away having created an entire software for them.
“You’ll never meet more talented people,” he said. “I learned so much. By the time I was done there, I made a software that actually can be used on AR glasses.”