Computing the future
The MU College of Engineering’s Research Experience for Undergraduates (REU) recently wrapped up its 10-week summer session, with brilliant students from around the country completing work on cutting-edge computer science and Big Data projects.
A total of ten students, guided by faculty from the Electrical Engineering and Computer Science Department, split into groups and prepared presentations, posters and final papers, which they shared at the Undergraduate Research and Creative Achievements Forum on July 27.
Program Director and EECS Assistant Professor Prasad Calyam said 76 people applied to participate in the REU, but only 10 were selected for the prestigious National Science Foundation-funded program.
“The purpose is to get them away from their homes, and create an exciting research environment,” he said. “And give them 10 weeks of focused attention on solving a real problem pertaining to applications in healthcare and public safety and working with experts on these problems that are built around consumer networking technologies such as Google Home, HTC Vive Virtual Reality Headsets and Microsoft Kinect Sensors. Students also visited the MU Data Center and the local EEE middle school to interact with other experts and students.”
The four projects were: Stroke Patient Daily Activity Observation System (mentored by EECS Professor Marjorie Skubic), Virtual Reality for Online Social Training of Children with Autism (mentored by EECS Professor Zhihai He and Calyam), Visual Computing at the Network Edge for Disaster Incident Response (mentored by Calyam) and Healthy Coping in Diabetes (mentored by EECS Professor Yi Shang). EECS Graduate student Roshan Neupane served as the coordinator for the program activities and mentored the students as well.
Stroke Patient Daily Observation System
Jaired Collins and Joseph Warren of Missouri Southern State University worked on a project that entailed constructing a system to help therapists provide the highest quality care to stroke victims. Working with Skubic’s team, they used the Microsoft Kinect motion capture device to create a non-intrusive observation system, allowing caregivers to track patients’ movements to better determine potential problems. This way, the therapy can be geared toward solving these specific pitfalls before they become more troublesome.
“We’re trying to combine some of the latest and greatest technologies in analyzing algorithms to be able to acquire information about movements in their homes so they don’t have to leave to get assessed,” Warren explained.
Virtual Reality for Online Social Training of Children with Autism
Devin Hudson of Truman State University, Adam Starr of Pomona College and Chiara Zizza of Grinnell College worked with He’s and Calyam’s teams to develop a virtual reality classroom to support the specialized education of students with Autism Spectrum Disorder in urban/rural areas, where such services are in shorter supply. The team utilized an open-source VR platform High Fidelity to connect remote users with special educators to facilitate immersive lessons.
“Research shows that virtual reality is very effective in leading to a greater generalization of social skills [for these students],” Starr said.
Zizza’s mother is heavily involved in the education of children with ASD, which helped motivate her to work on a related research question. She added that her mother appreciates her work and that of other researchers in the field.
“I wanted to apply that knowledge into this project and help manipulate the environment in a way that can help,” Zizza said.
Visual Computing at the Network Edge for Disaster Incident Response
Kyle Coleman of Saint Louis University, Andrew Crutcher of Southeast Missouri State University and Caleb Koch of Cornell University were tasked with developing a method of computation task partitioning for visual data processing in disaster scenarios. Working under the tutelage of Calyam and his team, the trio used linear regression to develop what’s called a “hyperprofile,” which provides a resource-cost estimate to offload computation tasks to a given server. This allows users to better prioritize their data processing locations and utilize the proper servers in order to get the most important information processed the quickest, using the least amount of energy for a given device.
“We want to be able to query resources within this space to get which servers would be optimal to offload to,” Koch explained. “The idea is different network metrics are different dimensions in this space, and in order to begin processing on these devices, we had to understand how various network metrics are related to each other.”
Healthy Coping in Diabetes
Amy Cheng of Auburn University and Vaishnavi Raghavaraju of the University of Cincinnati tackled the problem of making the latest in cutting-edge technology palatable for use by elderly diabetes patients in order to facilitate proper self-management of their affliction. The duo worked with Shang’s and Assistant Professor of Health Management and Informatics Min Soon Kim’s teams to develop an application for the Google Home, which is voice activated.
“Research identified a lot of problems with mobile apps when a geriatric population uses it, because they experience high learning curves and deal with problems dealing with their physical disabilities,” Cheng said. “We’ve been working on creating an app on Google Home which utilizes voice interface rather than a tactile one, so they can effectively manage diabetes with their voice rather than touch.”