While virtual reality is often associated with gaming and entertainment, it serves a wider range of purposes spanning education, fitness, training for military or medical personnel, physical rehabilitation and more. John Quarles, a UT San Antonio professor of computer science, conducts research that aims to enhance the virtual and augmented reality experience across all these applications.
Quarles says his interest in these subjects was first sparked during his youth while spending time with his father, a physics professor, who also helped introduce him to the world of research.
“I played a lot of video games with my dad,” Quarles says. “I also worked at his lab a few times as a lab assistant kind of thing, and I enjoyed the whole research aspect and being in that environment.”
After his initial experience using a VR headset during college, Quarles decided to follow a path similar to his father’s and pursue academic research.
Quarles would go on to earn his PhD in computer engineering from the University of Florida and joined the UT San Antonio faculty soon afterward in 2009, where he would continue contributing to the growing field of VR and AR research.
Since arriving at UT San Antonio, Quarles has been involved in several impactful projects, from developing AR-powered patient holograms for training Army medics to finding solutions to improve the VR experience for people with balance issues. The latter is a National Science Foundation-backed project he is leading alongside fellow UT San Antonio faculty members Alberto Cordova, a kinesiology professor, and Kevin Desai, an assistant professor of computer science.
Their team operates out of a lab in the San Pedro I building, which is part of UT San Antonio’s growing downtown footprint in the College of AI, Cyber and Computing. Quarles is one of many UT San Antonio faculty members conducting interdisciplinary research in the new college.
Additionally, Quarles directs the San Antonio Virtual Environments (SAVE) Lab at UT San Antonio where he leads a team of students conducting research in VR, AR and mixed reality. Their current projects focus on reducing and preventing cybersickness, a type of motion sickness that can occur while in a virtual environment.
“If you’re on a roller coaster in VR, for example, it looks like you’re moving, and your eyes think you’re moving,” Quarles says. “But your vestibular system, or your inner ears, are wondering why you’re not physically moving.”
To help ease this sensory conflict between the eyes and inner ears, the team has tested different solutions including blurring the subject’s peripheral vision, introducing auditory feedback and using vibrating haptic vests. But the effectiveness of these solutions can vary widely from person to person and the type of virtual environment they are in. These cybersickness remedies can also disrupt the immersive experience that VR and AR are supposed to provide.
This is where artificial intelligence comes into play, Quarles says. By integrating AI, they can analyze large sets of data and develop models that may predict and ultimately help prevent cybersickness.
“Part of what we’re using is deep learning and time series forecasting,” he says. “We try to take all the data that we’re collecting, like heart rate, skin response and eye tracking, which is one of the best metrics. We take all that data from people in real time and then we try to predict when cybersickness will occur.”
Quarles’ many research projects all share a common goal: to create VR, AR and mixed reality experiences that are more effective and accessible so all users can take advantage of their wide-ranging applications.
“That’s where our research really fits — it’s kind of at the intersection of virtual reality and augmented reality, and deep learning, AI and computer vision,” Quarles says. “The intention is to advance VR, and to make VR better.”
No comment yet, add your voice below!