NUWC Division, Keyport Strives to Leap Ahead in Virtual Reality
(Source: US Navy; issued May 02, 2019)
KEYPORT, Wa. --- The Naval Undersea Warfare Center (NUWC) Division, Keyport is bringing what used to be science fiction to the world of science fact through the development of practical augmented reality and virtual reality (AR/VR) technology.

The Human Performance Engineering (HPE) Branch of NUWC Division, Keyport’s In-Service Engineering Department sees AR/VR as game-changer for naval training, maintenance, and even operations. The HPE team believes the ability to create a “head’s up” display over the real world (augmented reality), or immerse the user in a digitally-created world (virtual reality) can revolutionize much of the Navy by allowing for a more well-managed and efficient means of teaching sailors their trades and for operating specialized equipment.

“We’re trying to develop a baseline for integrating VR technology and improve any aspects of the technology we can get out hands on,” said Fallon Orr, a 3D modeler and virtual environmental specialist on NUWC Division, Keyport’s HPE team. She said they are looking at ways to upgrade existing technology to serve the high-pressure world of the Navy and Department of Defense.

“How can we take what’s readily available and integrate it to the Navy?” Orr asks. “My personal favorite part is that we’re not working on just one aspect.”

Orr has been working for NUWC Division, Keyport for a little over a year. A graduate of the University of Idaho, she said a professor recommended she apply to NUWC Division, Keyport after learning of her interest in developing practical uses for virtual worlds. She is involved in everything from designing virtual worlds to helping iron the kinks encountered while testing the technology.

The HPE team has already begun slowly fielding AR/VR technology to the fleet in a concerted effort to expand the Navy’s capabilities. However, traditional virtual reality tech requires the user to manipulate special controllers. Orr is focusing on technology to let the operator use their own hands.

For about eight months, the team has been working with commercially-available technology, Orr said. She said the product uses infrared sensors to detect and track a user’s hands, thereby eliminating the need for specialized controllers. The HPE team is evaluating this existing technology to create a baseline understanding of how such technology must be adapted to function successfully in a military environment.

The potential benefits of letting the user’s own hands be the VR tool are fairly obvious. A trainee can better learn how to perform maintenance by having their own hands practice the techniques of a job. Future use of such technology in operational settings might allow a user to more easily manipulate the arms of a remotely operated vehicle or direct a tiny robot in a confined space. This would enable maintenance to be performed in areas normally inaccessible except by expensive and time-consuming disassembly of major components. Eventually, even explosive ordnance disposal teams might be able to more easily and accurately control their robots, thereby significantly reducing their exposure to danger.

“The problem is occlusion,” Orr said. Setting aside her coffee, she maneuvers her hands as if operating an unseen system. “When one hand blocks the other, it blocks the system. But in a year that might not be a problem because people are out there right now solving these problems.”

Orr enjoys working on the HPE team partly because of the technology she is helping bring to practical reality, but also because she is on a truly multi-disciplinary team.

“We have artists mixing with scientists and with psychologists,” Orr said. “This diverse array of fields creates a dynamic team able to truly keep the user in mind.”

Orr herself has an extensive background in art, and that skillset is one of the more unexpected skills making the AR/VR effort potentially successful.

“I was told during my interview I would be helping increase the realism of the virtual worlds,” Orr said. She explained it is important to move the technology from looking like a computer-generated cartoon to a realistic environment in order to make VR as relevant to the user as possible. She uses her background in art in many innovative ways. She will help craft a virtual world one week, and the next she might be designing a simplified “head’s up” displays for potential AR technology that will augment a user’s ability to do their job without overwhelming them with too much information.

This diverse array of talent also helps in the process of introducing the technology to potential users who might be averse to trying something as radical as training a Sailor to take apart and rebuild major systems on a submarine without the trainee ever touching a real submarine. Even so, Orr said that convincing potential users to give AR/VR technology a try is not a hard sell once they can see the technology in action.

“As soon as you see it, you can visualize it, and it’s easier to put yourself in that situation and see how to integrate it,” Orr said. In her personal experience, even people who are self-described as “technology-adverse” are generally convinced once they experience how easy VR tech is to use, or how much data they can call up when experimenting with an AR system while working on a simulated job.

The future is already here, and Orr’s team is working hard to make sure these cutting-edge technologies provide realistic, tangible benefits to the Navy and Department of Defense. It is a two-way relationship from Orr’s perspective; she and her team are developing a better future for the Navy while finding their teamwork improves their own professional skills.

“There’s always an opportunity here to continue learning, and that makes our prospects better,” Orr said.


prev next