Augmented reality helps interpret gunnery commands / by Scott A.

The Navy has successfully demonstrated a prototype for a new command communications system that could help sailors work more effectively amid the noise and confusion of combat. 
 

The GunnAR system relies on augmented reality, generating visual cues from a ship’s gunnery liaison officer in an easy-to-interpret visual format for the gunner manning a naval gun system. 
 

Lt. Robert McClenning, a training officer aboard USS Gridley, proposed the idea to the Office of Naval Research and won $100,000 in prototyping funds during the March 2016 Innovation Jam. The Space and Naval Warfare Systems Center Pacific (SSC Pacific) Battlespace Exploitation of Mixed Reality (BEMR) Lab in San Diego undertook development work with contractor DAQRI, an AR technology company based in Los Angeles, and a mid-December demonstration put the system through its paces. 
 

The augmented-reality solution aims to resolve challenges faced by gunners trying to receive and interpret commands in the heat of battle. 
 

“The gunner’s mate has a problem when the shipboard environment is noisy or busy. There is a lot of confusion and a lot of noise when they are firing on targets,” said BEMR Director Heidi Buck. 
 

Often the chain of command can be fragmented. “The gunner’s mate hears the firing commands from someone, who is hearing them from the radio, and those commands are coming from the bridge. So it’s like playing telephone with guns firing and wind blowing,” she said. 
 

 

C4ISRNET

Virtual training environment allows forces to keep skills sharp

McClenning’s proposed solution augments that audio muddle with a clear visual signal conveyed via helmet. The sailor sees commands overlaid on the visual landscape — words like “fire” and “cease fire” — as well as images indicating the position and nature of targets.

The helmet, now in prototype phase, is in keeping with BEMR’s other mixed-reality efforts, including 360-degree training scenarios. 
 

“When a warfighter goes out and does a live training scenario with weapons or tanks or other soldiers, we can film that with 360-degree cameras and then put that film in a virtual reality headset,” Buck said. The service member can use this immersive environment to review or prepare for live events. “It allows them to experience things more fully, so they retain the information a whole lot better.” 
 

In applying this immersive-style approach to firing commands, BEMR officials say they are looking to help those in combat perform better. 
 

“We want the gunner and the liaison officer on the bridge to be focused on making the best decisions, and not having to deal with confusion or misunderstandings or time delays,” Buck said. “This removes the potential for error, and it speeds the information flow, which can reduce the chances of making mistakes.” 
 

The visual nature of the solution may be especially helpful in situations where officers in different locations may potentially be facing in different directions. “Which is the target to the left or to the right? Your left, or mine? So to be able to see the target symbol overlay, it eases that confusion,” Buck said. 

GunnAR is slated for its next demonstration at the June 2017 Trident Warrior event. It’s likely the system will undergo continued development between now and then. 
 

“We are dealing with cutting-edge technology that fleet users have never seen before. It is straight from startup companies and Silicon Valley and my lab’s job now is to figure out what this technology can and should be used for in the Navy,” Buck said. 
 

“It’s obvious that this type of technology would do well to address this problem,” she said. “But there are so many startups in this space, in six months there will be new headsets with new features, or there will be new software that will open up new doors. Literally the morning of our demonstration the DAQRI company came down and gave us a new headset. That’s how fast this stuff is changing.”