Using AI and simulations, Rowan engineers are helping the Army reshape the future of combat

Using AI and simulations, Rowan engineers are helping the Army reshape the future of combat

Share
 
The Rowan team has created a virtual environment with photo-realistic visuals, including a tactical vehicle and multiple terrain options.

Aiming to make game-changing impacts on its future combat capabilities, the Department of Defense has awarded Rowan University an additional $3 million to create virtual and mixed reality combat simulations augmented by artificial intelligence (AI). The project is a continuation of a $5.5 million partnership between Rowan University and the U.S. Army Combat Capabilities Development Command – Armaments Center at Picatinny Arsenal, New Jersey. 

Improvements to weapons and tactics for combat are shaped by changes in technology. Drones, sensors on tactical vehicles, and artificial intelligence can work together to protect exposed crew members and gunners operating in combat vehicle turrets. 

“We are developing secure, immersive and autonomous mixed-reality environments that can enhance the operational evaluation of next-generation gunner turret systems and accelerate their development,” said the project’s principal investigator, Dr. Nidhal Bouaynaya, associate dean for research and graduate studies and professor in the Department of Electrical and Computer Engineering in the Henry M. Rowan College of Engineering.

To safely simulate, test and refine its new technologies, the team is creating virtual, augmented and mixed reality environments featuring armored and tactical vehicles, gunner protection kits (GPKs), threats and engagement scenarios. Artists and programmers will work with veterans who were previously deployed to create realistic combat scenarios. 

The team is also building situational awareness systems to alert military personnel to threats using secure, high-speed wireless communication between the vehicle and the crew members. A sensor suite will capture data collected from sensors mounted inside and outside the tactical vehicles, as well as physiological data from the gunner and information about the weapon, such as its position and misfeeds.  

Engineers will feed the data to the AI system, so it can identify threats and make predictions. Ultimately, each element of the project will be woven seamlessly together, giving the gunner access to processed information using a combination of head-up displays and VR headsets. 

To date, the Rowan team has created a virtual environment with photo-realistic visuals, including a tactical vehicle and multiple terrain options. The team’s AI system recently demonstrated it could detect aerial drone threats in real time within the simulated environment.   

Next, the human factors team will collect physiological data from test subjects (Rowan University student volunteers) by immersing them in a number of virtual scenarios. The information will help the team determine the best methods for presenting information to a gunner.

Bouaynaya and her team are pushing the boundaries of convergence research in virtual and mixed reality, AI, sensors, human factors, and advanced materials. Because the research is conducted within a mixed reality simulation environment, Bouaynaya expects to answer many questions about how well these tools can work in a real combat environment. 

“The combat landscape is changing rapidly,” Bouaynaya said, “and will be changing to include more robots and more AI. The framework we are developing is one way to keep pace and maybe get one step ahead of those changes.”