Deployment lessons learned moves F-16 sofware upgrade forward

  • Published
  • By Samuel King Jr.
  • Team Eglin Public Affairs
For two weeks in June, Eglin units, the 40th Flight Test Squadron and 85th Test and Evaluation Squadron, tested the M7 F-16 Fighting Falcon software upgrade over the wilds of Alaska in the joint exercise known as Northern Edge.

Upon returning from the large-scale deployment, the squadrons' Airmen continued to analyze the vast amounts of test data recorded and use the lessons learned to further improve the software's performance.

"Historically, this type of full spectrum testing has been accomplished much later in the software development process and any issues found usually were not able to be corrected unless they would affect fielding," said Lt. Col. Tim Stevens, the 40th FLTS commander.  "By testing earlier, we were able to identify problems and provide time for the software engineers and coders to fix those issues."

The type of testing the deployment provided pushed the software's limits by creating a dense, constantly busy war-time environment in the air and on the ground.  At times, approximately 140 aircraft were participating in the large force exercise.  This traffic was coupled with a denied and degraded electronic environment with GPS jamming, Link 16 jamming, radar jamming and communication jamming. 

"We were able to evaluate the system in an environment that it is likely to be employed in and also one we aren't able to test in here," said Capt. Jonathan Gilbert, an M7 project pilot.

Those test flights averaged around two to three hours with in-air refueling and covered diverse scenarios like bomb drops, hostile boat engagement, enemy surface to air missiles defenses and advance/defend air-to-air combat.  Based on those scheduled scenarios, the squadrons' Airmen would adjust the planned sortie to gather more data and isolate problem areas with the software.

A goal of the aggressive testing in the deployment was to discover the software's problem spots, so they could be addressed now.

"(The software) was not as mature as we thought we noticed it had a few issues in this environment, but we were able to gather data in those areas and provide it to our engineers in hopes of fixing those issues," said Gilbert.

Gilbert said the simultaneous developmental and operational testing was the key to the success of the Northern Edge deployment.

"Everyone was able to integrate together to include the engineers during those tests.   This allowed us to better learn the software we are testing," he said.  "It was a true test enterprise effort."

Traditionally, developmental software testing is done to check the software's specifications.  Then it is passed to operational testing for suitability, effectiveness and tactics development.  At Northern Edge, those test parameters were evaluated concurrently.

"The goal of this concept is to reduce the test timeline by familiarizing OT with problems early, therefore effect on operational suitability, and ultimately producing a better product once DT is complete," said Gilbert.