As the prime contractor for the U.S. Navy’s Trident D5 missile guidance system, Draper Laboratory oversees the MARK 6 MOD 1 development team of more than 700 engineers from Draper and its major subcontractors as it modernizes the missile’s MARK 6 inertial guidance system. The goal is to extend the life of the MARK 6 to 2042 while lowering the Navy’s future maintenance and support costs and providing a flexible architecture to support new missions and upgrades.
The guidance system’s primary functional requirements are to determine position, velocity, and attitude of the missile; issue steering commands to the missile; and maintain executive control of the system’s modes once in flight. It also provides the capability to calibrate and test the system while resident in a submarine as part of the overall weapon system operations.
Part 1, which appeared in the September issue, described the infrastructure developed to facilitate design, simulation, and test of the MARK 6 MOD 1 with an overall systems engineering approach. Part 2 addresses hardware development, various environmental test cells, and the analyses executed throughout design and verification.
Modeling and simulation are central to the design process for MARK 6 MOD 1 and took the form of developing a series of simulations known as virtual systems.
The first virtual system, known as the functional simulation, consists of a simulation faithfully modeling the partitioning of the system into modules in Simulink, with each module represented by models of the control laws and dynamic equations. Simulink, a hybrid simulation tool developed by The MathWorks, is capable of modeling digital processes, discrete events, and continuous time-domain dynamics within a graphical environment.
The shared serial data network (SSDN) model is implemented in a library generated directly from the system interface control document (ICD) through a series of processing scripts. This allows the simulation to accurately model the content and timing of data exchanged between the modules in the system. This was the first full system simulation developed on the program and serves as the truth model capable of generating data sets that can be used as input to open-loop test benches for unit development and tests.
Instruction Set Simulator
The next level of simulation incorporates an instruction set simulator (ISS) based on the Simics tool suite. Simics is a simulation and software verification environment provided by Virtutech. It supports a flexible and scalable simulation incorporating detailed models of electronics with which embedded software interacts.
This technology provides a functionally accurate model of the software-hardware interface in the target processor. The model is capable of executing code compiled for the target processor in a fully virtual environment with control over time and complete visibility into the code execution.
In addition to the off-the-shelf capability provided by Simics, detailed models of the memory controller, databus, and memories were developed in parallel to the design and development of the actual flight computer modules. This allowed the software development process to be decoupled from the hardware delivery schedule.
To provide meaningful stimulus to the flight software under test, data sets logged from the functional simulation were driven into the ISS. The software development teams used this capability to test and debug their designs before the software ever ran on actual hardware.
Flight Software Simulation
As embedded flight software matured, the ISS-based test capabilities were merged into a higher fidelity simulation called the flight software simulation. This simulation consisted of four instances of the ISS, representing the four flight computers in the system, along with the sensor and plant models from the functional simulation integrated within a simulation tool called EASY5.
EASY5 is a multidomain modeling and simulation tool from MSC. It provides a hierarchical, systems engineering view of the simulated models and allows the integration of third-party software.
This environment supported a detailed representation of the databus not only in terms of timing and content, but also message formatting, again derived directly from the ICD. Instead of using control laws in the form of block diagrams to control the system as in the functional simulation, the actual flight software compiled for the target processors would be loaded into the ISSs within flight software simulation, allowing detailed verification of the flight software in a system context.
Detailed System Simulation
The highest fidelity virtual system on the program incorporates the digital VHDL designs of the ASICS and processors using a computer called a Palladium. The Palladium from Cadence is a special-purpose computer known as a hardware accelerator expressly designed for speeding up the verification of digital designs. This technology accelerates the execution of simulations 1,000x over traditional workstation-based simulation.
The seven unique ASIC designs present in the system were integrated into a virtual system per the system design and synthesized into an executable form for the Palladium. This environment emulates the digital logic of the designs at the gate level and executes their logic at a rate of the system clock on the order of tens of megahertz. This level of simulation provides the verification teams a highly accurate and visible means for verifying digital aspects of the design that could not be evaluated by other means.
As the digital designs matured, the dynamic models from the functional simulation were again reused. This time, the gimbals and missile models were executed on a separate UNIX workstation while the digital logic was emulated on the Palladium. Eventually, the capability to fly full missions was established, fully exercising the digital designs over entire mission scenarios and providing design teams with full visibility and access to the ASICs’ states for verification in ways that could not otherwise be accomplished.
In parallel to the modeling and simulation activities, hardware was developed. Each module had its own test bench dedicated to support its verification.
For modules interfacing to the SSDN, a communications emulator provided a means to drive message traffic into the module and log its response. Through this emulator, data logged from the virtual systems also could be driven into the hardware prototype, providing meaningful test cases. Also, data logged from the prototype could be brought back into the desktop analysis environment and compared against the expected results from the virtual system. Emulators for other interfaces also were developed, providing the interfaces and behavior of sensors and actuators necessary to verify the electronics module in a unit test environment as much as possible before integration.
As functionality was achieved in prototype
modules, they were integrated into the flat system test bench
(FSTB) with the goal of complete system integration of all of
the electronics modules with a real-time simulation computer and
the sensor emulators. In this case, 37 electronics modules in
the system were integrated on the laboratory bench. Models
running on the simulation computer would feed dynamics into the
sensor emulators, identical to those from the module
test benches, that would be read by the module electronics.
Commands from the mission processor would be sent to a missile model flying in the simulation computer, allowing closure of all the control loops in the system and demonstration of the full system over a range of operations. This capability provided a means to demonstrate the electronics designs in open architecture in the lab before assembly into the final closed system, permitting probes to be inserted for troubleshooting and verification of intermediate signals between modules.
Simulations are not the only assets available to verification teams. As the full engineering evaluation units are assembled including the sensors in the sealed Inertial Measurement Unit (IMU), test cells are available to evaluate the system. These include the thermal, vibration, and stellar (TVS), centrifuge, and POD test cells.
The TVS test cell is capable of creating dynamic environments on the guidance system for verifying its performance over a range of shock, vibration, and thermal conditions. Not only does this environment stress the performance of the gyroscopes, accelerometers, and gimbal control loops, the test cell also incorporates a star simulator. Sighting a star under these control environments reveals details about the overall system performance and exposes the critical sensitivities of interest for each sensor to the designers.
The precision centrifuge test cell has a radius of 35 feet from its spin axis to its tip and can achieve 25g of acceleration on the UUT and a variety of rate profiles. For testing the MARK 6 MOD 1, both the IMU and Electronics Assembly (EA) are mounted at a distance of 32 feet from the spin axis. Three variable-speed DC motors rotate the centrifuge and, under the current test scenario, bring the guidance system to 7g at a rate of 1g/minute.
Signals from the guidance system are routed through an interface adapter and slip rings to the support equipment and the test station. Sensors are mounted on the end of the arm to sense position relative to surveyed references around the test facility. This instrumentation data is routed to a data acquisition computer to serve as the truth reference against which the performance of the guidance system is measured.
The POD test cell is mounted to the wing of an F-15E and contains the guidance system along with dataloggers, GPS receivers, and an independent IMU to provide instrumentation reference data for subsequent analysis (Figure 1). After take-off, the pilot flies a series of maneuvers designed to generate dynamic loads on the IMU representative of those in a real missile flight. Through a series of turns, dives, and climbs, the IMU senses accelerations and decelerations, exciting various modes in the system.
Figure 1. Pod Test Cell Mounted Under the Wing of an F-15E
Courtesy of the U.S. Navy
Upon landing, the data is removed from the POD for subsequent analysis to verify the performance of the system. Data from the GPS receivers and independent IMU is instrumentation data, providing a reference against which the performance of the guidance system can be analyzed.
Throughout the MARK 6 MOD 1 development, these assets have been used in different ways at the appropriate times to support the design efforts, verify implementation, and mitigate risks.
Verification of Preliminary Interfaces and
During the preliminary design phase, the system engineering design team was tasked with synthesizing the high-level system requirements into a viable system architecture with derived requirements for modules in the system and definitions for each module’s interface. To support this effort, the development of the functional simulation was tightly integrated with the system design activities.
As requirements for modules were identified, model libraries were updated and integrated into the simulation to evaluate performance at the system level. The ICD evolved with the definition of the module requirements. By generating or auto-coding the databus model directly from the system design team’s ICD, the evolution of the functional simulation remained a high-fidelity representation of the system design. This forced the model of each module in the system simulation to consume and generate its interface data at the specified rates in the system design.
The system design team was able to use the simulation to evaluate the performance of each control loop in the system throughout each mission phase. By the time of the Preliminary Design Review (PDR), the functional simulation had been used to develop and verify requirements for completeness and correctness, including the content and timing of data exchange across the databus. As part of the entry criteria for PDR, the designers were able to demonstrate that the pieces of the system, if built to the derived requirements, should integrate, communicate, and execute as a whole to achieve the higher-level system requirements.
Although a processor based upon a commercially available design was chosen for the system’s embedded computers, the full processor modules including support chips and memories were not available to the software developers until well into the program. Rather than delay development of flight software, the software design teams immediately began work on their designs by compiling code for the target environment and executing on the ISS that accurately emulated the flight computers.
The teams started with framework code, developing the lower-level functions required to interact with the processor and I/O interfaces. As the module requirements and the functional simulation matured, both requirements for the application software and test data sets became available.
The software development teams were able to begin work on the four software applications to run on the four separate processors in the system. By directly implementing the system design, the functional simulation was immediately available to generate data sets for driving the code in the open-loop ISS, serving as a truth reference for the expected output for the application code. The developers were able to work in the virtual environments to design and test their software as requirements evolved and the hardware was being developed.
Integration Check-Out and Risk Mitigation
One of the greatest risks in the design of a new system is omitting the flow-down and specification of a lower-level requirement. This results in an incompatibility between modules that is not discovered until late in the program when designs have been solidified and the cost to recover is great. To mitigate this risk, the MARK 6 MOD 1 program set a demonstration of a full system with prototype electronics flying a full mission as entry criteria to its Critical Design Review (CDR). This motivated the early development of the FSTB and forced communications between module design teams and resolution of integration issues early in the program (Figure 2).
Figure 2. Electronics Modules of MARK 6 MOD 1 Integrated Into the Flat
System Test Bench
With the functional simulation as a reference, the FSTB was incrementally assembled with the prototype modules. As each module was integrated, test scenarios from the lab were compared with the expected behavior observed in the functional simulation.
When it came to software integration, much of the software already had been tested and verified to the scenarios expected to be seen upon integration through the use of the ISS and subsystem test benches. Time spent in the integration lab was dedicated to integration issues between subsystems present only in real hardware and not higher-level functionality issues that were resolved in advance through simulation.
Although the schedule for the FSTB was aggressive, the integration of the mission flight software took only approximately one-quarter of the time allocated. This savings in time is credited to the software development team’s rigorous application of peer reviews and verification through their virtual environments before delivery to the integration teams. Well in advance of system CDR, the design teams were able to demonstrate that the detailed designs of the electronics modules worked in both isolated test beds and a fully integrated system and could perform in all modes of operation across a mission.
The portion of the MARK 6 MOD 1 implemented in software is significant, including control loops for positioning the gimbals in the IMU, interfacing the guidance system to fire control and the missile control electronics, sighting the star, and performing the navigation and guidance calculations. To test the full scope of software, an independent software verification and validation (V&V) team develops test scenarios for execution in the various simulation and test environments.
One of the primary tools available to the V&V team is the flight software simulation. This environment provides the team total control over test scenarios, the ability to inject faults and test corner cases, and visibility into the state of all elements in the simulation including the state of the software execution in the detailed models of the processors in the ISSs.
All of the requirements originally allocated to the software design teams become goals for verification test scenarios that the V&V team creates. As test scenarios are executed in the virtual environments, the status of the design is communicated back to the design teams. Whenever a requirement or the design intent is not met, the design is updated to ensure its quality for the final delivery.
Characterization of System Accuracy
Ultimately, the success of guidance system design is measured in terms of how accurately the system navigates. To characterize the performance of the MARK 6 MOD 1 system, a team dedicated to the accuracy analysis of the system makes extensive use of all available data from the full range of test cells.
In the absence of an exhaustive and prohibitively expensive missile flight test program, the accuracy team designs unique tests intended to excite the system in ways characteristic of the environments expected in flight for each cell. Data from the test cells includes the instrumentation reference data and the navigation data from the guidance system under test.
Using a sophisticated suite of analysis tools developed over years of guidance system design and detailed models of all of the expected error sources in the system, the team assesses the overall accuracy of the system as well as identifies the contribution of each individual error to the overall performance. Presently, MARK 6 MOD 1 has undergone one set of centrifuge tests with another centrifuge series expected in the near future. Preparations also are underway for tests in the TVS and POD test cells.
The design and verification of MARK 6 MOD 1 represent a significant challenge, with the need to meet high expectations set by the existing MARK 6 system in a cost-effective manner. To address this challenge, Draper’s use of a system engineering process that has tightly incorporated simulation and test from the beginning has proved to be a successful approach to the incremental design, development, integration, and test of a system.
At each phase of the program, the system has been tested at the system level to an appropriate level of fidelity. The system architecture, module interfaces, and module requirements were demonstrated through missions flown in a function simulation by PDR; electronic prototypes flying similar missions in the FSTB by CDR; and software verification and system accuracy assessments ongoing through transition to production on the flight software simulation and initial engineering units. Each environment and associated verification activities provide confidence in the design and implementation before the first missile flight. In this incremental, spiraling fashion, risks are retired early, and each subsequent phase of commitment to the design is approached with confidence.
Todd Jackson, Ph.D., is a systems engineer in the Modeling and Simulation Group at Draper Laboratory. Dr. Jackson has participated in the design process of the Trident MK6 MOD1, supporting both the design of the system and development of simulations and related infrastructure to verify the design. He is a graduate of Princeton University with a B.S.E. in aerospace engineering and earned his Ph.D. in computer-aided design and fabrication from MIT. e-mail: firstname.lastname@example.org