The US Department of Defense (DOD) has a long history dealing with data interoperability challenges.Â Many solutions have been successfully implemented, but all have exhibited limitations when the scope of their application has been expanded.Â The introduction of the internet and the concept of Netcentric Warfare have magnified both the need for and the challenges of data interoperability.Â It is becoming increasingly clear that the fundamental problem is one of complexity, and this understanding has influenced recent Joint military guidance and directives.Â Â This paper will discuss the problem domain and solutions as a case study in complex systems engineering (CSE).Â Â The case study provides preliminary insights involving scale and emergence in CSE.
DOD Data Interoperability is an ideal case study for complex systems engineering (CSE).Â There is a long history of limited successes and ultimate failures so that the challenges are well recognized.Â Further, new concepts for networked operations present additional challenges, and it is recognized that new kinds of solutions are needed.Â Experience has shown that achieving data interoperability across a large diverse enterprise is intractable.Â In addition to technical challenges, there are organizational and cultural issues.Â It is becoming increasingly clear that complexity is the true problem.Â Â Recognizing this, the Joint DOD community has begun to approach data interoperability as a problem in CSE.Â This paper reports on the some preliminary issues and insights.
CSE is about achieving a process, not an end state.Â System engineering implies some level of deliberate control or standardization, while complexity implies a level of unpredictable evolutionary development.Â Instituting a process that allows for both standardization and variability is a key CSE challenge.Â A related challenge is balancing competition and collaboration.Â Competition is valuable because it drives innovation, but collaboration is essential to provide coherence.Â Â Taken to extremes, competition can lead to counterproductive battles between Â“magic bulletÂ” solutions, while collaboration can involve cumbersome processes, which are also counterproductive.Â
CSE is a human process with concomitant complexities and limitations.Â As the size and diversity of a stakeholder group grows, fewer system details can be standardized.Â Matching the degree of standardization to the scale of the stakeholder group is another key CSE challenge.Â Â Effective communication is a critical element, and useful system documentation is a key enabler.Â Â
1.1 What is Data Interoperability?
Data Interoperability is Â“the ability to correctly interpret data that crosses system or organizational boundariesÂ” [Renner undated].Â Specifically, data interoperability means that one system (and its users) can understand and interpret the data that comes from another system (and its users).Â This understanding includes both the syntax (format) and the semantics (meaning) of the data.Â Â Data interoperability does not include Â“process interoperabilityÂ”, which ensures that the other system has the information that is needed, nor does it include Â“communications interoperabilityÂ”, which ensures that there is a data transfer mechanism between the systems.Â Â
Human communication is a key element of data interoperability.Â System builders and their users must share a common vocabulary.Â Typically shared understanding is defined implicitly by stakeholders in a community of common interest.Â Â Disconnects among these implicit understandings can inhibit data interoperability within and especially between communities of interest.Â
1.2 Why is Data Interoperability Complex?
Four characteristics contribute to the complexity of DOD data interoperability.Â First is the large number of systems.Â The number of interfaces increases as the square of the number of systems.Â An enterprise with many thousands of systems has potentially many millions of interfaces.Â The DOD, for example, has over a thousand logistic systems alone.Â
Second, changing operational needs continually require new and modified systems with new and modified interfaces.Â Â Changes occur rapidly during war when new tactics and procedures evolve quickly to meet changing combat conditions and enemy tactics. Â Changing requirement can be imposed even when a system is still in development.Â For example, the end of the cold war with the concomitant emphasis on tactical missions led to modifications to many DOD military systems and their interfaces.
Third, interfaces must accommodate asynchronous implementation and deployment of systems.Â Interfacing systems may be developed in parallel by different contractors under different program timelines, and multiple versions of each system may be in the field simultaneously.Â For example, an entire surveillance aircraft cannot be grounded for retrofitting so upgrading is an asynchronous multi-year process.Â Since new upgrades are being introduced continuously, each aircraft system may be unique.Â
Finally, diverse communities have diverse information content with domain-specific vocabularies.Â For example, the DOD uses health, training, housing, transportation, procurement, and accounting information, in addition to weapon systems, support and logistics information.Â The deployment of hundreds of thousands of warfighters may require coordination and interoperability among all these communities.Â However, even within a single community, vocabularies vary in subtle ways.Â For example, does the Â“aircraft weightÂ” include the on-board mission equipment, the fuel, and the cargo weights?Â Does the cargo weight include shipping boxes or pallets?Â Yet Â“weightÂ” is a simple concept relative to abstractions, such as Â“missionÂ”.Â
2 Past Solutions
Many approaches to data interoperability have been successful in small to medium communities of stakeholders but have become intractable as the number and diversity of stakeholders increases. One approach is to Â“do nothingÂ” assuming that interoperability will happen, because it is in everyoneÂ’s best interest.Â Â However, this approach is rarely if ever successful.Â Â“Doing nothingÂ” has resulted in stand-alone systems with undefined or poorly defined interfaces (or no interfaces) to critical information.Â
A more successful approach for a few systems is to define and control the interfaces between them.Â Each pair of systems agrees to comply with a detailed documented interface, with changes effected via an Â“Interface Control BoardÂ” that includes the relevant stakeholders.Â Â Negotiating interface changes between systems with different contractors, users, funding sources, and development timelines is challenging but has sometimes been successful.Â Â In practice, several issues arise.Â The documentation may include only the data syntax, not the semantics, because the stakeholders come from a small community where the syntax is implicitly understood.Â This makes it difficult for those outside the community to reuse the interfaces.Â For systems with many complicated interfaces and frequent changes, the documentation is costly to maintain and may become out of date.Â Achieving backward compatibility with numerous versions of fielded systems is also challenging.Â Â Worse, the number of interfacing pairs of systems increases as the square of the number of systems, so this approach seems to be most successful for a Â“hub and spokesÂ” system-of-systems configuration, for example for interfaces between an aircraft and its support equipment.
Two approaches have been successful for interfaces between larger numbers of systems:Â standardized message sets and standardized data bases.Â Â Â Message sets and data bases encounter similar problems as the number of systems increases.Â These include expanding documentation of highly complicated standards, which makes the standards expensive to implement, backward compatibility issues, and an unresponsive change process.Â As a result, implementations become fragmented and incomplete.Â
For example, each of the many DOD message standards may have several versions, which are not compatible with each other, and each version may have a hundred or more defined messages each with a hundred or more defined fields.Â The documentation for a message standard may fill a bookshelf.Â Due to cost and schedule constraints, an entire standard is rarely if ever implemented in a single system.Â Moreover interpretations of the standard may differ.Â As a result, using a message standard does not guarantee interoperability.Â Â
Databases also become unwieldy.Â In an attempt to insulate systems from frequent changes in database content and structure, systems-of-systems may develop modified versions of the database, limiting interoperability outside the system-of-systems.
A final approach has been to standardize vocabulary and data models.Â These approaches have been effective for some well-defined communities.Â However, they have had the same limitations for diverse communities as the other standards.Â Migrating legacy systems to new data model standards strains limited resources.Â Cost and schedule limitations have also led implementers of new systems to ignore complicated standardized vocabularies and data models.Â Â
Although past solutions have merit, all have failed to provide interoperability for large diverse communities.Â Solutions involving coordinated standards have lead to workarounds and subsequent incompatibilities.Â Nevertheless, uncoordinated interface control has also failed.Â
In the absence of deliberate engineering, data interoperability does not emerge because the three major groups of stakeholders, those who fund, use and build the systems, have other pressing priorities.Â Funders are primarily interested in providing new capabilities; users want to fix todayÂ’s deficiencies quickly; and system builders are focused on meeting severe cost and schedule constraints.Â Â
At the same time, standardization is fundamentally limited.Â Large amounts of diverse data cannot be standardized over large diverse groups of stakeholders because it is too expensive to implement and too slow to change.Â Â Complexity limits the ability to manage large standards.
3 New Guidance and Directives
Recognizing these problems, the DOD has issued new guidance and directives to foster both coordination and evolution.Â There are two thrusts.Â One involves establishing Communities of Interest (COIs), defined as groups of stakeholders who use a common vocabulary to exchange information.Â Â The other involves requirements for disciplined data asset engineering, where data assets are broadly defined to include output files, databases, documents or web pages as well as services that may be provided to access the data from an application.
3.1 Communities of Interest (COI)
COIs have been established by the DOD to develop shared vocabularies, define shared information spaces and align responsibilities for information owners and data producers.Â Successful COIs have been triads including users, funders, and builders.Â Â COIs are envisioned to encompass both centralized coordination via Â“organizationalÂ” COIs and decentralized problem solving via Â“ad hocÂ” COIs.Â Organizational COIs provide a structure for more stable stakeholder communities, while ad hoc COIs provide a mechanism to coordinate among communities with immediate data interoperability needs.Â Â One valuable enabler has been leveraging established stakeholder groups and standards (based on existing messages, databases and data models).Â
It is sometimes envisioned that the operational COIs should provide oversight of the ad hoc COIs, but that view implies that the DOD domain can be cleanly deconstructed.Â It is not yet clear how to bring these two kinds of COIs together under a single process.Â It is impossible to divide the enterprise cleanly into an organization of non-duplicative COIs, but it is equally impossible for interoperability to emerge from uncoordinated problem-solving COIs.Â
This is at the heart of the CSE challenge.Â Engineering presumes a deliberative process to ensure an end result, in this case data interoperability.Â In the face of increasing complexity, however, structured engineering fails.Â Standardization can only go so far; it is critical to allow for decentralized variations.Â Several competing data models have been proposed as Â“ultimate solutionsÂ”, but the past failure of global solutions has made the community wary of Â“magic bulletsÂ”.Â
3.2 Data Asset Engineering
Data asset engineering is part of good system engineering practice, and much of the new DOD guidance and directives have simply underscored the requirements for this practice.Â The DOD key tenets are that data should be visible, understandable, accessible and trusted.Â This means that users and applications can discover the existence of data assets (e.g. via metadata catalogs [DOD CIO, 2005]) and that they can interpret the data, both structurally and semantically.Â In addition, it means that data assets are available to users and applications except where limited by policy, regulation or security.Â Â Â There is a new emphasis on accommodating unanticipated future users and on coordinating with COIs to identify recommended data standards and interoperability test opportunities.
However, there are many challenges.Â These requirements strain resources and push the limits of current technology, although technologies are continuing to advance in the commercial sector (e.g. to support the semantic web).Â Many existing systems are poorly documented, and the funding for improved documentation would have to come at the expense of improved system capabilities.Â For new systems, it is difficult to anticipate the required resources because best practices have not been institutionalized.Â Â
In addition to technical challenges and limited resources, there are organizational and cultural issues.Â These include evolving lines of authority for data interoperability and for program management and funding.Â
Some progress has been made.Â Metadata registries have been established, although improvements are needed in both the quality of the documentation and the capabilities of the registries.Â Developers of new systems are beginning to collaborate via established COIs to identify and evolve applicable standards.Â There has also been some effort to develop Â“smallÂ” standards (e.g. Â“Cursor on TargetÂ” [Byrne, 2004], which has been useful across a diverse stakeholder community).Â
4 CSE Preliminary Insights
The DOD experiences in data interoperability engineering can be generalized to other applications of CSE.Â A key insight is that standardization should be applied at the appropriate scale.Â Specifically, the degree of standardization should relate to the size and diversity of the stakeholder group.Â Â Traditionally, data interoperability standards have been inclusive and flexible.Â As a result, they have become unmanageable as the size and diversity of the stakeholder group has grown.Â Â It may be that the reverse should occur - that standards should be limited to what is common to the stakeholder group and simplicity should be favored over flexibility.Â Â
Another insight is that CSE needs to combine a structured engineering process and an unstructured evolutionary process in a manner that promotes the emergence of a desired characteristic.Â Â“Ad hocÂ” COIs have been effective in solving focused data interoperability problems and Â“organizationalÂ” COIs have been effective in defining community standards.Â It is not yet clear how to combine these two processes, but the author believes communication (via well-constructed metadata catalogues) to be a key enabler.Â
A final insight involves the dual role of competition and collaboration.Â The CSE process should promote both multiple competing solutions and collaborations to establish standardization.Â Data interoperability is a goal that will never be fully achieved, but it is unclear what level of diversity is healthy and what level of non-interoperability is acceptable.Â
Providing data interoperability is an ongoing DOD challenge, and many recognize that the central issue is complexity.Â As the community struggles with this challenge, it should continue to reflect on how their experiences generalize and inform CSE.
One unsolved challenge is how to measure progress, or lack of progress, toward interoperability.Â While it may be clear that improvements are needed to meet growing needs, it is unclear how to determine whether data interoperability is in fact improving.Â Although various metrics have been proposed, both at the system and at the enterprise level, there is no consensus.Â Until this challenge is met, it will be difficult to understand what solutions work.Â
Byrne, R. Â“Cursor on Target Improves EfficiencyÂ”, The Edge, Vol 8, No. 2, The MITRE Corp., Fall, 2004
Renner, S. Â“A History of DoD Data ManagementÂ”, undated, USAF briefing
DoD Directive 8100.1, Â“Global Information Grid (GIG) Overarching Policy,Â”Â September 19, 2002
Deputy Secretary of Defense, Management Initiative DecisionÂ No. 912 (MID-912), Joint Battle Management Command and Control (JBMC2), 07 January 2003
Department of Defense Chief Information Officer, Â“Net-Centric Data Strategy,Â” 09 May 2003
CJCSINST 6212.01C, Â“Interoperability And Supportability of Information Technology and National Security Systems,Â” 20 November 2003
Department of Defense Chief Information Officer, Â“DoD Discovery Metadata Specification (DDMS)Â” Version 1.2, 03 January 2005
Deputy Secretary of Defense Memorandum (OSD 03246-04), Â“Information Technology Portfolio Management,Â” 22 March 2004
Chairman of the Joint Chiefs of Staff Memorandum CM-2040-04, Â“Assignment of Warfighting Mission Area (WMA) Responsibilities to Support Global Information Grid Enterprise Services (GIG ES),Â” 08 September 2004
The Joint Staff, Joint Requirements Oversight Council Memorandum (JROCM 199-04), Â“Data Strategy Implementation for Warfighter Domain Systems,Â” 29 October 2004
USD AT&L Defense Acquisition Guidebook, Version 1.0, Section 7-Acquiring Information Technology and National Security Systems, 17 November 2004
DoD Directive 8320.2, Data Sharing in a Net-Centric Department of Defense, 02 December 2004