Dennis D. McDonald (ddmcd@ddmcd.com) consults from Alexandria Virginia. His services include writing & research, proposal development, and project management.

Recommendations for Collaborative Management of Government Data Standardization Projects

Recommendations for Collaborative Management of Government Data Standardization Projects

By Dennis D. McDonald, Ph.D.

Click or tap here to download a .pdf of this article.

Introduction

Standardizing how the U.S. government collects, manages, and publishes budget and expenditure data, as required by the DATA Act currently before the U.S. Congress, is an example of a long-term and complex project. It will require careful planning, management, and sufficient resources to be successful. Government agencies are already organizing themselves to support its implementation 

As the U.S. Government continues its practice of making data files accessible to the public, interest in data standardization is bound to increase. This blog post, developed independently by the author, is part of a series discussing important project management issues relating to how data and metadata standards are developed and implemented. It concludes with a list of areas requiring special management attention when undertaking initiatives where data and metadata standardization is being delivered as part of an overall program.

Ripple effects

The number of organizations, systems, and processes involved in implementing standards-related initiatives which focus on increasing the accessibility and usability of Federal budget and spending data is large. The “ripple effects” of changing how financial data are managed extend through many industries and levels of government and include how financial information is reported, how regulations are managed, how systems are acquired, and how data are published and exchanged both inside and outside government agencies. This is not just a Federal issue since state and local governments are closely tied to how Federal funds are distributed and used.

Costs and benefits

How data and metadata standards are implemented can impact the costs and benefits experienced by standardization participants:

  • In the short-term: “Off the shelf” systems and services are already available and in use that can provide rapid and economical access to data files whether or not they are officially “standards compliant.” 
  • In the long term: Implementing standardization to promote interoperability of data and metadata will require planning, resources, and time, given the variety of systems, organizations, and regulations that impact or consume government financial data. Longer term efforts will also have to address the variety of formats and systems currently in use.

Ideally, accomplishing such short and long term effects can be coordinated so that public access to Federal spending and other data is maximized while the total cost over time of standardizing such data is controlled and minimized.

Standards processes

Traditionally the introduction of data standards into an industry has depended on a structured standards development process involving many committee meetings and the reviewing and revising of multiple standards drafts. Here as an example is a diagram illustrating the steps in one such standardization process as published by the Environmental Information Exchange Network.

The success of such efforts is greatly dependent upon market forces and major corporate participation with appropriate government bodies sometimes expressing official and public commitment to the standards. A thorough but readable review of such processes from an economic perspective is available in the Manchester Business School’s research report, The Economics of Standardization.

Formalized standards processes exist today and already impact how government regulations or corporate financial data are exchanged, examined, and reported. Examples are implementation of XML- and XBRL-based financial reporting standards. Also, data standardization and conversion efforts supporting data center and financial system consolidation involve structured processes such as data dictionary development, data extraction, conversion to a target format, and reloading or exchange with now-interoperable systems. Such processes are typical of data conversion or data transformation projects where one goal is to reduce the number of systems — and associated costs — required to manage data that support similar business processes.

One side effect of standards implementation is that supporting businesses sometimes evolve that provide tools and advice to medium-size and smaller industry participants who need help incorporating standards into their own offerings in order to remain competitive. This availability of potentially innovative third party tools and services can be critical to industry adoption of standards that require changes to existing systems and business processes. (For a conceptual model of how standards relate to different types of innovation see slide 8 of Cesare Riillo’s presentation Standards and Innovation – What Relationships? A Literature Review.)

Today’s environment

Online data exchange and collaboration support systems have become widely available and accessible. Collaborative work that once might have been conducted via physical meetings or annual conferences or dedicated teleconferencing systems can also be conducted online through web-based systems that anyone can configure and use. For example, project management software now typically incorporates functionality to support information sharing and collaboration.

Rapid access to shared data files, ad-hoc review of intermediate drafts, and rapid consultation and review by industry experts and impacted parties have the potential for accelerating the entire “standards lifecycle” including distribution and implementation support. Furthermore, such processes can now be conducted “in the open,” bypassing the secrecy that sometimes shrouds efforts dominated by a few key industry players. (A detailed case study of such collaborative and open processes is provided by Standards-Battles in Open Source Software: The Case of Firefox.)

In some cases external access to and re-use of government-generated data can have significant economic and employment benefits, as illustrated by Cap Gemini’s research report The Open Data Economy: Unlocking Economic Value by Opening Government and Public Data.

Not all standards are created equal

There are differences across a full-fledged industrial or engineering standards such as detailed document-oriented markup standards, flexible taxonomies for classifying data for public access on municipalities’ “open government” data portals, and specialized hashtags that support communication over a weekend meeting of data hackers. The formalisms and levels of effort involved in developing and using such “standards” obviously differ yet, taken together, they can all play a roles in improving both transparency and interoperability.

One thing that such standards have in common, regardless of whether they are industry-official, government-sanctioned, de facto, or just socially-adopted by a group of dedicated practitioners is that implementing them requires changes to systems and processes that vary in complexity, cost, risk, and ease of adoption. How this overall process is managed will impact not only successful adoption and use but also the total cost of standardization.

Recommendations for collaborative standards management

Any organization that considers data and metadata standards critical to the success of its data access and reporting responsibilities needs to integrate a range of formal and informal standards and collaborative techniques into its program management portfolio. Here are some recommendations for accomplishing this. Note that some of these recommendations relate to basic program management principles with special attention given to collaborative project management techniques:

  1. The managing organization needs to be involved as participant or leader in all relevant standards bodies. 
  2. The organization should employ, where appropriate, open source systems and tools such as those available for software development, testing, and sharing.
  3. The organization should encourage public access to data “as is” while at the same time developing roadmap for released data they need to be changed over time to accommodate new or emerging standards.
  4. The organization should experiment with and make available APIs and related software tools that can help to bridge the gap between systems based on different data standards or architectures.
  5. The organization should provide clarity about data sets are considered to be” standards compliant.”
  6. The organization should support and engage in communication among all stakeholders, using traditional as well as modern communication, collaboration, and social networking media.
  7. The organization should provide a clear public statement addressing short and long-term goals, activities, and deliverables.
  8. The organization should report regularly on its progress to all its stakeholders.
  9. The organization should provide an open environment in which standards, documentation, software, and other tools relevant to standards implementation can be developed, shared, and used.
  10. The organization should publicize information about its own finances and the economic interests of its stakeholders.

Implementation

To implement the above requires a governance process that includes a program office empowered (and resourced) to provide a range of management support services. Managing this will require a variety of approaches ranging from highly structured and hierarchical to decentralized and collaborative. The actions of some participants will need to be closely controlled while the actions of others can only be monitored. 

Communication with stakeholders, as is the case with any complex project, will be critical to success. Performance metrics developed around key deliverables must be tracked and progress against them communicated and acted upon. Uncertainties caused by irregular departmental funding, sequestration, and uncontrolled actions of volunteer participants will also need to be accommodated. 

It’s important to recognize that the development and adoption of data and metadata standards are not ends in themselves but means to a variety of ends. Standardization generally is intended to control costs over time and to promote an infrastructure that supports commerce, communication, efficiency, and innovation, but we need to be more targeted than that to understand total impact.

As suggested in my white paper A Framework for Transparency Program Planning and Assessmentperformance metrics must address a range of objectives starting with the program objectives of the agencies responsible for the data that’s being standardized. Understanding the link between the programs and standardized data will be of critical importance given the likely need to prioritize changes to systems and processes affected by the standards.

Related reading:

Copyright © 2013 by Dennis D. McDonald, Ph.D.

Fixing the NSA: Getting Real About What Really Matters

Fixing the NSA: Getting Real About What Really Matters

Scoping Out the ‘Total Cost of Standardization’ in Federal Financial Reporting