Given the likely impact of the coming U.S. DATA Act on access to federal spending data, and continued development of federal “open data” programs, I wanted to “pick the brain” of someone for whom data and metadata standardization is a “meat and potatoes” business.
Pushing for standardization of such regulatory information will greatly enhance public access and transparency — as long as effective governance and sufficient resources are made available to support the process and the systems that provide public access.
What’s different about this newly announced NOAA program is not just the potential “big data” scope of the program but the way in which private sector cloud vendors are involved as intermediaries not only to the public but also to potential data vendors and resellers.
Much of what the EPA staff talked about involved processes and activities that are necessarily associated not only with “open data” but with any data intensive business process. Data must be managed. Systems that share data need to be coordinated. Resources need to be allocated and shared. Such requirements are not unique to “open data” but are universally relevant.
We want systems and processes to be more effective and transparent, we want to be able to take advantage of improved standards and technologies when they make sense — but we also need to balance the cost benefits of change in a fiscally austere and change resistant environment.
You do need to avoid adopting standards and processes developed by other organizations without first understanding how and why they were developed. That requires research, communication with the other organization, and thoughtful planning.
On balance I believe passage of the DATA Act will be a good thing as long as its implementation is effectively planned and managed. Before commenting on its ultimate success or failure, though, I’d like to see a detailed work plan with resources, responsibilities, a timeline, and deliverables. That’s just good project management.
The basic ideas behind the DATA Act’s focus on financial data standardization makes such eminently good sense that efforts to weaken such standardization should be carefully and openly assessed. Fundamentally, data standardization if managed well can reduce costs, improve data manageability, reduce errors, and improve communication. Implementing data standards can also improve how date transparency efforts are supported as long as the people who operate the underlying systems want to be more transparent.
Standardizing how the U.S. government collects, manages, and publishes budget and expenditure data, as required by the DATA Act currently before the U.S. Congress, is an example of a long-term and complex project. It will be require careful planning, management, and sufficient resources to be successful.
Last week I attended a meeting in DC sponsored by the Data Transparency Coalition, PwC, and Intel. Representatives of the Federal agencies likely to manage implementation of the evolving DATA Act presented their thoughts on implementing the Act’s requirements for standardizing and reporting on Federal financial data.
While improving data transparency for both financial and non-financial data is an important goal, any effort requiring an enterprise level shift in data formats or associated business processes can also require a substantial early peak in required resources. If in fact we need to delay implementing data transparency programs, perhaps we can use our time wisely by doing a more detailed job of research and planning.