While a central program management operation can define detailed technical requirements, technical approaches, and management tools, implementation work needs to be occurring locally – while the “train is still running.” How this overall governance process is managed will determine how long DATA Act implementation takes, how much it costs, and whether or not it is successful.
On balance I believe passage of the DATA Act will be a good thing as long as its implementation is effectively planned and managed. Before commenting on its ultimate success or failure, though, I’d like to see a detailed work plan with resources, responsibilities, a timeline, and deliverables. That’s just good project management.
Standardizing how the U.S. government collects, manages, and publishes budget and expenditure data, as required by the DATA Act currently before the U.S. Congress, is an example of a long-term and complex project. It will be require careful planning, management, and sufficient resources to be successful.
Last week I attended a meeting in DC sponsored by the Data Transparency Coalition, PwC, and Intel. Representatives of the Federal agencies likely to manage implementation of the evolving DATA Act presented their thoughts on implementing the Act’s requirements for standardizing and reporting on Federal financial data.
I’ve been researching government program transparency and the hype surrounding “big data.” Given OMB’s recent statement of support for improving access to accurate Federal spending data I’ve also been giving some thought to what improved access might actually mean, based on my own experience with data conversion and consolidation projects.
In Forrester’s Top 15 Emerging Technologies To Watch: Now to 2018, Brian Hopkins provides a peek at the results of an online survey conducted in 2012 to answer a question about what respondents feel the most “disruptive” technologies will be.
I appreciate that scaling, discoverability, and innovation are all potentially enhanced when the size, variety, quality and number of data sets surrounding a particular process or function are aggregated and exposed. Jewels can become visible. Inconsistencies can be identified and resolved. Impacts can be tracked.