Dennis D. McDonald (ddmcd@ddmcd.com) consults from Alexandria Virginia. His services include writing & research, proposal development, and project management.

Why Haven’t We Replaced Obsolete Federal Government Computer Systems?

Why Haven’t We Replaced Obsolete Federal Government Computer Systems?

By Dennis D. McDonald

Eventually, the old systems will break down. They will die when patching and repairing no longer are viable. Even venerable B-52 Stratofortresses will have to be replaced some day. The same will inevitably happen to all the federal legacy systems that need replacement.

John Kamensky's review of the Federal government's progress in modernizing its IT resources is concise and well thought out. He quickly reviews steps taken since 2008 to transform Federal IT. He is optimistic that modernization efforts now underway, as envisioned in the recently released Federal CIO Council’s State of Federal Information Technology, can succeed in transforming IT away from a decades-long proliferation of custom applications built on top of legacy systems.

According to Kamensky, necessary groundwork for the necessary transformation has been laid in three areas:

  1. Establishing "digital services" teams to help agencies modernize and upgrade citizen facing government services.
  2. Upgrading and adapting government procurement processes to streamline and accelerate IT resource acquisition.
  3. Developing and promulgating improved processed and standards for creating digital services (for example, the Digital Services Playbook).

How successful have these efforts been in laying the groundwork for continued "digital transformation" and the replacement of outmoded legacy systems with more modern and efficient resources?

Perhaps it is significant that the three system success stories mentioned by Kamensky -- a faster VA disability compensation process, a streamlined certification process for small businesses, and an improved college scorecard tool -- seem not to replace existing legacy systems but to make better use of data provided by existing systems.

The major efforts reviewed involve people, procurement, and process, not necessarily technology. That's all very positive. Still, adding more layers on top of existing legacy systems does not necessarily move us closer to the goal of replacing those systems.

This is a typical system integration dilemma that is often faced by organizations that have not been fortunate enough to grew up in the "digital natives" era. Looking over the GAO's 2016 report Federal Agencies Need to Address Aging Legacy Systems, also cited by Kamensky, illustrates the challenge:

Federal legacy IT investments are becoming increasingly obsolete: many use outdated software languages and hardware parts that are unsupported. Agencies reported using several systems that have components that are, in some cases, at least 50 years old. For example, Department of Defense uses 8-inch floppy disks in a legacy system that coordinates the operational functions of the nation's nuclear forces. In addition, Department of the Treasury uses assembly language code—a computer language initially used in the 1950s and typically tied to the hardware for which it was developed.

Even though OMB and GAO have provided much-needed management and oversight for efforts at Federal computer system "transformation," the devil is in the details. It always is with system integration efforts that touch on a wide variety of technical systems and architectures. If you have ever studied a huge wall chart illustrating the multiple interconnected systems that support major Federal programs at agencies such as EPA, VA, or NOAA, you’ll know what I mean.

Also, just keeping legacy systems running in the background to feed more modern user facing systems is, as pointed out by the GAO report, a risky and expensive proposition. Agile development and procurement methods alone will not solve such problems. What's needed is a continuous effort -- supported at the top -- to promote a unified development vision that supports both modernization as well as retirement and replacement of legacy systems.

Will such an effort be sustained by the Trump administration? That's hard to say given emerging funding priorities away from non-defense budgets along with a possible reduction in Federally sponsored data collection and data availability. Maintaining a strategic focus in such a rapidly changing environment may be difficult given the tendency of stressed organizations to focus on short-term survival along with “keeping the lights on."

Eventually the old systems do break down and die when patching and repairing no longer are viable. Even venerable B-52 Stratofortresses will have to be replaced some day at which time the current patchwork of logistics, maintenance, and weapons systems upgrades will go away.

The same will happen, eventually, with all the Federal legacy systems that need replacement. The question is, will replacement occur rationally, or only after disaster or organizational turmoil?

I, for one, vote for a rational approach, one that ties open data access and intelligent data governance to systems and processes that effectively and efficiently serve the American public.

Copyright © 2017 by Dennis D. McDonald

How To Write A Letter

How To Write A Letter

Why "Data Ownership" Matters

Why "Data Ownership" Matters