www.ddmcd.com

View Original

Interim Report on the Generalizability of the NOAA Big Data Project’s Management Model

By Dennis D. McDonald

Tap or click here to download a .pdf of this article

See this content in the original post

I’ve been independently researching the generalizability of programs like NOAA’s big data project where cloud vendors (Amazon Web Services, Google Cloud Platform, IBM, Microsoft Corp., and the Open Cloud Consortium) are recruited to provide both public access and support for data commercialization.

The goal is a win-win situation:

  • The government and the public win by providing and gaining access to data that might otherwise have to remain “behind the firewall.”
  • The private sector wins by generating revenue from new products that meet market demands.

Experiments like this make sense. It’s good to see the Federal government and private sector working together to create value from data that might not be realized were its use restricted only to specifically funded and legislated programs.

The question arises, how do you manage a program like this? Other government agencies are interested in the NOAA model as a way to open up their data resources for use in commercial exploitation. What will we learn from the NOAA experiment? And – importantly — how will this learning be shared?

To begin answering questions like these I have been talking with people both inside and outside the government. This article is an interim report on what I’ve been finding, organized as follows:

  1. Which management model is most appropriate?
  2. How much transparency is appropriate?
  3. Some preliminary observations

1. Which management model is most appropriate?

The project has been initiated via a series of CRADAs:

A Cooperative Research and Development Agreement (CRADA) is a written agreement between a private company and NOAA to work together on a project.  Created as the result of the Stevenson-Wydler Technology Innovation Act of 1980, as amended by the Federal Technology Transfer Act of 1986, a CRADA allows NOAA and non-Federal partners to optimize their resources, share technical expertise in a protected environment, share intellectual property emerging from the effort, and speed the commercialization of NOAA developed technology.

CRADAs are one of the principal mechanisms used by NOAA laboratories to engage in collaborative efforts with non-federal partners to achieve the goals of technology transfer. The CRADA, which is not an acquisition or procurement vehicle, is designed to be a relatively easy mechanism to implement, requiring less time and effort to initiate than previous methods for working with non-government organizations. The CRADA is also intended to take into account the needs and desires of private industry when commercializing a product.

What this means in practice for the NOAA project is that requirements for what the program will do and how it will do it have not been completely worked out.

It’s an experiment. One of the things that’s typical of experiments is that you can’t always predict how things will turn out.

You don’t manage an experiment the same way you manage, say, a construction project. For example, you handle risk and uncertainty very differently. Uncertainties include the marketability of products and services that might be developed from NOAA’s public data and the investment necessary to bring a product to market.

This question of revenue potential weighs heavily on potential private sector participants, including those that may already have worked for NOAA or its prime contractors via conventional contracting methods. “How are we going to make money from this?” was a typical comment I heard from contractors I interviewed. Another typical comment: “There’s no way my management will approve our participation in this given our need to make an unknown investment.”

On the other hand, one data-market-savvy contractor told me, “The government is not qualified to make data commercialization decisions should let the private sector decide what will make money.”

Management wise, then, the model most appropriate to follow when (a) risk and uncertainty are involved and (b) success will at least be partly defined by the development of marketable products or services must at minimum be one where uncertainty and risk are surfaced and addressed as quickly as possible.

2. How much transparency is appropriate?

One of NOAA’s goals is to “open up” access to more data so that more people can take advantage of data generated from publicly funded programs. While it’s possible to position the program is an example of open data programs generally – which have received mixed reviews recently — the NOAA program’s partners and participants must grapple with how best to generate value from public data via activities that will ideally stimulate the development of jobs and useful revenue-generating products and services.

One advantage that NOAA has is that its data are for the most part sensor and satellite generated. Questions of personal privacy are not as severe as data related to, say, health or personal income. Requirements for data obfuscation are reduced.

Still, openness and transparency of how the program operates and is managed are important issues here. “Lessons learned” will be an important program justification both for NOAA itself (i.e., the CRADA is not really a management and oversight tool) and for other Commerce and non-Commerce agencies that may want to use the NOAA program as a model to help improve public access to their data.

Given the commercial private sector status of most of the anchor partners, it will be interesting to see how “open and transparent” their activities are given the competitive nature of most private sector sponsored product development efforts.

Having been involved myself in managing the development and delivery of private sector data-based products I know that some level of secrecy and confidentiality is typical in order to prevent the inappropriate release of information to competitors.

If you are interested in partnering on a typical federal government contract listed on FedBizOpps.gov (www.fbo.gov), you can list your organization publicly as an “Interested Vendor.” Your contact information is openly available to anyone tracking that proposal. This facilitates partnerships among the people interested in the contract and is potentially useful for those without established relationships that are trying to enter into a new market.

On the other hand, if you are interested in partnering with Amazon on the NOAA project and sign up on the official Amazon NOAA Big Data Project web page, your contact information will not be made publicly available.

From an anti-competitive private sector perspective that makes perfect sense. As a potential partner, though, do you now need to “sign up” and begin engaging with multiple anchor partners in order to determine if there is a commercial opportunity for you? Also, to what extent will information about anchor partner activities be shared across Amazon Web Services, Google Cloud Platform, IBM, Microsoft Corp., and the Open Cloud Consortium? Will all have the same definition of what program and market information should be open or closed? Plus, how will NOAA coordinate – assuming NOAA decides it’s appropriate – to coordinate the activities of the different partners?

3. Some preliminary observations

The above are some of the questions to work through as the government considers how to manage and how to apply the NOAA Big Data Project model to other areas. Additional observations:

  1. Program uniqueness. Even though the volume and types of data are huge that NOAA is generating there’s no guarantee that highly targeted products might also be developed to address smaller niche markets. It may therefore may be premature to conclude that the NOAA program is not applicable to lower-volume and more static Federal data sources. Anchor partners are learning as well as they adapt ongoing cloud marketing methods to NOAA. It will therefore be useful to sort out what lessons learned are NOAA program and data specific and what lessons learned are generalizable.
  2. Collaboration. It makes sense that a standard procurement-and-contractor model may not be appropriate for managing such an experimental program. The private sector is being asked to do many things here with which cost and risk are associated. This includes doing what the Federal government is not always well qualified to do: build and sell potentially commercially viable products. The management and oversight process might therefore be more appropriately focused on promoting collaboration and on sharing information among participants and stakeholders. Management must also pay attention to balancing justifiable competitive and confidentiality interests. For example, an attempt might be made to establish a central clearinghouse function to promote and share information among potential partners and vendors where both technical and market information can be reasonable shared. Just explaining what the program is all about to people inside and outside the government is going to be a major task. Keeping everyone “on the same page” with the same set of facts should be a management priority.
  3. Strategic Agility. We may find that only well-established partners with prior knowledge of both the data and the markets for the data will be successful. While that may not be a surprising finding – the big and established do tend to get bigger and more established, after all – that would in my opinion be an unfortunate outcome. What I’ve noted in my research is that a wide variation exists in how companies react to the ideas behind the NOAA Big Data Project. There’s still a lot of “that’s not how we do things here” at work both inside and outside the government. Some people with an ingrained and traditional government-procurement relationship model seem to have a hard time thinking more strategically about whether or not they might be able to participate in a program like this. Smaller and more agile (and potentially more innovative) companies may be more open to experimenting. Given that NOAA seems to be “outsourcing” the relationship with these smaller and more agile companies to its anchor partners, will NOAA be able to learn from how these relationships evolve so that other programs can avoid having to reinvent the wheel?
  4. Success. There is no guarantee of success. If and when “failures” do occur, will the political fallout be such that the entire program will be tarred and feathered? Or will “lessons learned” be documented and transferred quickly to others so the same mistakes can be avoided? This to me is the biggest danger of too much secrecy. It’s one thing to tolerate secrecy despite some increase in inefficiency, it’s another for secrecy to increase the likelihood of failure. Where do you draw the line? That’s hard to say. After all, this is an experimental program. Hopefully NOAA will establish appropriate management and oversight processes to address such issues with an appropriate level of transparency.

Related reading

Copyright © 2015 by Dennis D. McDonald, Ph.D.