www.ddmcd.com

View Original

Understanding the Challenges of Big Data Project Management: “The Data Must Flow”

By Dennis D. McDonald

Click or tap here to download a .pdf of this article.

I’m currently researching big data project management in order to better understand what makes big data projects different from other tech related projects. So far I’ve interviewed more than a dozen government, private sector, and academic professionals, all of them experienced in managing data intensive projects. What I’m finding is that professionals with experience managing such projects believe that big data projects do share characteristics with other types of projects.

Such similarities, however, have less to do with technology than with how projects are governed in relationship to other enterprise processes and systems. For example:

  1. Much learning is needed. Many if not most organizations still have a lot tolearn when it comes to making use of emerging big data analysis techniques. Understanding the potential costs and benefits of predictive and descriptive analytics is unevenly distributed throughout most organizations. Senior leadership support aside, this means that adoption of new analytical techniques and data sources as standard inputs to management and decision-making still has a long way to go and heavily dependent on what types of costs and benefits are experienced along the way. Organizations that are already to some extent “data-driven” in their decision-making will be better positioned to adopt new techniques than those that are not already engaging with data analytics methods. Organizations that are not already data-driven will take longer to recognize and internalize big data benefits.
  2. Data type matters. Data types that are already well defined and understood will more readily become adopted for and used by organizations than highly specialized or process-specific data. An example of the former: financial data subject to standard audit definitions and practices. An example of the latter: specialized process or transaction data that can be easily misinterpreted (or even misused) if made available without appropriate contextual information to support its interpretation.
  3. Overlapping enterprise changes. It is often difficult to separate the planning and management of big data projects from other ongoing organizational changes. An example is pushing for making better use of data science techniques at the same time the organization is moving data management and application hosting to the cloud from in-house data centers. Scheduling and risk management in such situations becomes more complex given the sequence in which cost and benefit dependencies emerge.
  4. “The data must flow.” As the emergence of corporate email did for previous generations, managing data at the enterprise-level forces management to face data ownership and data siloing issues that hinder managing data as an enterprise-level asset. Just as the imperial edict “the spice must flow” recognized the interdependency of the Houses in the Dune universe, so too must “the data must flow” be the watchword in organizations taking a holistic view of data and its potential uses. This may be a complex undertaking in organizations where data management systems are not integrated or where organizational units have traditionally operated independently. Even where resistance to unified data governance is not highly politicized, basic differences in enterprise technology architecture may still have to be accounted for if managements wants to make data management more systematic and aligned with corporate strategy.
  5. Enterprise portfolio management. Data governance needs to be integrated with enterprise portfolio management. What this integration will look like will necessarily vary from organization to organization. If we view enterprise portfolio management as a systematic planning and management of all corporate resources in alignment with corporate business and performance objectives, relevant data need to be viewed likewise as resources to be managed. That means that business process, technical project, and data management will all need to be integrated as projects are organized into programs for efficient management.

Managing data and metadata at an enterprise level to facilitate efficient tool use can be a complex undertaking. This is especially true when corporate actions such as transitioning IT resources to the cloud, constantly upgrading technologies, and increasing attention to privacy and security must also be considered. Such complexity should not be a cause for discouragement but should help drive the organization to become more disciplined in how it generates value from its data.

If what Nate Silver said in a recent presentation is true – “Big Data has Peaked, and that’s a Good Thing” – perceptions about big data are maturing.  Perhaps we are becoming better positioned now than before to make disciplined assessments of the costs as well as the benefits of big data.

Do you have some insights to share about managing big data projects? Let’s talk. Call me at 703-402-7382 or email me at ddmcd@yahoo.com.

The above article was also published as “Big Data Project Management: Data Must Flow!” on CTOVision.com.

Copyright (c) 2015 by Dennis D. McDonald, Ph.D. 

Follow me on LinkedinTwitter, and Google+