Data is the New Oil

December 20th, 2010 by

I attended my first “unconference” this week; the event was CloudCamp where early adopters of Cloud Computing technologies exchange ideas. The host of the event, Dave Nielsen a Web Services industry professional, made an interesting comment during the event that “data is the new oil”. After the Cloud Camp unconference I did some thinking about the statement and came up with my own analogy of how data is like the “new oil”.

In the oil & gas industry there is a theory called “Peak Oil”, this is the point in time when the maximum rate of global petroleum extraction is reached – after which the rate of production enters terminal decline; as a result you see the value of oil continuing to rise. One interesting concept to solve the peak oil crisis is using enhanced oil recovery techniques. This is when an oil well/field reaches the end of life; when 75% of the oil is not recovered this is known as stranded oil. In the energy industry they are using enhanced oil recovery techniques like oil sands and heavy oil recovery in which they are able to recover the stranded oil from the ground and to get it production.

When I work with customers on new CRM projects, I typically find they have no shortage of data that is stored on the desktop in applications like MS outlook, MS excel, PDF’s and on servers like accounting systems CRM, and web sites. The challenge for the sales and marketing teams lies in the fact that they don’t use 75% of the data because it is stranded within these stand alone applications. Companies often invest in new applications when they require more data to grow the business, but they quickly run into the “peak data” this is when adding more systems does not produces a diminishing return on the use of the new data.

To solve this issue companies need to go back to sources of data that exist within the organization and use optimization techniques on the data while centralizing the data into a CRM. Dave Nielsen also made the comment “you need to prepare the data to move it out to the cloud” – in the case of POIM customers the cloud is SalesForce.com. To optimize your existing data you will need to perform data deduplication and data normalization.

Data deduplication– In computing, Data deduplication is a specific form of compression where redundant data is eliminated, typically to improve storage utilization. In the deduplication process, duplicate data is deleted, leaving only one copy of the data to be stored. …

Data normalization – In the field of relational database design, normalization is a systematic way of ensuring that a database structure is suitable for general-purpose querying and free of certain undesirable characteristics—insertion, update, and deletion anomalies—that could lead to a loss of data integrity. …

If data is the “new oil” for you business take the time to go back to your existing data and use optimization techniques to maximize the benefit of an existing asset.

http://www.cloudcamp.org/
http://www.platformd.com/
http://en.wikipedia.org/wiki/Peak_oil

Topics: Uncategorized