In our discussions of data we seem to focus on two major areas, either collecting a lot of it or making sure it’s clean and tidy, but there is a third idea worth considering in relation to overall data management, too. The collectors are a bunch of usual suspects who espouse the importance of capturing customer crumbs so that you can figure out what they, at least in aggregate, want, think, feel etc.
Then there are the cleansers, the vendors that provide ETL tools — that’s extraction, transformation, and loading — which neatly describes a round trip also described as wash, rinse, and repeat. It’s all good and valuable but if that’s all you have you are missing something.
What’s missing is encapsulated in the sage saying that you don’t know what you don’t know. The data we collect is overwhelmingly first order in my conception. By that I mean that one way or another, I can collect a bunch of data about a customer and derive information from it. But the information won’t necessarily be enough for decision-making because it will not necessarily be complete.
Even in a business to business or B2B setting, if I collect data about your company, where it’s located, who the executives are, who the decision-maker is for the products I sell, and more, I might not stumble across the fact that your company is owned by some other company and most importantly that decision-making is handled by the other company. I could simply ask someone that I am calling on in a sales situation about that, and that’s what I did in the bad old days. But truthfully, once I am in the account it is becoming too late because I’ve invested time and resources. In other words, the cement isn’t hardened yet but it is setting up.
It would be preferable for planning purposes and resource allocation if I knew all that before I picked up a phone. As it turns out that’s at least part of the value of Data as a Service. It’s the ability to fill in the information gaps that might not get filled any other way. In a similar vein, you might also be able to get such non-trivial information out of a service as the prospect’s credit rating, which could come in handy when you want to get paid.
All this is to point out that the growing data industry is something of a three-legged stool consisting of vendors that supplement, clean, and manage data for your company. This is obviously completely different form the data management you need to do to keep on top of accounts and revenues but that’s usually an in-house job and there are other vendors for that.
I keep DaaS separate in my mind from sales or marketing intelligence. There’s no reason that a vendor can’t supply both kinds of solutions but they are different enough in my estimate to keep them distinct. It’s also likely a vendor might want both. Intelligence vendors provide more breaking news-like information that is often consumed by sales and service organizations while DaaS can be consumed by sales and marketing but also finance to help guide financial decision-making. The two services overlap and it’s getting harder to tell them apart. Each provides a relatively low cost approach to more rational decision-making that didn’t exist a short while ago.
Add this data discussion to a changing market and you see where this is going. We are inexorably changing the way we sell from a face-to-face process to something that’s a bit more automated especially in the early going. That means less opportunity to ferret out hidden data and greater reliance on electronic sources. As we contemplate the new year and our sales and marketing planning it might be a good thing to examine how our organizations capture, manage and use data, and whether or not we have enough of it.
Admit it, you never think about data storage any more. We used to, quite a bit. Long before the storage glut that exists today, there was already too much of it and it was hard to find places to store it. When we built software it was with an eye toward maximizing our limited storage capacities, we looked after computer memory the way thirsty people in a desert manage their canteens. Bill Gates once famously said we’d never need more than 640 kb on a computer. That was a long time ago.
Then we all got fat on data. In my lifetime a megabyte went from a million dollars to chump change. The computer I write this on has four gigabytes plus another half terabyte on an itty-bitty spindle. Storage is now just another elastic resource available for a song on the Internet.
The availability of storage has encouraged us to capture and store data about the minutest details of our lives and our business processes. For a long time vendors have captured data about keystrokes and mouse clicks, analyzing it in the hope of finding a nugget of value. This brute force approach is inelegant but it is brought to us by the ubiquity of fast and cheap storage and powerful analytics software increasingly built by out of work rocket scientists and Wall Street quants. It’s there so we use it.
I was at a conference hosted by Pervasive Software last week in which the company announced new storage and data integration products, more on that in a moment. While there, someone made the comment that data is the new oil and I think I agree. Cheap energy made possible huge economic activity driven by transportation and oil doubles as a raw material. Is data next? It already is the raw material of the digital age and it seems like collecting and analyzing data is the default answer to the question of what we do next in a variety of business situations.
Now, we already have a variety of IT products delivered as services. Everything from software to infrastructure is now available as commodity services easily accessed through the Internet. Pervasive and companies like it are talking about data as a service (DaaS) and with DaaS comes the need for data integration as a service too.
This is interesting on several levels first for obvious reasons but also because we’re watching the DaaS industry skip over its commodity and product phases only to move directly into service. It’s Joe Pine’s and Jim Gilmore’s cup of coffee from their landmark book “The Experience Economy.” You know the story — a vendor adds value to a commodity (coffee beans) through roasting and packaging and delivers to market a branded product. Then someone brews the stuff and serves it to you at a counter and it becomes a service. A third person pipes in music, gets some overstuffed chairs and mood lighting and you have an experience.
Data has become a service. I can’t help but wonder what the data experience will be but I think it might look a lot like Pervasive’s new product called Pervasive Galaxy. According to the press release “Pervasive Galaxy is a combination of service marketplace, revenue sharing, integration store and community that serves its OEMs and ISVs.” Sounds like an experience, no?
This opens up a lot of new territory in the cloud. For instance, a customer might want to use multiple SaaS applications and share data between them. That’s a reasonable objective and just as with more terrestrial applications, an integration has to be done to make the applications talk to one another. That’s where Galaxy comes in. With it vendors can write application connectors for popular packages and offer them — at commodity prices — to others in an environment not unlike iTunes or the AppExchange.
Intuit (QuickBooks) and Salesforce recently announced a packaged integration of their products aimed at the low end of the market. The glue that holds the integration together is based on this technology. Small companies that have used QuickBooks for years can slurp their customer data (Admittedly a highly jargonized technical term. Sorry.) into Salesforce and build a reasonably relevant customer target list for such things as cross and up selling.
A more robust application of the idea might seek to upload old data to generate a map of the customer base, especially useful for an organization trying to balance its sales territories or equalize service approaches. The permutations are as vast as your imagination.
Now, I can understand the attractiveness of data as a subject — it is the reason for building the above-mentioned infrastructure, which is as necessary as plumbing. But if we follow the coffee analogy noted above, then the ultimate object of the data experience isn’t simply data or clean data or even cheap and reliable data. It is the information that we derive from it. The territory map just mentioned is merely data for the data supplier just as a cup of coffee is always a cup of coffee. But in the hands of the customer the coffee can become an experience and the data becomes vital information.
My reason for this obvious statement is simple. Now that data is a service I think smart service vendors will see the importance of quickly advancing to the idea of selling information or at least aiding and abetting it. Like the four-dollar cup of coffee at Starbucks, information carries a higher price tag and there is almost no additional work.