Last Friday the Labor Department announced that the U.S. Economy added 176,000 private sector jobs in April while shedding about 11,000 in the public sector. The stock market rejoiced. The private sector number will likely be revised upward next month when May’s numbers come in, as has been the pattern for a while. So far, during 2013, the economy has added an average of nearly 200,000 jobs per month according to an article in the New York Times.
However, everywhere we look there are stories of decline and sluggishness. In my own unscientific data gathering I see great signs of new company formation, of venture capital and private equity companies sifting the industry, calling me up for ideas, and trying to put some of their huge stockpiles of money to work. I also see too many companies trying to participate in what ought to be a recovery but they’re putting only one foot into the water, testing it but not committing enough to make a real difference.
So I see many vendors spending a little on marketing but only enough to keep them from missing the next wave, if that wave indeed comes in, but not enough to really make the wave happen. That kind of strategy works well in one’s personal life — at a micro economic level — but it makes for poor macro economic performance.
In any economy, my spending is your income and vice versa, so if everyone takes an approach that they aren’t going to spend, the result is a recession. Incomes go down, economic activity is slow, you know the drill.
According to the U.S. Bureau of Economic Analysis, the Gross Domestic Product (GDP) in the United States expanded 2.5 percent in the first quarter of 2013 but the long-term average from 1947 to the present is 3.23 percent.
On the employment front we are trending down from the eight percent range. Unemployment was 7.5 percent in April according to the Bureau of Labor Statistics. The same office showed unemployment between 4.5 percent and 5.0 percent throughout 2007, the low point before the economy cratered.
We’re stuck in a false dichotomy in which we are all waiting for someone else to start the heavy lifting. But there is no one else. Perhaps now that Reinhart and Rogoff’s analysis supposedly showing austerity is the solution to the stagnation that afflicts us, has been proved false, we’ll start to see more of a turnaround. But the economy is big and not subject to being turned on a dime. Nevertheless, I am thinking that 2013 is a pivot year, that things accelerate from here. That’s why I get concerned about timidity in the face of what I see as great opportunity.
It hit me last week while attending Oracle’s Analyst World briefing. We convened in a conference center on the Oracle campus in Redwood Shores to learn about Oracle’s latest developments in hardware and software and to be briefed on the company’s future roadmap. How extensive was it? Let’s just say that my brain hurt when it was over and I had to sign a five-year NDA agreement to get into the building.
So what hit me? What ethical dilemma are Oracle and other enterprise companies facing? The very idea of ethics and the software industry may make for strange bedfellows for some people and I do not believe that we’ve ever seen an ethical dilemma like this before, though others might have existed as well.
Clay Christensen wrote elegantly about the Innovator’s Dilemma — that point in time when an innovator must decide to supersede a product or a whole line with something with greater performance characteristics and a lower cost profile, or risk having a competitor do it thus disrupting its established business. As Christensen showed, many, if not most, companies are pretty terrible at doing this. So the mini-computer makers completely missed the microcomputer wave, Kodak missed digital photography and the list goes on.
But this dilemma also breeds an ethical problem of the same order. Suppose an innovator is successful at transitioning from the old product line to the new and suppose further that the vendor continues to offer both the old and new product lines. Which one does the vendor lead with or push through the sales force? Typically, the sales force is comfortable with the old line and, having made a good living from selling it, the team is not very interested in selling the new stuff, which is why compensation plans get adjusted to incent the right behavior.
This is not far fetched and is, in fact, what happens all the time. More often than not there are also financial incentives for the customer that make the new solution so appealing that the decision about which product to buy never rises to the level of a dilemma, ethical or otherwise. But this time is different. Typically, the new solution offers better price performance characteristics and that’s enough to get the new product adopted by the market.
But now, here’s the rub. The new generation of hardware and software that Oracle and others are introducing might run well in a private data center, but their full benefits come through in cloud configurations. In fact some customers will find the cost considerations work out best when they use the new devices in cloud configurations. In the cloud, as we all know, it’s not necessary to own the stack. Cloud vendors typically own the stack and sell it incrementally to customers on a periodic basis. I think this is one of Oracle’s long-term plays.
Oracle, SAP, Microsoft and others — except Salesforce, which set up camp in the cloud a long time ago — are now in a straddle position offering new technologies to old markets or hybrid configurations for companies that might be changing over slowly.
The question is what do you lead with? Is there a duty for a vendor selling to a traditional on premise data center to point out the obvious? This is what I consider the ethical dilemma.
I think there’s an obligation to inform customers that the choice between on premise and cloud computing is no longer at best a toss up. There are significant benefits and consequences to be considered. The market’s direction is clear. Data centers are consolidating into the cloud and delivering major benefits including lower costs and greater reliability and better security. If, after informed consent is obtained, thee customer still wants to invest in the data center, that’s fine. I also recognize that these decisions are not as simple as my example. That’s why it’s a dilemma.
At some point in the not too distant future though, it will be impossible to justify on premise computing for routine business application work. Therefore, when customers are considering new purchases, sales people today have the responsibility to inform them — and to capture informed consent — that the direction of the market along with cost benefit considerations now favor cloud computing. A purchase of a solution that uses cloud oriented “hybrid” architectures might be a palliative approach to dealing with the conflict between premise based and cloud solutions, but the subject has to be broached.
“Oh, you want to refit your in-house data center with a new generation of technology? Ok, are you aware of the significant advantages of cloud computing? Are you aware of the market’s movement in that direction?” These and other questions now need to precede the standard, “Sign here. Press hard. The third copy is yours.”
I saw an ad for a webcast the other day and it said in part:
“The scope, scale and complexity of enterprise data centers is rapidly rising due to increased use of virtualization, cloud, big data and mobility. Applications and workloads are becoming more dynamic and volatile and IT staff are being asked to become more efficient and responsive. Automation across physical, virtual and cloud data centers is vital for effective operations and consistent service levels.”
Did you catch the change? Today it’s virtualization, cloud, big data and mobility the new four horsemen of business advance. In case you’re wondering they replace social, SaaS, mobile and cloud. Small difference? Yeah, but big change. If you were hip over the last five or so years you did the social, SaaS, etc. thing but if you missed the onramp, virtualization and big data give you a chance to save face. You weren’t being overly conservative. No, no, NO! You were being prudent, waiting for the technologies to mature into a coherent whole.
Really? After all this time and all the disruptive innovation cycles, you were waiting? Coherent?
In case you were wondering, virtualization and cloud made SaaS acceptable to those who worried obsessively that their data, the same data they couldn’t find an elephant hiding in would suddenly reveal golden nuggets to hackers. Big Data gives us all a way to accept social without ever for one minute admitting that our employees were not simply “playing” with social media at work — you can and should thank analytics for that. And mobility is mobility because your customers and employees are walking — some away from you and some towards you and you need to know and use it.
I was in The Valley the other day talking with a guy who is sometimes a client but always a friend. He’s a young guy who has already worked at Salesforce back in the day, did another startup with a Salesforce alumnus and is on his third company, this time running the whole marketing shebang. His take? Companies are looking to form data centers of excellence around analytics.
My take? It’s IT’s way of preserving itself. Remember Gartner’s forecast that the CMO would soon be spending more on technology than the CIO? This is IT’s response and I think it’s a good one because it potentially shows both groups reaching out to create greater value for the enterprise.
As commodity servers take hold of the world, it becomes less and less rational for a company to run its own IT so virtualization and cloud here we come. But what’s left behind is very interesting. IT might be buying the commodity farm but the secret sauce is still information and how you use it. So the IT data center of excellence is both a way to keep IT employed and more or less in house and an important way for IT to save some serious coin on commodity processing.
Larry Dignon of ZDNet put his finger on it about a week ago when he examined the possibility that IBM might sell off its x86 server business to new pal Lenovo. Servers are not going away but they are going to the farm and with that change comes greater focus on management systems overlaying everything because server farms are becoming quite huge. This opens up opportunities for companies like Oracle/Sun. Despite the catcalls from critics calling advanced servers the new mainframes, they have an important purpose and a growing niche, not to mention a new and as yet unstated goal of nine nines of reliability to achieve the promise of true utility computing.
So, yes, the scope, scale and complexity of data centers, wherever they’re located and whatever they’re called, is rapidly increasing and as the economy continues its rebound I will remain interested in finding
Erik Brynjolfsson and Andrew McAfee of the MIT Center for Digital Business and the Sloan School of Management have written an interesting book for our times — our economic times — with an appealing metaphor that any technologist will appreciate. Race Against The Machine: How the Digital Revolution is Accelerating Innovation, Driving Productivity, and Irreversibly Transforming Employment and the Economy, is short and to the point and it ought to be required reading.
The subject matter is employment growth or its lack in this rather austere recovery and the effect on future employment and growth. More specifically, it is about the changing relationship between humans and their creation, the computer — the almost-thinking-machine — and how it can out-compete its masters not only in routine manufacturing tasks but, increasingly, in jobs that were once thought to be the exclusive province of human thinking.
The metaphor, from futurist Ray Kurzweil, holds the narrative together and is worth pondering before we continue. It is told in the form of a story about the invention of chess. The emperor, the story goes, was so delighted with the game that he gave its inventor a wish. The sly inventor asked for a grain of rice to be placed on one square of the chessboard two on the second and double the prior square’s total on each succeeding square.
It is the story of exponential growth. Accumulating the sums of rice on the first half of the chessboard was manageable but the second half totals were truly significant, from small beginnings arose a mountain of rice that would dwarf Mr. Everest. McAfee and Brynjolfsson apply Kurzweil’s story to another runaway exponential progression, Moore’s Law. You may not need to be reminded that Moore suggested that computing power would double and its cost halve every 12 months or so. With some fine-tuning the period was raised to 18 months and has continued for virtually the lives of all people in the technology industry today.
But that’s not the crucial part of the story or the book. The authors calculate that we have only recently (in the last few years) crossed over from the first 32 squares of the chessboard into the second half where a metaphorical Everest awaits us. The gains in the second half of the chessboard will likely come from advanced software and algorithms and not hardware per se. They point out that while computing power has increased one thousand fold on the first half of the chessboard, the power and quality of our algorithms has increased 43,000 fold.
The chessboard’s second half is already giving us systems that can diagnose better than doctors, out lawyer lawyers and, of course, kick booty in Jeopardy!. So what will happen next? McAfee and Brynjolfsson are quick to point out that the thinking that machines do is not the thinking of humans. It is often the lightening brute force effort of crunching a great deal of data to ferret out an answer. Also, training a machine to pick up a pencil from a random table top, let alone use it, is still elusive.
The major point of The Race Against the Machine, is that there really is no race, or if you think there is be prepared to lose. But this book is fundamentally hopeful because it suggests that machines are tools that ought to off load people from their rote tasks to concentrate on the creative, entrepreneurial and innovative endeavors that only humans can engage in.
What’s powerful about this argument is that it offers a prescription for a solution: Leverage the machine rather than fight it. Crunch big data in every way you can imagine, ask “What if” questions ad nauseam and above all, innovate.
As I look at CRM and the broader technology world, it seems to me that the subdivided chessboard is visible everywhere. What we once called innovation was surely innovative but it was really no more than automation of what existed before. True innovation starts on square 33 when we realize that all the automation is mostly behind us and innovation means making totally new concepts.
McAfee and Brynjolfsson speculate that we reached square 33 in the middle of the last decade. If true, one of the first true innovations we all witnessed was the social media revolution. Combined with the mobile revolution and today’s quest to master Big Data, we have a potent nucleus on which to invent new businesses, models, and processes.
And if we combine what we know about the chessboard and where we are on it with what we know about our industry we can clearly see that some vendors are simply automating on the first half of the board while others are innovating on the second half.
Let’s not fool ourselves though, simply innovating on the second half of the chessboard is no guarantee of success, just as always, bad ideas will still yield bad results. But it is also true that failing to try, to enter the second half is a sure route to oblivion. This is not simply a matter for the tech sector or even the nation. It’s a large scale economic issue that will affect our species.
Historians and others often debate when specific eras start, because they don’t often follow calendars and precise dates. Some people say that the twentieth century started in 1890 with the closing of the American frontier, for instance. With this as a guide and McAfee and Brynjolfsson’s fine and short book as context, I’d say the twenty-first century started in about 2006.