This is hard—moving a huge number of customers from on-premise systems to cloud systems while continuing to run the business. Hard but there are few alternatives. The last time anyone seriously tried to rip and replace was the Y2K conversion of back office systems to accommodate the new century’s date format. The effort nearly clobbered many companies.
This time vendors like Oracle have no interest in repeating the mistake so as they migrate their installed bases to the cloud, they are being careful to provide interim steps that lessen the load and the complexity though probably not the costs. In announcements made by founder, Chairman, and CTO, Larry Ellison, Oracle outlined a series of cloud services designed to help all manner of customers—from partners to enterprises—to migrate to cloud systems. But note here that migration does not automatically mean going to a multi-tenant architecture like Salesforce and many other vendors. For Oracle, moving to the cloud means only a literal translation from premise to cloud. Call it step one.
For many customers that’s enough because it will be a heavy lift moving from applications they may have been using for 15 or more years. The alternative would be to build new cloud apps on Oracle’s platform but that would take many years and dollars for some. The solution of moving the existing applications to the cloud and then contemplating a rewrite seems inelegant but in fact it makes a lot of sense. Many customers will realize significant savings by sending parts of their datacenters to the cloud—monies they can apply to new business process support.
Here are the services announced and my take on them.
Oracle Database Cloud—Exadata Service. This is very interesting because the Exadata hardware supporting this service is worth the price of admission. Exadata provides orders of magnitude speedups for most database functions because it operates in memory virtually all the time. So big reports, analytics and other database operations run much closer to memory speeds than disk speeds—in other words about a million times faster. That’s nice especially if you need to find more cycles to dedicate to data encryption for security.
Oracle Archive Storage Cloud Service. Archives are necessary and far from glamorous but somebody’s got to do it and I can see many enterprises happily paying whatever Oracle charges.
Oracle Big Data Cloud Service and Big Data SQL Cloud Service. If you need Hadoop and NoSQL databases in your enterprise, this is for you. Though this is another less-than-sexy service, its need is readily apparent for large enterprises and small. It’s also probably more than many will need but that’s likely to be viewed as a good thing.
Oracle Integration Cloud Service. Everyone needs integration services but it’s surprising to see this elevated to the scale of a cloud. Many other vendors get by with their platforms and APIs and that’s telling. If you’ve been an Oracle customer since green screens on a VAX days then this is something you might need to make sense of your less than third normal form relational DB.
Oracle Mobile Cloud Service. This is a developer tool useful in developing and deploying mobile apps. It sounds great but it begs the question, why can’t we just define apps once and generate running code for multiple target platforms? The answer is that some apps don’t have definitions that plug into code generators, they’re real code. So this is another tool for helping move and preserve what’s out there.
Oracle Process Cloud Service. I was very happy to see this because I think process orientation is where we’re all headed. The applications that are moving to the cloud in this migration and many others are built around capturing and manipulating data. But the future is about what you do with the information you glean from data that produces useful information about customers. This information will, among other things, enable businesses to better serve customers by being more intimately involved in their moments of truth resulting in bonding, the holy grail of modern business. This service is the tip of an important iceberg and it provides justification for all of the other services.
All together the Oracle Cloud Platform and Infrastructure services present a vivid picture of the state of modern business and computing. There’s a huge legacy base that has to keep working even as it is being moved. You could sniff that Oracle is enabling the legacy systems to continue operating rather than replacing them en mass but that’s an impractical idea.
The announcements Ellison made show a customer centric company focused on helping customers make generational transitions safely and economically. That might not be your first conception of Oracle. Yes, they will make money on this—I am a big fan of the motivation possible when money is involved. For many companies concerned about betting their business on new technology (been there, done that, got the scars and the Tee-shirt) this should be seen as a gradualist approach to the last generational transition of their working lives.
News out in the virtual world is that some analysts have trimmed their sails regarding Oracle’s financial picture. The company missed its revenue forecast last time and today the financial guys are concerned about competition in the database business and lack of strong market support for the company’s hardware.
Competition from companies like SAP with its in-memory database solution offers potential to disrupt part of Oracle’s database business. Oracle also has a memory based strategy and there are other factors to consider. For instance, how do SAP customers feel about buying their database services from their application provider vs. from their database company?
The same kind of question can be asked about hardware from Oracle, traditionally a software company. Oracle’s purchase of Sun a couple of years ago changed that equation and you can argue that Oracle is more of a hardware company than SAP is a database company but I think this all misses the point.
Whether we’re talking about in-memory databases, very large computers, data storage and analytic appliances, we are seeing fundamental disruption in multiple markets. What’s interesting about these disruptions is that the incumbent leaders in these markets are leading the creative destruction themselves.
In a more conventional market disruption we could expect a crop of small companies flying under the radar to create products and nip at the heels of the big guys for a few years until — Oh my gosh the sky is falling! — the moment when the little guys tripped up the big guys. That’s not happening here. The big guys, especially Oracle, are doing their own disrupting.
Perhaps it’s because hardware is no longer a place where Steve Wozniak can put a few chips together on a breadboard and invent the new, new thing. It takes big bucks and lots of R&D to build what Marc Benioff derisively calls a new mainframe or the data storage and analytics appliances the Oracle has brought to market in the last two years.
Oracle has done its job, it has brought out some impressive next generation technologies and it has seeded its biggest customers, the early adopters, with the gear. It has also received good reviews albeit with some first generation glitches.
Wall Street is doing its job too, though you might want to question the rationale. The Street has a ninety-day time horizon; we know this. But disruptions have their own internal clocks and they don’t kowtow to the analysts. So we have a situation in which Oracle is having some lackluster results regarding market uptake of its newest and priciest products. Not to worry, I say.
The alternative to really big iron is widely distributed iron and it will be interesting to see how this plays out. A widely distributed scenario can use smaller and older technologies, but you lose economies of scale, even in a cloud computing situation.
So although Oracle is experiencing slower demand for its new products, I think the situation is temporary. The new gear strikes me as the disruptive innovation that the market needs and we are going through a normal process of market uptake. The only difference between this and a more conventional disruption is that as a big public company, Oracle is going through this in full view of all the critics.
Oracle is a big company and that point gets driven home when you start to go in-depth on their products. At a show like OpenWorld which is dedicated more or less to touching on every aspect of the business, you can quickly get out of your depth.
Since the opening keynote on Sunday the talk has been mainly around things that I know about but don’t cover. So I’ve learned about what’s new in the company’s computing hardware, operating systems, the subject of Big Data and the new SPARC T4 chip.
You have to expect that. As a database company first, something like Exadata — a storage machine for very large databases — is very important especially for large Oracle customers. Exadata is supposed to speed up data processing by orders of magnitude and enable users to compress it significantly reducing the number of disks needed and the electricity to drive and cool them. As I say, it’s important if you are a big company with significant IT issues, not the least of which is power consumption.
I can say the same about Exalogic a compute server that offers massive parallelism meaning multiple parts doing the same job within the box. Parallelism is important if you need to support hundreds of thousands or even millions of users on a website and need to ensure up time all the time. And Oracle just introduced Exalytic a machine that does for business intelligence what the others do for their respective IT fiefdoms.
So, it’s all definitely important but not exactly CRM — except for one small idea. All of the gear mentioned is necessary for taking the next step from cloud computing to a genuine form of utility computing. Think about your phone service or electric service or older services like water or natural gas. Those things just about never go out. True they can falter in a natural disaster but other than that outages are rather rare. The providers generally achieve seven to nine “9’s” of reliability (99.9999999 percent up time).
Do the math, it’s like a hiccup once a year. Now compare that with the three or four 9’s a cloud offering provides these days. Though still rare, outages are something that cloud computing still grapples with in part because many cloud vendors do not offer the kind of redundancy and massive parallelism that other utilities do. It’s not that the electric grid is perfect that keeps it up all the time, it’s massive parallelism. The same is true in almost any system upon which we’ve come to depend. In its Exa- hardware, Oracle is trying to provide that parallelism.
My point is that cloud computing is growing up and not a moment too soon. The amount of data they’re talking about at this show will soon be measured in zettabytes (1021) if my friends at IDC are right. It is a number that makes even the national debt look insignificant and it’s just on the horizon. What’s also on the horizon is whopping energy costs for data processing unless we find better ways to power and cool our machines and the Exa- family is certainly a way towards that.
Implicit in all this is the inexorable move toward cloud computing. This week I am seeing and hearing a lot about private, public and hybrid clouds and much of this will lead to more efficient processing that will make the zettabyte world realistic. Nonetheless I must say that some of what I hear — not just from Oracle but from partners like EMC2 too — seems a bit self-serving. Specifically, I must respectfully disagree with anyone who uses the term “private cloud” because it simply moves a data center from within a company’s walls to a vendor’s data center.
While the IT processing might be delivered through the Internet and the solution might shave off some computing costs, cloud computing is much more than moving the data center and continuing to run the same applications in the same old way. Hint: if your data center is moving to the cloud, shouldn’t some of your business processes too? Shouldn’t you be looking at reinventing your business processes as well as making them cheaper to run? Sometimes I worry that extracting cost out of IT this way will simply kick the business process question down the road and the processes will just ossify.
We’re embarking on a new computing era that will be powered by massively parallel machines yet we seem to behave as if we can simply take our heretofore terrestrial applications to the cloud — while it will work it won’t take advantage of all the opportunities the cloud offers.
Now’s the time to rethink the ways we do business and the applications we do business with. The result of that thinking should be a set of programs in every company designed to move into the new era of business — one that gets closer to the customer and, by the way, the remote employee or contractor. Granted, that’s a big job but so far it’s the eight hundred pound gorilla in the room. No one is talking about it. New languages, new tools and approaches are being rolled out as if they’re simply nice to have. Not so, they’re necessary.
It would be better if there was more recognition of the imperatives of the era we’re embarking on, because it might provide a better frame for the discussions we’re having around all the shiny new objects.