The tools you work with have a lot of impact on what you can accomplish and the more sophisticated the tools the better, especially in software. Beagle Research just completed a study into using a DevOps strategy with the Salesforce Lightning Platform. You can get a copy of it here. The work was sponsored by Copado a DevOps solutions provider. DevOps is a strategy for building, changing and deploying enterprise software that can also be used with a Scrum or Agile methodology as well as others. More than concentrating on code and coding, DevOps is more holistic looking at culture and infrastructure in its broadest manifestation.
Even if you’ve never built systems you can surmise that planning, developing, assessing, testing and deploying software are all critical milestones and they’re often spread across technical departments of IT like development and operations, hence the name.
It can be challenging to compare enterprise software strategies. For instance, using an on-premise hardware and software stack has been common for decades but with the development of cloud computing, users find they can eliminate having to care about a good deal of their development environments leaving it all to the cloud vendor. How do you compare overhead and costs between cloud and on-premise? What are the effects on speed to market, reliability, security? To control for some variables and enable us to make an apples-to-apples comparison, we chose to research only companies developing and maintaining systems using Salesforce Lightning and a DevOps strategy.
Companies ranged in size from fewer than 100 employees to more than 10,000. There were similar measures for number of salesforce users and number of developers as well as the number of production orgs. Two-thirds or 67 percent said they run between 2 and 7 production orgs. Most of the respondents were C-level executives (48 percent) or upper management (40 percent).
We found that DevOps is delivering value for most of its users though the larger organizations have greater challenges, more on that in a moment; 17 percent claim over $5 million in benefits from using a DevOps strategy. These people have a good understanding that software flexibility drives business agility and impressively, 54 percent say their lead time for making changes to their Salesforce orgs is between one day and one week. Compare that to a more traditional process that takes weeks or months.
But we also identified an elite group that operates even faster–21 percent say their lead time for making changes is less than a day, and 8 percent say it takes less than an hour. Taken together 83 percent can make changes in a week or less.
In other recent research I’ve been involved in, delivering running, tested and deployable code was much slower. Clearly, if a business depends on its ability to quickly change to meet changing market demands this is where you want to be.
On the other hand
As you might expect though, the benefits of a DevOps strategy were not evenly distributed across all users. Generally, smaller businesses with smaller development groups did better overall at establishing DevOps programs and at excelling within them.
The most successful businesses using DevOps are those that use a well-integrated set of tools to move through development and deployment. Many organizations, especially smaller ones, use a combination of in-house developed and opensource management tools. At best the great variety of tool choices suggested to me that some best practices are still being worked out.
Even with Salesforce Lightning and a DevOps approach you can still have issues and almost everyone had the experience of deploying a release to a production org and having a service degradation. A plurality of respondents, 43 percent, said a problem occurred up to 15 percent of the time and the vast majority or 86 percent said service degradations happen less than half of the time. This is an important snapshot of the state of the industry. Speed of delivery slightly exceeds stability of releases indicating a need to bring the two metrics more in alignment.
Some best practices considerations
- A strong majority (60 percent) say each developer in the business has a private development environment.
- Also, 77 percent say they use version control to store code and click-based Salesforce customizations.
- Most synchronize their development environments with the latest changes from other teams with 41 percent doing this on-demand or at most once per day and 42 percent saying they do this between once per day and once a week.
- 75 percent say changes made in version control trigger automation tests.
- 87 percent have confidence that when automated tests pass the software is ready for release. However, meta-analysis of the data strongly suggests that the greater a team’s confidence in their tests, the higher their change failure rate. Skeptics who were neutral on this question experienced a 40% lower change fail rate than those who expressed strong confidence in their tests.
It’s good to be skeptical.
Part of the allure of the digital disruption is having the capacity to change a business process to take advantage of changing market conditions and many businesses are already having that experience. Big data and analytics tell us what needs attention but then we still need to change our systems’ behaviors. Flexible software contributes to a business’ agility and that’s good. But that speed and flexibility need to be balanced by security and what I can only call the bulletproof-ness of the new or changed code.
The businesses most able to reap the rewards of DevOps tend to be smaller though large enterprises have their bragging points. While larger businesses already see benefits from a DevOps strategy, they are the ones with the greatest potential to do more. What’s holding them back?
In any organization size breeds complexity which causes business friction. We don’t have all the data to say so unequivocally, but it seems that bigger organizations have more walls to break down.
It looks to me like the development tools are pretty good. Not enough businesses have well integrated management suites to handle the complexity and it also seems like culture forms stovepipes which causes less stellar performance. If that’s so there’s still some cultural work to be done enhancing communications within and between developer groups and the business. DevOps tools can be a big part of that help but as with the psychiatrist trying to change a lightbulb, the bulb still has to want to change.
Drop Tank is a small company in footprint–it started with only 22 people–but with an outsized mission to provide loyalty and discount programs to thousands of gasoline retailers. It has been decades since gas stations offered incentives to purchase their products. Last time, in the 1960s, retailers would routinely offer silverware, glassware, and china for fill ups. Alternatively, some would offer Green Stamps–collectable coupons redeemable for merchandise. That was the state of the art for loyalty programs.
All of these arrangements were relatively easy to administer and there was no back-end data to massage. A customer made a purchase and got a reward on the spot. Those were the days! Today’s loyalty programs often capture metadata from the transaction that credits points to a customer’s account and the customer can determine how best to spend them.
Drop Tank started out providing single-use, cents per gallon discount cards to customersin 2012. Today it offers the same fuel discounting scheme but also gift cards and combined offers for gasoline plus consumer packaged goods sold in convenience stores that also sell gas.
Less than five years ago (2016), Drop Tank began designing a back-office system to ride herd on customer gas purchases and the points they accumulated. Any customer could join by adding a phone number which could then be used at the pump to identify the purchaser. The system has been through several iterations in its short life, each time becoming simpler and more efficient and thus better able to support an expanding mission.
When Drop Tank started out, it produced a small black box device that connected to the pumps capturing customer data and purchases for later upload and processing. The device was needed because 65 percent of retailers are independent and have the ability to choose their POS (point of sale) system. This resulted in integration challenges best met by hardware. But less than ten years later, the black box is gone and the latest iteration leverages the Oracle Autonomous Database which enables several important improvements for Drop Tank and its customers.
Drop Tank’s principal partner is Marathon Oil, a vertically integrated petroleum company that leads in oil refining in the US and runs a large distribution network located primarily in the Eastern half of the US, though recent acquisitions have spread the footprint into the West.
Although the original black box solution was good for its time, installing those boxes at more than 3,500 retail locations was time consuming and expensive, requiring a visit to each retailer.
Tim Miller vice president of technology for Drop Tank, was a co-founder of the company and he has run IT since founding so each iteration of the system was his responsibility to develop, deploy, and maintain–all the incentive he’d need to support continuous improvement.
Over time, Miller has been able to replace the black boxes with server-side software for all of the POS systems popular with his customers, the retailers. Importantly, the evolution of technology in the dealer network mimicked and supported the evolution of increasingly sophisticated services provided by Drop Tank to the retailer and, recently to CPG companies. Consider these milestones,
Stage 1, 2012
Initially, Drop Tank functioned as a promotional products company with cards and codes that dropped the price at the pump. Customers didn’t need to enroll in a program to enjoy savings, but Drop Tank had limited ability to leverage data to engage with them.
Stage 2, 2016
The original cents off system was built on Rackspace and was adequate for the need. But by 2016 Miller and a small team designed an improvement that would take the black boxes out of the equation letting the retailer simply connect a POS system to Drop Tank’s headquarters using Oracle Cloud Infrastructure. It made sense because it reduced the time and expense associated with deploying hardware from four hours per site to 30 minutes. Now all of the integration and conversion could be handled by software. It also appealed to retailers who wanted fewer devices behind the counter connecting to other systems and printers. So, simplification was good for all parties.
The dealer network reacted, and Drop Tank’s new retailer signups doubled..
Stage 3, 2018
With its system built and in place and connected directly to the POS systems Miller and his team discovered they could gather additional point of sales data that other systems either were not capturing or, because there were so many disparate systems in use, no one could easily aggregate. This was in stark contrast to the way other retailers could capture data and provide it to CPG companies. For example, supermarkets can routinely report to CPG vendors not only what sold but what other items made up the transaction.
Enabling business analysis with Oracle Autonomous Data Warehouse
Drop Tank discovered that it could collect data that CPG vendors needed and also provide more information to a growing set of CPG vendors. But the company needed a data warehouse that could handle the load while enabling the company to retain its small size, now a 22 people business. So, finding the right amount of automation was critical. That’s when they turned to Oracle.
While Drop Tank captures a great deal of data in its data warehouse, there are always times when CPG companies might want specialized information that’s not easily available as the system was set up.
It’s a common problem. For years data warehouse users have tried to build a perfect warehouse containing every kind of data and every possible analysis, but that’s not real. So, Miller took the company in a different direction. Today he says, “Don’t try to build a data warehouse that’s perfect, instead, leverage the power of the Autonomous Database,” which he does.
When a CPG customer makes a special request, Miller says that, thanks to the Oracle Autonomous Data Warehouse, Drop Tank can simply spin up a new database automatically in about an hour without the traditional overhead of building, tuning, and maintaining it. “Spinning up a traditional database and tuning it can take days or weeks. Within an hour I can have the database running, within 4 hours we can load data and then within another hour we can have answers. That’s as difference-maker.”
Oracle technology has enabled Drop Tank to grow in several ways. Oracle PaaS and IaaS have enabled the company to run its IT in the cloud using a service-oriented architecture (SOA) that helps the company reduce overhead and save headcount. Oracle Autonomous Database enabled the company to branch into providing information services to CPG companies from its starting point as a loyalty program provider. And, Oracle automation has enabled this small company to remain small in headcount but to continue to grow its services while significantly increasing productivity.
CRM guru, Esteban Kolsky, and I did some primary research paid for by Zoho earlier this year. We wanted to better understand what buyers of CRM systems today were most interested in and to discover their highest priorities. Our survey population comprised more than 200 highly qualified executives and managers (47 percent C-level) in companies with at least 500 employees and ranging to several thousand. All respondents indicated a need to make a CRM purchase decision in the months (not years) ahead. So, we felt our data represented a good measure of current need.
Our findings were what you might expect from such a group though some data points puzzled us. For example, few executives seemed to understand the need for platform technology to support their ambitions. Those aims included taking on the digital disruption and leading their organizations to be more agile, things that platform-based CRM is ideally positioned for. So that seemed like a big disconnect.
CRM all in
After most of two decades where CRM was seen in some circles as a technology suite to keep an eye on but not necessarily purchase, our data clearly showed that most people surveyed see a CRM solution as a necessary additive to business strategy. Most, said they wanted greater technology flexibility (80 percent), increased ability to take on new opportunities (63 percent), and better information sharing among the groups in the front office (60 percent).
Since all of the members of our study had purchased CRM before, the great yearning for technology flexibility speaks to some of the limitations of earlier CRM systems. It also suggests to us pent up demand that could result in a new adoption wave.
These findings are also in line with other research that points to majority CRM purchasers today seeking opportunities for greater differentiation in their markets. You can’t blame them. With CRM well distributed in many markets, the dividend from installing first or even second-generation CRM systems has evaporated. Today, users with systems that only capture and store customer data (systems of record) are being out-competed by businesses that can perform some amount of data analysis and make relevant recommendations about what to do next, integrate other systems easily, and offer support for social media.
The latter systems are often referred to as systems of engagement. In other research I’ve documented an important chain of cause and effect this way. Engagement drives loyalty which drives profits. No wonder there’s so much interest in modernizing CRM. A note of caution though, your notion of engagement might not be the thing that motivates customers to engage.
Best of breed?
Over time I’ve seen the number of disparate best of breed applications in organizations steadily climb from a low of several dozen when I started tracking to many hundreds now. Part of this finding simply reflects the success of cloud computing. As the number of cloud vendors has steadily increased so has the number of apps available. And the cloud technology and business models make it easy to add a new app.
But at some point, which I am confident we’ve passed, the sheer number of different apps capturing and trying to share data produces its own limitations. With hundreds of apps needing to integrate to a CRM suite it becomes more than a full-time job to keep all of the apps synched and the integrations in good repair, even with modern cloud technology.
There are two keys to success in this scenario: 1) limit the number of apps the organization will take in or 2) invent better ways to integrate systems. Good luck with the first idea, departments are now fully capable of bringing cloud-based IT solutions into their workflows without seeking permission. Too often IT only discovers a new app when it is asked to fix something.
The second approach calls for platform technology from a CRM vendor. With platforms a third party builds to the specifications of the platform and the user has a much easier time bringing apps onboard. So it was a great surprise to us that less than ten percent of the executives surveyed had an inkling of the centrality of platform technology to their search.
When I started in this business implementing a CRM system in an enterprise could easily take a year given the complexity of deploying CRM for the very first time. The rule of thumb for a full CRM deployment was that the cost of the effort would easily be two or three times the cost of software thanks to the need for an army of SI specialists. Software costs have been reduced considerably thanks to competition and cloud computing, but the time involved has barely budged though it has been refactored.
In our study 63 percent expected to complete the selection process in 4 to 6 months though a smaller cohort expected it to take upwards of a year. With selection complete 43 percent think purchase to rollout should take 2 to 4 months while an additional 33 percent expect the process to take 4 to 6 months. When everything is laid out and accounted for, the executives still think the process will last a year between purchase and first ROI proof.
Perhaps this can be partly explained by the additional need for setting up AI rules and algorithms and training machine learning systems. Also, some explanation may rest in the need to customize by adding vertical market expertise.
My two bits
We might be in the early stage of a new CRM deployment wave. The situation in the industry and between vendors has changed a lot since cloud computing came to dominate and AI and machine learning made appearances. To a degree platform orientation within a CRM product set could significantly alleviate the need for substantial rip and replace efforts. With a good platform it’s far more likely that a vendor could update customers with new technology in-line with the maintenance process. Yet another reason to pay attention to platforms.
There’s a chicken and egg issue with digital disruption. Making decisions based on numbers instead of gut instinct is recognized to be a superior approach in many situations, but before you can get to decision-making, people have to be able to use things like AI and machine learning. Humans are not naturals when it comes to numbers; thinking back to high school algebra is all it takes to convince most of us.
Humans are really good at things like relationships and reading faces. So there should be a natural association between providing crunched numbers to customer-facing employees and their use. But before you can expect employees to take on thinking with numbers more than they ever have, it’s got to be dead solid easy to crunch the numbers and deliver their meaning. For much of the AI universe so far that crunching and delivery has been focused on things involving a next best algorithm. Next best offer in sales perhaps, or next best service solution in customer service. But there’s a lot more we can do.
Salesforce delivers Einstein analytics for a broader audience
Today, Salesforce announced four new products based on Einstein, its analytics engine, that are designed to spread analytics to more parts of an organization and to enable more types of employees to work with the tools. All introductions support clicks or code thus enabling admins and developers to access functionality according to their skill levels. Briefly the introductions include,
- Einstein Translation which enables admins and developers to set up automatic language translation. If a user enters data in a different language the system instantly converts to that language. There was no statement, however, about how many languages are initially supported. The product is in pilot so look for more information later.
- Einstein Optical Character Recognition (OCR). OCR has been around a long time because it works and is an important part of scraping usable data off documents. Initially Salesforce sees this as a way to streamline data entry. Also in pilot.
- Einstein Prediction Builder enables admins and developers to build AI models for apps running on the Salesforce platform with a declarative setup tool. Generally available.
- Einstein Predictions Service enables admins to embed Einstein AI analytics into third party systems like ERP or HR. Also generally available.
In a move that seems like a commentary on the troubles that social media companies are having, Salesforce also restated its commitment to its core values, especially trust in this case. The company went out of its way to state that its AI products are transparent, responsible and accountable. For instance, the system provides users with justifications for predictions based on which factors influence a prediction. Also, protected fields warn of potential bias in datasets with pop-up alerts. And Model Metrics evaluate the accuracy and performance of AI models. If only things like this were available in social media.
My two bits
A few years ago, when sales analytics were the only analytics game in town, I remember some emerging vendors telling me it was hard to get customers to use their tools to develop their own unique analyses. They were happy to use all of the reports that came with the tool out of the box though, which led to delivering a large number.
In my experience this rang true because sales veterans (and I am one) seem highly attached to their unique approaches. At the time I thought that asking them to develop their own analyses was akin to asking a fish to invent fire. In the years since, I discovered that sales people were not unique. So making analytics’ use as easy as possible is a pre-requisite for getting on with a company’s digital disruption.
Clicks and code, the two approaches Salesforce emphasized in this announcement are not out of the ordinary for most things the company enables. They want to reach the broadest audience possible for their solutions and that’s good. But it has extra importance at the intersection of digital disruption and analytics.