Shipping giant Matson made news back in November with its major migration to Amazon Web Services (AWS). Matson’s recent announcement to go ‘all in’ with AWS is another example of large enterprises realising the intrinsic value of the public cloud.
But merely getting set up on one of the large cloud providers is now table stakes – everyone has a pilot. True cloud success is another matter.
It took years to complete this migration at Matson, according to its CIO in the news coverage; so it’s not surprising that many companies are wary. In a 2016 Rightscale survey, companies ranked “lack of resources/expertise” as their greatest cloud challenge, even more than security. While that signifies an evolution in attitudes toward cloud, it also foreshadows new challenges. One important topic Matson’s CIO raised is something all enterprises need to consider when embarking on their cloud journey – cloud provider lock-in.
The perils of cloud lock-in
Concerns about getting ‘locked-in’ to a public cloud provider are not just about contracts, pricing or negotiating leverage. CIOs also worry about migration costs, unavoidable provider outages, redundancy and disaster recovery. They hear about innovations from one cloud provider that aren’t available with another and shifting performance metrics. Many buyers have unpleasant memories from lock-in scenarios in the mega-vendor and enterprise resource planning (ERP) heyday. In the age of the cloud, we want and expect choices
That’s why many companies are thinking about how to architect a multi-cloud strategy, across multiple public cloud providers, even if they may not be planning to implement it anytime soon. Risk-averse companies want to hedge their bets, innovators want lots of flexibility and many still aren’t certain which cloud provider is best suited for their needs.
But companies struggle with exactly how to adopt services in the public cloud – especially for big data. Lock-in concerns are particularly worrisome with analytics and data management workloads because it’s not just about vendor or infrastructure lock-in – it’s more than that.
Built-in or lock-in?
Cloud providers increasingly offer their own components for data warehousing and big data, alongside marketplaces of third-party solutions and services. For example, AWS offers Amazon Redshift for data warehousing and Amazon EMR for big data and Hadoop workloads. Similarly, Azure offers the Azure Data Warehouse and HD Insight. Compared to DIY, the built-in components can seem like a simple option. But they also represent a strategic step with long-term implications.
That’s because companies often need to make significant investments in custom tooling to integrate existing data flow with these built-in services. This makes it challenging later to move workloads to another cloud provider without duplicating the investment. This is a big deal for big data with a ripple effect on costs, related processes, data sources, tools, etc.
If an organisation is reliant on the cloud providers’ built-in services, they are making a long-term platform bet with high switching costs. And, when we’re talking about enterprise data – potentially huge volumes – that’s a much a bigger deal than say, switching smartphone platforms and even that seems a little painful, doesn’t it?
Multi-cloud services emerge
Fortunately, thanks to the rise of multi-cloud managed services, enterprises can now get the most out of the cloud’s capabilities while minimizing lock-in challenges. Services are emerging for a variety of cloud functions, including big data. Using services that work across cloud providers is a prime example of a multi-sourcing or dual-sourcing strategy. It also supports high-availability goals, making it easier to move workloads to a different provider and region as needed.
Migrating to the cloud unlocks incredible agility and productivity gains, and multi-cloud managed service providers can make things easier. Don’t get locked out over fears of lock-in.
Matson, AWS, and overcoming the cloud lock-in challenge for big data