Close

Big Data analytics in the cloud: The Enterprise wants it now

The journey to cloud computing, and to its adoption in the enterprise, has been a rather manic-depressive one. When the first public cloud platforms became available, many thought the model would immediately gain favor in the Enterprise world. But it didn’t. The Platform as a Service (PaaS) model seemed especially enticing, because it freed developers from having to worry about any environmental factors other than their own applications. But PaaS models’ required code changes and paradigm shifts were a deal-killer in a great many cases.

Cloud computing has, conversely, worked very well for startups. Companies like Netflix and Airbnb run their mission critical infrastructure entirely in the cloud, for example. The cloud’s low barrier to entry, and its ability to scale up and down, with little necessary advanced planning or notice, is perfect for new companies that wish to grow their costs only as their revenue grows.

Survey says
But now that we have finally become accustomed to startups being cloud-happy and Enterprises being cloud-averse, along comes some data to counter that. The data I speak of is in the form of the results to a survey on Big Data and cloud computing, conducted by Gigaom Research and underwritten by Cazena.

On Monday, we published a report presenting and highlighting the survey results. Those results were surprising and exposed some real pearls of cloud wisdom. And on final consideration, the results illuminate a rather straightforward to-do list for cloud providers in order to bring cloud computing into the Enterprise mainstream.

Cloud demand reaching a threshold
Number one finding?  Our survey respondents aren’t so cloud-averse after all. A majority – 53% – of them are either using the cloud now (28%) for big data analytics or are planning to do so (25%).

Perhaps even more impressive is the fact that 55% of those survey-takers who self-identified as “hesitating or not planning to implement any analytic processes to the cloud” told us that they would reassess their strategy if they had a “better understanding of the security posture of the cloud.”

In other words, (1) a majority of Enterprise customers are convinced of the cloud’s efficacy and (2) a large group who are not convinced are nonetheless very willing to reconsider. But they need to understand cloud providers’ policies toward security. And even though 39% of respondents say that “Privacy regulation prevents our data from being stored off-site,” it turns out an equal percentage of respondents say cloud-enabling regulatory changes are “imminent” or are “coming in the next 1-2 years.”

Data volumes: not just peanuts
Another interesting outcome? It turns out that most cloud Big Data candidates want to move non-trivial amounts of data into the cloud. Specifically, 92% of survey participants want to move more than a terabyte. In fact, 20% want to move more than 100 terabytes.

This tells us that Enterprises who are or will be ready to work with data in the cloud, are ready to do so with significant data volumes that eclipse the amount needed by proof-of-concept projects. Enterprise cloud data customers, present and future, want to dive in, not just put a toe in the water. The time for experimentation is over.

Path of least resistance
Some of the survey results weren’t surprising at all – in fact they provided useful corroboration for a key assumption: Enterprises want their changes to be low-impact in terms of production. For example, Enterprises want to keep using their existing BI and analytics tools for the purpose of querying and analyzing their cloud-based data. Specifically, 54.5% want to use existing BI tools like those from SAP, IBM, Tableau and Qlik. 48.8% want to use existing analytical tools like those from SAS.

Enterprise customers also want to avoid complexity – 24% of respondents have concerns about whether their available bandwidth can accommodate pushing their data volumes up to the cloud; 23% are worried that processes and tools will have to change as cloud Big Data gains favor and interest; and 21% worry that they don’t have the expertise to carry out the necessary data migration.

Marching orders
What do cloud providers need to do, in order to drive their platforms into the mainstream? It comes down to education around security protocols and compliance standards; integration with existing tool sets and skill sets; and the reduction of complexity, in terms of provisioning, tooling and data movement, especially for the 1TB+ data set range that it seems most customers are chomping at the bit to push up.

In other words, cloud providers need to keep it simple, and make it easy. File access over cloud storage needs to be easy. Data ingest shouldn’t be a trepidatious. Existing tools should just work. And if there are rough edges that prevent such seamless connectivity, cloud providers should take on the burden of smoothing them out. Complexity has costs, including for training, lost productivity and opportunity costs. Cloud providers should take these complexities on so that customers can be happy and productive. That’s what will drive adoption when all is said and done.