In this GigaOM article, Janko Roettgers talks about the need for data portability in the cloud. “As more data is moving to the cloud, customers often have to make a tough choice: Do they want to make use of the most advanced offering, or do they want to rely on standardized solutions that offer them an easy way to move their data to another provider if necessary? ‘You have innovation, and then you have standards,’ CloudSigma Co-Founder and CTO Robert Jenkins remarked at GigaOM’s Structure:Data 2013 conference in New York Wednesday.”
This is really about the problem of data integration, which is not a new issue. As we migrate data into public clouds, there is an obvious need to sync that data with information back inside of the enterprise, as well as with other public clouds. The suggestion of a public data exchange is interesting, but that idea has also been around for years within specific verticals, and has not caught on.
Data integration is going to be a huge impediment to the success of cloud computing. Worse, it’s something that’s not yet on the radar screens of many of those beginning to leverage public clouds. There is no killer technology that solves this problem, and it’s not really about portability standards.
This is a problem that’s solved with advanced planning, meaning that we understand the data integration requirements and select the right enabling technology to solve the problems. Chances are, there will be two or three cloud data integration players, with different technologies for data synchronization for on-premise applications and databases, as well as cloud-to-cloud.