As Cloud Computing Goes International, Whose Laws Matter?

1Executive Summary

Cloud computing solutions of various flavors continue to grow in popularity, as individuals, small startups and global corporations turn to the cloud in order to store data, distribute computing tasks, or deliver applications from email and calendaring to customer relationship management and gene sequencing. While cloud advocates tend to present ‘the cloud’ as global, seamless and ubiquitous, the true picture is richer and complicated by laws and notions of territoriality developed long before the birth of today’s global network. What issues are raised by today’s legislative realities, and what are cloud providers — and their customers — doing in order to adapt?

A global cloud can’t please everyone

Software as a Service (SaaS) offerings such as Google Apps, Microsoft Live!, Zoho and others certainly appear a compelling proposition for organizations seeking to cut costs while offering core services like email and calendaring in a more flexible manner. Hosted out in the cloud, maintained day and night by someone else, enhanced frequently and without your intervention, often free, and available on almost any device with a web browser and a network connection — what’s not to like?

Plenty, if you’re the University of Oxford. The globally distributed nature of these services proved an insurmountable obstacle for the university, as described in a 2008 briefing document commissioned by the UK’s Joint Information Systems Committee (JISC) and Universities & Colleges Information Systems Association (UCISA):

“The issue of privacy and confidentiality became the biggest factor why Oxford did not outsource the provision of email. Without the guarantee that the hosting of data (specifically email) would be held in the UK, or at least the European Economic Area, it would be challenging if not politically impossible for the service to be ‘sold’ to the University.”

While it’s true that a growing number of universities (and other organizations) have undertaken evaluations very similar to Oxford’s and reached totally different conclusions, the issues flagged at Oxford remain real. Europeans in particular have a tendency to juxtapose their fierce protection of personal data with the more laissez-faire attitudes they perceive on the other side of the Atlantic, where the majority of those providing SaaS solutions site their data centers. Given the maturity of the U.S. and European markets, these tend to be the focus for debates around portability of data, but very similar issues may also be found in other geographies.

Subdividing the cloud
Lower down the cloud computing stack, providers of computing and storage infrastructure from Amazon‘s Web Services to Microsoft‘s Windows Azure offer a simple but effective solution to overcoming legislative restrictions on the movement of personal data; they allow customers to allocate compute cycles and data to specific geographic regions. Amazon already has a European zone and two in the United States, and Microsoft’s Matt Deacon confirms that Windows Azure will emerge from Community Technology Preview (beta) with three discrete zones serviced by data centres in Dublin and Amsterdam, Chicago and San Antonio, Hong Kong and Singapore.

For many applications, of course, the complex inter-relationships between different sets of privacy legislation are not necessarily significant. Rackspace‘s Simon Abrahams notes that around 25 percent of customers for the Rackspace Cloud originate in his EMEA (Europe, the Middle East, Africa) region although the servers powering the Rackspace Cloud today are all in North America. The sizable lead Guy Rosen sees Amazon’s us-east-1 region maintain over its European equivalent must surely be due in part to European customers placing more emphasis upon saving a cent or more per computer per hour than on the more ephemeral issues of ‘data sovereignty.’

Bringing traditional thinking to the cloud
European Data Protection Laws present particular issues for those wishing to transfer personal information to servers located outside the continent, but solutions such as the existing ‘Safe Harbor‘ provisions offer measures that enable many service providers to comply cost-effectively. These laws, and the ever-present spectre of powers granted under the United States’ PATRIOT Act, mask a wider and far more significant set of issues regarding the manner in which enterprises around the world treat their data. Simon Mackie addresses specific issues around data ownership in a recent GigaOM Pro article, and security also features in a piece (subscription required) by David Talbot for the January/February 2010 issue of MIT’s Technology Review.

Michelle Dennedy, Chief Governance Officer for Cloud Computing at Sun Microsystems, is quick to suggest that “something old-fashioned needs to happen before an Enterprise can take these risks” and entrust data to the Cloud. Harriet Pearson, Chief Privacy Officer at IBM, agrees. Both stress the need to accurately classify and audit data held in systems inside the organisation, long before any decision to push data into the cloud. “The firewall around the enterprise no longer works,” argues Dennedy. Rather than rely upon robust defences at the edge of corporate IT infrastructure, the benefits promised by cloud computing require a far more granular approach to protecting core assets. Data audits, robust identity management solutions, serious asset management procedures and more are both necessary and feasible before any move off-premise.

Matthew Yeager, Practice Leader for Data Storage and Protection at Europe’s Computacenter, echoes Rackspace’s Abrahams in suggesting that many of the data storage issues posed by cloud computing are actually ‘nothing new,’ having been faced by larger enterprises for many years. The real challenge, they assert, is that the affordable utility model of cloud computing exposes a whole new set of smaller companies to these issues. These are, on the whole, companies without robust internal procedures or the expertise of a Pearson or a Dennedy that would enable them to cope. Cloud computing doesn’t create these problems; it simply exposes the flaws in internal processes, and leads these newcomers to make their mistakes in public.

In the latest iteration of its Security Guidance for Critical Areas of Focus in Cloud Computing, released on Dec. 17, the Cloud Security Alliance make a series of recommendations to both service providers and potential customers with respect to mitigating risk. There is a clear focus upon informed assessment of the risks and opportunities involved; “Just as a critical application might be too important to move to a public cloud provider, there might be little or no reason to apply extensive security controls to low-value data migrating to cloud-based storage.”

Trust me, I’m a cloud
Cloud computing offers clear efficiency gains and potential cost savings across a range of application areas, but those benefits must be offset against the inevitable risk of diminishing control over valuable applications and data. In certain domains, such as medicine and banking, stringent legal and policy frameworks, not unreasonably, constrain the ways in which data may be treated. In less regulated environments, common sense should still be used in evaluating whether a specific data set or process should run on public cloud infrastructure, on a carefully guarded machine isolated even from the rest of your corporate network, or on any of the myriad options between.

Cloud computing need not be less safe than an on-premise data center, and in many cases it is likely to be more secure, more actively guarded, and more rigorously monitored than the machine room down the hall from your office. Nevertheless, Rackspace’s Abrahams notes: “Choose the right tool for the job; the public cloud will not always be the right answer.”

Trust and transparency emerge as common themes in reaching informed decisions as to what that ‘right tool’ might be. Microsoft’s Deacon points to the value of appropriate accreditation for data center infrastructure, and stresses the need for separate accreditations (such as HIPAA) specifically concerned with the manner in which data is handled. Deacon highlights Windows Azure’s ability to run any application at any tier (on-premise, in a container parked outside, off-premise at a hosting partner, or in Microsoft’s own data centers), and suggests that a new breed of applications will require explicit business logic to make best use of this flexibility.

Sun’s Dennedy identifies one significant difference between cloud computing and more traditional models of IT outsourcing, arguing that the former tends to be accessed in a lightweight and fluid fashion, while the latter tends to be the result of protracted and careful negotiation. As a result, she suggests, there is far more of a need for cloud computing providers to be transparent about their policies and capabilities. A recent paper from Sun explores this in greater depth, presenting a possible model for disclosure based upon ISO 27000 standards which is intended to reassure potential customers whilst maintaining the provider’s security and associated competitive advantage.

Where next?
IBM’s Pearson sees a long-term approach to managing concerns as more and more data flows over trans-national networks. Today, she stresses the need for data owners and hosting providers to understand and comply with current laws, policies and procedures. She exhorts companies to ‘choose your Cloud provider wisely,’ highlighting very similar issues to those Dennedy encapsulates in the Sun paper. Pearson goes on to suggest that legislators and service providers are engaged in a process to assess the relevance of today’s often confusing legislative framework, pointing favorably to discussions during the recent International Conference of Data Protection & Privacy Commissioners in Madrid, Spain.

More optimistic about opportunities for short-term change than Deacon or Abrahams, Pearson suggests there is growing recognition that while “privacy is local, data flows have to be global.” She sees possibilities to apply legislation local to the point at which data are created, possibly independent of the ways in which those data flow over global networks. Will it ever be feasible for data about French customers, collected by a French company according to French law, to be stored and processed in a server farm in Nevada? In principle, at least, it should not matter where the bits are stored or processed, and French laws should continue to apply to the data as it circles the globe, free from local interference.

The final step in Pearson’s suggested approach is technological, and she postulates that we may see growing enthusiasm for ‘privacy by design;’ systems that are built in a manner that minimizes the need to transfer sensitive data in the first place. Homomorphic encryption, the use of identifying tokens in place of actual data disclosure, and other techniques are growing in promise, she suggests. Lori MacVittie of F5 Networks makes a very similar point, pondering the feasibility of codifying security policies in the infrastructure upon which applications run. She points to F5’s current geolocation capabilities as an early piece in a far richer infrastructure, capable of applying very different security procedures on the basis of a user’s location.

At one level, we will doubtless see increasing prominence for neutral countries that shield Europeans from the overbearing powers of the PATRIOT Act, while enabling American companies to offset at least some of the burdens imposed by operating inside European borders. Iceland, for one, has many environmental advantages when it comes to building and running data centers like the new one taking shape not far from Reykjavík. Might the country’s geo-political status as a member of the European Economic Area (rather than the more tightly legislated European Community) ultimately prove to be worth more to its neighbors on both sides of the Atlantic than abundant geothermal power and fresh-air cooling?

Conclusion
Many of those giving serious attention to transforming their IT infrastructure to benefit from some combination of public, private or hybrid cloud computing are taking real and pragmatic steps today in auditing, understanding and appropriately segmenting their data and processes. Some of that data will be perfectly suited to the public cloud — and some will not. There is a balance to be struck between innovation and a drive for efficiency on one hand and protective risk-aversion on the other; many of the associated judgements remain highly subjective.

If mainstream enterprises are to embrace cloud computing for mainstream processes, the cloud’s much-vaunted utility model may need to flex a little. To gain the trust of those enterprise customers, cloud providers will need to provide more than VeriSign-secured credit card forms and automated uptime web sites; demanding enterprise customers want to know, understand and trust the people and processes behind the services upon which they rely. For some cloud computing providers, the cost of embracing this more demanding breed of customer may be too high, and they will choose to continue servicing the clientele they already have. For those able to adapt, the currently untapped market is huge and, ultimately, extremely profitable.

Relevant Analyst
PaulMiller-med5f09011b40a21c20b941ac1948ceee31-avatar2

Paul Miller

Founder The Cloud of Data

Do you want to speak with Paul Miller about this topic?

Learn More
You must be logged in to post a comment.
4 Comments Subscribers to comment

Learn about our services or Contact us: Email / 800-906-8098