
Even though data is officially located in the cloud, the physical location of the underlying server infrastructure needs to be known. This is because different jurisdictions have varying privacy and data management rules that the information may become subject to.
For example, jurisdiction might be critical if your chosen cloud provider suddenly goes out of business, your relationship with them sours, or you become subject to some form of e-discovery process, all of which require you to be able to get your information out of them, and fast.
“The terms and conditions (T&Cs) will have a governing law clause in them, such that if you use a foreign [cloud] provider your data will more than likely be governed by the laws in that place,” explained Abrahams.
“It means enterprises could find themselves subject to laws that they aren’t familiar with.”
According to Abrahams, there are also restrictions under Australian law on the export of some personal information to other countries.
Tim Smith, marketing manager A/NZ at Hitachi Data Systems, also claims there are some ‘83 pieces of legislation around data retention and document destruction’ locally that could also come into play.
“Users need to be vigilant to ensure way that personal information is being protected in clouds outside of Australia,” said Abrahams.
Abrahams urged prospective cloud customers to think through all potential ramifications of data accessibility before signing on to a service.
“What happens if the provider were to go bust? Will the data be in a format that makes it accessible and readable?” Abrahams asked.
“You’re putting a lot of trust in the provider, and while you can try to protect access to data contractually, short of backing it up in escrow or having a bespoke agreement, you will just have to trust the provider to make the data available to you again.”
He also warned customers to prepare for the possibility of a falling out with the cloud operator.
“If the relationship sours they have a massive negotiating position on you because they could effectively stymie access to your proprietary data,” said Abrahams.
Abrahams advised potential cloud operators to pay close attention to the drafting of their standard terms and conditions for use, and potential customers to read the T&Cs carefully and negotiate a bespoke agreement in place of accepting standard terms where possible.
“For the most part, standard T&Cs are used so the customer doesn’t have a great deal of say in what goes into the contract,” explained Abrahams.
He added: “It’s important to note that none of these issues render the cloud business model ineffective. They just need to be considered as part of the risk analysis of having or using some form of hosted solution.”
Shades of grey
One of the other key side effects of today’s environment is the complete lack of standards and reference architectures available to prospective cloud operators.
It surprised even iTnews how closely some hardware vendors are keeping to themselves reference architectures that could be used as the basis for a cloud service.
For example, HDS has them – but only under non-disclosure agreement. IBM’s Business Consulting Services division and HP Australia are similar – our attempts to gain access to that knowledge fell on deaf ears.
What could be perceived as a trend towards proprietary cloud knowledge might be the impetus needed to drive development of more open reference architectures and industry standard models for Cloud Computing
moving forward.
“I think there needs to be some open standards,” said Sean Casey, enterprise business development manager for Intel A/NZ.
“Otherwise there is a danger of fragmented mini clouds forming and everyone doing their own thing.”
HP’ Angus Young concurred. “We’re relying on the industry to come up with a standard for Cloud Computing going forward,” he said.
There are a number of models that could vie for industry standard status. In July, Intel, HP and Yahoo! created an open source cloud computing testbed initiative for large-scale, global research projects.
Separately, Intel and Oracle are working together to ‘identify and drive standards to enable flexible deployment across private and public clouds’.
The latter initiative is perhaps a recognition that there may be a number of different types of ‘cloud’ in the not-too-distant future – not just external EC2-type offerings that exist today.
“We see corporates and large organisations having their own clouds within their own environments,” said Dell’s Justin Boyd.
Other hybrid models that skirt the line between public and private are also possible.
“You will see variations to the theme – it could be a private cloud, on-premise or external type service or a mix of all three,” said IDC’s Linus Lai. “There will be grey areas – it’s not going to be all black and white.”
Cloud Computing is a big topic. The flexibility it introduces could take virtualisation to the next level. Being aware of the potential side-effects will be the key to making it work for your business.
Above all, in its formative stages, remember – always read the warnings.
And if you can negotiate on the T&Cs, don’t take cloud services only as the vendor prescribes.
An extended version of this article appeared in the 24th November 2008 issue of CRN magazine.