pony_cloud

No matter how you feel about the NSA and the wide net it casts in surveilling people, one thing is certain: All this talk about spying on Americans and foreigners is having an effect on some businesses, namely American cloud companies. It’s also afforded opportunities to cloud companies based in other countries, which see an opportunity to grab customers from their American rivals. Officials in some countries – namely Germany, the Netherlands, France, Sweden and Switzerland, as well as from the European Union – have even been using scare tactics to steer business to companies in their jurisdictions. 

For instance, German federal data protection commissioners threatened new bureaucratic hurdles for anyone using US cloud providers. The nation’s Interior Minister, Hans-Peter Friedrich, announced “whoever fears their communication is being intercepted in any way should use services that don’t go through American servers.” Justice Minister, Jörg-Uwe Hahn, called for an outright boycott of U.S. companies. Europe’s largest oil company (Royal Dutch Shell Plc.) and one of Microsoft’s biggest clients in the region, decided to store its data in Germany with T-Systems and decided against Microsoft because Microsoft’s offering is U.S.-based.

The Dutch Security and Justice Minister, Ivo Opstelten, told Parliament that U.S. companies will be excluded from bidding for IT services by his government because of fears that the U.S. Patriot Act may allow data to be compromised. “It’s extremely important to have the governments of Europe take care of this issue,” said Jean-Francois Audenard, the cloud security advisor to France Telecom. “If all the data of enterprises were going to be under the control of the U.S., it’s not really good for the future of the European people.”

Sweden and the Swiss have jumped in the bandwagon.  Johan Christenson, CEO of City Network from Sweden, noted, “[t]here are a lot of customers that come to us because they want to store their data in Sweden (instead of U.S.A.).”  Mateo Meier, CEO of Artmotion, Switzerland’s largest hosting company, said that revenues jumped 45% since the Snowden leaks.

Meanwhile, the European Commission will present tighter data-protection rules to shield individuals from data loss on the web while at the same time creating a “level playing field for European companies” by smoothing out differences across European countries according to EU Justice Commissioner, Viviane Reding.

In any case, if your operations, and your consumers, can be restricted to just your country, assuming your country has cloud awesomeness, that would be sublime. But the cloud doesn’t work that way; it knows no borders and never will.  One reason the cloud became so popular in the first place is because it knows no national or cultural boundaries. By choosing these localized boutique cloud providers, you will lose the ability to use the best of the cloud providers and will limit yourself to using some sub-standard providers just because they are local and abide by your country’s laws. This will result in cloud silos giving way from best of the breed providers to low quality, high cost providers driven by scare tactics. Also, keep in mind there is no guarantee that these providers will guard your sensitive data with their life either. It could be a bigger disaster waiting to happen.

So what does a business that wants a cloud-based solution do now?

You can either allow yourself to be scared away by stories and steer away from U.S. based cloud implementations, or take counter measures to make your usage of clouds better, not just for now, but for the future.

How do you do that? As I explained in my earlier Pandodaily post, the easiest and quickest way to do this is to not send sensitive data to the cloud. If you have to, you need to take steps to protect it.

Problem solved, right?

Wrong. This is actually a complicated process, but bear with me. The company you save may be your own.

First, let us explore the various ways to do this then look at the possible options:

  1. Big bang approach. There’s a legacy way of doing things. Once you figure out the areas of your system, and the workload that goes to cloud, you can do the rip-and-replace approach in which you will touch every system component that needs to be modified and rewrite the system to modify the sensitive un-secure data to be encrypted/tokenized/redacted. This might involve touching every component and changing your entire architecture. It will likely be the most expensive, painful and slowest approach. While this can be the most effective for spot solutions, this could be an issue if you have to do this every time you enable a new system/ application to use cloud services.
  2. API/SDK approach. More effective than big bang, in this case, you can retrofit applications, processes, systems, databases, etc. by making those components call an API (or an SDK) which will convert the sensitive data and return one of the approved formats which can be sent to the cloud for storage or processing. This requires you to do minimally invasive procedures. While this doesn’t require you to change your entire architecture/system, it still requires you to touch all those components that need to be compliant. Effectively, this method is a lot faster and quicker to the market, while also giving you an opportunity to change quickly when the needs change.
  3. Gateway approach. In this you essentially monitor the traffic between enterprise to cloud and vice versa and de-sensitize the data in transit. You can either have global policies or location, device, system and cloud specific policies. This method is effectively the cheapest, and the quickest to the market. However, the biggest advantage is that the changes to your existing systems will be very minimal to nil. Essentially, you make the sensitive data flow through the gateway, which will take care of converting the de-sensitizing the data before it hits outside your perimeter.

Let us look at what I mean by not sending unprotected sensitive data to the cloud. You can do one of the following:

  1. Encrypt the data: This will create a garble out of the original. While this is a good way to desensitize it, the strength of protection depends on the key strength and algorithm used. You also need to worry about key management issues (i.e. key rotation, key expiration, re-keying the old data, etc.)
  2. Format preserves the encryption: This is a variation of the above. In this case, the output can make fit the original format of the data so it won’t break backend systems.
  3. Tokenize your data: Create a random token that will look, feel, and act the same as original data; store the original data in your enterprise secure vault locally; send the tokens to the cloud by replacing the sensitive data.
  4. Redact data: Keep portions of data and mask the rest of it. This is the safest of all formats but you won’t be able to get the original data with this, so it’s best used for archival purposes.

Don’t be a naïve cloud user by shifting responsibility to your cloud provider and expecting it to protect your data. Get smart by putting solutions and controls in place that will let you maintain complete control.

[Image via Ebay]