How simpler language might increase the use of encryption and secure email

By David Sirota , written on June 17, 2014

From The News Desk

Amid allegations that telecom and tech companies helped the National Security Agency sweep up user data, those companies could lose serious money. Indeed, studies show that they could lose somewhere between $35 billion and $180 billion as privacy- and security-seeking customers move business to companies and countries that are perceived to be more protected from the NSA’s panopticon.

One of the companies that has a lot to lose is Google, both because of its general prominence and because of the specific NSA-related headlines about the company. For example, Google featured prominently in the original Snowden revelations about the NSA’s PRISM program, which, according to the Guardian, “allows officials to collect material including search history, the content of emails, file transfers and live chats” of users. More recently, documents obtained by Al Jazeera America show that Google may have had a far more cozy relationship with the agency than it wants to admit.

In a response aiming to reassure users about its security and privacy standards, Google recently announced it is adding better encryption tools to its email service. Additionally, as the Wall Street Journal noted, the company also “called out other email providers – including Comcast and France’s Orange – for not using encryption,” which means the announcement could have a ripple effect of setting industry-wide norms.

At a consumer level, however, there remains a big obstacle to the goal of increasing encryption: usability and comprehension.

On the usability front, the ACLU’s Christopher Soghoian notes: “I don't think you can call Google's end to end encryption tool easy to use yet. It is a work in progress.” In a blog post about a new Chrome extension for encryption, Google seems to recognize this, lamenting the fact that “while end-to-end encryption tools like PGP and GnuPG have been around for a long time, they require a great deal of technical know-how and manual effort to use.” Thus, the Christian Science Monitor reports that as Google tackles the usability question, “only the source code (for the extension) has been released, with the intention that savvy users will test the Chrome extension and then report any bugs that are found” so that the product will be as simple to use as possible upon the wider release.

Making encryption far more seamless and easy to use than it is today will likely be a critical X-factor in whether or not it becomes a consumer standard. After all, as a new Princeton University study points out, excellent encryption has been around for a long time in the form of Pretty Good Privacy (PGP). Yet, the paper also notes that “for all of its crypographically guaranteed security” PGP has been “nearly impenetrable for those without technical backgrounds.”

Part of that has to do with the complicated user interfaces, which Google’s Chrome extension clearly aims to address. But Princeton researchers say part of it also has to do with the very language used to describe the encryption process. This vernacular is populated with potentially confusing terms like “public key” and “private  key,” which the researchers believe serve to confuse the average user:

In PGP’s metaphors, each user posses two items, a private key and a public key.  Have you inferred how the protocol works yet?  Unless you have previous exposure to cryptography, likely not.  Why do I have two keys? What do these keys open? Aren’t all keys private?...

Did the message come from me or someone pretending to be me? To prove I am who I claim to be, I sign all messages I send with my private key.  Wait a minute – how do you possibly sign something with a key?  Finally, you can make sure my signature is valid by checking it with my public key.  That’s right, you verify my signature using the same object that you use to encrypt messages you send to me.  Mathematically, this makes perfect sense.  Metaphor-wise, it’s a nightmare... The researchers then tested out a more straightforward set of metaphors to see if that might improve the average user's understanding of encryption:

We decided to test whether better metaphors might be able to close this gap between security and usability.  Specifically, we wanted metaphors that represented the cryptographic actions a user performs to send secure email and were evocative enough that users could reason about the security properties of PGP without needing to read a lengthy, technical introduction.  We settled on four objects: a key, lock, seal and imprint.  To send someone a message, secure it with that person’s lock.  Only this recipient has the corresponding key, so only they can open it.  To prove your identity, stamp the message with your seal.  Since everyone knows what your seal’s imprint looks, it’s easy to verify that the message came from you...

We put these ideas to the test by developing a quiz that measured a subject’s ability to understand and reason about secure email.  We gave different groups of users various forms of documentation, stretching from a technical introduction of traditional PGP metaphors to a narrative that did little more than show our new objects in use.  Our results indicated that the new metaphors themselves were no more effective than public and private keys, but that far less documentation was necessary to achieve an equivalent level of understanding. These results suggest that at the consumer/user level, language itself probably presents at least some barrier to wider use of encryption. Basically, lots of non-technical users who might be interested in encryption are probably deterred the first time they hit the technical jargon and arcane metaphors about how to use it on their own computers.

The good news is that out of all the challenges involved in better securing Internet communications, this one is probably the easiest to overcome. It just requires a bit more effort to translate tech-speak into plain language.

[Image by jeuxsansfrontieres (Creative Commons)]