Pando

Welcome to Airbnb. You can check out any time. But your data can never leave

By Kevin Kelleher , written on September 9, 2015

From The Sharing Economy Desk

Years ago, on a dark desert highway, I was driving with the cool wind in my hair.

I had to stop for the night and found a roadside hotel. The night man told us he had a vacancy but I’d need to answer a few questions first.

“When did you live in Japan?” he asked. Startled, I asked how he knew and why that mattered. “It is for your own protection,” he answered. “Now, what is the name of the elderly woman who lives across the street from you?”

Again, feeling this to be intrusive and very, very weird, I asked what was going on here. “Relax,” said the night man. “These questions are aimed at upholding trust and facilitating connection. Now, what city does your uncle own a house in?” When I pointed out my uncle has passed away, the night man shrugged. “Let's just pretend that never happened so we can check you in, shall we?” I quickly left and continued down the highway, warm smell of colitas rising up through the air.

Ha, just kidding! That story is totally made up. It's far too surreal and ghoulish to have ever actually happened.

But here is something that did happen this weekend. I found myself looking for a last-minute accommodation in Santa Cruz. I'd never been there, and since the weekend plans we'd made suddenly fell through, I thought this was the perfect chance to visit. The problem, as I discovered, was that it's all but impossible to find last-minute accommodations in Santa Cruz on Labor Day weekend for less than $400 a night.

HomeAway, VRBO, Kayak, Hotels.com – all came up with nothing. I remembered the creation myth of Airbnb, how it was designed to help people find affordable, last-minute places to crash. I'd used Airbnb before, but not often because I usually opted for alternatives that didn't charge me a service fee.

To improve my chances of finding a place, I took up the company's suggestion to “Verify Me.” I'd read the horror stories experienced by guests and hosts alike and supported the company's efforts to keep both safe. When I clicked on the rose-colored “Verify Me” button, it gave me two options. The first was to take a photo of my passport or driver's license and upload it to a company called Jumio.



Simple enough. I thought about the times I'd had to present a passport or a driver's license to a stranger. Typically in situations where regulators believe it's all too easy to commit some crime – customs checkpoints, liquor stores, pot dispensaries. In those cases, some law was crafted intended to filter out the bad characters from the good. But no law was demanding Airbnb to deliver a copy of my driver's license.

Then I thought of business transactions when this was necessary. And I recalled that this practice was traditionally reserved for pretty weighty matters. Like taking out a mortgage to buy a house. Since when did paying, say, $85 a night to share a bunk bed listed on Airbnb hold the gravity of taking out a six-figure mortgage to own a home for 20 or 30 years?

There was something else bugging me. I was aware of Jumio, but I'm also aware of the rash of break-ins onto corporate and government servers to harvest personal data. I've had my own data purloined (hello Adobe, hello Gawker), so I try to minimize my exposure to future incursions. And no matter how hard the images on Jumio's site try to make it look completely normal to upload images of my personal ID to some server I'll never have the privilege to access, I just feel like it isn't safe in the long run. It's not that Jumio's good intentions can't be trusted. It's more that no company's security IT can be trusted in 2015.

So, thanks for that green “SECURE” logo next to Jumio, but all the same, no thanks on option one.

That left me with the second option – the surreal, ghoulish option: I could “confirm a few personal details that only I would know.” I clicked on this, thinking that since I'd given Airbnb access to my Facebook account, it would ask me about something I'd posted there. I was wrong.

Airbnb asked me to confirm basic information I'd entered before – name, address – as well as the last four digits of my social security number. I swear, I don't remember giving this to Airbnb because I'm leery about sharing it at all, but I might have. I don’t know. And I really wanted that weekend getaway, so.

Then I saw this.

-In which of the following states does “Robert Kelleher” currently live or own property?

-In which of the following countries have you ever lived or owned property?

-Which of the following people have you known?

If you've ordered your credit report from one of the four companies that insist on pretending that your financial data belong to them and not you, then you have seen these kinds of questions. But even this was different. The security questions these credit agencies ask are all contained in the reports you are seeking access to.

Airbnb was asking me questions just as private, if not more so - the most intimate questions I’ve ever been asked by any travel company in all my travels around the world. Even after the tightened security that followed 9/11. Seeing these questions on Airbnb was like buying a box of Wheaties and having the grocery clerk rifle through your finance records to make sure are worthy of the breakfast meal endorsed by Tom Brady. It was just... wrong.

But that's not what got to me. What got to me were the other two questions. “Robert Kelleher” is my father. Who in fact doesn't live in or own property in any state other than the state of grace. He died not long ago. And every moment that reminds me of that still strikes my heart with a cold and severe pain that I can never put into words. Here was Airbnb pulling up data so old it was false, and asking me to verify it, just so I could prove myself worthy of using its platform. All the while ready to charge me a service fee for the injury.

As for the third question, I hesitated before clicking on the menu. What people should I know, according to Airbnb? This verification process was leaving me feeling awfully powerless. And all I ever wanted was just a room for the weekend. It turned out the “person I have known” was a neighbor across the street. And so I am after all no Phillip Jennings or Walter White, but this whole big data thing was starting to creep me out.

We live our lives in ways that spill lots of personal information into public databases. Each piece of data we eject into innocence. When we aggregate those bits of data into our personal histories, they become our memories. When the same thing happens in the hands of an anonymous corporation, it becomes sinister and intrusive. It's the definition of creepy.

Which is why, instead of answering Airbnb's questions, I complained on Twitter. A social-media worker quickly responded. Whoever this person is, he or she (I couldn't tell, that piece of information was denied to me) said everything exactly right given his/her role. You can be on the wrong side of an argument and still say the right things along the way.

So Airbnb doesn't seek this information out itself, it pays a company called IDology to do it for them. (By the way, it’s pronounced ideology, not idology – as in, the logos of a false idol – I leave it to you, reader, to decide which pronunciation is less disturbing). IDology was founded the same year as Facebook, only where Facebook seduced you into sharing openly the details of your daily lives, IDology was secretly hoovering up the breadcrumbs of data you spilled without even knowing it.

IDology offers an ID-scanning service like Jumio, only Airbnb chose not to pay for it. And Airbnb didn't choose to place the snazzy, green “SECURE” logo next to IDology's technology like it did for Junio’s. Instead, Airbnb paid for an IDology technology called ExpecID IQ, which in the company's own words “provides a simple, non-intrusive way to test an identity” and “verifies customers in a way that won’t alienate them” - all while “preventing fraud.”

IDology didn’t deliver on the non-intrusive and non-alienating promises, at least in my case. But let’s look closer at that preventing fraud thing. And let me make the point again – because I'm going to come back to it – that I agree completely how important it is to combat fraud.

When you look into IDology, there are a few fishy things about it that stand out. The first is that the site looks like it was designed in 2007. If they spend so little to update their customer interface, what are they spending on IT security?

Second, IDology offers this perfunctory, unconvincing video on consumer-data privacy. “Here at Ideology, consumer privacy and protection is at the forefront of what we do,” says an SVP in a taupe blazer and eyes as large as Elliot Alderson's. You can't watch it without recalling that this is exactly the kind of thing some companies bragged about right before they were hacked.

If user data is collected from public information, what does it matter if someone hacks it? A lot. We share information in public documents according to laws passed before lawmakers had any clear definition of data mining. We consumers shared the data the law demanded, with the understanding it would be used exactly as we shared it - incrementally, discreetly, scattered so widely and randomly only an insane, obsessive man could piece it together. But companies like IDology collect it in ways regulators never imagined – or wanted.

What guarantees are there that someone won't steal that aggregate public information from IDology? IDology's site doesn't say. Its clients are corporations like Airbnb, not consumers like us. (That video about consumer privacy protection wasn’t, you slowly and dispiritingly realize as you watch it, meant to assuage consumers, but rather companies that fear a backlash from angry consumers.) If, as Airbnb says, this is based on the same information you use to access a credit report, could someone steal it from IDology's servers and access anyone's credit report?

Finally, IDology's CEO is named John Dancu. Unlike a lot of tech visionaries, he doesn't have much of a presence online. Which I kind of understand in this era of privacy intrusions. But John Dancu – the guy who wants to collect and sell everything ever publicly disclosed about your history, your family, your neighbors, all while not telling you he's done so – this guy won't even put his fucking last name on his LinkedIn page.

I've gone into deconstructive detail about a minor online transaction for a reason. I understand Airbnb's urge to verify users. I trust they vet hosts just as vigorously, and if they do I feel better about using them. No harm resulted from the verification process this weekend. I am not ranting about wrongs done, I am just really, really worried about where things are headed in general as more of our lives and activities migrate online.

Here's where things have gotten to: Horrible things happened to Airbnb guests and hosts, which rightly attract media attention. Airbnb sensibly reacts to prevent fraud. But there is a risk it's going too far to find a solution.

Using personal data to verify one's identity is like using an antibiotic to treat an infection. It's startlingly efficient in the short term, but overusing antibiotics has perilous consequences in the long term should the bad bacteria adapt and grow more powerful. Passports uploaded are passports that can be stolen. Public data can, once aggregated, empower identity thieves.

In this era of big data, companies are thinking about what data can do for them today. Their sales don't rely on what costs are incurred when that data is abused. To book a room on Airbnb, you may need to share intimate info you wouldn’t even share with friends. But the more that info is readily shared, the more easily it can be abused by identity thieves. It's not wrong for Airbnb to want that data for verification, but it's very wrong for it to not be aware of its potential for abuse.

There is a minority view emerging – which I share – about personal data: It belongs to the user - sorry, the person -  and can only be shared with consent. I appreciate the choice to upload a photo of my passport (thank you, but no), but when you ask me to confirm information I've never voluntarily shared with you just because I want a pillow to lay my head on for the night, you cross a line.

The saddest thing I saw on IDology's site was this: “All of IDology’s solutions are designed with customer protection in mind.” Which sounds great, until you remember the customer is a company like Airbnb, and not an unwitting citizen like you or me. The user's data is exposed, but IDology and its clients are protected. Over and over, IDology says this is ostensibly done in the name of stopping fraud. Here is Merriam-Webster's definition of the word fraud:

intentional perversion of truth in order to induce another to part with something of value or to surrender a legal right

And I'm just going to leave it to John D, the CEO of Idology (along with any any of its clients) to equivocate over whether, by the very definition of fraud, what you are doing with our public, personal data – really, when you think about it, the most valuable right we’re left with once we users are commoditized into database entries - is fraudulent. For the rest of us, the answer is pretty clear.

We ended up not booking anything on Airbnb this weekend. But I started thinking about its use of our data.

And I was thinking to myself, this could be heaven or this could be hell.

Well, okay. Yeah. It's probably hell.