How to gain consumer trust in the post-Snowden world
Ah, those were the days.
I'm talking about the days before documents leaked by Edward Snowden revealed that the NSA sucks up massive amounts of personal information from the biggest technology companies in the world. Somehow the privacy scandals we used to get angry about just don't seem as urgent anymore. Remember those halcyon hours we spent raging against Target's marketers for knowing a teenage girl was pregnant before her father did? And all the fun we had when he gasped in horror at discovering that Facebook and Instagram could use your party pics in Pepsi ads? Or Facebook's Beacon?
Nowadays these outrages seem quaint by comparison, with national law enforcement agencies systematically dismantling the encryption standards created to protect our privacy. Even if we're aware of widespread NSA snooping, shouldn't we still care at least a little how companies manage and share our data when it comes to marketing?
That's the main takeaway of a new study by the Boston Consulting Group. The study finds that by getting consumers to trust you, and explaining clearly and preemptively how you will use their public information, customers are five to ten times more likely to provide their data, depending on the country and industry. This falls in line with another recent study we reported on, which found customers were more likely to click on ads if they knew why they're seeing them.
"The magnitude of that is a surprise," says John Rose, head of the media sector at BCG and the study's main author. Somewhat surprisingly it cuts across age demographics. "There's not a lot of difference between millennial attitudes toward their data compared to the older populations. The lore is that this is an over 25, over 30 issue. Teens and young adults don't care." The data, however, shows this isn't true.
So what should companies do to, as Rose puts it, "expand their trust footprint?"
The first step is internal. "Most companies have really good privacy policies and statements, but they don't go much further than that," he says, adding that companies need to actively encourage their employees to adhere to these policies and create metrics and incentives surrounding them.
The next step is external transparency. "People need to know what uses you're putting to their data, they need to be able to find that out simply, they need to know what your code of conduct is... and they need to understand how you are holding yourself accountable."
Finally, companies need to get out of the habit of asking themselves, "Is this right or wrong" to collect a certain piece of information. It's only wrong, Rose says, if customers don't understand the data collection process.
Is all this transparency for naught because we're scared of data falling into the hands of the government? After all, the NSA put a gag order on Google regarding its disclosure of what information it shares with the government, so already there's a limit to how transparent companies can be. Maybe so, but Rose emphasizes that this resonates with customers as a "government problem" not a "Google problem" or an "Apple problem."
"I don't think people believe that this is going to be worse on iTunes than it would be on Android or anywhere else."
In other words, even if companies are compelled to share our data with the government, the least we can do is hold companies accountable for sharing our data with third-parties for marketing purposes.
Check out more of BCG's findings here:
[Image via Thinkstock]