The security researcher who revealed during the HOPE/X conference earlier this month that there are back doors installed on 600 million iOS devices has responded to Apple’s claim that the back doors are used only for diagnostic and enterprise purposes. The gist? “That’s bullshit.”
The initial report focused on small utilities that allowed anyone in possession of a file used in the connection between an iOS device and a PC to retrieve unencrypted data from the device. Apple confirmed that the utilities exist, but said in a statement that they’re actually harmless:
We have designed iOS so that its diagnostic functions do not compromise user privacy and security, but still provides needed information to enterprise IT departments, developers, and Apple for troubleshooting technical issues. A user must have unlocked their device and agreed to trust another computer before that computer is able to access this limited diagnostic data. The user must agree to share this information, and data is never transferred without their consent.
As we have said before, Apple has never worked with any government agency from any country to create a backdoor in any of our products or services.
The researcher, Jonathan Zdziarski, isn’t satisfied with that answer. As he wrote in a blog post published earlier this week:
I understand that every OS has diagnostic functions, however these services break the promise that Apple makes with the consumer when they enter a backup password; that the data on their device will only come off the phone encrypted. The consumer is also not aware of these mechanisms, nor are they prompted in any way by the device. There is simply no way to justify the massive leak of data as a result of these services, and without any explicit consent by the user.
I don’t buy for a minute that these services are intended solely for diagnostics. The data they leak is of an extreme personal nature. There is no notification to the user. A real diagnostic tool would have been engineered to respect the user, prompt them like applications do for access to data, and respect backup encryption. Tell me, what is the point in promising the user encryption if there is a back door to bypass it?
There are a few possible answers for this. The first is that Apple is lying about including back doors in its products for government use. The second is that Apple has once again failed in its attempts to secure its customers’ personal information, which wouldn’t be all that surprising, given the company’s worrisome history with digital security and acknowledging its mistakes.
Consumers are getting screwed either way. If Apple is cooperating with intelligence agencies despite its repeated claims to the contrary, people might have been fooled into thinking that their iPhone is secure from government snooping. If the company just sucks at securing personal data, people have to worry about hackers and other attackers in addition to the government.
It’s a lose-lose situation for consumers, and it doesn’t seem likely to change any time soon. As the Guardian notes in its follow-up report on Zdziarski’s presentation and following blog post:
Though many want more information from Apple on these previously undisclosed services and security bypasses, the company can continue to point out that an attacker would need to be in control of that pairing file and in proximity of a target iPhone to retrieve data.
The company had not responded to a request for comment at the time of publication.
Welcome to the new age, where companies are responsible for safeguarding more and more personal information, yet aren’t required to be honest with consumers about the extent to which that information might be compromised by anyone knowing where to look.