
By Chand Bellur
May 20, 2020 at 2:39 p.m. PT
- Apple contracted external quality assurance resources to evaluate real-life Siri requests from Apple customers without their knowledge.
- One of the contractors stepped forward, acknowledging that other people were listening in on Apple customers’ Siri requests, some of which were embarrassing in nature.
- Today, the contractor has emerged from secrecy, concerned that Apple faced no consequences for this intentional privacy violation.
Apple’s Privacy Stance Undermined by Contracted Quality Assurance
Apple claims to protect end-users’ privacy; however, recent developments indicate otherwise. To improve Siri, the Cupertino tech company hired an outside firm to provide quality assurance. Members of the team listened to real Siri requests from Apple customers, some of which revealed embarrassing information.
When a company like Apple hires an outside contracting firm, they lose oversight and authority. Although they can put forth due diligence and specify certain requirements, there are only weak mechanisms to enforce these conditions. Aside from Apple management’s leverage and courts, a third-party firm has much more autonomy than in-house quality assurance testers.
The lack of oversight ended up being a problem for Apple. The Guardian first reported this story back in July of 2019. According to their report, the contracted firm had a high employee turnover rate. Furthermore, employees were privy to what should be private information. The contractors assessed medical conversations with doctors, sexually related queries, and other embarrassing matters.
Although Apple attempted to appease consumers and privacy advocates, claiming testers did not access personal information, whistleblower Thomas le Bonniec recounts a different experience:
“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data”
Le Bonniec goes on to detail a high turnover rate of employees at the firm. The combination of shady, fly-by-night employees, and access to embarrassing personal data sets up Apple customers for blackmail. Although no such incidents are known to have happened, Apple is not respecting customer privacy.
Le Bonniec Steps Forward with Formal Complaint
Whether Apple’s move was hypocrisy or a lack of oversight, the company claims to protect consumer privacy. After the story broke and nothing happened, a frustrated Thomas le Bonneic stepped forward with a formal complaint to European data protection regulators. Apple violated consumer privacy and got away with it. Although users can now opt-out of Siri quality assurance, Apple faced no other consequences.
Coming forward with his true identity, Le Bonneic risks limited employment prospects. Most large corporations are wary of hiring whistleblowers, for fear that they may find and report their wrongdoings. He’s motivated by doing the right thing, and not by profit.
One of the most shocking revelations revealed by Le Bonneic asserts that even people who don’t use Apple products have their privacy compromised. This is because an Apple user has referenced them or recorded them in a Siri request:
“The recordings were not limited to the users of Apple devices, but also involved relatives, children, friends, colleagues, and whoever could be recorded by the device. The system recorded everything: names, addresses, messages, searches, arguments, background noises, films, and conversations. I heard people talking about their cancer, referring to dead relatives, religion, sexuality, pornography, politics, school, relationships, or drugs with no intention to activate Siri whatsoever.”
Given that the issue goes beyond Apple users, Le Bonniec feels the company needs to do more than just let users opt-out. On the other hand, without real-world testing, Siri’s development will stunt.
Apple is not the only tech company that lets employees or contractors listen in on digital assistant requests. Both Google and Amazon engage in similar practices. The problem for Apple is that they claim to be staunch privacy advocates. The reality is quite different.
It seems reasonable that if Apple needs to listen to real-world Siri requests, they should use only carefully vetted employees, not contractors with a high turnover rate. Furthermore, only users who opt-in should have their requests used for quality assurance. End-users should receive a notification when their Siri conversation becomes QA testing data. Ideally, customers should have the opportunity to hear what the QA testers are accessing. This notification should also contain another offer to opt-out. That way, consumers can re-assess whether they want to end participation in Apple’s testing program.