Apple Pledges to Stop Spying on Customers with Siri

Apple’s quality assurance team used real world Siri conversations to test the technology. This practice has been suspended until a future software update offers user consent.

Apple claims to protect the customer’s privacy, however, The Guardian uncovered a quality assurance practice that left customers exposed. Quality assurance engineers regularly listened to real life Siri conversations, for the purpose of testing the system.

Such practices are common across the tech industry. In healthcare IT, for example, developers with administrative privileges can access patient data. These employees are trained in HIPAA privacy laws. Google and Amazon also have similar quality assurance practices, however, they allow users to opt out. The problem is, Apple differentiates itself from the competition on its privacy features. They don’t let users opt out of this quality assurance program.

The Guardian reported that Apple contractors were privy to illegal drug deals, sexual activity and medical information from real life Apple customers. They were tasked with grading Siri on its performance.

Apple responded that they use a small fraction of customer data to test Siri. They have no knowledge of the user’s Apple ID. Following such practices, users’ privacy is protected. The practice may still be disconcerting to some users.

The practice was exposed by an anonymous whistleblower who works in quality assurance for Apple. The source reported that the “Hey Siri” feature can be activated by certain words on TV. This resulted in customers’ voices being recorded without their knowledge and consent. These accidental activations provided the most salacious information from customers.

The biggest concern is that these are not Apple employees. Instead, they outsourced this project to a third party firm. This gives Apple less control over employee training.

For now, Apple has decided to suspend the program. Future releases of iOS, macOS and tvOS will allow users to opt out of quality assurance. If you use the “Hey Siri” feature, it’s probably a good idea to opt out. Although, quality assurance testers will not be able to match any recordings with an Apple ID, this concept is creepy for some customers.

Leave a comment

Your email address will not be published. Required fields are marked *

© 2022 Appledystopia | Privacy & Cookie Policy | Terms of Service