Privacy: Apple allegedly using Siri to listen in to conversation without user's consent
An employee of Apple subcontractor Globe Technical Services, exposed what the company was purportedly doing without the public's knowledge.
When international healthcare systems called on Google and Apple to help in the fight against COVID-19, some were worried about the violation of their rights. Given that smartphones will be used to track individuals who were potentially exposed to SARS-CoV-2, privacy concerns were immediately brought up. Now, the Cupertino, California-based tech group is being called out for allegedly using Siri to record user's conversations without their consent. This was first reported in 2019, but it seems that an investigation was never launched.
Thomas le Bonniec, an employee of Apple subcontractor Globe Technical Services, exposed what the company was purportedly doing without the public's knowledge. He reportedly submitted a letter to authorities in Europe responsible for regulating processes from tech groups related to data protection. Independent notes his letter indicated that it is a "massive violation of the privacy of millions of citizens."
According to the whistleblower, it is alarming as to why regulators are not investigating the matter. Even though it is already public, those responsible for taking action appears to be turning a blind eye to the issue. His work tasked him with correcting errors from transcriptions of recordings sources from Apple Watches, iPads, and iPhones. Surprisingly, most of these were supposedly recorded even when Siri was not engaged by the user.
He also wrote, "In other words, staff assigned to the project had access to personal user information, and used it to be able to link it to Siri commands. This means that users' playlists, contact details, notes, calendars, photos, maps, etc. were gathered in huge data sets, ready to be exploited by Apple for other projects."
From Apple's end, the company communicated in August 2019 that it identified over 300 workers involved in the alleged audio recording incident. However, le Bonniec wonders why there was nothing done to confirm if the questionable practices have been completely stopped.
In a related report, consumers who regularly use smart speakers or virtual assistants should be aware of the risks involved. Hackers have been observed to use malware to allow them to listen in to live conversations even when the device shows that its microphones are not activated. Apple is yet to respond to the latest developments.
© Copyright IBTimes 2024. All rights reserved.