It's only a couple of weeks since we learned, for certain, that Google is listening to what people say to Google Assistant. Now -- and perhaps surprising no one -- it transpires that Siri is just as much of a privacy invasion.
Just as with Amazon and Google with Alexa and Google Assistant, Apple shares some of the recordings made via Siri with contractors with a view to improving the service. But while it may mean that Siri gets better at responding to queries, it also means that the contractors charged with "grading" Siri's performance "regularly hear confidential details" -- everything from people having sex, to people making drug deals.
See also:
A whistleblowing Apple contractor told the Guardian that: "There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data".
One of the reasons such private conversations are heard by Siri in the first place is that the digital assistant can be inadvertently triggered by something other than the designated "hey, Siri". While it may come as some comfort to know that Apple says it does not link recordings that are made to the user account that they came from, users may still be disturbed to find that real people -- not just the AI assistant -- are hearing what they say.
The contractor said that while revealing recordings come from a range of devices, it is Apple's HomePod that is the greatest source of material; and, unsurprisingly, it was the accidental activations of Siri that yielded the most sensitive recordings. The contractor explained that "you can definitely hear a doctor and patient talking about the medical history of the patient. Or you'd hear someone, maybe with car engine background noise -- you can't say definitely, but it's a drug deal... you can definitely hear it happening. And you'd hear, like, people engaging in sexual acts that are accidentally recorded on the Pod or the Watch".
In a statement given to the Guardian in response to the revelations, Apple said:
A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user's Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple's strict confidentiality requirements.
Perhaps most worryingly, the Apple contractor said that workers are pushed to such an extent that there was little incentive to report accidental recordings to the company. "We're encouraged to hit targets, and get through work as fast as possible. The only function for reporting what you're listening to seems to be for technical problems. There’s nothing about reporting the content".
They went on to say:
There's not much vetting of who works there, and the amount of data that we're free to look through seems quite broad. It wouldn't be difficult to identify the person that you're listening to, especially with accidental triggers -- addresses, names and so on.
Apple is subcontracting out, there's a high turnover. It's not like people are being encouraged to have consideration for people's privacy, or even consider it. If there were someone with nefarious intentions, it wouldn't be hard to identify.
See also: Piotr Swat / Shutterstock
https://betanews.com/2019/07/27/apple-siri-privacy/
2019-07-27 14:27:40Z
52780340158687
Tidak ada komentar:
Posting Komentar