Apple's Siri 'listens to users' intimate moments, including sex', says whistleblower

From Amazon's Alexa to Apple Now Siri, intelligent assistants, are part of many people's daily lives.

Advertisement

But a new letter from a former Apple contractor indicates that Siri may be doing more than helping him with his daily tasks.

Thomas Le Bonniec claims that Crab listens to the users' intimate moments, before the recordings are "classified" by the contractors.

Le Bonniec says that, during his time at Apple, he classified recordings of medical discussions, criminal activities, sex and official business conversations.

Advertisement

Speaking anonymously to The Guardian last July, he said: “There is not much verification about who works there, and the amount of data that we are free to analyze seems quite wide.

"It wouldn't be difficult to identify the person you're listening to, especially with accidental triggers – addresses, names and so on."

Advertisement


Is your smartphone spying on you?

Le Bonniec has now revealed his identity and written an open letter to European data protection regulators about his concerns.

The letter said: “It is worrying that Apple (and undoubtedly not just Apple) continues to ignore and violate fundamental rights and continue its massive data collection.

“I am extremely concerned that the big tech companies are intercepting entire populations, despite the fact that European citizens are informed that the EU has one of the strongest data protection laws in the world.

Advertisement

"Passing a law is not enough: it needs to be applied to privacy violators."

Advertisement

During his time at Apple's Cork offices, Le Bonniec says he heard thousands of recordings a day from users iPhones, Apple Watches and iPads.


Video upload

Video unavailable

Read More

Smart wizards

According to Le Bonniec, this included recordings of the device's owners, as well as their friends, family and colleagues.

Apple previously admitted that a small portion of Siri's orders are analyzed to improve the diction of the smart assistant.

In July last year, an Apple spokesman said: “We know that customers were concerned about the recent reports of people listening to Siri audio recordings as part of our Siri quality assessment process – which we call rating.

“We listened to your concerns, immediately suspended the human classification of Siri's requests and started a complete review of our practices and policies. We decided to make some changes to Siri as a result. "

Mirror Online contacted Apple for further comments.

.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *