Apple fires 300 in Ireland for listening to Siri’s recordings of people having sex

Apple said that it will work to ensure privacy, not retain audio recordings of Siri’s interactions, and use computer-generated transcripts to help Siri improve.
Apple fires 300 in Ireland for listening to Siri’s recordings of people having sex
Apple fires 300 in Ireland for listening to Siri’s recordings of people having sex
Written by:
Published on

Taking a tough stand against contractors who listened to over 1,000 Siri recordings per shift including people having sex, Apple has reportedly laid off 300 contractors in Cork, Ireland.

According to a report in Engadget quoting the Guardian on Wednesday, after suspending the Siri "grading" programme last month, the Cupertino-based iPhone maker Apple has decided to terminate it.

"More contractors throughout Europe may have been let go", said the report.

An earlier report in Irish Examiner said that contractors in the city of Cork listened to over 1,000 Siri recordings per shift before Apple suspended the programme last month.

They regularly heard drug deals, sensitive business deals and even recordings of people having sex picked up by Apple's digital assistant.

The contractors had the job of listening to and grading recordings by Apple's virtual assistant Siri. The employee said that the details of each Siri user were kept anonymous.

"They (the recordings) were about a few seconds long, occasionally we would hear personal data or snippets of conversations but mostly it would be Siri commands," the employee was quoted as saying.

The details were revealed after a whistle blower last month told the Guardian that Apple contractors worldwide regularly heard users' conversations.

Apple users had no prior knowledge that their Siri recordings were listened to.

After details of the practice came to light, Apple suspended transcription and grading work on Siri recordings last month.

Improving privacy

Apple has said it will no longer retain audio recordings of its digital assistant Siri interactions and use computer-generated transcripts to help Siri improve.

"The users will be able to opt in to help Siri improve by learning from the audio samples of their requests," Apple said in a statement late on Wednesday.

Those who choose to participate will be able to opt out at any time.

"When customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri," said the company.

"Apple is committed to customer privacy and made the decision to suspend Siri grading while we conduct a thorough review of our processes. We're working closely with our partners as we do this to ensure the best possible outcome for our suppliers, their employees and our customers around the world," a spokesperson for Apple said earlier.

Related Stories

No stories found.
The News Minute
www.thenewsminute.com