Apple Spies on Users’ Conversations via Siri For ‘Grading’ Purposes

    0
    646
    Hey-Siri-iPad-AAPro-By-JQason-Zigrino-wallpaper

    Along with the advent of voice assistants, one can imagine there are quite a few privacy concerns associated with them. Amazon and Google, owning Alexa and Google Assistant respectively, all came clean this year and now so has Apple virtual assistant, Siri.

    In a recent report by The Guardian, it was exposed that your conversations are regularly being listened to by Apple contractors. They apparently monitor your conversations in order to ‘grade’ Siri’s responsiveness. The grading depends upon numerous factors such as whether Siri was activated mistakenly or not, whether the answer was an appropriate response to the query etc.

    The company claims to listen to only a tiny portion of the recordings and that only 1% of it is used for grading purposes. The recordings are also not associated with any particular Apple ID so it would be nearly impossible to trace it back to its original owner.

    This is what Apple had to say:

    “A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

    Apple Siri

    There are numerous times when Siri gets activated unknowingly and records personal conversations such as that between a doctor and patient, or a confidential business deal. An anonymous contractor also mentioned that supposed criminal deals such as drug exchanges also get recorded. Unfortunately, people owning an Apple Pod or smartwatch are indeed prone to this vulnerability.

    The accidental behaviors of Siri have to be reported as a ‘technical problem’ and there are no measures set up in place when dealing with sensitive/criminal recordings. The contractor’s motive for going public with this issue is due to the fear of information being misused and falling into the wrong set of hands. He mentions that there is not much vetting to who works there and how much information they have access to. It’s relatively easy to recognize the person talking in the recordings, and accidental triggers sometimes reveal an address or a name.

    Apple has provided an option to completely disable Siri, thereby keeping your privacy intact. The company should also reveal to its users that this human oversight exists so that some amount of transparency can be maintained. Alexa and Google Assistant have an option to turn off the microphone while still being able to operate the voice assistant. Moving forward, this feature should be added to Siri.

    Further Reading:

    Leave a Reply