Siri, the voice assistant of Apple, hears the users even when they’re having sex or while they are talking with the medic.

The technology companies carry out more intrusive practices, because of that the privacy of the users are more vulnerable any time. According to the specialists in cybersecurity services, Apple and their contratists have access to confidential information of the users, like medical details, purchases of medicines or even recordings of their sexual encounters thanks to the “quality control” process of Siri, the voice assistant of the company.

A minimum portion of the interactions between siri and the users is sent by the contratists of Apple that operate in practically all over the world. The labor of these companies is to evaluate according to a group of variables the responses that the voice assistant gives the user. As expected, Apple does not mention in a clear way this practice to the customers.

The company only mentions that they take a few data to make constant updates on Siri, although specialists in cybersecurity services consider that the company is making a clear violation of the privacy by not explicitly mention that this labor is done by humans that hear the recordings of users.

Being questioned by this practice, the company said: “A small portion of the applications and questions that Siri receives are subjected to a rigorous analysis to improve the experience for the users. These registers are not in any way asociated to the Ip adress or physical location of users. Besides, the analysis is done in safe facilities according to the confidentiality meassures of the Apple data”. It is estimated that around the 1% of the daily applications that receives the voice assistant are subjected to this procedure.

An anonymous informant that works in one of the contratists companies of Apple mentioned that the users and organizations should keep on track aboutthe intrusive capacities of tools like Siri; “considering the frequency that the voice assistants activate in ana ccidental way, these companies could find recording of the users in a private circunstancy”, said the anonymous informant.

“Siri can wake up even without hearing clearly the activation words (Hi, Siri). Even an Apple Watch with Siri can begin to record if it detects any movement or hears random words”, added the informant.

Although Siri is included in the majority of Apple developments, the main databases of Apple are the devices Apple Watch, iPhone and the smart speaker HomePod. According the specialists in cybersecurity services, the smart watch Apple Watch is the device that presents more non-authorized activations of the local assistant.

For finish, the informant mentioned that, by express application of Apple, the accidental activations of Siri must be reported as a technical errors; “It doesn’t exist a special procedure to treat the accidental activations of the voice assistant, that’s why in the end the company really is not trying much to reduce these service errors”.

According to specialists in cybersecurity services of the Internations Institute of Cyber Security(IICS), when an user asks Siri about the access of their voice registers, the assistant only mentions “I only hear when someone talks to me”. This is totally fake, because it is proved that the voice records of the users are also stored by Apple even when the assistant is activated by error.





Deja un comentario