GoFuckYourself.com - Adult Webmaster Forum

GoFuckYourself.com - Adult Webmaster Forum (https://gfy.com/index.php)
-   Fucking Around & Business Discussion (https://gfy.com/forumdisplay.php?f=26)
-   -   Siri ‘regularly’ records sex encounters, sends ‘countless’ private moments to Apple (https://gfy.com/showthread.php?t=1316242)

wehateporn 07-27-2019 12:18 PM

Siri ‘regularly’ records sex encounters, sends ‘countless’ private moments to Apple
 
Apple’s Siri AI assistant sends audio of sexual encounters, embarrassing medical information, drug deals, and other private moments recorded without users’ knowledge to human ‘graders’ for evaluation, a whistleblower has revealed.

Recordings from Apple’s Siri voice assistant are fed to human contractors around the world, who grade the AI based on the quality of its response and whether its activation was deliberate, according to an anonymous contractor who spoke to the Guardian. They claimed accidental activations are much more frequent than Apple lets on, especially with Apple Watch users – and wants the company to own up to the problem.

“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data,” the whistleblower revealed.

Continued https://www.rt.com/news/465181-apple...n-contractors/

RedFred 07-27-2019 12:36 PM

https://scontent.fapa1-2.fna.fbcdn.n...4c&oe=5DA5DD20

bronco67 07-27-2019 01:06 PM

Quote:

Originally Posted by wehateporn (Post 22508587)
Apple’s Siri AI assistant sends audio of sexual encounters, embarrassing medical information, drug deals, and other private moments recorded without users’ knowledge to human ‘graders’ for evaluation, a whistleblower has revealed.

Recordings from Apple’s Siri voice assistant are fed to human contractors around the world, who grade the AI based on the quality of its response and whether its activation was deliberate, according to an anonymous contractor who spoke to the Guardian. They claimed accidental activations are much more frequent than Apple lets on, especially with Apple Watch users – and wants the company to own up to the problem.

“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data,” the whistleblower revealed.

Continued https://www.rt.com/news/465181-apple...n-contractors/

No Siri doesn't do that. RT is full of shit and so are you. Die.

GFED 07-27-2019 01:08 PM

Quote:

Originally Posted by wehateporn (Post 22508587)
Apple’s Siri AI assistant sends audio of sexual encounters, embarrassing medical information, drug deals, and other private moments recorded without users’ knowledge to human ‘graders’ for evaluation, a whistleblower has revealed.

Recordings from Apple’s Siri voice assistant are fed to human contractors around the world, who grade the AI based on the quality of its response and whether its activation was deliberate, according to an anonymous contractor who spoke to the Guardian. They claimed accidental activations are much more frequent than Apple lets on, especially with Apple Watch users – and wants the company to own up to the problem.

“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data,” the whistleblower revealed.

Continued https://www.rt.com/news/465181-apple...n-contractors/

All of the wake-on voice command devices, phones, etc do this.

TrafficTitan 07-27-2019 01:13 PM

If people opt in to this it wouldn't even be illegal

wehateporn 07-27-2019 01:14 PM

Quote:

Originally Posted by bronco67 (Post 22508600)
No Siri doesn't do that. RT is full of shit and so are you. Die.

So everything on RT is fake news, who do you trust, Guardian? Maybe Guardian got tricked by Putin into reporting the same fake story.

https://www.theguardian.com/technolo...iri-recordings

Bladewire 07-28-2019 09:36 AM

Quote:

Originally Posted by bronco67 (Post 22508600)
No Siri doesn't do that. RT is full of shit and so are you. Die.

Well said :thumbsup

MrBottomTooth 07-28-2019 01:38 PM

I can go into my Alexa app and listen to every single time I triggered my Alexa. You can go in and delete the recordings if you want. Amazon admitted they have people analyzing these, no doubt to improve reliability and accuracy. I'm sure all the companies do this. In my case 90% of the recordings are me turning lights on and off.

Bladewire 07-29-2019 03:42 PM

Quote:

Originally Posted by MrBottomTooth (Post 22509030)
I can go into my Alexa app and listen to every single time I triggered my Alexa. You can go in and delete the recordings if you want. Amazon admitted they have people analyzing these, no doubt to improve reliability and accuracy. I'm sure all the companies do this. In my case 90% of the recordings are me turning lights on and off.

Google is creating voice profiles for every person using voice to text so that , in the future, your Google personal assistant will have your voice, if you choose.

Scary

pimpmaster9000 07-29-2019 03:56 PM

This is why I will only buy huaweii from now on...nobody cares if some china dude is spying on him...avoid all US tech its all spyware and can not be trusted...


All times are GMT -7. The time now is 04:53 PM.

Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123