Fricktured Friday
Technology is becoming an ever more important part of our lives, we use it and rely on it for almost everything these days. Over the years products have been getting more and more, invasive, however. And Amazons Alexa may have just shown us why we should still be wary of these technological advancements.
Since it’s inception, the people behind Alexa have been assuring and reassuring people that the device is safe, and non invasive. And 100% of the time it is only listening if you command it too, or “wake it up” by saying “Alexa” or “Alexa wake up”. But no matter what, it only records and listens if you tell it too...
At least that’s what we’ve been told. But this week a couple had their phone conversation recorded by their Alexa, and that recording was sent to an almost completely random person on their contact list. Prompting that person to text them, “Unplug your Alexa, you’ve been hacked.”
But when the person sent this text to a girl named Danielle, at first she didn’t believe what she was being told, but once the contact told her, “Your talking about hardwood floors.” She knew he wasn’t lying and had indeed had her private conversation recorded.
After being assured again, and again, and again, that Amazons devices would never do this, and Trusting amazon, Danielle felt as though her and her partners privacy had been violated, and for good reason. Who wouldn’t?
Instead of just having a conversation with her partner that was meant to be private, that conversation was recorded and sent to a stranger without those who had that conversations knowledge. How fun!....
Now, with a privacy violation as big as this, and with their customer Danielle’s trust in their product shattered, and a PR nightmare on their hands. You’d think they’d do everything they could to get to the bottom of this. Especially since the company assured people this wasn’t possible, and would never happen.... Instead, they released this statement.
“Echo woke up due to a word in background conversation sounding like ‘Alexa’. Then, the subsequent conversation was heard as a ‘send message’ request. At which point, Alexa said out loud ‘To whom?’ At which point, the background conversation was interpreted as a name in the customer’s contact list. Alexa then asked out loud, ‘[contact name], right?’ Alexa then interpreted background conversation as ‘right’.”
Does that sound RIDICULOUSLY implausible? Don’t worry, they thought so to, which is probably why they added this caveat.
“As unlikely as this string of events is, we are evaluating options to make this case even less likely.”
Basically, this translates too. “We have no idea what the hell happened, or why this thing did what it. Here’s our best guess! Please don’t stop using Amazon products, we’re working on this problem....Even if we have no clue what it is, don’t be afraid! Alexa loves you.”
Or at least that’s how I see it... All I know is this has solidified my absolute distrust of these artificial “Assistants”, and has made me even more wary of A.I. If their best explanation for why Alexa recorded a private conversation and sent it off to some random contact is that unbelievably unlikely series of events, do they even have this thing under control? Or are we being surveilled by a company that has been lying to it’s customers since this products launch?
No matter what the reason, this is an unacceptable breach of privacy by a company, and must be dealt with. But with the increasing popularity of devices such as Alexa, and with every phone coming equipped with some sort of iteration of Alexa on them, is there really any escape?
Just another AWESOME story from the realm of A.I. and technology, something I’m sure won’t kill us all one day! :P
Any who... Here’s a little doodle I did that I’m thinking about turning into a more legit piece. Everything I’ve worked on this week just hasn’t really “felt” right, or something. And I’m thinking about doing a little more of a planned out piece based on this doodles design. Let me know what you think!
Have a good Weekend everybody.
Sincerely, Bret Frick.
Leave a comment
Please note, comments must be approved before they are published