![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
A proof of concept of this was revealed some months ago when a Burger King TV commercial said "Siri, tell me about the Whopper". Maybe it was Hey Google, I don't remember. Anyway, it was rapidly blocked, then BK came out with another commercial and they had a little war back and forth. And BBC apparently tries it with "Hey Siri, remind me to watch Doctor Who on BBC America." I was particularly amused at "Hey Siri, remind me to watch Broadchurch on BBC America" during the final episode of the series. I burst out laughing when that ad aired and had to explain it to the spousal unit. And as Sam Clemens said, or is alleged to have said, 'Analyzing humor is like dissecting a frog: you can do it, but the frog isn't good for much afterwards.'
Well, the Chinese have found another way: pitch the audio above the range of human hearing. The microphones can still catch it, and the command works. Now, I don't have voice-activated Siri on my iPhone, I have to hold down the button because I find that, for me, for the most part Siri is garbage. I don't think it's my enunciation, but maybe it is.
Makes me wonder if they'll put in a filter to cap mic input to 18-20 kHz or so to prevent this sort of abuse.
I read about this last week, perhaps on the day that I went down to help out that medical practice with their ransomware attack. The clinic was handling their last patients of the day, and the office manager was running the front desk, and was using his iPhone with Siri voice commands. He looked a little shocked when I told him about this attack.
https://apple.slashdot.org/story/17/09/06/2026247/hackers-can-take-control-of-siri-and-alexa-by-whispering-to-them-in-frequencies-humans-cant-hear
Here's the Slashdot summary:
Chinese researchers have discovered a vulnerability in voice assistants from Apple, Google, Amazon, Microsoft, Samsung, and Huawei. It affects every iPhone and Macbook running Siri, any Galaxy phone, any PC running Windows 10, and even Amazon's Alexa assistant. From a report:
Using a technique called the DolphinAttack, a team from Zhejiang University translated typical vocal commands into ultrasonic frequencies that are too high for the human ear to hear, but perfectly decipherable by the microphones and software powering our always-on voice assistants. This relatively simple translation process lets them take control of gadgets with just a few words uttered in frequencies none of us can hear. The researchers didn't just activate basic commands like "Hey Siri" or "Okay Google," though. They could also tell an iPhone to "call 1234567890" or tell an iPad to FaceTime the number. They could force a Macbook or a Nexus 7 to open a malicious website. They could order an Amazon Echo to "open the backdoor." Even an Audi Q3 could have its navigation system redirected to a new location. "Inaudible voice commands question the common design assumption that adversaries may at most try to manipulate a [voice assistant] vocally and can be detected by an alert user," the research team writes in a paper just accepted to the ACM Conference on Computer and Communications Security.
Well, the Chinese have found another way: pitch the audio above the range of human hearing. The microphones can still catch it, and the command works. Now, I don't have voice-activated Siri on my iPhone, I have to hold down the button because I find that, for me, for the most part Siri is garbage. I don't think it's my enunciation, but maybe it is.
Makes me wonder if they'll put in a filter to cap mic input to 18-20 kHz or so to prevent this sort of abuse.
I read about this last week, perhaps on the day that I went down to help out that medical practice with their ransomware attack. The clinic was handling their last patients of the day, and the office manager was running the front desk, and was using his iPhone with Siri voice commands. He looked a little shocked when I told him about this attack.
https://apple.slashdot.org/story/17/09/06/2026247/hackers-can-take-control-of-siri-and-alexa-by-whispering-to-them-in-frequencies-humans-cant-hear
Here's the Slashdot summary:
Chinese researchers have discovered a vulnerability in voice assistants from Apple, Google, Amazon, Microsoft, Samsung, and Huawei. It affects every iPhone and Macbook running Siri, any Galaxy phone, any PC running Windows 10, and even Amazon's Alexa assistant. From a report:
Using a technique called the DolphinAttack, a team from Zhejiang University translated typical vocal commands into ultrasonic frequencies that are too high for the human ear to hear, but perfectly decipherable by the microphones and software powering our always-on voice assistants. This relatively simple translation process lets them take control of gadgets with just a few words uttered in frequencies none of us can hear. The researchers didn't just activate basic commands like "Hey Siri" or "Okay Google," though. They could also tell an iPhone to "call 1234567890" or tell an iPad to FaceTime the number. They could force a Macbook or a Nexus 7 to open a malicious website. They could order an Amazon Echo to "open the backdoor." Even an Audi Q3 could have its navigation system redirected to a new location. "Inaudible voice commands question the common design assumption that adversaries may at most try to manipulate a [voice assistant] vocally and can be detected by an alert user," the research team writes in a paper just accepted to the ACM Conference on Computer and Communications Security.
no subject
Date: 2017-09-17 02:51 pm (UTC)