3 easy ways to stop your gadgets from spying on you

There has been so much progress in technology over the past decade that we seem to be living in a futuristic world.

Some modern technology resided in the world of science fiction, which was only seen on the big screen. When I was a kid, I could only dream of one day having a “Star Trek” communicator like Captain Kirk. Now, we have smartphones even more powerful than Mr. Spock could have imagined.

But, sometimes bad also comes with good.

One downside to many innovations is the loss of privacy. You would be surprised to know about some of the gadgets that allow scammers to spy on you.

That’s why you need to know about these three things that might be spying on you right now.

1. Headphones
Cybercriminals are always looking for new ways to pull off a scam. Now, it looks like our headphones aren’t even safe.

A team of security researchers from Ben-Gurion University has demonstrated that it is entirely possible to hijack a pair of garden-variety headphones and use them as spying devices.

Their proof-of-concept malware called “speak(a)r” exploits the headphone jack “retasking” feature of Realtek audio codec chips, commonly found on most computer motherboards, and essentially Turns your headphone speaker into a microphone.

The principle behind the hack is quite simple. In case you don’t know, any analog headphone pair can be turned into a working microphone simply by plugging it into the audio-in jack.

Since analog headphones convert electromagnetic signals into audible speaker vibrations, reversing this signal will cause the speaker membrane to pick up the vibrations, and convert these back into electromagnetic energy, similar to how a mic works.

(You can try this trick by plugging a pair of mic-less headphones into the audio-in jack, then start recording.)

The researchers say that since Realtek audio chips are so common in computer motherboards, the attack can work on almost any computer, be it operating system, be it Windows or macOS. The team is still determining whether other audio chips and smartphones are vulnerable to these types of attacks, but they believe it is very likely.

“This is a real vulnerability,” said Ben-Gurion lead researcher Mordechai Guri. “This is what makes almost every computer today vulnerable to this type of attack.”

Attack also works quite well. During tests, he plugged in a pair of Sennheiser headphones and found they could record sounds from up to 20 feet away. “It’s very effective,” Guri said. “Your headphones make a good, quality microphone.”

Although the purpose of creating proof-of-concept “speak(a)r” malware is purely for theoretical and precautionary purposes, the vulnerability is certainly there for a determined hacker to try and exploit.

“People don’t think about this privacy vulnerability,” Guri said. “Even if you remove your computer’s microphone, you may be recorded if you use headphones.”

Researchers believe that this vulnerability cannot be fixed with just one software security update. The “ReTask” feature of Realtek audio chips is not a bug but a baked-in feature and cannot be fixed without a total redesign of the chips and installed in future motherboards.

At least for now, this hack won’t work with digital/non-analog headphones. USB, wireless, Bluetooth and Lightning headphones should be impervious to Speak(a)R-Type analog input retasking attacks.

Also, since this hack requires swapping output and input ports, there’s a way to tell if your headphone speakers have been rewired if they stop playing sound. Additionally, your headphones need to be plugged in for Swap to Even to work.

2. Smart Toys
Some concerns are being raised with internet-connected smart toys. Several privacy groups have Complaints have been filed with the Federal Trade Commission (FTC) as well as the European Union. They are accusing Genesis Toys as well as its tech partner Nuance of deceptive practices and violating privacy laws.

Privacy groups i-Q and My Friend Kayla are raising concerns over the toys. They claim that these toys record children’s voices without the necessary permission and send the recordings to Nuance. The fear is that these recording databases could be sold to the police or intelligence agencies.

Another concern is the way the toys connect to Bluetooth devices. The toys have no protective cover, which means any unauthorized Bluetooth device can connect to them.

Privacy groups say this could allow a person to listen to children. In the formal complaint, they state that it could lead to “predator pursuit and physical danger”.

Updated: December 21, 2021 — 11:16 am

Leave a Reply

Your email address will not be published. Required fields are marked *