Facial recognition to make smart glasses even more of a privacy nightmare
BY CHARLIE FRIPP, KOMANDO.COM
FEBRUARY 15, 2022
- CLICK TO SHARE ON TWITTER (OPENS IN NEW WINDOW)
- CLICK TO SHARE ON FACEBOOK (OPENS IN NEW WINDOW)
- 1CLICK TO SHARE ON PINTEREST (OPENS IN NEW WINDOW)1
In the latest Watch Dogs video game, the protagonists have a wealth of technology available at the push of a button. They can quickly scan people around them through a mobile phone. This brings up details like their name, occupation and personality traits.
That might seem like the conjurings of an over-active imagination, but Watch Dogs has always been somewhat rooted in reality. In fact, the first game in the series drew a lot of inspiration from the real world’s devasting Stuxnet virus.
Think we are still years away from the same technology as in Watch Dogs? Think again. The tech already exists, and you’ll never know if you have been captured by it. Read on to see how smart glasses with facial recognition technology could be a privacy nightmare.
Here’s the backstory
If the name ClearView AI rings a bell, it should. The company has been at the center of many privacy concerns. Thrust into the spotlight in 2020, a New York Times report revealed that law enforcement agencies have access to ClearView’s AI facial recognition system.
Has your local police department used facial recognition software? Tap or click here to check this database and find out.
Your daily dose of tech smarts
Learn the tech tips and tricks only the pros know.Email addressSIGN ME UP
While some agencies have solved crimes using the technology, there are fears that it could soon become available to the public. The biggest concern is how the technology works and the little information it needs to get results.
The tech is so powerful that it can identify a person from a single photograph. It can also scan government databases to gather relevant information.
Unfortunately, with many public record photos like licenses and passports available, it is only a matter of time for people to use it for nefarious purposes. The FBI has a database of over 700 million photos.
But the technology is now moving in even more dangerous territory. The New York Times recently revealed that ClearView AI signed a $50,000 contract with the U.S Air Force Research Laboratory to incorporate the tech into augmented reality glasses.
The intent is to increase security around airfields and military bases by having officers wear augmented reality glasses that could, in theory, quickly scan nearby faces for identification. However, the research lab promptly pointed out that no devices were delivered with the contract.
The department stressed that the collaboration with ClearView AI is for the “scientific and technical merit and feasibility” of such smart glasses.
What you can do about it
Thankfully, there isn’t anything that you must do. Yet. The technology may still be under development. But with so many government agencies eager to fully (or stealthily) adopt it, it might be here sooner than you think.
The bulk of ClearView AI’s data comes, to no surprise, from social media networks. Concerned that your image could be included in such technology? Now might be good to go through a digital cleanout.
We’ve highlighted ways to delete yourself from the internet, but there are other methods to scrub your data from being accessed by other people.
For example, deleting a social media profile might be a bit drastic for you. So, you can make it private instead. Check your security settings, especially on Facebook and Instagram, and adjust who tags you in photos.