Facial Recognition is the Worst of What AI has to Offer! Especially to Children
Children might be in grave danger when it comes to the use of AI in facial recognition
Protecting children online has never been easy. And with the advent of public-facing facial recognition search engines, that risk is growing. According to reporting by The Intercept, a startup called PimEyes makes it disturbingly easy to find “potentially explicit” images of children, not to mention almost anybody else.
The face-searching site launched in 2017 and has been a nuisance ever since. It lets users — free of charge — track down online photos of anyone, and all it takes is uploading just one image of the target person. In other words, it’s like a reverse image search except far more intrusive, since it can find photos that aren’t anywhere near an exact match.
The tool actually used to be even more invasive. Before developers removed the feature in the wake of public backlash, PimEyes was able to search for photos on social media sites as well.
Its owner, Giorgi Gobronidze, even facetiously admitted to The Intercept that the service was “tailor-designed for stalkers,” though he rationalized that the company has subsequently cleaned up its act by no longer crawling social media.
While PimEyes claims that it’s only meant to be used in self-searches and is “not intended for the surveillance of others,” it nonetheless provides paid subscription tiers that allow users to search up to 25 times per day, which seems like a lot of searches per day just to check up on your own face. “This is just another example of the large overarching problem within technology, surveillance-built or not,” Electronic Frontier Foundation staff technologist Daly Barnett told The Intercept.
“There isn’t privacy built from the get-go with it, and users have to opt-out of having their privacy compromised.”
Readily providing anyone with an internet connection, the ability to trace any person by their photo has troubling implications on its own, but when that also compromises the privacy of children, it dredges up even uglier ramifications. Child advocacy groups have championed the use of facial recognition tech to combat child trafficking, but when that tech gets into the hands of creeps, it can also provide them with powerful new stalking tools.
The report details several test searches on PimEyes using AI-generated faces of children. Distressingly, the searches yielded photos of actual kids, with potentially identifying details on the websites they found photos were linked to.
In The Intercept’s testing, PimEyes was able to dig up children’s faces on everything from personal family blogs to school photos. What was most disturbing in The Intercept’s testing was that PimEyes returned images of children labeled as “potentially explicit” while still providing a link to the source website. We don’t need to spell out just how easily stalkers and predators could use that function to exploit children.
“The fact that PimEyes doesn’t have safeguards in place for children and apparently is not sure how to provide safeguards for children only underlines the risks of this kind of facial recognition service,” Jeramie Scott, director of the Surveillance Oversight Project at the Electronic Privacy Information Center, told The Intercept. “Participating in public, whether online or offline, should not mean subjecting yourself to privacy-invasive services like PimEyes.
” “Congress needs to act to not only protect our children, but all of us from the dangers of facial recognition technology,” Scott added. “Services like this should be banned. That’s how you should regulate it.”