Is This The ‘Most Disturbing Website Ever’?

Recently, all you need to do is turn on the news to see the latest story about the dangers that artificial intelligence (AI) is reportedly causing to our society. But one that has continued to cause a stir since its release in 2017 and that has recently been making waves in the press is the AI online facial recognition search engine PimEyes.

Imagine if you could find every single picture of yourself that has ever existed online in one place. This means every awkward teenage selfie to every fleeting Instagram story of you and your friends to every drunken selfie you’ve ever posted online.

This is the purpose of PimEyes. Thanks to AI, this website will find every photo of you that exists online, even the ones you intended for no one to see.

Because of its alarming and intrusive talents, PimEyes has since been dubbed by several publications including UNILAD ‘the most disturbing AI website on the internet’ as it enables anyone with access to a photo of you to find potentially inappropriate or explicit pictures with disturbing ease.

How Does The Website Work?

To see every photo that you are in, you simply give PimEyes a photo of yourself and the website will then use it to collect every image of you available on the internet.

A user on TikTok known as @tcmsecurity posted a video of his experience with the site to show others how it works. Whilst demonstrating how to use PimEyes, the user also commented that he felt it was “super scary to see all of these pictures in one place off of one photo. This is nothing that Google Image search or any other search engine can do”.

“This wins my creepiest tool of the day I found on the internet”, the user concluded.

An employee at LADbible also posted about their experience with the site. The employee actually found that the images PimEyes shows are not always accurate. They stated on LADbible “It’s perhaps better at finding your doppelgangers than it is at tracking down every picture of you on the internet, but it is incredibly fast.”

“The first two were pictures of me, though the remaining six were images of other people who shared some similar facial features, mostly the eyebrows and the beard.”

So, it seems that PimEyes has a little way to go before it can be considered a fully accurate facial recognition engine. Nevertheless, what this website can do should still be considered a huge concern.

Are There Any Benefits To What PimEyes Can Offer?

It’s difficult to consider how a website that collects all of your photos can be a beneficial one, however, it does offer one advantage: PimEyes allows users to view and delete any unwanted photos from the internet, such as photos that were posted without consent.

According to the site, PimEyes can help you enforce the rules the GDPR and DMCA put forward.

The General Data Protection Regulation (GDPR) is a European Union law held by many countries outside the EU. The law, often known as the ‘right to be forgotten’, states that you have the right to erase any data concerning yourself, including pictures.

Furthermore, the Digital Millennium Copyright Act (DMCA) is a globally recognised rule that focuses on your rights as ‘the author in the photo’. In short, this means that if someone steals and publishes a photo of yours online, you have the right to enforce its removal.

The rules set forward by both the GDPR and the DMCA are important ones but are, alas, difficult to follow as people can get hold of your pictures and share them around very easily. Once this has happened, it can be difficult to trace them all down and remove them.

PimEyes makes it a lot easier for an individual to trace down all unwanted photos of themselves, and delete them.

The Dangers Of PimEyes

Whilst PimEyes may offer the benefit of being able to track down any unwanted photos you may want to get rid of, it is still found by many to be a dangerous and disturbing tool.

Some have even called the website ‘a stalker’s dream’; if a person can get their hands on just one photo of you, they may enter it into PimEyes and draw up every online picture of you in existence.

At the end of last year, privacy rights group Big Brother Watch even filed a lawsuit against PimEyes with the UK’s Information Commissioner. Their complaint claimed that PimEyes unlawfully processes the biometric data of millions of UK citizens.

PimEyes allows anyone to upload an image of a person to their website, which is then processed using facial recognition technology to find potential matches from an index of billions of photos on the internet.

The website places no limits on the type of images that may be searched for and has no safeguards to prevent people from searching for photos of someone other than themselves.

Big Brother Watch believes that because of this, PimEyes is likely to process millions of facial images of UK citizens without their knowledge or consent. The website could therefore be used for sinister purposes such as searching for images of an inappropriate nature.

The privacy rights group worries that because there is no limit on the age of individuals in the photos searched, PimEyes could also be used to track children across the web to find pornographic material.

Madeleine Stone, Legal and Policy Officer at Big Brother Watch said: “PimEyes enables privacy intrusion and stalking on a scale previously unimaginable. This facial recognition search engine lawlessly scans billions of our photos without our knowledge or permission.

“Images of anyone, including children, can be scoured and tracked across the internet. This extraordinary power is available to anyone at the click of a button and could be secretly used by potential employers, university admissions officers, domestic abusers or stalkers.”

Stone’s statement brings attention to how dangerous the artificial intelligence PimEyes uses can be, but this is not the first time the abilities of AI have raised concern. Apple co-founder Steve Wozniak and Skype co-founder Jaan Tallinn are just a couple of the many who have signed an open letter demanding that all labs training AI systems stop for at least six months.

“Recent months have seen AI labs locked in an out-of-control race to develop and deploy ever more powerful digital minds that no one – not even their creators – can understand, predict or reliably control,” the letter reads.

“Powerful AI systems should be developed only once we are confident that their effects will be positive and their risks will be manageable.”

This letter agrees with the sentiments laid out in the lawsuit filed by Big Brother Watch – that AI can produce applications and websites that are unsafe, inaccurate, nontransparent and dangerous to the public.

PimEyes is just one example of how artificial intelligence is being used to exploit our personal information. But are we already too far down the rabbit hole to stop the dangers AI online facial recognition will do next?