Among the thousands of photos that we publish on various social media, many of them are extracted without our authorization and used by developers routinely in the construction of facial recognition systems.
A recently launched tool, called Exposing.AI, promises to help its users find out if their photos are among those captured for training, testing or enhancing biometric technologies. The site is currently researching 3,630,582 Flickr photos in this situation.
To access the tool, just go to the website, enter your Flickr username, the URL of the hashtag in the search bar, and the tool examines that set of more than 3.6 million photos informed, to see if your images are included.
To reach a result, the search engine references some Flickr identifiers, such as username and image ID. When an exact match is found, the results are loaded directly from Flickr and displayed on the screen.
How does Exposing.AI work?
Exposing.AI originated from an independent art and research publication called MegaPixels, led by Adam Harvey and Jules LaPlace with support from Mozilla, which investigates, from an ethical point of view, the origins and implications of individual privacy in sets of images used in biometrics training.
For the creators of Exposing.AI, Flickr was an obvious choice not only because the photo sharing service is one of the most used in artificial intelligence research, but also because its content licenses are more permissive.
The Nex Web site tested the tool and, right at the beginning, the second account was located at Exposing.AI. That is, on the one hand, the system works perfectly. However, it is not possible to remove your face from the data sets that have already been distributed, an option that may come in the future, with changes in the laws and privacy policies of the sites.