The facial recognition technology of the Clearview AI company, which has been at the center of discussions for a while, is getting approval! Here are the details…
Clearview AI is preparing to file a patent for its controversial facial recognition technology. The company’s system, which will be used by law enforcement, including the FBI, collects images of people from social media sites without their consent. The system’s database of billions of images was therefore criticized. However, the company claims that the images it collects are public and therefore fair.
The new news coming in the middle of this discussion drew attention…
Clearview AI facial recognition system is so close!
According to CNET’s report, the company in question received a notice from the US Patent and Trademark Office (USPTO) to approve the application. If the company gets to work, it will use the system on public internet data to find people from government listings and security camera footage.
(Photo: Thomas Peter/Reuters)
Clearview AI currently only needs to pay an administrative fee for its facial recognition system to secure the patent. However, the fact that Clearview will create databases of images has been criticized as “concerning”. In addition, many governments, such as Australia and the United Kingdom, think that facial recognition violates data laws.
Although there are no clear rules regarding the use of this technology, it is becoming increasingly common. The system could theoretically be used to suppress opposition or privately stalk people. However, in a Politico statement, Clearview founder Hoan Ton-That stated that they will sell the vehicle to government customers (including law enforcement), aiming to speed up searches.
Ton-That argued that Clearview has no plans to sell to anyone but government customers, and it’s important to have neutral systems. However, Clearview claimed that AI was the first facial recognition patent that included large-scale internet data.
In addition, the CEO of Clearview argued that the technology is not a surveillance tool, but is intended to identify criminal suspects. Critics argue that such systems could potentially allow a passerby to capture your image with a smartphone and then reveal personal data about you.