A face detection system can help you log in to your iPhone, track offenders, and find local customers in stores. This system is built on an algorithm that identifies human faces and hundreds of different characteristics. Although it is still not perfect, this technology is rapidly improving.

To accomplish this, algorithms need to provide hundreds of thousands of images of multiple faces with diverse characteristics. Photos taken on the Internet are getting more and more. They are used by millions of people whose pictures are unknown. They are also classified by age, gender, skin color and dozens of other features and shared with researchers at universities and companies.

The growing algorithms can even distinguish the gender and color of human skin, something that they previously could not do. This has prompted law experts and civil rights advocates to raise a warning about researchers using people's images. Their faces were used without their permission, to serve a technology that would later be used to monitor them.

“This is a terrifying secret of artificial intelligence training (AI). Researchers often take any image indiscriminately, ”said Jason Schultz, a professor of law at the University of New York.

The latest company entering this field is IBM technology corporation. In January, the company launched a collection of nearly a million photos taken from the Flickr image site and coded to describe the appearance of the subject. IBM introduced this collection to researchers as a huge step forward to reduce prejudice in face recognition.

But the photographers whose images are in IBM's data set were surprised and confused when NBC News informed them that their images were detailed annotated about facial contours and skin color, and can be used to develop facial recognition algorithms. (NBC News collected IBM data from a source after the company refused to share information, citing that these images were only used for academic or corporate research groups.)

"None of the people I photographed knew that their images were used like this," Greg Peverill-Conti said; he is a Boston PR coordinator with more than 700 photos in IBM photo gallery. "This is a bit dishonest when IBM uses those images without saying anything to anyone," he added.

John Smith, AI research supervisor at IBM, said that the company has committed to "protecting the privacy of individuals" and "will work with anyone who requires removing a URL from the data set."

Although IBM has ensured that Flickr users can decide not to participate in the database, NBC News found that it was almost impossible to remove those images. Not only that, IBM did not publish Flickr user lists and images in the data set, so it was not easy to find out who the images were abused. IBM did not respond to this problem either.

IBM said that its data set was designed to help academic researchers make face recognition technology more fair. This is not the only company that uses online images like this. Dozens of other research organizations have also collected images to train face recognition systems, and most collections are cut from the internet.

Several experts and activists have argued that this not only violated the privacy of millions of abusive people - it also raised a concern about the improvement of identity technology. face, that is a worry about law enforcement agencies will use this technology to target minority communities.

The use of a surveillance system based on face recognition technology of law enforcement agencies is controversial to the point that 85 groups of civil rights and racial rights groups have joined together, to call technology companies refuses to sell this system to the government. They argue that this technology makes the prejudice that has existed become more and more severe, damaging the communities that are already subject to too much oppression and supervision.

"People allow their images to be shared on many websites." Meredith Whittaker - co-director of Institute AI Now, research on social applications of artificial intelligence - said. "Now, they are not ready or unaware that the image appears in systems that could later put pressure on their own communities."

Reference: NBC News