Flickr Photos are Anonymously Used by IBM for Facial Recognition Training
IBM took around million photos from Flickr, and used them to figure out how to train facial recognition training programs, and than they shared them with the outside researchers.
NBC Point Out, without people permission the photographs on Flickr are used to develop facial recognition systems. it is considered as those systems could be later on used for surveillance and recognize them.
While these photographers may be having a permission to take the pictures of these people, some one told NBC New that people who were photographed didn’t even know that their images are being used with facial recognition notes and could be used to train algorithms.
“None of the people I photographed had any idea their images were being used in this way,” one photographer told NBC.
Read: Facial Recognition (What You Are About To See Will Shock You
The photos were not originally compiled by IBM, by the way — they’re part of a larger collection of around 99.2 million photos, known as the YFCC100M, which the Former Flickr owner Yahoo originally put them together to conduct a research. All these photos were shared under a Creative Common License, which is typically a signal that they can be freely used, with some limitations.
But the fact that they could potentially be used to train facial recognition systems to profit by ethnicity, as one example, may not be a use that even Creative Commons’ most permissive licenses anticipated.
It’s not a theoretical example: IBM made a video analytics product which used a body camera to figure out people’s races. IBM denied that it would “Participate in work involving racial profiling,” it tell the The Verge.
Read: Facial Recognition (Fun or Deception – Viral Trend of 2019)
It is also worth mentioning here that IBM’s original intentions may not been in preventing AI (Artificial Intelligence) being used against certain groups though – when it announced the collection in January, The company explained that they needed a large dataset to help train for “fairness” as well as accuracy.
One Comment