Facebook censors iconic napalm photo: Are algorithms undermining news?
Loading...
A Norwegian newspaper editor publicly censured Facebook founder Mark Zuckerberg after his social media platform took down a famous wartime photograph, citing concerns about child pornography.
Facebook’s decision to censor the famous photo of a naked child fleeing a napalm attack during the Vietnam war has prompted questions about Facebook’s policies and its use of algorithms to control the content posted online.
"The media have a responsibility to consider publication [of stories] in every single case," wrote Espen Egil Hansen, editor at Norway’s largest newspaper, in an open letter to Mr. Zuckerberg. "This right and duty, which all editors in the world have, should not be undermined by algorithms encoded in your office in California."
Facebook defended its policies, saying that it is "difficult to create a distinction between allowing a photograph of a nude child in one instance and not others."
The controversy began after Norwegian writer Tom Egeland posted an album of historic war photographs on the site, including Nick Ut’s iconic photo of Kim Phuc fleeing the napalm attack. Facebook responded by removing the Pulitzer Prize-winning photo and later suspending Mr. Egeland when he posted a response on his account.
Mr. Hansen, Egeland’s editor at Aftenposten, refused to take down the photo and accused Zuckerberg of abusing his power as the “world’s most powerful editor.”
“The free and independent media have an important task in bringing information, even including pictures, which sometimes may be unpleasant, and which the ruling elite and maybe even ordinary citizens cannot bear to see or hear, but which might be important precisely for that reason,” Hansen wrote in an open letter to Zuckerberg.
Egeland initially posted the photo series, including the image that Facebook later removed, as part of a discussion about the way photographs changed warfare.
Even the photo’s subject, Kim Phuc, objected to Facebook’s decision. As a spokesperson told Norway’s Dagsavisen, "Kim is saddened by those who would focus on the nudity in the historic picture rather than the powerful message it conveys."
It all comes down to algorithms. Facebook officials recently made the decision to rely more heavily on algorithms after claims that Facebook’s news curators were posting news items with a pro-liberal bias. As computer programs, algorithms like Facebook’s trending news program and its photo censorship program were intended to do away with bias. Yet critics say that algorithms' unbiased reputation is a myth.
Algorithms do not find an objectively right solution, technology and society expert Zeynep Tufekci wrote in a May op-ed for the New York Times. Instead, they “optimize” the solutions they provide based on parameters their creators give them.
For Facebook’s news algorithm, this means that the news results are sorted by those likely to best increase engagement with the site. For the site’s photo sorting algorithm, it means that no matter the thematic content of the photograph – that is, whether real child pornography or a historically transformative image – the same set of rules governs whether or not the photo will remain online.
Creating just one set of rules to govern a world of expression, says Hansen, limits the ways in which human beings can relate to one another. And in cases like the censored war photo, Facebook’s limited definition of what is acceptable and its refusal to consider images on a case by case basis can severely reduce the kind of dialogue members of the global community need most to engage in.
"You are a nice channel for persons who wish to share music videos, family dinners and other experiences. On this level you are bringing people closer to each other. But if you wish to increase the real understanding between human beings, you have to offer more liberty in order to meet the entire width of cultural expressions and discuss substantial matters," wrote Hansen.
"Editors cannot live with you, Mark, as a master editor."