Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

With 'Napalm Girl,' Facebook Humans (Not Algorithms) Struggle To Be Editor

This is a June 8, 1972, file photo of South Vietnamese forces following after terrified children, including 9-year-old Kim Phuc (center). The Pulitzer Prize-winning image is at the center of a heated debate about freedom of speech in Norway after Facebook deleted it from a Norwegian author's page.
Nick Ut
/
AP
This is a June 8, 1972, file photo of South Vietnamese forces following after terrified children, including 9-year-old Kim Phuc (center). The Pulitzer Prize-winning image is at the center of a heated debate about freedom of speech in Norway after Facebook deleted it from a Norwegian author's page.

It is tempting to make every fiasco at Facebook about the power (and the abuse of power) of the algorithm. The "napalm girl" controversy does not neatly fit that storyline. A little-known team of humans at Facebook decided to remove the iconic photo from the site this week.

That move revealed, in a klutzy way, just how much the company is struggling internally to exercise the most basic editorial judgment, despite claims by senior leadership that the system is working.

Zuckerberg's Silence

In a public address in 2014, CEO Mark Zuckerberg said his company's goal is to be the "perfect personalized newspaper for every person in the world."

Well, if that's the case, Facebook doesn't show much deference to the editorial wisdom of newsrooms.

Back in 1972, in the bloodied fields of the Vietnam War, an Associated Press photographer took a photo of children screaming, mouths wide open, as they flee a napalm attack. One of them, a 9-year-old girl, is naked.

Fast-forward to 2016. A Norwegian writer shared that image and six others on Facebook, in a post about photos that "changed the history of warfare." His account was suspended. He violated Facebook bans on nudity and child porn.

The decision was extraordinary. A photo deemed so significant that it won a Pulitzer Prize did not make the cut for Facebook's Community Standards — the rules on what you can and cannot post.

It took a global campaign — with the editor-in-chief of a Norwegian newspaper running a front-page letter to Zuckerberg, with the Norwegian prime minister re-posting the now-banned photo, and with the world watching — for Facebook to back down, days later, and say: OK, we'll let the photo stay after all.

Notably, Zuckerberg did not issue a public apology. He stayed silent.

The company put a vice president in charge of media partnerships, Justin Osofsky, out front. Osofsky wrote in a post that Facebook "made a mistake" but, on the bright side, "one of the most important things about Facebook is our ability to listen to our community and evolve, and I appreciate everyone who has helped us make things right."

It was a teachable moment, and the lesson was learned.

Facebook users had different reactions. Uma Venkatraman darted back in a comment: "A simple 'sorry, we were wrong' would have been more effective."

The Humans At The Wheel

Facebook does use algorithms to decide what viral stories should be "trending news," even when the "news" is factually false. But in this "napalm girl" controversy, humans are at the wheel.

One could point out that Facebook has algorithms to spot what might be child porn, in order to take it down. But in this instance, it was a human who flagged the post and a human inside the company who decided to hit the delete button.

In fact, with very few exceptions (for example, for spam attacks), people at Facebook are the ones manually removing content, according to Monika Bickert, Facebook's head of policy.

Bickert has a curious dual role. She's responsible for deciding what stays up and what comes down on the site; and she has to run around the world and kiss the rings of global leaders who are angry with Facebook. She faces the unenviable challenge of developing global standards that work across markets, striking the right balance between free speech and suppression.

In an interview with NPR about U.S. hate speech, which occurred in July, Bickert explained that humans have to remain in charge of moderating posts because of the problem of context.

Algorithms can't make out if a racial slur is being used to attack a person or as social commentary in a rap song. Algorithms can't make out if a violent photo is mocking victims or educating the public. The technology simply isn't there yet.

"Context is so important, it's critical," Bickert said and repeated for emphasis. "Context is everything."

Her commitment to context is so clear, it's hard to imagine how the "napalm girl" photo ever got taken down. Unless, that is, you consider look at which humans are making the call.

Facebook did not reveal details of its internal decision-making process. NPR scraped LinkedIn for the resumes of a few hundred employees and contractors in the "community operations" teams, the self-described "safety specialists" in charge.

The team members are scattered around the world — in California, Ireland, India. Many are recent college grads with questionable training on what would be considered, in legacy newsrooms, very tough decisions that only veteran editors can make.

And the volume of work is extraordinary. While Bickert would not say how many posts Facebook removes on average, her colleague Osofsky shared in his non-apology: "It's hard to screen millions of posts on a case-by-case basis every week."

Repeat: millions. That would make the unit an editorial sweatshop.

Market Forces

Facebook did not have to remove "napalm girl" under law, according to Thomas Vinje, an attorney in the European Union who represents American internet companies. The idea that anyone in the company had a legitimate concern about liability is "unimaginable," he says. "Anyone with basic knowledge knows it's a very famous picture. It's a strange situation."

It's a revealing move about the business. While Facebook maintains it is just a platform (so it's not liable for the content that users choose to share), the company is also a multinational entity that's trying to build a digestible product: a digital Coca-Cola that everyone, from hipster San Francisco to conservative Jordan, can enjoy.

That'll take a lot of editorial judgment.

Vinje says in that respect, Facebook could be playing a dangerous game. If the company goes too far in pulling content at their discretion, "then they cross the line into becoming something at least akin to a publisher."

As traditional newsrooms know, with great power comes great responsibility.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Aarti Shahani is a correspondent for NPR. Based in Silicon Valley, she covers the biggest companies on earth. She is also an author. Her first book, Here We Are: American Dreams, American Nightmares (out Oct. 1, 2019), is about the extreme ups and downs her family encountered as immigrants in the U.S. Before journalism, Shahani was a community organizer in her native New York City, helping prisoners and families facing deportation. Even if it looks like she keeps changing careers, she's always doing the same thing: telling stories that matter.