Facebook, committed to connecting people, now must decide who to disconnect

Today's expression: Come to a head
Explore more: Lesson #90
October 1, 2018:

Social media companies like Facebook, Twitter, Apple and even Spotify are now in the uncomfortable position of policing content and determining which users to disconnect from their services. They must navigate through hateful speech, people inciting violence, and even foreign governments trying to affect elections. It is not a job they enjoy. Plus, learn the English phrase "come to a head."

Take control of your English

Use active strategies to finally go from good to great

Listen

  • Learning speed
  • Full speed

Learn

TranscriptYour turn
No translationsEspañol中文FrançaisPortuguês日本語ItalianoDeutschTürkçePolski

Who decides what’s allowed on social media? Increasingly, it’s the social media companies themselves

Facebook, Twitter and other social media companies are facing an increasing number of ethical dilemmas with respect to free speech. If someone says something offensive, wrong, hurtful, inaccurate, bigoted, or—God forbid—something that provokes violence, what should the social media companies do about it?

Hi everyone, welcome to Plain English, the perfect podcast for learning English. Right here, every week, every Monday and Thursday, we pick one thing from the news and talk about it at a slower pace for people who are practicing their English listening. Long-time listeners also know we pick one word or phrase to talk about in detail in the second half of the program. Today that phrase is “come to a head.” So listen up for that phrase in the first part of the program, and we’ll talk more about it later on.

Today is episode 90, so that means you can see the episode transcripts with instant translations of key words and phrases into Spanish, Portuguese, French, Chinese, Japanese and Italian at PlainEnglish.com/90.


Social media are now more active in policing content

If someone says something on social media that provokes violence against another person, should that person be banned from Facebook? What about if someone just spreads lies and misinformation? These are ethical dilemmas that the social media companies are increasingly having to answer—reluctantly.

This is not the role that Mark Zuckerberg, the founder of Facebook, wanted when he started his social media web site. Facebook was all about connecting people. You can criticize the company for any number of things, but all they ever wanted to do was grow and get users to participate on its platform, and sell ads that would appear before all those eyeballs. But now the company founded to connect people has to decide which of its users to disconnect. And it’s not an easy job.

The issue has come to a head in the last few months. For starters, Apple, Facebook and YouTube removed content produced by a web site called Infowars, saying the site repeatedly glorified violence and promulgated hate speech, in violation of the community standards. The site’s controversial founder immediately said it was a case of big-tech censorship.

Right around the same time, WhatsApp, which is owned by Facebook, came under pressure to control forwarded messages. That’s because in India, somehow, and this is really sad, false rumors accusing people of being child kidnappers spread like wildfire; people forward these false rumors to thousands of people at a time. And as a result, huge mobs of people who believed these rumors then attacked innocent people. Over 20 people have been killed by mobs, all because of false rumors. WhatsApp recently limited the number of people you can forward a message to; the limit is now 200 people.

And in Myanmar, a minority Muslim community called the Rohingyas have been victims of political violence, and people have been using Facebook to incite hatred and violence against that minority community. Facebook, based in California, polices the platform for offensive posts, but they didn’t have a lot of people who spoke Burmese, the language in Myanmar, so they missed a lot of the offensive content. Some of the things that were posted were really, really bad. Like, almost worse than you can imagine one human ever saying about another. And Facebook was caught flat-footed, without enough people who speak Burmese to police that content.

How did we get here? The social media companies originally took the position that they weren’t responsible for the content on their platforms. They were just the technology; don’t complain to them about what users posted. That hands-off approach tended to work when most of the content was people in the United States posting on Facebook about what they were doing that day, or uploading family videos to YouTube. The companies could selectively just take down stuff that was blatantly bad, but the stakes were relatively low.

But over the years, social media got to be a bigger and bigger part of our lives. And life is complicated; life is messy; in many places, life is dangerous and violent; and some of the messiness of life spilled onto social media. In 2010, a college student in New Jersey was secretly filmed doing something embarrassing to him; when the video was posted on Twitter, he was so ashamed that he committed suicide. In 2016, groups of people in Russia and Eastern Europe bought targeted ads on Facebook that attempted to influence the votes of people in the United States. Some bad actors in the world are now using Facebook to their advantage: after hiring a cybersecurity expert from the Obama administration, Facebook pulled down hundreds of profiles from Iran and Russia that were being used by foreign governments to influence public opinion in the United States.

And now the examples I shared earlier: the offensive speech, the incitement to violence. These technology companies are now so big, and so profitable, that they can’t just excuse themselves from being the policeman on their own platforms. The public, including elected officials in Europe and the United States, now expect the companies to act responsibly. So they each now have elaborately-designed community standards that try to balance the ideal of free speech, letting everyone participate with the reality that they really have to actively pull the plug on some types of the worst content. The trouble is determining what is what. If something is just offensive, should it really be taken down? But if it’s offensive and might cause violence, then what?

Facebook, Apple, and all these companies now employ thousands of people whose whole job is to look at flagged content and determine what violates the community standards. Now think about what that job could do to a person: if your whole job is viewing offensive videos and speech, how do you sleep at night?


You can say what you want about Facebook. They’re too big. They just care about profit. They sell your data, whatever you want to say. But I’ll tell you Mark Zuckerberg never asked for this. Neither did the founders of Twitter or Google. Never in their wildest nightmares did they think they personally would have to be responsible for controlling crazed mobs in India, avoiding election interference from Russia, or stopping ethnic violence in Myanmar. I don’t envy their position.

On a lighter note, how about we say hi to a couple of listeners. First up, Margarita in Germany. She is studying chemistry and she is just starting in English. Great to have you along Margarita, welcome to this great and complicated language, and happy Oktoberfest! I also want to say thank you to Marcela, who left a very nice review of the show on Facebook. She said the topics are interesting and it’s nice to be informed as you practice listening. That’s a good point, Marcela—and not only do you all get to stay informed about things going on in the world, but I also get to stay informed. This is good for me too. So thanks to my two M’s, Margarita and Marcela, for saying hi this week.

Here’s a quick reminder that JR and I send out episode summaries every Monday and Thursday. They summarize the main topic, but they also have links to English articles about the main topic. So if you like the topic, you can choose to read more about it, all in English. If that sounds like something you’d like to get, then go to PlainEnglish.com/mail and enter your details.

Learn English the way it’s really spoken

Starter feature

We speak your language

Learn English words faster with instant, built-in translations of key words into your language

Starter feature

We speak your language

Learn English words faster with instant, built-in translations of key words into your language

Starter feature

We speak your language

Learn English words faster with instant, built-in translations of key words into your language

Starter feature

We speak your language

Learn English words faster with instant, built-in translations of key words into your language

Starter feature

We speak your language

Learn English words faster with instant, built-in translations of key words into your language

Starter feature

We speak your language

Learn English words faster with instant, built-in translations of key words into your language

Starter feature

We speak your language

Learn English words faster with instant, built-in translations of key words into your language

Starter feature

We speak your language

Learn English words faster with instant, built-in translations of key words into your language

Starter feature

We speak your language

Learn English words faster with instant, built-in translations of key words into your language


Plus+ feature

Practice sharing your opinion

Get involved in this story by sharing your opinion and discussing the topic with others

Expression: Come to a head