Facebook Won't Address Problems Until Controversy Comes [Content Made Simple]

Also, how Facebook uses our off-Facebook internet activity to manipulate us.

TOP OF THE WEEK

FACEBOOK WAITS FOR CONTROVERSY TO FORCE CHANGE

Facebook doesn’t fix its shit until there’s a controversy — that needs to change

Facebook will hold out as long as it can to take advantage of its users. It does not fix problems until those problems cause controversy. This is a problem.

[READ MORE]

Quote:

After multiple stories in India about how Facebook executives courted politicians for years, the company can’t even pretend to not understand the culture.

Does Facebook know better? For sure. Does it want to be proactive and take action? Its actions suggest that’s not the case.

The biggest social network in the world has done it time and time again. Take no action against hate speech despite getting reported by numerous people.

There’s a well-defined cycle. Wait for media reports to emerge. Issue statements of apology. Take some minor rectifying actions. Rinse. Repeat.

Commentary:

I have written about this very phenomenon repeatedly in this newsletter, and I’m writing about it at length in my book. Facebook, blinded by pride and selfish gain, will not fix problems from which it profits for as long as possible. It is only when problems create a PR crisis that has the potential to threaten the long-term value of Facebook that it takes action to curtail misuse of its platform.


ON THE POD

We have no new podcast episode this week, as we took the week off for the Labor Day holiday.

But you can check out a social media podcast with Jonathan Howe, Elizabeth Hyndman, and me anytime you want right here.


HITTING THE LINKS

Link #1: How Facebook Manipulates Us Part 3: Our Off-Facebook Internet Activity.

This is part three of five of a series of posts I am writing about how Facebook uses its power as the largest social media platform in the world to manipulate people. You can access parts one and two at the top of part three. Part four will be published Thursday and part five next Monday.

Allow me to introduce you to the Facebook Pixel—a small piece of code installed in countless websites that sends your web activity back to Facebook to give them more information about you. The Facebook Pixel is why you see advertisements for workout apparel after buying a new pair of running shoes. It’s why you see an advertisement for Tylenol after searching, “Is my headache a sign of stress or a brain tumor?” No one knows how many websites on the internet have the Facebook Pixel installed...well, except probably Facebook.

Link #2: Facebook Moves to Limit Election Chaos

Facebook has decided to try to limit election ads within the last couple of weeks leading up to the election. This isn’t a preemptive action so much as it is a four-years-late response to the election meddling that took place on its platform in 2016.

On Tuesday, Facebook said the Kremlin-backed group that interfered in the 2016 presidential election, the Internet Research Agency, tried to meddle on its service again using fake accounts and a website set up to look like a left-wing news site. Facebook said it was warned by the Federal Bureau of Investigation about the Russian effort and removed the fake accounts and news site before they had gained much traction.

Thursday’s changes, which are a tacit acknowledgment by Facebook of how powerful its effect on public discourse can be, are unlikely to satisfy its critics. Some of its measures, such as the blocking of new political ads a week before Election Day, are temporary. Yet they demonstrate that Facebook has sweeping abilities to shut down untruthful ads should it choose to do so.

Link #3: Facebook Fires Employee Who Gathered Evidence of Conservative Favoritism

The largest social media platform in the world is dealing with as much internal fighting as it is fighting among its users.

On July 22, a Facebook employee posted a message to the company’s internal misinformation policy group noting that some misinformation strikes against Breitbart had been cleared by someone at Facebook seemingly acting on the publication's behalf.

“A Breitbart escalation marked ‘urgent: end of day’ was resolved on the same day, with all misinformation strikes against Breitbart’s page and against their domain cleared without explanation,” the employee wrote.

The same employee said a partly false rating applied to an Instagram post from Charlie Kirk was flagged for “priority” escalation by Joel Kaplan, the company’s vice president of global public policy. Kaplan once served in George W. Bush’s administration and drew criticism for publicly supporting Brett Kavanaugh’s controversial nomination to the Supreme Court.


THE FUNNY PART

If you like this, you should subscribe to my free newsletter of funny content I find online. It’s called The Funnies. It delivers on Saturday mornings.

You can subscribe to The Funnies here.

Remember, if you’d like additional access to the two blog posts per week I’m writing behind a paywall, you can upgrade your subscription. Just click below.