Follow @csteditorialsIn Sweden on Sunday, three men were arrested on charges of raping a woman — live on Facebook.
And twice also in the Chicago area in the last month, we have been reminded what horrible things people will do to get attention on social media.
Why, we have to wonder, doesn’t Facebook do a better job of preventing, flagging and reporting such crimes?
In Sweden, the three young men allegedly assaulted a woman who was clearly half-conscious, perhaps intoxicated, according to prosecutors. The assault, one investigator said, was shown on a “closed group” Facebook page “where you could post rather special things. Even for that group this was not anything normal.”
What did Facebook have to say about this? The usual.
“This is a hideous crime and we do not tolerate this kind of content,” the company said in a statement.
Last week in the Chicago area, a girl who attends Stevenson High School in Lincolnshire allegedly had a classmate film an attack against a younger student and post it on Facebook. Last month, an 18-year-old special-needs student was held captive and assaulted in Chicago in an attack that was streamed live on Facebook.
Follow @csteditorialsCharges have been filed in both Chicago area cases, but the obvious lesson here is that Facebook must do more to keep its platform from being used to glorify violence. Facebook also should act more quickly to keep posts from being shared to other online sites, where they can linger even if they disappear from Facebook.
As it works now, when someone posts texts, photos or videos that encourage violence, Facebook doesn’t react until a viewer who runs across the objectionable material flags the post. That alerts the “community operations team,” which National Public Radio reported last fall has several thousand people worldwide working out of such places as Manila, the Philippines and Warsaw, Poland. That’s a lot of workers, but Facebook is so huge that each flagged post is typically viewed for only seconds before a decision is made to leave it up or take it down.
We know it’s a complicated job. Some posts that may at first appear to be violent in nature might really be newsworthy. Last fall, for example, Facebook was properly criticized for taking down the iconic Vietnam War “Napalm Girl” photo because it showed a naked child. After public backlash, the post was restored.
But Facebook is a multibillion-dollar company. It can afford to do better job. The company needs to take more seriously its role as a vast conduit of text and images, taking into account not only what people might see but also what the ability to widely share text and images — particularly violent ones — might inspire them to do.
To begin with, Facebook could be transparent about how many posts are flagged each day and how many are removed. And it should unleash its vast engineering resources to develop ways to respond faster and more accurately to posts involving violence.
Send letters to firstname.lastname@example.org