Facebook and other social media platforms cannot stop all disinformation and hate on the internet. But they can call a halt to fanning the flames.
On Tuesday, former Facebook project manager Frances Haugen gave more than three hours of testimony before Congress. She said Facebook knows its platform is being used to spread hate, violence and disinformation, but it doesn’t crack down because it would lose money. Haugen has made public tens of thousands of pages of Facebook research and documents that she says demonstrate the company knows that its apps are deleterious, including Instagram, which can have a toxic effect on teenage girls.
Facebook Chairman and CEO Mark Zuckerberg said Haugen’s claims “don’t make any sense.” And Facebook says users can always opt out of the algorithm that ranks content, though the company fails to note that is not easily done.
Congress should find a way to hold Facebook, Twitter, Instagram, TikTok and other social media companies responsible for how their algorithms actually promote content that often is deleterious.
Using artificial intelligence, the algorithms decide what information to show to users. When the algorithms are tweaked to spread content that titillates and inflames segments of society so that people spend more time on various sites and share that content, that increases company profits. But the content that gets the most clicks, shares and “likes” also tends to be the most polarizing. For society as a whole, it pushes people further into warring camps that aren’t listening to each other.
During the COVID-19 pandemic, for example, it seems probable that Facebook’s algorithms helped irresponsible claims from anti-vaxxers and mask opponents to go viral, undermining the nation’s effort to get the virus under control. Social media companies’ algorithms also often help to spread extremely destructive conspiracy theories.
In all likelihood, this is partly why some normally staid school board meetings around the country are being inundated with furious protesters spouting falsehoods and threatening violence over mask policies. Social media stands accused, deservedly so, of undermining the democratic process by spreading false information, at times so effectively that its reach exceeds that of legitimate news reports.
After fierce opposition from lawmakers, child advocacy groups and attorneys general, Instagram, which is owned by Facebook, recently announced it was putting a plan on hold to develop a version of its platform for children under the age of 13. Studies have shown that social media, full of bullying and debilitating messaging about body image and the like, is particularly bad for teenage girls.
In developing countries, Facebook has been used to facilitate human trafficking, aid drug cartels, incite violence against minorities and suppress dissent.
It doesn’t have to be that way. Other types of media, such as newspapers and television stations, have long had to take responsibility for the information they share with the public. Social media companies could, to some degree, be held to a similar standard.
Tristan Harris, co-founder and president of the Center for Humane Techology, recommends limiting the amount of times a post can be shared with nothing more than a click to prevent disinformation from going viral. Haugen says Facebook should fix its algorithms so they don’t favor incendiary material. Sen. Bernie Sanders, I-Vt., says it’s time to break up Facebook, but that won’t change the incentive for social media companies to use the algorithms that are causing problems today.
It would be particularly hard for any single social media company to make needed changes unilaterally because that company would lose audience and profits. Congress needs to enact comprehensive reform that would encourage social media companies to rely less on artificial intelligence and give users more control.
Like other huge and seemingly insolvable problems of the past — auto fatalities, for example — we can’t expect to eliminate the scourges of online troll farms, lies and hate. But just as driving, as measured on a per-mile basis, has become less deadly over the decades, thoughtful regulations can make the internet a safer place for Americans.
Consign the dangerous fringe to the fringe.
Send letters to firstname.lastname@example.org.