Facebook’s oversight board is finally here. How will it affect content moderation?
Facebook’s oversight board is finally here. How will it affect content moderation?
Facebook now has an oversight board that will have final say over what content stays or goes on the social media platform. Its 20 members will serve as a high court of sorts to weigh tricky decisions of free speech versus harmful content.
Facebook committed to creating an oversight board two years ago, a day after The New York Times published an investigative report about Facebook’s handling of high-profile incidents such as Russian interference in the 2016 presidential race.
The company has been criticized for past content moderation decisions, and has denied accusations that it more heavily targets conservative-leaning posts for removal.
The oversight board is starting with 20 members and will grow to 40.
Members include a former prime minister of Denmark, a former human rights judge and a Nobel Peace Prize winner.
The group will decide some of the thorniest issues surrounding content moderation on Facebook and its sibling site Instagram, such as hate speech, harassment, safety and privacy.
The board’s four co-chairs say their decisions will be final and binding. It’s funded through a $130 million trust set up by Facebook and is supposed to operate independently.
There’s a lot happening in the world. Through it all, Marketplace is here for you.
You rely on Marketplace to break down the world’s events and tell you how it affects you in a fact-based, approachable way. We rely on your financial support to keep making that possible.
Your donation today powers the independent journalism that you rely on. For just $5/month, you can help sustain Marketplace so we can keep reporting on the things that matter to you.