Jason Koebler and Joseph Cox:

This spring, Facebook reached out to a few dozen leading social media academics with an invitation: Would they like to have a casual dinner with Mark Zuckerberg to discuss Facebook’s problems?

According to five people who attended the series of off-the-record dinners at Zuckerberg’s home in Palo Alto, California, the conversation largely centered around the most important problem plaguing the company: Content moderation.

In recent months, Facebook has been attacked from all sides: by conservatives for what they perceive is a liberal bias, by liberals for allowing white nationalism and Holocaust denial on the platform, by governments and news organizations for allowing fake news and disinformation to flourish, and by human rights organizations for its use as a platform to facilitate gender-based harassment and livestream suicide and murder. Facebook has even been blamed for contributing to genocide.

These situations have been largely framed as individual public relations fires that Facebook has tried to put out one at a time. But the need for content moderation is better looked at as a systemic issue in Facebook’s business model. Zuckerberg has said that he wants Facebook to be one global community, a radical ideal given the vast diversity of communities and cultural mores around the globe. Facebook believes highly-nuanced content moderation can resolve this tension, but it’s an unfathomably complex logistical problem that has no obvious solution, that fundamentally threatens Facebook’s business, and that has largely shifted the role of free speech arbitration from governments to a private platform.