Now we know some of what Facebook knows about how it’s hurting us
Facebook knows a lot about how it affects its users, because it’s investigated possible negative impacts.
For instance, internal research showed that one of its algorithms actually encourages angrier content. Or that Instagram, which it owns, makes body issues worse for teen girls.
Even though it knows all this, Facebook doesn’t share the information — not when Congress asks or even with its own oversight board. That’s the finding of an investigation out this week from The Wall Street Journal called the Facebook Files. It’s a topic for “Quality Assurance,” where we take a second look at a big tech story.
Jeff Horwitz is a reporter for The Wall Street Journal and an author of the series. He said the tech giant deserves credit for one thing. The following is an edited transcript of our conversation.
Jeff Horwitz: Facebook, out of the tech companies, is the only one that I am aware of that has done this level of research into its own societal effects. And I think that is a really valuable thing, and I understand why the company would prefer to be able to do that work in private, but the stuff they’re uncovering, and have uncovered, is really serious.
Jed Kim: I mean, uncovering things is one thing, but what about when it comes to addressing?
Horwitz: So I think that’s exactly the issue, right? They’ve hired Ph.D. data scientists, social psychologists, sort of political types, to look at what their platform is actually doing. And these people are coming up with some pretty uncomfortable things for the company, suggesting they make political discourse worse, that they make young women dislike their own bodies, how it’s doing in terms of protecting vulnerable communities across the world. And not that much changes. Facebook still kind of operates the same way it always operates, and I think the concern has frequently been that it’s just very focused on growth.
Kim: What are the risks to this kind of internal research continuing?
Horwitz: What would stop it?
Kim: No, I mean, now that they’re getting scrutinized for not acting on what they’re finding, is —
Horwitz: Yeah, Jed, that’s a, that is a deeply depressing thought.
Kim: That’s me.
Horwitz: The idea that journalism can be a permanent sort of check on the company is not possible in terms of providing information. And so I think there’s a legitimate question as to whether this attention does dis-incentivize the company to do the sort of research to understand itself that I think everyone wants them to do. But at the same time, there has to be some other solution to understanding what is happening inside a company that has such a transformative, really powerful product.
Kim: So you’re not seeing, like, a map forward to addressing it?
Horwitz: No, I think, I think one of the interesting things is that, very clearly looking at the research, people do have solutions internally that they’re proposing. And in fact, some of them — they’ve been tested — and found to be somewhat effective, but they do tend to sort of work against the company’s interests. And that’s, I think, a hard sell inside of a for-profit business.
Kim: So if they’re not addressing things satisfactorily, I gotta ask, what’s the point of doing the research?
Horwitz: I think some of it is just truly absolutely necessary to not be blindsided. We’re talking about a company that has repeatedly declared a “break glass” measures in recent months — that’s their words — for sort of emergency measures to slow the platform. If they don’t study this stuff, they have big problems on their hands. And I think, we can talk about how we got to Jan. 6, but there’s no question that viral misinformation about elections is a problem. I don’t think just ignoring it completely and sticking their head in the sand is an option.
Kim: Well, OK, so tell me how this reporting started. Like, why focus on Facebook?
Horwitz: Aside from being my beat, Facebook touches pretty much everything. I used to cover politics, and I realized that politics basically had turned into kind of a tech beat already. And I think that, that’s sort of the range of stories we’ve got this week, which deal with powerful political actors, overseas humanitarian crises, the mental health of teenagers, and the sort of ability to literally flip a switch and change the tenor of political discourse around the world. It just sort of illustrates the range and power of this platform.
Kim: And how did you? I mean …
Horwitz: I can’t tell you that (laughs).
Kim: (Laughs) I mean, just spell the name.
Horwitz: Yeah. It’s not a coincidence that there are a lot of people who Facebook has asked to work on deeply serious, sometimes deadly serious, societal issues that end up talking to folks like me. As some of them have noted on Twitter in recent days, sort of former employees, they end up talking because they aren’t able to proceed with their work inside the company.
Kim: What role do you think the oversight board plays here?
Horwitz: That is very much to be determined. The oversight board has two powers. One of them is to issue binding rules on specific Facebook posts that are generally months old, right? So [Donald Trump’s] removal from the platform was something that they kind of have binding authority over. The other option, or the other power, they’ve got is the ability to ask Facebook questions, and Facebook is required to respond to them and to give them information under their deal as long as the questions are relevant. And I think one of the things our reporting showed is that Facebook, at the very least, deeply misled its own oversight board, an entity that they created literally last year and put $130 million behind to supposedly provide accountability. And they, in the words of Kate Klonick, a law professor who studied them closely, they lied to it. Under those circumstances, it was kind of a real question about what the oversight board is going to do in response to this. They have issued sort of an expression of concern, and obviously there are other ways that they could threaten Facebook in terms of their cooperation with the company and things of that nature. Facebook has invested a lot into this program, but it’s sort of unclear what you do if the big-picture power you’ve got isn’t one that Facebook is going to respect.
Related links: More insight from Jed Kim
We’ve got links to the ongoing coverage Jeff and team are doing at The Wall Street Journal. They are working ’round the clock to publish the “Facebook Files.” Check it out.
Facebook already takes aggressive action against disinformation campaigns spread by fake accounts and waged by, say, Russia. Now, Facebook tells Reuters it’s using the same techniques against coordinated campaigns from real accounts. For example, the lead-up to the Jan. 6 attack on the U.S. Capitol building. Oversight is done not by content moderators, but by Facebook’s security experts. It’s unclear yet as to how this will affect public debate.
This might be superrelevant. “Tech” host Molly Wood interviewed Emanuel Moss last year about ethics in tech. He believes big data companies need to hire people whose job it is to think through even unintended ethical problems with their products. One idea: Form a “red team” to think through how a system may be misused. Or another, consider not only impacts to users but also to people who don’t use your product. Check it out if you missed it.
As we’ve mentioned, Facebook does have an oversight board, which a Bloomberg op-ed said could be a route to institutional change. It upheld Facebook’s decision to block President Trump’s account on Jan. 7. Some people think it has no teeth, owing to the fact that the company pays the board members six-figure salaries. Still, it has made suggestions for change and appears a little frustrated at the lack of progress. The board is set to double in size from 20 to 40 members in the coming months.
The future of this podcast starts with you.
Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.
Support “Marketplace Tech” in any amount today and become a partner in our mission.