Susan Wojcicki has been a key member of Google for years. She was one of the first employees, and now, she leads one of its most popular platforms: YouTube. In a recent interview with “Marketplace’s” Kimberly Adams, she talks about creators, content moderation, and regulation.
Click the audio player to hear the entire, unedited interview.
Kimberly Adams: YouTube, as well as your parent company Google, had a pretty good pandemic, financially. Ad revenue soared as people who were stuck at home, for obviously unfortunate reasons, started turning to the platform. What has that meant for creators specifically?
Susan Wojcicki: Well, definitely, the pandemic had a lot of changes on our business. And we did see that when people were stuck at home, they definitely were turning to more streaming services, and we definitely saw big growth in our traffic and the engagement. And creators in many ways, because they were used to filming at home or used to filming with fewer people around them, were able to continue to produce and create a lot of really valuable content. We also saw that during the pandemic people needed help. They needed to learn a lot of skills they hadn’t ever learned before — fix things in their house, learn how to cut hair, fix their refrigerator, other things they might have called someone for help with before that they couldn’t do, as well as of course, kids who were at home. I mean, it was tragic to see so many kids out of school. And so we have a lot of valuable learning, how to help your kids, how to help with homeschooling. So we definitely saw YouTube be a really critical resource. We also really invested to make sure people had the information they needed around COVID. I’ve never seen us launch such a big campaign as what we did around COVID. We served hundreds of billions of impressions and worked with health organizations globally, I think over 86 health organizations around the world. So it was a really important moment for us where we wanted to show up and be able to help creators and also help all of our users.
Adams: Can you talk to me a little bit more about the YouTube Creator economy — how it’s grown over the years and how you see it changing moving forward?
Wojcicki: Sure. YouTube enables our creators, which are really such an incredible part of YouTube, really the backbone of YouTube, and we have over two million creators who generate revenue from YouTube. When we look, for example, in the U.S. what that actually means, we see that that actually contributes really significantly in both GDP and in jobs. Over 20 billion in GDP and almost 400,000 in jobs. And that’s from the Oxford economic report that just came out. I mean, what creators can do is they can take their passion, something they always loved, and start creating videos. YouTube does the monetization, the distribution of that, and it becomes a next generation media company. Actually, during the pandemic, for example, there are many stories of creators who actually became, for the first time, creators because something happened in their life. For example, we have this creator, Randy Lau, [whose] business shut down during the pandemic, like a lot of people’s business. And his father had been a Chinese chef and had been cooking. And so he started filming his dad making Chinese recipes. And they are amazing. Every time you look at them, you think, “Wow, this is how the restaurant made all those recipes that I wanted to have at home.” And now they have a very successful business. They have almost 500,000 subscribers. So it’s just a way for individuals to be able to take their passion, turn it into a next generation media company and generate revenue.
Adams: Speaking of generating revenue: Based on what I can see online, most of YouTube’s highest paid creators are white. And I know you’ve had a big initiative to try to elevate more creators of color. But what is the long term strategy for increasing the diversity of the types of people who can get rich on YouTube?
Wojcicki: Well, first of all, we think YouTube is a great way for being able to diversify the voices that we hear in media. Because YouTube isn’t a gatekeeper, anyone can just post their video and post and start creating content and connecting with their audience. They don’t need to go create a script and have someone review it and then approve it, fund it, etc. They can just start filming in their house. We really do see that YouTube has enabled a lot of voices from underrepresented communities that we never would have heard from before in media. It’s been a very important priority for us. We think it’s a differentiator for YouTube, the fact that we have so many different types of content and different voices coming from all over the world. It’s been hard for us to really understand exactly what is the diversity of YouTube because we don’t have our creators report. They don’t tell us what their backgrounds are. And so we actually launched something recently called self-ID, where our users can actually identify and tell us what is their background? How do they affiliate? And that is going to be really useful for us to understand exactly what is the makeup of our creator base, and how do we continue to support them in all ways, whether it’s monetization, distribution, creation — what is the exact makeup of our ecosystem and how can we continue to support it? But we do a number of different programs across the board to be able to support all different types of groups. We actually have $100 million Black voices fund that we’ve been doing to be able to elevate different Black voices. We have a number of programs for women. We do [have] ways for all different types of groups to be able to get started to really foster and support the diverse ecosystem we see on YouTube.
Adams: You said that YouTube isn’t a gatekeeper. There’s a lot of people I talk to on Capitol Hill here who would argue the opposite, that you do have to make some tough decisions about who does get space on your platform, and who does get to monetize their content on your platform. How do you see that role for yourself and how it’s changed over the last couple years?
Wojcicki: Well, we do have community guidelines. We’ve always had community guidelines. So we want to make sure that when you come to YouTube, you have some expectations. For example, content that would be adult content, violent, violent extremism, hateful — there are many different categories that we think don’t belong on YouTube, for many different reasons. But we do tell our creators that as long as they meet our community guidelines, which are posted on the internet, which are very available — we do videos to try to train our creators on exactly what they mean, then that content can be posted and be monetized, provided it meets a certain threshold, which is very different than how TV works. So if you think about TV, you used to have to submit a script that would have to be reviewed by people, they would have to fund it, and there’s a limited number of channels that can be available. The advantage of YouTube is there are millions of channels. I mean, when I was growing up, there was like a handful of channels. And so the fact that now there are millions enables that there’s so many new voices and opportunities and topics that can be discussed that we never would have seen beforehand.
Adams: I want to follow up on this content moderation piece, because many advocates have said that YouTube has been slow to act in this area. For example, you all banned ads on videos denying climate change just last month. Can you talk about the process of making those kinds of decisions, and is it going to be moving faster in the future?
Wojcicki: Well, I’d say that we have been working really hard on this for many, many years. And that when we make a decision, we want to make sure that decision is very thoughtful, that we’ve consulted with all different parties, and that once we roll it out, we roll it out in a consistent way that’s implemented for all of our creators in the same way. And so I would characterize our work as very thorough, and we’re working hard to make it consistent. And we’ve done that for many, many years. And I’d say that our policies are — that we’re doing a really good job of moderating our ecosystem. And I understand there are people on both sides of the aisle who disagree with us, but the fact that you see both sides arguing with us means that we are really striking a balance. And when we do policies, sometimes we’ll consult literally with dozens of different groups to try to understand what are the different perspectives, and how can we do that in a very thoughtful way. So, if you look across all our policies, I think you’ll actually see that we have been very thorough, and in many cases, we have been ahead. But we’re also dealing with many, many different topics. So you can choose any one topic and say, “Oh, were you early? Were you late?” But again, we’re dealing with many topics, and I really would like to think that we’ve been thoughtful and forward-thinking with all of them.
Adams: Several researchers and some members of Congress have described your strategy that you’ve described here, as a little bit of a whack-a-mole — blocking content that violates standards as you see it, for the most part, rather than organizations or people known for posting harmful information, for example, which is something that Facebook does. Why do you take this particular approach?
Wojcicki: So I would say that we are incredibly consistent. I mean, whack-a-mole does not characterize at all our approach. When we make a policy, again, we consult with many different people, but then it needs to be rolled out so that we can have thousands of reviewers all around the world be able to implement it consistently, and we have many different checks and measures to make sure that that is happening. As we are making different changes, we also have the three strike system, which is distinguished, and YouTube has had that pretty much since the very beginning of YouTube. And we tell all of our creators, too, that we have the system, this is how it works. So basically, if somebody gets three different strikes within a certain period of time, that account will become terminated. And we tell creators, “Look, you got your first strike, you got your second strike, you got your third strike.” And actually, we have the first strike, which is a one-time event, which is a warning strike for creators who might not even be aware that we have a strike system. And of course, there’s some types of content that it doesn’t take three strikes. So if it’s violent extremism or child safety, that’s content that is immediately removed. But we really do try to work with our creators, so they understand the rules, the system is clear. And, again, we don’t want to necessarily focus on the person, we want to focus on what they say and what they do, as opposed to characterizing any individual as having some kind of label. It’s more: What did that channel say? What’s the content on it? And be content-based. And that that content, we could stand behind that decision saying, “Look, here’s the policy. It’s posted on the internet. This is the content that was said, it happened multiple times, which is documented, and that’s why we’ve made the decisions that we’ve had.”
Adams: And you think that’s a better strategy — to look at the individual content as opposed to the person or the group? Say, if it’s a white supremacist group that’s posting about flowers, would that still be fine? Because some of the researchers we’ve talked to about this say that what that is doing is providing an off-ramp for people to find more extremist content off of your platform.
Wojcicki: Well, first of all, we do see that groups — I mean, you talked about white supremacy. White supremacy groups do not post about flowers. They usually post about content that becomes violative, and then that content is removed under our hate policy. And so that is generally what we have seen. And if a channel has a certain threshold, or has any of those violations, then that channel will be removed, because we have a policy against hate. I think it’s a dangerous line to start characterizing individuals and start saying, “This person is not allowed for whatever reason.” Like we really want to be able to say, “No, this is our policy. Anyone who crosses these lines—everyone is held to the same standards, whether you’re a politician, or you’re an individual, these are the lines, the lines are posted on the internet. And if you cross them, then there will be consequences.” We do certainly want to make sure that our policies are not being—people would like to tiptoe up to the policy, right? And we have gotten smarter and smarter and worked with a lot of experts to understand, well, what does that symbol mean? Is there a dog whistle here? Is there a subtext? Is there even an image that’s representative of something that would be violative, that we know is a symbol about something? Is it a song that we know actually is representative of something that would be violative of our policy? So that’s the way that our policies really have evolved to get smarter, and make sure that we strike the right balance.
Adams: With so much content being uploaded to YouTube every day, it’s inevitable that some of this moderation work that you’re describing has to be done by machines, automation and AI. But there’s a ton of research out there that shows there is bias built into a lot of the AI that is being used across many platforms. How do you account for that?
Wojcicki: So, we do use AI for really everything we do. We use it for our recommendation system, we use it for the ads, we use it to be able to find content that is violative. But we also use people. And the people —at the end, the people are in charge. So the people train the machines and sometimes there are biases that people have and that will get translated into the machines. We may become aware of that issue, and then, as a result, we’ll need to retrain how our machines work. And so, this area of machine learning fairness, or AI fairness is a really important area where Google has done a tremendous amount of work. And we are also working incredibly hard to make sure that the way that we implement our algorithms are fair and equal to everyone. So we make sure that we are working with researchers, third parties, to be able to identify any different issues that are coming up and be able to address them. So if we see it, if we see some type of issue that’s there, we will right away look at it, retrain our systems and figure out how to address that.
Adams: I hear you talking about all these things that you’re doing, but I’m here in Washington, D.C., where big tech is the favorite punching bag, and everyone is saying that you’re not doing enough, quickly enough. And there’s a ton of regulation coming down the pipeline headed your way. What do you think should be the approach to regulating your industry?
Wojcicki: First of all, we already are regulated on many different fronts, so if you look at copyright, policy, privacy, kids, those are all different areas where YouTube already deals with a lot of regulation, and we’re compliant. We’re also operating on a global scale, so there are many, many bills in many countries where we are working very hard to make sure that we’re compliant. And the rules are always changing, so that is something that is challenging. But we are already subject to a very large amount of regulation. There certainly is a lot of discussion going forward, for sure. And we are working closely with different regulators to try to just talk about the perspective that we have, and making sure that the regulation, which it makes sense in many places, but in other places could have a number of unintended consequences. And we just want to make sure that as we work with regulators, that it achieves what they are looking for, as opposed to doing something that could be harmful, ultimately, to the creator ecosystem — like, literally millions of small businesses that are creating content, a lot of times about educational or valuable underrepresented groups — we want to make sure that we are speaking on behalf of those groups and making sure they will be able to continue to do the good work that they’re doing. So it’s a tricky balance.
Adams: Yeah, what do you think of calls for more transparency to your algorithm itself? Or, there’s one piece of legislation that says users should be allowed to opt-out of letting algorithms decide what they see.
Wojcicki: I think on transparency, we are working to be more transparent, and we are working to continue to give more visibility. I would say that for regulators, actually seeing the algorithm itself, of course, would be a very complex area, and I don’t think would achieve what most regulators want. What would be most helpful is to talk about what are the end metrics that they would like us to optimize for. YouTube, for example, we just came out with a metric called Violative View Rate, which is the number of views that are on our platform that are violative of the policies that we’ve said that we have. And we believe that that’s actually a really comprehensive way to understand: Are we meeting our own goals in terms of removing content that is violative of our policies? So basically, we share that, we update that regularly. And right now, it’s about 19 or 20 views in every 10,000 that would be violative of our policies. And so we’re going to continue to work to make sure that we pull down that content, but that’s a way that we are working to be more transparent.
Adams: Do you have any plans to make your algorithm more transparent and open for researchers? I get that members of Congress may not be able to dive into it, but what about experts outside of YouTube?
Wojcicki: We definitely see that there’s a need, and we’re going to continue to look into that and explore what are ways that would make sense for us to do that. We do see right now there is a large amount of research that’s already happening on YouTube. So we see a lot of researchers, a lot of PhDs, professors, think tanks, institutions that write a lot of papers already about YouTube and the results, because everything we do is public. And so, we do see them doing a fair amount of work. But we certainly will continue to work and find ways to be more transparent.
Adams: Do you have a timeline on that?
Wojcicki: I mean, I think certainly we’re talking about it for 2022. We’re open to finding more ways to work with researchers, and we want to be careful about how we do it to make sure that it’s done in a productive way. But, I mean, we understand that there needs to be more transparency.
I realize, I didn’t answer your question beforehand about what about people want to opt out of recommendations? I think YouTube would just be completely useless if people didn’t have recommendations. So many times I compare YouTube to a library, because we have a lot of content — it’s a video library that’s publicly available. We have 500 hours uploaded every minute to YouTube, we have a very large amount of content. So, if you went into the Library of Congress, for example, and you didn’t have a card catalog, and you didn’t have a librarian there recommending or helping you find books, it wouldn’t be a very useful experience. Like, we could post, “Here are like the top 10 videos that are popular on YouTube.” But the reason people come to YouTube is because they have very specific needs. They’re looking to make a pie for Thanksgiving, or they’re looking for a specific cookie recipe, or they want to watch a James Baldwin speech. So you need to have a system that is able to help you go through large amounts of information, and then know the kind of information you’re looking for. So it’s like having a very trained librarian that knows you and says, “Oh, look. I know you came in last time and you were interested in these three individuals, or you’re interested in science. Here’s some of the latest science videos that we think you’d be interested in.” And without that, it’s not a very useful service.
Adams: I mean, YouTube knows exactly which number file videos I want to watch. And I was on there the other night learning how to make ramen noodles from scratch. I was one of those people who figured out how to uninstall my dishwasher during the pandemic on YouTube. But, at the same time, you have the investigation into the New Zealand Christchurch massacre, which found that the gunman used YouTube to learn how to modify his weapons to kill more people, and that he was radicalized on YouTube — two very different ways to use the same platform. How do you think about how all this information coexists, and should it all coexist in the same place?
Wojcicki: First of all, we work really, really hard to keep updating our policies and working with experts across the board. So, any kind of violent extremism would be content we would work incredibly hard to remove. You mentioned weapons: We’ve updated our weapon policy a number of times. And so we want to make sure that we are removing that content. We also have seen a lot of researchers who have talked about our recommendation system, which really has changed a lot over the last five years. We’re updating it and changing it all the time. And we certainly have seen a number of researchers from think tanks to universities that have reviewed our recommendation system, how it’s changed, and have commented and seen that it is not leading users to more extreme content. But we’re always open to feedback, we’re always figuring out how to continue to do a better service, and that’s, again, why we’ll continue to work with various experts to get their feedback on all of these really tough topics.
Adams: Content moderation, obviously a challenge, you’re working on it, you have a strategy. What is the next big, kind of scary thing that you’re worried about on YouTube?
Wojcicki: Right now, I mean, content moderation has been a huge area for us. And I think it’s something we’re always going to have to continue to be on top of. I think we’re in a good place; in the last couple of years, we’ve put so much work from people, technology, process, experts in place to really make sure that we’re on top of it. But I’m always going to tell you that’s going to be one of my — that’s going to be my top responsibility. And we can never take our eye off the ball. We’re always going to be focused on that. But in terms of other things to worry about, I’d say maybe — we’re certainly — regulation. We’re certainly very focused on working, because I look across the globe and there are literally hundreds of bills that are in discussion right now, that all have a variety of different ways of impacting YouTube. And we want to make sure that our creators are able to continue to publish and do the great work that they do. I’d also say — there’s a lot of competition. Right now, everyone is talking about video, we see a lot of growth. It’s not just U.S. companies, there are global companies that we’re competing with.
Adams: And y’all are getting into podcasting.
Wojcicki: We’re excited about podcasting, for sure. But that’s a place where there’s also many different companies. But we do think it’s a good opportunity for people who are producing podcasting to generate revenue, have more distribution. We crossed 50 million subscribers for our YouTube music and premium service, and so we know that users are paying for the service and the more we can offer more podcasts there, we think that will be a really valuable service for our users. So I have many things that could keep me up at night. But I’m also excited about innovation, and that’s really why I ultimately came to Google and to YouTube, just the ability to continue to create and use technology to improve our lives. And that’s what I’m hopeful I can do more of in the coming years.
Adams: If I can ask about something that’s keeping me up at night, but also ties into your point about innovation: What about deep fakes? The technology to replicate audio and video that looks so convincing is just getting better. Do you all have the technology at YouTube to help users identify whether or not what they’re looking at is real?
Wojcicki: We definitely have invested in this place, and we’ve done a fair amount of work. For example, we released a data set for a lot of researchers to be able to work and train and understand deep fakes better. And that certainly is something that I’m asked about a lot, and I do agree that is an area that we need to continue to work on. But the reality is we’re not seeing, right now, that as a — right now, like, I could point more to shallow fakes that have been the issue, where people just will put and mix an old video with some new audio or old audio with a new video and mix it up. And those have been cases where we’ve been able to very quickly identify it. We’ll be able to say, for example, “Hey, we saw this video before, many years ago, and it didn’t have this audio.” And so we can identify that there’s something going on here with how this video has been altered and changed. And so we have a lot of techniques and tools to be able to identify that. I’m more worried in the short-term about just shallow fakes. But we definitely are investing in the long-term for a point where the deep fakes become more of an issue. And I’m not saying it’s not going to happen. I believe that it will, but what I’m saying is that it’s not something that is causing us issues right at this moment.
Adams: I know we’re almost out of time. Just a couple more quick points. I’m here in Washington, D.C., and I’m struck by the fact that other tech CEOs are called here all the time to Capitol Hill to testify, but not you, even though 81% of American adults, according to Pew, use your platform. Several other researchers we talked to while preparing for this interview, say that YouTube and you, specifically, have gotten kind of a pass when it comes to sort of this backlash and scrutiny of misinformation online. What do you think about that?
Wojcicki: Well, I’m always open to testifying if I’m called. So there’s never been a situation where I was called and I didn’t go. I have been many times to Congress and met with many people on the Hill. Google owns YouTube, and Google is part of Alphabet. And the CEO of Google and the CEO of Alphabet is Sundar [Pichai]. So Sundar has testified a number of times, and he has answered many questions about YouTube. And I certainly, if I were asked or needed to attend, I certainly would be there. I think it’s more that people, a lot of times, have wanted to have the questions about Google and Alphabet as a whole, and that’s why Sundar has testified a number of times.
Adams: Obviously, you are one of the most experienced people in the tech field. And I wonder: given how long you’ve been in this space, and how much you know about how the internet works, how does that shape how you use the internet and how you tell your loved ones to use the internet?
Wojcicki: Well, I mean, I’m old enough to remember before the internet.
Adams: Me too. (laughs)
Wojcicki: (Laughs) So, I think we’re reaching a stage where there’s some generations who don’t really remember that. So I definitely appreciate the many benefits that we have from the internet. On the other hand, because I’m also from a generation that remembers life before the internet, I also see benefits of being able to disengage from the internet, too. I think, like anything, it’s something that needs to be managed, that you need to have some time that you’re outside, engaging with friends, having other hobbies. Certainly, my recommendations and how I personally use the internet is I try to use it for all the good stuff, and I appreciate all the benefits and what I’ve been able to learn. But I also believe balance is critical, and that everybody needs time when they are also engaging in real life.
Adams: I have encountered more than one kid, some in my own family, who have told me that when they grow up, they want to be YouTubers. And they use that word specifically. What do you think of that — of kids looking to this kind of content creation as a career choice?
Wojcicki: Well, certainly there are many creators who are doing very well, and a lot of creators actually will do it as a hobby. So some of them, certainly many do it full-time, but many do it as something in addition. And so you think about it — it’s an opportunity to really engage in something you’re passionate about, share it with a global audience, generate revenue, and a lot of creators also will do more than just the videos. Once they have the videos and they have the audience, they’ll write a book, they’ll have a product, they hire people to work on it. I mean, if you think about it from a more academic standpoint, if you’re creating a video, you need to have a point of view. You need to be able to express it articulately. You need to think about how to tell that story. And so I do think, academically, there are many benefits to being able to think about how do you create a video that has a story, has meaning, is engaging. From a career standpoint, it certainly is a very valid career for some set of people. It’s a very valid hobby for some set of people. And then for others, it’s just an opportunity to engage and do something that is a way to explore or grow their passion. But they’re probably not going to have a — there’ll be some set that, of course, will not have a big audience. They’ll have a small set of people that watch their videos, but it doesn’t mean it’s not still a compelling experience. I encourage people to explore their passions and see what makes sense for them. And also, some people do it, and then they do something else, and then they come back to it. We gave that story about Made with Lau, but we also see other people. For example, we have many creators who are coming from the farming community, which has been really fascinating for me to see — FarmTube we’ve been calling it — which are people who are full-time farmers coming from the Midwest, who are documenting their life as farmers. For example, we have this one millennial farmer, [who] is Zach Johnson, who is a fifth generation farmer, and if you watch these videos, it’s really interesting you learn a lot about how they use some equipment and the harvest.
Adams: I’ve seen some of them.
Wojcicki: Yeah, so that’s just an example where someone is doing it in addition. It’s not that he’s only a YouTuber. He’s a farmer and a YouTuber. So we see many people combining it along with their area of interest or passion.
There’s a lot happening in the world. Through it all, Marketplace is here for you.
You rely on Marketplace to break down the world’s events and tell you how it affects you in a fact-based, approachable way. We rely on your financial support to keep making that possible.
Your donation today powers the independent journalism that you rely on. For just $5/month, you can help sustain Marketplace so we can keep reporting on the things that matter to you.