Skip to content

Listen to the episode

On this episode of Unlocking Us

I was looking for some certainty around the tough issues of censorship and misinformation — legal definitions, rules, and clear lines — so I called Ben Wizner, a lawyer with the ACLU and the director of its Speech, Privacy, and Technology Project. I’d hoped we’d have a Free Speech 101–type conversation, with tidy resolutions and a clear path forward. But what I suspected, and Ben confirmed, is that the law gives us very few answers to the hardest questions that we have. So in this episode, the two of us grapple with issues of balance and boundaries, unpacking the harms that speech can cause and the harms that censorship can cause. I’m glad that we had over an hour to talk, because as tempting as it is to approach issues like this with firm certainty or with 140 characters, it’s much more important to unpack the nuances and unlock the opportunity for growth and learning.

About the guest

Ben Wizner

Ben Wizner has been an ACLU lawyer since 2001 and, since 2012, the director of its Speech, Privacy, and Technology Project. In two decades at the ACLU he has litigated numerous cases involving freedom of expression, surveillance practices, government watch lists, targeted killing, and torture. Since 2013, he has been the lead attorney for NSA whistleblower Edward Snowden. Ben is a graduate of Harvard College and New York University of Law.

“Sometimes I’m too clever for my own good. I was trying to choose things that would be both sincerely related to my taste but also topical for our conversation, so that’s how this came into being.”

Transcript

Brené Brown: Hi everyone, I’m Brené Brown. And this is Unlocking Us.

[music]

BB: I’m glad to be back. It has been as many of you know, a very, very tough couple of weeks, what has it been two weeks or three weeks Barrett?

Barrett Guillen: Two weeks.

BB: Two weeks. It’s been long and it’s been hard, and the world is not built… I’m learning, for pausing and learning more and researching, I’ve learned that. What else have I learned? I’ve learned that bots that have cartoon avatars that are eagles are really mean AF, I can just tell y’all that right now. I’ve learned that this community is incredible, and even when we don’t agree, which there are a ton of you that didn’t agree with me taking a pause as I dug into what was happening at Spotify to better understand, you asked me tough questions, held me accountable, but did so with respect and curiosity, and I really appreciate that. I’m tired. I’m…I don’t know, getting back up on my feet a little bit. The worst thing about this whole shit show is I would do it again the exact same way. I didn’t know enough to tell you even why I was pausing, because I didn’t know what I was going to have to go look at.

BB: And I wasn’t sure what I was even allowed to say, to be honest with you, and it’s been hard, but I’m glad to be back [chuckle] and I’m coming back, interestingly, with Ben Wizner. He has been an attorney at the ACLU since 2001, and since 2012, he has been the director of its speech privacy and technology project. In two decades at the ACLU, he has litigated numerous cases involving freedom of expression, surveillance practices, government watch list, targeted killing, and torture. Since 2013, he has been a lead attorney for NSA whistleblower, Edward Snowden. He went to Harvard, and then he went to NYU for law school. I have to say that I have a very strange relationship with the ACLU. I’ve supported them financially before. I recently did a pro bono leadership event for them, and I… No one pisses me off like them. It’s like, it’s actually… This is the quick craziest thing to say, it’s… The analogy is with church, which I know that seems weird to be using an analogy of church with the ACLU, but I go to church for the reminder of who I want to be.

BB: I want to find the face of God in everyone that I look at, I want to love people, I don’t want to move through the world full of lovelessness and hate and rage, and so I go to church sometimes just to pass the peace and sing and break bread with people I really can’t stand. And the ACLU is the same way. I support them, because a lot of times I agree with what they’re doing and sometimes who they defend, it just makes me crazy, but then I think about the importance of free speech, and I’m about as anti-censorship as you can get. So I thought this would be a good conversation. I will tell you that I was looking for a lot of certainty in this conversation, I was hoping for some rules and some clarity and legal definitions, and that went south about what?

BG: Thirty seconds.

BB: Thirty seconds into it, I’m like, “Okay, let’s do free speech 101.” He’s like, “No, we’re not going to do that. We’re going to grapple with really hard things that the law can’t answer.” I’m like “Shit.” So, this has been Wizner on free speech, misinformation, and the case for nuance. I’m glad y’all are here.

[music]

BB: So let me start by saying, welcome to Unlocking Us to Ben Wizner. Hi.

Ben Wizner: Hi, what a pleasure.

BB: Thank you for being here.

BW: I’m looking forward to it. I hope.

[laughter]

BB: You know what, I think this is going to be… I think this is going to be probably for you, like you were teaching free speech 101.

BW: I actually think it won’t be because…

BB: Oh, you don’t.

BW: Because most of the things that we’re going to be talking about today are not about the law, and the law is going to give us very few answers to the hardest questions that we have, so this is going to be the two of us just grappling with issues that we both thought about with no right or wrong answers, probably.

BB: Oh, holy shit. [chuckle] I was really… I was really looking for some certainty, but okay. So before we get started, I have to tell everybody that you are with the ACLU, and I have supported the ACLU over the past financially and the most recently, y’all invited me to do a fire side leadership talk, and I did that pro bono, so I to be really up front about that. I also want to tell you that I appreciate the work you do, and God, I hate y’all so much that I could just spit sometimes.

BW: They say about the ACLU that if you agree with us, 80% of the time, you should be a member, and if you agree with us, 60% of the time, you should be a board member, so there’s no one in the organization who agrees with everything that we do. It’s an ongoing argument. And it has been for 100 years.

BB: So tell us a little bit about the ACLU and tell us about what you do there.

BW: Well, the ACLU has been around a lot longer than I have, although when I was writing a quick bio for your show and saw that I’ve been there now for 20 years, not at all what I expected when I showed up there in August of 2001, that is now 20% of the history of this organization. But the ACLU has been the country’s constitutional and civil rights watchdog. Now for a century, we have offices in every single state, and we work on a broad range of civil rights, civil liberties, and human rights issues, from free speech and freedom to protest, to the rights of immigrants, to women’s rights. That project was started by the late Justice Ruth Bader Ginsburg, to racial justice, to criminal law reform. You can imagine with all of those issues under a single umbrella, there are sometimes going to be pressures and tensions that arise even within the ACLU. So, I showed up here as a baby lawyer 20 years ago to be just a generalist to kind of work on everything.

BW: And then, just a few weeks later were the 9/11 attacks, and so the first half of my career was really shaped by the country’s response to 9/11 and what we believe to be rights abusing responses to that including secret detention and torture and excess surveillance, targeted killing, and then after doing that work primarily for a little over a decade, I took over the leadership of our free speech and privacy team at the ACLU, so sort of working on both some of the oldest issues that the ACLU works on and some of the newest, because as you know, and as we’ll talk about today, technology has had a huge effect on how we think about rights and liberties and speech and privacy. So that’s what I’ve been up to.

BB: You also… Are you the chief lawyer for Snowden?

BW: I have been since the summer of 2013, been his main lawyer and the one who coordinates the other lawyers in different countries on his team, that’s been sort of wild and unexpected life and career experience. And apropos of our conversation today, I have to admit that the only time in my life I’ve ever heard a Joe Rogan podcast was when Edward Snowden went on to hawk his memoir. So I may have some expertise on the issues, but I don’t have a specific expertise on the man.

BB: Okay, so let me jump in now I know that what you do. Let me ask you something. We have a lot of college students that listen. Was there just a time in your young kind of baby lawyer life where you said, “I should really stir things up a bit and be a lawyer for the ACLU.” How did that happen actually?

BW: I think I was always the kind of kid who people would say after a loud objection or an argument, why don’t you channel that into something like being a lawyer. I kind of had two directions and there was a push-pull, I’m very passionate about reading and literature, and there was a side of me that thought maybe I should be a humanities professor and then at the same time, there’s the very strong activist side of me that wants to be out there stirring shit up, as you put it, and college was the place that clarified it, the process for me of writing a senior thesis showed me that I would be a real fraud as an academic, that I would cut every corner I possibly could just to get out of it. [chuckle] Because I did not to spend my days and nights in a cubicle surrounded by books. I want to spend my evenings on a couch with a book, but my day I to spend surrounded by people just… I work better in a social environment, and so I decided to make activism my life and literature my hobby rather than the other way around.

BB: Yeah, location and yeah, I get that. Alright, let’s just start. Can we just start where I am? I’m really struggling. I’m just… Confession, I was really looking for some certainty here. Really, y’all can’t see Ben right now, but he’s laughing at me. Where is the legal line, where does a First Amendment protection end?

BW: Well, so no rights are absolute, so let’s start with that proposition. The right to speak is not absolute, the right to own a gun is not absolute, even if the Constitution really does protect an individual right to bear arms, which I don’t believe it does, it doesn’t protect your right to have a surface to air, heat-seeking missile that can take down a commercial airline. The right to free speech doesn’t mean that nothing that you can say can be prescribed or punished. So for example, we’re not allowed to engage in true threats, the kind of speech that would put someone else in real apprehension of physical danger. We’re not allowed to incite violence. We can’t stand in front of a crowd of rowdy people in a leadership position and urge them on to imminent violence. We’re not allowed to harass. If I called you up and said, let’s go out for coffee that would be free speech, but if I did it 50 times in the next hour after you had told me politely no, that could be harassment. So there are lines that we are not permitted to cross, even though there is free speech. Now on the topic that you want to talk about, misinformation, disinformation.

BW: It’s actually a lot more complicated. The Supreme Court has held that for the most part, the First Amendment protects the right not only to say things that are false, but to do so deliberately, it actually protects the right to lie. Congress passed a law called the Stolen Valor Act, which was aimed at trying to punish these people who pretended that they were military veterans and used that. The Supreme Court struck that law down.

BB: People were saying that they were war heroes.

BW: Yeah, that’s right.

BB: Who didn’t actually serve or they weren’t war heroes either way.

BW: Exactly right. And it seemed like about a decade ago, we were hearing these stories all the time of people who either in politics or in business had invented identities as war heroes or veterans that were false, and you can imagine how aggrieved and offended actual veterans were that people would do this, and Congress passed a law aimed at criminalizing this conduct, and the Supreme Court said, “No.” The supreme court struck that law down and said, “We’re not going to have the government as an arbiter of truth or falsity,” and that doesn’t mean that every lie is protected. Fraud, for example, can be criminalized. If I’m lying to you in order to get you to purchase a defective product from me, but there has to be that kind of concrete harm attached to the lie before it stops being constitutionally protected. And that’s why I said to you in the beginning of this conversation, law is not going to help us very much here, now that would be the case, even if we were talking about government intervention, we’re not. We’re talking about a private company and one of their business partners, but even if we were talking about government regulation, the Supreme Court has been really protective of even false speech in most circumstances, because remember…

BW: Every time you have a rule saying, “On this side, it’s okay, and on that side, it’s not okay,” someone has to be the decider, and this was really brought home to me in the early months of the Trump administration, because if you remember at that time, we weren’t using the term misinformation, it was fake news, that was what was on everybody’s lips, “What are we going to do about fake news?” “What are we going to do about fake news?,” and I think what they meant is, “What can the government do about fake news? How can we stop this fake news?,” and then before you could even blink, the person who had co-opted the term fake news was Trump.

BW: And Trump was using the term fake news to describe any accurate news story that cast any aspersions on him or his administration, and that should have been a reminder to everybody that supposing Congress had passed a fake news law in 2017, who would have been enforcing that law? The Trump administration would have been enforcing that law. Someone has to actually decide what fits into that category and what doesn’t fit into that category, and that’s a really huge power to give to government. We can’t expect governments to apply it neutrally, fairly, scientifically. We know that they’re going to apply it politically, parochially for their own benefit, and that really is why the Supreme Court has been very, very, very, very cautious about letting Congress put limits on any kind of speech, even false speech.

BB: Okay, this is going to be a big question for me, because who’s the arbiter of truth, like who gets to decide… That’s the scary thing for me. So can I play something back to make sure I’m saying what you said correctly or I’m thinking about it correctly?

BW: Sure.

BB: That the government, the Supreme Court has been very cautious in allowing the government to be the arbiter of what’s truthful and what’s not. Is that a fair statement?

BW: That’s a true statement.

BB: Okay. So let me just give you an example. Inciting violence. You said that that was one of the exceptions. Is that the right word maybe or considerations.

BW: Yeah. I think so. I think you can say… or a limitation on the right to free speech.

BB: A limitation.

BW: So in almost every circumstance, we want to hold people responsible for their own conduct, not other people’s conduct, but in this narrow category of incitement, when my words directly lead to someone else’s violence, then you might be able to hold both of us liable. And here you might think about the Hutu radio broadcasters in Rwanda, who really were using the air waves to incite a genocide or closer to home, and these were much closer calls, but when Trump was at his rallies, telling his very riled up supporters to kick out protesters, that is almost the paradigmatic example of a situation where you might hold the speaker liable. Now in most of those instances, he was careful to say, “Don’t use any violence.” Whether that was with a wink or a nod, who’s to say? But those are really narrow situations because it has to be the intent of the speaker that the words lead to violence and the violence needs to be imminent. We’re not going to say that something that you put on Twitter today that arguably has an effect days or weeks later fits into this very, very narrow exception.

BB: It does not.

BW: It shouldn’t, because of the eminence requirement. Again, the principal here is that people should be responsible for their own conduct, not other people’s conduct, the incitement doctrine is an exception to that principle, where you can hold a speaker responsible for other people’s conduct, but we need that to be a very, very narrow exception. Otherwise, all kinds of speech can be forbidden.

BB: So here’s what I’m struggling with. I agree with everything you just said. It pisses me off, but I agree with it. I agree that that limitation should be very narrow and very imminent, but I personally struggle with not just the belief, but I think we have historical data that language is a precursor to dehumanization and dehumanization leads to violence. Like, are you saying that I have to go too far, there’s not enough direct connection?

BW: Legally, yes. But I don’t think that’s an adequate answer to your question, because obviously words matter. Obviously words have power. Obviously, they have meaning in the world, they have impact in the world, otherwise we wouldn’t be so concerned about this, we wouldn’t need so much protection for free speech. But we have so many examples, through history, of ideas, arguments, concepts that were prohibited one day and then became accepted consensus almost mainstream in a later era, so we only really need free speech protection for ideas that are deeply unpopular to someone, if not most people. Other words and ideas don’t need protection at all.

BW: If you’re saying things that most people agree with, you don’t need a constitutional right, we need a constitutional right in the Bill of Rights for ideas, concepts, things that you would say that majorities of people might be tempted to prohibit or disapprove of. So these are the tensions. We have to acknowledge on the one hand, that words, propaganda, disparagement, discrimination, these things are real, that the words can cause harm in the world, and then we have to be humble about our ability as a society to be able to make these distinctions correct at any time because majorities have been so wrong for so long about these things, and we have to give a wide birth for disagreement, and even offense when we’re talking about legal interventions.

BB: Okay, I’m learning so much. Man, I’m so grateful for this conversation. I’m frustrated by it, but I’m really grateful for it. Okay, so let’s talk about… This really confuses me and I don’t know if this is… Maybe you’ll have the answers and maybe you don’t. Let’s talk about this argument by technology companies trying to tease out whether their platforms are publishers, and then based on that answer, what their responsibilities are to the public? Can you help me understand that?

BW: I can, yeah. So, let’s start with something that’s a little bit more straightforward, let’s start with a traditional news publisher like the New York Times. You may or may not know that the New York Times is in court this week, they’ve been sued by Sarah Palin for defamation and that case has gone to trial, because the Times wrote an Op-Ed, an editorial piece, that seemed to suggest that Sarah Palin’s political speech had led, incited effectively, a shooting of a politician. They corrected it the next day, there were some, at least opportunities for misunderstanding of that, but she has sued them for defamation. The New York Times is responsible for every word it publishes. The New York Times exercises full editorial control over everything that shows up in the newspaper, contrast that with something like Facebook. Facebook has 2.5 billion users who are able to upload their words, ideas, thoughts, pictures instantly without any kind of review in between. If I write a letter to the New York Times, someone looks at it, maybe asks me to edit it, ultimately they publish it, but when they publish it, they’ve published it after an exercise of their own editorial control. Facebook is exactly the opposite.

BW: It may have rules that they want me to follow, but they have been architected, as has most of the modern for-profit Internet, to allow us to speak first and then have the rules be applied later, if at all. This was by design, Congress, in a 1990’s law called the Communications Decency Act, which it has amended more than once, essentially understood that there was no way to treat Facebook or say Yelp, or any kind of enthusiast site, the same as a publisher, in terms of liability. If we wanted to allow people to be able to have those kinds of interactions online. Imagine if Yelp and its reviews were subject to defamation law for everything that any of the millions of reviewers wrote there, you could have a system that had that set of rules, but you wouldn’t have Yelp, you can have one or the other. Either we’re going to have these kinds of social networks like Twitter, where our tweets go up without being reviewed by a legal team at Twitter, or you won’t have social media at all and I’m sure some people would be happy with that as an outcome, but if you want to have Twitter, Facebook, Instagram operate roughly the way they operate right now, you cannot make the tech companies liable in the same way that we can make the New York Times liable for its editorial decisions.

BW: That doesn’t mean that there isn’t gray area in between, there aren’t things that you can do, for example, and Europe is experimenting with this as well. If, once tech companies are aware that comments have been posted or material has been posted that violates their Terms of Service or a law, they could be required to take it down, and this is how our copyright regime works. It’s why you see on YouTube, if you look, lots of things that used to be there aren’t there anymore. Someone posted it, but then YouTube found out that it was illegal copyright content, and they took it down. So again, there’s some in-between here, but the idea that we should just apply the same rules that we’ve applied to publishers to all of these social network tech platforms, doesn’t make any sense if you believe that we should be able to have social media the way that we have right now, which is just largely without curation or editing, or we’ll get into the loaded word censorship from a major corporation. And then of course, there are platforms that are somewhere in between, we’ll get to Spotify, I’m sure, because we’re on that platform and that’s in the news lately, but that’s the basic distinction that law recognizes.

BB: So, can I ask you some detailed questions? I’m going to have to really close my eyes as I think about this because it’s so complicated. So, Sarah Palin, the thing that you talked about in court this week with the New York Times, that’s falling into one of those very narrow limitations, right?

BW: Correct. Which is for libel or defamation. If I say something about you, a factual statement, knowing it to be false, in some instances, you can sue me for having harmed your reputation, that is an exception to free speech, it’s much harder for a public figure like Sarah Palin to do that. She has to show that the New York Times acted with what’s called actual malice and actual malice means they either knew or had reckless disregard for the truth of whether what they were saying was true or false. So, that’s a hard thing to prove at trial, she’s going to lose. It’s important to make predictions that can be verified or not, I’m putting myself out, I’m being brave, as you would say.

BB: Yeah.

BW: She’s going to lose her trial, but yes, this is one of those narrow exceptions for defamation, but imagine if you applied the defamation regime… as I said a minute ago, to Twitter, for all the defamation that takes place on that platform on an hourly basis, you couldn’t have Twitter.

BB: Okay. So, do we use the terms editorial control and curation synonymously, or are they just supportive of each other?

BW: Like censorship, those are not terms of our… Those terms don’t have legal meaning.

BB: Got it.

BW: So, you can use them however you like, as long as your listeners more or less understand you.

BB: Okay. So, if you’re going to use terms like curation or editorial control, you should define those terms when you’re talking about them, just to be in a decent argument or discourse. Yeah. Okay, so here’s one question while I was watching the fist fight in my Twitter feed getting cancelled, and then I’m canceling you, because you’re canceling someone who cancelled someone, who’s cancelled, like all that bullshit. I saw a lot of people saying, “I don’t know why you’re accusing her of censorship, only the government can censor people.” Is that true?

BW: Again, people can use that term how they choose, it might be clarifying to use the term private censorship, but even censorship when the government does it is not a legal term of ours, censorship is not a legal word. So I understand what they’re saying. What they’re saying is the government can be legally constrained from censorship activities, whereas a private actor like Spotify has its own First Amendment right to decide who it wants to associate with. That’s true. But if Spotify said, for example, “You know what, this whole debate about critical race theory is just too divisive and it’s too controversial, and we’ve decided we don’t to have any content about that on our platform, whether pro or con it’s tearing this country apart.”

BW: I would regard that as censorship, if they did the same thing for other controversial issues, it doesn’t mean that they’ve done anything illegal, they’ve done something legal. In the same way that if a private school decides, you know what, we don’t to have any books in our library that have any sexual content, so we’re getting rid of The Bluest Eye by Toni Morrison and we’re taking it off our shelves. I would regard that as an act of censorship, even though legally as a private school, they are not constrained by any law, they can decide what books are on their shelf without a court having any say in the matter. So once again, censorship is a word that you ought to define if you’re using it, I think when people say private actors can’t censor, what they’re really saying is the law protects private actors in the way that the law would not protect a government action. And that’s absolutely true.

BB: Can you say that again? Can you say that again? That seems really important.

BW: Right. So, let’s put it this way. If Spotify says, ‘We have decided to define misinformation in this way and to ban it from our platform.” The law protects Spotify’s decision to do that. Whereas, if Congress passed a law saying, we are defining misinformation in this way and Spotify can’t have it on its platform, that would be a First Amendment violation.

BB: Got it.

BW: Right. So, that I think is what, being charitable, people are saying when they say it can’t be censorship if it’s a private entity. Again, I regard the decision by Amazon or Walmart not to sell a certain book that’s not illegal, but I would regard it as censorship if it was driven by ideology. And maybe the easiest way to clarify this is just to distinguish between government censorship and private censorship.

BB: Okay, that’s so helpful for me. I have this question, and it’s going to be in the vein of the dehumanizing language thing. You said, fraud is a limitation on free speech, if you’re using your speech to defraud.

BW: I would put it another way, which is…

BB: Okay. How would you put it?

BW: I would say so governments, state governments, local governments, the federal government can pass criminal laws prohibiting fraud, and have those survive First Amendment challenge because they tie the false speech to economic harm. So, I’m defrauding you out of your money, if it’s just aimed at confusing you or giving you false information, that’s going to be considered protected speech, but if it is about actually giving you false information so that you’ll give me your money, then that can be regulated without running afoul of the First Amendment.

BB: So what it if, I say things that are untrue in order to gain a specific audience, which raises how much money I make. Do you understand what I’m saying?

BW: I do, but I don’t think that you can cut it that cleanly, I think that would be protected speech. We all speak to gain audience, gaining audience often brings material benefits to us. And so, that wouldn’t fall into the fraud exception to free speech, in the same way that if you were selling a COVID pill that you knew was actually just a placebo, but you put a label on it and you sell it as an elixir without any government approval and people give their money, that can be regulated in a way that you’re just, shall we say lying, for aggrandizement and to grow your audience, then that would be protected speech, I mean, that sounds a lot like politics to me.

BB: Yeah, yeah, yeah, that’s why I’m testing the limits here. It’s really interesting because all these kind of limitations or exceptions are so tight, this is making sense to me, and I actually agree with it. It scares me because I don’t know who’s going to be the arbiter, so it’s scares me to have… Do you know what I mean?

BW: I do. But I think that we want to leave as much room as possible for political speech. Political speech can be really offensive, it can be really demeaning, it can be very, very challenging, but we will always come up against this problem of as you just said, who is the arbiter? And when you think about hateful speech is another category, where other countries have come up with different systems than we have, we don’t have a concept called hate speech in American law. Hate speech is… Hateful speech is protected under American law, and when you think about who these arbiters are going to be, in almost every case, it’s not going to be the Attorney General of the United States, it’s going to be the principal of a school in a rural district, it’s going to be a sheriff somewhere, it’s going to be a university president under pressure.

BW: One person’s hate group will be KKK, another’s is going to be BLM, and another’s is going to be BDS. Depending on what political system you are, BDS, of course is the Boycott Divest and Sanction movement aimed at boycotting Israel, which 25 states have passed laws aimed at restricting those protests, Black Lives Matter, and the Ku Klux Klan. There is no consensus among the political leadership of our country about what is hateful. Many politicians have labeled the ACLU a hate group and have said it’s one of the most dangerous organizations in the United States, and that’s why we want the law to have a very, very soft touch in these areas and really police things at certain extremes, but we don’t to have regular intrusions because history shows that these powers will not be used in the ways that you might expect.

BB: Yeah. So, what are your thoughts about Spotify pulling down all of Joe Rogan’s episodes where he uses the N word?

B: Look, I think these are really hard calls. First of all, I should say this is not a legal question. I’ve said this before, I just want to repeat it, I want to repeat it. Spotify has its own First Amendment right, constitutionally protected right, to decide with whom it wants to associate, to decide what content it wants to publish, when we’re talking about who the speaker is here, in terms of regulation, Spotify is actually the speaker.

BB: No, I don’t… You lost me.

BW: In terms of government regulation, if the government tried to tell Spotify to take these down, right? In that case, Spotify is the speaker, and Spotify says you can’t actually interfere with our free speech, we’ve decided we want to publish this. Using the N-word is constitutionally protected, so there’s no illegality there, this is a question of Spotify’s values and what Spotify wants to be associated with. In this instance, it seems like Rogan has apologized, I don’t know that whether he objected, he may even have supported having some of those old episodes taken down so that one might not be as difficult to some of the other harder questions like, should he still have a platform? Should they still be hosting him? But I will also say that one of the reasons why we need to be humble and cautious here, well, a few reasons. First of all, stepping way back, my life until I went to college is unrecorded. And I thank my lucky stars for that all the time. My parents didn’t even own a camera, there’s school photos of me once a year, and none of the stupidest shit that I said when I was 15 years old is preserved anywhere or 16 or 17, or even 21, and it’s only remembered if at all, by the people who are immediately around me.

BW: And we’re now living in a technologically merciless world where everything is forever. That’s why Snowden’s memoir is called Permanent Record, that we are now surrounded by ourselves, and it makes it so much harder to try on personas, to try out ideas, to just experiment with different versions of ourselves without having consequences. So, I always get a little nervous when people start digging up something from seven years ago, nine years ago, 11 years ago, and using it to form a conclusion, not about what someone said, but about who they are, as if that’s a static thing. And you see it even happening to 15 and 16-year-olds who are having their college acceptances revoked on the argument that we don’t want a racist here, not someone who said something racist at 15, a racist. And in Public Interest Law, we like to say people are a lot more than the worst thing they’ve ever done or said, it’s why we were against capital punishment, it’s why we want mercy for people who are in prison, even who have committed serious crimes.

BW: And we don’t extend that mercy to people for speech crimes against the current norms, and so I think we need to be really cautious given how those norms change about taking today’s set of understandings even if it’s a consensus we agree with and applying it uncritically to people’s past statements and using that to form conclusions about their identities. So that… And I say all this with a lot of caution, it seems like this is a week where every day we’re going to find out something different that Joe Rogan said that is going to be appalling and offensive, and the point of my speech just now was not to defend any of that. And not to really say anything about him. I don’t know him or what he stands for or whether he’s changed, I try to take apologies at face value if they’re sincere, but to go back to your question, taking down these old episodes, leaving him with his current platform to show whether his remorse is sincere, it doesn’t strike me as an un-principled stance for Spotify to be taking. It strikes me as one of many possible principled stances they could take.

BB: Let me ask you this question, what is your perspective… This is such a hard topic for me because just to be honest with you, because I’m married to a physician whose life seems threatened by it, but… What is your perspective on COVID misinformation?

BW: It’s a big and hard topic, I’ll start with me and then go out to the idea… I’ve had COVID twice, I’ve been very lucky. I had the Alpha version in March of 2020, when the ambulances were going by all day and night and there were no tests available. I had the Omicron version just last month, both of them were mild enough, so I feel fortunate in that regard.

BB: I’m glad.

BW: Look, I am dismayed that our vaccination rates are not high enough to allow us to return to a much more normal life, and the reason why we can’t return to a much more normal life is if our hospital rooms are so full of COVID patients that they can’t treat other patients, then that’s just not a way for a society to be able to function. And so, I wish that the vaccination rates were considerably higher given what we now know to be there, really shockingly effective rates of keeping people from serious illness and death. I completely understand, even if I don’t relate to, the specific trust of institutions that’s driving some of the hesitation here. Our mantra; trust the science, trust the experts, and all of this, is a lot less persuasive when you look at the last 20 years of American life…

BB: God, dang, that’s true.

BW: When you look at the experts who brought us the war in Iraq and torture prisons, who brought us the housing bubble and financial crisis and deregulation, who have brought us some of the worst income inequality that you see anywhere in the world, and then even during COVID of course, the medical consensus has shifted, and I think one of the problems was that we sort of set up science on one side as being the thing that’s right, and then everything else, whereas, as you know, because you’re married to a physician, science is a series of questions and hypotheses that have to be tested, and so if you say, “This is the truth. The truth is you don’t need a mask unless you are working in an emergency room.”

BW: Now, at the time, the reason for that message was, “We are worried that we don’t have enough of this PPE equipment for people on the front lines, and we’re lying for the public’s own good.” Well, that always backfires when you try that kind of messaging and it really undermines, and so when the voice of authority turns out to have been incorrect, it doesn’t just change our view about that particular question, but it undermines our authority in the institution. We have now seen this is an institution that is willing to be misleading for our best interests, as they understand our best interests, and that’s really problematic. Look, there was another pretty controversial public health misstep when the same public health experts who had been saying, “Don’t go outside at all,” when the George Floyd protests started, said, “You know what, go out there and protest because racism is also a public health problem.”

BW: Now, I was out there protesting, so this wasn’t a problem for me, but you can see how you erode your authority when you say how you should behave depends on why you’re doing it. And then I would also say people have good reason to be suspicious of big pharma. Big pharma has brought us the opioid crisis, and so the idea that we’re all just going to trust this voice of reason, it’s unrealistic, it ignores reality and what we really need to be thinking about is not how we can bully people with reason, but how we can persuade, how do you re-establish or how do you establish trust and authority in these kinds of situations. Look, I hope that there are not millions of people listening to the Joe Rogan podcast, hearing who he has go on there and taking everything that they hear at face value. Surely some are, the numbers are too big.

BW: If he has 11 million listeners, some of them are probably listening to that. I hope people realize that he is the host of The Fear Factor, he’s not someone who brings any expertise to this, all he’s doing is asking questions, and I think that their response to this, the response to this mini-scandal, has been fairly positive in the sense that Spotify is going to be clearer about what their rules are, they are going to attach better information to worse information, but I have to say, and this is why I wish that we had someone in this conversation who was really an expert on misinformation and public messaging. How do you regain the trust of large numbers of people who have lost faith in the wise authority of these institutions? I don’t think it’s by kicking people off platforms. I don’t think that’s actually going to make the situation better. I think Joe Rogan, if he leaves Spotify, will bring most of his audience with him, wherever he goes.

BB: Oh, for sure. 100%.

BW: And I don’t think that the message that they will take from this is we should trust him less, or we should trust his guests less or we should trust science more. And that’s why I want us to be focused on how you can ameliorate the misinformation, not how we can punish the speaker or reveal our own purity, how do we actually understand why someone like this draws an audience like that, and how do we find a way to communicate with that audience in a way that is not just finger wagging and scolding.

BB: It’s true, because there’s some data that points to the fact that silencing debates about vaccines actually increases vaccine hesitancy and that rigorous debate increases vaccine compliance. So for me, as someone who believes in the efficacy and safety of the vaccines, I would not want to see the debate go away.

BW: I think that’s right. And I think tone matters a lot too. I remember when I was a young lawyer at the ACLU, and I had been quoted in the newspaper about something, I don’t remember what… And someone from our Communications Department came into my office and she said, “Nice quote in the newspaper, but you started your sentence with the word ‘Obviously.’ And when an ACLU lawyer starts a sentence with the word ‘Obviously,’ 30% of the people tune out, because they’re used to being lectured by someone like you, a snooty elitist ACLU lawyer. So say the exact same thing, and take out ‘Obviously.’” And it was really one of these light bulb moments, where I was like, “Wow, I’m really condescending, and that’s going to affect my ability to communicate here,” even though I think I’m just as right, before how I say that and how I approach and how I understand, how I appear, how I show up in a quote on the newspaper as an ACLU lawyer is really important to getting the message across in a way that works.

[music]

BB: I want to read something. Before I read this, I to ask you about this. So I was reading in Lawfare, I don’t know what that is, but it’s, L-A-W F-A-R-E… Do you know what that is?

BW: Yeah, it’s like a blog for people in the broad world of national security law.

BB: Okay, so it’s interesting. It was talking about warnings that work and warnings that don’t. We’re pulling away from a little bit of your “Let’s dig in to the deeper question,” because I do to talk about this a little bit. So the title of the article again from Lawfare, “Warnings that Work: Combating Misinformation Without Deplatforming.” I’m actually, to be honest with you, not a fan of deplatforming. That feels dangerous to me and a slippery slope into something that… There’re a fuck-load of people that would love to see me de-platformed right now, so I’m not sure that I’m a fan of it just because…

BB: But what was interesting about this article is they draw a comparison to early Internet days when there were security warnings about… You had web browser warnings and malware, and that none of it really worked, but they’re calling these contextual warnings, which is like what we see now on, I think, Facebook, we see that on Twitter, now on Spotify, contextual warnings, but they did a study where interstitial warnings… Do you know what these are?

BW: I don’t think so.

BB: They come up and you have to click that you understand.

BW: I see. Like they’re like, “We’re using cookies.”

BB: Yeah, and so “While the contextual,” I’m reading this verbatim, because I don’t to screw it up for obvious reasons, “really didn’t slow anyone down or make anyone think twice. These interstitial warnings,” this is the quote, “Dramatically changed what users chose to read. Users overwhelmingly noticed the warnings, considered the warnings, and then either declined to read the flagged content or sought out alternative information to verify it.” That’s interesting to me. Do you think that’s interesting or no?

BW: Very much so, and I think we need a lot more research of this kind, of exactly this kind. What works, what works?

BB: Me too.

BW: I think there is a lot of that going on right now, just because, as we said before, starting after Brexit and the Trump election in 2016, and there’s been a lot more claims made about the power of fake news or misinformation or disinformation, but there hadn’t been enough concrete research about both what effect it has and what the best ways are to ameliorate it. How do you get around this? And it’s a really hard problem because we talk about this in a lot of contexts, but when opinions ossify into identities, then it’s not just a question of what you believe, but it is a question of who you are…

BB: Your identity…

BW: Yeah, I think what’s been so surprising for me is that this has showed up in the context of the pandemic, that people now have identities on different sides of the pandemic response, masking is obviously a good example, and we saw it in some of our bluer communities, people wearing masks outdoors, even after the CDC said, “Well, maybe that’s not so necessary,” because it had been really a way of showing I care about others, and I’m not Trump, and much more…

BB: It’s the new NPR bag, right?

BW: Much more so, of course, on the other side, where people are now using vaccines for identity purpose, and you have Fox News hosts who refuse to answer the question, “Have you been vaccinated?,” because they don’t want to admit that they have, because it’s the wrong answer, in the identity that they’re trying to forge here, so that has been somewhat surprising. And the question is, do these kinds of interventions that you’re describing have a way of getting around that kind of identity defense and on that I’m really not an expert at all.

BB: Yeah, I just think it’s interesting. I think one of the hard problems that they talk about here, I’ve read three or four peer-reviewed articles, this is not a peer-reviewed article, it’s in the process of being submitted as one, but the three or four that I’ve read around what kind of interventions work around misinformation, all said that part of the problem is the very small amount of data public platforms are willing to release. So let’s talk about something that we’ve got a bigger problem. We don’t have a bigger problem, but we have a root problem around the power these platforms have. What would you say about that? Are these anti-trust issues? Are these…

BW: I think so, I think, yes, I endorse that comment 100%. What we’ve seen is that the most important places for free speech in our society are now platforms that are privately owned by Silicon Valley oligarchs, and you’d be making a huge mistake if you thought that those oligarchs share your political views and values, they do when it works for them, and that they won’t to turn on a dime if the political environment requires them to do that. So these are institutions that are not answerable to the public, they’re answerable to their shareholders, and bottom line, they are corporations, and they are the ones now that are making the godlike decisions about who gets to speak and how… Now, as you say, that wouldn’t be as big of a problem if there were a much more diverse ecosystem of those kinds of platforms, but when you have really dominant ones, when you have billions of people around the world on the same platform, when you have huge percentages of the public getting their news or information from a handful of essentially advertising companies, that is a major threat to free expression that our constitutional law has nothing to say about…

BW: Because these are private entities, Facebook gets to decide if they want to have nudity on their site, the government doesn’t get to object to Facebook’s policy on that, Twitter has made a different decision, and it may be that anti-trust is the only lever that can be applied effectively, we have not had in the last two generations, much of a tradition of robust anti-trust enforcement, and we may not even have the right legal framework for it, the same thing is happening in Europe, by the way, they call it competition law, where the Europeans are even more aggrieved by this because there are American companies that are dominating this global infrastructure. So I do think that the government could do a lot more to try to improve the playing field, the market by which these kinds of companies are allowed to become the behemoths that they are right now.

BB: It’s very hard when I hear you saying this Ben, because when I read how these founders and CEOS talk about their mission and their vision, they don’t talk about necessarily market cap or they don’t talk about those issues, they just talk about owning audio globally.

BW: Yeah. I thought you were going to say something different, which is that they talk about not being evil, or they talk about the wonderful benefits…

BB: I don’t think they say that as much as they say… Just to be… Just what I’ve heard, what I hear when they’re talking, at least to Wall Street or when they’re talking to investors, what they’re talking about is global domination as the business goal, and when I hear you saying again, I am going to play this back and you can correct me because I’ve been off a couple of times, at least during this conversation, is that… On a scale from one to 10, 10 being really confident and one being, this shit will never happen, how confident are you that we could put together, that we will put together, a framework for anti-trust for competition in the next 10 years. That will loosen up this clot.

BW: Five. I do think that…

BB: What if I said five was forced… I’m a researcher, you don’t get a five, you go one through four, six through ten.

BW: Okay, look, I think that there is a global recognition that these corporations have gotten too powerful, I think part of the same conversation though needs to be not just their dominance of our communications’ platforms, but how these massive corporations have contributed to income inequality relative to the size of these corporations, they’re not employing a lot of people, and if you look at something like Amazon, they’ve wiped out main streets across the country, and the law has had a hard time dealing with that. Because for you as an individual, Amazon is great. You click a button and they deliver it to your door for less than you would have paid at the old store in the downtown… For us as a society, it can be very damaging because all of a sudden, we now have way fewer middle class families who have professions across that supply chain that is now owned by one company, so these are really, really hard issues as efficiency and as we move into more automation with AI, these things are going to accelerate, has made more and more middle class jobs obsolete, and now this is where it’s really going to go dark in this conversation, but we don’t have really too many persuasive examples of having democracy without a middle class without a strong middle class.

BW: And when you have some people who are very rich and most people who really aren’t and are losing in the system and feel themselves losing in the system, that’s when people become much more susceptible to populists and demagoguery, and particularly the right-wing versions of that, and so then you start seeing Trump’s election in 2016, not as just the product of some meddling Russians and Facebook, but as part of a trend that we’re seeing across Western societies of right-wing populists coming to power and the old traditional more centrist parties shrinking. So we’re going to have to get a handle on that problem, how are we going to share the benefits of all of the great efficiency that technology is bringing us and not just have a smaller and smaller number of winners and everybody else being on the outside. So I think that went quite a bit beyond where we meant to go here, but I think that is the bigger challenge.

BB: No, it’s exactly where I wanted to go. I believe there’s a profound danger in disconnecting these things or tapping out when it gets too dark or too complex or too related in hard ways I just think that’s… I think you can’t separate what’s happening I don’t think… I just don’t think you can… And I’m, as a social worker, class, working conditions, as a former union steward, these are things that unless put in front of us over and over can so slip away in every conversation, but… I’m so glad you brought it up. It’s really important, and I think where I got chills was, if you look back at history, which I love. When you said this, I was like, “Is that true? Stay focused on what he’s saying, but is that true?” Very few examples of strong democracies that are not held together really, by a strong middle class. Even a strong working class. Yeah, I think this is really important conversation. I’m so grateful you connected that for us.

BW: Yeah, and it’s why when people talk about Facebook as a threat to democracy, the last thing we should be worried about is their content moderation policies, the first thing we should be worried about is how these companies are contributing to the hoarding of resources of a few… And the hollowing out of the middle of the country in a way that’s going to have really long-term corrosive consequences unless we get a handle on it somehow, and maybe the tax code is the way that you get a handle on it, maybe anti-trust is part of the problem, but as I said before, automation, AI, is only going to accelerate these kinds of trends and to me, even though it’s not what I work on, economic justice, that’s the urgent problem that’s at the center of all of these issues.

BB: Alright, I’m going to close by asking you a question about you and I’ve changed them up a little bit, but I’m going to go to our rapid fire questions. I’m kind of scared of one of them because I know you’ll just say what it is, but let’s start. Fill in the blank for me. Vulnerability is…

BW: Difficult.

BB: Do you to say more?

BW: Yeah. The great Buddhist line, there is no self to defend is pretty hard to live up to, and it’s particularly hard for me, so I have a lot of defenses.

BB: Yeah. Me too. Okay, number two, you are called to be very brave, but your fear is real and you can feel it right in your throat. What is the very first thing you do?

BW: I try to take a deep breath and hold it.

BB: Okay. Something that people often get wrong about you.

BW: They always think that I’m kidding. And can’t tell when I’m actually being earnest.

BB: Really?

BW: Yeah, that hasn’t come through as much in this conversation, but I would say I sit on that razor’s edge between irony and earnestness a lot. And so when I’m not wise-cracking and when I’m trying to say something sincere, people think that I’m putting them on.

BB: Do you have any strategies for when you have to defend an individual or a group whose behavior or speech you find just offensive or even devastating?

BW: It’s very rare, I want to say that. People’s image of us is doing this all the time, defending the people who to burn down our office. These are very, very rare instances, but I think my strategy is more intellectual than emotional in those situations, and it is just to imagine a world where the shoe is on the other foot, the wrong people have the power, and I’m the one being censored.

BB: Powerful. Okay, we’re taking a hard turn, are you ready?

BW: I am.

BB: Last TV show that you binged and loved.

BW: This is going to be a little… It’s going to make me sound more elitist than I am because I try to read books and watch movies rather than TV shows, but let’s just say I re-watched Deadwood, the HBO Western. And, I wish there were more shows like that one, just brilliant.

BB: Favorite movie of all time.

BW: You know that you have to have five or ten favorite movies of all time.

BB: You can have two max.

BW: I can have two. Well, let me do it this way. I’m going to do it this way. The best lawyer movie of all time, and it’s not even close, is My Cousin Vinny. [laughter] Everything else is distant. The movie that I’ve seen the most times is the Big Lebowski probably followed by Dazed and Confused, but I’m still a sucker for Godfather One and Godfather Two.

BB: I just learned so much about you in 60 seconds, so much. A concert that you’ll never forget.

BW: There’s so many. Alright, I will be brave and admit it, 13 years old, Billy Joel at Madison Square Garden.

BB: Oh, get… Come on. Yeah, that’s a classic. I don’t even think you’re the first person that said that on our podcast. Favorite meal of all time.

BW: Sally’s Pizza in New Haven, Connecticut. That would be my last meal. If they ever tried to execute me, that’ll be my last meal.

BB: What do you get on your pizza?

BW: Oh, the main rule is not too many things, maximum of three things, two would be even better, but you have to have four or five pizzas on the table.

BB: Got it. So you’re sampling around. Okay, tell us what’s on your nightstand, I’m so curious about this one.

BW: Right now, well, I read very, very, very little non-fiction because I feel like my work and life is non-fiction, and so I’m reading right now, a novel by Heinrich Böll who was a post-war German writer most famous for a novel called The Clown, this one is called The Safety Net, and it’s really fascinating. It’s pretty much about the way in which the security that surrounds this important person to protect his life, in fact, completely constrains his freedom and that of his family and corrodes his relationships and all of that. And it’s a pretty great literary representation of being trapped in your shelter. You put this thing up to protect yourself, but actually you’re the one who’s caught inside.

BB: You’re the prisoner.

BW: Right, which is what I think that we do to our society sometimes in the face of certain kinds of threats. It’s certainly how I would describe our reaction to 9/11 was that we have to take all of these harsh actions against a real but pretty distant threat. And do we really need to be scaring people on TV every night with color-coded terror alerts? Is that actually useful? Who is it helping? So this is a really terrific novel.

BB: Wow. Okay, give me a snapshot of an ordinary moment in your life that really brings you joy.

BW: I love it when I can give my senior rescue dog a really good bone from a restaurant that I shouldn’t give her because it’s a cooked bone from something that I ate, but she was a street dog for seven years, so she can handle it, and it was just that moment of just pure gratitude that I love as she goes to chew on a lamb shank.

BB: Okay, I had to add this one. What keeps you up at night?

BW: I’m going to talk about one night, in particular.

BB: Okay. Interesting.

BW: I had a very fortunate pandemic, I became a first-time parent at the ripe young age of 49 on Halloween in 2020.

BB: Congratulations.

BW: She was actually due on Election Day, and the day that this baby girl came home from the hospital was Election Day, and we went to sleep that night not knowing what the outcome of that election was. And this goes beyond who the president was going to be, but I just had this crystallized thought, what is this world going to be in 20 years, 40 years, 60 years, 80 years, and how can you launch a new life into this right now, when it seems at this moment, at this dark moment in my night that it may be that we’re coming to the end of this peaceful, safe, secure, free world that I’ve lived 50 years in and we’re about to enter a world of no democracy, climate catastrophe, violence. And let’s just say I felt much better in the morning, but there was just these things coming together at the same time, this… Are we about to turn this dark corner and here’s this new life that made that a really, really long night for me.

BB: That is… Yeah, that is a… You were right in the middle of life right there, man. Yeah, I love that. Congratulations on your daughter.

BW: Thank you.

BB: I wish y’all could see his face. He’s got a big smile. Alright, we asked you for five songs for your mini-mix tape, let me tell you what you picked, “Freedom Blues,” by Little Richard, “Talkin’ Loud and Sayin’ Nothing,” by James Brown. “Perfect Day,” by Lou Reed, “As,” by Stevie Wonder, and “Happiness is a Warm Gun,” by The Beatles. In one sentence, what does this mini-mix tape say about you, Ben Wizner?

BW: It says, sometimes I’m too clever for my own good. I was trying to choose things that would be both sincerely related to my taste, but also topical for our conversation, so that’s how this came into being.

BB: You nailed it. Thank you so much for your time and I really… You were right, there was a lot more grappling than there was certainty.

BW: I think that’s right, and I just want to say, I want to make sure I get a chance to say this, that I think you’ve been spot on. I think the statement that you put out about how these are hard issues, how we need to find a way to balance the harms that speech can cause and the harms that censorship can cause, that actually is precisely what we’re trying to grapple with here, and that anybody who approaches an issue like this with firm certainty is probably going to be angry at both of us for the conversation that we just had and we’re going to have to live with that.

BB: Yeah, I think misinformation, threat to democracy, censorship, threat to democracy, an unwillingness to grapple with nuanced difficult things, an equal threat to democracy, it’s just… I want answers, because I love certainty, but I don’t have them, and I think we just as a culture are not willing to pause and ask questions before we launch so much pain. And I could have probably done a lot of things better. But I’m really trying to understand and that is not easy.

BW: No, especially when we’re operating in a dopamine casino like Twitter, that really runs on outrage. You need to be able to step back from that. I’m glad that we had over an hour to talk about this, and I didn’t have to fuse it into some evil corporate ontology of 140 characters.

BB: No. Yeah, it just doesn’t work.

BW: It doesn’t work.

BB: Thank you so much, Ben.

BW: Thank you so much.

[music]

BB: Alright, y’all. That conversation kind of… I don’t know. It surprised me, I think, is the word that I’m looking for. I was surprised how complex these issues are, but you know what really surprised me the most, is that the center will hold if we don’t allow ourselves to have these knee-jerk reactions, race to ideology, scream and yell at people when we don’t understand. I think the whole system works if we’re willing to have nuanced complex conversations and engage in real critical thinking, but we’re just not willing to do that. It’s so interesting because I paused my podcast and then I went up a couple days later on February 2nd, I think and said, here’s why. And the most vitriol I’ve received for sure is off Twitter, but what was interesting is Twitter had almost less than 10% of the people who read my position that I wrote on the website came from Twitter, so people are responding without even reading what I’m thinking or they’re just… It’s just, again, a lot of it is bots. But I think the system is so beautifully designed, the architecture of it, but it requires education and critical thought and thinking and debate in a way that we just don’t do anymore, myself included sometimes.

BB: Look, if you don’t all… While I stand for free speech all the time and have a long history of doing it, we all have little inner censors with us that are like, “Hey, you shut up. You on the other hand, you talk, but you over there, you shut up.” That’s maybe the biggest threat to the system in addition to censorship and misinformation is just the unwillingness to engage thoughtfully. We’ll be back next week with Unlocking Us and Dare to Lead and guess who’s in the studio? Hi, Laura.

Laura Mayes: Hi.

BB: Hi. Is next week Valentine’s Day?

LM: Next week is Valentine’s Day.

BB: Yeah, so we have Dare to Lead will come out on Monday, and we have a little special Valentine’s gift, an extra episode of Unlocking Us for you as a Valentine’s Day present from us, and it’s an excerpt from the new audio book of Atlas of the Heart. I just got it recorded. And it was so fun because they let me ad-lib, so I read the book, but then they let me describe things and what I’ve learned since the book came out, and so it’s kind of fun. I’m excited about it. And then Wednesday, we’re coming back with a… Monday is Dan Pink.

LM: Dare to Lead, Dan Pink.

BB: Dare to Lead, Dan Pink, we’re talking about regret. You should not regret going in, except for the shit show last week, but I don’t regret it. It just sucked. And then Wednesday is… Oh no, Monday our Valentine’s Day present, and then bonus Unlocking Us, and then on Wednesday it is… Oh my God. Is it Jason?

LM: Yes. Perfect for Valentine’s week.

BB: Perfect for Valentine’s week. Jason Reynolds. Oh my God, this interview so fun. Alright, as always you can listen to Dare to Lead on Monday Unlocking Us on Wednesday. Every episode of Unlocking Us and Dare to Lead have episode pages on brenebrown.com. You can visit those episode pages for resources, downloads, and transcripts. It takes us about five days to get the transcripts up. I’m just happy to be back. I know it’s been really hard, I know some of you are still grappling with a lot of issues, #Iamtoo, but I’m going to make the best podcast I can because I believe in the conversations, and I believe in this community. So thank you and stay awkward, brave, and kind. I’ll see you next week.

[music]

BB: Unlocking Us is a Spotify original from Parcast. It’s hosted by me, Brené Brown, it’s produced by Max Cutler, Kristen Acevedo, Carleigh Madden, and Tristan McNeil, and by Weird Lucy Productions. Sound design by Tristan McNeil and music is by the amazing Carrie Rodriguez and the amazing Gina Chavez.

© 2022 Brené Brown Education and Research Group, LLC. All rights reserved.

Brown, B. (Host). (2022, Feb 9). Brené with Ben Wizner on Free Speech, Misinformation, and the Case for Nuance. [Audio podcast episode]. In Unlocking Us with Brené Brown. Parcast Network. https://brenebrown.com/podcast/free-speech-misinformation-and-the-case-for-nuance/

Back to Top