The Just Security Podcast

Social Media, Government Jawboning, and the First Amendment at the Supreme Court

March 11, 2024 Just Security Episode 59
Social Media, Government Jawboning, and the First Amendment at the Supreme Court
The Just Security Podcast
More Info
The Just Security Podcast
Social Media, Government Jawboning, and the First Amendment at the Supreme Court
Mar 11, 2024 Episode 59
Just Security

On March 6, 2024, Just Security and the Reiss Center on Law and Security at NYU School of Law co-hosted an all-star panel of experts to discuss the issue of government “jawboning” – a practice of informal government efforts to persuade, or strong-arm, private platforms to change their content-moderation practices. Many aspects of jawboning remain unsettled but could come to a head later this month when the Supreme Court hears arguments in a case called Murthy v. Missouri on March 18. 

Murthy poses several questions that defy easy answer, driving at the heart of how we wish to construct and regulate what some consider to be the modern public square.

The expert panel consists of Jameel Jaffer, the Executive Director of the Knight First Amendment Institute at Columbia University and an Executive Editor at Just Security; Kathryn Ruemmler, the Chief Legal Officer and General Counsel of Goldman Sachs and former White House Counsel to President Barack Obama; and Colin Stretch, the Chief Legal Officer and Corporate Secretary of Etsy and the former General Counsel of Facebook (now Meta). Just Security’s Co-Editor-in-Chief, Ryan Goodman, moderated the discussion. 

This NYU Law Forum was sponsored by the law firm Latham & Watkins. 

Show Notes: 

Show Notes Transcript

On March 6, 2024, Just Security and the Reiss Center on Law and Security at NYU School of Law co-hosted an all-star panel of experts to discuss the issue of government “jawboning” – a practice of informal government efforts to persuade, or strong-arm, private platforms to change their content-moderation practices. Many aspects of jawboning remain unsettled but could come to a head later this month when the Supreme Court hears arguments in a case called Murthy v. Missouri on March 18. 

Murthy poses several questions that defy easy answer, driving at the heart of how we wish to construct and regulate what some consider to be the modern public square.

The expert panel consists of Jameel Jaffer, the Executive Director of the Knight First Amendment Institute at Columbia University and an Executive Editor at Just Security; Kathryn Ruemmler, the Chief Legal Officer and General Counsel of Goldman Sachs and former White House Counsel to President Barack Obama; and Colin Stretch, the Chief Legal Officer and Corporate Secretary of Etsy and the former General Counsel of Facebook (now Meta). Just Security’s Co-Editor-in-Chief, Ryan Goodman, moderated the discussion. 

This NYU Law Forum was sponsored by the law firm Latham & Watkins. 

Show Notes: 

Paras Shah: Hello and welcome to a special episode of the Just Security podcast. I’m your host, Paras Shah. 

On March 6, 2024, Just Security and the Reiss Center on Law and Security at NYU School of Law co-hosted an all-star panel of experts to discuss the issue of government “jawboning” – a practice of informal government efforts to persuade, or strong-arm, private platforms to change their content-moderation practices. Many aspects of jawboning remain unsettled but could come to a head later this month when the Supreme Court hears arguments in a case called Murthy v. Missouri on March 18. 

Murthy poses several questions that defy easy answer, driving at the heart of how we wish to construct and regulate what some consider to be the modern public square.

The expert panel consists of Jameel Jaffer, Executive Director of the Knight First Amendment Institute at Columbia University and an Executive Editor at Just Security; Kathryn Ruemmler, the Chief Legal Officer and General Counsel of Goldman Sachs and former White House Counsel to President Barack Obama; and Colin Stretch, the Chief Legal Officer and Corporate Secretary of Etsy, and the former General Counsel of Facebook (now Meta). Just Security’s Co-Editor-in-Chief, Ryan Goodman, moderated the discussion. 

This NYU Law Forum was sponsored by the law firm Latham & Watkins. 

Ryan Goodman: So, welcome to the Latham and Watkins forum at NYU School of Law with our topic, “Social Media, Government Jawboning, and the First Amendment at the Supreme Court.” This event is also co-hosted by the Reiss Center on Law and Security and Just Security. I'm Ryan Goodman. I'm a Co-Director of the Reiss Center and Co-Editor-in-Chief of Just Security. 

So, I thought to start with a few extra words than normal in trying to describe the topic for today, because it raises a certain set of complicated questions and issues and it’d be good to just kind of level set. So, about two weeks from now on March 18, the Supreme Court will hear oral arguments in Murthy v. Missouri, in which the justices will consider under what circumstances government communications with social media companies are so coercive or intertwined with the social media companies’ content moderation decisions, that those government actions may violate the First Amendment. This practice, of government actors pressuring private actors to make decisions with their companies, has become known as jawboning, and in this context, conceivably a form of censorship by private proxy, or state control in another industry. The issue is cut across ideological and partisan divides. There are thorny First Amendment questions, and the role of governments’ use of the bully pulpit, and more. It could implicate Republican, Democratic administrations, Republican, Democratic policies and the like. There's no clear partisan line to this. 

The plaintiff's case, a group of social media users along with Missouri and Louisiana, alleged that officials across the executive branch, ranging from the White House to the Surgeon General to the CDC, to the FBI, and CISA, the Cybersecurity and Infrastructure Security Agency, coerced or colluded with social media platforms to suppress disfavored speakers and content, the speech having to do with the COVID19 pandemic, foreign interference in the 2020 election, and yes, Hunter Biden's laptop. In July 2023, a federal judge in the Western District of Louisiana agreed with the plaintiffs. The court also issued a sweeping injunction prohibiting a large swath of federal officials and agencies’ communications with companies with a sort of proviso, or carve out, for national security issues. The Fifth Circuit upheld the District Court in large part with a narrowed injunction. Its injunction said that the US government shall not, “coerce or significantly encourage social media companies to remove, delete, suppress or reduce, including through altering their algorithms, posted social media content containing protected free speech.” 

The Supreme Court stayed the injunction and granted cert, and now faces a complicated task of line drawing that our panel will explore in detail. When does the use of the bully pulpit and the attempted persuasion of private actors become coercive? Do the facts cited in the District Court and Fifth Circuit properly represent the power relations between the US government and social media companies and what went on? And did the Fifth Circuit and the District Court properly apply the tests of coercion and encouragement to the alleged facts? 

Our panel is basically a dream team. We thought who would be the best people to speak on this issue, and all of them said yes. So, I'll just say a few words about the panelists. And if you’re here to see them, if you’re doing this virtually, are here to be with them, if you’re here presently, so I'll just say a few words. And I just want to thank the panelists for spending their time especially since they have extraordinarily busy schedules with us today. 

So, Kathryn Ruemmler is the Chief Legal Officer and General Counsel of Goldman Sachs. She served as former White House Counsel for President Barack Obama. Jameel Jaffer, to her left, is Executive Director of the Knight First Amendment Institute at Columbia University, and also Executive Editor at Just Security. Colin Stretch is Chief Legal Officer at Etsy and former General Counsel of Facebook. They come at this issue with deep experience, thoughtfulness, and a different range of perspectives. So, after I ask a few set of questions, in which we'll get a conversation rolling, we'll also have time at the end to open it up to questions from the audience. 

So just to begin the conversation, and this could be to any of you — Jameel, I might invite you first, but then Kathy or Colin as well — what is at stake here in the Supreme Court's opinion? Do you think a lot rides on the outcome both for social media companies and potentially for other industries more broadly, because the idea of using the bully pulpit, or persuading private actors, private companies to bend to the interests of the executive branch from one policy to another, could obviously be in a range of other industries, not just social media companies, not just the First Amendment? So, one of you would like to start that out? Or maybe Jameel?

Jameel Jaffer: Sure. Okay, well, so first, thanks for the invitation to speak on this panel. So, let me start with something that Ryan, you said in your intro, which I think is very important that, you know, this case, you know, has a particular partisan valence. It's important to kind of get that out of your heads, because the same issues that are presented in this case, with Republicans on one side and a Democratic administration on the other, you know, the next time around is going to be presented in a very different, different way. And you can just, you know, imagine if, say, in the summer of 2020, the Trump administration had engaged in a concerted effort over the course of multiple months to persuade the social media companies to take down speech that was supportive in some way of Black Lives Matter. Imagine that that campaign included some public statements say, President Trump went on TV, and he said, the social media companies are killing people. By not taking down speech that is incendiary, or, you know, encouraging people to act lawlessly, they're responsible for violence against police. And imagine that some of those communications were public, but others were private, and that some of the private communications were accompanied by threats — maybe extremely vague threats, maybe implausible threats — but nonetheless, accompanied by threats. 

Or you could, you know, spin out another hypothetical relating to speech relating to reproductive freedom post-Dobbs, or today, speech relating to Palestine or Israel, right? So, you know, pick your category of speech, change the actors, and try to see this in a principled way, even though the facts, as they're presented to the court really are, you know, straight out of the cultural wars.

So, you know, that's one thing. And I, maybe that's an indirect response to your question, right? Like, how significant are these issues? I think that these are really significant issues, and they're, whatever the court says in this case, will have implications that go far beyond these particular circumstances. FIRE, the free speech organization, filed a brief which I don't entirely agree with, but it very usefully lists other contexts in which government actors, both Democrats and Republicans, have engaged in efforts to pressure speech intermediaries to change their policies, or take down speech, or leave speech up. And it's a long list. You know, it's a long list.  

So, that's not to say I think the answer to this, you know, this particular case is easy. I think this is, in some ways, a hard case. Well, you know, we'll get into it. But, to answer at least part of your question, Ryan, I think, you know, it's an important case. It's going to have real implications, not just, you know, in this context of speech relating to public health, but also speech related to all sorts of other things. 

Ryan: Okay. Do any of you want to weigh in? 

Kathryn Ruemmler: Sure, I'm happy to. You know, one of the things that I think is most interesting about this case, that makes it different than the hypotheticals that Jameel set forth — which I agree, I mean, that you do have to think about, really, the broader implications of whatever the Supreme Court might do, as opposed to the particular circumstances presented by this case — but one thing that I thought was interesting, when I was reading through the lower court decision and the appellate court decision, is there really was no recognition by the courts that the vast majority of these communications between the government officials and the social media companies related to a global health crisis. And that, I think, is really critical because, you know, if you think about what is the purpose of the government, why do governments exist, it's really, you know, to protect the health and safety and welfare of its citizens. And so, this was really a crisis situation, in which the only way to get this pandemic under control was really to get very significant parts of the population vaccinated. 

And so, you know, I think that it is fair to say — and just as I'm, you know, reading between the lines and some of the communications, and we're going to get into some more details — but that urgency of the officials in really trying to make sure that misinformation was not being broadly disseminated to the population in the wake of this crisis, I think makes this case much more difficult in some ways than, you know, some of the hypotheticals that Jameel put forth.

Colin Stretch: I would just add, from the industry point of view, I think, what these services mostly are trying to do — I mean, they're trying to do a lot of different things — but in this context, really what they're trying to do is prevent their online services from essentially contributing to or resulting in offline harm. That's sort of the basic principle that tends to govern and has for, you know, a decade or more, the kind of going in perspective on, aren't we going to moderate content? And if so, how? 

When I think about what's at stake, I mean, I think, you know, it's so interesting, it was a global health crisis. Hard to imagine anything sort of more important, with more possibility of offline harm, so, more relevant to that lens. Also, somewhat bizarrely, it's still kind of hard to wrap your head around why it became so politicized, and yet it was acutely politicized. And that's often not the case with the possibility of offline harm. So, for example, the government talks a lot about terrorism and child safety. Those are important moderation contexts that are typically not as politicized, where these questions don't come up as in the same sort of frame. 

The point I'll make is that when the companies are doing this, they're doing this in a lot of different lanes. You know, they're technology experts, they're not terrorism experts, they're not child safety experts, they're not health experts. And whenever they choose to sort of wade into a topic, they have a lot of learning to do. And how do they learn? Well, they talk to people. They talk to experts. There's lots of NGOs who have contributed enormously to these content policies and the enforcement decisions that these companies make. The government's probably the best resource on most of those topics. So, when I think about what's at stake, I think about, God, that'd be a real bummer if you're trying to do that job, and you weren't allowed to talk to the people who are most informed.

And if you can sort of lift out of the sort of, as Jameel said, the politicization of the issue, it becomes, in my view, a relatively straightforward case. Do you want the companies to be able to talk to the government about this stuff? Do you want the government to be able to talk to the companies about this stuff? I say yes.

Jameel Jaffer: Can I —

Colin Stretch: We're going to join issue sooner rather than later.

Jameel: I think that's right. Yeah. So, okay. So first, let me just say, I'm very sympathetic to the Biden administration in that particular episode involving vaccine Information. I mean, I think that the Biden administration had a responsibility to counter vaccine information, had a responsibility to, in some way, alert social media companies when what the Biden administration saw as vaccine misinformation, or public health misinformation, was being disseminated on those platforms. I'm very sympathetic to, you know, to all of that. 

But here are the things that make this, I think, maybe a more difficult case then Kathy and Colin acknowledge it is. So first, I don't see a real distinction between a public health crisis and what the Trump administration would have characterized as a civil unrest crisis or a law enforcement crisis. And I think, you know, after, say, the Snowden disclosures, the Obama administration would have characterized those disclosures themselves as a kind of, sparking a national security crisis, or themselves a national security crisis. And so, I'm not sure that we can cabin — I worry about whether we can cabin any rules we come up with in this context to the public health context, even though I'm very sympathetic to, you know, the Biden administration on that particular set of issues. So that's one thing. 

The second thing is, it's true. Even if you can cabin it, right? Even if you think, well, public health is different a public health crisis, the government needs to have the power to talk to the public, to talk to social media companies, to talk to other speech intermediaries, and even to pressure them and to coerce — and even if you accept, well, public health is different. We'd have a different set of rules in this context. Isn't it equally true that for the same reasons, you know, public, so much is at stake in that context, right? So much is at stake. But isn't it for that reason also true that it's especially important that government speech be subjected to real checks and counterweights, right? And if what was going on here — and we'll talk about what actually was going on here, because, you know, the facts are important — if what was going on is that the government was essentially overriding the judgment of the social media companies, then, you know, I don't think that's a good thing for — it's not a good thing for a democracy in the context of a public health crisis to have a government that has the power to do that. So, you know, for all those reasons, I just think that this case is more complicated. 

Since I have the floor already and introduced the facts, I think that one of the things that the Fifth Circuit did, and even more, the District Court did here, is sort of lumped together a lot of different things, some of which might be totally unproblematic, and others that, you know, raise real questions, and then draw kind of broad conclusions that in a way makes it difficult to know precisely which facts were the basis for which conclusions. Let me let me just leave it, leave it there.

Ryan: Kathy, can I ask you to put on your hat as former White House Counsel? And the question of — you can either talk about the facts that we have, or just basically, from that role, what do you want to see US government officials doing? What are the kinds of instructions that you think they should abide by in their communications with social media companies? Are there — what are the kinds of things that you're worried about or considering in that role? And are there red lines that you were trying to instruct folks in the White House, and maybe in the departments and agencies, about what they should or shouldn't be saying, by way of what might be the examples of what might cross a line of coercion, what might cross a line of threatening the company with X, Y, Z sanction, or official action? Is that what you're thinking about? What are the things that you would say, or at least should be best practices? And then we can talk about whether or not we have in the record here something suggesting they crossed it?

Kathy:  Yeah. Well, I think there are some things in the record here. I'll start there. And I'll back up that. I would say I might have phrased a little differently if I were scripting people, but, so, I'll say that diplomatically. But, you know, it's that the conclusions I think that the Court drew on, particularly with respect to, you know, the people who were speaking — the government officials who were speaking — let's talk about the White House officials, because I think that's where there were some pretty broad conclusions that were drawn by the Court about, you know, sort of the great power of the White House, which is, I think, quite detached from reality.

And, you know, to Jameel’s point about facts mattering, if you start with the social media companies, I mean, these are some of the most powerful and sophisticated companies literally in the world. They have multiple former government officials who were working there. While they, you know, need to consult, to Colin’s point, about particular subject matters, I mean these are people who — these are companies that understand kind of how the world works, and how the government works. These are not individual citizens who might have, you know, be kind of overwhelmed by getting a call from, you know, the guy who's running the digital strategy at the White House. So, you know, I think that is an important context.  

And then who is speaking, who are the White House officials speaking? That actually matters, too. And if you have an appreciation for where the real enforcement authority lies, it's not with a guy who's running the digital strategy at the White House, I can tell you that. That guy literally has zero authority or ability to influence a decision at a federal agency that would have some regulatory authority over any of these companies. So that, I think, is a factual thing that matters. 

And I think, too, you know, when I said there are some things that I think I would have drafted differently, the comments — and just to get into the facts for a second, you know — some of the comments that were made, I think, including references made by the White House press secretary, were along the lines of, you know, well, there could be reforms to Section 230, right? And so again, going back to the sophistication point, that's not something the White House can do. And that's something that has to be addressed by Congress. And that's kind of like saying, you know, the sky is blue. I mean, it's so obvious. There's been a lot of, you know, public debate about whether there should be reforms there in any event. And so, that, to me, the kind of the idea that that in some way would be coercive to this very sophisticated set of companies, I think it's kind of silly, frankly. 

I think there was passing reference to antitrust enforcement. That was one wearing my White House Counsel hat that I think was unfortunate. And I've been very diplomatic here. But I think the reason why I think it was unfortunate is actually not for the reasons that the Court suggested. It's not because I think in any way, shape, or form, there would be some enforcement action that would be brought as a result of the company's unwillingness to moderate content, but rather, that if independently — which is the way this happens in the real world — there were some antitrust issue, the fact that that statement were made has now complicated the actual on the merits case that the Justice Department or the FTC might bring. So, you know, I think that those stray comments in my mind were — again, given the audience, given the people who heard the conversations — there were sort of political rhetoric really to show that the White House was doing everything it could to ensure that it was really addressing this public health crisis. 

Ryan: And I guess, Colin, can you come at the same question from circling it from the other side, from within the social media companies? What you think about like — yeah, one of the things, let’s put in this way — one of the parts of the legal test of coercion is whether or not the recipient of the communication perceives it to be threatening. Are there any ones that crossed the red line, or that you think about as a red line?

Colin: I mean, I share Kathy’s view that, you know, these are big companies. They don't, they don't scare easy. They've got folks who, you know, work for the government, or have — sorry, have worked for the government. So, they kind of, yeah, they kind of they kind of get it. And, you know, I left Facebook in fall of ‘19. You know, I think, ‘20, I think 2020, 2021, I think that was a very intense time. And I do think there was a lot of pressure brought to bear. So, I don't want to overstate this. But, the reality is, there's often a lot of pressure being brought to bear on these companies from government officials all over the world. And the question when you get, you know, when you have an incoming is, typically, not how mad are they, but what are the actual consequences? You know, it's not like, gee, they're really mad we haven't returned their email. Like, okay, let's return their email, that'll, you know, that'll sort of take the temperature down, and you always want to try to settle down. But the question is, like, what are they going to do? And in the US, happily, from my perspective, at least, not that much, right? 

So, I think the best — I think the, to me the fact that really underscores how I think, you know, the companies would have reacted to this was, is the kind of the blood on the hands — they’re killing people, right? So, President Biden went out and said, these, you know, these companies are killing people, which is extraordinary, right? That's an extraordinary statement by the President of the United States. It felt, I'm sure, terrible to companies. Certainly, you know, these are people who work very hard every day. They're trying their best to do right by their company and by their users. And I'm sure that felt awful.

And what did they do? Mark went out and said, don't blame us for your failed vaccine rollout strategy, right? That was not a company that was like, oh, we're going to be cowed into submission. It was like, this is on you, not on us. And that, to me, is as it should be, right? Whether who's right or who's wrong, I, you know, I won't mediate that. But they weren't, like, oh, now we're going to tuck our tail between our legs and do whatever the government says. It's sort of rhetoric, it's playing out in a political forum. And, you know, that's kind of, that's life in the big leagues, to sort of switch to a sports metaphor, and these companies been playing there for a while. They're sort of okay, in my view.

 Ryan: So, Jameel, do want to come in on that in the sense of, in this particular case, with this particular set of facts, dealing with the large social media companies? Is there anything that you see that crosses this line, and in the constitutional line, and if you agree with this, is really between what is persuasion and using a bully pulpit, which in some sense, that's what you could say Biden was doing by being at the bully pulpit and saying these companies are killing people, to something that's more about coercion, and that coercion is a specific threat of some kind of sanction. But then, what I'm hearing Kathy and Colin saying is that that's not at play here. These are, in some ways even, the companies are bigger than the government. I mean, the power relationship here is not one in which, at least with these actors — 

Colin: That’s not what I was trying to say about that, but, yeah. 

Ryan: Sorry — or maybe are bigger than the particular individual found in the, like, who's doing social media content for the White House. So, or is there something you see that is more about the principal in the case, and it's about future cases, smaller companies, other situations, other episodes? 

Jameel: So, let me let me answer the bigger question. So, maybe it's worthwhile just to go back to why coercion is the line, or at least I think, it's the line, and maybe everybody on this panel thinks it's the line.

So, I think that there are — these cases are hard because there are First Amendment interests, or at least interests that sound on free speech on both sides of the V here. Like, on the one hand, you have the interests of the people in having a government that can actually govern, and governing requires speaking, including speaking to private speech intermediaries. It includes, I think, it's entirely legitimate for a government to pressure private speech intermediaries to be more attentive to what the government views as the public interest. So, all of that is on one side of the V. And I see that as a kind of free speech interest because I think ultimately, it's not a government speech, right, it's really a right that inheres in the citizenry. It's a right of the citizenry to have a government that can do those things. 

 And Colin mentioned this earlier, but sometimes the government has information that nobody else has, and you want a government that can share that kind of information, right? We want our government to be able to share whatever the CDC comes up with with private speakers. And the same is true in national security. The government often has information that others don't have. And while the government's information is not always reliable, we want the government to be able to share that information with journalists and with, you know, with others.

So that that all sounds, at least in my mind, in free speech. Those are all kind of free speech interests, even if you don't usually think of them as First Amendment interests. On the other side of the V, you have the interests of the companies, the social media companies or speech intermediaries more broadly, and their users in having expressive spaces that are free from government coercion, and that reflect autonomous editorial decisions of the people who are in charge of those spaces. And both of those are important interests, and the way that our law reconciles those two interests or balances those two interests, is through this distinction between persuasion, on one hand, and coercion on the other. So that’s the way I see, that's where I see this persuasion versus coercion rule coming from. It's an acknowledgment that there are important interests on you know, both sides.

Applying that rule is much more difficult than stating what the rule is because, you know, so many facts might be relevant to whether something is coercive, right? It might have to do with the tone of the communication. It might have to do with whether there was a threat, whether the threat was explicit or implicit, whether the person making the threat had regulatory authority. In the brief that we at the Knight Institute filed in the case, we put a lot of emphasis on whether the threat was public or private. And we see private communications presenting a much more significant First Amendment problem than public ones, because the private ones are not subject to other checks, right? If, when President Biden goes on TV, and he says social media companies are killing people, lots of other people can weigh in on whether that's actually the case — whether it's the social media companies that are responsible for this, or it's the Biden administration's failed rollout, right? Whereas if it's a private communication — you know, somebody at the White House sends an email to somebody at Facebook saying, take down this post or else, that doesn't, you know, get any pushback because nobody knows that it exists. 

So, this question of whether something is coercive, I think, you know, is inevitably a fact intensive one. It's very difficult to come up with principles in that space. But, with respect to this particular case, you know, are the social media companies, you know, just un-coercible, I think, you know, a few things to consider here. One is, in every regulated industry — and I think of social media companies as part of a regulated industry, even though the regulation is, you know, arguably pretty light touch — where every potentially regulated space, you have a phenomenon which Daphne Keller has called — I don’t know if this is her term, but I'm stealing it from her — anticipatory obedience, where sometimes, the regulated entity, or the entity that thinks it might be regulated — one of the things that's going on in their decision making is, well, what happens, you know, why don't we do it this way just to avoid the possibility that in the future, the government will regulate us more harshly, or regulate us in a way that we're not regulated now, right? And I think we have to —it's hard to, you know, prove that that was, you know, that that phenomenon actually had some traction in any particular context. But I think that is a phenomenon that we've seen in other contexts, and it would be quite a surprise if it didn't exist in this one, too. 

The second thing is, the scale of the social media companies. You know social media companies are so big that they just have no stake in whether — they just don't care whether any particular piece of content stays up or not, or at least they don't care very much about it, right? It doesn't have a lot to do with their bottom line. Whether any particular piece of content stays up or not doesn't actually change the character of the expressive community that they have created. And so, if you have a government official saying, take down this speech, it's just easier for a social media company to say yes.  That's not to say they always say yes. They don't. But it's easier for them to say yes, because they don't care that much. It’s is different from, you know, the government goes to a bookstore that specializes in books for the LGBT community and ask that bookstore to take down this set of books. That bookstore is going to have a kind of expressive interest that social media companies don't have in any particular piece of content. 

Last thing I'll say. One of the things that I think First Amendment doctrine can help do in this sphere is protect a kind of diversity and editorial diversity. You know, if you allow the government to coerce the social media companies, we're going to have one public sphere, like the digital public sphere is going to be the same on Facebook as it is on Twitter as it is on every other platform, right? Because everybody will, at least at the margins, adopt the government's rules. So, one thing that this rule against coercion does is protect a kind of diversity. But as a matter of practice, the social media companies often operate in a kind of cartel-like way. Now I'm stealing somebody else's work — they operate in a cartel-like way in the sense that once Facebook has decided that, say, Alex Jones shouldn't be on this platform, most of the other platforms just follow. So, it's just easier to follow suit than to strike out your own, you know, your own path here. 

And so, you know, where does that leave us? You know, I guess I'm just saying, it might be naive to expect the social media companies to be reliable proxies for the speech interests of their users.

Kathy: Well, I love the sympathy for regulated institutions. I work at Goldman Sachs, which is probably in the most regulated industry, and I often feel coerced every day, but that's a different subject. What I would say, just in response, is like, how do you draw the lines — and I do think we sort of all agree that coercion is probably an appropriate line. It's a rule that we have in the existing case law. But you know, in my mind, one way to do it — and I think if I were on the Supreme Court, this is probably how I do it, is say, it better be really, really clear that there was, in fact, coercion, and that the recipient of this speech actually felt coerced. And in at least my review of the record, that's just not present here. I mean, there's really no evidence that I saw that anybody in any of these companies actually felt coerced. There's no internal email saying, oh, if we don't do this, they're going to bring an antitrust case against us. Or, if they don't do this, you know, they're going to do X, and we better do what they say. And there's not really any evidence that, you know, anybody at the government was telling them to change their policies. They were rather saying, look, we want you to enforce your content moderation policies. There's really no dispute in the case, as I can see, that the content at issue wasn't, in fact, misleading or a misrepresentation or otherwise potentially harmful to the public. 

So, you know, one way, potentially, that the Supreme Court, I think, could do something relatively non-dramatic, is just to say, look, we think that there are cases in which coercion is present. But this ain't it. And, you know, that's easier said than done because they have to articulate some things. But in my mind that's sort of the most important thing. And I think the nature of the recipient of the alleged coercive activity — you know, how sophisticated that person is, is that person an individual, what is the power differential between the government and the recipient — I think those are all really important factors that, frankly, the lower courts didn't spend a whole lot of time really even addressing.

Colin: So yeah, I kept looking for the — so if you read the Fifth Circuit decision, there's the whole like, wait, is correlation causation? Because they kept saying, and lo and behold, the company's got a line, and this content came down. And just one sort of background understanding, you just, anytime you're talking about these companies generally, and particularly on the moderation side, it's, like, the scale is mind boggling. And they're not very good at it, right? They're just, I mean, they’ve gotten a lot better. But it's a combination of people and machines that are trying to apply these rules that are kind of complicated across just a mind-boggling array of content.  And they make tons of mistakes.  

And so frequently — I mean, this started happening in 20, you know, probably 2012 or so — we get a lot of incoming from media that would say like, hey, this is up, this is up, this is up, this is up, this kind of like —and we'd look at it and say oops, take it down. So, is that coercion? No, that's like, oh, God, we looked at it, and it should come down. And so frequently, what's happening is it's just drawing renewed attention on content that should have come down. I was sort of looking for the — Jameel, I think you use the term “overriding” the judgment of the companies. And I didn’t see it, and in the absence of that, and in the absence of anything, like Kathy said, suggesting the companies were like, oh, gosh, you know, there's going to be a consequence if we don't get in line — it's hard for me to see, you know, a real problem in this case. 

I don't want to say there's no issue here, though. I don't, like, I think, you know, Jameel at the outset brought up the fact that this was sort of intensely political, this topic. Frequently, this stuff comes up in contexts that aren't political, and that aren't — where nobody is going to stand on the side of speech. So, I think the terrorism stuff is a great example, and child safety is another one where, you know, for years the companies had policies that would prohibit anybody who was a registered sex offender from having, for example, a Facebook account, or a YouTube account or various other services. So, people who had paid their debt to society were effectively locked out of participating in the digital world by virtue of the fact that they had been convicted of a crime, sexual crime, which was never the law. It was solely, at least in my experience, the result of a particular attorney general, state attorney general, who went around poking the companies and saying, you need to do this. You need to do this. You need to do this. No process. Certainly no indication that the disability that this attorney general was trying to force on the companies was sort of not overbroad, and it resulted in every company adopting the policy and a lot of people being, you know, kept offline. And you know, who's going to stand up for the registered sex offender, right? Nobody? So that's, I think, a really a good example. 

And I think there are probably others, particularly in the terrorism space, where you do have, I'd say, you know, quite problematic content that is being used to radicalize youth around the world. But, you know, are they drawing the line where it needs to be drawn? Or are they drawn the line a little bit more broadly, because who's really going to stand up and defend somebody who's been accused of fomenting terrorism? Nobody. And so, there are issues here about the company's line drawing, I think, in the back of all of this it. I think the basic fact — and I think that's really what's driving a lot of this — is, you have companies that matter a lot to our discourse that have very little transparency into how they make decisions affecting content and very little accountability. So, I mean, that's kind of what's going on here, right? there's a deep suspicion that they're making, among some people, at least, that they're making decisions with political motivation, or other motivations in mind, and nobody can really test it. And that's where this case comes from. I think that's where the net choice cases come from involving the Texas and Florida statutes. I think that's where the Facebook Oversight Board comes from, right, is the company's recognition that we got to try something to create some degree of accountability. Otherwise, I mean — that's sort of a good example of, what did you call it? Anticipatory obedience — is like, this state of play where there's very little transparency and very little accountability on questions of speech. That's not a sustainable situation. 

Kathy: Well, and the reality is is that this case, it's really not a coercion case. This is a case that what's really going on here is that the Biden administration had more influence with these companies than perhaps the Trump administration would have had. And so, you know, it's interesting that the communications back and forth, at least in my mind, actually show a willingness on behalf of the tech companies to be very engaged, because they're sort of like-minded on this topic, because they're actually, you know, my guess is — I'm not sitting in the thing — but my guess is that most of the people in senior positions and companies like, you know, Facebook, what was then Twitter might be different now. But we're sort of like, yes, we believe that masking is helpful to, you know, stemming the pandemic. We believe that vaccinations — it was, they were, in some ways, they were pushing on an open door, which is why I think it's interesting that this is in the coercive frame, because I think in the real world, that actually wasn’t what was going on at all. 

Jameel: It's possible. I think you're right that they were in some sense like minded. But it's also possible that they were even more like minded because they knew that there was a cost, or there might be a cost, to not be like minded, right? And the question that Colin was asking earlier, you know, where's the evidence that somebody actually changed their editorial? I don't think it's possible to say with any certainty that any particular editorial judgment was made differently by one of these platforms, or at least based on my understanding of facts. I don't think it's possible to say with certainty that any particular editorial judgment was made differently. And so, that's, you know, if that's the test, then then the plaintiffs here fail.

But I guess I might approach the question a different way and ask not, can you show that there was a change in editorial policy as a result, you know, as a consequence of the purported jawboning? I might just ask, like, why should the First Amendment protect this activity on the part of the government? What is — what is the rationale for, say, protecting this communication from somebody at the White House. I know this person had no regulatory authority. I mean, if somebody the White House writes to somebody at Facebook saying, you got to take this down, and then there's an email that follows up saying, you know, you're hiding the ball, and a third email that says, if you don't fix this, we're internally considering our options on what to do about it, right? It's a private email from the White House to Facebook. 

Now, I don't know if that resulted in Facebook doing something differently, but why protect it? You know, what is the interest? We're trying to — why does the government need the ability to send that kind of private email to a speech intermediary? Why couldn't it say something similar publicly, or why couldn't it say it without the implicit threat?

You know, I guess I don't see the justification for distorting the First Amendment to accommodate that kind of government conduct. It’s better just to say to the government, you know, if you want to communicate with the social media companies, do it in a different way. You can share information. You should share information. You can communicate that they're not doing what you think they should be doing, but communicate that publicly, so that other people can, you know, can see it and respond to it. 

Kathy: But isn't it appropriate for the government to say, to a company that has an enormous amount of social influence and authority, to say to them, you know what? You say you have these policies, you should enforce them — to kind of hold the company's feet to the fire. But isn't that an appropriate thing for the government to do, because the government is one of the few entities that actually, you know, is able to have some persuasive authority over the company. And, you know, when you start — the government's not sending those emails, which those were the emails that I thought, not so great, but, you know, they're not sending them to an individual speaker. They're not sending them to the person who is speaking. They're sending them to this platform, as you say, the speech intermediary. And as an individual person, I don't have a right to publish my content on Facebook. I have a right to publish my content consistent with their policies. And so, I do think it's important, at least in my mind, analytically to focus in on, it's not that the government is censoring speech. In this instance, the government is saying, you know what, you, Facebook, are making a lot of money. You have a lot of influence on your citizens. You have a policy. You say you uphold your policy. Boy, I'm looking at this content, sure doesn't look like you're upholding your policy to me. That seems appropriate, in my mind, for a government official to, you know, to do. 

Jameel: I'd feel differently about it if it were public, rather than private. But also, I mean, you're obviously right that at least as a matter of First Amendment law, a user doesn't have a constitutionally protected right to publish on Facebook's platform. On the other hand, a user does have a right to be able to publish on Facebook's platform to the extent Facebook wants, you know, and wants to publish that speech. And the user has a right not to have that relationship distorted by government coercion, right? So, I think it just brings us back to the question, well, is it coercive or not? Because the user does have a right not to have his or her speech interfered with by that kind of unconstitutional pressure — if it's unconstitutional.

Colin: I'm pro transparency. So, I'm actually, I think, Jameel, I think you and I are aligned on this, I think it answers a lot of questions if they're willing to say it publicly. And then the companies can do some sort of transparency report to say, here's how frequently we, you know, adhere to these requests. There's, again, every government where at least Facebook operates has a mechanism to get requests from the government. Most of the time, those requests — I shouldn't say most of the time — many of the time, a lot of the time, the requests that come from foreign governments are really problematic. They're, they're sort of like, you know, this person's a terrorist. Actually, no, this person is actually a political opponent of your cousin. So, you know, we're going to leave that up. 

But, you know, bringing all of that out into the light, I think, is really healthy, and I don't really see a lot of value in enabling, as Jameel said, private conversation or private communications. It just does seem, you know, just creates more possibility of abuse. It reminds me a little bit of, you know, in the wake of the Snowden revelations. Again, now, — I’m going to like to date myself — but we're going back a decade or so, where I think there was a lot of concern that a lot of what the government was doing with respect to gaining access to information from both telephone companies and internet companies just wasn't aboveboard. It was done confidentially and not pursuant to legal process. Now, it turns out that much of that, maybe even most of it, was overblown and inaccurate and reflected in just a basic lack of understanding of the tools that the government had available to gain access to information for various national security purposes. 

But what I think helped really address that situation was the steps that both the government and the companies took towards transparency, right? Towards saying, here's how many requests in these different lanes that we're getting, and here's how we're acting on it. And then people can look at that and say, okay, this is a vanishingly small amount of information that's actually being turned over to the government, and it's pursuant to these legal authorities. Let's just move on. I mean, it took a few years to move on. But I do think that, you know, it's a little hard to see exactly how that would apply in this case, but I do think that instinct towards, let's regularize this, let's have channels where you can report either complaints about policy, and then the companies can decide whether to change policy, or they can flag content that they think violates policy, and then the companies can say yes or no, it comes down or not, and let's just be public about it. 

Ryan: I guess I want to, in a certain sense, shift to another potential legal argument avenue for the Supreme Court, which is the opposite of coercion, in a certain sense —the way that’s I think of it, as a strong encouragement. So, if we are in a realm in which the companies are aligned in their interests with the government, and the government is  cooperative, is there a risk and are the lawyers who are having to litigate this thinking about the risk that if they say, actually, it's not coercive at all. It's highly cooperative, then they've kind of slid into a separate test of strong encouragement that makes it a form of state action, and then the actions of the company are attributable to the government. And it's funny, the way of thinking of it is the question presented to the court reads as —  one of them does — whether the government's challenged conduct transformed private social media companies’ content moderation decisions into state action and violated respondents’ first amendment rights, which in some ways even sounds a little bit more like the strong encouragement, even though I think coercion is really the heart of it. But is that something that's also something that we should think about? Have we kind of veered into that lane, or are we still in some sweet spot in between the two? 

Jameel: I mean, I guess I'm not very sympathetic to that particular argument for two reasons. One is factual, that, you know, I think it's important to recognize what's going on here with the social media companies may be different in degree, maybe, from what goes on with other speech intermediaries, but it's not really different if you think about many conversations between journalists and government officials, include jawboning of one kind or another, right? If you're a national security journalist, and you're talking to a national security official about some secret government program and how it should be characterized, a lot of that conversation is going to be about, it’s going to be the government trying to convince you, the journalist, that the program should be framed in a particular way. 

And I, you know, actually talked about this case to a publisher of a major media organization, and his reaction to this case was, I hope nobody starts thinking about the conversations that we have with government officials every day because, you know, like, they're entangled with the government. And, you know, after the Snowden disclosures, The Guardian, The New York Times, The Washington Post, all had conversations with the government about before they decided to publish, they gave the government an opportunity to say, if you publish this, the sky is going to fall. Now, they didn't listen, or they didn't always listen, right?

Colin: They said it and they published anyway.  

Jameel: Yeah, well, they didn’t publish everything though, right? And the same was true in 2005, when Eric Lichtblau and James Risen wrote the story on the NSA’s warrantless wiretapping program. You know, conversations between The New York Times and the government were part of the reason The New York Times held its story for almost a year before publishing it. So, I'm not defending the, you know, the result of these conversations, but it's a fact that journalists and editors of conventional, or sort of legacy, media organizations have these kinds of conversations with the government all the time, and those conversations involve jawboning. 

So, this goes back, I guess, Ryan, something that you asked right at the beginning, which is, you know, what are the sort of broader implications for this case beyond social media? You know, this isn't — social media isn't the only place where jawboning takes place. So, the facts here, you know, you asked about this argument that the social media companies have become state actors because of the entanglement, I think this entanglement is not really distinctive to social media, even if it's a little bit different in degree. 

The other reason I don't love this argument is that, I think about what would the remedy be. So if, let's say we accept that, you know, Facebook became a state actor by participating in this cooperative conversation with the government about what should stay up, what should come down. What’s the result? And then a court is going to issue an injunction requiring Facebook to keep up certain information that even Facebook says it wants to take down? I just like, I guess I don't see this leading anywhere good. 

Colin: I had to say, I had the same reaction on the, just, trace through the sort of the doctrine. But, if you look at the net, the sort of the industry brief, which is just very short, but they're basically saying, like, whatever you do, don't say that we're governed by the First Amendment, because that would create — First Amendment's wonderful, but it would create a massively permissive set of moderation decisions where, you know, you couldn't take down it or just about anything. 

Kathy: I was going to say that, you know, the plaintiffs with the best argument would be the registered sex offenders on that score. 

Colin: Yeah. 

Kathy: Because there's, I mean, there probably is really a much stronger evidentiary case for like, you know, real, such coordination, that it's almost entanglement, so that would be a perverse outcome, no pun intended.

Ryan: So, one other question, then I'll open it up to audience.  

Kathy: Could I just say one thing about this first question that was presented? Because I think, to me, it's super fascinating, which is the question of standing. You know, if there's no, if there's no sort of coercion, putting the sort of entanglement thing aside, but if there's no coercion that strikes me that there's no standing, right? Because there's only an injury in fact, if in fact there's some First Amendment violation. And there's only a First Amendment violation, if, in fact, you know, the government has caused that violation. So, it'll be interesting to see how the courts’ sequences, you know, the first question presented is, do they have standing? It's not entirely clear to me that they do. 

Colin: I was picturing sort of a trial where the Great Barrington Declaration, the validity of it actually gets litigated. Like, is it misinformation or not? 

Kathy: Yeah, right. exactly. I mean, it's really, it's just it's a very, very interesting thing, because there is no inert injury if you haven't found that the government was, you know, responsible here for, you know, “censoring this speech.” So that, that I don't know how they're going to, how they're going to figure out how to sequence that argument. But anyways, I was fascinated by that. 

Ryan: And it's also fascinating just in terms of the entire conversation that you have to always think back, that the social media companies are not the plaintiffs in the case. They're not the one saying, we are aggrieved, we were coerced, and it comes up with the standing question as well. 

And that is also the segue to the one question I was thinking to ask you all before opening it up, which is to where the head of prediction. So, how you think the court will rule and in part, because going back to I think might be the one of the very first things that Jameel said, that this particular set of facts comes out of the culture wars. There is a lot of raging disinformation that I think informed the District Court judge’s opinion, including the role of Fauci, the role of the FBI, and the Hunter Biden laptop. 

So, there's that part of it as well, in terms of, like, this is such an unusual case in which I think we might actually have a Supreme Court ruling or concurring opinions or dissenting opinions that are based on disinformation itself. And it's the technological capacities of the justices and their clerks to be able to grapple with this particular set of cases, in the sense that the Fifth Circuit, for example, conflates what it means to demote versus to take messages or content off of their platforms. And they conflate the two in their own analysis, and that's been pointed out by others. So, where do you think the Supreme Court might land on this, given that it's got some of the similar kinds of technological problems that they faced in the past and have dodged, and it's also got this other kind of cultural wars element to it? So, they might not perceive it as non-ideological and non-partisan. Any sense of where you think this will end up, or what we'll see in the oral argument itself?

Kathy: My prediction is they're going to try to drastically simplify this case. And I think that they'll reverse the Fifth Circuit, and I think that they will probably try not to pronounce a new rule, would be my prediction.

Ryan: Just want to go down, or Jameel/Colin, Colin/Jameel?

Jameel: I think Colin’s prepared to answer the question in a way that I’m not. 

Colin: Yeah, I think, yeah, it doesn't seem like, you know — look, I'm not a scholar in this area, so I don't know how much the courts are crying out for clarification between, sort of, where the coercion line is, and whether the significant encouragement test needs to be, sort of, updated. 

Yeah, I'd like to see him go off on standing. I'd like to see something approaching, you know, a significant majority. And I do think, though, to my earlier point, I just think there's deep skepticism and concern among some members of the court about — and you saw this at oral argument in the net choice cases. And I think the fact that those are being argued in such close proximity, I think, really, I think it matters. I think there's deep skepticism about the way in which these companies censor speech among some members of the court. And so, even if they do sort of resolve the course relatively narrowly, I would expect to see somebody saying something reflecting that concern.

Jameel: Yeah, the short answer is, I don't know how the court is going to come out. But one thing that I think is clear from the net choice argument, and from previous social media, Gonzalez argument and the Taamneh argument — multiple members of the court on the left and the right are struggling with this question of whether the doctrine we've inherited, the First Amendment doctrine we've inherited, is actually adequate to this new factual context. And, in particular, this question of whether it's enough for the First Amendment to be concerned with the speech intermediaries, or whether the doctrine needs to somehow accommodate or be sensitive to the speech interests of the users, right, who, you know, according to existing doctrine, it's, the relevant question is whether the speech intermediaries have been coerced. But if you, if you are skeptical that the speech intermediaries are reliable proxies for the expressive interests of their users, then maybe focusing myopically on the, you know, whether the platform's recourse is not adequate. You need to have a doctrine that somehow, you know, accommodates the speech interests of the users. 

And I think that the court was struggling with that question in net choice, you know, where these two state governors, these two states, Florida and Texas, have proposed and enacted social media laws that, you know, are purportedly about protecting the free speech interests of the users. You know, even justices who seem skeptical of those particular laws seemed sympathetic to the idea that, well, maybe something does need to be put in place to ensure that this handful of social media companies that, you know, become gatekeepers to the digital public sphere, you know, something needs to be done to ensure that those gatekeepers are actually representing the interests of their users. And I think you see that across the political spectrum on the court. I don't know what that tells you about what they're going to do in this case.

Ryan: So, I think there are two mics on either end, if anybody would like to ask a question.

Audience Member: Thank you for having this session, and it is a dream team. I find myself, oddly, perhaps for the first time, in deep agreement with Mr. Jaffer. I would like to ask you to — it’s the fate of any senior person at a public company to be threatened by government officials. I can't tell you the number of times it happens. So, I'm not worried about senior officials being threatened by government officials. Although I do carve out regulatory officials and you've had that. It's a much different kettle of fish.

I think we ought to reconsider the importance, though, of the intertwining the second potential route for government transforming action into government action, because I think that the reporter scenario that was played out. Reporter-publisher scenario in mainstream media is vastly different than the way a large corporation is going to deal with a repeated set of government interactions, right? In my experience, what happens is that these things tend to get routinized. If there's going to be a consistent, private bully pulpit, let's give it that name. I don't understand how there's a private bully pulpit. It's sort of inconsistent, internally inconsistent. 

But if there's going to be a consistent set of communications where the government is going to be telling you these things, or you're not doing a good job with your content moderation, which I think, once it starts, is not going to be limited to a really severe crisis, because the crisis is going to continue to be less critical all the time, and the government is going to get used to doing this. The company is going to get used to doing this. This gets routinized. It gets routinized by pushing the interaction down in the corporate hierarchy to people who will not lie and say, of course, I'm sophisticated, and I'm not coerced by the guy — you know, what general counsel you're going to find and say, I'm so unsophisticated, I'm coerced by the government. We're all going to say we're totally sophisticated, and uncoerced. But the guy who's, and the woman who's, like, an assistant director, or someone else, have someone from the White House call them up and say, you're not doing a good job. That I think is something to be worried about, right? Because the interaction, the intertwining is going to happen to the level, and then there's going to be a frequency, and then people are going to get used to it. And before you know it, there are going to be decisions that are going to be changed by this consistent interaction with the government. So, I would ask you to rethink the notion that that part of it is unimportant, because I think that these things get routinized in a way that strips it of the ability of more sophisticated people to say no. 

Kathy: I think — just to respond to that — I mean, I think that's a fair point. I just don't think that's a First Amendment problem, or that something that we should be, I think it's a, you know, that’s maybe a question of, should there be, you know, a regulation that in some way restricts or, you know, statutes in some ways — it would have to be a statute to apply to the whole government — but that, you know, restricts the way in which government officials interact with private companies. But that strikes me as, yes, an absolutely real dynamic. And it's not, it isn't fair to just say there's no coercion, because there clearly is. It's clearly influential at a minimum, right? Even with people at senior levels. At any company, they're going to pay attention to what government officials says. So, I agree with you on that principle. I just think that, you know, I'm not sure that the remedy is, well, we should, you know, we should deem the government basically censoring speech. So, but that's fine distinction. 

Colin: So, you know, the dynamic you described, I think, is very, very real. I think it's accurate. I think you've stated it accurately. And I'm sort of sitting here trying to think about why it doesn't bother me that much. And I think the reason it doesn't bother me that much is that, it is important, but not isolated. And by that, I mean, that dynamic, that sort of intertwining and that sort of almost collaborative engagement, is happening with dozens, maybe hundreds of groups. And so, it's just, at least in my experience, it was never the case that the government had pride of place in those discussions, with the possible exception of national security issues — although even there, there were a lot of folks who had a lot of important information to share that were not associated with the government. And so, I think that’s part of my just relative ease or comfort with the dynamic you're describing. 

And then the other thing I'll say is, you know, I did mention lack of accountability in terms of speech decisions that are made by these companies. It's not, there is one bit of accountability, which is that everybody — anytime it's politically charged, the kind of the side whose ox is getting gored sees it very quickly. They see it in the reach of their pages. They see it in their impressions. And they go bat blank crazy. I mean, they are extremely vocal, and so the minute you sort of alter an algorithm or adopt a policy that's going to disfavor one side in a charged area, you're hearing it very loudly from the other side, and I think that would, that's true, regardless of what the genesis of the change is, so that also gives me some comfort that there's not as sort of an inevitable creep in the nature of these decisions that would be unnoticed.

Ryan: So, I guess one last question that's been touched on a bit is, but just a bit — and, Kathy, you said in your last comment the role of Congress, because we've talked about the executive branch, we've talked about the social media companies — but is there a role for Congress here, in the sense of could there be statutory reforms that would potentially regulate how government interacts with the companies? Or, Colin, flipping it to you as well, what do you think about legislation that does, in fact, require some forms of transparency on the part of the social media companies, if that's something that could also help fix at least the levels of distrust?

Kathy: Well, the question of liability, right, like, that's the big question. And that, yeah, that would make a big difference, right, because then these plaintiffs’ beef would be with the social media companies, not with the government officials. And so, that would certainly simplify the world and make it look kind of a lot like a lot of other things. And, you know, and that, obviously, will continue to be a source of, you know, debate.

I think it's difficult to be too restrictive — you think about just even this scenario, let's assume that the Supreme Court upholds the Fifth Circuit opinion and actually says, yeah, you know, what, the government crossed the line here. Okay. Like, what? So? I honestly, I think that probably would be a bit of, you know, the White House's reaction. Well, there's no damages, right, can't get economic damages in a case like this. And so, the, you know, the White House says, okay, you know, my bad, we crossed the line. But, by the way, they probably would say, I’d do it again, because this was an urgent health crisis, and we were going to do everything we could to ensure that the people of the United States were fully informed with factual information. So, you know, I think that, and I have to say that, probably in another situation, would it affect future government officials’ behavior? Sure, but probably, you know, a little bit on the margin, again, depending on where the Supreme Court, you know, draws the line. 

So to me, I think, there's probably not — other than really dealing with the big elephant in the room, which is, should the social media companies have some liability for these decisions that they make — I don't think there's probably a lot — I haven't given it a huge amount of thought — but I don't think there's probably a lot more that Congress could do. But that's obviously a big deal that would make people in the tech world pretty unhappy. 

Jameel: You know, I think that the Biden administration screwed up here, no question about it. There's the hard question about whether any particular interaction crossed the constitutional line. But, you know, Kathy, you would not say that. Maybe you wouldn't use the phrase I just did, but, you know, you've already said that, you know, you would have advised them to act differently than they did, at least in some of those contexts. 

And, you know, it's worth remembering that, you know — said at the beginning, and this is true, I'm very sympathetic to the Biden administration's effort at a kind of high level —  but, you know, they got some important things wrong. And it's not surprising. It was in the middle of you know, a crisis. And it wasn't clear what the right facts were. And they thought one thing was true, and it turned out that something else was true. That's always going to happen. But that's a reason to build a system that preserves the possibility of dissent and counterweight, and the rule against coercion is meant to do that. 

And, you know, if the Supreme Court comes out with a rule that says, you know, in a public health crisis, the government can override the — they're not going to do this — but override the editorial decision making of social media platforms, that would be a bad thing, because, precisely because, it's a public health crisis. And we need to make sure that collectively we make good decisions. And the way we do that is to preserve the space for dissent.

Ryan: Great. Well, please join me in thanking the panel.

Paras: We hope you enjoyed this discussion. 

You can find all of Just Security’s coverage of jawboning and the First Amendment on our website. 

If you enjoyed this episode, please give us a five star rating on Apple Podcasts or wherever you listen.