The Just Security Podcast
The Just Security Podcast
The Supreme Court’s TikTok Decision
On Friday, Jan. 17, the U.S. Supreme Court upheld the constitutionality of the Protecting Americans from Foreign Adversary Controlled Applications Act, the law which could effectively ban TikTok from operating in the United States, unless it is sold to a U.S. company. The case is the latest round in a legal battle involving free speech, national security, and the popular social media app, which is used by more than 170 million Americans. U.S. lawmakers argue that TikTok’s ties to the Chinese government raise serious data protection and content manipulation concerns. Free speech advocates see the law as a fundamental afront to the First Amendment.
How did the Supreme Court decide the case? And how might this decision impact future efforts to regulate social media companies with ties to foreign governments?
Joining the show to discuss the Court’s opinion and its implications are Marty Lederman, Asha Rangappa, and Xiangnong (George) Wang.
Marty is an Executive Editor at Just Security and a Professor at Georgetown University Law Center. He has served in senior roles at the Justice Department, including in the Office of Legal Counsel. Asha is an Editor at Just Security, a Senior Lecturer at Yale’s Jackson Institute for Global Affairs, and a former FBI Agent specializing in counterintelligence investigations. George is a staff attorney at the Knight First Amendment Institute at Columbia University.
Show Notes:
- Marty Lederman (Bluesky – X)
- Asha Rangappa (Bluesky – X)
- Xiangnong (George) Wang (Bluesky – LinkedIn)
- Paras Shah (LinkedIn – X)
- Just Security’s U.S. Supreme Court coverage
- Just Security’s TikTok coverage
- Music: “Broken” by David Bullard from Uppbeat: https://uppbeat.io/t/david-bullard/broken (License code: OSC7K3LCPSGXISVI)
Paras Shah: On Friday, January 17, the U.S. Supreme Court upheld the constitutionality of the Protecting Americans from Foreign Adversary-Controlled Applications Act, the law which could effectively ban TikTok from operating in the United States, unless it is sold to a U.S. company. The case is the latest round in a legal battle involving free speech, national security, and the popular social media app, which is used by more than 170 million Americans. U.S. lawmakers argue that TikTok’s ties to the Chinese government raise serious data protection and content manipulation concerns. Free speech advocates see the law as a fundamental afront to the First Amendment.
How did the Supreme Court decide the case? And how might this decision impact future efforts to regulate social media companies with ties to foreign governments?
This is the Just Security Podcast. I’m your host, Paras Shah.
Joining the show to discuss the Court’s opinion and its implications are Marty Lederman, Asha Rangappa, and George Wang.
Marty is an Executive Editor at Just Security and a Professor at Georgetown University Law Center. He has served in senior roles at the Justice Department, including in the Office of Legal Counsel. Asha is an Editor at Just Security, a Senior Lecturer at Yale’s Jackson School of Global Affairs, and a former FBI agent specializing in counterintelligence investigations. George is a staff attorney at the Knight First Amendment Institute at Columbia University.
Asha, Marty, George, welcome to the show. Thanks so much for joining us to talk about this important decision from the Supreme Court, which just came down. And as a reference, we're recording this on the afternoon of January 17, because a lot of these facts are moving very quickly. So, Marty, this is a complicated case. There are several arguments on both sides, and the law issue here is also quite complicated. Could you start by giving us a background on the case, the political dynamics and the law? How did this end up at the Supreme Court?
Marty Lederman: Yeah, thanks, Paras. This is one of those rare cases in which I can say I told you so. My prognostication is usually very much off, but it did strike me the more I read about the case and understood it, that this is what the court was going to do. Let me describe how the statute works briefly and then talk about the different rationales. It was a nine to zero decision to uphold the constitutionality of the statute against a free speech objection, several free speech objections by different sorts of parties.
So, most of our viewers presumably know what TikTok is and that what gets sent out through a feed, or it gets pushed out to users, like in other social media platforms, is based largely on an algorithm that takes what they've liked, what they've viewed and the like, and figures out what you should be seeing of the countless things that are available to you. And this algorithm is proprietary information, and TikTok, the platform is run in the first instance by a U.S. corporation in California called TikTok USA, which is, in turn, a subsidiary of — there's a chain of ownership that goes back to a company called ByteDance, which has its headquarters in the People's Republic of China. And ByteDance, in turn, is subject to, as are all other companies in China, to Chinese laws and Chinese governmental pressures to do what China wishes. And in fact, China does not allow the algorithm to be shared or exported overseas.
And TikTok, in the course of its business, collects an enormous amount of data or information about users and about the people they share things with and the like, and that is available to ByteDance and ultimately to the PRC, if they wish. And the PRC and ByteDance also, at least, have the capacity to control the algorithm and what gets pushed out onto TikTok. I'm oversimplifying dramatically, and George can fill in to the extent my year of the generation that understands this technology, and I don't, so I'm just the law guy.
So, over the course of several years, beginning with President Trump, in his first administration, there were deep concerns within the intelligence community and the national security community in the U.S. government about China's control, or its prospective control, its ability to control TikTok and to obtain information about U.S. users that would not only compromise their privacy interests, but could be used down the road years later to blackmail them or to create all sorts of profiles about U.S. persons that could be down to the detriment of the United States and U.S. persons in the United States, for obvious reasons. Now, China's obviously not alone in trying to collect data off of social media sites, but the perception was. China's one of our —Congress has declared them an adversary in some important sense, at least an economic adversary and an intelligence adversary, and their ability to control TikTok was something that has concerned the intelligence community and national security figures in both political branches for several years. Efforts were made to try to get TikTok itself and ByteDance to change their practices and the control that ByteDance has over the algorithm and over the data that's collected by TikTok during negotiations that took quite a long time. President Trump himself issued executive orders requiring divestiture of TikTok from ByteDance. That was tied up in court, remains tied up in court. It was put on hold in the DC Circuit when eventually Congress got around, last April, to passing this law that was the subject of today's Supreme Court decision.
And the law effectively requires that ByteDance — that there be some sort of very thorough, very comprehensive divestiture between ByteDance and TikTok. And if that, by Sunday, by 48 hours from now, and if there is not such a divestiture, the way the Act works, the regulated parties are actually U.S. companies that provide services with respect to TikTok, either, you know, app stores, that provide TikTok to users, or hosts that host TikTok on various different platforms. And they are told that they cannot do so if there has not been this divestiture come this Sunday. And so, the statute works against them, and if they were to disregard it, they would be subject to quite extensive civil fines in lawsuits brought by the U.S. Attorney-General against them. They have not been parties to the case, these third-party providers. They've been pretty silent about their views about the case and about the legal questions in the case. They've just been sort of waiting to see what would happen. They've been preparing, by all accounts, to comply with the law come Sunday, if the law were upheld, as it now has been, and I assume, on Sunday, they will start complying with it. They will start stopping to make Tik Tok available in app stores and stopping the hosting of TikTok. That doesn't mean TikTok shuts down immediately, but it will quickly become pretty much obsolete. All of the value that people obtain get from TikTok here to the United States will quickly diminish to the point where TikTok will not be an important or viable platform, certainly will not be the central sort of platform that it is today.
And by all accounts, that will happen. There was a provision in the law that allowed the president to extend the deadline, and still does. Joe Biden could extend the deadline, but it requires that that the parties be pretty far along in terms of being able to reach a qualified divestiture in order to extend the deadline. And as far as I know, there's no suggestion at all that they are, or that there's been any movement toward the kind of divestiture that the statute requires.
So, one thing that's interesting — one thing that seems to be apparent now, before I get into the meat and bones of the case and the legal questions, is that this was something of a game of chicken, right, in that the government was trying to get China and ByteDance to divest TikTok for several years. There were negotiations that people were willing to do it on certain terms, but not others. Those were not sufficient to address the national security concerns that the United States had. And so, Congress passes this law that basically tells China and ByteDance, you have to do this, or your TikTok USA is going to be basically shut down, and sort of daring them to do it.
Now, my take on this is that there aren't a lot of people in either of the executive branch or Congress that want TikTok to be shut down. They want it to continue, but that the algorithm and the platform be run by American, or at least non-Chinese-controlled entities, that's the ideal situation. And I think they were hoping that that would happen, but it hasn't happened. And so now, just in the last day or two, you have all sorts of folks, you know, leaks from the Biden administration, unknown officers in the Biden administration and Chuck Schumer, Majority Leader, Majority Leader, Schumer making noises about not wanting TikTok to be shut down and being desperate to do something about it in the next days or weeks or months to stop that. And I think that reflects the fact that this has been a failure. To a certain extent, China and ByteDance have called their bluff and said, we’re not going to. We're not selling. And so, it results in a situation that is not ideal from virtually anyone's perspective — the users of TikTok or politicians here in the United States, or the like.
So, that's the basic mechanics. Why don't I pause there just to let George supplement that, or correct what I assume are my many mistakes, before we get into what the legal arguments were and what the court did.
George Wang: Well, first of all, thanks Paras for having me, and thanks Marty for that great description. I think the only thing that I'll add to that description of the law is, you know, just how to characterize the law and how to think about the law. You'll hear me and others refer to the law as a TikTok ban, even though, you know, as Marty explained it, you know, the law doesn't say it is illegal to use TikTok in the U.S. It's, you know, it's not quite that simple, but I refer to it as a TikTok ban, because, you know, the law expressly authorizes a ban. Lawmakers, I think, intended the law to be a ban. And you know, the practical effect is likely a ban, especially since it doesn't seem like there's going to be divestiture in the next couple days. You know, even the Supreme Court, in its opinion, just to look a little bit ahead, called the law an effective ban on a social media platform with 170 million U.S. users. And that's just to explain why you might hear me talk about this as a ban and call it the TikTok ban, even though I acknowledge that it's, you know, not quite as simple as a straight ban.
Marty: I think that's right. But I also think, I mean, there's a reason the Solicitor-General kept referring to it as a divestiture requirement, right? Because I do think that that was the goal here, was to get — was to somehow salvage TikTok, but make it not subject to Chinese control. As of today, it doesn't look like that's going to happen, but who knows what the future will bring.
Paras: Thanks for that really helpful overview of the law and the political dynamics. Let's get into the Supreme Court's decision itself. And Marty, could you walk us through the opinion? What did the justices decide here?
Marty: So, the government had two basic rationales. So, the lawsuits were brought by TikTok itself, by ByteDance, which is — TikTok and ByteDance were represented by the same counsel. It's an interesting question whether it was a good idea to have ByteDance be one of the parties challenging it, because they are the ones that are controlled by China and they don't have any First Amendment rights. But TikTok USA challenged it and they were represented by a number of excellent attorneys, such as Andy Pincus, and ultimately in the Supreme Court by Noel Francisco, Hashim M. Mooppan, Jones Day, who did an excellent job briefing and arguing the case.
And then creators, a handful of creators on TikTok who would lose the ability to reach the audiences that they've had for many months or years, represented by Melvin, Ian Myers, Jeff Fisher, Josh Reves and others. And they, too, did a really wonderful job briefing and arguing the case, I think, and putting it in its best possible light. And the principal arguments were that, in some respects, this is a, you know, this obviously is a regulation — I haven't gone into the details of the statute, as the court's opinion suggests, some of the statute, there's sort of two categories of platforms that are regulated. One is TikTok by name, which is quite unusual in American law, to single out a particular speaker for disfavored treatment. And then other kinds of user-generated platforms that meet certain criteria involving Chinese or other adversarial control or prospective control over them.
But so, it's definitely a statute that is directed towards speech platforms, right? Not all, not all platforms, but speech platforms, a particular subcategory of speech platforms, which are user-generated speech platforms, and not, for instance, e-commerce platforms, which were carved out of the statute for reasons that are a bit obscure, probably just ordinary lobbying and the like that were carved out of it. And the argument is that, because Congress went after speech, you know, media, expressive companies, and in particular, certain subcategories of companies, you know, these different levels of carving the salami here, right? So, user-generated platforms, and within them, TikTok by name. It reflected at least some form of content discrimination or speaker-based discrimination that was subject to some level of heightened scrutiny under the free speech clause of the First Amendment to the U.S. Constitution.
And the government came in and argued in the first instance that there's no First Amendment protection at all here for reasons that I'm not going to get deeply into. They’re kind of arcane and complex, and that got a little bit of sympathy from the Supreme Court. The Supreme Court did not decide whether any heightened scrutiny is appropriate here. They thought that this statute was quite different than the types of speech regulations or speech media regulations that they have confronted in the past, but they reserve that question in today's opinion. They assume arguendo that it is subject to some form of heightened scrutiny. Justices Sotomayor and Gorsuch writes separately to say of course it is, but they're the only two who went on record, if that. I think ultimately, the court would hold that some form of heightened scrutiny applies, but the court merely assumed it here.
And then we get to the two rationales that the government gave for this. And the one that's gotten by far the most public attention, and that, I'd say, dominated about 70 or 80 percent of the oral argument in the Supreme Court, it was what was came to be called the covert content manipulation concern on the part of the government, which is the idea that an adversary nation such as China has the capability of basically using the algorithm and programming the algorithm to push out content to foreign audiences, including a U.S. audience, that is very much structured in accord with China's interests in what sort of content should or should not be viewed by American audiences, and doing so covertly, so that the hand of the Communist Party in China is not seen, right? It's all done behind the scenes, and therefore trying to manipulate various sorts of speech that's seen by U.S. audiences in particular, although audiences around the world, to the benefit of China, which they could use in all sorts of different ways to try to affect how Americans view about whole bunch of different issues, from elections to economics to politics and the like to their views of China, for that matter, and international trade and the like.
And the government argued that because China doesn't have any First Amendment rights, and because ordinarily, we think that Congress can prohibit foreign entities, and certainly foreign nations, from speaking to American audiences even overtly, it follows that covertly, they can prohibit that as well, and that they can do so even when they are speaking through the conduit of a U.S. corporation. That's what made this a complicated question — that China is not speaking or structuring the algorithm directly, but is doing so at least in some relationship with TikTok USA, which is a U.S. corporation, which ordinarily does have First Amendment rights to structure the speech that appears on the platform, and to do so according to the algorithms that they think best advance their expressive interests, right, or make this platform most appealing to their users, right, and the Supreme Court has held in recent cases that that the ability of a company to use algorithms to structure how speeches collated and sent out to people on social platforms is itself a First Amendment activity worthy of constitutional solicitude.
And this rationale, the covert content rationale, raises to my mind a host of really sticky, unresolved and, in fact, unexplored, unexamined questions under the free speech clause. It really is novel in a way that today's short opinion, per curiam opinion, appreciates, right? They're like, we don't want to get anywhere near these questions, particularly since we only had eight days to write this opinion. It's incredibly short order. We don't know what the ramifications are of making certain holdings with respect to that rationale, and we want to stay a million miles away from it. And they did. They don't decide whether this is entitled to heightened scrutiny at all, and they don't decide anything about the covert content manipulation rationale. Instead, the court did what I predicted they would do, and they relied on the second. It's actually the first rationale that was being used, and that is the data protection rationale. That is the fear that China has the capability of collecting and then manipulating and weaponizing huge amounts of data of U.S. persons, by virtue of their relationship with ByteDance, which in turn, has this relationship with TikTok USA.
And the court said, obviously, Congress would have passed the statute for that reason alone, which I think is a fair reading of the legislative history. Just viewing the House report, you will be struck by how dominant this interest was, at least within the executive branch and the committees in Congress. I know, of course, there are over 400 members of Congress right who voted for this. And so, who knows? I'm sure they had a mixture of rationales and reasons for voting for it, but certainly, the data protection rationale was predominant, and was, to my mind, obviously sufficient to have this bill be passed by very, very substantial majorities in both houses and signed by President Biden. And the court saw that, and the court said that, basically. They said, we don't have to deal with a case in which it's unclear whether this rationale would have independently supported a statute, because obviously it did here.
And the data protection rationale is not related to the content on TikTok. It's a non-content, non-viewpoint-based reason that the government wants to require this divestiture, therefore intermediate scrutiny applies, and that's a fairly forgiving level of heightened scrutiny. It's in between rational basis and strict scrutiny, and under intermediate scrutiny, the government's reasons for requiring divestiture as a condition of these third parties being able to continue to service TikTok passes muster as these things ordinarily do, particularly in the area of national security concerns. And so, the data protection rationale easily captured all nine justices votes, including Justices Sotomayor and Gorsuch, as I suspected it would. And so, I think George and others who were very, very concerned about other rationales the court might have used, and I was concerned about them too, about their implications for free speech, were just not reached by the court. The court settled on the data protection rationale. They said that was sufficient to the day to require this kind of divestiture condition of continuing to operate in the United States. And that's where we are. So, it's, in some respects, doctrinally, anyway, a much narrower decision than many had feared it might be, and one that I think is doctrinally quite solid and on very, very solid ground.
I don't have strong feelings about whether this was a wise thing for Congress to do or not. I'm the last person in the world to have expertise to know the answer to that question. I did work in the Department of Justice, and a lot of people I trust who care a great deal about freedom of speech. It assured me that there were these data protection problems that were quite severe. But, you know, don't take my word for that. But certainly, large majorities of both houses of Congress were persuaded that that was the case. And so, doctrinally, anyway, I don't know if George will agree, but I think this is a less convulsive, less troubling decision for free speech than other rationales might have been, and therefore it was very attractive to the court, particularly working on such a short timeline, to go in that direction, and they did so nine to zero.
Paras: Asha, what's your view of the decision?
Asha Rangappa: Sure, I was not surprised. You know, looking at the bipartisan nature of this ban, you know, this law, and the fact that this has been a national security threat that has been on the radar for the Trump administration before Congress passed this law and moving into the Biden administration, it just seemed a big stretch to me that the court was going to second guess the determination of both political branches that this constitutes enough of the national security threat that it warranted taking the step. And also, I think, the framing of the government that this was really targeting the foreign ownership, not speech, which the opinion mentions, it says that, you know, the majority says that they don't believe that this to be a content-based restriction — to me, all suggested that they were going to uphold the span.
George: So, I guess I can give my reaction. You know, I think my top line reaction to the Supreme Court's decision is that it's, you know, disappointing and concerning, even if it was maybe unsurprising. I agree with what Marty was saying that it could have been worse, but I think the decision we ended up with was still, you know, really troubling from my perspective. I think it has the potential to do real damage to the First Amendment, because it permits the federal government to go as far as banning an entire social media platform in the name of national security. And as I see it, the court today, you know, authorized the government to shut down Americans’ access to a wildly popular platform for speech, at least in part because it dislikes some of the content and viewpoints it believes are popular on the app. And I think that just sets a dangerous precedent for how we regulate speech online.
Here, maybe it makes sense to, you know, kind of break down, you know, how I saw the opinion going and, you know, flag some of my issues. So, I saw the opinion as addressing three main issues, which I think Marty summarized as well. You know, first, whether the First Amendment was implicated at all by the law. Second, whether the law was content-based or content-neutral, which factored into the level of scrutiny the court applied. And then third, whether the government's concerns justified the law. And you know, I have some bones to pick with the court’s reasoning as to sort of all three of those issues.
So, on the first one, just very briefly, you know, the court, as Marty mentioned, assumed, without deciding, that the First Amendment was implicated. But, you know, I think it really treated the issue as if it were a closed question. And to my mind, it isn't, you know? The First Amendment isn't only implicated when a law directly restricts speech. I think the question is if it burdens the freedom of speech, and here the law clearly does, because the mechanism for its enforcement is to shut down a popular social media platform. Plus, it is a direct restriction on speech. It's a direct restriction on the rights of Americans to receive information ideas from their chosen source, including if it comes from abroad, and it implicates their rights to access the media they're choosing. And overall, I think a disappointing part of the opinion was that it didn't really seem to take seriously the rights of American users, and particularly the rights of American users to receive information and ideas, including information that the government might deem dangerous.
So, on the second point about the level of scrutiny, the court concluded that the law was content-neutral on its face and was supported by content-neutral justification, and therefore at most, intermediate scrutiny was appropriate. As to that first part, I think it's true that not all speaker-based laws should be subject to strict scrutiny. But here, I think the law identifies a particular speaker, and it identifies it in part because of the government's disagreement with the messages the speaker presents and disagreement with certain content and viewpoints perceived as popular on TikTok. As for the government's justifications, as Marty mentioned, it defended the law on covert content manipulation grounds and data privacy grounds. And, you know, the court focuses on the data privacy rationale, which makes a lot of sense, because I do agree that the data privacy rationale is content-neutral, but I think maybe where I disagree with Marty is that, you know, I think it's a fiction to say that Congress enacted the law primarily for that reason. I really view, I think, the lawmakers and even how the government has defended the law as of just a few days ago, make clear to me that the content manipulation rationale was the primary reason. And you know, at the very least, it was a significant part of the law. And, you know, this is jumping ahead a little bit, but, you know, I’m not sure that the court can sort of avoid, you know, sort of, turn a blind eye to that justification, you know, which I think is the content manipulation rationale, I think is plainly content-based and, you know, should have been enough to warrant strict scrutiny there.
So, on the third sort of point, you know, the court reasoned that the law survives intermediate scrutiny because it furthers an important government interest unrelated to the suppression of speech, and does not burden more speech than necessary. You know, first, I want to acknowledge protecting American state of privacy is, of course, an important interest, but I'm not sure how the court can conclude that a ban doesn't burden more speech than necessary to further that interest. I think lawmakers should absolutely be concerned with the amount of data digital platforms collect on Americans, and the fact that they could share that information with others. That's a problem with TikTok, for sure, but I think that's a problem with most major digital platforms, and I think the more effective way to address the government's data privacy concerns is through comprehensive data privacy legislation and laws that restrict data brokerage and sale to China as well as anyone else you know. I think, just as a matter of common sense, is this law really the way that we expect Congress to address data privacy? I think it's pretty round-about, you know, you know? The court in its opinion tries to get around this. It calls it a conditional ban. It says the prohibitions prevent China from gathering data from U.S. TikTok users, unless and until qualified divestiture severs China's control. And that sounds, you know, reasonable, I suppose, to me, but that's not what the law says. And you know, the law effectively bans Americans from accessing the platform until that qualified divestiture happens. And I think that obviously burdens those Americans’ speech rights.
The other thing I will mention about the court's analysis on intermediate scrutiny is, I think there's a really great amount of deference that the court showed to the government's view of the circumstances, particularly on available alternatives. You know, the court was purportedly applying intermediate scrutiny, but to me, it treated it more like rational basis review. And so, you know, I think you can read this case as an extension of some of the court's national security exceptionalism and I think that reading of the case presents dangerous not just in this case, but beyond.
The sort of other thing I'll mention here is, you know, importantly, the court also concluded, as I mentioned before, that the other rationale, and as Marty mentioned before, the interest in preventing content manipulation, you know, sort of doesn't taint the data privacy rationale. But as I said before, I think that ignores the, you know, one of the primary reasons, not the very reason, that lawmakers enacted the law, and more importantly, how the government has principally defended the law, certainly before the DC Circuit, and I think before its, you know, you know, in its briefing before the Supreme Court, even if it maybe read the tea leaves during oral argument and shifted its position a little bit. You know, I viewed the government's defense of the law as primarily on that content manipulation rationale.
And I think this reasoning may be one of the more concerning parts of the ruling. I think it enables the government to enact laws with illicit motives, so long as there is also some potential elicit motive. And I think that's particularly concerning when national security is involved. You know, I think there are many laws that would otherwise plainly violate the First Amendment, but that could be justified on some content-neutral ground. And you know, I worry about extending that sort of reasoning to future cases also.
Paras: In just this week, millions of U.S. users have migrated over to other Chinese apps, including RedNote, which are similar to TikTok. What do you think is happening there? Are users just not concerned about the security risks, and might those apps also be facing bans that are similar to the TikTok ban?
Asha: So, I can't speak to what the consumers are thinking, obviously. I think the pushback against this TikTok ban — you know, it's a collective action problem, right? Like, each individual person thinks like, well, I don't care if China has my data or I'm not, you know? This isn't a big deal. All the social media platforms do this. So maybe they don't care, and they're moving over to these other ones. The other apps are potentially subject to the — now this law mentions TikTok specifically, but it also creates the option for the president to designate other foreign adversary-controlled apps to fall under its purview as well. So, it would require a presidential determination. My guess is that these other applications, RedNote, what was the other one? I don't use these things. They may just not have the kind of critical mass right now to present the sort of threat that TikTok does. There's something that they have discovered about the way TikTok is being used and manipulated that has, you know, really raised alarms. It may very well be that these other ones will in the future.
George: Yeah, well, first on the movement of so-called, you know, TikTok refugees to RedNote, which you know, has much a much stronger connection to the Chinese government than, you know, even anyone has claimed about ByteDance, along with, you know, various videos that people have posted on TikTok, you know, joking about saying goodbye to my Chinese spy. I mean, I think these actions, I think, maybe make the obvious point that, you know, people who use TikTok understand that TikTok has a connection to ByteDance, and ByteDance has a connection to the Chinese government and, you know, might be subject to pressure from the Chinese government. And so, I think, you know, at least when the sort of covert content manipulation rationale was, was, you know, at the center of, for instance, the DC Circuit’s position, I think it, you know, essentially makes clear that people are aware of this, right?
You know, as for the wisdom of doing that, you know, I don't really know. I honestly don't use TikTok or any of these apps, but, you know, I do still think that it is very much American users’ rights to choose whichever platforms that they want to have access to. And you know, some of this activity, I think, does sort of make it clear how, you know — I think some of these activities make clear that, you know, first of all, that users of TikTok are really invested in TikTok and care about its survival. And, you know, I view a lot of these actions as, really, you know, forms of protest, you know, humor, to sort of reject the government's vision of what this looks like.
In terms of, you know, what precedent this sets, the Supreme Court in this TikTok opinion mentioned multiple times that, you know, it sought this decision to be narrow. You know, it emphasized its narrowness multiple times, but I'm not really sure how it can be? You know, again, at least, how I see it, the decision permits the government to shut down an extremely popular platform for speech, you know, based at least a little bit, you know, Marty and I can disagree on grounds that, you know, lawmakers and on grounds that were plainly content and viewpoint motivated. And, you know, it's interesting when it talks about its narrowness, it uses TikTok’s popularity as a reason that the ban is constitutional, rather than sort of an acknowledgement that it burdens the rights of millions of American users, which I think itself sets a dangerous precedent for how we think about regulating sort of media online.
And you know, I think it's worth underscoring how significant I think the stakes of this decision are. You know, we're talking about government attempt to shut down an entire platform for speech, and one that's enjoyed by millions of Americans every day. And the scale and scope of that kind of restriction really does feel unprecedented to me, and the Supreme Court's authorization of that practice feels unprecedented too. So, you know, whether this means that the government will have a greater hand in restricting Americans’ access to other platforms, I think that remains to be seen. At the very least, I think, you know, it really raises this question of, when the government raises national security concerns, how much are courts going to really look into those justifications?
Asha: I think there are basically three big national security threats, two of which are mentioned in this opinion. The first is the influence operation — the ability of a foreign government to manipulate an algorithm, to basically engage in a perception management operation, to shape the attitudes and opinions of Americans. The court didn't seem very persuaded about the, you know, urgency of that threat at oral argument, and basically, they glossed over it in this opinion by saying that the second threat really meets the government's burden of being an important government interest, which is to prevent, essentially, the siphoning of huge amounts of data about Americans into the hands of the Chinese government, and the nefarious purposes for which that can be used, which include espionage operations, blackmail, those kinds of things, were persuasive to the court. And I agree that that is the kind of, you know, next highest threat, or actually, higher than the influence operation in terms of the U.S. government, because it can compromise potentially our own intelligence efforts and create a major counterintelligence threat here, if they're able to recruit people to basically spy on behalf of China.
But there's a third national security threat that we don't know about. So, the government alludes to this — alluded to this in their brief. They talked about the threat that's laid out in the classified record. Clearly, this is something that Congress was privy to, and they were able to see in terms of kind of wanting to pass this law, and the court doesn't mention it because they didn't look at it. And, you know, I'm not really sure how I feel about it. Gorsuch addresses this, that, you know, they made their decision on the public record that, basing their decision on secret information, you know, is antithetical, I guess, to you know, transparency and legitimacy of the court. That might be the case. But whatever that classified thing is, which we know, is not the perception management operation, not the espionage concerns, it's something even greater. That is where I think the heart of this matter is. And the government hasn't fully explained, and there may be very good reasons why they're unwilling to do that. I suspect it has to do with the, you know, something along the order of malware and potential threats to critical infrastructure, and some of the other really egregious things that we've seen China engage in over the last few years.
Marty: And I guess I would simply say that I wouldn't read the opinion to suggest that anything other than a divestiture from a foreign adversary nation is really implicated by this. I think at the very least, the justices would be all over the map on something that wasn't so limited, or at the very least, it would take them a lot longer to issue their decision. I do think that here, they really see this, not as an attempt to ban TikTok or deny 170 million Americans have the right to use it, or the ability to use it, but instead of an effort by the political branches to switch it over to an American owner from Chinese control.
George: You know, I think even if the focus, though, is on sort of adversary control, you know, I wonder where that logic ends when you think about traditional media, you know, does this logic extend to foreign ownership of, you know, newspapers, or really any adversarial ownership of newspapers? And, you know, I think the other point I'll make is that I think ownership does really matter. You know, of course, ownership matters for traditional media, like newspapers. A different owner for a paper can have an influence on the editorial, you know, decision making of that paper, as I think we've seen recently with the LA Times and The Washington Post. And I think that's certainly true for social media, you know, the example of Elon Musk taking over Twitter, or now X, you know, I think that fundamentally changed the community that was on X and had a major impact on how people engage with that platform. And so, you know, I think ownership matters, and I would be concerned about, you know, even sort of that narrow view of the opinion and how far that can extend.
Paras: And some people have talked about this idea of a slippery slope, that if you ban one app, then it could lead to banning many other apps. But, Asha, you've also explained there's a slippery slope in the other direction. So, what is the other potential slippery slopes here that are implicated?
Asha: Yeah, you know, I think, look, we always have to look extra carefully when the government is trying to shut down anything that offers the opportunity for people to engage in expression and speech. Fine. Here's the thing, if the government were to do this with other applications, either under this law, or try to do it with other social media platforms, we do have recourse to the courts here, which is something that the TikTok case has shown. And I think even with the concurring opinions in this case, it seems like on, you know, along the spectrum, political spectrum of the court, everybody seems to be concerned about the First Amendment implications of this kind of thing.
Here's the other side of it, which is that the kind of data that China is able to collect through this platform is something that if the U.S. government wanted in its own hands here in the United States, it would have to jump through a lot of hoops to get. It would have to go and get search warrants and FISA orders or whatever, you know, it would take in terms of having an active criminal or national security investigation. It would be hard. And here's where I've been thinking, is that, if people are handing this over to China, and we already know that Trump is coming in wanting to, you know, make deals, and he's already been on the phone with President Xi even before his inauguration, discussing, you know, TikTok and the future of TikTok, it's not clear to me what would stop him from getting the data that China has gathered on people in whom he has an interest in gathering dirt. He has broad latitude for foreign affairs authority under his Article II powers. We know that he doesn't take in any witnesses when he's talking to these, you know, foreign leaders, particularly of rogue states. And so, you know, it really does offer a president who may not necessarily be acting in the best interest of the United States or of the American people, to really use this as a back door around our own due process, and, you know, civil liberties protections that we have in the courts. And I really hope that some people think about that.
Paras Shah: And it's really difficult to read the tea leaves here, because so many things will be in flux in just the next couple of days. The Biden administration, according to press reports, has said that it won't enforce the ban on Sunday, and of course, President-elect Donald Trump will become President Donald Trump at noon on Monday, and he has said that his views are forthcoming and will be decided soon. If he decides not to enforce the ban or to delay enforcement, how might that happen as a legal matter?
Marty: I mean, this is something — there’s a lot of bluster out there, and there's a lot of posturing, I think. As I said, the way this is enforced is, the Attorney-General brings civil actions for fines if third party providers disregard the statute’s prohibitions. I don't think they are going to disregard the statute’s prohibitions. So, there's not going to be anything to enforce. And so, this all this discussion about how we're not going to enforce — well, TikTok is not going to exist probably in a couple of days. So, what steps could President Trump or Chuck Schumer or others take to revive it, right, to salvage it after it goes into effect on Sunday?
The most obvious thing is a statutory amendment of some sort, right, which, and maybe there are votes for that, right? Maybe there will be some sort of, at least, extension of the time by which the divestiture has to take place, although I have to say, you know, China and ByteDance sort of called their bluff and refused to make this sort of divestiture by Sunday. And so, what? Why think that they'll do it by July or by next December, or whatever the next deadline would be? Or, I don't know. I mean, maybe President Trump thinks that he can actually induce China to sell this algorithm to a U.S. owner. I don't know how that would be. So, if that were to happen, the statute does, it does allow that if the divestiture occurs after Sunday, at that point, things would change, right? And the platform would once again become available, once the president would basically attest or declare that that the terms of the divestiture have been satisfied. So maybe, what do I know? Maybe Donald Trump will be able to convince China to give up this very valuable algorithm to an American purchaser.
But short of those two things, statutory amendment and working a divestiture through, I'm not sure things will change. I imagine Donald Trump might say, my Attorney-General is not going to enforce this if Google or YouTube disregards it and starts making TikTok available again. I would be very, very surprised if the lawyers for those corporations advise their clients to go ahead and disregard the statute just because Donald Trump is forbearing enforcement of it. Typically, that is not a defense, right, unless Trump thought it was illegal to enforce it. So, someone could reach back and penalize them for having disregarded the statute during the period of forbearance, of enforcement forbearance.
But again, I don't know. I mean, maybe you'll get enough people willing to take that legal chance who are more aggressive, who will break the law for a while on knowing that Donald Trump is not — Pam Bondi is not going to come after them, which is probably true. I don't quite — I'm not sure anyone really knows what they will do to try to accomplish what they want to accomplish. I think that there's, right now, there's a lot of rhetoric out there that’s a little bit half-baked.
George: Yeah, I don't know. I mean, I think I certainly agree with you that I have no idea what, you know, President-elect Trump is going to do when he takes office. And, you know, I think in general, we shouldn't be looking to Trump to just sort of save TikTok. You know, after all, in 2020, it was initially part of his idea to ban TikTok in the first place, and I think, as a more general matter, it's, I think, a mistake to invest government officials with such a broad authority to restrict citizens from accessing foreign media platforms. That's a power that can be readily abused by anyone who wants to manipulate and distort domestic public debate, and I think we should be concerned about the way that that power might be abused in our future. So, you know, even if Trump has some sort of discretion under the law, even if he has maybe some other ways to avoid enforcing it, you know, I don't think that's necessarily the solution that that's going to be satisfying, at least for me.
Paras: Thanks for that. Are there any concluding thoughts or issues that we haven't touched on yet that you'd like to add?
Asha: The only thing I would want to add is, I personally think that TikTok does constitute a national security threat. And I mean, there's just so few things that you know, Republicans and Democrats can agree on, and the very fact that this actually had bipartisan support, and, you know, passed and was signed into law, to me, tells me that this is something that is long overdue. I saw a Twitter post by Tom Cotton where he talked about the threat that TikTok poses. And, you know, I kind of disassociate from my body whenever I see something from Tom Cotton or Ted Cruz or whoever that I agree with. But I think in this case, it's one of those rare instances where you have some consensus on both sides of the aisle. And I think we would be wise to pay attention.
I guess the, you know, the last thought I'll just mention is, you know, I think part of what was really disappointing about the decision is that, you know, I at least view this law as you know, plainly an attempt to block Americans’ access from a foreign media platform. And I think that practice, you know, is really a tactic we ordinarily so associate — that's a tactic we ordinarily associate with repressive regimes, not democracies, and I think it really unflatteringly recalls our government's own past efforts to control Americans’ access to information and ideas from abroad. This was part of a point that the Knight Institute, where I work, along with two of our peer organizations, Free Press and PEN America, made in our amicus brief before the court.
Marty: I recommend to Just Security viewers. It’s an excellent — George, it's a really excellent brief. It's a really fine amicus brief.
George: Thank you. Thank you. You know, we make the point that during the Cold War, the U.S. government used restrictions barring communists from coming into the country to keep out a wide range of its critics, including folks like Pablo Neruda, Gabriel García Márquez, Jorge Luis Borges and other prominent cultural and political figures. It intercepted and detained mail sent to Americans from abroad that it deemed communist political propaganda, and it barred Americans from receiving books, newspapers and other publications from so-called enemy nations. And of course, you know, it did all of that on claims of national security, and the fact of these restrictions was to limit Americans access to information and ideas and also to cause others around the world to doubt our country's dedication to its ideals. You know, we ended those ill-advised practices, and we now view them with embarrassment and shame. And I think that history should be a lesson for today. You know, at its core, I view the TikTok ban, the law, as an effort by the government to shape and control the information and ideas Americans are permitted to engage with. And I think the Supreme Court's decision today effectively permits that to happen.
Marty: Just to re-emphasize, if the court had held that that would be very troubling. The court was very careful not to say that that would be a legitimate justification for doing this. And I think that, you know, if there were to be a qualifying divestiture, TikTok, with all of its content, would continue to thrive in the United States and Congress would not be making efforts to try to ban it just because of its content, if it really were taken out of the — if China could not get at the data of U.S. persons. So, it's just a cautionary note, which is, I think it's in all of our interests not to describe the court as having done something much more dramatic and much more speech suppressive than it did. We don't need those kinds of precedents, and I don't think this is one, so maybe one or two cheers for the court for not raising the kind of free speech concerns that many of us were most concerned about.
George: I do definitely cheer that the court did not reach that conclusion.
Paras: And let’s leave it there. Asha, Marty, George, thanks so much to all of you for joining the show. We'll be continuing to track all of these developments at Just Security. Thanks again.
Asha: Thanks.
George: Thanks to you two.
Marty: Thank you very much to all of you.
Paras: This episode was hosted and produced by me, Paras Shah, with help from Clara Apt.
Special thanks to Marty Lederman, Asha Rangappa, and George Wang. You can read all of Just Security’s coverage of the TikTok case and the Supreme Court on our website. If you enjoyed this episode, please give us a five-star review on Apple Podcasts or wherever you listen.