The Just Security Podcast

Counterterrorism and Human Rights (Part 2 Spyware and Data Collection)

November 27, 2023 Just Security Episode 47
The Just Security Podcast
Counterterrorism and Human Rights (Part 2 Spyware and Data Collection)
Show Notes Transcript

Some of the biggest risks to human rights in the twenty-first century come from governments misusing surveillance technology originally designed to combat counterterrorism. These spyware tools are manufactured around the world, including in the United States, the European Union, China, Israel, and the United Arab Emirates. 

The technology is difficult to detect and allows access to a target’s communications, contacts, and geolocation and metadata. It can even delete information or plant incriminating data on a person’s phone. Now, nations are using it to spy on politicians, journalists, human rights activists, lawyers, and ordinary citizens with no links to terrorism. 

As a reminder, this is Part 2 of a conversation with Fionnuala Ni Aoláin. Fionnuala recently completed her tenure as the United Nations Special Rapporteur on Human Rights and Counterterrorism.

For nearly six years, she examined global and country counterterrorism practices and how they do or don’t comply with human rights standards. To hear Part 1 of our discussion, including Fionnuala’s insights from her experience documenting the conditions at the U.S. detention facility at Guantanamo Bay, Cuba, and in prisons and sprawling camps in Northeast Syria, please tune in to last week’s episode, which you can find in the show notes and on our website. 

Show Notes:

  • Fionnuala Ní Aoláin (@NiAolainF)
  • Paras Shah (@pshah518
  • Viola Gienger (@ViolaGienger)
  • Part 1 of our conversation with Fionnuala 
  • Fionnuala and Adriana Edmeades Jones’ Just Security article “Spyware Out of the Shadows”  
  • Just Security’s Ending Perpetual War Symposium 
  • Just Security’s counterterrorism coverage
  • Just Security’s technology coverage
  • The U.N. Special Rapporteur on counter-terrorism and human rights’ website (including reports during Fionnuala's term, which ended Oct. 31)
  • Music: “The Parade” by “Hey Pluto!” from Uppbeat: https://uppbeat.io/t/hey-pluto/the-parade (License code: 36B6ODD7Y6ODZ3BX)
  • Music: “Gnome” by Danijel Zambo from Uppbeat: https://uppbeat.io/t/danijel-zambo/gnome (License code: MIZAQ1JSL9JRTUN8)

Paras Shah: Some of the biggest risks to human rights in the twenty-first century come from governments misusing surveillance technology originally designed to combat counterterrorism. These spyware tools are manufactured across the world, including in the United States, the European Union, China, Israel, and the United Arab Emirates. The technology is difficult to detect and allows access to a target’s communications, contacts, and geolocation and metadata. It can even delete information or plant incriminating data on a person’s phone. Now, nations are using it to spy on politicians, journalists, human rights activists, lawyers, and ordinary citizens with no links to terrorism. 

This is the Just Security podcast. I’m your host, Paras Shah. Co-hosting with me today is Just Security’s Washington Senior Editor, Viola Gienger. 

This is Part 2 of a conversation with Fionnuala Ni Aoláin. Fionnuala recently completed her tenure as the United Nations Special Rapporteur on Human Rights and Counterterrorism. For nearly six years, she examined global and country counterterrorism practices and how they do or don’t comply with human rights standards. To hear Part 1 of our discussion, including Fionnuala’s insights from her experience documenting the conditions at the U.S. detention facility at Guantanamo Bay, Cuba, and in prisons and sprawling camps in Northeast Syria, please tune in to last week’s episode, which you can find in the show notes and on our website.

Now, we’ll jump right back into the conversation. 

I want to pivot slightly and discuss another area where your mandate did so much work, which was the spread of technology, particularly spyware, which can be covert tools like the NSO Group's program, Pegasus, which can discreetly take over a device and have access to a person's phone, to more visible aspects of spyware, like facial recognition or street cameras. And as you've observed in your reports, these tools have been used to target human rights advocates, lawyers, journalists, political leaders. What do you see as the most acute threats for spyware going forward? 

Fionnuala Ni Aoláin: So, I think spyware is an existential threat to democracy, and to rule of law-based societies. While we have an enormous amount of evidence already — whether that's been through litigation, for example, in the United Kingdom, or through the work of the PEGA committee at the European Union, or through the documentation that we've seen been done by journalists and investigative groups across the world — we have a sense about the ways in which spyware is being used to target a whole range of actors. And that's happening. 

I think it's really important to say that persistent misuse and global misuse of this technology is happening both in democratic and non=democratic spaces. The PEGA report of the European Parliament looked at misuse in countries like Spain, and Poland, and recognized the ways in which democracies are misusing this kind of technology. And so we now, I think, fully understand that when these companies who produce spyware insist that the technology is only intended for legitimate use by investigative authorities, the facts on the ground tell us that that is entirely not how these technologies are being used, and actually belies the seductiveness and the sale and transfer and use of these technologies, which precisely appeals to both democratic and non-democratic governments in the ways in which these powerful tools can be used to spy on political opponents and dissidents and ordinary citizens. 

But I do want to acknowledge that there's a whole universe of things we don't know about the surveillance and the technology available to states that is not commercial, meaning state-developed surveillance technologies that don't rely on the private sector. And I think even in terms of the work of groups like Citizen Lab or the Amnesty Lab, they can check your phone to make sure, as I do regularly, that my phone has not been infiltrated by spyware, we have yet to be sure that we have a technology that adequately tracks whether our phones and our devices and other technologies are being massively infiltrated and used by states in a way that we can't yet track. And that's what I mean by existential threat, meaning that you have both this collusion of commercial spyware and state generated surveillance technologies coming together to control public space, civil society, and essentially obliterate the capacity, in some cases, for individuals to function in their lives without surveillance. 

And I mean, I undertook a global study on the impact of counterterrorism on civic space last year, and I traveled the world and the single thing that most human rights defenders — whether it was in the Philippines or in Hong Kong, or in Ramallah, or even in the United States — the thing that people were the most worried about was surveillance, that they didn't know what they didn't know about what kind of surveillance they were subjected to. And that meant that many of them were preemptively cautious because they were afraid of things they might say, or do, that would land them in trouble because they were being surveyed. So it's the obvious and the non obvious impact of spyware that I think makes it such an existential threat. 

But the problem we have is we don't have regulation. And I have been calling for, saying very explicitly, one of my last statements as a Special Rapporteur made clear a number of things. One, I endorsed the PEGA committee findings, and the European Union, as the leader in this space, has to move to regulate effectively, which includes increasing its capacity to regulate dual use, and other arrangements that prevent the sale and transfer of these technologies to governments who will abuse them. I also make clear that we need a liability-based model, like, companies have to be held accountable for the damage that they do. And these devices have to be, this technology has to be developed in a way that's human rights compliant. And if it can't be, like, literally, if you cannot do that, then this technology has to be banned because of its existential threat. And maybe I'll close by saying, you know, we've heard a lot from these companies about self regulation. And I just think that's not appropriate. This is not a place where self regulation by the most shadowy, non-transparent and harmful of companies should be allowed. Self regulation is a non-starter in this space.

Paras: There's so much to follow and look for there, and your work on that issue has been particularly illuminating. Another area that your final report as Special Rapporteur touched on was the massive collection of personal data when people fly on commercial airlines for business or for leisure. And I know around the holidays, this is on all of our minds as many of us are traveling. Can you give us an overview of that report and its findings?

Fionnuala: Sure. So one of the things that many of us don't realize is that actually when we do the most mundane of things — like next Wednesday and Thursday, we're coming up on Thanksgiving, hundreds of thousands of people will be getting on planes and traveling — and one of the things that they will do when they do that is that they and many billions of passengers on air transportation around the world will have their personal data collected through this form of either Advance Passenger Information, API, or another method, which is Passenger Name Record, PNR, data. So API and PNR data collection happen via commercial airlines. And when those airlines collect your data as you check in for a flight — it's often if you're traveling overseas — it's going to be shared with the state that you're traveling to.  

And API and PNR information sharing has just accelerated across the globe. And it's not — I want to be clear — it's not that I think we don't need security protocols when we travel, but what happens and what has been happening in the collection of this information is that the impact on individuals’ rights has been extraordinary. And I want to explain why that's true, because sometimes, it's not obvious how much personal data these particular forms of data collection are tracking. So if you look at API data, it's a kind of a very basic form of data collection as you travel. And it identifies things like your name, your date of birth, your gender, your citizenship or nationality, the country that issued your travel documents, and the specifics of your flight details. And again, while these things look generic, actually, from that very basic information, there are a number of salient features, including your gender and your nationality, that make you highly vulnerable to certain kinds of human rights abuses, whether it's from the state that's collecting that data, that commercial airline sharing it in the state you're leaving from, or the state that you're traveling to. 

And the key point we understand is that when this data is collected, it's intended to be able to allow that data to be matched against terrorism watch lists, domestic or international, to make sure that the person getting on the plane is not a terrorist. So no one could disagree that we don't want terrorists getting on planes. But I want to compare this to a sort of, it's like a huge, big, heavy hammer to hit a tiny little nail. That's because here, you've got massive amounts of data being collected on the populations that are traveling, but actually, the number of terrorists that we think get on planes most years is tiny. And in fact, many of them are smart enough to know that actually, they shouldn't be getting on planes anymore, because that's not a good way to go. 

The second form of data, this PNR data collection, is more expensive. And it's an umbrella term that actually goes beyond your mere identity, but it can include all kinds of other information, for example, all about your ticket, who paid for it, what was used to pay for it, your dietary preferences, your frequent flyer information, your baggage information, the hotel you might be going to stay in. All of this is highly identifiable information, even, just for example, do you asked for a kosher or halal meal on a flight, that's already an indicator of a religious identity that can make you vulnerable to targeting in either the country that you're leaving from or the country that you're going to. And my key problem here is that the collection of this data and the identification that this data allows happens almost without any human rights protections in any country in the world. So even countries that have for example, data protection statutes, generally have national security opt-outs. So it turns out that even if you're in a country that looks like it has data protection, that your data that you've just given over to get on a plane won't be misused, it turns out that it's opted out, because this is considered national security data.  

The second point about this data is that it's transferred between states. So for example, we might not have a big issue for sitting in the United States about having our data shared with Canada. But what happens if you're flying to China, or you're traveling to countries that have a less than stellar democratic record, or even a terrible record for dealing with journalists, humanitarian, civil society actors, dissenters, and there are no built in protections in these systems to ensure that they won't be misused? And maybe just to close by saying, this is a general set of concerns. But my biggest concern in this final report is that the United Nations is in the business, through this goTravel program, of actually supplying states with this technology. And it is entirely human rights free. It has no intrinsic and robust, built in due diligence or human rights protections, so it's ripe for abuse. And fundamentally, as I call out in this report, there needs to be a moratorium on U.N. transfer of this technology, because if you're going to supply it to a country like Sudan, which is on the list of countries we're allegedly going to supply it to, you risk the lives, the security, the liberty, the protection from harm of significant numbers of people.

Paras: And given all the grave human rights concerns that you identify, and the need to collect this data at a certain level, how should states respond and the U.N. respond to this problem? How do you make this collection human rights compliant?

Fionnuala: Well, I think we start off — so there's a couple of really major recommendations in this report, but I would start off by saying that the first thing we do is that these systems, these programs actually have to start with human rights thinking at the center of the data collection exercise. So when you do that, you're not just thinking about, “Oh, we have to, like, protect ourselves from these identified potential terrorists who might use a plane.” But we also have to protect the actual people who are actually traveling, whose data we're actually collecting, and where that data is. It's just the start of a different mindset. You're not only thinking in terms of one risk, which is the risk of a terrorist traveling, you're thinking about actually a more actualized risk, which is that governments will misuse this data if they don't have protections. 

I think the second thing that we would then do is we would say that you need to restructure these programs. I mean, you start with a different set of presumptions, and then, what that means, is you build in due diligence when you transfer, meaning, the country you're transferring to has to have a data protection legislation. And frankly, if the U.N. is going to transfer the technology to these countries, then you should insist that the national security carve outs don't apply, because the scale of the number of people you're collecting data on, to me, makes the national security exemption disproportionate to the actual harm that will be actualized with misuse. 

And the second thing I think we absolutely need to do is that you have to ensure that there are independent monitors in the countries that have this data collection who can call out misuse, that there's a right of remedy for the individual whose data is misused, and that there are some countries, frankly, who simply should never get this data. If you're a country that has a persistent violator status in human rights — from human rights treaty bodies, or the Universal Periodic Review, or is on a designated sanctions list like Sudan — those countries should never be in the frame for getting this kind of technology until their human rights records are approved. So I think all of those levels, it's kind of at a meta level, but it's also like you literally have to build in protection at every nodal point if you're going to give states the technology to collect this kind of data.

Viola Gienger: Well, that gives us so much to think about. I really appreciate your perspective on all of these and the incredible work that you've done. You have had such a consequential tenure as Special Rapporteur for six years, in addition to now your two law school positions. What's next for you? 

Fionnuala: So I'm hoping to take a short-ish break in the sense that it's been a really extraordinarily fruitful time. And I have felt enormous privilege, actually, it's a huge privilege to do this work, and to serve the people whose rights are so negatively impacted by the misuse of counterterrorism, whether that's meeting men at Guantanamo Bay, Cuba, and having access to that site and both the emotional and symbolic significance of that, for me personally, to be in that space, to meeting, you know, young boys — the age of my own boys — in detention facilities in northeast Syria and to try to be a voice for them. That's been a privilege and not always one that I feel that I've managed to do the things I would like to do to really ameliorate the situation of the people I've met along the way. 

But I go back to teaching. I'm back in the classroom, which I'm really enjoying. I'm back to being a Just Security editor and trying to do more work in the places and the organizations that mean the most to me. But I will continue to do this work, and I know that I will, at some point, I will be back on a front line somewhere. I just don't know where that front line is going to be yet. And my front lines in Belfast remain standing. And I go back to do the work that I've always done, which is writing, thinking, supporting as a community member of civil society, these issues that are so, so important to me. But as I say to all of those I've met in the last six years, I know we'll meet again, I just don't know where it will be. But for now, it's a good thing for me to be back in the classroom and back to writing as an academic, and I'm really enjoying both of those things.

Paras: Fionnuala, thank you again for joining the show. 

Fionnuala: Thank you.

Paras: This episode was co-hosted by me, Paras Shah and Viola Gienger. It was edited and produced by Tiffany Chang, Michelle Eigenheer, and Clara Apt. Our theme song is “The Parade” by Hey Pluto. 

Special thanks to Fionnuala Ní Aoláin and Brianna Rosen. You can read Just Security’s coverage of counterterrorism, including spyware and surveillance technology, on our website. If you enjoyed this episode, please give us a five star rating on Apple Podcasts or wherever you listen.