How is technology being used to track Black Lives Matter protestors?

Wednesday 18th Nov 2020, 11.57am

We’ve probably all heard the phrase ‘Big Brother is watching you’ (a reference to the fictional character in George Orwell’s dystopian novel ‘1984’) – but are we really under constant surveillance? Is it actually possible to be a fully functioning member of modern society without being tracked by some sort of surveillance system? And how is technology being used to track protestors – such as those involved in the Black Lives Matter demonstrations? We catch up with Anjuli Shere, from Oxford’s Department of Computer Science and Channel 4’s show ‘Hunted‘, to find out.

Eager to find out more? No worries, we’ve got you covered! Follow the links to check out the work of Simone BrowneShoshana Zuboff and Ken Klippenstein.

Read Transcript

Emily Elias: Protesting is a part of our democratic rights but it appears there are more folks watching than you might think. Surveillance systems are getting more complex and simply posting a picture could endanger those gathered to decry institutional injustice. So, on this episode of the Oxford Sparks Big Questions Podcast, we are asking how is technology being used to track Black Lives Matter protestors?

Hello, I’m Emily Elias and this is the show where we seek out the brightest minds at the University of Oxford and we ask them the big questions. For this one, we’ve reached a researcher who thinks about all of the data-related things that you don’t want to think about.

Anjuli Shere: My name is Anjuli Shere. I’m a doctoral student in Cyber Security at the Centre for Doctoral Training in Cyber Security at Oxford and my research focuses on protecting journalists from new emerging cyber threats.

Emily: You also have a very unique skillset that has actually got you on TV.

Anjuli: Yes. My professional experience covers a wide range of things. I briefly, when I was a teenager, worked at a private investigative agency, I’ve written for a national news outlet and my most notable one, I guess, is my summer job as an intelligence analyst on the television show, ‘Hunted’, which is on Channel 4.

Voiceover: “Nine ordinary people are about to go on the run from a team of expert hunters.”

Anjuli: I guess my primary responsibility is profiling, which often involves going through publically available online information as well as data from personal devices to put together the pieces and track someone down. It’s a fugitive simulation and all of those bits of expertise combine to form my doctoral topic.

Emily: Anjuli is an all-around badass when it comes to surveillance, not her words, mine. When you spend as much time in this world as she does and then take a look at what’s happening with protestors in the Black Lives Matter movement, well, it’s worrying.

Anjuli: It’s tricky because my experience on ‘Hunted’ is from the other side of things. It’s very much we’re simulating a law enforcement or intelligence agency tracking down people who might be on the run so I’ve seen some of the capabilities that are available to people who are on that side of the equation.

But, as a brown woman in Britain, I’m very sympathetic and a strong supporter of the Black Lives Matter movement and so I was thinking more about how these protestors might not realise that they’re being surveilled so it’s taking a negative view of the technological capabilities.

Emily: There is the low hanging fruit of surveillance that we’re all familiar with, you know, you’ve got your CCTV, that’s a part of daily life, drones, helicopters, following protestors around and social media. You could see how police could use that to identify people at protests but Anjuli says what we don’t talk about is how much that technology has actually evolved.

Anjuli: The technological capabilities that the police have access to now have massive advantages over the kinds of things that they were able to do 10 years ago. Those are upgrades in terms of capability and quality and prevalence. The example that I always think about is CCTV cameras. The rise of internet-connected consumer devices has facilitated the growth of an informal side of state surveillance infrastructure.

This is really well encapsulated by things like private camera-equipped doorbells that have an internet connection and allow police to access video footage and those have become very popular. Technologies like facial recognition, artificial intelligence, biometrics, those are all emerging at the same time.

Rather than being able to view CCTV cameras as a solid issue, as people might have been able to do in the ‘90s, you now have cameras that have multiple functionalities and are connected to the internet and are manufactured and sold cheaply and are not just sold to members of the public but, in fact, some police forces have given away Amazon Ring Doorbells to communities for free to enable them to visually monitor the areas in front of people’s houses.

That’s fostered massive racial profiling of video subjects, which, of course, is particularly concerning considering police profiling of activists such as those who support Black Lives Matter and potentially fatal when you consider the fact that there was that FBI report that showed that white supremacists have infiltrated American law enforcement to a very disconcerting level.

Emily: Back in May, protests in Portland, Oregon began over the killing of George Floyd. They continued through the summer and President Donald Trump sent in federal law enforcement to quash them.

Anjuli: In addition to normal policing efforts and then, also, very, very aggressive policing efforts, we also saw the deployment of these federal agents. There were reports of them not wearing badges, not wearing them nametags, kidnapping people off the streets, bundling them into vans and holding tem in federal buildings and ostensibly arresting Black Lives Matter protestors and surveilling them, profiling them and intimidating them.

Emily: How exactly were they surveilling them? Well, Anjuli says they were using a piece of kit that is typically used by military surveillance teams.

Anjuli: You have these different forms of technological surveillance that take in information on protestors en masse and one of those forms is low level voice intercept operations. Those involve using these military technologies, these very specialised technologies to pick up on communications by scanning transmissions, scanning the airwaves and then capturing these short bursts of conversation that could then be used against someone to incriminate or to prosecute someone, even if those bits of information, those short bursts of conversation are too brief to be grounded in enough context to tell you what someone was really saying.

Imagine having a conversation, imagine having a break-up and then talking to your friend in person in a public park and saying, “I just want to kill him.” Right? Obviously, you don’t actually want to kill your ex but that short burst of conversation could be picked up and then used afterwards to incriminate you.

Emily: It’s not just police surveillance that worries Anjuli, technology has been moving so quickly. I mean, a few years ago, we didn’t have doorbell cameras or Alexas or Fitbits watching and listening and tracking us. These companies, they have piles and piles of data about you and those laws that are out there to protect you, some are okay, some are out of date and some are just bad.

Anjuli: In terms of policing, you no longer have, in my opinion, state-run law enforcement agencies that completely siloed and isolated from any kind of corporate influence. The example that I gave of Amazon Ring cameras is a really clear example of that.

On top of that, it’s a legal but ethically grey area where you’ve got law enforcement and intelligence agencies allying with these corporate hegemons to create a state industrial surveillance complex that definitely threatens certain civil liberties, like the freedom to protest without having a dossier of your details compiled and kept on file as though you’re a criminal when, actually, protesting is very legal.

I think that this is particularly concerning when you realise that it’s not just the technologies that are the problem but, also, the law changing. Where the technologies can’t be directly accessed by law enforcement, you would expect the warrants to be an important factor in protecting people’s data but there have been these very clear instances of states implementing legislation that forces metadata or data to be retained by the technology or communications companies that have created these devices that are gathering the data.

For example, in Australia, there’s a mandatory retention of metadata for at least two years with state agencies paying the companies that have to retain this data for access. On top of that, in America, you have examples of data being bought in bulk from, for example, social media companies and then that’s used for an agency’s own purposes without necessarily needing to announce that there’s journalistic data in there or there’s protestor’s data included in that bulk load.

That’s particularly concerning for proponents of free speech because there’s a clear erosion of civil liberties that were previously protected by established legal mechanisms, like the need for a warrant from a judge if someone wants to look into a journalist or a protestor.

This is where I sound like a tin foil-hat-wearing conspiracist but it’s my personal belief that you can’t engage in normal human life and be a functional member of society at present in the UK or in the US, in particular, without also interacting with these massive technological structures and institutions that are springing up and that are very closely interlinked with the state.

There’s always going to be an asymmetry between where the technology is and where the law is but we need to try and close that gap a little bit and, beyond that, we need to stop the law from what’s actually happening at the moment, which is that it’s increasingly being used to facilitate technology being used against civilians.

Emily: So what can you do? How do you stay safe and protest? Well, Anjuli, you’ve got to go basic.

Anjuli: I want to make it clear that it’s close to impossible to entirely avoid these technological surveillance measures. The best someone can hope for might just be to make it so difficult or so expensive for someone tracking you to work out where you’re going or who you are that they decide to focus your efforts elsewhere but there are old school methods that still have utility today.

For example, we’ve seen some of these crop up at recent protests in Hong Kong, protestors might wear black or very dark clothing that isn’t branded to protest and then, afterwards, to avoid the police identifying them as protestors based on the uniform that they’re wearing, they could slip out of these clothes into some other slightly more colourful clothing so the police near protests can’t say with certainty that you’re the same person who was just at that protest. That’s a very simple one.

It’s also key, I would say, to try to keep up to date on any news articles that tell you what kind of surveillance tools have been used and there are lots of articles out there that investigative reporters, who are used to needing to know about this stuff, have written to help protestors. There are outlets, like The Intercept, that are really great at publishing these.

Advice from these sources tend to include things like, I would say, using a burner phone that is on airplane mode, writing down the numbers of people that you would want to call in an emergency, not taking anything that could identify you or track you to a protest. Obviously, it’s very easy for me to say that in the abstract, in practice that means not wearing your smart watch, it means not taking your normal mobile phone with you or your debit card.

I think it’s Mr Weasley in Harry Potter, “Don’t trust anything that keep its brain where you can’t see it.” Don’t take technologies with you. It’s the equivalent of taking someone along with you that has a clear allegiance to the forces that you’re protesting against.

Emily: I mean, I’m not going to lie, after talking to you I feel like I never want to leave my house again. Do you get that response from people often?

Anjuli: I try not to bring this up at dinner parties. No, I think there’s a very fine line between, like I said, wearing a tin foil hat and, also, having a realistic understanding of what the dangers are. Honestly, I think that the default position that everyone should have regarding these new technologies is suspicion. I don’t think that that’s because I’m a conspiracist, I think it’s because I’m actively researching these issues and so I have a greater awareness of what the problems are than, perhaps, other people.

I also rely on these technologies. I might not have any internet-connected devices that would be considered novel or interesting but I have a laptop. I still use the internet. I still use social media and it would be impossible to be a useful member of society if I didn’t have those things and, especially during the pandemic, it would be impossible to stay connected to anyone. I would lose my mind.

Emily: When it comes to technology, Anjuli says you can’t just bury your head in the sand and hope it will be okay. She’s a firm believer that we need to act.

Anjuli: I would say, in addition to protesting for political changes and for changes in the structure of these institutions, it’s also really important to push for legal changes and increased regulations of these technologies because that could have a massive positive effect.

Emily: This podcast was brought to you by Oxford Sparks from the University of Oxford with music by John Lyons and a special thanks to Anjuli Shere. There are a lot of other smart people out there who are thinking about these issues so, if you are looking for more information, we highly recommend you check out writing by Simone Brown, Shoshana Zuboff and Ken Klippenstein.

Links to their work are going to be in the show notes of this episode so you don’t have to go too far to find them. Tell us what you think about this podcast. We are on the internet @OxfordSparks and we’ve got a website, oxfordsparks.ox.ac.uk. I’m Emily Elias. Bye for now.