“A State of Emergency”: In Conversation with Jake Scott on Surveillance Technology, Part I
Georgia: What is your opinion on facial recognition technology (FRT)?
Jake: So the thing about facial recognition technology is that it was almost an inevitability with the invention of video. And even before video was a massive phenomenon, there were concerns about its use and implementation, especially obviously in the 1984 novel. And whilst we don’t seem to have on at least the same scale as Orwell’s dystopia a two-way video in every home there have been increasing.
There has been an increasing presence of video, facial recognition technology in the public square. And for instance, in Birmingham, where I live there is the new street station. And when it was rebuilt as grand central, it had these three EITs. That was how I was referred to these, these large ice shaped screens that were marketed as being the next step in advertising because they were able to change what they showed based on the composition of the crowd that it was looking out on.
So there’s, as I said, there are three of these eyes. And the point is that they are called eyes because they do literally watch the crowd. It’s obviously based on, you know, mass data collection. So if it shows a majority young female crowd, then they’re probably going to shift t makeup or similar kind of products.
The reason this matters so much in terms of the public square is that the public square was historically somewhere that was not necessarily as observed as the private square. And the interesting change in that is that in Birmingham, especially there are clear distinct lines of public and private. So for instance, if you walk to the ballroom, there is a line in the pavement where the paving slabs do change. And that’s because on one side it’s public and on one side is private.
Public demonstrations can be broken up so much, quickly, so much more quickly with facial recognition technology. You can essentially prevent them from occurring before they even start. But also the reason that they matter in terms of the public square is that the public square is the place of spontaneity in public life.
The fact that it’s observed so closely. And so scientifically is almost a way of stifling that spontaneity. So public squares, especially, you know, you’ve got paradigmatic examples like speaker’s corner in Hyde Park.
These are meant to be places to begin a public debate. And if you can prevent that public debate from occurring, then you can, in many ways, stifle the idea of the public. So what this means really this, this might sound hyperbolic or exaggerated, but it’s the, it’s the honest to God truth.
It’s part of the reason that Britain needs to be more concerned about the fact that we are the most observed nation in the world. You know, we have three times as many CCTV cameras in public than most of the countries.
A specific problem with facial recognition technology in the public square is that it profiles the crowd. And obviously, this raises a number of ethical issues, especially regarding say racial discrimination and sexual discrimination and gender discrimination.
It is a phenomenon in the modern world that the state does not know where you are up until recently. That’s a given the state was beyond small, you know, the nightwatchman state was as big as the state got in history. And then after the second world war it radically changed, this is why people should be concerned about things like facial recognition, technology alongside vaccine passports. They’re not just an efficiency question. They’re a question that radically alters the diff the relationship between citizen and state.
If the state has the capacity to know where you are at all times, its relationship to you changes, and we can’t necessarily know how it changes yet, but the fact of the matter is it well, and if you can walk through a large city and not not be recognized by facial recognition, technology pattern technology then, then you can essentially achieve anonymity, you know, CCTVs do observers, but not on the same level.
Georgia: Facial recognition technology is obviously, as we can tell by the name specifically about the face, as opposed to say the building, heights and clothing of the person and the most extreme example of this. The most extreme example of this, of course, is China, where they’ve combined this with a social credit system to essentially track socially undesirable people. And if someone has, you know, debt or has committed a crime or any of these sorts of things, the social credit system in China informs the people around them.
Jake: Sure, that’s absurd. That’s absurdly terrifying. One of the last remaining civil rights of this, of our civilization, is your ability to hide your own personal life from strangers. You know, the idea, granted, there are some that you can’t hide like sexual offence registries, and obviously, but the fact that I could stand next to someone in public, not know anything about them is a remarkably rare thing these days, or it will be an increasingly rare thing in the civilized world.
So facial recognition technology for me is a radical transformation of the relationship between citizen and state between states and public square, but also between citizens and citizens.
Georgia: What would you say to the people who say, okay, this will radically change the relationship between person and state and between different people, but that’s okay. It’s for the greater good it’s for security, blah, blah, blah. How would you respond to that? I’ve spoken to one person who thinks that FRT should be used to stop human trafficking for example
Jake: The issue with government structures and systems is there is always the possibility they will be used by your ideological opponents. And that doesn’t just mean, you know, liberal versus Tory versus socialist. That means radically alternative viewpoints. So for instance, the, again, you know, this might, these might sound like hyperbole or exaggeration, but one of the important things when it comes to political theory, especially is we have to consider the worst possible circumstance, Weimar Germany, set up its constitution to avoid tyranny.
And the ironic thing was the moment a tyrant entered the system. It completely fell apart, and it was the very system designed to prevent tyrants that facilitated the rise of a tyrant, especially, especially article 48, article 48 in the Weimar constitution, stated the Weimar constitution can be suspended in a moment of crisis now, liberal approaches to government. And by that, I don’t need to be ideological.
I mean, specifically, philosophical, liberal approaches to the government have always begun with the question of what is the relationship between citizens and states and how far can the state be restrained to secure the Liberty of the individual? The reason the question, how far can the state be restrained is not to, it’s not to prevent say welfare or, or security measures. It’s because there’s a rational fear is that eventually that system or structure might be overtaken by the very people that you need protection against.
So Nazi Germany is the perfect example. Likewise, with Tsarist Russia, the reason that Russia was even capable of becoming the USSR is because Russia didn’t have liberal protections against state overage. So with systems-wide facial recognition technology, it’s all very well and good when we believe that we have a benign, neutral civil service, for instance, that doesn’t have an ideological agenda, but what happens if it suddenly does?
And some people, you know, sort of critical people might say it already does, and that’s on both the left and the right, you know, you’ve got radical socialists who would say things like it exist to perpetuate a class structure or a class system, or you’ve got critical race theorists. The point is that if you have something that can be used in this way, there is always the possibility that it will be to suppress individual freedom. Now I can, I can understand and appreciate the human trafficking concerns, because obviously, that’s you know, that, that isn’t an extreme example, but if I may speak candidly extreme examples are always a danger because they facilitate the erosion of the norm.
This is part of the separation of powers in liberal philosophy is so important because it prevents, or at least is meant to prevent, one part of government overstepping its mark. So facial recognition technology can be used and introduced for the emergency, but very quickly, the emergency becomes the norm.
I’m sure over the last 18 months, we don’t need to be reminded of how quickly the emergency becomes the norm, especially with coronavirus and all these sorts of things. It's also fascinating that we’re talking about facial recognition technology at the same time that we’re talking about people wearing face masks more and more, which disrupts facial recognition technology. It’d be wrong to think that there’s some sort of conspiracy behind introducing it to, but that doesn’t change the fact that there would be someone out there who would use it for that purpose.
And for that reason, unless we can really entirely secure the infrastructure that prevents its misuse, it is likely to be misused.
Georgaia: In terms of that being as a global conspiracy, I mean, with all those kinds of things, it’s like, well, obviously there are so many different factors that mean this is not possible- like so many of these wacky conspiracy theories. don’t think the United States government, the UK government and China, or, you know, in cahoots to introduce FRT, but obviously, you know, China, the government does have that system that is oppressing people. And in the UK, we are using Chinese-made cameras and there’s evidence that data could be sent back to CCP databases with people’s information.
Jake: Well, this is the other problem that we, we live in a transnational world. We live in an increasingly globalized and international as well. What is there to say that the data will only ever stay in a closed circuit? There’s no guarantee, you know, this is the same reason people are concerned about, Huawei building 5g towers, not to, not to stereotype Chinese governments too much, but they do tend to be the perpetrators of these things.
All the Chinese companies that will operate here, that their state, the state-sanctioned companies because of the nature of China. So I think it’s absolutely fine to characterize him that way, because that all corporations that are sanctioned by the Chinese communist party, which is evil. Anyway, it’s a stupid idea. It doesn’t even matter if you know, obviously, there’s some ridiculous, like five-year conspiracies obviously are a bit of rubbish, but you shouldn’t be first from a national security perspective, you shouldn’t be allowing them to build the infrastructure.
And also just from an economic perspective, we have the ability to make these technologies ourselves .
The other thing to bear in mind to kind of sum up what I was sum up, what I was talking about, there is a simple Maxim with, with government, which is if it needs an emergency to enact the powers, it wants to enact, it will create the emergency. And that, that will happen if you introduce a system like facial recognition technology, and, you know, for whatever reason we need the crisis to justify its introduction, not necessarily saying that government will manufacture that crisis, but it will certainly jump on it, you know, crisis manufacturer or crisis exploitation is the job of government.
And that’s, that’s not, you know, that’s not to make me sound like a, you know, anarchist or a small state libertarian because I’m not a libertarian, I am a Tory really, I believe there is a role for government, but the government has to be restrained and, and the logic of government is to fight that restraint. So, I can’t see how creating a system that requires, you know, an emergency to justify it will benefit us.
Georgia: I guess, with facial recognition technology, it already does operate in the UK. It’s just a question of the extent..In London, for example, local councils will be using it and they’ll say, oh, it’s, you know, it’s not one specific crisis. They’ll refer to, you know, the crisis of violent crime, which is obviously a massive issue in London and, then sort of go along with it. But I think it is just one of those issues that, because there’s so much going on, it’s not necessarily top of the news agenda.
Jake: Well, the thing about a democratized political system is that it relies on an informed public, which never exists and never has existed, never will exist. So that’s the first issue. The second issue is you can’t have a vote on every single issue because if you did, you’d have a paralyzed system, that’s the point of representative government. So not only can we not really have a public site on this from a practical perspective, but also we wouldn’t be able to implement the public’s perspective on this.
So when it comes to, you know, introducing things like face recognition technology that needs to be seriously considered, at the highest level. And I don’t mean like, am I six-level because it’s a security concern. That’s what it’s there for. So the problem with just introducing it by fear is, is that you are subverting your own state security. What possible world would that benefit you?
Georgia: It certainly benefits the MPs who are getting money from tech companies, Chinese tech companies, that’s all I’ll say. And then the other ones are probably just, they don’t, they’re not really interested, but there has been a lot of pushback. And ironically, obviously, we talk about representative democracy, blah, blah, blah. Ironically, there’s been a lot of pushback in the house of Lords, not ironically because they obviously, you know, they pushed back on a lot of things. There’s been a lot of pushback in the hospitals over the top of communications bell, them with, you know, like I was saying, HikVision is operating a lot of those cameras.
Jake Scott is a postgraduate student in political philosophy at the University of Birmingham, and Chair of The Mallard magazine.
My short film: NOTHING TO HIDE, NOTHING TO FEAR I The Future of Facial Recognition Technology, produced as part of the Young Voices tech policy fellowship can be viewed on YouTube now: https://www.youtube.com/watch?v=feJeGNJiGDc&ab_channel=GeorgiaL.Gilholy