Case In Point

Face Off: Clearview AI v The World

Miniature Productions

In 2017 a little-known Australian technologist develops a revolutionary form of facial recognition technology. 

Clearview AI is being used by law enforcement agencies to hunt down criminals. But is there a catch?

A series of legal disputes around the world are among the most stark and alarming of our time.

Show Notes

Guest: Professor Jeannie Paterson, University of Melbourne (Co-Director, Centre for AI and Digital Ethics)

Update: Since recording, the Office of the Australian Information Commissioner released this Statement on Clearview AI (21 August 2024). 

The Secretive Company That Might End Privacy as We Know It by Kashmir Hill (The New York Times, 18 January 2020)

Clearview AI breached Australians’ privacy: Statement by Office of the Australian Information Commissioner (3 November 2021)

ACLU v. Clearview AI case summary (American Civil Liberties Union, 11 May 2022)

Facial recognition startup Clearview AI settles privacy suit (AP News, 22 June 2024)



[Archival] Hoan Ton-That: My name's Hoan Ton-That, I'm the co founder and CEO of Clearview AI. Clearview AI is a facial recognition search engine. When I was a kid, I was obsessed with electronics and computers. I'd be reading a lot of stuff on AI, especially around image recognition. The issues with the previous stuff, you'd have to get a perfectly aligned face. It wasn't accurate across all these different demographics. I'm a person of mixed race. Accuracy on this kind of technology is important to me. In my mind, I was like, that could be a solvable problem.

James Pattison: It's 2017. A little known Australian technologist joins forces with an American political operative to co found a company to develop a revolutionary form of technology. Melissa, tell me about this company. 

Melissa Castan: The company's called Clearview AI, James, and they created a groundbreaking form of facial recognition software and used it in ways that we never really expected.

James Pattison: And I am terrified already. It's the way that this technology works that has raised a lot of red flags all around the world. And We're going to hear more about the tech, but basically what it does is it matches uploaded images of faces against a database of billions, billions of images that have been scraped from social media accounts, public websites, all that sort of thing, without the consent of the people whose faces are in those photos.

James Pattison: So the question today is, has the stage been set for a showdown between technological advancement on the one hand and our individual freedoms on the other?

James Pattison: I'm James Pattison. 

Melissa Castan: I'm Melissa Castan, and this is Case in Point. Today, the very contentious case of Clearview AI. Is your face among the three billion images sitting in Clearview's database? 

James Pattison: Mine. 

Melissa Castan: Yeah, yours. 

James Pattison: Jeannie Paterson is the person to talk to about the legal issues in the Clearview case. She's a professor of law at the University of Melbourne and she's also founding co director of the Centre for AI and Digital Ethics.

James Pattison: Jeannie, welcome to Case in Point. 

Jeannie Paterson: Thank you so much for having me. 

James Pattison: What exactly is Clearview AI? 

Jeannie Paterson: Clearview AI started as an app that was a party trick. You could take a photo of someone you met at a party and all of a sudden information about them would pop up. Information of their latest social media post, possibly their name and where they had last been.

Jeannie Paterson: It was a game. And then the founder of Clearview AI realized that this product could be marketed to police forces. 

James Pattison: So, and everything was fine. Everything was completely fine. Now this founder, who is this founder? 

Jeannie Paterson: The founder is an Australian who It was kind of playing around with technology for a while, living in the U. S. And his first app was put Donald Trump's hair on someone else. So that's the space he was playing in, but he got really interested in facial recognition technology, realized the potential of AI before anybody else had really woken up to it. 

James Pattison: And this, this founder's name, his name is Hoan Ton-That. And I've looked him up and I mean, he looks like a male model.

James Pattison: He's got long, long locks. He looks kind of like the sort of guy who would create a facial recognition app like this. But, but he, I mean, he's everywhere. Like. I've seen some amazing places that this guy's popped up, including in Ukraine. We'll get to that a little bit later. Who are some of the other figures who are involved in this company?

Jeannie Paterson: Well, the other figures involved in this company are actually quite murky. They keep changing. He hired a great AI guy. He hired a great marketing guy. And he hired a great you know, app builder. But the people in the company keep changing. It's him who remains constant. 

James Pattison: Backers? 

Jeannie Paterson: Backers, lots of backers, but we don't know who they are either, to be honest.

Jeannie Paterson: For a while, there was a lot of interest, but once the ship hit the fan about the technology, a lot of the backers pulled out. The app was. Clearview AI was kicked off the Apple store and it was kicked off the Google store because people were so worried about it. And that's where it became really murky.

Melissa Castan: So Jeannie, you said that the app had been kicked off the platforms and something's gone haywire here or something's gone a bit wrong if people are turfing it off platforms. What is it that the app is being used for and why is it causing that controversy? 

Jeannie Paterson: This is a story actually of investigative journalism.

Jeannie Paterson: So in 2020, a journalist called Kashmir Hill, who subsequently wrote a book, broke the story of Clearview AI in the New York Times and made the point that this app, Clearview AI, was scraping faces from social media, but using that social media To re identify people. And at that point in time, the app was available to individuals.

Jeannie Paterson: It was available to private companies and it was also being offered for free to law enforcement agencies. And the concern here was we go, well, you know, people weren't asked to give permission to their faces to be scraped, but people had put their faces on social media. What's the problem? The problem is biometrics.

Jeannie Paterson: They're apparalized on biometrics. So what it does, if you take a photo of a person you want to identify, it maps the geometry of their face, the vectors, and then it matches that geometry to the vast number of images in its database to provide. Information about the person whose face vectors match the image in the database.

Jeannie Paterson: So that can include who they were with at any point in time, where they were with information that's been tagged to those social media images, often their name. And this is available to anyone who got access to the app. Now think about that for a minute. That's not. Like looking on social media for someone, that's actually a quick and easy way to match a face who you've seen to a name and a location.

Jeannie Paterson: And it's not just in the hands of the police, it's in the hands of private people. 

Melissa Castan: So was that offer of the app and the facility to the law enforcement agencies, some kind of virtue signaling and, and sort of saying, look, we're doing this for good. So it's, you know, to stop crime and to prevent, you know, bad things happening.

Melissa Castan: Well, to be honest, 

Jeannie Paterson: I think at the beginning the app was basically being offered around to see where the money can be made. Now there's an outrage about using the app for private purposes. Once this story breaks, this journalist stories breaks, the app is kicked off a lot of app stores. Eventually it's found out that it's being sold to private companies for surveillance and security purposes.

Jeannie Paterson: There's a big outrage about that. So the. Private companies cancel their contracts, but law enforcement remains the one place where the invasion of personal privacy might just be justified because it can be used to catch criminals. And so it's, there's a clever marketing trick. It's all about. Offered to police forces for free and then they can sign up and pay for it.

Jeannie Paterson: But police, police forces don't have this level of technology. They don't have a sophisticated facial recognition technology that works as well as Clearview AI. 

Melissa Castan: My mind keeps going back to those procedural crime shows where you see them, you know, plugging some data into the computer and then like, Two minutes later, it goes, beep, we've identified Jeannie Patterson.

Melissa Castan: And this is where she lives. And you know, this is when she last went to the airport or whatever. Is that the kind of facility that's in this platform? 

Jeannie Paterson: So police, the police and security forces have tried to use facial recognition. for some time to recognise, to identify terrorists or criminals in the crowd.

Jeannie Paterson: And it's harder than you think. So we use facial recognition technology every time we travel. Our passports are matched to our faces, but that's quite straightforward technology. It's called one to one facial recognition. It's my face standing still looking at a camera matched to the face that's in my passport.

Jeannie Paterson: That's pretty easy, but if you're If you're a law enforcement agency, what you're trying to do is match a face in a crowd, CCTV footage perhaps, to your own database of people who you know are suspected of crime, or engaged in criminal activities, or even terrorists. And that's not straightforward. It's not.

Jeannie Paterson: Images in a crowd are often poor quality, and moreover, the databases that the police hold are limited, because the only database of faces that police have are people who've already come into contact with the police, who've already committed crimes, been arrested, and so on. So, The technology is limited because the tech, the ability to decode a face in a crowd is quite difficult.

Jeannie Paterson: It's not good lighting, they're on the move and the database to compare those faces in the crowd against is limited as well. So the breakthrough of Clearview AI was this vast reservoir of faces and moreover faces with metadata attached to it, information about who they are, where they were, who they were with.

Jeannie Paterson: So all of a sudden The police using facial recognition technology had access, more or less, to the whole world to identify people of interest that they wanted to catch. 

James Pattison: You can see the value proposition for them being presented by the company. I mean, it's, it is really staggering. These images that you've mentioned, these billions and billions of images being, and you're using the term scraped from, from social media and public websites.

James Pattison: Whereabouts were these coming from? Are we talking? Instagram, Facebook, are we talking TikTok, like, like, you know, the company that I'm employed by, their website. Do we know which ones? 

Jeannie Paterson: Well, we don't really know because we don't know what's in this database of images, but it's Facebook, it's TikTok, it's Instagram.

Jeannie Paterson: And it's LinkedIn. And actually in the early days of Clearview AI, I remember this story broke in January 2020. In that, in that year, 2020, you know, outrage about Clearview AI was at its peak. LinkedIn and some of the other social media companies went to Clearview Your scraping of images from our social media platforms is contrary to our terms of use.

Jeannie Paterson: Stop it now. Setting a pattern that continues in all the litigation against Clearview AI, Clearview AI says, we don't believe you. These images are public. We can take them. You can't stop us. 

James Pattison: Wow. It's not just one case that we're talking about. This is global. There is a series of cases that are kicked off around the world.

James Pattison: Where and when. Well, 

Jeannie Paterson: what you have to remember is that privacy protection in the U. S. is very low and privacy protection in the EU is very high. However, in the U. S. One state, Illinois, has a biometric information privacy act, and that act, the BIPPA Act, this act, so privacy protection in the U. S. is uneven.

Jeannie Paterson: Actually, California has good privacy protection law. Illinois has some privacy protection law. Other states don't. It's pretty uneven. But in Illinois, Illinois, the BIPA Act says that you can't collect biometric information without consent. And that's the most stringent privacy protection, biometric privacy protection law in the U.

Jeannie Paterson: S. So after this groundbreaking expose in the New York Times, A number of class actions are brought against Clearview AI and other companies using facial recognition technology. It's perceived that that's a breach of the BIPPA Act. Now Clearview AI would go, nah, we don't need consent. You put those images out there.

Jeannie Paterson: We're not infringing anybody's privacy. We're not infringing into legislation because we're only collecting images that have been publicly displayed. Now that litigation runs starts to run and litigation runs quite slowly and in the meantime, other countries start to wake up to Clearview AI and start to investigate what it's doing.

James Pattison: brings a case like this, like in the United States, is that something that's brought by an individual who's been, who feels that they have stolen their images, for example, or is it brought by the regulator or, you know, who actually is it? 

Jeannie Paterson: So it can be brought by the regulator. So in the, in Australia, we don't have private rights of action under the Privacy Act.

Jeannie Paterson: That may be coming. Hopefully we're going to get that under reforms to the Privacy Act that are being considered now. So the action has to be bought by the regulator. Or individuals need to find another way, another way to frame their complaint. In the U S the action against Cleve UAI was brought actually by the American civil liberties union.

Jeannie Paterson: who can bring the representative action under that BIFA Act on behalf of people who are affected. But some actions were also brought by regulators. Now we need to pause here to think about privacy again. The problem with a lot of privacy protection law is that it's framed as a prohibition.

Jeannie Paterson: Don't infringe people's privacy. Don't collect biometric information without consent. Don't share information unless you have consent. Don't take more information than you actually need to run your business. There's a whole lot of. Rules, but there's not necessarily any rights to compensation of individuals if those rights are breached.

Jeannie Paterson: It's a, usually privacy breaches, the response is to impose a fine or an injunction not to engage in the action, but there's not necessarily any rights of compensation to the people whose privacy has been infringed. And one of the reasons why that compensation question is so problematic is because if somebody steals your face.

Jeannie Paterson: Which is effectively what happened here. It's not clear that you as an individual have suffered any economic loss, and that's what the law normally compensates. You might be cross, you might be upset. If you're somebody who has a reason for not wanting your image to be public, you might feel afraid. And there's concerns that Clearview AI in its original form could be used for stalking and harassment.

Jeannie Paterson: But it's not necessarily economic loss, so it's hard for the law to think about how to compensate people whose privacy rights have been breached, which means there's not a big incentive to bring the actions. Bringing actions, litigating is expensive, and normally you want to litigate in a class action to get the force of a group, and then you need a litigation funder, and if there's no loss.

Jeannie Paterson: No compensation. It's hard to bring, have the momentum to bring the right kinds of action. So that's why civic society groups might bring the action or you need your regulator. 

James Pattison: Let's just say, for example, that there's a company that has significant financial backers and they hear about what you're saying, that there's no incentive for individuals to bring an action like this because it's expensive.

James Pattison: If you were sort of Business minded, would you see a potential opportunity to offer some sort of incentive to individuals to, say, drop the cases against you? 

Jeannie Paterson: Oh, absolutely. Absolutely. Now I think we need to again go back a step. So Clearview AI has been very good at aggressively defending any action against it.

Jeannie Paterson: So in the EU, and in the UK, and in Australia, it said to the privacy regulators that have tried to bring action against it, it said, we haven't breached your law because the image it can send. is conferred by the fact that people have shared the images privately. And anyway, you don't have any jurisdiction over us because we don't operate in your country.

Jeannie Paterson: You can't get us. Now that doesn't work in the U. S. because Clearview AI is based in the U. S. So that action I mentioned, that Illinois's Biometric Information Privacy Act, BIPA, that class action was the one that was always more most problematic for Clearview AI, because it's being brought by the Civil Liberties Union.

Jeannie Paterson: So they have the commitment to bring the case. They don't care if there's no compensation because they're bringing it from a matter of principle and right. That case is is the one that's problematic and similar class actions being brought. So quite recently, what we've heard is that while Clearview AI has aggressively defended actions against it in other jurisdictions, very recently, June 13, 2024, we hear that as part of the settlement of class actions filed against Clearview AI and Eloise, the company has offered shares.

Jeannie Paterson: In the company instead of compensation. 

Melissa Castan: That's a, that's a, a massive mind shift to think that the people who have been negatively affected by their images being shared might want to profit out of the company that's done that sharing. 

James Pattison: I mean, what are the legalities of offering an inducement like that to somebody to drop their case?

Jeannie Paterson: There is no rule about that because the matter is, it's. There's been no decision in the case. So there's no order to play any damages. The parties who are part of that class action, those class, and there's a variety of class actions to be fair, filed, they're not going, they're probably not going to get anything if the class action is, if there's a decision in court.

Jeannie Paterson: There's, it's still actually uncertain. There's uncertain legal arguments. It's unclear whether they've suffered any. Economic loss, maybe they'll get something for anxiety or pain and suffering or infringement of their rights, but that's a pittance. So they're doing it, probably the actions are being brought as a matter of principle, but at some point the people to those class actions have to decide if they want to proceed and here they've been offered something.

James Pattison: The company would say that's just good business, right? 

Jeannie Paterson: The company would say that's good business. And here's the thing as well. Remember Melissa spoke earlier about the value proposition of Clearview AI. The problem here is Clearview AI has a compelling case of why it's not a bad company. Sure, it's taken faces.

Jeannie Paterson: from social media. But we put our faces there. We put the information there. We included information about our friends and our colleagues and where we are. So kind of we, from, from Clearview AI's perspective, we have, we're complicit in that, in its using our faces. And then how is it using that information?

Jeannie Paterson: It's using this information for law enforcement. Recently, Crikey ran an article saying that Clearview AI is still kind of being used in Australia because the federal police have been part of an international action. They've, Australian police have been collaborating with international law enforcement.

Jeannie Paterson: And the International Law Enforcement Agency is using Clearview AI. So Crikey was making the point that although Clearview AI has infringed our privacy law by using Australian faces, the Australian Federal Police is still complicit in the use of Clearview AI. But what's that law enforcement action?

Jeannie Paterson: It's catching children who have been sex trafficked. That's what the international action is. So let's weigh up, oh, we've infringed some privacy of individuals and we're using this amazing tool that has not before been available to police and law enforcement and security agencies to really make an impact on child sex trafficking.

Jeannie Paterson: It's not a 

Melissa Castan: straightforward moral argument. But the lack of clarity about what Clearview AI is doing, where it's, Where it's landed, who runs it, how it operates, who's responsible, that bit seems really hard to reconcile. It should be from a matter of corporate law and, you know, the responsibilities of directors and all those things that we're kind of quite familiar with, that this company is not stacking up on those grounds.

Jeannie Paterson: Well, I'd actually go further than that. So I would say, and, I would say that the concern about Clearview AI is these arguments about rights of individuals to information privacy, the importance of using technology to stamp out what are clearly offensive, terrible crimes. Those debates aren't being publicly aired.

Jeannie Paterson: And why aren't they being publicly aired? Because Clearview AI is a private company. It's a private company that's providing a public law enforcement service. And we've got this, we've got this melding of public and private. The police and law enforcement agencies are really important in societies. But they're also subject to a whole lot of checks and balances to make sure they're working in ways that are consistent with human rights, that are consistent with, you know, administrative law, that are transparent, that they're accountable.

Jeannie Paterson: Now, none of those protections that sit with public agencies like law enforcement apply to Clearview AI because you say it's a private company. So provided it's complying with, you know, company law, the company law that applies, It's fine. It's got no scrutiny over it. And the problem to me is that it's providing a public service, a service to public agencies without those checks and balances.

Jeannie Paterson: So we might think that in some ways facial recognition technology is justified for some law enforcement purposes, but we'd also say there needs to be a whole lot of protections built in. We'd want the use of the technology to be transparent. We'd want to know there's really robust testing of the technology to make sure it's accurate.

Jeannie Paterson: We don't know how accurate this is. We don't know how many. False positives it's giving and we'd want what's called an accountability framework around it. We'd want a human or a group of humans to be overseeing the use of the technology, to be auditing its outputs, to be recognizing when there's complaints about the technology.

Jeannie Paterson: None of that necessarily exists because the technology is being fed into law enforcement agencies by this private murky company. And to me, that's the real complaint. So 

James Pattison: even if you. Contracting with the government agency. There is no level of Oversight and accountability that expect from a public entity.

Jeannie Paterson: No, the public law Stops at the point where the government has contracted in a service now that procurement Contract is really important and a lot of people who are interested in civil liberties and and the impact of technology are looking really closely at procurement to say, well, if you're, if you're in a public agency and you're buying in technology, you still need to be responsible for that technology.

Jeannie Paterson: And to do that, you need to put something in the contract because otherwise, and what we see again and again, is that the technology that's being provided to the public agency may or may not work fine. But But there's no mechanism for the public agency, the police in this case, to trace back into the technology to see how accurate it is, to see what data it's been trained on, to ask the whole lot of questions that we'd expect it to be, have the answers for.

Jeannie Paterson: It's kind of this shifting of responsibility. The black box is not the technology. We sometimes hear about AI being a black box. The black box is not the technology. The black box is the fact that it's being run by a private company. 

James Pattison: You mentioned before about the way that it was being used to make major breakthroughs to protect children.

James Pattison: It makes you go, wow, this is, I didn't think about it that way. This is. Not all doom and gloom, this, this gives them some social license and, and increases their social capital, which clearly is very good for them, but also it, you know, hearing that it, it immediately changes my view and I go, maybe I, maybe I should give a little bit of credit to them and maybe be concerned a bit less about my privacy because it will protect children.

James Pattison: Something similar happened when I read on the Clearview website that the 13th of April, 2023, an award was presented to the CEO of Clearview. He's standing side by side with the general of the main directorate of intelligence at the ministry of defense of Ukraine. He's been given this award of gratitude on behalf of the Ukrainian people for The work, the use of Clearview AI in their fight against Russia.

James Pattison: Something like that. First of all, do you know about the way it's being used in, in that that war? 

Jeannie Paterson: A little bit. So 2023, there were reports that. Facial recognition technology, probably Clearview AI, because often people go, we're using facial recognition technology. Often it's Clearview AI because they have this huge database, was being used to identify Russian saboteurs existing in the Ukrainian population.

Jeannie Paterson: And that sounds good until you realise that lots of Russians live in the Ukraine and may go back and visit their families in Russia, but aren't necessarily, don't necessarily consider themselves as Russian sympathisers, might consider themselves as Ukrainian. And yet through this technology may be identified as potential saboteurs.

Jeannie Paterson: So the use of the technology in that context is probably inappropriate because it's making a judgment. about people's origins and loyalties. The technology was never set up to do. 

James Pattison: And sort of, I guess, expanding that point about being identified as a potential saboteur in a war zone, what could that mean for an individual?

Jeannie Paterson: Well, we can only imagine what that could mean for an individual. Now, I mean, I think that the part of the aim of the technology was also to identify, of the use of the technology is also to identify, you know, Russian soldiers that may have gone. May have tried to defect or have been in disguise and so on.

Jeannie Paterson: So you'd go, well, if we can identify the Russian saboteur point, it protects the Ukrainian people, but equally we could be condemning potentially innocent people or people who consider themselves very much aligned to the Ukrainian cause to punish all sorts of punishment and danger. The technology adept matches a face to an image.

Jeannie Paterson: It doesn't tell us anything about what people are thinking or feeling or Even essentially their national origins. It's simply based on social media posts. Take a photo of me against a Russian? Does that mean I'm a Russian sympathizer? I don't think so. But that's, that's kind of the danger. And, and, you know, a lot of the concerns about the technology also flared up during the Black Lives Matter movement, because there was concern that the police were using either Clearview AI or similar technologies to identify people who were acting in the protests, which again, think about the civil liberties of that.

James Pattison: Okay. So We've got the, there's so many cases involved let's, let's see if we can sort of zoom around the globe and see where we're at. So we've, we've got the UK and Australia. You've, you've mentioned where those cases are currently at which is TPC. We have the US where people have been offered equity in the company.

James Pattison: What's happening there? Is that case still going? Do we know if they've accepted it? 

Jeannie Paterson: We don't know. We're waiting to find out what's happening in that case. So, that's the first time Clearview AI has given any concession to, to anybody challenging its mode of business. Wow. 

James Pattison: Okay. And, and Europe, I know that this case is in Italy, France.

James Pattison: Where are they at? 

Jeannie Paterson: So, as this story broke in 2020 in Crete, the privacy commissioners across Europe, Europe has the strongest privacy protection probably in the world. GDPR. The privacy commissioners across Europe, or privacy regulators issued findings that Clearview AI had breached the privacy rights of European citizens, or their own citizens, and imposed colossal fines on Clearview AI, including a penalty for not paying up.

Jeannie Paterson: And I'm talking colossal fines. And so, So for example, France imposed a 20 million euro penalty in 2022 and other, other jurisdictions in, in Europe have done the same thing. Clearview AI said, we don't operate in Europe. We're not paying. Now Europe has now recently passed the AI Act and the AI Act specifically bans the business model of Clearview AI.

Jeannie Paterson: It specifically says scraping. Internet images to create facial recognition technology is banned in Europe. 

Melissa Castan: Do you think the outcomes of some of these legal challenges are going to be and how is it going to influence how we develop our privacy protections, corporate protections, and the use of these kind of materials in enforcement areas?

Jeannie Paterson: Such a good question. So the state of play in Australia And in the UK is this. In Australia, the Privacy Commissioner said that Clearview AI had breached privacy law by taking images of Australians without consent. Clearview AI appealed that decision and lost. It said, well, we don't operate in Australia, so we can't be subject to Australian law.

Jeannie Paterson: The Tribunal had appealed to, disagreed and said, come on, you're taking images of Australian from Australian websites, you operate here. In the UK the UK Privacy Commissioner imposed a fine on Clearview AI for the same reason that was appealed and the Tribunal in the UK accepted Clearview's argument that it was outside the jurisdiction and that case still has to go to court.

Jeannie Paterson: But what those two decisions tell us is that the legal system, Melissa's point, is poor at responding to this problem of data protection. Doesn't quite work. It's premised on a system where our concerns about privacy were some sort of Dorker taking photos through our bedroom window or standing in our garden.

Jeannie Paterson: Not this sort of free flow of data and images across the internet about us that may impact on our freedoms, but more broadly may affect on the freedoms in societies generally at the moment. Australia's reforming its privacy law, and some of the things they will do is give greater enforcement powers to the Commissioner but also make provision for private rights of action under the Privacy Act, which would make possible class actions to enforce privacy claims.

Jeannie Paterson: We still have the problem of compensation. That's still an issue. Needs to be dealt with. What they do in some countries is they get, they have a token amount that if you're, there's been a breach of your privacy rights, you get a token amount, a token payment that just represents your rights have been breached.

Jeannie Paterson: And that sounds trivial, except that it allows those actions to be broad, right? And sometimes we need private rights of action to enforce. important protections that for one reason or another the regulator's not dealing with. So I really think we need these changes to the Privacy Act. And the other thing I'd say is that I don't think putting your image on Facebook gives consent for anything to be done with that image.

Jeannie Paterson: And I think the reforms to the Privacy Act are going to make that clear as well. 

James Pattison: As we draw this to the, to a close, do you think that there is a reframe that We need as individuals, we, we can ask government to make these changes to the privacy legislation that we have in Australia, but is there a reframe that you think that we as individuals need to have about our privacy and what it actually means?

Jeannie Paterson: I think we need to be clear on We, we value privacy and as I've said, have clear and open informed debates about this balance between privacy and law enforcement. And I think that reminds us that once your stuff is on the internet, it's there, it's there. You can't take it back. It's always there and other people can use it.

Jeannie Paterson: I know lots of parents post images of their kids and I've even heard parents say, well, this is a sign that I love and respect my kids because I'm celebrating their individually, individuality on social media. Wow. I used to post photo of my kids. My kids politely, or not, asked me not to post images of them without asking.

Jeannie Paterson: And, you know, at that time I, because I'm a contract lawyer, I was like, well, okay, they've asked me nicely. I should respect their agreement. I don't know what, that's probably not a contract. They reminded me. You can't make a contract with your own family. Anyway, I kind of went, okay, I'll respect their decision that they wanted to control their images and I shouldn't.

Jeannie Paterson: But now I kind of go, you know what? They were right. They were right to say they wanted to preserve their right to privacy online. Remember that if you're posting images of your children, you're denying them the right to choose whether they're anonymous. They might want to be, you might not want to be, but they might want to be, but yet if their faces are scattered across the internet, somebody like something like Clearview AI can harvest those faces collect the metadata, store the geometry of the faces and they're forever tagged.

Jeannie Paterson: I think that's probably not a good thing. So we need to keep remembering that we're not against catching criminals, but there's due, there's a due process. In how we engage in catching criminals or terrorists. And there's also a point about where do we deploy resources. So if we're going to deploy resources in technologies to reduce crime, we want to know that those technologies work and that they're not impacting most heavily on people least able to defend themselves.

Jeannie Paterson: And so those questions are really important. And those are precisely the questions we don't know. with this technology. And what all we know is that they are, is that Clearview AI aggressively defends court actions, refuses to pay fines. And when it's at the wall, it offers people's shares in its own company.

Melissa Castan: Professor Jeannie Paterson. Thanks for speaking with Case in Point. 

Jeannie Paterson: Thank you for having me.