Woman looking nervous about privacy concerns from clearview

Is Clearview Making You Nervous?

Technology is exciting. It not only opens the doors for a range of new medical advances and economic opportunities, but it can simply make life more convenient.

But there are some costs to technological advances, and one clear one is to our privacy—Technology is increasingly infringing on our privacy. This is perhaps partly a result of simply giving off huge amounts of data in the online world and with our smart devices. But the threat also partly comes from the increasing opportunities to engage in surveillance as well as a lack of legal restrictions on who can collect and use our data.

Clearview.ai

One recent example of a potential threat to privacy comes from Clearview. Clearview, according to the company’s website, is a research tool that helps law enforcement track down criminals. Law enforcement officers upload a picture of a suspected criminal to Clearview’s database. Clearview then uses its facial recognition technology to source other pictures that match the first. The service provides the user with a list of pictures along with the website where that image was found. Law enforcement can use the website on which the picture is posted to find more information about the suspect that can, eventually, lead to an arrest.

Clearview and others say that this app has real benefits for the justice system. It allows police officers to catch criminals that might be otherwise very difficult to catch—perhaps because they don’t have a previous criminal record or because they are located in another jurisdiction.

Clearview highlights the special edge it gives law enforcement to catch criminals like pedophiles, terrorists, and sex traffickers that may not otherwise be in law enforcement’s existing database. They also note that the service can help exonerate those who have been wrongly convicted.

Clearview says that its service is not a surveillance system. It does not actively keep tabs on your behavior or movements. Instead, it simply crawls the internet and collects images of you that are already publicly available because they are posted on public sites.

What are the privacy concerns?

Hooray for justice for victims of sex trafficking! But some privacy experts are nervous about the threat that Clearview, and other successful facial recognition services, have for our privacy. Some of the issues are with respect to:

1. how Clearview gets its database;

2. who can use it;

3. whether it is legal; and

4. the threat it poses to anonymity.

How does Clearview get its database?

One worry by privacy advocates is about the way that Clearview gets its data. Clearview scrapes public sites for its images. A paraphrase of their position is something like, “You posted it and now it’s in the public domain, so it’s not a problem that we’ve collected it in a database.”

It’s true that they’ve used publicly accessible websites. But that doesn’t necessarily put them in the clear. Privacy experts point out that there is a question about what kind of consent we are entitled to about how our data is collected and used. While we may have consented to a photo of us to be seen by our friends or friends of friends on Facebook, but that’s not the same as giving consent for it to be used as part of a law enforcement exercise.

Senator Ed Markey from Massachusetts has sent a letter emphasizing that virtually all of the images in the database have been given without permission.

Google and Facebook have also both sent cease and desist letters to Clearview. Their opinion is that Clearview’s use of these images in a breach of their terms of agreement.

Most of us did not give consent for Clearview to use our data. The privacy question is: should Clearview be able to use my data and sell it to law enforcement?

Who is using it?

Another concern is about who can use this software. In interviews, Clearview founder Hoan Ton-That talks about the use of the platform for law enforcement. He emphasizes that law enforcement agencies in the US and in Canada are the main sales audience. While he does acknowledge that some of the clients are private banks doing fraud investigation, he emphasizes that all of the customers using the service are trained investigators.

In a blog post, the company has made it clear that this is not a public consumer application; it is for “law enforcement agencies to identify the perpetrators and victims of crimes.”

But you may find yourself wondering: how can we know who is using it? Clearview is a business, is it not? Shouldn’t we expect it to try to maximize its earnings? Why wouldn’t the company lease their software to anyone that wants to pay for the service? And how would we know who is on that list?

For instance, despite the claims that they only have law enforcement as customers, Buzzfeed News obtained a leaked client list that listed Macy’s as a paying customer, and that Best Buy, the NBA, and at least one wealth fund in the United Arab Emirates.

While Hoan Ton-That suggests that the company is only selling the service to companies in Canada and the US, in fact there have been organizations in 26 countries all over the world using it. That includes, apparently, the Saudi Arabian government, who has a record of repeatedly disregarding human rights.

The use of our data for law enforcement may be one thing; but its use by foreign governments who have a history of using violence to suppress dissent is another.

Not only that, but the service could also be used by hackers. Last month, Tech Crunch reported that Clearview had a misconfigured server, which allowed anyone to gain access to the source code of the platform, all the files, and even security tokens. That’s not super reassuring.

Together, not being certain that the platform is being used by law enforcement agencies under democratic governments leaves open the question about the extent to which we can trust that the application will be used only in ways that will contribute to justice and the rule of law.

Is Clearview legal?

Earlier, I explored the question of should we have control over our data and how it is used. But in some places, the legal framework makes this a different question: whether or not we do, in fact, have control over the images of you on the internet.

This is a legal question, and the answer will depend, in part, on where you live. Californians, because of recent privacy legislation, do have the legal right to decide how their images are used. Europe, also, has comprehensive data privacy legislation that quite clearly give its residents ownership over their data. Others, who live in other jurisdictions, may not have this same privacy coverage.

Given the privacy concerns and the varying mosaic of privacy laws, some have asked whether Clearview’s service is actually legal.

There are several states with biometric privacy laws that Clearview may violate. Facebook was actually at the center of a legal dispute in 2015 because of the way that facial recognition was used in its photo tagging system. That system was probably contrary to biometric privacy laws in Illinois, although Facebook settled outside of court. The use of Clearview in Illinois, or its collection of photos of people from Illinois, could similarly violate those privacy laws.

In Vermont, the state Attorney General has filed lawsuit against Clearview for allegedly being inconsistent with Vermont’s Consumer Protection Act as well as the Data Broker Law. We’ll have to wait and see whether or not the court finds that Clearview actually does breach these laws.

The service may also contravene the terms of services by most social media platforms. YouTube, Facebook (with Instagram), Twitter, and LinkedIn, have all asked Clearview to stop taking photos from their websites. Some of these sites have said that this use of photos by Clearview violates their terms of service, which explicitly forbid scraping information.

Together, there seems to be real questions about whether the collection of public photos, or the use of this platform by law enforcement agencies, is legal.

The erosion of anonymity.

Perhaps the biggest issues, though, is that this technology opens the doors for the erosion of public anonymity. Sure, I like Aavatar, and I’m not embarrassed to admit that I like it online to a group of other geeky people who may share the same kinds of taste in children’s animated TV series. But I also may wish for that information to not necessarily make it to potential employers. Or potential partners (until at least, like, you know, the third date).

It’s not that I want to hide it, but I also may not want anyone walking down the street with access to Clearview to know it about me.

We constantly engage in image management, trying to present ourselves in certain ways to certain people. I can post images on Instagram that I think my friends will like. I might not like a potential employer to see them—not because I’m doing something bad or illegal, but just because that’s not what I want them to know about me.

This may not seem like a big deal to those of us living in places where we’re protected against discrimination. But this is relatively new.

My grandma was born into an Irish Catholic family on the west coast of Canada, just north of Seattle. When she was in her late teens and early twenties, companies would not hire you if you were Catholic. So when she was looking for work, she didn’t mention her Catholic family—or she would lie about it and say she was Protestant. Lying to manage her image is how she managed to get work and keep it in the late 40s and 50s.

The problem with the erosion of anonymity is that you lose this ability to manage your image so that you are not discriminated against.

Our technology has gotten really good at knowing a lot about us based on the data we post. Machines can now know about where you lean politically and your sexual orientation based on the content you post about yourself. In fact, if you’re a man, AI can determine your sexuality with 81% accuracy just from a picture of you. That number increases to 91% with five pictures.

At the moment, there are still 28 states in the US where you can be fired for being gay. And society changes—it’s not clear to what extent people with particular political leanings, sexual orientations, or any other feature will be accepted in the future.

Anonymity, and the ability to hide things about yourself, can be protective. Clearview, as well as other services that collect and analyze our data, threaten that anonymity.

Is Clearview the only one with successful facial recognition?

Several facial recognition applications have been built before. Clearview is special in part because of how powerful it is.

But it isn’t even that special. Facebook reportedly built a similar app previously. But here’s the thing: they did not release it because of concerns about how it could be used.

Google, too, has refrained from releasing a similar facial recognition tracking system because it could be used “in a very bad way.”

It appears that while other tech companies are certainly capable of creating a system like Clearview, they have refrained from doing so precisely because of the many privacy concerns. What makes Clearview special is that they’ve actually released the system commercially.

What can you do if you’re nervous about Clearview?

Finding a way to balance privacy with the new technologies that collect and analyze our data will continue to be a challenge our societies face. There aren’t any easy answers.

But there are some choices. Californians have been successful at enacting what is seen to be as the most comprehensive data protection law in the U.S. Europe also has particularly strong privacy laws. These laws give residents of California and of Europe some choices about whether you want to participate in Clearview. You can actually ask that Clearview deletes its file on you if you live in either of these jurisdictions.

Want do to more? Others have provided lists of ways you can get involved politically to work towards enhancing privacy. You can even do this on a local level; some cities, like San Francisco, have also banned facial recognition software.

Clearview isn’t bad, necessarily. There’s a lot of potential for platforms like it, or things like it, to help us create a just society. But there are also really good reasons to try to ensure we maintain our privacy as we’re doing it.

Leave a Reply

Your email address will not be published. Required fields are marked *