By: Brady Turner

Imagine walking outside to go to the local coffee shop. In your ten minute walk you walk six blocks and pick-up your coffee. You pass several police officers and fail to notice cameras mounted on telephone poles. The government is using surveillance to monitor the area that you walked. This type of surveillance while, not new is beginning to develop rapidly raising issues of privacy in an increasingly modernized world.

            In China there have been recent developments with surveillance with “emotion recognition technology.”[1] This type of technology can be used to assist with monitoring human feelings, through tracking facial expressions and the tone of voice.[2] The technology is also not limited to a law enforcement use as it has been used to diagnose depression by listening to the voice of a patient in a psychiatric hospital.[3]

            In the United States related technology is being used for facial recognition, with Clearview AI being particularly notable. The company has developed a facial recognition application, that when a photograph is taken of an individual and uploaded to a database it allows the user to public photos of that individual.[4] It is also not clear if these uploaded photos, that by its very nature are sensitive, are on a server that adequately protects the sensitive information.[5] What is concerning is that the United States federal government as well as state law enforcement officers have “only a limited knowledge of how Clearview works”, but have used the application to solve a variety of cases such as shoplifting, murder, and child exploitation.[6]

            As of this blog post more than 600 law enforcement agencies in have begun using Clearview from approximately 2019 – 2020.[7] However, Clearview is not limiting their use of the application to law enforcement, but has also licensed their application to some companies for “security purposes.”[8] Clearview is now being used in over 2,400 United States law enforcement agencies representing a rapid growth in use.[9] Despite the United State’s law enforcement agencies embrace of the technology, Canada has recently declared the application as one for mass surveillance and illegal in use.[10] There were many law enforcement agencies and organizations in Canada who used the application prior to statement released by Canada’s privacy commissioner, Daniel Therrien.[11] Clearview is currently at a standstill with the Canadian government with Clearview ceasing operations in Canada in July of 2020, but has made no plans on deleting Canadians from its database.[12]

            Turning back to the United States Clearview’s application may be heading to the Supreme Court for review. Illinois passed a law called the Biometric Information Privacy Act of Illinois (BIPA).[13] BIPA applies to Illinois residents and broadly limits what companies are allowed to do with data from a person’s body such as face scans and fingerprints.[14] Generally, companies cannot use biometric details without a person’s knowledge or consent and it allows both an individual as well as the state to sue companies who violate the law.[15]

            Clearview is currently engaged in a lawsuit stemming from a class action lawsuit based off the violation of BIPA.[16] In a unusual turn of events the plaintiff for the case argues that they  do not have standing, and that it is Clearview who is fighting for the plaintiff’s right to sue, as it was the defendant who removed the case to federal court.[17]  In a recent decision from the Seventh Circuit, the Court ruled that the plaintiffs do not have Article III standing to pursue their case.[18] The Court concluded that the plaintiffs alleged “only a general, regulatory violation, not something that is particularized to them and concrete.”[19] The case is now remanded back to the state court for further litigation.[20] However, Clearview’s have asked the Seventh Circuit to stay their mandate as the defendant plans on filing a petition to the Supreme Court.[21] Clearview attempts to be arguing that the alleged violation is a concrete and individual injury.[22] While the motivations behind the petition may be multifaceted, this may be an attempt by Clearview to receive guidance from the Supreme Court for future litigation. By having the Supreme Court rule on Article III standing, if the petition is granted, it will give Clearview a better understanding of the legal landscape as its technology is used in various states.

            The use of surveillance technologies such as emotion recognition and facial recognition are a reality that many people are unaware of. However, the mass gathering of data with these technologies represent real privacy concerns over the collection and storage of that data. In the coming years it seems inevitable that companies like Clearview will be embroiled in legal challenges over their rights to collect, store, and use the data obtained from their software.


[1] Jackie Salo, China Using ‘Emotion Recognition Technology” for Surveillance, New York Post, (Mar. 4, 2021), https://nypost.com/2021/03/04/china-using-emotion-recognition-technology-in-surveillance/

[2] Id.

[3] Huang Lanlan and Lin Xiaoyi, China Leads in Emotion Recognition Tech, Reinforces Privacy Rules to Tackle Abuse, Global Times, (Mar. 3, 2021), https://www.globaltimes.cn/page/202103/1217212.shtml

[4] Kashmir Hill, The Secretive Company That Might End Privacy as We Know It, New York Times, (Jan. 18, 2020), https://www.nytimes.com/2020/01/18/technology/clearview-privacy-facial-recognition.html

[5] Id.

[6] Id.

[7] Id.

[8] Id.

[9] Kashmir Hill, Clearview AI’s Facial Recognition App Called Illegal in Canada,

[10] Id.

[11] Id.

[12] Id.

[13] Shira Ovide, The Best Law You’ve Never Heard Of, New York Times, (Feb. 23, 2021),

[14] Id.

[15] Id.

[16] Alison Frankel, The ‘Bizzare’ Twist in Clearview AI’s promised SCOTUS Petition in Biometric Privacy Case, Reuters, (Feb. 25, 2021), https://www.reuters.com/article/legal-us-otc-clearview-idUSKBN2AP2PW

[17] Thornley v. Clearview AI, Inc., 984 F.3d 1241, 1242 (7th Cir. 2021).

[18] Thornley, 984 F.3d at 1248

[19] Id.

[20] Id.

[21] Frankel, supra note 16.

[22] Id.