Skip Navigation

Cameras

In 1984, George Orwell envisioned a dystopian future in which two-way telescreens would be able to watch people constantly. His dystopian vision wasn’t far off from the realities of modern life, in which cameras are seemingly everywhere from retail stores, to streets, to devices within the home. These cameras represent a significant surveillance threat, which is likely to worsen with the development of artificial intelligence algorithms that can process video data more efficiently.

Page Contents

Public Cameras

Closed-Circuit Television (CCTV) cameras are ubiquitous in public places. These cameras have long been used for loss prevention in retail settings, both providing evidence of retail theft and acting as a deterrent to shoplifters. A recent trend is that these camera systems are becoming more sophisticated. Instead of merely recording activities in case a crime is committed, camera technology is increasingly being used to analyze personal behavior and draw inferences.1

Of course, CCTV systems are not limited to retail stores. They can be deployed by an owner of any type of private property. Such systems are also widely deployed by government agencies to monitor public property, including roads, parks, and other public spaces. Government-owned cameras can even be attached to light poles by police departments in order to conduct 24-hour warrantless surveillance of a person on his or her own property, at least to the extent of what is visible from the public street.2

Facial Recognition

Sophisticated algorithms can process video data in real time and identify human faces in the video. By measuring metrics that are unique to each person’s face – such as the shape and position of the eyes, nose, and mouth, among other characteristics – these algorithms can differentiate one person’s face from another. Moreover, these algorithms can utilize databases containing facial features to match a person’s face to their identity, allowing a person to be recognized in public.3 Use of these systems raises significant privacy concerns that are not addressed by current privacy laws or regulations.4 Another concern is that facial recognition algorithms are not even accurate for significant segments of the population, including women and Black people, raising the risk of misidentification.5

While police and government use of facial recognition carries its own set of concerns, the private sector is equally able to misuse this technology. For example, image processing technology can be used to detect if a retail employee and a customer seem too “friendly” with each other, allowing stores to limit discounts given at an employee’s discretion.6 Private companies can also use facial recognition to deny services to customers. In 2022, Radio City Music Hall denied entry to a mother and her daughter since the mother’s law firm was suing its parent company over an unrelated matter.7

Both public sector (including police) and private sector use of facial recognition require an extensive database to map facial features to individuals. Private sector companies like Clearview AI mine social media and other Internet sites to find photos of people matched to their identities.8 As is customary for “AI” companies, Clearview AI steals intellectual property from other sites, generates model data from the stolen information, and profits by selling the resulting database.9 Although Clearview AI tries to claim that they are careful about screening their customers, company documents indicate a willingness to give access to right-wing political organizations and investors.10

Device Cameras

Although Clearview AI might be mining the Internet for photos that people post willingly, another source of video surveillance comes from the mobile devices that we carry around on a day-to-day basis. Our cell phones all have cameras, and these cameras can be remotely activated by malware running on the phone. Targeted malware, such as Pegasus from the Israeli NSO Group company, can deliver specialized rootkits that infect the Android and iOS operating systems at a low level and enable access to the camera.11 Governments can purchase this malware and deploy it against specific individuals using a variety of methods. The French government has even gone so far as to allow its police to remotely activate phone cameras and microphones during routine criminal investigations.12

For the average non-targeted individual, unintentional malware infections are the more common way that a phone’s camera can be accessed. Blackshades is but one example of the type of malware that can be used to activate a phone camera remotely, and it has been used in the wild for the purpose of sextortion.13 It is relatively easy for malware to be installed on phones, since it has been found in both the Google Play Store14 and the Apple App Store.15 This malware doesn’t even have to be running on a person’s phone to compromise an individual’s privacy. If a person is on a video call in which the other party has an infected device, the malware can steal camera information by capturing the other party’s screen.16

Mitigation

It is difficult to mitigate tracking and facial recognition threats from cameras installed in public places, since wearing an effective disguise is generally impractical, may trigger social stigmas, and might even be illegal in some jurisdictions. For example, in South Carolina, it is illegal for a person over the age of 16 to wear a “mask or other device which conceals his identity” on a public street.18 This law dates back at least to 1951, well before the invention of modern surveillance technology.

Although concealing one’s identity in public might be problematic, avoiding cameras at home is another matter. One should never install a security camera indoors or in such a way that it can see inside the house. These devices can be compromised, allowing arbitrary people to see inside the house. Security cameras should instead be installed outdoors, facing away from the house. Devices with cameras, such as laptops, should have either a physical disconnect switch for the camera or some kind of cover to put over the lens when the camera isn’t intentionally in use. Tape can be used on devices where the camera will not be used. Phones can be kept in a pocket or otherwise placed in such a way as to block the main camera and prevent the front-facing camera from seeing anything useful.

To protect against law enforcement or other surveillance cameras from being used without a warrant, blinds and/or curtains should be installed on all windows in the home. These devices should be kept closed at night, when daylight reflections that would otherwise prevent viewing the interior of the home are not present. Extra protection during the day is possible while still admitting light by opening the blind slats instead of raising them. Alternatively, a layer of sheer curtains can be used during the day on windows without blinds.

Finally, you must always assume that any photo taken with your phone could be transmitted either by the phone itself or by some installed app to a cloud-based service. If this service further shares the photo, or if this service has a data breach, your photos will become publicly available on the Internet. Thus, you should never use your phone’s camera to take any nude or otherwise compromising photos, even if these photos do not show your face, as a number of celebrities learned the hard way.19 Even apps that supposedly auto-delete photos, like Snapchat, are not immune to compromise.20 It is probably best to avoid taking and sharing compromising photos entirely, but if you must take them, use a dedicated camera that does not have an Internet connection.

Notes and References


  1. Bill Zalud. “9 Ongoing Trends for Surveillance Analytics.” Security Magazine. February 1, 2013. 

  2. Nathan Freed Wessler. Warrantless Pole-Camera Surveillance by Police is Dangerous. The Supreme Court Can Stop It.. American Civil Liberties Union. November 23, 2022. 

  3. Center for Democracy and Technology. Seeing Is ID’ing: Facial Recognition and Privacy. January 20, 2012. 

  4. U.S. Government Accountability Office. Facial Recognition Technology: Commercial Uses, Privacy Issues, and Applicable Federal Law. GAO-15-621. July 30, 2015. 

  5. Tom Simonite. “Photo Algorithms ID White Men Fine—Black Women, Not So Much.” Wired. February 6, 2018. 

  6. Todd Feathers. “Facial Recognition That Tracks Suspicious Friendliness Is Coming to a Store Near You.” Gizmodo. November 1, 2024. 

  7. Sarah Wallace. “Face Recognition Tech Gets Girl Scout Mom Booted From Rockettes Show — Due to Where She Works.” NBC New York. December 19, 2022. 

  8. Courtney Linder. “This App Is a Dangerous Invasion of Your Privacy—and the FBI Uses It.” Popular Mechanics. January 22, 2020. 

  9. Katherine Tangalakis-Lippert. “Clearview AI scraped 30 billion images from Facebook and other social media sites and gave them to cops: it puts everyone into a ‘perpetual police line-up’.” Business Insider. April 3, 2023. 

  10. Ryan Mac, Caroline Haskins, and Logan McDonald. “Secret Users Of Clearview AI’s Facial Recognition Dragnet Included A Former Trump Staffer, A Troll, And Conservative Think Tanks.” BuzzFeed. March 25, 2020. 

  11. JD Rudie, Zach Katz, Sam Kuhbander, and Suman Bhunia. “Technical Analysis of the NSO Group’s Pegasus Spyware.” 8th Annual Conference on Computational Science and Computational Intelligence (CSCI‘21). Las Vegas, NV, December 15-17, 2021. 

  12. Jon Fingas. “French Assembly passes bill allowing police to remotely activate phone cameras and microphones for surveillance.” Engadget. July 6, 2023. 

  13. Chris Baraniuk. “Webcam hacker spied on sex acts with BlackShades malware.” BBC News. October 8, 2015. 

  14. Bill Toulas. “Over 200 malicious apps on Google Play downloaded millions of times.” BleepingComputer. October 15, 2024. 

  15. Brendan Hesse. “Great, Now the Apple App Store Has Malware Too.” Lifehacker. August 8, 2022. 

  16. Amber Bouman. “Malicious iPhone apps are spreading screenshot-reading malware on the Apple App Store — how to stay safe.” Tom’s Guide. February 6, 2025. 

  17. Shodan Search Engine

  18. Section 16-7-110. South Carolina Code of Laws. 

  19. What is The Fappening? Understanding the Controversial Event in Celebrity Culture.” The Maine Chronicle. October 14, 2024. 

  20. Kim Lachance Shandrow. “‘The Snappening’ Really Happened: 100,000 Snapchat Photos and Videos Leak Online.” Entrepreneur. October 13, 2014. 

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.