Skip Navigation

Browser Extensions

While some browsers have native capabilities for blocking trackers and other privacy threats, the major browsers normally require extensions or add-ons to reduce the amount of data that a website might collect. At the same time, installing too many extensions can create privacy issues by increasing the entropy of the browser fingerprint.

Page Contents

Video Lecture


Watch at Internet Archive

Nature of Extensions

A browser extension is a piece of add-on software that adds functionality to the browser or changes its behavior or appearance in some way. Extensions permit the community to add features to a browser without having to wait for the browser vendor to add them (or fork the code for the whole browser to make a custom derivative). Although all mainstream browsers support some kind of extension mechanism, most extensions are developed by third parties unrelated to the browser maker.

To make it easy for users to find and install extensions, each mainstream browser has some type of extension repository. Mozilla has the Mozilla Addons repository for Firefox and Thunderbird.1 Google has the Chrome Web Store.2 Microsoft maintains a repository site for Edge Add-ons.3 Each of these browser vendors can act as a gatekeeper for their respective extension repository. On the positive side, they can remove extensions that are found to be malicious or harmful to users. However, they can also censor any extensions with which they disagree.

Extension APIs

Each browser provides an Application Programming Interface (API) that extension authors use to control the browser, modify its functionality, or change its appearance. While browser makers tend to support similar APIs to make porting extensions between browsers easier, there can be some differences between different browsers. In any case, extension capabilities are fundamentally restricted by what each browser API supports. If a browser maker decides to discontinue an API, extension authors must either adapt their extensions to whatever other APIs are available or discontinue their extensions for that particular browser.

As of early 2023, Google is trying to change an important API that a number of Chrome extensions use; this change is part of an update known as Manifest Version 3 (Manifest V3).4 With this update, Google intends to remove the general-purpose webRequest event that ad blockers and other privacy extensions utilize.5 In its place, a new mechanism called declarativeNetRequest is being added, which enforces an arbitrary (and relatively small) limit on the number of URLs that can be intercepted.6 While Google claims that declarativeNetRequest will improve user privacy by preventing extensions from monitoring network traffic as easily, the Electronic Frontier Foundation (EFF) views this change as an attack on privacy by deliberately reducing ad blocker effectiveness.7

Due to backlash from extension developers and others in the community, Google has delayed the switch to Manifest V3 repeatedly.8 However, the company still insists that the change will be made, and there appears to be some veracity to the EFF’s criticism, since Google is an advertising technology company at its core. Their own marketing blog discusses methods for discouraging the use of ad blockers, for example.9 From a technical perspective, the extension API change that is proposed could potentially affect the entire browser engine (Blink+V8), depending on the level at which it is ultimately implemented.

Fingerprinting Extensions

It is not advisable to install more extensions than are strictly needed to improve privacy from the baseline state provided by the browser. Each additional extension potentially increases the amount of entropy available for browser fingerprinting. Proof of concept code exists to enumerate installed extensions, at least for the Chrome browser.10 Since each user typically selects their own extensions from a large catalog of available choices, the amount of available entropy from knowing which extensions are installed can be significant.

Although installing a large number of extensions is a bad idea from a fingerprinting perspective, there are privacy implications to forgoing extensions altogether. Even with the limited tracking protection features built into some modern browsers, there is still quite a bit of tracking and privacy invasive code that will be able to run. Some type of mitigation for this tracking code is needed to improve user privacy, and this mitigation will generally be delivered in the form of a browser extension. It is therefore advisable to install either a content blocking extension or an extension that can perform a barrage attack on the tracking companies.

Content Blocking

A content blocker stops certain resources from loading, running, or remaining visible on a Web page. Content blocking is a generalization of advertisement (ad) blocking that can be extended to block material other than advertising banners and images. Over the years, a number of different ad blocking extensions have been developed, several of which have become popular. Some of these extensions have been implicated in browser performance issues, others have been found to be ineffective, and still others have been mired in controversy.

One extension that has been consistently available and has had consistent performance is uBlock Origin.11 This open source extension is available for several different browsers and has avoided the “acceptable ads” controversy that some other extensions support. In testing, uBlock Origin doesn’t appear to have a significant performance impact, and it is capable of operating both with ad blocking lists and in an advanced mode where the user enables each site feature and capability manually. While tedious, the latter approach enforces the principle of least privilege on websites, allowing the minimum amount of code to run. It is interesting to note that the LibreWolf browser, which is a privacy-focused rebuild of Mozilla Firefox, includes uBlock Origin by default.

As one might expect with any technology that has an impact on someone’s profit margins, there is some organized opposition to content blocking. The advertising industry argues that ad and content blockers cost them money in terms of lost revenue.12 Those making this argument are quick to present hypothetical dollar figures “lost” to content blocking, which is a dubious exercise at best, since it assumes that any user who blocks ads would choose to click on those ads if they weren’t blocked. The advertisers also argue that much of the “free” content on the Web is ad-supported, and they assert that much of this content will become paywalled if ad revenue continues to drop due to the proliferation of content blocking. On the flip side, there is a lot of low-quality, search-engine-optimized, advertiser-supported content on the Web, thus making it questionable how much revenue the producers of this poor content would actually make from subscriptions if they had to compete in a market setting instead of relying on an advertising network for revenue.

Perhaps the greatest problem with the “lost revenue” argument is that the advertising technology industry created this mess in the first place. Tracking users for the purpose of targeting ads is an invasion of privacy. Even the idea of “targeting” implies psychological manipulation, since it is designed to produce an action in the targeted person (chiefly, spending money). In addition, the targeted ads have a tendency to be both intrusive and obnoxious, marking a significant departure from the relatively tasteful, unobtrusive, and one-way newspaper and magazine advertising to which online ads are sometimes compared. Furthermore, marketers in this industry have demonstrated a sense of entitlement by assuming that unsuspecting people should be tracked by default and have to opt out of targeted advertising. If the benefit of all this tracking and targeting is really for the users, let them opt in after giving informed consent. Content blocking is simply one way that users can fight back against decisions that are being made for them without their informed consent.

Barrage Attacks

Content blocking is one approach to improving privacy, which works by preventing the downloading or execution of code that might reduce the user’s privacy. However, there is a completely opposite approach available that would have the same effect if widely deployed. This approach is called a barrage attack and is an example of an anti-forensic technique. With a barrage attack, the objective is to pollute the adversary’s gathered information with useless noise, raising the cost of trying to find the signal (or the desired information) in that noise to a level that is unprofitable or impractical.

In the case of online tracking content, an effective type of barrage attack would be to allow the tracking code to run but to automate interaction with the code to fill the tracking profiles with meaningless data. One way to accomplish this task is to use a browser extension that automatically clicks on every ad on every visited website, making the user appear to be interested in everything. The copious amounts of collected data thereby become harmful to the advertising technology companies, since there are costs associated with collecting, storing, and mining the data, and the results of extracting information from intentionally corrupted data will not be commercially useful. In many respects, this approach is more detrimental to the tracking and advertising companies than content blocking, since it will result in actual costs to the company in addition to lost revenue.

One browser extension that implements this type of barrage attack is AdNauseam, which suppresses the visible display of advertisements but clicks on each one of them in the background.13 Unsurprisingly, Google has censored this extension from the Chrome Web Store and has even taken steps to prevent it from being sideloaded into the Chrome browser.14 While it makes commercial sense for Google to take this action, considering that their advertising business would have to pay sites for each of the phony ad clicks, the censorship of this extension is a good example of the inherent conflict of interest that occurs when an advertising technology company is also the primary developer of a popular Web browser.

References and Further Reading


  1. Mozilla Add-ons

  2. Chrome Web Store

  3. Microsoft Edge Add-ons

  4. Chrome Developers. “Welcome to Manifest V3.” September 28, 2022. 

  5. Chrome Developers. “chrome.webRequest.” 

  6. Chrome Developers. “chrome.declarativeNetRequest.” 

  7. Alexei Miagkov and Bennett Cyphers. “Google’s Manifest V3 Still Hurts Privacy, Security, and Innovation.” Electronic Frontier Foundation. December 14, 2021. 

  8. Thomas Claburn. “Google halts purge of legacy ad blockers and other Chrome Extensions, again.” The Register. April 1, 2023. 

  9. Varun Chirravuri. “Helping publishers recover lost revenue from ad blocking.” The Keyword (Google Blog). April 16, 2018. 

  10. Extension Fingerprints

  11. uBlock Origin

  12. Countering the revenue loss caused by ad blockers.” Digital Content Next. August 12, 2020. 

  13. AdNauseam

  14. AdNauseam. “AdNauseam banned from the Google Web Store.” January 5, 2017. 

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.