International Workshop
on Obfuscation:
Science, Technology, and Theory
April 7-8, 2017  •  New York University

Workshop on Obfuscation Workshop on Obfuscation
OBFUSCATION WORKSHOP REPORT

Obfuscation and the Threat of Centralized Distribution

Daniel C. Howe, School of Creative Media, City University of Hong Kong

Earlier this year, Helen Nissenbaum, Mushon Zer-Aviv, and I released an updated version of AdNauseam with a number of new features. AdNauseam is the adblocker that clicks every ad in an effort to obfuscate tracking profiles and inject doubt into the economics driving advertising-based surveillance. Soon after the release, we learned that Google had banned AdNauseam from its store, where it had been available for the past year. We’ve since learned that Google was disallowing users from manually installing or updating AdNauseam on their Chrome browser.

The fact that the distribution of open-source extensions is now largely in the hands of a few multinational corporations, operating with little oversight, highlights the threat of recent moves toward centralized distribution. Whether or not you agree with AdNauseam’s approach, it is chilling to realize that Google can quietly make one’s extensions and data disappear at their whim. Today it is a privacy tool that is disabled, tomorrow it could be your photo app, chat program, or even password manager. And you don’t simply lose the app, you lose your stored data as well: photos, chat transcripts, passwords. For developers, who incidentally must pay a fee to post items in the Chrome store, this should cause one to think twice. Not only can your software be banned and removed without warning, but all ratings, reviews and statistics are deleted as well.

When we wrote Google to ask the reason for the removal, they initially responded that AdNauseam had breached the Web Store’s Terms of Service, stating that “An extension should have a single purpose that is clear to users.” Only months later did Google admit the actual reason for the block: that AdNauseam was interfering with their ad networks. In Google’s final official response, Senior Policy Specialist Dr. Michael Falgoust confirmed that: “[AdNauseam] appears to simulate user behavior by sending automated clicks in such a way that may result in financial harm to third party systems such as advertising networks.” As one could also claim economic harms from adblockers (which are, as yet, not blocked in the Chrome store), we are left to speculate whether they might be other reasons behind the takedown. Our guess is that part of Google’s antipathy toward AdNauseam can be traced to a new feature: specifically our built-in support for the EFF’s Do Not Track mechanism [1].

For anyone unfamiliar, this is not the ill-fated DNT of yore, but a new, machine-verifiable (and potentially legally-binding) assertion on the part of websites that commit to not violating the privacy of users who send the Do-Not-Track header. A new generation of blockers, including the EFF’s Privacy Badger and AdNauseam, have support for this mechanism enabled by default; which means that they don’t block ads and other resources from DNT sites, and, in the case of AdNauseam, don’t simulate clicks on these ads.

So why is this so threatening to Google? Perhaps because it could represent a real means for users, advertisers, and content-providers to move away from surveillance-based advertising. If enough sites commit to Do Not Track, there will be significant financial incentive for advertisers to place ads on those sites, and these too will be bound by DNT, as the mechanism also applies to a site’s third-party partners. And this could possibly set off a chain reaction of adoption that would leave Google, which has committed to surveillance as its core business model, out in the cold.

But wait, you may be thinking, why did the EFF develop this new DNT mechanism when there is AdBlock Plus’ “Acceptable Ads” programs, which Google and other major ad networks already participate in? That’s because there are crucial differences between the two. For one, “Acceptable Ads” is pay-to-play; large ad networks pay Eyeo, the company behind Adblock Plus, to whitelist their sites. But the more important reason is that the program is all about aesthetics—so-called “annoying” or “intrusive” ads—which the ad industry would like us to believe is the only problem with the current system. An entity like Google is fine with “Acceptable Ads” because they have more than enough resources to pay for whitelisting [2]. Further, they are quite willing to make their ads more aesthetically acceptable to users (after all, an annoyed user is unlikely to click) [3]. What they refuse to change—though we hope we’re wrong about this—is their commitment to surreptitious tracking on a scale never before seen. And this, of course, is what we, the EFF, and a growing number of users find truly “unacceptable” about the current advertising landscape.

Notes

Note: a version of this argument was published on the “Freedom to Tinker” blog as “AdNauseam, Google, and the Myth of the ‘Acceptable Ad’”.

[1] This is indeed speculation. However, as mentioned in [1], the stated reason for Google’s ban of AdNauseam does not hold up to scrutiny.

[2]  In September of this year, Eyeo announced that it would partner with a UK-based ad tech startup called ComboTag to launch the “Acceptable Ads Platform” with which they would act also as an ad exchange, selling placements for “Acceptable Ad” slots.  Google, as might be expected, reacted negatively, stating that it would no longer do business with ComboTag. Some assumed that this might also signal an end to their participation in “Acceptable Ads” as well. However, this does not appear to be the case. Google still comprises a significant portion of the exception list on which “Acceptable Ads” is based and, as one ad industry observer put it, “Google is likely Adblock Plus’ largest, most lucrative customer.”

[3]  Google is also a member of the “Coalition for Better Ads,” an industry-wide effort which, like “Acceptable Ads,” focuses exclusively on issues of aesthetics and user experience, as opposed to surveillance and data profiling.

Workshop Report

download PDF

Stay in Touch

We'll send occasional announcements about conference details and follow-up initiatives.

Sponsored by:

NYU Steinhardt logo

International Program and Organizing Committee:

Paul Ashley, Anonyome Labs
Benoît Baudry, INRIA, France
Finn Brunton, New York University
Saumya Debray, University of Arizona
Cynthia Dwork, Harvard University
Rachel Greenstadt, Drexel University
Seda Gürses, Princeton University
Anna Lysyanskaya, Brown University
Helen Nissenbaum, Cornell Tech & New York University
Alexander Pretschner, Technische Universität München
Reza Shokri, Cornell Tech