FYI.

This story is over 5 years old.

Tech

How "Device Fingerprinting" Tracks You Without Cookies, Your Knowledge, or Consent

And not even Tor can help you.
Uh, not exactly, via CJ Isherwood/Flickr

The more closely you examine online privacy, the more it becomes clear that you can’t get online without giving something away.

Even if there are measures you can take, like enabling “Do Not Track” on your browser or switching to Tor, and even with legal protections in place, like those restricting cookies, loading a webpage requires a give-and-take, and what your browser gives away through Javascript or Flash could be enough for you to be identified and tracked.

Advertisement

study out this week from KU Leuven-iMinds researchers confirms this reality: Tracking users without their knowledge or consent via hidden scripts that uncover the users’ “device fingerprint,” they found, is much more widespread than previously thought.

Web applications need information on the device where they’re being employed so they can present the content correctly—the right dimensions, with compatible media, in a font that you have, and on and on.

“Web-based device fingerprinting” is the process of collecting enough of that information through the browser to perform stateless, which is to say cookie-free, device identification that is, for practical purposes, unique. With the right information, these fingerprints can be collected by private companies who then store and use it to track the device across the web.

In 2010, Electronic Free Foundation’s Peter Eckersley demonstrated that “benign characteristics of a browser’s environment” that it transmits upon a website’s request—stuff like the browser’s version, its screen dimensions, list of plugins and list of installed fonts—is enough to create a unique device-specific fingerprint. Among the half-million users with Java or Flash who visited panopticlick.eff.org, 94.2 percent of them could be identified and tracked without the need for browser or Flash cookies.

Without leaving cookies behind, the tracking is hard to detect and even harder to opt out of. Researchers found that setting your browser to “private-mode” doesn’t make a difference either; fingerprinting remains just as easy. And because regulations in the US and Europe limit cookies, advertisers can use fingerprinting as a legal loophole.

Advertisement

Even services that aspire to give you privacy have their limitations and loopholes. The low numbers of people using the privacy-friendly service Tor makes it so even a partial fingerprint is enough to uniquely identify the user. Tor knows this, and has attempted to build browsers that are indistinguishable from one another. Fonts are operating-system dependent, however, and therefore a vulnerability. Tor’s developers know this, once again, and built in caps to the number of fonts that a page can request.

But the KU Leuven-iMinds study discovered an exemption to the rule through certain CSS codes. The authors of the study noted that they informed Tor, and the next update will patch this hole. Update: A researcher emailed to tell us the hole has been fixed for a few months now, and is not an issue with the current release of Tor. Apologies for our mixup of the dates.

Fingerprinting is done for security purposes as well as marketing ones, and the companies that run the fingerprinting scripts for websites usually will differentiate which service they offer. According to the study:

The spectrum of use cases include fraud detection, protection against account hijacking, anti-bot and anti-scraping services, enterprise security management, protection against DDOS attacks, real-time targeted marketing, campaign measurement, reaching customers across devices, and limiting number of access to services.

Advertisement

And these companies argue that what they do isn’t a violation of your privacy, since they aren’t collecting “Personally Identifiable Information.” Comforting, right? Just like the NSA collecting “only metadata,” it’s still enough.

The study mentions the company MaxMind, which offers online retailers a service to check on their customers’ fraud scores based on 31 “non-PII” attributes—including “IP address, shipping address, non-salted hashes of email addresses and passwords, and credit card BIN number.”

So it’s mostly legal, easy to do, and hard to stop—and it’s more common than the researchers thought. Looking at the Internet’s top 10,000 websites, they discovered that 145 of them (almost 1.5 percent) use Flash-based fingerprinting. Of the top 1 million sites found, 404 were using Javascript-based fingerprinting.

These might not seem like huge numbers, but then most sites can simply rely on third-party cookies, which are easier and still more accurate. But if ever-more security savvy web users opt against cookies en masse, the back-up plan is ready to go.

The researchers will present their findings at the 20th ACM Conference on Computer and Communications Security this November in Berlin, and say they are developing a framework that the public can use as a browser plug-in that will be used for future privacy studies.

The ongoing NSA revelations have been a wake-up call for many, and have put the idea of internet privacy into the mainstream. Studies such as this one are showing users just how much they’re willingly giving up when they load webpages, even if it’s only through implied consent. In order to demand anything like privacy on the web, it takes first figuring out the extent to which you're already exposed and how much of that is essential to browsing, and how much was collected because no one thought to request otherwise.