Google's Latest Tracking Nightmare For Chrome Comes In Two Parts

2 years ago on August 09, 2023

A worrying new update from Google that hasn’t yet made headlines has put Chrome’s 2.6 billion users at risk. If you’re one of those users, this just gave you a reason to quit.

Chrome has serious issues when it comes to protecting your security and your privacy. The world’s leading browser has issued one urgent fix after another this year, as high-risk exploits have been found in the wild; and just a few weeks ago, Google finally admitted it had “accidentally” allowed millions of users to be secretly tracked.

Google says it wants to change, to put your privacy first, that web tracking is now out of control and has resulted in “an erosion of trust.” But as DuckDuckGo warns, “it’s all noise until Google actually agrees to collect less data and do less behavioral targeting.”

The latest tracking nightmare for Chrome users comes in two parts. First, Google has ignored security warnings and launched a new Chrome API to detect and report when you’re “idle,” i.e., not actively using your device. Apple warns “this is an obvious privacy concern,” and Mozilla that it’s “too tempting an opportunity for surveillance.”

Google, though, isn’t listening, reinforcing its fairly narrow use case while staying silent on these warnings. “This feature,” Google told me, “which we only expect to be used by a small fraction of sites, requires the site to ask for the user’s permission to access this data. It was built with privacy in mind, and helps messaging applications deliver notifications to only the device the user is currently using.”

According to Brave, “allowing websites to learn when users are active on sites, or have their screen locked, or similar, allows sites to learn sensitive information... Signals like this would be very useful to a malicious site (or script) that wanted to learn patterns.”

Vivaldi agrees, telling me: “We are not happy with the privacy implications of this API (since it can be abused for behavioral tracking), or the fact that it can be abused to know about when you might not notice if something is using your CPU... There are privacy implications that a user cannot be expected to realize.”

If this release of a controversial Chrome tracking technology despite industry warnings sounds familiar, that’s because we saw the same with FLoC earlier this year: Google was warned that its attempt to anonymize users while still serving the needs of advertisers was a surveillance disaster in the making. Google refuted any such claims and secretly enrolled millions of users into a trial, before quietly admitting later that those warnings had come true, that it had made the risks of tracking worse.

DuckDuckGo warns that Idle Detection “is another example of Google adding an API that has poor privacy properties to the web without consensus—and in this case in the face of active dissent—from other browser vendors. The Idle Detection API has a very narrow motivating use case but exposes new data about a user's behavior to the entire web—data that will ultimately be abused for user surveillance and advertising. The utility this API provides is outweighed by the privacy concerns it introduces.”

“Google has been professing their intent to figure out how to place ads in a privacy-preserving way with plans like the Privacy Sandbox,” Mozilla told me, “but those plans keep being delayed, and all the while they build functionalities like this one that tracks people and enables new ad use cases.”

Google’s Idle Detection API is worrying enough, but there’s worse to come. In the aftermath of FLoC’s awkward failure, Google is now touting a new idea to serve the needs of its customers—advertisers—while talking up privacy. The issue is that this is an impossible contortion. It just doesn’t work. And Apple has suddenly shown its 1.5 billion users just how exposed Chrome’s surveillance business model has now become.

Despite Apple versus Facebook stealing the privacy headlines, arguably it’s Google that Apple has more in its sights. And while it’s Firefox, DuckDuckGo and Brave that most vocally push the browser privacy agenda, it’s really Safari that has done the best job of exposing Chrome’s avaricious data harvesting machine at scale.

Apple’s campaign against Chrome has been long running. Safari’s war on third-party cookies has shown up Chrome’s unwillingness to do the same—Google’s promise to banish those hidden trackers has been postponed. Mozilla has publicly warned that Chrome is now “the only major browser that does not offer meaningful protection against cross-site tracking... and will continue to leave users unprotected.”

Then came Apple’s privacy labels (as you can see in the graphic above), exposing Chrome an an outlier against all other leading browsers. It collects too much of your data and links everything to your identity. None of the others do that.

Now, Apple has just gone much, much further. It may not have generated as much PR as iPhone 13 and iOS 15’s glitzy new features, but from a security and privacy perspective the most significant update that Apple has just introduced is a genuine game-changer for the way the internet works and your online privacy.

Safari already blocks by default the third-party tracking cookies that follow you around the internet, and other leading browsers do the same to some extent. But not Chrome. The risk here is fingerprinting, that’s where web trackers return information on you as you browse, adding all those bits of data to the profiles held on you, adding anything that can help identify you—IP address, browser and device details.

I think it’s fair to say that Apple has long waged a war on fingerprinting, and now it has introduced its biggest weapon yet—Private Relay. Put simply, this breaks the identity chain between you, the websites you visit and the ISP through which you access the internet. “The opportunities for fingerprinting,” Apple says, “have been removed.”

Private Relay has been described as a VPN—but it isn’t: it works differently and has a different purpose. A VPN creates a private, secure tunnel between you and the sites and servers you visit, masking your identity and IP address, even spoofing your location by routing your traffic through a different country to the one you’re in.

A VPN transfers your risk from the public internet and the various routings between you and the sites you visit to the VPN vendor. You need to trust a VPN provider—they can see everything you do, and they know where you are. Unlike Private Relay, VPNs safeguard all the traffic to and from your device. This is why you should always use a VPN when accessing public WiFi, in hotels and restaurants, airports, public access points. VPNs masquerade their proxy servers to present to web servers as genuine locations, enabling users to defeat web restrictions in places like China.

If you travel and use WiFi overseas, or if you use public internet access points, you should install a VPN. There are three golden rules when doing so. First, avoid free VPNs. Second, only install VPNs from reputable western vendors, avoid anything from obscure developers, especially in China. And third, check the reviews. An app with numerous, short five-star reviews with similar keywords is a red flag.

Private Relay has a different purpose, one that exposes Chrome’s systemic failings on the privacy front. What Apple has done is stop ISPs/WiFi operators harvesting your Safari web queries, while preventing websites from capturing your identity. Both risk you being fingerprinted. “It is critical to note,” Apple says, “that no one in this chain—not even Apple—can see both the client IP address and what the user is accessing.”

Private Relay doesn’t let you spoof your location, albeit it regularly changes your public facing IP address. It doesn’t hide that you’re using a proxy server, and so some websites will not work correctly. It’s a fundamental change in how the internet works, and as such there are teething issues—that’s why it remains in beta for now.

Put very simply, Private Relay blocks the exact type of web tracking and fingerprinting for which Chrome is lambasted. And this is the crux. Chrome could never deploy something similar, because to block the combination of identifiers and web queries from even Chrome itself would require technology that would fundamentally break the digital ad ecosystem, with Google at its center.

Google is trying to square this circle with its Privacy Sandbox, to find a way to serve advertisers while preserving user privacy. The issue is that this contradiction is an impossible problem to solve. Google’s first solution was FLoC, a plan to collate users in “anonymised,” likeminded groups. I warned at the time that this would not work, that once out of the lab, the system would be compromised by the wider tracking ecosystem. And so it proved. Google has now headed back to the drawing board.

Google’s latest gambit isn’t yet generating headlines, but it will. Rather than take Apple’s approach, that your privacy should be sacrosanct, Google wants to “budget” how invasive data harvesting can be. Rather than simply stopping web trackers from collecting your data, Google plans to introduce a “privacy budget,” whereby it will police just how much data they can take—so much and no more.

The theory is understandable. Websites are limited to what they can take from the privacy bank—that currency is obviously your data. Once they're fully drawn down, the privacy bank shuts and they can’t withdraw any more for a time. But just like FLoC, isolated theories don’t survive long on the real web. As Mozilla explains, “the underlying problem here is the large amount of fingerprinting-capable surface that is exposed to the web—there does not appear to be a shortcut around addressing that.”

Google is caught in a self-made trap. Unlike Mozilla, Brave, Microsoft, DuckDuckGo and Apple, of course, the company needs to play both sides of the fence. It may talk about safeguarding your privacy, but compromising that privacy to serve the needs of advertisers—its customers—is literally its business model. Just follow the money.

“Google has shown time and time again they care more about the perception of privacy than actually respecting it,” DuckDuckGo told me. “In the case of FLoC, for example, Google used privacy washing tactics to make it seem like this new approach would reduce tracking, while in the same breath stated that FLoC was at least 95% as effective as third-party cookie tracking, and would continue the ability to target people based on age, gender, ethnicity, income, and many other factors.”

Mozilla agrees, telling me that “browser fingerprinting is a major threat to user privacy; unfortunately, while we appreciate Google's exploration of solutions to this problem, we don't believe that the Privacy Budget is viable in practice."

“We appreciate Mozilla and other browsers' engagement throughout this process,” Google told me, “as we all work to build a more private web without third party cookies and other forms of invasive tracking. This is our collaborative process working as intended.”

Google initially assured me that FLoC was not the threat it was being painted, that it would reduce the risk of fingerprinting despite all the concerns. But it turned out to be every bit as bad as feared and Google backtracked. And so, here we are again.

“As we have previously stated,” Google assured me for this story, “Privacy Budget is an early-stage proposal and we fully expect to make improvements as we iterate based on feedback. Our ultimate goal is to build a solution that restricts fingerprinting effectively without compromising key website functionality or introducing new forms of tracking. We have publicly committed to not self-preference and are working with regulatory bodies and industry groups to reinforce this outcome.”

But according to Brave, “approaches that attempt to maintain an ‘acceptable’ amount of identification and tracking online, however well-meaning, are antithetical to the goal of a truly privacy-respecting web. We expect that ‘budget’-based approaches to web privacy will not be effective privacy protections.”

The reality is that Google can’t back down—it must meet the needs of advertisers or its machine stops feeding. But there are no good solutions, the fundamental premise of a “privacy-centric” web that’s built around trackers and data brokers is a nonsense.

“If the experience with FLoC/removal of third-party cookies tells us anything,” warns DuckDuckGo, “it's that we should take Google's proposals and privacy claims with a huge grain of salt until they're proven to work.”

Is it dramatic to suggest you quit Chrome? That depends on the value you place on your own privacy. If Google’s hidden budgeting and monetizing of your data isn’t a reason to quit, what about adding Idle Detection in such a way that you need to change your settings to avoid the intrusion. Just as with FLoC, this is not okay. If new tracking is added, it should be communicated with an opt-in/opt-out upfront. Users should not have to delve into settings to disable new tracking they have been told nothing about.

Apple has raised the bar here with App Tracking Transparency and Privacy Labels, Google, it seems, is doing the opposite. Yes, there are always settings to disable its more nefarious technologies, but we all know that the vast majority of users either can’t or won’t make any changes. Conversely, we have seen the vast majority of Apple users opting for privacy when offered clear and simple choices upfront.

Google emphasized to me that Apple’s solutions are not a cure-all. We know that apps have been caught “snooping” on users even when asked not to track. But this is a double-edged sword for Google. The lesson from FLoC is that the ad industry is crafty and will find workarounds. Apple has committed to enhancing its technologies to shut down abuses. The abusers in the case of Chrome are Google’s advertising customers.

“Fingerprinting is real and we’re seeing it happen,” Google says. “We’d like to stop this highly pervasive tracking of users across the web.” Well maybe it’s time for a wake-up call. If you control the world’s leading browser with 2.6 billion users, if you own the intersection between users and advertisers and websites, if you control search and most back-end web trackers, then stopping that “highly pervasive tracking” is totally within your control. But Google can’t do that, of course. Follow the money.

Google also says that “72% of users feel that almost all of what they do online is being tracked... and 81% say the potential risks from data collection outweigh the benefits,” which is why change is needed. Google says a lot of things. But until Chrome’s 2.6 billion users make privacy choices, Google will continue to say more than it does.

Read More Press

View All

Get 300 checks per month absolutely FREE!

No credit card needed. No strings attached. 👍