General security

FLoC delayed: what does this mean for security and privacy?

October 4, 2021 by Susan Morrow

The specter of invasive tech has reached peak annoyance in recent years. A report from the Internet Society found that almost two-thirds of respondents felt that smart devices that collected data on interests and behavior were “creepy.” As such, the consumer appetite for targeted marketing via third-party cookies is losing ground. Google has reacted by developing the concept of Federated Learning of Cohorts (FLoC). But is FLoC truly a privacy-enhanced method to meet advertisers’ needs while maintaining consumer privacy?

Is FLoC building or breaking down walls?

FLoC aims to partition everything in the browser: socket pools, cookies, caches etc. These partitioning acts to isolate a website (the registered domain). The goal of FLoC is to facilitate interest-based advertising while making user tracking more difficult. FLoC came about from the realization that third-party cookies, used to track people as they navigated from site to site on the internet, were privacy-invasive. In using cookies to track people, the collated data and metadata could also be used to potentially determine an individual’s identity.

In 2020, Google announced that it would be phasing out third-party cookies. The Google Privacy Sandbox, an idea going back to 2019, was utilized as a part of this initiative to develop privacy standards while giving advertisers the data they need to market effectively.

Google continues to court the tech industry with their Privacy Sandbox, including co-opting W3C, to help to fine-tune the initiative. In January 2021, a year after announcing the Privacy Sandbox initiative, Google pared back the proposals for a Privacy Sandbox to five candidates, one of which is FLoC.

FLoC uses a kind of “herd privacy” to (in theory) improve the privacy of individuals browsing the internet. Instead of tracking individuals using third-party cookies in each user’s browser, the interests of groups of people (cohorts) will replace individual identifiers. These cohorts will be assigned a FLoC ID, grouping individuals together by interest area. The thinking behind the FLoC ethos is that identifying information will be hidden within the “crowd” of users that make up a specific cohort. Novel device processing is used to keep web browsing history private. Google employs machine learning to create a cohort of interests based on the sites that an individual visits. The principles of K-anonymity are used to pseudonymised cohort members.

Google has stated this about FLoC:

“Our tests of FLoC to reach in-market and affinity Google Audiences show that advertisers can expect to see at least 95% of the conversions per dollar spent when compared to cookie-based advertising.”

However, this golden chalice of balanced privacy vs. marketing outreach is not as clear-cut as it may seem.

How does FLoC affect privacy and data security?

Mozilla carried out a deep analysis of the impact of FLoC on an individual’s privacy. The main concern of the Mozilla researchers was that FLoC was still at risk of allowing linking of user behavior. 

The privacy issues identified by Mozilla came down to four key areas.

Browser fingerprinting

Varying browser types, and versions and operating systems on which the browser was running, allows a cohort member to be narrowed down. Mozilla believes that a small amount of data can be used to potentially identify an individual within a typical cohort of a few thousand users.

Multiple visits

FLoC uses “FLoC IDs” to identify cohort groups. A FLoC ID is currently re-calculated approximately every seven days. Mozilla is concerned that these IDs can be used to identify a user who makes multiple visits to a site by tracking the associated FLoC ID across time. Also, Firefox’s Total Cookie Protection (TCP) does not prevent this, as FLoC undermines TCP by restoring cross-site tracking even if users have TCP enabled.

FLoC leaks information

FLoC also undermines restrictive cookie policies that would otherwise prevent user tracking users having the ability to decline marketing cookies. Individual FLoC cohorts can be queried in an increasingly granular manner. When the ID queries are combined with fingerprinting data, this can potentially identify data being revealed. Mozilla demonstrates this using the example: “Do people who live in France have Macs, run Firefox and who have this ID like cars?” This conflation of individual data points is potentially revealing in Mozilla’s view.

FLoC insufficient countermeasures

A website able to opt out of using FLoC is one potential way to avoid FLoC privacy issues. However, Mozilla believes that it is more likely that any site that uses advertising will be included by default, and opt-out is unlikely to be actioned whatever the situation.

Addressing the FLoC privacy issues

Google is attempting to address the privacy-advertising balance by recognizing ‘sensitive websites’ and providing a list of sensitive categories that will return an empty cohort ID for that sensitive cohort. What constitutes “sensitive,” however, is open to interpretation and may be incomplete. Google gives an example of sensitivity that could reveal someone is searching for a “rare disease.”

What does the future of FLoC look like?

The best thing said about FLoC is that it attempts to minimize privacy invasion but does not remove privacy issues altogether. Some privacy advocates are also worried that unscrupulous advertisers may turn to first-party cookies to obtain more granular data than before FLoC. It is also worth noting that Firefox and Safari have already removed the use of third-party cookies. 

The worst-case scenario, regarding FLoC and privacy, is summed up by private browser company Vivaldi:

“A dictatorship may be able to work out that dissenters often seem to have one of the same five FLoC IDs. Now anyone who visits a nationally controlled website with that ID could be at risk. A country that outlaws certain religions or sexualities could do the same.” 

The work to establish a FLoC architecture continues, with an expected delivery date of phase 1 in 2022 and a full rollout expected in 2023. Hopefully, with guidance from the privacy industry and industry stalwarts such as W3C, Google will deliver technology that moves the privacy dial towards good and that Google can stand by its commitment to doing no evil.

Chrome version 89 included a trial version of FLoC that affects around 0.5% of users. To check out if your Chrome browser is part of that test, you can use EFF’s FLoC check tool, “Am I FLoCed.”

 

Sources

Posted: October 4, 2021
Articles Author
Susan Morrow
View Profile

Susan Morrow is a cybersecurity and digital identity expert with over 20 years of experience. Before moving into the tech sector, she was an analytical chemist working in environmental and pharmaceutical analysis. Currently, Susan is Head of R&D at UK-based Avoco Secure. Susan’s expertise includes usability, accessibility and data privacy within a consumer digital transaction context. She was named a 2020 Most Influential Women in UK Tech by Computer Weekly and shortlisted by WeAreTechWomen as a Top 100 Women in Tech. Susan is on the advisory board of Surfshark and Think Digital Partners, and regularly writes on identity and security for CSO Online and Infosec Resources. Her mantra is to ensure human beings control technology, not the other way around.

Leave a Reply

Your email address will not be published. Required fields are marked *