New research reveals how Facebook could be discriminating against users and acting in violation of equality laws and data protection rules in France and the Netherlands.

Imagine that you and your colleague are on a lunch break. You are both scrolling through your Facebook feeds at your desks. You work in the same company and have similar roles, responsibilities, and experience.  Your colleague sees an ad for an interesting, well-paid job that you’d love to get the chance to apply for, but you haven’t seen it. He mentions it to you, you refresh your feed and scan the ads again, but it still doesn’t appear.  

Why didn’t you see the ad?  

As uncomfortable as it sounds, perhaps one of the reasons you didn’t see the ad was your gender.  

In March and April we paid Facebook to post adverts containing links to real job vacancies for a range of trades and professions in the UK, the Netherlands, France, India, Ireland, and South Africa including jobs for airline pilots, couriers, hairdressers, psychologists, and IT workers. 

91% of the people who were shown our ad for mechanic vacancies in all of these countries were male. Of those that were shown our ad for pre-school teacher vacancies, 79% were female.  

These findings, which reveal how Facebook is showing adverts for job vacancies overwhelming to one gender relative to another, was replicated across nearly all the adverts and in all the countries we posted. 

Working with feminist organisations we took a closer look at France and the Netherlands: in the Netherlands, 97% of people who were shown our ad for receptionist jobs were women, and 79% of those that saw our ad for chef jobs were men. In France, of those who were shown our ad for psychologist jobs 89% were women. Of those that were shown our ad for pilot jobs in France, 74% were men.  

All the ads that we posted were shown - via one of Facebook’s mandatory ad campaign objectives – to users that Facebook thought were most likely to click on the website they linked to. And for all the ads, we specified only the following: that the ads must be shown to adults who lived in or had recently been in either country.  

The Facebook users who were shown our job ads, whom Facebook thought our ads were relevant and of interest to, were decided entirely by the company’s algorithm.  

We are concerned that in showing job ads predominantly to one gender the company’s ad delivery algorithm is not just replicating, but exacerbating the biases we see in society, narrowing opportunities for users, and frustrating progress and equity in the workplace and society at large.   

The right to not to be discriminated against on the basis of sex was hard won, fought for and secured in law by the women’s rights movement. Such discrimination is expressly prohibited in the European Convention on Human Rights and is further enshrined in EU law and in both the French and Dutch constitutions.  

But it is clearly not enough to leave these laws on the statute book. These are rights which we need to keep applying and defending even in this modern era of Big Tech algorithms, automated decision-making and artificial intelligence.  

In the US, the Justice Department has sued Meta over allegations that the way its ad delivery algorithm distributed housing adverts discriminated against US Facebook users based on characteristics including race, sex, and disability. Meta settled that case under an an agreement to develop a new system to address the disparities caused by its algorithms in respect to housing ads; a system which will be subject to court oversight. The problem, however, is that the settlement only applies to housing ads, and only applies to users in the US.  

In 2021, when we first noticed this discriminatory effect in the UK, we asked Facebook to explain the results. They didn’t. We then submitted complaints to regulators asking them to investigate our suspicion that the platform’s system for advertising jobs was discriminating on the basis of sex. 

Now, with these new findings, we are joining forces with women’s rights organisations in the Netherlands and France to ask that the Dutch Institute of Human Rights and the French Défenseur des droits investigate Meta’s compliance with equality legislation and intervene should the company be found to be in violation of these important laws.  

We’re also requesting that the Data Protection Authorities in both countries review the company’s compliance with rules which state that the company must process data transparently, legally, and fairly, in accordance with fundamental rights. 

Algorithms fed by Meta’s assumptions about us dictate the content we see on our Facebook feeds and affect billions of people’s lives every day. We’ve uncovered how Facebook is profiting from ads which are delivered to users in a discriminatory way, and in a way that neither users nor advertisers have an opportunity to understand, let alone prevent.  

Regulators must crack open the black box at the heart of Meta, investigate, and enforce our rights for a fairer society.  

When approached for comment, a spokesperson from Meta said: 

“We have applied targeting restrictions to advertisers when setting up campaigns for employment, as well as housing and credit ads, and we offer transparency about these ads in our Ad Library.  

We do not allow advertisers to target these ads based on gender. We continue to work with stakeholders and experts across academia, human rights groups and other disciplines on the best ways to study and address algorithmic fairness.”   


Notes:  

  • We also submitted job ads in India, Ireland, South Africa and the United Kingdom; the full results are available on request.  
  • The complaints to the Defenseur des Droits, CNIL, the Dutch Data Protection Authority and The Netherlands Institute for Human Rights are available on request.