NGOs file complaints in Netherlands and France against Meta

12th June 2023, London - A group of NGOs have today filed complaints against Meta to the human rights commissions in both the Netherlands and France, as new research indicates that gender discrimination is inherent to Facebook’s job adverts function. 

Global Witness posted a range ofreal-life job adverts on Facebook in France, Ireland, the Netherlands, India, South Africa, and the UK. Across these six countries Facebook’s algorithm resulted in 90.9% of ads for a mechanic being seen by men, whilst 78.6% of ads for preschool teachers were seen by women. 

In the Netherlands 97% of those were shown an ad for a receptionist were women and 79% who were targeted with an ad for a role as a chef were men. In France, only 12% of those targeted for a psychologist job were men and just a quarter of those that were shown a pilot job ad were women. 

In response to these results, Global Witness and women’s rights organisations in France and the Netherlands have today filed complaints to the French Defenseur des droits and the Dutch Institute of Human Rights.

The complaints accuse Facebook of gender discrimination in how its algorithm automatically targets certain jobs to particular users, seemingly according to their gender. Global Witness, Bureau Clara Wichmann in the Netherlands, and Fondation des Femmes in France, argue that the platform’s apparent algorithmic bias is reinforcing regressive societal stereotypes and is in breach of fundamental rights.

In both cases the commissions are called on to investigate Facebook’s compliance with national equality legislation and to intervene if the company is found to be in breach. 

Naomi Hirst, Digital Threats Campaign Leader at Global Witness, said:

“Big tech platforms would like us to believe that they represent progress and are building the future. Instead, we find that they are entrenching historical and regressive gender stereotypes by running an algorithm that is denying women the opportunity to know about job opportunities.

In almost every case the ads we posted were overwhelmingly shown to one gender, making it crystal clear that the algorithms at the heart of the platform’s business model are deeply problematic. It’s time for Facebook’s operations to be dragged from the stone age to the modern age big tech supposedly stands for.

The need to apply our rights in this age of algorithms, automated decision-making and AI is more pressing than ever. We call on the regulators in France and the Netherlands to properly investigate these complaints and enforce the equality legislations that people fought for over many years. They must put an end to the exceptionalism enjoyed by social media platforms that allows them to ride roughshod over our hard-won right to equality.”

Rajae Azaroual, legal advisor at Bureau Clara Wichmann, said:

“If social media algorithms are based on gender, we will maintain inequality in the labour market between men and women. The results of the study are alarming.”

Floriane Volt, Director of Public and Legal Affairs of the Fondation des Femmes, said:

"Social network algorithms reproduce and accentuate gender stereotypes and inequalities. Test results show an urgent need to end automatic gender discrimination in access to employment".

As well as urgent action being required by human rights commissions, Global Witness also calls on Data Protection Authorities in both France and the Netherlands to review Facebook’s compliance with rules that state the company must process data transparently, legally and fairly. 

It follows a similar case in the US last year when the Justice Department sued Facebook over allegations that housing ads on the platform discriminated against race, sex and disability. In that case Facebook agreed to a settlement that required the company to  develop a new system to address the discrimination in its algorithm for delivering housing ads to users. 

When approached for comment, a spokesperson from Meta said:

“We have applied targeting restrictions to advertisers when setting up campaigns for employment, as well as housing and credit ads, and we offer transparency about these ads in our Ad Library.

We do not allow advertisers to target these ads based on gender. We continue to work with stakeholders and experts across academia, human rights groups and other disciplines on the best ways to study and address algorithmic fairness.”