Facebook has announced it will push limits against discriminatory ad targeting. It will remove specific ad targeting options to avoid limiting audiences unfairly.
Facebook COO Sheryl Sandberg said the changes relate to three categories: housing, employment and credit ads.
- Anyone who wants to run housing, employment or credit ads will no longer be allowed to target by age, gender or zip code.
- Advertisers offering housing, employment and credit opportunities will have a much smaller set of targeting categories to use in their campaigns overall. Multicultural affinity targeting will continue to be unavailable for these ads. Additionally, any detailed targeting option describing or appearing to relate to protected classes will also be unavailable.
- We’re building a tool so you can search for and view all current housing ads in the US targeted to different places across the country, regardless of whether the ads are shown to you.
The reinforced restriction is a reaction to requests by the National Fair Housing Alliance, the American Civil Liberties Union, and the Communication Workers of America.
The organizations found that the social network’s ad targeting options is vulnerable to discriminatory content.
Facebook complied. It shifted focus and implemented these updates.
ProPublica discovered this weak spot in 2016. They found that Facebook advertisers have an option to filter black, Hispanic, and other ethnicities from seeing ads.
The social network addressed this in 2017. But advertisers could still find a way to use the filters through Facebook’s ad targeting system.
After the Cambridge Analytica scandal, Facebook removed more than 5,000 ad targeting options to avoid discrimination.
It rolled out an opt-in agreement so businesses can still use the filters legally.
These measures were futile. Businesses continued to target or exclude those audiences—legally or illegally.
Facebook assures that the latest update limits businesses from using the discriminatory exclusions.
Given the complexity of ad targeting, Facebook will have a hard time putting up restrictions.
“It’s within the realm of possibility, depending on how the algorithm is constructed, that you could end up serving ads, inadvertently, to biased audiences.”
– Pauline Kim, professor at Washington University in St. Louis, told The New York Times
Facebook has kept mum on how its ad targeting algorithms work. And there is an inherent problem with algorithms.
If the algorithm trains on biased habits and user behaviors, it may aid in discriminatory ad targeting by default.
Since algorithms train on data from actual usage, they skew towards existing biases for or against the chosen audience.
If more users in a category have more interest in a certain thing, they will be targeted logically. It is biased based on the sample.
How tech companies can avoid algorithms from doing this remains a hot topic among machine learning experts.
Anything Facebook can do to limit the discriminatory ad targeting is a step forward.