'How exactly Facebook decides who sees what is one of the great pieces of forbidden knowledge in the information age, hidden away behind nondisclosure agreements, trade secrecy law, and a general culture of opacity. New research from experts at Northeastern University, the University of Southern California, and the public-interest advocacy group Upturn doesn’t reveal how Facebook’s targeting algorithms work, but does show an alarming outcome: They appear to deliver certain ads, including for housing and employment, in a way that aligns with race and gender stereotypes — even when advertisers ask for the ads to be exposed a broad, inclusive audience.
There are two basic steps to advertising on Facebook. The first is taken by advertisers when they choose certain segments of the Facebook population to target: Canadian women who enjoy badminton and Weezer, lacrosse dads over 40 with an interest in white genocide, and so forth. The second is taken by Facebook, when it makes an ad show up on certain peoples’ screens, reconciling the advertiser’s targeting preferences with the flow of people through Facebook’s apps and webpages in a given period of time. Advertisers can see which audiences ended up viewing the ad, but are never permitted to know the underlying logic of how those precise audiences were selected.
The new research focuses on the second step of advertising on Facebook, the process of ad delivery, rather than on ad targeting. Essentially, the researchers created ads without any demographic target at all and watched where Facebook placed them. The results, said the researchers, were disturbing:
Critically, we observe significant skew in delivery along gender and racial lines for “real” ads for employment and housing opportunities despite neutral targeting parameters. Our results demonstrate previously unknown mechanisms that can lead to potentially discriminatory ad delivery, even when advertisers set their targeting parameters to be highly inclusive.
Rather than targeting a demographic niche, the researchers requested only that their ads reach Facebook users in the United States, leaving matters of ethnicity and gender entirely up to Facebook’s black box. As Facebook itself tells potential advertisers, “We try to show people the ads that are most pertinent to them.” What exactly does the company’s ad-targeting black box, left to its own devices, consider pertinent? Are Facebook’s ad-serving algorithms as prone to bias like so many others? The answer will not surprise you.'
Read more: Facebook Ad Algorithm Is a Race and Gender Stereotyping Machine, New Study Suggests