Automated discrimination is probably a term that not many of us are familiar with, but it relates directly to the way that online platforms optimise ad delivery in discriminatory ways. The topic has gathered increasing traction in recent years, as advertisers who are found guilty of this offence are now considered to be breaking the law. However, to what extent are these firms excluding people based on both race, and gender?
Online platforms are well known for the refinement of their advertisement engines, which allow advertisers to target specific audiences with their content. Whilst at first this was considered fairly harmless, whereby, for instance, Facebook will allow you to target people who hold certain interests, journalists recently uncovered that it has much more damning implications, in excluding people of a given race from seeing an ad, in what Facebook terms as excluding specific groups with 'ethnic affinities'. This was highlighted in research by both
ProPublica and AlgorithmWatch which both illustrate how Facebook allow advertisers to create ads which exclude people based on race, gender and other sensitive factors, even though this is prohibited by federal law in housing and employment.
In the research by ProPublica, the organisation sought to purchase an ad in Facebook’s housing categories, via the company’s advertising portal. In doing so, they were allowed to place an ad which targeted Facebook members who were house hunting, whilst excluding anyone with an 'affinity' for African-American, Asian-American or Hispanic people. They then proceeded to show these findings to prominent civil right lawyer, John Relman, who remarked that 'This is horrifying. This is massively illegal. This is about as blatant a violation of the federal Fair Housing Act as one can find.'
In an attempt to further explore this topic, AlgorithmWatch advertised the following jobs on both Facebook and Google; machine learning developers, truck drivers, hairdressers, child care workers, legal counsels, and nurses. The experiment, which they conducted in Germany, Poland, France, Spain and Switzerland, found that both Google to some extent, but especially Facebook, targeted the ads without asking for permission. In doing this, the ad for truck drivers was shown on Facebook to 4,864 men but only to 386 women, whereas the ad for child care workers, which was running at exactly the same time, was shown to 6,456 women but only to 258 men. When these findings were shown to Northeastern University researcher, Piotr Sapiezynski, he explained that 'Facebook immediately decides whom to show the ad to, as soon as the ad is published. This is done by using past data to make predications on who might click on an ad based on how users reacted to similar ads in the past, and hence in doing so, it predicts the past will repeat itself.'
On many levels automated targeting could be incredibly advantageous for a range of advertisers, across multiple industries. It could be used to ensure that job offers are only shown to licensed professionals who hold the required degree, or that property sales are only highlighted to those within a certain geographical area. However, the way it is currently being employed at the moment is using outdated data to create the future, and in doing so is depriving people of opportunities through blatant discrimination. Whilst European law forbids discrimination based on a series of criteria, such as race and gender, this discrimination might be impossible to prove for citizens who were affected. The reason for this is that current legislation puts the burden of proof on complainants. However, Facebook users have no way of knowing what job offers they were not shown, and advertisers have no way of ensuring that their ads are shown in a way that does not discriminate illegally.
Therefore, with Facebook's privacy and public policy manager, Steve Satterfield, naively stating that Facebook only offered the 'ethnic affinity' categories as part of a 'multicultural advertising' effort, and an attempt by housing rights activist, Neuhtah Opiotennione, to file a class action lawsuit against Facebook's ad delivery being considered unlikely to go forward, this prominent problem doesn't seem to be going away anytime soon.
Comments