Fairness: The view from abroad

Research in algorithmic fairness is inextricably linked to the legal system. Certain approaches that might seem algorithmically sound are illegal, and other approaches rely on specific legal definitions of bias.

This means that it’s hard to research that crosses national boundaries. Our work on disparate impact is limited to the US. In fact, the very idea of disparate impact appears to be US-centric.

Across the ocean, in France, things are different, and more complicated. I was at the Paris ML meetup organized by the indefatigable Igor Carron, and heard a fascinating presentation by Pierre Saurel.

I should say ‘read’ instead of ‘heard’. His slides were in English, but the presentation itself was in French. It was about the ethics of algorithms, as seen by the French judicial system, and was centered around a case where Google was sued for defamation as a result of the autocomplete suggestions generated during a partial search.

Google initially lost the case, but the ruling was eventually overturned by the French Cour de Cassation, the final court of appeals. In its judgement, it made the argument that algorithms are by definition neutral and cannot exhibit any sense of intention, and therefore Google can’t be held responsible for the results of automatic algorithm-driven suggestions.

This is a fine example of defining the problem away: if an algorithm is neutral by definition, then it cannot demonstrate bias. Notice how the idea of disparate impact gets around this by thinking about outcomes rather than intent.

But a consequence of this ruling is that bringing cases of algorithmic bias in French courts will now be much more difficult.

The jury is still out on this issue across the world. In Australia, Google was held liable for search results that pointed to defamatory content: in this case, an algorithm was producing the results, but the company was still viewed as liable.

 

4 thoughts on “Fairness: The view from abroad

  1. Suresh,

    Thanks for being at the #MLParis meetup and your presentation. Yes, it was a discovery for me as well that the highest court in the land seemed to have made a somewhat uninformed determination of what constitute an algorithm. It’s a real shame.

    Igor.

    Like

Leave a reply to geomblog Cancel reply