The Toronto Declaration

In Toronto on May 16th, a coalition of human rights and technology groups released a new declaration on machine learning standards, calling on both governments and tech companies to ensure that algorithms respect basic principles of equality and non-discrimination. Called The Toronto Declaration (Ref 1), the document focuses on the obligation to prevent machine learning systems from discriminating, and in some cases violating, existing human rights law. The declaration was announced as part of the RightsCon conference, an annual gathering of digital and human rights groups.

“We must keep our focus on how these technologies will affect individual human beings and human rights,” the preamble reads. “In a world of machine learning systems, who will bear accountability for harming human rights?”

The declaration has already been signed by Amnesty International, Access Now, Human Rights Watch, and the Wikimedia Foundation. More signatories are expected in the weeks to come.

“In a world of machine learning systems, who will bear accountability for harming human rights?”

While not legally binding, the declaration is meant to serve as a guiding light for governments and tech companies dealing with these issues, similar to the Necessary and Proportionate principles on surveillance (Ref 2). It’s unclear how the principles would translate into specific development practices, although more specific recommendations on data sets and inputs may be developed in the future.

Beyond general non-discrimination practices, the declaration focuses on the individual right to remedy when algorithmic discrimination does occur. “This may include, for example, creating clear, independent, and visible processes for redress following adverse individual or societal effects,” the declaration suggests, “[and making decisions] subject to accessible and effective appeal and judicial review.”

In practice, that will also mean significantly more visibility into how popular algorithms work. “Transparency is integrally related to accountability. It is not simply about making users comfortable with products,” said Dinah PoKempner, general counsel at Human Rights Watch. “It is also about ensuring that AI is a mechanism that works for the good of human dignity.”

Many governments are already moving along similar lines. Speaking at RightsCon’s opening plenary session, Canadian heritage minister Mélanie Joly said algorithmic transparency efforts were crucial for the broader exchange of information online. “We believe in a democratic internet,” said Joly. “So for us, transparency of algorithms is really important. We don’t need to know the recipe, but we want to know the ingredients.”


(1) Toronto Declaration

(2) Necessary and Proportionate Principles on Surveillance

Gerhard Schimpf, the recipient of the ACM Presidential Award 2016, has a degree in Physics from the University of Karlsruhe. As a former IBM development manager and self-employed consultant for international companies, he has been active in ACM for over four decades. He was a leading supporter of ACM Europe, serving on the first ACM Europe Council in 2009. He was also instrumental in coordinating ACM’s spot as one of the founding organizations of the Heidelberg Laureates Forum. Gerhard Schimpf is a member of the German Chapter of the ACM (Chair 2008 – 2011) and a member of the Gesellschaft für Informatik. --oo-- Gerhard Schimpf, der 2016 mit dem ACM Presidential Award geehrt wurde, hat an der TH Karlsruhe Physik studiert. Als ehemaliger Manager bei IBM im Bereich Entwicklung und Forschung und als freiberuflicher Berater international tätiger Unternehmen ist er seit 40 Jahren in der ACM aktiv. Er war Gründungsmitglied des ACM Europe Councils und gehört zum Founders Club für das Heidelberg Laureate Forum, einem jährlichen Treffen von Preisträgern der Informatik und Mathematik mit Studenten. Gerhard Schimpf ist Mitglied des German Chapter of the ACM (Chairperson 2008 – 2011) und der Gesellschaft für Informatik.

Leave a Reply

Your email address will not be published. Required fields are marked *

WP2Social Auto Publish Powered By :