Algorithmic transparency​

Machine learning algorithms are used across a wide variety of domains to produce useful outputs in the form of predictions, classifications, and clusterings. However, while many organisations can benefit from the use of machine learning algorithms, they raise a lot of ethical issues, which can make organisations reluctant to explore them.

To make these algorithms more socially acceptable, and thus enable their benefits to be experienced more widely, many organisations are exploring algorithmic transparency tools.

In domains where the outputs of a machine learning model can have real world consequences, for example, predictive policing, financial lending, or family screening, algorithmic transparency can provide explanations of how these outputs were reached. By tailoring the form and complexity of these explanations to the audience, practitioners can effectively audit, verify and contextualise a model’s outputs – building public trust in the tools and institutions that use them.

Our Approach

Trilateral Research evaluates machine learning algorithms using a diverse set of algorithmic transparency techniques in order to:

  • make their decision-making processes more transparent
  • identify and analyse situations where they perform poorly
  • probe for biases

Our interdisciplinary team of experts uses techniques like Shapley values, Anchors, and counterfactual explanations to produce solutions that enable ethical human-focused machine-assisted decision-making, as opposed to prescriptive autonomous decision-making.

Why Trilateral?

Trilateral Research has a wealth of experience in algorithmic transparency. We take a socio-technical approach and place strong emphasis on the ethical implications of the implementation of machine learning algorithms into decision-making processes, including our own applications.

Sign up for our newsletter

Join our mailing lists to receive updates about our latest research and to hear about our free public events and exhibitions.  If you would like to find out more about how we manage your personal information please see our privacy policy.