As the use of predictive analytics expands beyond niches like suicide prevention and road safety into
more contentious areas like child protection and parole/sentencing decisions, more and more voices are
raised against the opaque algorithms that govern such software.
Policy forum The Mandarin tackled the issue and asked questions:
“A host of emerging issues include machine bias, exaggerated predictive accuracy, ethics and the
infiltration of programmers’ values into algorithm design. Rather than slowing down the use of
predictive analytics in regulation, however, these techniques are now being extended to areas as diverse
as pollution control, access to alcohol and cycling rules. The bottom line is the need for effective
strategies to build trust and legitimacy if these tools are to be used to generate public good.[…]
“Do data-driven techniques for making decisions threaten to supplant regulators’ traditional approaches
to problem-solving using their expertise and experience? What issues does the use of big data and
predictive analytics present for increasing the consumers and citizens based affected by these regulatory
processes? Talking about transparency is fine but how do you explain the algorithm to a family having a
child removed from their care?”