The UK government has quietly squashed a plan that would have required government departments to create a process by which ordinary citizens could question policy decisions aided by the use of “black box” algorithms.

The internal processes of these algorithms are opaque and some members of parliament (MPs) wanted a “right of explanation” afforded citizens who are directly affected. The MPs also believe that transparency and accountability in government would be could only be enhanced if concerned parties could challenge both the basis for and the results of particular algorithms.

Beckie Smith filed this report on Civil Service World:

The government acknowledged the need for the civil service to be transparent about how they use algorithms. However, it did not commit to producing and maintaining a list of where algorithms “with significant impacts” are used, as recommended by the committee.

It also failed to address the committee’s call to add the use of algorithms to a ministerial brief. MPs had urged the government to appoint a named minister to oversee and coordinate its departments’ use and development of algorithms and their partnerships with industry.

And it appeared to reject the committee’s recommendation for the Crown Commercial Service to commission the Alan Turing Institute or another expert body to conduct a review establishing an appropriate procurement model for algorithms. The response said the CCS regularly reviewed where there may be a need for commercial procurement agreements, but said it would work with the Turing Institute and others to inform the categories for review.