Site icon Legal Cheek

‘Uncritical reliance’ on AI in criminal justice could lead to ‘wrong decisions’, says Law Society

Warning contained in new commission report

“An uncritical reliance on tech” in the justice system is raising alarm bells for the Law Society.

In a report published this week, Chancery Lane highlights a lack of accountability and transparency alongside potential human rights challenges of algorithms such as facial recognition, predictive crime mapping, and mobile phone data extraction being developed by the police, prisons and border forces.

There are increasing concerns about police forces piloting facial recognition technology that can, for instance, cross-reference someone at a particular public event with crime data, or algorithms that predict the level of risk of an individual committing further crimes over a given time period.

Christina Blacklaws, president of the Law Society, said:

“Complex algorithms are crunching data to help officials make judgement calls about all sorts of things … [and] … while there are obvious efficiency wins, there is a worrying lack of oversight or framework to mitigate some hefty risks … that may be unwittingly built in by an operator.”

The 80-page report, authored by a commission set up by the Law Society last year, sets out the challenges that algorithms raise such as bias and discrimination.

The 2019 Legal Cheek Firms Most List

Because algorithms “encode assumptions and systematic patterns” they can reinforce and then embed discriminations. It reads: “If, as is commonly known, the justice system does under-serve certain populations or over-police others, these biases will be reflected in the data, meaning it will be a biased measurement of the phenomena of interest, such as criminal activity.”

There is also a concern that different government agencies are not talking to each other, as Blacklaws puts it: “Police, prisons and border forces are innovating in silos to help them manage and use the vast quantities of data they hold about people, places and events” but there is an “absence of … centralised coordination or systematic knowledge-sharing between public bodies.”

Chancery Lane makes a number of recommendations as a result of the research findings including ensuring that public bodies rather than tech companies take ownership of the software involved, and setting up a National Register of Algorithmic Systems as an “initial scaffold for further openness, cross-sector learning and scrutiny.”

The commission also mapped all the known algorithms currently being deployed or developed by police in England and Wales.

The commission included members of the Law Society alongside academics, as well as Andrea Coomber from all-party law reform and human rights organisation, Justice.

Exit mobile version