Artificial Intelligence on Trial

Article

Francis Toye

At Unilink we are always scanning the market for new technologies that can help make prisons and probation work better.

Artificial Intelligence (AI) is defined as “the simulation of human intelligence processes by machines especially computer systems. The processes include machine learning, reasoning and self-correction” and it is spreading within the Justice system internationally with applications ranging from security to offender management.

AI has achieved notoriety in some areas… Recall Microsoft’s chatbot “Tay”, which having been created to converse naturally with humans was spewing out bile within 24-hours.

In the Justice sector a program developed by UCL, Sheffield and Pennsylvania universities to analyse court documents gave the same verdict as humans, 79% of the time and this serves to show the limits of AI applications.

What is the “right” answer? Is repeating the results of human operatives what is needed? Is it repeating human bias based upon human observations?

Screnshoot of the vulnerability prediction tool beta version.
Screenshot of the vulnerability prediction tool beta version.

Caution is required with respect to applying AI in such a contentious area as say sentencing. “It has been suggested that machine learning algorithms may help in curbing problems concerning inter-judge sentencing disparity.

It is argued that, insofar as the unfairness of sentencing disparity is held to reflect a retributivist view of proportionality, it is not necessarily the case that increasing inter-judge uniformity in sentencing is desirable.

More generally, it is shown that the idea of introducing machine learning algorithms, that produce sentencing predictions on the ground of a dataset that is built of previous sentencing decisions, faces serious problems if there exists a discrepancy between actual sentencing practice and the sentences that are ideally desirable.” (Ryberg, J. Sentencing Disparity and Artificial Intelligence. J Value Inquiry (2021)).

But there are less contentious areas than court. What about for example cell allocation? It is important to allocate people together who can be expected to live amicably. Get this wrong and rue the serious consequences. AI can in principle take account of far more variables than human operatives making the selection.

However, an AI can go far beyond a human operator in difficult cases of over-crowding re-allocating multiple cells to avoid future problems. This of course happens today with risk assessments and other factors being taken into account but what is our measure of success and perhaps AI can assist?

overall vulnerability levels evolution of a prisoner over several months, considering visits, work, phone, meals, gym and activities
The graphic shows the overall vulnerability levels evolution of a prisoner over several months, considering visits, work, phone, meals, gym and activities.

Unilink is seeking to combine observations of behaviour with measures of self-isolation for people in prison. Determined from their engagement with the prison regime through interactions with friends and family, the number of applications and so on, Unilink is developing with Serco a “Vulnerability Predictor”.

This would be difficult in a prison with paper-based systems but relatively straightforward where there are massive amounts of operational data derived from residents through the digitisation of operational activities such as provided by Unilink’s self-service.

Machine Learning can be applied to the data to measure the impact of various parameters on the resident’s self-isolation and see if that correlates with the likelihood of self-harming.

This is an example of (sic) “AI playing a more modest role using machine learning techniques to help us improve our understanding about what works, learn from our mistakes, and play only an advisory role in decision making processes” (Artificial Intelligence in Prisons in 2030; Pia Puolakka, Criminal Sanctions Agency, Finland & Steven Van De Steene, Enterprise Architect and Technology Consultant ACJ, 11-2021). Ultimately there are real gains to be made by intelligent application of these new technologies provided the approach is well considered.

Francis Toye is Unilink's Founder and CEO

Francis Toye

Francis Toye is Unilink’s Founder and CEO.
Helping Prisons and Probation Work: For Offenders, for Staff, for Society.
Unilink specialises in innovative solutions for criminal justice sectors around the world. Unilink’s reputation has been built on over twenty years’ experience in case management systems for probation, custodial management systems, biometric applications, offender self-service and communications. All Unilink’s solutions are created with the direct input of industry professionals and learnings from the 200+ establishments that use a Unilink product.
Independent research from the University of York shows that Unilink’s self-service software contributes significantly towards rehabilitation and running prisons efficiently.
The system is well proved and tested; offenders have carried out over two billion transactions. Unilink’s rich portfolio of proven solutions underpins digital transformation in prison and probation services across the UK, Norway, Austria, the Netherlands, Australia and New Zealand.
Unilink is a multi-award-winning company winning the Queen’s Award for Enterprise in Innovation and “Best Citizen App” and UK Digital Leader overall winner in recent years.

Advertisement

Like / Share:
More stories
Why are Prisons Slow to Adopt New Digital Technologies? The Challenges and Solutions