Ungovernable Minister of Justice Denis Malyuska started talking about the use of artificial intelligence in the election of a preventive measure. Opinions in the professional environment were divided. Some believe that the minister is catastrophically lacking the experience and understanding of the specifics of the criminal process. Others think Malyuska is a talkative fool.
That progress abroad, in Ukraine is harm.
Whatever the case, it’s about the whole minister. That is, he can go from words to actions. So I decided to find out how things are abroad with the use of artificial intelligence. To study, so to speak, foreign experience.
So, in December 2018, the European Commission for the Efficiency of Justice adopted the Charter on Ethical Principles of the Use of Artificial Intelligence in Judicial Systems. It especially noted that any programs and algorithms are only a support service and can not make decisions for a competent person.
And here, of course, the experience of England is particularly interesting. In the town of Durham there, an interesting experiment has been conducted for several years. To assess risks, the police use a digital software product HART (Harm Assessment Risk Tool), developed by the University of Cambridge.
It has access to more than a hundred thousand police cases that have accumulated over the past five years. Using a sophisticated algorithm for evaluating these data, HART’s artificial intelligence produces forecasts for each new detainee. The competent authorities then decide whether a detainee can be released on bail or whether it is safer to keep him/her in custody.
At the beginning, the system only analyzed the sex, age, residence and residence address, place and time of commission, and the repeat offense. Later on, the local press raised a wave of discontent. It turned out that the algorithm discriminates against people living in poor areas.
The police removed the information about the address of the propiska from the database, but introduced additional parameters. Now the “machine” takes into account credit history, health data, information on school performance, activity in social networks and much more.
A similar system called COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) has been used for twenty years in some American states. It should be noted that these and other similar systems are used rather in a test mode and the practice of their use is very controversial.
And what do we have? To begin with, it is important to understand that such programs are absolutely useless without large and incredibly expensive databases containing information about you and me. In the same Durham, for example, the police bought a database of fifty million UK adults from Experian, the largest UK consumer credit agency. So we’re going to have to digitize and systematize all the accumulated information about citizens.
It’s long, complicated and insanely expensive. But let’s say. Given the level of corruption and the lack of clear protection of personal data of citizens, it is easy to guess that the next day this database will be taken over by criminals, collectors, advertisers and jokers know who else. Anyone who has access will be able to use this information as they wish.
Yes, artificial intelligence in criminal proceedings could become a kind of restriction that does not allow for the choice of tough repression measures against a person if there are no real grounds for it. Now, in addition to criminals, those who are wanted to exert pressure are also imprisoned. Thus, a bribe or a confession is “taken out”. Real murderers, meanwhile, walk free just because their detention “can cause social unrest”.
However, the experience of using automated systems in Ukraine shows that all of them are subject to interference from operators. You know the algorithms of work – you get the right result.
Thus, for example, clever workers of judicial offices, through manipulations with settings, have long learned to deceive the system of automatic distribution of cases among judges. Played with the settings and the right case got to the “right” judge.
I have no doubt that law enforcement officers and judges will quickly learn to slip the “right” conclusion to the artificial intelligence. Now it works very easily. To the suspicion of taking a bribe carefully add a few more episodes, so that there are signs of repetition. The amount of a bribe miraculously increases to the right size. The qualification changes and now the seriousness of the crime is sufficient for detention. It would be a wish.
So it turns out that blunt copying of progressive mechanisms used abroad in our realities causes only harm. That is why I hope that the “good intentions” of the ministers who have seized the upper hand will remain so. A good idea, implemented with crooked hands and dumb minds, always has terrible consequences.
Maxim Mogilnitsky, Ukraine