The Spanish Government announced this week a significant revision of a system with the police, based on an algorithm, identifying possible recurrent victims of domestic violence, since the authorities infringed on the effectiveness of the system.
The system, VioGen, requires police officers to formulate a series of warnings about the victim. Responsibilities are introduced into a computer program that generates a trigger — from zero to extreme risks — intended to signal women who are most vulnerable to recurrent abuse. This decision determines whether police protection and other services can be received by a woman.
An investigation by The New York Times last year discovered that the policy depends largely on this technology, and if it accepts the decisions made by the VioGen system. Algunas mujeres a las que el goritmo consideraba sin riesgo o con bajo riesgo de sufrir más daños posteriormente sufrieron nuevos abuses, incluidas decenas que pueron aesinadas, según descubrió el Times.
The Spanish authorities say that the announced changes are this week forming part of an update of the system, introduced in 2007, planned for much of the time. I would like to say that the system uses police departments with limited recourses to protect vulnerable women and to reduce the number of repeated attacks.
On the updated system, VioGen 2, the software may not cause women to suffer any harm. The police must also introduce more information about the victim, whereby, by reason of the authorities, it will allow more accurate predictions to be made.
Other changes aim to improve collaboration between government organizations that intervene in cases of violence against women, as well as facilitating the exchange of information. In some cases, victims will receive personalized protection plans.
“El machismo está llamando a nuestras puertas y lo está haciendo con una violencia que hace tiempo que ne veíamos”, says Ana Redondo, minister of Igualdad, in a street of prensa el miércoles. “No is the moment to dar a paso atrás. This is the moment to dar an adelante salto”.
The Spanish use of an algorithm to guide the treatment of general violence is an example of the great influence of how gobiernos are based on algorithms to make important social decisions, a tendency that we hope to create with the use of artificial intelligence . The system has been studied as a potential model to protect our guests who intend to combat violence against women.
VioGén is created with the belief that an algorithm based on a mathematical model can serve as an imparcial herramienta to help the police to encounter and protect women who, in other ways, may not be overlooked. Between the preguntas and the relevant options if you do not have questions such as: Are you using a weapon? Do you have economic problems? Is the agresor most likely to behave as a controller?
Las clasificadas clasificadas como de mayor riesgo recibieron más protection, incluidas patrolls regulares por su casa, acceso a un refugee y vigilancia policial de los movimientos de su agresor. If you don’t get enough points, you won’t get enough.
Last year, Spain has more than 100,000 women’s active cases that have been evaluated by VioGén, and the 85 victims have been classified as the least likely to have been mistreated by their attacker. The police officers in Spain are trained to cancel VioGén’s recommendations if they are justified, but the Times discovered that the police officers were accepted approximately at 95 by ciento de las veces.
Victoria Rosell, who played in Spain and was an ex-gobier against the violence of género, said that a period of “autocriticism” was necessary so that the gobierno improved VioGén. Please note that the system may be more accurate if it extracts information from other government data bases, including health and educational assistance systems.
Natalia Morlas, president of Somos Más, a group for the defense of victims’ victims, said that she was satisfied with the changes, and she hoped to lead to a better evaluation of the crime by the police.
“Calibrating well the victim’s life is so important that it can save lives,” says Morlas. Añadió that it is fundamental to maintain a human supervision of the system because a victim “has to be handled by people, not by machines”.
Adam Satariano He is the Times Technology Correspondent, based in London. More by Adam Satariano