Spain has become reliant on an algorithm to score how likely a domestic violence victim may be abused again and what protection to provide — sometimes leading to fatal consequences.
Having worked in making software for almost 3 decades, including in Finance both before and after the 2008 Crash, this blind reliance on algorithms for law enforcement and victim protection scares the hell out of me.
An algorithm is just an encoding of whatever the people who made it think will happen: it’s like using those actual people directly, only worse because by need an algorithm has a fixed set of input parameters and can’t just ask more questions when something “smells fishy” as a person would.
Also making judgements by “entering something in a form” has a tendency to close people’s thinking - instead of pondering on it and using their intuition to, for example, notice from the way people are talking that they’re understating the gravity of the situation, people filling form tend to mindlessly do it like a box-ticking exercise - and that’s not even going into the whole “As long as I just fill the form my ass is covered” effect when the responsability is delegated to the algorithm that leads people to play it safe and not dispute the results even when their instincts say otherwise.
For anybody who has experience with modelling, using computer algorithms within human processes and with how users actually treat such things (the “computer says” effect) this shit really is scary at many levels.
Having worked in making software for almost 3 decades, including in Finance both before and after the 2008 Crash, this blind reliance on algorithms for law enforcement and victim protection scares the hell out of me.
An algorithm is just an encoding of whatever the people who made it think will happen: it’s like using those actual people directly, only worse because by need an algorithm has a fixed set of input parameters and can’t just ask more questions when something “smells fishy” as a person would.
Also making judgements by “entering something in a form” has a tendency to close people’s thinking - instead of pondering on it and using their intuition to, for example, notice from the way people are talking that they’re understating the gravity of the situation, people filling form tend to mindlessly do it like a box-ticking exercise - and that’s not even going into the whole “As long as I just fill the form my ass is covered” effect when the responsability is delegated to the algorithm that leads people to play it safe and not dispute the results even when their instincts say otherwise.
For anybody who has experience with modelling, using computer algorithms within human processes and with how users actually treat such things (the “computer says” effect) this shit really is scary at many levels.