Code and colonialism: how risk algorithms keep failing Māori

Two risk-scoring tools have shaped sentencing and parole decisions for over 25 years. Their designers promised neutrality. The data tells a different story.

Summary of research by Emily Silby



Māori make up around 17% of New Zealand's population. They represent more than half of its prison population. That gap has persisted for decades — and, this article argues, has been quietly entrenched by algorithmic tools that were supposed to make the justice system fairer.

Since 1999, two tools have been at the centre of criminal justice decision-making in New Zealand: RoC*RoI (Risk of Reconviction × Risk of Reimprisonment) and YORST (Youth Offending Risk Screening Tool). Both were introduced with good intentions. Both have been repeatedly shown to produce outcomes that disadvantage Māori — and both remain in use.



THE TOOLS

RoC*RoI was built from the criminal histories of 133,000 individuals dating back to 1983. Race was an explicit predictor variable from the start. The model scores individuals between 0 and 1 — anything above 0.7 is "very high risk" — and those scores feed into pre-sentence reports, sentence planning, parole board decisions, and programme eligibility.

The problem is structural. The model runs on static data: things that cannot change, like age at first conviction, or that only move in one direction, like the number of previous convictions. If the historical data encodes decades of over-policing in Māori communities, the algorithm doesn't correct for that — it learns from it and reproduces it.

Higher scores lead to harsher sentences, longer criminal records, and even higher scores at the next point of contact. The Department of Corrections' own 2009 report acknowledged this could have "potentially pernicious amplifying effects on certain sub-populations of offenders (such as Māori)."

YORST was introduced in 2007 for youth, and applies to children from age 10. Because juvenile criminal records are thin, it drew on proxies: the decile of a child's primary school, their peer group, substance use, family history. These are not behaviours — they are circumstances. A child cannot choose the school their neighbourhood can afford.

WHAT THE WAITANGI TRIBUNAL FOUND

In 2002, Department of Corrections probation officer Tom Hemopo brought an urgent claim to the Waitangi Tribunal, arguing both tools discriminated against Māori. The Department acknowledged that two people with identical criminal histories would receive different scores depending on their ethnicity. Their response was to zero-rate the ethnicity variable. Hemopo then re-ran assessments with the revised tool — and found Māori offenders still received higher scores.

An internal email produced during the hearing showed that staff had been looking for a variable that was "highly correlated with ethnicity" without changing the model's stated accuracy. In other words: a proxy.

If police institutional bias against Māori has contributed to higher prosecution and conviction rates for Māori, the data set will simply reflect the consequences of such bias without highlighting, let alone correcting, its distorted effect.
— Waitangi Tribunal, Offender Assessment Policies Report (2005)

The claim was ultimately unsuccessful — ethnicity had been formally zero-rated, and Māori offending rates had slightly declined at the time. But Hemopo returned to the Tribunal in 2015, thirteen years later, with the same concern. By then, 80.9% of sentenced Māori prisoners were being reconvicted within five years.



BY THE NUMBERS

  • Māori share of New Zealand's general population (2018 census)

  • Māori share of the prison population

  • Sentenced Māori prisoners reconvicted within five years

  • Same figure for non-Māori



THE DEEPER ARGUMENT

Silby's article traces the problem back further than the algorithms themselves. The data these tools were trained on reflects a legal history that criminalised Māori culture outright — from the 1863 acts that allowed the Crown to seize land from anyone defending it, to the 1907 Tohunga Suppression Act that made traditional healing practices illegal. The criminal justice system was built on that foundation. The algorithms learned from it.

An algorithm will take that historic data and locate a pattern, necessarily continuing that bias. This is what ‘success’ looks like for a programme and nothing more.
— Emily Silby

The article also raises the question of data sovereignty — the right of Māori to govern how data about their communities is collected and used. Te Mana Raraunga, the Māori Data Sovereignty Network (founded 2018), now provides some oversight, but the tools that feed on that data predate it by nearly two decades.

The 2019 NZ Law Foundation report on government AI reached similar conclusions: YORST's variables function as proxies for ethnicity, and zero-rating race in RoC*RoI does not make the tool free of discriminatory effect. It recommended independent oversight and meaningful consultation — neither of which had occurred when the tools were first introduced.



WHERE THINGS STAND

The government's responses since have been real but indirect: a redesigned Tauranga courthouse built in consultation with iwi, a new Criminal Cases Review Commission, restored legal aid funding. In 2018, the Department of Corrections' RoC*RoI was listed in the Algorithm Assessment Report as its sole operational decision-making algorithm — a formal acknowledgement, at least, that it exists.

New Zealand has also signed an Algorithm Charter committing signatories to transparency and consultation. The Department of Corrections is a signatory. The tools remain in use.

If RoC*RoI and YORST are more ‘successful’ for New Zealand European than Māori, then perhaps utilise these tools just for New Zealand European offenders.
— Emily Silby

That provocation sits at the heart of the article. Not as a serious policy proposal, but as a way of making the logic plain: if a tool performs differently across ethnic groups, then its results are not neutral. Calling them neutral doesn't make them so.

Next
Next

Justice as Trauma Summit