It’s hard to generate good evidence on which to base policy and programs. It’s even harder when that evidence exists, is publicly available and blatantly ignored.
Let’s talk about generating evidence first…
Sometimes when we talk about comparing a group receiving services to a group not receiving services, it is argued that it is unethical to deny anyone a service (this always comes up when planning social impact bonds and other pay for success programs). There is an underlying assumption in this argument that all services are beneficial to their recipients. Results from the Justice Data Lab in the UK show that services are not always beneficial, as some programs that intended to reduce reoffending actually increased reoffending.
The Justice Data Lab was launched on a trial basis in April 2013. For organisations delivering services to reduce reoffending, it provided the first opportunity to have an effect size calculated against a matched national comparison group, for no financial cost. A key condition of the Justice Data Lab was that the Ministry of Justice would publish all results.
For more information on how the Lab works, see the brief summary I wrote last year.
Critics of the Justice Data Lab point out that organisations are able to choose which names they submit, so are able to bias the results. Despite this, not all results have been positive.
Up to October 2014, 93 programs have applied. Only 30 of these had statistically significant results. Of these, 25 were shown to reduce reoffending and five increased reoffending.
[Technical note: Non-statistically significant results could be due to a number of features in combination, including small effect size (difference between those receiving the service and similar ex-offenders not receiving the service), small sample size (how many people were in the program) and low matching rate. The Justice Data Lab requires that at least 60 people’s names be submitted for matching with similar ex-offenders, but is not always able to match them all. If only 30 offenders were able to be matched, the program would have to have an effect size of at least 15 percentage points in order for the result to be statistically significant with 95% confidence. That is very high – only one of the programs so far has produced a difference of greater than 15 percentage points. (A confidence level of 95% means that if the program were repeated 100 times, at least 95 times the observed effect would be due to the program and the remaining times the observed effect would occur by chance.)]
The UK is currently undergoing a huge policy reform, Transforming Rehabilitation. What role the Justice Data Lab and its results will play in this process is unknown. Sometimes the hardest part of the evidence cycle is making decisions that reflect the evidence.
Disney’s anti-evidence programming
Perhaps the most notorious of programs that consistently increases reoffending is Scared Straight. Scared Straight involves taking young people into prisons, where they speak to incarcerated offenders and ‘experience’ being locked up. The idea is that they’re so shocked by what they see they will never offend themselves and risk a life behind bars. Unfortunately, for the young people participating in these programs, the incidence of prison increases.
Scared Straight programs spread across the US after a documentary of the same name won an Academy Award in 1979. The effect of many of these programs was not evaluated, but there were two studies published only a few years later, in 1982 and 1983, showing that out of seven evaluations, not once did the program reduce reoffending, and that overall the program increased reoffending. These analyses have been repeated several times, but the results remain the same (Petrosino (2013) Scared Straight Update).
Despite this evidence being publicly available and fairly well known, in January 2011, the Disney-owned TV channel A&E began to broadcast their new series Beyond Scared Straight. The program follows “at-risk teens” and is “aimed at deterring them from a life of crime”. Despite outcry, public condemnation and petitions, the channel refuses to cancel the series, which is about to enter its eighth season.
The Washington State Institute for Public Policy estimates that each dollar spent on Scared Straight programs incurs costs of $166.88, making it the only juvenile justice program in their list with a negative cost:benefit ratio (see their summary below).
For the young people enticed into the program, their prize is not only a terrifying experience, but a greater likelihood of a stint in one of the unhappiest places on Earth.
Useful references
The Cochrane Collaboration – systematic reviews of evaluations in health.
The Campbell Collaboration – sister of Cochrane for other policy areas e.g. where the Scared Straight update is published.
Test, Learn, Adapt: Developing Public Policy with Randomised Controlled Trials – from the UK Cabinet Office clearly sets out how public programs can use randomised controlled trials to develop evidence-based programs. Rather than ‘denying’ service, the authors encourage randomisation of rollout of a new program, for example, as a cost-neutral way of enabling better data collection and learning.
Creating a ‘Data Lab’ – from NPC, about submitting the proposal that initiated the Justice Data Lab and continuing work to seed similar programs in other service areas.
Transforming Rehabilitation: a summary of evidence on reducing reoffending (second edition) – 2014 – published by the Ministry of Justice
What Works to Reduce Reoffending: A Summary of the Evidence – 2011 – published by the Scottish Government
[…] See my November 2014 post on information coming out of the Justice Data Lab. […]