Developing a counterfactual for a social impact bond (SIB)

The following was taken from a presentation by Sally Cowling, Director of Research, Innovation and Advocacy for UnitingCare Children, Young People and Families. The presentation was to the Social Impact Measurement Network of Australia (SIMNA) New South Wales chapter on March 11 2015. Sally was discussing the measurement aspects of the Newpin Social Benefit Bond, which is referred to as a social impact bond in this article for an international audience.

The social impact bond (called Social Benefit Bond in New South WaleSally Cowlings) was something very new for us. The Newpin (New Parent and Infant Network) program had been running for a decade supported by our own investment funding, and our staff were deeply committed to it. When our late CEO, Jane Woodruff, appointed me to our SIB team she said my role was to ’make sure this fancy financial thing doesn’t bugger Newpin up’.

One of the important steps in developing a social impact bond is to develop a counterfactual. This estimates what would have happened to the families and children involved in Newpin without the program, the ‘business as usual’ scenario. This was the hardest part of the SIB. The Newpin program works with families to become strong enough for their children to be restored to them from care. But the administrative data didn’t enable us to compare groups of potential Newpin families based on risk profiles to determine a probability of restoration to their families for children in care. We needed to do this to estimate the difference the program could make for families, and to assess the extent to which Newpin would realise government savings.

Experimenting with randomised control trials

NSW Family and Community Services (FACS) were keen to randomly allocate families to Newpin as an efficient means to compare family restoration and preservation outcomes for those who were in our program and those who weren’t. A randomised control trial is generally considered the ‘gold standard’ in the measurement of effect, so that’s where we started.

Child's drawing of a happy kidOne of my key lessons from my Newpin practice colleagues was the importance of their relationships and conversations with government child protection (FACS) staff when determining which families were ready for Newpin and had a genuine probability (much lower than 100%) of restoration. When random allocations were first flagged I thought ‘this will bugger stuff up’.

To the credit of FACS they were willing to run an experiment involving local Newpin Coordinators and their colleagues in child protection services. We created some basic Newpin eligibility criteria and FACS ran a list from their administrative data and randomly selected 40 families (all of whom were de-identified) for both sets of practitioners to consider. A key part of the experiment was for the FACS officer with access to the richer data in case files to add notes. Through these notes and conversations it was quickly clear that a lot of mothers and fathers on the list weren’t ready for Newpin because:

  • One was living in south America
  • A couple had moved interstate
  • One was in prison
  • One had subsequent children who had been placed into care
  • One was co-resident with a violent and abusive partner – a circumstance that needed to be addressed before they could commence Newpin

From memory, somewhere between 15 and 20 percent of our automated would-be-referrals would have been a good fit for the program. It was enlightening to be one of the non-practitioners in the room listening to specialists exchange informed, thoughtful views about who Newpin could have a serious chance at working for. This experiment was a ‘light bulb moment’ for all of us. For both the government and our SIB teams, randomisation was off the table. Not only was the data not fit for that purpose, we all recognised the importance of maintaining professional relationships.

In hindsight, I think the ‘experiment’ was also important to building the trust of our Newpin staff in our negotiating team. They saw an economist and accountant listening to their views and engaging in a process of testing. They saw that we weren’t prepared to trade off the fidelity and integrity of the NewpiChild's drawing of happinessn program to ‘get’ a SIB and that we were thinking ethically through all aspects of the program. We were a team and all members knew where they did and didn’t have expertise.

Ultimately Newpin is about relationships. Not just the relationships between our staff and the families they work with, but the relationship between our staff and government child protection workers.

But we still had the ‘counterfactual problem’! The joint development phase of the SIB – in which we had access to unpublished and de-identified government data under strict confidentiality provisions – convinced me that we didn’t have the administrative data needed to come up with what I had come to call the ‘frigging counterfactual’ (in my head the adjective was a bit sharper!). FACS suggested I come up with a way to ‘solve’ the problem and they would do their best to get me good proxy data. As the deadline was closing in, I remember a teary, pathetic midnight moment willing that US-style admin data had found a home in Australia.

Using historical data from case files

Eventually you have to stop moping and start working. I decided to go back to the last three years of case files for the Newpin program. Foster care research is clear that the best predictor of whether a child in the care system would be restored to their family was duration in care. We profiled all the children we had worked with, their duration in care prior to entry to Newpin and intervention length. FACS provided restoration and reversal rates in a matrix structure and matching allowed us to estimate that if we worked with the same group of families (that is, the same duration of care profiles) under the SIB that we had in the previous 3 years, then the counterfactual (the percentage of children who would be restored without a Newpin intervention) would be 25%.

As we negotiated the Newpin Social Benefit Bond contract with the NSW Government we did need to acknowledge that a SIB issue had never been put to the Australian investment market and we needed to provide some protection for investors. We negotiated a fixed counterfactual of 25% for the first three years of the SIB. That means that the Newpin social impact bond is valued and paid on the restoration rate we can achieve over 25%. Thus far, our guesses have been remarkably accurate. To the government’s immense credit, they are building a live control group that will act as the counterfactual after the first three years. This is very resource intensive but the government was determined to make the pilot process as robust as possible

In terms of practice culture, I can’t emphasise enough the importance of thinking ethically. We had to keep asking ourselves, ‘Does this financial structure create perverse incentives for our practice?’ The matched control group and tightly defined eligibility criteria remove incentives for ‘cherry picking’ (choosing easier cases). The restoration decisions that are central to the effectiveness of the program are made independently by the NSW Children’s Court and we need to be confident that children can remain safely at home. If a restoration breaks down within 12 months our performance payment for that result is returned to the government. For all of us involved in the Newpin Social Benefit Bond project behaving thoughtfully, ethically and protecting the integrity of the Newpin program has been our raison d’etre. That under the bond, the program is achieving better results for a much higher risk of group of families and spawning practice innovation is a source of joy which is true to our social justice ethos.

Delivering the Promise of Social Outcomes: The Role of the Performance Analyst

I’ve wanted to write about performance management systems for a long time. I knew there were people drawing insights from data to improve social programs and I wanted to know more about them. I wanted to know all about their work and highlight the importance of these quiet, back-office champions. But they weren’t easy to find, or find time with.

Dan
Dan Miodovnik, Social Finance

I worked at Social Finance in London for three months in late 2013, a fair chunk of that time spent skulking around behind Dan Miodovnik’s desk. I’d peer over his shoulders at his computer screen as he worked flat out, trying to catch a glimpse of these magic performance management ‘systems’ he’d developed. At the end of my time at Social Finance, I understood how valuable the performance management role was to their social impact bonds (SIBs), but I still had no idea of what it actually entailed.

Antonio Miguel, The Social Investment Lab
Antonio Miguel, The Social Investment Lab

Then early 2014 Antonio Miguel and I took a 2-hour bullet train ride through Japan while on a SIB speaking tour. On this train journey I asked Antonio to open his computer and show me the performance management systems he’d worked on with Social Finance. Two hours later, I understood the essential components of a performance management system, but I didn’t fully grasp the detail of how these components worked together.

So I proposed to Dan that we join Antonio on the beaches of Cascais in Portugal in August 2014. My cunning research plan was to catch them at their most relaxed and pick their brains over beach time and beers. Around this time I saw a blog written by Jenny North, from Impetus-PEF that mentioned performance management. A call with her confirmed that they were as enthused about performance management as I was. So I drafted a clean, six-step ‘how to’ guide for constructing a performance management system. I hoped that a quick edit from Dan and Antonio, a couple of quotes and I’d be done.

Interviewing Dan and Antonio blew me away. Only when I heard them talk freely about their work did I realise the magic wasn’t in their computer systems, it was in their attitudes. It was their attitude to forming relationships with everyone who needed to use their data. It was their attitude to their role – as the couriers, rather than the policemen, of data.

They told me that there were plenty of ‘how to’ guides for setting up systems like theirs, but that the difficult thing was getting people to read and implement them.

Isaac Castillo, DC Promise Neighbourhood Initiative
Isaac Castillo, DC Promise Neighbourhood Initiative

They suggested I throw out my draft and interview more people. People who were delivering services and their investors. I didn’t just need to understand the system itself, I needed to understand what it meant for the people who delivered and funded services. I gathered many of these people at San Francisco’s Social Capital Markets (SOCAP) conference and several more from recommendations. One of these recommendations was Isaac Castillo, who works with the DC Promise Neighbourhood Initiative’s collective impact project. He is now managing not only his team of performance analysts, but the service delivery team too. It’s revolutionary, but it makes complete sense.

Interviewing these people has been a most humbling experience. It has revealed to me the extent of their dedication, innovation and intelligence. It has also revealed to me how little I knew, and in turn, how little we, as a sector, know about these people and their work. I am honoured to share their stories with you – please read them at deliveringthepromise.org.


This research is published by The Social Investment Lab (Portugal), Impetus-PEF (UK) and Think Impact (Australia).

logos in row

Evidence-based justice – or NOT!

It’s hard to generate good evidence on which to base policy and programs. It’s even harder when that evidence exists, is publicly available and blatantly ignored.

Let’s talk about generating evidence first…

Sometimes when we talk about comparing a group receiving services to a group not receiving services, it is argued that it is unethical to deny anyone a service (this always comes up when planning social impact bonds and other pay for success programs). There is an underlying assumption in this argument that all services are beneficial to their recipients. Results from the Justice Data Lab in the UK show that services are not always beneficial, as some programs that intended to reduce reoffending actually increased reoffending.

Justice Data Lab Oct results

The Justice Data Lab was launched on a trial basis in April 2013. For organisations delivering services to reduce reoffending, it provided the first opportunity to have an effect size calculated against a matched national comparison group, for no financial cost. A key condition of the Justice Data Lab was that the Ministry of Justice would publish all results.

For more information on how the Lab works, see the brief summary I wrote last year.

Critics of the Justice Data Lab point out that organisations are able to choose which names they submit, so are able to bias the results. Despite this, not all results have been positive.

Up to October 2014, 93 programs have applied. Only 30 of these had statistically significant results. Of these, 25 were shown to reduce reoffending and five increased reoffending.

Justice Data Lab Oct results 2

[Technical note: Non-statistically significant results could be due to a number of features in combination, including small effect size (difference between those receiving the service and similar ex-offenders not receiving the service), small sample size (how many people were in the program) and low matching rate. The Justice Data Lab requires that at least 60 people’s names be submitted for matching with similar ex-offenders, but is not always able to match them all. If only 30 offenders were able to be matched, the program would have to have an effect size of at least 15 percentage points in order for the result to be statistically significant with 95% confidence. That is very high – only one of the programs so far has produced a difference of greater than 15 percentage points. (A confidence level of 95% means that if the program were repeated 100 times, at least 95 times the observed effect would be due to the program and the remaining times the observed effect would occur by chance.)]

The UK is currently undergoing a huge policy reform, Transforming Rehabilitation. What role the Justice Data Lab and its results will play in this process is unknown. Sometimes the hardest part of the evidence cycle is making decisions that reflect the evidence.

Disney’s anti-evidence programmingBeyond Scared Straight

Perhaps the most notorious of programs that consistently increases reoffending is Scared Straight. Scared Straight involves taking young people into prisons, where they speak to incarcerated offenders and ‘experience’ being locked up. The idea is that they’re so shocked by what they see they will never offend themselves and risk a life behind bars. Unfortunately, for the young people participating in these programs, the incidence of prison increases.

Scared Straight programs spread across the US after a documentary of the same name won an Academy Award in 1979. The effect of many of these programs was not evaluated, but there were two studies published only a few years later, in 1982 and 1983, showing that out of seven evaluations, not once did the program reduce reoffending, and that overall the program increased reoffending. These analyses have been repeated several times, but the results remain the same (Petrosino (2013) Scared Straight Update).

Scared Straight

Despite this evidence being publicly available and fairly well known, in January 2011, the Disney-owned TV channel A&E began to broadcast their new series Beyond Scared Straight. The program follows “at-risk teens” and is “aimed at deterring them from a life of crime”. Despite outcry, public condemnation and petitions, the channel refuses to cancel the series, which is about to enter its eighth season.

The Washington State Institute for Public Policy estimates that each dollar spent on Scared Straight programs incurs costs of $166.88, making it the only juvenile justice program in their list with a negative cost:benefit ratio (see their summary below).

WSIPP Scared Straight

For the young people enticed into the program, their prize is not only a terrifying experience, but a greater likelihood of a stint in one of the unhappiest places on Earth.

Useful references

The Cochrane Collaboration – systematic reviews of evaluations in health.

The Campbell Collaboration – sister of Cochrane for other policy areas e.g. where the Scared Straight update is published.

Test, Learn, Adapt: Developing Public Policy with Randomised Controlled Trials – from the UK Cabinet Office clearly sets out how public programs can use randomised controlled trials to develop evidence-based programs. Rather than ‘denying’ service, the authors encourage randomisation of rollout of a new program, for example, as a cost-neutral way of enabling better data collection and learning.

Creating a ‘Data Lab’ – from NPC, about submitting the proposal that initiated the Justice Data Lab and continuing work to seed similar programs in other service areas.

Transforming Rehabilitation: a summary of evidence on reducing reoffending (second edition) – 2014 – published by the Ministry of Justice

What Works to Reduce Reoffending: A Summary of the Evidence – 2011 – published by the Scottish Government

Social Impact Bonds and Pay for Success – are they synonyms?

On a recent trip to the US, I noticed that the discussions around ‘Pay for Success’ were a little different to those I’d been having on ‘Social Impact Bonds (SIBs)’ with other countries. Particularly in the measurement community, there was an idea that Pay for Success took measurement of social programs to a new level: that ‘Pay for Success’ meant paying for an effect size (by comparison to a control group), rather than ‘Pay for Performance’ which paid for the number of times something occurred. Continue reading

Using SROI for a Social Impact Bond

Social Return on Investment (SROI) and Social Impact Bonds (SIBs) are two ideas that are increasingly mentioned in the same breath. SROI is a measurement and accounting framework and SIBs are a way to contract and finance a service. Both require three common ingredients:

  • the quantification of one or more social outcomes for beneficiaries,
  • a valuation of these outcomes, and
  • an estimation of the cost of delivering these outcomes.

While not a necessary ingredient, SROI can contribute to the design, operation and evaluation of SIBs.

*NB the word ‘outcome’ is used here to represent a change in someone’s life – some readers (particularly from the US) may use the word ‘impact’ to mean the same

SIBs and SROI 1 Continue reading

The Justice Data Lab – an overview

MoJ Data LabWhat is the Justice Data Lab?

The Justice Data Lab allows non-government organisations to compare the reoffending of the participants in their programmes with the reoffending of other similar ex-offenders. It “will allow them to understand their specific impact in reducing re-offending… providing easy access to high-quality re-offending information” (Ministry of Justice, Justice Data Lab User Journey p.10). There is no charge to organisations that use the Justice Data Lab.

The Justice Data Lab is a pilot run by the Ministry of Justice. The pilot began in April 2013. Each month, summaries of results and data are published, including Forest plots of all results so far.

Who might use it?

The Justice Data Lab can be used by “organisations that genuinely work with offenders” (Justice Data Lab User Journey p.11). One request will provide evidence of a programme’s effect on its service users’ reoffending. Several requests could compare services within an organisation or over time to answer more sophisticated questions about what is more effective.

This information could be used by non-government organisation for internal programme improvements, to report impact to stakeholders or to bid for contracts. It was set up at the time the Ministry of Justice’s Transforming Rehabilitation Programme was encouraging bids from voluntary and community sector organisations to deliver services to reduce reoffending.

What are the inputs?

Input data are required to identify the service users from a specific program and match them with a comparison group. Information on at least 60 service users is required and the organisation must have worked with the offender between 2002 and 2010.

Essential:

  • Surname
  • Forename
  • Date of Birth
  • Gender

At least one of the following:

  • Index Date
  • Conviction Date
  • Intervention Start Date [note: feedback from applicants is that this is required]
  • Intervention End Date [note:feedback from applicants is that this is required]

Highly Desirable: PNC ID and/or Prison Number

Optional: User Reference Fields

What are the outputs?

The one year proven re‐offending rate –  defined as the proportion of offenders in a cohort who commit an offence in a one year follow‐up period which received a court conviction, caution, reprimand or warning during the one year follow‐up or in a further six month waiting period. The one year follow‐up period begins when offenders leave custody or start their probation sentence. A fictional example of the output provided by the Ministry of Justice is quoted below:

The analysis assessed the impact of the Granville Literacy Project (GLP) on re‐ offending. The one year proven re‐offending rate for 72 offenders on the GLP was 35%, compared with 41% for a matched control group of similar offenders. The best estimate for the reduction in re‐offending is 6 percentage points, and we can be confident that the reduction in re‐offending is between 2 and 10 percentage points.
What you can say: The evidence indicates that the GLP reduced re‐offending by between 2 and 10 percentage points.

Publication
Applicants should note the following requirement: “an organisation requesting data through the Justice Data Lab must publish the final report, in full, on the organisation’s website within four months of receiving the final report.”

I’d be very interested in the opinions of applicants on this requirement. Is it an issue? Does it create perverse incentives?

What are the implications?

The implications are huge. Prior to the Justice Data Lab it was very difficult for non-government organisations to establish a comparison group against which to measure their effect. Evaluations of effect are expensive and thus prohibitive, particularly for smaller organisations. In addition, the differences in their methods and definitions meant that evidence was more difficult to interpret and compare.

This is exactly the type of evidence that developers of social impact bonds find so difficult to establish and will be essential to constructing social impact bonds to deliver  Transforming Rehabilitation services. It is a measure of outcome, which is desirable, but often more difficult to quantify than input (e.g. how much money went into the programme), activity (e.g. what services were delivered) or output (e.g. how many people completed the programme).

New Philanthropy Capital (NPC) were involved in designing the Justice Data Lab and their Data for Impact Manager, Tracey Gyateng, is specifically thinking about applications to other policy areas.

How is it going?

See my November 2014 post on information coming out of the Justice Data Lab.

Also note the announcement of an Employment Data Lab by NPC and the Department of Work and Pensions.

More information

Information on the Justice Data Lab home page includes links to a series of useful documents:

  • User journey document – information on what the justice data lab is, and how to use its services.
  • Data upload template – use this template to supply data to the justice data lab. Further descriptions of the variables requested are given, and there are key areas which must be filled in on the specific activities of the organisation in relation to offenders.
  • Methodology paper – this document gives details of the specific methodology used by justice data lab to generate the analysis
  • Privacy impact assessment – this is a detailed analysis of how an organisations’ data will be protected at all stages of a request to the justice data lab
  • Example report template – two examples of a standard report, completed for two fictional organisations showing what will be provided.

Criminal justice service providers might also benefit from getting involved in the Improving Your Evidence project, a partnership between Clinks, NPC and Project Oracle. The project will produce resources and support, so follow the link and let them know what would be of most use. The page also links to an introduction to the Justice Data Lab – a useful explanation of the service.

The bulk of this post has been copied directly from the Ministry of Justice documents listed above. It is intended to act as a summary of these documents for quick digestion by potential users of the Justice Data Lab. The author is not affiliated with the Ministry of Justice and does not claim to represent them.

Mapping the needs of a community

Australian update: Community Insight Australia is working to shape and translate the Community Insight tool for Australia. Please get in touch if you share our vision and would like to take the journey with us.


Policy-makers know that social programmes are more effective if they are provided in the areas of greatest need. But, historically, it has been resource intensive to identify either areas of need or the range of needs of a particular community. This task would either involve weeks of combing through the latest data from all reliable sources or painstakingly interviewing a large enough sample to make assumptions about the populations. As a public servant, I’ve spent hours on the computer painting maps of social disadvantage, a new map for each indicator.

This problem is also faced by Housing Associations and other social housing landlords, who provide homes to over 4.5m households across England, and operate in an environment in which accurate data about communities they work within has become increasingly important to policy and delivery decisions. Their ability to access relevant data has been limited by poor quality data systems and a reliance on a limited pool of research analysts to interpret the data that was available.

In response to this, new housing think-do tank HACT and social policy data experts OCSI recently launched Community Insight, a web-based tool which allows non-expert staff at all levels to explore geographically social indicators of need quickly and easily for the first time, using constantly-updated, reliable data.

I was lucky enough to have Matt Leach, from HACT, take me through Community Insight and I couldn’t have been more impressed. The key to the tool is its simplicity.

  1. You choose the geographic area you are interested in on a map of England and Wales.
  2. The tool will give you the demographic and social indicators of the area.

Screenshot of Community Insight showing a selected area, specific houses by housing type, social indicator categories on the left and colours on the map showing the density of the chosen indicator, mental health issues.

Community Insight area screen grab

Features:

  • the social indicators are also presented in comparison to the national average
  • information about a geographical area can be interacted with online or exported in seconds as a detailed report
  • you can drill down by area or statistical collection for more information
  • the statistical collections behind the tool are automatically updated as their sources are updated
  • geographical areas can be defined specifically by a spreadsheet of housing stock or drawn with your finger or a mouse onto a map as a suburb, county or region.

Some of the ways housing providers are using Community Insight could transfer to policy makers and programme designers:

  • comparing between different areas in order to target community investment programmes to areas of greatest ‘need’
  • assessing change over time in different areas, as a starting point for evaluation of programme impact
  • combining with more detailed data from administrative data sets, to develop ‘at-risk models’ to identify areas and properties (and indeed individuals) that might be at risk e.g. of rent arrears

The tool is notable for a number of reasons:

  • it is one of the first large scale commercial approaches to accessing and interpreting open data launched by a UK-based social enterprise in a major public service area
  • it was designed from the bottom up as a tool for practitioners (one of the design principles that drove the team was “democratising data”)
  • it has had instant, mainstream success, with over 60 landlords with a total stock in management of nearly 1m households subscribing to the service within 4 months of its launch
  • it’s incredibly easy to use and the data produced is fit for purpose.

Easy to use

The tool was developed with its users involved at every step of the way. Rather than start with the data sets and try to make them interactive, the development of Community Insight was driven by the needs and intentions of the user. The intended users are housing providers – they can upload their housing stock and ascertain the social characteristics of the people they house. However, even a quick play with the tool suggests that a much wider range of unintended users – policy-makers and programme designers across government and other public service areas – might be beneficiaries. A number of local authorities, for example, facing significant cuts to their in house capacity to collect and analyse data have expressed interest in embedding Community Insight in order to retain the ability to access information on the communities they work within.

Business model

Community Insight is sold on a subscription basis, with subscribing organisations having unlimited staff access to the tool across their business. They are able quickly to produce comparable reports on different geographical areas as the need arises. OCSI and HACT ensure the data is constantly updated and will continue to develop and improve the resource over time. Subscribers report an immediate reduction in the costs of community profiling consultancies (for some housing associations paying back the annual subscription in a matter of weeks), little to no installation or maintenance overhead (as all data is updated centrally) and minimal training requirements for new users.

Selection of headline indicators from the  Community Insight Report on Emmaville (a fictitious village).

Emmaville overview

 Statistics for each selected geographical area

  • population by number, age, gender, dependency ratio, population size over the last 10 years, ethnicity and country of birth, migration statistics, household composition, religion
  • number of types of houses e.g. flats by local median price of each, renting and ownership proportions, trends in house price over the last 6 years, central heating, overcrowding and dwelling size, local communal residential establishments
  • vulnerable groups by types of benefits claimed and number of claimants
  • crime by type recorded and 10 year trend
  • health by life expectancy and long-term illnesses, healthy eating, smoking and binge drinking
  • education by qualifications, pupil scores at key stage tests
  • economy by income, employment status and sector, job vacancies, local businesses, index of multiple deprivation, child wellbeing index
  • transport by car ownership, distance to key services
  • community by classification of type, feeling of neighbourhood satisfaction, active charities, air pollution

Potential uses

Following their roll-out in the housing sector, HACT and OCSI are considering where Community Insight might be applicable or adaptable to other sectors.  After my brief trial of the tool, my immediate thoughts for additional applications by potential non-housing provider users are:

  • designers of social impact bonds and other payment by results programmes might use the Community Insight tool to select an intervention cohort of appropriate size and need
  • researchers might use the tool to scan areas where they might focus their on-the-ground investigations
  • journalists might use the tool to describe the community a particular event has taken place in
  • local authorities might use the tool to educate their staff about the diversity and differences within their communities
  • social investors interested in place-based investing

What might you use it for?

Fewer criminals or less crime? Frequency v binary measures in criminal justice

The June 2013 interim results released by the Ministry of Justice gave us a chance to examine the relationship between the number of criminals and the number of crimes they commit. The number of criminals is referred to as a binary measure, since offenders can be in only one of two categories: those who reoffend and those who don’t. The number of crimes is referred to as a frequency measure, as it focuses on how many crimes a reoffender commits.

The payments for the Peterborough SIB are based on the frequency measure. Please note that the interim results are not calculated in precisely the same way as the payments for the SIB will be made. [update: the results from the first cohort of the Peterborough SIB were released in August 2014 showing a reduction in offending of 8.4% compared to the matched national comparison group.]

In the period the Peterborough SIB delivered services to the first cohort (9 Sept 2010-1July 2012), the proportion of crimes committed over the six months following each prisoner’s release reduced by 6.9% and the proportion of criminals by 5.8%. In the same period, there was a national increase in continuing criminals of 5.4%, but an even larger increase of 14.5% in the number of crimes they commit. The current burning issue is not that there are more reoffenders, it is that those who reoffend are reoffending more frequently.

Criminals or crime 1Criminals (binary measure) in this instance are defined as the “Proportion of offenders who commit one or more proven reoffences”. A proven reoffence means “proven by conviction at court or a caution either in those 12 months or in a further 6 months”, rather than simply being arrested or charged.

Crime (frequency measure) in this instance is defined as “Any re-conviction event (sentencing occasion) relating to offences committed in the 12 months following release from prison, and resulting in conviction at court either in those 12 months or in a further 6 months (Note: excludes cautions).”

The two measures are related – you would generally expect more criminals to commit more crimes. But the way reoffending results are measured creates incentives for service providers. If our purpose is to reduce crime and really help those who impose the greatest costs on our society and justice system, we would choose a frequency measure of the number of crimes. If our purpose is to help those who might commit one or two more crimes to abstain from committing any at all, then we would choose a binary measure.Criminals or crime 2Source of data: NSW Bureau of Crime Statistics and Research

The effect of the binary measure in practice: Doncaster Prison

A Payment by Results (PbR) pilot was launched in October 2011 at Doncaster Prison to test the impact of a PbR model on reducing reconvictions. The pilot is being delivered by Serco and Catch22 (‘the Alliance’). The impact of the pilot is being assessed using a binary outcome measure, which is the proportion of prison leavers who are convicted of one or more offences in the 12 months following their release. The Alliance chose to withdraw community support for offenders who are reconvicted within the 12 month period post-release as they feel that this does not represent the best use of their resources. Some delivery staff reported frustration that support is withdrawn, undermining the interventions previously undertaken. (Ministry of Justice, Process Evaluation of the HMP Doncaster Payment by Results Pilot: Phase 2 findings.)

I have heard politicians and policy makers argue that the public are more interested in reducing or ‘fixing’ criminals than helping them offend less, and thus the success of our programmes needs to be based on a binary measure. I don’t think it’s that hard to make a case for reducing crime. People can relate to a reduction in aggravated burglaries. Let’s get intentional with the measures we use.

What do the Peterborough SIB interim results tell us?

Update: actual results are now out.

First cohort results from the Peterborough SIB were released August 7 2014. The Social Finance UK press release on the results has lots of great information, with a quote below.

“Results for the first group (cohort) of 1000 prisoners on the Peterborough Social Bond (SIB) were announced today, demonstrating an 8.4% reduction in reconviction events relative to the comparable national baseline. The project is on course to receive outcome payments in 2016. Based on the trend in performance demonstrated in the first cohort, investors can look forward to a positive return, including the return of capital, on the funds they have invested.”

Outdated information below:

Peterborough Interim ResultsOn the 13th of June, the Ministry of Justice released interim results from the Peterborough Pilot SIB. The results were seen as very encouraging, although Social Finance stressed that the results “do not measure  reoffending behaviour over as long a period as the Social Impact Bond will be judged and are not compiled on precisely the same basis as will be used by the Independent Assessor during the course of 2014 to determine whether a payment is due.”

What the results do tell us

The results tell us that the reoffending events have improved in the Peterborough cohort as the national average has worsened. We can be fairly confident that the reduction in reoffending is due to the Peterborough SIB, or more specifically the One Service. This in itself is quite an achievement for Government policy.

These results are also an excellent demonstration of the need to have a contemporary comparison, in this case the Police National Computer control group, rather than a historical baseline. If the historical re-conviction rates at Peterborough had been used as the only comparison, it would appear that a 6% reduction had been produced. Using the national comparison group shows that the programme also counters an increasing trend in re-convictions to produce a relative reduction of 23%. The inability of historical baselines to represent the fluctuating environment affecting reoffending is further illustrated when we look at results leading up to the 2010 launch of the One Service. The graph below shows that reoffending by both Peterborough inmates and prisoners across the nation was increasing until SIB services began in 2010, but then reversed in comparison to the national rates.

Peterborough Interim Results 2

What the results don’t tell us

The results do not tease out for us which aspects of the One Service might be more or less responsible for success. So, the results do not tell us if the reduction in reoffending is due to the:

  • use of volunteer mentors
  • voluntary, rather than mandatory, participation
  • long-term nature of the SIB funding
  • flexibility of funding
  • ability to innovate programme delivery to optimise outcomes
  • focus on a single outcome
  • extraordinary skills of the people involved in managing and delivering services
  • continuous evaluation and improvement of the One Service
  • discipline of reporting to external investors
  • alignment of financial and social returns.

The April 2014 evaluation commissioned by the Ministry of Justice from Rand Europe sheds some light on the perceived benefits of the SIB model, including the way the flexibility of funding allows for the service to improve in response to performance management data.

Update April 2014

See 24 April results and press release from the Ministry of Justice stating “Before the pilot, for every 100 prisoners released from Peterborough there were 159 reconviction events annually. Under the scheme this figure has fallen to 141 — a fall of 11 per cent. Nationally that figure has risen by 10 per cent over the same period.”

Toby Eccles blog analyses how well the Peterborough SIB achieved its objectives to:

  • Enable innovation
  • Enable flexibility and focus on outcomes
  • Bring rigour to prevention
  • Better alignment
  • Investment in social change.

And a good analysis of the Peterborough journey and what was learnt is given by the Social Spider here. I have a slightly broader view of SIBs in the context of policy reform, but I like the discussion.

References

Ministry of Justice, Statistical Notice: Interim re-conviction figures for the Peterborough and Doncaster Payment by Results pilots12 June 2013.

Social Finance, Interim Re-Conviction Figures for Peterborough Social Impact Bond Pilot, press release 13 June 2013.

Ministry of Justice, Mentoring Scheme Reduces Reoffending, press release 13 June 2013.

Vibeka Mair, Peterborough social impact bond has slashed reoffending rates says MoJ, Civil Society Finance, 13 June 2013.

Alan Travis, Pilot schemes to cut reoffending show mixed results, The Guardian, 13 June 2013.

BBC, Prison payment-by-results schemes see reoffending cut, 13 June 2013.

Nicholls, A. & Tomkinson,E. Case Study – The Peterborough Pilot Social Impact Bond – Oct 2013, Oct 2013.

NSW Newpin social benefit bond – returns to investors

Image

(Social Ventures Australia: Newpin Social Benefit Bond)

The NSW social benefit bond (social impact bond) for UnitingCare Burnside’s Newpin programme is attracting interest from a range of investors, including NGS Super – the first time we’ve seen a pension/superannuation fund sign up to a social impact bond. The Newpin programme is for families with children aged 0-5 and results are measured on the proportion of children in out-of-home care (which includes foster care, institutional care and placement with extended family) that are returned to their families by the courts. This is called the restoration rate.

Eureka Report’s A high-yield bond with social benefits recently revealed the social and financial returns over the seven year social benefit bond as follows:

Restoration rate (r) Return to investor (IRR)
≥ 70% 15%
65% ≤ r < 70% 12%
60% ≤ r < 65% 7.5%
55% ≤ r < 60% 3%
<55%
•minimum 5% yield over first three years
•no minimum yield after three years
•75% of capital returned if bond redeemed at four years
•50% capital returned if redeemed after four years

The Newpin restoration rates were 74.5% last year. Their approach to this social benefit bond, to measure their results for a similar cohort for the year prior to the bond, gives social investors information with which to judge the social and financial risk of the investors. Providing this information attracts investors beyond the die-hard philanthropists who have backed the programme from the start.

Funds were raised from 59 investors by Social Ventures Australia (SVA) with a minimum investment of AU$50,000. The SIB was oversubscribed. Below is a breakdown of investment by investor categories as presented by Ian Learmonth, Executive Director, Social Ventures Australia at the 2013 Social Finance Forum in Sydney/Australia.

NewPin SIB investors