Mapping the needs of a community

Australian update: Community Insight Australia is working to shape and translate the Community Insight tool for Australia. Please get in touch if you share our vision and would like to take the journey with us.

Policy-makers know that social programmes are more effective if they are provided in the areas of greatest need. But, historically, it has been resource intensive to identify either areas of need or the range of needs of a particular community. This task would either involve weeks of combing through the latest data from all reliable sources or painstakingly interviewing a large enough sample to make assumptions about the populations. As a public servant, I’ve spent hours on the computer painting maps of social disadvantage, a new map for each indicator.

This problem is also faced by Housing Associations and other social housing landlords, who provide homes to over 4.5m households across England, and operate in an environment in which accurate data about communities they work within has become increasingly important to policy and delivery decisions. Their ability to access relevant data has been limited by poor quality data systems and a reliance on a limited pool of research analysts to interpret the data that was available.

In response to this, new housing think-do tank HACT and social policy data experts OCSI recently launched Community Insight, a web-based tool which allows non-expert staff at all levels to explore geographically social indicators of need quickly and easily for the first time, using constantly-updated, reliable data.

I was lucky enough to have Matt Leach, from HACT, take me through Community Insight and I couldn’t have been more impressed. The key to the tool is its simplicity.

  1. You choose the geographic area you are interested in on a map of England and Wales.
  2. The tool will give you the demographic and social indicators of the area.

Screenshot of Community Insight showing a selected area, specific houses by housing type, social indicator categories on the left and colours on the map showing the density of the chosen indicator, mental health issues.

Community Insight area screen grab


  • the social indicators are also presented in comparison to the national average
  • information about a geographical area can be interacted with online or exported in seconds as a detailed report
  • you can drill down by area or statistical collection for more information
  • the statistical collections behind the tool are automatically updated as their sources are updated
  • geographical areas can be defined specifically by a spreadsheet of housing stock or drawn with your finger or a mouse onto a map as a suburb, county or region.

Some of the ways housing providers are using Community Insight could transfer to policy makers and programme designers:

  • comparing between different areas in order to target community investment programmes to areas of greatest ‘need’
  • assessing change over time in different areas, as a starting point for evaluation of programme impact
  • combining with more detailed data from administrative data sets, to develop ‘at-risk models’ to identify areas and properties (and indeed individuals) that might be at risk e.g. of rent arrears

The tool is notable for a number of reasons:

  • it is one of the first large scale commercial approaches to accessing and interpreting open data launched by a UK-based social enterprise in a major public service area
  • it was designed from the bottom up as a tool for practitioners (one of the design principles that drove the team was “democratising data”)
  • it has had instant, mainstream success, with over 60 landlords with a total stock in management of nearly 1m households subscribing to the service within 4 months of its launch
  • it’s incredibly easy to use and the data produced is fit for purpose.

Easy to use

The tool was developed with its users involved at every step of the way. Rather than start with the data sets and try to make them interactive, the development of Community Insight was driven by the needs and intentions of the user. The intended users are housing providers – they can upload their housing stock and ascertain the social characteristics of the people they house. However, even a quick play with the tool suggests that a much wider range of unintended users – policy-makers and programme designers across government and other public service areas – might be beneficiaries. A number of local authorities, for example, facing significant cuts to their in house capacity to collect and analyse data have expressed interest in embedding Community Insight in order to retain the ability to access information on the communities they work within.

Business model

Community Insight is sold on a subscription basis, with subscribing organisations having unlimited staff access to the tool across their business. They are able quickly to produce comparable reports on different geographical areas as the need arises. OCSI and HACT ensure the data is constantly updated and will continue to develop and improve the resource over time. Subscribers report an immediate reduction in the costs of community profiling consultancies (for some housing associations paying back the annual subscription in a matter of weeks), little to no installation or maintenance overhead (as all data is updated centrally) and minimal training requirements for new users.

Selection of headline indicators from the  Community Insight Report on Emmaville (a fictitious village).

Emmaville overview

 Statistics for each selected geographical area

  • population by number, age, gender, dependency ratio, population size over the last 10 years, ethnicity and country of birth, migration statistics, household composition, religion
  • number of types of houses e.g. flats by local median price of each, renting and ownership proportions, trends in house price over the last 6 years, central heating, overcrowding and dwelling size, local communal residential establishments
  • vulnerable groups by types of benefits claimed and number of claimants
  • crime by type recorded and 10 year trend
  • health by life expectancy and long-term illnesses, healthy eating, smoking and binge drinking
  • education by qualifications, pupil scores at key stage tests
  • economy by income, employment status and sector, job vacancies, local businesses, index of multiple deprivation, child wellbeing index
  • transport by car ownership, distance to key services
  • community by classification of type, feeling of neighbourhood satisfaction, active charities, air pollution

Potential uses

Following their roll-out in the housing sector, HACT and OCSI are considering where Community Insight might be applicable or adaptable to other sectors.  After my brief trial of the tool, my immediate thoughts for additional applications by potential non-housing provider users are:

  • designers of social impact bonds and other payment by results programmes might use the Community Insight tool to select an intervention cohort of appropriate size and need
  • researchers might use the tool to scan areas where they might focus their on-the-ground investigations
  • journalists might use the tool to describe the community a particular event has taken place in
  • local authorities might use the tool to educate their staff about the diversity and differences within their communities
  • social investors interested in place-based investing

What might you use it for?

Randomised controlled trials (RCTs) in public policy


The basic design of a randomised controlled trial (RCT), illustrated with a test of a new ʻback to workʼ programme (Haynes et. al, 2012, p.4).

In 2012, Laura Haynes, Owain Service, Ben Goldacre & David Torgerson wrote the fantastic paper Test, Learn, Adapt: Developing Public Policy with Randomised Controlled Trials. They begin the paper by making the case for RCTs with the following four points.

1.We don’t necessarily know ‘what works’ – “confident predictions about policy made by experts often turn out to be incorrect. RCTs have demonstrated that interventions which were designed to be effective were in fact not”

2. RCTs don’t have to cost a lot of money – “The costs of an RCT depend on how it is designed: with planning, they can be cheaper than other forms of evaluation.”

3. There are ethical advantages to using RCTs – “Sometimes people object to RCTs in public policy on the grounds that it is unethical to withhold a new intervention from people who could benefit from it.” “If anything, a phased introduction in the context of an RCT is more ethical, because it generates new high quality information that may help to  demonstrate that an intervention is cost effective.”

4. RCTs do not have to be complicated or difficult to run – “It is much more efficient to put a smaller amount of effort [than a post-intervention impact evaluation] into the design of an RCT before a policy is implemented.”

Laura and her team are making a huge difference to the way the UK Government perceives and implements RCTs.

The World Bank has also published some fantastic guidance in their  Impact Evaluation OverviewThis includes information abou their  Development Impact Evaluation (DIME) initiative that has the following objectives:

  • “To increase the number of Bank projects with impact evaluation components;
  • To increase staff capacity to design and carry out such evaluations;
  • To build a process of systematic learning based on effective development interventions with lessons learned from completed evaluations.”

I’ve popped both these resources on the Social Impact Bond Knowledge Box page Comparisons and the counterfactual, but thought they were so valuable it was worth expanding on them here.

Start a mistakes log

mistakes“No one is exempt from the rule that learning occurs through recognition of error.” Alexander Lowen, Bioenergetics

There’s too many lessons we’re missing out on because of our tendency to only publish good results. It’s perfectly understandable to want to promote wins, but publishing mistakes and what’s been learned from the them may be even more valuable.

Ben Goldacre is crusading against publication bias in evidence based medicine. He is one of the forces behind, an online petition to get all medical trials registered and subsequently all results reported. This is important stuff.

But apart from medicine, those of us involved in designing and delivering social programmes continue to repeat the mistakes of the past, because we simply don’t know enough about what has happened. I’m a strong believer in evidence-based policy, but evidence of policy history and why things failed is rarely captured and shared. Might it be possible for us to value mistakes enough to create incentives for their publication?

Curt Rosengren writes in his blog, the genius of mistakes:

You might even try keeping a mistake genius journal. Not a place for you to berate yourself for how many mistakes you make, but a place for you to actively learn from what has happened. Explore the mistake, explore what insights you’ve gained as a result, and summarize those insights into key points.

One organisation that’s created a ‘mistakes genius journal’ is Givewell in the US, with a section on their website, Our Shortcomings, logging their mistakes and what they’ve done in response. My opinion of the organisation was heightened by this discovery and I thought that this honest recognition and promotion of continuous improvement might have had the opposite effect most would expect from publishing their mistakes. Yes, we’re all worried about tabloid headlines, but wouldn’t it be a little less exciting when it’s not a secret ‘uncovered’, but a quote from the source straight off their public website. Imagine how wonderful it would be if governments and service providers kept similar logs!

As we try to design new services and financial products to address entrenched problems in this emerging social investment market, it would be really valuable to know what didn’t work out for others and most importantly, what they changed in response.

Allia recently showed an exemplary commitment to learning following the closure of their Future for Children Bond, which was the first opportunity for retail investors to invest a proportion of funds in a social impact bond, but failed to raise sufficient capital.

As a first pilot product, the Future for Children Bond has nevertheless been hugely valuable in assessing the retail market for social investment and generating learning about the steps needed to enable it to grow. These lessons will be used to inform the development of future Allia products and will be shared with the sector, together with policy recommendations, in a report by NPC to be published in May.

So here’s to seeing a whole lot more mistakes logs and lessons learned appearing in the public domain – great PR and enhanced social impact – what is there not to like?

Did you get what you came for? Realising social impact bond objectives

shopping basketIn a previous blog, Do SIBs Work?, I listed 22 objectives that have been given for pursuing SIBs as they have been announced around the world. Now I want to examine the achievability (is that even a word) of each one. Some objectives are achieved just by signing the contract to deliver a SIB, but some objectives rest on longer-term outcomes and these are the more interesting. But the conditions of achieving outcomes need to be built into SIBs from the start. If we’re after transparency, then we need to publish terms and results. If we want to shift funding to prevention, there must be a plan to continue funding the SIB preventative service after the SIB is over.

Better programs and better results for the people who participate in them
1. Improving results for beneficiaries by focusing on outcomes rather than outputs

I’d say this is the objective most likely to be achieved. Peterborough doesn’t have its results in, but word on the ground is that providers feel the focus on outcomes has made them deliver better services and participants enjoy the holistic and seamless approach. This may be an objective more easily achieved where payments are based on a single outcome, like Peterborough or Essex, than multiple outcomes like the DWP Innovation Pilots.

2. Improve the likelihood of delivering real and sustainable solutions to important social challenges

For this to happen you need a funding to be sustained – there hasn’t been a whole lot of talk so far of what happens after the SIB, but if the solutions are to be sustained, they’re going to need some continuity of funding. I think the services we’re seeing are real solutions, but we can’t think that solving a problem means it goes away. Solving social problems mean they go away for some of the people for some or all of the time – bettering a situation is noble – thinking you’re going to eradicate is is naive.

3. Making effective interventions available to far more people in need than the number that can be reached through traditional state contracts and philanthropy

This will be achieved as long as SIBs are delivering additional services to those who would otherwise have received little to nothing.

4. Harnessing the innovation capacity of both investors and service providers for publicly funded services

We’re unlikely to get any innovation if the same providers are delivering the same proven programmes – the objective will be achieved if there is room to improve and real changes are made on the ground. Peterborough is certainly delivering on this- not in any of the components of service for participants, but in the consistent and collaborative way they are delivered, so that the impact on their lives is real and sustained.

5. Adding discipline to measuring outcomes for government programs because there is an upfront agreement on how to measure success

This depends how much discipline you had about measuring outcomes before and how much political will there is to improve measurement in government – this may be more applicable to an Australian context where increased use of evidence-based policy is very much on the agenda – Peterborough has certainly introduced a measurement system that was then used in a number of other pilots, but it looks like the new Ministry of Justice probation reforms will revert to the less informative binary measure as it’s perceived as more favourable to Ministers and the public.

6. Improving the evidence base for social services, by mandating measurement and publication of outcomes

Only achieved if the results of the SIB are published along with full evaluations of the programmes that achieved them, ideally including follow-up measurement after the SIB is over. 

7. Accelerating the adoption and implementation of promising programs

Dependent on the programme chosen – was it promising? In all SIBs so far I would say yes.

8. Accelerating the expansion of evidence-based programs delivered by effective nonprofits

This is very much the objective that seems to have been focussed on in the US. You’d have to look at the expansion of programmes like Multi-Systemic Therapy (Essex SIB) and Moral Reconation Therapy (New York SIB) to see how the inclusion of the programmes in these SIBs impacted their expansion rates.

A social finance market
9. Unlocking funds to tackle social issues

We’ve certainly seen new funds committed by Government, and the involvement of an investor like Goldman Sachs suggests new funds (unless it comes from their corporate social responsibility budget).

10. Growing the social finance and social business sector

SIBs have certainly grown interest in the social finance sector and have become a bit of a poster-child. This has occurred at the same time as rapid growth in the social finance and social business sector, so I think it’s fair to assume an element of causal relationship.

11. Providing new financial instruments to harness private investment for the benefit of the community

This is achieved the minute the contract is signed. But in all honesty, if investors don’t receive their capital back, there won’t be any more and the instrument will no longer be viable.

12. Enabling investors to achieve financial returns and social impact

SIBs will either achieve both financial returns and social impact or neither, so this objective is pretty achievable. Some might argue that the measurement of impact is a little on the weak side for some SIBs, but if the measurement system is assumed to be a good indicator of social impact, then this is highly achievable.

13. Increasing funding for prevention and early intervention programs in a sustainable manner

This objective will only be achieved if funding is sustained, but we should all be asking the question of commissioners “What happens when your SIB is finished?”

14. Improving accountability and transparency for publicly funded services

Depends on publication of terms and results – some current SIBs are achieving this better than others – if it’s an objective stated at the start, the public should expect and demand publication of information

15. Allowing governments to accept and measure new ideas from external providers, only paying for the ones that deliver

This is difficult and unlikely to be achieved in the current market. Government’s ability to accept new ideas from the market is often limited by procurement rules, but the DWP innovation SIBs certainly allow 10 new ideas to be tested and compared. Only paying for the ones that deliver? This occurs in the UK, but not so much in Australia where investors stand to receive between 50% and 75% of their funds if services deliver no results at all.

16. Saving Government money

There is no doubt that there is a multitude of benefits to government from being involved in a SIB, but money in the hand is one of the least achievable, unless the services that will experience reduced demand are spot-purchased (i.e. purchased as units of service in direct response to demand). Steve Goldberg defines this as an accounting problem, rather than a savings problem, but governments are constrained within their accounting systems and are thus unlikely to cut anyone’s budget to pay for a SIB. The only SIB we’ve seen where savings resulting from a SIB intervention are used to repay investors is the Essex SIB , although these savings will not cover the cost of the SIB.

17. Lowering risk for government

Financial risk may be lower, but due to media attention, the reputational risk of programme failure is extraordinarily high. As a result, the due diligence undertaken by governments mitigates most of the risks of SIBs before they enter.

18. Savings can be recaptured and reinvested into a permanent funding stream for the program

Highly unlikely – see ‘Saving Government money’ above.

Service providers
19. Increasing accessibility of payment by results contracts

This is a UK objective – in the US many not-for-profits are on these kinds of contracts and in Australia there is a dominance of not-for-profits delivering government contracts for social services, despite there being few payment by results contracts. In the UK, it seems that this objective has been achieved for the SIBs introduced, but the test of whether it opens up broader payment by results contracts will be the upcoming rehabilitation reforms.

20. Giving nonprofit providers a committed, long-term funding stream not subject to budget cuts

This is achieved when contracts are signed and seems a significant contribution of SIBs for providers.

21. Aligning the interests of beneficiaries, nonprofit service providers, private investors, and governments

The SIB contract development process certainly achieves this. The only risk is that some voices around the table are louder than others and some interests are therefore maligned, but that consideration would hold over the development of any contracting arrangement.

22. Facilitated coordination with organisations working on overlapping problems

We’ve seen this demonstrated very well in Peterborough, but it will depend on the delivery structure. In New South Wales, Australia, the government has contracted for results directly with UnitingCare Burnside, a service provider, so they are not mandated to work with other delivery organisations.

Measuring the effect of interventions that strengthen families? Start here!

Children of Parents with a Mental Illness (COPMI) is an Adelaide-based organisation with a website rich in resources, both for families living with mental illness and those that support them. I was particularly impressed with the research section of the site – it’s easy to navigate, up-to-date and provides a wealth of information for evaluators of family-based interventions. They list several measures of parental self-efficacy and competence, summarising their reliability and validity, as well as an easy-to-read overview of evaluation. Their research information on young people includes lists of measures of stress and coping, self-esteem, connectedness, knowledge of mental health, strengths and difficulties and resiliance.

Better Evidence Network article published in Public Administration Today

Claudine Lyons and I have published an article, Building a case for better evidence in the Institute of Public Administration Australia’s Public Administration Today (vol 31, July 2012). Hoping for a digital copy soon, but here’s the PDF till then.

The article tell the story of the problem of being unable to find technical expertise in the silos of Government, and how I established an informal network of experts across the sector in response.