Home / posts / Blog / Complexity Settings to the Rescue: A New Lease of Life for Evidence-Based Policy?

Complexity Settings to the Rescue: A New Lease of Life for Evidence-Based Policy?

Aug 28, 2017 | Blog

CECAN Fellow Sara Giorgi shares her perspective on some of the key insights from her research.

It would be naive and, potentially, ill-advised to have evaluation solely drive policy direction. Good, open, evidence-backed policy, however, does need to be informed by evaluation results and insights.  My CECAN Fellowship provided me with a rare opportunity to investigate how evaluation is applied in real life within a government department – in this case Defra – and how it can be used to plan for future policymaking.

My case study was the evaluation of Defra’s Reward and Recognition Fund – an initiative that funded 31 pilot schemes designed to improve recycling in local areas.  As part of my fellowship, I was able to interview practitioners delivering the schemes and central Government policymakers, including policy advisors, researchers, evidence managers and strategy leads. All policy stakeholder interviewees stated that evaluation was, and ought to be, key to good and open policymaking; though, often, the daily practicalities of how this happens are difficult. Evaluation outcomes do not drive policy, according to interviewees, but they do, indirectly and in certain circumstances, inform and influence policy. The extent and frequency of this practice depends on a whole host of preconditions and constraints including: policy design (has the policy been designed to be evaluated?); timing; resources, capacity and skills; governance structures and working cultures; and buy-in.

Timing is key and was one of the main issues mentioned by respondents. Policy and evaluation work at different speeds. Policy tends to be fast-paced, dynamic and demands a quick turnaround while evaluation is thought to require longer timeframes, given its data issues, attention to detail and analytical nature. This mismatch often means that the feedback loops between evaluation and policy do not work as intended.

There are challenges, but it doesn’t have to be all doom and gloom. With an open acknowledgement from both sides, this timing mismatch can be addressed. External evaluators, researchers and/or civil service analysts, on the one side, need to be confident in sharing emerging findings and insights live at that point in time through outlets such as learning sessions. Emerging findings may be ‘good enough’ and tell the narrative of the ‘impact at this point in time’ to feed into policy development decisions. Policymakers, on the other hand, need to feel comfortable with these caveats and with the risk that final results and end conclusions may differ.

Now add in the complication of complexity to the mix and things become clearer. Let me explain.

When policies take place in an interrelated system with a range of intervening factors, the two sides of evaluation (an external, independent assessment for accountability purposes and an internal, reflective analysis for learning purposes) conflate. This conflation of assessment and learning in evaluation in complex settings requires leaner, co-produced, adaptive or, effectively, ‘smarter’ ways of working and methodologies. These ways of working and methods are better suited to improve how evaluation outcomes inform and influence policy direction.

Thus, we shouldn’t be put off by the label complexity, and it shouldn’t deter action on important policy issues or dissuade us from evaluation. On the contrary, recognising complexity explicitly is an opportunity and can better equip policy stakeholders and practitioners with the ‘smart’ (i.e. fit for purpose, lean, appropriate, robust) evaluation approaches which better inform policy making.

Read Sara Giorgi’s full report from her CECAN fellowship.

 

In case you missed this week's webinar: 'Innovation as a complex system: delivering a systems framework to measure impact within deep tech', with Brian MacAulay and Teresa Miquel from Digital Catapult, a recording is now available on the CECAN website: cecan.ac.uk/videos/

[image or embed] — CECAN (@cecan.bsky.social) October 10, 2024 at 11:01 AM

*Training* Systems Mapping for Environmental Domains. 12 Nov 2024, 09.00 – 17.00, University of Surrey, Guildford. This one day workshop is hosted by @_ACCESSnetwork, with facilitators from @CecanLimited. For details and to book, see: accessnetwork.uk/systems-mapp...

[image or embed]

— CECAN (@cecan.bsky.social) October 3, 2024 at 3:04 PM
Share This