In May and June 2015, I had the satisfaction of taking part in the online course “Online training to strengthen Monitoring & Evaluation & Learning (MEL) on policy influence in Central and Eastern Europe”, devised and implemented by the Politics & Ideas (P&I) team. Looking back on the course, I see my time, spread out over the six weeks of the course, was well spent.
To put my experience into perspective, I come from an organization which grew out of election observation and subsequently ventured into monitoring of legislation and policies in the field of democratic governance, transparency and openness of public bodies. Our work is conducted through two main programs – one devoted to democratic institutions and processes and the other to democratic political culture and civic education. Since our monitoring work to this point revolved more closely around “sunlight” laws and legislation promoting good governance principles, the former seemed like the obvious choice to apply the tools for improving our ongoing MEL practices.
Prior to the course, I somehow figured our work was as self-understandable as it gets. Ideally, it consisted of legislative monitoring of the national Freedom of Information Act (FoIA), monitoring the implementation of the Act and associated bylaws, and reporting on the progress of the overall national access to information policy. All of it seemed to be part of a straightforward whole that hardly required any further elaboration.
However, as it turned out, my perception was just that – an incomplete, rather unreflected image of what it meant to link policy monitoring, evaluation of past rounds of monitoring and transferring newly-gained knowledge forward. I soon realized that what we lacked in our day-to-day monitoring work was a systematic approach – namely, that many of the elements vital to understanding the policy area regulated by the FoIA, were not made explicit in the term of the current Government which has undertaken pioneering steps to establishing the freedom of information infrastructure in Croatia. We knew what we were after, but we never paid heed to structuring it in a way that would help us understand the starting point, midpoints and end-state of our legislative monitoring, let alone specific areas prescribed by the FoIA’s numerous bylaws. All of these elements come together in a wonderfully intuitive structuring tool presented by the P&I team, dubbed the MEL matrix.
And yes, the MEL matrix is just what you might expect – a tool structuring your monitoring effort according to several compartments, including policy influence goals to be achieved, outcomes both in the long term and in the mid-term perspectives, and the outputs of your monitoring themselves. And of course, crucial to knowing whether advances had been made at each of these levels of structure – indicators, which I came to realize are incredibly easy to overlook. The MEL matrix promises to be a very functional and useful framing tool for looking at what your organization is supposed to be doing, how it should be going about it and what measures of impact it should be aiming for. For those who are looking for more theory and establishing propositions behind such a tool before they would trust using it to structure their work, there is that too, in the form of richly referenced chapters accompanying every module of the course. Put together, these make up an invaluable handbook that any organization should come back to now and then. When used properly, these can be the building elements of a story of your think tank’s impact on areas you care about.
Simple and intuitive does not equal easy, however. Except when done during the P&I course, constructing an MEL matrix is not an intellectual exercise; it is not a formality to be blown off before we can get down to “the real thing”. Quite the opposite: a well-though out MEL matrix covers the things you really need to be certain of at all times – it is your roadmap to future rounds of monitoring, a set of criteria for evaluating policy progress, a foundation for organizational preparations between monitoring projects. In fact, the more real, reflexive effort put into making the matrix, the better the final product and the more useful your map to the often opaque landscape of policy monitoring and evaluation. The biggest takeaway of the course for me, then, has to be the necessity of introducing more clarity and more structure in thinking about M&E if it is to be done systematically and used effectively as an advocacy tool later on. The learning component follows as a close second, since each monitoring should be the baseline for the next one if we are to keep relevant. Using the MEL matrix is just one of the means to this end, but may well be the central one. Certainly, it is very valuable to have a tool like that handed to you.
Finally, Vanesa, Dena and Kristie (and Leandro) will truly commit to guiding you – the course attendee – through the steps necessary to advance your MEL activities overall, but will not give you any undeserved pats on the back. Instead, their task is to challenge the assumptions about the everyday work of you and your organization and point out to you what else needs to be systematized in your heads and how to learn to reflect on “the way things have always been done”. In most cases, you stand to profit greatly.
Just be sure to catch your “Aha moments” when they appear (and they will, at almost every step of the way) and take away from them whatever you can.