Month: June 2017

Awesome (Free) Visual Idea Tools

Ok, so the awesome bit is subjective! I have just returned from a predominantly practitioner-based conference, where as an academic, I was impressed with the powerful visuals in most of the presentations. Gone seem the days where ‘visuals’ constitute a relevant photograph flourishing a crammed PowerPoint slide. The infographics, process diagrams, and cartoons present at the conference were impactful, conveyed some really complex information, and yet were not produced by multi-million pound organisations with teams of graphics experts – many were charities and government departments.

This prompted me to explore a little more this visual idea domain. The post doesn’t intend to exhaust all options, or ‘taste test’ all for the best in class. What it does do is give an overview of some of the options out there to encourage a bit of creative toe-dipping.

1. Canva

Anyone who knows me will know how impressed I have been with Canva ( and their mission of ’empowering the world to design’ – my students are using it for posters, my sister is using it for her wedding invites, my partner is using it for social media marketing – I am a major fan! Canva have many templates (termed ‘layouts’) and there are many free ones to get started. There are many preloaded images and users can upload their own too (paid images are only $1). Very easy to use and has apps (I use iPhone and iPad apps – useful on the go) but needs internet connection to use.


Support Guide DProf.jpgUntitled design.jpg

2. Storyboard That 

Storyboard That ( allows users to create comicstrips. There are lots of other storyboard creators out there too which look great and perhaps look a bit more effective than what I have managed to produce on Storyboard That, but it is relatively intuitive which is a big thing for me.


‘Storyboard That’ Example

Storyboards are appealing to explain or add humour and these tools seem to be trying to market a lot to educators. Storyboard That refer to their offering as ‘digital storytelling’. The example adjacent was my first ever dabble with this tool and took about 10 minutes to create. However, as with many of the tools out there, the number of projects and features are restricted unless you pay for premium access.

3. Piktochart

Piktochart ( is similar to Canva and relatively easy to use. For me, their infographic and A4 reports (in ‘printables”) are the better of their current offering.  I prefer Piktochart over Canva for presenting numbers since you can add data into tables and maps. At the time of writing there is limited colour choices on some of the templates (but a coming soon message so think they are onto this).

Of course, the focus here has been on free tools only. Upgrades to premium membership might be worthwhile if you are going to use such tools regularly (Venngage is great for infographics but at $19 a month I can’t justify it).

What do you use?

How do you like to convey complex information?

Is there a place for more visual means?








XII International Evaluation Conference 

It was a great pleasure to speak at the opening of the XII International Evaluation Conference by invitation of the Polish Ministry of Economic Development, and Polish Agency for Enterprise Development yesterday (21st June 2017).

I had been asked to speak about the future of evaluation and evidence based policy and a summary of what I discussed is outlined below. 

Challenges facing evaluators at the current time

The Death of Experts. During the U.K. ‘Brexit’ referendum campaigning there were several examples of expertise being undermined by both the general public and some politicians (e.g. Michael Gove:”people in this country have had enough of experts”). At the same time we are seeing evidence, knowledge and expertise shared via internet blogs and websites; and arguably being utilised at a greater rate than other formal evidence mechanisms (evaluation reports, academic publication) – despite no guarantee of their quality. Evaluators have long struggled for legitimacy and this erosion of the expert role complicates this challenge further. 

What does the erosion of expertise mean for evaluation evidence and its future use?
How should evaluators respond to the erosion of expertise and further challenges to their legitimacy ? 

Changing consumption of data. Big data is trending right now and evaluation is following suit (Lou Davina-Stouffs of Nesta UK shared an example in Wales, UK, where data is being scraped to look at indications of SME development/improvement). Evaluators and policy makers may need to approach this data with caution. Similarly, we are seeing strength in story telling – rich qualitative methodologies may support this to be captured. This is not about polarising qualitative (ethnography, interviews) and quantitative (big data) approaches – but evaluators need to move with these changing methodologies and embrace technology to do this. We are seeing the use of audio-visual means of presentation as a valuable means to disseminate evaluation findings with better consumption than written evaluation reports (being anecdotally reported). This twitter post demonstrates this point:  

Source: @evaluationmaven


Mariana Hristcheva (Director-General of Urban and Regional Policy, European Commission) referred to this point later in the conference, and noted that evaluation studies of European Union funded cohesion policy initiatives are now beginning to be produced in video format.

Anyone can evaluate.  Many evaluation practitioners are engaged in evaluation societies, networks and professional development throughout the world, engaging in debate about how to advance evaluation practice; yet many are not (there is no data available to demonstrate just how many evaluators are engaged/disengaged with such practice development to support any such proportion to be known). I am preaching to the converted as it is those who are disengaged and absent from practice development (professional development or methodological) who we ought to worry about. We know the quality of evaluation varies vastly and so this lack of oversight/governance is unhelpful. 

Evaluation societies such as the UK Evaluation Society (UKES), American Evaluation Association (AEA) and European Evaluation Society (EES) are engaging in work to professionalise evaluation and we can recognise capability frameworks, guidelines for practice and, more recently, voluntary peer review scheme pilots (EES, UKES). The European Commission are also taking a more proactive role to enhance their evaluation studies offering summer schools in evaluation practice.

Wolfgang Meyer, later at the conference, highlighted the absence of renowned European evaluation scholars and theorists. US scholars from the 1980s still dominate our academic contributions and this is unhelpful to the education of evaluators and evaluation capacity-building in Europe.

Providing a consistent evaluation experience across the industry remains a challenge.

Should we abandon evidence based policy?

The evidence based policy approach continues to be challenged across numerous sectors. No, I do not think that this should be abandoned – to do so would be to abandon the very notion of evidence, knowledge and its transformative potential in policy development. However, the very fact it is being challenged might prompt us to craft a new narrative that sits behind it, and to explore further the issues we face in practicing it. 

The recent work of Newman, Cherney and Head (2017) detailed the result of a study among over 2000 Australian public servants and found that almost 60% used e-databases to search for academic abstracts, articles and reports, and just over 60% had used academic work in reports over the preceding 12 months. This suggests that the evidence base is being consulted. Although worryingly, 75% of the same respondents didn’t feel they had the expertise to apply the results – one possible avenue for improvement of the use of evidence. 

Redefining EBP is unlikely to support us to overcome the barriers to its effectiveness, pushing a clear agenda to engage all parties to systematically address the challenges that face it might help. Our struggles as evaluators to transform policy need closer inspection. We are likely to struggle to prevent evidence being manipulated for political gain but it is likely we can support the issues of cultural difference between evaluators and the consumers of evidence, seen for instance in long-standing remarks about the presentation of evaluation findings. Evaluators alone cannot solve this, it needs to be a systems approach involving policy makers, public servants, funders if relevant, and evaluators.

To abandon EBP would be to abandon faith in knowledge, learning and improvement (Wond, 2017)

But we can’t afford to wait for data can we? 

This timeliness challenge is never going to cease, the two are going to struggle to synchronize but they can be mitigated to some extent. I can’t imagine a time when society will stop to ponder the evaluation reports of the previous initiative before proceeding to the next and I do wonder whether evaluation has to step back and consider a longer term role for itself instead. Seeing evaluation as a longer term game may actually be helpful, for instance we can reflect on the way evaluation is funded (short term not supporting us to establish longer term impact) and whether this fits.