How familiar are you with the approaches to evaluation?

How familiar are you with the approaches to evaluation? The use of Poll Everywhere in audience engagement – 2015 AES Conference Matt Healey1 and Kate ...
Author: Zoe Davis
5 downloads 0 Views 769KB Size
How familiar are you with the approaches to evaluation? The use of Poll Everywhere in audience engagement – 2015 AES Conference Matt Healey1 and Kate Roberts2 1

2

First Person Consulting, Roberts Evaluation

How familiar are you with the approaches to evaluation?

Contact: Matt Healey First Person Consulting E: [email protected] P: 03 9600 1778

Prepared for the Australasian Evaluation Society Conference 2015

1

How familiar are you with the approaches to evaluation?

1

Introduction

The purpose of this paper is to provide a summary of the ‘How familiar are you with the approaches to evaluation?’ workshop from the 2015 Australasian Evaluation Society Conference and to highlight the use of Poll Everywhere 1 in the workshop. The workshop explored a sample (11) of the 22 approaches to evaluation from the http://betterevaluation.org/ website. Its objectives were to: •

Provide an introduction to the different approaches to evaluation



Provide an introduction to Poll Everywhere.

This was done through: •

Development of posters / handouts (Appendix 1) summarizing the approaches



Categorising the approaches into four ‘styles’ based on their features: o Logic o Continuous improvement o Empowerment o Goal free



The audience read through each of the posters or their handout , while considering the following: o did they agree with how the approaches were grouped o how do the approaches relate to audience members own work



Audience engagement, discussion and data collection using Poll Everywhere

The rest of this paper summarises the method behind using Poll Everywhere and the results of the data collection process. The reason we decided to use Poll Everywhere was to trial this form of event-based data collection, while simultaneously engaging the audience in a conversation around the theory of approaches in evaluation. This paper is the first of many in highlighting the work of First Person Consulting in trialing different methods and publishing the results. To stay up to date on any new publications or opportunities we encourage you to sign up to our mailing list: http://www.fpconsulting.com.au/mailing-list.html.

You may use any part of this paper and its contents provided that you cite the source: Healey, M and Roberts, K 2015, How familiar are you with the approaches to evaluation? The use of Poll Everywhere in audience engagement – 2015 AES Conference, First Person Consulting, Melbourne.

1

https://www.polleverywhere.com

Prepared for the Australasian Evaluation Society Conference 2015

2

How familiar are you with the approaches to evaluation?

2

Poll Everywhere methodology

Poll Everywhere is a web-based application that enables audience members to engage with presenters and their peers using text messaging, twitter or a web browser. Polls can be downloaded and embedded into PowerPoint (though this requires an add on for PowerPoint to also be downloaded) and incorporated into your regular presentation. These polls enable questions to be posed to the audience and they respond through phone/computer or Twitter. The added benefit is that these responses are then displayed in real time through the presentation allowing for anonymity compared to more traditional Q&A scenarios. Polls can be either open ended or closed through the use of scaled questions. Section 3 contains a summary of these from the workshops. The questions posed through Poll Everywhere were: 1. To what extent were you already familiar with all of these approaches? (closed question) 2. To what extent do you agree with how the different approaches have been grouped? (closed question) 3. Who controls the evaluation during ‘goal free’ approaches? (closed question) 4. What are three words you would use to describe your own approach to evaluation? (open question) 5. How do the approaches relate to your own work? (open question) Closed question responses are represented through column or bar graphs, while open question responses can either be shown verbatim, or grouped. Examples of these based on the above questions are over the page. Various settings enable you to customize the process (e.g. only allow respondents to answer once), as well as visual effects to ensure it aligns with style guides or colour schemes.

Prepared for the Australasian Evaluation Society Conference 2015

3

How familiar are you with the approaches to evaluation?

3

Results

As can be seen from the following three graphs closed questions are represented through simple bar or column (not used here) graphs. You can have any number of options and all participants have to do is text message, Tweet or respond online using the relevant letter key (i.e. A, B, C, D… etc.) (see Figure 1).

Figure 1: Audience member’s familiarity with the approaches

These responses appear in real time allowing the audience to see where they fit compared to the rest of the group. This allowed for discussion between questions (e.g. if they felt like it those that nominated ‘strongly disagree’ in Figure 2 could explain why they felt that way).

Figure 2: Audience member’s level of agreement with grouping of approaches

Other uses include collecting data on your audience members. For example, Figure 3 would suggest that most prefer the ‘logic’ style of evaluation, however there were many generalists in the audience Prepared for the Australasian Evaluation Society Conference 2015

4

How familiar are you with the approaches to evaluation?

as well. Additionally, seven people were not sure what they preferred. Again, this allows for further discussion among the groups.

Figure 3: Preferred evaluation style of audience member's

Total number of responses are also provided so that you are able to quickly see what proportion of your audience has responded (bottom right in Figure 4).

Figure 4: Owner of the evaluation during 'goal free'

What workshop attendees found the most engaging though was the qualitative or open questions. Figure 5 is a ‘word cloud’ analysis style question. Participants provided three words that described their own approach to evaluation; the more the same word was used the bigger it gets. Similar to previous questions this allows audience members to obtain an understanding of what their peers are thinking relative to themselves. Prepared for the Australasian Evaluation Society Conference 2015

5

How familiar are you with the approaches to evaluation?

Figure 5: Three words that describe audience member's approaches to evaluation

Finally, responses to open questions can also be presented verbatim on the page (see Figure 6). Not shown on this particular slide, there were also questions asked – this means in larger forums you could have participants text message questions in during the talk and then address them during question time at the end.

Figure 6: How the approaches relate to audience member's work

Prepared for the Australasian Evaluation Society Conference 2015

6

How familiar are you with the approaches to evaluation?

4

Discussion

There were many benefits to the use of Poll Everywhere in the workshop session. These are briefly discussed below. A simple to use audience engagement tool Audience members were generally quite engaged through the use of Poll Everywhere. The real-time viewing of information allowed attendees to gain an instant view of what their peers thought which facilitated discussion. Responses were anonymous (though there is a function to allow people to sign up to Poll Everywhere which can turn this off), which could be useful when working with populations who are sensitive to this. Alternatively, you could simply ask an identifying question first as this would then be the first entry for each row (a simple work around compared to getting each participant to sign up to Poll Everywhere). Easy data collection Data that was collected can be downloaded into Excel which allows for other forms of analysis to those of the figures above. Additionally, once participants commence responding they each receive their own row in Excel. As such, if you were collecting pre- and post-workshop data you could use Poll Everywhere to keep it organized for you. This also removes the need for data entry of hardcopy feedback forms. A cost-effective tool The workshop used the cheapest paid plan ($19 a month 2) which allows for 50 responses per poll 3. You can have as many different polls as you like and there is a useful ‘grouping’ feature which allows the questions to be grouped into sub-folders. Given the low cost it is not an unreasonable platform to add onto a project’s budget – particularly given you save time from having to enter hardcopy data from feedback forms.

Conclusion This was our first time using Poll Everywhere in a workshop setting and it was quite well received. It does require your computer presenting the PowerPoint to have an internet connection so there are limitations in that sense however it is a tool that First Person Consulting will be using moving forward.

2

There are no single event plans – their argument being that you need to set up the poll, test it and perhaps return to it later. 3 This is how they break up their pricing, for example the next plan level allows for 200 responses per poll. Prepared for the Australasian Evaluation Society Conference 2015

7

How familiar are you with the approaches to evaluation?

Appendix 1

Sample of the approaches to evaluation – grouped by ‘style’

Outcome Mapping (Logic) Outcome mapping is a methodology for planning, monitoring and evaluating initiatives. At the planning stage, the process of outcome mapping helps a project team or program be specific about who it intends to target, the changes it hopes to see and the strategies appropriate to achieve these. • • • • •

Outcome mapping involves 12 steps in three stages: intentional design, outcome and performance monitoring and evaluation planning. Its process is participatory. This process is very effective at illuminating different perspectives of the initiative and the underlying values of different stakeholders. It is important not just to collect data about an intervention but also about the internal performance of the initiative, in particular any learning. Data are used to reconstruct pathways of change. Likewise, a retrospective assessment based on the approach will generate alternative and complementary explanations. Outcome Mapping provides a process and guidelines for continuous reflection among key actors involved in the initiative. By building in participation from the start, Outcome Mapping maximises the chances that findings will result in actual changes on the ground.

Randomised control trials (Logic) Randomised controlled trials (RCTs), or randomized impact evaluations, are a type of impact evaluation which uses randomized access to social programs as a means of limiting bias and generating an internally valid impact estimate. • • • • •

The steps of a randomised control trial involve developing a logic model, a baseline survey, a randomised sample from which a treatment group and a control group are drawn, a follow up survey, and a comparison of results between the treatment and control group. RCTs by definition must specify key evaluation questions. A deep understanding of the sample is essential: who is the target population? Given the importance of establishing causality, it is useful to highlight the role of the control group as counterfactual. By design, RCTs cannot determine impacts of currently existing projects, that is, of programs that have already launched and did not, by chance, randomly deliver their services.

Prepared for the Australasian Evaluation Society Conference 2015

1

How familiar are you with the approaches to evaluation?

Realist evaluation (Logic) Realist philosophy considers that an intervention works (or not) because actors make particular decisions in response to the intervention. The ‘reasoning’ of the actors in response to the resources or opportunities provided by the intervention is what causes the outcomes. • • • • •

The purpose of a realist evaluation is as much to test and refine the programme theory as it is to determine whether and how the programme worked. The choice of data collection, and analysis methods and tools is guided by the types of data that are needed to test the initial programme theory in all its dimensions. Realist evaluation explains the change brought about by an intervention by referring to the people who act and change a situation under specific conditions and under the influence of external events. Once patterns of outcomes are identified, the mechanisms generating those outcomes can be analysed, provided the right kinds of data are available. The contexts in which particular mechanisms did or did not ‘fire’ can then be determined. This is called a context- mechanism- outcome configuration. The last phase of the analysis consists of determining which configuration(s) offers the most robust and plausible explanation of the observed pattern of outcomes. This resulting configuration is then compared with the initial programme theory, which is modified (or not) in light of the evaluation findings.

Innovation History (Logic) People who have been involved in the innovation jointly construct a detailed written account (sometimes referred to as a ‘learning history’) based on their recollections and on available documents. Innovation histories are underpinned by two sets of concepts that guide data gathering and analysis. The first is called the ‘learning selection’ model and the second comes from social network analysis. • • • • •

The process of preparing this history stimulates discussion, reflection and learning among stakeholders. Subsequent planning drawing on the innovation history, can (i) build on the lessons learned, (ii) inform a shared vision, (iii) act as a catalyst for change and (iv) improve future performance. Based on the initial detailed account of the innovation process, more concise information can be prepared that summarizes the innovation process for internal use. Information designed for wider dissemination can help external parties build upon and expand their knowledge and understanding about how innovations are brought about. These concepts help participants to appreciate innovation as an evolutionary process driven by experiential learning cycles.

Prepared for the Australasian Evaluation Society Conference 2015

2

How familiar are you with the approaches to evaluation?

Participatory evaluation (continuous improvement) Participatory evaluation is an approach that involves the stakeholders of a programme or policy in the evaluation process. This involvement can occur at any stage of the evaluation process, from the evaluation design to the data collection and analysis and reporting. • • •

A participatory approach can be taken with any evaluation design, and with quantitative and qualitative data. The first step in a participatory approach is to clarify what value this approach will add, both to the evaluation and to the stakeholders who would be involved There are a number of reasons to use this approach: o Involving stakeholders in the process of an evaluation can lead to "better data, better understanding of the data, more appropriate recommendations, [and] better uptake of findings” o It is ethical to include the people to be affected by a programme or policy in the process to inform relevant decisions.

Developmental evaluation (continuous improvement) Developmental Evaluation guides the implementation of innovation by using a process that allows for close monitoring so that adaptations can be made quickly and the innovation improved. Patterns of change emerge from rapid, real time interactions that generate learning, evolution, and development. •

This approach works well in complex environments for social interventions and innovations where solutions are uncertain and key stakeholders are in conflict about how to proceed.



The originators of this approach liken it to the role of research and development in the private sector product development process because it facilitates real-time, or close to real-time, feedback to program staff thus facilitating a continuous development loop.



It is an approach that is responsive to context.



It is particularly suited to innovation, radical program re-design, replication, complex issues, crises.



Developmental evaluation helps by framing concepts, testing quick iterations, tracking developments, surfacing issues.

Prepared for the Australasian Evaluation Society Conference 2015

3

How familiar are you with the approaches to evaluation?

Appreciative Inquiry (Empowerment) Appreciative Inquiry is an approach to organisational change which focuses on strengths rather than on weaknesses. While Appreciative Inquiry has always had an evaluative focus (working out what is working well and seeking to improve performance and conditions), in recent years there have been explicit efforts to embed its principles and processes in formal evaluation. Appreciative Inquiry inquires into, identifies and further develops the best of “what is” in order to create a better future. Appreciative Inquiry is often presented in terms of a 4 step process around an affirmative topic choice: 1. DISCOVER: What gives life? What is the best? Appreciating and identifying processes that work well. 2. DREAM: What might be? What is the world calling for? Envisioning results, and how things might work well in the future. 3. DESIGN: What should be--the ideal? Co-constructing - planning and prioritizing processes that would work well. 4. DESTINY (or DELIVER): How to empower, learn and adjust/improvise? Sustaining the change

Democratic Evaluation (Empowerment) Democratic Evaluation is an approach where the aim of the evaluation is to serve the whole community. This allows people to be informed of what others are doing and sees the evaluator as someone who brokers the process. •

It generally focuses on inclusive practices which foster participation and collaboration.



However it is also used as a means of ensuring public accountability and transparency.



The basic value is an informed citizenry.



The evaluator acts as a broker in exchanges of information between groups who want knowledge of each other.



The key concepts democratic evaluation are “confidentiality,” “negotiation,” and “accessibility”.



The role of Democratic Evaluation is challenging monopolies of various kinds – of problem definition, of issue formulation, of data control, of information utilisation.

Prepared for the Australasian Evaluation Society Conference 2015

4

How familiar are you with the approaches to evaluation?

Beneficiary assessment (Empowerment) Beneficiary assessment is a tool for managers who wish to improve the quality of development operations. This is an approach to information gathering which assesses the value of an activity as it is perceived by its principal users. • • • • • •

The approach is qualitative in that it attempts to derive understanding from shared experience Beneficiary assessment is a systematic inquiry into people’s values and behaviour in relation to a planned or ongoing The ultimate goal of beneficiary assessment is to reveal the meaning people give to particular aspects of their lives so that development activities may better enhance people’s ability to improve their own living conditions, as they see fit. While early assessments were largely for project design, most of the ongoing and planned beneficiary assessments are iterative learning processes under-taken periodically throughout the lifetime of a project. While the focus of beneficiary assessment is clearly on the beneficiaries of a development policy or activity, the target is the decision making of the managers responsible for that activity. The final report should first be presented to management orally, subsequently in rough draft. The perspective of management should be incorporated in the final text.

Positive Deviance (Goal free) Positive Deviance refers to an approach which is premised on the observation that in any context, certain individuals will employ uncommon but successful behaviours or strategies which enable them to find better solutions than their peers. Through the study of these individuals (referred to as “positive deviants) their innovative solutions are identified and refined for more general benefit. In applying this approach, an investigator must first obtain an invitation from the community in question. • It is a strength-based approach based around five core principles: o first, that communities possess the solutions and expertise to best address their own problems; o second, that these communities are self-organising entities with sufficient human resources and assets to derive solutions to communal problems, o third, that communities possess a ‘collective intelligence’, equally distributed through the community, which the approach seeks to foster and draw out o fourth, that the foundation of any approach rests on sustainability and the act of enabling a community to discover solutions to their own problems through the study of local “positive deviants” o fifth, that behaviour change is best achieved through practice and the act of “doing”. Prepared for the Australasian Evaluation Society Conference 2015

5

How familiar are you with the approaches to evaluation?

Outcome Harvesting (Goal free) Outcome Harvesting collects evidence of what has changed and, then, working backwards, determines whether and how an intervention has contributed to these changes. Outcome Harvesting has proven to be especially useful in complex situations when it is not possible to define concretely most of what an intervention aims to achieve, or even, what specific actions will be taken. • • • • • •

Outcome Harvesting does not measure progress towards predetermined objectives or outcomes, but rather, collects evidence of what has changed and, then, working backwards, determines whether and how an intervention contributed to these changes. Depending on the situation, either an external or internal person (“the harvester”) is designated to lead the Outcome Harvesting process. From reports, previous evaluations, press releases and other documentation, harvesters identify potential outcomes (i.e., changes in individuals, groups, communities, organisations or institutions) and what the intervention did to contribute to them. The harvester needs to engage informants who are knowledgeable about what the intervention has achieved. Field staff who are positioned “closest to the action” tend to be the best informants. Harvesters engage directly with informants to review the outcome descriptions based on the document review, and to identify and formulate additional outcomes. Informants will often consult with others inside or outside their organisation knowledgeable about outcomes to which they have contributed. Harvest users and harvesters review the final outcomes and select those to be verified in order to increase the accuracy and credibility of the findings. The harvesters obtain the views of one or more individuals who are independent of the intervention (third party) but knowledgeable about one or more of the outcomes and the change agent’s contribution.

Prepared for the Australasian Evaluation Society Conference 2015

6