Learning Gain and Confidence Gain: design, practice, evaluation Dr Fabio R. Aricò @FabioArico L&T Festival Medway – Sep 2017
YOUR PRESENTER Fabio Aricò National Teaching Fellow Senior Lecturer in Macroeconomics School of Economics – University of East Anglia, UK Research fields • Higher Education policy and practice (widen. access, satisfaction) • Technology Enhanced Learning • Self-Assessment and Academic Self-Efficacy
Twitter: @FabioArico 2
ACKNOWLEDGEMENTS HEA – Teaching Development Grant Scheme
HEFCE Piloting and Evaluating Measures of Learning Gain UEA Students, Alumni, and Research Assistants
3
AIMS & OBJECTIVES My talk will be all about pedagogies which I have designed, used, and evaluated. They are now being rolled across different disciplines.
• Sharing my practice
take home at least one idea!
• Inspiring research
from research-led teaching to teaching-led research
• Support developments in the HE sector
• Support personal development
in the UK: TEF is coming (learning gain) promoting and evidencing excellence.
4
OUTLINE 1. Core Concepts
Peer-instruction Self-assessment & Self-efficacy Learning Gain 2. Peer-Instruction & Self-Assessment in an Active Learning Environment 3. Evaluating the Pedagogy: Learning Gain & Confidence Gain 4. Students’ Appraisal of the Pedagogy & Closing the Feedback Loop
5
ETHICAL REMARK
You will be presented with data collected during teaching sessions. Students involved have given informed consent for me to analyse their responses and present the results of this analysis.
I can assist with ethical queries as well, please ask me.
6
1. Core Concepts Peer-instruction Self-assessment & Self-efficacy Learning Gain
7
FLIPPED CLASS and PEER-INSTRUCTION • Flipped classroom & Peer-Instruction pre-reading + student interaction Mazur (1997) Henderson and Dancy (2009) well-developed research in Physics and STEM. • Learning analytics for Peer-Instruction Learning gains: Mazour Group - Bates & Galloway (2012) Student satisfaction: Hernandez Nanclares & Cerezo Menendez (2014). • There is not much literature on the links with self-assessment skills Open field, with many unanswered questions e.g. role of demographics, language, previous background Pedagogically: self-assessment blends with flipping and Peer-instruction.
8
SELF-EFFICACY and SELF-ASSESSMENT Academic Self-Efficacy = confidence at performing academic tasks and/or attaining academic goals. Bandura (1977)
1. Mastery of experiences 2. Vicarious experiences 3. Verbal persuasion 4. Environment and settings See also: Pajares (1996) and Ritchie (2015).
Idea:
Students should develop their self-efficacy to master their learning experience. Measure learning gain along with increased self-efficacy: ‘confidence gain’. 9
LEARNING GAIN Policy-driven research to assess student learning Arum & Roksa (2010), ‘Academically Adrift’ (US data) OECD approach:
assessment of learning outcomes AHELO Project (OECD, 2011 and 2014)
US & UK approach:
learning gain (‘distance run’ over time) McGrath et al., 2015
HEFCE commissioned research on piloting measures of learning gain Outputs to feed in Teaching Excellence Framework.
10
2. Peer-Instruction & Self-Assessment in an Active Learning Environment
11
ACTIVE LEARNING ENVIRONMENT Introductory Macroeconomics (from 2013 to 2017) • • • • •
year-long module 250 students 22 lectures 8 seminars 8 workshops
(compulsory 1st year) (started with 140 in 2013, 250 past 2yrs) (2hrs per week) (every second week, even) (every second week, odd)
Students endowed with individual Audience Response Systems (clickers) continuous data collection facilitated by technology; comprehensive ethical approval obtained beforehand. 12
WORKSHOPS – teaching algorithm
Round 1 Peer-Instruction Self-Assessment 2 - formative question - students talk - confidence question - 4 choices - compare answers - 4 level Likert-scale - no information - explain each other - information shared - no answer Round 2 Self-Assessment 1 - formative question - confidence question - Identical to R1 - 4 level Likert-scale - information shared - information shared - correct answer 13
Holding everything else constant in the world economy: If inflation in a country increases, its… A.
exports decrease and imports stay the same;
B.
exports decrease and imports increase;
C.
exports increase and imports decrease; D. imports increase.
14
“I think I have the skills/knowledge to answer the previous question correctly” A.
strongly agree;
B.
agree;
C.
disagree;
D. strongly disagree.
15
Holding everything else constant in the world economy: If inflation in a country increases, its… A.
exports decrease and imports stay the same;
B.
exports decrease and imports increase;
C.
exports increase and imports decrease; D. imports increase.
16
“NOW I think I have the skills/knowledge to answer the previous question correctly” A.
strongly agree;
B.
agree;
C.
disagree;
D. strongly disagree.
17
WORKSHOPS – asking the right questions Bloom’s Taxonomy • Aim to climb up the pyramid • Do not trivialise MCQs.
See work by Simon Lancaster.
20
3. Evaluating the Pedagogy: Learning Gain & Confidence Gain
21
DATASETS AND CODING Q1
Q2
Q3
1
0
1
1
2
1
0
0
3
1
1
…
…
…
performance per student confidence by student
Student
Formative questions 1 = correct 0 = incorrect
Confidence questions 1 = strongly/agree 0 = strongly/disagree
performance per question confidence by question
22
OPERATIONALISING TWO GAINS For each 1st and 2nd response to formative assessment questions: % correct R2 % correct R1 Normalised Learning Gain (NLG) =
100% % correct R1
For each 1st and 2nd response to self-assessment questions: % confident R2 % confident R1 Normalised Confidence Gain (NCG) =
100% % confident R1
23
PEDAGOGY EVALUATION STRATEGY 1. Is the pedagogy developing good self-assessment skills? Are students self-assessing correctly over Round 1/2? 2. Is peer-instruction able to generate learning/confidence gain? How does learning/confidence gain relate to initial knowledge/confidence levels (Round 1)? 3. Is learning gain associated to confidence gain? Does the structure of the algorithm affect this relationship? 2016
Vicarious of Experience Scenario (VES)
2017
Mastery of Experience Scenario (MES) only contrasted with VES (4 sessions each). 24
WORKSHOPS – contrast 2 teaching algorithms VES
MES
Round 1 Peer-Instruction Self-Assessment 2 - formative question - students talk - confidence question - 4 choices - compare answers - 4 level Likert-scale - no information - explain each other - information shared - no answer Round 2 Self-Assessment 1 - formative question - confidence question - Identical to R1 - 4 level Likert-scale - information shared - information shared - correct answer 25
RESULT – Self-assessment Skills Round 1 % confident Round 1
2016 VES =0.429*** R²=0.56 2017 VES =0.367*** R²=0.59 2017 MES =0.09***
0.44
0.33
% correct Round 1
26
RESULT – Self-assessment Skills Round 2 % confident Round 2 0.60
2016 VES =0.282*** R²=0.26 2017 VES =0.322*** R²=0.64 2017 MES =1.44***
0.57
% correct Round 2
27
RESULT – Normalised Gains in 2016 NLG R²=0.39
0.85
NCG NCG R²=0.19
0.52 NLG
2016
0.37
0.48
% Correct/Confident in Round 1
28
RESULT – Normalised Gains in 2017 NLG R²=0.21 0.67
NCG NCG R²=0.42 =0.132** NLG
0.40
0.42
% Correct/Confident in Round 1
29
RESULT – Learning Gain & Confidence Gain NCG
2016 R²=016 2017 VES =0.19*** R²=0.39 MES =0.125**
0.31 0.28 -1
NLG
30
4. Students’ Appraisal of the Pedagogies
Closing the Feedback Loop
31
WHAT DO STUDENT THINK? •
The literature on evaluation of TEL and Peer-Instruction is far too focused on whether students ‘enjoy’ their experience (student satisfaction) typical of academic practice literature.
•
I want to give more focus on the perception of learning: 1st lecture: introduced the concept of Peer-Instruction asked the students to share what they think of it. each workshop: asked students to share their view on the session and whether they felt they learnt from each other. informal end-of-module feedback: what was the most effective component of the blended learning environment mix within the module.
32
1st lecture: “‘Peer-instruction’ sessions (students teaching each other) are more effective than lectures (teacher teaching students)” 45% 40%
No. Respondents = 82
35% 30%
strongly agree
25% agree 20% 15%
disagree
10% 5%
strongly disagree
0%
33
Workshop feedback statement: “I have learnt more Economics by discussing answers with my classmates”
80% 70%
strongly agree
agree
disagree
strongly disagree
60% 50% 40% 30% 20% 10%
0% No.
Wk 4 89
Wk 6 38
Wk 8 72
Wk14 69
Wk16 52
Wk18 39
Wk20 40
34
Comparing student opinion about Peer-Instruction as an effective pedagogy before and after exposure 70%
60%
strongly agree
50% agree
40% 30%
disagree 20% 10%
strongly disagree
0% 1st Lecture (N=82)
Avg Workshop (N=57)
35
End-of-module Feedback: What is the component of the Macro module which had the strongest impact on your learning?
6%
13%
Lectures 6%
20%
4%
Seminars Workshops
Support VLE 51%
NA
36
FEEDBACK to STUDENTS •
During polling sessions: show results, talk to the students, encourage them; use comparative-links, show them their progress; explain what you are doing and why you are doing it.
•
After polling sessions: share session reports on your VLE and comment on these; use mail-merge to send individual reports;
•
Go beyond polling: craft your own perfect pedagogical blend; be aware of different student needs.
37
FINAL REMARKS • It took me 4 years to develop this teaching approach & evaluation. Think big….but start small, and build from there. • It’s not about the technology, it’s about the pedagogy and, most importantly, it’s about the students. • Be concerned about ethics, but do not be discouraged or scared (students are not: JISC, 2016).
• Choose your demonstrators carefully: your ‘average Fabio’ might be more convincing than a pedagogy expert or a techno-hyper-enthusiast. 38
STAY IN TOUCH!
[email protected]
@FabioArico “Promoting Active Learning Through Peer-Instruction and SelfAssessment: A Toolkit to Design, Support and Evaluate Teaching”, Educational Developments, SEDA, 17.1, 15-18.
39
Learning Gain and Confidence Gain: design, practice, evaluation Dr Fabio R. Aricò @FabioArico L&T Festival Chatham – Sep 2017