5 Key Ideas Underlying Effective Formative Assessment

5 Key Ideas Underlying Effective Formative Assessment The practice of formative – or “informal” – classroom assessment has been around for years, but ...
5 downloads 0 Views 234KB Size
5 Key Ideas Underlying Effective Formative Assessment The practice of formative – or “informal” – classroom assessment has been around for years, but it was probably the research of Paul Black and Dylan Wiliam (1998) that made educators stop and think more deeply about what makes an assessment formative – more specifically the purpose and potential impact that the use of assessment evidence can have on future learning. While there are a variety of definitions of formative assessment in the literature, most agree that formative assessment is assessment FOR learning – assessment that provides information about what students know now in relation to where they are going and used to help them get to the intended learning target. Black & Wiliam’s research highlighted that students who learn in a formative way achieve significantly better than matched control groups receiving more traditional approaches to teaching and testing. Several key ideas emerge from the last two decade’s research on effective uses of formative assessment. Key Idea #1: Authentic assessment is continuous. Formative assessment is both integral to the cycle of learning and part of a balanced assessment system. The success of formative assessment use within a local assessment system (formative assessment + interim assessment + summative assessment) is highly related to the quality of student involvement and how effectively teachers plan for and use assessment data to adjust instruction. Key Idea #2: Formative assessment may take different forms, but should always inform instruction and learning. Feedback from formative assessment is based on different sources of observable evidence (written, oral, visual, kinesthetic, etc.) and used to guide next steps in instruction and learning. Formative assessment is constantly occurring. It may be (a) “in-themoment” (e.g., quick checks for understanding, probing questions during instruction based on what was just heard/observed), (b) designed with a specific purpose and learning target in mind (exit card, pre-assessment, conferencing, planned formative “probe”), or (c) curriculum embedded, such as formatively using interim assessments (mini summative assessments, such as performance tasks) to monitor student progress across the school year. Key Idea #3: Feedback is multi-faceted and used to gauge how close a student is to the intended learning target. A balance of feedback coming from three key sources - from teachers, from peers (e.g., peer tutoring, peer editing, peer conferencing), and self-assessment tools (e.g., Hess’ ‘what I need to do & what I did’ rubrics) - has been proven to enhance effectiveness of formative assessment use.  

Assessment evidence can be based on a variety of observable artifacts (e.g., portfolios, works in progress, systematic observations of individual or group activities, classroom discourse, performance tasks coupled with scoring guide or rubric based on intended learning targets and success criteria). Feedback to the student is primarily descriptive. Feedback emphasizes strengths, identifies challenges, and points to possible next steps in learning based on intended learning targets.

Key Idea #4: Students are actively involved in formative assessment. Active involvement means students use assessment evidence to set and monitor progress towards learning goals, reflect on themselves as learners, and evaluate the quality of their performance. Valuing both 1

© Karin Hess (2015) Educational Research in Action, LLC

one’s stru uggles and successes s acccomplishing g smaller learrning targetss over time hhas been proven to have a profound in nfluence on deepening d motivation, m ddeveloping inndependencee as a learnerr, and build ding what wee have comee to know as “a growth m mindset.” a #5: All high-quality assessment a utilizes threee key comp ponents – un nderstandin ng Key Idea how one learns, how w one demon nstrates wh hat was learn ned, and hoow we interp pret/measurre the evideence observ ved. The concept of the Assessment A T Triangle, firrst presentedd by Pellegrinno, Chudowssky, and Glaaser in Know wing What Sttudents Know w/KWSK (N NRC, 2001) iis shown bellow. “The asseessment trian ngle explicaates three key y elements uunderlying asssessment: ‘a model of student cognition c and d learning in n the domain n, a set of belliefs about thhe kinds of oobservation that will prov vide evidencee of studentss’ competenccies, and an interpretatioon process ffor making seense of the evidence’ (NR RC, 2001, p. 44). KWSK K uses the heuuristic of an ‘assessmentt triangle’ too illustrate the relationships among g learning models, m assesssment methoods, and infeerences one ccan draw from m the observ vations madee about whatt students truuly know annd can do” (H Hess, Burdgee, & Clayton, 2011, p. 184 4). Assessmeent design (fformative-innterim-summ mative) and pplanning shoould consider all three.

Obs servation: A se et of specifications for assessment a task ks that will elicitt illuminating resp ponses from students

Interprretation: The m methods and analyticc tools used to make sense o of and rea ason from the a assessment observa ations/evidence

Cog gnition: Beliefs s about how hu umans reprresent information and develo op com mpetence in a particular acade emic domain

Learning proggressions research  focuses on ho ow competence e  develops oveer time

The Ass sessment Triangle T (NR RC, 2001, p.. 44)

Learning g progression ns offer a co oherent starting point forr thinking abbout how stuudents develoop competen nce in an aca ademic domain over time and how too observe annd interpret the learningg as it unfolds (Hess, 2010; Hess, 2011)). Progress indicators (P PIs) in a learrning progreession descriibe typical obbservable evvidence of learning along the learning contiinuum for eaach larger leaarning objecctive (e.g., Sttudents will apply org ganizational strategies an nd multiple reference r souurces to anaalyze, integraate, and communicate fact-baased informaation on topiccs, concepts , and events for authentiic and variedd audiences; Students will apply reeasoning usiing propertiees of two- annd three-dim mensional shaapes to analyzze, represent, and model geometric reelationships)). Teachers ccan utilize thhe descriptioons in a learning g progressio on to plan insstruction and d assessmentt tasks, as w well as to inteerpret where students are on their learning patthways.

2

© Karin Hess (2015) Educcational Researrch in Action, LLC

A concep ptual view of o learning progression p ns is one of ooverlapping llearning “zoones” along a learning continuum (Hess, ( 2008)). At the low wer end of thee progressionn are “Novicce” learners (at any gradee level), who o may (or may m not) dem monstrate the necessary pprerequisite sskills and concepts needed thatt can be builtt upon over time. t A startting point foor learning caan be established, perhaps with a short pre-assessm ment or form mative diagnoostic assessm ment. Guidedd, targeted, and scaffold ded practice can be emplloyed to devvelop subsetss of understaanding (skills/co oncepts brok ken into smalller manageaable and meaaningful learrning chunkss). Later durring the instru uctional cyclle, instructio on and assesssment target students’ abbility to deveelop schemaas to organize and connectt new learnin ng and work k more indeppendently - w what “Expertt” performerrs consisten ntly do. (Thee Zone of Pro oximal Deveelopment is tthe range off potential eaach person haas for learniing at any giiven time, Vygotsky, 197 78).

3

© Karin Hess (2015) Educcational Researrch in Action, LLC

Table 1: Sample Formative-Interim-Summative Assessment Planning along a Learning Continuum Step 1 – Determine the Unit Outcome Step 2 – Describe the unit summative assessment and evidence that will be used to determine proficiency (content+process+product) and possible unit learning extensions Grade 3 ELA-SS Unit of Study: Opinion Writing – Students will be able to use text evidence to develop and support an opinion on a topic/issue/event studied in social studies. Continuum of LessonBased Learning Targets Generate ideas for writing; understand unique features of opinion writing

Pre-requisite or Mid-Assessment Evidence Pre-assessment Write a personal opinion about…(topic) with use of supporting evidence

Develop an understanding of a topic/text; locate evidence; organize information relating to opposing sides of an issue

Gather and organize text evidence in T-chart; analyze opposing ideas

Frame introduction; select relevant facts, details, etc. to support stated opinion

Distinguish fact-opinion; relevant non relevant facts Mid-assessment Use two short texts to develop opinion with supporting evidence

Elaborate on each reason; use transitions to connect ideas; link focus to conclusion

Mid-assessment Plan or analyze opinion pieces using graphic organizer (e.g., Hess’ Anatomy of an Opinion) Mid-assessment Peers edit & revise opinion pieces

Revise and edit for clarity of message, word choice, etc.

Summative Assessment Evidence

Based on the unit objective: Content – Choose prompt (Topic/Event) from list provided by teacher Process/DOK – gather evidence from multiple informational sources to support your response (DOK 4)

Possible Learning Extensions

Select two+ sources presenting the same event/issue/story (historical fiction, news story, biography, etc.). Analyze the varying perspectives of the same event.

Product – develop an opinion essay or presentation, responding to prompt and using supporting evidence from (at least) 2 sources

 

Step 3 – Break unit into a progression of lesson-based learning targets Step 4 – For each lesson, consider what evidence can be used formatively to monitor progress and make instructional decisions      

 

“ … consider where the lesson resides in the larger learning trajectory… The right learning target for today’s lesson builds upon the learning targets from previous lessons in the unit and connects to learning targets in future lessons to advance student understanding of important skills and concepts.” Moss & Brookhart, Learning Targets, (2012, p.2) 4

© Karin Hess (2015) Educational Research in Action, LLC

References & Formative Assessment Resources    Black, P. & Wiliam, D. (1998). Inside the black box: Raising standards through classroom assessment. London: Granada Learning. Black, P., Harrison, C., Lee, C., Marshall, B., & Wiliam, D. (2003). Assessment for learning: Putting it into practice. Berkshire, England: Open University Press. Greenstein, L. (2010). What teachers really need to know about formative assessment. Alexandria, VA: ASCD. Hess, K. (2008). Developing and using learning progressions as a schema for measuring progress. Paper presented at 2008 CCSSO Student Assessment Conference, Orlando, FL. http://www.nciea.org/publications/CCSSO2_KH08.pdf Hess, Karin K., (Ed. & Principle author) (2010). Learning progressions frameworks designed for use with the common core state standards in mathematics K-12. National Alternate Assessment Center at the University of Kentucky and the National Center for the Improvement of Educational Assessment. Available [online]: http://www.nciea.org/publications/Math_LPF_KH11.pdf Hess, K. K. (Ed. & Principle author) (2011). Learning progressions frameworks designed for use with the common core state standards in English language arts & literacy K-12. National Alternate Assessment Center at the University of Kentucky and the National Center for the Improvement of Educational Assessment. Available [online]: http://www.nciea.org/publications/ELA_LPF_12%202011_final.pdf Hess, K. Burdge, M., & Clayton, J. (2011). Challenges to developing alternate assessments. In Assessing students in the margins: Challenges, strategies, and techniques. M. Russell (Ed.).Information Age Publishing. Hess, K. & Gong, B. (2014). Ready for college and careers? Achieving the Common Core Standards and beyond through deeper, student-centered learning. Quincy, MA: Nellie Mae Education Foundation. http://www.nmefoundation.org/resources/scl-2/ready-for-college-and-career Keeley, P. (2008). Science formative assessment: 75 practical strategies for linking assessment, instruction, and learning. Joint publication: Corwin Press & NSTA Press. Moss, C. & Brookhart, S. (2012). Learning targets: helping students aim for understanding in today’s lesson. Alexandria, VA: ASCD. National Research Council. (2001). Knowing what students know: The science and design of educational assessment. Committee on the Foundations of Assessment. J. Pellegrino, N. Chudowsky, & R. Glaser (Eds.), Board on Testing and Assessment, Center for Education, Division of Behavioral and Social Sciences and Education. Washington, DC: National Academy Press. Vygotsky, L. S. (1978). Mind and society: The development of higher psychological processes. Cambridge, MA: Harvard University Press. Wiliam, D. & Leahy,S. (2015). Embedded formative assessment: Practical techniques for k-12 classrooms. West Palm Beach, FL: Learning Sciences International. 5

© Karin Hess (2015) Educational Research in Action, LLC

Beyond Quick Checks for Understanding: Strategic Analysis of Formative Assessment Use A few years ago, I was walking the hallways of an elementary school with bulletin board displays of data showing rising quarterly test scores of their students. It was clear to me that they had been working hard and were proud of their results. When I asked several teachers how they were using the assessment evidence from the popular, off-the-shelf “formative” assessment they had purchased, they looked a bit puzzled. This on-line resource provided quick student results, using test questions aligned to the Common Core State Standards; however, most teachers were not sure if the specific skills or concepts being tested at any given time linked directly to what they had recently been teaching. Most were unsure of what depth of understanding of content was being tested. When I asked a few students what they had to do to get better test scores, the typical answer was “I just have to work harder!” I think they probably were already working very hard. I don’t think they were sure WHAT they should be working hard on. Knowing how important the interpretation of formative assessment evidence is to informing the next steps in teaching and learning, I designed a tool to help teachers analyze both the purpose and use of formative assessment tasks they were embedding in their lessons. Hess’ Tool #10 (in Module 3 of Linking Research with Practice: A Local Assessment Toolkit to Guide School Leaders) takes only a few minutes to complete: first examine what content and reasoning skills are being tested, then describe what possible correct - or confused responses might be elicited, and finally determine what next steps might be taken no matter the results. After much piloting and refining of the tool, I’ve created an interactive version that you can download. Download Hess’ Tool #10 – Analyzing Formative Assessments Strategic Planning Tool - now! http://media.wix.com/ugd/5e86bd_aeb8c4992a2c43178732201ac8f71448.pdf

6

© Karin Hess (2015) Educational Research in Action, LLC