Skip to content
Aurora Institute

From Good to Great, Initial to Ideal: A Way to Improve Exhibitions and Other Performance Assessments

CompetencyWorks Blog

Author(s): Adam Watson

Issue(s): Issues in Practice, Lead Change and Innovation, Create Balanced Systems of Assessments, Learn Lessons from the Field


Several things brought me back to this post by Adam Watson about bringing performance assessments from good to great, and initial to ideal. NGLC’s new Portrait of a Graduate in Practice series explores how five schools and districts use their graduate portrait to shape teaching and learning. This EdWeek piece, Fighting Senioritis? This New Requirement Kept a Graduating Class Engaged, tells the story of one school’s journey with a 1-credit mastery-based senior project to meet Connecticut’s new graduation requirement for the class of 2023. Finally, I recently spent a day as a community member juror for senior presentations at the Parker Charter Essential School (more to follow on this and if you are interested, check out this post about Parker’s 8th and 10th-grade gateway presentations).

When the post was originally published on Wednesday, January 26, 2022, on Adam Watson’s Edtech Elixirs, the author was a Digital Learning Coordinator working in Shelby County Public Schools. Adam is now a Deeper Learning Design Specialist for the Ohio Valley Educational Cooperative (OVEC), also in Kentucky.

– Laurie, CompetencyWorks Director

Photo by Next Generation Learning Challenges (NGLC)

 

When our district began its journey into competency-based education (CBE), one of the significant early steps was the publication of our Profile of a Graduate (PoG) in 2017. The Profile was crucial in developing our current Strategic Leadership Plan for 2018-2022, where Personalized Learning is the top strand. “Embedded Performance Assessments” – demonstrated applications of student knowledge intentionally timed at critical points of a student’s learning path – is a feature mentioned throughout that strand, and one of our frequent examples of such an assessment is student exhibitions: a culminating celebratory event where they can “show what they know,” often with the attendance of parents and community members as an audience.

From a certain perspective, student exhibitions are not new. Indeed, if the phrase conjures up images of a junior high science fair with trifold presentation boards, homemade dioramas, and solar system mobiles, or perhaps cute elementary students dressed up as famous people for a living wax museum, you would be forgiven. But there is a difference when defining exhibitions in the context of CBE or as a high-quality performance assessment. Features of such an exhibition would include the instructional intentionality in how they are designed and assigned by teachers, the agency of the students behind them, and the depth of the mastery of learning the exhibition can demonstrate. Furthermore, if your objectives from student exhibitions include students producing rich artifacts of learning and having some practice runs leading up to a rigorous Defense of Learning performance summative assessment – more on both of these later – it is imperative that students exhibit and exhibit often. 

For a further exploration and explanation of the power of student exhibitions, watch this short video (2:14) from High Tech High:

Our district was not naive about what it would take to achieve these richer and deeper exhibition experiences. Teachers needed to learn strategies and see good models for student-centered learning; students could not become empowered, confident, and self-directed overnight. However, one of the wisest moves from our leadership was to encourage our teachers, and by extension students, to just try. We knew that some of our earliest exhibitions would be rough and wouldn’t look much different than those old-fashioned science expos with the trifold presentation boards, but we had to start somewhere so that we could baseline, learn, and improve. While we are still (and should always be!) “becoming” and improving, we can use the integration of our Profile of a Graduate as an example of how exhibitions have deepened and gotten better.

At first, students may have been asked to reflect on a Profile competency or two at the very end of the process, if at all, perhaps as an afterthought (“Reflect on which PoG competency you think this exhibition best exemplifies”). But then we developed single-point rubrics to measure whether the student met the competency …teachers began developing exhibitions with an intentional focus on a PoG competency made clear to the student from the start…and we are in the process of launching four-point mastery scales on the PoG competencies to better assess the student’s “performance.”

Image from High Tech High Exhibition video

The launch of the Profile of a Graduate five years ago certainly gave us a new “why” for student exhibitions and a way to deepen and better measure student learning. But looking back and looking forward, what if we had a tool or a more insightful lens to look at performance assessment more effectively? 

This is where an excellent book enters stage left: Deeper Competency-Based Learning: Making Equitable, Student-Centered, Sustainable Shifts by Karin Hess, Rose Colby, and Daniel Joseph (Corwin, 2020). I’m halfway through reading it as an anchor text for my chosen SCPS PD strand this year, and in full disclosure (and to my delight), Rose Colby also joins Shelby County periodically to virtually lead workshop sessions on CBE for myself and other CBE strand learners.** Deeper Competency-Based Learning is dense in the best sense of the term, full of CBE shift strategies, rubrics, and metrics to measure where you are in competency-based education and where you want to be.

In their chapter “Making Shifts in Teaching and Learning Structures,” the authors share a table of criteria types to plan performance assessments (Table 3.6, page 94). The criterion types, example questions typically answered by each type, and the Depth of Knowledge (DOK) one would usually expect for each type, are as follows:

  • Process: “Will the student follow particular processes (e.g. procedures for a science investigation; data collection; validating credibility of sources)?” [DOK 2]
  • Form: “Are there formats or rules to be applied (e.g. correct citation format, organize parts of a lab report; use camera shots/visuals; edit for grammar and usage)?” [DOK 1]
  • Accuracy of Content: “List essential domain-specific terms, calculations, concepts, or principles to be applied.” [DOK 1 or 2]
  • Construction of New Knowledge: “How will the student go beyond the accurate solution and correct processes to gain new insights and raise new questions? Are there any personal success skills that might also be employed?” [DOK 3 or 4]
  • Impact:  “How will the final product achieve its intended purpose (e.g. solve a complex problem; persuade the audience; synthesize information to create a new product/performance)?” [DOK 3 or 4]
Image from High Tech High Exhibition video

The authors point out that “[a]ll rubric criteria do not need to be included for every summative assessment, but they should [all] be considered during the design phase,” with a particular emphasis on Construction of New Knowledge and Impact for performance assessments that are to truly assess deeper learning (94). Furthermore, these criteria are an excellent first step for either designing a new performance assessment or analyzing an existing one. Additional steps include considering an authentic context for application, identifying a proper format for such a demonstration, determining how much students will have voice and choice in the process or product, and last but not least, identifying and aligning success criteria to the task (94-95).

All of this brings us back to student exhibitions. Even as many exhibition examples in our district often had a “good” quality to them, I struggled to articulate feedback on what could make them “great.”  One thing I realized early on was the need to present an original idea, rather than just a regurgitation of what others have said or done, no matter how well it is packaged or communicated. Put another way when I was discussing project-based learning in a previous blog entry: “It is not transformative learning if the end result…is merely a well-presented aggregation of researched bullet points instead of a new idea or creation that shows real reflection and growth of student thinking.”  (Note also that the word choice of the criterion Impact, as well as its definition, can strongly suggest how a well designed PBL can end in a high DOK performance assessment that “impacts” the community outside of the four walls of a school.)

It was these insightful criteria from Deeper Competency-Based Learning that improved how I could evaluate and give feedback on performance assessment. Firstly, as the authors point out, there is nothing inherently wrong with a performance assessment that only remains at the level of Process, Form, and/or Accuracy of Content (and by extension, at DOK 1 or 2). I will label these as Initial performance assessments. Depending on the intentionality of the lesson/unit design, this might very well be instructionally sound – for example, perhaps the competency is only being introduced and full mastery would not be expected.

Image from High Tech High Exhibition video

The issue is when a teacher either stays at the Initial level for all such assessments for the entire course or worse, misinterprets a performance assessment best designed only for these lower levels as a demonstration of mastery. For a performance assessment to truly apply a student’s knowledge with depth, the student should “gain new insights,” “raise new questions,” “employ personal skills” (like the ones from our Profile of a Graduate!), “solve a complex problem,” “persuade an audience,” and/or “synthesize information to create a new product or performance.”  I would label these Construction of New Knowledge and Impact performance assessments Ideal, especially if a student’s mastery of learning (with a task at DOK of 3 or 4) is your goal.

This concept of Initial and Ideal can be further applied to other semantic aspects of teaching and assessment, as we transition from the traditional to the transformational. For example, an Initial “folder” might contain artifacts selected by the student, but the wording shows a limited goal; it may be more concerned with being a container of student work, rather than with the depth of the work contained, and an emphasis on an end product rather than an ongoing process.

An Ideal “portfolio,” on the other hand, can suggest a living collection ever-changing, expanding, and improving over time; a perpetual artifact curation that should spur a student to continually reflect on their growth academically and personally. While this process of a portfolio may eventually culminate in a Defense of Learning, a type of performance assessment where the student draws on their body of work to present as evidence of mastery (say, once a student is ready to complete their senior year), the day of the defense where a student stands in front of an audience should “Ideally” not overshadow the months and years leading up to the event. Conversely, if the day of defense is a performance assessment that consists only of a student compliantly marching through a showcase of past work – in the parlance of the Process/Form/Accuracy of Content criteria, the student is merely following a process, applying formulas and rules, or listing definitions – this “Initial” kind of performance could silently be replaced by a single bulleted slide or a one-page table of contents and not skip a beat.

From Initial to Ideal (whether it be exhibitions specifically or performance assessments in general) and from good to great is not just a straightforward journey but a non-linear learning path that may circle back and then turn forward again over time. The secret, as it is with many things in competency-based learning, is to do the journey with as much intentionality, reflection, and metacognition as possible, with the aspiration to always go deeper. 

** For further reading on Shelby County’s professional development personalized learning plan, please read Eliot Levine’s CompetencyWorks 11/4/21 blog entry

Learn More

Photo of Adam WatsonAdam Watson has been an educator in Kentucky since 2005. He began as a high school English teacher, eventually becoming Teacher of the Year at South Oldham High School in 2009 and getting Nationally Board Certified in 2013. Adam was the Digital Learning Coordinator for Shelby County Public Schools from 2014 to 2022. In 2019, Adam was awarded KySTE Outstanding Leader of the Year. In August 2022, he joined Ohio Valley Educational Cooperative (OVEC) as a Deeper Learning Design Specialist. He is a frequent professional development presenter and session leader at conferences and institutions in Kentucky and beyond and enjoys blogging from his site Edtech Elixirs.

Follow @watsonedtech