5 Things I Learned While Scoring Micro-credentials
This post originally appeared at the Center for Collaborative Education on October 18, 2017.
How can we create learning experiences that respect adults as learners and support teacher driven professional development? That was the question educators in Rhode Island set out to answer as part of the Assessment for Learning Project. At the heart of the project was a set of performance assessment micro-credentials designed by teachers. I had the task of reviewing many of the 100 submissions we received and two things are now clear to me. First, writing a performance assessment micro-credential is a performance assessment in and of itself. Second, adults are not much different from younger students when it comes to assessment. I found being a reviewer to be fascinating! Here are some of the things I learned while scoring micro-credentials.
1. Essays are the security blanket of education. Not to say that the ability to write an essay isn’t an important life skill, but in designing the micro-credentials, we aimed to provide choice in how a learner could demonstrate understanding. Participants could provide reflections via whatever medium they wanted—book, comic, drawing, video, photo journal—as long as the rigorous learning was captured. I was surprised by how many people chose the tried and true essay. The essay is an easy way to ensure all prompts have been addressed, or a way to check off specific asks with a 2 or 3 sentence response. However, sometimes this approach masked the evidence of deep learning and reflection. With many essays, I wasn’t sure if the participant really understood the topic or would make changes to their practice. It makes me wonder what we could change in the frame or wording of the micro-credentials to encourage a little risk taking and meaningful reflection to have a better indication of personal growth.
2. Rubrics have an impact! A strong rubric is an essential component of a micro-credential. As a reviewer, I sometimes struggled with the rigidity of the rubrics provided. The intention behind the list of required documents and response criteria made sense to me, but in application they didn’t provide clarity around why the artifacts were needed and the multiple pathways available for showing understanding. For both participant and reviewer, the rubric was simultaneously a cumbersome checklist and an unclear guide to submission requirements. Many participants expressed their frustration with the lack of clarity in the rubrics that lead to general uncertainty throughout the completion process.
3. In a similar vein, writing micro-credential instructions is HARD. Much like the feedback on the rubrics, the instructions were both too prescriptive and too vague at the same time (the eternal conundrum!). Many participants found parts of the micro-credential redundant, simply copying their answer from the first section to the last. Why should a learner bother to answer the same question again and again? It’s disrespectful of the participant’s time and intelligence. Why did this happen? Probably because writing a good question is HARD! And sometimes you don’t know how good a question is until you start getting answers. I understood the nuanced differences between the prompts in different sections because I’m neck deep in performance assessment work. However, the questions hadn’t been broken down into that learner friendly language that allowed for rich responses that explored multiple facets of the topic.
4. Feedback works. I was always nervous when denying a micro-credential submission. Often it was because a key document or reflection was missing, but sometimes it was because the spirit of the micro-credential and desired learning was missing. Rejecting the work of experienced classroom teachers is an unnerving task, and I didn’t want to discourage anyone. What if I was denying the work of someone who has been in the classroom for 1 year or 15 years? What if they decided micro-credentials were bogus because of the rejection? 9/10 of the participants resubmitted and the resulting reflections were better. Yes, feedback works! It was nice to know that an anonymous reviewer miles away could help deepen an educator’s reflection on their practice — true evidence that even adults are continuous learners who will rise to the opportunity to continue learning when it is done well.
5. Without going through a field test like the one we conducted in Rhode Island, there is no way of knowing the quality of your micro-credentials. Our micro-credentials were far from perfect! In fact, they were very not perfect. Our first task in phase 2 will be to revise them according to the abundant feedback we received from pilot teachers. The challenge with micro-credentials mirrors the challenge with designing student driven performance assessments. How do you remove the scaffolds while still clearly articulating the learning that you are looking for? And moreover, how “micro” does a micro-credential have to be? What is the grain size of questioning and activity that can accurately depict deeper understanding?
Teachers should have meaningful voice and choice in their professional development. People involved in creating professional development learning experiences need to start thinking about how to do that well. I say start by asking the teachers themselves – they’ll let you know.
- Why School Quality Measurement is an Equity Issue
- Personalized PD and Collaborative Teams: A Symbiotic Approach to Professional Learning, Part 1
- Guiding Students in Reflection: The Gateway Process at Parker
Christina is a Program Associate, Quality Performance Assessment, at the Center for Collaborative Education. She helps develop sustainable operational processes for the QPA team, handles the logistics of projects in different school districts, curates QPA resources, and strategizes best ways to share student and teacher stories about performance assessment tasks.