Skip to content
Aurora Institute

Co-Developing a Student Survey with Washington Stakeholders

CompetencyWorks Blog


We know that measurement tools are better when they are designed together with the people who will ultimately be using them. So when we began thinking about how to design a survey for middle school and high school students across Washington, my colleague Jilliam and I realized we couldn’t do it alone. Although we have some expertise in instrument development, survey methods, and psychometrics, we’re not from Washington, we’re definitely not in middle school anymore, and we don’t have expertise in what it’s like to be a student in the Mastery-Based Learning Collaborative (MBLC). To properly design a survey for those students, it was imperative for us to talk to some of them – and their teachers – to better understand their experiences. In this post, I’ll describe the process we went through to co-develop our upcoming statewide student survey.

Project Background

Aurora has been supporting Washington’s State Board of Education (SBE) as an external evaluator of the MBLC, an initiative designed to support schools in implementing mastery-based and culturally responsive educational practices. As evaluators, we had been studying implementation by two cohorts of schools, learning what was actually happening on the ground and how it was going. (Last year’s report can be found here.) Now, a few years in, it was finally time to start looking at preliminary student outcomes! Schools in the first cohort had been working on implementing competency-based and culturally responsive educational practices for a few years, so we were hoping that by now we would start to see some changes in students’ experiences at school. We had been talking to small groups of students in focus groups since the beginning of the study, but to measure these experiences more comprehensively across the MBLC network, we decided to develop a student survey. 

Students working on a laptop

The student survey would cover multiple topics: students’ experiences of mastery-based learning practices and culturally-responsive educational practices, students’ engagement in school, and students’ habits of success, as defined by Washington’s Profile of a Graduate. We also wanted to collect some student demographic information so that we could see if there were differences in students’ experiences at MBLC schools based on their backgrounds or identities. 

Our Approach to Survey Co-Design

There are lots of ways to meaningfully implement co-design practices and incorporate student voice. I think of it as a spectrum. On one end is the fullest version of co-design: partnering with end users from the beginning of the project and at every step of the way, with the users themselves really leading the work. On the other end of the spectrum is simple feedback: presenting a mostly-finished product and asking people for input. Different constraints – time, budget, ease of access to end users, and input from other stakeholders – impact where on that spectrum you land. For this project, we were definitely closer to the “feedback” end of the spectrum. 

Infographic describing meaningful assessment
Excerpt of Washington Student Survey

We started by articulating clearly what concepts we were trying to measure in the survey. Then, in an effort to not reinvent the wheel, we researched existing instruments that measured those concepts with adolescent audiences. We found a few instruments that were a good fit for our purposes, but in some areas we would need to write our own questions. Once we had drafted the content, we realized our instrument was way too long. We started to get input on that early draft, asking people what felt core, what felt redundant, and what was simply confusing. We got input from a lot of different folks: our own team, the State Board of Education, and other educators who work with adolescents. Eventually, we had a draft of the instrument that we thought was ready to start sharing with Washington students and teachers.

Learning from Teachers

We invited teachers to join us on Zoom for one-on-one conversations where we asked some general questions about vocabulary: how they talk about mastery-based learning and culturally responsive educational principles at their school, how they hear students talking about it, and what words they thought would resonate or not. Then we showed them our draft and asked them to think aloud while they clicked through it so that we could hear their honest reactions and feedback. 

It was not great. The survey was boring. It was too long. It was ugly on the page, and it still felt somewhat repetitive. 

Screenshot of Washington Student Survey
Excerpt of Washington Student Survey

This was amazing feedback! It was exactly what we needed. One of the teachers we talked to shared her expertise in working with students with learning differences. She made a lot of really helpful, concrete suggestions for how we could reformat questions, such as using bold text to call out key words to support students who struggle with reading, or breaking a paragraph of instructions into a series of short Q&A. Another teacher made helpful suggestions for how we could re-organize the flow of the survey to make it more engaging and logical for students. After each call, we went back and made edits, slowly developing a better version of the instrument.

Learning from Students

After we had this better draft, it was time to talk to students. To do so, we relied on the SBE’s student advisory board. The student advisory board is a group of students who attend MBLC schools and provide a student perspective on the work of the Mastery-Based Learning Collaborative. The students represent a wide range of ages and grade levels, schools, and viewpoints. We attended one of their regular meetings to get their opinion on the survey. Specifically we wanted input on the aesthetics and ways to keep students engaged. We even asked them to drop some links to GIFs that they thought would be relevant in each section of the survey, just to keep it fun.

Screenshot of student survey asking students what their mood is, with pictures of dogs
Excerpt of Washington Student Survey

Most importantly, we asked them about their understanding of a couple “buzzwords” and core topics we were using in the survey. For example, we were using the language of “competencies” throughout, but most of these students didn’t know that word. Instead, they were familiar with the language of “learning outcomes” or “learning goals.” In order to ensure relevance and comprehensibility for students, we decided to update the language we used throughout the survey to ensure it was easily understood.

Overall, talking to the students confirmed that many of the edits we had been making were headed in the right direction. We took this feedback and went back to make additional tweaks to the survey, finally landing at an instrument that we are pretty proud of. 

Instrument Co-Development Improves Research

As a researcher, it’s easy to overthink things. We love to develop nuanced, multifaceted tools to perfectly capture a concept and all of its component parts. But in my opinion, an instrument that is engaging and relatable to students will usually get you better data than an incredibly precise tool that is so long, convoluted, or boring that students don’t take it seriously. These are tradeoffs that have to be carefully considered – of course, there are times when precision really matters – but it’s important to keep the end goals in mind. For us, that end goal is to authentically hear from as many students as possible about their experiences in MBLC schools and how those experiences impact their learning.

Collecting data from students is really important. We’re hopeful that this student survey will let us hear from a huge group of students across Washington about the impact that the MBLC initiative has had on their education. We look forward to sharing the results of this survey more broadly so we can all learn from it – and to continuing to refine the instrument itself, together with our partners in Washington.

Learn More

Picture of Kelly OrganKelly Organ joined the Aurora Institute as Research Manager in August 2024. She previously served as Director of Research Partnerships at a national nonprofit focused on youth well-being and social-emotional learning. Her career spans roles in service-learning, social entrepreneurship, action civics, and youth leadership programs, with a focus on developing culturally responsive and adaptable practices that support diverse learners.