Parent Surveys Done Right
You need to know what your parents want for their kids' education and what they think about your schools. Can't you just send a survey home or whip something up in Survey Monkey? Who needs the professionals?
Recently AEM and 5-Star Students helped the Murrieta Valley Unified School District administer a survey to parents of children from its four middle schools. The experience showed several ways in which those trained in survey methodology can help educators get more helpful information at a reasonable cost.
Murrieta came to us with questions that each of the four schools had submitted for inclusion. This is a great practice that ensures the survey will provide information each school thinks they need. However, in the initial design, all four schools had a completely unique survey with little overlap in questions.
We helped Murrieta understand the need for some questions that parents from all four schools answered. More often than not, survey results cannot tell you much unless you have a basis for comparison. Parent satisfaction with schools is a prototypical example of this phenomenon. Parents almost always rate the school their children attend higher than they rate the American public school system as a whole, even when their children attend under-performing schools. After all, what parent wants to believe their child goes to a bad school? Understanding differences in reported parent engagement thus becomes a question of distinguishing between schools with high approval ratings.
Which is exactly the position in which Murrieta found itself. With our prompting, Murrieta asked a question on overall parent satisfaction to parents at each school, and all four had high approval ratings. Anyone without specialized knowledge would look at these results, nod in approval, and go about their day. However, when I looked at the numbers, what struck me was that one school lagged behind the other. In three schools, 95 percent of parents said they were satisfied or very satisfied. In the fourth, "only" 85 percent gave the game answers.
After I brought Murrieta's attention to the difference, I stressed that all four schools were doing well and that the outlier school deserves credit more than blame. The first three schools hit home runs. The fourth school hit a triple, and you would win a ton of baseball games if you got a triple every time you stepped to the plate. The question is not whether the fourth school did a good job with parent engagement but whether lessons or practices from other three can help the fourth go from good to great.
We also need to understand if any fundamental differences exist between the fourth school and the other three that might explain the results. Is the fourth school's student population significantly different from the other three? Do their kids have a longer commute to school? Are they in an older building? All of these could explain the survey results, and all of them are largely out of the fourth school's control.
One of the lessons to take out of this example is that good data analysis often raises as many questions as it answers, but that the new questions (hopefully) are more informed and start to nudge you down a path that will lead to your goals. In Murrieta's case, surveys can't tell us that why the first three schools have higher parent satisfaction than the fourth, but it can and did bring Murrieta's attention to the difference. Without our help, Murrieta might not have used enough common questions to tease out this difference or known that a difference between 85 and 95 percent parent satisfaction could be meaningful. Now, they can ask themselves whether the first three schools can help the fourth improve the parent experience.
Murrieta's example shows how even the simplest survey is far more complicated than you'd think. If you've got a survey you think is important, it's at least worth asking whether you'd benefit from the involvement of professionals with strong training and experience.