Over 740 PK-2012 schools have taken one of our surveys, many multiple years. That’s 108,000+
respondents to date.
Here’s four sound
reasons why your own home-grown survey will likely not give you the wisdom you
need to improve your school strategically.
Problem #1: What’s a
good score? (AKA, No normed data.)
One of the most
important findings of doing Christian school surveys over a decade now is how
very good they are – how very satisfying they are.
PK-12 Christian
schools are among the most satisfying organizations in the country, really in
the world.
What that means to you
is that survey scores that seem good to you are often not that great in
comparison to other Christian PK-12 schools.
It’s the worst of all
possible worlds: The world of false
positives, the world that believes everything is OK, when really it is
not.
Here’s a concrete
example of what I mean:
Ask yourself, with 10
being high, is an overall satisfaction average of 7.75 a good score or a bad
score? How about a solid 3.9 average score on Principal leadership, where
5 is high? Is that good or bad?
Answer: Both of these scores are 16th percentile
in our normed data. 84% of our schools scored higher than that for both
questions.
Problem #2: “I don’t
care so much … “ (AKA, Effectiveness scores should match relative importance.)
On that same 1-5
effectiveness scale, with 5 being high, is an average score of 4.23 out of 5 a
good score for (1) Teachers are Christian role models, or (2) Use of technology
in instruction?
Here’s the
answer. 4.23 on use of technology in instruction is a great
score for Christian Pk-12 schools – 80th percentile. Only
20% of schools will score better on this item.
However, an average
effectiveness of 4.23 out of 5 for teachers as Christian role models is a terrible
score, just at the 20th percentile. About 80% of Christian schools
will score better on this program element.
On the home-grown
survey, we interpret these scores exactly the same. We assume they are equally important
to parents, and they clearly are not.
Gene Frost, on his take
on Good to Great for Christian Schools, makes a big deal of
this, and rightly so. That’s why he recommends our survey in his book,
because we ask both importance and effectiveness of program elements.
What I just said is
that it is virtually impossible, on a home-grown survey, to know if the
scores we receive are good or bad. Worse, we typically interpret our
scores to be good, when in fact they are just average or worse.
I call this the Pollyanna
Effect – who wants to change anything when we are doing just
fine?
The classic instance
of the Pollyanna effect was a school in the Northwest, where the accreditation
team thought the teachers were outstanding. And said so, in their final
report.
The Administrator did
not believe it, and our survey, with its normed data, confirmed her concerns.
Imagine how hard change would have been without GraceWorks’ survey!
That’s why we do
surveys for accreditations – it’s hard to argue with the comparison data of
106,000+ Christian school constituents.
Problem #3: “It
Matters to Me - or Not.” (AKA Some issues impact satisfaction more than
others.)
Let’s pretend we’re on
Jeopardy, and I’ll give you the answers first: Much worse, Somewhat
worse, About the same, Somewhat better, and Much better.
Ok, I’ll even give you
the questions:
(1) How do compare the
Christian character of students at our school to students in public schools in
our area?
(2) How do you compare
the academic quality of our school to public schools in our area?
So I’ve given you the
questions with the same answers for both.
Now comes the crucial
question. Which of the answers are good and which are bad for each
question?
We can all agree that
the first three answers - Much worse, Somewhat worse, and About the same - will
hurt us in overall satisfaction, and by the numbers, they do.
Certainly “Much
better” must help us with overall satisfaction, and by the numbers, it does.
So that leaves
“Somewhat better.” Are respondents who feel Christian character
and Academic quality are somewhat better than public schools
less satisfied with, and thus less willing to refer to, your school?
From over 700
Christian Schools, the answer is usually yes and no.
Yes - parents are much less satisfied
if Christian character is somewhat better than public schools.
No - parents are typically no
less satisfied if academic quality is somewhat better than Christian
schools.
If you think
that’s a big deal, you are right.
Translation: When it comes to Christian character
at Pk-12 schools, “Somewhat better” is just not good
enough.
Christian character is
job #1. In fact, when
we go to the trouble of regressing the whole thing, the Christian character
question is more predictive of overall satisfaction than any other single
question on our survey.
And if you don’t
believe that for your school, you can
find out for as little as $995 and 7.5 hours of staff time.
You can certainly ask
importance and effectiveness on your own surveys, and you should, but you will
never be able to determine – outside of factor analysis and regression - how
much any particular program aspect impacts overall satisfaction and willingness
to refer.
(It took me three
days to figure out a way to do that automatically, and that was after a year in
the most research-intensive Ph.D. program in education in the state of
Colorado.)
Which brings us to the
final problem.
Problem #4: Now what
do we do? (AKA How do we prioritize what to “fix” based on the survey?)
Here’s the real beauty
of your own home-grown survey. Because of all the problems above, you
can interpret it any way you want!
You can dedicate time and money to various pet projects and
someone's gut feeling about what parents want. An ambiguous survey can
back you up!
These interpretation
dynamics are particularly interesting when we do it as a group, especially with
boards. (Just thinking about that process makes my head hurt).
There's only three
limits to this do-it-yourself approach: Time, Money,
and Reality.
For my money, I’d
rather put my time and energy into projects and problem fixes that for sure,
hands-down, no question, will result in your overall program getting
better.
GraceWorks Survey –
the Parent Satisfaction and Referral Survey – solves all these problems (and
many more.)
We norm everything –
everything! We do ask
how effective and how important for each of your
program elements.
We do tell you
which strengths and weaknesses are helping
you and hurting you the most. We do make it
clear what you need to work on, in priority order.
Plus - a two-page
summary report for your board. Splits by divisions if
you need it. Custom questions. Satisfaction /
Willingness to refer by demographic.
And, we help
you present the results to teachers / parents / boards. By me
personally, and I’ve been to this rodeo over 700 times in the last 12 years!
In addition to that,
our survey provides all of the following:
(1) Actual
leads of potential families, with a contact.
(2) Volunteers willing
to help with marketing and fundraising tasks.
(3) Enrollment
status of non-returning or not all enrolled families – where else they are
going and why.
(4) A
research-based answer to “Will they pay” & ”Can they pay” - by income level
- for tuition increases.
(5) Barna-like alumni
outcomes data.
(6) Promoters -
dozens willing to spread the word about your school (with a month by month
calendar of how to work with them.)
(7) Detailed
comments of why your constituents love your school or not so much, broken
out by demographics.
(Such as, what your 3rd grade
parents think, what people making over $150,000 a year think, what your
Millennial parents think.)
(8) Parent
testimonials - often ready to go with minimal editing. All you need to
do is ask permission to use them.
The survey will pay
for itself many times over
by the students you save – because you know what the real problems
are – and new students you gain – through actual leads and
later leads working with your newly found Promoters.
No comments:
Post a Comment
Thank you for posting a comment. We enjoy interaction with our readers.