'We (don't really) Want to Know What You Think'
The School Board's 2004 Parent/Guardian Survey ...
Commentary by PETER G. ZIMMER*
HALIFAX (5 April 2005) -- I'VE BEEN URGED by readers of an earlier draft to give a wider circulation to my critique of the recently administered "Planning for Improvement" Parents'/Guardians' Survey issued by the Halifax Regional School Board (HRSB).
My criticism is informed by (and to some degree inflamed by) the observations and opinions I heard again and again during my recent run for a seat on the school board, as I talked to parents, teachers, retired teachers, community leaders, businessmen, taxpayers, non-teaching employees of the School Board and the NS Department of Education, educators at universities, and the other candidates:
o I heard praise of many front-line educators who are dealing with our children day-to-day, doing the best they can under sometimes difficult conditions (and the "best they can" is sometimes an outstanding effort by any measure);
o I heard serious dissatisfaction with what our public system is offering to many of our children and to our communities and our city;
o I heard dismay at the trends and changes occurring at the local schools and anger at some of the system-wide changes seen in recent years;
o I heard real concern for the future of our public school system and how it will serve our children, our city and our province in years to come, and concern about the appropriateness and effectiveness of the whole "Planning for Improvement" exercise;
o I heard the serious mistrust of individuals and attitudes in the upper levels of our public education system - in the administration at HRSB, in Nova Scotia's Government, and in the administration of the NS Department of Education. Again, I did hear praise for individuals working in all these organizations, but usually offered as exceptions to the general state of things.
The HRSB Planning for Improvement Parent/Guardian survey is able only to capture upbeat impressions on part of that first point alone. This expensive survey exercise headlined "WE WANT TO KNOW WHAT YOU THINK" requires parents and guardians to stand mute on the other HRSB issues that have concerned us for some years now.
I am sending this directly to all the new and returning members of the Halifax Regional School Board and will circulate it widely to as many others concerned with the state and future of our public education system.
I first wrote the following on 13 November 2004. I've edited and added to this since.
'We (don't really) Want to Know What You Think'
I HAVE JUST FILLED OUT the 2004 "We Want to Know What You Think" HRSB Parent/Guardian Survey on-line. It is part of the multi-year "Planning for Improvements" exercise underway now in HRSB.
My immediate reaction was disappointment and anger (but not surprise) at what was asked and how it was asked.
The survey's 39 multiple choice questions gave me no way to convey anything about what I think are important issues facing HRSB's elected members and the senior administrators - if they really want to plan for needed improvement. The survey asked no open ended questions, made no provision for parents' comments, allowed for no ambiguity.
Its omission of any mention of the French Immersion curriculum or French Second Language instruction or physical education or ... are non-trivial holes in this survey, but they are only minor design failures among many more fundamental defects in this profoundly flawed and very expensive way to ask questions that are for the most part useless, unclear and/or wrong-headed.
Almost all the 39 multiple choice items were framed as "motherhood" touchy-feely "I like..." statements: the "right" answer was clearly "agree" or "agree strongly" .... It will surprise no-one at all that these questions will result in a overall response that is skewed to a "We're all right, Jack" set of graphs that purport to represent how HRSB's parents feel and what they think of our school system. As with last year's survey (which acknowledged the fact that "certain useful statistical tests cannot be conducted" because the responses were excessively positive, and then ignored what flowed from that observation) we'll discover next spring that we've spent a lot for very little of real value.
Some Real Problems that I think about...
My school and HRSB-related dissatisfactions are not mainly with my thoughts or feelings about my child, or her school accomplishments, or her interactions with her teachers or her principal, this year.
Rather I am dissatisfied with the HRSB senior management and how they've dealt with substantive objective issues that affect, have affected, and will affect the education, the academic accomplishments of my daughter, her classmates, thousands of children for years to come, and the reputation of HRSB's "product", i.e., our children and what they've learned in our schools. These are what I think the HRSB needs to understand. This dissatisfaction is why I ran for the school board, and why I will continue to contribute to the public discussion on public education, why I wrote this critique.
To mention a few substantial dissatisfactions:
o I am unhappy with an administration that, as of November 15th and halfway though the first semester and the whole course, has not yet provided textbooks for my daughter's Grade 12 biology class: lack of sufficient and up-to-date text books is a persistent and system-wide problem;
o I am unhappy with an administration that in 2003 stripped 350 high school students' curriculum of a large fraction of previously available course options, and severely reduced their scheduling options at the affected schools last year and this year, and told us this was done to "improve" French language education throughout HRSB and to "offer teenagers more choices": this was a decision that was never "researched" in any meaningful way and certainly was not "data-driven";
o I am unhappy with an administration that places my child's principal and teachers and guidance staff in the position of either lying to me, or dodging my questions, or sitting mute when I make a statement of fact, or, if they do talk frankly, risking their jobs should I betray their trust and put their words "on the record", should they disagree with Administration-mandated positions on semestering or specifications for new schools or instrumental music or French Second language instruction or .... I want every parent (or taxpayer or journalist or school board member) to be allowed to hear first hand these front-line educators' informed professional concerns and conclusions, un-muzzled by a self-serving criticism-averse management's command-and-control "check with head office first, or be punished" attitude;
o I am unhappy with an administration that has done a poor job planning and negotiating for the new peninsula high school ... alienating HRM Council and staff so that they are reluctant to help build what the province won't; planning an undersized new "super" [sic] school lacking an auditorium and a needed second gym; designing for fewer students than HRM city planners and Statistics Canada projections would lead us to expect in a very few years, ...
o I am unhappy with an administration that has caved in to a provincially-promoted educational fad of high school semestering without critically examining the research and literature which on balance shows semester / block scheduling is associated with poorer academic achievement by students, reduced coverage of the course curriculums and subsequent grade inflation by the teachers to disguise the resulting decline in educational results. In years to come, we can look forward to college admission officers applying a discount to our kids' grade transcripts vs transcripts from schools still using the full-year course format (this has happened elsewhere in Canada and the States).
I am ... don't get me started... I might rave and rant ...
About the Survey
The idea that anyone would find the responses to and analysis of this latest questionnaire in any way useful for planning needed changes in our city's school system for the next few years is downright scary. I've read the report on last year's survey, and considered this years' survey in light of what HRSB has published: I am not filled with confidence in the creators and analysts and users of the effort.
I ask: What use, if any, could possibly be made of the survey's "analysis"? Is it likely that there will be any surprises, and novel insights arising from the data or the analysis of it? Will our schools and our children's education actually be improved by any changes resulting from insights obtainable only from the analysis of the data of this project? What value we are getting for our money? Where does the money used come from?
If we didn't spend the money on this, but invested the amounts in immediately evident needs, would our children's educational outcomes be better this year, next year, five years from now?
That fraction of parents who bother to respond (it was about 50 per cent last year) is far from a random sample of the parent population, so we know HRSB can't project anything much about the opinions of the whole population from their self-selected unscientific sample, even if the survey had asked the right questions. In all likelihood very satisfied parents will see a rosier picture and readily respond, the usual squeaky wheels will squeak (though the survey doesn't provide much welcome for that), and the usually-disenfranchised will act as expected and not bother.
A professionally designed, properly tested and professionally conducted survey of a scientific sample of fewer than a thousand parents would have been far more revealing, reliable, repeatable and useful in making generalizations and predictions.
Flawed time series
HRSB has no way of knowing whether the parent responding this year had filled out the form last year. Comparisons of two highly skewed unreliable aggregate responses to this sort of questionnaire provides near-zero utility or statistical validity. Improvements demonstrated by the first two years' data? A base-line for future comparisons? A fiction told in statistical guise, at best.
Flawed "demographics" of respondents: HRSB has no way of correlating parents' responses with significant demographic variables, with any objective measure of how their child is doing, or with their family educational objectives and expectations, or with their socio-economic status, or education, or race or ethnicity, or local community .... which are factors that most certainly do have a bearing, for good or ill, on how a parent perceives the school and their child together, what they expect from the school, how the teachers and staff perceive them and their child, and how their child is doing.
The survey's questions are relentlessly present tense and "me-and -mine"-centered: it assumes respondents bring no historical context, no parental memory of years past, no community-linked context, or parental planning for a future five or ten years away. It offered us no way to offer our assessments of HRSB's changes over the past few years, to speak to conditions observed at our children's schools and in the system as a whole, no way to share the knowledge, hopes and expectations we bring to our children's educational community from the larger world outside the walls of HRSB.
Not a single one of the thirty-nine questions ask us report what we think or feel about:
o comparisons between what we know "now" and what we know of / remember from last year or earlier, or
o comparisons between what we know about our child and what we know of our child's classmates' experiences and expectations, or
o comparisons between what we know about our child in her class in her school this year and what we know about other children in other classes and schools elsewhere and/or elsewhen.
The survey assumes responding parents look neither to the past nor the future nor to the side. Should parents entertain these sorts of thoughts, HRSB doesn't care what they think. Such thoughts won't matter when it comes to "Planning for Improvements".
But the wider comparisons DO matter. The parents I talk with know this. Teachers and principals know it. Many of the new and returning school board members know it. University admission officers know it.
The questions used for this year's survey are very similar to those of last year's survey. The answers to the questions asked of parents, students and teachers last year were skewed overwhelmingly to the positive. As will be the responses from this year's edition. As the author(s) of the Report on the 2003 survey noted on page 3 in the "Limitations... ", "Items overall were rated very highly, making it so that the response distributions did not follow the famous 'bell-shaped' curve." "... certain useful statistical tests cannot be conducted."
As professional opinion researchers develop and test items for a survey they'd routinely discard such questions: there is little point in asking them.
Simply put, the Survey's questions were so poorly worded, so poorly chosen, that any slightly sophisticated previewer would have predicted the fatally skewed responses, and no serious statistical analyst would agree to draw any conclusions from the responses to the questions HRSB presented. Suggestions by the authors of last year's report that a little tweaking of wording of the questions, or the instructions given respondents, or of who administered the questionnaires would address these problems are not correct: the fundamental flaw is asking more than a hundred thousand individuals to answer forty or so poorly constructed survey questions that were not properly pre-tested on samples of respondents.
If we ask only questions for which we can predict the answers, and if we don't ask any useful questions, we cannot make any useful conclusions ... no matter how many hundreds of thousands of "answers" we collect.
What is the bill?
The whole exercise is a bad job, for which HRSB paid a lot of money and spent a lot of teachers' and other staffs' time.
The mechanical / clerical task alone was massive:
o printing and distributing, in round numbers, some 120,000 multi-page perforated forms: one for each student - 56,400 or so forms and one for each student's parent/guardian - 56,400 forms, in envelopes with a return envelope - as well as one for each teacher - 2,226+ forms - plus some thousands of extras;
o creating and administering a website version;
o collecting, opening, scanning, filing the hundred thousand or so forms expected back;
o separating (filing and/or shredding?) any non-machine-readable forms and attachments;
What is the bill for this part of the job this year, and last year? I'd be surprised if this cost as little as a dollar fifty per form sent out, $200,000 give or take, (one must include the teachers' and school administrators' time at cost). Then multiply by five.... for the 2003 - 2007 editions, and we're up to a minimum million dollars, just to handle the paper.
Then let us consider the costs of creating the surveys and analysing the data and reporting the results (writing and creating the graphics for the Board-level report, and the individual school reports, and the administrators' PowerPoint presentations for various audiences, ... and then performing the dog-and-pony show to get the good word out). We must include both the contracted-out costs and the real cost of HRSB staff time spent on this endeavor, last year, this year, next year.
I'd like to see a statement of time spent and the hourly costs (salary plus overhead costs) for head office administrators and in-school administrative staff and the classroom teachers' time used to administer the student version (which, I'd note, is time lost from actually teaching) plus the costs of any contracted services.
Superindent Olsen, __?_ hours at $70/hr + overhead costs @$___?__/hr = $____?____.
+ Mr. Buck, __?_ hours at .....
+ about 1 hour per classroom teacher: __?__ hours at ..... = ____?____.
_____________________________________________________________ HRSB staff sub-total: __?__ hours, $___?___ (salary and overheads)
paid to: _________________
for: ______________, $____________
paid to: _________ ...
Parents/Guardians Survey 2004 Grand total $_______ ? _____,
But these are not the only costs that matter: we'd like to know about lost opportunities' costs, the value of important things not done or postponed because staff time was sunk into this survey and other "Planning for Improvement" activities. What would we save by not doing this, and reducing administration staffing?
Who is in charge here?
I would ask HRSB to publish the relevant qualifications of those responsible for the creation and analysis and reporting of this survey and last year's survey, that is, the non-clerical inputs.
Presuming the whole thing was done properly, we could expect to see
o the research design,
o the survey design brief / RFP prepared by HRSB and the names and survey design qualifications and experience of those managers at HRSB who prepared that design brief;
Presuming the whole thing was not created entirely in-house, HRSB should make public:
o the process used to identify and select suitable outside consultants / firms;
o the names and qualifications of all consultants / firms invited to bid on the contract(s);
o the names and qualifications of all consultants / firms who actually bid on the contract;
o the professional qualifications of the people / firm(s) that won the competition(s) and created and/or administered and/or analyzed and/or reported on this questionnaire, in whole or part, and what was paid for these services.
If it was an in-house effort
o survey design qualifications of those at HRSB creating, overseeing and approving the final survey design;
And we want to see
o the professional (survey-related) qualifications of those analyzing the data;
o names and qualifications of the writers and editors who created (2003 survey) / will create (2004) the written public report, and their supervisors within HRSB who signed off / will sign off on it prior to publication.
Checking the Work
Based on what I have found reviewing a number of other HRSB reports, I ask for public release of the data sets and the analytic tools (spreadsheets, etc.) for the 2003 and 2004 surveys, making the raw data and the methodology available for detailed outside review and possible re-analysis.
We may not be able to make a silk purse from this but we may get a bit of salvaged ear-leather from this sunk cost, and a clearer knowledge of what not to do next year.
- - - - - - - - - - - - -
If the Spring 2004 report is any indication elected HRS Board members and the general (taxpaying / parental) public will get presented with a document next Spring that would not earn a pass in a second-year university quantitative sociology class, much less merit consideration for any peer-reviewed professional journal: the graphs and tables will confuse what little real information might hide in the data, and the supporting written analysis (offering insights about the meaning of the graphs and tables) will be close to non-existent.
See for yourself: the whole Board-level Results for Students, Teachers, and Parent/Guardians, Spring 2004 report is available at www.hrsb.ns.ca/downloads/pdf/improve/survey/board-level-survey-results.pdf.
E-mail me if you'd like a copy of my forthcoming annotated critique of this document.
Planning for Improvement? Really?
I am concerned that this survey is only one small part of the larger "Planning for Improvements" exercise that might result in HRSB increasing the numeric values of a suite of similarly irrelevant, ill-conceived "measurements" in pursuit of bragging rights as "the most improved school system" in Superintendent Olsen's words. As far as I can determine she has entered HRSB is in a "competition", at great ongoing cost, that has only one competitor, HRSB itself, alone.
"Follow the money", the cynic suggested. Who in the long run benefits from carrying out the Parent/Guardian Survey or any of the other parts of the "Planning for Improvements" exercise?
Will it benefit the careers of:
o my daughter or her HRSB classmates in Grade 11?
o children in now in Grade 5?
o our front-line teachers trying to find time for both "P-f-I"-mandated tasks and real teaching?
o present and future "educational consultants"? or perhaps
o the Tory politicians who currently run the Government of Nova Scotia, who have a not often mentioned financial / political interest in running down the delivered quality of public education? [Each child pulled by frustrated parents from a HRSB classroom and sent to a private school or home-schooled puts another six grand a year on the "right" (tax-cutting) side of the government's ledgers ... not to mention contributing to a reduction in the number of public class rooms they need to build... ]
Could that money and time now used for this Parent/Guardian Survey be better used for more instruction, for text books, teacher's aides, library materials, photocopy paper, ... all proven ways to improve / restore our children's' educational quality and all in increasingly short supply today? I certainly think so.
I remain, yours truly, Not a Happy HRSB Camper, still working for real changes and an improved public education system for Halifax.
-- -- -- -- --
Peter G. Zimmer, BS MFA
Product Design & Development
6133 Willow Street, Halifax, NS B3K 1M1 Canada
Comments to : email@example.com
Copyright Copyright New Media Publications. © 2005. The views expressed herein are the writers' own and do not necessarily reflect those of shunpiking magazine or New Media Publications. You may not alter or remove any trademark, copyright or other notice from copies of the content. Copyright of written and photographic and art work remains with the creators.