I would guess most teachers saved the survey for the class most likely to yield positive results. I would have preferred to give it to my ninth graders, but I had three days in which to give the survey and they were busy with presentations. By the time I gave the survey to my very vocal, but truly likable tenth graders, they informed me they had already completed the survey seven or eight times.
So, what's the nature of the survey? Students were asked about the behavior and participation of their peers. Students were asked if lessons were interesting. Did their teacher seem to care? Did the teacher explain things well? Did students learn a lot? Did the teacher assess student understanding? Did the teacher have high expectations? Were lessons student-centered? Did teachers give feedback and make corrections?
Many of the questions intentionally overlapped with one another. I would have preferred some different questions, however, like:
What is the size of your class? Do you think it negatively impacts your teacher's ability to help individual students? Do you enjoy test prep? Do you enjoy taking tests? Do you think a single, high-stakes test score adequately measures your abilities, as exhibited to your teacher, throughout the year? Did the curriculum actively promote your awareness of current issues and encourage you to work toward a better future? And, in all honesty, did you study for more than an hour outside of class each week?
So much for my questions...
As with all data, I am sure the Tripod Survey results must be taken with a few grains of salt. If student behavior is poor, is it necessarily the fault of the teacher? If students do poorly on tests, is it necessarily the fault of the teacher?
And, if the lessons are not interesting, is it ultimately the fault of the teacher or is it the fault of an overemphasis on the importance of standardized testing?