In last week’s blog post, I wrote about the process of getting input from students to set some assessment schedules for the end of the school year. Once the schedule was in place, we needed to come back to the students to get their feedback. One thing I was particularly interested in was if the students felt it was valuable to review the 2018 MCAS test in small groups with the teacher, if that practice helped them do their best work on the 2019 State exam. I may not be able to do much about schedules and assessments mandated by the State and district, but I can ensure that I retain practices that are valuable to the students, if I know what they are!
To get that feedback, we asked a variety of questions, such as the following:
As is always the case (thank you, variability!), we got a wide range of responses across the survey results. With that said, one area that was consistently a three or four out of four was that it was helpful to work with a teacher on an actual released MCAS test. That was very useful feedback for me–now that I have data that tells me this practice worked for the students, I have a reason to ensure that I engage in that practice in the future, rather just guessing.
Even within the consistency of the responses there were still differences, ranging from “I didn’t think it was a complete waste of time and the most awful thing in the world, but it was a bit stressful to be looking right at what we were facing, so it intimidated me a little bit. But other than that, it was nice to see what we should prepare for in MCAS.” to “It was effective because it showed us what to expect on the test and how to solve what was on the test.”
It was rather upsetting to see how many students commented on the fact that “it was effectice but there was maby a few things that were not effecive like some of the quistions were not similler.” At this point, we only have two years of released problems from MCAS to pull from, so hearing that the practice problems were markedly different from what the students saw on the test, at least in terms of formatting, was depressing. It’s also not something that we, as teachers, can change because we have a pool of released items that is simply too small at this time.
We asked some other questions and ended with an open-ended response option:
Again, I was struck by the sheer variability, with answers all over the place:
- The preparation was perfect change nothing.
- Nothing could have been better it was great!
- I think you should keep it the same it worked very well for me
- maybe you can give the students one more week to study important topics or topics the students this year got wrong
- I don’t think it could be much better, but I found it to be a tiny bit chaotic near the last few days or practicing. I lost track of what work I had done, and hadn’t done.
- This preparation could have more material to practice with. (More worksheets, self-assessments, etc.)
- maybe just start a little earlier so it is not so crammed.
- This could be done better for next year by going over stuff that we were stuck on and give worksheets on it.
- We could have 3 days to prepare for the YE assessment instead of 5 days plus homework, because I finished all of the links and extensions by the fourth day and then I was just working on work that I didn’t need for my comprehension.
- I think that the links on google classroom could have been organised better into reference sheets and work.
- I think that we cold begin the preparation for the year end a couple days later, because eventually I did all of the work and felt confident, but we still were reviewing for 2 more days, so I got bored of doing the same things over and over again even though I was already a pro on them.
Sadly, I didn’t have a chance to follow up with students like I usually do, due to scheduling and a change of focus to some in-class projects. Not being able to follow up with kids made me realize how much I enjoy the looping aspect, how much more powerful it is for me to be able to ask follow-up questions. For example, when a kid writes “I think that maybe we could dig deeper into the subject because I feel like we just scratched the surface,” I want to know more about why they thought we would be “digging deeper” when the whole exercise was designed to be a cumulative review! How did my intentions as a teacher end up so far separate from the experience of this particular student in my class?
Or why did a student say “Less time practicing to use the tools in MCAS” when we only used the tools in context for each problem/content? Again, how did my intention end up so distant from the experience? Although the kids did put their names on these responses, so I am not guessing who they are, I am guessing about why they are saying what they are saying, especially when it seems so contrary to what we designed. The biggest take-away for me in this experience is the reminder, yet again, of the power and value of asking for feedback, since it’s clear that my best intentions do not create a consistent experience that mirrors my plan.