“Let me tell you about what I think she’s really learning…”
As a young teacher, I always used my schools’ standard report cards the way I was expected to, but I likewise always found myself apologizing, during parent conferences:
“What’s not clear from this report card is that we’re really teaching the whole child.”
“I want you to understand that there’s much more important growth happening than what this report suggests.”
“Actually, what you see on this report card belies a few significant concerns I want to address with you.”
Working with adolescents in various Montessori environments and in Japanese public schools, student assessment, summary reporting, and parent conferences were easily the most problematic aspects of my job. I understood, of course, the need for schools and teachers to be accountable to parents. But I couldn’t find a way to demonstrate intentionality about the kinds of development I was convinced are actually most important for children.
Though I left the classroom for a time, to adopt and raise two children, I never stopped thinking about how parents and teachers have to work together to agree on how they will recognize and encourage growth. When my own kids were old enough, I toured and observed far and wide for a school whose values I could fully embrace. In addition to observing physical environments and classroom interactions, I found myself increasingly drawn to schools’ reporting tools.
Far beyond what I’d expected, school report cards offered insights into the fundamental values of the learning communities I was comparing.
I felt like I’d found the place where the rubber meets the road: the school’s explicit assertions, for parents, about what had been accomplished while students were at school. Studying school report cards became a hobby for me.
I had my first real, assessment AHA! moment in 2009, when I got my hands on a multi-page rubric developed by Pat Baker, one of the founders of The Bixby School – a private, progressive Pre-K through 5th grade learning community in Boulder, Colorado. I was so excited by The Bixby School’s assessment rubric (along with other aspects of the program) that I immediately registered Kofi to attend 1st grade.
Ironically, by the time I stumbled on the rubric, 2 or 3 years into its use, The Bixby faculty had come to find it cumbersome, and parents were pushing for quantitative reporting consistent with local trends in education. The school discontinued use of the rubric soon after. In both my private and my public school experience, parents and administrators overwhelmingly favor a scorekeeping paradigm of school accountability. This reveals nothing so much as a need for educators to craft (and apply deliberately) a more complex and nuanced narrative about the contribution they make to young people’s development.
Bixby moved on, but I was hooked on key elements of the reporting format. Specifically, I was enamored with the idea of developing a collection of non-academic attributes, grouped into several domains, each characterized by a narrative continuum of behavioral descriptors. I set about adapting and modifying the instrument right away, and had been tinkering with it for several years before I found myself back in an institutional setting where I had the latitude to pilot its use.
While I immediately envisioned possibilities for creating rigor and consistency in observation practices across entire school communities, at no point did I hope to arrive at the tool to end all tools. I wanted always to protect users’ flexibility to customize, individualize, and create local, contextual relevance.
When I was hired to lead a team of adolescent educators as an embedded Program Director, enhancing the quality of our reporting became a guiding objective.
At first, we amplified our commitment to the narrative sections of our report cards, deliberately and explicitly endowing qualitative data with greater institutional value.
It did not take long for us to feel that our reporting was gaining integrity.
Over time, though, the lack of consistency was disappointing – not only across the faculty, but also across successive reports written by any given teacher. Despite rich observations, thoughtful individual instruction, and purposeful, well-designed, and highly personalized support processes, there was little rigor or coherence (and therefore, little credibility) in our reporting narratives. One teacher’s effusive commentaries would make another teacher’s less effusive reflections in the adjoining section appear to be lacking. There were very few threads of qualitative reporting consistency to be found across the student population in the adolescent program, and almost none across the school’s Pre-K – 12th grade age span. As so often happens in a world where what counts is what can be counted, it was a state of affairs that implicitly communicated a lack of value and intention around the immeasurables.
In fact, there was no clarity about what we were observing for, or why. The entire qualitative reporting structure was built on the intuition of a handful of talented and well-meaning adults, and each of them expressed their own set of values in the little kingdom behind their classroom doors. There was little explicit agreement among us about the attributes we were cultivating, and nobody on the outside could tell how we spelled success. We had no idea whether we were trying to nurture the same traits in all the young people, or focus entirely on the individual. We had no explicit collective understanding of whether, as a faculty, we were focused on filling gaps, “correcting weaknesses,” fostering growth in areas of strength, or all of the above.
I began to consulted with various thinkers for guidance in defining dimensions of observable growth, and a set of representative attributes within each dimension, whose acquisition could be observed and described.
My intention, from the outset, was to use a recursive cycle of design, implementation, reflection, and modification to develop an accessible and editable tool that can be adapted to diverse environments. At the same time, various users were helping me to articulate better rationales for using this tool.