Friends, I hope this finds you well – for many of you, today was a snow day, so I hope you were able to make good use of it. I am sorry that circumstances continue to be really challenging for educators of all stripes, and I am grateful to all the people who wrote in response to the last Coaching Letter, about learning from the science of happiness. I’m glad it was useful. At the same time, I made reference to the fact that I had been planning to write about the book Street Data, by Shane Safir and Jamila Dugan, and I heard from a few people that they were hoping I’d get back to that (some of my readers can be a little demanding), so this is that letter. I actually started it before Thanksgiving, and while I am confident that I am not as stretched as anyone working in a school district, I’ve been a little busy. And I should also like to note that I have benefitted enormously from discussion about Street Data with my colleague, Rydell Harrison; he and I work together on a couple of projects and I am all the better for it.
Street Data’s title comes from the metaphor the authors use to describe three kinds of data available to educators:
Satellite data: large grain size data that illuminate patterns of, for example, achievement, equity, and teacher quality and retention, and point us in a general direction for further investigation.
Map data: medium grain size data that help us to identify student progress in reading, math, etc. at a programmatic level (e.g. decoding, fluency, fractions), and which can point us in a slightly more focused direction.
Street data: help us to understand student, staff, and parent feelings and attitudes as well as specific conceptions and mindsets; they help us to monitor students’ internalization of important knowledge, skills and dispositions.
The best description of street data is on page 67 of the book:
“Street data offers a new grammar for educational equity. It humanizes the process of gathering data. Rather than positioning students and teachers as objects whose value can be quantified, street data teaches us to engage with people as subjects and agents in an ever-shifting landscape – human beings whose experiences are worthy of careful study and deep listening. It teaches us to be ethnographers rather than statisticians. And the process itself builds trust and relational capital.”
Street data are different from satellite and map data in dimensions other than scale. Primarily, they are qualitative rather than quantitative; they are a reflection of experience, and in this respect the book is very similar to other works we’ve consulted on design thinking, including but not limited to Jon Kolko’s well designed, and Liedtka and Ogilivie’s Designing for Growth.
Street Data takes issue with the data practices that top-down, accountability-driven education reform based on policies such as No Child Left Behind and Race to the Top, and instead takes as its premise that in order to effect truly radical, equity-centric reform, we have to re-build the system from the student up, and in order to do that, we have to be willing to understand the system from the students’ point of view, and in order to do that we need street data. So in its emphasis on centering equity, it sits apart from other books on continuous improvement, such as Learning to Improve; not because the methods of continuous improvement or improvement science are antithetical to equity – far from it – but because the methods of improvement science are equity-neutral. But I don’t want to be disingenuous – there’s no reason why you couldn’t be using the methods in Street Data in an equity-neutral way, either.
The collection of street data requires focused listening and observation (have I mentioned lately that education would be in better shape if everyone had coaching training?); but beyond that, you have to go looking for street data. Satellite data, especially, are unavoidable – they tend to be the high-stakes data that we are forced to confront all the time. But we are not very good at getting underneath the high-stakes data to figure out what’s really going on, not just to get at root causes from our perspective, but from the perspective of people in the system whose voices are often absent. The collection of street data also requires humility – we may not like what we hear, and taking feedback that you don’t agree with takes skill. And once you have collected the data, you have to have a plan to do something with it – I was at the UCEA conference last November, and a theme I heard was that frequently data collection events like listening tours and equity audits are performative: they are done for the sake of being able to say that they were done, and nothing changes as a result, which actively undermines social capital and relational trust.
Speaking of coaching: the work of coaching consistently involves asking people how they know something: how do you know that’s what’s really going on? what makes you so sure that that’s the problem? how will you know your proposed solution will make a difference? how do you know? And frequently there’s an easy answer, like, well, I could ask; but frequently, clients will start creating more elaborate ways of asking that question, and I’m always curious what it is about education that we think we should set up fancy data collection systems when there are people who could just tell us what’s really going on? For example, I love many aspects of The Classroom Experiment, a 2 part documentary about Dylan Wiliam’s work with a few teachers in an English secondary school (on YouTube: Part One and Part Two), including the segment on asking students for their feedback on teachers’ instruction – my, my.
It strikes me that there is, in general, a parallel between street data as an organizational practice and formative assessment as an instructional practice (and, just because, here is the most recent list of Resources on Formative Assessment). In both cases, we are trying to collect data as close to the consumer of our services as possible, and to do that as frequently as possible so that it can inform our practices in as close to real-time as possible. (I use the line all the time that if we are really interested in using assessment to inform instruction, then even an exit slip is too late.) And just as with classroom formative assessment we are seeking to make students’ thinking visible, with street data we are trying to make their experience visible.
There’s a story in Jordan Ellenberg’s book, How Not To Be Wrong, about an analysis that was done on fighter planes that had been shot at during World War II battles – it showed that these planes were much more likely to be hit on the wings and tail, which is not surprising, given that these are the broadest part of the plane, and plans were made to reinforce these areas. But Abraham Wald, a mathematician, pointed out that perhaps this reasoning was backwards: the reason these planes were had bullet holes in these areas and not others was because planes that had been shot in more vulnerable places didn’t make it back. The reinforcement, therefore, should be placed on the parts of the plane where there weren’t any bullet holes. This story is often told to make the point that the stories behind the data are more powerful than the data themselves; or even that the missing data have a more powerful lesson to offer than the data we do have. Street Data reminds us that what is measurable is not the same as what is valuable.
So this is the frame that we can use to think about data – or the lack thereof. What’s the story behind the data that are easy to get at? Whose voices are missing from the data? Our basic assumption is that the data won’t give us all the answers, they will only tell us what questions to ask next.
Even if you don’t agree with the position that education should be re-invented and reconstructed to raise student achievement and erase achievement gaps, I can still make the argument that we should be more focused on street data. Talking with Rydell made me think about how we consistently undervalue qualitative data, and I have my own story about what we can gain from hearing people’s stories that we can’t get any other way. I wrote my dissertation on principals’ knowledge of good instruction: how they developed the mental models that they carry about what effective teaching and learning look like. I interviewed a bunch of principals, and found that there was, unsurprisingly, considerable variation in what they valued about instruction. What they all had in common, however, was that they developed their mental models very early in their careers – in fact, not during their careers at all. All of them spoke at great length about the role of their own experience as students in forming the core of their vision for high quality instruction – I could go on about this at length. My point is that merely surveying principals about their views on instruction would not have led me to that key insight, which has influenced a great deal of my thinking and work since then.
Street Data is connected to many of the projects I’m involved in right now, so it’s been cool to apply these ideas. I hope to write more about those another time…
As always, I would love to hear from you about your thinking and application of these ideas. And please continue to pass on the Coaching Letter to others whom you think might be interested – and welcome to those who are reading the Coaching Letter for the first time! Best, Isobel
Isobel Stevenson, PhD PCC
istevenson@partnersforel.org
860-576-9410
If someone forwarded you this email, click here to subscribe to The Coaching Letter
Please follow me on Twitter: @IsobelTX
Author with Jennie Weiner of
The Strategy Playbook for Educational Leaders:Principles and Practices