Avoid Three Common Pitfalls When Using Data
An example from the education system
An example from the education system
By Elena Harman
I’m sure you’re familiar with the phrase “too much of a good thing.” Take data for example. (You knew I would.) The business sector loves statistics and lots of them. And the social sector is quickly following suit. As a collective, we’ve even given it the name, Big Data, because volumes of data must mean something larger than life.
Don’t get me wrong, as an evaluation consultant, I’m a fan of measurement and information, but only when it adheres to a thoughtful and intentional approach with those who are most directly involved, using a balance of data sources. I recently read an article by Simon Rodberg, who was the founding principal of the District of Columbia’s International School, a public charter middle and high school. Rodberg found himself on the wrong end of too much information, so much so that he’s become an advocate for reform of data-driven approaches.
Long story short, Rodberg inherited the field of education’s 30-year belief system that more student testing will get us the results we need to improve performance. Under that belief system, teachers set aside “Data Days” for analyzing end-of-year and mid-year exams, interim assessments, teacher-created and computer-adaptive tests, surveys, attendance, and behavior notes. Today, principals who ascribe to that philosophy have gotten lost in the abundance of numbers.
Rodberg’s call for reform is a sound one. Statistics and numbers are not inherently bad, but the education system has lost sight of the best way to use them. Somewhere along the way, the system has arrived at a more-is-better philosophy. Unfortunately, more statistics mean less intentionality, which often results in accountability as the focal point. When accountability becomes the center of attention, you stray from opportunities to learn from data.
Here are three issues the education system is experiencing with data, and to be clear, these are not unique to education. I see these three issues again and again across systems the public and social sectors trying to use more data:
Looking outside the immediate context to determine questions: Evaluation works best when it’s not mandated by policy makers and administrators. Rather, when teachers on the ground are enlisted to identify the questions that, if answered, would help them do their jobs better.
Using only one data source: When we think of data narrowly as only what can be quantified, we’re missing half the story. Dashboards, and other quick analyses that exclusively focus on tracking figures because they’re easily captured, leave out qualitative information and other equally important answers that inform growth.
Over-emphasis on accountability: In education reform, a misguided approach to and relationship with data has failed kids. Data is only a powerful tool when embedded in a system of ongoing improvement. No Child Left Behind focused on using data as a tool for accountability without giving schools the skills and capacity to learn from data. With that one-sided approach, we are left admiring the problems in education instead of learning how to fix them.
Like Rodberg, I'm not ready to give up on data in schools, so let’s consider down-sizing the volume of metrics and refocus our efforts on letting teachers identify where they genuinely need support. Remember that measurement is only as useful as the extent to which it informs strategy.
And a sound strategy is formed by effectively addressing three priorities: (1) Empowering those directly involved to form the questions that genuinely drive change, (2) Applying a mix of data sources that fit your key questions, and (3) Embedding data in a strategy aimed at nurturing growth rather than blame.
If you can keep an eye on these priorities, you’ll avoid drowning in a sea of data and, instead, find yourself buoyed by answers to key questions that inform good decision making.
Elena Harman, PhD | CEO
Elena takes the big-picture view of how Vantage’s work transforms how evaluation is used and perceived. She pushes everyone around her to think bigger about what evaluation can be, and how it can help improve our communities. With an encyclopedic knowledge of research and evaluation methods, Elena supports and advises the evaluation team on all projects. She connects the dots between data sources and projects. Elena has dedicated her life to Colorado and evaluation as a means to improve the lives of state residents. She brings a deep expertise of systems, nonprofits, and foundations in Colorado, as well as how to engage diverse audiences in a productive conversation about evaluation.