Make Your Data Delicious


Data Sandwiches

Data Sandwiches

By Elena Harman

It is the nonprofit’s leaders that take the data, identify what action, if any, the data requires, and take action to improve the nonprofit’s services.

Every few months I interact with nonprofits that have spent tens of thousands of dollars on evaluation and ended up with a very expensive paperweight. The evaluation process does not end when the data is in. The data alone can’t make decisions about what actions to take. And data certainly can’t take action to implement those decisions. That’s where you come in: it is the nonprofit’s leaders that take the data, identify what action, if any, the data requires, and take action to improve the nonprofit’s services.

Once you’ve taken the time to focus the evaluation on things that you care most about, and dedicated the resources to carry the evaluation through, the last step in the evaluation process is to reflect and learn. One of the first steps you can take is making your written evaluation results as impactful as possible.

No-Nos and Data Sandwiches

A report no-no I see constantly in nonprofit evaluation reports is what I call “data vomit.” Data vomit is when you take a bunch of random data and evaluation findings and throw them at random throughout a report without any context or interpretation. I see this approach most often in bulleted lists in grant reports, and it looks something like:

This technique puts the burden on the reader to determine why those numbers matter and the majority of the time, especially when you’re working with folks who aren’t naturally inclined towards evaluation, the answer is that it doesn’t matter to them.

Data Vomit

Because of your attendance at the event, which of the following you do plan to do? (n = 181)

  • ‹‹Use resources or tools received at the event: 150 (83 percent)

  • ‹‹Follow up with new contacts: 71 (39 percent)

  • ‹‹Reconnect with colleagues: 56 (31 percent)

  • ‹‹Share stories: 48 (27 percent)

A better strategy is what I call a “data sandwich.” A data sandwich has three parts: a conclusion, the supporting data, and a pretty picture.

blog the sandwich.png

First, share one sentence about what you took away from the data, which is the conclusion. The second sentence should support your conclusion with data. The last element is a pretty picture related to your finding so readers who are visual learners or who want to see the full supporting data can do so easily. If your data is quantitative, you will most likely have a chart or graph. If your data is qualitative, you’ll have to be more creative about how to present key themes in a visual way, or you can start with a simple quote box.

Data Sandwich

Attendees were most likely to use resources or tools as a result of the event. In fact, 83 percent of attendees reported that as a result of the event, they planned to use resources or tools received that were provided. The next most commonly reported action, meaning follow up with new contacts, was selected by only 39 percent of attendees.

Blog sandwich.png

Compare the data sandwich to the original data vomit: the data sandwich highlights the message and puts the legwork in the hands of the writer instead of the reader.

Learn more about data sandwiches and how to digest your findings in my new book: “The Great Nonprofit Evaluation Reboot: A New Approach Every Staff Member Can Understand,” available on


Elena Harman, PhD | CEO

Elena takes the big-picture view of how Vantage’s work transforms how evaluation is used and perceived. She pushes everyone around her to think bigger about what evaluation can be, and how it can help improve our communities. With an encyclopedic knowledge of research and evaluation methods, Elena supports and advises the evaluation team on all projects. She connects the dots between data sources and projects. Elena has dedicated her life to Colorado and evaluation as a means to improve the lives of state residents. She brings a deep expertise of systems, nonprofits, and foundations in Colorado, as well as how to engage diverse audiences in a productive conversation about evaluation.

Elena Harmaneval tips