Evaluation in the Wild: The Public Service Loan Forgiveness Program

the importance of being clear about Failure

The importance of being clear about failure

By Morgan Valley

You may have read recent news coverage of GAO’s report describing how unsuccessful the Temporary Expanded Public Service Loan Forgiveness (TEPSLF) program has been. TEPSLF was an extension of the Public Service Loan Forgiveness program, created to forgive federal student loans for people who worked in public service for at least 10 years and reward them for doing valuable work, which often comes with lower wages. But it has a 99% denial rate – the same denial rate for the larger and problematic Public Service Loan Forgiveness program. 

The place I’ve heard most about the student loan forgiveness program is over the dinner table. After serving 10 years as an educator and advisor at our local university, my partner recently joined the 99% of public servants who applied for loan forgiveness and got denied. Like others, he got denied because of confusion over the process but also because he received misinformation and mixed messages from those involved in the process.

My partner read GAO’s report and felt some validation learning that almost all of his peers found the process confusing. He also felt frustrated that the report didn’t quite capture what he experienced – the program that promised to aid him and many of his colleagues didn’t work.

As an evaluator, I read the report with an additional perspective. I appreciated the methods used to reach the findings – they analyzed loan data, reviewed rejection and complaint documents, and conducted interviews with a wide range of stakeholders, including officials, supporting organizations, and borrowers. Those are some seriously robust mixed methods!

I appreciated the report’s clear and concise writing (somewhat forgiving the excessive use of jargon because it comes with the territory). The report summarized an incredible amount of information into a digestible size and provided a one-page actionable executive summary.

But when I finished the report, I felt dissatisfied – and not just because of financial implications for my family that come with loan forgiveness denial. Like my partner, I felt the report was missing something. I wished the authors had included one more critical piece of the report – a clear and powerful insight statement.

Yes, they provided a summary statement in the header: Improving the Temporary Expanded Process Could Help Reduce Borrower Confusion. 

But this “so what” statement felt mild and dismissive of the true insight behind the findings and the experience of those impacted by the program’s confusing processes. Maybe it will help policymakers and others working in this space interpret and act on the findings, but it could just as easily help them point to borrower confusion as the primary issue.

Meanwhile, many news outlets got the insight right in their headlines. For example, here’s a selection of the headlines I saw about the report: 

The journalists made the insight clear – the program failed the people whom it promised to serve. As evaluators, we can learn a lot from journalists about crafting an insight statement that both summarizes our findings and presents the implications of it – without the hedging.  


Morgan.Site_480x320.jpg

Morgan Valley, PhD | Evaluator

Morgan is an academic researcher turned evaluator, skilled in bringing mixed methodologies and advanced data synthesis to clients in way they can understand and use.


Morgan Valley