28 Comments
User's avatar
Aaron Lemon-Strauss's avatar

Thanks to you, Jen, for literally writing the book (and the article, and the post, and everything else) that allowed this work to happen.

Expand full comment
Jennifer Pahlka's avatar

Writing is a WHOLE lot easier than the turnaround you have accomplished, Aaron!

Expand full comment
Kevin's avatar

It’s a small point here but it jumped out at me that it was much easier to use an off-shelf analytics package. That’s something I have run into several times with contracts, specifically for analytics. It’s so easy to specify exactly what data you want to track in every detail and accidentally define the requirements so that it’s 100x more expensive to build it. Tiny details of data processing pipelines can be unintuitively expensive because they mean you have to build something from scratch rather than using standard technologies.

Expand full comment
Jennifer Pahlka's avatar

Byrne's law: If you'll take 85% of the features, you can have it for 10% of the cost.

Expand full comment
Aaron Lemon-Strauss's avatar

Yeah, one thing I'm learning in my transition to government is that the basic product discovery process looks different when vendors do the primary engineering lift. Throw out a random idea and someone at the vendor will record it as a "requirement" and then build to it, adding complexity and cost, without pausing to evaluate the architectural fit, ROI, etc. Moving to buying capacity helps a bit, better DevOps helps a bit, but fundamentally we've needed to get the vendor leadership in a room and get on the same page about operating as an integrated outcomes-focused team. Lot of phenomenal people working at vendors, to be clear. But it's hard to get around the core business incentives.

Expand full comment
Micah Saul's avatar

Huge +1. Being able to act as one outcome oriented team is critical for effective vendor engagement. Of course, there are significant institutional and procurement barriers in the way of that “badgeless culture” so kudos to y’all specifically for finding a way through that morass, and generally for the whole thing!

Expand full comment
David Garten's avatar

But this highlights another issue/challenge for GAO, which is considered the investigative arm of Congress. GAO typically starts working on reports like this after receiving a request from Congress, typically from a committee. Because GAO is so risk-averse, they are very laborious in their fact-finding and careful with the language they include their report. As a result, a typical engagement from start of finish can take nine months to a year, or even longer. But this incident really highlights where GAO’s model fails. Government doesn’t typically move fast, but in this case, and we should see more of this with better technology, they did move quickly to address this issue. But GAO’s model is one that fails to adequately pick this up. GAO is great and more people should read their reports, but this incident highlights that they also probably need to change with the times.

Expand full comment
MHarden's avatar

This is an excellent read, and congrats to the team for righting the ship, correcting the record, and serving the public.

As someone who had a ringside seat for Healthcare.gov, at the time aligned with the large SI that took the brunt of the blame for that situation and to this day does not push back on it, I find a great many parallels to that situation. And I'll note that the folks who came to the table to participate in that collective fix are still praising themselves for their individual smart works, ignoring that so many of the fundamental problems were exactly the structural issues that we see at play in this ED/FAFSA environment.

Again, congrats to all involved for righting the ship.

Expand full comment
Karen Shields's avatar

Thank you for recognizing the Department of Education teams that work to solve this problem for the country. Thank you for also recognizing the leadership that championed their ability to get this done. Many government leaders are forced to be the “guardians at the gate” while these teams work and do the “unconventional” things that are needed to save these programs. It is true that the oversight bodies often find a mismatch between their audit grid and the way recoveries are enacted. But they are learning and deserve grace. We will all keep learning together. Because what you very accurately point out is that we are only as agile, smart and product-based as the most unpracticed part of our ecosystem. When we bring each other up, we all rise higher.

Expand full comment
Joshua Miller's avatar

It's worth pointing out that GAO and ED are agreeing that the initial process sucked. Their advice is usually pretty obvious in retrospect, yet often forgotten in the moment: Make a plan. When things go wrong, amend the plan. Use and enforce contracts to do both, and hold contractors accountable when they don't comply with the agreements they've negotiated and modified multiple times during the engagement.

It's also notable GAO responded to the "spicy" ED letter in their report, on page 42. Do you object to their work on how agencies can implement APOM? Here's the text of their response, since people may not want to wade through the document to find it:

"In its comments, FSA also stated that it believes our analysis teaches the wrong lessons and reinforces the exact practices that led to the FAFSA’s initial challenges. However, FSA did not explain how our work reinforces the practices that led to FAFSA’s initial challenges. As our report notes, FSA was not appropriately overseeing the work of its contractor and did not adequately ensure rigorous testing of the system. By not doing so, FSA put the FAFSA modernization effort at risk of failure, which their letter points out."

"FSA further stated that it disagrees with several key parts of the draft report, especially where, in the agency’s view, the report applies a more traditional and somewhat outdated project-based model that does not support modern technology development for scaled systems like FAFSA. The agency identified a product operating model by the Niskanen Center that it strongly supports because the model advocates for modern technology development within government. The agency stated that this more modern model is at odds with our focus on rigid compliance with specific outputs listed in contracts signed years before a feature is delivered to the public."

"We disagree with these statements. The product operating model identified by the Niskanen Center, as the center points out, is a structure to operationalize Agile principles. This is also known as APOM. We have long supported effective application of Agile principles. In September 2020, we issued a guide for adoption and implementation of Agile and updated this guidance in November 2023. [https://www.gao.gov/products/gao-24-105506] We have also conducted many reviews of agencies’ implementation of Agile."

"To guide this report, we used current federal and agency guidelines and requirements, as well as established best practices and the FPS contract. In addition, FSA could work with its contractor to modify the contract to update requirements and has, many times since the original 2022 contract was signed. The requirements in the contract, whether the agency agrees with them or not, lay out the agreed upon work and deliverables that the agency can use to hold its contractor accountable for delivering what was promised to the federal government. It is also a key tool in ensuring federal tax dollars are spent appropriately."

Expand full comment
WRDinDC's avatar

GAO purports to evaluate DOE against IEEE's guidance ("Software and systems engineering—Software testing—Part 3: Test documentation, IEEE/ISO/IEC std. 29119-3:2021"), fn9.

Is GAO's whole premise of evaluating DOE against this standard inappropriate?

Expand full comment
Ann Lewis's avatar

Agree this is a great question, and gets at some key methodological issues with auditing frameworks. IEEE 29119-3:2021 provides templates and examples for test documentation outputs tied to various software test processes, like test plans, test procedure/specs, test status reports, incident reports, etc. It's meant to be comprehensive but not a spec to comply with. In 29119-3:2021, there’s a strong emphasis on “tailored conformance” which I believe is intended to mean that you don’t have to use everything, but you need to justify what you use/not use.

My take: saying that IEEE 29119-3:2021 is an authoritative spec to judge software implementation against is like saying that a physics textbook is a spec to judge building a new train system. Yes building trains involves some amount of knowledge of physics as it applies to civil and mechanical engineering, but that doesnt mean you have to check every principle of physics against your transportation plan to call it done (as opposed to checking higher priority and more meaningful things like track alignment, soil stability, civil works integration). Checking physics principles against a running train system is not a helpful audit, but there are many safety, regulatory, technical, and financial processes that are helpful, domain-specific audits.

Expand full comment
Sam's avatar

Doesn't the report say they use only seven practices that are considered baseline requirements from this guidance document, though? Not the whole thing?

Expand full comment
Jennifer Pahlka's avatar

Excellent question. Let's take a look at it.

Expand full comment
Daniel Honker's avatar

Fantastic writeup, and kudos to you Jen for giving kudos to the agency leadership -- not easy, especially in this time.

Auditors in organizations like GAO usually rely on “bright line” tests—clear rubrics and standards they can point to as objectively met or unmet (a policy does or doesn’t exist, requirements did or did not exist, milestones met or unmet, etc.). You’re right to note that many of those standards are badly out of date. I also wonder if the whole idea of a bright line test is mismatched to modern technology. Auditing digital practices like iterative, human-centered approaches isn’t a simple yes/no question—it’s far more nuanced and subjective. For oversight to be meaningful today, it’s not just the rubrics that need to modernize, but also the underlying mental models. Otherwise, audits risk missing the point: driving better results and smarter use of resources for the public.

Expand full comment
Ed Knight's avatar

Oh, God. You are so, so right. In my experience (w/ NASA and DOD), the oversight groups are driven by "we don't want to be embarrassed" and so what matters is that the team they're overseeing checked all the boxes, even if they're the wrong (and often stupid) boxes.

As someone who will start filling of FAFSA *next week*, I am thrilled to read that the team is actually, you know, putting their mission before their bureaucracy. Unfortunately, the cynic in me thinks they'll still get squashed for defiance--either directly or by having more compliant management installed.

Expand full comment
Ann Lewis's avatar

👏👏👏

Expand full comment
JA's avatar

I’ve never liked the GAO or their processes; but it’s very fascinating to me that in an ecosystem where violence is prevalent, you all will choose words like “schooled” and “punches back.” Why is winning only defined in context of these words? There are much more constructive ways of highlighting technology wins within the government without having to resort in rhetoric, that at most I’d expect from tech bros.

It’s also worth researching that the cost of implementing the FAFSA has doubled since contract award (now around $100 million); and that’s only for one system. USASpending shows it all. A lot of winning, yes. Thoughtfulness, maybe less so. I wonder where taxpayer $ are being saved here.

Expand full comment
Shawster's avatar

Those terms are how normal people talk and are useful for communicating ideas to a wide audience

Expand full comment
Raghav Vajjhala's avatar

I have a long history of frustration with both GAO and OIGs as I have been the named action officer on dozens of recommendations and my teams have closed on the order of hundreds more recommendations from ATO related audits.

GAO's methodology is a good predictor for the report's findings. In this case, GAO's methodology focused on the original rollout's shortcomings leaving little room for recognizing recovery. Reading between the lines on the DOE/FAFSA response, there exists (justifiable) frustration that GAO chose (likely at the request of House/Senate member) to focus on past actions overcome by more recent events.

My primary critique - both GAO's report and DOE/FAFSA's response continue the federal practice of not directly criticizing the procurement practices that materially contribute to failed IT rollouts. In this case, it would been helpful for DOE/FAFSA to mention the role played by the College Board and discuss if the original procurement allowed for consideration of those alternative methods of contract management or not. On GAO's part, by design they focus on agency management discretion on policy which omits scrutiny on procurement actions.

DOE/FAFSA team could have accepted the finding and responded with their own action to close as GAO generally defers to agencies on how to close recommendations (or at least much more so than OIGs). It's unlikely GAO would ever write the specific recs suggested by DOE/FAFSA; the best GAO would ever do is recommend an agency update its policies. This allows the agency tremendous discretion in how to close the recommendation, and there is nothing stopping DOE/FAFSA from doing so of their own accord.

From my read, both GAO's methodology and DOE/FAFSA's response missed the opportunity to scrutinize the role procurement (which can limit market research into commercial efforts like those at the College Board) plays in modernization efforts.

Expand full comment
mathew's avatar

"My primary critique - both GAO's report and DOE/FAFSA's response continue the federal practice of not directly criticizing the procurement practices that materially contribute to failed IT rollouts."

nailed it

Expand full comment
eliza's avatar

Love this! Go FAFSA!

Expand full comment
Sam's avatar

Did you read GAO’s rebuttal to the Department of Education’s rebuttal? I’d be interested to know what you think. They essentially say that 1) Education should change their policies if they want to evaluated against different criteria, and 2) none of their recommendations preclude an Agile approach, and in fact, they encourage it.

Expand full comment
Aaron Lemon-Strauss's avatar

Some thoughts from me, responding to that summary of GAO's notes on our letter:

1) Changing ED/FSA policies is hugely important and GAO is absolutely right to point there as the right move/recommendation. That's part of what we leaned into in our suggested edits to GAO's recommendations. We need to overhaul (scrap?) the current Lifecycle Management Methodology at FSA, which requires 30 page PPTs anytime you want to do a release. So I'm totally onboard with changing our policies as the way forward. One quick side note: when you ask people internally why there's so much process/documentation required for stuff like this, the first thing they bring up is that most of the process was created to comply with years of GAO/OIG audits.

2) I'll be honest that I'm a little less open to this as a response from them. :) I totally believe that they have the intent to support Agile development practices. But in my interactions over the course of now several audits, they always come back to the need to document everything and create firm processes for everything. In May I invited them to embed within our team for a week (or however long they want!) and observe how the team works, how we manage vendors, etc. They declined and asked for documentation. It's part of why I footnoted the Agile manifesto, which obviously isn't the Holy Grail, but does seem prophetic twenty-five years on:

Individuals and interactions over processes and tools

Working software over comprehensive documentation

Customer collaboration over contract negotiation

Responding to change over following a plan

Expand full comment
Kaitlin  Devine's avatar

There is a key point in the rebuttal that I don’t think they internalized or at least didn’t respond to — the team could have completed every documented requirement and still fallen far short of the desired outcome. That is why it is necessary to empower the team to make or change decisions in the framework of the overall goal. I guess you could change the contract or redraft a bunch of plans every time that happens but if it’s happening frequently enough that doesn’t seem realistic.

I am familiar with their agile guide and it focuses heavily on very prescriptive “agile” processes rather than how to continuously reorient around a target outcome. Highly functional agile teams know how to reinvent their internal norms and processes to support high value delivery. They change over time as conditions change or the shape of the team changes. The guide also goes very heavy on estimation as a control which IME is not a good means of ensuring compliance or a valuable use of time.

Kudos on the work and the well written rebuttal!

Expand full comment
Wes's avatar

Hell yeah!

Expand full comment
Thomas L. Hutcheson's avatar

This reminds me of an incident while I was temporarily working at USIAD. The agency has only recently started working with the country again after it got a new President and did a bunch of very good stuff. Among them was to fund an economic advisor to the new VP. [I say "good" in principle, how much of the advice the VP took is another story.] Time goes by and we get a new Mission Chief and when the advisor's contract was up, made us advertise locally to renew his contract! Obviously we would select him again, but the advertising itself was just insulting to the advisors and counterproductive in calling attention to the VP having a foreign advisor.

Expand full comment