<img height="1" width="1" style="display:none;" alt="" src="https://px.ads.linkedin.com/collect/?pid=2340113&amp;fmt=gif">
Skip to content

The Initiative Graveyard

You can feel it in the staffroom before the slide deck even opens. Another launch. Fresh name. New poster set. The energy is earnest; the intention is good. And yet, somewhere between Week 6 and the end of term, the rhythm fades. Handouts disappear into drawers. Lesson resources sit half-finished. The initiative joins a familiar landscape of good ideas that never became lived practice.

I hear the same line in schools across the country: “We launch new programs… and then quietly bury them months later.”

The result isn’t just wasted effort—it’s something deeper. People start to expect that the next thing won’t last either. Engagement becomes polite. Momentum becomes episodic. Culture becomes cautious.

This article is about breaking that cycle—turning change from an event into a habit leaders can see, measure, and sustain.

What’s Really Going On (and Why It Keeps Happening)

Implementation science has been clear for two decades: outcomes depend as much on how we implement as what we choose (Fixsen et al., 2005; Durlak & DuPre, 2008). Schools rarely lack ideas; they lack shared routines that make new practice observable, supported, and reviewable in short cycles.

Two traps in particular keep filling the graveyard:

  • Front-loading and forgetting. Leaders invest heavily in the launch—big PD days, slick slide decks, ambitious goals. Then the routines that would sustain the work between meetings are left implicit. No action log. No check-in cadence. No feedback loops. The work drifts.
  • Too much, too fast. In an attempt to make progress everywhere, leaders spread effort too thin. As Fullan (2011) warns, coherence collapses when priorities outnumber capacity. Spread is not scale.

Research Spotlight: The ACU Principal Health & Wellbeing Survey (2023; 2024) identifies role conflict—too many demands, unclear priorities—as the top stressor for school leaders. Coherent, short implementation cycles reduce that conflict by making progress visible and workload shared.

This isn’t a dead end—it’s a signal to redesign for embedded impact.

For a companion piece on how rules and red tape choke follow-through, revisit The Invisible Wall of Compliance.

The Barrier: Staff Complacency

In the Culture of Excellence framework, the Initiative Graveyard sits under Staff Complacency. Not because staff don’t care, but because repeated, abandoned pushes teach people that effort is temporary and feedback won’t change anything. When change becomes theatre, the wisest response is caution.

You’ll know this barrier is shifting when staff begin asking for the next review checkpoint, when action items are claimed publicly without prompting, and when teams talk more about evidence moved than activities completed.

Trust underpins this shift. For how trust accelerates adoption, see The Trust Dilemma.

The Corrective: Continuous Learning

Your Topic Matrix lists Continuous Learning as the guiding Aspect here. Practically, that means shortening cycles, making wins visible, and tying every action to an observable learner outcome.

Continuous learning isn’t “infinite change.” It’s disciplined iteration—the habit of improving something small, quickly, together.

“Measure the change, not the launch.”

Signal of progress: staff begin to speak in evidence: “By Week 3, the entry routine shaved two minutes off transitions—10 extra minutes of learning per day.”

The Practical Playbook: Five Shifts to Keep Initiatives Alive

1. If you don’t define the finish line, staff won’t run the race

I often ask staff: “What does success look like for this initiative in six weeks?” The room usually falls quiet. Without a finish line, everyone is running in different directions.

Strong leaders declare the finish line before the start line. One clear learner outcome. Two visible practice shifts. Three ways to measure within six weeks. When success is pictured this concretely, teachers lean in.

Research backs this: Rogers (2003) showed adoption depends on compatibility, trialability, and observability. Hattie (2016) highlights clarity of success criteria as a driver of collective efficacy. When people can see what success looks like, they believe they can achieve it.

“If you can’t picture the finish line, your staff can’t pace the race.”

2. Momentum beats magnitude

Schools love big plans. But big plans collapse under their own weight. I’ve watched year-long rollouts suffocate under paperwork and fatigue before a single cycle finishes.

The alternative? Six-week loops: design → do → decide. Small enough to feel achievable, tight enough to keep energy alive. Timperley (2011) showed short feedback loops drive adaptive practice. Durlak & DuPre (2008) found implementation quality—not program choice—was the biggest predictor of outcomes.

Leaders who keep cycles short see staff energy rise, not fade. Teachers arrive at check-ins with data in hand, eager to show what shifted.

3. Goodwill collapses without design

One principal put it bluntly: “If I don’t do it myself, it won’t get done.” That’s the Collaboration Illusion all over again.

Distributed leadership works only when accountability is explicit. Leithwood, Sun & Schumacker (2020) proved this: without role clarity, responsibility collapses back onto one person.

That’s why role maps and action logs matter. Not as bureaucracy, but as signals of fairness and trust (Edmondson, 1999; Bryk & Schneider, 2002). When everyone can see who owns what, goodwill becomes shared responsibility.

“If everyone owns it, no one owns it.”

4. Retire before you replace

Here’s the hygiene rule every school needs: nothing new launches until the last initiative has been closed properly. Not shelved quietly. Closed.

That means presenting what changed, what sustains, and what stops. It’s short, sharp, and public. And it matters more than most leaders think.

Fullan (2011) and Senge (2006) both warn that coherence relies as much on stopping as starting. Without closure, staff conclude nothing really sticks. And then every new launch starts with cynicism.

5. Capacity is the engine; the initiative is just the road

Too often, professional development becomes a scattershot of workshops, disconnected from the habits initiatives require. I’ve seen schools pour hours into theory sessions, only to watch staff flounder when practice begins.

The smarter move: invest capacity exactly where the initiative lives. If the focus is classroom routines, rehearse them. If it’s feedback, practice it live. Hargreaves & Fullan (2012) describe professional learning as the deepest cultural investment a school makes. When it’s scattershot, practice stays scattershot.

One short rehearsal with feedback beats a full day of theory. That’s how intention becomes habit.

Sustaining Without Overload

Sustaining isn’t about heroics. It’s about rhythm. Protect a standing implementation window in your meeting cycle. Keep the action log alive. Run a light “scale check” before expanding beyond early adopters. Rogers (2003) showed spread only succeeds when it’s paced.

Signal of progress: leaders report fewer ad-hoc “rescue” tasks; staff begin asking when the next cycle starts because wins are visible and workload is shared.

For system-level design choices that reduce red tape while protecting authority, revisit The Invisible Wall of Compliance. For how trust makes adoption safer and faster, see The Trust Dilemma.

Reframe and Next Step

The graveyard isn’t inevitable. It’s a design problem we can solve. When you define the finish line, shorten the cycle, make contribution visible, close loops, and build capacity deliberately, initiatives stop being events. They become culture.

Don’t launch harder. Embed smaller and sooner.

Where to start this term: pick one existing initiative that drifted. Run a single six-week cycle with a role map, action log, and a visible finish line. Publish the closure note. Notice what changes—then scale.


References

Australian Catholic University. (2023). The Australian Principal Occupational Health, Safety and Wellbeing Survey 2023: Data report.

Australian Catholic University. (2024). The Australian Principal Occupational Health, Safety and Wellbeing Survey 2024: Data report.

Bryk, A. S., & Schneider, B. (2002). Trust in schools: A core resource for improvement. Russell Sage Foundation.

Bryk, A. S., Gomez, L. M., Grunow, A., & LeMahieu, P. (2015). Learning to improve: How America’s schools can get better at getting better. Harvard Education Press.

Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41(3–4), 327–350.

Edmondson, A. (1999). Psychological safety and learning behaviour in work teams. Administrative Science Quarterly, 44(2), 350–383.

Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R., & Wallace, F. (2005). Implementation research: A synthesis of the literature. University of South Florida.

Fullan, M. (2011). Choosing the wrong drivers for whole system reform. Centre for Strategic Education.

Hargreaves, A., & Fullan, M. (2012). Professional capital: Transforming teaching in every school. Teachers College Press.

Hattie, J. (2016). Visible learning: A synthesis of over 1,200 meta-analyses relating to achievement. Routledge.

Rogers, E. M. (2003). Diffusion of innovations (5th ed.). Free Press.

Senge, P. (2006). The fifth discipline: The art and practice of the learning organisation (Rev. ed.). Doubleday/Currency.

Timperley, H. (2011). Realising the power of professional learning. Open University Press.