Tim Lockie's Blog

Stop calling it 'AI adoption'—it's culture capacity

Written by Tim Lockie | Apr 29, 2026 1:20:23 PM

 

Why nonprofits keep misdiagnosing their AI problem, and how leaders can fix it by focusing on collective learning instead of tool adoption.

If you're leading a nonprofit and feel like you're "behind" on AI, you're not alone. But the problem isn't that your organization hasn't adopted AI yet—it's that you're trying to solve a cultural problem with a technological solution.

 

The real problem isn't AI adoption—it's culture capacity

Most failed AI initiatives didn't fail because of the tools—they failed because the system couldn't absorb the change.

"AI adoption" makes it sound like the job is straightforward: pick the right tool, train people, and then the organization gets better. That framing locates success in choosing well and executing training. But that's not what creates lasting change, and it's not what happens in practice.

Here's what actually happens: a few people sprint ahead, most people stall out, and the whole thing becomes another change initiative that dies quietly in a shared folder. The pattern repeats because organizations see individual mastery as the win, when the real challenge is whether the system can learn together.

Most organizations weren't built to learn collectively at the pace technology now demands. They were built for stability, for role clarity, for expertise that compounds over years. Those structures made sense when tools changed slowly. They don't make sense anymore.

The better question isn't "How do we get people to adopt AI?" The better question is: Does our culture have the capacity to change as fast as our tools are changing?

That's a leadership question, not a tooling question.

 

Adoption is private lessons for the tuba player

Imagine your organization as an orchestra: shiny new instruments won't matter if the musicians don't trust the conductor or play in sync.

If one person gets great at a tool, that's cool. But adoption isn't the win. Organizations win when the system can play together. Individual mastery doesn't equal organizational capability. An organization filled with individually skilled people who can't coordinate produces expensive chaos, not better outcomes.

Technology is the instrument. Data is the sheet music. Your people are the musicians. Leadership is the conductor.

If you don't like the way your organization sounds, you can buy shinier instruments. You can upgrade the tech stack. But if the musicians don't know how to play together—or they don't trust the conductor—the sound doesn't improve. It just gets louder.

This metaphor holds because it makes visible what "adoption" language obscures: the system's capacity to coordinate is what determines whether new tools create value or just create noise. You can have the finest instruments in the world, but if the orchestra can't stay in tempo, can't read the same sheet music, or doesn't trust the conductor's cues, you get cacophony.

Upgrading tools without upgrading coordination doesn't improve the sound. It amplifies whatever dysfunction already exists.

 

Why nonprofits keep repeating the same mistake

We keep buying better tools without building better teams—and then wonder why adoption fails again.

For years, nonprofits have overinvested in instruments. New CRM. New marketing automation. New donor analytics. New ticketing system. New project management tool. Now: new AI tools. Each wave promised to solve the coordination problem through better technology. None of them did.

And at the same time, we've underinvested in musicianship: shared language, trust, clarity of roles and decisions, and the ability to experiment without punishment. Those capabilities don't come from tools. They come from how leadership shapes culture, how teams build trust, and how the organization treats people who try something new and fail.

So AI lands on top of a culture already saturated with change fatigue. The system can't metabolize another layer of complexity. People are running hot, bandwidth is thin, and the implicit message from leadership—despite what's said explicitly—is often "make this work without slowing down what you're already doing."

This is the point: adoption keeps failing not because AI isn't useful, but because the system can't metabolize change. When culture lags behind technology, every new tool makes the organization more fragile, not more capable. Each new layer of complexity increases the coordination cost without increasing the coordination capacity.

The mistake nonprofits keep repeating is diagnosing a culture problem as a technology problem, then solving for the wrong variable.

 

AI isn't just a tool—it's an identity challenge

Under every AI conversation is a quiet identity crisis—and leaders who ignore it will never see real adoption.

AI stirs deep questions about value, belonging, and expertise. Under the surface, a lot of people are asking: "If AI can do the work I was hired to do... what's my value now?" "Am I training my replacement?" "Do I still belong here if I'm not the expert?"

These aren't irrational fears. They're reasonable responses to a technology that genuinely does shift what humans need to be good at. And they're identity-level questions, not skill-level questions.

Leaders try to solve this with training, policies, prompt libraries, and tool rules. Those interventions address the surface. They don't address the fear. Policies and training can't solve identity-level fears. Only belonging and psychological safety can.

If people feel threatened, they won't experiment. If they won't experiment, they won't learn. If they won't learn, you'll never get the desired outcomes. The causal chain is direct: psychological safety enables experimentation, experimentation enables learning, learning enables adaptation. Break the first link and the whole chain fails.

So if you're serious about AI, you have to be serious about belonging. Because belonging is the base code of humans. People will protect themselves before they'll optimize the system. That's not a character flaw—it's how humans work.

The organizations that treat AI purely as a technical challenge will keep running into the same wall: people who say yes in meetings and then quietly resist in practice. The organizations that treat it as an identity challenge—who build belonging, who create safe space to be bad at new things before being good at them—will actually see their people lean in.

 

How to make culture capacity visible and actionable

You can't manage what you can't measure—and culture capacity is measurable.

One of the most helpful moments for a leadership team is when this becomes visible. Not vibes. Not "we're trying." Visible. You can look at your organization and answer questions like: Is our culture ahead of our technology, or is our technology ahead of our culture? Do we have capacity for change, or are we running hot? Are people experimenting, or protecting themselves?

Those questions have observable answers. You can measure experimentation rates—how many people are trying new approaches in low-stakes contexts. You can measure trust signals—whether people surface problems early or hide them until they're crises. You can measure bandwidth—whether teams have slack to learn or whether every hour is already allocated.

When you can measure the gap between technology ambition and culture capacity, you stop arguing about tools and start making better decisions about pace, investment, and leadership priorities. The conversation shifts from "Should we use AI?" to "Do we have the conditions that would let AI be useful here?"

Here's the uncomfortable truth: if your culture is behind your technology, every new tool makes you more fragile. Each tool adds complexity the system can't absorb, which increases coordination costs, which burns people out, which reduces trust, which makes the next tool even harder to integrate. It's a vicious cycle.

Stop relying on vibes. Measure your culture's readiness for change with clear signals of experimentation, trust, and bandwidth. Then make strategic decisions based on what you find.

Start with outcomes, not tools. Pick one workflow end-to-end where AI could genuinely reduce friction or improve quality. Create a "practice lane"—not a performance lane—where teams can be bad at this before they're good at it. Give them permission to fail, time to learn, and leadership attention that rewards experimentation, not just results.

Invest in shared language so the team knows what "good" looks like in your context. "Good" doesn't mean "what the vendor demo showed." It means "what actually works here, given our constraints, our people, and our mission." Build that definition together.

Then measure culture capacity as seriously as you measure technology adoption. Track whether trust is increasing. Track whether people feel safe experimenting. Track whether teams are learning together or fragmenting into individuals who hoard knowledge.

Culture capacity isn't soft. It's the hardest variable to change and the most predictive of whether any transformation effort will stick.

 

The point: adoption is a tactic, culture capacity is the strategy

If your culture can't learn together, no tool will save you.

Adoption is a tactic. Culture capacity is the strategy. If your organization can learn together, you'll make AI useful—and you'll keep making it useful as the tools change again next month, and again next quarter. Organizations that can learn collectively are resilient to technological change because their capability isn't locked into specific tools. It's embedded in how they work together.

Those that can't will keep chasing the next shiny thing—and burning out in the process. They'll adopt and abandon, adopt and abandon, each time wondering why this tool didn't solve the problem either.

The real leadership challenge is not how fast you can adopt, but how well your organization can play together.

Think back to the orchestra. Harmony, not hardware, is what creates real progress. You can keep upgrading instruments, but if the musicians can't hear each other, can't trust the conductor, can't recover when someone makes a mistake, the music doesn't improve.

So shift your focus from tools to tempo. From adoption to capacity. Ask whether your team has the trust, clarity, and psychological safety needed to learn together. Ask whether your culture can absorb change at the speed technology demands. Ask whether people feel like they belong in the version of the organization that's emerging.

If the answer is no, that's not a failure. It's clarity. And clarity about the real problem is the first step toward solving it.

Audit your organization's culture capacity. Have the conversation with your leadership team. Share what you find—in the comments, with your board, with the people doing the work. Because the organizations that will thrive in the next decade aren't the ones with the best tools. They're the ones where people can play together.