AI signal and noise: The first wave in learning and enablement

AI's first wave is making the work we already do faster. The next wave will rewrite the work itself.

AI is already becoming a coworker for people in the enablement space. At Learning Technologies 2026, nearly every participant in each AI session agreed on one thing: you cannot be a viable partner to your stakeholders without using AI.

Fosway data backs the sentiment: nearly half of L&D work could be done by AI within the next few years, and 80% of teams plan to use AI to resource projects this year. But what does that mean?

The question isn’t whether AI will be adopted, but whether what we’re using it for is solving the right problems and what new problems it could solve next.

AI started with content, but that’s not the end

To answer that, let’s take a look at a familiar content creation workflow. Most enablement and L&D teams operate inside a process that hasn’t fundamentally changed in decades, but with the introduction of AI content tools, the workflow is changing.

A typical learning content lifecycle workflow for enablement

Let’s break this down.

  1. Knowledge lives in scattered sources: SharePoint, Confluence, Notion, Google Drive, Seismic, the heads of SMEs, half a dozen Slack channels
  2. When a business need surfaces, an instructional designer is asked to build something (usually a learning or content). This could be a product launch, a sales or skill gap, a product or strategy update, etc
  3. They compile knowledge from those sources, design the experience, build it in Rise or another authoring tool, and hand it to an admin
  4. The admin uploads it to the LMS as a SCORM package

But the content lifecycle is not done…

  • From there, learners engage (or don’t)
  • Feedback comes back through course evaluations
  • Analytics report pass/fail
  • Completions are used to track compliance (notably not outcomes)
  • When the source content changes, an updates loop fires (eventually)
  • Updates are flagged for work in the backlog
  • The cycle restarts

The process is designed for maximum reach, but it targets no one’s specific needs. It’s not personalized. It takes weeks to build. It is hard to keep up-to-date. Feedback gets lost, forgotten, or buried under the next request. Analytics aren’t about business outcomes; instead they are stuck at pass/fail.

By the time learning reaches the learner, the moment of need has often passed, and forget about recall. No one is revisiting this content once it’s marked as complete.

Is AI here to save us?

Now look at where AI showed up first. In the middle of that diagram.

A typical learning content lifecycle workflow for enablement, now with AI

AI authoring tools, avatar generators, course assemblers, AI-generated assessments, AI writing assistants embedded in every authoring platform, and AI coaching tools like Yoodli that let learners practice and get real-time feedback before high-stakes moments. My team at Snowflake uses Yoodli and I’m a fan of what they’re building.

There are three reasons why AI tools targeted content creation first:

  1. AI is good at content and patterns without full context. It can generate a first draft from a brief. It can turn a transcript into a course outline. It can produce passable assessments. None of this requires AI to understand your full knowledge base, your sales process, or your analytics stack. And that’s good, because it doesn’t have access to most of those anyway. Not yet.
  2. Content acceleration is the most obvious productivity win. Enablement and L&D leaders have been promised faster content production for decades, and ATD’s research is clear on why. Two-thirds of learning teams cite limited resources as their top barrier to effective learning, roughly 40% cite scope creep, and another 37% can’t get enough SME time to keep their materials accurate. These are exactly the bottlenecks AI content tools were built to dissolve.
  3. It fits inside the existing workflow. AI in content creation doesn’t require rewiring the process. The instructional designer’s role gets faster, not different. The LMS still distributes the content. Analytics still report pass/fail. Nothing else has to change.
ATD top barriers to L&D output, aligned with expected impact from AI

While AI has sped up the reaction, the real problem remains: the cycle is still reactive. Still generalized. Still pass/fail.

AI made the same shape of work move faster, but the shape itself was the constraint. As Fosway flagged, most AI activity is concentrated on the production side of learning rather than on outcomes.

If you’re measuring AI by content speed, you’re measuring fit to the old workflow, not outcomes.

The first wave is real. The next wave is the work.

This is the first wave of AI in enablement: making the existing shape of work faster. Real wins. Real savings. Not yet transformation.

The next wave is rewiring the shape of work itself. A connective membrane between knowledge, skills, and signals. The central brain that turns AI from accelerant into the operating layer. I’ll cover how this is starting to take shape in the next post.

This is part two of a series on what I took away from Learning Technologies 2026. Part three: the central brain. What the connective layer between knowledge, skills, and outcomes actually looks like, with the architecture that makes it real.