
AI is advancing. Every month brings a fresh set of tools, updates, and integrations aimed at transforming industries both large and small. The education sector, especially digital learning, has welcomed this shift with open arms. Automated content generation, adaptive learning paths, and data-driven assessments are now part of the standard eLearning toolkit. What was once manual is now increasingly machine-led.
Within Learning Experience Design (LXD), these technologies are changing how courses are built and delivered. But something vital risks being lost in the process. While AI excels at identifying patterns and making predictions, it will struggle to truly understand the human nuances within your audience (their demographics, skill levels, specializations, cultural blind spots, etc.). And since any good course should center around the learner, that is something we need to be highly aware of.
Who You Design For
Designing for learning is not only about content. It’s about people. An effective course meets learners where they are, not just in terms of knowledge gaps but in terms of emotions, motivations, and lived experience. Human-centered LXD listens before it speaks. It adapts not only to what a learner knows but to how a learner feels. This is where AI stumbles.
Empathy is difficult to program. While some AI models can detect sentiment or tone, their understanding remains shallow. A frustrated learner might be flagged for needing easier content. But what if the frustration comes from life outside the course? What if it stems from cultural dissonance or internalized anxiety about learning itself?
These are not problems AI is equipped to solve. They require sensitivity, observation, and context. These are things that live in the subtle space between words and behaviors.
The Voice of Your Training
There is also the question of voice. Many AI-powered digital learning tools can default to a tone that feels artificial or impersonal. The phrasing might be technically correct, but something feels off; readers tend to pick up on such instances.
When the voice behind digital learning sounds too robotic or overly optimized, some distance could develop between the learner and their training. People do not respond to information alone. The tone, style, and humanity contained in a piece can be just as critical as the content itself.
Considerations of Bias in Digital Learning
Bias presents another difficulty. AI reflects the data that it’s trained on. If a model has only been exposed to a narrow subset of learners, its outputs will echo that limitation. This can lead to course material that unintentionally excludes or misrepresents diverse learners.
While humans are also prone to bias, they have the ability and responsibility to question and adjust for it. AI does not. It repeats what it knows, regardless of context.
AI Is Not Inherently Good or Bad
This is not an argument against AI in digital learning. When used wisely, these tools can bring enormous value. They can personalize pacing, automate tasks, and offer insights that improve instructional design. But caution is required.
The LXD community must remember what cannot be automated: curiosity, lived experiences… empathy. Only a person can connect one learner’s story to a broader learning journey. These elements must be protected, even as technology evolves.
It’s tempting to rush. Tools that promise faster development or increased efficiency are attractive, especially under tight timelines. But meaningful learning takes time. The best courses are often the ones that feel less like systems and more like conversations. AI can assist in building those conversations, but it cannot yet, and perhaps should not ever, lead them.
Designers and thought leaders must play a crucial role in this balance. They must ask hard questions, not only about what AI can do but about what it should do. They must listen to feedback from learners and remain attentive to moments where something feels off.
When used without reflection, even the best AI tools can create experiences that are hollow at their core.
Takeaways
The ultimate goal of training should never be speed. It shouldn’t be scale for the sake of scale. At its best, training aims to help people grow and facilitate behavior change. That purpose must remain central, even in an age of automation. The tools may change, but the heart of learning stays the same.
AI can support and augment. AI can sometimes even surprise. But it cannot yet care. And caring is where truly memorable learning begins. Contact us to co-create truly impactful digital learning for your organization.