I want to start with an honest observation.

Most L&D teams I speak with know they should be doing more with data. Many have invested in analytics tools, built dashboards, and hired people with data skills. And yet, when I ask the question that really matters "does your analytics work actually change decisions?" the answer is almost always some version of: not really.

Completion rates go up and down. Satisfaction scores hover around 4.2 out of 5. Hours of learning consumed tick upward every quarter. Reports get sent. Presentations get made. And then... not much changes.

And where most people in L&D consider learning analytics a data and technology challenge, I see it differently: It’s a mindset challenge. And it starts with a question that most analytics initiatives never properly answer:

What is learning analytics actually for?

The wrong answer, and why we keep giving it

The most common implicit answer to that question is: learning analytics exists to show that L&D is doing things.

That answer produces vanity metrics. Metrics that document activity without illuminating anything. Metrics that measure what is easy to count rather than what matters. Metrics that satisfy a reporting requirement without informing a single meaningful decision.

To be fair, the pressure to produce these metrics is real. L&D has always struggled to demonstrate its value in language that business leaders find compelling. When meaningful proof is hard to find, the temptation is to fill the gap with numbers that at least look credible. Completion rates. Learner satisfaction. Seat time. Catalogue size.

Example of an L&D Dashboard with mostly vanity metrics

The problem is not that these numbers are wrong. or useless. Many of them are in fact essential. The problem is that they are disconnected from anything a decision-maker can act on. And over time, they quietly teach the organization that learning analytics is a reporting exercise, not a strategic pillar.

That reputation is very hard to shake once it has formed. And it is why so many L&D teams find themselves in a frustrating position: investing in analytics capability while sensing, somewhere, that it is not quite landing the way it should.

The right answer: analytics exists to support decisions

Let me be direct about what I believe learning analytics is for.

Learning analytics exists to support better decisions.

Learning analytics exists to support better decisions.

Full stop.

Not to report. Not to demonstrate activity. Not to justify existence. To support the decisions that the L& teams and the organization need to make about people, about skills, about performance, about where to invest and where to stop.

This sounds obvious when stated plainly. But it has radical implications for how analytics work is designed, what gets measured, and who the real audience for that work actually is.

The moment you make decisions the starting point, the entire logic of analytics reverses. Instead of asking "what data do we have and what can we show with it?", you start by asking "what decisions need to be made, by whom, and what information would genuinely help them make those decisions better?"

That is a completely different conversation. And it is one that immediately forces L&D out of the data room and into a strategic dialogue with the business.

What decision intelligence means

The discipline that formalizes this idea is called decision intelligence.

Decision intelligence is the systematic use of data, analytics, AI and contextual insight to design, support, and improve decision-making, so that better, faster, and more consistent decisions are made across the organization.

It was popularized by Cassie Kozyrkov, Google's first Chief Decision Scientist, whose central argument is deceptively simple: organizations invest enormous resources in data and AI, but almost nothing in the quality of the decisions those tools are supposed to support. The technology is only as valuable as the decisions it informs.

That argument should land very directly for anyone working in L&D.

Because if you look closely at what decision intelligence is trying to do: equip people with the right information, in the right context, at the right moment, to make the best possible choice — it sounds remarkably like what Learning and Development has always been trying to do. We have just been doing it at the level of knowledge and skills, not at the level of decision support infrastructure.

The opportunity is to connect those two things. To recognize that L&D's expertise in understanding how people learn, how performance is built, and what organizations need to develop is exactly the foundation required to build genuine decision intelligence for the business.

And if we do not make that connection ourselves, someone else will.

Decisions are not made in a vacuum

One thing I want to be precise about, because it matters for everything that follows, is that decision intelligence is not a buzzword for "more dashboards." It takes seriously something that most analytics tools quietly ignore: decisions are genuinely hard.

Think about what a manager faces when making a people decision. Incomplete data. Competing priorities. Time pressure. Personal biases they may not even be aware of. Organizational politics. And the uncomfortable reality that the data available to them is usually a filtered, delayed, and imperfect reflection of what actually happened.

These are not edge cases. They are the normal conditions under which most consequential decisions in organizations get made. Decision intelligence does not pretend otherwise. It asks: given this reality, how do we design the information, the context, and the analytical support that helps people decide as well as possible?

That question has no purely technical answer. It requires understanding how people think, how organizations work, and how learning and capability connect to outcomes. Which is, again, exactly what L&D is supposed to be good at.

Three questions that reframe everything

If you want to start shifting your analytics practice toward decision intelligence, the entry point is not a new tool or a new data source. It is a different set of questions.

The first question is: who are the decision-makers we are trying to support? Not the stakeholders who receive our reports, but the people who actually make choices based on what we give them. That could be an HR leadership team deciding where to invest in skills. It could be line managers deciding how to support a team member's development. It could be a business leader deciding whether to build or buy a capability. Who are they, and what do they actually need?

The second question is: what decisions do these people currently make without reliable evidence? Where are they relying on gut feel, seniority, or convenience data because something better does not exist? These are the highest-value gaps for learning analytics to fill. Not because the data would be easy to produce, but because the decision actually matters.

The third question is: what would change if they had better information? If the answer is "probably not much," the decision is not the right target. If the answer is "they would invest differently, act faster, or avoid a costly mistake," you have found something worth building toward.

These three questions do not require any data. They require clarity of purpose. And without that clarity, even the most sophisticated analytics infrastructure will produce the same result: dashboards that get opened on Monday and forgotten by Thursday.

Why AI makes this urgent

I want to close with a thought that I will develop much more fully later in this series, because it points to why decision intelligence matters beyond the analytics conversation itself.

Organizations are increasingly making decisions with AI. Not dramatically or all at once, but incrementally and continuously, in ways that are already reshaping how work gets done and how choices get made. That shift has profound implications for L&D, and for the role that decision intelligence plays in an AI-driven organization.

I will come back to this in the fourth newsletter in this series, because it deserves its own space. For now, the important thing to register is this: building genuine decision intelligence is not just about making L&D analytics more useful today. It is about positioning L&D for a future in which the quality of decisions, and the quality of what informs them, becomes one of the most strategically important capabilities an organization can have.

This connects directly to an argument I made in the vision series on reinventing L&D: one of the most important and underappreciated roles that L&D can claim in an AI-driven organization is as the guardian of decision intelligence. Not as a technical watchdog, but as the function with the deepest understanding of how people learn, how AI learns, and what it means for both to be learning from the right things.

Read the series on reinventing L&D to understand how L&D, Performance, Decision Intelligence and AI all link together

In-house large language models learn from company data: policies, procedures, knowledge bases, performance records. The quality of what they learn from determines the quality of the decisions they support. That is a learning problem. And it is squarely in L&D's territory.

What comes next

This newsletter is the first in a series of four on decision intelligence for Learning and Development.

In the next edition, I will explore something that often surprises L&D teams: most organizations already have far more decision-relevant data than they realize. The challenge is not starting from scratch. It is making an honest inventory of what exists, connecting it to the decisions that matter, and being clear-eyed about the gaps that remain.

The third edition introduces the concept of designing for data, the idea that technology, processes, learning programs, and the way we structure roles should also be designed based on the data we want them to generate.

And the fourth edition brings it all together by exploring what happens when AI enters the picture fully and why L&D teams that have built genuine decision intelligence are the ones best positioned to lead in an AI-driven organization, not just adapt to it.

Taken together, these four newsletters build a practical framework for L&D teams who want analytics that genuinely delivers value, not because it looks impressive on a slide, but because it changes the quality of decisions the organization makes.

That is what analytics is for. And that is what decision intelligence, done well, makes possible.

Peter Meerman SLT Consulting — Learning Analytics Made Easy

Keep Reading