A Key Skill Everyone in L&D Should Have: Asking the Right Question

April Edition of the "Learning Analytics Made Easy" Newsletter

Hi ,

Welcome to the 4rd edition of the ‘Learning Analytics Made Easy” newsletter. The one newsletter you need to start and build your Learning Analytics skills. Glad you’re with us!

The inspiration for this newsletter came from the never ending stream of updates and news on AI. In many of these, AI is presented as the solution to answer all your questions. Including all your questions on learning analytics. Some even claim that with AI, you no longer need to do analytics yourself, or build any analytics skills yourself. AI can do all of that for you.

Not surprisingly, I do not agree. Not because I’m a fan of Learning Analytics. But because we work a lot with AI and know it’s potential and limitations very well. And there are serious limitations in using AI for analytics. We’re actually doing an experiment on this will several major AI solutions and will report out on this in a next newsletter. But this month I want to share my thoughts and ideas on one aspect and skill that is very essential even with the best AI money can buy:

The art and skill of asking the right question!

So this newsletter is all about helping you to ask the right questions.

Here we go!

Peter

Why asking a good question is key! 

The reason why asking a good question is essential in learning analytics is simple:

Messy questions leads to messy analytics outcomes and meaningless insights…

Imagine a business leader coming to you asking: “Could you please let me know how much time my team has spent on training so far this year?”

You might be tempted to go ahead, pull data from your HR and LMS systems and get going on the calculations and insights. Or if you have a dashboard, you go to your dashboard and pull the metrics.

Calculating the time spent on learning is after all a fairly simple metric to analyze.

But there might be a catch, or even several:

  • First of all there is lack of clarity: what does the questioner mean with 'time spent'? And what trainings should be considered? 

  • Also what does 'so far this year' mean? Is it the calendar year or the fiscal year?

  • And who is considered to be in 'my team'? Only direct reports or the whole downstream organization? Should you include contractors?

  • Next there is a lack of context: what is the goal of this data request? Is it compliance, or costs, or skills?

  • Lastly, the question is not actionable. What action or decision is behind the question?

So what seams to be a fairly straightforward question, might not be so straightforward after all.

Doing learning analytics based on a poorly phrased question is, in essence, the same as building a learning program without a thorough learning needs analysis…

You will be…

Going around in circles…

When you start doing analytics based on a poorly phrased question, the chance is high that you come up with an answer that your stakeholder is not looking for. The answer could be completely wrong, it could also be only part of what he or she is looking for.

Your stakeholder or customer might have completely different definition in mind, it might look at a different slice of the data, or a different timeframe, or it might simply be looking for an actionable insights thinking ‘total learning hours’ will give that, while it is actually not.

And imagine you run and analyze the data, then come back to present your results, only to find out it’s not what was required. You then have to go back and start all over again. In a worst case scenario, you might need to go back several times!

This all leads to time being wasted by you and everybody else involved. It also increases dissatisfaction and frustration with the person asking the questions because it looks like you are not really sure what you are talking about! And if you want to be taken serious as an L&D professional, and want your L&D team to be a strategic partner, the art and skill of asking the right question is vital.

Examples of poorly phrased questions

Unfortunately it’s so much easier to ask poor questions, than it is to ask really good questions. Here’s some examples you might recognize:

Is our training good?

This question is too vague and subjective. It does not specify what aspect of the training is being questioned (e.g., content, delivery, engagement impact?) and what good looks like.

How many people completed the training?

A classic one! And arguably the most asked question in the history of L&D. Training completions by itself mean very little, sometimes they mean nothing at all. Still it’s tyhe one I see most on slides presenting learning results: “we have 312 completions!”. Ok, nice. but what does it mean? And what action can I take with the information? Even if you have a specific completion target in mind this question can be phrased much better (“how many completions do we have against the target and will we achieve the target completions in the time we have set out?” is a good improvement! But seriously….

Is our training expensive?

This question is subjective and relative. "Expensive" is not quantifiable without a point of reference, such as cost per learning hour or comparison to industry standards. And depending on the return of the investment in L&D…training programs with high returns might also be allowed to cost more, but does that make them expensive?

Do people use the LXP?

This one get asked a lot with companies who recently implemented a new learning platform. And this question could be easily rephased to “Do people use our new AI recommendation engine?” The issues with both the question are the same. Even if only one person ever logged in, the answer is “yes”. But more importantly, what does “use” really means? Does it refers to actual learning? Or just browsing around? And what qualifies for use? A once of visit or completion, or every week? In addition, the question is not actionable. It suggest action is desired when the usage is lower than expected or planned, but it’s much better to make this explicit in the question rather than assume it.

This brings us to one of the most common pitfalls in any question asked. Not just in L&D, but in general. There are too many hidden assumption in each question. And if you follow my writing, you know how I feel about assumptions!

People asking for completions assume completions = learning

People asking for costs assume expensive = bad, cheap = good

People asking for engagement data assume high engagement = better learning transfer

and so on.

But poorly phrased questions have more bad traits. They lack specificity, include (too much) ambiguity (in terms of scope of training, audience and time), they almost never include sufficient context to tell you why the information is important and what people want to do with it, and they are not actionable.

That is why we have made ‘question analysis’ the first step of our learning analytics process: it’s the vital first step that enables you to make sure your question is solid as a rock.

And that is why we have introduced a framework for asking solid (analytics) questions: SAMCA!

A framework for good questions: SAMCA 

Good analytics questions are specific, aligned, measurable, clear and Actionable: SAMCA!

Specific: The question should have focus, sufficient detail and should avoid all ambiguity This means that the analytics questions should clearly define the scope, context and when possible, the variables that are required. “How can we improve our training?” is not a very specific question. A better alternative is: “What specific changes can we make to our onboarding training program to reduce new hire turnover within the first six months?

Aligned: The question should align with the strategic goals and objectives of the organization. This ensures that the insights derived from the analysis will be relevant and supportive of decision-making processes that drive the organization's priorities. “How many employees have completed the AI training program?” is an interesting question to ask.

But the companies strategic goal is not about AI training completion, it’s about driving business value through AI! So a better alternative would be: “How does the AI skills training program support our strategic objective of integrating artificial intelligence into 50% of our business processes by the end of next year?

Measurable: The question should involve elements that can be quantified or assessed through data. This implies having access to reliable data sources and metrics that can be used to evaluate the outcomes. “Are our learners sufficiently engaged?” is again an interesting question, but really really hard to measure! Because what is engagement really? And how do you measure it?

A better question would be something like this: ”Given that we agree time spent on voluntary learning is a good proxy for learning engagement (this is an assumption…) can we correlate learning engagement with business performance to see if there is a positive correlation and how strong it is across the organization?”

Clear: The question should be easy to understand and free of jargon or complex language. Clarity ensures that everyone involved in the analysis and decision-making process has a shared understanding of the question and its implications. “What is the delta in performance metrics post-implementation of our AI-enhanced, adaptive learning paradigms vis-à-vis the legacy pedagogical frameworks within the enterprise knowledge ecosystem?” is a great example that Chat GPT brought up and I love it! But it contains so much jargon that I even do not really know what the question is about!

Actionable: The question should lead to insights that can directly inform decision-making and prompt specific actions. It implies that the results of the analysis will provide clear recommendations or steps that can be taken. The question “what should we do about low engagement in training?” is not even that bad. It suggests action. But we can get much better results if we bring in a real target like in this example: “What specific strategies can we implement to increase employee participation in the quarterly professional development seminars by 20% in the next six months?”

A crucial element of making a question actionable is to link it very explicitly to one or more specific targets or objectives.

A special type of poorly phased analytics questions can be grouped under the label “Vanity Metrics”.

Vanity metrics are metrics that appear impressive (“we have trained a thousand employees”), but they do not provide any information that can be used to improve your strategy or make informed decisions, they are not connected to your core objectives and they can be easily manipulated giving you a false sense of achievement.

Famous examples of L&D vanity metrics are:

Course Enrolments

Completions

Learning Hours

Happy Sheets

Course Ratings

Now, I am not saying that these metrics present no value, just that there value is limited, especially without context. Sometimes they are all we have. And they for sure are a good place to start with learning analytics. But we should at one point move beyond them… way beyond them…

What Vanity Metrics do you use most?

In L&D we all use vanity metrics, sometimes simply because they are the only ones available. As long as we realize they are vanity metrics, it's ok to use them. What vanity metrics do you use most?

Login or Subscribe to participate in polls.

How to formulate a good question?

A challenge I hear very often is that our customers and stakeholders do not always know what data and insight they want and need. How are we then supposed to come to a solid analytics question?

The answer to this question is actually similar as the answer to doing a solid learning needs analysis. We all know that poorly executed learning needs analysis seldom leads to learning transfer, let alone business impact. Without a thorough understanding of the business context, it’s challenges, the audience and evidence based learning design you might still end up with a learning program, and employees might still participate, but you leave transfer and impact very much to chance.

With learning analytics it’s not very different. Without spending a bit of time on trying to find out what exactly is being asked and why, you still could produce data and insights, but you’re simply not sure if they are helpful.

Here’s how you turn a poorly phrased question into a solid one!

Use the question clarification process

The question clarification process is a 4 or 5 step process consisting of the following activities:

1.Review the question: Make sure that you fully understand the question that is asked and already consider area’s of unclarity of ambiguity. This seems a logical and somewhat superfluous step, but too often

2.Identify the intended action behind the question: Try to figure out what the requestor is trying to do with the data and insights. What is the intended action or decision? This helps you immediately with 2 things. First you can push back when there is no real action behind the question and you run the risk of spending time on vanity metrics. Secondly, more often that you might realize, people asking questions can ask the wrong question for the action they have in mind. The sooner you realize this, the earlier you can redirect the requestor to the right question!

3.Ask clarifying questions: Depending on how familiar you are with the person asking the question and it’s context, and how well you have been able to retrieve the intended action behind the question, you can now start to eliminate all unclarities and ambiguities by asking clarifying questions on things like (a) definitions, (b) contexts, (c) audience (for whom is the answer intended?), (d) timelines, (e) recurrence, (f) metrics and variables (if and when available), (h) targets or benchmarks against which to compare the actuals.

4.Rephase and validate the question: The final step is then to rephase the question and align the rephased question with your stakeholder. For more complicated questions involving multiple stakeholders you might want to split this into 2 separate steps. It’s crucial that whomever is requesting data and insights also understands and agrees with the new and improved question you have formulated.

Use proven questioning techniques

A lot of research has been done on questioning techniques, and libraries are filled with books on the topic. And while I do want to mention that the art of getting to the root of a question, problem or challenge is a critical skill I think every L&D professional should have, this newsletter is way too short to discuss all of them.

I do want to highlight 2 of my favorites:

  • 3x why (or the five Why’s and a How)

  • Socratic Questioning

The 3x why method is the seemingly simple method of asking probing questions starting with why. The five why’s and a how is a more elaborate version of this method. The key intend is that with every question asked you get more specific and closer to the root of the issue.

Initial Question: "Did our leadership training have an impact?"

1st Why: Why do you want to know if the leadership training had an impact?
Because we need to ensure it improved the way leaders manage their teams.

2nd Why: Why is it important to improve how leaders manage their teams?
Because effective leadership directly influences team performance, engagement, and decision-making.

3rd Why: Why is that critical for our organization right now?
Because we’re growing quickly and can’t afford inconsistency in how teams are led.

4th Why: Why does inconsistent leadership create problems during growth?
Because it leads to slower project execution, misalignment across teams, and higher employee turnover.

5th Why": Why do we need to address those problems urgently?
Because they are already impacting our ability to scale operations and meet strategic targets.

And the How: How can we measure whether the leadership training helped reduce these business problems?
By tracking changes in team performance metrics (like project delivery time), comparing turnover rates in teams with trained vs. untrained leaders, and collecting post-training feedback from team members.

Resulting Refined Question

"How has the leadership development program influenced key business outcomes—such as project delivery timelines, team retention, and leadership effectiveness ratings—in teams led by participants compared to those led by non-participants?"

Socratic Questioning

Socratic Questioning is a different approach that allows a bit more control over the process and helps identifying assumptions and .

Initial question: "Did our leadership training have an impact?"

  1. Clarifying concepts

  2. Probing assumptions

  3. Exploring reasoning and evidence

  4. Considering alternatives

  5. Examining implications

  6. Encouraging reflection

No matter what method you choose… investing the time and effort to go through the question analysis is something that will pay itself back many times!

A good thing then that you read my newsletter!

How AI can Help

While I do not agree that AI can do all of your learning analytics for you, I do think AI can be very helpful. For sure AI can help you formulating solid learning analytics questions.

Here’s how AI can help:

  1. Identifying area’s of ambiguity and missing critical elements in a question so that you can focus you efforts on getting clarity in and on those elements.

  2. Supporting you in applying clarification frameworks like the 3x why, 5 why’s and a How and the Socratic Questioning technique. You can even simulate these with AI exploring possible scenarios and different directions

  3. Explore possible actions behind the question. Especially when you provide sufficient context, AI can help you consider all possible actions that could behind the question being asked. This will help you in your conversations with your stakeholders. Especially when AI brings up possible actions in a business context that you are less familiar with

  4. Provide several options for refined questions that focus on SAMCA and especially actionability. This enables you to choose the best options and validate those with your stakeholder

  5. Identify possible metrics and KPI’s to explore that represent the question and can be used for data storytelling. This is extremely useful to prepare the next step of the process: data collection. And you might sometimes find that the KPI that woks best (business improvement!) might not be realistic due to a lack of data, then AI can help finding alternatives.

Thanks for joining me on this fascinating journey into L&D Data, Analytics and AI! Drop me a note if you have any specific questions or ideas for a next edition. Or want to have a more in debt conversation on analytics

Best,

Peter Meerman