Skip to main content

Compliance elearning that works — Data driven design

Blog posts | 19.10.2021

Compliance elearning

Helping you find your North Star

How do you avoid making the same mistake twice? Better to make a mistake once and learn from it. Data analytics can provide valuable insights into the direction you should be taking your compliance-based learning.

Article no. 9 of 10 - The Kineo view on: Compliance elearning


You may have heard the term ‘data driven’ in relation to learning design.​ A data-driven approach helps you to understand how people are engaging with your content and the insights from this can help you to build on what’s working and change what isn’t. To look at it in navigational terms, it can tell you:

  • Where you’ve been
  • Where you are now
  • Where to go next

Not only that, but gathering analytics data also allows you to optimize a specific learning experience for your audience(s) and/or a certain design approach or technologies. Crucially in the compliance space, it allows you to gather evidence that your audience will change their behavior because of completing the learning.

In this article, we’ll explore what data-driven design is and how you can use it to make evidence-based decisions in your drive for change. 

 

So, what is it?

Data-driven design considers how you can use ongoing digital experiences to evidence engagement and attitude change, and consequently predict a probable change in behavior. Given the vast amount of money that organizations spend on compliance-based learning each year, collecting and analysing data is a critical part of ensuring that the learning is doing its job.

And we’re talking big bucks. Organizations spend several millions of dollars a year on compliance. Certain highly regulated sectors, such as financial services, spend tens or even hundreds of millions annually. But how do they know that it’s worth it? 

 

Measuring effectiveness

Many organizations fall into the trap of providing training that feels cumbersome and mundane. This is frustrating for the learners, who can feel like it’s all a tick box exercise, but it’s also hard for leaders, who don’t know that it’s working, or whether it’s worthy of the investment.

Even the regulators are unsure of its effectiveness. They are increasingly looking for more assurance that an initiative is working, beyond lists of completion rates, passmarks, self-attestations or the number of hours that have been spent by employees doing the training.

Effective measurement tools can give you much greater confidence than this. You may have a general idea of the direction you want to take your compliance-based learning, and a sense of what your end destination looks like, but what is the evidence for this? By analyzing data, you can directly measure things like usage trends (e.g. average time spent on learning) and changes in pre and post responses. But you can also use data to infer other critical information, such as knowledge acquisition. Provided you have a well-defined set of business criteria and access to business data, you can also use it to help calculate ROI Measurement. 


Digital learning measurement has evolved

For many years, the way organizations measure their digital learning has been dictated by the constraints of learning management systems and the SCORM specification.  SCORM is useful in that it allows you to capture an important collection of information (think: completion status, seat time, course pass mark). But this data is limited; it will tell you whether an individual has completed the learning but almost nothing else.

XAPI was devised to be a more modern and flexible successor to SCORM, but alongside the high cost of entry (many LMS’s still do not support xAPI, leaving organizations having to acquire an additional LRS to integrate with their existing learning system) it suffers from the same limitation that the data is focused on individual attainments. While the flexibility of xAPI allows you to gather many more data points, because of the way it is structured around individual attainments for individual learners it is difficult to aggregate.

Data services and analytic tools (such as Kineo’s Didactics and Google Analytics) capture a different collection of data that allow you to do a deeper analysis. Not about the individual, but the audience(s) as a whole.

Didactics gathers anonymized usage data that wouldn’t be easily achievable via SCORM, if at all. You can use this to gather insights that will help you to design better learning experiences. Here are some examples of what you can monitor:

 

The total number of people accessing the content

What’s resonating with your audience? What’s gaining traction? This can give you the confidence to design similar resources in the future. Or, if it’s been consistently skipped, give you justification to find out why: is it because it isn’t relevant or accessible, or perhaps the complexity or duration is wrong?  

 

The number of attempts

How many attempts is your audience typically taking to complete a resource? If it’s a quiz or activity, this can help you gauge how complex it is. It might also imply your resource is a popular one to return to, or conversely, that it is too long to complete in one sitting. 

 

Length of time spent on completion

How long do people typically spend completing a resource? This data can be used as a guide for the length of content you develop in the future for optimal engagement levels, as well as help you to identify the ‘sweet spot’ between time spent and score achieved.

How many times is a resource being completed across multiple visits? Is it being used as a just-in-time performance support aid or not?

You can also sense check that the suggested learner duration times you provide are accurate.

 

Pre vs. post assessment scores

What is the difference between the pre and post assessment scores – is your learning making a difference? What is the likelihood of behavior change based on pre and post questioning?

There are several strategies that our designers employ to assess a learner’s current attitudes and behavior regarding each specific learning objective and to identify whether this attitude and associated anticipated behavior changes as a result of them completing the course content. Our evaluation tool records these responses, and in addition to showing visualizations for each question set, collates all responses within a single course and converts them into one overall impact rating for perception change.

 

Questions which are being repeatedly failed

Knowing which questions are repeatedly being failed may suggest they are poorly phrased or too difficult. It may also imply that the associated learning content isn’t detailed enough or impart the information in an effective manner. More worryingly, it could also point to a deeper, endemic problem within your organization.  

 

More food for thought

As well as human factors, there are also technical and environmental insights which data analytics can help you to identify. For example, you can discover:

  • The time of day a resource is being accessed – is it being done in typical working hours, or during the morning or evening commute? Likely, it will be a mix of these, but you may spot trends that influence how you design learning in the future.
  • How a learning program is being utilized by learners, such as what devices are being used to access the content (so you can decide whether to take a mobile first approach or not). Even if a course is predominantly being accessed on a desktop analytics will reveal the screen resolutions most commonly being used, enabling us to optimize our design to the technologies we know your learners use.
Infographic showing various learner statistics

Expand graphic 🡕

Reporting dashboard shows learner statistics – 76 unique users, 86 course launches, 63 course completions, 36 average minutes to complete, 38% average pre-assessment score, 87% average final score

Expand graphic 🡕

The examples shown in these dashboards represent single experiences. However, data reporting with greater nuance allows you to consider what you may want to measure over time. How can you use multiple experiences throughout the year to collect data on attitude change? How can you compare training year-on-year, especially when training the same subjects on an annual basis? What do your managers and stakeholders want to know about your learning initiatives?

In short, data can help you to make objective decisions about what to keep doing, what to stop doing and what to adapt. Duplication and superfluousness can be eradicated, and opportunities to improve can be realized. 

As you can see, you can begin to design and implement data strategies that go beyond just tracking completion. They enable you to build an understanding of your learners – and the impact of your learning - over months and years. Download our measurement guide for a simple plan for moving forward with your data and measurement strategy. Your knowledge about what works and what doesn’t will build up over time and you can adapt accordingly.



Check out the next article in this series: Seven steps to success, to continue exploring The Kineo view on: Compliance elearning.

If you want to know more about how we create better learning experiences for compliance-based learning, drop us a line to book a free consultation with one of our learning experts.

Compliance elearning

Helping you find your North Star


In this suite of articles on compliance elearning, we take a helicopter view of the compliance landscape, talk you through our latest thinking and research, and share our recommendations for transforming your mandatory training – helping you find your North Star. We hope you find it useful and we welcome your feedback. Please get in touch if you'd like to discuss your own compliance training challenges with one of our team.