Skip to main content

Step outside your data comfort zone

Podcasts and Audio | 27.06.2018

The team looks at data sources beyond the comfort of your LMS, including Learning Record Stores, Google Analytics and heatmapping tools. But before you start gathering data, the big issue is: do you know which questions to ask to determine whether your L&D is having a real impact?

Paul Westlake  0:00  

Welcome to Kineo Stream of Thought. A monthly podcast that features informal chat from the Kineo team about all things learning. I'm Paul Westlake, a Solutions Consultant at Kineo. And today, we're encouraging you to step outside the data comfort zone. This month, I'm pleased to say that I'm joined around the table by:

Laura Parsons  0:23  

Laura Parsons, Marketing Analyst, 

Pete Smith  0:26  

Peter Smith, Technical Team Lead,

Jez Anderson  0:28  

Jezz Anderson, Head of Consulting.

Paul Westlake  0:30  

Thanks all. So by way of an intro to the show this month, we've produced a report on data in L&D in conjunction with towards maturity, HT2 Labs and Filtered. And specifically, the findings are showing that there's big differences between how the top deck use data. So that's the highest performing businesses, if you like, and how that contrasts with how everyone else is using it. And without stealing the whole reports, thunder, which I'm sure you could download and read, we'll tell you about that later. Here are some of the headlines to get started. So 99% of participants want to demonstrate impact of learning, and yet only 38% think they actually do so, 98% want to improve learning design yet, only 56% think that they use the data to help them do that. And finally, a big one for me, a huge 96% want to win the hearts and minds of stakeholders, yet less than 50% feel that they actually do. So maybe we should start with what data we should be looking at and what we should be measuring. Because if we're talking about L&D data, surely all we're talking about is completions on LMS. Right?

Pete Smith  1:39  

That's certainly the the traditional view. And so as everyone in the industry knows, we've had the very lovely SCORM API, which has held sway for the last 20 years or so. And it's very much focused on gathering an individuals completion records, whether they've actually managed to get to the end of a piece of elearning and whether they fulfilled all the requirements of it. And that's really the main source of reporting data which l&d professionals have had, who's completed what course, whereas the kind of data that you're talking about there Paul, is a quite different data sets. It's not about the individual, it's about a much broader spread of interactions. And really, we we do need a different toolset to record that sorts of data.

Jez Anderson  2:25  

I think the importance of people wanting to to use data more is great. And as you know, it's an absolute key finding for us. It shows that there is hunger and passion there for understanding the impact that learning and development is having purpose individually, but also organisationally. So that's that's one thing. And you know, the wheel is a good way forward. The key will be in my mind will be the questions that organisations and learning and development want to ask of themselves under their organisation to know what data they need to process. Traditionally, data has been from an L&D point of view has been, as Pete says, it's quite low level, you know, it's quite functional. It's how many people, it's how many, you know, bums on seats, it's what did people think about a particular approach to the learning versus actually starting to understand how, how that learning has an impact on their performance? And that's always been difficult. And I think that new technologies are starting to make that easier to do. But and the big but is: learning and development have got to catch up in terms of how to use that technology to do that.

Paul Westlake  3:36  

Do you think they're comfortable doing that Jez?

Jez Anderson  3:39  

No, no, I don't, I think traditionally, L&D have, for whatever reason, have have always struggled a little bit for those people familiar withn Kirkpatrick, the levels three and four, which is really starting to get into the deeper levels of how learning has impacted organisations and individuals within organisations. And I think it's hard. It's been hard to actually really understand what questions you need to ask to get the right information from that. So I don't think I think the wills there, as I said, I think it's some it's actually about skill. And I think this is probably where we come in that we can start to help people develop the skills to do that, but also give them the technology which they can start to utilise better.

Pete Smith  4:22  

I think, actually, you're absolutely right, Jez, completely agree. But before we get onto the slightly deeper waters of Kirkpatrick, three and four, there are actually a lot of quick wins, which are open to people within L&D. without actually having to go down the whole route of evaluation. There are more reasons for wanting to gather data than just demonstrating ROI or the effectiveness of your training. And so one, one really good example, is just simply understanding how people are doing the learning content that we produce, changes our approach to actually producing that learning content. So if you plug in Google Analytics to your LMS or your elearning, then we can find out how many people are actually accessing the courses on smartphones. So you can see whether there's a significant drop off of people who are using smartphones against people who are learning on the desktop, you can have a look at, if there's a difference in length of time spent on courses for people who are based somewhere else globally. And so therefore, whether you should actually translate your content, even though in theory, everyone is an English speaking office, those are the sorts of bits of information that you can get with very, very little technical overhead.

Paul Westlake  5:34  

Now, I think that's a really good point when we've spoken to clients in the past, and they've said, Oh, everyone does this on mobile. So it's really important. It's updated for iOS 12. When that arrives, and I know we've pulled data and said, Well, actually, you know, less than 3% of your users are actually accessing that stuff, because you actually got the data to back it up rather than that assumption.

Pete Smith  5:53  

Exactly. And it's very often actually a case of taking particular technologies off the list and not investing the time and effort needed to build courses to work in those technologies, tests against those top technologies and possibly compromise your content, based on as you say, actual proper data of what people are really using out in the real world.

Paul Westlake  6:13  

So you mentioned Google Analytics there. So Google Analytics, I think, is something probably if anyone's made a website or wherever they probably got it in there. So they can know where peop le accessing the content from etc. So you're saying that it's relatively straightforward to do exactly the same with an LMS?

Pete Smith  6:30  

Yes, it depends on the LMS, obviously, but if you're using, say, Moodle, LMS, then it's simply a config option, you just need to generate your Google Analytics tracking codes and just drop it into a field on Moodle and click Save.

Paul Westlake  6:45  

And what sort things can people see from that? Amount of time on pages? Or you know which topics are going directly too?

Laura Parsons  6:53  

Yeah, absolutely. How long people are spending on each page, how many pages they're viewing after that, where they're from city wise, what browser they're on. Which page, they landed on which page, they exited the LMS from all sorts.

Pete Smith  7:09  

The key difference is that this isn't a person in the SCORM sense. This is a user agent, it's... it's a person, but you're looking at it as an aggregated mass rather than the individual learners through course content or an LMS?

Laura Parsons  7:25  

Well, I think it's quite interesting. In the report, it showed that only 13% of LMS' actually have Google Analytics implemented on them. But if you applied that to websites, in general, that would be madness. That's kind of a basic, fundamental thing you would add...

Paul Westlake  7:41  

why would you use it? Why would you add it to a website? What sort of what sort of things could you do with that data?

Laura Parsons  7:46  

It's fundamental, really understanding how your marketing is working. If you know that people are coming to you through the adverts that you've been running, then you'll know to do more of that. And those are the kind of things you can apply to L&D. If you have tracking setup, you can know if your internal campaign for your latest content has worked. If people aren't coming through your tracking for that, then you know that you need to revise that and perhaps think about the ways to market your new content.

Paul Westlake  8:13  

So in summary, what you're saying then is that people will use that data to point them in a direction to them make choices around what to update, what to amend, what to change, what to leave, what to remove, etc. And then, bearing in mind what we said at the start the show around the report that says 98% of businesses want to improve their learning design. In theory, we could argue that, well, if we've got data, knowing what people are accessing, what they're not accessing, it's going to point us in the right direction to know what we need to update.

Laura Parsons  8:46  

Absolutely. And you can have a look at the bounce rate, the average time on page for each of your courses. And if you've got one that's really fundamental, but it's not getting much engagement, then you can have a look and see, is that just based because all of our team are using a browser that can't access that content? Or is there a problem that it's just not engaging enough? And you could use that data to prioritise the content for your internal teams?

Paul Westlake  9:09  

Okay, so we've got Google Analytics, and we've got standard completion data, if you like, you mentioned things like heat maps, and like so before we move on, to what we're going to do with the data, or how else can people record data?

Laura Parsons  9:26  

Heat mapping is another favourite of mine, just because you can actually see in a very visual way, what people are doing on your website or your platform. So you don't have to be an analyst to really get stuck in to the data. Again, these can be really easy things to add to your platform. It's just a bit of code. But you would set up your heat map on certain pages, and he would see where there's red marks on a page. That's where people are spending a lot of time whereas perhaps a cooler color, like blue, people aren't spending as much time there. And then I can then show you if you put your content that you need people doin, perhaps compliance, and that's in an area people aren't looking at, perhaps move that to an area people are focusing on. And that can just help you prioritise where your essential bits should be. But you can also see whether people are actually reaching the content you want, how far down they're scrolling on a page. If they're not seeing your calls to action, then perhaps these need to be bumped up. 

Paul Westlake  10:20  

So Pete, can I pick up on what Laura's just said there? So correct me if I'm wrong? And I'm sure you will, we're talking about websites, etc. So if I take that a step further, if we're talking about a piece of Adapt content, which is what HTML5 which I think I'm right in saying what most websites are written in these days anyway, in theory, could we do this? Or do a similar sort of thing where we heat map a piece of learning content to see where people are looking around the page? And therefore help us to design off that?

Pete Smith  10:50  

Yes, absolutely. It's, it's going to be both useful and not useful. It'll be really handy from the technology perspective. And so particularly looking at the the UI and the UX of all these standard elements and Adapt page. So it's having the resources button up at the top right a useful thing. Does anyone ever actually click on the page level progress, which is actually a second level of navigation that we've got on Adapt courses, from the feedback we've had, I think most learners actually aren't aware that they can click and navigate through it, in which case, we could redesign it. And again, look back at the heatmap, see how people interact with it, see, if the redesign works, maybe try two or three different versions of it before settling on that, a good approach. So in that respect, that would be a really useful thing to add into Adapt courses and useful data to analyse. On the flip sides, I do think that we have a slightly more complicated task with learning and development than maybe marketers do, particularly if the marketer is looking after an e commerce website, where everything is self contained to the one site. And it's really easy to come up with a clear objective, because the objective is to get your user from point A, to point B quickly and and get them through the sales process, get them to buy as much as possible, and complete the checkout process. So that's what you want someone to do. With learning, it's a little bit more subtle. So if you think about the concept of engagement, engagement, from a marketing perspective, it's all about length of time that someone spends on the page. And more time is better in almost all circumstances. With learning, it's not quite as clear cut as that if someone's spending a long time on a page, it might be because you've got really good rich content, and they're fascinated. Equally, it might be that you've built an overly complex course, you've used loads of jargon, and people are really struggling to get through it. Or you've got an asset, which is a 10mgb graphic, which somehow snuck in and it's just taking ages to load, could be all sorts of things going on with learning. And the final objective isn't quite so clear cut, because you want someone to understand your content, you want someone to internalize your content. And ultimately, you want someone to change their behaviour based on that content. And most of that happens well away from the elearning course that you're tracking.

Paul Westlake  13:23  

Excellent. So the answer was, yes, we can do it.

Pete Smith  13:30  

We can, yes, we absolutely should do, but it won't necessarily solve the whole question of evaluation. Well, in fact, it definitely wouldn't solve the whole question of evaluation. 

Paul Westlake  13:39  

Cool. Thank you. So thinking about this data that we're talking about then so I guess we should look about what data should we be looking at. And I'm sure in a lot of cases, that there's an awful lot of data in businesses that comes from numerous different sources. So for example, you know, I'm going to use my old world again. But, you know, we've got till data, and we had ordering data, and then we had training data, and then we have waste percentage data. And all of this, there's a huge amount of data there, that you could, in theory, say, if we did a piece of learning around x, then it could affect one or more of those results. The problem, there was always that they were in loads of different places. So as an example, I remember writing a piece of health and safety training for for an HSE department. And when I asked about success of that, or how we're going to measure success of that, and they said, Well, obviously it's the amount of people have done that piece of learning. And I said, well, surely we're going to see a reduction in accidents. And and you know, they said, Well, yes, that would be lovely. But we can't possibly say that the reduction in accidents has anything to do with the fact that people have done a piece of learning that's too much of a leap of faith. So always seems a bit weird for me that people are sort of almost resistant to claim the effect that their learnings had. But the key to our question is this. How do we pull all of those different bits of information and all those bits of data from all over the place together into one, and then sort of produce a report or something that reflects that it has actually made a difference.

Laura Parsons  15:16  

Not to be a Google fangirl, but they've got this wonderful reporting tool called Google Data Studio, which is fantastic. It's free for everyone, it can handle big data. And it pulls in data from multiple sources, it's got connections with your Google Analytics, which you're obviously all getting set up now. But if you can put something in a Google Sheet, it will pull into Google Data Studio. And you can have all of your different data sources together and create charts, funnels and just show the correlation between those datasets.

Pete Smith  15:48  

And also by as well as this, a lot of our clients will have something similar, proprietary already set up. So certainly all of our... all of our big clients will have a business warehouse setup, which will have several systems feeding into it, a lot of them will also have a feed straight from a learning management system. And so they will be able to do some of the the kind of data analysis that you're talking about. But obviously, different organisations have different levels of integration. And they will all put a slightly different level of importance on that training data going in. So obviously, if the core business is based around getting good data from sales and driving the revenues up, they will probably be doing a lot more interpretation of those sorts of records than they will around anything, l&d and training related, at a guess.

Paul Westlake  16:41  

So Pete, I know on previous shows, we've talked about NRS, and how we can bring different types of data together. So would that'd be a good fit for what we're talking about here?

Pete Smith  16:51  

Yeah, definitely is, it's certainly a route in. xAPI is obviously got two big benefits to it, one of them is you get much more flexible data going into it, you've got a lot more control over what sorts of statements you can send your learning record store. If you go for a learning record store, like learning locker, then you have a lot of opportunity there to, to carry out data integrations. So you could either use your LRS as your one central point of all information, and throw your sales data and HR data, whatever else, you might want to go into that and query that. Or you could use one of the business intelligence type systems that I just mentioned, and use that as your central data store. It all really depends on how much data you have, and how many data integration points he wants to deal with.

Paul Westlake  17:44  

Okay, so we've got a way of capturing all of this data, we know how we're going to do that. But still, traditionally, all we've really done is produced a number of reports on the back of that data. So if we have to start putting it into one place, surely we can now analyze it rather than just report on it?

Jez Anderson  18:02  

We can I mean, I think you know, and I sort of fly against the wind a little bit here. But I think that ultimately, we do need to almost pull back a little bit from from just looking at how you pull the data together and using the processes, to the fundamental thing, which is about what questions do you want the data to answer? So what is the problem that you're trying to understand more of? So your example, Paul, of the accidents, you know, if that's the what you're trying to solve, what data is going to best give you that? So that might be learning data. But or it might be other data, it might be about turnover in terms of your staff turnover, it might be about performance review data, it's looking at how we can pick on the things that already exist within an organisation and pull that together to answer these questions, that, for me is a fundamental, I don't think L&D naturally always have the skill set to do that. And I think that's something that we can help develop and grow with the data capability, but also how to utilise the data to its best.

Paul Westlake  19:11  

Ironically, when I asked the question about why wouldn't we want to see a reduction in accidents, as in surely that's why we're doing the learning. The message that came back was, well, we just need to prove that everyone's done it. So when there are accidents, you know, we can prove that wasn't our fault, which is which dreadful really...

Jez Anderson  19:29  

And that's exactly the sum of the issue. And again, it's about actually changing the nature of learning and development from being quite parental in some ways. So how do we own you know, our learning and development owns the process and owns the content and owns the, the fact that people have to do it, and that's what they're charged with doing by the business. What we're saying and what we're starting to move towards is a shift in that by saying that actually L&D can provide a real insight into how organisations can improve through their people. Through the effectiveness of their L&D strategies. So it's not just about, you know, the virtual bums on seats, it is much more about the impact that learning is having on the day to day performance of individuals and the organisations that they're part of, if you ask the right questions to begin with, and seek out the data, which is going to help answer those questions.

Pete Smith  20:21  

And I do think that that's a really big 'if' though... those are the key bits. So it's all, you need to think about the data before you even start to think about designing the course, it's got to be the data that comes first. And also be really clear about your set of objectives. And even then demonstrating cause and effect. If we stick with the health and safety example, there could be any number of factors which cause a spike in the number of incidents. And so it's partly the quality of data going in. But you also need to have that high quality analysis, and a good range of data so that you can close off as many of those spikes, or you need good data design. So you might want A/B testing or all those sorts of techniques. It's not impossible, but it's very difficult.

Jez Anderson  21:07  

Yeah. And sorry, for interrupting. But it's that it's that thing for me about actually applying some thinking to the process that you're actually undertaking and not just chucking the tools in and hoping it will work. Because the reality of it is, it won't.

Paul Westlake  21:22  

Surely that's the question for the stakeholders isn't it? ... If someone's come to me and said, I need you to create a piece of learning for X, you would assume... We know the danger of that! But I would assume that they've already done some sort of analysis to say, this is the issue, this is what we're going to try and solve this is how we're going to measure that. But in a lot of cases, through my previous experience, you've rarely asked that question to the stakeholder, of what what does success look like?

Jez Anderson  21:51  

Yeah, I agree. I think that it's, it's something that it's a bit like an elephant in the room, really... what are you trying to improve? And as soon as you commission a piece of training, the next step is well, I've got to get that piece of training out there. I said, X number of people will experience that training within a six month period, that's what I'm interested in. And then it's very quickly you move on to the next project, the next piece of training. And this is why, in my experience, evaluation always sort of slips down the bottom, because there's always something else coming down the door, through the door, in terms of implementation. So when we're starting to look at how to process data, and the richness of data and the improved ability of accessing data, actually, it'll help speed that process up. So it makes, you know, the idea would be is that they don't really got to think about anything after the event, all the work is going to be done from you know, painting a wall, you're going to rub it down, you're going to make sure it's filled properly. If you do all that up front, then the wall looks great, you haven't got to worry about what it looks like afterwards. It's that it's that whole process for me about actually putting some effort up front to understand what that measurable difference is going to be what it is he actually trying to achieve? 

Paul Westlake  23:04  

Sure. And if we take that step back and do that upfront, then surely subsequent pieces are going to be even easier?

Jez Anderson  23:09  

Yeah, should be I mean, I think, you know, there's, there's lots of evidence of great evaluation studies out there, there's lots of evidence that people where they have done it, you know, look, running control groups, etc. So there's lots of good practice in that. But I think the reality of it is we have to, we just have to work with organisations to help them utilise this technology better, to their best advantage.

Pete Smith  23:32  

And also make it really easy to get the reports because as you say, it's quite easy to forget these things and never return to it. Whereas its evaluation over time. elearning degrades in exactly the same way that everything else does. And the impacts of it and some people's recollection of it also dips over time. So you need to automate the process of reporting on it, you need to automate the process of nudging people to repeat their learning, all of those sorts of things. And again, learning systems can be set up to do that.

Paul Westlake  24:05  

...and potentially nudge people or nudge L&D teams to go back and revisit a piece of learning they put out two years ago. Does it need updating has, you know, has process changed? Is there an easier way of doing this? Is there a different way of explaining it, it's gonna have a better result in a short space of time. So yeah, it's using that data for sort of a win win for everyone really, from the L&D team, who's creating it in the first place to the end user who's consuming it. So obviously, there's a desire to get started. So Laura, is it? How straightforward is it to do that? How would someone get started?

Laura Parsons  24:37  

I think the easiest way would be to start with Google Analytics, really easy bit of code to add to your platform. And then have a look at tools like Google Data Studio. Start pulling in some really basic reports from there. You can get these automated so they can be sent to your stakeholders see the kind of thing that they engage with, what is it that they're asking you questions on based on the reports, look to adapt your reports from there. See what insights you get, and then just extend that and carry on.

Pete Smith  25:05  

And it is actually so quick and easy to drop Google Analytics code into a course. But certainly for any Adapt course that we build, then we'd be happy to drop that in for no charge whatsoever.

Paul Westlake  25:24  

If you want to continue the conversation, you can catch up with us as usual on Twitter where we're @Kineo, our via our website, which is kineo.com. And if you're interested in picking up a copy of the full report on data in L&D, be sure to check in regularly with us on social and we'll let you know when that's available.


Your speakers are


Jes was Head of Consulting at Kineo until 2020.
Laura deep dives into the customer journey by monitoring user engagement and making sense of our marketing data. Laura has 10 years of experience in marketing and has a background in search engine optimisation.
Paul was previously a Solutions Consultant at Kineo.
As a Technical Team Lead, Pete manages our team of Senior Technical Consultants and Front End Developers as well as taking accountability for the technical robustness and suitability of Kineo’s elearning and learning content. Pete also helps drive forward technical innovation working with our Technical Director and Head of Innovation to identify new opportunities for Kineo to branch into. Pete has a key role in the development of our Adapt framework and technical roadmaps for our proprietary tools and development.