LMS distress


LMS Distress

Last week, I posted the following on LinkedIn: “What I really, really want, is an LMS that tracks more than just completions. I want stats like how long a learner spent on a page, what activities they skipped, or what they went back to review. Basically, how they really behave in a piece of digital content”. I also included the hashtag #nerd, because I figured not too many people would be interested in my musings.

I was surprisingly wrong. The post generated a great deal of discussion, which is something I love. It also created was a flood of connection requests. To date, I have had over 30 individual LMS companies reaching out to me to talk about their product. The sad truth is, I’m currently a consultant with literally bupkis for budget. It was completely my error not being transparent in my original post, for which I am sorry. Likewise, I am not complaining about the high volume of sales people reaching out – I get that LinkedIn is a big tool for lead prospecting; we all gotta eat.

What concerned me (and I do not use that term lightly) are the number of companies operating in this space. Yes, you all have good products, but they are often remarkably similar. This is harsh to say, but after looking at dozens of LMS company websites (almost all with a blue or green palette, featuring an apple or an owl, with some bar graphs for good measure) they start to blur together. Some are more innovative than others – a handful are using their data to lead to personalisation of content, which is great. Others are still tracking completions, but with fancier data dashboards. Sure, many of you will argue why you are unique, but consider this an individual observation from the consumer side.

Getting back to my concern: is this proliferation of learning tracking solutions fragmenting us? Does it prevent our industry from seeing the insights behind all of this data we are collecting as we operate in silos?

I asked my original question because I believe that we still do not fully understand how our learners consume digital content. Personally, I think we have a lot of mythology floating around out there. For example, whenever I have had the opportunity to watch someone engage with an eLearning module (that is not lock stepped), the completion of interactivities wanes after the first one or two. Yet, we continue to build them, like adding salt and pepper to a stew, because it is how we think eLearning should be built. Likewise, I have not seen millennials flocking to mobile learning in a corporate setting. When I probed on this topic, the response was overwhelming that their phones are personal and they will go to Instagram on the train home, rather than an LMS. Still, mobile learning is considered one of the biggest trends.

All of this is anecdotal and based on my individual experiences. It still leads me to my original quest – how do we decode learner digital body language?

Many of the comments were extremely helpful in describing ways to use SCORM and xAPI (thank you!). Unfortunately, I am not well-versed in the technical nuances of xAPI and quite frankly, I am not sure I could ever grow my skillset enough to be proficient enough (read: I’m thick). That said, it excites me to know that the concept of tracking learner behaviour on a micro level is out there.

Also, I do not think building this type of tracking at an individual level is going to propel L&D forward into maximising digital. To do this, I believe we need to begin to aggregate and share our insights from data. For a start, we create a set of learning specific metrics that are common across the industry. Metrics that go way beyond completions but focus on behaviours such as: interactivities ignored, content revisited, and videos skipped. I am thinking along the lines of the type of micro metricssay a YouTube collects to analyse digital content performance.

Once we have a standard set of metrics, this data can be shared, compared, and analysed, across industries and geographies to gain insights that go beyond an individual company view. You would then have the power to slice and dice data to better understand what will likely resonate with your audience. Building learning for a mining industry in South America? Segment this population out of the data and use the insights to make more informed design decisions, rather than guessing what will work. This means smarter and more efficient design, as well as more satisfied audiences.  

As a seasoned L&D professional, I am questioning more and more of what we consider good design practice in digital. I am losing confidence that what we serve up to our learners is what they really want. If we did, then the default search for content would be on an LMS, instead of on Google. I do not see that changing any time soon.

I truly believe that data is only way that L&D can truly crack digital content. There is, however, a reason why the term Big Data is used. The numbers need to be on a broad and wide scale. Operating as individual entities means that one only sees one picture at one time. This is akin to looking in the rear view mirror as you drive. What I want is a view from the windscreen. Data might be the way forward, if we can work collectively.

For another perspective on the state of the LMS, consider the article "Why do LMSs Fail" by the very smart Steve Dineen.

If this post resonated with you, consider visiting my blog. You can download a free copy of my eBook "Data Driven Learning Design" there.

Written by