LMS Reporting Challenges and Solutions

Jun 26, 2019

Learning Management Systems have evolved from a simple tool to manage, deploy and track learning activities to become the “nerve center” of an organization’s overall talent management strategy. Advanced reporting capabilities enable us to capture and quantify a wide array of learning activities and program results.

Many learning organizations struggle to find ways to access and interpret the virtual gold mine of data contained within their own systems. In this post, we’ll take a look at four typical LMS reporting challenges and recommend actions you can take to resolve them.

1. Who did what?

The typical LMS setup does a good job of capturing basic assignment and course completion data.  While completion results are important for documenting activities such as compliance training, onboarding, or other legally required topics, it’s less critical for other learning programs.

Learning administrators often point out that finding data on course incompletes is rarely an LMS reporting default option. However, incompletion reports can provide critical and actionable details on who is not finishing a course and also identify the page where an individual exits the course. Leaders can use this information to coach learners, manage deadlines, and identify potential issues with content deployment.

Solution: Customized reports

You can obtain noncompletions by filtering a standard Completion Report to exclude those who did not successfully complete the module. If your focus is on finding noncompletes or stopping points, it makes sense to customize a report to generate that data. The parameters for determining course incompletes can be set as follows:

[Overall training audience] – [Completed] – [Not yet begun]

Your new report will include all persons who were assigned the content and opened the module but did not register a completion benchmark within the system.

If possible, be sure to include the last page accessed by each learner prior to exiting the course. Knowing where your learners left a course gives you important information about content deployment and learner engagement. For example, if the vast majority of incompletes stopped at page 17, you might check that point for technical issues that impact progress, such as a page advancement error or bad hyperlink.

Be sure to save any custom reports you create so that you can access them as needed. You may want to set up the report to run automatically on a schedule. For assistance creating any kind of custom reporting function, reach out to your LMS administrator or Learning Technology team.

2. Pinpointing knowledge issues

Most assessment level reports generated by an LMS are set up only to indicate whether or not learners hit an established benchmark, such as an 80% score to pass a course. Question-level reporting on each assessment item – rather than a simple Pass/Fail indication of the overall results – yields detailed response patterns at both the individual and audience level that are of interest to both operational managers and learning teams.

People leaders and mentors can use question level reporting details to focus their coaching efforts on each learner’s exact point of need. If question level results indicate that learners are more likely to miss a question than to answer correctly, the learning team can use this data to determine if (a) the learning content is effectively written and presented; (b) the question aligns to the content and learning objectives; or (c) the answer tagged as correct is actually incorrect.

Solution: Bridging the gaps

Learning gap reports can be tricky to configure and require lots of effort. Typically, learning team members cannot get this type of data unless the LMS is correctly configured to capture the information. Question level reporting also requires an e-learning module to be packaged as xAPI or SCORM 2004.

Your LMS administrator should know the basic parameters of your LMS configuration. If your system cannot deliver the type of information you need from a learning gap report, you may need to work with your IT team, the LMS provider, or an outside vendor to update your capabilities. Once your system is able to interpret this data, you can design the appropriate report type.

3. Mapping the Learning Journey

Many learning leaders like to present learning results in a dashboard format that aligns skills and competencies according to job-specific themes or outcomes. Tracking and displaying relevant data in a logical manner allows leaders to track the big-picture learning journey, rather than focusing on each individual data point. One popular strategy is to configure this output as an infographic, which is visually appealing and easily understandable.

A learning portal built to display learning results can be customized to show different views to different audiences. Senior executives can see program level learning and performance results, while people leaders and mentors receive each individual’s learning results along with a self-reported “temperature check” measuring each learner’s overall comfort level with the learning materials they are consuming. At the learner level, the portal view will track the individual’s progress against the overall curriculum.

Solution: Learning Dashboards

To start building a dashboard, identify your relevant training themes and topics then:

  • Specify the knowledge, skills, and competencies (KSAs) that align to each topic;
  • Select the training activities that build knowledge of each topic; and
  • Select the training activities that prove or support the transfer of knowledge to the actual job.

Once you’ve identified the relevant KSAs and the supporting learning materials, you can focus on proof of competency data points such as:

  • A passing grade on an assessment can indicate proof of competency for knowledge-based materials.
  • The successful completion of scenario-based learning, executing the task in a work environment, leading a “teach back” demonstration, or a supervised apprenticeship can show the successful transfer of knowledge to real-world job needs.

Additional data points for your dashboard could include job specific metrics such as Net Promoter Score, sales data, customer satisfaction data, or other measures.

Qualitative data points such as satisfaction results, confidence measurement, and mentor or leadership feedback can give you insight about the level of comfort your learners feel with each topic and their levels of engagement.

You can show a learning journey by configuring your data to display as a series of layers with click-through options for viewing, or as a series of analog-style gauges similar to the speedometer and tachometer on your car’s dashboard. Recommended layers (or gauges) include:

  • Training themes or topics
  • Completion Status
  • Assessment scores
  • Mentoring or hands-on activities
  • Self-reported data
  • Informal self-led training (i.e., articles read, videos watched, conference attendance, etc.)
  • Job-related metrics

Your internal technology team or an IT vendor who knows your LMS is your best bet to design and build the dashboard you need. Since business needs are subject to frequent revision, we recommend that you plan for annual updates to ensure that your dashboard reports always support current business needs.

4. Measuring the cost/benefit of a learning program

Learning leaders and instructional designers typically focus on training quality. Business partners, who are responsible for results, are more inclined to notice the impact of learning. Designing, delivering, and maintaining learning programs requires ongoing time, effort, and funding. When training programs fail to deliver the expected results, revenue targets can be missed. If learning costs run higher than budget levels, entire programs can be placed at risk of cancellation, further straining an organization’s potential to grow.

Solution: Calculate the ROI of learning programs

ROI, or return on investment, measures the revenue gained from a planned expenditure. In the past, calculating a Learning ROI may have seemed daunting because of the number of variables required to create a single unit of learning. However, today’s successful organizations have become adept at identifying and capturing relevant quantitative data points.

A simple ROI

A basic learning ROI calculates the net benefit that can be directly attributed to a learning program by the total cost of the program. The net benefit – or return – can be expressed as:

  • A financial measurement, such as increased profits; or
  • A performance-based ROI.

To calculate a performance-based return on your learning investment, compare the increase in customers attributable to a learning program against annual goals. If the number of new customers gained exceeds your goal amount, you have a positive ROI; any number less than your goal is a negative ROI.

You can take the performance-based ROI a step further and divide the number of new customers by the amount of net profit generated by new customers to see the average revenue per new client.

Learning program costs

To see the true cost/benefit of learning, you’ll need to measure the cost of all of the inputs needed to design, deliver, and maintain a learning program. Learning program costs are generally a combination of labor, technology, and fees. Typical learning program costs can include:

  • The number of hours of work contributed by learning team members to create each learning deliverable. Work hours include not only the actual time spent writing and editing the learning materials, but also project meetings, research, communication, content uploading, testing, and revisions.
  • The total number of hours of work contributed by Subject Matter Experts (SMEs), Stakeholders, and business partners for project meetings, content reviews, and content approvals.
  • A percentage of the direct costs of learning technologies (i.e., software programs, licensing fees, subscriptions, file storage space, etc.) required to create, deploy, and manage the learning program.
  • Labor costs per hour for all learning team members and business partners involved in the creation or approval of learning materials.

Learning program benefits

Learning programs can benefit an organization by providing increases in both revenues and productivity. Calculating revenue increases is a fairly straightforward task. Your Finance Department can often provide direct revenue figures for specific programs.

Productivity increases may not show immediately as a revenue gain; however, as your employees grow their overall skills and competencies, you may notice improvement in areas such as revenue growth, client retention, client satisfaction, and a shorter sales cycle. Be sure to include the revenue increases or cost savings attached to each of these data points as a learning program benefit.

Seeing your results

In most cases, you can work with your LMS administrator to design a simple ROI spreadsheet to calculate basic costs or productivity measures. You can set this type of report to run on a monthly basis then track the cost/benefit data over time.

Your internal technology team should be able to work with you to gather internal data for a detailed program-level ROI. A vendor partner who is knowledgeable about your LMS can work with you to design and build a learning portal or dashboard to display your learning program ROI.

A Final Thought on LMS Challenges

In 2013, a new protocol for tracking learning activities was introduced. The Experience API (commonly known as xAPI or Tin Can) allows organizations to track learning activities that happen in any context, both inside and outside of the LMS. With xAPI capabilities, an organization has the capability of measuring not only formal LMS-generated learning activity, but also the informal learning events which are estimated to make up about 70% of all workplace learning.

Some learning leaders consider xAPI to be the proverbial elephant in the room. Most “compliant” legacy LMSs are configured to only gather SCORM data. Taking one or two extra steps to post-process information from end users will pay off down the line by giving you a more complete picture of your overall organizational learning.

A key benefit

xAPI takes reporting capabilities to a new level. With xAPI, you can receive detailed test results, including question-level reporting. The protocol enables you to track learning events completed offline, as well as non-traditional learning activities, games, simulations, informal events, and real-world performance results. xAPI learning results can be reported in the form of a dashboard, learning analytics, or noted as an open source badge.

Organizations should address xAPI needs during their initial set up, in order to make use of all of the benefits the protocol can bring to a workplace. If your LMS is already in place, reach out to your Account Executive to learn how to reconfigure or upgrade your current capabilities.

To see how advanced reporting can really save you time, money and frustration, request a demo today.