The work described in this case study was performed by LEO Learning prior to becoming part of GP Strategies.
The Defence Academy, alongside the majority of the UK’s Ministry of Defence (MoD), makes extensive use of the Defence Learning Environment (DLE). The DLE is a shared environment used by a variety of MoD departments, providing learning to over 300,000 people within the UK’s military and civil service.
The DLE is a learning environment that consists of Moodle (the learning interface), a Content Management System (CMS), a Learning Records System (LRS), and a learner content store.
Challenge
The DLE is a widely used resource for learning in the MoD, so it’s essential that it is continuously developed and enhanced to keep up with the latest in learning innovation. Keeping up with learner needs and the requirements of course owners and instructors is also a must.
A number of challenges across the organization’s learning environments arose that needed attention, but the process of meeting requirements and ambitions from across the MoD was a complex one. On top of this, many of the learning and administrative systems didn’t effectively communicate with each other.
The five key challenges the MoD faced were:
- Inconsistency of gathering learner feedback
- Quality assurance for training throughout the DLE needed to be unified
- As the MoD’s learning technology suite expanded, greater access to measurement, analytics, and data was needed
- Following work on The Bridge—an area of the DLE to help bring previously gated digital content to other staff who may benefit from it—learners needed easier access to an even wider range of training on their systems
- Scheduling mandatory training, both face-to-face and digital, was managed using old technology no longer supported by MoD systems
Solution
The Ministry of Defence’s DLE is the platform responsible for delivering Professional Military Education (PME). The way this content was delivered, measured, and even some of the content itself needed to be updated in order to improve the learning experience. So technological solutions were provided to each of the challenges listed above.
Standardizing Learner Feedback
At the MoD, measuring the effectiveness of learning is a requirement. However, historically this was managed inconsistently, so questionnaires have been created which can now be added to any course and all learner responses are collected via the DLE.
These include mandatory and optional questions, as well as the ability to create custom questions specific to any one course. This has made data collection significantly more efficient and provided the MoD with greater access to data for comprehensive analysis and understanding of each course’s effectiveness.
Improving Quality Assurance (QA) Processes
Prompted by a need to standardize Quality Assurance (QA) processes for all course materials, the Quality Rubric was created. This provides prompts, and data once the courses are created, across a variety of categories including accessibility, quality of content, evaluation, course navigation, and interactivity. This prompts all course creators to make required decisions and provides an audit trail for QA. It’s been automated as an app and ensures compliance with MoD quality requirements.
Broadening Analysis and Measurement
Now that the MoD has expanded its learning programs to include social learning, virtual reality, games in learning, and cross-device learning access, there’s a greater need for accurate analysis and broader measurement across the DLE.
Learning designers now use a Method and Media Analysis Tool to evaluate best practices and analyze the content created. The output can then be used for cost-based decisions on training as well as to inform future content design.
Making Content Searchable and Available at Point-of-Need
The DLE has been enhanced to allow courses and content to be categorized using metadata and tags. This means that learners can easily search for content, creating a more effective learning experience and allowing for point-of-need learning.
Having training that is searchable by topic and content can greatly increase engagement with both the content and the learning environment it sits in. This also allows learners to make the most of all of the new content available on the platform following work on The Bridge program earlier this year.
Managing Mandatory Training
The previous approach was inefficient and out of date. So a new system was introduced for mandatory training which stores all of the data in one place, allowing for automated updates, reporting, and completion rates. All of this greatly eases the administrative burden and allows for the L&D team to focus their efforts elsewhere.
The LEO team engaged a wide range of stakeholders and brought a fresh and interesting approach to the Academy. Their in-depth knowledge of learning delivery and management, and their excellent facilitation and engagement styles, helped us to understand our Measurement Information (MI) requirements and opportunities.
John Owens, DLE Service Owner
Results
- Improved learning analytics – the analysis has been enhanced by content metadata, consistent learner feedback, QA audit trails, and audit trails throughout content creation.
- Reduced waste – these enhancements have helped to identify duplicated content, increase flexibility of learning consumptions, and helped to prioritize instructors toward the most urgent requirements.
- Increased quality – improved measurement and reporting on content consumption has led to an increased quality of content and adoption of new learner experiences.
- Cost-saving and process efficiencies – the changes have meant that duplicate content has been identified and removed, and replacing manual processes with automation have saved time and money
- Improved decision making – the tools implemented have embedded best practice into learning design, encouraging better decision making in both the design and auditing processes.
As part of the project, they created a proof of concept solution and also created a range of very useful eLearning content, which we have already used across a range of Defence courses. The project was aimed at the Academy, but has subsequently proven valuable for a much wider Defence audience.
John Owens, DLE Service Owner