Björn Rohles rohles.net

The Human-centered Training Management Cycle Training digital skills with the human in mind

Last update: Reading time: 7 minutes Tags: Human-centered design, Training Management, Digital skills

While training digital skills is vital for organisations and society, education still ultimately serves a deeply human purpose. The human-centered training management cycle ensures that human training needs are always kept in focus.

Based on the human-centered design approach, adapted from the CPUX-F curriculum, this article outlines a suggestion for a human-centered training management cycle. The figure below represents a visualisation, and this article walks through the different parts and phases step by step.

The human-centered training management cycle. At the beginning, there is a strategy, outlining vision, objectives and principles of training management. It is connected to the cycle itself, which could start at any point. One phase is exploration and analysis, where audience and their needs are understood. The next is defining requirements, which includes content, activities and learning objectives. Based on this, course materials are created. Finally, the course is evaluated againt the requirements, in both experience and outcomes. From this evaluation, there are arrows leading back to earlier phases, or alternatively towards repeating the course, always with including feedback in a loop. Finally, it is possible to sunset the course, meaning it is stopped.
Human-centered Training Management Cycle

Vision and strategy: the guiding north star for training management

The base for training management is an appropriate strategy. For digital skills, such a strategy would necessarily have to address technology, but also make sure that it delivers real value for people. It should also include an inspiring vision, motivating us to work continuously towards an outlined goal that is worth achieving. Strategy is the guiding light that impact all training management, like a north star that is always visible and providing orientation on the night sky.

Flexible start, iterative improvements

Let’s transition to the human-centered training management cycle itself, and first things first: The human-centered training management cycle is not a linear process, but a very iterative one. This means not only that the course content is always adapted based on evaluation results, but also that the process could start at different points.

In the image above, the grey circular arrows surrounding the process visualise this: Speaking metaphorically, you could turn the entire cycle on the central axis. You would then, for example, start with an existing course and do an evaluation. Based on this feedback, you might need to do more user research to better understand your audience or work on enhanced content, for example.

Iterative phases of human-centered training management

Within the cycle, four core phases outline key activities of human-centered training management. In practice, those sometimes run in parallel, but we should explain them step by step.

Deep understanding of the audience as a base

Like mentioned in an earlier article, I am convinced that we cannot really be human-centered without having a solid understanding of our audience. Such an understanding is necessarily based on data. For training management, this means that there needs to be user research about the training needs and training preferences of our learners and trainers. It also means that it is useful for a training manager to be close to these people, for example by attending courses or even teaching oneself. In the end, we cannot create meaningful courses by limiting ourselves to sitting in an office. We need to be where the training is happening, and we need to know the people for whom it is intended.

Cards with human needs, autonomy and competence are visible
User research can identify needs referring to training, for example with the human needs cards depicted here

Solid requirements based on learning objectives

Based on data about training needs, we can define the concept of a course. This includes the included content and activities (such as discussions, peer evaluations, and lectures), but also the learning objectives. Writing learning objectives is not easy, but here are some key characteristics of good learning objectives:

  • Writing learning objectives typically starts with a phrase like “Upon completion of the course, the learner should be able to…”.
  • Learning objectives should use an active verb. These verbs should cover a lot of different mental activities to keep the content engaging. A good approach is to use Bloom's taxonomy for describing learning objectives.
  • Learning objectives can apply to the course or session level.
  • There should not be too many learning objectives. A typical rule of thumb is to have 4 to 6 learning objectives for each meaningful content block.
  • Learning objectives should be formulated based on the needs of learners. Many trainers write learning objectives like “understand this particular content”, but this is often not the true purpose of learning something. As a learner, I typically do not wake up in the morning saying “I want to understand Python”, but understanding Python is a requirement for doing something, like “I want to have a feedback dashboard which automatically updates when new survey responses arrive”.

A good learning objective is connected to an evaluation. For example, when my learning objectives is to “understand advantages and disadvantages of digital technology”, a good assessment would be to suggest appropriate technologies for a given case study. This assessment allows to check whether a learner has understood digital technologies and is able to make an informed suggestion based on the context.

Collaborative course content creation

The requirements are the foundation for creating the course content and delivering the course. Learning objectives help a lot in doing this: When we know of a specific learning objective, we know that there have to be activities related to this objective. Content creators incorporate good practices and insights from educational fields, such as instructional design or assessment. There is also a multitude of open educational resources (OERs) available that could be included in a course. As a training manager, our role is often to collaborate with the content creators, making sure that their creation adequately address the learners’ needs.

Constant evaluation

A key mantra of being human-centered is to rely on data from humans to verify whether requirements are fulfilled, putting evaluation as a vital part of the process. For human-centered training management, evaluation covers two aspects:

First, the learning experience of trainers and learners, meaning whether they had positive experiences throughout the course. Often, this learning experience overlaps with other types of experience. For example, when course participants have issues to access course materials, this might be a classical example of user experience (UX), but it also impacts their experiences as learners.

Second, the learning objectives, meaning whether they are able to understand or do what they wanted to achieve by signing up for a training. On the one hand, this covers the learners themselves, but it could also impact other stakeholders, for example managers who send their employees to a training. In addition, this has multiple temporal dimensions: Some learning objectives refer to longer-term impacts, such as being able to realise a project.

Based on the evaluation results, we might have four different kinds of results:

  • It could be that some requirements are not fulfilled. We would then need to go back to the content creation phase and adapt our materials. The adapted materials will then again be tested.
  • It could also be that some requirements turn out to be invalid, or that our requirements are incomplete. We would use these insights to update the requirements, for example by reformulating learning objectives or adding missing contents.
  • Sometimes, our tests might reveal that our understanding of the audience is not good enough. For example, we have recruited a group of university students as test participants, assuming that they have similar training needs. However, in our tests, the results from some of these students are very different from others. This could mean that there is not one learner group “university students”, but that we need another categorisation to meaningfully represent our audience. Such a result requires additional user research for verification.
  • Alternatively, we might find that a course is “good enough”. At this point, we would move on to the next phase of a course lifecycle.

Transition into production, but always with close feedback

The human-centered training management cycle has a strong conceptual and creative focus. So it mostly applies to the early stages in the lifecycle of a course. At some point, when the learning objectives are met (both in terms of experience and outcomes), we might simply reschedule a course again. However, even in this phase, there should be a close feedback loop with constant evaluation. As soon as the need arises, the human-centered training loop starts again. And when the feedback reveals that the original training need does not exist any longer, a course is faded out.

Conclusion

Human-centered design is a flexible process for creating products and services while continuously including the audience. We can apply this concept to training management, resulting in the human-centered training management cycle outlined here. This approach ensures that the human audience is constantly kept in the center of creating courses, even when the topic is all about digital technology.