Distance Learning: Continuous Improvement Part 2
Implementing Continuous Improvement in Familiarization Training
Continuous Improvement (CI) is the critical process of continually analyzing the performance of some aspect of operations, and then applying changes intended to improve that performance. This is the second in a series of articles intended to introduce practical tips for using CI to improve training effectiveness and efficiency in maritime organizations. Implementing a modest CI process for your in-house training is neither expensive nor difficult, and even a small program can produce a tremendous ROI in safety, efficiency, trainer engagement and trainee satisfaction.
This topic is especially timely right now. Vessels, equipment and job routines in the maritime industry continue to become more and more complicated and sophisticated. As a result, deeper knowledge and more specialized skills are required to operate safely. A program of continuous improvement for operational training is a necessary tool in the effort to close this gap.
In the first article of this series, published in the January edition of Maritime Reporter & Engineering News (http://digitalmagazines.marinelink.com/nwm/MaritimeReporter/201601/), we introduced the application of CI to your organization’s maritime job and familiarization training. In this article, we continue this series by talking about specific key performance indicators which can be used to measure training performance in your maritime organization.
Key Performance Indicators
Key Performance Indicators (KPIs) are the foundation of any CI process. KPIs are measurements used to evaluate effectiveness and efficiency. This is critical - as it is often said “if we can’t measure it, we can’t manage it”. And while some people may be reluctant to gather training metrics for fear of what they may tell us, the only thing scarier than discovering unsafe training metrics is not collecting those KPIs in the first place. KPIs should minimally satisfy the following requirements:
● They must be aligned with corporate goals.
● They should react reasonably quickly to changes in training.
● They must track something you have some control over.
KPIs for Maritime Training
There are many KPIs which we can track to evaluate changes we make to familiarization training as part of our CI process. These fall generally into four categories:
1. Ask the trainee
2. Ask the trainer
3. Evaluate the trainee
4. Record performance
Let’s look at some KPIs in each of these categories and discuss how they can help us improve training performance.
1. Ask the Trainee
Our students can be an excellent source of information. The signal to noise ratio can sometimes be quite low, but there will be good data there.
Probably the most common and obvious technique is to have the trainees fill out an evaluation of the trainer and the familiarization process. This evaluation can be used:
● by the trainer to help improve his
or her technique.
● by training administrators for hiring and advancement, for optimizing train-the-trainer programs and to improve training practices and resources.
This is typically done at the end of training. But as a university faculty member many years ago I personally found it useful to hand out my own evaluation form half way through the term. This was personally beneficial in that I received feedback on a more regular basis. It was also beneficial for the students (trainees) because changes I would make to improve my performance would benefit the students who provided the feedback, not only the next year’s group of students.
Another useful, though rarely applied, technique is to reevaluate the training 6 or 12 months after the completion of the training event. Once a seafarer has gained some on-the-job experience and had some time to put the knowledge and skills into practice, he or she may be able to provide a more informed and therefore more useful evaluation of the training. With the benefit of that experience they will be more able to comment on elements of training they feel were missing or were unnecessary. This can be very useful information.
2. Ask the Trainer
While many training organizations do collect some form of trainee feedback, it is much less common to gather trainer feedback in a structured and tracked way.
Trainers can be asked to comment on a variety of useful metrics including the preparedness of the trainees, the appropriateness training duration, the quality of the materials used to support the trainees during familiarization, the quality and appropriateness of the assessment techniques, and so on. This last question regarding assessments is a very important one. At BC Ferries where I am involved as the architect of the learning management system which supports their familiarization training, we find we receive a great deal of feedback on individual questions which are used to evaluate trainees. This feedback is critical in the process of continually refining the assessments to ensure that trainees know what they need to know, at the level required.
Another useful technique is to ask the trainers to perform self-evaluations. Ask if they feel they performed well during that training experience, ask what they could have done better, what new techniques they tried and found to be particularly successful, and what support they require to deliver a better experience next time. The additional subtle, but extremely important, benefit of this self-evaluation is that it causes the trainer to reflect on the their training process and performance. It makes him or her a partner in training improvement. Otherwise, you are wasting a valuable thought resource because the trainer simply “works IN training”, and is not as likely to “work ON training” - a key distinction.
3. Evaluate the Trainee
All organizations have a process in place to evaluate candidates at the end of training. The results of these evaluations can be used as a KPI.
A couple points on this. First - it is a good idea (and a common technique) to perform assessments not only at the end of training to evaluate the candidate, but also at the beginning of training to evaluate and improve the effectiveness of the training itself. There are many reasons for this pre-assessment. First, if an evaluation is conducted at the start of training, we are then aware of what the candidate “does not know” in addition to what they do know. By comparing that to the results of the evaluation conducted at the end of training, we can determine how successful training was even in the presence of candidates with varying degrees of knowledge and aptitude. Secondly, a pre-assessment also helps improve hiring practices and pre-familiarization self-study materials. Finally, pre-assessment results can be used by the trainer to tune his or her program to the candidates in hand, and to ensure that critical gaps in knowledge are covered.
Another excellent technique is to re-evaluate the candidate after some passage of time since the training event. This provides information on how well knowledge is retained over time. If critical knowledge is lost, then the training program can be altered to reinforce the maintenance of this knowledge and to recommend further training to individual candidates.
Finally, although all of these metrics are very useful, use caution in applying them without analysis and thought. Assessment outcomes can vary significantly with the method of assessment, with differences in the assessors, with differences in the candidates, and with the age of the assessment itself, just to name a few. So always think carefully about whether (or how much) a KPI is being influenced by actual effectiveness, or by the method of gathering the KPI itself.
4. Record Performance
Every organization has a system in place for measuring key aspects of operational performance. These can be mined for information that is directly related to your training performance. Any KPI that can be tied to a training event can be used.
One simple example is a KPI which measures the duration of familiarization training. Some organizations do not have fixed training agendas and vary the number of days of familiarization according to the candidate. This can be mined for very useful data. For example - has the number of days crept up over the years? If so – has performance improved alongside? Does duration vary according to the trainer? According to the number of trainees? According to the vessel? These are all important questions whose answers can provide insightful clues on how to improve outcomes.
Another example is the mining of loss or safety-related incidents. Statistics on these should be visited regularly in order to uncover common and recurring issues which can be addressed through a modification in training. Likewise for other operational metrics such as ferry loading and turnaround times, customer service complaints, and so on. There is a wealth of information that is already being collected (or can easily be collected) that can help improve training, and therefore safety and operational efficiency.
LMS-enabled KPIs for Maritime Familiarization Training
All of the KPIs spoken of above can be collected and analyzed manually. In addition to those, organizations that use a learning management system to support training have access to a variety of other deep KPIs which can provide insight that is simply not otherwise available.
In the March issue of Maritime Reporter and Engineering News, the final article in this series will discuss examples of LMS-enables KPIs and leading indicators which can provide incredible insight into training effectiveness.
Murray Goldberg is CEO of Marine Learning Systems (www.MarineLS.com). An eLearning researcher and developer, his software has been used by 14 million people worldwide.
(As published in the February 2016 edition of Maritime Reporter & Engineering News - http://magazines.marinelink.com/Magazines/MaritimeReporter)
Other stories from February 2016 issue
- A Career Built on Cleaner Seas: Jochen Deerberg Retires page: 10
- S.S. United States Historic Ship may get a Second Lease on Life page: 12
- Distance Learning: Continuous Improvement Part 2 page: 18
- Modeling & Simulation Not Only for Naval Combatants page: 20
- Navy Competes for Resources at Home, against Asymmetric Threats Abroad page: 24
- Safe Cruising Down Under with Australian Reef Pilots page: 28
- Unmanned Surface Vessels: From Concept to Service page: 42