The main purpose of evaluating a training program is to gain knowledge about whether it has achieved or failed its objectives. Analysing the training event by using appropriate evaluation tools can improve the outcome of future trainings to a considerable extend. Even if the evaluation process of training is essential, it must always be incorporated within the available framework of time and cost. Defining the appropriate questions is the key starting point of every evaluation.
Evaluation is considered as being an important process of a training event in order to reflect, analyse and improve its effectiveness and efficiency. Evaluation can be defined as the following:
"Evaluation is the collection, analysis and interpretation of information about any aspect of a program of education or training as part of a recognised process of judging its effectiveness, its efficiency and any other outcomes it may have" (ELLINGTON et al. 1993).
Neglecting to make any attempt of evaluation reflects disinterest and lack of professionalism. Evaluation is a must and therefore an integral part of effective training (FAO 1998) (see also planning a training). The effort that is put into the design of any evaluation will pay rich dividends, but defining the right questions is always the key starting point. There are degrees of correctness of definition but this should always be something that is measurable and possible within the time and cost frame you actually have (adapted from CROMPTON 1999).
The following key questions should be covered within the evaluation process (adapted from THOMAS 1999):
Evaluation is often considered as taking place at four different levels (the "Kirkpatrick levels") which are listed below (KIRKPATRICK 1998). The further down you go in the evaluation process, the more valid the evaluation.
Even though reaching level 4 is the most desired result from an evaluation process, it is usually the most difficult to accomplish. Evaluating effectiveness often involves the use of key performance measures.
(Adapted from GREENAWAY 1999)
Be selective! Do not hand out the learner a huge list of questions. Work out what you really want to know and the best way of finding this out.
Be realistic! Form-filling is never fun. So do not expect people to conscientiously work their way through a long and complex evaluation form.
Be creative! Why not evaluating with an activity that is itself engaging and enjoyable! Create evaluative processes that will engage participants and provide you at the same time with a valid feedback.
Be balanced! You may develop a standardised evaluation process in order to monitor results over time. However, by asking the same questions, you are always looking at courses from the same perspective. Try to combine a standardised element that allows you to make comparisons over time, with a random/changing element which shows you a new perspective.
Be holistic! After a course in which people have gained a whole range of experiences, it is not realistic to expect anyone to express their true evaluation of a course on a piece of paper. Paper exercises can be very useful but it should be seen as part of a much wider evaluation process that includes dimensions of learning that are less easy to capture on paper.
Training should always incorporate an evaluation process in order to analyse and to learn which elements have successfully achieved their objectives and which have failed their purpose. The process can cover only the first level of evaluation, when time and costs are restricted for conducting a comprehensive evaluation (means including all the 4 levels). Regarding the applicability of the evaluation, you should consider which techniques and methods are the most appropriate for the intended purpose. It is important to keep in mind the advantages and challenges of the chosen tools, before applying them in the evaluation process.
ELLINGTON, H.; PERCIVAL, F.; RACE, P. (1993): Handbook of Educational Technology. London: Kogan Page.
FAO (Editor) (1998): Food quality and safety systems. A training manual on food hygiene and the Hazard Analysis and Critical Control Point (HACCP) system. (= 4). FAO Agricultural Policy and Economic Development . URL [Accessed: 14.03.2011].
KIRKPATRICK, D.L. (1998): Another look at evaluating training programs. Alexandria, VA: American Society for Training & Development.
THOMAS, M. (1999): Evaluation of Training Courses. In: Associate Publication of Asia Pacific Disability Rehabilitation Journal 2, 1.
This guide aims to give an overview about the principles of evaluation while mentioning advantages and disadvantages of specific methods.
This paper is a practical guide for lecturers interested in evaluating materials for their effectiveness in achieving specific learning objectives.
This paper provides the reader a short overview of the four levels of evaluation published by Kirkpatrick in 1994.
This article highlights the importance of using different approaches for different stakeholders in the evaluation process.
ABUBAKARI, Z.; KUNIMOTO, S.; NEILL, R.; SUTCLIFFE, A.; ZETEK, U. (2013): Approaches and Practices in Monitoring and Evaluation of Capacity Building within the WASH Sector. Group Project Report. Cranfield: Cranfield University and Center for Affordable Water and Sanitation Technology (CAWST). PDF
This report of a group project by the University of Cranfield assesses the effectiveness of Key Performance Indicators (KPI) in measuring progress along the results chain. The assessment is conducted by analysing different Monitoring and Evaluation (M&E) approaches in the WASH sector and leads to recommendations for the Centre of Affordable Water and Sanitation Technology (CAWST) to incorporate into their M&E approach.
This training evaluation form, developed by ISPCAN within their international training programme ITPI can easily be adapted for own trainings and allow to conduct a comprehensive evaluation of a training, allowing trainers to improve their trainings continuously.
This comprehensive evaluation questionnaire can be used after the end of a training course in order to rate the training course. It allows trainers to collect feedback and improve their trainings.
http://www.docstoc.com/ [Accessed: 14.04.2010]
This is a very basic sample evaluation form and not specified to a particular topic.
http://www.fao.org/ [Accessed: 14.03.2011]
This website contains a training manual on food hygiene and the Hazard Analysis and Critical Control Point (HACCP) system. The first section of this manual explains the basic elements for effective preparation, implementation and evaluation of training programs.
http://www.go2itech.org/ [Accessed: 14.03.2011]
These resources are sample evaluation forms and guides to adapt for your own use. Course summary evaluations, focus group questions, and expert observation tools are included. There is a trainer’s competency checklist and trainer attributes competency self-assessment. These forms can encourage trainers to strengthen their training and communication skills and strive for improvement.
http://www.ifets.info/ [Accessed: 14.03.2011]
The article on this website written by D. Eseryel in 2002 describes current approaches regarding the evaluation of training both in theory and in practice.
Too many WASH and WRM projects fail prematurely or are left unused because they are poorly planned, don’t adequately meet user needs, or are weakened by corruption and integrity issues.
IQC management is a participatory, step-by-step process to help improve Integrity, manage Quality, and ensure Compliance of small-scale WASH and WRM projects.
May 3 - 4 in Berlin