49338 The Bank's Learning Agenda An Evaluation of the Bank Staff Training Program ­ FY99 _____________________________ Sukai Prom Jackson Ray Rist Sean Biko Sankofa Martin Tena Malone Chunnong Zhou Chika Hayashi WBI Evaluation Studies Number ES99-43 World Bank Institute The World Bank Washington, D.C. Copyright © 1999 The International Bank for Reconstruction and Development/The World Bank 1818 H Street, N.W. Washington, D.C. 20433, U.S.A. The World Bank enjoys copyright under protocol 2 of the Universal Copyright Convention. This material may nonetheless be copied for research, educational, or scholarly purposes only in the member countries of The World Bank. Material in this series is subject to revision. The findings, interpretations, and conclusions expressed in this document are entirely those of the author(s) and should not be attributed in any manner to The World Bank, to its affiliated organizations, or the members of its Board of Directors or the countries they represent. If this is reproduced or translated, WBI would appreciate a copy. This report was prepared by a team under the direction of Ray Rist, led by Sukai Prom-Jackson who together with Sean Biko Sankofa Martin and Tena Malone served as the full time evaluation team. Part- time team members were Chunnong Zhou and Chika Hayashi. The production staff included Mary Cusick and Janie Ree Stokes. The report incorporates the comments and recommendations made by various stakeholders and beneficiaries of the training/learning function of the Bank; we would like to thank them for their input. CONTENTS Executive Summary ...........................................................................................................i I. Introduction and Background...................................................................................... 1 The New Evaluation System ...................................................................................................2 FY99 PT and ACS Training Activities ..................................................................................3 II. The Evaluation ............................................................................................................. 5 The Scope..................................................................................................................................5 The Design................................................................................................................................5 Data Sources, Instrumentation, and Performance Standards.............................................6 III. Main Findings............................................................................................................. 9 Section 1: Are We Doing the Right Things?..........................................................................9 a. Linkage with the content of Bank performance and business requirements...................10 b. Relevance of staff learning needs for job performance and professional growth...........13 c. The application of appropriate principles for adult learning and performance improvement........................................................................................................................14 Section 2: Are We Doing Things Right?..............................................................................18 a. Delivery efficiency..........................................................................................................18 b. Learning effectiveness....................................................................................................20 c. Participant satisfaction with learning gains and other benefits from the training...........27 Concluding remarks and recommendations.........................................................................28 Section 3: Are We Ensuring Sustainability?.......................................................................29 Relationship between cost and quality.................................................................................30 Cost and quality: summary remarks and questions raised...................................................33 IV. Interpretation and Summary of Recommendations for Improvement ............... 35 Variations in Network Performance....................................................................................35 a. The stage of program development and the expertise of the training team on design, development and delivery....................................................................................................35 b. The knowledge structure of the training courses and the appropriateness of the formal training delivery modality........................................................................................36 Inadequacies of a Content Driven Curriculum...................................................................37 Planning and curriculum development ................................................................................37 Instructional technology.......................................................................................................37 Performance technology and sustainability .........................................................................38 ANNEX A: Questions vs. Score Ranges with Old and New Instruments.................. 39 ANNEX B: Quantitative Results Based on Old Questionnaire for Networks and Non-Networks................................................................................ 40 ANNEX C: Bank Corporate Performance Criteria..................................................... 41 ANNEX D: Content Analysis: Participants as a Learning Community .................... 42 ANNEX E: Type of Follow-up Support Staff Would Like to Have After Training . 43 DEFINITIONS ACS: Administrative and Client Support, or "Level 11-7" staff. The ACS is a network, but is reported on throughout the report separately. Bank: The World Bank DEC: Development Economics (Non-Network) ESSD: Environment and Socially Sustainable Development Network FPS: Financial Products and Services (Non-Network) FPSI: Finance, Private Sector and Infrastructure Network HD: Human Development Network HR: Human Resources (Non-Network) HUB: Region-based training of field staff IT: Information technology KLC: Knowledge Learning Council, an Institution-wide governance body Legal-or-LEG: Legal Department (Non-Network) Level 1 Evaluation: Portion of this evaluation assessing participant reaction Level 2 Evaluation: Portion of this evaluation assessing learning gains among participants MOU: Memorandum of Understanding. Networks -or- NW: PREM, ESSD, HD, and FPSI Non-Networks -or- NN: includes FPS, Legal, HR, and DEC OCS: Operational Core Services Operational Networks: Networks which are concerned with core Bank business PREM: (Poverty Reduction and Economic Management Network) PT: Professional Technical Training Program, including PREM, and ESSD, HD and FPSI as networks under the Network and Sector Training Program; and the OCS and Non-Network under the Core Business Skills Training Program. WBI: The World Bank Institute WBIES: World Bank Institute Evaluation Unit WBISD: World Bank Institute Skills Development Group EXECUTIVE SUMMARY Introduction Learning and professional excellence are of great importance for the institutional sustainability of the World Bank in the current revolution on globalization. The Bank has thus invested considerable resources in staff learning and professional excellence. This investment has to be carefully planned and evaluated to ensure that the learning programs are relevant, appropriate, effective and likely to impact on the performance and productivity of Bank staff on a sustainable basis. Addressing this goal has resulted in the merger of the LCC and EDI to form the WBI. The WBI works in partnership with the Networks and non- network operational and policy support training groups to plan and carry out the learning agenda. It does this under the governance of the Knowledge Learning Council (KLC). One of the functions of the KLC is the evaluation of the learning programs. This task it has delegated to the WBI Evaluation Unit, WBIES. WBIES was charged and financed in February 1999 to build a comprehensive system to evaluate Bank staff learning and to begin to operationalize this system by (a) conducting Level 1 evaluations to determine the quality of formal classroom training; and (b) laying the foundation for Level 2 evaluations to assess learning gains based on objective methods. The training programs selected for the FY99 evaluation were: (i) the Professional and Technical (PT) training programs of the Networks (PREM, ESSD, FPSI and HD), of the OCS, and the Non-Networks (DEC, Legal, OED, FPS, and HR); (ii) the ACS (or level 11-17) training program; and (iii) the New Staff Orientation Program.1 Evaluation Purpose and Strategy After a period of study and consultations, WBIES established both formative and summative evaluation strategies that are responsive to the information needs of the various stakeholders. It provided course-level evaluation summaries of training quality to training task managers. It synthesized evaluation results across courses to provide program level information as well as information on Bank-wide performance. This evaluation report is based on this synthesis. The purpose is to provide information to guide Bank Managers in making refined decisions in improving the quality of the Bank staff training program. 1 A separate evaluation report is provided by WBIES for the New Staff Orientation Program. Executive Summary This evaluation seeks to determine whether: 1. the training programs are relevant and appropriate and offer the greatest potential value for the Bank Doing the Right Things; 2. the processes, outputs, and outcomes are technically optimal and are likely to produce high impact on performance Doing Things Right; and 3. the conditions are in place to ensure the sustainability of these results Ensuring Sustainability. The analysis of these three areas is based on 2 data sources. The first one is participant responses to a new end-of-course questionnaire. The questionnaire uses a five point scale to assess six key dimensions of quality: the relevance, effectiveness, efficiency, potential impact, overall usefulness, and satisfaction among participants. It also assesses the quality of course design and delivery for 14 variables about the training content, structure, instructional methods, materials, and participant behavior. The questionnaire was launched in mid-April and is being used Bank-wide. This evaluation is based on the responses of 1015, or 60 percent of the participants who participated in formal classroom training between April and June of FY99. Data are analyzed using percentages, the chi-square statistic test, and logit regression analysis. One of the new innovations in the evaluation of the Bank staff training is a benchmark for judging quality. It was established, consistent with the WBI benchmark for courses to be considered high quality, that 85 percent of the respondents to the questionnaire will have to award a score of 4 or 5 on the five point scale on course quality criteria. The second data source is the performance of participants on objective instruments measuring learning gains. This was done for 26 or 14 percent of the 191 courses delivered in FY99. The instruments included multiple choice tests, tests requiring situation analysis, and objective-based self-assessment instruments. The design used in analyzing performance included a pre/post-test design for 21 courses, and a post-test only design for five courses. In both cases, courses were categorized as "high gain" or "low gain" courses. High gain courses are (a) those that have a 50 percent increase in post-test score relative to a pre-test score, or (b) those with only a post-test result that meets the acceptable level of mastery defined by the course training team. The evaluation of the effectiveness of the Bank staff training programs in increasing learning is based on the benchmark that 85 percent of the courses selected for Level 2 will be high gain courses. Main Findings The findings are presented for the various network and non-network groups, and across all groups for a Bank-wide summary. The synthesis provided in this executive summary is based on the prime factors that define: Doing the Right Things, Doing Things Right, and Ensuring Sustainability. The specific variables that define these factors are described in great depth in the main report. ii Executive Summary On a Bank- wide basis, the relevance and appropriateness of the programs are average. The efficiency, effectiveness, and level of satisfaction are average. Performance on the key drivers of quality is below average. Performance on cost relative to quality shows poor performance to be associated with high cost. There are major variations among the Networks, the Non-Network, the OCS, and ACS groups. The ACS shows excellent performance on a consistent basis. The mode of performance for the other groups is as follows. OCS performance is average. The HD performance is slightly below average. PREM, FPSI, ESSD performance is poor ranging from 51 ­ 68 percent. For the Non-Networks, the HR performance is at the same level as PREM, FPSI, and ESSD. The mode of performance of Legal, DEC, and FPS is good. 1. Doing the right things analysis based on Level 1 evaluation data The factors of the training programs analyzed in assessing relevance and appropriateness were: a) linkage with the content of the Bank's corporate performance business requirements; b) relevance to staff learning needs for job performance and professional growth; and c) the application of appropriate principles for adult learning and performance improvement. The results for these three factors are summarized in Table A below and they show that on a Bank-wide basis, 75 percent of the respondents rated the programs as good or excellent in their relevance or appropriateness. This is ten percent below the benchmark of 85 percent. The results also show wide variations among the various groups. The ACS had excellent performance; DEC, FPS, OCS and HD had average performance; while HR, PREM, FPSI and ESSD had poor performance. This pattern is consistent across all the three factors of assessment with ratings being lowest for the application of appropriate principles for adult learning. 2. Doing things right analysis based on Level 1 evaluation data The factors analyzed were: a) delivery efficiency; b) learning effectiveness; and c) participant satisfaction. Bank performance is below the set standard of 85 percent. It averages 77 percent for all three factors considered together. The performance among the groups is quite variable with two groups showing excellent performance, three showing good performance and four showing average performance. The PREM, ESSD, FPSI Networks show the poorest performance with a range of 60 percent and 70 percent. This level of performance is consistently low across all the three factors. (See Table B.) Inspection of the data shows that FPS, PREM, HD and ESSD have considerably higher ratings for "satisfaction" with the overall training. Accounting for this response is not only what respondents learned but also other gains derived from networking, sharing knowledge and exchanging ideas, and developing new approaches, new perceptions and new insights into Bank work during iii Executive Summary training. This result highlights the importance of learning-to-learn attributes in addition to the learning of specific content for these groups. Table A Percentage of Respondents Providing Ratings of 4 or 5 on the Three Factors Assessing "Doing the Right Things" 11-17 Professional Technical All Bank ACS Non-Network OCS Networks DEC FPS HR LE PREM HD FPSI ESSD G # of Respondents 1015 135 12 57 43 52 335 102 43 77 159 Linkage with Content of Bank Business & Performance 77 93 83 81 65 89 80 66 77 67 63 Linkage with Staff Learning Needs for Performance 77 91 70 80 77 81 81 65 72 69 65 Application of Principles of Adult Learning 70 91 85 73 49 80 73 51 72 56 54 TOTAL 75 92 79 78 64 83 78 61 74 64 61 Description Below relative to Average Excellent Average Average Poor Good Average Poor Average Poor Poor benchmark of 85% Table B Percentage of Respondents Providing Ratings of 4 or 5 on the three Factors Assessing "Doing Things Right" 11-17 Professional Technical All ACS Non-Network OCS Networks Bank DEC FPS HR LEG PREM HD FPSI ESSD # of 1015 135 12 57 43 52 335 102 43 77 159 Respondents Delivery Efficiency 77 94 82 84 66 91 79 67 81 68 63 Learning Effectiveness 75 93 83 74 72 92 82 66 65 66 52 Participant Satisfaction 80 93 83 87 71 92 82 70 73 63 65 TOTAL 77 93 83 82 72 92 81 68 73 66 60 Below Below Description Average Excellent Good Good Average Excellent Good Poor Average Poor Poor iv Executive Summary 3. Doing things right Level 2 evaluation of learning effectiveness A more in-depth analysis of learning effectiveness conducted via the use of objective assessment shows evidence which is more conservative but which is nevertheless consistent with the evidence provided by the Level 1 analysis of respondent ratings of learning effectiveness. An analysis of the average performance increase on pre- and post-tests across 21 courses reveals a Bank-wide average of 40 percent. This aggregation does not meet the established standard of quality of 50 percent. There are nevertheless some major variations among the various groups. The DEC, ACS, OCS and ESSD groups show performance above the 50 percent benchmark. Figure 1 below shows the variations among the training groups with the Networks (FPSI, HD, PREM, ESSD) performing on the lower end of the performance scale. Given the small number of courses involved in the analysis by network, caution must be exercised in the generalization of the findings. Figure 1 Average Percentage Increase of Course Learning Gains 140 120 115 100 80 69 59 60 54 43 46 36 40 20 15 0 HD PREM ESSD FPSI HR DEC OCS ACS N=2 N=6 N=1 N=3 N=1 N=1 N=4 N=3 Networks Non-Networks OCS ACS 4. Key drivers of learning effectiveness analysis based on Level 2 and Level 1 evaluation While the above evidence on the level of performance on learning effectiveness is important in suggesting the trends in performance, much more significant in conducting Level 2 Evaluation is in determining the differences between the "high gain" and the "low gain" courses and how this information could be used to guide program improvements. An analysis was conducted to determine the difference between high gain and low gain courses on 14 of the Level 1 variables assessing design and delivery. A chi-square analysis shows v Executive Summary significant differences between high gain and low gain courses for 10 of the 14 variables. (Table 7 of Main Report.) It was recognized that addressing all of the 10 factors at one time might not be a feasible act among program managers interested in program improvement, especially for those Networks that have very poor ratings on all 10 factors. Thus in order to help out in the process of decision making, a logit regression analysis was conducted to establish the key drivers of quality in the high courses. The analysis shows five variables to be significant drivers: the pacing of the instruction of the various sessions; the application of concepts and principles to Bank work; the materials for use during training; the balance among lecture, discussion, and application; and the balance between theoretical and practical information. Analysis of the performance of the various networks against these drivers show some major variations. (See Table C below.) The ACS, DEC, and LEG groups have good to excellent performance. FPS, OCS, and HD groups have average performance. The performance of HR, PREM, FPSI and ESSD ranges from 59 to 55 percent. Participant comments and recommendations highlighted in the main report provide some concrete steps that could be taken by the networks to improve quality on these training design and delivery dimensions. Table C Percentage of Respondents Providing Ratings of 4 or 5 on the aggregate of the Key Drivers of Quality 11-17 Professional Technical All Bank ACS Non-Network OCS Networks DEC FPS HR LE PREM HD FPSI ESSD G # of Respondents 1015 135 12 57 43 52 335 102 43 77 159 Aggregate Ratings on Key Drivers of Quality 72 91 87 75 58 81 75 63 75 59 55 Below Very Description Average Excellent Good Average Poor Good Average Poor Average Poor Poor 5. Ensuring sustainability Level 1 evaluation and information on unit cost from WBI Budget Office The evaluation considered the main sustainability factors to be: (i) management involvement in enhancing transfer of learning and performance improvement, (ii) performance enhancement systems such as help-desks or hotlines, clinics, learning groups, access to web-sites, and access to expert and resource persons for mentoring or consultation, and (iii) efficiency in the use of financial inputs. The former is indirectly assessed via the participant responses to the types of follow-up support they would like to have. Respondents' specific recommendations from their managers are for time to practice, master, and sustain new knowledge, skills and techniques, and for moral support in the form of trust, encouragement, patience and incentives to carry out new ideas. vi Executive Summary Efficiency in the use of financial resources is assessed via an analysis of the relationship between unit cost per training day and quality (i.e. relevance, effectiveness, and delivery efficiency). In the absence of norms for judging this relationship in the Bank staff training programs, the information presented is, at best, descriptive. The trend of the results shows an inverse relationship between unit cost and quality (r = - .4, p< .08). Figure 2 below which is based on the Level 2 data shows that groups with high quality (i.e. ACS, OCS, Non-Network) have very low cost. The Networks HD, PREM, ESSD, and FPSI have higher costs with quality slightly above or below the set benchmark of 50 percent for learning effectiveness. (See also Figures 4 and 5 in main report for analysis with Level 1 data.) This preliminary evidence could be used as a starting point for an informed dialogue on the cost-effectiveness and cost efficiency of the Bank staff training program. Figure 2 Relationship Between Cost and Level 2 Learning Gains by Network unit cost % gain in learning % $2,500 80 70 $2,016 $2,000 69% 65% 60 59% $1,500 54% 50 $1,373 46% 40 43% $1,000 36% 30 $597 20 $500 $456 $155 $203 10 $88 $0 0 HD PREM ESSD FPSI NN O CS ACS vii I. INTRODUCTION AND BACKGROUND The Bank is investing considerable resources in structured training and various forms of learning activities as one basis for improving the value of its human capital and for promoting the professional excellence of staff. Because professional excellence is of strategic importance in the current revolution on globalization, this investment has to be carefully planned and evaluated to ensure that: the outcomes and impact of the activities are relevant to the Bank mission; the conditions and outputs are right for achieving the desired outcomes and impact; and the lessons, good practices and principles are developed to advance the objective of the Bank as a learning organization. In order to respond in an effective manner to the agenda of the Strategic Compact (1997) and the HR Policy Reform (1998) on continuous learning and effective knowledge management, the two key training arms of the World Bank, the Economic Development Institute and the Learning and Leadership Center (LLC) merged in July 1998 to form the World Bank Institute (WBI). The mandate of the WBI is to work with the Networks, regions, and stakeholders worldwide to develop, integrate and share knowledge on issues crucial to poverty alleviation and sustainable development. For its work with the Networks in developing a shared approach to the training of Bank staff, Memoranda of Understanding (MOUs) were drawn outlining the objectives, scope, partnership terms, financial arrangements, management and delivery responsibilities for the training activities. It was also established that an institution-wide governance body, the Knowledge and Learning Council (KLC), supported by the HR function, the new WBI, the Networks and other partners would be created to integrate the learning agenda, rationalize resource allocations, connect client/staff learning and knowledge management, create incentives for knowledge management and learning, and strengthen monitoring and evaluation of all Bank staff training. The measures for promoting continuous learning and knowledge management included: mainstreaming of knowledge management and learning efforts into the Bank; strengthening and supporting the thematic groups as vehicles for learning and knowledge transfer; implementing the incentives, partnerships and culture/behavior change needed for effective learning; and measuring and demonstrating the value for money invested in knowledge management and learning through monitoring and evaluation. The evaluation unit within the new WBI was expanded and mandated by the KLC to undertake in FY99 the following evaluation activities related to the training of Bank staff: Build a new evaluation system that is comprehensive, integrated, and timely in providing information that is valuable for strategic planning, program development, quality enhancement, and knowledge generation. Evaluation of the Bank Staff Training Program FY99 Develop a Level 1 evaluation system that assesses the quality of all formal training and makes recommendations for program improvement and systemic changes. Begin to lay the foundations for a Level 2 evaluation system that assesses learning gains and that defines the systemic and technical requirements for this type of evaluation. The New Evaluation System The KLC authorized $300,000 in January 1999 for the monitoring and evaluation of three areas of learning activities during the remainder of FY99: the Professional and Technical (PT) training program, the 11-172 or the Administrative Client Support group (ACS) training program, and the new Bank Staff Orientation Program. The new evaluation system was designed based on: 1. analyzing the strengths and weaknesses of the current approach to the evaluation of the Bank staff training programs; 2. defining the system components and requirements with major stakeholders; 3. pilot testing various instruments and analytical approaches in consultation with key stakeholders and with internal and external experts and resource persons; and 4. defining benchmarks or performance indicators for evaluating quality and getting consensus from various groups. One of the products of the new system is a questionnaire for Level 1 evaluation (focusing on participation feedback) with six core questions that speak to relevance, effectiveness, efficiency, potential impact, and the quality of training design and delivery. The questionnaire was pilot tested in February and launched Bank-wide in mid-April, 1999. Since then, it has been used in almost all Bank staff training including the Professional Technical (PT), Administrative and Client Support (ACS), Communication, Languages, and Information Technology training programs. It has also been adapted for use in various other types of Bank learning activities, including sector-wide Learning Fora, the SAP training, the Regional training of field staff, and pilot courses led by the Regions. In introducing Level 2 evaluations (focusing on participant learning), consultations and work sessions were held with the Networks and various training groups leading to the selection of a targeted set of courses for the Level 2 evaluations, the definition of the types of assessment devices best suited for the courses selected, the validation of tests being administered, and the development of procedures for the administration and for the objective scoring of test results. It was determined by the KLC that a Level 2 evaluation would be conducted on 10 percent of the PT and ACS training programs. This process began in February 1999 and 26 courses have been included in the FY99 Level 2 evaluation. This number represents 14 percent of the 190 formal classroom-based courses delivered in FY99 in the PT and the ACS programs. 2"11-17" refers to the Bank's internal grading system for staff. 2 Evaluation of the Bank Staff Training Program FY99 As noted above, for FY99, the Level 1 and Level 2 evaluations were to address formal classroom training of the PT, the ACS, and the new Staff Orientation programs. This does not include the Communications, Languages, Information Technology, the region-based (HUB) Field Staff Training, or the Regional Training programs in Washington. A separate report on the Orientation program for FY99 is available from the WBI Evaluation Unit (WBIES). FY99 PT and ACS Training Activities In FY99, the PT and ACS training programs designed and offered 190 formal classroom-based training courses and seminars. There were 347 deliveries of training based on these 190 courses. The average duration of the courses was two days. The estimated number of training days3 for FY99 was 25,777. Table 1 below provides a breakdown of this information on formal training for the various PT and ACS training groups. Table 1 Summary of Bank Staff FY 99 Learning Activities* Addressed In This Evaluation # of formal classroom-based Total # of Total # of FY99 Program Types training courses deliveries training days** Professional Technical Training Program · Network and Sector Training Program PREM 29 42 2469 ESSD 26 46 3909 HD 41 47 5174 FPSI 42 55 4785 · Core Business Skills Training Program. OCS Network Training Program 24 84 6710 Non-Network Training Program 21 37 1681 Total PT 183 311 24,728 Level 11 ­ 17: Administrative and Client Support Group · Core Social Cognitive and Behavioral Skills 7 36 1049 Total ACS 7 36 1049 TOTAL 190 347 25,777 *Data Source: PeopleSoft -- 6/30/99 **Training Days is estimated based on number of days per course multiplied by the number of participants. 3The number of training days is estimated based on the number of days per course multiplied by the number of participants in the course. 3 Evaluation of the Bank Staff Training Program FY99 Besides the formal training, PT also offered seven Sector Weeks or Learning Forums designed primarily for networking, sharing experiences, and developing knowledge of innovations or cutting edge technology. Also included in the PT program are individualized or alternative learning opportunities which include Professional Development Grants Programs, Mentoring Programs, Apprenticeship Programs, clinics, hotlines, help-desk, desk- top and just-in time computerized learning modules. It is important to note that these learning activities are complemented by knowledge management initiatives such as the development of web-sites and self-paced instructional materials. The new learning agenda of the PT and ACS programs also seeks to encourage everyone to build "learning communities" that would, on a continuous basis, "learn, build, and share their skills and knowledge" so that the Bank can better fulfill its mission of fighting poverty. These alternative learning activities are fully recognized as a significant part of the learning agenda of the Bank. They are, however, not covered in this evaluation. 4 Evaluation of the Bank Staff Training Program FY99 II. THE EVALUATION The Scope The expanded training and learning agenda of the Bank described above needs to be evaluated to determine its value and returns to the institution. This evaluation supports this goal with a prime focus on the evaluation of the formal classroom training programs of the PT and the ACS. It provides Level 1 and Level 2 evaluations which include: (a) all the Network programs ­ HD, ESSD, FPSI, PREM; (b) the OCS program; (c) the Non-Network programs -- FPS, HR, Legal, DEC; and (d) the newly developed ACS program on social and behavioral skills development.4 The Design A training program has value when it addresses the right things and selects a portfolio of programs and activities offering the greatest potential value; it does things the right way by optimizing the processes and outputs to provide high impact programs and activities; and it seeks to ensure the sustainability of its programs and activities by addressing the factors that pose as risks and constraints to success. In this regard, the key questions by management which will be the focus of this evaluation in FY99 are the following: (1) Are we doing the right things? Are courses aligned with the Bank's performance and business requirements? Are they relevant to staff learning needs for performance improvement and professional growth? Are they based on the principles of effective learning and performance improvement among adults? (2) Are we doing things right? Is delivery efficient? How do respondents rate and evaluate the following factors of delivery efficiency: the match between the announced objectives and those delivered; the logic in sequence of course sessions and activities; and the pacing for learning? Is the training effective in increasing learning or the attainment of desired mastery levels? Are the respondents satisfied with their learning gains and other benefits derived from the training courses? 4HD: Human Development; ESSD: Environment and Socially Sustainable Development; FPSI: Finance and Private Sector and Infrastructure; PREM: Poverty Reduction and Economic Management; OCS: Operational Core Services; FPS: Financial Products and Services; Legal: Legal Department; HR: Human Resources; DEC: Development Economics. 5 Evaluation of the Bank Staff Training Program FY99 (3) Are we ensuring sustainability? Is the training cost-efficient? What is the relationship between cost and quality? What factors about the training function or the level of development of the various Network programs affect cost and quality? The evaluation will address the factors that affect the observed results, and the improvements needed to maximize the value of the training programs. Data Sources, Instrumentation, and Performance Standards The data used in this evaluation come from two primary sources: · Participant feedback on an end-of-course questionnaire (Level 1 Evaluation) · The performance of respondents on objective learning assessment instruments (Level 2 Evaluation) The Level 1 evaluation The Level 1 evaluation is conducted via a questionnaire completed by respondents at the end of a course or learning activity. This instrument assesses, on a five point scale: · course outcomes: i.e. relevance, effectiveness, efficiency, potential impact, overall usefulness, and satisfaction; and · course outputs: i.e. design and delivery factors for content, structure, instructional approach, instructors, and participants as a learning community. The Level 1 questionnaire was launched Bank-wide for all Bank staff learning activities covered in this evaluation as of mid-April, 1999. It replaced the older LLC questionnaire that had been used over several years. The latter had technical weaknesses which restricted the range of responses provided, limited its validity, and made it difficult to interpret and use the findings derived thereof. The scatter-grams in Annex A show the different discrimination powers of the old and the new instruments, in favor of the new instrument. The tables in Annex B show the quantitative results based on the old questionnaire administered between July 1998 and mid-April 1999 for participants from the Networks and Non-Networks. For this 10-month period, there were 1,428 respondents representing approximately 20 percent of the participants who attended training courses. Inspection of the data indicates, consistent with the evidence from the scattergrams, very few differences in scores among programs and courses. Because of the general lack of confidence in the results from this old instrument, it was decided that this evaluation report would be limited to the data generated with the new instrument as of mid-April 1999. This decision implies enhancing internal validity at the expense of external validity or the generalizability of the findings to the whole of the FY99 training program. Thus the findings of the study reported in this evaluation are mostly representative of the training carried out in the last quarter of the year between April and June, 1999. The number of respondents involved during this period is 1,015. This is a 62 percent response rate (for 79 courses included in this evaluation). It contrasts considerably 6 Evaluation of the Bank Staff Training Program FY99 with the 20 percent rate in the past with the now discarded prior Level 1 evaluation questionnaire, and reflects progress in the establishment of a new evaluation system for Bank staff training. While this number is quite significant on a Bank-wide basis, the large variability in response rates among Networks, and the small number of courses included for some of the Networks, requires caution in the interpretation of the findings and comparisons made in the evaluation. The results in such cases are not conclusive but indicative of trends in performance. The quantitative data generated from the new questionnaire are analyzed in percentages. The comments and suggestions made by respondents are content analyzed and the frequencies of responses are reported. One of the innovations in the evaluation of the Bank Staff Training Program is the establishment of a performance indicator or benchmark for judging quality and value. It was established that 85 percent of the respondents who submit ratings of their learning experience will award a score of 4 or 5 on a five point scale on key course quality criteria. This is consistent with the WBI system for client training. It compares favorably with benchmarks set by well known education and training institutions such as Motorola University, and Linkage Incorporated5. The adoption of this benchmark has been preceded by extensive consultations and consensus building with the various stakeholders of the training function in the Bank. The Level 2 evaluation While the Level 1 Evaluation assesses training effectiveness based on self-reported participant ratings, the Level 2 Evaluation promotes a more objective assessment of these learning outcomes. Over 14 percent (N=26) of the formal classroom based training courses offered in FY99 were evaluated using a Level 2 approach. The assessment tools for the Level 2 Evaluation are diverse and range from multiple choice tests to essays, certification tests, and self-assessment instruments. The choice of the assessment device for any one course is dictated by the knowledge structure of the course objectives, concerns for "construct" and "ecological" validity of the measurement instruments, and logistic and feasibility issues in the administration and scoring of tests. The analysis of the Level 2 data is based on two methods, (a) an analysis of percentage increase in learning based on pre- and post-test results; and (b) an analysis of mastery level based on a defined criterion. The former method is consistent with what WBI has been doing in evaluating its Core Courses. The latter is applied in cases where only post-test data are available. Another innovation in this overall evaluation effort is the establishment of a performance indicator, or benchmark, for judging the extent of learning as another criterion for quality and value. This evaluation used a 50 percent increase in gain of the post-test over the pre-test as an acceptable index of quality. This benchmark is based on an analysis of the average increase of all the FY99 Level 2 courses with pre- and post-test data. The derived average was 54 percent. However, 50 percent was selected by WBIES to accommodate errors likely to arise from the small number of cases involved in the analysis. In cases where there exists no pre-test data for the courses selected for Level 2 (N=7), a 5With the rise of the Total Quality Management movement and consumer-driven models of product/service delivery, private sector firms set very high standards (e.g. the Six Sigma Systems) for client satisfaction. The 90% or more of 4 or 5 ratings mark the best-managed organizations. 7 Evaluation of the Bank Staff Training Program FY99 mastery level of performance was agreed upon with the instructors and this was used as a basis for judging effectiveness. 8 Evaluation of the Bank Staff Training Program FY99 III. MAIN FINDINGS The findings of this evaluation are reported in three sections based on the evaluation questions raised above (II. "The Design"): Section 1: Are We Doing The Right Things? Section 2: Are We Doing Things Right? Section 3: Are We Ensuring Sustainability? This report provides evaluation data on the Bank-wide performance across all courses delivered between April and June, as well as on the performance of the Network training program (HD, ESSD, FPSI, PREM) the Non-Network groups (Legal, OED, DEC, HR) and the OCS. The study also reports on the performance of the newly created ACS Network for support staff (levels 11-17)6. The analysis of the data on the questionnaire reveals six levels of performance among the programs based on percentage of respondent providing ratings of 4 or 5. These levels are used to highlight the differences among the various Networks and training groups included in the evaluation. The levels of performance are: Excellent 90 ­ 100 percent of the respondents provide ratings of 4 or 5 Very Good 85 ­ 89 percent of the respondents provide ratings of 4 or 57 Good 80 ­ 84 percent of the respondents provide ratings of 4 or 5 Average 75 ­ 79 percent of the respondents provide ratings of 4 or 58 Below Average 70 ­ 74 percent of the respondents provide ratings of 4 or 5 Poor 69 percent and below of the respondents provide ratings of 4 or 5 Section 1: Are We Doing The Right Things? This section reports on the degree to which the learning programs were relevant and appropriate. The aspects of the learning agenda considered in this regard are: a. linkage with the content of the Bank's corporate performance and business requirements; 6The total number of respondents included are: Bank-wide = 1,015 (79 courses); HD = 43 (4 courses); PREM = 102 (10 courses); ESSD =159 (6 courses); FPSI = 77 (8 courses); DEC = 12 (1 course); FPS = 57 (6 courses); HR = 43 (4 courses); Legal = 52 (3 courses); OCS = 335 (24 courses); and ACS = 135 (13 courses). See also the section above on "Data Sources..." for caution in interpreting comparisons among networks due to the small number of courses involved for some of the Networks. 785 percent is the WBI benchmark for quality. 8The Bank-wide average is generally between 75 percent and 79 percent. 9 Evaluation of the Bank Staff Training Program FY99 b. relevance to the learning needs of staff for job performance and professional growth; and c. appropriateness for adult learning and performance improvement. a. Linkage with the content of Bank performance and business requirements One of the added values of having Networks and Sector Boards identify training programs is the potential for greater alignment with the core business of the Bank and with the technical requirements of Networks and Sectors. Annex C outlines the areas of corporate performance important for the Bank. Such considerations together with the technical requirements of the Networks guided the planning of the Network training programs for FY99 and the continued efforts at program modifications throughout FY99. This evaluation investigated the value added effects of such efforts through participant ratings of: · the quality of the choice of content covered in the courses; · the depth of coverage of such content; and · the application of course concepts and principles to Bank work. The analysis across all the courses of the various programs (Table 2 below) indicates a Bank-wide mean performance score of 77 percent across all of the three variables. This is eight percentage points below the established standard of 85 percent for quality. Comparison of the Network and the Non-Network programs shows some interesting variations. Only the ACS9 and Legal perform above the target with 92 percent and 89 percent, respectively. However, DEC (83 percent)10, the FPS (81 percent), and the OCS (80 percent) come very close to meeting the target. Performing well below the target are the ESSD (63 percent), PREM (66 percent), the FPSI (67 percent) and the HD (77 percent). (See summary below Table 2.) The analysis for each of the three variables of the questionnaire considered separately shows some interesting variations which ought to be noted in making improvements to address strategic alignment. The highest ratings are for the question, "choice of content covered" where there is a Bank-wide average of 86 percent. Ratings are above the set standard for the HD (91 percent), the FPS (93 percent), Legal (96 percent), the OCS (87 percent) and the ACS (95 percent) programs. The DEC (83 percent), PREM (82 percent), and FPSI (81 percent) programs all come very close to meeting this target. Performing well below the standard are ESSD (76 percent) and HR (78 percent). 9The courses of the ACS program are all focussed on the social and behavioral dimensions of work -- team building, conflict resolution, meeting challenges of the 21st Century. It is conjectured that it is this characteristic which accounts for the consistently high ratings for the ACS program. 10It is important to keep in mind the fact that the data for DEC includes only one course with 12 respondents. The results for this course are very positive. Given the small sample size for DEC, caution must be exercised in interpreting the results for DEC presented in this report. 10 Evaluation of the Bank Staff Training Program FY99 Table 2 Percentage of Respondents Providing Ratings of 4 or 5 on the Level 1 Variables Assessing Linkage with the Content of Bank Performance and Business Requirements 11-17 Professional Technical All Bank ACS Non-Network OCS Networks DEC FPS HR LEG PREM HD FPSI ESSD # of Respondents 1015 135 12 57 43 52 335 102 43 77 159 The choice of content covered 86 95 83 93 78 96 87 82 91 81 76 The depth of coverage of the various sessions 68 90 75 66 56 86 70 54 71 57 49 Application to Bank work 77 92 92 84 60 84 82 62 70 64 63 TOTAL (Average in percent) 77 92 83 81 65 89 80 66 77 67 63 Summary Levels of Performance for Network, Non-Network and ACS 100 95 ) 92 %( 90 89 85 83 mance 81 orf 80 80 77 77 Per of 75 70 67 Level 65 66 65 63 60 C S HR EM HD G ESSD FPSI OCS FPS DE LE AC PR ank-wideB 69% and below: 75-79% 80-84% 85-89% 90-100% Poor Average Good Very Excellent Good 11 Evaluation of the Bank Staff Training Program FY99 The judgements of the respondents on the question, "the application of principles and concepts to the Bank's work" had a mean score of 77 percent. This does not meet the set standard of 85 percent. The scores for the Networks (HD, ESSD, PREM, and FPSI) are moderate, with ratings ranging from 62 percent to 70 percent. The rating is weak in the HR program (60 percent). The ratings are stronger but still not to the benchmark for the FPS (84 percent), Legal (84 percent), and OCS (82 percent) programs. Performance is most exemplary in the cases of the DEC (92 percent) and ACS (92 percent) programs. The greatest weakness reported by respondents is in the depth of coverage of the various sessions where there is a Bank-wide average of 68 percent. The evidence shows that the quality noted in the choice of content, or subject matter covered, is not complemented by depth of coverage. Only the Legal group and the ACS program meet the Bank standard for quality. All the other programs operate at more than 10 percentage points below the target. The lowest performer is the ESSD, with a rating of 49 percent. The overall evidence at this point suggests that all Networks and training groups included in this evaluation need to work on improving the depth of coverage of their courses. The ESSD, PREM, FPSI and HR programs need to intensify their overall efforts at strategic planning and curriculum development in order to increase relevance. The analysis of the comments by the respondents shows that the problems with the depth of coverage and the application of concepts and principles are associated with the following: · The absence of a sound assessment of learning needs and their linkage with clearly defined competencies and performance requirements of the job. Most training coordinators and task managers indicated limited focus on this "pedagogic" task in FY99 due to the focus on planning and budgeting. While some efforts were made in defining learning goals, limited efforts were directed at defining learning objectives in an observable or measurable fashion. Much less effort was directed towards defining the performance outcomes associated with learning gains. With these shortcomings, courses remained content-driven and were not responsive either to the depth of learning, or the higher order cognitive skills and strategies that are significant for staff performance. · Inconsistency between stated learning objectives and what is actually delivered. In cases where the objectives stated the development of analytical and application skills, the delivery process did not adequately cover these levels of learning. The short duration of courses poses a limiting factor. This short duration could be offset by several factors to include the following: (a) Very focussed training. The delivery of such focussed sessions appear to be affected by the oftentimes significant heterogeneity in knowledge levels among respondents. It is also affected by the massive interruptions in the learning process caused by absenteeism, as well as the constant entry and exit of participants in training rooms during delivery. (b) The development of modularized courses integrated in ways to address different learning levels. 12 Evaluation of the Bank Staff Training Program FY99 b. Relevance to staff learning needs for job performance and professional growth The Level 1 data indicate that staff attend training primarily (a) to enhance their performance in a current assignment and (b) for professional growth. Other complementary reasons noted are to network, and to share knowledge and experiences with other staff and resource persons delivering the training. The following three variables from the Level 1 questionnaire address the degree to which the training responded to staff learning needs to meet performance requirements, and for professional growth and excellence: (1) The relevance of the training to staff learning needs for job performance and/or professional growth (2) The likelihood among staff to use the knowledge and skills acquired from the training (3) The overall usefulness of the course in meeting staff learning needs In examining the overall Bank performance, the results of the analysis across all courses show the average performance of the Bank to be 77 percent. The evidence indicates four predominant levels of performance among the Network and Non-Network groups all below the 85 percent benchmark, as outlined below. Table 3 Percentage of Respondents Providing Ratings of 4 or 5 on the Level 1 Variables assessing Linkage with Staff Learning Needs for Performance Improvement and/or Professional Growth 11-17 Professional Technical All Bank ACS Non Network OCS Networks DEC FPS HR LEG PREM HD FPSI ESSD # of Respondents 1015 135 12 57 43 52 335 102 43 77 159 1. The relevance to staff learning needs for job performance 77 92 67 84 68 90 83 63 71 71 58 2. The likelihood to use knowledge /skills 75 92 75 72 84 78 78 60 71 66 68 3. Overall usefulness of the course 79 90 67 83 78 76 83 72 75 70 70 TOTAL (Average in percent) 77 91 70 80 77 81 81 65 72 69 65 13 Evaluation of the Bank Staff Training Program FY99 Summary Levels of Performance for Network, Non-Network and ACS 100 ) (% 95 91 90 ncea 85 80 81 81 80 77 77 Perform 75 72 of 69 70 70 65 65 Level 65 60 C G S EM ESSD PSIF HD HR DE PR deiw- S FP OCS LE AC ankB 69% and below: 70-74% 75-79% 80-84% 90-100% Poor Below Average Good Excellent Average With the exception of the ACS, OCS, Legal, and FPS programs, the results are consistently low for each of the three variables considered separately. The same issues raised above with respect to the depth of coverage and application of content to Bank work remain pertinent explanations for the findings being observed here. The performance of the ACS group is excellent in its responsiveness to staff learning needs (91 percent). The OCS, Legal, and FPS programs are quite good with ratings of 81 percent. The ESSD and PREM programs have poor ratings of 65 percent. Respondents' comments suggest that the factors that enhance responsiveness to staff learning needs are: clear learning objectives, depth of content covered, application of concepts and principles to Bank work, and quality of the materials for continuous learning. Recommendations for improving the programs' responsiveness would need to consider how best to enhance these factors. c. The application of appropriate principles for adult learning and performance improvement Complementary to the question raised above on the relevance of the training to staff learning needs is the question of how such learning needs are being addressed. Specifically, what principles of adult learning are being applied to enhance learning and performance improvement among staff? The following are some of the key principles of effective adult learning. · Learning is functional; it must address a need for performance improvement and training must provide an appropriate balance between theory and practice. 14 Evaluation of the Bank Staff Training Program FY99 · Adults learn best and are likely to use information when learning is experiential and social. This requires a participatory or interactive process which allows them to use prior knowledge and to engage in a process of knowledge construction. · Learning is continuous and is closely integrated with performance. This requires the provision of materials and other systems to support continuous learning and the application of knowledge acquired to solve real life problems. Table 4 highlights the training design and delivery factors from the Level 1 Questionnaire which are significant for adult learning. The results are moderate to weak across the board, except for the DEC and ACS programs. The DEC program is particularly exemplary on all factors assessed except for the use of case studies and small group exercises for interactive learning. This is also the lowest rated variable of the entire Level 1 questionnaire. The low ratings on the quality of the use of case studies and other forms of interactive learning become troublesome given the commonly held assumptions about the significance of this factor for learning and performance improvement. Participant comments also indicate a keen interest in the use of case studies. In over 90 percent of the courses, the respondents included in this study make recommendations about (a) increasing the use of case studies to enhance interactive learning, and (b) making improvements in the case studies being used. Suggested improvements include: · using case studies that provide practical, hands-on analysis; · examining the case studies and making them more complex; · using updated cases that are more representative of the new projects or new directions in the Bank; · providing time for more intensive analysis of cases being studied; · using case studies based on situations encountered in Bank-financed projects or activities; and · limiting the heterogeneity in knowledge levels among participants in order to facilitate the processing of case studies. In summary, staff request that the training programs use actual case studies, as opposed to the usual practice of using "case examples" where task Managers describe the work they have done. 15 Evaluation of the Bank Staff Training Program FY99 Table 4 Percentage of Respondents Providing Ratings of 4 or 5 on the Level 1 Variables assessing Linkage with Principles of Effective Adult Learning 11-17 Professional Technical All Bank ACS Non-Network OCS Networks DEC FPS HR LEG PREM HD FPSI ESSD # of Respondents 1015 135 12 57 43 52 335 102 43 77 159 A. Course Design and Delivery Factors - Balance of theory and practice 70 89 92 66 56 75 69 66 74 64 55 - Balance of lecture, discussion 69 92 92 76 46 72 72 56 75 50 51 - Application to Bank work 77 92 92 84 60 84 82 62 70 64 63 - Case studies for interaction and knowledge construction 65 94 67 69 32 74 64 38 72 48 58 TOTAL 70 92 86 74 49 76 71 56 73 57 57 B. Continuous Learning and Performance Improvement Factors - Follow-up materials for follow- up learning or job application. 69 87 83 71 51 96 76 33 69 52 45 TOTAL (Mean Average) 70 91 85 73 49 80 73 51 72 56 54 It has been empirically demonstrated that immediately after training, the knowledge acquired is not permanent and not wholly integrated, especially given the short duration of most courses. It needs to be refined. This is the most critical stage of the learning process; the learner is very likely to revert to old ways (Davis and Dean, 1993). It takes a committed learner to continue the learning process and to prevent loss. The existence of materials for continuous learning and of systems for performance enhancement is also critical to move the learning process to the next stage -- when knowledge becomes automatic and routine (Hall and Loucks, 1975). In the case of the "knowledge worker" such as Bank staff, it is at this stage that (s)he routinely and automatically uses professional level skills and resource materials for problem-solving and decision-making. The availability of such materials after training is imperative. Bank staff reported on the poor quality of the types of materials provided for continuous learning and performance improvement. With the exception of the ACS (87 percent), Legal (96 percent), and DEC (83 percent), all of the other groups perform below 80 percent, with a range between 33 percent and 76 percent. The rating for the follow-up materials for the 16 Evaluation of the Bank Staff Training Program FY99 Legal program is very high (96 percent). An examination of the nature of the Legal group's materials might elucidate some lessons for good practice in material development. In addition to their recommendation to improve the materials for continuous learning, the respondents also suggested various types of support they would like to have to continue learning and performance improvement after training. Most significant are (a) management support and commitment; (b) access to information, experts and resource persons and learning groups; and (c) follow-up training. Annex E provides details on the type of support noted by staff to be significant for continuous learning and performance improvement. Are We Doing the Right Things? Concluding Remarks and Recommendations The content or subject matter of the courses are relevant to the business priorities of the Bank, but the courses are not appropriate in meeting learner needs. A "learner-driven curriculum" needs to complement the current "content-driven approach" to enhance the quality and value of the courses. Network, Non-Network, ACS, and OCS training groups have all given some good thought to strategic linkage of the courses with the Bank's business priorities or Network requirements. Some of the groups are engaged in ongoing efforts to review their existing curricula and "revamp" or make adjustments to increase their relevance. The evidence from the analysis of the Level 1 data is overwhelming in showing the strongest and most positive ratings from the respondents to be for quality of courses in selecting the right content or subject matter. This extensive focus on content is not complemented by a focus on the types of instructional and performance technologies that enhance learning and performance improvement. The courses fall short in addressing staff interest for a greater depth of coverage of content, the application of principles to Bank work, the use of interactive learning methods, and the provision of follow-up materials for continuous learning and transfer of learning on the job. These inadequacies signal a need on the part of the various training groups to pay particular attention to the learner with a focus on (a) the level of learning that is needed in addressing the content selected; and (b) the types of instructional and performance technology that are considered to be appropriate for Bank staff as adult learners. Consideration of these factors in curriculum development efforts will no doubt impact individual staff learning needs as well as performance improvement. In moving away from a content-driven to a "learner-centered" approach, the following are significant recommendations based primarily on the analysis of participant comments: · Use a competency-based, performance improvement model in curriculum development and course design. This would require a clear identification of the performance requirements of the jobs or tasks and the definition of competencies and levels of knowledge and skills that should be the focus in training courses. · Carry out a more thorough and technical approach to the assessment of learning needs for different target groups. 17 Evaluation of the Bank Staff Training Program FY99 · Define measurable learning objectives; doing this greatly clarifies the methodology and activities that add value to the training. · Develop modularized and integrated training courses of short duration that are responsive to different learning levels and can also allow staff to progressively develop expertise over time in an efficient manner. To maximize use of the short duration of the courses, which is a preferred mode of delivery among Bank staff, the recommendations are to: · Limit the gross heterogeneity in knowledge levels among respondents, defining very clearly the target group for the courses and the pre-requisite knowledge requirements. When necessary, pre-test the course nominees for placement and selection purposes. · Have managers set expectations of full time participation in courses for which staff are registered. Doing so should help to limit the interruptions to the learning process caused by a constant "entry" and "exit" of the training rooms as participants try to use the training opportunities while also addressing other work priorities. Section 2. Are We Doing Things Right? Maximizing the value of the Bank staff training program requires not only selecting a portfolio of relevant and appropriate training courses but also optimizing the processes, outputs, and outcomes. This section will evaluate the degree to which the Bank staff training program does each of the following: a. Promotes efficient delivery processes b. Develops effective learning outcomes c. Enhances participant satisfaction with the learning gains and other benefits from the training The data analyzed in this section come from both the Level 1 and the Level 2 assessments. The next section of this report (Section 3) expands on the information provided in this section to analyze relationships between delivery efficiency, effectiveness, and cost. Included in this section is an analysis of the key drivers of quality in training and the performance of the various Networks on these drivers. a. Delivery efficiency Bank staff training courses have very short duration. The average is two days with a range of one-half days to five days. Bank staff members prefer short courses as well as extensive and in-depth coverage of content. Meeting this demand requires the efficient delivery of training. Delivery efficiency occurs when the following criteria are met: · the learning objectives are clearly defined and what is announced is what is delivered; · the activities of the various sessions are well sequenced to reinforce learning; · the pacing of the various sessions is appropriate for the stated learning objectives; 18 Evaluation of the Bank Staff Training Program FY99 · the instructional materials are well organized and easy to use; and · the instructional media are easy to use and are reliable. The analysis of these dimensions based on respondent ratings shows that delivery efficiency on a Bank-wide basis is average. Seventy-seven percent (77 percent) of all respondents rated delivery efficiency as average. This level does not, however, meet the set standard of 85 percent. See Table 5 below. The analysis of differences among the various training groups follows the same pattern as in earlier analyses noted in Section 1. The PREM (67 percent), FPSI (68 percent) ESSD (63 percent), and HR (66 percent) training programs show the lowest ratings. (See Data Summary below Table 5.) In determining where the strengths and weaknesses in delivery efficiency lie, the analysis shows that the greatest strength is in "the match of the announced objectives with those delivered". The greatest weakness is in the "pacing of instruction for learning". It appears that in an attempt by instructors to cover a generally over-ambitious training agenda, pacing is accelerated. In the end, more content is covered at the expense of the depth of content. The problem is compounded when, as noted in Section 1, there is great heterogeneity in the level of knowledge among respondents. Balancing these facets of the training (pacing, depth of coverage, heterogeneous groups) is a major challenge for instructors. Table 5 Percentage of Respondents Providing Ratings of 4 or 5 on the Level 1 Variables assessing Delivery Efficiency 11-17 Professional Technical ACS Non-Network OCS Networks All Bank DEC FPS HR LEG PREM HD FPSI ESSD # of Respondents 1015 135 12 57 43 52 335 102 43 77 159 Match of announced Objectives with those Delivered 87 96 83 93 70 100 87 79 85 83 77 Logic in sequence 80 96 100 91 68 92 80 72 81 71 66 Pacing for learning 69 90 75 72 56 84 71 57 84 50 52 Materials used during Training 74 93 92 79 71 88 76 58 72 66 56 TOTAL (Mean %) 77 94 88 84 66 91 80 67 81 68 63 19 Evaluation of the Bank Staff Training Program FY99 Summary Levels of Performance for Network, ) Non-Network and ACS (% 100 94 95 91 ncea 90 88 84 m 85 80 81 forreP 80 77 75 70 66 67 68 of 65 63 60 Level S S C HR G S EM ESSD PSIF HD PR -wide OC FP DE LE AC ankB 69% and below: 75-79% 80-84% 85-89% 90-100% Poor Average Good Very Excellent Good b. Learning effectiveness Level 2 evaluation The move to an objective assessment of learning outcomes, (Level 2 Evaluation) is one of the key areas of the FY99 evaluation effort. WBIES held an extensive dialogue with the various Networks on this new requirement of the training function. The Networks want to know in a more objective fashion what staff get from the courses. Work has been carried out by WBIES and the Networks for selecting a targeted set of courses that are of great strategic importance to the sector; developing or validating learning achievement tests; establishing test administration and scoring procedures; and analyzing test results. Several obstacles have been encountered in the introduction of Level 2 evaluation in the Bank. They range from technical constraints to logistic, contractual, and budget constraints associated with the introduction of Level 2 evaluation mid-stream in the implementation of the FY99 Bank staff training program. Database and Instruments: The WBIES has been able to develop a substantial database for Level 2 analysis, while laying a good foundation for more work in FY00. For the total formal classroom-based training courses delivered in FY99 for PT, ACS, and Bank Staff Orientation training11, 26 courses have been assessed for their Level 2 performance. This constitutes 14 percent of the total. This sample size does not permit generalizations of the findings. The evidence from the sample is, at best, suggestive of the trend of performance of the Bank staff training program. The assessment devices used for the courses included in this report are: multiple choice tests, case exercises for problem solving and decision- making, and self-assessment ratings on clearly defined learning objectives of the course. Evaluation Questions: The two questions which guide the evaluation of the effectiveness of the training are: 11A separate report on Orientation training for FY99 is available from the Evaluation Unit. 20 Evaluation of the Bank Staff Training Program FY99 1. Is the Bank staff training program effective in increasing participant learning to an acceptable level? 2. What are the key drivers of training effectiveness? Standard for Evaluating the Effectiveness of the Bank Staff Training Program. It is established in this evaluation that the Bank Staff Training Program is effective when 85 percent of the courses included in the Level 2 evaluation are "high gain courses". The section immediately below explains the methods used to categorize courses as "high gain" and "low gain" courses in the context of this particular evaluation. Analysis and Standard for Evaluating Course Level Performance: There were two types of methods used to analyze and evaluate the learning outcomes of the courses and to categorize courses as "high gain" and "low gain". These are outlined in Table 6 below. The first method is based on an analysis of the gain score or the percentage of the increase in test performance on the post-test relative to the pre-test. The basis for judging high or low performance is normative; it uses the average percentage increase for the 21 courses with pre- and post-test data to guide the decision on benchmarking courses as "high gain" versus "low gain". The average percentage increase is 54 percent for the 21 courses with pre- and post-test scores. To accommodate the errors likely to be associated with the small sample size, the percentage increase used to determine "high gain courses" in this evaluation is 50 percent . The second method is criterion-referenced. It is based on a performance level of mastery which has been determined by the course training teams as the critical level that either predicts success on the job or is deemed acceptable given the entry-level qualifications or knowledge levels of the respondents. For example in the OCS training on Project Cycle Management, the criterion for judging the effectiveness of the course is that "the average performance of the participants who take the post-test is 70 percent." 21 Evaluation of the Bank Staff Training Program FY99 Table 6 Methods Used in the Analysis and Evaluation of Courses with Level 2 Test Results Method 1: Gain Score Analysis (based on pre- and post- testing on multiple choice and self-assessment instruments) Number of courses assessed 21 Number of courses with 50 percent or higher increase in learning 10 Method 2: Criterion Referenced Analysis (based on objective scoring of post-course performance on multiple choice tests, case studies, simulations, and exercises) Number of courses assessed 5 Number meeting criterion 5 Summary Total number of courses assessed 26 Courses meeting the established criterion of effectiveness 15 (58 percent) Based on these methods, the evaluation thus analyzed the following areas guided by questions 1 and 2 raised above: a) the percentage of courses with high learning gains and those with low gains; b) the differences among the various Network programs on their learning gains; c) the difference between the high gain and the low gain courses on the 14 design and delivery variables assessed via the Level 1 questionnaire, and the course design and delivery factors associated with high gain; and d) the main factors that are drivers of such high gains. The analysis below presents the findings for each of these areas. a) The percentage of courses with high learning gains and those with low gains: Out of the 26 courses analyzed, 15 (58 percent) meet the established course level criterion of effectiveness. (See Table 6 above.) This is 27 percentage points below the set standard (of 85 percent) for Bank-wide performance on training effectiveness based on the Level 2 assessment. b) Differences among Networks: An analysis of the average percentage increase, limiting the analysis to courses with pre- and post-test data,12 shows the following performance for the various training programs. The Networks in the aggregate have a 40 percent gain in learning and do not meet the 50 percent target that defines courses as high gain. The OCS 12The analysis for the criterion referenced performance is all based on the 4 OCS courses and 1 Non-Network course and the evidence is positive in all cases. 22 Evaluation of the Bank Staff Training Program FY99 has a 59 percent increase, the ACS has a 69 percent increase and the Non-Networks in the aggregate have a 65 percent learning gain. Disaggregate information shows, in Figure 1, the variations among the various training groups. This information has to be interpreted with caution given that groups such as the ESSD, FPSI, DEC and HR had only one course included in the evaluation while others, such as PREM, had six courses. Figure 1 shows that the FPSI, the HD, PREM and the HR have an increase of less than 50 percent, and the ESSD has an increase of 54 percent. The average performance of the OCS, Non-Network, and ACS groups is above 50 percent. Figure 1 Average Percentage Increase of Course Learning Gains 140 115 120 100 80 69 59 60 54 43 46 36 40 15 20 0 HD PREM ESSD FPSI HR DEC OCS ACS n=2 n=6 n=1 n=3 n=1 n=1 n=4 n=3 Networks Non-Networks OCS ACS The pattern noted for the various programs is consistent with that observed from the Level 1 analysis where respondents were asked to rate the degree to which the training increased their knowledge and skills. 23 Evaluation of the Bank Staff Training Program FY99 The two figures below show the consistency in the pattern of performance of the various programs based on the Level 1 data (Figure 2) and the Level 2 data (Figure 3). Figure 2 Figure 3 Percentage of 4 or 5 Ratings for the The Average Percent Gain in Learning for Effectiveness Variable for Level 1. the Various Programs on Level 2. 100 93 100 90 90 80 80 80 80 70 62 70 65 69 60 59 60 50 50 42 40 40 30 30 20 20 NW OCS NN ACS NW OCS NN ACS Legend: NW=Network NN=Non-Network c) Differences between "high gain" and "low gain" courses on design and delivery variables: In order to guide efforts at program improvement, this study conducted an analysis of the difference between the "high gain" and the "low gain" courses on the 14 design and delivery variables included in the evaluation (see Table 7 for these variables). The operating assumption here is that variables that significantly discriminate between the two sets of courses in favor of the high gain courses should be useful in program improvement. A chi-square test shows significant differences between the two sets of courses for 10 of the 14 variables (these significant variables and their values are noted in Table 7). It is interesting to note that two of the three variables which are not significant at the .05 level, i.e. the quality of the choice of content covered, and the use of case studies, are also the same variables that have the highest and the lowest ratings, respectively, on the Level 1 analysis, this finding being consistent across all Network programs. The third variable, the materials provided for follow-up learning or job application, is not significantly different in this analysis due partly to the fact that it is more closely associated with performance improvement than with learning effectiveness. There is no explanation that could be substantiated at this time for the lack of difference in the ratings for balance between theory and practice. d) The main factors that are drivers of high gains. Addressing all the 10 significant variables at one time might not be feasible in making program improvements. Thus, in order to help out in the process of deciding on where to begin to make improvements on the 10 variables, a logit regression analysis was conducted to establish the key drivers of the high gain courses. This analysis shows the following five as significant drivers: 24 Evaluation of the Bank Staff Training Program FY99 (1) Pacing of the various sessions for learning (2) The application of concepts and principles to the Bank's work (3) The materials used during the course (4) The balance between theoretical and practical content (5) The balance among lectures, discussions, and application of concepts This information on the five key drivers of high gain in courses could be used to guide priorities in program improvements only with due regard to the differences in performance among the various Networks. Table 8 highlights Network performance on the key drivers. The evidence shows that on a Bank-wide basis, only 72 percent of the respondents rated the drivers with a 4 or 5 on the five-point scale in the Level 1 questionnaire. Only the ACS and DEC have excellent performance. The performance for Legal is good (81 percent), while the OCS, FPS and HD have average performance. PREM, the FPSI, the ESSD and the HR perform poorly on all the drivers of quality; the results indicate a need for intensive improvements in each of these groups in order to increase learning gains. 25 Evaluation of the Bank Staff Training Program FY99 Table 7 Difference Between High and Low Gain Level 2 Courses on the Course Design and Delivery Variables Difference Chi- % of respondents % of respondent Between square providing ratings providing ratings High and Analysis Level 1 Variables on of 4 or 5 on High of 4 or 5 on Low Low Gain N=201 Course Design and Delivery Gain Courses Gain Courses Courses P<.05 1. DOING THE RIGHT THINGS Linkage with Bank Performance and Business Requirements The choice of content or subject matter covered 88 78 10 .07 The depth of coverage of the various sessions 78 61 17 .01 Application of concepts and principles to Bank work 89 56 33 .00 Total 85 65 20 Linkage with Staff Learning Needs The extent to which the objectives delivered met your learning needs 82 64 18 .01 The likelihood to use new knowledge and skills on the job 80 62 18 .00 The overall usefulness of the courses 83 69 14 .01 Total 82 65 17 Linkage with Adult Learning Principles Balance between theoretical and practical content 78 70 8 .26 Balance among lectures, discussions and application of concepts 77 65 12 .00 The application of the principles and concepts to World Bank work 89 56 23 .00 The use of case studies and/or group problem solving exercises 66 67 -1 .09 The material provided for follow-up learning or job application 76 69 7 .31 Total 77 65 10 2. DOING THINGS RIGHT Delivery Efficiency Degree of match between the announced course objectives and those delivered 92 77 15 .00 The logic in the sequence of content, modules, or sessions 87 78 9 .03 The pacing of the various sessions 80 62 18 .01 The materials use during course 83 64 19 .00 Total 86 70 16 Learning Effectiveness 84 64 20 .01 Participant Satisfaction 87 64 23 .00 Instructor Quality 93 76 17 .00 26 Evaluation of the Bank Staff Training Program FY99 Table 8 Percentage of Respondents Providing Ratings of 4 or 5 On the Five Key Drivers of High Gain Learning Activities 11-17 Professional Technical ACS Non-Network OCS Networks All Bank DEC FPS HR LEG PREM HD FPSI ESSD # of Respondents 1015 135 12 57 43 52 335 102 43 77 159 Pacing of the various sessions for learning 69 90 75 72 56 84 71 57 84 50 52 The application of concepts and principles to the Bank's work 77 92 92 84 60 84 83 62 70 64 63 The materials used during the course 74 93 92 79 71 88 76 58 72 66 56 The balance of theory and practice 70 89 92 66 56 75 70 66 74 64 55 The balance of lecture, discussion, and applications 69 92 92 76 46 72 74 56 75 50 51 TOTAL (Mean Average) 72 91 87 75 58 81 75 63 75 59 55 c. Participant satisfaction with learning gains and other benefits from the training While the learning gains from the Level 2 assessment are not as high as the target set for the Bank's training program, respondents noted several other types of gains or benefits, the most significant of which are those listed in the box below: · Networking; · sharing knowledge and exchanging ideas; · developing new approaches, new perceptions and new insights into things; · collegiality and team work in learning; and · appreciation of the challenges and the nature of the business and future directions of the Bank. (See Annex D for details from content analysis of respondent comments.) 27 Evaluation of the Bank Staff Training Program FY99 These benefits, coupled with learning gains, help to enhance the respondents' level of satisfaction with the courses. The respondent level of satisfaction is good (80 percent) even though it does not meet the set target. (See Table 9 below.) The analysis shows that the respondents in the Network training programs (PREM, HD, FPSI, and ESSD) are the least satisfied with their overall gains from the courses. Table 9: Percentage of Respondents Providing Ratings of 4 or 5 on Level of Participant Satisfaction 11-17 Professional Technical ACS Non-Network OCS Networks All Bank DEC FPS HR LEG PREM HD FPSI ESSD # of Respondents 1015 135 12 57 43 52 335 102 43 77 159 How would you rate your overall satisfaction with what you got from the training? 80 93 83 87 71 92 82 70 75 63 65 Are we doing things right? Concluding remarks and recommendations The right delivery processes for the effective use of time, a high degree of learning outcomes, and a high level of satisfaction among the respondents -- these would guarantee that things are being done right and would therefore add value. Bank performance is average and below the set standard. The overall average for efficiency (77 percent), effectiveness (75 percent), and satisfaction (80 percent) in the Level 1 analysis is average, at 77 percent. The evidence from the Level 2 assessment, with a more objective analysis of effectiveness, shows that 58 percent of the courses perform above the Bank-wide average. This level of performance is fair when evaluated against the benchmark of 85 percent. Results are greatly enhanced when complemented with the more favorable ratings on respondent satisfaction with the other reported benefits from the training. The evidence is clear in indicating that, in addition to the learning gains, respondents perceived several other kinds of benefits from training, the most significant of which are the opportunities for networking, sharing information with staff, diverse perspectives, and reflecting about new plans and approaches (see Annex D). Satisfaction levels could further increase if more time is provided in the training agenda for dialogue and the exchange of viewpoints, particularly if this dialogue is managed more effectively to prevent irrelevant and personalized discussions, and staff are challenged to reflect on how they will use new knowledge and approaches. There are some major program differences. The PREM, ESSD, FPSI, and HR (from the Non-Network program) programs generally perform below the Bank-wide average, and their overall level of satisfaction with training is low. They also perform lower on all five of the key variables that have been identified as key drivers of effectiveness. The following 28 Evaluation of the Bank Staff Training Program FY99 recommendations cover these key drivers of effectiveness and are based on the content analysis of the comments and recommendations made by the respondents. Improve the pacing of instruction. This could be done by limiting the scope of coverage of the courses, developing a more appropriate alignment of course objectives with the allotted time, and developing integrated course modules of short duration. Increase the application of concepts and principles to Bank work. Poor depth of coverage in training has already been highlighted above. Learning gains could benefit from the types of recommendations already noted for resolving problems associated with depth of coverage and application to Bank work. Develop better materials for use during training. Participant comments indicate major inconsistencies between the materials contained in their instructional binders and the materials used by the instructors; poor organization and visibility in the materials provided; and non-synthesized, bulky case materials which are too difficult to digest in the short duration of the courses. Addressing these areas of weakness could help in improving learning. Applying the right balance of theory and practice, and of lecture, discussions, and applications. This balance is most significant for effective adult learning, and appears to be a challenge for instructors who are nevertheless rated very highly on the quality of their knowledge. Making effective decisions on the appropriate trade-offs among these variables in the classroom is a strength of the expert instructor. This expertise develops with experience. The WBI should consider how to develop such expertise among instructors for the Networks as part of its train-the-trainer program being developed by WBI's Skills Development group (WBISD). Section 3: Are We Ensuring Sustainability? The new learning agenda of the Bank13 has defined a set of guiding principles and processes which are significant for ensuring the sustainability of the Bank staff training program. They include: (a) enhancing client capacity; (b) exploiting the synergy between staff and client learning; (c) taking advantage of strong internal partners (i.e. managers) and external partnerships; and (d) using resources efficiently. The WBI has an extensive training program which addresses the first point (a), and for which extensive evaluations are on-going under the Core Course programs. On the second point (b) a considerable number of initiatives are underway to enhance the joint training of Bank staff and clients in ways that would enhance the development of learning communities and partnerships in poverty alleviation. This study does not include an evaluation of such joint Bank/client training, due to limitations in accessing viable data in a timely fashion for sound analysis. Nor does the evaluation address the issue of partnership with Bank managers in the training function (c). This issue is significant for reinforcing learning and 13The Learning Agenda of the Bank, presented to the Personnel Committee of the Board, 1999. 29 Evaluation of the Bank Staff Training Program FY99 the use of knowledge and skills for performance and productivity. To an extent, the subject is referred to in describing the types of support Bank staff would like to get from their managers. (See Annex E.) The degree to which the training technology would sustain learning and the use of learning is covered above under the section entitled, "The application of appropriate principles for adult learning and performance improvement" and will therefore not be repeated here. This section of the evaluation addresses (d), the efficient use of financial resources. It presents information that could enrich the dialogue on cost effectiveness and cost-efficiency. The focus is on analyzing the relationship between unit cost per training day and training quality. The quality factors considered are relevance, learning effectiveness, and delivery efficiency. The data comes from both the Level 1 and the Level 2 assessments. The information on unit cost per training day is from the WBI (Budget Office -- June 10, 1999). The WBI database provides unit cost per training day for each of the 79 courses included in the study. In the absence of a valid basis or norm on which to make appropriate judgements about cost-efficiency or cost-effectiveness, this report will be purely descriptive. The results are also tentative, given small number of cases involved in the analysis. In spite of these shortcomings, the evidence presents some interesting relationships that would compel the training function to begin to define a basis for assessing and addressing cost-effectiveness and efficiency, with a view to ensuring the financial sustainability of the Bank's learning agenda. Relationship between cost and quality Program Comparisons: Figures 4, 5a and 5b below present information based on Level 1 data. Figure 6 provides information based on the Level 2 information on learning outcomes. Figure 4 shows that the higher performing programs (i.e. The ACS, OCS, and Non-Network) show very low cost relative to quality (i.e. relevance, efficiency and effectiveness). The Networks, relative to the Non-Network, OCS, and ACS programs, have the highest unit cost. They also received the lowest respondent assessments of quality. Figure 4: Relationship between Unit Cost per Training Day and Program Quality (Based on Level 1 Data) Note: Bank wide unit cost per training day based on 79 courses is $466 unit cost % 4 or 5 % $1,200 100 $1,000 94% 90 $955 $800 80 83% 84% 83% $600 70 $466 70% $400 60 $313 $200 $155 $185 50 $0 40 All Bank N etw ork N N O C S AC S 30 Evaluation of the Bank Staff Training Program FY99 Figure 5a Relationship between Unit Cost per Training Day and Delivery Efficiency, Relevance, and Effectiveness for the Various Networks and Training Groups (Based on Level 1 data). Unit Cost efficiency relevance effectiveness $ % 1800 100 $1,626 1600 90 1400 1200 $1,161 80 1000 70 800 600 $522 60 400 $357 $313 50 $185 200 $155 0 40 HD PREM ESSD FPSI NN OCS ACS (Note: See Figure 5b for breakdown for Non-Network Groups.) Figure 5b Relationship between Unit Cost per Training Day and Delivery Efficiency, Relevance, and Effectiveness for Non-Network Groups(Based on Level 1 Data). un it cost efficiency releva nce effe ctiveness % $ 5 0 0 1 1 0 $ 4 5 5 1 0 0 $ 4 0 0 9 0 $ 3 0 0 8 0 $ 2 0 0 7 0 $ 1 0 2 6 0 $ 1 0 0 $ 8 2 $ 4 6 5 0 $ 0 4 0 D E C F P S H R L E G 31 Evaluation of the Bank Staff Training Program FY99 Network Comparisons: Disaggregate information by Network, and by each of the dimensions of quality in Figures 5a and 5b, shows major variations among the various Networks and groups. The Non-Network, OCS, and ACS programs maintain a pattern of high quality relative to cost. Quality ratings are 82 percent, 80 percent and 94 percent, respectively. This pattern is constant for all the three dimensions of quality -- relevance, efficiency and effectiveness assessed via Level 1. Disaggregate information for the Non- Networks shows in Figure 5b that this pattern does not, however, hold for the HR group which shows an inverse relationship between cost and quality. For the Networks, HD and PREM show a cost-quality pattern identical to that of the Non-Networks, OCS, and ACS (i.e. high quality relative to cost). The results, however, are less dramatic -- costs are comparatively higher. Quality gains are below average or poor, , at 72 percent and 65 percent for HD and PREM, respectively. The ESSD and FPSI programs hold the inverse relationship, with cost being higher than quality. Quality ratings are very poor at 58 percent and 68 percent, respectively. Analysis Based on Level 2 Data: The evidence based on the Level 1 data is corroborated by the analysis which uses the Level 2 data on course effectiveness based on learning gains. Correlation analysis shows an inverse relationship between unit cost and learning effectiveness (r = - .4, p< .08). Figure 6 shows that the Non-Network, OCS and ACS programs have low costs relative to learning effectiveness. The latter meets the established standard for learning effectiveness. The unit costs for the HD and PREM are average and slightly above average, respectively. Their level of effectiveness, however, is below the set standard. The ESSD has high costs and average learning effectiveness. Figure 6 Relationship between Cost and Level 2 Learning Gains by the Various Networks and Training Groups unit cost % gain in learning % $2,500 80 70 $2,016 $2,000 69% 65% 60 59% $1,500 54% 50 $1,373 46% 40 43% $1,000 36% 30 $597 20 $500 $456 $155 $203 10 $88 $0 0 H D PR EM ESSD FPSI NN O C S AC S 32 Evaluation of the Bank Staff Training Program FY99 Cost and quality: Summary remarks and questions raised What can we make of the relationships just outlined? Given the small number of cases involved in the Level 2 analysis, the results are not conclusive, but signal a trend in the overall Bank staff training in the last quarter of FY99. The overall evidence on the variations observed among Networks raises questions about the types of expenditures that are driving costs, and the need to better understand how such expenditures are related to program development and quality. The evidence, while preliminary, is very consistent in showing the Non-Network, OCS, and ACS programs as operating in a pattern that suggests cost-effectiveness and cost-efficiency. The evidence is not clear on what accounts for the low costs. Future analyses might investigate this factor in relation to the stage of development of the programs. The set of factors that account for high quality among this group are partially noted in the section below; they include sound curriculum development and maintenance, and use of technical experts in the training function from the WBISD and the private sector. The PREM and HD show a healthy trend, which perhaps could be sustained with enhanced quality in the factors that drive relevance and effectiveness. The evidence for the ESSD and FPSI raises questions about their stage of program development and how this plays a role in the observed relationship between cost and quality. 33 Evaluation of the Bank Staff Training Program FY99 34 Evaluation of the Bank Staff Training Program FY99 IV. INTERPRETATION AND SUMMARY OF RECOMMENDATIONS FOR IMPROVEMENT The interpretation and summary recommendations presented here are based on: (i) the analyses of the variables that define the main areas of investigation (doing the right things, doing things right, and ensuring sustainability), (ii) the content analysis of the qualitative comments and recommendations made by participants, (iii) information from informal discussions and unsolicited feedback submitted by various groups, (iv) and non-structured interviews with various training teams. Variations in Network Performance One of the most striking findings of the study is the difference between: (i) the Networks and (ii) the operational core services (OCS, addressing core Bank business) and the Non- networks. The differences between these two groups appear to be associated to some degree with the two factors described below under (a) and (b). a) The stage of program development and the expertise of the training team on design, development and delivery. The PREM, FPSI and ESSD Networks are among the newer groups taking on the training function in its entirety (from needs assessment to delivery and evaluation), and their function was described in interviews to be largely mechanical. They appear to be less focussed on the learner. Their major preoccupation at this stage is on identifying and defining the content of their programs, on planning and budgeting, and on the general administration for delivery of the courses. Furthermore, the coordinators of such programs are not experts in the training function. The Bank operational courses, on the other hand, have had the benefit of extensive curriculum development and revision, following the Wappenhans recommendations in 1994 for the development of an "Integrated Curriculum on Bank Operations and Portfolio Management". Some of these courses are now going through a revamping exercise to enhance their quality by using training experts. The managers of such courses also appreciate and insist on world-class training standards for their courses. How to move the Network training programs to a stage where training teams and coordinators begin to refine the programs, to focus achieving greater impact for the "learner" is a challenge for FY00. Actions to this effect could include: · The establishment of performance standards for training courses. Models and guidelines of this type are available from the WBISD, and could be further developed and disseminated for use by quality assurance teams. · The establishment of quality assurance teams to ensure the "quality-at-entry" of training courses. 35 Evaluation of the Bank Staff Training Program FY99 · The use of training experts and professionals as part of the management, design, development and delivery training team. · The implementation of a train-the-trainer program addressing all elements of the training cycle, including design, delivery, and management. · The analysis of exemplary courses to highlight good practices, significant lessons and models for use by learning communities involved in the training profession. b) The knowledge structure of the training courses and the appropriateness of the formal training delivery modality. It has been postulated by some in review of the results of the evaluation that the technical courses of the Networks are more difficult to teach. Thus they are less likely to yield high quality assessments and measured learning gains. They have less structured knowledge domains, they are subject to a constant change of new knowledge, and staff needs and interests are more diversified. Because of these factors, it is conjectured that perhaps the formal training structure is not an appropriate modality for the technical Networks. Bank core operational courses, on the other hand, have more structured knowledge which is easier to teach. The knowledge, skill and competency requirements for operational work are also more definite and uniform. The validity of this comparison has been vehemently questioned by other Bank staff members who have been involved in both types of training program -- operational and technical. Their experience in working with both types of program clearly indicates that both have structured and non-structured knowledge domains, both call for the development of skills, strategies and learning-to-learning approaches, both are subject to constant change and the consequent updating of information and materials, and both use different consultants or Bank staff as trainers. It is further argued that the explanation offered to account for the differences postulated does not help us understand the higher performance of the Non- Network courses which, like the technical Network courses, are designed on technical professional subject matter. It is interesting to note that the trend of the data in this study on the factors that enhance "satisfaction with the courses" suggest that what staff value from the training courses is not only the learning they acquire, but also the networking, sharing knowledge and exchanging ideas, and developing new approaches, perceptions and insights into Bank work during training. This evidence appears to be very significant in the case of PREM, the ESSD, and the HD. Meeting such needs raises questions about the appropriateness of the design being applied for the training courses of the Networks. The evidence suggests that the technical courses are generally filled with presentations and leave little room for the learning-to- learning and networking necessary for a learning community. Below are recommendations for improvement. · Networks should re-assess staff learning needs relative to both staff knowledge acquisition and learning-to-learning interests. · Technical Networks should re-visit the learning objectives, cognitive and non- cognitive, of their courses and identify the training modality or design that would best serve such objectives. · For focusing training, one recommendation is to use "in-tact team training", which is one very efficacious method with high impact learning. (ref: LLC Pilot in the Africa 36 Evaluation of the Bank Staff Training Program FY99 Region using the Linkage Model, and Action Learning Workshops on the Logframe.) · Networks should establish very clearly "learning-to-learn" type activities (as distinguished from knowledge and skill development activities), and develop the appropriate delivery methodology for such activities. Inadequacies of a Content Driven Curriculum The major strength of the programs appears to be in the alignment between the subject matter of the courses and Bank business and performance requirements. This has an overall rating of 86 percent and is uniformly high across all group analyses. This favorable rating for subject matter is, however, not complemented with an appropriate or satisfactory depth of coverage of subject matter, nor by the application of appropriate principles of adult learning and the transfer of learning on the job. It appears that in the interest of covering large amounts of subject matter, the depth of learning is compromised. Major weakness are associated with planning, and instructional and performance technologies or methods: the absence of a sound assessment of learning needs and their linkage with clearly defined competency and performance requirements of the job (evidence from content analysis and interviews); inadequate or poorly designed case studies for analysis, evaluation, synthesis and application of knowledge to Bank work (65 percent ratings of 4 or 5); poor balance in the distribution of time among lecture, discussion, and the application of concepts to Bank work (69 percent); inadequacies in the types of materials provided to enhance follow-up learning and job application (69 percent); and inconsistency between stated learning objectives and what is actually delivered (evidence from content analysis and interviews). (See Tables 2,3, and 4). Some of these problems are closely associated with the short duration and fast pacing of the courses (69 percent). The average duration of courses is two days. The recommendations to address these weaknesses of needs analysis, and instructional and performance technology include the following: Planning and curriculum development · Conduct better learning needs assessment with a close alignment with performance outcomes. · Develop a modularized and integrated curriculum. Staff recommend not increasing the training days per course, but developing short course modules that are "integrated" in addressing diverse learning needs. This integrated curriculum could provide modules with manageable subject matter and use different and appropriate delivery modalities. This approach would be significant in limiting the heterogeneity in knowledge levels (not diversity in experience) among participants -- a factor noted by participants as an impediment to learning. Instructional technology · Improve case studies. Staff recommend the use of solid cases that would allow for analysis, synthesis, evaluation and application. This should counteract the current 37 Evaluation of the Bank Staff Training Program FY99 typical practice of presenting "case examples". Cases that are practical and are aligned with Bank work and provide an opportunity for application are most desirable among staff. · Maintain an appropriate balance between theory and practice, and among lecture, discussion and exercises for application. Maintaining the right balance among these instructional methods is the forte of an "expert instructor". While this evaluation study indicates a very high rating for instructor quality (86 percent of respondents rating a 4 or 5), the level of expertise in subject matter knowledge among instructors does not appear to be commensurate with instructor knowledge of how to manage the instructional process, including how to manage staff behaviors not conducive to learning. Improvements in FY00 would require considering how to work with instructors to balance the learning process, how to develop valid case studies, and how to manage classroom behavior. Performance technology and sustainability · To enhance the transfer of learning on the job, staff recommend better follow-up materials and enhancement of access to quality web-sites. Staff also suggest various ways by which mangers could enhance continuous learning and performance improvement: providing time to practice, master and sustain new knowledge and skills; providing incentives to carry out new ideas and challenges, and exercising quality control in staff implementation of new ideas. · The preliminary analysis of cost relative to quality raises questions about the types of expenditures that are driving costs and how this relates to quality. 38 Evaluation of the Bank Staff Training Program FY99 ANNEX A Questions vs Score Range with the Old LLC Instrument Annex A 14 12 10 ns 8 tio uesQ 6 4 2 0 0.0 .5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0 5.5 6.0 Score (1=Low, 6=High) SCORE (0=Low, 6=High) Questions vs Score Range with the New Instrument 30 20 ns tio uesQ Annex B Summary of LLC Data ­ Level 1 Evaluation by Network 10 July ­ April FY99 0 0.0 .5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0 SCORE (0=Low, 5=High) Score (1=Low, 5=High) 39 Evaluation of the Bank Staff Training Program FY99 ANNEX B Quantitative Results Based on Old Questionnaire for Networks and Non-Networks (Mean scores reported) NETWORKS NON NETWORKS FIPSI PREM HD ESSD OCS TOTAL DEC COMM OED HUB IT TOTAL N=137 N=28 N=184 N=193 N=626 N=1168 N=236 N=895 N=24 N=678 N=2653 N=4502 A. Course Outcome: Relevance, Effectiveness and Potential Impact 4.89 4.63 4.95 4.81 5.10 4.87 4.94 5.45 4.97 4.91 5.24 5.10 1. What I Learned In This Course Will Help Me Improve My Performance 4.87 4.25 4.88 4.67 5.15 4.76 4.917 5.48 4.92 4.84 5.25 5.08 2. Material And Issues Were Current And Worthwhile 4.95 5.08 5.06 4.98 4.98 5.01 4.935 5.38 5.00 4.97 5.20 5.10 3. The Course Was Relevant To My Needs 4.84 4.55 4.90 4.77 5.16 4.84 4.966 5.48 5.00 4.91 5.25 5.12 B. Quality of Course Design, Delivery and Instruction 4.78 4.86 4.83 4.73 4.84 4.81 4.82 5.28 4.60 4.81 5.03 4.91 1. The Structure And Instructional Modes Of The Course Encouraged Learning 4.94 4..58 4.94 4.76 5.00 4.91 4.99 5.39 4.75 4.95 5.15 5.05 2. The Course Objectives Were Fully Addressed 4.73 4.50 4.85 4.77 4.96 4.76 4.94 5.38 5.09 4.85 5.13 5.08 3. The Course Actively And Effectively Engaged Me Throughout 4.82 5.17 4.77 4.78 4.85 4.88 4.78 5.38 4.42 4.83 5.10 4.90 4. The Duration Of The Course Was Just Right 4.61 4.92 4.75 4.60 4.54 4.69 4.58 4.98 4.13 4.63 4.72 4.61 C. Overall Quality, Usefulness, and Satisfaction 4.83 4.76 4.78 4.79 4.87 4.80 4.90 5.35 4.56 4.85 4.98 4.93 1. Overall This Was A High Quality Course 4.98 5.08 5.02 4.85 5.00 4.98 5.01 5.46 4.96 5.00 5.12 5.11 2. Overall Quality relative to similar Bank training. 4.69 4.43 4.54 4.72 4.74 4.63 4.79 5.23 4.15 4.71 4.84 4.74 D. Quality of Instructor/s 5.12 5.46 5.28 5.26 5.28 5.30 5.70 5.25 5.29 5.39 5.39 1. Instructors encouraged and responded well to questions 5.07 5.59 5.31 5.22 5.30 5.30 5.72 5.17 5.30 5.40 5.38 2. Instructor is knowledgeable in the course content 5.24 5.60 5.50 5.45 5.45 5.41 5.73 5.39 5.43 5.44 5.48 3. Instructor treated participants with respect 5.31 5.68 5.48 5.36 5.46 5.46 5.82 5.44 5.46 5.50 5.54 4. Instructor was well prepared and organized 5.12 5.57 5.17 5.31 5.29 5.28 5.76 5.28 5.29 5.48 5.42 5. The pace of instruction was just right 4.83 4.83 4.94 4.96 4.89 5.05 5.49 4.99 4.97 5.13 5.13 40 Evaluation of the Bank Staff Training Program FY99 ANNEX C Bank Corporate Performance Criteria 1. Professional Quality of Services: Advisory and analytical services; Quality at entry; Portfolio management 2. Participation and Partnership Quality of country dialogue; Aid Coordination and Co-financing; Participation with Stakeholders 3. Selectivity Focus on borrower ownership and capacity; Adequacy of instruments; Use of Bank's comparative advantage 4. Efficiency and Innovation Budget efficiency; Business standards; Adaptability and responsiveness 5. Prudence and Probity Risk management; Financial management; Monitoring and reporting 6. Knowledge Management and Dissemination 7. Professional excellence, honesty, integrity, and commitment among Bank staff 8. Healthy work environment Vibrant, stimulating caring and sharing (Sources: The Corporate Scorecard by Anil Sood; Mission, Principles and Values of the World Bank, and Report on the Personnel Committee of the Board on "The Bank's Learning Agenda", January 29, 1999) 41 Evaluation of the Bank Staff Training Program FY99 Annex D Content Analysis: Participants as a Learning Community Frequencies of Responses for 32 courses Frequencies of responses PT ACS Other gains or benefits acquired from the courses ...........................................117................................35 Networking ............................................................................................................30.................................. 9 Sharing knowledge or experiences or exchanging ideas .........................................28.................................. 1 New knowledge/new approaches, new perceptions, new insights into things ............................................................27.................................. 5 Course specific knowledge ......................................................................................13.................................. 6 Appreciation of work. Bank challenges, roles, nature of business, future directions .....................................................................12.................................. 2 Appreciation of team work, group work ..................................................................3.................................. 1 Develop new plans and thoughts ...............................................................................2.................................. 2 Others (e.g. personal reasons)....................................................................................2.................................. 9 Factors about participants which enhanced learning.......................................265................................86 Diversity, differing viewpoints, expertise, and experiences.....................................80................................ 23 Sharing information and experiences .....................................................................62................................ 22 Collegiality (and team work in learning) .................................................................32................................ 11 Networking ............................................................................................................26.................................. 6 Frank and open discussions/ open-minded about experiences.................................25.................................. 5 Respect, honesty, caring, supportive, cordial, cooperative ...................................15................................ 11 Good insightful questions and analysis ..................................................................12.................................. 1 Similar background/experiences................................................................................2 Enthusiasm, friendliness, bright motivated, amiable, cooperative, humor...................................................................................................7 ................................. 3 Cultural diversity .......................................................................................................2.................................. 2 Punctuality ................................................................................................................2...................................- Meeting colleagues from other countries ..................................................................1...................................- Empathy and understanding.......................................................................................1 ..................................- Factors about participants which impeded learning...........................................26..................................5 Different levels of knowledge, different backgrounds ..............................................8.................................. 2 Attendance (inconsistent attendance, late arrivals, early leaving, constant entry and exit of classroom ................................................6.................................. 1 Personal style Dominant group (economists) Group is too theoretical .............................................................................................2...................................- Too much focus on personal issues and experiences ............................................1...................................- Few incidents of old bank mentality Protracted comments Too many questions interrupting presentations .........................................................3...................................- Monopolized Discussions .........................................................................................2...................................- Non-applicable experiences shared.............................................................................-...................................- Participant accents.......................................................................................................-...................................- Interruptions from "private instructions" among participants ..................................2.................................. 1 Group too large .......................................................................................................2...................................- Others.........................................................................................................................1...................................- 42 Evaluation of the Bank Staff Training Program FY99 Annex E Type of Follow-Up Support Staff Would Like to Have After Training Frequencies of Responses for 51 courses Frequencies of Responses PT ACS Management Support and Commitment .............................................................57................................25 Support and commitment from managers and supervisors (type not specified)..........................................................................................................20................................ 18 Time to practice and to master and sustain new knowledge and skills and techniques .......................................................................................13.................................. 1 Moral Support: Trust, recognition, encouragement, patience, and incentives to carry out new idea ..............................................................................9.................................. 6 A better job, opportunity for an assignment to more challenging tasks, or involvement in tasks .................................................................................9...................................- Quality control, feedback, and technical support or mentoring on innovations until full confidence is built ......................................................................................4...................................- Strong manager ..........................................................................................................2...................................- Peer Support .............................................................................................................-..................................5 Support from Colleagues ............................................................................................-.................................. 5 Follow-up Training ................................................................................................33................................15 Follow up training, longer, or more in-depth training to up grade or update skills.......................................................................................................25................................ 10 Other complementary or on the job training ..............................................................7.................................. 4 Less advanced training...............................................................................................1.................................. 1 Access to Experts, Resource Persons, Web-sites, other Sources......................134..................................2 c) Access to well defined information on experts and resource persons.................42...................................- Operational support and mentoring - from legal , procurement........................10...................................- Experts or Senior professional staff who led the training ........................................5...................................- Senior sector staff with expertise and experience for supervision and advice ...........7...................................- Coaching and mentoring and technical support by lead specialists .....................7...................................- a) Access to Web-sites and other bases with simple and clear information on subject, good practices, reading materials......................................35...................................- d) Hotline, helpdesk, tools, manuals and resource Guides for technical support......................................................................................................12...................................- b) Access to database on short cases, good practices, difficulties and challenges............................................................................................................9...................................- e) Development of "learning group" or creation of thematic group with other participants working on similar issues for effective implementation of ideas and for team support.....................................................................................7.................................. 2 Others: ....................................................................................................................14...................................- Client related.............................................................................................................-...................................- Simple instructions for borrowers...............................................................................-...................................- Help to support clients ................................................................................................-...................................- Hotline on resources to support/educate clients..........................................................-...................................- Support for computer technology...............................................................................4...................................- Resource Support......................................................................................................-...................................- Budgetary resources...................................................................................................6...................................- More staff...................................................................................................................4...................................- Address External Constraints..................................................................................-...................................- Price regulations and external markets........................................................................-...................................- Policy support from Bank management......................................................................-...................................- More self-initiative......................................................................................................-...................................- 43