31298 The Effectiveness and Use in FY03 of WBI FY01-02 Activities: A Baseline Assessment in Five Countries Jaime B. Quizon Cristina Ling Chard Marlaine E. Lockheed WBI Evaluation Studies No. EG04-86 The World Bank Institute The World Bank Washington, D.C. May, 2004 ACKNOWLEDGMENT This evaluation report was prepared for WBI under the overall guidance of Marlaine E. Lockheed, Manager, World Bank Institute Evaluation Group (IEG). Jaime B. Quizon, Cristina Ling Chard, and Marlaine E. Lockheed prepared this report. Ryuko Hirano contributed extensively to the task manager survey and and Humberto S. Diaz (IEG) for his help with formatting this document. WBI Evaluation Studies are produced by the WBI Evaluation Group (IEG) to report evaluation results for staff, client, and joint learning events. An objective of the studies is to get the findings out quickly, even if the presentations are less than fully polished. The papers carry the names of the authors and should be cited accordingly. The findings, interpretations, and conclusions expressed in this paper are entirely those of the authors and do not necessarily represent the view of the World Bank Group. WBI Evaluation Studies are available on line at: http://info.worldbank.org/etools/WBIEG/publications/index.cfm?pg=getPubs&category= Publications&Intro=yes&instructions=no&showDetails=no&ID= Vice President, World Bank Institute Ms. Frannie Léautier Manager, Institute Evaluation Group Ms. Marlaine E. Lockheed Task Team Leaders Mr. Jaime B. Quizon Ms. Cristina Ling Chard ii ACRONYMS 2SLS Two-Stage Least Squares AAA Analytical and Advisory Services CAS Country A Strategic CE Capacity Enhancement CRS Client Recording System DL Distance Learning FGD Focus Group Discussions FY01-02 Fiscal Year 2001 to 2002 GNI Gross National Income IEG Institute Evaluation Group K & S Knowledge and Skill LICUS Low Income Country Under Stress MDGs Millennium Development Goals OLS Ordinary Least Squares PPP Purchasing Power Parity PRSP Poverty Reduction Strategy Papers RCET Regional Capacity Enhancement Team TM Task Manager TTL Task Team Leader WBI World Bank Institute iii TABLE OF CONTENTS ACKNOWLEDGMENT ............................................................................................................ ii ACRONYMS.......................................................................................................................... iii TABLE OF CONTENTS ............................................................................................................iv EXECUTIVE SUMMARY..........................................................................................................v WBI'S NEW COUNTRY FOCUS.............................................................................................. 1 Evaluation in Country Context .................................................................................. 1 Evaluation Objectives................................................................................................ 2 EVALUATION METHODS....................................................................................................... 3 WBI Task Team Leaders' Views of the Learning Events......................................... 3 Participant Survey Sample ........................................................................................ 4 Participant Characteristics ......................................................................................... 5 ACTIVITY RELEVANCE, EFFECTIVENESS AND USE ............................................................... 5 Activity Relevance .................................................................................................... 5 Activity Effectiveness ............................................................................................... 6 Activity Use............................................................................................................... 8 FEATURES OF ACTIVITIES .................................................................................................... 9 Activity Objectives.................................................................................................... 9 Activity Delivery Features ...................................................................................... 12 Activity Follow-up with Participants ...................................................................... 13 DETERMINANTS OF EFFECTIVENESS AND USE OF WBI LEARNING EVENTS ....................... 14 Regression Results: Activity Effectiveness............................................................ 15 Regression Results: Activity Use ........................................................................... 17 SUMMARY AND LESSONS FOR WBI.................................................................................... 20 ANNEXES Annex A: WBI Focus and Priority Countries........................................................... 22 Annex B: Impact Assessment of WBI activities ....................................................... 23 Annex B: Impact Assessment of WBI activities ....................................................... 23 Annex C: Variable list, definition, source, means and standard deviations ............. 28 REFERENCES ...................................................................................................................... 31 iv EXECUTIVE SUMMARY This paper analyses the level and determinants of WBI activity effectiveness and use by participants in five WBI focus countries: Brazil, Egypt, Russia, Thailand and Sri Lanka. It uses data drawn from surveys of 3,091 participants from these countries who participated in FY01-02 WBI activities; in all, a total of 793 participants from about 131 activities responded. Participant responses are supplemented by a survey of WBI Task Team Leaders (TTLs) responsible for these activities, who were asked a number of questions regarding the activity objective, features of its design and delivery, and follow- up. Activity effectiveness varied widely across the five countries, with the highest ratings given for effectiveness in building individual knowledge and skills and by participants from Brazil. Participants used the knowledge and skills only modestly, and principally for awareness raising, teaching and research. These result are fully consistent with the Institute's directions in FY01-02. The most important determinant of activity use was its effectiveness, as rated by participants. Several design and delivery features of activities helped boost effectiveness: · Activities designed with partners and designed to respond to country needs were 30 percent more effective than those that are not. · Activities delivered with an action plan, as part of a series, and to teams of participants from the same institution were 20 percent more effective than other activities. These findings suggest that a country focus to activity design and delivery will boost effectiveness by more than 50 percent. Net of activity effectiveness, few activity features affected use. In particular, the objective of the activity was unrelated to its use. Use was higher for activities delivered in English and delivered as part of a series of events; it was higher when there was activity follow-up. Finally, use was higher for participants in higher positions, particularly for use in public policy. v WBI'S NEW COUNTRY FOCUS 1. In FY02, the World Bank Institute (WBI) embarked on a new, country-focused, capacity-building strategy. The Institute established a country pillar and the Regional Capacity Enhancement Team (RCET) to give new prominence to country needs as key determinants for the design and delivery of WBI sector/thematic products and services. The new strategy calls for WBI learning and other activities to link up with clearly identified capacity-building outcomes and processes (i.e., Country Assistance Strategy, Poverty Reduction Strategy Paper, Community Driven Development, Comprehensive Development Framework) in specific priority countries requiring assistance. In response to this new focus, the Institute Evaluation Group (IEG) initiated "country focus" evaluations to establish baselines against which to judge the success of this change. This evaluation report draws on the first five country focus evaluations to address two questions: (a) how effective were past WBI activities, when examined from a country perspective, and (b) what activity features are associated with relatively more effective activities? Evaluation in Country Context 2. In accordance with the new strategy and in consultation with the regions, WBI identified 12 countries as targets for intensive, multi-sectoral, capacity-building activities. Countries were selected on the basis of: their readiness as clients; WBI's relationship with clients; and the "rapid results" collaboration mode (or the overall discipline with translating intent and learning into action). The selected focus countries are listed in Table 1.1 Table 1. WBI Focus Countries, FY03 Region Low-income Lower- and middle-income LCR Guatemala Brazil EAP Laos Thailand AFR Nigeria Burkina Faso MENA Yemen Egypt ECA Tajikistan Russia SAR Afganistán Sri Lanka 3. The challenge for the WBI country pillar is to show actual on-the-ground capacity development impacts in these focus countries over the next three years. These impacts are to happen in clearly defined, country-specific, CAS priority thematic areas that WBI takes upon itself to develop. 1In February 2004, WBI officially raised the number of its priority countries to 30. These countries, which include the 12 identified in this table, are listed in Annex A. 1 4. This report builds on recently completed, individual IEG retrospective evaluations in five WBI focus countries: Brazil, Egypt, Russia, Sri Lanka, and Thailand.2 IEG selected these countries not only because they are WBI focus countries from different regions, but also because they count among those with the largest numbers of WBI learning event participants in FY01-02.3 5. The five country studies (a) appraise the relevance of past WBI activities to country needs, (b) assess the perceived effectiveness and impacts of WBI activities, (c) examine factors related to effectiveness and impact, and (d) establish baseline data for subsequent prospective country-focused evaluations. Methodologically, these country- specific assessments build on earlier IEG evaluations of key WBI thematic programs (see Khattri, Quizon, et al, 2002). They form part of a continuing IEG attempt to broaden the spotlight of its evaluations from the reactions and learning gains of participants (Levels 1 and 2 evaluations) to the behavioral and institutional (Levels 3 and 4) impacts that the WBI learning events are supposed to engender.4 6. Presently, WBI has the Bank-wide mandate to promote the Capacity Enhancement (CE) agenda in client countries. Hence, the number and share of WBI CE activities in the WBI focus countries can be expected to grow in the coming years. In building up these CE investments, an understanding of WBI's past learning events and other activities in WBI's focus countries is crucial. In order to move forward with a successful program, it is necessary then to learn what was successful in the past and at the country level, and how these might be scaled up. Evaluation Objectives 7. This evaluation brings together two separate perspectives on the same WBI learning events: that of the WBI participant and that of the responsible WBI task team leader (TTL). We gathered these viewpoints separately using evaluation methods described below. For participants, we asked them, inter alia, to rate the relevance, effectiveness and their use of the knowledge and skills they acquired from the WBI learning event. For WBI TTLs, we requested them to describe the objectives, design and delivery of the same WBI learning event, including related activities following the event. The main analysis relates these two standpoints in an attempt to answer the key research question underlining this study: what elements and characteristics of WBI learning events 2These reports are authored by IEG staff and consultants: Eckert, Sousa and Gunnarsson (2003) for Brazil; Zia (2003) for Egypt; Bardini, Manjieva, Narozhnaya and Gunnarsson (2003) for Russia; Khattri (2003) for Sri Lanka; and Quizon and Chard (2003) for Thailand. 3Currently, similar retrospective country evaluations are ongoing in five other WBI focus countries (Burkina Faso, Guatemala, Nigeria, Tajikistan and Yemen) and two priority countries (Indonesia and Kenya). 4Because thematic program activities often happen in different regions and involve many countries that are in various stages of development, it has been difficult hitherto to provide convincing evidence of cost- effective capacity enhancement and/or of any significant on-the-ground development impacts for any WBI thematic program. WBI believes that a country focus approach will resolve this. 2 result in greater effectiveness and desired CE impacts at the individual, institutional and policy/country levels? 8. In answering this main evaluation question, we first offer two summaries. We summarize the key results from survey and other data gathered exclusively from WBI learning event graduates. These findings draw from the five country-focused retrospective evaluations. We also run through the main evaluation results from the TTL's perspective. These are from new survey data that we collected for this study and also discussed at length below. The principal focus of these summaries is on WBI's current understanding of its own CE activities (or the elements that result in CE) in these five countries. Do these activities determine participants' higher ratings of WBI's overall effectiveness and impacts? EVALUATION METHODS 9. This evaluation is a meta-analysis of data from five country studies of WBI activities, supplemented with original data collected from the WBI staff involved in delivering the activities. This section describes these data collection methods Participants' Views of WBI Learning Event 10. All the original five studies used three methods of data collection: follow-up surveys of activity participants using a common questionnaire (translated for and pre- tested in each country), focus group discussions with past participants held in-country, and interviews with World Bank country operations staff. In addition, one or more country studies used participant interviews and a survey of WB country staff. Full descriptions of the methods used are presented in each of the country-specific studies. 11. This report makes use of all these sources of information from the country- specific studies in summarizing an understanding from the participants' viewpoint of, (a) the overall relevance of WBI past efforts to address countries' needs; (b) the effectiveness and impact of WBI's activities; (c) the general nature and extent of WBI interventions in the focus countries, and (d) WBI partnerships and follow-up activities. WBI Task Team Leaders' Views of the Learning Events 12. Adding a new dimension, this report uses data from a survey of WBI task team leaders (TTLs) directly involved in the delivery of the specific activities attended by the sampled respondents in the participant surveys.5 This TTL survey asks about: (a) the Capacity Enhancement (CE) objectives of the specific thematic learning events that they task managed, particularly as these relate to individual, institutional and overall country level goals; (b) the design and delivery of the thematic learning events they provided; (c) 5A copy of the TTL survey is attached as Annex B of this report. 3 the follow-up activities that happened after the event; and (d) other matters that relate to their learning events. TTL survey results are analyzed separately and also in conjunction with data already obtained for the five WBI priority countries to determine whether WBI's understanding of CE activities, and/or the elements that result in CE, are really what determine participants' higher ratings of WBI's overall effectiveness and impacts. Participant Survey Sample 13. The design and implementation of the participant surveys in each of the five countries were very similar and involved either the full population or a random sample of participants in each country from among FY01-02 WBI learning event "alumni" who had contact information (e-mail address, home/office telephone number, fax number, or home/office address) in WBI's Client Recording System (CRS) database. The distribution of participants and respondents in the WBI participant surveys are as shown in Table 2. 14. The first column lists the total number of WBI participants as entered in the CRS by the activity team. 6 The second column shows the initial count of those participants with names, which is the number of potential respondents in each country as listed in CRS; for two countries, a random sample from this population was selected. The third column of this table records the number of WBI participants in each country having contact information from one of four sources: e-mail address, fax number, telephone number or mailing address in the selected country. 15. Seventy-nine percent of the 1,268 (sampled) participants were eligible to be surveyed, based on having contact information. Eligible participant were contacted, with repeat attempts their made; the response rates varied from 52 to 91 percent, with most non-responses due to inaccurate or outdated CRS contact information. Correct contact information will be extremely important in the context of WBI's country-focus strategy, where enhancing capacity will entail sustained relationships with WBI alumni and their institutions. Table 2. Sampling of Participants, by Country Total Participants Total Participants(Ps) Country without names* with names Ps with contact information Response Rate Brazil 1,826 300 231 52% Egypt 882 248 203 84% Russia 3,046 1628 (300 sampled) 217 91% Sri Lanka 258 184 169 80% Thailand 1,536 768 (236 sampled) 186 86% Total 7,548 1,268 sampled 1,006 79% * These totals exclude one-day and World Links activities. 6These totals exclude one-day and World Links activities. 4 Participant Characteristics 16. The five countries are relatively equally represented in the overall respondent group: Brazil (15 percent), Egypt (22 percent), Thailand (20 percent), Russia (27 percent), and Sri Lanka (15 percent). Men represent a slight majority of this sample (58 percent). The age of respondents ranges from 20 years to 83 years, with the average 43 years. 17. Respondents' work affiliations include: national, regional, and local governments (30 percent, seven percent, and seven percent respectively); the private sector (21 percent); universities and research institutions (19 percent); not-for-profit organizations or donor agencies (seven percent) and other organizations (eight percent). 18. Most respondents held high-level positions. Over half reported that they were senior level (49 percent) or belonged to the top level (seven percent) of their work organization. One quarter of respondents (25 percent) described themselves as middle level. Relatively few held entry or junior level positions (12 percent).7 19. Participants were relatively proficient in the language of instruction and in the technical terminology used in the course. The participant surveys asked respondents to rate their competence in the language of instruction and in the technical terminology used in the course. Their average ratings are essentially the same for both categories: 5.86 for language, and 5.72 for technical terminology, on a scale from one (not proficient at all) to seven (highly proficient). ACTIVITY RELEVANCE, EFFECTIVENESS AND USE 20. This section compares the five countries with respect to participants' rating of activity relevance, effectiveness and use. Activity Relevance 21. Countries varied in the degree to which activities were "country focused." We use three indicators or country focus: (a) share of participants from the country in the activity, (b) participants' perception that the activity was designed for the participants, and (c) participants' report of activity relevance to country needs. Figure 1 illustrates these three separate indicators (or measures) of "country focus" and relevance of WBI activities, and the variation across countries in each measure. 22. Participants from most of the countries did not attend a WBI activity with many others from the same country. The first column shows the average share of country participants in the activity. For Russia, about two-thirds of activity participants were from the country, but for all other countries, the average share of country participants was 7Nine percent fell in the "Other" category. 5 below 40 percent. Sri Lankans, as a share of activity participants, averaged well below 20 percent. Figure 1: Perceived Country Focus of Activities (Percentage Participants) 80 70 60 % participants 50 in activity 40 Designed for 30 participants 20 High relevance to country 10 needs 0 Brazil Egypt Russia Sri Lanka Thailand 23. Many participants reported that the activity had been designed for them. The second column shows the share of participants who indicated that the activity was designed for participants in their country. More than half of participants from Brazil, Russia, and Sri Lanka reported that the activity was designed specifically for them. 24. Finally, activities were viewed as relevant to country needs. The third column shows the share of participants who indicated that the activity topics were highly relevant to their country (ratings of 6-7 on a 7-point scale). The majority of participants in Brazil (74 percent), Egypt (57 percent) and Sri Lanka (60 percent) reported that activities were highly relevant to their country. On the other hand, fewer than half reported high relevance for Russia (29 percent) and Thailand (32 percent). Activity Effectiveness 25. On the whole, participants' ratings of effectiveness are positive, particularly for individual benefits. Figure 2 illustrates average effectiveness scores, on a scale of one to seven, from the full sample of participants surveyed. Respondents rated WBI activities above average in all five areas of effectiveness: offering networking opportunities (5.17), raising awareness of development needs (5.10), increasing knowledge and skills (5.08), providing strategies for development needs (4.8), and giving approaches for addressing the needs of participants' organization (4.6). 6 Figure 2. Perceived Activity Effectiveness in Five Areas (Participant Ratings) Strategies for organization 4.6 Strategies for country 4.8 Knoweldge & skills 5.08 Awareness of dvlp. Issues 5.1 Networking 5.17 4.2 4.4 4.6 4.8 5 5.2 5.4 Mean Ratings 1(not effective at all) to 7(extremely effective) 26. Ratings varied widely across countries, however, with participants in Brazil., Sri Lanka and Egypt rating activities more highly than participants in Thailand and Russia. Figure 3 illustrates the proportion of respondents rating activities 6-7 (on a 7-point scale) for effectiveness in providing (a) knowledge and skills, (b) organizational strategies and approaches, and (c) country strategies and approaches to address country development needs. Only participants from Brazil rated WBI's activities as effective or extremely effective in all three areas (68, 53, and 57 percent, respectively). Following Brazil, participants from Sri Lanka (50, 39 and 47 percent) and Egypt (43, 29, and 36 percent) rated activities most effective, however this typically does not include even half of their participants. Fewer than a quarter of participants from Russia (23, 23, 20 percent) and Thailand (24, 23, and 22 percent) viewed WBI's activities as highly effective in these main areas. Figure 3. Participants Giving High Ratings of Effectiveness, by Country (Percentage of Respondents "6-7") 80 70 Knowledge 60 50 Organizational 40 Strategies 30 Country Strategies 20 10 0 Brazil Egypt Russia Sri Lanka Thailand 7 Activity Use 27. WBI activities were not often used by participants, particularly for non-academic purposes. Figure 4 illustrates the average frequency of use in "academic" and "operational" activities, defined as developing country strategies, taking community initiatives, changing regulations or implementing country strategies. Participants' mean ratings of academic use, again on a 7-point scale, were modest to slightly above average: raising awareness in development issues (4.6); conducting research (4.1); and teaching (4.2). Mean scores for operational use were about the mid-point or slightly below: organizing community initiatives (3.75); implementing new work practices (4.31); developing country strategies (3.93); influencing legislation and regulation (3.69); and implementing country strategies (3.6). Figure 4: Reported Use of WBI-Acquired K&S (1=not at all; 7= very often) 5 4.6 4.5 4.2 4.31 4.1 3.93 4 3.75 3.69 3.6 3.5 ratings 3 2.5 mean 2 1.5 1 Raising Teaching Research Implement work Develop country Community Changing Implementation awareness practices strategies Initiatives regulations country strategies 28. In all but Sri Lanka, fewer than half the participants reported that they frequently used their WBI-acquired knowledge and skills (K&S). Figure 5 illustrates the proportion of respondents by country who rated the activities they attended as "often" or "very often" (i.e., a 6-7 on a 7-point scale) in various areas: developing country strategies; legislation; teaching; and organizing communities. Of those who indicated that WBI- acquired knowledge and skills could be applied to their work, fewer than half actually used what they learned, except in Sri Lanka. Fifty-five percent of Sri-Lankan respondents reported using their new skills for teaching; 51 percent used what they learned to develop country strategies; and 50 percent applied their WBI K&S to organizing communities. On the whole, WBI's lessons were most frequently applied to teaching ­ but again, this was not very often. Overall, these results for activity use suggest that the impacts of WBI learning events --- as with activity effectiveness --- are not extraordinary and vary significantly by country. There is certainly considerable room for improving on WBI's performance in specific country settings. 8 Figure 5: Participants Reporting High Use (Percentage of Respondents "6-7") 60 50 Developing 40 country strategies Legislation 30 Teaching 20 Organizing 10 communities 0 Brazil Egypt Russia Sri Lanka Thailand FEATURES OF ACTIVITIES 29. We conducted a formal survey of task team leaders (TTLs) for the WBI learning activities attended by the respondents to the participant surveys. The questionnaire asked TTLs close-ended questions relating to the objectives, design, and delivery of the learning event, including later follow-up activities with participants (See Annex B). In order to maximize efficiency and reduce the burden to the TTL, we requested each TTL to complete survey questionnaires for up to two learning events, even when they might have managed more activities. It was unreasonable to expect TMs to answer questions for every activity they managed during this time period, as some TTLs were in charge of as many as 25 activities. Thus, we selected the events based on the highest number of respondents from the participant surveys in these WBI activities. In all, we had the potential of matching TTL responses with 536 respondents (out of 793 total respondents) from the participant survey. 30. Between January 23 and March 30, 2004, IEG contacted 42 TTLs of 62 activities through several waves of emails and with the generous support of WBI higher management; non-respondents were telephoned directly. We obtained completed questionnaires from 36 task managers of 52 learning activities, for a response rate of 83 percent. The activities for which we were unable to interview the TTL did not differ from the others in terms of sector and all programs were represented in our sample. We were able to match TTL data with 405 respondents to the participant survey (52 percent of the full respondent population and 40 percent of the activity population). Activity Objectives 31. The survey asked TTLs to rate the objectives of the specific learning activity that they managed. The survey first identified three levels of development objectives --- individual, institutional and policy/country --- and then broke each of these objectives 9 down into several key elements. On a scale from one (not an objective) to seven (primary course objective), the survey asked TTLs to the importance of these individual, institutional, and policy development objectives to their learning events. 32. Individual Development. For individual development, the most important objective was enhancing participants' knowledge and skills. TTLs rated the four key elements in the following order: enhancing participants' knowledge and skills (K&S); raising participants' awareness, acquiring additional knowledge through networking; and; training participants to share and disseminate knowledge e.g., training of trainers.(See first section of Figure 6). Figure 6: Activity Objectives (1= not an objective; 7= primary objective) 7 6.3 5.8 6 5.7 5.4 4.8 5 4.3 4.2 4.4 3.9 3.8 3.9 4 3.3 3 3 Rating 2 1 of w/ K&S local Build Awareness Networking Training Trainers partner w/ Regulations Formulating consensus Monitoring Effectiveness Relations Long-term Enforcement mechanisms relationships Implementation Human Development Institutional Development Policy Development 33. Institutional Development was less likely to be a primary course objective, with average scores substantially lower than individual development scores; they ranged from 3 to 4.2 (second section of Figure 6). TTLs rated elements of institutional development in the following order: developing and improving rules and regulations, improving the structure or effectiveness of participants' work organizations; improving relations with partner institutions through delivery of learning events together; improving enforcement mechanisms, such as making sure new policies are followed; and establishing long-term relationships with local institutes. 34. Country Development. Finally, with regard to the elements of policy/country development objectives, TTLs rated them between individual and institutional development objectives, as follows: formulating new policies; building consensus towards common action; implementing new policies or undertaking common action; and monitoring new policies or changes. 35. In all, these surveys of WBI TTLs indicate that while the key aims of WBI learning events are at the individual development level, the loftier goal of policy development remains an important activity objective. Institutional development, a key aspect of capacity enhancement, was much less important in the view of these TTLs. It is difficult to know if WBI learning events have had their intended impacts on institutional 10 development and policies, given the often long string of effects and/or of events that are necessary to establish such attribution. It is easier to ascertain whether WBI learning events have had actual effects on participants directly (or at the individual development level). The relevant question then is whether WBI learning events, and the follow-up activities they engendered, have had any non-trivial impacts on participants' knowledge and skills and behaviors. We will attempt to answer this question in a later section. Activity Design Features 36. We asked TTLs about three dimensions of activity design: (a) partnership, (b) tailoring, and (c) source of request for activity. Each of these can be considered a dimension of "country focus." TTLs rated the importance of eight design features on a scale of 1 (not important) to 7 (extremely important). Three dimensions of design were sharply more important than others (Figure 7). 37. Partnership, particularly with World Bank regional and country staff, was rated as important to the activity design. The lowest ratings, however, were given to partnership with local partners, with the importance of international partners between the two (first three columns of Figure 7). Figure 7: Task Manager Ratings of Activity Design (1=not important to the design; 7=extremely important to design) 7 6 5.3 4.9 5.1 5 5 4.6 4.1 3.8 4 Ratings 2.9 3 2 1 with Bank with International with local Tailored for Tailored for Bank Operations Product of needs Government regional/country partners partners region country request assessment request staff 38. Regionally Tailored. TTLs did not tailor activities offered in FY01-02 specifically for country needs; rather, activities were tailored for participants from a region. Designing region wide WBI activities was more important than designing country-specific learning offerings. We would not necessarily have expected these activities to be designed for participants from a particular country because these FY01-02 learning events took place before WBI launched its country-focus CE strategy. However, we include this measure in our baseline documentation because we anticipate that in the future, TTLs will report participants from the country are highly important for their activity designs. 39. Operational Requests. Finally, TTLs reported that designing activities to meet requests by Bank Operations to support their work e.g. CAS, PRSP, AAA was more important than responding to formal needs assessments by WBI; or to specific requests 11 from governments. This is another area in which we would anticipate change as the new CE strategies roll out. Activity Delivery Features 40. Most information about activity design features come from the TTL survey: (a) the activity as a part of a series, (b), mode of delivery, (c) fee structure, (d) language of instruction, (e) development of action plan, (f) delivery with local partner. Another design feature was computed: participants attending with institutional colleagues. 41. Part of a Series. TTLs reported that the vast majority of learning activities (82 percent) were delivered as part of a series of events addressing the same issue. Task managers were asked whether or not the activity was "part of a series of linked WBI events aiming to address the same issue in this country or region." This finding is important, particularly in the context of local capacity enhancement, where there is the need to review and upgrade participants' awareness, knowledge and skills on a regular basis. 42. Face-to-Face. Most of the courses were conducted face to face (65 percent); some 35 percent were distance learning (DL), or included a blend of face-to-face and video conference (20 percent blended; 15 percent DL only) deliveries. These numbers differ from those in the CRS for these years (44 percent face-to-face, and 55 percent DL blended). 43. Fee. Fewer than half of activities charged a course fee (39 percent versus 61 percent asked participants to pay). 44. The Language of Instruction in two-thirds of the activities was English (64 percent), or English with local translation (14 percent. Only fourteen percent of learning events were delivered in the local (or another) language. This is comparable to activities delivered in FY01-02 activities in the CRS. 45. Action Plans. As a design feature, the use of action plans in activities has been found to be important in overall activity effectiveness, although used relatively rarely, about 39 percent of FY00-01 activities used action plans. (Khattri, Quizon, et al, 2002). Two-thirds of TTLs in this survey, however, reported using action plans (62 percent).8 This number differs sharply from the 34 percent of participants reporting the use of action planning in the courses they attended.9 We were not able to investigate this discrepancy directly, but suggest that it may be attributed to differences in perceptions. What some participants might view as straightforward learning assignments may be viewed as action 8The task manager survey asked TTLs: "Did the participants develop an action plan (e.g. work plans, strategy papers, policy documents, assessment of country needs) in the course? Similarly, the participant surveys asked respondents: "During the learning activity, did you develop an action plan (e.g. work plans, strategy papers, policy documents, assessment of country needs)?" 9This percentage adjusts for differences in the number of participants per activity. Hence, these differences in the responses to the questions cannot be explained by an unequal distribution of participants across WBI events "with action learning" vs. "without action learning." 12 plans by TTLs. Thus, this might indicate a recall effect where WBI event participants remember only the most prominent or immediately useful action planning exercises. 46. Partners. Most activities were delivered with both non-Bank country or region- based partners or "partner" Bank staff in the country or region. TTLs rated country-and region-based partners as more important to the delivery of the activity (mean rating = 6.2), than Bank staff, who were rated as only somewhat important to the delivery (mean rating = 4.8). This finding supports results from interviews with resident Bank staff who complain that they are bypassed at times by WBI TTLs and that WBI's local partners appear to know more about WBI activities than key Bank staff in the country missions. 47. Participant Teams. Finally, 103 activities (79 percent) were delivered to groups of participants including teams from the same institution. Thirty-four percent of activities were attended with pairs of colleagues; 31 percent of activities were attended with an average of 4 colleagues; 12 percent of activities were attended with eight colleagues; and 4 percent of activities were attended by teams of ten or more colleagues. Activity Follow-up with Participants 48. Activity follow-up has also been previously identified as an important factor in overall activity effectiveness. TTLs reported that follow-up with participants took place after the completion of WBI activities by: WBI (for 63 percent of WBI events) WBI's partner (73 percent); WB Country Team (51 percent); or the participants themselves (48 percent).10 (See Figure 8). However, participants themselves report significantly lower levels of follow-up contact by WBI, with only 27 percent of participants reporting such contact. Figure 8: Percentage of TMs Indicating Responsibility for Follow-up and Actual Contacts with Participants 80 73.3 70 65.4 62.5 60 53.8 51.2 50 47.5 40 Percent 30 23.1 19.2 20 10 0 WBI WBI's Partner Country team Other participants Responsible Contacted Participants 10Ninety-four percent of TTLs report having provided participant contact information to other participants. 13 49. While WBI's partners were the most likely to contact participants in some follow- up activity, a majority of WBI task managers (65%) considered it their responsibility to engage participants in related activities after the learning event. Some 54 percent of TTLs said that follow-up was the responsibility of WBI's partners as well. Likewise, about half of TTLs believed follow-up was the responsibility of the WB Country Team (50 percent) or participants (19 percent). The latter is in contrast to the earlier figure which shows approximately 48 percent of TTLs reporting that participants actually initiated or participated in a large number of follow-up activities on their own. 50. We asked task managers to rate on a scale from one (not at all) to seven (frequently) how often they or their partners used the various media (electronic, mail/fax, or face to face) to follow-up with participants. Electronic mail was the most frequently used, mainly for announcements or updates on course topics (mean = 4.5) and for responding to participants' specific requests (mean = 4.4). Face-to-face approaches were used less often (mean = 3.5). TTLs rarely used mail or fax as a means of communicating with participants (mean = 2.1). DETERMINANTS OF EFFECTIVENESS AND USE OF WBI LEARNING EVENTS 51. How do WBI learning activity objectives, designs, delivery features and related follow-up activities as described by TTLs relate to or influence participants' ratings of effectiveness and use (or impacts) of the same events? To answer this question, we combine and analyze jointly the data from the participant and the TTL surveys for the same WBI learning activity. The next section defines the dependent and independent variables that are used in this analysis. Following this, we report the results of our analysis. This analysis uses a two-stage instrumental regression model, where the first stage explains activity effectiveness and the second stage relates these estimates of activity effectiveness and activity features to predict participant use of the learning activity. Dependent Variables 52. We define activity effectiveness as a composite variable, with values recalibrated to range from zero to one. This dependent variable is measured as the participant's average rating (on a 7-point scale) of the effectiveness of the learning event in five areas: enhancing their understanding and awareness of development issues; increasing their knowledge and skills (K&S); providing them with strategies for addressing their country's development needs; offering approaches for addressing the needs of their 14 organization; and introducing them to others who are interested in the theme of the learning activity.11 53. We define activity impact as use, which is also as a composite variable, ranging from zero to one. We have three measures of this dependent variable. All three measures are average scores for how frequently participants used the knowledge and skills (K&S) they acquired from the WBI learning event. Academic use is the average self-rating for their frequency of use in academic and similar pursuits, i.e., conducting research; teaching; and raising public awareness of development issues. Operational use is the average self-rating for how frequently they used their acquired K&S in four action- oriented endeavors, i.e., implementing new practices within their work organization; organizing collective initiatives; influencing legislation and regulation; and implementing country development strategies. Overall use is the average self-rating for how frequently they used their acquired K&S in both academic and operational undertakings. Independent Variables 54. Annex C defines the main variables (from the participant and TTL surveys) that might explain participants' ratings of effectiveness and use of WBI learning events. These independent variables can be organized into six groups: activity objectives, activity design, activity delivery features, activity follow-up, participant characteristics, and other. Under each of these groups of independent variables, Annex C shows also the list of more disaggregated (or more clearly defined) elements or variables that constitute each group. In general, the activity-level variables are from the TTL survey, although there may be equivalents of the same activity-level variable available from the participant surveys. Some of these are also shown in Annex C. Regression Results: Activity Effectiveness 55. We tested several factors and models that might explain our composite measure of activity effectiveness, including activity related variables (i.e., activity objectives, activity design features, activity delivery measures, activity follow-up aspects), participant- related characteristics and exogenous factors (i.e., country). Table 3 presents the results of an OLS regression model of activity effectiveness. Among all the OLS models that we tested, this regression estimate has about the highest explanatory power (i.e., R2 = .28 and adjusted R2 = .26).12 It also includes all those explanatory variables that appear to be the most consistently significant and meaningful in the other test regressions. 56. Activity effectiveness is largely determined by activity design and delivery features; exogenous participant characteristics (gender and age) are unrelated to 11We also used composites of these five areas to construct alternative decomposed effectiveness variables. In particular, we created: (a) an "academic effectiveness" variable which is the average participant rating for the first two areas: enhancing understanding and awareness of development issues and increasing K&S, and (b) an "operational effectiveness" variable which is the average of the last three areas of effectiveness. 12We also ran regressions with decomposed measures of activity effectiveness (see footnote 13) as the dependent variable. These alternative estimates did not show any significant improvements over the result reported in Table 3. 15 effectiveness. There is a large country effect that is not explained by other variables. Interestingly, activity objectives were unrelated to effectiveness .13 Table 3: Effectiveness as a Function of Activity and Participant Characteristics (OLS Coefficients) Variable Estimate Activity design TM designed with partner .209** TM designed for needs of country .096*** TM design for needs of region -.076*** Activity delivery Action plan .111*** Duration .004 Duration2 -.000 Part of series .073*** # colleagues attended .006* Participant characteristics Female .029 Age .001 Country Sri Lanka .128*** Brazil .118** Egypt .063*** Thailand -.006 constant .257*** N=405 R2=.28, R2=.26 ***=p<.01,**=p<.05, *=p<.10 57. Design features. Two design features were important: designing the activity with a partner and tailoring it to country needs. Activities with these design features were 30 percent more effective than activities lacking these two features. Involving partners in the design phase of activity planning is a positive step toward enhancing activity effectiveness; it can also contribute to institutional capacity enhancement. 58. A focus on country versus regional needs in the design of the same activities has opposing effects. Whereas activities tailored specifically for the needs of the country received significantly higher ratings of effectiveness (ß=.10, p<.01), activities tailored to meet the needs of the region were significantly less effective (ß= -.08). These results suggest that WBI's country-focus approach is correct; regional approaches are not as effective and may even be detrimental to overall effectiveness. Interviews with WBI task managers support this general finding. WBI TTLs generally note that regional 13We recalibrated all variables to between zero and one in order for the beta coefficients to illustrate the maximum effects. 16 approaches are often too broad and that tailoring activities to the needs of the country is necessary for them to be more relevant and effective. In short, a country focus must be present already at the initial stage of learning activity design. 59. Delivery features. Three activity delivery features were important. Activities with action plans were 11 percent more effective and those that were part of a series were seven percent more effective. Activities in which teams of colleagues attended were six percent more effective than those in which individuals attended with fewer than ten colleagues.14 One activity delivery feature that was, unlike previous studies, unrelated to effectiveness was the duration of the activity. The number of days an activity lasted was not related to participants' assessments of activity effectiveness. Our findings confirm results from previous IEG studies (Quizon and Chard, 2003; Khattri, Quizon, et al., 2002). 60. Country differences. Country-level factors account for a large share of the variance in participants' ratings of effectiveness. The country variables (Brazil, Egypt, Sri Lanka and Thailand) capture all those exogenous country-specific attributes and conditions affecting participants' ratings of WBI effectiveness but which we cannot capture with our current instruments. That these country-based factors matter in significantly explaining large differences in ratings of effectiveness only underscores the importance of a country-focus strategy in raising overall performance ratings of WBI activities. The five countries can be ranked according to the relative levels in WBI's effectiveness that exist among them. There are significant differences between Russia, the baseline category, and Sri Lanka (ß= .13, p<.01), Brazil (ß= .12, p<.01), and Egypt (ß= .06, p<.05). There is no significant differences between Russia and Thailand. With the possible exception of Brazil, these relative rankings of effectiveness mirror the relative levels of economic development of these countries as measured, for example, by GNI per capita (PPP adjusted).15 This finding suggests that WBI learning events are more effective in lower-middle income than higher-middle income countries. IEG participant surveys in LICUS and other lower income countries are still ongoing. These surveys, however, should shed light on the relative effectiveness of WBI learning activities in these least developed economies. Regression Results: Activity Use 61. Activity use is mainly determined by the activity's effectiveness as perceived by participants. This supports previous findings (Quizon and Chard, 2003) that the activity must be effective first in order for the knowledge and skills acquired at the learning event to be applied. Table 4 presents the 2SLS results for three models of activity use. Again these Table 4 second-stage regressions are among the most robust estimates from among several regressions tested. The three models of activity use correspond to the three 14Ratings of effectiveness increased by 5% for attending with eight colleagues; 4% for six colleagues; 2% for four colleagues; and 1% for two colleagues. 15Brazil might not really be an exception because much of the WBI CE efforts there are focused in its poorest regions (e.g., Fortaleza). 17 composite measures for this variable described earlier: (a) academic use (raising awareness, teaching and conducting research), (b) operational use (developing and implementing country and work strategies, influencing regulation and legislation, and organizing collective community initiatives) and (c) overall use (the average of all use items). Table 4: Use as a Function of Activity and Participant Characteristics (OLS Coefficients) DV= Use DV= Academic Use DV= Operational Use Variable Estimate Estimate Estimate Participant variables Predicted value of effectiveness .877*** .832*** .836*** Position level .013** .095 .159** Technical terminology .052 .106* -.005 Activity Objectives Awareness/ K&S .045 .012 .034 Training of trainers -.027 .002 -.042 Networking .006 -.067 .048 Developing rules/ regulations .071 .073 .065 Twinning with local Institutes -.027 -.003 -.024 Formulating new policies .101 .037 .101 Monitoring policies -.001 .011 .036 Activity Delivery Pay Fee .004 .010 .011 Deliver with Partner -.052 -.091 -.133 English as language of instruction .111** .086* .064 Part of series of events .072* .049 .090** Activity Follow-up Electronic .011* .119** .088 Mail/fax -.151 -.100 -.142 Face to face -.150 -.120 -.165 Request by phone .087 .063 .056 Country Brazil .013 .026 .028 Egypt -.009 .009 .011 Thailand -.047 -.043 -.002 Sri Lanka .101* .110* .101* constant -.257* -.131 -.260 N=405 N=405 N=405 ***=p<.01,**=p<.05, *=p<.10 R2=.25 R2=.20 R2=.22 R2=.17 R2=.24 R2=.20 62. Participant Characteristics. Several participant characteristics were associated with higher use. The participant's position level is significant in predicting overall use and operational use, but not academic use. This suggests that participants in higher level positions appear to apply more of what they learn from WBI. But this might be more because middle-level WBI graduates are less likely to implement new ideas without the agreement of their superiors. This latter contention is supported by results from several 18 focus group discussions (FGDs) with participants holding middle-level positions, particularly in government. 63. Participants' familiarity with technical terminology used in the learning event was an important factor explaining higher ratings of academic use but not for operational use or overall use. The more technically sophisticated participants are, the better able are they to apply what they learn towards raising awareness of others , teaching, and academic research. Indeed, familiarity with technical terminology is less important for customary operational work. 64. Delivery Features. An activity delivery feature with significant impact is the language of instruction of the learning event. Holding the activity in English significantly increases the reported academic and overall use of the event, but not its operational use. Participants also use more of what they acquire from the learning event when the activity is part of a series of events. This is true for overall and operational use, but not for academic use. Perhaps the reason for this distinction with academic use is because it is not necessary for courses to be part of series for it to have a significant impact on scholarly activities. For instance, a course on econometric modeling can be applied immediately without a series of courses on the subject. 65. Follow-up. Activity follow-up variables indicate that electronic follow-up is the most effective means for raising the use of a learning activity. Contacting participants via electronic means significantly increases both participants' ratings of his/her overall use and academic use. This is because in developing countries, there is generally easier access to emails, websites and similar electronic media in academic circles particularly universities and research institutes. Electronic follow-up is an efficient and cost effective tool to keep in contact with participants after the activity. 66. Objectives. The most striking results of this Table 4 model are the null findings for activity objectives. Our tests show that activity objectives do not influence participants' use of what they acquire from WBI leaning events. The broad (and often lofty) goals that task managers set for their activities do not significantly determine participants' actual use of whatever they gained from the WBI learning event. In other words, activity objectives as defined by WBI TTLs are not significant determinants of participants' actual use of what they acquired from the learning event. Certain activity design and delivery features are significantly more vital in raising the use of a learning event. 19 SUMMARY AND LESSONS FOR WBI 67. This study establishes some baseline measures in five countries, from which future country-focused impact evaluations may be anchored, particularly with respect to priorities aligned to the country CAS strategy, World Bank corporate priorities and the Millennium Development Goals (MDGs). This analysis also provides guidance for improving the overall effectiveness and utility of WBI activities. 68. In FY01-02, activities were focused on improving individual participant's knowledge and skills, awareness of development issues and networking. And participants rated the activities positively for achieving these objectives. These earlier WBI activities were not viewed as particularly effective in helping participants design or implement strategies for improving their own organization or their countries. And they were not designed with these objectives in mind. 69. Nevertheless, the effectiveness of WBI activities ranged substantially across the five countries. Participants from Brazil, Egypt and Sri Lanka perceived WBI activities as 10 percent more effective in comparison with participants from Thailand and Russia. We were unable to explain these differences in perceived effectiveness. 70. Two design features of effective activities are consistent with features of a "country focused" approach to design and delivery: partnership and a focus on country needs. Three delivery features of effective activities also support the concept of sustained capacity enhancement: the use of action plans, delivery as part of a series of activities, and delivery to groups of participants from the same institution. 71. The effectiveness of the activity, as perceived nine to 31 months later, is strongly related to reports of its academic or operational use. These results are consistent with previous studies that stress the importance of improving the effectiveness of activities as the most important means of obtaining use on the ground. 72. In addition, activity use in these five countries was facilitated by the participant's position (seniority level in one's workplace environment), particularly for operational use. Language facility of participants and language of instruction were related to academic use. And sustained delivery of the activity combined with activity follow-up boosted use by about 8 percent overall and by about 12 percent for academic use. 73. Current WBI strategy stresses the importance of many of the above features of effective activities. Greater attention to activity effectiveness through partnership, country-focused activity designs, use of action plans and targeting activities to teams of participants from the same organization should improve effectiveness generally. In addition, sustained engagement in the country through repeated, series of activities and follow up should increase the degree to which the content of the WBI activity is applied on the ground. 20 74. Finally, we note that the archival data in the CRS is less complete than it should be, with systems and procedures in place that do not enable continuous updating of participant contact information. For programs to follow-up with their participants, it is essential that these data be available, and to ensure business continuity, it is important that the data reside in a central repository. 21 ANNEX A: WBI FOCUS AND PRIORITY COUNTRIES Focus Countries Africa EAP ECA LAC MENA SAR Burkina Faso Laos Russia Brazil Egypt Afghanistan Nigeria Thailand Tajikistan/Kyrgyz Guatemala Yemen Sri Lanka Priority Countries Africa EAP ECA LAC MENA SAR Chad China Turkey Mexico Iran Bangladesh Ethiopia Indonesia Ukraine Bolivia Morocco India Kenya Vietnam Serbia Iraq* Madagascar Bosnia Senegal Tanzania Ghana * A June 2, 2004 email from Michael Sarris to Ruben Lamdany and WBI refers to Iraq as "unofficially the 31st priority country." 22 ANNEX B: IMPACT ASSESSMENT OF WBI ACTIVITIES Questionnaire for Activity TTLs ACTIVITY FEATURES 1. What was the delivery mode? (check one) · F2F · DL · Both (F2F and DL) 2. Did the participants develop an action plan (e.g. work plans, strategy papers, policy documents, assessment of country needs) in the course? · Yes · No 3. Were any participants charged a course fee? · Yes · No 4. What proportion of participants paid or were funded by their employer (i.e. not through Bank or Bank's partners in delivering the activity)? (please choose one) · 0% · 1-10% · 11-20% · 21-30% · 31-40% · 41-50% · 51-60% · 61-70% · 71-80% · 81-90% · 91- 100% 5. What was the language of instruction? (check one) · English · English with simultaneous translation to local language · Local language · Other 23 6. Was this activity part of a series of linked WBI events aiming to address the same issue in this country or region? · Yes · No ACTIVITY OBJECTIVES First, we would like to ask you about the objectives of your course. 7. Thinking about your course in terms of the objectives listed below, how would you rank your course on the following list of objectives? Kindly note the following scale: 1 = not an objective and 7 = primary course objective 1 = not an objective 2 3 4 5 6 7 = primary course objective Please fill out your answer here I. Human development a. raising participants' awareness of key development issues b. enhancing participants' knowledge and skills towards addressing these issues c. training participants to share and disseminate new knowledge (e.g., "training of trainers") d. helping participants to acquire additional knowledge through networking with people interested in the subject matter of the learning event II. Institutional development at the local, regional, or national levels: a. developing/improving rules and regulations b. improving enforcement mechanisms (making sure new policies are followed) c. improving participants' own organizational structures or the effectiveness of their work unit/organization improving relations with partner local institutions through delivering learning events together d. "twinning" or establishing long-term relationships with local institutes III. Policy development at the local, regional or national levels: Provide participants with strategies and approaches towards a. formulating new policies 24 b. building consensus towards common action c. implementing new policies or undertaking common action d. monitoring new policies or changes ACTIVITY DESIGN 8. Thinking back to when you designed the activity, on a scale from 1 to 7, where 1 is not important to the design and 7 is extremely important to the design, please rate the extent to which the activity was designed : Please fill out your answer here 1 = not an objective 2 3 4 5 6 7 = primary course objective a. as a response to specific requests from Bank Operations in support of their work (as defined in CASs, PRSP/Cs, AAAs, etc.)? a. b. as a response to (a) specific government request(s) b. c. as a product of a more formal needs assessment done by WBI or Bank Operations? c. d. in conjunction with a local partner(s)? d. e. in conjunction with international partners or donors? e. f. in conjunction with Bank regional/country staff? f. g. particularly for the identified needs of a specific audience? g. h. specifically for participants from one particular country? h. i. specifically for participants from one particular region? i. LEARNING ACTIVITY DELIVERY Now, we would like to ask you questions focusing on the actual delivery of training. 9. Thinking back to when you delivered the activity, on a scale from 1 to 7, where 1 is not important to the delivery and 7 is extremely important to the delivery please rate the extent to which the activity was delivered : 1 = not important to the delivery 2 3 4 5 6 7 = extremely important to the delivery a. With the participation of country or region-based partners or experts? b. With the Bank country (field office) or regional staffs' involvement in participant selection or course delivery? 25 10. Was WBI uniquely positioned to deliver this learning program? · Yes · No · Don't Know a) If yes, is this because: (check all that apply) · WBI provides the international expertise that participants could not obtain otherwise? · No other organizations are able to fund this learning program? · Other · FOLLOW-UP ACTIVITIES We would like to ask you about events since the activity delivery. 11. Since the delivery of your activity, which of the following have contacted participants about the course topic in some follow-up activity? Yes No Don't Know a. WBI b. WBI's partner c. WB country team d. Other participants 12. In your opinion, who is responsible for follow-up activities for this learning event? (check all that apply) · Bank Operations · WBI · WBI's partner · WBI country team · Other participants · Other, specify: · No one 13. Were participants provided with contact information of other participants (e.g. e-mail addresses, phone)? · Yes · No 14. There are a number of methods for contacting or providing participants with opportunities to continue their learning after the activity ended. Please indicate the 26 frequency with which you or your partner have used these to follow-up with participants after the activity: Please fill out not at all (1) 2 3 4 5 6 Frequently (7) your answer here a. Electronic newsletters a. b. Mailed/Faxed newsletters b. c. Electronic discussion forum (e.g. List serve) c. d. Face to face discussions d. e. E-mail announcements and/or updates on course topic e. f. Mailed or faxed announcements or updates on course topic f. g. E-mail response to participants' specific requests g. h. Mailed or faxed responses to participants' specific requests h. i. Telephone responses to participants' specific requests i. j. Subsequent learning events j. k. Other k. 27 ANNEX C: VARIABLE LIST, DEFINITION, SOURCE, MEANS AND STANDARD DEVIATIONS Respondent Activities from TM Matched Participant All Participant N=793 survey N=52 Mean N=405 Mean (Standard Mean Variable Variable Definition Source (Standard Deviation) Deviation) (Standard Deviation) Activity Objectives Individual CE Awareness 5.84 (1.82) 5.85 (1.57) N/A K&S increase (1=not an objective, TM Survey 6.27 (1.22) 6.40 (0.95) N/A Training of trainers 7=primary objective) 4.25 (2.02) 4.75 (1.93) N/A Networking 5.71 (1.54) 5.93 (1.36) N/A Institutional CE Developing rules/regulations 4.17 (2.57) 3.94 (2.47) N/A Improve enforcement mechanisms (1=not an objective, 3.25 (2.13) 3.24 (1.91) N/A Improve organizational structure 7=primary objective) TM Survey 3.90 (2.33) 3.97 (2.14) N/A Improve relations with partners 3.78 (2.39) 3.93 (2.93) N/A Twinning with local institutes 3.04 (2.41) 3.24 (2.39) N/A Policy CE Formulating new policies 5.36 (2.06) 5.09 (2.12) N/A Build consensus toward action (1=not an objective, TM Survey 4.76 (2.28) 4.87 (2.12) N/A Implement new policy 7=primary objective) 4.44 (2.02) 4.33 (2.02) N/A Monitoring policies 3.85 (2.36) 3.68 (2.08) N/A Activity Design Request by Bank Operations request (1=not important to 5.04 (2.10) 5.13 (1.88) N/A TM Survey Government request design; 7= extremel y 3.80 (2.31) 4.04 (2.14) N/A WBI request important to design.) 4.09 (2.24) 4.65 (2.09) N/A (Annex C continues on next page.) 28 (Annex C continued.) Respondent Activities from TM Matched Participant All Participant N=793 survey N=52 Mean N=405 Mean (Standard Mean Variable Variable Definition Source (Standard Deviation) Deviation) (Standard Deviation) In conjunction with: Local partner TM Survey 4.57 (2.70) 5.23 (2.28) N/A International partner (1=yes, 0=no) 4.94 (2.57) 4.19 (2.53) N/A Bank field office staff 5.31 (1.98) 5.60 (1.78) N/A Designed for: Specific audience TM Survey 5.77 (1.96) 5.50 (2.14) N/A County partners (1=yes, 0=no) 2.91 (2.31) 3.90 (2.32) N/A Regional needs 5.12 (2.54) 4.66 (2.69) N/A Activity Delivery Pay 0.39 (0.49) 0.47 (0.50) N/A Deliver w/local partner 5.25 (1.49) 6.38 (1.12) N/A Deliver w/ bank staff (1=yes, 0=no) TM Survey 3.83 (2.30) 5.09 (2.19) N/A English 0.62 (0.49) 0.51 (0.50) N/A Part of series 0.82 (0.38) 0.77 ( 0.40) N/A Activity follow-up By whom: WBI 0.58 (0.50) 0.54 (0.50) N/A WBI's partner (1=yes, 0=no) TM Survey 0.63 (0.49) 0.61 (0.49) N/A Bank country team 1 (0) 1 (0) N/A Participants 0.37 (0.49) 0.32 (0.47) N/A Activity follow-up Medium: Electronic (1= not at all, 3.63 (2.76) 3.63 (2.54) N/A Mail/fax 7=frequently) TM Survey 1.46 (1.06) 1.63 (0.98) N/A Face to face 3.49 (2.36) 3.17 (2.30) N/A Phone 2.54 (2.17) 2.14 (1.74) N/A Activity features from Action plan 0.61 (0.49) 0.34 (0.47) 0.33 (0.47) DL* (1=yes, 0=no) 0.35 (0.48) 0.30 (0.46) 0.38 (0.49) Duration in days* # of course days Participant 6.91 (8.04) 6.60 (4.16) 6.68 (5.87) Total participant* # of course participants Survey or CRS 67.35 (61.78) 70.12 (64.0) 64.85 (62.3) Proportion attended in % % of course attended 0.93 (0.14) 0.91 (0.20) 0.91 (0.20) # colleagues attended # of colleagues w/whom attended N/A 3.14 (2.85) 3.32 (3.05) (Annex C continues on next page.) 29 (Annex C continued.) Respondent Activities from TM Matched Participant All Participant N=793 survey N=52 Mean N=405 Mean (Standard Mean Variable Variable Definicion Source (Standard Deviation) Deviation) (Standard De viation) Participant Characteristics Female N/A 0.42 (0.50) 0.43 (0.50) (1=yes, 0=no) Age in years N/A 42.9 (11.24) 42.5 (10.8) Proficiency in technical terminology (1=not proficient, 7=highly Participant N/A 5.62 (1.46) 5.73 (1.44) English proficiency proficient) Survey N/A 5.78 (1.42) 5.86 (1.38) Position level (0=junior, 5=medium, 1=Senior) N/A 0.63 (0.20) 0.63 (0.20) Job type: Management N/A 0.03 (0.18) 0.03 (0.18) Participant Policy making (1=yes, 0=no) Survey N/A 0.02 (0.14) 0.02 (0.13) Research/ teaching N/A 0.03 (0.18) 0.03 (0.18) Service provider N/A 0.06 (0.24) 0.13 (0.33) Organization affiliation: Donor N/A 0.06 (0.24) 0.06 (0.25) Participant Private N/A 0.12 (0.33) 0.21 (0.40) (1=yes, 0=no) Survey Government N/A 0.34 (0.47) 0.29 (0.45) University N/A 0.19 (0.39) 0.19 (0.39) Country Brazil* 0.19 (0.40) 0.24 (0.43) 0.15 (0.36) Egypt* 0.23 (0.43) 0.29 (0.45) 0.22 (0.41) Participant Thailand* (1=yes, 0=no) Survey 0.13 (0.34) 0.14 (0.35) 0.20 (0.40) Russia* 0.29 (0.46) 0.20 (0.40) 0.28 (0.45) Sri Lanka* 0.15 (0.36) 0.12 (0.33) 0.15 (0.36) * The data source is CRS. 30 REFERENCES Bardini, M., V. Gunnarsson, E. Manjieva, and Yekaterina Narozhnaya. "The Impact of WBI Activities, FY01-02, on Participants from Russia: A Baseline Assessment." WBI Evaluation Studies, No. EG04-81, December, 2003. Eckert, W., G. Souza, and V. Gunnarsson. "The Impact of WBI Activities, FY01-02, on Participants from Brazil: A Baseline Assessment." WBI Evaluation Studies, No. EG04-82, January, 2004. Quizon, J. B. and C.L. Chard. "The Impact of WBI Activities, FY01-02, on Participants from Thailand: A Baseline Assessment." WBI Evaluation Studies, No. EG04-77, October, 2003. Khattri, N., J. Quizon, W. Eckert, S.Prom-Jackson, H.Zia, M.W.Meiers, Z. Shi, M.B.Palmisano, R. Novojilov, S.Jha, and D. Nikitin. "Impact Evaluation of WBI Client Programs, FY00-01." WBI Evaluation Studies, No. EG03-63, November, 2002. Khattri, N., P. Bachrach, and T. Jiang. "The Impact of WBI Activities, FY01-02, on Participants from Sri Lanka: A Baseline Assessment." WBI Evaluation Studies, No. EG04-73, October, 2003. Zia, H.S., M.K.Al-Sayyid, S. Tawila, and V. Gunnarsson. "The Impact of WBI Activities, FY01- 02, on Participants from Egypt: A Baseline Assessment." WBI Evaluation Studies, No. EG04-78, October, 2004. 31