Document of The World Bank Report No: ICR00001263 IMPLEMENTATION COMPLETION AND RESULTS REPORT (IDA-H1190 TF-54730) ON A GRANT IN THE AMOUNT OF SDR 24.00 MILLION (US$ 35 MILLION EQUIVALENT) TO THE ISLAMIC REPUBLIC OF AFGHANISTAN FOR AN EDUCATION QUALITY IMPROVEMENT PROJECT September 22, 2009 Human Development Unit South Asia Region CURRENCY EQUIVALENTS Exchange Rate Effective March 31, 2009 Currency Unit = Afghani US$ 1.00 = 48.50 Afghani FISCAL YEAR March 21 ­ March 20 ABBREVIATIONS AND ACRONYMS AFMIS Afghanistan Management Information System ARCS Audit Report Compliance System ARTF Afghanistan Reconstruction Trust Fund BESST Building Education Support Systems for Teachers Program BRAC Bangladesh Rural Advancement Committee CARE CARE International ­ Afghanistan Country Programs CCFO Counterpart Chief Financial Officer CDC Community Development Council CQ Consultant Qualifications DAB Da Afghanistan Bank DBER Development Budget and External Relations EQUIP Education Quality Improvement Project EERDP Emergency Education Rehabilitation and Development Project FMR Financial Management Report GMU Grant Management Unit HR Human Resources IBRD International Bank for Reconstruction and Development IDA International Development Association IIEP International Institute for Educational Planning IRA Islamic Republic of Afghanistan INSET 1 In-Service Teacher Training (Pedagogy content) INSET 2 In-Service Teacher Training (Curriculum content) JICA Japan International Cooperation Agency MACA Mine Action Center for Afghanistan MOE Ministry of Education MOHE Ministry of Higher Education MOF Ministry of Finance NCB National Competitive Bidding NICR National Implementation Completion Report NGO Non Governmental Organization NSP National Solidarity Program NESP National Education Strategic Plan NERAP National Emergency Rural Roads Program PCU Project Coordination Unit PED Provincial Education Department PFM Public Financial Management System PRR Priority Reform and Restructuring PTA Parent Teacher Association QEG Quality Enhancement Grants for Schools QCBS Quality and Cost Based Selection SBD Standard Bidding Documents SA Special Account SDA Special Dollar Account SDR Special Drawing Rights SDU Special Disbursement Unit SIP School Improvement Plan SMC School Management Committee SOE Statement of Expenditures TA Technical Assistance TED Teacher Education Department TEP Multi-stakeholder Teacher Education Project TRC Teacher Resource Center TSS Transitional Support Strategy UNESCO United Nations Educational, Scientific and Cultural Organization UNICEF United Nations children's Fund USAID United States Agency for International Development Vice President: Isabel M. Guerrero Country Director: Nicholas J. Krafft Sector Manager: Amit Dar Project Team Leader: Joel E. Reyes ICR Team Leaders: Darlyn Meza/Teresa Campos Afghanistan Education Quality Improvement Project (EQUIP) CONTENTS Data Sheet A. Basic Information B. Key Dates C. Ratings Summary D. Sector and Theme Codes E. Bank Staff F. Results Framework Analysis G. Ratings of Project Performance in ISRs H. Restructuring I. Disbursement Graph 1. Project Context, Development Objectives and Design ............................................... 1 2. Key Factors Affecting Implementation and Outcomes .............................................. 3 3. Assessment of Outcomes ............................................................................................ 7 4. Assessment of Risk to Development Outcome......................................................... 11 5. Assessment of Bank and Borrower Performance ..................................................... 11 6. Lessons Learned ....................................................................................................... 13 7. Comments on Issues Raised by Borrower/Implementing Agencies/Partners .......... 14 Annex 1. Project Costs and Financing ........................................................................ 177 Annex 2. Outputs by Component ............................................................................... 188 Annex 3. Economic and Financial Analysis ............................................................... 324 Annex 4. Bank Lending and Implementation Support/Supervision Processes .......... 522 Annex 5. Beneficiary Survey Results ......................................................................... 544 Annex 6. Stakeholder Workshop Report and Results................................................. 547 Annex 7. Summary of Borrower's ICR and/or Comments on Draft ICR ..................... 60 Annex 8. Comments of Cofinanciers and Other Partners/Stakeholders ....................... 74 Annex 9. List of Supporting Documents .................................................................... 777 MAP ...............................................................................................80 A. Basic Information Education Quality Country: Afghanistan Project Name: Improvement Program Project ID: P083964 L/C/TF Number(s): IDA-H1190,TF-54730 ICR Date: 09/23/2009 ICR Type: Core ICR ISLAMIC REPUBLIC Lending Instrument: ERL Borrower: OF AFGHANISTAN Original Total USD 35.0M Disbursed Amount: USD 35.0M Commitment: Revised Amount: USD 35.0M Environmental Category: B Implementing Agencies: Ministry of Education Cofinanciers and Other External Partners: ARTF B. Key Dates Revised / Actual Process Date Process Original Date Date(s) Concept Review: 12/09/2003 Effectiveness: 08/05/2004 08/05/2004 Appraisal: 05/03/2004 Restructuring(s): Approval: 07/29/2004 Mid-term Review: 04/15/2007 04/30/2007 Closing: 03/31/2009 03/31/2009 C. Ratings Summary C.1 Performance Rating by ICR Outcomes: Moderately Satisfactory Risk to Development Outcome: High Bank Performance: Moderately Satisfactory Borrower Performance: Moderately Satisfactory C.2 Detailed Ratings of Bank and Borrower Performance (by ICR) Bank Ratings Borrower Ratings Quality at Entry: Satisfactory Government: Not Applicable Implementing Quality of Supervision: Moderately Satisfactory Not Applicable Agency/Agencies: Overall Bank Overall Borrower Moderately Satisfactory Moderately Satisfactory Performance: Performance: i C.3 Quality at Entry and Implementation Performance Indicators Implementation QAG Assessments Indicators Rating Performance (if any) Potential Problem Project Quality at Entry No None at any time (Yes/No): (QEA): Problem Project at any Quality of Yes None time (Yes/No): Supervision (QSA): DO rating before Moderately Closing/Inactive status: Satisfactory D. Sector and Theme Codes Original Actual Sector Code (as % of total Bank financing) Central government administration 10 10 Primary education 50 65 Secondary education 25 10 Sub-national government administration 15 15 Theme Code (as % of total Bank financing) Conflict prevention and post-conflict reconstruction 17 17 Decentralization 16 16 Education for all 33 33 Gender 17 17 Participation and civic engagement 17 17 E. Bank Staff Positions At ICR At Approval Vice President: Isabel M. Guerrero Praful C. Patel Country Director: Nicholas J. Krafft Alastair J. McKechnie Sector Manager: Amit Dar Michelle Riboud Project Team Leader: Joel E. Reyes Keiko Miwa ICR Team Leader: Darlyn Xiomara Meza Lara ICR Primary Author: Joel E. Reyes Hasib Karimzada Teresa D. Campos Veronica Milagros Minaya Lazarte Mostaeen Jouya ii F. Results Framework Analysis Project Development Objectives (from Project Appraisal Document) The program aims to improve the quality of educational inputs and processes as the foundation for a long-term strategy to enhance the quality of educational outcomes. This will be achieved through: (a) a focus on schools and communities to strengthen their capacity to better manage teaching-learning activities; (b) investment in human resources (teachers, principals and educational administration personnel) and physical facilities; and (c) institutional development of schools, District Education Departments, Provincial Education Departments and the Ministry of Education. The program also aims to promote education for girls by putting a priority for female teachers and students within each component activity. Revised Project Development Objectives (as approved by original approving authority) N/A (a) PDO Indicator(s) Original Target Formally Actual Value Values (from Revised Achieved at Indicator Baseline Value approval Target Completion or documents) Values Target Years Indicator 1 : Student learning outcome: Grade 3 level reading assessment. Pilot test of Grade 3 student Value assessment in quantitative or Design and apply reading Qualitative) Badakshan, Logar and Parwan provinces. Date achieved 09/01/2004 03/31/2009 Comments The student assessment pilot was not implemented and has been rated (incl. % unsatisfactory. The MOE is committed to implement it under EQUIP II. achievement) Indicator 2 : Utilization of Quality Enhancement Grants Value 80% of grants are quantitative or - used for intended 98% Qualitative) purposes. Date achieved 06/30/2006 03/31/2009 03/31/2009 Comments 3,963 Quality Enhancement grants have been provided, and based on MOE (incl. % monitoring data, almost all recipient schools (89%) have implemented them achievement) without any major obstacles. Indicator 3 : Enrollment Value 7 million (45% 6.11 million quantitative or 4 million(34%) girls) (35.7%) Qualitative) Date achieved 09/01/2004 03/31/2009 03/31/2009 Comments National enrollments increased in Afghanistan during project implementation, iii (incl. % however, project direct causal relation can't be assessed as existing and new achievement) enrollments before and after project interventions were not monitored. Indicator 4 : Number of female teachers. Value quantitative or 3,053 (14%) 20,000 48,473(41%) Qualitative) Date achieved 09/30/2003 03/31/2009 03/31/2009 Females teachers nationwide increased in Afghanistan, No direct casual relation Comments between the project & increments could be assessed, as the only project (incl. % intervention related to females teachers was training of 10,448 female teachers achievement) already in the system. Indicator 5 : Completed National Education Strategy being implemented. Strategy being Value implemented as quantitative or - Completed national program Qualitative) in education. Date achieved 11/04/2004 03/31/2009 03/31/2009 Comments The National Education Strategic Plan was completed and has supported the (incl. % strategic guidance of education sector investments, as well as donor achievement) harmonization and alignment. Indicator 6 : Evaluation of EQUIP. Value Evaluation of key Not Fully quantitative or - interventions. Completed Qualitative) Date achieved 11/04/2004 03/31/2009 03/31/2009 Comments EQUIP tracked inputs of its different components, however an integrated (incl. % Monitoring and Evaluation system was not completed as expected. achievement) (b) Intermediate Outcome Indicator(s) Original Target Actual Value Formally Values (from Achieved at Indicator Baseline Value Revised approval Completion or Target Values documents) Target Years Indicator 1 : Number of schools having received Quality Enhancement Grants. Value 1,213 (schools under (quantitative 2,735 schools Emergency Project) or Qualitative) Date achieved 06/30/2006 03/31/2009 Comments Additional resources by ARTF helped financed a larger number of Quality (incl. % Enhancement Grants; no formal revision was undertaken under OP 8.50 as the achievement) original indicators and values were not defined in approval documents. Indicator 2 : Number of schools having received Infrastructure Development Grants. Value 1,213 (schools under 250 additional (quantitative Emergency Project) schools (1,463) or Qualitative) Date achieved 06/30/2006 03/31/2009 iv Comments Additional resources by ARTF help finance a larger number of Infrastructure (incl. % Grants; no formal revision was undertaken under OP 8.50 as the original achievement) indicators and values were not defined in approval documents. Indicator 3 : Number of teachers trained in INSET I (in-service teacher training module 1) Value (quantitative - 140,000 teachers or Qualitative) Date achieved 11/04/2004 03/31/2009 Originally the project expected to train 20,000 teachers in pedagogical skills Comments (INSET I); this target was later increased to 140,000 (all teachers) but target was (incl. % not achieved. No original targets were included in approval documents under OP achievement) 8.50. Indicator 4 : Number of teachers trained in INSET II (in-service teacher training module 2) Value (quantitative - 140,000 or Qualitative) Date achieved 11/04/2004 03/31/2009 The project did not include curricular knowledge training (INSET II); this value Comments and target were later included for 140,000 (all teachers) but target was not (incl. % achieved. No formal revision undertaken in OP8.50 processed approval achievement) documents. Indicator 5 : Continued development of Education Management System (EMIS) Value Functioning EMIS EMIS under (quantitative and its gradual use development. or Qualitative) in policy making. Date achieved 11/04/2004 12/31/2009 Comments School surveys were conducted nationwide and school, student and teacher (incl. % information is now available for planning, program design and implementation. achievement) A fully integrated EMIS system is still needed. Indicator 6 : Number of teachers trained in INSET II (in-service teacher training module 2) Value (quantitative - 140,000 or Qualitative) Date achieved 11/04/2004 03/31/2009 Comments (incl. % achievement) Indicator 7 : Continued development of Education Management System (EMIS) Value Functioning EMIS EMIS under (quantitative and its gradual use development. or Qualitative) in policy making. Date achieved 11/04/2004 12/31/2009 Comments (incl. % achievement) v G. Ratings of Project Performance in ISRs Actual Date ISR No. DO IP Disbursements Archived (USD millions) 1 09/21/2004 Satisfactory Satisfactory 0.00 2 03/16/2005 Moderately Satisfactory Moderately Satisfactory 2.60 3 09/13/2005 Moderately Satisfactory Moderately Satisfactory 2.60 Moderately Moderately 4 03/10/2006 2.60 Unsatisfactory Unsatisfactory 5 06/29/2006 Moderately Satisfactory Moderately Satisfactory 3.63 6 12/20/2006 Moderately Satisfactory Moderately Satisfactory 9.39 7 06/11/2007 Satisfactory Moderately Satisfactory 16.90 8 12/07/2007 Satisfactory Moderately Satisfactory 26.87 9 06/27/2008 Satisfactory Moderately Satisfactory 29.24 10 12/19/2008 Moderately Satisfactory Moderately Satisfactory 34.98 11 06/28/2009 Moderately Satisfactory Moderately Satisfactory 34.98 H. Restructuring (if any) Not Applicable I. Disbursement Profile vi 1. Project Context, Development Objectives and Design 1.1 Context at Appraisal In 2004, after 23 years of conflict and two years of reconstruction, Afghanistan was eager to transition from emergency recovery to longer-term nation building. Politically, the country had a transitional national government, a national assembly (the Loya Jirga), and a newly approved constitution. Economically, Afghanistan had a single stable currency; fiscal policy remained conservative; and it had developed a national budgetary system. Strategically, a national development framework (2004-2015) was to guide government investments. These were signs of recovery, despite chronic poverty, still fragile institutions and insurgency threats. In this context, education quality was critical for poverty alleviation and socio-economic development. Demand for education after the ousting of the Taliban regime increased immediately: in 2002 more than 3 million students enrolled in Grades 1-12, from less than a million a year before. By 2003, an additional 1.4 million enrolled for a total pre-EQUIP enrollment of 4.3 million (34% female and 91% in primary education). However, enrollment disparities across provinces and gender existed; teachers received no training and their salaries were low and not paid on time; and there were no basic quality education inputs--such as curricula, quality textbooks and sufficient school facilities. It was a priority to strengthen the role of schools, communities, and the MOE--this latter from direct service provider to policy formulator, and to regulate and monitor service delivery of other providers such as NGOs, the private sector, and provincial and district education departments. The Ministry of Education was committed to provide quality education, and sought broad-based participation, delegated decision making and spending authority and accountability. Donors supported capacity building, for example through the Priority Reform and Restructuring (PRR) program, which aimed at the reform of the overall structure of the MOE and strengthening its fiduciary, planning and technical departments. The World Bank was an important ally to support Afghanistan to transition from recovery assistance to long-term reconstruction and development. The World Bank Transitional Support Strategy (TSS) Framework (2003) consisted of four strategic areas: (a) improving livelihoods; (b) fiscal strategy, institutions and management; (c) governance and public administration reform; and (d) enabling private sector development. EQUIP was prepared consistent with the GoA and the TSS objectives to strengthen communities and providing basic social services. 1.2 Original Project Development Objectives (PDO) and Key Indicators The program aims to improve the quality of educational inputs and processes as a foundation for a long-term strategy to enhance the quality of educational outcomes, through: (a) strengthening schools and communities to better manage teaching-learning activities; (b) investment in human resources (teachers, principals and educational administration personnel) and physical facilities; and (c) institutional development of schools, District Education Departments, Provincial Education Departments and the Ministry of Education. Education for girls and female teachers training was a project priority. As an Emergency Assistance Operation under World Bank Operational Policy (OP) 8.50, PDO indicators were not included in the original Technical Annex and Development Grant Agreement; however, the following PDO indicators were defined and fine-tuned during implementation: 1 Indicator Target Means of Verification Baseline End of Project Student Learning Grade 3 Reading Non-existent Piloted Assessment Quality Grants Used Number of Grants Utilized 0 2,000 grants National Access Grades 1-12 enrollment 4 million 7 million Girls Access % of total enrollment 34% girls 45% girls Female Teachers No. and percentage of 3,053 (14%) 20,000 (no % target) teaching force Education Policy National Education Strategy Non-existent Implemented Evaluation of EQUIP Key Interventions 0 Completed Evaluation 1.3 Revised PDO: N/A 1.4 Main Beneficiaries EQUIP I was to benefit 10 provinces (later expanded to 26 out of 34 provinces) of Afghanistan. Additional targets were later set: 2,000 schools with quality grants; 250 schools with infrastructure grants; 25,000 teachers trained (target later expanded to 140,000 teachers nationwide); and 3,000 principals trained. The MOE was a direct beneficiary through project support of a medium-term policy framework (NESP), a national development budget, and an education management information system (EMIS). 1.5 Original Components Component 1: School Grants (US$21 million): School grants to finance teaching and learning inputs to create effective school environments (Component 1.1, School Grants for Quality Enhancement, US$10 million), and to build and rehabilitate basic school facilities (Component 1.2, School Grants for Infrastructure Development, US$11 million). Priority was given to schools seeking to increase girls' enrollments. Component 2: Support to Schools through Institutional and Human Resource Development (US$11.5 million): Support for training teachers (Component 2.1, Teacher Training, US$7 million) and school principals (Component 2.2: Training of Principals, US$ 1.5 million) through different modalities, and provide grants to provincial and district education departments to review and revise their organizational structures and enhance school support efficiency and effectiveness (Component 2.3: Capacity Building of District and Provincial Education Departments, US$3 million). Component 3: Policy Development and Monitoring and Evaluation (US$2.5 million): Support the MOE to: (i) implement a medium-term policy framework; (ii) develop an education management information system (EMIS); and (iii) prepare a sound National Development Budget each year (Component 3.1, Policy Development, US$1 million); and finance an M&E system for physical, financial, process and post-implementation monitoring, as well as financial audits and outcome evaluations (Component 3.2: Monitoring and Evaluation, US$ 1.5 million). 2 1.6 Revised Components: N/A 1.7 Other Significant Changes As objective indicators and targets were not defined during project preparation (flexibility allowed under OP 8.50), they were later agreed and fine-tuned during implementation (see Table 1, Annex 2). After the Mid-Term Review, project targets were increased from 10 to 26 provinces; from 25,000 teachers to the overall teaching force of 140,000; from 250 schools to more than 800; and from 2000 quality enhancement grants to 4,000. Additional ARTF financing of US$44 million supported the EQUIP program and permitted increasing the project targets, especially for school quality enhancement grants and infrastructure. 2. Key Factors Affecting Implementation and Outcomes 2.1 Project Preparation, Design and Quality at Entry EQUIP I incorporated recommendations from the Emergency Education Rehabilitation and Development Project (EERDP), and the World Bank's sector analysis, "Investing in Afghanistan's Future: A Strategy Note on the Education System in Afghanistan (an in-depth analytical work, in the immediate post-Taliban period, is commendable). Recommendations included supporting community participation and school-based management, working with NGOs, professionalizing of teaching force, delegating school support functions to provinces and districts, and improving education sector data relevance and accuracy. In retrospect, making the following elements explicit in the design could have improved the pertinence of the interventions: Cultural Setting: EQUIP recommended building schools for girls and training female teachers (which are valid objectives); however, a full understanding on how to increase girls' education in an Islamic--and until recently very conservative society with marked gender divisions--were not fully explored (for example, cultural sensitive requirement for girls' school designs, such as boundary walls; female teachers mobility constraints; and building alliances with religious leaders). Psycho-Social Needs: Project design was silent on the psycho-social needs of students, teachers and education services providers in a conflict afflicted context (more than 600 schools were attacked during project implementation). Targeting and Differentiated Interventions: The highly heterogeneous context of Afghanistan (culturally; ethnically; geographically; and with different levels of education attainment, gender gaps and insecurity) would have benefited from differentiated education delivery interventions, especially for hard-to- reach rural communities vs. urban centers, large student-teacher ratios vs. sparsely populated districts, and unsecure vs. secure provinces. Assessment of risks: Three risks were included in the Memorandum of the President: (i) lack of security, (ii) delays in disbursing school grants, and (iii) weak public financial management (PFM) systems. Risk mitigations were not effective as implementation data shows a correlation between unsecure provinces and lagging project progress (see Annex 3); long delays in the grant cycles (mostly in the liquidation process); and on-going complex PFM procedures at the provincial level. Other institutional risks, which materialized later during project implementation, were not identified: (i) lack of coordination and integrated planning for financial management and procurement, and among MOE departments implementing the project; (ii) difficulty in 3 attracting experienced technical assistance to Afghanistan and unsustainable high fees; and (iii) lack of a full set of project management tools to support the transition from emergency reconstruction to longer term institution building for social services delivery. 2.2 Implementation In 2005, the implementation of EQUIP I was minimal due to frequent MOE leadership change and lack of qualified project staff. In 2006-2008, a more stable MOE leadership and proactive management accelerated the transfer of grants to communities, school infrastructure, and contracting technical assistance (TA) for the central MOE and PED levels. The teacher training component initiated as well, but focused mainly in Kabul city and to a lesser extent in 6 other provinces. Another important gain was the development of the National Education Strategic Plan (NESP) and the 2006 school survey which improved the existing education sector information, and was the basis for an Education Management Information System (EMIS). However, some worrisome externalities were also generated along the welcomed accelerated project implementation. There was no evidence of strategic and capacity building planning by most MOE departments requesting large amounts of TA. Consultant salaries increased rapidly and became unsustainable. A Bank review in 2009 identified that the MOE and EQUIP paid, on average, one of the highest salaries in the GoA for TA (See Component 3.1, Annex 2). Also, evidence suggests that the acceleration of investments was "input-led" (grants, infrastructure and consultants) without integrated annual planning tools linking procurement activities with expected annual results indicators and targets, systematic planning of activities and implementation time, and in line with approved Ministry of Finance (MOF) budgets. As a result, there were financial over-commitments in some components (e.g. infrastructure grants), while others remained with limited budgets (e.g., teacher and principal training). These outcomes could have been avoided with simple integrated management tools to track overall project investment against annual targets for all components (not just grants and infrastructure). Mid-Term Review (MTR). In 2007, the Bank's MTR identified the achievements of school grants, infrastructure, and community mobilization. A satisfactory rating was given on the advances to the NESP, the EMIS and M&E. It was also noted that teacher training was mostly provided in Kabul city, and principal training was not implemented. However, the Mid-Term Review did not review other EQUIP financed MOE technical assistance, and the stated M&E progress, in retrospect, was not in line with standards of a functional system for physical, financial, process, and outcome monitoring and evaluation. However, it was decided to scale up the project nationwide, without a detailed assessment on capacity, budgets and equity criteria. 2.3 Monitoring and Evaluation (M&E) Design, Implementation and Utilization M&E Implementation. A number of critical M&E activities were contemplated to be carried out under the Project to provide timely and adequate information to stakeholders on implementation, performance, process, outputs, outcomes and lessons learned on school improvement grants, rehabilitation and construction of schools, teacher in-service training, and policy development. The project design stated a start-up baseline survey and annual updates, with a follow-up survey at the end of year two. The M&E was to support MOE to assess learning achievement and pilot 4 a student assessment mechanism. However, many of the above M&E activities were not carried out. M&E Utilization. To strengthen project monitoring, TA was provided to the PCU to monitor the allocation of grants, infrastructure and consultants. However, no integrated and automated M&E system was developed to guide the process in an institutionalized manner. As a result, the project suffered from i) lack of information to evaluate outcome indicators; ii) insufficient credible and integrated information to monitor and evaluate access to quality education and teacher training activities; iii) limited data integration and exchange within EQUIP and within MOE; and iv) limited reporting, dissemination and sharing of information. (See Annex 2, sub-component 3.2. for further details). 2.4 Safeguard and Fiduciary Compliance Specific social, environmental and fiduciary safeguards were not defined ex-ante to project approval, as EQUIP was prepared under OP 8.50. It was agreed that the Environmental and Social Safeguard Framework for emergency projects in Afghanistan would be applied. Implementation performance is as follows: Social benefits. Project social objectives included increasing access of education services for girls and closing the education access gap across provinces. However, documented evidence (see Annex 3) shows that approximately 30% of completed school which received infrastructure grants did not comply with the gender equity criteria. Hard-to-access and conflict-afflicted provinces also experienced more implementation obstacles. Environment. A project environmental category "B" provided for construction guidelines and the development of 84 different school designs to address size, local terrain and weather differences. However, culturally sensitive girls' school designs (with boundary walls, or separate toilets for boys and girls in mixed schools) were not considered. School construction delays were also caused by the absence of mechanisms to facilitate land ownership, transfer, or settlement of disputes. Even when School Management Committees (SMCs) secured land donations, land titles could not be transferred. Procurement. Low initial procurement planning and implementation capacity resulted in delays in designs, bidding documents, bid evaluations. Initially NGO supported procurement management was later mainstreamed under the MOE procurement department (with support of two international procurement consultants). Training was provided by Bank staff. The quality of procurement management improved. Bank procurement post review (PPR) in FY08 noticed deficiencies in the procurement process of civil works and consultants, which had improved by the FY09 PPR. Contracts of goods were not presented on time by the MOE for review. Due to security reasons Bank staff could not make regular site supervisions to verify assets. Considering the emergency situation, the volatile environment, the security conditions and the capacity of the grantee, project procurement is rated moderately satisfactory. Financial Management (FM). After three decades of conflict, the GoA is facing a daunting challenge to rebuild its public administration to effectively manage and utilize public resources, 5 including the large inflow of development aid. Since the Bank Group's re-engagement in Afghanistan, the Bank has supported the increased capacity and accountability of the State for provision of basic services, including the establishment of the basic legal framework underlying the public financial management (PFM) and public procurement systems. Under the direction of the Ministry of Finance (MoF), significant progress has been made. The government-wide PFM systems are used for all IDA and ARTF projects. Project accounting, payments, financial statements and responsibility for submission of audited financial statements are centralized within the Ministry of Finance (MoF). In 2005 and 2007, the Public Expenditure and Financial Accountability (PEFA) Public Financial Management performance measurement framework found the PFM's accounting, recording and reporting better than the average of low-income countries. Records of the project's funds receipts and disbursements from inception until closure were maintained at the Ministry of Finance, Special Disbursement Unit (SDU). The MOE maintained records of project's disbursements. Financial transaction records under EQUIP, like other emergency projects in Afghanistan, were maintained manually at the SDU from inception until March 2005 when the SDU started using the Afghanistan Financial Management Information System (AFMIS). Supporting documents were maintained satisfactorily for all payments, except for some transactions, and these were reported in the Bank's FM supervision mission reports (see disbursement section below). Funds flow to all implementing agencies was controlled centrally by the SDU; assessments have shown its appropriate management. Financial reporting was poor initially: monthly and quarterly financial reports--including the mandatory financial monitoring reports to be submitted quarterly to the Bank--were not prepared regularly and also did not contain all required information (for example, expenditures by components and sub-components). However financial reporting improved over time. Annual audited financial statements were submitted regularly, though after the deadline. The audit opinion of the Control and Audit Office of Afghanistan was unqualified (clean) for SY 1383, qualified (unclean) for SY1384 and SY1385, and unqualified (clean) for SY1386. Given that there were no financial management systems in place when the project became effective, one could conclude that FM capacity has been built steadily over the life of the project. The FM rating is moderately satisfactory. Disbursement. The estimated project cost at appraisal was USD35.0 million, with an actual total cost of USD 79.0 million: ARTF TF54730 (which closes on March 31, 2010) provided USD44.0 million for the project, mostly financing additional quality enhancement and infrastructure grants (the graph on Section I presents combined IDA and ARTF actual disbursements to-date). About 99% of the ARTF grant amount has been disbursed as of September 2009. As of March 2009, IDA grant H119 was 100% disbursed with a grace period of 6 months to liquidate its accounts. The January 2009 Bank FM review indentified unresolved issues to be accounted: (i) ineligible expenditures of US$1,798, and (ii) unsubstantiated expenditures of US$3,043,050. The last financial management review--which includes transaction review of the project's expenditures up to the closing date (March 31, 2009)--is being conducted currently. Project Management Capacity. A Project Implementation Manual (PIM) and a Grant Management Unit (GMU) supported procurement, disbursement and financial management procedures. Later, as part of the MOE's institutionalization process, project fiduciary management was mainstreamed to the MOE's respective line departments. A project 6 coordination unit (PCU) was created to supervise overall project implementation. MOE staff received basic FM and procurement training, but hands-on technical assistance would have been helpful to ensure efficient and timely project implementation. Although great efforts were made to ensure that financial and procurement records were kept in place for post-review and management purposes, some basic management tools--such as strategic and annual implementation plans (linking disbursements to result targets and subcomponents)--were not in place for most of the EQUIP I implementation period. Most World Bank procurement and financial management guidelines were only provided in English and not in the local languages (Dari and Pashtu). 2.5 Post-completion Operation/Next Phase IDA approved on January 31, 2008, a grant (H 354) of SDR 18.9 million (US$30 million equivalent) for the Second Education Quality Improvement Project (EQUIP II). On April 14, 2009, an ARTF Grant for US$35 million was also signed. EQUIP II includes the same EQUIP I components as well as a scholarship program for prospective female teachers to complete their secondary and teacher training college education. EQUIP II will implement the teacher and principal in-service training and the monitoring and evaluation system not completed within EQUIP I. Also, the GoA has requested the World Bank to support a new education sector operation to be designed. On-going and future operations need to strongly promote and support MOE education management capacity through results-based management; further devolution of management functions to schools, communities, PEDs and DEDs; and closing education gaps across provinces and population groups (e.g., gender disparities). 3. Assessment of Outcomes 3.1 Relevance of Objectives, Design and Implementation The relevance of the objectives and design of EQUIP is rated substantial, as the project's proposed focus on community participation, school inputs (grants and infrastructure), improved human resources, policy direction and monitoring and evaluation were in line with the development priorities of the GoA and the MOE at the time of design and today. The objective--to support inputs and processes as a preamble to (rather than to support fully) a medium-term quality education strategy--was also relevant given the still fragile education institutions and unsatisfied basic school needs in Afghanistan. The implementation lessons learned on psycho-social needs (especially due to increased insurgency and conflict in certain areas of Afghanistan) and on differentiated education delivery mechanisms (given the country's heterogeneity) are already being considered by the MOE. Today, the GoA, the MOE and education donors maintain their commitment to increase access to quality education, recognizing the positive foundations of community support, school-based management and well targeted support for teachers, all under a clear and national education development policy. EQUIP objectives and contributions continue to be in line with education development priorities and circumstances prevailing in Afghanistan. However, relevance of implementation is rated as modest. The implementation of EQUIP had mixed results. Positive project results were strengthening community participation, providing 7 incentives for school-based management, supporting the development of the NESP and school surveys for the EMIS. These positive outcomes have given the Afghanistan education sector an important foundation for on-going reform. Weak results are related to project management capacity, internal MOE coordination, and decentralization of functions and decision making to the provincial and district levels. Thus, the present MOE administration is making efforts to improve organizational cohesiveness, integrated planning and results monitoring, TA rationalization and civil servants capacity building, and institutionalization of education management led by the MOE, PEDs, DEDs, schools and communities. 3.2 Achievement of Project Development Objectives (Outputs details are reflected in Annex 2) The overall efficacy of the project is rated moderate combining: (i) substantial improvement in community participation, school-based management and use of quality grants and infrastructure, as well as the development of a medium-term policy framework to guide education investments (NESP); and (ii) modest gains in setting processes with minimum standards to guide education management, implementation, accountability and monitoring and evaluation of results. PDO 1: Improved Student Learning Outcome (Grade 3 level reading assessment): This PDO is rated Unsatisfactory. The student assessment framework and tools required for monitoring causal linkages between project investments and learning was not developed. The MOE did not pilot the test Grade 3 student assessment in reading in Badakhshan, Logar and Parwan provinces--albeit another program (PACE-A, managed by a consortium of NGOs) had developed a simple reading and numeracy assessment and was willing to share the experience with the MOE for scale up. This major shortfall needs to be corrected as soon as possible. PDO 2: Utilization of Quality Enhancement Grants: This PDO is rated Satisfactory: The MOE has reported almost 4,000 grants (US$12 million) disbursed to schools to support their school improvement plans (SIPs). Laboratories, libraries and computer equipment have been the most selected school-based investments to improve teaching and learning quality. Communities have improved their organization, decision making and have provided additional in-kind and financial school support. PDO 3: Increased Grade 1-12 Enrollments and % of Girl Students: This PDO is rated Satisfactory. During the period of implementation of EQUIP, 2005-2008, MOE reports 25% increased nationwide school access, or 1,217,738 new students, and 29% increase in female enrollment, an additional 499,459 new female students (see subcomponent 1.2, Annex 2). However it is not possible to make a direct causal linkage between national school access and EQUIP financed infrastructure, since new vs. existing enrollments in project communities was not tracked. At the provincial level, there is an overall positive degree of association between EQUIP supported provinces and the national enrolment increments and improved gender ratio (see Figure, 1.7, Annex 3). PDO 4: Number and Percentage of Female Teachers: This PDO cannot be rated. Although the number of female teachers increased from 34,108 in 2004/05 to 48,473 in 2008/09, a 30% increase (see table 3, Annex 2). No direct project interventions could be identified as a causal 8 linkage with the recorded national increments. EQUIP financed only some in-service teacher training for 10,448 female teachers, most of whom were already in the system. Other MOE policies and programs have contributed to this PDO. PDO 5: Completed and Implemented National Education Strategy: This PDO is rated Satisfactory. EQUIP provided TA funds to the MOE planning department, in charge of preparing the National Education Strategic Plan (NESP) 2006-2010. NESP has supported policy development guidance and donor harmonization and alignment. UNESCO, IIEP (with funding by Norway), USAID, DANIDA and others donors supported different phases and specialized areas of the preparation of the strategic plan. PDO 6: Evaluation of EQUIP: This PDO is rated Moderately Unsatisfactory. EQUIP did not develop a proper evaluation design, nor the monitoring and evaluation system required to achieve this institutional objective. The MOE did hire monitoring staff to collect basic data of program inputs (quality grants, infrastructure and title and number of consultants) financed by EQUIP. See annex 2, sub-component 3.2 for further details. 3.3 Efficiency Given data limitations and the focus of the project on inputs and processes (rather than on medium term outcomes), no cost-benefit analysis was conducted. However, equity targeting was assessed on how well EQUIP maintained its focus on increasing female access. The data shows that some EQUIP benefited schools did not report girls' enrollments. Also the preliminary assessment suggests that there were more execution bottlenecks during the second phase of EQUIP (when management was institutionalized within the MOE, after the international NGOs' managed pilot) and also in unsecure provinces (see Annex 3). 3.4 Justification of Overall Outcome Rating: Moderately Satisfactory EQUIP I had mixed results. Project design relevance was significant, as key basic inputs and some institutional processes were targeted. Nonetheless, implementation experienced moderate to significant shortcomings, which impacted the project development objectives. Grants and community-based school management were satisfactory only with minor shortcoming related to delays in the grant disbursing cycle and central over-commitments in infrastructure planning. The development and use of the NESP to guide education investments was satisfactory, but the monitoring and evaluation system did experience more significant shortcomings when compared to original project design expectations. Teacher training was limited and principal training was not implemented. Shortcomings related to institutional processes (such as result-based management, integration of MOE departments and staff, accountability, and basic project management tools) also impacted project efficiency and contributed to lower ratings. 3.5 Overarching Themes, Other Outcomes and Impacts Poverty Impacts, Gender Aspects, and Social Development. EQUIP sought to provide targeted financing to schools seeking increased girls' enrollment. While in general EQUIP invested in provinces where the MOE's EMIS reported that female enrollment increased, some provinces 9 seem not to have maintained the gender targeting (see Annex 3). As a national program, EQUIP supported provinces with high poverty and lower social development. Especially the community participation strategies allowed EQUIP to deliver some education services in unsecure or highly rural provinces, albeit with more management and monitoring difficulties. Institutional Change/Strengthening. EQUIP contributed to the establishment of important structures for education quality in Afghanistan by setting up and supporting entities at the sub- national (e.g. PEDs and DEDs) and community levels (PTAs and SMCs). A National Education Strategic Plan (NESP) was prepared and a new 5-year plan is underway; a primary education curriculum was completed; and the school grants model has responded to some unsatisfied basic school needs. These institutional changes, especially the community participation and school- based management strategies, are a key contribution to the organizational setting of the education sector of Afghanistan. Some institutional needs are still to be addressed. Organizationally, the MOE is fragmented between a large number of staff with low skills and low salaries and another group of consultants and advisors, whose high salaries have become unsustainable. Also, although the government­ wide PFM systems are functioning, the roll-out of PFM reforms and capacity building to the line ministries and to the provincial arms of the Treasury have been slow, which affect education investment programs such as EQUIP. Finally, procurement of inputs (infrastructure, goods and consultants) has not fully been grounded on strategic planning and results monitoring each fiscal year. Albeit TA was provided during project implementation, systematic management and administration strengthening is still needed for sustainable MOE institutional capacity. As a program aiming to support the transition from emergency delivery of education services to longer-term development, EQUIP implementation lessons have created awareness for: (i) stricter guidelines for result-based strategic and operational planning; (ii) integrated work teams among MOE departments; and (iii) explicit management tools to connect inputs and procurement to annual targets of sector objectives and available financing--all with timely monitoring to correct specific and systemic bottlenecks. Other Unintended Outcomes and Impacts. Donor harmonization and alignment with regard to some project supported activities (e.g., community participation and school-based management) was a positive unexpected impact. In addition, while the in-service training system was not implemented, EQUIP provided TA for curricular reform and the design of textbooks as needed foundation for a quality teacher training system. Negative unintended impacts included the MOE's rapid increase in consultant fees (one of the highest in the GoA) and limited opportunity for permanent MOE staff to increase their project management capacity, as the project was managed mostly by contracted advisors and consultants. 3.6 Summary of Findings of Beneficiary Survey and/or Stakeholder Workshops Annexes 5 and 6, respectively, detail the results of a beneficiary assessment and three workshops. The M&E beneficiary assessment survey was conducted in two provinces of Afghanistan and collected the perceptions of parents, students, teachers, community leaders and PED and DED staff on community participation, grants, infrastructure and other local needs. The M&E 10 Workshop identified that a full functional M&E system for the MOE and EQUIP was not in place, but important building blocks were underway: (i) fine-tuned indicators and targets within the updated NESP for 2010-2014; (ii) some indicators collected by EQUIP on school-based management and community participation (albeit, some data was not channeled to the MOE); and (iii) the availability of qualified staff with prior experience in the Ministry of Communications, who are supporting the MOE in defining a web-based integrated EMIS system. The Social Mobilization Workshop identified strengths and challenges of community participation, and the need for institutionalization. The salient points of the MOE Integration for Education Quality Workshop, with MOE directors and advisors, were the need for: (i) an MOE shared concept of education quality; (ii) innovative and flexible models of teacher formation and training, and (iii) further development of the community participation strategies to support education quality. It was recommended to better understand how children can learn in conflict afflicted contexts, with special attention to their psycho-social needs. The MOE conducted its own stakeholder workshop to validate the national ICR (NICR) report (see Annex 7) and provided a systemic review of the strengths, weaknesses, challenges and opportunities of EQUIP. 4. Assessment of Risk to Development Outcome Rating: High. The same three risks identified at the onset of the project--insecurity, grant disbursement delays, and weak financial management execution, especially at the sub-national level--are still high today. Other risks that remain high and can affect the sustainability of project results are limited organizational integration of the MOE (between staff and consultants, and between different line departments); limited differentiated strategies to provide education services in a highly heterogeneous country context; and weak planning and monitoring. The institutional risks are the major obstacle to EQUIP's providing a foundation for long-term education quality reform; the MOE is committed to address them. 5. Assessment of Bank and Borrower Performance 5.1 Bank Performance Bank Performance in Ensuring Quality at Entry Rating: Satisfactory The Bank supported the MOE to tap the energy and enthusiasm of community participation, which yielded the project's main results and satisfied basic school needs (infrastructure, classroom materials, laboratories, etc.). The sector analysis "Investing in Afghanistan's Future" provided guidance for the design of the project. The Bank provided a fast response under OP8.50 to channel resources to the reconstruction of the education sector of Afghanistan. Quality of Supervision Rating: Moderately Satisfactory A Bank team of national and international education specialists was present at the Bank's office in Kabul. Deteriorating security has significantly limited the Bank staff's movement across the country to conduct on-site project supervision. Field visits were carried out only in the safer provinces where air travel was possible and permitted (mainly Herat, Bamyan and Balkh). Mid- term, portfolio and ex-post fiduciary (procurement and FMRs) reviews were conducted. The 11 Bank also became a lead donor supporting harmonization and alignment of international aid in the education sector. Justified by the emergency response, the main tool for investment planning was the procurement plan. However, procurement plans are not effective to monitor output and outcome level indicators, annual investment activities tied to approved budgets, and balanced implementation across components. Given the project's stated emergency-to-institutionalization transition objective, the Bank should have requested the MOE to adopt a basic set of strategic, operational and administrative tools (such as annual operation plans, monitoring of cash flows and approved fiscal year budgets, and a reliable monitoring and evaluation system). The Bank made repeated recommendations (such as subcomponent investment tracking and monitoring planned and actual targets), but these were not implemented by the MOE in the absence of Bank's taking remedies. Justification for Rating of Overall Bank Performance Rating: Moderately Satisfactory The transitional context of EQUIP (from an emergency program to a longer term, sustainable service delivery) required a strategic focus and greater implementation support to government counterparts. The Bank's task team was not adequately staffed to effectively address a myriad of education and organizational needs in a challenging context. The country security situation and the need for extended assignments without being accompanied by family members have posed a challenge to attract and retain seasoned TTLs and senior operational staff in Afghanistan. 5.2 Borrower Performance Government Performance Rating: Moderately Satisfactory The GoA and MoF are fully committed to the education sector (and EQUIP). The GoA provided annual budgets for EQUIP and rallied support from ARTF for EQUIP financing. The MoF was committed to solve any problems encountered, and allocated specific staff for the education sector. Although EQUIP had a relatively simple checklist of PFM procedures, in practice at the provincial level additional non-formal processes were added (e.g., processing of grants to communities required many signatures for approval--including from the Provincial Governor). Decentralized EQUIP investments were not fully understood by Mustoufiats (provincial MoF offices); and administrative problems were solved on a one-to-one basis, rather than systemically. The timing of forward budgets, annual fiscal budgets and allotments, as well as late sub-national liquidation of grants, impacted project cash flows. Implementing Agency or Agencies Performance Rating: Moderately Satisfactory The MOE has made considerable efforts to improve the management of education services. Some stricter guidelines on basic strategic planning, integrated monitoring, and results-based management are needed. The move from emergency assistance to longer-term reconstruction and State building would have benefited from the MOE's taking prompt actions to improve the skills and responsibilities of its permanent staff and defining sustainable strategies to gradually reduce direct management of education services by consultants. 12 Rating Justification for Overall Borrower Performance Rating: Moderately Satisfactory The rating is based on mixed efforts: on the positive, a decentralized community participation and school-based management model was developed. Also the design of a medium-term policy framework to guide investments (the NESP) is commendable. Collecting and disseminating education sector data has been an important input for future education reforms. However, commitment and effort are needed to strengthen the MOE's still fragile management, improve MOE departments' coordination and information sharing, and review the formal and non-formal public financial management execution processes, especially at the sub-national level. 6. Lessons Learned 6.1 General Systematic sector analysis in fragile and post-country contexts, even with limited data available, contributes to better project design. The "Securing Afghanistan's Future" report exemplifies the positive contributions of a basic systematic sector analysis to project design and quality at entry. Sector development programs in fragile and post-conflict states require an operational alignment with central reform and reconstruction programs. EQUIP had difficulties coordinating with the PRR program to improve MOE's organization, financial management, pay and grade and other human resources reforms. Sector projects need the tools and explicit processes to link with or influence the broader State reform programs. Flexibility and fast response policies and approaches need to be balanced with minimum standards for program planning, decision making and management. Bank OP 8.50 allows fast and flexible preparation and approval of financing for emergency situations. However the lesson learned is that even during an emergency, planning and management processes--albeit with simpler tools--are needed. Recently, the MOE and the Bank have increased their efforts to introduce stricter planning and management criteria and tools for education investments in Afghanistan. Emergency response programs need to coordinate with other post-conflict programs aiming at State reforms and capacity building. Targeting and differentiated strategies are needed in highly heterogeneous post-conflict countries. Simple emergency designs do not have to be homogenous and could include, for example, targeted and differentiated interventions for rural and urban communities, secure and unsecure areas, and secular and Islamic messages (the latter in the case of Afghanistan). EQUIP's design did not establish differentiated interventions, especially for an ethnically and geographically heterogeneous country with an incipient peace process and a sensitive cultural and religious makeup. Aid channeled through Government should contribute to identify practical ways to continue to improve public financial management (PFM) execution and solve bottlenecks for social services delivery. EQUIP funding is channeled through the Afghanistan PFM system, facilitated by IDA's fiduciary guidelines. Core budget sector programs, such as EQUIP, can help identify practical ways to improve the country's PFM and procurement systems--rather than only solve 13 their administrative bottlenecks through short-term donor (IDA and ARTF) procedures: larger advances to special accounts, retroactive payments, overdrawn disbursement categories, "special case" no objections, etc. Fragile and post-conflict countries are vulnerable to costly and unsustainable technical assistance (TA). Early and proactive planning needs to be in place to reduce dependency on TA, including concerted agreements between all donor agencies on consultants' salary scales and capacity building strategies. A gradual shift to permanent staff management can prevent a collapse of intellectual and operational support after the high point of aid flows. The MOE in Afghanistan became one of the largest employers of highly paid consultants in the GoA, distorting the education sector HR market in the country. 6.2 Project Specific Community participation and school grants can provide demand and stimulus for State and public management reforms. EQUIP School Grants can become a catalyst for MOE management reforms focused on teaching and learning quality. Thus, community participation needs to become a cross-cutting objective in all MOE technical and fiduciary departments. Community mobilization teams need to be formally part of the MOE structure to ensure sustainability. EQUIP could have benefitted from better integrated physical, financial and results-based planning. The EQUIP coordination unit has begun to prepare annual integrated plans (with results indicators, targets, activity programming, finance available and procurement). However, these instruments are not used by all MOE departments involved in EQUIP, and there is the risk that strategic and operational reports become only a tool for donor reporting rather than an internal MOE mechanism for communication, management and informed decision making. MOE should monitor and provide strategies to close equity gaps, especially in national investment programs. School infrastructure data shows that up to 30% of EQUIP supported schools did not follow the equity criteria to increase girls enrollment (MOE data reflects only male students in these schools). Also, the lack of differentiated service delivery strategies may have negatively affected unsecure and hard-to-access rural areas (See Annex 3). Cultural and Psycho-social interventions could have complemented EQUIP investments. EQUIP and other programs in Afghanistan should not lose sight of the "social, human and psychological burden" of conflict in the education sector, which requires explicit psycho-social support, along academic programs, for an integral and holistic approach for sustainable reforms. 7. Comments on Issues Raised by Borrower/Implementing Agencies/Partners The following is a summary of comments received from the Borrower, co-financiers, NGOs and other education partners. Their comments are noted for the implementation of EQUIP II. 14 7.1 Borrower/Implementing Agencies (See Annex 7 for details): The Ministry of Education of the Islamic Republic of Afghanistan thanks the World Bank for preparing the Implementation Completion and Results report of EQUIP I, which was reviewed by the MOE departments with profound interest. The report was prepared in very close cooperation and consultation with the concerned departments of the MOE. The MOE will seriously look at the issues, lessons learned and recommendations of the ICR and will assign a Task Force--with representation of all concerned departments and stakeholders--to address the issues raised and implement recommendations. To present, validate and finalize the draft of the MOE's NICR report for EQUIP I, a one-day workshop, with technical support from the World Bank's ICR mission, was conducted at the Ministry of Education in August 2009. The working groups with representation of all MOE concerned departments discussed: (i) the background, implementation and results of the different components of the EQUIP; (ii) current status of the MoE programs/departments supported by different components of EQUIP; and (iii) filled information gaps and committed to apply lessons learned in future education programs. 7.2 Co-financiers (See Annex 8 for details) Not achieved. EQUIP I outcomes and outputs should strongly be addressed under EQUIP II, especially as ARTF education sector contributions have increased considerably. Special attention should be given to: (i) strategic and sustainable use of Technical Assistance (especially as the MOE has begun to tackle this issue through its Technical Assistance Evaluation Committee in April 2009); (ii) the lack of differentiated interventions for service delivery in unsecure areas, which is still not addressed in EQUIP II; (iii) improved annual integrated implementation plans and monitoring, not only for infrastructure but also for other key education quality investments, such as teacher training; and (iv) students learning assessments. 7.3 Other Partners, NGOs and Stakeholders (See Annex 8 for details) The World Bank and the Government of Afghanistan are to be applauded for their joint effort under EQUIP I to jump start reform in the midst of great uncertainty and vulnerability. The ICR highlights the successes and shortcomings of an ambitious EQUIP I attempting to respond to immediate emergency needs in education while creating conditions for more lasting reform at a future stage. Three elements stand out that are worthy of comment here: (i) sectoral reform projects sometimes make the mistake of targeting only the macro or the micro realm, but EQUIP I boldly (and correctly) tackled both--local and national approaches--simultaneously. EQUIP I was more successful at the school and community levels than at the national. In EQUIP II the priority clearly is to bring advances at the national level into synch with those already registered at the local level to achieve lasting reform; (ii) absence of skilled national manpower is noted as a bottleneck to implement the myriad activities critical to project success, which led to an inordinate reliance on foreign and national consultants and spiraling of salaries. The second phase of the project should contemplate an immediate and urgent program of training and incentives to build the capacity and motivation of a cadre of educational professionals; and (iii) for reasons not explained in the ICR the student learning assessment tool was not developed. Early mastery in grades 1 through 3 of the basic skills of reading and numeracy are critical for student success in later years. There are experiences to be shared with the MOE and EQUIP II on 15 student learning assessment, as the Rapid Reading and Numeracy Test for grades 1-6 under the USAID funded Partnership in Advancing Community Education in Afghanistan (PACE-A). 16 Annex 1. Project Costs and Financing Project Cost by Component (in USD million equivalent) Actual/Latest Appraisal Estimate Percentage of Components Estimate (USD (USD millions) Appraisal millions)* SCHOOL GRANTS (INFRASTRUCTURE & 21.00 29.25 140% QUALITY) HUMAN RESOURCE AND INSTITUTIONAL 11.50 4.32 38% DEVELOPMENT*** POLICY DEVELOPMENT, MONITORING & 2.50 3.91 157% EVALUATION**** Total Baseline Cost 35.00 37.48** 107% Physical Contingencies 0.00 0.00 0.00 Price Contingencies 0.00 0.00 0.00 Total Project Costs 35.00 37.48 107% Front-end fee PPF 0.00 0.00 .00 Front-end fee IBRD 0.00 0.00 .00 Total Financing Required 35.00 37.48 * Latest estimates by component were reconstructed by the ICR team, as EQUIP tracked expenditures only by Grant Agreement disbursement categories. ** Based on present exchange rate of 1 XDR = 1.56 USD ***Component 2 includes expenditures related to institutional development grants, training and workshops and cost of TED and EQUIP's provincial and district teams (approximately32% of EQUIP TA costs). **** Component 3 includes TA (not related to TED or EQUIP provincial teams) and incremental operating costs. Financing Appraisal Actual/Latest Type of Co- Percentage of Source of Funds Estimate Estimate financing Appraisal (USD millions) (USD millions) Borrower 0.00 0.00 .00 IDA GRANT FOR POST-CONFLICT 35.00 37.48* .00 *Given changes in SDRs exchange rates, present USD expenditure at project-end are later than at appraisal (at present SDR exchange rates the Grant amount equals USD 37.48 million) 17 Annex 2. Outputs by Component This annex presents detailed information on physical outputs and implementation lessons learned (which include administrative processes and monitoring and evaluation) of each subcomponent. As a preamble, Section I of the Annex, below, provides some additional information on the national level PDOs included in Section 3 of the ICR. I. Overall Context of EQUIP I's Objectives and Indicators As an "Emergency Recovery Assistance" operation (OP 8.50), EQUIP focused on rebuilding physical education assets or inputs such as: (i) infrastructure improvement and development; (ii) classroom furniture, consumables and learning materials; and (iii) adequate human resources for teaching and education administration. Although, EQUIP did not attempt yet to address systematically a long-term education quality reform, it did aim to improve some key processes such as community participation, provincial and district level education management, and MOE policy and monitoring and evaluation, as a foundation for future education reforms. Caveat on National Level Indicators Table 1, below, presents the original indicators defined during the first year of project implementation, and their fine-tuning after the Mid-Term Review (see Section F and Section 3 for baselines, targets and other detailed information on PDOs and IPs). There was no formal revision as indicators and target values were not included in approval documents: Technical Annex and Grant Agreement. It is important to note that the PDO level indicators for student enrollment and size of the teaching force are national level indicators, and EQUIP had no tools to monitor the direct causal relationship between project investments and changes in national PDOs. Table 1: EQUIP Development and Implementation Objectives: Original and Fine-Tuned Original PDO Indicators Revised After the Mid-Term Review Net Enrollment for Boys Total Enrollments Boys and Girls Net Enrollment for Girls % of Girls in Total Enrollments Student Learning Outcomes Grade 3 Reading Assessment # of Female Teachers and % of Teaching Force Education Policy Developed (NESP) Evaluation of EQUIP Interventions Original Intermediate Outputs Revised After the Mid-Term Review Number of Teachers Trained Number of Teachers Trained in Pedagogy Number of Principals Trained Number of Teachers Trained in Curriculum Content # of Schools Implementing SIPs Number of School Receiving Quality Grants Number of Schools Rehabilitated # of Schools Receiving Infrastructure Grants EMIS Annual Update and Use Developing of EMIS EQUIP sought to contribute to national indicators related to increased basic education access, and increased availability of female teachers. To increase enrollments, and especially girls' 18 access, EQUIP financed school buildings in 26 provinces of Afghanistan. However, the MOE did not monitor existing vs. new enrollments in communities receiving EQUIP infrastructure. Thus, a direct project contribution to 2005-2008 national enrollment increments, detailed in Table 2 below, cannot be assessed. Also, EQUIP I did not include specific interventions to increase female teachers (such as special pre-service female teacher training strategies, special incentives to attract females to the teaching work-force, or salary support); thus the changes detailed in Table 3, below, on the increased numbers of female teachers cannot be directly attributed to EQUIP I. Nonetheless, the tracking of these indicators is important to see the general growth of the education sector in Afghanistan, during the period of project implementation. Table 2: Summary of National Indicators Included in EQUIP: Student Enrollments in Afghanistan 2005-2008 1384 1385 (2005) (2006) Boys Girls Total Boys Girls Total 3,211,794 1,682,921 4,894,715 3,564,432 1,929,739 5,494,172 1386 1387 (2007) (2008) Boys Girls Total Boys Girls Total n.a n.a 5,675,951 3,930,073 2,182,380 6,112,453 Source: Ministry of Education of Afghanistan, Planning Department Table 3: Summary of National Indicators Included in EQUIP: Availability of Teachers in Afghanistan 1383 ­ 1388 (2004-2009) Female as % Female Overall % Growth of Overall Year Total Male Female as % of from Previous Year Teaching Male Force 1383 122,910 (baseline) 88,802 34,108 38.41% 27.75% (2004) 1384 128,400 4.47% 92,60 36,140 39.17% 28.15% (2005) 1385 136,503 6.31% 98,083 38,420 39.17% 28.15% (2006) 1386 142,508 4.40% 103,047 39,461 38.29% 27.69% (2007) 1387 152,281 6.86% 107,748 44,533 41.33% 29.24% (2008) 1388 164,771 8.2% 116,298 48,473 41.68% 29.42% (2009) 19 II. Subcomponent Level Outputs Component 1: School Grants Subcomponent 1.1: School Grants for Quality Enhancement Physical Outputs: This sub-component has been implemented under the initial parameters defined in the original project document, and has surpassed its targets of 2,000 school improvement plans (SIPs) financed. Table 4: Output Indicators for the School Grants for Quality Enhancement (QEG) Indicator Original Target* End of Project Achieved Number of Schools 2000 (later increased to 5,827 Management Committees 5,827) (SMCs) formed Number of Schools with 2000 (later increased to 3,963 Quality Grants 5,827) Quality Grants Fully 2000 (later increased to 3,963 Utilized 5,827) *Original target was defined based on the original IDA approval of US$35 million; the target was increased upon additional financing of US$44 million provided by ARTF The analysis of EQUIP supported SIPs shows that almost 70% of the US$11.5 million of Quality Enhancement Grants (QEG) financed libraries, laboratories, computers and other school equipment (Table 5). Table 5: Type of School Expenditures Financed by QEG in EQUIP I Type of Expenditure % of Overall QEG Financing* Total Financing in US$ School Equipment 23% 2,671,325 Library 22% 2,594,166 Computers 16% 1,928,354 Laboratories 14% 1,639,788 School Repairs 10% 1,122,929 Training 6% 694,337 Sporting Materials 5% 593,084 Water Supply 0% 59,331 Fares/Transport 0% 22,389 Tents 0% 15,910 Miscellaneous 1.6% 186,253 TOTAL US$11,527,864 *May not add up to 100% due to rounding Implementation Process Lessons Learned: The education quality grants subcomponent provided resources to satisfy basic needs of schools, directly managed by School Management Committees (SMCs). Implementation of this sub-component generated the following important lessons learned: 20 Important legal foundation exists for on-going education reform with community participation and school-based management. Article 46 of the Education Law reinforces EQUIP's vision of an active role of community and parents in improving education access and quality. It clearly states that school management committees in private or public schools can assume the following roles: i) contributing to solutions to education problems and to improve education quality; ii) managing and monitoring learning, values and behavior of students and school employees; and iii) safeguarding the education community against violence and any other detrimental actions within and outside the education institution. However, the MOE needs to take further advantage of the opportunity that the education law presents for PTAs and SMCs involvement in the education quality processes, to support teachers, and especially support students in their holistic learning objectives. Teachers are informally organized (not officially) in teacher committees; however, teachers need to be officially part of the SMCs to allow them a more active role in education quality planning in their schools. MOE is building community mobilization capacity and financial transfers to schools. As a systematic process, the Quality Enhancement Grants model has maximized positively decentralization and community participation. Social mobilizers (originally provided by NGOs and later hired directly by the MOE) orient technical district teams, school principals and SMCs on basic planning processes for the preparation of their School Improvement Plans (SIP). The MOE is attempting to decentralize the management of social mobilizers to the Provincial Education Departments (PEDs), with the guidance of a community mobilization central team within EQUIP's PCU in Kabul. Also significant progress has been achieved in systematizing school grants transfers to PTA's/SMC's. School-based financing needs to be strengthened, but already provides the basis for consideration of school budgets and community participation in Afghan schools. Community participation an important element in conflict afflicted regions. A Bank financed study identified that during the project implementation period there were more than 785 school attacks (by anti-Government elements, regional militias and criminal activity). International experience demonstrates the important role that communities and their leaders (including religious leaders) can play in the safety of their schools, students and teachers in the midst of on- going conflict. A systematic assessment of this role was not included in EQUIP, although the MOE is strategizing now how to improve the role of community and religious leaders in safeguarding schools. Monitoring and Evaluation (including Mid-Term Review. The EQUIP PCU and the Community Mobilizers monitored the following indicators: number of SMCs organized, SIPs prepared, and type of school inputs financed (see table 5, above). The two NGOs, CARE and BRAC, hired to implement the subcomponent initially identified that the MOE enrollment data of schools to receive grants reflected some over-reporting of students, which needed field verification. During the Mid-Term Review, this subcomponent was rated satisfactory, highlighting the success of social mobilization and community participation to establish the SMCs and PTAs in education administration. The field visits of the MTR and the M&E assessment revealed a strong commitment of the community and their empowerment in the performance of their duties. 21 Administrative and Management Experience. The transferring of grants to schools has been an important mechanism to provide decentralized resources for education investments directly to communities and schools. The setting up of such a school-based management system during the emergency period in Afghanistan is commendable. During implementation long processes and delays in the transferring of grants became apparent, and still need to be corrected. As the MOE took over the grants component, the transferring of funds became part of the regular public financial management procedures, which are channeled through many different steps between the MOE, MOF, local MOF provincial offices (mostufias), and provincial and district education departments. Review of SIP and Grant documentation at the school level, noted many approval signatures for school grants, which evidence longer than expected bureaucratic financial and administrative steps (not detailed in the Technical Annex and Operational Manual of EQUIP). Longer non-formal processes may pose higher governance vulnerability risks (e.g., pay-offs for approving signatures or for expediting procedures). Sub-component 1.2.: School Grants for Infrastructure Development Physical Outputs: To date, a total of 755 schools have received infrastructure grants, and 355 have been completed. As the ARTF grant that supports EQUIP I will not close until March 31st, 2010, 400 schools are still being built and will be funded from ARTF funds. Table 6: Output Indicators for the School Infrastructure Grants Indicator Original Target* End of Project Achieved Number of Schools with 200 (later increased to 822) 755 Infrastructure Grants Number of Schools 200 (later increased to 822) 355 completed (400 in Completed (Rehabilitated) progress with ARTF financing) *Original target was defined based on the original IDA approval of US$35 million; the target was increased upon additional financing of US$44 million provided by ARTF. Implementation Process Lessons Learned: The infrastructure subcomponent was the largest financial investments in EQUIP I, which generated important lessons learned: MOE not able to provide causal link between EQUIP infrastructure and increased access. Causal link between school infrastructure financed by EQUIP and increased nationwide and provincial enrollments cannot be measured due to: (i) schools built by EQUIP provided student places for existing students (who did not have a physical school building) or new enrollments; (ii) net and gross enrollments which would account for the school age population outside of school could not be quantified, as a reliable population census still does not exist in Afghanistan; (iii) the MOE had no planning and monitoring instruments to measure enrolments before and after construction in beneficiary communities; and (iv) there were weak linkages between the EQUIP PCU and the Procurement Department, which managed the infrastructure program, and the Planning Department, which monitored and projected increased enrollments. No special design for girls' schools and deteriorated girls' enrollment targeting. The MOE has prepared over 80 standard designs for school construction, based on school size, local construction material, and terrain conditions of the community. However, it was not evident that 22 a specific design for girls' schools was considered, taking into consideration cultural requirements for additional privacy such as inclusion of a boundary wall or separate restroom facilities for boys and girls. In surveys and interviews with local communities, such lack of differentiated designs for girls' schools was considered an impediment to increased girls' enrollment. Also infrastructure data analysis shows that the original targeting of girls access for EQUIP I infrastructure deteriorated as the program scale up to a nationwide program (see Annex 3). The analysis also shows wide provincial differences in effectiveness regarding female enrollment, which may call for differentiated strategies to continue to support this important objective (given security, cultural context, appropriate school designs or other special support for girls' education). Further clarity of community participation in school construction by firms is needed. The community contracted schools follow a full decentralized approach, by which the SMC received the financial funds and managed the construction of their school. However, for larger schools, the MOE conducted national competitive bidding processes (NCBs). However, for NCBs it was not clear the role the SMC played in evaluating and supervising school construction by firms (including school specifications, quality monitoring, information on sub-contracted builders, etc.). Large gaps in school infrastructure still exist. Even with the large support of EQUIP on infrastructure, the MOE data shows that to provide appropriate schools for the almost 7 million students enrolled in grades 1-12, the country requires 73,200 classrooms, of which only 42,000 have been provided (or 57%). Thus, there is still a 40% deficit for required infrastructure, without considering even new enrollments but just existing student population. In addition, at the same time that the international cooperation is supporting the construction of schools, about 650 schools have been damaged and/or destroyed as a result of the on-going armed conflict in Afghanistan. Given the still large infrastructure needs, the GoA, the MOE and the donor community need to define new differentiated and effective strategies to provide appropriate learning environments--incorporating lessons learned (positive and negative) from previous investments such as EQUIP and others. Monitoring and Evaluation (including Mid-Term Review). During the Mid-Term Review (MTR), this subcomponent was rated satisfactory. The MTR considered satisfactory the level of progress achieved in 12 provinces, highlighting that the original goal of 250 schools in 10 provinces had been surpassed. The MTR noted the development of cost-effective standards in accordance with the individual needs of each school, which accounted for use of local materials and precautionary measures for earthquakes. Administrative and Management: To accelerate construction of schools in 2008, the MOE initiated the construction program for EQUIP II (two) without having the approved MOF budgets. This decision also crowded out the remaining infrastructure investments in EQUIP I, with the result of still 400 uncompleted schools. The MOE now is expected to prioritize its construction program, for which the EQUIP I schools to be financed by ARTF contributions are a high priority (the EQUIP I ARTF grant will close on March 31st, 2010). Improved and stricter communication between the MOE's financial management and the procurement departments are in place to guarantee that activities to be implemented each fiscal year have the approved MOF 23 budgets and allotments. The EQUIP PCU needs to facilitate the preparation and day-to-day use of annual implementation plans--linking activities to be managed by MOE units to project objectives targets (to be achieved each fiscal year) and to approved budgets and, then, authorization to initiate procurement processes. There is still some resistance to such a coordinated approach among MOE departments, as communication and integrated planning is limited among them. The MOE, the Bank and education sector donors need to continue to support integrated planning and integrated education services delivery. Component 2: Support to Schools through Institutional and Human Resource Development Subcomponent 2.1: Teacher Training Physical Outputs: From 2006-2007, this sub-component financed the training of 32,467 teachers (21% of the teaching force at the time) under the first "In-service Teacher Training Program", INSET I (see Table 7, below). The original teacher training target for IDA financing were only 20,000 teachers. With increased ARTF financing, and after the Mid-Term Review, the target was increased to 140,000 teachers (the overall teaching force at the time). Nationwide teacher training did not take place given that the modality for training changed (from MOE execution to proposed NGO delivery) and the MOE decided to finance the in-service training program during the second phase of EQUIP. The project, thus, supported curriculum development and teaching standards--to provide the content and skills to be developed within the pre-service and in-service teacher training programs of Afghanistan. Table 7: Teachers Trained During Implementation of EQUIP Year Province Training Date Total Trainees by Total Gender Trained Male Female 2006 Paktia July 06 to Nov 06 1,796 120 1,916 2006 Kapisa July 06 to Dec 06 1,387 309 1,687 2006 Logar July 06 to Dec 06 1,504 323 1,827 2006/2007 Parwan Aug 06 to Jan 07 3,060 507 3,567 2006/2007 Kabul (Kabul Sept 06 to Dec 07 5,543 8,774 14,317 City included) 2007/2008 Laghman Dec 07 to Dec 08 2,981 0 2,981 2007/2008 Nangarhar Dec 07 to Dec 08 5,747 425 6,172 Totals 7 Provinces 2005 to 2008 22,009 10,458 32,467 During the implementation of EQUIP I, an additional 13,000 teachers received random training, as well as 516 Teacher Educators and 34 Headmasters in 7 provinces of Afghanistan. Implementation Process Lessons Learned: The teacher in-service training subcomponent did not complete its increased target to train the entire Afghanistan teaching force; however important lessons learned can be drawn: MOE developed curricular and standard teacher competencies as a foundation for pre-service and in-service teacher training: EQUIP I TA contributed to the development of a national 24 unified teacher training curriculum comprised of 8 teaching standards (pedagogy, active learning, questioning skills, group work, child development, lesson planning, evaluation methods, and diversity). These standards are now part of the Framework for Teachers of Afghanistan. Curricular reform and textbook revisions were also tangible products supported by EQUIP financed TA, which have become an integral part of an effective teacher education and development system. Innovative teacher training strategies are being developed by the MOE. The MOE has defined innovative models for pre-service and in-service training. The decentralized training strategy (called District Teacher Training Teams, DT3) was initiated by USAID's BESST program in 11 provinces and will be implemented nationwide by the MOE during the second phase of EQUIP. The Teacher Education Department and BESST have also improved on the INSET 1 (pedagogical) and INSET 2 (curricular content) training programs methodology and materials. For pre-service training the MOE is piloting a satellite Teacher Training Colleges model, especially targeting prospective female teachers. Need to monitor NGO-provided training and avoid training fragmentation. Although NGOs can increase the effective and efficiency of teacher training, the MOE will require an appropriate system for accreditation, quality control and performance monitoring of NGOs. It is important, as well, to guarantee that teacher training is seen as an integrated service provided by the MOE, albeit supported by NGOs. Teacher training workshops can easily become isolated and fragmented training, if not tied to other curricular reforms (such as curricular standards, subject content, teaching materials and textbooks). Guaranteeing an integrated system requires, first, that different departments of the MOE plan and manage integrated services, and that specific guidelines be provided to third party providers of teacher training, such as NGOs. Monitoring and Evaluation (including Mid-Term Review): There was limited monitoring data for the teacher training component, and no evaluation design. However, the only source of systematic information on the In Service Teacher Training in Afghanistan was collected by BESST, a USAID funded program which has started the implementation of INSET I and II in 11 provinces. The BESST monitoring system provides a foundation to accelerate having a proper monitoring system for teacher in-service training within the MOE's Teacher Education Department (TED) (See Figure 1). 25 Figure 1: BESST Information Flow 1.The Team Member requires to review the filled Trainee in Training Site forms put them in envelops and hand them over to the designated Team Leader in province, 2. The Team Leader, should first conduct a Team Member in Province crosscheck prior to submitting the data, seal the envelops and submit them to the Provincial Manager, Team Leader in Province 3. The Provincial Manager in province, on his behave counts the number of envelops submitted to him/her, writes a short report and then delivers the data to his/her Kabul Office (most probably Provincial Manager in to the designated focal point) Province 4. The focal point or designated NGO partner's person in Kabul receives the data, then writes a NGO Partner's Designated data delivery report encompassing the number of Person in Kabul forms or envelops to submitted to the BESST CAII M&E Department in Kabul, BESST M&E Department in Kabul Source: Creative Associates Administration and Management: Teacher training is a complex activity which requires integrated management by various MOE departments and other education service providers. For example, the planning department needs to provide reliable information on teachers; the curriculum department needs to provide content orientation and seek support of teacher training activities for appropriate use of textbooks; PEDs and DEDs need to support teacher mobilization for training and support classroom follow up; SMCs and SIPs need to define strategies to support teachers and students learning. Thus, reducing silos between MOE departments should be a focus of on-going and future MOE institutional development program. Sub-component 2.2: Training of Principals Physical Progress: The 3,000 principals were not trained, as expected, by EQUIP, due to: (i) late preparation of programs and training modules; (ii) focus on getting the teacher training modules prepared, to detriment of the training design for principals, and (iii) delays on the implementation of the PRR program which was to develop the selection criteria for school principals and to define their upgraded terms of reference. Table 8 shows the limited targets achieved by project completion. 26 Table 8: Principal Training Key Progress Indicators and Targets Indicator End of Program Target Target Achieved Develop Training Training Program Training Program Program and Module for Developed & Developed but Not school principal Initiated in 10 provinces Implemented Actual # of Principals 3,000 168 Trained under EQUIP I Implementation Process Lessons Learned Principals are the most important managers of the education process. EQUIP's oversight of principal training is a major flaw. Principals are the key actors not only on the proposed school- based management goals of the Afghan education system, but also a key factor for the process of teaching and learning in schools. Prioritizing principal training is imperative. Principal HR reforms need to keep pace with expected education reforms. It is important to speed up the selection process and training of school principals. Principals lead school management processes and are facilitators of community participation together with PTAs and SMCs. Monitoring and Evaluation (including Mid-Term Review): In April 2007 the Mid-Term Review mission informed that "the merit based hiring of school principals for Kabul Province has been completed providing an opening for this sub component to begin albeit in a smaller scale. However, no further progress on the subcomponent was documented, and there was no monitoring information regarding follow up to the proposed principal training beyond the limited 168 principals trained in Kabul (out of 3,000 expected within EQUIP I). Sub-component 2.3: Capacity Building of District and Provincial Education Departments Physical Progress: Although the specific strategies to increase the capacity of PEDs and DEDs were not fully detailed in the project design, available information shows that more than 150 national consultants--in social mobilization, engineering and financial management--were hired to support the 34 provinces in Afghanistan (see Table 9). Some additional infrastructure and equipment were provided to some PEDs. Table 9: Provincial and District EQUIP Teams Specialization # of # of TA Annual TA Provinces Cost (2008) Equip Coordinators, Monitoring Specialists, Social Mobilizers, Engineers, and Financial 34 181 US$2,656,272 Managers 27 Implementation Process Lessons Learned Increased PEDs and DEDs role in supporting teaching and learning quality needed. PEDs and DEDs roles include support in the training of school teachers, community mobilization and conducting some basic administration and management functions more effectively provided at the sub-national level. The MOE has a new organizational structure that clearly defines the central, provincial, district and schools level; however, more strategic and systemic capacity building--beyond TA to fill-in capacity gaps--is needed. The District Education Departments (DEDs), moreover, are the MOE institution closer to the schools and require strengthening to provide effective school management and pedagogical support. Integration of Community Mobilization Teams within PEDs and DEDs. A recent decision of the MOE is to integrate the EQUIP teams with the PEDs and DEDs, and formally make the community participation and school-based management functions part of these sub-national MOE departments. Also, at the central MOE a more appropriate institutional home needs to be found for social mobilizers and, in general, for the community participation and school-based management strategies and operations. Administration and Management: The DEDs have limitations in carrying out their supporting roles to the schools. Therefore, it will be important that their capacity is strengthened in order to be able to carry out supervision, planning and school evaluation processes, in addition to the processes encompassed in the community participation strategy. Monitoring and Evaluation (including Mid-Term Review): There has been limited monitoring and evaluation of the PED and DED capacity building strategies and action plans. Component 3: Policy Development and Monitoring and Evaluation Subcomponent 3.1: Policy Development Physical Outputs: Table 10 presents the major outputs of the Policy Development Component. 28 Table 10: MOE Sector Management Capacity Building Indicator Progress Explanation EMIS requires a hard and software Education EMIS has collected information technology base to Management education sector wide improve data collection, analysis and Information System data, which is updated management use at all levels of the (EMIS) every year. system. The first five year of the The NESP is being updated to align to NESP was completed National Education the completed Afghanistan National and guides MOE actions Strategic Plan Development Strategy and the Result- and investments. NESP (NESP) Based Budget of the Ministry of was also revised and Finance. aligned with ANDS The MOE continues to strengthen its A new "tashkil" for the policy, programming and management MOE Restructuring MOE was approved and capacity, including focus at the operationalized provincial and district levels. Technical Assistance Financed From Component 3: In addition to the TA provided to the NESP and EMIS, Component 3 also provided support to other MOE departments and programs related to the three EQUIP 1 components (see Table 11 below). Table: 11: MOE Programs Supported by EQUIP 1 Components Programs School Grants Basic Education Access Construction Program Community Mobilization and School Based Management School Support Thru Human Teacher Education and Development (teachers Resource Development and principals) Curriculum Development Strengthening of PEDs and DEDs Policy Development, Education Management Monitoring and Evaluation Project Management (GMU, EQUIP coordination) Fiduciary Management (FM and Procurement) Implementation Process Lessons Learned NESP key for policy guidance of education reforms. The preparation of the National Education Strategic Plan (NESP) was one of the major outcomes of the MOE. The NESP is the main tool that allows the MOE to guide education sector investments and to improve alignment and harmonization. The second 5 year plan for the NESP, 2010-2014, is being finalized by the MOE to guide sector investments and track results. 29 Harmonization and Alignment of Aid: The preparation of the NESP was supported by different donors, including UNESCO, IIEP (with funding by Norway), USAID, DANIDA and by EQUIP, through resources to hire technical assistance in the Planning Department. The Planning Department exemplifies ways to improve aid effectiveness by channeling resources from various donors to MOE and planning department objectives. The MOE has ownership of sector strategic planning process. The Planning Department has also shown increased capacity, having prepared the NESP 2010-2014 with less TA support and through a fully participatory approach. Also, the NESP 2010-2014 was prepared entirely in local languages, as opposed to the NESP 2006-2010, which was prepared initially in English and then translated to Dari and Pashtu. Unsustainable increase of technical assistance and consultant salaries must be monitored. An unexpected outcome of EQUIP support for TA of various MOE departments was a rapid increment in salaries, and a consultant-led management of the MOE. Tables 12 and 13, below, shows the number of staff of the MOE, and average salary, in comparison to the estimated costs of almost 1,250 national consultants hired (EQUIP financed approximately 40% of these consultancies). Table 12: National Consultants Hired By EQUIP Project No. of National Salary scale Consultants EQUIP 450 (out of 1261 total in $100 - $6000 MoE). Table 13: MOE National Consultants Relative to Civil Service Staff and Costs No. of Civil 1387 No. National Estimated Service Staff Salary Consultants Cost/Yr Exp. Ministry of 40,000 (incl. US$12.9m 1261 US$15m Education 30,000 agirs) Monitoring and Evaluation (including Mid-Term Review): The MOE has documented the systematic process by which the NESP has been prepared, which has provided lessons learned to other ministries (for example for the on-going preparation of the Higher Education Strategic Plan). MOE department have not fully monitored the capacity building outcomes and performance of EQUIP financed TA. Recently, the MOE undertook an exercise to rationalize TA, define capacity building plans in each MOE department, and to introduce TA performance monitoring mechanisms. The results of this new TA rationalization and monitoring system will need to be assessed. Capacity and Support for Fiduciary Departments of the MOE: In 2005 a Project Implementation Manual (PIM) was prepared to support achievement of project objectives, as well as procurement, disbursement and financial management aspects. As there were considerable numbers of community contract packages of civil works in the project, the operation manual elaborates on the procedure of the community contracting. A Grant Management Unit (GMU) was established. 30 Later, as part of the MOE's institutionalization process, project procurement and financial management was mainstreamed to the respective MOE's line department. A project coordination unit (PCU) was created to supervise overall project implementation. MOE hired large number individuals under the project to various positions to facilitate implementation of the project. EQUIP financed technical assistance needed to ensure project fiduciary implementation; however, at times the MOE selected consultants' qualifications did not match the functions to be performed, and/or their terms of reference were not clearly defined or monitored. Sub-component 3.2: Monitoring and Evaluation Physical Outputs: Figure 2 presents the basic data collection system that EQUIP utilized to monitor SMCs organized, SIPs prepared, number of grants disbursed, school built (by community contracting and national competitive bidding). Figure 2: EQUIP Monitoring System for School Quality and Infrastructure Grants Stakeholders MOE/EQUIP Coordination Central Unit Level (MOE) Forms/report Approve and report EQUIP Provincial PED Office Level (PED) Forms Report DED Forms/report School 31 Table 14 presents the evaluation reports on EQUIP 1; however there was no impact evaluation design for the project. Table 14: Formal Progress Third-Party Evaluation of EQUIP Name Evaluating Agency Date World Bank and EQUIP Mid-Term Review April 15-30, 2007 Government of Norway EQUIP Mid-Term Report CAF-SHDP 2007 Assessment of Education Activities Norwegian Refugee June 2007 (including EQUIP) Council in Afghanistan EQUIP Completion Report CARE April 7, 2008 Accomplishment Report on EQUIP BRAC Afghanistan September 2008 Education Quality and Community August-September The World Bank Processes Report 2008 Curriculum Development and Learning Ministry of Education of Materials Implementation and November 2008 Afghanistan Capacity Implementation Process Lessons Learned Contemplated full M&E design not completed. Critical M&E activities were contemplated under the Project to provide timely and adequate information on implementation, performance, process, outputs, outcomes and lessons learned of the following programs: teacher development, school improvement grants, rehabilitation and construction of schools, and policy development. A baseline survey at the start of the program, follow-up surveys at the end of year two, and annual surveys were to be conducted. The contemplated and ambitious M&E activities did not take place; however, it is important to note that the MOE advanced on some important building blocks for an M&E system: (i) the EMIS 2006-2008 student and school level data; (ii) fine-tuned indicators and targets in the updated NESP for 2010-2014; and (iii) EQUIP indicators on school- based management and community participation. Learning Assessment not completed. M&E was to support MOE in carrying out an assessment of learning achievement to build an appropriate piloted student assessment mechanism. However, many of the above M&E activities were not carried out. Data collection of grants, infrastructure and number and title of consultants. The PCU monitored data on grants, infrastructure and consultants hired. This data allowed EQUIP to provide information to donors on advances of Component 1 at the provincial and district levels. Improvements in collection of these data provided key information for ministerial decision making and for tracking the main project expenditures. Administration and Management. The M&E investments in EQUIP I focused mainly on some basic data requirements by the PCU and did not finance a fully integrated MOE M&E system. As a result, the project suffered from i) lack of information to evaluate outcome indicators; ii) insufficient credible and integrated information to monitor and evaluate access to quality 32 education and teacher training activities; iii) limited data integration and exchange within EQUIP and within MOE, and iv) limited reporting, dissemination and sharing of information. While TA was provided throughout the implementation to strengthen project monitoring, the M&E integrated system was not developed. The present availability of qualified staff (including those with prior experience in strategic planning, M&E and prior staff from the Ministry of Communications) augur positive future results if integrated planning and operational work plans can be developed linking the NESP, the M&E, the EMIS and the ICT needs. 33 Annex 3. Economic and Financial Analysis The Annex is divided into three sections. In Section I, a targeting analysis will identify how the gender criteria was applied in EQUIP and will attempt to find factors that correlate with girls' enrollment in schools, such as secured and unsecured provinces, roads and school infrastructure access. Section II compares the available information on the cost of school infrastructure grants to enrollment between the two phases of EQUIP I (when EQUIP scaled up implementation from 10 to 26 provinces)--as an approximation of a cost-effectiveness analysis and a before-and-after- the-Program estimations. Section III presents conclusions and recommendations for future interventions. I. Benefit Analysis of Investment in Targeting Gender Parity This section will particularly assess how effective was EQUIP in applying the gender criteria oriented to improve gender equality. EQUIP I priority was given to: (a) girls schools; (b) schools with both boys' and girls' sections/shifts; and (c) boys schools which plan to open up sections/shifts for girls. However, these priority criteria have not been strictly followed by the Program. Up to May 2009, EQUIP school infrastructure grants were mainly given to schools with both boys and girls (53 percent of the total number of schools covered by the Program), followed by boys schools that do not report girls enrollments (29 percent) and girls schools (18 percent). As a consequence, the schools benefited by infrastructure grants have a combined gender parity index of 0.781. The parity index indicates that different proportions of females and males are enrolled in basic education. The combined index masks different trends across provinces. While there are proportionately more boys enrolled at schools benefited by infrastructure grants than girls in most provinces (i.e. Baghlan, Hilmand, Khost, Laghman, Paktika, Parwan, Samangan, and Takhar), this pattern shifts in others, where girls are more likely to attend school built by EQUIP than boys (i.e. Kabul, Kandahar, Logar, Faryab and Takhar). Moreover, there are schools that have completed their projects without female students as beneficiaries of the Program (leakage2), and there are girls at school age that are not registered in EQUIP schools (under-coverage3). 1 Total number of females enrolled in schools benefited by school infrastructure grants divided its value for males. 2 Leakage: also defined as Type I or inclusion error, it is measured as the percentage of EQUIP beneficiaries who are not girls. A high leakage indicator implies that educational gender gaps are not improving because boys' enrollment is proportionally higher than that for girls. The leakage could be due to project administration (MOE, PED or community decisions) or at the household level (parents not sending girls to school) or both. To close or narrow educational gender gaps priority resources should be given to schools with girls' enrollments. 3 Under-coverage: also defined as Type II or exclusion error, it is measured as the percentage of girls that are not beneficiaries of EQUIP. If the undercoverage indicator is high, an important share of girls at school age are not attending school, and therefore the proposed gender parity of EQUIP is not being improved. 34 Given the lack of updated population census data4, the study cannot assess the ratio of school age girls out of school not attended by EQUIP (under-coverage rates); thus the targeting study focused only on "targeting leakage." Out of 190 boys schools (without female students) that at least have received the first school infrastructure grant installment, 73 schools have completed their projects without having reported any progress on opening up sections/shifts for girls5. The latter estimate represents 25 percent of the total number of schools with completed projects (297 schools). Since the number of students reported corresponds to the current school year and all these schools were supposed to incorporate female students this school year (the latest), 25 percent would be the leakage rate. As observed in Figure 1.5, the percentage of schools that have completed their projects without incorporating female students significantly varies across provinces. Figure 1.5: Percentage of Schools without Female Students by Province 100 90 80 70 60 50 40 30 20 10 0 Source: EQUIP Database On the supply side, the gender parity index and the above leakage rate suggest high levels of inefficiency in the resource allocation of the Program and/or administrative problems in applying and monitoring the gender targeting, which could partially explain a low impact in closing the educational gender gap. Therefore, it is necessary to explore why these schools were not able to incorporate female students, especially in those provinces in which the leakage rates are very high. From the program management perspective and in order to achieve an effective improvement in gender parity, it is crucial to monitor and report the progress of these boys' schools on closing the gender gap so as to enforce the commitment of school shuras to improve girls' education. Closing the Gender Gap in Basic Education Comparing the targeting of infrastructure during the first and second (national wide scale up) phase of EQUIP, less targeting to girls enrollment can be perceived. When scaling up programs, 4 And the unavailability of NRVA datasets to extrapolate students population 5 Most of these projects have been completed in 2007 and 2008. 35 it is important to monitor original equity targeting to avoid increasing educational attainment gaps, even during the reconstruction phase. Figure 1.6 and Figure 1.7 shows how the slope of the positive correlation between infrastructure grants and number of female students turns less steep when scaling up the Program. Figure 1.6 EQUIP Infrastructure Grant and Number of Female Students-Phase I 150000 100000 50000 0 0 1000 2000 3000 4000 5000 Number of Female Students Source: EQUIP Dataset Figure 1.7 EQUIP Infrastructure Grant and Number of Female Students-Phase II (National Scale Up) 150000 100000 50000 0 0 500 1000 1500 2000 Number of Female Students Source: EQUIP Dataset It is worth to analyze the role of EQUIP as a MOE Program in closing the educational gender gap by comparing the MOE gender parity ratio drawn from the EMIS and the EQUIP gender ratio. Figure 1.8 suggests that EQUIP is covering proportionally more girls than boys than the 36 MOE in most of the provinces. This indicates that EQUIP as a Program is positively supporting the MOE gender strategy. Figure 1.8 National Female Enrollment and EQUIP Gender Ratio 1.5 EMIS Gender Ratio 1 Badakhshan Kabul City Takhar Nuristan Bamian Faryab Daikundi Nimrooz Laghman SaripulKanduz Nangarhar Baghlan ParwanKunar Samangan .5 Kapisa Farah Khost Paktika Kandahar Hilmand Zabul Urzgan 0 .5 1 1.5 EQUIP Gender Ratio Source: EMIS (2008/2009) and EQUIP Dataset (2009) To translate the national trends to specific changes in gender gaps in EQUIP benefited schools, it is important a special effort to assess and address the reasons why boys' schools had difficulties to incorporate female students. For those boys' schools that are still receiving the benefits of school infrastructure grants is crucial to monitor their progress in improving girls' enrollment once the projects are completed. The following subsections will present correlations that can be useful to further explore new strategies to improve girls' enrollment. Data constraints restricted this analysis to security risk, access to education, and access to drivable roads. All these correlations do not imply causality and were only used to correlate variables that are significantly associated to each other. Girls Enrollment in Unsecure Provinces Female access targeting was limited in unsecured province (see Figure 1.9). Security risk is not only an issue that influences the demand (families sending their children to schools), but also the supply (public and private institutions providing services) in education projects. Therefore, different intervention strategies should be taken into account to alleviate the negative effect of security threats. 37 Figure 1.9 Number of Female Students (IG) and Security Risk Faryab 10 Logar Parwan Khost Baghlan Province Kabul Hilmand 9 Nangarhar Laghman Daikundi Kapisa 8 Nimrooz Kunar Farah Nuristan 7 Paktika 6 Urzgan 2.5 3 3.5 4 4.5 Security Risk Source: EQUIP Database and CSO Note: (1) Security risk measured by the percentage of unsafe districts in each province; (2) Variables are in logs. (3) P-value= 0.078 Figure 1.10 reports the negative correlation between high risk provinces and EQUIP infrastructure investments which were more limited in unsafe provinces. Security risks can be associated with cost-effectiveness ratio: increasing access to education for girls in unsecured regions can be more difficult, and therefore, more costly. Figure 1.10 EQUIP Infrastructure Grant and Security Risk 16 15 Parwan Faryab 14 Kabul Province Hilmand Baghlan Urzgan Logar Kandahar Kapisa Khost 13 Laghman Kunar Nangarhar Nuristan Daikundi Farah Paktika Zabul Nimrooz 12 2.5 3 3.5 4 4.5 Security Risk Source: EQUIP Database and CSO Note: Security risk measured by the percentage of unsafe districts in each province; Variables are in logs; P-value= 0.008 38 In general, it is important to explore the relevance of designing an specific gender strategy in unsecure areas across Afghanistan, as opposed to implementing an standard national strategy for girls targeting (for example with special designed schools--for increased female privacy; or targeting only girls schools, rather than mix; or negotiating more forcefully with parties involved in the conflict to safeguard schools; or forming alliances with religious leaders, etc.). Girls Enrollment and Accessibility Hard-to-reach communities and schools may also be at a disadvantage both in terms of supply (more difficult to build schools and provide teachers) or demand (parent unwillingness, or difficult, for girls and young children to walk to far away schools). Accessible and drivable roads can be used an indicator of easiness or difficulty of access within a province. As shown in Table 1.1 communities that have better access to education would be associated with more students and a higher level of girls' enrolment. Table 1.1: Access to Education by Levels (Primary, Secondary and High School) # # Students # Students Students # Students # Students Students (Girls and (Girls and (Girls and (Girls) (Girls) (Girls) Boys) Boys) Boys) Access to Female 130.57 Primary School (6.65)** Access to Mixed Primary School 147.85 (8.87)** Access to Mixed Secondary School 274.19 (6.92)** Access to Female Secondary School 124.46 (4.92)** Access to Female High School 309.53 (4.25)** Access to Mixed High School 286.94 (4.67)** Distance to the nearest drivable road -0.60032 -0.24284 -0.49473 -0.51233 -1.62006 -0.66119 (2.56)* -0.45 (2.05)* -0.43 (3.32)** -1.6 Constan 7.92 13.6 7.95 20.1 19.4 12.2 (4.38)** (4.87)** (4.94)** (5.28)** (5.48)** (5.45)** Observations 540 551 546 549 524 536 R-squared 0.34 0.31 0.29 0.35 0.28 0.37 Note: Robust t statistics in parentheses * significant at 5%; ** significant at 1% Source: NERAP Community Baseline Survey 39 According to the NERAP dataset, over 50 percent of boys and girls at school age who live in villages along NERAP roads attend to school, spending an average of half an hour a day walking to school. Table 1.1 displays the correlations between enrollment in primary, secondary and high school for female and mixed schools, the availability of school infrastructure in the village, and the distance to the nearest drivable road from the village. Counting on school infrastructure in the village is significantly and positively correlated with the number of girls attending to these schools. Also, the distance to a drivable road is negatively associated to girls' enrollment; especially in high school (no matter if the school is only for girls or mixed). Although the data used to correlate girls enrollment and accessibility is not representative at national level, the hypothesis drawn from the baseline study of the National Emergency Rural Access Program (NERAP) suggest the need of taking access to school sites and to drivable roads into account when planning future education services intervention. Girls Enrollment and Other Factors Other factors may also explain girls' enrollment from a demand perspective as displayed in Table 1.2. In the NERAP baseline survey, the female head of the household was asked why girls did not attend school. 21.36 percent of the respondents said that they did not send their girls to school because there was no school for girls, 18 percent indicated that girls take too much time walking to the nearest school, 16 percent reported that it was because girls were not allowed to go to school, and 9 percent responded that there are no female teachers in schools. Although further investigation would be needed to improve targeting girls' enrollment, it seems that an integrated approach to girls enrollment is needed and that "single solutions"--albeit the need for simplicity in fragile and post-conflict context such as Afghanistan--may not alone yield the results expected. Table 1.2: Reasons Why Girls Do Not Attend School Main Reason Why Girls Do Not Attend School Freq. Percent Father or Male Family Members do not allow them 244 15.84 Not Female Teacher in school 144 9.35 There are not school for girls 329 21.36 Girls are married 25 1.62 Prefer to Send Child to Madrassa 128 8.31 Costs Too Much to Travel to Nearest School 7 0.45 Takes Too Much Time to Travel to Nearest School 273 17.73 Children Required for Work (Housework) 196 12.73 Cost of School Fees, Materials 6 0.39 Quality of Education is Poor 13 0.84 Teachers Do Not Come to School 1 0.06 Lack of Security 16 1.04 Children Do Not Learn Useful Things at 7 0.45 Children Do Not Like School 32 2.08 40 Main Reason Why Girls Do Not Attend School Freq. Percent Better for them to go Madrasa 11 0.71 Feud or Dispute Prevents Children from 3 0.19 Other 105 6.82 Total 1,540 100 Source: NERAP Household Baseline Survey II. Cost-Effectiveness Analysis of School Infrastructure Grants In this section we focus on the economic evaluation of the school infrastructure grants of EQUIP I. Any economic evaluation of a public-sector program needs to address the underlying motivation for the government intervention: in the case of EQUIP it was to improve the quality of education inputs and processes as the foundation for a long-term strategy to enhance the quality of educational outcomes. Appropriate teaching and learning infrastructure, as well as improved education management processes involving community participation, NGOs and private sector was considered important underpinnings of reform. As will be explained below, a simple cost-effectiveness approximation analyzing EQUIP spending per capita (comparing school construction cost and students enrolled today)--across provinces and in two points in time (the first and second phases of school construction in EQUIP I)--will be conducted. This analysis is in line with the Government of Afghanistan priority to provide appropriate spaces for teaching and learning for a large number of students that enrolled in basic education in the years immediately after the end of the Taliban regime (MOE reports that there is still a 40% gap of school infrastructure for existing enrollments). Assumptions and Caveats There were major design, data and context limitations for the present economic evaluation; nonetheless, in fragile and post-conflict context the ICR team proposed that even basic--but systematic--analysis can improve project design, implementation adjustments, lessons learned and decisions for future operations. A cost-benefit analysis would have been the ideal approach to evaluate the economic impact of EQUIP, which was not possible with the data available.6 Due to the fact that monetary values could not be attached to EQUIP impacts, we undertook a cost- effectiveness analysis--which in itself had some limitations and caveats which will be detailed below. A cost-effectiveness analysis compares the costs of achieving the program impacts to those of alternative interventions that could achieve the same impacts. Unfortunately, a predefined cost- 6 The application of cost-benefit analysis requires one to: identify the impacts; the inputs required to bring about these impacts; attach monetary values to these impacts and inputs to get benefits and costs respectively; and finally to compare costs to benefits. 41 effectiveness evaluation for EQUIP was not prepared to evaluate project alternatives such as: (i) impact of different EQUIP components or combinations of them at the school level7; and (ii) impact of EQUIP in comparison to any other MOE project/program or NGO intervention oriented to improve access to quality of education. Lack of school IDs to identify what kind of interventions schools received also would have impeded any comparative analysis. The measure of "effectiveness" on education outcomes was also difficult. Effectiveness criteria in education ideally consider learning as an overall goal. For infrastructure programs, increased new education access (from out of school population) can also be considered a key effectiveness criterion. Given lack of student assessment tools and population census in Afghanistan, it was not possible to link EQUIP components (such as infrastructure grants) to learning or net or gross enrollment rates (which account for changes in out of school population).8 Given the above constraints and still committed to do a systematic economic assessment, it was decided to analyze EQUIP spending per capita (comparing school construction cost and students enrolled today), to assess the different levels of effectiveness on school enrollments across provinces. Various studies have documented the relation of appropriate school and learning contexts--including adequate facilities 9 --with education outcomes such as enrollments, permanence and quality (see for example, Heneveld and Craig, the World Bank, 1996). The ex-post cost-effectiveness evaluation compare two phases of the EQUIP I construction program. In the first phase, the community mobilization, school grants and infrastructure program were supported by two international NGOs. In, the second phase, EQUIP activities were mainstreamed within the MOE (managed directly) and the EQUIP program was scaled up (or experienced "densification") from 10 to 26 provinces. The cost-effectiveness analysis will identify any differences in the school construction program and its impact in enrollments in these two points of time: before and after densification and with different management strategies (with and without NGO participation). Finally, we present results of before and after EQUIP using the MOE enrollment data of 200310 as baseline data, and assuming that the proportion of students benefited by the Program today (from the national enrollment in basic education) was the same as that of 2003. A sensitivity analysis of this proportion was also performed to support this exercise. 7 For example classifying schools as those that received: (i) one kind of school grants (i.e. infrastructure grant-IG or quality enhancement grant-QEG), (ii) two kind of school grants (i.e. IG and QEG), and (iii) schools that received the teacher training component and none or at least one of the school grants. Then, the analysis would compare the impacts of these three combinations. 8 EQUIP I does not have a baseline database; information of teacher training activities is very limited and constraint to the information reported in the last EQUIP Progress Report (2009) and to the information provided by the TED; Afghanistan lacks of a population census to measure net and gross enrollment rates. 9 Among other supporting inputs and enabling factors such as strong parents and community support, school climate and teaching and learning processes and materials. 10 2003 is utilized as an ex-post defined baseline, since the Program did not collect its own baseline data, 42 School Construction, Costs and Enrollments The total funding for EQUIP was US$ 79 million, from which up to May 2009 approximately US$31.1 million funded school grants11. Of these, US$21.5 million were centered on school grants for infrastructure development (69 percent) and US$9.6 million funded school quality enhancement (31percent). Within EQUIP I, an additional US$7 million from ARTF were committed to quality enhancement grants (QEG), but had not been disbursed yet at the time of the ICR (ARTF closes on March 31st, 2010); thus, QEGs were not included in the cost- effectiveness analysis. IDA and ARTF financing within EQUIP supported the construction of 794 schools, benefitting 548,659 students. The indicator of effectiveness in improving access to education is the total number of students registered in schools that were totally and partially supported by the school infrastructure grant12. EQUIP supported enrolment of both boys and girls in the present reconstruction period; thus, for the cost-effectiveness analysis total enrolments (of boys and girls) are considered. It has to be noted that EQUIP financed school construction in communities were children were already receiving education services without a physical school (e.g., in tents or community buildings). Given that is not possible to differentiate existing vs. new enrollment (since EQUIP did not collect this baseline data prior to grant provision), the total number of students registered is considered the "effectiveness" indicator (be that new enrollment or existing enrollment from a previous precarious school environment). Given that school grants for infrastructure development comprised the largest component of EQUIP I, with 55 percent of the allocated budget spent, the below cost-effectiveness analysis is based on infrastructure grants. The cost-effectiveness of school infrastructure grants is measured based on the amount of money paid to schools for infrastructure development and basic education13 enrollments in EQUIP benefited schools. . There were two construction phases. The first phase was implemented in 13 provinces of Afghanistan, building 206 schools, and benefitting 368,788 students. The second phase benefitted 13 additional provinces, 595 schools were expected to be build out of which only 499 have received their infrastructure grants, covering 179,871 students. The project output indicators show that at least 704 schools have received infrastructure grants, and that new or renovated infrastructure is being used by 718,072 students. The present cost-effectiveness analysis wants to identify any variations by provinces--either in higher costs or lower enrollments--and by phases among completed school projects. We are assuming that the initiation of the school construction work has generated enough incentives for parents to send their children to school. 11 This amount corresponds to the allocated budget spent up to May 2009. 12 Schools partially benefited by the grant are those that at least have received the first installment. 13 Basic education is traditionally understood grades 1-9; however, EQUIP Grants could support schools with grades 1-12, 43 Cost-Effectiveness Analysis Results Overall, the cost-effective analysis suggests that for EQUIP the cost required to obtain an additional student registered in the school was US$ 39.2. An important message from the cost- effectiveness analysis is the broad differentiation in cost-effectiveness across provinces benefited by school infrastructure. Figure 1.11 shows cost-effectiveness ratios across provinces for a student independent of her/his gender. Figure 1.11: EQUIP Cost-Effectiveness Analysis (Total Number of Students) 100 90 80 70 60 50 40 30 20 10 0 Khost Kapisa Logar Kanduz Takhar Daikundi Kandahar Hilmand Nimrooz Kunar Nangarhar Baghlan TOTAL Zabul Urzgan Kabul City Laghman Paktika Parwan Samangan Badakhshan Saripul Faryab Bamian Nuristan Kabul Province Farah Source: EQUIP Database The differences in the cost-effectiveness ratios across provinces can be attributed to supply and/or demand factors. On the supply side, differences in the cost of school construction materials, labor and transaction costs, etc., affects the final cost effectiveness ratios in a different way across provinces. Also, differences in the central and provincial management and in the institutional capacity to deliver the program at the provincial level may influence these ratios. Differences in geographic, security and economic conditions across provinces may also play an important role when comparing these ratios. With respect to the demand side, cultural, security, demographic (e.g. number of girls at school age in the village) and socio-economic conditions influence parents' decisions to send their girls to school (and boys). Girls may not have attended as well, because the school built by EQUIP were only for boys (against the agreed gender-based criteria of the program). Moreover, it is worth to denote that certain provinces are at the highest or lowest levels of the cost-effectiveness curves. For example, provinces like Kandahar, Kabul City, Sari pul and Takhar are very cost-effective, whereas in Bamian, Urzgan, Kabul Province and Paktika the cost effectiveness ratios dramatically exceed the overall ratio of the Program. Kandahar is an interesting sample, given that it is a province in high unsecured regions, yet their investments seem to be more cost-effective. Although the cost-effectiveness analysis could not answer the 44 reasons for the differences across provinces, it does provide the basis for further study. Differentiated education delivery strategies may be required between secured and unsecured regions, easy vs. hard to access terrains and road infrastructure, and/or institutional capacity and governance vulnerabilities. Comparison of Pre- and Post-Densification (Scale-Up) of EQUIP The pre- and post-densification of EQUIP I was marked by two phases with two different intervention strategies. During the phase I, NGOs were engaged in building and rehabilitating schools, and some of them provided external support also to provincial and district education departments for construction. Phase II, which is still under completion, was led by the MOE with the department of procurement and EQUIP offices at the provincial level. Under phase one of the program, the 13 provinces of Badakhshan, Bamyan, Helmand, Kabul city and Kabul province, Kandahar, Kapisa, Khost, Logar, Nooristan, Paktika, Parwan and Zabul were covered. Phase two activities were extended to another 13 provinces of the country: Baghlan, Daikunid, Farah, Faryab, Kunduz, Kunar, Laghman, Nimroz, Nangarhar, Samangan, Sar-i-Pul, Takhar, and Uruzgan. An additional caveat to account for when comparing the first and second construction phase of EQUIP I was the differences in completion of schools between each phases. Most of the projects initiated in phase I are completed (82 percent of those that at least have received the first installment). On the contrary, only 11 percent of the schools being constructed or rehabilitated under phase II has been completed until May 2009. Thus, to account for this difference in implementation completion across both phases, the final cost-effective analysis factors in the levels of completion (100 percent, 75 percent and 50 percent). As an example, if the completion difference was not noted, the first construction phase of EQUIP would be only slightly more cost-effective than phase II by US$ 3.4 (See figure 1.12 and figure 1.13). This comparison would not be conclusive because it ignores the additional cost and effectiveness to be added up after the completion of phase II, unless we assume that both the cost and the number of students would proportionally increase (these schools may be expecting an additional increase in the number of students upon their completion, as well as possible costs increments in relations to an earlier construction period). 45 Figure 1.12: EQUIP Phase I-Cost-Effectiveness Analysis (Total Number of Students) 140.0 120.0 100.0 80.0 60.0 40.0 20.0 0.0 Source: EQUIP Database Figure 1.13: EQUIP I Phase II- Cost-Effective Analysis (Total Number of Students) 140 120 100 80 60 40 20 0 Source: EQUIP Database The final cost-effective analysis factors takes into account the fact that most of the construction projects initiated in phase I are completed (82 percent of those that at least have received the first installment). On the contrary, only 11 percent of the schools being constructed or rehabilitated under phase II has been completed until May 2009. Taking into account the delays in school project completion, Table 1.3 and 1.4 display cost-effectiveness ratios (or cost per capita, for the purpose of this study) of both phases, but differentiating between the levels of completion (100 percent, 75 percent and 50 percent). These tables suggest that for a 100 percent project completion in both phases, phase II is less cost-effective because of the higher cost attributed to a marginal increase in the number of students enrolled. In conclusion, it seems that so far phase II 46 has been less cost-effective than phase I: A construction cost per enrolled student in phase I is US$ 35 and a construction cost per student of in phase II is US$50.6, with a difference of US$ 15.6 between phase I and phase II. Table 1.3 Pre-Densification (Phase I) 100% 75% 50% Number of Students 353,074 371,376 386,806 Number of Girls 163,961 168,642 177,712 Number of Boys 189,113 202,734 209,094 Gender Ratio 0.87 0.83 0.85 Number of Schools 242 278 291 Amount Paid (US$) 12,344,143 13,410,772 13,887,382 Cost per capita 35.0 36.1 35.9 Source: EQUIP Database Table 1.4 Post-Densification (Phase II) 100% 75% 50% Number of Students 43,162 175,930 213,366 Number of Girls 26,895 79,678 91,129 Number of Boys 16,267 96,252 122,237 Gender Ratio 1.65 0.83 0.75 Number of Schools 55 184 240 Amount Paid (US$) 2,184,619 3,506,919 4,663,238 Cost per capita 50.6 19.9 21.9 Source: EQUIP Database The exact reasons that account for the different per enrolled student costs of infrastructure between phase I and phase II needs to be further studied. One differentiation as noted before is the different management strategies, between involvements of NGOs and directly managed by the MOE. However, further analysis is needed to see if management differences account for the change. Other possible reasons are the differences in school demand, increased construction costs, or even differences in governance and other vulnerability issues between the provinces which participated in the two different construction phases. These all need to be studied further and especially analyze those provinces where the differential dramatically exceeds the average cost-effectiveness of the program. Comparison of Before and After of EQUIP Ideally, one would have estimated the effect of the Program on whether students benefited by EQUIP are currently in a better situation than they would have been, had they not been covered by the Program. Specifically, if without the EQUIP construction program we would have less students enrolled in benefitted schools today. Nonetheless, the lack of baseline data and control group (comparison in enrollment increments in similar schools not benefited by EQUIP 47 infrastructure) does not allow a contra-factual comparison to determine the impact of EQUIP. Thus, only an approximation of a before/after analysis is carried to assess the effectiveness of the Program in comparison to a pre-EQUIP status in the intervened communities and schools. The underlying assumption of this analysis is that the situation of the treatment students group had not changed in time. The before/after approximation attempts to measure the difference levels of access to education for the treatment group (students enrolment in EQUIP benefitted schools/communities). However, it is subject to bias with respect to observable and unobservable factors which may have affected the treatment group besides the EQUIP. So, whereas it can measure e.g. whether the treatment group is in a better situation than it was before, it cannot help to determine what benefit was introduced by the Program, independently of other factors-- increased enrolment even without improved infrastructure, other education support programs in the communities, etc.). To approximate the before/after estimator without baseline data we used the MICs 2003 enrollment statistics and assume that the proportion of students currently benefited by EQUIP was the same as that in 2003. Figure 1.14 reports the percentage of students enrolled in basic education who were benefited by EQUIP across provinces. Out of 5,477,362 students enrolled in basic education in 2008/2009, 10 percent were benefited by EQUIP. Figure 1.14: Proportion of Students Enrollment in Basic Education and Benefited by EQUIP in 2009 0.35 0.30 0.25 0.20 0.15 0.10 0.05 0.00 Khost Kapisa Logar Kanduz Takhar Daikundi Nimrooz Kandahar Hilmand Kunar Baghlan Nangarhar Urzgan TOTAL Zabul Kabul City Laghman Paktika Parwan Samangan Badakhshan Saripul Bamian Faryab Nuristan Kabul Province Farah Source: EQUIP Database (2009) and MICS (2003) Assuming that this 10 percent represented the size of EQUIP in terms of enrollment in 2003, everything else being equal, there was an improvement of 382,736 students in enrolment in basic education after EQUIP. Although we cannot attribute this increment to the Program, the approximation suggests that the access to basic education is in better situation than that before EQUIP. 48 Table 1.5 shows a sensibility analysis of the above main assumption used to estimate different case scenarios of the "before-the-Program" estimation. Furthermore, the before/after estimates distinguish between schools that received the first installment of the Program and those that have completed their projects. For completed projects, if the size of EQUIP had been 25 percent (or more) before its implementation, a decrease in enrollment would have been perceived. Table 1.5 After- After- After After (first Before Before % Before (Completed installment) (First (Completed Projects) installment) Projects) 0.01 16,564 548,659 396,236 532,095 379,672 0.05 82,822 548,659 396,236 465,837 313,414 0.10 165,923 548,659 396,236 382,736 230,313 0.15 248,466 548,659 396,236 300,193 147,770 0.20 331,287 548,659 396,236 217,372 64,949 0.25 414,109 548,659 396,236 134,550 -17,873 0.30 496,931 548,659 396,236 51,728 -100,695 0.33 548,659 548,659 396,236 0 -152,423 Source: EQUIP Database (2009) and MICS (2003 III. Conclusions and Recommendations Despite data availability constraints, in this annex we have attempted to evaluate the benefit of investing in targeting gender parity and to assess the cost-effectiveness of the Program. Our results suggest the following: Strengthened compliance with gender-equity criteria of investment projects. EQUIP I priority was given to: (a) girls schools; (b) schools with both boys' and girls' sections/shifts; and (c) boys schools which plan to open up sections/shifts for girls. However, these priority criteria have not been strictly followed by the Program. Given the high priority of the GoA to close gender education gaps (as stated in the ANDS and NESP), monitoring adherence to gender targeting criteria is highly recommended. National girls' enrollment increments hide differences in gender targeting among provinces. Although the national numbers on females' enrollment increments are positive, at the provincial levels further analysis is needed to identify those provinces where closing the gender gap has become more difficult and differentiated interventions related to regional or province-specific obstacles need to be identified and implemented. Comparing the targeting of infrastructure during the first and second (national wide scale up) phase of EQUIP, less targeting to girls enrollment has been perceived. Furthermore a leakage rate of 25 percent of completed schools suggests that resources were not effectively allocated to close the gender gap. Need to address supply and demand obstacles to increase girls' enrollments. A significant progress on closing the gender gap could be attained if addressing the current leakage, which 49 suggests that boys' schools have difficulties to incorporate female students or have not complied with their agreement to incorporate girls in their schools. To complement infrastructure investments, EQUIP II and other donor and GoA financed programs need to support additional interventions to overcome obstacles to girls' school enrollments (some of the obstacles are stated in Table 1.2 of this Annex). Supply factors creating obstacles to girls' enrollment related to the heterogeneity of Afghanistan should be further explored. The correlation analysis of insecurity risk, access to drivable road and school infrastructure and other factors suggest that supply factors and exogenous variables are more associated to increasing girls' enrolment than demand factors. This will require further investigation and data collection so as to improve the current supply strategy taking into consideration the differences among provinces in terms of security and access to public infrastructure. Variability of cost-effectiveness, including management, community participation and even reliability of data provided should be studied. The variability of cost-effectiveness ratios across provinces requires further investigation in order to identify why provinces like Kandahar, Kabul City, Saripul and Takhar are very cost-effective, whereas in Bamian, Urzgan, Kabul Province and Paktika the cost effectiveness ratios dramatically exceed the overall ratio of the Program. A further study aimed at analyzing EQUIP' investments' achievements differentiated among province would help to differentiate delivery strategies. NGO supported phase I shows to have been more cost-effective, the role of NGOs as well as other education sector management processes should be further explored. It seems that so far phase II has been less cost-effective than phase I. It would be recommendable to improve the effectiveness of the program in terms of enrolment for those school projects under implementation. Although the role of different management modalities of the two construction phases of EQUIP I (NGO supported and MOE managed) has still not been fully studied, institutional, governance and management issues at the provincial level should be explored, including financial and procurement management. An expenditures tracking analysis is also recommended. Tools to show the relation of education Programs to national education indicators improvement (causality linkages) must be strengthened. Assuming that 10 percent represented the size of EQUIP in terms of enrollment in basic education in 2003, everything else being equal, there was an improvement of 382,736 students in enrolment in basic education after EQUIP. Although we cannot attribute this increment to the Program, the approximation suggests that the access to basic education is in better situation than that before EQUIP. Monitoring and Evaluation improvements are imperative. It is necessary to improve the current M&E system of the Program in order to better track the implementation and effectiveness of the program in yearly basis. Schools need to have a unique ID for both school grants (school infrastructure grant and school quality enhancement grant) in order to assess potential synergies between the components of the program as well as to identify the most cost-effective component or group of components. Also, a baseline study and an identification of a control group for EQUIP II will ensure the accuracy of the cost-effective analysis, especially in terms of causality 50 or impacts attributed to the Program. Finally, a learning achievement test needs to be conducted to assess the effectiveness of the program in terms of quality of education. 51 Annex 4. Bank Lending and Implementation Support/Supervision Processes (a) Task Team members Responsibility/ Names Title Unit Specialty Keiko Miwa Senior Education Economist SASHD Team Leader (I) Habibullah Wajdi Education Specialist SASHD Education Sheila Braka Musiime Senior Legal Counsel Legal Manoj Agrawal Consultant SARFM Financial Management Irajen Appasamy Senior Operations Officer SASHD Operations Mahmoda Sonia Eqbal Consultant SASHD Monitoring Evaluation Deepal Fernando Senior Procurement Specialist SARPS Procurement Julie-Anne M. Graitge Program Assistant SASHD Administration/Operations Shawkat M.Q. Hasan Senior Procurement Specialist AFTPC Procurement Nadia Hashimi Consultant SASDI Education Mostaeen Jouya Education Specialist SASHD Education Hasib Karimzada Program Assistant SACAF Administration/Operations Scherezad Joya Monami Latif Senior Education Specialist ECSH2 Team Leader(II) Pema Lhazom Operations Officer SASED Operations Veronica Minaya Lazarte Consultant SASHD Monitoring & Evaluation Hena G. Mukherjee Consultant SASHD Higher Education Senior Financial Management Kenneth O. Okpara SARFM Financial Management Specialist Senior Institutional Development Joel E. Reyes SASHD Team Leader(III) Specialist Grant G. Sinclair Consultant SASHD Operations Shobhana Sosale Senior Operations Officer AFTED Operations Venkatesh Sundararaman Senior Economist SASHD Economist S. R. Tiwari Consultant SASHD School Construction Rahimullah Wardak Procurement Specialist SARPS Procurement Md. Mokhlur Rahman Consultant SASHD Operations David Freese Senior Disbursement Officer LOAC Disbursement M. Wali Ahmadzia Disbursement Analyst LOADM Disbursement Patricia Tibbetts Consultant SASHD Education Darlyn Meza Consultant SASHD Project Management Teresa Campos Consultant SASHD Operations Tina Lagdamen Consultant SASHD Teacher Education Palwasha Andisha Team Assistant SACAF Procurement 52 (b) Staff Time and Cost Staff Time and Cost (Bank Budget Only) Stage of Project Cycle USD Thousands (including No. of staff weeks travel and consultant costs) Lending FY04 8 121.99 FY05 3.29 FY06 0.00 FY07 0.00 FY08 0.00 Total: 8 125.28 Supervision/ICR FY04 0.00 FY05 25 106.54 FY06 40 138.97 FY07 58 235.98 FY08 19 73.21 FY09 27 0.00 Total: 169 554.70 53 Annex 5. Beneficiary Survey Results Background A beneficiary assessment was conducted as part of the needs assessment of the M&E/MIS system of the EQUIP program. This stakeholder consultation, supported by the Ministry of Education of Afghanistan and the Bank's Education Sector Unit based in Kabul, was based on an extensive number of interviews conducted at the central and sub-national level. Since the nature of this assessment was qualitative so for data collection in-depth interviews were used as data collection methodology. At the central level, the EQUIP Coordination Unit, the Department of Planning, the Department of Finance, the Department of Procurement, the Teacher Education Department, and implementation partners (i.e. BESST Program) participated in the initial process of identifying and defining the information sources from which data regarding the measurement of indicators is retrieved. Also, as part of our institutional assessment, two provinces, and in every province, two districts, and in every district, two schools were selected for the study. From May 31st to June 13th of 2009, two field missions to Bamyan and Herat were organized, in which we interviewed the head of the Education Department and EQUIP officers at the provincial level, the head of the DED and supervisors/inspectors at the district level, and school principals, teachers, school shura and class representatives at the school level. Moreover, parents and community shura also participated at this stage of the study so that we covered all the key stakeholders who are (or should be) involved in the education process. The total number of interviews for this study was 93. In overall, 24 group interviews and 69 one- to-to interviews were conducted in 2 provinces of Afghanistan (Herat and Bamyan). In Kabul, at the MOE level, 14 interviews were administered before organizing the field work in these 2 provinces. At the school level, overall 12 group interviews and 8 one-to-one interviews were conducted with school shura members, teachers and class representatives. At the community level, overall 8 group interviews, 18 one-to-one female interviews and 31 one-to-one male interviews were conducted with community shura members and female and male head of the households, respectively. At the district level, 8 one-to-one interviews were conducted with the head of the District Education Department and school inspectors/supervisors. At the province level, 4 group interviews and 4 one-to-one interviews were conducted with EQUIP social mobilizers and engineers, and with the EQUIP officer and the head of the Provincial Education Department. In every province, two districts, and in every district, two schools were selected for the study. The selection of schools were based on the pre-defined distance variations from the District Education Department (DED) i.e. one school from the center and one around 10 Km away from the center of district. At the school level, class representatives and teachers were selected from both EQUIP social mobilizers viz. School's Principal, and at the community level parents were randomly selected by enumerators and EQUIP social mobilizers. 54 Conclusions The community-managed school approach has encouraged participation from all sectors of the beneficiary communities. Parents evaluated the schools positively and children are happy to have access to an education closer to their homes. Education is valued by parents, teachers, children and community members, and the idea of education as being useful, good and for the benefit of the future generations prevails. Quality Enhancement Grants motivate the community to raise counterpart funds for school improvement plans ­ contributions range between 20% and 30%. Based on their own experience, PTA and SMC members considered that closer coordination is needed to ensure that schools receive MOE support through the PED's and DED's to guide them in this new approach. School Shuras work effectively with the communities. "Shuras work as a bridge between the people and the school" many community members remarked. Although parents are optimistic about the economic value of education for their children, teachers are less optimistic about the economic outcomes of education, due in part to the disparity they see between their own education levels, their access to hands-on training and their salaries. Parents consider that "new teaching to put the students in the center of education is good". Although they stress the importance of "bringing in more professional teachers and to build the capacity of teachers already in the schools". There is a consensus among School Principals, Teachers and Shuras that Training on Psychology would be beneficial to learn given the distress in which the communities live. Children of different ages (overage) in the same grade, is considered normal. No one considers age differences as an obstacle to learn but would prefer to have different grades and teachers for different ages. The following tables present samples of opinions expressed by sub-national, community and school beneficiaries: What is Good and What is Missing From Your School? - Responses by Students Good " Education is good, the teachers are present, motivating and try their best to make us understand" "If we have an opinion to express, the student representative will go and talk to the principal to share". They also expressed that "Our teachers teach us well and explain more". Overaged students's feelings on the education they received: "The good grasp of what is taught to us gives us the notification that we receive good education". Missing "A well for drinking water, books, furniture and labs are needed, and also internet". "More technical material to schools is needed. It is not enough to write that H2O is water, we need labs to see the formula". "We need more classrooms, more teachers, more books, some furniture and stationary to learn better". 55 Comments from School Teachers According to teachers, "Lesson Plans need to be improved and new methodologies introduced to better educate our students". Given the weather constraints, teachers suggest that "winter courses for 2 months need to be introduced given that it gets very cold here, and children can only study 6 months a year". Regarding cultural approaches and attitudes, some School Shuras expressed that "most of the people are illiterate in the community and this affects our economy, security and awareness behavior. There is a huge difference between someone who is literate and another one who is not". Parental and Teachers Opinions on Gender and Education Parents and teachers value basic education for both boys and girls. When asked better that boys go to school than girls, the overall response is that "Boys should study up to 18 years and more ­ this country can only be built by educated people. We want the same for girls. We want them to become doctors and midwives to treat our women. Girls also have the right to get education". When asked about the subjects students should learn, in general the responses echo that "We expect that they should learn grammar, languages, computer, math and science". PEDS and DEDs Staff Comments PED and DED's highlighted the accomplishments of EQUIP's contribution to schools, the communities and the students as the final beneficiaries, and suggested improvements in coordinating efforts between MOE and the PED's and DED's to work closely with Shuras. "Security issues are affecting our monitoring process, the tribal and ethnic conflict between shuras need to be considered when setting up PTAs and SMCs as well". In terms of supervision, PEDs and DED's mentioned that "long distances, lack of proper facilities and weather conditions affect engagement with the communities. We have computers but no electricity, and the motorcycles do not work so we have no means of transportation to reach the communities as often as they would need us present". In terms of the remuneration received, it is suggested that "the education budget be increased to put more professional people in the schools, and to increase teachers' salary. In terms of overall management, PED's and DEDs emphasized the importance of "proper and on-time allocation of funds to ensure implementation success". Training of PED's and DED's was also mentioned as needed. Social Mobilizers Comments on EQUIP and Community Participation When asked about their work, Social Mobilizers expressed their enthusiasm to contribute in encouraging community participation by setting the foundations for community organization. "Thanks to EQUIP students have school buildings ­ not in tents anymore- they have labs, libraries, and in some places they have computers, which promote the quality of education they are receiving". However, some constraints they have faced are related to working mechanisms that require fine- tuning. "Staff transportation is a problem" some expressed. "When the money is not given on time, we face problems with the community. For our own office, the money does not come on time to cover our expenses. Sometimes we do not get our salary on-time either, and all of this causes problems". "The construction of schools should be based on a needs assessment of the school, and professional teachers should be sent, along with books and salaries delivered on time" they recommended. "There is no specific consideration based on different needs of different provinces" they responded when asked about Grants allocation monies. Also, the number of classrooms per students is not realistic". When asked for their recommendations on improving the planning and overall mobilization approach, their suggestion is that "School materials need to be provided, and salaries paid on-time. Also, long-term workshops need to be considered for teachers and the security risks need to be taken into account". 56 Annex 6. Stakeholder Workshop Report and Results Three separate workshops were held in Kabul with stakeholders of the Ministry of Education. One focused on EQUIP M&E, the second one on social mobilization and community participation, and the third one on MOE alignment and integration for community participation and education quality. The fourth workshop was conducted by the MOE, and supported by the ICR team, for the validation with MOE stakeholders of the National Implementation Completion Report (for the summary of the NICR see Annex 7). In summary, the main messages of the first three workshops on social mobilization and community participation and the institutionalization of community participation and education equality are: Salient Points of M&E Beneficiary Assessment Workshop, July 2009 The main objectives of the workshop "EQUIP II M&E for better Basic Education Policy, Planning, and Results" were: (i) validating the set of indicators currently used to measure inputs, outputs, and outcomes of EQUIP; (ii) identifying the need of measuring more indicators and improving data collection methods; (iii) identifying and validating the current information flow and data collection, reporting and dissemination processes from schools to EQUIP/MOE; (iv) preparing M&E Plans to track results of access to education, community participation and teacher training; (v) preparing a vertical and horizontal integration strategy embedded in a integrated information platform; and (vi) preparing an Action Plan to institutionalize the EQUIP II M&E system. The stakeholders were selected based on their significant role in planning, strategic management and policy making: (i) stakeholders working on promoting access to basic education under EQUIP; (ii) stakeholders working on community mobilization activities involving school shuras to participate in EQUIP school grant activities; (iii) stakeholders working on strategic planning and information system at the MOE level; (iv) stakeholders working on teacher training under EQUIP at the policy, planning and management level; and (v) stakeholders with vast experience in improving teacher performance. Thus, the EQUIP Coordination Unit and the World Bank organize this workshop for representatives of the Teacher Education Department, the Planning Department and the EQUIP Coordination Unit at the MOE, and Creative Associates representatives as the implementation partner of the teacher training component. Summary of the workshop conclusions and recommendations follows: Community Participation. There is a lot of information on community participation and social mobilization activities collected and available at the school and provincial level that does not reach the central level (EQUIP/MOE). Output and outcome indicators of community participation, school shuras and grants indicators were identified and discussed. The EQUIP coordination team with Bank's support agreed on holding a workshop on community participation. Teacher training. Given that the implementation partner (USAID/BESST) has an M&E system in place, the forms used to collect teacher training information as well as specific indicators were discussed in order to initiate the process of building an integrated M&E system between the Teacher Education Department (TED) and its implementation partner. TED and BESST agreed on conducting a review workshop to have a final agreement on indicators and collection/reporting/analysis tools that to be used for the DT3. 57 Access to Quality of Education. It was identified the need of improving the outcome indicators to measure quality of education. New indicators and new data collection methods were discussed to measure net enrolment, drop-out and repetition rates and student learning achievements. The importance of going further on developing the student registration database was also discussed as an option to improve the credibility of enrolment rates. Data Integration. Action plans to improve information and data integration are needed. The main gaps on the ideal integration situation between EQUIP and EMIS and TED and EMIS were discussed and identified. On the issues of EMIS integration, it was agreed on exploring new web technologies to support the web-based integration among the stakeholders. Issues of M&E organizational development and capacity building were also covered in these discussions. Salient Points of Social Mobilization Workshop August 2009 The main objectives of the workshop "Social Mobilization in Afghanistan" were to: (i) validate the lessons learned on community participation, school based management and school grants during the implementation of EQUIP I; and (ii) identified the next steps to strengthen the social mobilization teams in the education sector of Afghanistan. Workshop participants were selected from the Community Mobilization team based in Kabul in the MOE offices, and representatives from near-by provinces that could travel to Kabul safely to participate in the workshop. The key conclusions of the workshop were: With the Program a national team of 78 professionals has been created, and these are organized and distributed at the central and provincial level, and attributed the tasks of creating awareness and orient on the community participation process at the district level and with the school principals. Although their role has been crucial throughout the implementation, it is important that their roles and responsibilities be reviewed and adjusted so that these professionals can engage in monitoring and follow-up activities of the School Improvement Plans. Given that the functions of social mobilization are strategically important for the MOE within the community participation context, it is necessary that these efforts be given a specific placement within the structure of the MOE to institutionalize the activities carried out within the context of social mobilization. Although the quality enhancement grants have contributed positively in terms of encouraging the participation of communities in their decision-making process to improve the school facilities, it is important to define a strategy that will ensure continuity of technical and financial support by providing resources that will guarantee the sustainability of the grants so that the efforts are not perceived as a specific intervention but are rather perceived as a part of a systematic and permanent process. Salient Points of MOE Integration for Education Quality Workshop and Community Participation August 2009 The objectives of the workshop "MOE Integration for Education Quality and Community Participation" were: (i) to discuss the linkages between community participation and education quality in the education sector of Afghanistan; (ii) find synergies between the different MOT departments which guide and provide services to improve the access to quality education; (iii) generate a common vision of education quality and community participation. The stakeholders were selected based on their significant role in planning, strategic management and policy making related to teaching and learning quality, as well as direct community and school support. 58 Salient Points of MOE Integration for Education Quality Workshop and Community Participation August 2009 Thus, workshop participants included the Primary and Secondary Education Departments, the Teacher Education Department, the Planning Department, the Curriculum Department, the EQUIP Coordination Unit at the MOE, the central team of Community Mobilizers, and representatives of Provincial and District level community mobilization specialists. The key conclusions of the workshop were: MOE is carrying out various reform processes in terms of curriculum, teacher development in pre- service and in-service for basic and higher education. These reform processes need to align themselves within each of these departments at the central, provincial and district level so that they can reach the schools in an integrated manner, be more effective and be supported by the community participation efforts. The concept of education quality has to be shared by the MOE staff members leading the reform processes because it is evident that there are different visions of it, but most importantly is that all the processes of education quality reform acknowledge that at the center of all these reform processes the main beneficiaries are the students. In the emergency and reconstruction context of Afghanistan, it will be necessary to find innovative and flexible models of teacher formation and training, especially for female teachers in provinces where the enrollment rates of girls are the lowest. This could have a two-fold benefit, first to generate parents' trust to send their daughters to school, and secondly, to provide a learning space for girls in those provinces. In general, community participation has been valued as crucial support in the expansion of education coverage. However, in Afghanistan the experience has gone one step further whereby the participatory processes were created to support education quality, therefore it is important that these efforts be strengthened through the provision of trainings focused on parents with informational content that can be easy to understand and applied about education quality in the schools. In some provinces of Afghanistan the reality of the conflict is latent; therefore, it will be important that the MOE invests committed efforts to learn how best the children in those provinces learn within the context of conflict so that MOE can develop teaching and learning materials specifically designed within such context. 59 Annex 7: Summary of Borrower's ICR and/or Comments on Draft ICR This Annex presents a summary of the MOE's NICR, detailing the overall achievements of the Education Quality Improvement Program (EQUIP I) from August 2004 ­ 30 July 2009; its US$79 million were financed by IDA and ARTF. EQUIP I started in 2004 with initial funding of US$ 35 million by IDA while the ARTF funds were provided in three tranches: US$ 5 million in June 2005, US$ 27 Million in June 2007 and US$ 12 Million in April 2008. EQUIP I is being implemented in two phases - Phase one and Phase two. EQUIP I activities were implemented in 26 provinces of the country. The Ministry of Education official letter confirming its review of the Bank's ICR for EQUIP I and proposing follow up steps and a Task Force to review issues raised and recommendations is included at the end of this annex. Project Objectives and Targets EQUIP aims to improve the quality of educational inputs and processes as the foundation for a long-term strategy to enhance the quality of educational outcomes. This will be achieved through: (a) a focus on schools and communities to strengthen their capacity to better manage teaching-learning activities; (b) investment in human resources (teachers, principals and educational administration personnel) and physical facilities; and (c) institutional development of schools, DEDs, PEDs and the MOE. The program also aims to promote education for girls by putting a priority for female teachers and students within each component activity. As a national program, EQUIP sought to influence sector wide indicator related to access and student learning. Thus, general country-wide data was monitored for national trends of education indicators during the implementation of EQUIP I, with 2004 as a baseline. Sector Wide Education Indicators within EQUIP I Objective Indicator Enrollment Growth of Male and Female Students (note: given the lack of a Increased Basic population census to measure net and gross enrollment rates, just the overall Education Access enrollment growth each year was monitored). Number of Female Teachers Improved Quality of 3rd Grade Reading and Numeracy Assessment Teaching and (note: national learning assessment for basic education are yet to be developed) Learning Functioning Education Management Information System; Education Sector Education Strategic Plan Guiding the System; Capacity Building General Restructuring and Strengthening of the MOE 60 Access Indicators Enrollment Growth in Afghanistan 1383 ­ 1388 (2004-2009) Female Overall % Female as % of Year Total Growth from Male Female as % of Overall Previous Year Male Enrollments 1383 (2004) 3,974,704 (baseline) 2,652,751 1,321,953 50 33 1384 (2005) 4,880,634 23% 3,200,764 1,679,879 52 34 1385 (2006) 5,435,075 11% 3,515,661 1,919,414 54 35.3 1386 (2007) 5,675,951 4% 3,667,862 2,008,089 55 35.3 1387 (2008) 6,200,000 9% 3,982,573 2,217,427 56 35.7 1388 (2009) 6,955,845 12% 4,439,683 2,516,162 58 36.1 Source: MOE Planning Department Availability of Teachers in Afghanistan 1383 ­ 1388 (2004-2009) Overall % Female Growth Female as % of Year Total from Male Female as % of Overall Previous Male Teaching Year Force 1383 (2004) 122,910 (baseline) 88,802 34,108 38.41% 27.75% 1384 (2005) 128,400 4.47% 92,60 36,140 39.17% 28.15% 1385 (2006) 136,503 6.31% 98,083 38,420 39.17% 28.15% 1386 (2007) 142,508 4.40% 103,047 39,461 38.29% 27.69% 1387 (2008) 152,281 6.86% 107,748 44,533 41.33% 29.24% 1388 (2009) 164,771 8.2% 116,298 48,473 41.68% 29.42% Learning Quality Indicator Progress Basic Reading and Numeracy Assessment of Students Grade 3 Year Indicator Explanation The design and application of the National Reading and Numeracy Assessment for Grades 1-3 has not been 1387 (2008) Not Available completed and is expected it will be completed by the end of 2009. 61 Institutional Capacity Indicator Progress MOE Sector Management Capacity Building Indicator Progress Explanation Education EMIS requires a hard and software information EMIS has collected Management technology base to improve data collection, education sector wide data, Information System analysis and management use at all levels of the which is updated every year. (EMIS) system. The first five year of the The NESP-II is being developed to align to the National Education NESP was completed. completed Afghanistan National Development Strategic Plan NESP-II has been developed Strategy and the Result-Based Budget of the (NESP) in line with ANDS Ministry of Finance. The MOE continues to strengthen its policy, A new "tashkil" for the programming and management capacity, MOE Restructuring MOE was approved and including focus at the provincial and district operationalized levels. Implementation of Pay and Grade on- going PROGRAM COMPONENTS DESCRIPTION AND IMPLEMENTATION PROGRESS Component 1: School Grants Grants to schools are designed to achieve two complementary objectives through a greater emphasis on school- and community-based decision making. First, school grants will support efforts to improve teaching and learning and create effective school environments. Second, school grants will support the improvement of basic school facilities in existing government registered primary, middle and secondary schools with teachers on payroll. The agreed indicators, targets and progress to-date for this component are included in the table below: Component 1.1: School Grants for Quality Enhancement Summary of Implementation Progress and Results In general, the implementation of this sub-component of the program is progressing very well. Despite several constraints i.e. difficulties in funds flow, security, etc. the Ministry of Education had remarkable success in mobilizing communities to establish School Management Committees (SMCs) and Parent Teacher Associations (PTAs). Field observations, in general, indicate strong commitment and ownership by the functioning SMCs/PTAs. The EQUIP provincial and central social mobilization teams made extraordinary efforts to carry out all key activities such as community mobilization and awareness raising programs for a considerable number of schools, training of stakeholders and orientation to the Ministry of Finance (Mostofiates) officials on fund disbursement. By end of June 2009, a total of 5,827 SMCs were established in 26 target provinces. These SMCs have actively been involved in implementing the quality enhancement grants for their respective schools. Following some preparatory activities (school survey, awareness raising, community mobilization and training) 5,827 School Improvement Plans (SIPs) in consultation with SMCs were developed and approved by District Education Departments and Provincial Education Departments. A total of 3,936 schools received Quality Enhancement Grant. (US$ 17,360,085 spent under this category ­ US$ 11,527,864 was directly 62 disbursed to schools and the remaining amount for operation cost of NGOs and trainings). For the purpose of sustainability, all activities related to community mobilizations, preparation of SIPs and implementation of quality enhancement grants have been jointly implemented with DEDs and PEDs. In short, the quality enhancement grants reached 4,112,923 students (1,477,944 girls and 2,634,979 boys) in 26 provinces. Component 1.2: School Grants for Infrastructure Development School Grants for infrastructure development will be provided, based on priority criteria, for the repair and rehabilitation of school facilities. These grants would also finance limited new construction of school buildings that are already registered with MOE. The communities in difficult access areas may follow traditional approaches to school design and construction methods. The program will assist MOE in reviewing technical approaches and developing guidelines on appropriate school building options. Support to communities for use of an improved building model will be provided by the Department of Construction in a limited number of selected provinces and by NGOs in other provinces. Summary of Implementation Progress and Results In collaboration with the Construction, Procurement and Finance departments of the Ministry of Education, EQUIP had outstanding progress in the implementation of this sub-component. Standard cost-effective designs with upgraded local technology and earthquake resistant features have been developed with support from EQUIP and are being used. Overall, construction of 822 school buildings, including 24 Provincial Education Department (PED) buildings, was financed under this sub-component. By the end of June 2009, as many as 325 construction activities were completed and the remaining is under construction. It is planned that all EQUIP I school construction activities to be completed by the end of 1388. The total cost of these projects reached US$ 43,465,438. As a result of this intervention more than 718,072 students (405,282 boys and 312,790 girls) were provided with safe and conducive learning environment and hence contributing to the enhancement of education quality in these schools. Component 2: Support to Schools through Institutional and Human Resource Development This component will provide technical and financial support for training key personnel, to enable them to carry out respective functions more effectively and to support school-based management. The training of teachers, school principals and staff of PEDs and DEDs will be the primary activity in this component supplemented by the grants provided to PEDs and DEDs for their institutional development (including infrastructure). Component 2.1: Teacher Training This component of EQUIP will help to finance the Teacher Education Program, which is jointly led by the Ministry of Education, Ministry of Higher Education, and the Academic Council of Education and supported by development partners including Denmark, Japan, UNICEF, USAID, 63 and the World Bank. The Ministry of Education is currently developing a national unified teacher training curriculum that is competency-based and modularized. The training is being designed to widen teachers' knowledge on subject content and develop "modern" student- centered pedagogical skills. To ensure the quality of teacher training, this component will provide systemic support for upgrading and updating of teacher trainers. Summary of Implementation Progress and Results Under this sub-component, two modules - INSET 1 and INSET 2 - consisting of 52 sessions of 90 minutes each are considered for training of teachers. To date, 32,467 teachers (10,467 female and 22,009 male) have completed INSET 1 under EQUIP I (against a target of 140,000). Another 8,000 (all female) teachers have been trained with support from UNICEF and 3,723 by other agencies. BESST has also trained 50,600 teachers (1,530 female and 5,570 male). INSET 2 has just started with training of core trainers from NGOs, and Master Trainers from Kabul City, Kabul Province, Kapisa, Logar, Paktika and Parwan. Component 2.2: Training of Principals To enhance the capacity of school principals as good managers as well as effective instructional leaders, the training of school principals will be supported in this component. First, this component will support MOE in reforming the selection criteria for school principals. It will also help to define the responsibility and accountability structure of school principals, and will develop training modules to match the new functions. International good practice will be applied in redefining the functions and selection criteria, and in developing training modules. Master trainers will be trained, and trainers from NGOs will also be used to provide training at Province and District level and facilitate cross-fertilization, for example through organizing school principals' workshops and study visits to other provinces. Summary of Implementation Progress and Results There has been little progress in this sub-component due to slow implementation. Moreover, activities under this sub component were tied to the PRR which has been delayed. Contracts with NGOs to implement this component has recently been finalized and signed. Full implementation of this sub-component would be started under EQUIP II. Component 2.3: Capacity Building of District and Provincial Education Departments The objective of this sub component is to streamline the functions of the PEDs and DEDs and enhance their capacities to do the job effectively. The activities were designed to aim at building the capacity of the Provincial and District Education Departments using a three-pronged approach: (a) strengthening the PEDs through greater devolution of both managerial and financial authority, and to organize PEDs and DEDs in such a way that their effectiveness are enhanced; (b) providing intensive regular in-country training in different areas such as planning, management and administration, recruitment of personnel, budget and financial management, accounting, educational development, and monitoring and supervision of teaching; and (c) providing funding for institutional capacity development and upgrading of physical facilities. 64 Summary of Implementation Progress and Results Little progress was made under this sub-component due to its linkages with the implementation of PRR. The MOE has begun to implement major public administration and management reform (as outlined in the Five Year Strategic Plan). A new organizational structure for the entire ministry (center, province, district and school levels) has been developed following an extensive consultation process across the Ministry. Procedures and guidelines have been developed for merit-based appointments and the recruitment process for PEDs has been completed. The recruitment process for Provincial Deputy Directors and District Managers has also been recently completed. EQUIP technical assistance at the provincial level has been formalized at the provincial level by integrating the EQUIP provincial team into the formal structure (Tashkil) of the provincial education department. EQUIP II will strengthen its technical assistance at the district level by involving district level supervisors in community mobilization activities and monitoring and evaluation. Efforts would also be made to create synergies between different component of the program in favor of community empowerment and building capacity at the district level i.e. District Teacher Training Teams (DT3). Component 3: Policy Development and Monitoring and Evaluation The component supported the overall capacity strengthening of the Ministry of Education, to improve the delivery of education services nationwide. Given the sector-wide approach of institutional capacity for the education sector, the detailed strategies received support from various donors and programs, including EQUIP I. The agreed indicators, targets and progress to- date for this component are included in the table below. MOE Sector Management Capacity Building Objective Indicator Progress to-Date The National Education Strategic Plan was prepared through extensive consultation and capacity building within the Linkage between MOE/Planning Department Policy Development, NESP is being updated to align to the completed Afghanistan Policy Program National Development Strategy and the Result-Based Budget of Development Implementation and the Ministry of Finance Budget EQUIP I (toward is last phase of implementation) has developed tools to align its annual implementation plans, to FY financing and approved budgets and results EMIS has collected education sector wide data, which is updated Establish and every year Implement a Practical EMIS reports being used for planning and monitoring Monitoring and Evaluation System for EMIS requires a hard and software information technology based Monitoring EQUIP and MOE to improve data collection, analysis and management use at all and levels of the system Evaluation Teacher Registration is 85% completed, including progressive Assess Status of opening of Bank account for teacher payment. Teachers Competency assessment preparation in progress Pilot Assessment of Student Learning Assessment not yet developed Student Learning Pilot is expected by end of 2009 65 Component 3.1: Policy Development Continuing the technical assistance provided under the ongoing education project, this component will support the Ministry of Education to implement a medium-term policy framework, and to prepare a sound National Development Budget each year. The component will also help build the capacity of MOE officials to use the Education Management Information System, which the Ministry is currently developing. It will support the annual data collection from schools in order to promote the use of data in planning and decision making. It will also support MOE in undertaking a household survey to assess the status of schooling of children, direct and indirect cost of schooling, and any other opportunities and constraints concerning education. Summary of Implementation Progress and Results National Education Strategic Plan (NESP 1385 ­ 1389): the medium term policy framework developed under the IDA funded EERDP was used as a background for the development of the MOE's Five Year Education Strategy, developed with the technical support of UNESCO and IIEP. The strategy has been endorsed by all stakeholders in the sector and outlines eight priority programs for the education sector: (i) general education, (ii) teacher education and working conditions, (iii) education infrastructure rehabilitation and development, (iv) curriculum development, (v) Islamic education, (vi) technical and vocational education and training (vii) literacy and non-formal education (viii) education administration reform and development. With some support from EQUIP, the Ministry of Education is developing annual implementation, annual budget and operational plans each year. Education Management Information System In March 2007, EQUIP supported the roll out of the national survey for 1386. This survey package has been created with the full collaboration of the Planning and IT Departments (and technical support from BESST) as well as wide consultations with line departments and stakeholders who will be using the survey data for policy and implementation purposes. In mid 2007, the Ministry of Education conducted the school survey which has now led to the establishment of an EMIS system that may be used for establishing baseline data and future monitoring. A total number of 196 survey teams were formed to visit 9500 schools over a period of 70 days. Each survey team consisted of one Kabul ministry staff member and one monitoring staff member from the Provincial Education Department. Thirty-four supervisory teams were formed comprising one person from the Planning Department and one from each provincial department. The 400 district education staff members participated as facilitators and guides in their respective locations. The surveyors and supervisors were then pooled to form 34 provincial survey teams, one for each province. The number of surveyors for each province was proportionate to the number of schools in the province. The survey report was made available in January 2008. 66 Component 3.2: Monitoring and Evaluation The Monitoring and Evaluation (M&E) component, as an integral part of EQUIP, will provide timely and adequate information to stakeholders on implementation, performance, process, outputs and outcomes. It aims to provide timely feedback so that lessons can be learned in time to make mid-course adjustments where necessary. It will put in place a system to monitor and evaluate the implementation of the main interventions: teacher development, school improvement grants, rehabilitation and construction of schools, and policy development. The M&E system would have four components: (a) physical and financial monitoring of implementation; (b) process monitoring; (c) post-implementation monitoring, including monitoring of the sustainability of school grant activities and financial audit; and (d) outcome evaluation. Monitoring and evaluation will be distinct processes. The interventions and achievement of program objectives will be evaluated by a third-party organization, which will submit an annual evaluation report to the Grant Management Unit of MOE and the Ministry of Finance. School grant activities will be monitored by SMCs/PTAs, and supervised by District Education Departments and NGOs where these work as facilitation agencies. Progress in implementing school improvement plans and the use of grants will be publicly posted at schools and/or locations for community gatherings. Where they exist, Community Development Councils of the National Solidarity Program will play an important role in social audits. All project data and information in the prescribed formats will be transmitted to planning officers in PEDs and the Ministry of Finance (MOF). Planning officers would work closely with District Education Officers to prepare statistical and monthly reports for the GMU and other relevant departments of MOE, and MOF. A baseline survey will be conducted at the start of the program on the basis of which progress would be monitored and assessed according to agreed targets. The data from the survey will assist in setting realistic targets to be achieved. Follow-up surveys at the end of year two and annual surveys would be conducted to monitor the cost, impact, and implementation of the school-based funding scheme, plus annual updating of the school mapping database and evaluation of construction progress. The program will support the Ministry of Education in carrying out the verification of teachers: their numbers, attendance, and levels of qualifications and skills. Technical and financial support will be provided to MOE to undertake an expenditure tracking survey to assess the flow of funds, including salary payment, from central Ministries to schools. Finally, EQUIP will support the Ministry of Education in carrying out an assessment of learning achievement. The component will support the development of a core team, skilled in student assessment strategies, who will impart these strategies to teachers and principals as part of their training programs. The focus will be on formative assessment to enable teachers to ascertain students' learning progress and constraints on a continuing basis. The core team will also be trained in summative assessment strategies using standardized testing. An appropriate student assessment mechanism will be developed through pilot activities. 67 Summary of Implementation Progress and Results Program Monitoring: Satisfactory progress has been made in Monitoring and Evaluation by the EQUIP team. A Monitoring and Evaluation Section has been established, under the responsibility of the Team Leader, Social Mobilization. This Section consists of four monitoring officers and one support staff. Since January 2007, this Section has: (a) developed a monitoring checklist, which it has field-tested and revised; (b) carried out initial monitoring in five provinces (Badakhshan, Khost, Logar, Kandahar, and Parwan), covering 435 schools (411 ­ Quality Grants and 24 Infrastructure Development Grants); and (c) prepared monitoring reports for the field visits in these five provinces (findings are incorporated under Sub-component 1.1). With support from EQUIP, a new monitoring system for infrastructure work has also been established and being implemented during 1387. The new monitoring system is utilizing all the engineering manpower of MOE to monitor all education infrastructure work irrespective of funding sources and implementing partners. Daily, weekly, quarterly and ad hoc reports are generated as a result of this monitoring mechanism and submitted to donors, parliament, etc. and also used for internal ministry purposes. Evaluation: An external evaluation of the program is planned under EQUIP II and the financial needs have been reflected in the EQUIP II proposal. Teacher Registration and Bank Accounts: With support from EQUIP, the Ministry of Education through its Education Management Information System Team conducted teacher registration process. By 30 June 2009, 203,725 ministry employees, out of 216,475 teaching and non-teaching staff, were officially registered as a result of this exercise. A brief summary of achievements is presented below: All of the teachers who registered and took the literacy assessment, received the 1300 Afs/month salary increase effective 1st Jawza 1387 (May 2008); Approximately 5,200 MOE employees (4,500 teachers; 700 non-teaching staff) who did not appear for registration were removed from the system; and Bank accounts were opened for a total of 175,575 MOE employees and 22,000 teachers from Kabul city and they are now receiving their salaries through direct deposit into their accounts. Teacher Literacy and Numeracy Assessment: The Ministry of Education through the Teacher Education Department conducted a literacy and numeracy assessment of teachers. Literacy and numeracy score data for around 100,000 teachers has been entered into the EMIS database. Most of the assessment data for teachers from the remote districts of Badakhshan have just arrived and are being scored and entered into the database. For teachers who could not be reached or did not come to register due to security reasons, alternative measures are being explored and an action plan for completion of registration and assessment will be jointly developed with relevant stakeholders. Assessment sheets for around 5,000 teachers who scored the lowest are being considered as a sampling frame for a more comprehensive assessment. As part of the prior commitment, a face-to-face comprehensive literacy and numeracy assessment will be given to a randomly selected sample drawn from this sub-group. Results of this assessment will provide information on validity of the screening tool and guide the scope and 68 content of future training for targeted groups of teachers. Based on these results, the positions of teachers who prove to be totally illiterate will be considered for opening for competitive hiring. However, if those failed in the comprehensive literacy assessment happen to be female with no better replacement, the decision on whether to dismiss or provide immediate and intensive literacy training will be jointly made by MOE and its development partners. Component 3: Support to Institutional Arrangements and Program Implementation Component 3 also supported to increase the capacity of the Ministry of Education to implement EQUIP I, and in turn support the implementation of other key programs of the Ministry. The Ministry of Education, through its central, provincial and district departments were responsible for the overall execution of the program. In coordination with line departments, the Grant Management Unit (GMU) was responsible for program monitoring, reporting and financial management. MOE will form a Program Steering Committee to set operational policy guidelines, monitor progress, and address issues. Guided by the Steering Committee, GMU also coordinated technical assistance, evaluation, and policy-relevant research studies. Provincial Education Departments were responsible for the overall management of the School Grants (Component 1). The Department of Construction provided overall technical guidance to school construction and rehabilitation carried out through School Grants in Component 1. It was expected that close coordination was to be maintained with the National Solidarity Program, which will be strengthened in the second phase of EQUIP. MOE's Teacher Training Department was responsible for the technical aspects of human resource development, and lead the Teacher Education Program (Component 2) reorganization and decentralized, district-base design. The MOE contracted technical support (e.g., community mobilization) to strengthen the capacity of participating PEDs and DEDs to deliver the school-based program. MOE was directly responsible for the procurement of goods, works, and services with assistance from the central government procurement agent in the Ministry of Reconstruction. Summary of Implementation Progress and Results Since implementation began in August 2004, the program has undergone several project management changes. Only since June 2006 has there been a greater emphasis on EQUIP implementation and capacity-building of MOE. As a result, with a strengthened and restructured GMU, program implementation has accelerated considerably. The responsibility of the GMU was to coordinate and monitor all donors funded projects, human resource management, procurement, accounting and financial reporting, and information technology. The project is being managed under the following revised structure: the EQUIP Coordination Unit led by an EQUIP Coordinator, with a team of 12 staff (compared with three staff at the beginning of implementation) with expertise in social mobilization, engineering, and monitoring and evaluation. The GMU complemented the EQUIP team by providing the capacity needed for procurement, financial management and overall administration related matters. The MOE has been working towards enhanced coordination of the teacher training activities under TTD and its linkages with the rest of the program, supported by the Provincial and District Education 69 Departments. In addition, EQUIP has placed relevant staff at provincial education departments to boost their capacity (an EQUIP Officer, two social mobilizers and two engineers). Technical Assistance: similar to other post-conflict settings, in Afghanistan, there is an absence of strong institutional capacity in the line ministry and it is difficult to implement development programs without external assistance. Given the situation, the program had to increase capacity through the recruitment of contract staff (including technical advisors and specialists). As part of this objective, EQUIP provided support, in addition to EQUIP core team, in the recruitment of technical assistance to the departments of curriculum development, finance, construction, procurement, IT and planning. A total of 436 contracts were supported. It should, however, be noted that the biggest number of technical assistance was provided to the Curriculum Department to assist in the development of new curriculum and textbooks for grade 1 ­ 12 of general education. The Ministry of Education is currently reviewing the effectiveness of the technical assistance as part of its TA institutionalization process. A taskforce headed by the Deputy Minister of Education has been assigned to review the effectiveness of all TAs. The initial feed-back indicates that there would be considerable reduction in the number of TAs under EQUIP II in 2009. The following tables briefly present the outcomes of the technical assistance provided by EQUIP to different departments within the Ministry. 70 Policy Development and M&E Component Technical Assistance Hired Area of Institutional No of Progress To-Date Expected Support TA Type of Work Outcomes Hired Administrative Support provided on major reforms 3 Support the and Logistical for complex organizational Office of the management and strategy Minister's Minister of 3 development issues. Office Support Education for Strategic guidance provided leading MOE Effective Sector to policy dialogues on strategic Management Policy and 1 Security issues related to education. and Strategic Guidance Administrative support provided to Administration the new leadership of the Ministry. Support All EQUIP procurement of goods, services and works provided by the 9 Procurement technical assistance and on the job training provided to the civil servants. Technical and Technical assistance provided to 11 Vocational guide strategic development of Education TVET Department. Modality for a national in-service program established. A common framework for Teacher Strengthening of coordination established. 2 Education Department key line A common, reform-based departments of curriculum and training materials Support to Key the MOE finalized. MOE providing Departments strategic and Standard cost-effective designs with technical upgraded local technology and Infrastructure guidance to earthquake resistant features have 2 EQUIP Department been developed. components Effective monitoring capacity established. Policy, programming and management capacity, including 1 RIMU focus at the provincial and district and school levels strengthened. 71 Technical Assistance Hired Area of Institutional No of Progress To-Date Expected Support TA Type of Work Outcomes Hired 200 title new textbooks and teacher guide developed for lower secondary (grade 7 ­ 9) in Pashtu Subject and Dari Languages Design Basic Specialists, Seminars and workshops were Curriculum Education Graphic organized as part of capacity and Textbook 189 Curriculum and Designers, building. Development Instructional Computer 100 title textbooks for primary Materials Operators (grade 1 ­ 6) in Pushto and Dari revised/edited for reprinting 147 titles of old secondary textbooks were edited for reprinting. 1 EQUIP Officer, 2 Engineers and 2 Social Mobilizers deployed to all Support the provinces. Equip coordination and Together with PEDs and DEDs, Coordinators, integrated EQUIP Provincial teams EQUIP Monitoring management of implemented and monitored the Coordination, Specialists, EQUIP I, and program. Regional and 181 Social increased PEDs and DEDs capacity developed Provincial Mobilizers, capacity in 34 in program implementation and Monitoring Engineers, and Provincial community mobilization Financial Managers Education EQUIP team embedded into formal Departments structure of PEDs, as part of sustainability strategy of the program 9500 schools were surveyed 37 EMIS EMIS system established Support the National EMIS survey package has development of been created with the full 10 ICT Education Planning and collaboration of the Planning and IT Management Monitoring Departments Information System and of Well-functioning GMU established the NESP within the MOE with capacity to 2 GMU carry out all financial and procurement management activities. Improve fiduciary All financial management of EQUIP Fiduciary Financial management at the center and in the provinces Strengthening 28 Management and financial was carried out and on the job and Support accountability of training provided to civil servants. MOE resources 72 Letter from MOE on ICR Report 73 Annex 8. Comments of Co-financiers and Other Partners/Stakeholders Co-Financiers Comments The ICR provides detailed information on what was lacking/not-achieved under EQUIP I and these issues should strongly be addressed under EQUIP II, especially considering that the financial envelope has increased substantially (both IDA and ARTF contributions). Some of the key themes which require attention would include: Technical Assistance: The Report highlights (pg. 14) that the MoE was one of the largest employers of TA with high salaries across the GoA. The Ministry has begun to tackle this issue and established a Technical Assistance Evaluation Committee in April 2009 and a Summary Report was prepared. It was agreed with the World Bank that all concerned departments utilizing the services of TA's would develop a "TA Strategic Plan" also considering an evaluation of TAs at the provincial level. Differentiated Strategies in unsecure areas: The lack of differentiated strategies is highlighted as a short-fall in project design. This issue is not addressed in EQUIP II. There may be lessons to be learned from National Solidarity Program in addressing the delivery of social services in hard-to reach areas. Annual Integrated/Implementation Plans for EQUIP: The report highlights in various sections that project management tools were not in place to monitor the pace/challenges of project implementation. It is clear from the report what some of the infrastructure related challenges were and how they have impacted this component of EQUIP. The Ministry seems to have really focused on monitoring/collecting information in regards to infrastructure. The Ministry has also made efforts to develop financing plans for EQUIP and regular substantive reports outlining project progress/challenges need to continue. Focused efforts on the infrastructure component, as a substantial portion of EQUIP I & II, is commendable; however monitoring reports related to other components such as teacher training are needed (the ICR points clearly to the fact that teacher training investments require intensive work to ensure targets are met. Student Assessment of Learning. Donors agree students learning assessment is a high priority, and forward planning in EQUIP II is needed. NGOs and Education Partners Comments Given the state of education in Afghanistan at the conclusion of the Taliban era, the entire system was in dire and urgent need of profound reform. But the conditions for laying the foundations for sustainable educational reform during this period were (and still are) considerably less than ideal. The World Bank and the Government of Afghanistan are to be applauded for their joint effort under EQUIP I to jump start reform in the midst of great uncertainty and vulnerability. The Bank's "Implementation Completion and Results Report", issued as the project ends its current phase, highlights the successes and shortcomings of the ambitious EQUIP I as it has wended its way through five troubled years, attempting in the process to respond to immediate emergency needs in education while creating conditions for 74 more lasting reform at a future stage. While the report provocatively recounts in some detail the results of the project in its three focus areas and numerous indicators, three elements stand out that are worthy of comment here: 1. Sectoral reform projects sometimes make the mistake of targeting only the macro or the micro realm, but EQUIP I boldly (and correctly) tackled both simultaneously. Some may say that was reckless given the instability of the operating environment, but the sad lessons learned from failed projects the world over is that reform, to be effective, must be simultaneously horizontal and vertical. Local and national approaches are necessary and complementary. It is somewhat ironic, but at the same time predictable, that EQUIP I was more successful at the school and community levels than at the national. Within that realm it was, furthermore, successful in the interventions that count (given current conditions): in community mobilization and in strengthening educational support and infrastructure. It was less fortunate at the national level, particularly in the critical areas of financial and operational management, inter- and intra-institutional cooperation, and monitoring and evaluation. In phase II the priority clearly is to bring advances at the national level into synch with those already registered at the local level, which is the best way to achieve lasting reform. 2. The "Report" highlights what was apparently an unexpected bottleneck in the form of the absence of skilled national manpower to implement the myriad activities critical to project success, a circumstance that led to an inordinate reliance on foreign and national consultants and a concomitant upward and unsustainable spiraling of salaries. In both the intermediate and the long term, the existence of a cadre of trained educational professionals is indispensable for ensuring the enduring success of any reform effort. Planning for the second phase of the project, therefore, should contemplate an immediate and urgent program of training and incentives to build the capacity and motivation of a generation of education professionals that can lead the way to translate today's hopes into tomorrow's reality for Afghan youth. 3. The project contemplated the design and implementation of a student learning assessment model in order to measure the success of actions undertaken by the project to ensure quality improvement. Yet for reasons not explained in the "Report" the assessment tool was not developed. Early mastery in grades 1 through 3 of the basic skills of reading and numeracy are critical for student success in later years. Students who delay the mastery of those skills often are the ones who drop out of the system or who underachieve in later grades and ultimately lose motivation to continue their education beyond the primary or basic education level. Standardized student assessment is essential for determining the efficacy of the learning environment, and the results of uniform, periodic testing aid education planners and technicians in correcting inappropriate pedagogical and curricular practices and materials or in improving those interventions. In sum, student achievement is the ultimate indicator of educational quality. The USAID funded Partnership in Advancing Community Education in Afghanistan (PACE-A) project has developed a Rapid Reading and Numeracy Test for grades 1-6, which it has given to a sample of its CBE students for three years running, beginning in 2007. The results of the test have been useful in diagnosing areas of weakness and strength in the project's educational 75 interventions, enabling the technical team and the implementing partners to improve programs in both teacher training and classroom support. PACE-A is willing to share its experiences in student assessment with the MOE and EQUIP II, and it stands ready to cooperate to the fullest to assist in any way possible in the adoption or adaptation by EQUIP II of this viable and proven learning assessment tool. Finally technical education partners and donors agree that the NESP has been a key input for the education sector in Afghanistan. NESP was launched in January 2007. At the Education Development Forum, the first joint review meeting of NESP implementation that gathered the whole education development community in Afghanistan (February 2008), it was unanimously referred to as the central education document in Afghanistan. NESP is the basis for MoE fundraising, donor harmonization and alignment and also for yearly operational planning (including at provincial level). The development of NESP has been strengthened by additional components critical for strategic planning and capacity development, such as systems development, yearly and medium-term (five year) planning, monitoring and reporting and policy formulation; and capacity development of MoE staff. The NESP has also given a national strategic framework to further develop the EMIS. Following the Paris Conference in June 2008, donor coordination became a priority for government and international community to move forward the implementation of ANDS. In general donor coordination has improved, especially through the Education Development Board (EDB). Development partners, donors and related ministries in the education sector made good progress and commitments through the organization of the Education Development Forum in February 2008 and the establishment of the Education Development Board. MOE also initiated a donor mapping process to strengthen harmonization and alignment to the NESP. In addition, a working group on TVET has been established based on donors request to improve coordination in this sub-sector. Education programs such as EQUIP II should continue to benefit from improved harmonization and alignment of development assistance in Afghanistan. 76 Annex 9. List of Supporting Documents Islamic Republic of Afghanistan Islamic Republic of Afghanistan, Public Administration and Capacity Building Project, Public Procurement Reform Support Documents: - Proposal for Selection and Placement of Procurement Capacity Building Officers in the Line Ministries (2008) - Post-training Evaluation Report (2008) - 7th Quarterly Progress Report (October-December 2008). Ministry of Finance, Supplementary Budget Allocation for the Afghan Fiscal Year 1388 (2009) approved by Parliament, May of 2009. Ministry of Education of Afghanistan Documents: - NESP Programs: General Education, Teacher Education, Curriculum Development, Education Infrastructure and Education Administration Results Framework and Monitoring: Education Quality Improvement (2006-2008), October, 2008. - Policy Statement on National Teacher Competency Assessment, March 10, 2009. - EQUIP I Progress Report (2004-December 2008) dated April 20, 2009. - EMIS Report (2008-2009) dated July, 2009. - Procurement Plan for 1388 (FY2010). - EQUIP Program Implementation Manual, December, 2005. - Staff Performance Appraisal Samples (Old and New Formats) World Bank Investing in Afghanistan's Future, Report No. 31563-AF by Keiko Miwa, February, 2005. Memorandum and Recommendation of the President, Report No. P7626-AF dated July 1, 2004. Technical Annex, Report NO. T7626-AF Development Grant Agreement for Education Quality Improvement Program, August 2, 2004, Grant Number H119-AF. Afghanistan Reconstruction Trust Fund Agreement (Relating to the Education Quality Improvement Program) June 1, 2005, Grant Number TF054730. Afghanistan Reconstruction Trust Fund: - Report to Donors, Second Quarter of the Afghan Fiscal Year 1387 (June 21, 2008 to September 21, 2008). - Progress Reports to ARTF Management Committee for Additional Funds for SY 1386, MC Meeting Date: February 12, 2008. - Progress Report to ARTF Management Committee for Additional Funds for SY 1387, MC Meeting: 14 April, 2008. - FM Supervision Mission Report for Education Quality Improvement Projects I and II (EQUIP) - IDA H119, ARTF 54730 and IDA H354, dated January 31, 2009. - Progress Report on Teacher Registration and Assessment and Proposed Benchmarks for ARTF, January 15, 2009. 77 EQUIP I Aide Memoires - August 18 ­ September 4, 2004 - January 30 ­ February 27, 2005 - June 21 ­ 30, 2005 - November 8, 2005 to February 8, 2006 - July 4-24, 2006 - December 11 ­ 23, 2006 - September 25 ­ October 10, 2007 EQUIP I Implementation Status and Results Reports: - September, 2004 - March, 2005 - September, 2005 - March, 2006 - June, 2006 - December, 2006 - June, 2007 - December, 2007 - June, 2008 - December, 2008 - February, 2009 Management Letters for Grant Numbers: 54730 & H119 for 2006/07 & 2007/08. Audit Reports for 2006/07 and 2007/08 Financial Statements and Reports for 2006/07 and 2007/08 EQUIP I Mid-Term Review, April 2007 ICR for the Emergency Education Rehabilitation and Development Project, Report No. ICR000006, March 23, 2007. Other BRAC Afghanistan Education Program, Completion Report. The Education Quality Improvement Program (EQUIP I) for Contract Agreement number MOE/371-B (April 1, 2006 ­ March 31, 2008). Building Education Support Systems for Teachers, USAID Document from Contract Number GS-10F-466P. CARE Afghanistan: - Education Program, Completion Report for Education Quality Improvement Program (EQUIP I), April 2006 to April, 2008. - Report on the EQUIP I Appreciative Inquiry Workshop, May, 2008. CGAC Country Team Meeting, Governance Vulnerabilities in Community Driven Programs- Preliminary Findings, February 15, 2009. Draft Discussion Paper: Addressing the Problems of the Second Civil Service In Afghanistan Heneveld, W. and Craig, "Schools Count: World Bank Projects Design and the Quality of Primary Education in Sub-Sahara Africa", WB Technical Paper #303, Washington DC, 1996. Meza, Darlyn, "Community Participation: A Look at the International Experience ­ Lessons Learned", Afghanistan, August, 2009. 78 Meza, Darlyn, "Key Elements to Institutionalize Community Participation", Afghanistan, August, 2009. Reyes, Joel, (Unpublished Paper) "Principios y Aplicacion de Consejeria Escolar en Afganistan" (Principles and Application of School Counseling in Afghanistan), Universidad del Valle de Guatemala, January, 2009. USAID/AFGHANISTAN, Office of Social Sector Development, Education Portfolio Pamphlet, June 2009. 79 MAP 80