Cabo Verde SABER Country Report STUDENT ASSESSMENT 2017 Key Policy Areas for Student Assessment Status 1. Classroom Assessment In Cabo Verde, classroom assessment is supported by formal, system-level documents that provide information on how classroom assessment should be carried out by teachers and how it should be used to evaluate student performance. The use of classroom assessment is well established in the country, and teachers are engaged in positive assessment practices in the classroom. At the same time, there are no mechanisms in place to ensure alignment between classroom assessment activities and the national curriculum, and there is a need to improve the quality of and access to resources and training opportunities to support teachers’ classroom assessment practices. 2. Examinations The Provas Concelhias (Regional Tests) and the Provas Gerais Nacionais, PGN (General National Tests) are administered to students in grades 2, 4, 6, and 8 (in the case of Provas Concelhias), and in grade 12 (in the case of the PGN). In combination with other classroom assessment activities, Provas Concelhias determines if a student can move to the next grade, while the PGN also certifies completion of secondary education. Although there is no official unit within the Ministry of Education in charge of examinations, an individual from the Ministry of Education leads a team to organize and coordinate each of the examination programs every year. Although both examination programs are intended to be based on the official curriculum, in practice these examinations are more aligned with what teachers taught during the school year. There is no mechanism in place to ensure the quality of the examinations or its impact on the education system and students. 3. National Large-Scale Assessment (NLSA) Cabo Verde’s National Large-Scale Assessment exercise, the Prova de Aferida (Aferida), was administered in 2010 (to grade 6 students) and in 2014 (to students in grades 2, 4, and 6). The Aferida assessed students in Portuguese and math. For these administrations of the Aferida, individuals from the Ministry of Education and the teaching staff were brought together to carry out assessment activities; there is no permanent unit responsible for running the national assessment program. Although the individuals responsible for the Aferida had most of the appropriate resources to carry out the assessment, there were issues with how key assessment activities were implemented, and limited assurance procedures ensured the quality of the Aferida. 4. International Large-Scale Assessment (ILSA) Cabo Verde has not yet participated in an ILSA and there are no concrete plans in place to participate in a future ILSA. THE WORLD BANK Table of Contents Introduction ................................................................................................................................................3 What is SABER-Student Assessment? .........................................................................................................3 Education in Cabo Verde .............................................................................................................................5 Classroom Assessment in Cabo Verde ........................................................................................................7 Examinations in Cabo Verde .....................................................................................................................10 National Large-Scale Assessment in Cabo Verde .....................................................................................14 International Large-Scale Assessment in Cabo Verde ..............................................................................16 Appendix 1: Assessment Types and Their Key Differences .......................................................................17 Appendix 2: Summary of the Development Levels for Each Assessment Type .......................................18 Appendix 3: Methodology for Assigning Development Levels .................................................................19 Appendix 4: SABER-Student Assessment Rubrics for Cabo Verde ...........................................................21 Acknowledgements ...................................................................................................................................71 References.................................................................................................................................................71 2 CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 Introduction SABER-Student Assessment methodology The SABER-Student Assessment framework is built on Cabo Verde has focused on increasing student the available evidence base for what an effective learning outcomes by improving the quality of assessment system looks like. The framework education in the country. An effective student provides guidance on how countries can build more assessment system is an important component of effective student assessment systems. The framework efforts to improve education quality and learning is structured around two main dimensions of outcomes because it provides the necessary assessment systems: the types/purposes of information to meet stakeholders’ decision-making assessment activities and the quality of those needs. In order to gain a better understanding of the activities. strengths and weaknesses of its existing assessment system, Cabo Verde decided to benchmark this Assessment types and purposes system using standardized tools developed under The World Bank’s Systems Approach for Better Education Assessment systems tend to be comprised of three Results (SABER) program. SABER is an evidence-based main types of assessment activities, each of which program to help countries systematically examine and serves a different purpose and addresses different strengthen the performance of different aspects of information needs. These three main types are: their education systems. classroom assessment, examinations, and large-scale, system level assessments. What is SABER-Student Assessment? Classroom assessment provides real-time information to support ongoing teaching and learning in individual SABER-Student Assessment is a component of the classrooms. Classroom assessments use a variety of SABER program that focuses specifically on formats, including observation, questioning, and benchmarking student assessment policies and paper-and-pencil tests, to evaluate student learning, systems. The goal of SABER-Student Assessment is to generally on a daily basis. promote stronger assessment systems that contribute to improved education quality and learning for all. Examinations provide a basis for selecting or certifying students as they move from one level of the education system to the next (or into the workforce). National governments and international agencies are All eligible students are tested on an annual basis (or increasingly recognizing the key role that assessment more often if the system allows for repeat testing). of student learning plays in an effective education Examinations cover the main subject areas in the system. The importance of assessment is linked to its curriculum and usually involve essays and multiple- role in: choice questions. (i) providing information on levels of student learning and achievement in the system; (ii) monitoring trends in education quality over Large-scale, system-level assessments provide time; feedback on the overall performance of the education (iii) supporting educators and students with real- system at particular grades or age levels. These time information to improve teaching and assessments typically cover a few subjects on a learning; and regular basis (such as every 3 to 5 years), are often (iv) holding stakeholders accountable for results. sample-based, and use multiple-choice and short- answer formats. They may be national or international in scope. Appendix 1 summarizes the key features of these main types of assessment activities. 3 CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 Quality drivers of an assessment system Table 1: Framework for building an effective assessment system, with indicator areas The key considerations when evaluating a student assessment system are the individual and combined quality of assessment activities in terms of the adequacy of the information generated to support decision making. There are three main drivers of information quality in an assessment system: enabling context, system alignment, and assessment quality. Enabling context refers to the broader context in which the assessment activity takes place and the extent to which that context is conducive to, or supportive of, the assessment. It covers such issues as the legislative or policy framework for assessment The indicators are identified based on a combination activities; institutional and organizational structures of criteria, including: for designing, carrying out, or using results from the assessment; the availability of sufficient and stable • professional standards for assessment; sources of funding; and the presence of trained • empirical research on the characteristics of assessment staff. effective assessment systems, including analysis of the characteristics that differentiate System alignment refers to the extent to which the between the assessment systems of low- versus assessment is aligned with the rest of the education high-performing nations; and system. This includes the degree of congruence • theory — that is, general consensus among between assessment activities and system learning experts that it contributes to effective goals, standards, curriculum, and pre- and in-service assessment. teacher training. Levels of development Assessment quality refers to the psychometric quality of the instruments, processes, and procedures for the The World Bank has developed a set of assessment activity. It covers such issues as design standardized questionnaires and rubrics for and implementation of assessment activities, analysis collecting and evaluating data on the three and interpretation of student responses to those assessment types and related quality drivers. activities, and the appropriateness of how assessment results are reported and used. The questionnaires are used to collect data on the characteristics of the assessment system in a Crossing the quality drivers with the different particular country. The information from the assessment types/purposes provides the framework questionnaires is then applied to the rubrics in order and broad indicator areas shown in Table 1. This to judge the development level of the country’s framework is a starting point for identifying indicators assessment system in different areas. that can be used to review assessment systems and plan for their improvement. Rubrics are used to evaluate data collected using the standardized questionnaires. The goal of the rubrics is to provide a country with some sense of the development level of its assessment activities compared to best or recommended practice. For each indicator, the rubric displays four development levels—Latent, Emerging, Established, and Advanced. These levels are artificially constructed 4 CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 categories chosen to represent key stages on the capita was US$ 2,970, almost six times what it was in underlying continuum for each indicator. Each level is 1982. The country has also witnessed a sustained accompanied by a description of what performance decline in the number of poor, with the incidence of on the indicator looks like at that level. poverty falling from 58 percent in 2001 to 35 percent in 2015, and extreme poverty falling from 30 percent • Latent is the lowest level of performance; it to 10 percent during this same period. Anchored in represents absence of the desired attribute. stable political institutions, the country’s economic • Emerging is the next level; it represents partial performance is attributable to significant investment presence of the attribute. in infrastructure linked to the promotion of the • Established represents the acceptable minimum country as a tourist destination. standard. Cabo Verde has made significant progress in • Advanced represents the ideal or current best expanding access to education and has achieved practice. nearly universal access to basic education over the past decade. Although preschool education is not A summary of the development levels for each compulsory, 85% of children in Cabo Verde are assessment type is presented in Appendix 2. enrolled in preschool programs. Basic Education is mandatory and free. The Net Enrollment Rate (NER) In reality, assessment systems are likely to be at in primary and secondary levels has improved from different levels of development in different areas. For 91,7% and 59,2% in 2006 to 97% and 70,3% example, a system may be Established in the area of respectively in 2015. Cabo Verde is in the process of examinations, but Emerging in the area of large-scale, expanding its basic education system beyond grade 6, system-level assessment, and vice versa. While to grades 7 and 8, with the goal of achieving 100% net intuition suggests that it is probably better to be enrolment rate from preschool to grade 8 by 2021. further along in as many areas as possible, the evidence is unclear as to whether it is necessary After completing basic education, students can to be functioning at Advanced levels in all areas. continue their secondary education in institutions of Therefore, one might view the Established level as a general secondary education, institutions of technical desirable minimum outcome to achieve in all areas, education, and institutions of professional-vocational but only aspire beyond that in those areas that most education. Students who continue through general contribute to the national vision or priorities for secondary education and complete grade 12, must education. In line with these considerations, the pass the Prova Geral Nacional (National General Test) ratings generated by the rubrics are not meant to be to receive a certificate of general secondary school additive across assessment types (that is, they are not completion. meant to be added to create an overall rating for an assessment system; they are only meant to produce Despite having made major progress, the education an overall rating for each assessment type). The system in Cabo Verde faces several challenges. methodology for assigning development levels is Results from the 2010 Aferida show that only 27% of summarized in Appendix 3. 6th graders achieved a satisfactory score on the assessment, and that half performed at a level that is considered to be concerning. The island of residency Education in Cabo Verde and the household living conditions are also significant variables affecting student learning outcomes at the primary level. At the secondary Cabo Verde is a lower-middle income country level, low internal efficiency is a major concern and it consisting of ten islands located off the coast of West is considered the weak link of the education system: it Africa. Approximately 88% of its population (0.5m) is estimated that 87.5% of young people access lives on four of the ten islands. secondary education, but only 45% complete it. Although student retention is high at the primary level Cabo Verde has made substantial development (93%), it declines in secondary education (85%). At the progress and is currently the richest country in West same time, repetition rates are moderate at the Africa and the 9th richest in Sub-Saharan Africa (SSA). primary level (9%), and are considerably higher at the In 2016, Cabo Verde’s gross national income (GNI) per 5 CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 secondary level (22%). Despite many young people Assessment questionnaires – one questionnaire each leaving the secondary level without qualification and for Classroom Assessment, Examinations, National facing difficulties in the job market, Cabo Verde’s Large-Scale Assessment, and International Large-Scale Technical and Vocational Education and Training Assessment. The data to complete these (TVET) system accounts for only 5% of enrollment at questionnaires were obtained through interviews and the secondary level. The current TVET system is focus groups with key stakeholders and a review of characterized by a disperse offering of professional existing official and technical documents. The and technical trainings but it does not constitute a information in the completed questionnaires was true system that is well articulated with student flows then applied to the SABER-Student Assessment from the secondary level and the job market rubrics (one rubric for each assessment type), and the demands. conclusions of this report were determined on the basis of this analysis. It is important to remember that In response to these and other challenges, Cabo these tools primarily focus on benchmarking a Verde is in the process of developing its 2017-2021 country’s policies and arrangements for assessment Education Strategic Plan that is built around the activities at the system or macro level. Additional data following three main priorities: i) Gradually increasing would need to be collected to determine actual, on- universal access to preschool, basic and secondary the-ground practices in Cabo Verde, particularly by school; ii) Improving quality and relevance of teachers and students in schools. The following education services; and iii) Improving the efficiency sections discuss the findings by each assessment type, and management of the education sector. Within accompanied by suggested policy options. The Basic Education, the focus is on curriculum reform and suggested policy options were determined in the revised education structure (to include grades 1st- collaboration with key local stakeholders based on 8th). At the secondary level, the main objective is to Cabo Verde’s immediate interests and needs. increase access to relevant secondary education Detailed, completed rubrics for each assessment type aligned with the economic development of the are provided in Appendix 4. country. As part of the process to improve the quality of education, Cabo Verde is interested in improving its assessment system to ensure that is has better data on student performance that can inform further education reforms and policies. Detailed information was collected on Cabo Verde’s student assessment system using the SABER-Student Assessment questionnaires and rubrics in order to benchmark it against best practices. Specifically, a local consultant with in-depth knowledge of, and experience with, the education system in Cabo Verde oversaw the completion of the four SABER-Student 6 CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 Classroom Assessment in Cabo Verde The school year in Cabo Verde is divided into three trimesters. At the end of each trimester, students receive a trimester score that is intended to include Level of development: ESTABLISHED information from summative tests (“testes sumativas”) that are developed by a group of teachers Several formal, official system-level documents from the same grade and subject level in a conselho provide guidelines for classroom assessment in Cabo (regional level) and administered two times per a Verde, including the “Programas,” which outline what trimester (and collectively make up approximately students are expected to learn at each subject and 80% of the trimester grade), and a mix of “other each grade level. However, these “Programas” have elements of assessment” (OEA), which consist of not been updated since 1999, and it is not clear how attendance, group work, individual work, and class they are aligned with the curriculum (which, at the participation. Based on trimester scores, students time of data collection, was undergoing significant receive an annual score, Classificação Anual - the first reform). The “Programas” also do not specify the and second trimester scores contribute approximately desired performance level for the subjects and grade 30 percent each to the annual score, and the third levels, and it is not clear if they are used by teachers trimester score contributes approximately 40 percent to help guide the development of classroom to the annual score. The third trimester score is assessment activities. weighted more heavily as it includes the marks from the final summative assessment (Prova Concelhia, Other official and publicly available documents Prova Geral Interna- PGI/Prova Geral Nacional- PGN) provide more specific guidance for classroom that takes place during this trimester (for more assessment activities. For example, the “Sistema de information, see the examination section below.) The Avaliacao- Ensino Basico, 2003” and the “Sistema de final annual score is calculated out of 20 points. Avaliacao- Ensino Secondario, 2003” provide overall general information on the use of classroom In practice, however, very few teachers use or apply assessment for primary education and secondary these OEAs and there are no specific standards or education, respectively. In addition, each year, guidelines concerning how to incorporate the data teachers receive a document, “Orientacoes para a from these assessments into trimester scores. Organizacao das Provas de Avaliacao Final,” with Consequently, trimester and annual scores are largely information on what should be assessed, how the determined by the summative classroom assessment information should be used, guidelines for the format information. of assessment questions, how the assessments should be administered, the subjects to be tested, how Officially, teachers are required to use classroom assessments should be corrected, and other logistics assessment information to diagnose student learning on carrying out assessment activities. At the time of issues, provide continuous feedback to students as data collection, the assessment system in Cabo Verde part of instruction, evaluate student performance and was undergoing a reconfiguration, and a new plan next steps in instruction. However, in practice, assessment policy document, “Decreto-lei 71 O Novo this does not seem to always occur and teachers Sistem Nacional da Avaliacao das Apredizagems, report insufficient training on how to use available 2015,” had been authorized introducing significant classroom assessment information. changes to the assessment system at all levels of education. However, at the time of data collection, Also under this scoring system, classroom assessment this new system had not yet been implemented, and activities help to determine whether students have it was unclear how or when the changes reflected in successfully completed certain grade levels. the new policy document would take place. Specifically, at the end of the 2nd, 4th, 6th and 8th Therefore, the data collected for the SABER-Student grades, before transitioning to next grade level, Assessment questionnaire reflects the current status students must have an overall annual score of at least of the student assessment system that was authorized 10 out of 20 points. At the end of secondary school by the Sistema de Avaliacao- Ensino Basico/Ensino (the 12th grade), students must have an overall Secundario in 2003. annual score of at least 10 out of 20 to graduate. Classroom assessments are not, however, used as 7 CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 selection mechanisms for entering higher levels of supervised in their use and application of classroom education. assessment/testes sumativas, there is no mechanism in place to ensure that the testes sumativas being All schools are required to report on individual used to evaluate students learning are actually aligned student performance in a manner that is accessible to with the curriculum. the student and the parents. Grades from the classroom assessments (i.e. testes sumativas) are posted on the walls in the classrooms, where students and parents can view the results. In addition, after Suggested policy options: each teste sumativa, teachers return the tests to the students and are required to go through the test, 1. As part of the ongoing curriculum review the questions and correct answers with the revision/update, introduce an official, system- students. While the students’ classroom assessment results are not sent to the Ministry of Education, their level document that: annual grades (which are based largely on classroom a. Provides information on what students assessment) are sent to the Ministry of Education’s are expected to learn (particularly in National Directorate for Education. language and math) at all different age and grade levels; and There are two formal mechanisms at the system level b. Specifies the desired performance levels to support developing teachers’ competencies in (that is, how well students should learn classroom assessment: pre-service teacher training and in-service teacher training. Pre-service is the specified content or skills) mandatory and is available annually, albeit perceived Make this document available to all teachers to be of low quality. In-service training is ad hoc and through their schools as well as via pre-service also perceived to be of low-quality. There are some and in-service teacher training and professional resources available to teachers for use in their development programs. classroom assessment activities, which include the guidelines (“Orientacoes”) mentioned above, and the 2. Ensure alignment between the curriculum, the “Programas” specifying what students are expected document outlining desired student learning to learn. Teachers also receive examples of previous outcomes, and the development of classroom classroom assessments/testes sumativas and general assessments. Ensure that there is alignment scoring criteria. However, teachers report insufficient between how classroom assessment is intended training and resources on how to use and follow up on classroom assessment information, particularly with to be used by teachers and how it is actually used regard to OEA. through improved teacher monitoring and teacher performance evaluation. Summative assessment are developed at the regional level (known as the conselho) in coordination with a 3. Provide high-quality training and resources to group of teachers from the same subject/grade level. teachers to help them follow appropriate As such, the same summative assessments are given procedures in developing and implementing to all students within each grade/subject area in the classroom assessment activities by: same conselho. The teachers meet regularly to determine their planning schedule (including the a. Reviewing current pre and in-service teaching schedule and the assessment schedule, as teacher training options (via live, distance the teachers in the same conselho follow the same learning, or pre-recorded courses schedules), develop the summative assessments, and delivered via computer) focused on to establish the grading and scoring criteria used building teacher competencies in within the conselho. However, it was noted that classroom assessment to ensure they are teachers do not receive sufficient guidelines to of high quality and available on an annual support the development of these summative assessments to ensure they are aligned with the basis (especially with the release of the curriculum. Although teachers are regularly new curriculum); 8 CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 b. Providing clear, official guidance and guidelines to teachers on using assessment results and information (both summative and formative) at the classroom level to improve teaching and student learning outcomes; and c. Improving teacher monitoring and performance evaluation with a specific focus on how classroom assessment is intended to be used by teachers and how it is actually used at the classroom level 9 CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 Provas Concelhias are intended to be based on the Examinations in Cabo Verde - Provas official curriculum, in practice they are more aligned Concelhias with what teachers actually taught during the school year. Thus, the Provas Concelhias differs among schools and conselhos depending on whether what Level of development: EMERGING was taught is aligned with the official curriculum. There are no official reviews in place to ensure The Provas Concelhias is an examination offered at alignment with the curriculum and while the the end of each two-year education cycle in the 2nd, Directorate of National Education is supposed to 4th, 6th, and 8th grades. In each cycle, students take the validate the Provas Concelhias, it is unclear what Provas Concelhias in language, math, and science. At exactly this validation entails. the end of 6th grade, students also take the examination in social sciences. At the end of 8th grade, The Provas Concelhias are well aligned with classroom French, English, physics, chemistry, and the history assessment activities in terms of the content and skills and geography of Cabo Verde are part of the testing being measured. The Provas Concelhias follows the requirements. format and structure of other trimester and end-of- year summative assessments given throughout the The “Sistema de Avaliacao- Ensino Basico” (2003) is entire education system. Students' Provas Concelhias the formal, publicly availably policy document that results are incorporated into their third trimester authorizes the Provas Concelhias. Guidance on the score, which is combined with their first two trimester Provas Concelhias, including who should be assessed, scores to calculate their overall annual score. In what should be assessed, the frequency, date, and grades 2, 4, 6, and 8, students who receive an annual length of time to administer the examination, the score of at least 10 out of 20 may advance to the next number of test questions, the structure of the grade level. Students who achieve an annual score examination, the breakdown of multiple-choice versus below 7 must retake the subjects in which they scored open-ended questions, and how examination results below a 7. Students who score above 7, but below 10 should be communicated are provided through out of 20 may take a “second chance” test, called the “Orientacoes para a Organizacao das Provas de Provas de Recurso. If these students do not pass the Avaliacao Final.” Provas de Recurso, they must repeat the subject or grade. It is important to note that the Provas There is no official unit within the Ministry of Concelhias are not considered high-stakes. Rather the Education in charge of examinations, or student results of the Provas Concelhias are used in part to assessments in general. Within the Nucelo de Ensino make a decision regarding the future selection and Basico (the Basic Education Unit) there is one person promotion of students through the education system. designated to organize and coordinate the Provas Concelhias. This person works with members of other Suggested policy options: departments within the Ministry of Education, particularly the Nucelo de Gestao Escolar e Orientacao (the Unit for School Management and Guidance). This 1. Document the methods and procedures used inter-unit team has primary responsibility for running during the examination. Specifically, the Provas Concelhias, it is not exclusively dedicated a. Document the processes used to to the examination activities. construct, analyze, and score questions, items, or tasks, and to set cutoff scores. The Provas Concelhias are developed, organized, b. Clarify the validation process used by the administered, corrected, and the results published at Ministry of Education. Outline the criteria the regional level by “Delegacoes Concelhias.” used by the Ministry to validate the “Delegacoes Concelhias” are regional-level examination. Document the objectives of delegations consisting of teachers and school the validation process to promote directors. Provas Concelhias are implemented on a regional level and thus the examinations differ from transparency and enhance stakeholder each other depending on which “Concelho” or confidence in the credibility of the exam regional grouping the school belongs to. Although the results. 2. Expand upon quality assurance procedures. 10 CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 a. Implement an official review of the examination, perhaps by ensuring that the Ministry of Education’s validation entails orienting the exam to address the official curriculum. b. Introduce additional quality assurance procedures, such as ensuring that proctors/administrators are trained according to a standard protocol; questions, items, or tasks are piloted before the official examination is administered; and scorers are trained to ensure high inter-rater reliability. Such procedures will help better ensure the validity, reliability, and comparability of the examination results. 3. Institute mechanisms to monitor the impact of the examination on education quality, equity, and student outcomes, such as introducing oversight committees, expert review groups, funding for independent research and studies (for example, predictive validity studies), and focus groups or surveys of key stakeholders. Findings from these activities could be used to enhance the design, analysis, or use of the examination to maximize its positive impact on students and the education system. 4. Create a semi-autonomous unit/department within the Ministry of Education dedicated to student assessment: a. Ensure that this unit is accountable to an external body. b. Clearly outline the roles and responsibilities of this unit/group with regard to the key types of student assessment activities (classroom, examinations, large-scale assessment). c. Provide training and other learning opportunities to ensure that this unit can adequately lead work in further developing the student assessment system. 11 CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 Examinations in Cabo Verde - Prova Geral more closely reflect what was taught during that year. These adaptations are often not shared with the Nacional Ministry of Education and as such, it is not clear in practice how well the PGN is aligned with the Level of development: ESTABLISHED curriculum. For both staff within the Ministry of Education and The Prova Geral National—the General National teachers who administer the PGN, there are very Test—is an examination that students take at the end limited opportunities to learn about the PGN or to of the 12th grade. It was first administered in 2003 and build capacity in student assessment in general. Some has been administered annually thereafter. The workshops are offered to Ministry of Education staff purpose of the PGN is to certify students’ completion on an ad-hoc basis and peer-to-peer learning among of secondary education. Students in public and private teachers exists as the PGN is a well-established part of schools take the examination. the education system (although this peer-to-peer learning is not systematic.) However, there are no The “Sistema de Avaliacao- Ensino Secundario” (2003) university/non-university courses, programs or is the formal, publicly availably policy document that workshops available regarding student assessment or authorizes the PGN. Guidance on the PGN, including topics relevant to the PGN. who should be assessed, what should be assessed, the frequency of the examination, date of the The PGN is well aligned with classroom assessment examination administration, length of time to activities in terms of the content and skills being administer the PGN and how examination results measured. The PGN follows the format and structure should be communicated are provided in the of other trimester and end-of-year summative “Orientacoes para a Organizacao das Provas de assessments given throughout the education system. Avaliacao Final.” Along with the scores from other trimester tests, the PGN results factor into students’ “cycle scores” (11th There is no official unit within the Ministry of grade final score + 12th grade final score) to determine Education in charge of examinations, or of student if they may graduate. The PGN accounts for 22 assessment in general. Within the Nucelo de Ensino percent of the cycle score. Students who score above Secundario Geral e Tecnico (the Technical and General 14 out of 20 on the other components of their cycle Secondary Education Unit) there is one person score may graduate without taking the PGN. All other designated to organize and coordinate the PGN. This students must take the PGN. After taking the PGN, person works with members of other departments students achieving a cycle score of at least 10 out of within the Ministry of Education, particularly the 20 may graduate. Students achieving a cycle score of Nucelo de Gestao Escolar e Orientacao (the Unit for between 7 and 10 are eligible to take a “second School Management and Guidance. This inter-unit chance” test called the “Provas de Recurso.” Students team has primary responsibility for running the PGN, achieving a cycle score below 7 must retake the although it is not exclusively dedicated to the PGN. subject(s) in which they scored below a 7. Although the PGN is intended to be based on the official curriculum, in practice it is more aligned with what teachers actually taught during the school year. Suggested policy options: Secondary school teachers propose inputs, such as examination content, to the Ministry of Education’s 1. When designing the PGN, the team might place Technical and General Secondary Education Unit, more emphasis on ensuring that the which then uses these inputs to develop a common examination's content is closely aligned with, and exam to be administered nationally by all secondary adequately reflects, the official curriculum. (If the schools. Once the examination has been developed official curriculum does not reflect what teachers and shared with the secondary schools, teachers are actually teach, consider revisiting the curriculum responsible for administration, scoring, and reporting and/or what is being taught in the classroom.) the results to the National Directorate of Education within the Ministry of Education. It has been noted that sometimes secondary schools adapt the PGN to 12 CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 2. Document existing examination methods and levels, and ensure alignment between these procedures, including the construction of test documents and all forms of student specifications, questions, items, or tasks; test assessments being used in Cabo Verde (testes assembly; scoring of open-ended questions, sumativas, Aferida, PGN, etc) items, tasks, or essays; and the setting of cutoff scores. 7. Ensure that high-quality training opportunities on use of student assessment (particularly 3. Introduce additional quality assurance classroom assessment) is available on a procedures, such as training examination regular basis (both via pre-service and in- administrators according to a standard protocol; service teacher training), with particular piloting questions, items, or tasks before the emphasis on the upcoming changes related to official examination is administered; and training scorers to ensure high inter-rater reliability. Such curriculum and new assessment system. For procedures will help better ensure the validity example: and reliability of the examination results. a. Review preservice teacher training options for building competencies in 4. Institute mechanisms to monitor the impact of classroom assessment to ensure they the examination on education quality, equity, and are of high quality and available on an student outcomes, such as introducing oversight annual basis. committees, expert review groups, funding for b. Institute annual in-service teacher independent research and studies (for example, training opportunities on classroom predictive validity studies), and focus groups or assessment (including via live and surveys of key stakeholders. Findings from these prerecorded courses delivered via activities could be used to enhance the design, computer) to ensure that teachers analysis, or use of the examination to maximize its have the ability to hone their positive impact on students and the overall classroom assessment knowledge and system. skills every year as needed 5. Create a semi-autonomous unit/department c. Ensure that distance learning options within the Ministry of Education dedicated to are available to all teachers student assessment a. This unit needs to be accountable to an external body. b. Clearly outline the roles and responsibilities of this unit/group in regards to all types of student assessment activities (classroom, NLSA, etc.) c. Provide training and learning opportunities to ensure that this unit can adequately lead work in further developing the student assessment system 6. In line with the updated/revised curriculum, develop an official document that outlines desired student learning outcomes and performance in all subject areas at all grade 13 CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 on how to teach the content to be assessed by the National Large-Scale Assessment (NLSA) Aferida) was not provided to schools ahead of the test in Cabo Verde administration. There is no permanent unit or group responsible for Level of development: EMERGING the Aferida. In 2010 and 2014 a temporary group was formed within the Ministry of Education, consisting of Cabo Verde’s National Large-Scale Assessment full time staff from various departments within the exercise, known as Prova de Aferida, was Ministry who helped to design, lead and implement administered in 2010 and 2014. The 2010 round was the Aferida. In general, staff working on the Aferida administered to a representative sample of public have few relevant qualifications in NLSA and there are school students in sixth grade, and the 2014 round no formal trainings associated with the Aferida was administered to a sample of public school available. The 2010 round benefitted from significant students in second, fourth, and sixth grades. Both financial and technical support from UNESCO-Dakar, rounds assessed Portuguese and Math. Assessment the Brazilian Cooperation, and the University of Beira results were used to monitor education quality, Interior from Portugal. For the 2010 Aferida, UNESCO inform policy, and inform pedagogy, although there is Dakar organized a series of capacity-building exercises no system in place to ensure or monitor this impact. for Ministry of Education staff, however there are no ongoing opportunities within the system to learn The “Sistema de Avaliacao do Ensino Basic Decreto, about NLSA/Aferida (such as university courses, 2003” provides an overall definition of a national workshops, or other programs.) large-scale assessment and a brief description of how it can be used to inform policy. However, the Although there were some issues related to errors in document does not authorize the Aferida, and it does item development, administration (such as fewer not provide any guidelines such as on the frequency students attending the second day of the assessment of the assessment, grade levels to be assessed, or the administration than the first day), and delays in data use of results. processing, the 2010 round was considered successfully implemented and stakeholders perceived The Aferida aims to assess overall levels of student the results to be credible. The 2010 results were achievement in relation to the national curriculum, published in the 2011 RESEN (Report of the State of however there is no regular review system in place to the National Education System in Cabo Verde) by ensure that Aferida is aligned with the curriculum. performance levels at the national level (i.e. % of The format and structure of the Aferida is considered students who performed high, medium, or low), similar to the Provas Concelhias (the end of the year which was made available to the general public. The summative assessments) however, because the report did not include scores or performance levels at Provas Concelhias are developed at the local/regional- the individual, school, or regional levels. A report level, they are more aligned with what teachers documenting the methods and procedures used for actually taught in class, whereas the Aferida is the 2010 round was drafted with support from developed at the national-level and more closely UNESCO-Dakar, however it was never finalized or aligned with the curriculum. In theory, if teachers are published. The 2014 round was administered fully in- following the curriculum, all students in Cabo Verde house. This round faced many issues and its results should be exposed to the content and skills measured have not been released. by the Aferida via regular course instruction. In both 2010 and 2014, schools in the sample were Suggested policy options: provided with a framework document that provided general information about Aferida, such as when it would take place, the materials to be used, the 1. Prepare an official, system-level policy document duration of the assessment, and basic instructions for authorizing Prova de Aferida and establishing its the teachers on how to administer it. However, objectives. This could entail updating the Sistema preparatory information on the Aferida (e.g. sample de Avaliacao do Ensino Basico Decreto Lei N. 43 to questions, examples of scoring criteria, information include a formal authorization of the NLSA, 14 CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 including the grade levels and topics to be assessed. In addition, ensure that this document is made available to the general public (for example, by disseminating hard copy and publishing online). 2. Ensure that there is a permanent unit with primary responsibility for the NLSA, and that it is accountable to an external body. Clearly outline the roles and responsibilities of this unit/group as well as of all individuals responsible for key NLSA activities. 3. Provide key stakeholders, including staff and teachers, with training and learning opportunities to ensure that they can adequately implement NLSA-related tasks, including administration of the instruments. 4. Establish regular reviews to ensure that the instruments used for the NLSA are adequately aligned with the school curriculum. 5. Introduce procedures to monitor the quality, credibility, and impact of the NLSA program in general and the NLSA administration in particular. This could involve bringing in external observers, soliciting feedback from stakeholders and experts, and introducing quality assurance mechanisms such as numbering booklets and introducing double data scoring. 15 CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 International Large-Scale Assessment (ILSA) in Cabo Verde Level of development: LATENT Cabo Verde has not yet participated in an ILSA. The Ministry of Education has expressed interest in participating in one in the future. The 2017-2021 Education Strategic Plan broadly mentions “developing [an] external assessment model,” however it does not explicitly mention any plans for the country to participate in an ILSA program. There is no unit within the Ministry of Education directly responsible for student assessment. Suggested policy options: 1. Create an enabling context for Cabo Verde to successfully carry out ILSA activities in the mid- to long-term: a. Prepare a formal policy document that authorizes ILSA activity and explains its role in supporting improved education quality and learning in Cabo Verde, or update the 2017-2021 Education Strategic Plan to specifically authorize and justify participation in an ILSA. b. Develop a medium- to long-term funding plan for ILSA activities and ensure that sufficient funding is available to carry out all ILSA activities, particularly those essential to the technical integrity of the assessment and the utility of the results. c. Conduct a needs assessment of the organizational (e.g., computers, software, storage facilities, building security, communication tools) and human (e.g., specialists, translators) resources that will be needed to carry out ILSA activities. 16 CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 Appendix 1: Assessment Types and Their Key Differences Classroom Large-scale assessment Examinations Surveys National International Exit Entrance Purpose To provide To provide To provide To certify To select immediate feedback on feedback on the students as they students for feedback to overall health of comparative move from one further inform the system at performance of level of the educational classroom particular the education education system opportunities instruction grade/age system at to the next (or level(s), and to particular into the monitor trends in grade/age workforce) learning level(s) Frequency Daily For individual For individual Annually and Annually and subjects offered subjects offered more often more often on a regular on a regular where the system where the system basis (such as basis (such as allows for allows for every 3-5 years) every 3-5 years) repeats repeats Who is All students Sample or A sample of All eligible All eligible tested? census of students at a students students students at a particular grade particular grade or age level(s) or age level(s) Format Varies from Usually multiple Usually multiple Usually essay Usually essay observation to choice and short choice and short and multiple and multiple questioning to answer answer choice choice paper-and-pencil tests to student performances Coverage of All subject areas Generally Generally Covers main Covers main curriculum confined to a few confined to one subject areas subject areas subjects or two subjects Additional Yes, as part of Frequently Yes Seldom Seldom information the teaching collected from process students? Scoring Usually informal Varies from Usually involves Varies from Varies from and simple simple to more statistically simple to more simple to more statistically sophisticated statistically statistically sophisticated techniques sophisticated sophisticated techniques techniques techniques 17 CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 Appendix 2: Summary of the Development Levels for Each Assessment Type Assessment Type LATENT EMERGING ESTABLISHED ADVANCED Absence of, or deviation On way to meeting Acceptable minimum Best practice from, the attribute minimum standard standard There is no system-wide There is weak system- There is sufficient There is strong system- institutional capacity to wide institutional system-wide institutional wide institutional support and ensure the capacity to support and capacity to support and capacity to support and quality of classroom ensure the quality of ensure the quality of ensure the quality of assessment practices. classroom assessment classroom assessment classroom assessment practices. practices. practices. CLASSROOM ASSESSMENT There is no standardized There is a partially There is a stable There is a stable examination in place for stable standardized standardized standardized key decisions. examination in place, examination in place. examination in place and and a need to develop There is institutional institutional capacity and institutional capacity to capacity and some strong mechanisms to run the examination. The limited mechanisms to monitor it. The EXAMINATIONS examination typically is monitor it. The examination is of high of poor quality and is examination is of quality and is perceived perceived as unfair or acceptable quality and is as fair and free from corrupt. perceived as fair for corruption. most students and free from corruption. There is no NLSA in There is an unstable There is a stable NLSA There is a stable NLSA place. NLSA in place and a in place. There is in place and institutional need to develop institutional capacity and capacity and strong institutional capacity to some limited mechanisms to monitor run the NLSA. mechanisms to monitor it. The NLSA is of high NATIONAL (OR SYSTEM- Assessment quality and it. The NLSA is of quality and its LEVEL) LARGE-SCALE impact are weak. moderate quality and its information is ASSESSMENT information is effectively used to disseminated, but not improve education. always used in effective ways. There is no history of Participation in an ILSA There is more or less There is stable participation in an ILSA has been initiated, but stable participation in an participation in an ILSA nor plans to participate there still is need to ILSA. There is and institutional capacity in one. develop institutional institutional capacity to to run the ILSA. The capacity to carry out the carry out the ILSA. The information from the INTERNATIONAL LARGE- ILSA. information from the ILSA is effectively used SCALE ASSESSMENT ILSA is disseminated, to improve education. but not always used in effective ways. 18 CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 Appendix 3: Methodology for Assigning set as ceiling scores, i.e., the overall mean score for Development Levels the particular assessment type cannot be greater than the score for these key dimensions. These key variables include formal policy, regular funding, 1. The country team or consultant collects having a permanent assessment unit, and the quality information about the assessment system in the of assessment practices. country. 2. Based on the collected information, a level of development and score is assigned to each dimension in the rubrics: • Latent = 1 score point • Emerging = 2 score points • Established = 3 score points • Advanced = 4 score points 3. The score for each quality driver is computed by aggregating the scores for each of its constituent dimensions. For example: The quality driver, ‘Enabling Context,’ in the case of ILSA, has 3 dimensions on which a hypothetical country receives the following scores: Dimension A = 2 points; Dimension B = 2 points; Dimension C = 3 points. The hypothetical country’s overall score for this quality driver would be: (2+2+3)/3 = 2.33 4. A preliminary level of development is assigned to each quality driver. 5. The preliminary development level is validated using expert judgment in cooperation with the country team and The World Bank Task Team Leader. For scores that allow a margin of discretion (i.e., to choose between two levels of development), a final decision has to be made based on expert judgment. For example, the aforementioned hypothetical country has an ‘Enabling Context’ score of 2.33, corresponding to a preliminary level of development of ‘Emerging or Established.’ Based on qualitative information not captured in the rubric, along with expert judgment, the country team chooses ‘Emerging’ as the most appropriate level. 6. Scores for certain key dimensions under ‘Enabling Context’ (in the case of EXAM, NLSA, and ILSA) and under ‘System Alignment’ (in the case of CLASS) were 19 CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 20 CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2015 Appendix 4: SABER-Student Assessment Rubrics for Cabo Verde This appendix provides the completed SABER-Student Assessment rubrics for each type of assessment activity in Cabo Verde. In each row of the rubric, the relevant selection is indicated by a shaded cell. The selection may include a superscript number that refers to the justification or explanation for the selection. The explanation or justification text is located in the “Development-level rating justifications” section at the end of each rubric. If a row includes a superscript, but no shading, this means that insufficient information was available to determine the relevant selection in the row. Cabo Verde Classroom Assessment 21 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 2014 Rubric for judging development level of Classroom Assessment LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation On way to meeting minimum Acceptable minimum Best practice from, the attribute standard standard Curriculum/Standards There was no official document There was an official document There was an official document at There was an official document at at the system level that outlined at the system level, but it the system level that provided the system level that provided what students were expected to provided limited and insufficient sufficient, but not extensive extensive and comprehensive learn. information on what students information on what students information on what students were expected to learn. 1 were expected to learn. were expected to learn. Policy Document There was no document at the There was a document at the There was an official and publicly There was an official and publicly system level that provided system level that provided available document at the system available document at the system guidelines for classroom guidelines for classroom level that provided sufficient, but level that provided extensive and assessment. assessment, but it was either not extensive, guidelines for comprehensive guidelines for unofficial, not publicly available, classroom assessment. classroom assessment.2 or provided limited guidance. Resources There were no resources There were resources available There were sufficient high-quality There was an extensive number available to teachers in the to teachers in the system for resources available to all or of high-quality resources system for their use in classroom their use in classroom almost all teachers in the system available to all or almost all assessment activities. assessment activities, but these for their use in classroom teachers in the system for their resources were not of high assessment activities. use in classroom assessment quality and were limited in activities. number or availability.3 (CONTINUED) 22 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation from, On way to meeting minimum Acceptable minimum standard Best practice the attribute standard Teacher There were no formal mechanisms There was a minimum number of There were sufficient formal There was an extensive number of Development at the system level that supported formal mechanisms at the system mechanisms of high quality at the high-quality, formal mechanisms the development of teachers' level that supported the system level that supported the at the system level that supported competencies in classroom development of teachers' development of teachers' the development of teachers' assessment. competencies in classroom competencies in classroom competencies in classroom assessment, or else the available assessment. assessment. formal mechanisms were not of high quality or were limited in their availability.4 Quality There were no formal mechanisms There was a minimum number of There were sufficient formal There were extensive formal Monitoring at the system level to monitor the formal mechanisms at the system mechanisms at the system level to mechanisms at the system level to quality of classroom assessment level to monitor the quality of monitor the quality of classroom monitor the quality of classroom practices. classroom assessment practices. assessment practices, including assessment practices, including inspection/supervision.5 inspection/supervision. Report to Schools were not required to Schools had minimum Schools were required to report Schools were required to report Stakeholders report individual student requirements to report individual individual student performance on individual student performance on performance on classroom student performance on classroom assessments to the classroom assessments to a variety assessments to any stakeholders. classroom assessments. student and their parents. 6 of relevant stakeholders. (CONTINUED) 23 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation from, On way to meeting minimum Acceptable minimum standard Best practice the attribute standard Report Content Schools were not required to Schools were required to report Schools were required to report This option does not apply to this report individual student individual student performance on individual student performance on indicator. performance on classroom classroom assessments in one or classroom assessments in more than assessments in particular subject two subject areas. two subject areas.7 areas. Report Format There were no requirements for There were requirements for There were requirements for schools There were requirements for schools to use specific formats for schools to use specific formats for to use specific formats for reporting schools to use specific formats for reporting individual student reporting individual student individual student performance on reporting individual student performance on classroom performance on classroom classroom assessments to the performance on classroom assessments. assessments to the students and students and their parents, including assessments to students, parents, their parents, but the formats written reports and teacher/school and the school district, Ministry specified did not include written meetings. There also were of Education, or equivalent. reports and teacher/school requirements for schools to report These included written reports meetings. this information to the school and (in the case of students and district, Ministry of Education, or parents) teacher/school equivalent although the format for meetings. reporting to these entities was not specified.8 Required Uses There were no system-level Teachers were required to use Teachers were required to use Teachers were required to use requirements for teachers to use classroom assessment classroom assessment information classroom assessment classroom assessment information, albeit in a minimal in a sufficient number of ways.9 information in extensive ways. information. number of ways. (CONTINUED) 24 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation from, On way to meeting minimum Acceptable minimum standard Best practice the attribute standard Selection and At the secondary level, classroom This option does not apply to this At the secondary level, classroom This option does not apply to this Certification assessment information was not indicator. assessment information was indicator. required as an input for required to be used as an input for certification decisions or for certification decisions or for selection to the next level of the selection to the next level of the education system. education system.10 Positive Uses Classroom assessment information Classroom assessment Classroom assessment information Classroom assessment information was used in positive ways by a information was used in positive was used in positive ways by most was used in positive ways by all or marginal number of teachers. ways by some teachers. teachers.11 almost all teachers. Poor Practices All or almost all teachers engaged Many teachers engaged in poor Only some teachers engaged in A marginal number of teachers or in poor classroom assessment classroom assessment practices. poor classroom assessment no teachers engaged in poor practices. practices.12 classroom assessment practices. 25 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 Classroom Assessment: Development-level rating justifications 1. There are official documents at the system level outlining what students are expected to learn at each subject and each grade level: "Programa de Disciplina de Quimica" (1999/2000, Ministry of Education), "Programa de Disciplina de Biologia" (1999/2000, Ministry of Education), and "Programa de Disciplina de Historia" 1999/2000, Ministry of Education). These documents do not, however, specify the desired performance level for these subjects and grade levels. 2. Formal, publicly-available documents provide guidelines for classroom assessment, including what should be assessed, how the information should be used, guidelines for the format of assessment questions, how the assessments should be administered, the subjects to be tested, how assessments should be corrected, and other logistics on test administration. These formal, publicly-available documents are the following: Orientacoes para a Organizacao das Provas de Avaliacao Final (Ministry of Education, 2016); Sistema de Avaliacao- Ensino Secundario (Ministry of Education, 2003); and Decreto- lei 71- O Novo Sistema Nacional da Avaliacao das Aprendizagens (Republic of Cabo Verde, 2015). 3. There are high and medium-quality resources available to teachers for use in their classroom assessment activities. These resources are available to all or almost all teachers. They include a document outlining what students are expected to learn in different subject or topic areas at different grade and/or age levels (of high-quality, part of the national curriculum); teacher guides (of medium-quality, provided each year to teachers); and scoring criteria for grading student work (of high-quality, which the Ministry of Education's National Directorate of Education department sends annually to teachers for the grading of final tests). Most teachers also have access to student textbooks that provide support for classroom assessment (of high- quality, most student are provided with textbooks and workbooks that follow the curriculum). For more information, please see “PGN Economia 1a Chamanda” and “Grelha PGN Economia 1a Chamada” (for secondary schools) and “Exemplo Matriz Prova 2o ano” and “Exemplo Criterias Classificacao Mat 2º ano” (for primary schools). Teachers do not have access to the following resources: document(s) outlining the performance level(s) that students are expected to reach in different subject or topic areas at different grade and/or age levels; item banks with examples of questions and tasks to be used for classroom assessment activities; and computer-based testing with instant reports on student performance. 4. There are two formal mechanisms at the system level to support developing teachers' competencies: pre-service teacher training and in-service teacher training that address competencies in classroom assessment. Pre-service training is mandatory and available annually, albeit perceived to be of low-quality. In-service training is ad hoc and also perceived to be of low-quality. The following are not available as formal system-level mechanisms to support developing teachers' competencies: online resources on classroom assessment (for example, modules, rubrics, criteria for scoring); opportunities to participate in conferences and workshops on classroom assessment; and opportunities to participate in the development or scoring of test questions for large-scale assessments or examinations. 26 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 5. Classroom assessment is a required component of school inspections, teacher supervision, and teachers' performance evaluations. An external moderation system is also in place. At least twice per trimester, summative assessments are administered to all students. Final summative assessments are also administered to all students at the end of the school year. These summative assessments are developed at the regional level (known as the "conselho") in coordination with a group of teachers from that conselho of the same subject/grade level. The teachers meet regularly to determine their planning schedule, develop the summative and final assessments, and to establish the grading and scoring criteria used within the conselho. Teachers are regularly supervised and evaluated as a requirement of their day-to-day work, and, as such, they also undergo supervision while administering and grading/scoring summative assessments. Specifically, the school director administers classroom observations under the direction of the General Inspectorate of Education. 6. All or almost all schools are required to report on individual student performance in a manner that is accessible to the student and the parent. Cabo Verde accomplishes this by posting grades from classroom assessments on the walls in classrooms, where students and parents can view the results. In addition, after each of these assessments, the teachers return the tests to the students and are required to go through the test, review all questions, and correct answers with the students. Schools are not required to report results to the school district or Ministry of Education. 7. Schools are required to report individual student performance on classroom assessment in all subjects, which include language and math. Schools are required to report to the student and the student's parent or guardian. 8. Schools are required to provide written reports on individual student performance to both students and their parents/guardians. The students' classroom assessment results are not sent to the Ministry of Education, however students' final annual grades are sent to the Ministry of Education's National Directorate of Education. 9. Teachers are required to use classroom assessment information to diagnose student learning issues, provide continuous feedback to students as a part of instruction, and to evaluate students' performance. Teachers are not required to use classroom assessment information to plan the next steps in instruction. 10. Classroom assessments are key inputs for student grades. Each year, students earn an annual score, based upon their scores each trimester and the students’ final exams. The third trimester score is weighted a bit higher (around 40%) to account for the final summative assessments, which take place during the third trimester. 27 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 The trimester scores are based largely on classroom assessments. Officially, trimester scores are supposed to be based upon classroom assessments (about 80% of the trimester grade) and other elements of assessment (about 20% of the trimester grade), which include attendance, participation, group work, and individual work. In practice, however, very few teachers use or apply these other elements of assessments, and there are not specific standards or guidelines concerning how to incorporate these assessments in trimester scores. Consequently, trimester scores (and, thus annual scores) are largely determined by classroom assessments. Under this scoring system, classroom assessments help measure whether students have successfully completed primary and secondary schools. At the end of the 8th grade, before transitioning to secondary school, students must have an annual sore of at least 10 out of 20, which—as described above--is largely based upon classroom assessment results. At the end of secondary school (the 12th grade), students must have an annual score of at least 10 out of 20 to graduate. Classroom assessments are not, however, used as selection mechanisms for entering higher levels of education. 11. All or almost all teachers use classroom assessment to evaluate student performance, most teachers use classroom assessment to diagnose student learning issues, and some teachers use classroom assessment to continuously provide feedback to students as part of instruction and to plan the next steps in instruction. 12. Most teachers use and apply the multiple trimester summative assessments correctly. Teachers receive sufficient guidelines to support the development of assessments to ensure the assessments are aligned with the curriculum. (Summative assessments are developed by teachers at the regional level; teachers from the same region convene to develop, implement, and grade the assessments.) There are no problems with teachers administering the summative tests throughout the year as required. Teachers also receive sufficient guidelines to support the development of other assessments throughout the school year to ensure the assessments that the teacher develops for his or her class are aligned with the curriculum. There are two areas which merit attention. First, classroom assessments are intended to include "other elements of assessment," including individual work, group work, attendance, and participation. In practice, however, few teachers apply or use these other elements of assessment, in part because there are not standards in place about how to measure or include them in the overall calculation of classroom assessments. Second, major discrepancies in 2010 6th grade Aferida test results and 2010 6th grade classroom assessments have called into question the accuracy of both exams, but especially the 2010 6th grade classroom assessment results. Students passed the classroom assessments at greater rates than the Aferida, but there is little evidence as to why this happened. It is important to understand why the two tests produced different results, because they are directly linked to 6th grade graduation rates. 28 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 CABO VERDE Examinations - Provas Concelhias 29 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation from, On way to meeting minimum Acceptable minimum standard Best practice the attribute standard Program No examination program existed An examination program existed A stable examination program had A stable examination program had Stability at the system level. at the system level, but it was not been in place for several years. been in place for 10 years or more. sufficiently stable.1 Clarity of There were no policy-mandated The examination had clear policy- The examination had clear policy- This option does not apply to this Purpose purposes of the examination. mandated purposes, but these did mandated purposes that included indicator. not include student certification student certification, selection, or or selection.2 both. Policy No policy document authorized An informal/draft policy A formal/official policy document A formal/official policy document Document the examination program. document authorized the authorized the examination authorized the examination examination program. program, but the document was program and was available to the not available to the general public. general public.3 Program No official document provided An official document provided An official document provided key This option does not apply to this Guidelines guidelines for the examination guidelines for the examination guidelines for the examination indicator. program. program, but it was missing some program. key guidelines.4 Stability of There was no unit with primary There was a unit(s) with primary There was a permanent unit(s) This option does not apply to this Organization responsibility for running the responsibility for running the with primary responsibility for indicator. examination program. examination program, but the running the examination program unit(s) was temporary or had been that had been in place for 5 or in place for less than 5 years.5 more years. (CONTINUED) 30 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation from, On way to meeting minimum Acceptable minimum standard Best practice the attribute standard Accountability There was no unit with primary The unit(s) with primary The unit(s) with primary The unit(s) with primary of Organization responsibility for running the responsibility for running the responsibility for running the responsibility for running the examination program, or else the examination program was examination program was examination program was unit responsible was not accountable to a clearly accountable to a clearly accountable to a clearly accountable to a clearly recognized body within the recognized body within the same recognized external body. recognized body. examination unit. 6 institution as the examination unit. Organization The examination unit did not have The examination unit had some of The examination unit had most of The examination unit had all of the Resources the appropriate resources. the appropriate resources. the appropriate resources. appropriate resources.7 Qualifications There were no individuals Some of the individuals Most of the individuals responsible All or almost all of the individuals of Staff responsible for completing key responsible for completing key for completing key examination responsible for completing key examination activities. examination activities had the activities had the relevant examination activities had the relevant qualifications. qualifications. relevant qualifications.8 Effectiveness of There were no individuals The responsible individuals The responsible individuals The responsible individuals Staff responsible for completing key completed key examination completed key examination completed key examination examination activities. activities, but there were activities, with only some issues in activities and there were no issues significant issues in how these how these activities were in how these activities were activities were completed. completed. completed.9 Source of There was no funding available for The source of funding for the The source of funding for the This option does not apply to this Funding examination activities. majority of examination activities majority of examination activities indicator. was loans, credits, grants or was the government's internal equivalent. 10 funding sources or student fees. (CONTINUED) 31 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation from, On way to meeting minimum Acceptable minimum standard Best practice the attribute standard Activities There was no funding available for Funding was not sufficient to Funding was sufficient to cover all This option does not apply to this Funded examination activities. cover all core examination core examination activities. indicator. activities.11 Staff/Teacher There were no opportunities to Opportunities to learn about the There were sufficient high-quality Opportunities to learn about the Opportunity to learn about the examination. examination were minimal, or not opportunities to learn about the examination were extensive, of Learn of high quality, or did not benefit examination that were available to high quality, and benefited key all key stakeholder groups.12 key stakeholder groups. stakeholder groups. Teacher Teachers did not perform Teachers performed a minimal Teachers performed a sufficient Teachers performed an extensive Participation examination-related tasks. number of examination-related number of examination-related number of examination-related tasks. tasks.13 tasks. Measuring It was not clear what the There was weak alignment The examination measured official The examination measured official What is examination was intended to between the examination and learning standards or curriculum, learning standards or curriculum, Intended measure. what it was meant to measure, or and officially-mandated reviews to and officially-mandated reviews to there was no regular review verify this alignment took place verify this alignment took place process in place to verify that during most examination rounds. during all or almost all alignment existed. 14 examination rounds. Alignment with The examination was poorly The examination was somewhat The examination was very aligned This option does not apply to this Other aligned with other types of aligned with other types of with other types of assessment indicator. Assessments assessment activities in the assessment activities in the activities in the system.15 system. system. (CONTINUED) 32 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation from, On way to meeting minimum Acceptable minimum standard Best practice the attribute standard Availability of There were no materials available Materials to prepare for the Materials to prepare for the Materials to prepare for the Preparation to students to prepare for the examination were available to examination were available to examination were available to all Materials examination. 16 some or a marginal number of most students. or almost all students. students. Quality of There were no materials available Minimal material was available to Sufficient and high-quality Extensive and high-quality Preparation to students to prepare for the students to prepare for the material was available to students material was available to students Materials examination. 17 examination, or the material to prepare for the examination. to prepare for the examination. available was not of high quality. Reasons for Not All or almost all individuals could Most or some individuals could There were no non-examination- This option does not apply to this Taking the not take the examination due to not take the examination due to relevant reasons that prevented indicator. Examination one or more non-examination- one or more non-examination- individuals from taking the relevant reason(s). relevant reason(s). examination.18 Quality No formal procedures were in Formal procedures to ensure the Formal procedures to ensure the Formal procedures to ensure the Assurance place to ensure the quality of the quality of the examination were quality of the examination were quality of the examination were examination. minimal in nature or not sufficient in nature and required. extensive in nature and required. required.19 Standardization The examination was not The examination was partially The examination was fully The examination was fully standardized at the system level. 20 standardized at the system level, standardized at the system level, standardized at the system level, or minimal or no procedures were and sufficient procedures were in and extensive procedures were in in place to ensure standardization. place to ensure standardization. place to ensure standardization. (CONTINUED) 33 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation from, On way to meeting minimum Acceptable minimum standard Best practice the attribute standard Quality Many errors or delays in activities Errors or delays in activities Any errors or delays in activities Errors or delays in activities did Processes took place that affected the affected the examination to a had only a minimal effect on the not affect the examination. examination to a great extent.21 significant level. examination. Inappropriate Inappropriate behavior Inappropriate behavior took place Inappropriate behavior was low Inappropriate behavior, if any, was Behavior compromised the credibility of the and compromised the credibility and did not compromise the marginal, and did not compromise examination to a great extent. of the examination somewhat. credibility of the examination. the credibility of the examination.22 Credibility of The results of the examination The results of the examination The results of the examination The results of the examination Results were perceived as credible by very were perceived as credible by were perceived as credible by were perceived as credible by all few stakeholder groups. some stakeholder groups. most stakeholder groups. or almost all stakeholder groups.23 Confidentiality There was no official policy to Confidentiality of student results There was an official policy to This option does not apply to this of Results keep student results confidential, was partially accomplished. keep student results confidential, indicator. and student results were not kept and student results were kept confidential in practice.24 confidential in practice. Official Examination results were not This option does not apply to this Examination results were officially This option does not apply to this Recognition of officially recognized by indicator. recognized by educational indicator. Results educational institutions or institutions or employers in other 25 employers in other countries. countries. (CONTINUED) 34 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation from, On way to meeting minimum Acceptable minimum standard Best practice the attribute standard Post- No options were available to Minimal options were available to Sufficient options were available Extensive options were available Examination students after they had taken the students after they had taken the to students after they had taken to students after they had taken Options for examination. examination. the examination. 26 the examination. Students Methods and There was no documentation on There was minimal There was sufficient and public There was extensive and public Procedures the methods and procedures used documentation on the methods documentation on the methods documentation on the methods Documentation during the examination. 27 and procedures used during the and procedures used during the and procedures used during the examination, or the examination. examination. documentation that existed was not public. Impact No mechanisms were in place to Minimal mechanisms were in Sufficient mechanisms were in Extensive mechanisms were in Monitoring monitor the impact of the place to monitor the place to monitor the impact of the place to monitor the impact of the examination. 28 consequences of the examination, examination and the mechanisms examination and the mechanisms or the mechanisms took place took place all or almost all took place all or almost all only some or a few examination examination rounds. examination rounds. rounds. Readiness to The system was weakly prepared The system was somewhat The system was well prepared to This option does not apply to this Start an to start an examination program in prepared to start an examination start an examination program in indicator. 29 Examination the future. program in the future. the future. Program 35 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 Examinations: Development-level rating justifications 1. The Provas Concelhias is an examination offered once per year at the end of each education cycle in 2nd, 4th, 6th, and 8th grades. (In higher grades, there is an examination called the Prova Geral Interna which is administered at the end of each year of secondary education.) Students take the Provas Concelhias in each key subject area, including language, math, and science. Additional areas tested vary across education cycles: in 6th grade, the Provas Concelhias also tests social sciences, and in 8th grade, the Provas Concelhias also tests French, English, history and geography of Cabo Verde, physics, and chemistry. Students' Provas Concelhias results are incorporated into their third trimester score, which is combined with the first two trimester scores to calculate students' final scores for the school year, their "annual scores." The Provas Concelhias are offered two times per year, in two "waves." The first wave is mandatory for all students, but if students miss the first wave, they have the opportunity to take the test during the second wave. If students do not take the Provas Concelhias (if they miss both waves) they will receive a zero on the exam, and this will be incorporated into their final grade for the school year. The Provas Concelhias is developed, organized, administered, corrected, and results are published at the regional level by "Delegacoes Concelhias," which are groups of schools arranged by region and proximity. (Each school in Cabo Verde belongs to a Delegacao Concelhia.) To accomplish these tasks, each Delegacao Concelhia leader convenes teams made up of teachers and school directors within the Delegacao Concelhia. Together, they develop, organize, administer, correct, and publish the examination and its results. The dates for the exam are set at the national level by the Ministry of Education's Directorate of National Education. The Directorate of National Education is also responsible for developing the general orientation of the examination (such as establishing the examination date, duration, and materials permitted to be used on the examination), general information on the structure of the examination structure (including subjects to be tested in each grade level, the number of test questions/items, and the breakdown of multiple choice vs. open-ended questions), and validation of the examination. (It is not clear, however, what this validation entails.) Information on when this examination was administered for the first time is not available. 2. The policy-mandated purpose of the Provas Concelhias is to be an input for third trimester grades, which factor into students' annual grades. Students' annual grades, awarded at the end of the school year, determine whether students may pass to the next grade level. Policy-mandated purposes of the exam do not include the following: certifying student completion of primary or secondary education, selecting students into tracks as they pursue further education (for example, academic, vocational); selecting students into tertiary education; monitoring education quality; holding the government, schools, or teachers accountable; evaluating interventions aimed at improving student learning; and informing policy and pedagogy. 36 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 3. The Sistema de Avaliacao-Ensino Basico is a system-level policy document that authorized the Provas Concelhias. This document was authorized by the Ministry of Education in 2003. It is a formal/official document and it is available to the general public. This document is currently undergoing changes according to the 2015 Regulation of the New Assessment System. 4. An official document, Orientacoes para a Organizacao das Provas de Avaliacao Final, provided guidelines for the examination program. These guidelines addressed the governance of the examination, what should be assessed, who should be assessed, the frequency of the examination administration, and how results should be communicated to stakeholders. The document did not offer guidelines on how assessment results should be used. 5. There was no official unit with primary responsibility for running the examination program, however there are people from different departments who work together to run the examination. In the Nucleo de Ensino Basico (the Primary School Department within the Ministry of Education), one person serves as a focal point who leads the work and organization of the Provas Concelhias within the Ministry of Education. This person works with other departments within the Ministry of Education, particularly The Nucleo de Gestao Escolar e orientacao (School Management department) to provide general administrative guidelines, overarching guidelines on test structure, and validation of the tests. (It is not clear, however, what this validation entails.) At the local level, in Delegacoes Concelhias, teachers and school directors develop the Provas Concelhias examinations to align with what has been taught during the year. The Delecagoes Concelhias teams are in charge of the elaboration of the exam, sending in the exams for validation by the Ministry of Education, organizing the exam's administration, correcting the exam, publishing the exam's results, and treating/cleaning the exam's final results. 6. There is no official examination unit, however individuals within the Ministry of Education and at the local level work together on the examination. 7. Individuals responsible for examination activities had all of the appropriate resources, including computers for all technical staff, software, building security, storage facilities, computer servers, and communication tools. 8. Individuals responsible for the examination consisted of permanent staff from different units, all or almost all of whom had relevant qualifications, and teachers completing examination activities as part of their job responsibilities, most of whom had relevant qualifications. 9. There were individuals responsible for completing key examination activities, specifically permanent staff within the Ministry of Education (which does not have an official examination unit) and teachers completing examination activities as part of their job responsibilities. There were no significant issues with their effectiveness, including in the organization, design, administration, or grading of the examination, or in disseminating the results. 10. There was funding available for examination activities, however there is no information available about the source of funding, the amount of funding, what the funding covers, or the regularity of funding. 37 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 11. There was funding available for examination activities, however there is no information available about the source of funding, the amount of funding, what the funding covers, or the regularity of funding. 12. There were opportunities to learn about the examination in pre-service training. Teachers receive a general training in assessment, which addresses the Provas Concelhias. In addition, there are occasionally organized internal workshops on assessment within the Ministry of Education. 13. Teachers perform sufficient examination-related tasks, including selecting and/or creating the examination questions, items or tasks; selecting and/or creating examination scoring guides; administering the examination; scoring the examination; and supervising the examination process. Teachers do not, however, act as judges (for example, in orals) or resolve inconsistencies between examination scores and school grades (moderation). 14. The examination is intended to measure the official curriculum. There are no official or officially-mandated reviews in place. In practice, whether the examination is aligned with the official curriculum can vary given that it is aligned with what teachers taught during the school year, and whether or not what they taught aligned with the official curriculum. Although the Ministry of Education is responsible for validating the examination, it does not seem to entail ensuring that the examination is aligned with the official curriculum. (In fact, there is little clarity about what the validation process entails, exactly.) 15. The Provas Concelhias are very aligned with classroom assessments--specifically the trimester summative assessments given throughout the education system. 16. There were no materials available to students to prepare for the examination; Cabo Verde considers the Provas Concelhias to be a regular part of its classroom assessment system, and does not provide special materials so that students may prepare for this exam. 17. There were no materials available to students to prepare for the examination; Cabo Verde considers the Provas Concelhias to be a regular part of its classroom assessment system, and does not provide special materials so that students may prepare for this exam. 18. There were no non-examination-relevant reasons that prevented individuals from taking the examination; all students can participate in the Provas Concelhias. 19. A standardized manual for examination administrators, called the Orientations for the End of the Year Assessment, was the only formal quality assurance procedure in place to ensure the quality of the examination. Other forms of quality assurance procedures that were not present include: proctors/administrators trained according to protocol; questions, items, or 38 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 tasks piloted before the official examination was administered; all booklets numbered; double data scoring; scorers trained to ensure high inter-rater reliability; double processing of data; and external or internal observers. 20. The examination was not standardized at the system level. Assessment design, administration, and scoring varied across students in the same examination round. This was because the examination was standardized at the regional levels, rather than at the national level. Thus, students within the same region experienced similar conditions, but students across regions did not necessarily experience similar conditions. The following procedures were not in place to ensure the standardization of the examination: -Examination papers and questions, items, or tasks were not the same or equivalent for all students; -Examination administrators were not trained to ensure that all students took the examination under the same conditions; -Quality control monitors and/or observers were not used to ensure the same administration conditions in all locations where the examination was administered; -The same scoring criteria were not used to correct the examination questions, items, or tasks; and -Examination results were not computed using the same procedures for all students. 21. There is not enough information to answer this, because all Provas Concelhias are implemented on a regional level and differ. As mentioned above, the Ministry of Education (and, specifically, the Directorate of National Education) is responsible for developing administrative guidelines for the examination, general guidelines for the examination structure, and validation of the examinations. It is unclear, however, what exactly this validation entails.) At the regional level, Delegacoes Concelhias are in charge of exam elaboration, validation, organization, results publication, and the treating/cleaning of final results. 22. Inappropriate behavior did not take place during the examination round. 23. The results of the examination were perceived to be credible by all or almost all stakeholder groups. 24. Student names and results were supposed to be (and were, in practice) publicly accessible. Students' grades were posted in the school lobby. Keeping students results confidential is neither an official policy nor implemented in practice. 25. Examination results were not officially recognized by educational institutions or employers in other countries. 26. If students did not pass the examanation, they were not required to leave the education system. Instead, students could attend remedial education and then take a "second chance" test called the Provas de Recurso which is available to students whose overall, final score for the school year is lower than 10 but greater than 7. (A score of 10 is needed to progress to the next grade, and a score below 7 means that the student must retake the subject in which they scored less than a 10). If a student is eligible for the Provas de Recurso, the second chance test, they also have the right to take two weeks of 39 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 remedial classes, organized by the school, to prepare for the Provas de Recurso. After the student takes the Prova de Recurso, their end-of-year score is recalculated using the score from the Prova de Recurso. Students could also repeat the grade. The following options were not available to students after they took the examination and received their results: -Students could not apply to tertiary education institutions; -Students could not apply to secondary education institutions; -Students could not retake the exact same examination; and -Students could not take courses in preparation for the Provas Concelhias. 27. There was no documentation that specified the methods and procedures used during the examination round. Specifically, the following information on the methods and procedures were not documented: -Test specifications; -Construction of questions, items, or tasks; -Pilot testing of questions, items, or tasks; -Analysis of piloted questions, items, or tasks; -Test assembly; -Marking or scoring of open-ended questions, items, tasks or essays; -Scoring of examination questions, items, or tasks; -Reliability; -Scaling; and -Setting cutoff scores. 28. There are no mechanisms in place to monitor the impact of the examination. Specifically, the following mechanisms to monitor the impact of the examination were not in place: - Oversight committee(s); - Expert review groups; - Funding for independent research on the examination; - Studies (for example, predictive validity) on the examination; and - Focus groups or surveys of key stakeholders. 29. This indicator does not apply to this rubric. 40 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 CABO VERDE Examinations - Prova Geral Nacional 41 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation from, On way to meeting minimum Acceptable minimum standard Best practice the attribute standard Program No examination program existed An examination program existed A stable examination program had A stable examination program had Stability at the system level. at the system level, but it was not been in place for several years. been in place for 10 years or sufficiently stable. more.1 Clarity of There were no policy-mandated The examination had clear policy- The examination had clear policy- This option does not apply to this Purpose purposes of the examination. mandated purposes, but these did mandated purposes that included indicator. not include student certification student certification, selection, or or selection. both.2 Policy No policy document authorized An informal/draft policy A formal/official policy document A formal/official policy document Document the examination program. document authorized the authorized the examination authorized the examination examination program. program, but the document was program and was available to the not available to the general public. general public.3 Program No official document provided An official document provided An official document provided key This option does not apply to this Guidelines guidelines for the examination guidelines for the examination guidelines for the examination indicator. program. program, but it was missing some program. key guidelines.4 Stability of There was no unit with primary There was a unit(s) with primary There was a permanent unit(s) This option does not apply to this Organization responsibility for running the responsibility for running the with primary responsibility for indicator. examination program. examination program, but the running the examination program unit(s) was temporary or had been that had been in place for 5 or in place for less than 5 years. more years.5 (CONTINUED) 42 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation from, On way to meeting minimum Acceptable minimum standard Best practice the attribute standard Accountability There was no unit with primary The unit(s) with primary The unit(s) with primary The unit(s) with primary of Organization responsibility for running the responsibility for running the responsibility for running the responsibility for running the examination program, or else the examination program was examination program was examination program was unit responsible was not accountable to a clearly accountable to a clearly accountable to a clearly accountable to a clearly recognized body within the recognized body within the same recognized external body. recognized body. examination unit. institution as the examination unit. 6 Organization The examination unit did not have The examination unit had some of The examination unit had most of The examination unit had all of the Resources the appropriate resources. the appropriate resources. the appropriate resources. appropriate resources.7 Qualifications There were no individuals Some of the individuals Most of the individuals responsible All or almost all of the individuals of Staff responsible for completing key responsible for completing key for completing key examination responsible for completing key examination activities. examination activities had the activities had the relevant examination activities had the relevant qualifications. qualifications. relevant qualifications.8 Effectiveness of There were no individuals The responsible individuals The responsible individuals The responsible individuals Staff responsible for completing key completed key examination completed key examination completed key examination examination activities. activities, but there were activities, with only some issues in activities and there were no issues significant issues in how these how these activities were in how these activities were activities were completed. completed. completed.9 Source of There was no funding available for The source of funding for the The source of funding for the This option does not apply to this Funding examination activities. majority of examination activities majority of examination activities indicator. was loans, credits, grants or was the government's internal equivalent. 10 funding sources or student fees. (CONTINUED) 43 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation from, On way to meeting minimum Acceptable minimum standard Best practice the attribute standard Activities There was no funding available for Funding was not sufficient to Funding was sufficient to cover all This option does not apply to this Funded examination activities. cover all core examination core examination activities. indicator. activities.11 Staff/Teacher There were no opportunities to Opportunities to learn about the There were sufficient high-quality Opportunities to learn about the Opportunity to learn about the examination. examination were minimal, or not opportunities to learn about the examination were extensive, of Learn of high quality, or did not benefit examination that were available to high quality, and benefited key all key stakeholder groups.12 key stakeholder groups. stakeholder groups. Teacher Teachers did not perform Teachers performed a minimal Teachers performed a sufficient Teachers performed an extensive Participation examination-related tasks. number of examination-related number of examination-related number of examination-related tasks. tasks.13 tasks. Measuring It was not clear what the There was weak alignment The examination measured official The examination measured official What is examination was intended to between the examination and learning standards or curriculum, learning standards or curriculum, Intended measure. what it was meant to measure, or and officially-mandated reviews to and officially-mandated reviews to there was no regular review verify this alignment took place verify this alignment took place process in place to verify that during most examination rounds. during all or almost all alignment existed. 14 examination rounds. Alignment with The examination was poorly The examination was somewhat The examination was very aligned This option does not apply to this Other aligned with other types of aligned with other types of with other types of assessment indicator. Assessments assessment activities in the assessment activities in the activities in the system.15 system. system. (CONTINUED) 44 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation from, On way to meeting minimum Acceptable minimum standard Best practice the attribute standard Availability of There were no materials available Materials to prepare for the Materials to prepare for the Materials to prepare for the Preparation to students to prepare for the examination were available to examination were available to examination were available to all Materials examination. some or a marginal number of most students. or almost all students. 16 students. Quality of There were no materials available Minimal material was available to Sufficient and high-quality Extensive and high-quality Preparation to students to prepare for the students to prepare for the material was available to students material was available to students Materials examination. examination, or the material to prepare for the examination. to prepare for the examination. available was not of high quality. 17 Reasons for Not All or almost all individuals could Most or some individuals could There were no non-examination- This option does not apply to this Taking the not take the examination due to not take the examination due to relevant reasons that prevented indicator. Examination one or more non-examination- one or more non-examination- individuals from taking the relevant reason(s). relevant reason(s). examination.18 Quality No formal procedures were in Formal procedures to ensure the Formal procedures to ensure the Formal procedures to ensure the Assurance place to ensure the quality of the quality of the examination were quality of the examination were quality of the examination were examination. minimal in nature or not sufficient in nature and required. extensive in nature and required. required.19 Standardization The examination was not The examination was partially The examination was fully The examination was fully standardized at the system level. standardized at the system level, standardized at the system level, standardized at the system level, or minimal or no procedures were and sufficient procedures were in and extensive procedures were in in place to ensure standardization. place to ensure standardization. 20 place to ensure standardization. (CONTINUED) 45 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation from, On way to meeting minimum Acceptable minimum standard Best practice the attribute standard Quality Many errors or delays in activities Errors or delays in activities Any errors or delays in activities Errors or delays in activities did Processes took place that affected the affected the examination to a had only a minimal effect on the not affect the examination. examination to a great extent. significant level. examination. 21 Inappropriate Inappropriate behavior Inappropriate behavior took place Inappropriate behavior was low Inappropriate behavior, if any, was Behavior compromised the credibility of the and compromised the credibility and did not compromise the marginal, and did not compromise examination to a great extent. of the examination somewhat. credibility of the examination. the credibility of the examination.22 Credibility of The results of the examination The results of the examination The results of the examination The results of the examination Results were perceived as credible by very were perceived as credible by were perceived as credible by were perceived as credible by all few stakeholder groups. some stakeholder groups. most stakeholder groups. or almost all stakeholder groups.23 Confidentiality There was no official policy to Confidentiality of student results There was an official policy to This option does not apply to this of Results keep student results confidential, was partially accomplished. keep student results confidential, indicator. and student results were not kept and student results were kept confidential in practice.24 confidential in practice. Official Examination results were not This option does not apply to this Examination results were officially This option does not apply to this Recognition of officially recognized by indicator. recognized by educational indicator. Results educational institutions or institutions or employers in other 25 employers in other countries. countries. (CONTINUED) 46 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation from, On way to meeting minimum Acceptable minimum standard Best practice the attribute standard Post- No options were available to Minimal options were available to Sufficient options were available Extensive options were available Examination students after they had taken the students after they had taken the to students after they had taken to students after they had taken Options for examination. examination. the examination. 26 the examination. Students Methods and There was no documentation on There was minimal There was sufficient and public There was extensive and public Procedures the methods and procedures used documentation on the methods documentation on the methods documentation on the methods Documentation during the examination. 27 and procedures used during the and procedures used during the and procedures used during the examination, or the examination. examination. documentation that existed was not public. Impact No mechanisms were in place to Minimal mechanisms were in Sufficient mechanisms were in Extensive mechanisms were in Monitoring monitor the impact of the place to monitor the place to monitor the impact of the place to monitor the impact of the examination. 28 consequences of the examination, examination and the mechanisms examination and the mechanisms or the mechanisms took place took place all or almost all took place all or almost all only some or a few examination examination rounds. examination rounds. rounds. Readiness to The system was weakly prepared The system was somewhat The system was well prepared to This option does not apply to this Start an to start an examination program in prepared to start an examination start an examination program in indicator. 29 Examination the future. program in the future. the future. Program 47 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 Examinations: Development-level rating justifications 1. The Prova Geral Nacional examination (PGN, the General National Test) was first administered in 2003 and has been administered once per year ever since. 2. The policy-mandated purposes of the examination are to (1) certify student completion of secondary education, and (2) make pass/fail decisions in transitioning students from one grade to another. 3. Sistema de Avaliacao-Ensino Secundario is a system-level policy document authorized by the Ministry of Education in 2003. This document is formal/official and is available to the general public. 4. An official document--Orientacoes para a Organizacao das Provas de Avaliacao Final--provides guidelines for the examination, including guidelines on the governance of the examination, what should be assessed, who should be assessed, the frequency of the examination administration, and how results should be communicated to stakeholders. It does not provide guidelines on how results should be used, on funding for the examination, on confidentiality of results, or on how results can be accessed by stakeholders. 5. Within the Ministry of Education's Nucleo de Ensino Secundario Geral e Tecnico (the Technical and General Secondary Education Unit), there is one person who leads a team to organize and coordinate the PGN. This person works with members of other departments within the Ministry of Education, particularly the Nucleo de Gestao Escolar e Orientacao (the unit for School Management and Guidance). This inter-unit team has primary responsibility for running the PGN examination program. Although this team is not exclusively dedicated to the PGN, it is permanent. The team has overseen all examination rounds since 2003. 6. There is an inter-unit team led by an individual in the Nucleo de Ensino Secundario Geral e Tecnico (the Technical and General Secondary Education Unit, part of the Ministry of Education) which holds primary responsibility for running the PGN. This team is accountable to the Ministry of Education. It is not accountable to an external board or committee that is institutionally separate from the team in charge of the examination. 7. The examination unit had all of the appropriate resources: computers for all technical staff, software (such as statistical packages), building security, storage facilities, computer servers, and communications tools (phone, internet, email). 8. Two types of individuals were responsible for completing key examination activities: (1) permanent staff from different units within the Ministry of Education (including the Nucleo de Ensino Secundario Geral e Tecnico--the Technical and General Secondary Education Unit--and the Nucleo de Gestao Escolar e Orientacoes-- the unit for School Management and Guidance), and (2) secondary school teachers completing examination activities as part of their job responsibilities. Almost all permanent staff of the different Ministry of Education units were qualified, and most of the teachers had relevant qualifications as well. 48 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 Secondary school teachers are responsible for proposing exam content to the Ministry of Education, on an annual basis. The Ministry of Education is responsible for developing the final PGN and setting the exam times and dates. Once the exam has been developed, the teachers are responsible for administering the exam, correcting the exam, and reporting exam results. 9. For both groups of individuals responsible for completing key examination activities (permanent staff in the Ministry of Education and teachers completing examination activities as part of their job responsibilities), there were no significant issues with their effectiveness or issues that affected the quality of specific examination activities or the overall quality of the examination. 10. There was funding available for examination activities, however there is insufficient evidence to describe the funding's source or allocation. (It is likely that the source of funding for the majority of examination activities are the government's internal funding sources and/ or student fees.) 11. There was funding available for examination activities, however there is insufficient evidence to describe the funding's allocation or sufficiency. 12. Opportunities to learn about the PGN were available through organized internal workshops for Ministry of Education staff. These workshops, which were of medium quality, discussed the content and skills measured by the examination. In addition, there is peer-to-peer learning among teachers (addressing administering the PGN, grading, and interpretation of scores), because the PGN is a well-established part of the education system. (This peer-to-peer learning, however, is not systematic.) The following opportunities were not available for learning about the PGN: - University graduate programs (master’s or doctoral level) on student assessment that include topics relevant to the examination (for example, test design, administration); - University courses and/or workshops on the content and skills measured by the examination (for example, courses on curriculum); - University courses and/or workshops on examination topics other than the content and skills measured by the examination (for example, test design, reporting); - Non-university courses and/or workshops on examination topics other than the content and skills measured by the examination (for example, test design, reporting); - Funding for attending international programs, courses, and workshops on student assessment that cover topics relevant to the examination; - Internships and/or short-term employment in the unit running the examination; and - Presentations about the examination (for example, presentations on test design, administration). In addition to including more of the opportunities listed above, it is recommended that these opportunities extend to sufficient beneficiaries, including full-time staff on the Ministry of Education's examination team, primary school teachers, and secondary school teachers. 13. Teachers performed a sufficient number of examination-related tasks, including: creating examination questions, items or tasks; administering the examination; and scoring the examination. Specifically, teachers send proposals to the Ministry of Education's Directorate of Education outlining items to include in each subject 49 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 area. (The Directorate of Education is in charge of primary and secondary education, including assessment.) The Ministry of Education's Directorate of Education determines and creates the final PGN and its scoring guide. Teachers are responsible for administering the PGN and scoring the results under the supervision of the Ministry of Education's General Inspectorate department, which is in charge of evaluating, controlling, and supervising the overall functioning of the Education system. After teachers score the PGN, students' results are incorporated into their trimester and annual scores. The actual test is not returned to the students, but an answer guide is posted in the school lobby. Examination tasks that are not performed by teachers include: acting as judges (for example, in orals); supervising examination procedures; and resolving inconsistencies between examination scores and school grades (moderation). 14. The examination is intended to measure the official curriculum. In practice, however, the PGN is only somewhat aligned with the official curriculum, because the examination is based on the content that teachers propose and not necessarily on the national curriculum. Secondary schools submit proposals to the Ministry of Education's Directorate of Education, outlining items they would like to include on the PGN. Subsequently, the Directorate of National Education drafts the exam. There are officially-mandated, external, regular reviews of the PGN which have taken place during almost all examination rounds. The Ministry of Education's Directorate of Education hires an external consultant to conduct a final review and verification of the PGN. The external consultant conducts this review after the Directorate of Education receives proposals from the secondary schools and develops a common PGN. After the consultant reviews and verifies the draft PGN, the Directorate of National Education prepares the final PGN and sends it to the secondary schools to administer. This external review of the exam is systematic and takes place every year. 15. Classroom assessment activities are very aligned with the PGN. Specifically, the PGN is similar to the type of trimester and end-of-year summative assessments given throughout the entire education system. 16. Materials available to prepare for the examination were: (1) the official framework document, which explained what was measured on the examination; and (2) examples of the types of questions that were on the examination provided by the examination unit and/or authority overseeing the examination. These materials were available in school to all or almost all students. Specifically, each April, all schools publicly display the "reference matrix," which outlines the content and objectives for each subject to be covered by the PGN, when the PGN will take place, and the materials required for the PGN. In addition, teachers use past PGNs to help students prepare for the examination. 17. Materials available to prepare for the examination were: (1) the official framework document, of medium quality; and (2) examples of the types of questions that were on the examination, which were of high quality. 18. All students can participate in the PGN. There are no non-examination-relevant reasons that prevent individuals from taking the examination. Students who scored above a grade of 14 out of 20 on their cycle score do not have to take the PGN, however. (The cycle score is the total 11th grade score plus the total 12th grade score.) 50 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 19. There were minimal quality assurance procedures in place to ensure the quality of the examination: there was a standardized manual for examination administrators, required for every examination round, and there were internal observers present at some schools. The internal observation was conducted by the General Inspectorate of Education, which goes to some schools during the PGN to observe the administration of the assessment. Although there are not observers at each classroom undertaking the PGN, PGN protocol ensures that each classroom undertaking the PGN is supervised by two supervising teachers. Quality assurance procedures that were not in place include: proctors/administrators trained according to protocol; questions, items, or tasks piloted before the official examination was administered; numbered booklets; double data scoring; training of scorers to ensure high inter-rater reliability; and double processing of data. 20. The examination was fully standardized at the system level. The PGN is designed by the Ministry of Education's Directorate of Education, with input from secondary school teachers. The Directorate of Education provides the PGN to all schools. All students take the PGN on the same day and at the same time. All schools use the same formula to calculate the students' score on the exam. Moreover, sufficient procedures were in place to ensure the standardization of the examination: (1) examination papers and questions, items, and tasks were the same for all students; (2) the same scoring criteria were used to correct the examination questions, items, and tasks; (3) examination results were computed using the same procedures for all students; and (4) examination results were reported to all students in the same way. 21. Errors in the printing of test booklets have somewhat affected the examination. Specifically, there are sometimes problems with the printing or formatting of the questions on the PGN such that material cannot be read easily. As a result, the PGN is re-sent to the national level for correction. This has sometimes caused administrative delays, but never affected the exam timing; the corrections were all conducted in advance of the PGN administration and never caused schools to postpone the administration of PGN to a different day. All schools administered the tests on the same day and using the same examination booklets. 22. Inappropriate behavior did not take place during the examination round. 23. The results of the PGN examination were perceived to be credible by almost all stakeholder groups. 24. There was no official policy to keep student results confidential and, in practice, student results were not kept confidential. Student names and results were supposed to be and, in practice, were publicly accessible. Specifically, students' grades on the PGN are posted in the school lobby. 25. Examination results were not officially recognized by educational institutions or employers in other countries. 26. There were sufficient options for students after they had taken the examination and received their results: students could ask for remedial education, or students could repeat the grade. Students who failed the PGN or did not reach a minimum level or score did not have to leave the education system. The specific guidelines are as follows: 51 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 - Students with an annual score of at least a 10 (out of 20) at the end of the year may graduate from whatever grade they are in. If students have an annual score of above a 10 at the end of 12th grade, they may graduate from secondary school (9th-12th grade) and apply to higher education institutions. - Secondary school (9th-12th grade) students whose annual scores are between 7 and 10 are eligible to take a "second chance" test called the “Provas de Recurso.” - Students whose annual scores are below 7 must retake the subject(s) in which they scored below a 7. - Students do not have the option to retake the PGN examination itself. - The Prova de Recurso for 9-11th grade levels is developed at the school level. The Provas de Recurso for 12th grade are developed by the Directorate of Education and students can only take the Prova de Recurso in three subjects at a time. All students have the right to take Remedial classes for two weeks to prepare for the Provas de Recurso. Remedial classes are organized by the school. After students take the Prova de Recurso, the students' annual grade in that subject is calculated as 40% of the student’s overall grade in that subject and 60% of the student's grade in the Prova de Recurso. 27. There was no document that specified the methods and procedures used during the examination round. 28. There are no mechanisms in place to monitor the impact of the examination. 29. This indicator does not apply to this rubric. 52 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 CABO VERDE National (or System-Level) Large-Scale Assessment (NLSA) 53 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation On way to meeting minimum Acceptable minimum Best practice from, the attribute standard standard Program Stability No NLSA program existed at the An NLSA program existed at the A stable NLSA program had been A stable NLSA program had been system level. system level, but it was not in place for several years. in place for 10 years or more. sufficiently stable.1 Clarity of Purpose There were no policy-mandated The NLSA had clear policy- The NLSA had clear policy- This option does not apply to this purposes of the NLSA. mandated purposes, but these mandated purposes that indicator. did not include informing policy included informing policy or or pedagogy. pedagogy.2 Policy Document No policy document authorized An informal/draft policy A formal/official policy document A formal/official policy document 3 the NLSA program. document authorized the NLSA authorized the NLSA program, authorized the NLSA program program. but the document was not and was available to the general available to the general public. public. Program Guidelines No official document provided An official document provided An official document provided This option does not apply to this guidelines for the NLSA program. guidelines for the NLSA program, key guidelines for the NLSA indicator. but it was missing some key program. guidelines.4 Stability of There was no unit with primary There was a unit(s) with primary There was a permanent unit(s) This option does not apply to this Organization responsibility for running the responsibility for running the with primary responsibility for indicator. NLSA program. NLSA program, but the unit(s) running the NLSA program that was temporary or had been in had been in place for 5 or more place for less than 5 years.5 years. (CONTINUED) 54 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation On way to meeting minimum Acceptable minimum Best practice from, the attribute standard standard Accountability of There was no unit with primary The unit(s) with primary The unit(s) with primary The unit(s) with primary Organization responsibility for running the responsibility for running the responsibility for running the responsibility for running the NLSA program, or else the unit NLSA program was accountable NLSA program was accountable NLSA program was accountable responsible was not accountable to a clearly recognized body to a clearly recognized body to a clearly recognized external to a clearly recognized body.6 within the NLSA unit. within the same institution as body. the NLSA unit. Source of Funding There was no funding available The source of funding for the The source of funding for the This option does not apply to this for NLSA activities. majority of NLSA activities was majority of NLSA activities was indicator. loans, credits, grants or the government's internal 7 equivalent. funding sources. Activities Funded There was no funding available Funding was not sufficient to Funding was sufficient to cover This option does not apply to this for NLSA activities. cover all core NLSA activities.8 all core NLSA activities. indicator. Organization The NLSA unit did not have the The NLSA unit had some of the The NLSA unit had most of the The NLSA unit had all of the Resources appropriate resources. appropriate resources. appropriate resources.9 appropriate resources. Qualifications of There were no individuals Some of the individuals Most of the individuals All or almost all of the individuals Staff responsible for completing key responsible for completing key responsible for completing key responsible for completing key NLSA activities. NLSA activities had the relevant NLSA activities had the relevant NLSA activities had the relevant qualifications.10 qualifications. qualifications. (CONTINUED) 55 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation On way to meeting minimum Acceptable minimum Best practice from, the attribute standard standard Effectiveness of Staff There were no individuals The responsible individuals The responsible individuals The responsible individuals responsible for completing key completed key NLSA activities, completed key NLSA activities, completed key NLSA activities NLSA activities. but there were significant issues with only some issues in how and there were no issues in how in how these activities were these activities were completed. these activities were completed. completed.11 Staff/Teacher There were no opportunities to Opportunities to learn about the There were sufficient high- Opportunities to learn about the Opportunity to Learn learn about the NLSA.12 NLSA were minimal, or not of quality opportunities to learn NLSA were extensive, of high high quality, or did not benefit about the NLSA that were quality, and benefited key all key stakeholder groups. available to key stakeholder stakeholder groups. groups. Measuring What is It was not clear what the NLSA There was weak alignment The NLSA measured official The NLSA measured official Intended was intended to measure. between the NLSA and what it learning standards or curriculum, learning standards or curriculum, was meant to measure, or there and officially-mandated reviews and officially-mandated reviews was no regular review process in to verify this alignment took to verify this alignment took place to verify that alignment place during most NLSA rounds. place during all or almost all existed.13 NLSA rounds. Alignment with The NLSA was poorly aligned The NLSA was somewhat aligned The NLSA was very aligned with This option does not apply to this Other Assessments with other types of assessment with other types of assessment other types of assessment indicator. activities in the system. activities in the system. activities in the system.14 Opportunities for Students did not have Students had limited Students had sufficient Students had many opportunities Students to be opportunities to be exposed to opportunities to be exposed to opportunities to be exposed to to be exposed to the content and Exposed to Content the content and skills measured the content and skills measured the content and skills measured skills measured by the NLSA.15 and Skills by the NLSA. by the NLSA. by the NLSA. (CONTINUED) 56 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation On way to meeting minimum Acceptable minimum Best practice from, the attribute standard standard Preparatory Official information on the NLSA A minimal amount of official A sufficient amount of official An extensive amount of official Information for was not made available to information on the NLSA was information on the NLSA was information on the NLSA was Schools schools in the system. made available to schools in the made available to most or almost made available to all or almost system, although not necessarily all schools in the system.16 all schools in the system. all schools. Quality Assurance No formal procedures were in Formal procedures to ensure the Formal procedures to ensure the Formal procedures to ensure the place to ensure the quality of the quality of the NLSA were quality of the NLSA were quality of the NLSA were NLSA. minimal in nature or not sufficient in nature and required. extensive in nature and required. required.17 Standardization The NLSA was not standardized The NLSA was partially The NLSA was fully standardized The NLSA was fully standardized at the system level. standardized at the system level, at the system level, and sufficient at the system level, and or minimal or no procedures procedures were in place to extensive procedures were in were in place to ensure ensure standardization. place to ensure standardization. standardization.18 Representativeness A non-random sample or a A random sample of students All students in public and private All students in public schools, or convenience sample of students that was not representative at a representative sample of schools, or a representative participated in the NLSA. the country-level participated in students in public schools, sample of students in public and the NLSA. participated in the NLSA.19 private schools, participated in the NLSA. Reasons for Not All or almost all individuals could Most or some individuals could There were no non-assessment- This option does not apply to this Taking the NLSA not take the NLSA due to one or not take the NLSA due to one or relevant reasons that prevented indicator. more non-assessment-relevant more non-assessment-relevant individuals from taking the reason(s). reason(s). NLSA.20 (CONTINUED) 57 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation On way to meeting minimum Acceptable minimum Best practice from, the attribute standard standard Quality Processes Many errors or delays in Errors or delays in activities Any errors or delays in activities Errors or delays in activities did activities took place that affected affected the NLSA to a significant had only a minimal effect on the not affect the NLSA. the NLSA to a great extent.21 level. NLSA. Inappropriate Inappropriate behavior Inappropriate behavior took Inappropriate behavior was low Inappropriate behavior, if any, Behavior compromised the credibility of place and compromised the and did not compromise the was marginal, and did not the NLSA to a great extent. credibility of the NLSA credibility of the NLSA. compromise the credibility of the somewhat. NLSA.22 Methods and There was no documentation on There was minimal There was sufficient and public There was extensive and public Procedures the methods and procedures documentation on the methods documentation on the methods documentation on the methods Documentation used during the NLSA.23 and procedures used during the and procedures used during the and procedures used during the NLSA, or the documentation that NLSA. NLSA. existed was not public. Publication of NLSA results were not Limited information on the NLSA Sufficient information on the Comprehensive information on Results published.24 results was published, or the NLSA results was published using the NLSA results was published results were published using a an array of dissemination using an array of dissemination minimum number of mechanisms. mechanisms. dissemination mechanisms. Credibility of Results The results of the NLSA were The results of the NLSA were The results of the NLSA were The results of the NLSA were perceived as credible by very few perceived as credible by some perceived as credible by most perceived as credible by all or stakeholder groups.25 stakeholder groups. stakeholder groups. almost all stakeholder groups. (CONTINUED) 58 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation On way to meeting minimum Acceptable minimum Best practice from, the attribute standard standard Impact Monitoring No mechanisms were in place to Minimal mechanisms were in Sufficient mechanisms were in Extensive mechanisms were in monitor the impact of the NLSA. place to monitor the place to monitor the impact of place to monitor the impact of 26 consequences of the NLSA, or the NLSA and the mechanisms the NLSA and the mechanisms the mechanisms took place only took place all or almost all NLSA took place all or almost all NLSA some or a few NLSA rounds. rounds. rounds. Readiness to Start an The system was weakly prepared The system was somewhat The system was well prepared to This option does not apply to this NLSA Program to start an NLSA program in the prepared to start an NLSA start an NLSA program in the indicator. 27 future. program in the future. future. 59 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 National (of System-Level) Large Scale Assessment (NLSA): Development-level rating justifications 1. Cabo Verde has a system-level NLSA program, the Prova de Aferida, which was administered in 2010 and 2014. Both rounds assessed Portuguese and math. The 2010 NLSA was administered to sixth graders and the 2014 NLSA was administered to second, fourth, and sixth graders. 2. The policy-mandated purposes of the NLSA are monitoring education quality, informing policy, and informing pedagogy. The following are not policy-mandated purposes: holding the government, schools, teachers, or students accountable; and evaluating interventions aimed at improving student learning. 3. There was no system-level policy document authorizing the NLSA program, however there is a document that generally describes the purpose of an NLSA without authorizing any specific program. This document is Sistema de Avaliacao do Ensino Basico Decreto Lei N. 43, and was authorized in 2003 by the Republica of Cabo Verde. This document is available to the general public. 4. "General Information Guides" and "Application Guidelines" were circulated to the schools participating in the NLSA sample. These documents provided guidelines on governance of the NLSA, what should be assessed, and who should be assessed. The guidelines did not address how assessment results should be used. 5. In 2014, a temporary “Nucleo” was formed with technicians from various departments within the Ministry of Education to help lead and implement the 2014 Aferida. This nucleo had primary responsibility for running the NLSA program and completed one round. The unit was not held accountable to a clearly recognized body. As a part of the Nucleo, a team of teachers and Ministry of Education technical staff worked together on the elaboration of the assessment and planning for implementation. The Ministry of Education staff involved included employees from the Directorate of National Education, the General Inspectorate of Education, and the Department of Statistics. The Ministry of Education implemented the assessment and was responsible for scoring the assessment. The Department of Statistics within the Ministry of Education was officially responsible for data processing. The Nucleo no longer exists. 6. The Ministry of Education had primary responsibility for running the NLSA, but it was not held accountable to a clearly recognized body. 7. There was internal funding available, however there is no information available on the amount of funding or the source of funding. 8. There was internal funding available, however it is unclear what NLSA activities were funded. 9. All Ministry of Education staff responsible for NLSA activities were properly equipped with needed computers and resources, including appropriate computers for all technical staff, appropriate software, appropriate building security, appropriate storage facilities, appropriate computer servers, and appropriate communication tools. 60 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 10. Individuals responsible for completing key NLSA activities were: a) Permanent staff from the Ministry of Education (including staff from the Directorate of National Education, the General Inspectorate of Education, and the Department of Statistics), 10-50% of whom were qualified, and b) Teachers completing NLSA activities as part of their job responsibilities, of whom less than 10% were qualified. 11. As mentioned above, individuals responsible for completing key NLSA activities were: permanent staff from other Ministry of Education units, 10-50% of whom were qualified, and teachers completing NLSA activities as part of their job responsibilities, of whom less than 10% were qualified. There were significant issues with these individuals' effectiveness, which affected the quality of specific NLSA activities and compromised the overall quality of the NLSA. Specifically, there were delays in reporting information and processing results, some remote schools that were intended to participate in the NLSA did not participate due to their remote locations, and some teachers did not administer the exam correctly due to confusion about the instructions they received. 12. There were no opportunities available in the system to learn about the NLSA. Specifically, there were no university graduate programs (master's or doctoral level) on student assessment that include topics relevant to the NSLA (for example, test design, reporting); no university courses and/or workshops on the content and skills measured by the NLSA (for example, courses on curriculum); no non-university courses and/or workshops on the content and skills measured by the NLSA (for example, courses on curriculum); no university courses and/or workshops on NLSA topics other than the content and skills measured by the NLSA (for example, test design, reporting); no non- university courses and/or workshops on NLSA topics other than the content and skills measured by the NLSA (for example, test design, reporting); no funding for attending international programs, courses, and workshops on student assessment that cover topics relevant to the NLSA, no internships and/or short-term employment in the unit running the NLSA, and no presentations about the NLSA (for example, test design, administration). 13. The NLSA is intended to measure curriculum, and is very aligned with what it intends to measure; the NLSA is based upon the official curriculum ("Programas") for each grade. In 2015, an internal review took place. However, there are no officially mandated reviews--internal, external, or otherwise. 14. The NLSA is very aligned with classroom assessment. 15. School educators from both public and private schools provided students with the opportunity to be exposed to the content and skills measured by the NLSA. Students were exposed to this material as a part of regular course instruction at school. In both public and private schools, all or almost all students were exposed to the skills and content assessed by the NLSA. 16. Only the "General Information Guide" was available to almost all schools. This unofficial framework document provided information on the grade levels to be assessed, when the assessment would take place, the structure of the assessment and the types of questions that are part of the NLSA, and information on how 61 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 the NLSA will be administered at the school level. The General Information Guide did not include information on how to teach the content and skills assessed by the NLSA or examples of the criteria used for scoring open-ended and essay questions. 17. The two formal quality assurance procedures that were in place to ensure the quality of the NLSA were (1) a standardized manual for NLSA administrators and (2) internal observers. The standardized manual included information on the logistics of test administration, such as how to set up the classroom, the number of supervisors per classroom, how to read instructions to students, and how students should fill out the cover form (a sheet for recording student identification information such as class, school, and region). The internal observers were Ministry of Education Staff, who observed the administration of the assessment to ensure that schools followed the protocols. Quality assurance procedures did not include the following: ensuring that all proctors or administrators were trained according to a protocol; piloting questions, items, or tasks before the official NLSA was administered; numbering all booklets; double data scoring; training scorers to ensure high inter-rater reliability; double processing of data; and bringing in external observers (such as a representative of the community observing at administration sites). 18. The NLSA was partially standardized at the system level. Assessment design was consistent and overseen by the Ministry of Education. However, although all teachers received the same information and manuals, weak training led to differences in NLSA administration across the country. Additionally, NLSA papers and questions, items, and tasks were the same for all students, and quality control monitors were used to ensure consistent administration conditions. These quality control monitors, however, were not present in all locations. The following measures were not used to ensure the standardization of the NLSA: NLSA administrators were not trained to ensure that all students took the NLSA under the same conditions; the same scoring criteria was not used to correct NLSA questions, items, or tasks; NLSA results were not computed using the same procedures for all students; and NLSA results were not reported to all students in the same way. 19. The 2014 NLSA was implemented in a random sample of public schools, representative of the overall target population at the system level for the grades. There was difficulty, however, concerning access to some of the schools in the sample, and some schools were excluded from the Aferida at the last minute. The Aferida was not administered in private schools. 20. There were no non-NLSA-relevant reasons that prevented individuals from taking the NLSA, with the exception of the school cancellations mentioned above. 21. The following factors affected the NLSA round to a great extent: errors in administering the NLSA, poor training of NLSA administrators, delays in scoring student responses, delays in data processing, delays in reporting results, and failure to report results. Specific examples included problems with administration (some schools in the initial sample selection were excluded for participation for logistical reasons), and slow processing of results (as of this report, NLSA results have not been processed and there are no formal plans to publish them). 22. Inappropriate behavior did not take place during the 2014 NLSA. 23. There was no document that specified the methods and procedures used during the 2014 NLSA. 62 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 24. NLSA results were not published. 25. The 2014 NLSA results have not been released. 26. There are no mechanisms in place to monitor the impact of the NLSA. 27. This indicator does not apply to this rubric. 63 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 CABO VERDE International Large-Scale Assessment (ILSA) 64 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation from, On way to meeting minimum Acceptable minimum standard Best practice the attribute standard Stability of The system did not participate in The system participated in an ILSA The system completed one ILSA The system completed two or Participation an ILSA round in the last 10 years.1 round in the last 10 years, but did round in the last 10 years. more ILSA rounds in the last 10 not complete it. years. Policy No policy document authorized An informal/draft policy A formal/official policy document A formal/official policy document 2 Document the ILSA program. document authorized the ILSA authorized the ILSA program, but authorized the ILSA program and program. the document was not available to was available to the general the general public. public. Stability of There was no unit with primary There was a unit(s) with primary There was a permanent unit(s) This option does not apply to this Organization responsibility for running the ILSA responsibility for running the ILSA with primary responsibility for indicator. program. 3 program, but the unit(s) was running the ILSA program that had temporary or had been in place been in place for 5 or more years. for less than 5 years. Accountability There was no unit with primary The unit(s) with primary The unit(s) with primary The unit(s) with primary of Organization responsibility for running the ILSA responsibility for running the ILSA responsibility for running the ILSA responsibility for running the ILSA program, or else the unit program was accountable to a program was accountable to a program was accountable to a responsible was not accountable clearly recognized body within the clearly recognized body within the clearly recognized external body. to a clearly recognized body. 4 ILSA unit. same institution as the ILSA unit. Source of There was no funding available for The source of funding for the The source of funding for the This option does not apply to this Funding ILSA activities. 5 majority of ILSA activities was majority of ILSA activities was the indicator. loans, credits, grants or government's internal funding equivalent. sources. (CONTINUED) 65 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation from, On way to meeting minimum Acceptable minimum standard Best practice the attribute standard Activities There was no funding available for Funding was not sufficient to Funding was sufficient to cover all This option does not apply to this Funded ILSA activities. 6 cover all core ILSA activities. core ILSA activities. indicator. Organization The ILSA unit did not have the The ILSA unit had some of the The ILSA unit had most of the The ILSA unit had all of the Resources appropriate resources. 7 appropriate resources. appropriate resources. appropriate resources. Qualifications There were no individuals Some of the individuals Most of the individuals responsible All or almost all of the individuals of Staff responsible for completing key responsible for completing key for completing key ILSA activities responsible for completing key ILSA activities. 8 ILSA activities had the relevant had the relevant qualifications. ILSA activities had the relevant qualifications. qualifications. Effectiveness of There were no individuals The responsible individuals The responsible individuals The responsible individuals Staff responsible for completing key completed key ILSA activities, but completed key ILSA activities, with completed key ILSA activities and ILSA activities. 9 there were significant issues in only some issues in how these there were no issues in how these how these activities were activities were completed. activities were completed. completed. Staff/Teacher There were no opportunities to Opportunities to learn about the There were sufficient high-quality Opportunities to learn about the Opportunity to learn about the ILSA. 10 ILSA were minimal, or not of high opportunities to learn about the ILSA were extensive, of high Learn quality, or did not benefit all key ILSA that were available to key quality, and benefited key stakeholder groups. stakeholder groups. stakeholder groups. (CONTINUED) 66 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation from, On way to meeting minimum Acceptable minimum standard Best practice the attribute standard Alignment with The ILSA was poorly aligned with The ILSA was somewhat aligned The ILSA was very aligned with This option does not apply to this Other other types of assessment with other types of assessment other types of assessment indicator. Assessments activities in the system. 11 activities in the system. activities in the system. Opportunities Students did not have Students had limited Students had sufficient Students had many opportunities for Students to opportunities to be exposed to the opportunities to be exposed to opportunities to be exposed to the to be exposed to the content and be Exposed to content and skills measured by the the content and skills measured content and skills measured by the skills measured by the ILSA. Content and ILSA. 12 by the ILSA. ILSA. Skills Quality Many errors or delays in activities Errors or delays in activities Any errors or delays in activities Errors or delays in activities did Processes took place that affected the ILSA affected the ILSA to a significant had only a minimal effect on the not affect the ILSA. to a great extent. 13 level. ILSA. Inappropriate Inappropriate behavior Inappropriate behavior took place Inappropriate behavior was low Inappropriate behavior, if any, was Behavior compromised the credibility of the and compromised the credibility and did not compromise the marginal, and did not compromise ILSA to a great extent. 14 of the ILSA somewhat. credibility of the ILSA. the credibility of the ILSA. Meeting ILSA results for the system did not ILSA results for the system met ILSA results for the system met all This option does not apply to this Standards for meet the standards required for sufficient standards to be of the standards required to be indicator. Publication publication in the international presented beneath the main presented in the main displays of report. 15 displays in the international the international report. report. (CONTINUED) 67 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 LATENT EMERGING ESTABLISHED ADVANCED Indicator Absence of, or deviation from, On way to meeting minimum Acceptable minimum standard Best practice the attribute standard Publication of ILSA results were not published in Limited information on the ILSA Sufficient information on the ILSA Comprehensive information on Results the system. 16 results was published in the results was published in the the ILSA results was published in system, or the results were system using an array of the system using an array of published using a minimum mechanisms. mechanisms. number of mechanisms. Credibility of The results of the ILSA were The results of the ILSA were The results of the ILSA were The results of the ILSA were Results perceived as credible by very few perceived as credible by some perceived as credible by most perceived as credible by all or stakeholder groups. 17 stakeholder groups. stakeholder groups. almost all stakeholder groups. Use of Results ILSA results were not used by ILSA results were used in minimal ILSA results were used in sufficient ILSA results were used in extensive stakeholders in the system. 18 ways by stakeholders in the ways by stakeholders in the ways by stakeholders in the system. system. system. Readiness to The system was weakly prepared The system was somewhat The system was well prepared to This option does not apply to this Participate in to participate in an ILSA program prepared to participate in an ILSA participate in an ILSA program in indicator. an ILSA in the future. 19 program in the future. the future. 68 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 International Large Scale Assessment (ILSA): Development-level rating justifications 1. The system has not participated in an ILSA in the last 10 years. 2. There is no system-level policy document that authorized participation in the ILSA program. 3. There is no unit responsible for running an ILSA program. 4. There is no unit responsible for running an ILSA program. 5. There is no funding allocated to ILSA activities. 6. There is no funding allocated to ILSA activities. 7. There is no ILSA unit. 8. N/A because there has not been an ILSA exam. 9. N/A because there has not been an ILSA exam. 10. N/A because there has not been an ILSA exam. 11. N/A because there has not been an ILSA exam. 12. N/A because there has not been an ILSA exam. 13. N/A because there has not been an ILSA exam. 14. N/A because there has not been an ILSA exam. 15. N/A because there has not been an ILSA exam. 16. N/A because there has not been an ILSA exam. 17. N/A because there has not been an ILSA exam. 18. N/A because there has not been an ILSA exam. 19. Cabo Verde has not participated in an ILSA in the past. However, the Ministry of Education has discussed the possibility of participating in an ILSA in the future. The current draft of the 2017-2021 Education Strategy “Plano Estrategico 25 abril” broadly mentions "developing [an] external assessment model,” however 69 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 there is no official mention of participating in any specific ILSA. While there is no unit within the Ministry of Education directly responsible for student assessment, temporary units and individuals from various departments within the Ministry have supported other assessment activities in the past. No specific opportunities are available for ILSA training, however if Cabo Verde were to decide to move forward with an ILSA, funding could be made available for capacity building. 70 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 Acknowledgements This report was prepared by the World Bank Cabo Verde Country Task Team (Kamel Braham, World Bank Lead Education Specialist and Jem Heinzel Nelson, World Bank Education Consultant) and the SABER- Student Assessment Team (Marguerite Clarke, World Bank Senior Education Specialist, Julia Liberman, World Bank Operations Officer, and Tara Danica Siegel, World Bank Education Consultant). The Team is grateful for the feedback and support from the Cabo Verde Ministry of Education, in particular the Direção Nacional de Educação (National Directorate of Education), the Direção Geral de Planeamento, Orçamento e Gestão (Directorate-General for Planning, Budgeting and Management), and the Inspeção Geral da Educação (IGE) (General Inspection of Education Unit.) References Clarke, M. 2012. “What Matters Most for Student Assessment Systems: A Framework Paper.” READ/SABER Working Paper Series. Washington, DC: World Bank. 71 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS CABO VERDE ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2017 www.worldbank.org/education/saber r The Systems Approach for Better Education Results (SABER) initiative produces comparative data and knowledge on education policies and institutions, with the aim of helping countries systematically strengthen their education systems. SABER evaluates the quality of education policies against evidence-based global standards, using new diagnostic tools and detailed policy data. The SABER country reports give all parties with a stake in educational results—from administrators, teachers, and parents to policymakers and business people—an accessible, objective snapshot showing how well the policies of their country's education system are oriented toward ensuring that all children and youth learn. This report focuses specifically on policies in the area of student assessment. This work is a product of the staff of The World Bank with external contributions. The findings, interpretations, and conclusions expressed in this work do not necessarily reflect the views of The World Bank, its Board of Executive Directors, or the governments they represent. The World Bank does not guarantee the accuracy of the data included in this work. The boundaries, colors, denominations, and other information shown on any map in this work do not imply any judgment on the part of The World Bank concerning the legal status of any territory or the endorsement or acceptance of such boundaries. 72 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS