Primary Education Survey Evaluation Report

UNICEF USSC Somalia Support Centre Primary Education Survey Evaluation Report Somalia Supported by the Strategic Partnership for Recovery and Develo...
Author: Cordelia Ramsey
4 downloads 2 Views 1MB Size
UNICEF USSC Somalia Support Centre

Primary Education Survey Evaluation Report Somalia

Supported by the Strategic Partnership for Recovery and Development in Education in Somalia UNICEF-DFID-UNESCO

August 2008 Dr Edward Redden, Consultant

PES Evaluation Report, 2008

ACKNOWLEDGEMENTS The nature of this assignment required assistance from many people involved in the education sectors of Somalia. This assistance was always willingly given and I am grateful for that support.

Minister Hassan Haji Mohmoud of Somaliland and Minister Mohamud Bile Dubbe of Puntland both gave freely of their time and provided the support of their senior officers. I would also like to express gratitude to the Directors General of the two northern Zones and the various directors to whom I have had the pleasure of interviewing and sharing their experiences with reference to the Primary Education Survey.

The regional education officers and enumerators who attended a workshop in support of this evaluation provided wonderful insight in the education processes in their regions and demonstrated great understanding of challenges ahead. They were able to provide a number of very useful recommendations that have been embedded in this report.

The UNICEF officers in both Hargeisa and Garowe provided access to a range of officials within the Ministries and also provided valuable descriptions of their field experiences during implementation of the survey project. I offer my sincere thanks to these staff members. The UNICEF officers in the central south zone provided information via email due to the insecurity in their areas. Their promptness in replying to my inquiries was always much appreciated.

Members of NGOs, UN Agencies and Donors gave of their time to both contribute data for the evaluation process and to provide comment on the draft report. Their contributions were significant and valued.

Finally, I am indebted to the staff of UNICEF Somalia in Nairobi, who provided the necessary infrastructure support and technical and cultural guidance. Woki Munyui, Maulid Warfa and Catherine Remmelzwaal also provided critical comment on early drafts of the Report that contributed significantly to the development of the evaluation.

Dr Edward Redden 25/8/08

2

PES Evaluation Report, 2008

ACRONYMS AIR

Apparent Intake Rates

ECD

Early Childhood Development

ECCE

Early Childhood Care and Education

EFA

Education for All

EMIS

Education Management Information System

EU

European Union

FTI

Fast Track Initiative

GER

Gross Enrolment Ratio

G1ECD

Percentage of Grade 1 students who have attended Early Childhood Education

MDG

Millennium Development Goals

MoE

Ministry of Education

NER

Net Enrolment Ratio

NIR

Net Intake Rate

NGO

Non Government Organisation

PAE

Primary Alternative Education

PES

Primary Education Survey

REO

Regional Education Officer

SPSS

Statistics Package for the Social Sciences

UNESCO

United Nations Education, Science and Cultural Organisation

UNICEF

United Nations Children’s Fund

3

PES Evaluation Report, 2008

CONTENTS EXECUTIVE SUMMARY.........................................................................................................................6 THE PRIMARY EDUCATION SURVEY ...............................................................................................9 EVALUATION OBJECTIVES..................................................................................................................9 CONCEPTUAL FRAMEWORK ............................................................................................................10 RESEARCH QUESTIONS AND METHODOLOGY...........................................................................11 RESULTS...................................................................................................................................................13 A. EFFICIENCY....................................................................................................................................................... 13 1

Validity ............................................................................................................................................................. 14

2

Training............................................................................................................................................................ 18

3

Enumerators and Supervisors ........................................................................................................................ 19

4

Timing .............................................................................................................................................................. 20

5

Data Duplication.............................................................................................................................................. 22

B. EFFECTIVENESS ............................................................................................................................................... 23 1

Utility-Current and Potential ......................................................................................................................... 23

2

EMIS Tools ...................................................................................................................................................... 24

3

Ownership ........................................................................................................................................................ 25

C. RELEVANCE AND APPROPRIATENESS ..................................................................................................... 27 1

Data Required for an EMIS ........................................................................................................................... 27

2

Reporting.......................................................................................................................................................... 32

D. IMPACT ............................................................................................................................................................... 35 E. SUSTAINABILITY.............................................................................................................................................. 36 F. COVERAGE ......................................................................................................................................................... 37 G. COORDINATION ............................................................................................................................................... 38 H. CONCLUSION: FUTURE STEPS TOWARDS AN EMIS ............................................................................. 40

REFERENCES.......................................................................................................................................................... 43 APPENDICES ........................................................................................................................................................... 44 Appendix 1 Summary of Recommendations.......................................................................................................... 44 Appendix 2 Terms of Reference.............................................................................................................................. 48

4

PES Evaluation Report, 2008 Appendix 3 EMIS Conceptual framework ............................................................................................................ 53 Appendix 4 Research Questions and Data Collection Planner ............................................................................ 58 Appendix 5 List of Interviewees.............................................................................................................................. 62 Appendix 6 EMIS Development Planning Matrix ................................................................................................ 63 Appendix 7 Evidence Based Planning .................................................................................................................... 65 Appendix 8 Glossary................................................................................................................................................ 67

5

PES Evaluation Report, 2008

EXECUTIVE SUMMARY 1. The Primary Education Survey represents a significant achievement for the education sector of Somalia. The cooperation between three Ministries of Education, UNICEF, DFID and UNESCO demonstrated in the instigation and continued development of the Primary Education Survey (PES) process, reflects a commitment and dedication to the children of Somalia. 2. The PES reports are well respected within the education sector and the data is widely used in a range of ways by agencies active in the sector. The MoEs have expressed interest in taking greater control over the process and in expanding and improving planning processes using evidence based approaches. 3. The Census approach to the Survey should be continued and developed over time into a full EMIS. The notion of an EMIS that is aspired to is “an organized group of information and documentation services that collects, stores, processes, analyses and disseminates information for educational planning and management.” As such it is more than a data storage system. Rather it integrates system components of inputs, processes, outputs and reporting in such a way that accuracy, flexibility and adaptability, and efficiency are maximized. 4. Views received from the field indicate that the validity of the data is quite sound, although some recommendations have been made to improve data collection processes. These are based upon developing an audit trail so that greater accountability can be achieved. 5. Enumerators and supervisors should be assessed during training and only those demonstrating competency in the required skills and appropriate ethical values should be employed. Additionally, an appraisal of field performance should be undertaken following the data collection and cleaning process. 6. The training for enumerators and supervisors should be expanded to include a number of new training modules and methodologies. The training should allow the demonstration of competence and include a greater emphasis on understanding the technical skills being developed. 7. Enumerators and supervisors should be subject to a transparent selection system conducted jointly by implementing organizations. Rotation of staff supervision arrangements across regions would improve supervision objectivity. 8. When an effective quality assurance process is in place, responsibility for data collection should be shifted to REOs and head teachers following adequate training. 9. The timing of the data collection should be shifted to the earliest practical time in the school year. Late September or early October should be considered. This will allow some data reports being available to MoE, REOs and school communities during December. This shift will help avoid the current need for data duplication. As the system develops there will be a need for supplementary data to be gathered on a term or semester basis. 10. EMIS tools are only marginally useful under current practice. They do not seem to be linked to the PES data collection process. They should be developed as documentary evidence for the data being collected and hence contribute to the validity of the system by allowing systematic checking of summary data. They will also contribute to the supplementary data collection process when it is developed. More training and supervision of the tools’ use and simplification of tools is recommended. 11. The plan in 1997 was to develop the PES processes and transfer responsibility for implementation to the Ministries. Apart from the data collection process this has not been achieved.

6

PES Evaluation Report, 2008 12. Capacity building needs to be seen as more than training. There are at least three components required; training, experience and ongoing support. A model needs to be built to ensure these three components are in place as MoEs accept responsibility for additional components of PES process. 13. The current PES reporting process does not make full use of the flexibility offered by the data being stored in an Access data base. It also targets a very small range of stakeholders which results in the REOs, and school communities being unable to use the data it provides. 14. At a minimum the following set of reports should be developed and distributed: a. A school report that should be provided within 3 months of data collection. b. District reports that reflect education conditions within the district generally and allow schools and CECs to benchmark themselves, and district officers to determine priorities. c. Regional reports that reflect education conditions across regions and allow for comparison of progress between regions. d. Zone and Somalia compact reports that that provide a summary of major EFA indicators and trends disaggregated by region, gender and rural/urban. e. A set of thematic reports that would change from year to year and would provide deeper analysis of EMIS data of issues of specific relevance. f. The capacity to interrogate the data base using the query module of Access should exist at Zone level and would be available for generating specialized reports for stakeholders g. A software-based front end that will allow exploration of the data base by non-technical people. Some suggestions for detailed reporting are included in Appendix7 of the report. 15. A set of steps toward the development of a full EMIS is outlined, with a time frame of at least 5 years indicated. The development of the EMIS is seen as a transition process and is based on using current PES processes as a starting point. Some key elements of the plan are listed here: a. An EMIS unit with adequate staffing levels, including technical assistance, needs to be established. The MoEs must demonstrate commitment to the process by allocating some resources to the unit with a plan to increase this allocation as resources become available. The salary and conditions of employment must be such that the staff are retained as their capacity is developed. b. A set of nine skill areas required for EMIS implementation is identified that provide a basis for a capacity development plan. c. A set of EMIS components are outlined that would be developed within each ministry sequentially. An immediate start with data cleaning and entry by MoEs is recommended. d. With the ongoing support of technical assistance, a rolling plan should be developed for EMIS development that will be subject to an annual review that will focus on redefining objectives following identification of progress the previous year. e. The expansion of data is recommended in line with EFA and MDG reporting requirements and in regard to local contextual needs. A three stage data expansion model is suggested. This is reproduced below under the headings of Immediate, Next phase and Aspirational:

7

PES Evaluation Report, 2008

Immediate

Enrolment by age, grade and gender Drop out and repetition Secondary School data Condition of buildings/ rehabilitation needs

Next phase

Background ECD of grade 1 enrolments Non-formal education Qur’anic schools/ Early childhood Text book numbers Emergency preparedness Outcome/ student achievement data

Aspirational

Child friendliness of schools Teaching and learning resources

f.

Data from the EMIS should systematically be integrated into mapping software. Such a synthesis will also facilitate cross cutting views by allowing assessment to include education conditions along with emergency, WASH and health conditions. The DEVINFO format of reporting can also be supported.

g. A number of possible threats to sustainability have been listed. Careful planning will be required to reduce the perceived risk of these threats.

8

PES Evaluation Report, 2008

THE PRIMARY EDUCATION SURVEY The Primary Education Survey (PES) has been conducted almost every year for the last nine years. However, no formal evaluation of the Survey has ever been carried out. The original purpose of the survey was to fill a critical data gap in the education sector and to meet the information needs of the Ministry’s of Education (MoEs), national and international development stakeholders as well as donors, with an independent, reliable and routine source of data for trend analysis and programme planning. The PES has evolved over the years with additional data elements being added to meet the growing needs of education stakeholders. From the beginning, UNICEF clearly stated its intention to build capacity of MoE officials in order for the MoEs to take full responsibility for the design and management of the PES. The PES was in many ways only meant to be an interim step on the road to establishing a full operational Education Management Information System (EMIS). (PES Evaluation TOR 2008) The ultimate goal of building such an effective EMIS is the establishment of a reputable system and expertly trained education officials at central, regional and district levels to ensure the efficient flow of data in order to facilitate the monitoring of service provision as well as providing critical information to educational planners and policy-makers After nine years, the Primary Education Survey remains the most up-to-date and reliable source of data on the status of education in Somalia. However, for the PES to maintain credibility and to ensure the validity of the data collected, an evaluation of the current and past processes involved in this important routine education survey, as well as an assessment of its utility levels, is considered necessary. The evaluation also examines the relationship between the need for a sustainable EMIS and the future role of the PES. The Strategic Partnership in Education (DFID-UNICEF-UNESCO), intends to fund the next annual education survey. However, prior to doing so, it is deemed important to evaluate the current annual process of data collection and data dissemination. Such an evaluation is fully justified in view of the fact that without a fully functional and efficient management information system, it will be difficult to bring about any reform in basic education in Somalia. To date, the PES has played a crucial role in maintaining the availability of a reliable data source. How the effectiveness of this role can be strengthened and its relationship to the establishment of a broader system for EMIS, will depend on the outcomes and recommendations of this independent evaluation.

EVALUATION OBJECTIVES The scope of the proposed evaluation covers the two routine annual surveys of primary education in Somalia. These are known as the Primary Formal Education (PFE) and the Primary Alternative Education Survey (PAE). Together they are referred to as the Primary Education Survey (PES). The evaluation focuses on the main aspects of the surveys, i.e. data collection and management as well as information dissemination and utility. In addition, it looks at the performance of the agency in supporting this recurring activity and the collaboration with its partners. The overall objectives of the evaluation are: ƒ ƒ ƒ ƒ ƒ

To assess the current and past design, planning and management of the key processes and methods that have been employed to conduct the PES since its inception To assess the degree to which the PES meets the real data needs/analysis of the range of education stakeholders in Somalia To assess the role of the PES in relation to the nascent EMIS and review the current linkages To assess the need for expanding the breadth and depth of the information gathered and in particular, to review the idea of integrating the collection of Secondary School data To assess the mechanisms for ensuring internal validity and reliability of the data presented in the PES

9

PES Evaluation Report, 2008 ƒ ƒ ƒ ƒ ƒ

To assess the degree to which training of MoE officials in data collection and management have been successfully and appropriately implemented To assess the utility level of the PES in Somalia, in particular the benefits of its results and outputs to the intended users To identify best practice, innovative interventions and shortcomings in the current process of planning and implementing the annual survey To make recommendations and suggestions on possible improvements related to all aspects of the routine survey, including, if necessary, ways of increasing levels of participation and ownership and utility of the survey/data by the MoE authorities and other stakeholders To make recommendations with respect to the evolving relationship between the current/future PES and the nascent EMIS system

CONCEPTUAL FRAMEWORK The PES as currently conceived has many of the classic attributes of a survey in that it is a set of data collected by independent enumerators at a single point in time. The data is aggregated centrally, subjected to some analysis and placed in a single report for distribution. The implementation of the survey on an annual basis has facilitated trend analysis to be undertaken. However, these annual implementations are a set of independent events, and not an annual upgrading of a central data base. Importantly, the PES has not used the common strategy of sampling from the population; rather it is a census that collects data from all schools in the targeted population. This element, along with the centralised aggregation of data, is a feature that the PES has in common with an EMIS. However, an EMIS has some important features that distinguish it from a survey. A full description of an EMIS is set out in Appendix 3. Here, a brief extract from that appendix is presented to define what is understood to be an EMIS in this report. To clarify understanding about what an EMIS is, and how it can be of assistance in developing an education infrastructure, the definition and conceptual framework of an EMIS has been adapted from the work undertaken by UNESCO (1998 Bangkok). This conceptualization defines an EMIS to be ‘an organized group of information and documentation services that collects, stores, processes, analyses and disseminates information for educational planning and management” (p.2). As such it is more than a data storage system. Rather it integrates system components of inputs, processes, outputs and reporting in such a way that accuracy, flexibility and adaptability, and efficiency are maximized. The elements that need to be integrated are the: • • • • • • • •

Needs of data producers and users Data Information handling Storage of data Retrieval of data Data analysis Computer and manual procedures Networking among EMIS centers (UNESCO 1998, p. 4)

It also has a dynamic process orientation that focuses on the two-way flow of information so that all levels of the organization can make effective use of the data and reports generated from the data. This flow of information needs to be efficient by avoiding duplication of effort, made accurate by incorporating validation procedures and adequate training of personnel at all levels, and meeting the needs of all stakeholders in the system. This complex interaction of components is represented in Figure 1 and has been adapted from the UNESCO (1998) report to reflect some aspects of the Somalia context. These ideas are more fully developed in Appendix 3.

10

PES Evaluation Report, 2008

EMIS DATA

Students Curriculum Personnel Finance Facilities Other

LEVELS OF DECISION MAKING

INFORMATION PRODUCTS

Reports

National

Plans Zones

Products

Regions

Data/statistics bulletins

Districts

Directories

Villages/Schools Early childhood Primary High school PAE NFE Nomadic and Qur’anic Schools

Physical plant and Facilities design Budget Estimates Others

Figure 1: Information Flow (adapted from UNESCO 1998)

RESEARCH QUESTIONS AND METHODOLOGY These objectives will be addressed by considering a set of 40 research questions that have been organised in seven areas. The detailed questions are set out in Appendix 4. The evaluation will explore the following areas in relation to the survey. ƒ ƒ ƒ ƒ ƒ ƒ ƒ

Efficiency Effectiveness Relevance / Appropriateness Impact Sustainability Coverage Coordination

These are explored by using the detailed research questions to structure interview schedules for a wide variety of interviewees.

11

PES Evaluation Report, 2008 Study Components The principal source of data will be interviews with respective stakeholders. Respondents will include MoE officials at central, regional and district levels, NGO officials, head teachers, teachers, survey enumerators, supervisors, UNICEF/UNESCO staff at Nairobi and zonal level as well as local and international NGOs. The methods followed would be largely qualitative and include key informant semi-structured interviews, community level interviews and focus group discussions. Questionnaires for guiding the interviews will be developed for review in the inception phase. (PES Evaluation TOR 2008) The task of this methodology section is to coordinate these process components with the 7 areas identified above. The 7 areas are further clarified and designed in Appendix 4 by the use of 40 research questions mapped onto the 7 areas. A research matrix was developed that linked relevant data sources (stakeholders) with the research questions. This appendix is referred to as the Data Collection Planner. This matrix was used to ensure that each research question had an associated data source to inform it, to identify people who would reasonably be expected to have relevant knowledge and experience to inform each research question, and to guide the development of the interview guides that would assist the researcher in generating the conversation in the semi structured interview environments. These processes were validated by consultation with a range of stakeholders who have personal and detailed knowledge of the context of the research. The Data Collection planner was used to develop seven Interview guides. These guides targeted UNICEF officers, Directors General of Education, planning officers in each Ministry, Regional and District Education Officers, School and community leaders, Survey team members, and NGOs/international organisations. In addition, questions that could be informed by the desk review were identified. Appendix 5 indicates the set of interviews and focus groups conducted in the data collection process. The semi-structured interviews with individuals and focus groups were recorded and salient information mapped onto research questions. These data were synthesized and interpreted by the consultant prior to drawing conclusions and making recommendations. To enhance the validity of the study a number of strategies were used that were drawn from the qualitative research literature (LeCompte and Preissle, 1993). These involve enhancing the “trustworthiness” of the research. The strategies included triangulation, building a comfortable non threatening relationship with interviewees, creating an audit trail of data using a digital recorder in interviews, seeking permission of interviewee to record the interview with an offer to turn the recorder off if requested, seeking specific examples in support of claims being made, asking to see documentary evidence of claims and comments, and providing a summary of conversations by listing major points and seeking concurrence from interviewees. A few further words on triangulation are warranted. This is a process where the researcher tries to get multiple sources of evidence to assist in making conclusions in relation to the study. To this end, after studying the Data Collection Planner (Appendix 4) it can be seen that each research question is to be investigated with several stakeholders. Additionally, when claims are made attempts will be made to triangulate the claim with documentary, or physical evidence. In general, matters of fact should be agreed upon by all data sources. Some variability will be expected in views, or attitudes of a more subjective nature. Often it can be expected that this variability is a function of varying positions in the system. For example the Director General might see data as a valuable help to decision making and resource allocation, whereas teachers might see the collection of the same data as an imposition that reduces the time they have for lesson preparation. There were a number of threats and limitations to the study that should be enumerated. During the period of the study (July/August) schools were closed for the summer break and hence were not available for observation in a natural setting. Attempts were made to contact some local principals and school representatives for interview.

12

PES Evaluation Report, 2008 Only two zones were able to be visited (Somaliland and Puntland) as Central South was considered unstable by security authorities. While findings will be generalised from the two zones visited, these findings need qualification when applied to Central South Zone, due to particular difficulties of security, accessibility due to climatic factors and the general level of development and competence of the Ministry of Education in the Zone The evaluation took two months as per the time-line (Appendix 1).

RESULTS The results of the evaluation are reported in the seven areas indicated in the evaluation design, namely; efficiency, effectiveness, relevance/appropriateness, impact, sustainability, coverage, and coordination. The first three areas are divided into a number of sections to enhance clarity. Recommendations are then gathered at the end of each section for ease of access, and to provide a summary of the section. All recommendations are collected and numbered in Appendix 1.

A. EFFICIENCY In this section the PES data collection process is briefly described and then some issues associated with that process are explored. The issues explored are; data validation strategies, timing of data collection, training and recruitment of enumerators and supervisors and data duplication. A brief outline of the PES data collection strategies as explained by a number of UNICEF and Ministry staff is presented prior to the major components of the section. 1. The process begins with a consultant working in Nairobi to design the questionnaire. 2. A PES planning meeting involving UNICEF staff from the three zones is conducted. Quality of questions is discussed to eliminate ambiguity and to ensure that the questions will achieve what they are designed to achieve. In addition, a detailed training strategy is designed. 3. UNICEF staff go back to Zones where the enumerators and supervisors are identified. This identification differs across the 3 Zones. In one Zone the MoE does it independently of UNICEF, In the second Zone MoE and UNICEF cooperate in the process of identification and appointment, and in the third zone UNICEF takes full control of the process. 4. UNICEF staff undertake training of enumerators and supervisors. (Field testing happened in two zones and not in a third zone.) 5. Training includes a module of ‘micro-planning’ during which enumerators and supervisors are allocated to a region and a set of schools within the region. Additionally, UNICEF staff are also allocated to regions in a supervisory capacity. 6. Enumerators then visit their schools over a two week period to collect data. Supervisors are expected to conduct follow up visits to schools to ensure they have been visited. If the school has not in fact been visited the enumerator is sent to the school again. 7. UNICEF staff and supervisors also visit a sample of schools to check data that has been collected and on occasions to collect missing the data. 8. Enumerators, Supervisors, REOs and UNICEF staff come together to review data forms one by one to check and clean data. 9. Forms are then photo copied and the originals sent to Nairobi for the data entry process. This process involves double entry and comparison processes along with internal consistency checks in the data base entry forms 10. Finally a report is prepared and distributed to MoEs and NGOs.

13

PES Evaluation Report, 2008 1

Validity

1.1 Findings and Analysis Attempts were made to assess data quality. Unfortunately this was undertaken without a full field assessment being conducted by selecting a sample of schools and recollecting data for a comparison. This full validation exercise was impossible due to schools being closed for summer holidays. Such an evaluation was also outside of the terms of reference for this study. Instead a range of participants in the process were interviewed and an assessment made of documentary evidence, processes in the project design, and comparison with stakeholders perceptions based on field experience. There has been some questioning of the validity of some of the data that appears on the data forms and in the draft report of the 2006/7 survey. Comment has been made on the draft report in the reporting section of this evaluation. There it is argued that the inaccuracies and inconsistencies identified seem to be a function of the subsequent analysis of the data and careless editing rather than a consequence of the lack of validity in the original data set. That is not to imply that there are no limitations on the validity of the data in the data base. The issue is to assess the size and nature of the limitations and to recommend strategies to address the threats to validity. As one Director General interviewed observed: All measurement has an error. But it is important to improve results. In attempts to reduce the size of the error the issue of diminishing returns for effort is faced. By this it is meant that it is relatively cheap to move from 90% accurate to 91% accurate, but very expensive to move from 98% accurate to 99% accurate, with 100% accuracy being unattainable. This is especially so in a context of political instability, capacity constraints, rainy seasons, droughts and a mobile population. There are a number of reported threats to the validity of the data, some of which are of the nature of hunches, and some of which are a result of direct observation. An interview with a director general identified these issues: A number of schools were not visited and some documents lost, but these matters have been discussed and follow up data collection conducted to remedy the situation. The recording of GPS coordinates is a problem and I suggest that additional training on the use of a GPS is necessary. The process of transferring raw data to UNICEF for data entry caused some problems, and to ensure supervisors and enumerators completed forms correctly we need close monitoring. (He was unable to tell us about spot checking). In one zone UNICEF officers make up to 150 visits to schools (out of approximately 600 schools) for check visits. One reported that only 1 out of his 30 schools had not been visited. In another zone the UNICEF officer said that in 4 years he had only found one school that had not been visited. The Zone that had the trouble with the use of the GPS and recording the coordinates was also the zone that did not conduct field testing during the training of enumerators. In another zone that had conducted field testing Only one enumerator had trouble with GPS. The problem was identified and resurveyed. Further: We had a problem with School mapping data last year. The coordinates collected did not match the ten year old village location coordinates. We recollected 10 Bossaso school coordinates and they still did not match. Was the problem with us or the village maps?

14

PES Evaluation Report, 2008 While UNICEF staff are actively engaged in ‘spot checks’ of schools and the associated data, such consistent and diligent behaviour among supervisors and REOs was difficult to verify. A meeting of REOs indicated that: There is not follow up by REOs in the field due to lack of transport. No systematic data verification procedures at the field level are in place. The so called spot checks are haphazard at best and non existent at worst. Further REOs have no records of data collected. The process of checking by REOs and supervisors seems to be inconsistently applied across zones and regions. Some REOs described in detail how the checking process is undertaken and how they use independently collected data to verify the data forms. Two REOs were vague about the verification process and implied that it was left to supervisors and UNICEF staff. One enumerator indicated that: We always find 1 to 3 schools that have never been recorded. suggesting that there is a continuous process of seeking a complete data set. All stakeholders interviewed with experience in the field estimated that over 95% of schools were visited by enumerators. This is consistent with UNICEF reports of follow up visits. Further, one UNICEF staff member said that in his region staff consistently take the report with them into the field and check on data. It is always very consistent with field observation. This practice is also practiced by a donor, who pays special attention to gender issues in the field. The PES report is always an accurate reflection of the data he finds during his visit. In a focus group session with approximately 8 NGOs who regularly used the data from the PES, no one indicated that they had found significant inaccuracies. While there are constraints to validity identified above, they do not seem to be very great challenges to the quality of the data. However, reasons behind these shortcomings that have been identified were sought and strategies for improving the process of data collection are suggested. There appears to be three sets of reasons for schools not being visited. • The first group are obstacles that cannot be overcome in the time frame of the survey. These include Floods (especially during the rainy season), and security concerns. • The second group are those obstacles such as distance and transport that need careful planning to overcome. • The third set are those reflecting a lack of ethical commitment to the work with and underlying attitude of ‘they won’t know’ or ‘I won’t be caught.’ Common responses to these constraints include the enumerator: • Contacting the Head Teacher at his/her home to conduct the interview with the result that some details are estimates rather than validated information from source documents. • Completing the forms in the REOs office using data collected by REO, or last years data. • Making up data from local knowledge. • Recording GPS coordinates from the location of interview rather than location of Head Teacher’s office. These together with some limited problems of using a GPS to collect school location data, misrepresentation by school head teachers, and carelessness in recording data on the forms encompass the set of constraints to validity that have been found in this evaluation. However, it is emphasised that, while these issues exist they do not appear to be widespread and while they will cause distortions at the micro level, they will have little impact on macro indicators.

15

PES Evaluation Report, 2008 To address these issues a number of recommendations are made. Some are described here in full and others related to issues of training are mentioned here and developed further in the timing, training and enumerator sections below. REOs, Supervisors and UNICEF staff are encouraged to develop a ‘culture of support’ for enumerators. In such an environment enumerators are encouraged to report difficulties in visiting schools and to record limitations to data so that an accurate picture of data quality can be gained. Such a culture can be contrasted with a culture of punishment and inadequacy when enumerators have not been able to visit a school. This will result in attempts to hide constraints to data quality. In one zone this approach was reflected in comments by a UNICEF officer: Enumerators are told that if they cannot find or get to a school they are to report it. It is not a problem. A solution will be found. The only punishment is if you lie to us about claiming you have been there and you have not. Such a culture should be encouraged by a module on ethical behaviour being included in the training exercises for enumerators and supervisors. Enumerators should be evaluated both during training and during the field work. If they do not reflect adequate competency they should be replaced or not re-employed. An effective monitoring system needs to be established that will avoid issues of nepotism influencing performance quality. GPS training should be supported by field work and mapping exercises to encourage understanding of the implications and importance of the school mapping exercise. The timing of the major data collecting exercise should be adjusted to avoid rainy seasons and end of school year closures. An audit trail needs to be established to facilitate effective tracking of school visits and data forms. Evidence of visits, checking, recovering missing data and reasons for non-visits were almost impossible to find in the field. There was a case of missing forms, requiring a recollection of the data, which resulted in intense debate as to the cause of the problem. This was a result of lack of effective tracking strategies. Such a system should be transparent and available to all participants to understand, monitor and use. Performance indicators can be derived from such systems and used to encourage performance improvement, and an understanding of ethical approaches to the work being undertaken. As a component of the audit trail, enumerators should be required to record school locations as ‘way points’ in their GPS. These way points should be shared with supervisors/ REOs on return from the field and checked against a list of schools and previously recorded locations. Such a system will provide validating evidence of the school visit process. Enumerators should be required to validate data in the school from source documents where possible. This is already a component of the data collection process for the assessment of the physical resources of the school where the enumerator is required to conduct an inspection of the school and its surroundings to gather the necessary data. This should also be encouraged where possible in the earlier part of the data collection process. Effective use of the EMIS tools would assist in this process so that school enrolment could be checked for example. A common practice elsewhere is to choose a few data elements each year that will be subject to a validation check at the school level. These data elements are commonly rotated each year. Data entry has thus far taken place in Nairobi. While this has been efficient it has some drawbacks in terms of capacity development and efficient interpretation of Somali language issues. This matter has been discussed extensively elsewhere in this report in the section on ownership. The data is entered into an Access data base. The mechanisms used in the data entry process are very impressive and reflect state of the art practice. These include validity checks at point of data entry, the

16

PES Evaluation Report, 2008 double entry of data and subsequent comparison, and data cleaning. All this should ensure that the data entered into the data base is an accurate reflection of what is recorded on the survey forms. The designers of these processes should be congratulated. A common practice in these exercises is, following data entry, to produce a report of the data for each school and to have the school check the accuracy of the data. This is not a component part of the PES. This validation of school data is an important and valuable component of the EMIS process. Additionally, it provides feedback to schools and communities with regard to the value and importance of accurate data. Such a process is consistent with the EMIS conceptual framework. With the data entered into an Access data base the production of such a school report becomes an easy process that could be done at zone level at minimum cost. The challenge is not the report itself but the distribution of the reports to schools and the receiving of feed back from the schools. This becomes a challenge due to a lack of well established and efficient communications systems in many places and a lack of sophisticated monitoring systems within the ministries/zones. There are however a number of emerging strategies that reflect improved contact and communication with schools. These include the mentoring and mobilising systems established in rural areas, the regular data collection at regional level that duplicates the PES in some regards, and the initiatives of examination and curriculum directorates in one region that collect reports as frequently as monthly. Such resources need to be integrated and harnessed into an effective unit that can facilitate regular and economic contact with schools. The establishment of effective Quality Assurance mechanisms that would establish regular contact with schools is a necessary prerequisite for this validation process. An option that should be considered in future data collection processes, if for the annual data census to be viewed as an update on data rather than the current collection of all data on an annual basis. Now that data is entered in to a data base, much will remain constant or relatively constant from year to year. Hence, the annual data collection process could be seen as an update on data that changes regularly, such as student enrolments, and a validation exercise for data that does not change regularly, such as number of classrooms and provision of water and sanitation services. To facilitate this approach, the enumerator would take a report for each school to the field that would provide for checks on current data that has not changed and updating of data that has changed. An added advantage of this process would be to reduce the volume of data entry that is undertaken annually, and hence, speeding up the reporting time frame. A useful addition here would be a missing data report to allow follow up in the field. Such a report can easily be generated from the data base. Although it is recognised that missing data should be minimal due to the data checking and cleaning exercises before data form are delivered from the regions. 1.2

Recommendations a. That REOs, Supervisors and UNICEF staff are encouraged to develop a ‘culture of support’ for enumerators to encourage transparent and open work practices. b. That such a culture should be encouraged by a module on ethical behaviour being included in the training exercises c. That enumerators should be evaluated both during training and during the field work, and employment in the PES process be dependent on satisfactory appraisal d. That GPS training be supported by field work and mapping exercises to ensure an understanding of the implications of incorrect location data e. That an audit trail be established to facilitate effective tracking of school visits and data forms f. That enumerators be required to record school locations as ‘way points’ in their GPS as part of the audit trail g. That enumerators be required to validate data in the school from source documents on selected indicators. h. That effective Quality Assurance mechanisms be developed that would establish regular contact with schools. These are seen as a necessary prerequisite for the validation process. i. That in future data collection processes, the data census become an ‘update’ of data rather than the current collection of all data on an annual basis j. That a missing data report be provided at the completion of data entry to allow follow up in the field.

17

PES Evaluation Report, 2008 2

Training

2.1 Findings and Analysis There have been a number of issues identified in this evaluation report that point to the importance of effective training of enumerators and supervisors, and to the consequences when training has not been undertaken with rigour and dedication. All of the issues discussed here have been mentioned in other sections of the report and are being readdressed for purposes of thoroughness and completeness. It is observed in the ownership section that ‘class training’ in itself is not an adequate preparation for the undertaking of complex skills. It is further emphasised that experience and ongoing support are also necessary. Thus, it is emphasised that there is a necessity for practical components, undertaken with appropriate guidance and opportunity to reflect on the experience, to be central to the training process. There are two major components to these discussions of training processes. The first is a certification process for enumerators and supervisors and the second is a set of additions to the current training programme that are based on experiences of people who have been engaged in the current process. Readers may react to some of these issues with the view that they are already included in the training process. What has become clear is that the training is not delivered using exactly the same process in all zones, consequently, some procedures are being emphasised for inclusion. Reports were received during the evaluation that some enumerators and supervisors did not take the training seriously. This is in spite of the recent initiative to develop a TOR for the employment of these staff members. The lack of commitment was reflected in behaviour of constantly absenting themselves from training sessions, constantly using mobile phones during training, and failing to take and active part in class sessions and activities. One MoE reported that: Some people are only here for the money. We need to be more careful in their selection. It is clear that it is not enough for mere attendance at the training be deemed adequate preparation, but also an evaluation of performance leading to a certification process should be embedded to ensure commitment and competence. A set of competencies should be designed and shared with trainees prior to the training workshop. Certification will only be granted when trainees achieve all competencies. The competencies might include: Trainees must: • Achieve satisfactory attendance • Be able to actively engage in class activities • Be able to use GPS and identification of way points • Be able to use mapping skills to plot Location coordinates • Demonstrate an ability to analyse data forms for inconsistencies • Be able to use audit trail instruments • Be able to discuss implications of inaccurate data • Demonstrate commitment to the project and understand its importance. • Be able to make ethical judgements appropriately To assist participants achieve these competencies it will be necessary to adjust course content to provide opportunities for participants to develop the necessary skills and attitudes. The following modules are suggestions derived from the study: •



Ethics/Values module. This would have the aim of developing the importance of accurate data collection and to value the opportunity to be a participant in the process. This might be achieved through a set of small case studies that reflect common dilemmas faced in the field, and using class or group discussion to resolve the dilemmas. An issue that might be included here is how to deal with obviously exaggerated data, or major changes in data from previous years. Field Testing and Modelling of Monitoring Process module.

18

PES Evaluation Report, 2008 • • • • •

Data Checking and Cleaning module. A module of training for data sheet checking, monitoring and supervision should be developed and all REOs, supervisors and enumerators be encouraged to reflect on impact of poor data pick up procedures. Use of audit trail records and way points to track activity. GPS and mapping exercise Module. Use of GPS. Consideration of implications of school coordinates missing village locations. This should include the use of way points to monitor school visits. Analysis of Forms Module. Forms containing common problems and errors analysed, and consideration of causes and correction strategies. Addressing Common Problems module. Issues such as spelling of village and school names, changing school names, new schools emerging and old schools disappearing needs discussion and strategies developed for avoiding confusion.

These have two obvious consequences for the training course. One is that the course will have to be lengthened to include the extra content. The second is that it will not be enough to merely present the course material. Participants will have to be given the opportunity to demonstrate the competencies. This has implications for teaching methodology, need for revision and retesting opportunities. 2.2

3

Recommendations a. That a certification process for enumerators and supervisors be used to judge competence for the work tasks. b. That the certification process be based on explicit competencies that must be demonstrated by enumerators and supervisors. c. That a set of additional modules to the current training be included in the training course d. That the course be lengthened to include the extra content e. That participants be able to revisit opportunities to learn and demonstrate the competencies.

Enumerators and Supervisors

3.1 Findings and Analysis Enumerator and supervisors are at the centre of the current data collection system for the PES. The quality and dedication of these people is important to the validity of the data collected. It is clear from the above discussion that there are areas of challenge in their selection, training, supervision and commitment. Some doubts have also been expressed in regard to the ethical practice of some of the enumerators. It must be emphasised that inadequate performance was not found to be widespread and most of the enumerators spoken to as part of this evaluation reflected competency and commitment to the tasks at hand. Once again there was variability across zones, with the zone that seemed to have the most transparent recruitment practice and who had addressed some supervision issues having fewer difficulties in this regard. It has been noted that a TOR has been developed for enumerators and supervisors. This is a useful beginning. One zone seemed to make the recruitment process a ‘private’ affair, with a lack of clarity with regard to demonstrated potential to adequately commit to the task. One senior ministry official observed that: We have been recruiting old people as enumerators and we need to start recruiting younger people with greater enthusiasm and commitment. A more transparent process of recruitment, using a variety of stakeholders would assist in improving the quality of recruited staff. In the above section on training it has been recommended that enumerators and supervisors demonstrate competency and commitment during the training process. If they fail to demonstrate the competencies they should be replaced. It would be advisable to include more than the required number in the training process and only the best included in the data collection phase.

19

PES Evaluation Report, 2008 In addition, the performance of enumerators and supervisors should be evaluated during the data collection phase by independent assessors, a report given to the staff member and this used as a basis for reemployment in the process the following year. One zone moved data collecting teams across regions where they were supervised by a different REO each year. In this way a more independent system of monitoring was created since over familiarity and old friendships could not influence work practice. Such a system could be considered by other zones. In undertaking the above recommendation it is necessary to remember the recommendation made above of conducting the data collection in a supportive environment. It needs to be remembered that these people are working in an environment where capacity is low and a major focus of the work is to develop that capacity. Hence, in implementing these proposals it is necessary to ensure that all staff are supported to develop the necessary skills and attitudes, are given support when needed and are encouraged to seek advice and support without fear of dismissal. Thus, it is imperative that a fine balance is recognised between commitment to quality and commitment to capacity development. It is worthwhile to make a note with regard to the future use of enumerators and supervisors as they are currently conceived. The more standard process for collecting data in an EMIS process is for head teachers to take responsibility for completing data forms and returning them to district/regional office. Currently this would be difficult to achieve due to the poor communication and transport infrastructure in many areas of Somalia. As noted in an earlier section, a quality assurance and systematic monitoring structure has to be established before such data collection strategies can be used effectively. As capacity is developed and infrastructure built it is suggested that a more economic data collection system be developed. This could be achieved by abolishing the enumerator/supervisor structure and replacing it with an EMIS officer located in each regional office and head teachers be empowered to complete update of data forms along with subsequent data collection processes in the form of term or semester reports. 3.2

4

Recommendations a. That the enumerators/supervisor position be competitively attained. It would be advisable to include more than the required number in the training process and only the best included in the data collection phase. b. That the performance of enumerators and supervisors be evaluated during the data collection phase by independent assessors using predefined competencies. c. That REOs be rotated across data collecting teams and thus create a more independent system of monitoring. d. That a fine balance is recognised between commitment to quality and commitment to capacity development. e. That as capacity is developed and infrastructure built the enumerator/supervisor structure be abolished and replaced with an EMIS officer located in each regional office and head teachers be empowered to complete update of data forms along with subsequent data collection processes in the form of term or semester reports.

Timing

4.1 Findings and Analysis In making comments with regard to the timing of data collection there are two areas in which considerations need to be made. One is the impact of timing on the data collection process, and the second is the suitability of timing of data collection for the process of collecting data and providing reports in a timely manner to ensure the maximum use of the data can be made and to help in avoiding the necessity for data duplication. By data duplication it is meant the need for people to pick up data in a parallel system that is also collected in the PES. Firstly, the impact of timing on the data collection process is considered. While it might not be possible to avoid all data duplication, a well designed EMIS process will attempt to minimise the need for data duplication. The current data collection process is undertaken in late April and May. The plan is for data to be collected over a two week period, but there are often delays and disruption to the process. This results in

20

PES Evaluation Report, 2008 data collection coinciding with end of the school year activities such as exams and exam marking, making it difficult to find head teachers in their office. A more significant constraint is that the current data collection period takes place in part of the rainy season. The influence of the rainy season has been identified above as a threat to the validity of some data, due to enumerators being unable to reach some schools. The Vice Minister of one zone observed: We are collecting data once a year. It is very rushed and not a stable process close to the end of the school year. Another impact of the data being collected so late in the school year is that any report is always an historical document. It can never be a report about the current year. Such a reporting structure is built for agencies in developing proposals, not for monitoring school performance and improving planning, or to facilitate system activities such as recording year 8 exam results and attendance data. The current practice has led to a number of data duplication practices that have become necessary due to the lack of timeliness, in addition to an inflexible reporting process. A common practice in EMIS data collection is to conduct a school census (similar to the current PES) early in the school year, with subsequent data pick ups on a term or semester basis. The subsequent data pickups commonly include drop out, exam, attendance, and finance data. With careful planning and coordination with other ministry directorates, reports can be made on a timely basis for other MoE activities. Such practice has the advantage of spreading EMIS activity across the year, thus ensuring ongoing activity of specialist data entry staff and data base managers. Both of the above issues can be addressed by moving the school census process to early in the school year. If data was collected in late September or early October, (or as soon as possible in the school year) it would be possible to avoid the rainy season and disruptions to school activity due to end of year examination activity. A report has been received that indicated some variability in the school year beginning for some schools, or some late enrolments that may constrain the beginning of the data pick up. Such matters would require the advice of REOs to ensure the most suitable time is chosen. Such timing at the beginning of the school year will favourably impact on the usability of the data by regions, districts and school communities since it would be current data. If data is collected in early October, entered into the data base in November, preliminary reports should begin to flow to stakeholders, in particular REOs and schools, in December. A full set of reports for the year would be available before the end of the school year in May/June. 4.2

Recommendations a. That data be collected in late September or early October (or as early in the school year as can be achieved) b. That data be entered into the data base in November, c. That preliminary reports begin to flow to stakeholders, in particular REOs and schools, in December.

21

PES Evaluation Report, 2008

5

Data Duplication

5.1 Findings and Analysis In the effectiveness section of this evaluation report the view is expressed that an EMIS unit should be established that has the vision of being a service unit for other directorates within the ministry and other stakeholders that support the education of children. This provision of data and the associated reports need to be provided on a timely basis. A result of not providing reports on a timely basis is the duplication of the data collection process. If data collection needs are not well coordinated, parallel systems are at times set up to gather additional data. Evidence was found of quite extensive data collection and data entry that is not part of the PES process. Some examples are listed below. In one zone UNICEF entered a subset of PES data into an excel spreadsheet due the tardiness of the PES report being provided. The data entered allowed them to assess progress and distribute resources on an equitable basis. In another Zone REOs reported that they all had their own data collection systems. One set of documents provided indicated that School enrolment by grade/gender/district/school, drop outs, school furniture and equipment health hazards, general condition of school, number of text book per student, extra curricular activities and other data are systematically collected. Some of this is collated into a regional report. While all REOs indicated they collect data independently of PES it is not clear just how extensive each set of data is. In Puntland the curriculum division collects reports on a monthly basis from rural schools and conducts some analysis of the data. Mentors are used as collection agents. These include student enrolment figures and observations of the mentoring process. Some may have reservations with regard to this process as mentors are intended to have a professional development role, rather than an enumerator role. If such a process were to continue, it would have to be made explicit how the data collected related to their professional development function In this same zone a set of data for Grade 8 examination results is collected and collated. This data consisted of full details (including date of birth) of every student undertaking the examination. To this data base the exam results are added when available. This is an example of data being required to supplement that of the PES. The PES does not gather student names and details as records. As EMIS units develop it will be necessary to integrate these functions into its annual work plan. Much of the content of this section supports the observations and recommendations made in the previous section on the timeliness of the PES and indicates how the current structure and operational mechanisms are resulting in duplication of effort and a resulting wasting of resources. However, there are some positive aspects to these observations. There is an indication of capacity to communicate with schools and to transfer forms within a rudimentary monitoring system emerging. There is also an indication of an emerging understanding of the important role of data in planning and an emergent capacity to process and analyse data. These efforts need to be recognised and encouraged to facilitate the growth of formal quality assurance procedures to replace the relatively ad hoc systems currently being used. 5.2

Recommendation a. That careful planning take place to provide data on a timely basis for the variety of MoE directorates and stakeholders

22

PES Evaluation Report, 2008

B. EFFECTIVENESS This section discusses effectiveness in the context of the use made of the current survey data, the use and potential of documents provided to schools and referred to as ‘EMIS tools,’ ownership of the PES and the progress made towards MoEs accepting responsibility for the implementation of the PES.

1

Utility-Current and Potential

1.1 Findings and Analysis Inquiries as to how the PES was used drew a range of responses. One Vice Minister said: We don’t make use of the PES. One Minister was more positive implying it is used to provide data for supporting organisations It is very important. We need it to submit to other supporting organisations. It is the only way to know what is going on. A Director of Planning was more aspirational: We are committed to build the capacity of a Department of Planning, Research and Coordination to have accurate data and proper data recording. We want more capacity building in planning. We want: To provide a yardstick for assessing and evaluating development in primary education. To provide the latest data on key aspects of primary education for use by organisations involved in provision and delivery of education. To take further the development of an EMIS for our Zone by collecting and analyzing data on a questionnaire that has continued to improve since 1997. He also indicated that he used the data to check submissions from REOs when they requested support. However, he was unable to provide an example of such activity and his data entry clerks were unable to use the query function on Access. Thus the conclusion, that these views are more aspirational than reality. There was some criticism of the distribution of the PES report that implied a constraint to its possible use. A REO/Enumerator workshop reported: Reports not received for last two years. Last report received 2004/05. Schools never receive reports. MOE should ensure wide distribution of the report. UNICEF ensures adequate copies of the report reaches the MOE including electronic format. Majority of REOs have computers in their offices. Even when received, the report is never properly used by schools, due to lack of trained staff for the use and planning. Capacity development for the use of such a report is needed. In a different zone one REO indicated: I make regular use of the report for planning purposes. I look at enrolment increases and decreases, physical condition of schools, how many are operating. This seemed to the exception rather than the rule.

23

PES Evaluation Report, 2008 While it is clear that the survey reports have had very little impact on the MoEs, it is still of major importance to the various agencies working in the education sector. Staff from Save the Children indicated that they: Use data for Base line data, enrolment figures, number of teachers, school environment, fees of students, community contributions and gender numbers Data collected targets ministry, but they (the Ministries) don’t use it. A UNESCO representative supported this position and added that the PES consisted of the only significant data for the sector. A wide range of uses of the PES reports were identified by NGOs. These included: • Data used in proposal preparation. • Establishing base lines for activities and monitoring progress to assess impact of activities. • Identifying and prioritising schools in need of water and sanitation support. • Planning for distribution of supplies. • Planning for teacher training. (Special focus on women due to very small number of women teachers identified in the survey) • Planning for text book production and distribution. • Use as a sampling frame. • Prediction of numbers of students progressing to secondary school. • Comparison of Regions for patterns of enrolment across the country. • Identifying schools for rehabilitation interventions. From the above it can be concluded that the PES data is playing an important role in the development of the sector from an agency perspective. The MoEs seem to be making very limited use of the data at the present time but have indicated a desire to do so. To facilitate “Evidence Based Planning” there are a number of issues that need to be addressed, these include timeliness, capacity to use the data, format of reporting of data, and the development of a systematised and institutionalised planning process. This institutionalisation needs to take place at central, regional and school/community levels. These issues will be more fully discussed later in this report. 1.1 a. b. c. d.

2

Recommendations That timeliness of data reporting be improved to suit institutional needs. That capacity to use the data be developed among stakeholders. That alternative formats of reporting of data be developed to differentiate between stakeholders. That a systematised and institutionalised planning process be developed. This institutionalisation need to take place at central, regional and school/community levels.

EMIS Tools

2.1 Findings and Analysis This evaluation includes an exploration of the use schools make of a set of tools to assist in managing the schools. These tools are inappropriately referred to as ‘EMIS tools’ and consist of a Class Register, School Register, Student record card and a set of detailed instructions on their use. Unfortunately, the timing of the visit to Zones coincided with school holidays and hence, discussions on their use with teachers or principals were impossible. Their use was discussed with REOs and Ministry staff. They estimated that Class registers are used by 80% of schools and school registers by 20% of schools. The 80% may be a bit optimistic since the 2006/7 draft report indicated that only 50% of school had received the tools. The pupil record cards are very infrequently used, with one REO saying: They are only used when a student changes schools. One Zone indicated that these tools were to be a major monitoring focus in the new school year:

24

PES Evaluation Report, 2008 The next school year is to be ‘the year of using EMIS tools properly.’ We will evaluate mentors and teachers on this basis. All Ministry staff will visit schools to monitor the EMIS tools (this includes the Minister). Reports were received that indicated the School register was very difficult for head teachers to use. This difficulty was related to the perceived complexity of the tools and to the capacity of the head teachers to undertake such clerical duties. Further investigation of these issues needs to be made and adjustments to the tools, or to the training associated with the tools, or both, needs to be made. An appropriate strategy would be to conduct a workshop on the evaluation and use of the tools with a random sample of head teachers to more fully evaluate the issue, and to develop remedial strategies. What remains unclear is how these tools are to be used. Is it expected that they will provide summary data on school enrolments, external exam candidature, absenteeism, dropout rates, school resources etc? These would be desirable supplementary data sets that should be routinely collected in a full EMIS process. However, no evidence could be found that this is in fact the case. The use of these tools needs clarification and appropriate mentoring provided for head teachers on the use and value of these tools. The tools are potentially valuable aids for providing evidence for the PES data collection process. Without these important functions being understood the use of the tools will be seen by head teachers as “busy work” without value or impact. 2.2

3

Recommendations a. That the use of the ‘EMIS tools’ is clarified and appropriate mentoring provided for head teachers on the use and value of these tools. b. That a review of the design and use of the tools be undertaken. c. That plans be developed to integrate this data with the EMIS in the form of supplementary data collection and as a contributor to data verification. d. That further distribution of the EMIS tools be supported by developing understanding of their importance and use.

Ownership

3.1 Findings and Analysis There is considerable commitment to the importance of the survey: It is very important. We need to it to submit to other supporting organisations. It is the only way to know what is going on. It is important to conduct it annually. A Director General However, the MoEs seem to only be really involved in the data collection process, with design of forms, data entry analysis and reporting all being the domain of UNICEF, with some consultative processes, and the technical assistance they provide The operation of the project is done by MoE by filling in the forms. Funds are transferred to Ministry by UNICEF to allow this process. Calculations are done in Nairobi and results sent back to Zones … A Director General Even in this data collection process UNICEF staff are central to a validation process of spot checking of schools to confirm data accuracy. To broaden the “ownership” of the project from its present narrow base strategies need to be developed to shift responsibility for other major components of the PES to MoEs. As this shift takes place it should also be conceived as a shift from a survey to an EMIS. In addition to data collection, the major components could be considered to include:

25

PES Evaluation Report, 2008 • • • • • •

Validating data collection and data cleaning; Data entry processes; Data analysis; Report generation; The design of future data needs, and; The integration of various data collection strategies currently undertaken.

These components should be addressed as capacity and resources are developed. It would not be appropriate for capacity in all these areas to be developed simultaneously. An immediate focus should be placed on improving data collection processes to ensure greater accuracy, and shifting data entry responsibility to Zones over a two to three year period. The issue of data entry is clearly the next phase in developing ownership of the PES. The issue caused much comment from Ministry officials. One observed: The Process of data entry has to be owned by ministry. The computers are available, but human resources are the problem. People are available but need some support. UNICEF will have to pay for the EMIS counterpart. Further Those who enter data are not Somali and therefore they do not know local language and conditions. Need to involve local people in data entry. Can they enter once and the Nairobi people enter once. Then compare. Do this to build experience. Even take Somali data entry people to Nairobi for the process. It was mentioned above that data collecting, processing and analysis skills are emerging in Puntland. However, this is not consistent across the three Zones. Some capacity building in the area of data entry and accessing data has been attempted; computers have been supplied and staff to undertake the tasks have been identified. However, there are constraints in the system that are preventing the transition of responsibility from UNICEF to the MoEs. Both the education literature and the change literature will indicate that a brief workshop on a complex set of skills is not enough to build capacity for people to operate independently in a new and challenging area. In addition to the training component, the skills need to be practiced in a supportive environment. While experience and support has been provided for in the area of data collection it has not been provided in the other component areas of EMIS development. This is essential if capacity is to be developed. With the current double entry data system the opportunity exists to develop capacity in the area of data entry, without substantial risk to the project. The MoE data entry staff could be used to enter the data once and the Nairobi data entry staff enter it the second time. This would be followed by the usual comparison check and corrections. The precise location of the data entry function would need to be decided, but the process would provide data entry staff with experience in a supportive environment. As capacity is demonstrated, responsibility for both data entry tasks should be shifted to Zones. 3.2

Recommendations a. That the responsibility for transferring of EMIS components to MoEs are considered one at a time as capacity and resources are developed. In this way a smooth transition from the centrally managed survey to a decentralised EMIS can be achieved. b. That responsibility for data entry be shifted to Zones over a two to three year period. c. That initially MoE data entry staff be used to enter the data once and that the Nairobi data entry staff enter it the second time.

26

PES Evaluation Report, 2008 C. RELEVANCE AND APPROPRIATENESS Relevance and appropriateness is discussed under the subheadings of the data that is required for an EMIS and the suitability of reporting processes following data collection.

1

Data Required for an EMIS

1.1 Findings and Analysis The data collected in the PES has been modified over the 10 years of the surveys operation. The major component of these modifications has been the PAE survey to capture this large (some 69000 students) section of the Education sector. This component is of particular importance for girls who are often required to perform duties in the home prior to attending school. Street children are also targeted within this system. In addition, rural /urban data, and GPS coordinates of schools have recently been added to the system. There has been a continual evolution of the survey forms over the ten years based on experience and evaluations. The range of data collected is relatively extensive and includes educational components as well as infrastructure data that should assist in planning activities, and some limited school management data such as staff meeting, CEC and associated training. It also allows for disaggregation of the data by Gender, District, Region and Zone, and in the future, Rural/Urban differentiation. There is still need for some expansion of the data collected if it is an aspiration to provide for a full set of EFA indicators to be provided from the data. This would be consistent with the conceptual framework outlined (Appendix 3) and would be necessary for Fast Track Initiative (FTI) involvement and Millennium Development Goal (MDG) reporting. The indicators that are not currently reported are listed in Table 1. The additional data that needs to be collected includes, grade enrolment by age for calculation of net enrolment ratios, ECD/Qur’anic enrolment (the G1ECD indicators calculates the percentage of grade 1 enrolments that have previously had early childhood education experience), repetition and drop out data, student achievement data including student literacy rates. In addition more effective use needs to be made of population data for the calculation of standard gender parity indices. (Currently reported is % of girls in the school population, not the % of the female population that attends school. To interpret the current statistic requires the assumption of the equality of the number of boys and girls in the school age population). Much of this data would be available from and enhanced use of the ‘EMIS Tools’. Such a use would assist in generating importance and value of the tools, and create value and relevance for their use. It is recognised that student age information maybe difficult to acquire initially. It was noted that the grade 8 exam project was collecting student dates of birth. These may or may not have been accurate. However, at least some approximation of student age at time of school enrolment is a necessary component if EFA indicators are to be calculated. This data is provided for in the EMIS tools that we have indicated are not used systematically. This is further evidence for the importance of the review of EMIS tools recommended above.

27

PES Evaluation Report, 2008 Table 1: EFA Indicators not Reported ECD G1ECD AIR-NIR

1 2 3, 4

NER EXP

6 7 8

REPETITION SURVIVAL

12 13 14 15

Gross enrolment ratio in Early Childhood Development (ECD)1 Percentage of new entrants to Grade 1 with ECD experience Apparent Intake rates (AIR) and Net Intake rate (NIR) in primary education2 Net Enrolment ratios (NER) in primary education a) Public current expenditure on primary education as a % of GNP3 b) Public current expenditure per pupil on primary education as a % of GNP per capita; and Public current expenditure on primary education as a % of total public current expenditure on education. Repetition rates by grade in primary education Survival rate to grade 54 Coefficient of efficiency at grade 5 and at the final grade5 % of pupils who master basic learning competencies

16, 17 18

Adult Literacy rates 15-24 years and 15 years and over Literacy Gender Parity Index (GPI)

ACHIEVEMENT LITERACY

Note. Current gender indicators are proxies for standard gender parity index due to lack of age cohort data. With the exception of the proxy for the survival rate, no outcome data is reported that would allow us to make judgements with regard to the quality of the education process. One Zone shared their examination system process and the data collection system associated with it. The system has the potential to provide information with regard to the “% of pupils who master basic learning competencies” provided suitable forms of examination data analysis were undertaken. Unfortunately, this examination data was not linked to the PES. In fact a parallel data base was being established reflecting a duplication of effort and inefficient use of scarce resources. Systematic recording of school outcome data should be undertaken to facilitate judgments about school quality. Such analysis is particularly important in the rapidly expanding context of Somali primary education. There is considerable international evidence that such circumstances lead to a declining education standard. A challenge will be to identify useful and meaningful indicators of student achievement. The common norm referenced approach of reporting a mean score for a school is not very useful as they cannot be compared across time. A more useful approach is to move to a criterion referenced approach that allows the recording of the percentage of students who have achieved a set of defined outcomes. The existing examination projects being undertaken should be encouraged to provide such information from the analysis and recording of exam data. The “inclusion” of Qur’anic and Nomadic schools in the PES is considered here. Qur’anic schools seem to operate in two different ways. It is reported that the majority operate by exposing children to the Koran prior to their progression to primary school. In this sense they are providing some education experience for some children prior to entry into the formal schooling system. It would thus be useful to eventually include Qur’anic institutions in a systematic data collection process in a similar manner to the way Early Childhood institutions are commonly monitored and recorded. However, advice has been received that the Qur’anic schools are too numerous to record and that there would be resistance to ‘government interference’ in their operation.

1

This could be defined to include attendance at Qur’anic school prior to Primary school or Qur’anic school experience could be collected as separate data 2 Due to not collecting age/grade enrolment 3 Not a function of EMIS necessarily 4 Current estimate is a proxy due to lack of repetition data 5 Due to lack of repetition data

28

PES Evaluation Report, 2008 It would be useful to record the number of grade 1 enrolments who have attended Qur’anic school in a way similar to the G1ECD indicator. Such data allows investigation into the impact of the Qur’anic school experience on drop out, survival and performance in Grade 1. International experience indicates that ECCE has a very positive impact on these indicators. It would be interesting to make similar investigations into the impact of Qur’anic experience. This may lead to discussions as to how to adjust Qur’anic school experiences to increase positive impact on these indicators. The second way Qur’anic schools are operating is that some have started to provide experience in the formal primary school curriculum. As such they are indigenous schools with a hybrid curriculum and are providing alternative pathways to formal education. Two DGs indicated that such schools are already included in the PES. This is a satisfactory situation except that the current PES form does not allow recording of Qur’anic status. Such school status should be recorded. It might be provided for in question 1 under “school type” or under question 9a under school “owner”, or 9b under school “manager” of the current PES data collection form. There seems to be no nomadic schools included in the current survey. If they are, they are not recorded as such. Nomadic children are likely to be among the most vulnerable and marginalised children, and therefore need particular attention. However, if such children are not in school, they will not be included in the survey. A more precise analysis of out of school children is needed to investigate the underlying issues of out of school children. The variability of the distribution of these children needs to be understood to allow strategies to be developed to address their needs. However, such a study is considered to be outside the normal ambit of an EMIS, unless such children can be identified by a disaggregation of the GER. It needs to be remembered that the GER uses population data as its denominator, and such data will not in itself identify nomadic children as distinct from other out of school children. In the extensive range of interviews undertaken there were many other suggestions for additional data which people argued would assist in their delivery of support programmes, or in the allocation of resources and in general school management. The sources of these suggestions included NGOs and REOs. Nearly all the suggestions have value. However, some would not normally be included in an EMIS. As an example let us consider the rehabilitation needs of school buildings. A limiting factor is the ability of a wide range of people to make a consistent or reliable evaluation of the condition of a school. Head teachers and enumerators could be asked to make a global judgement about the building condition such as “new, adequate, dilapidated, unsafe”, but will the data recorders be qualified to make such a judgement, and could adequate descriptors be developed for each word on the scale to be consistently used across a large number of schools. This data remains important for planners and in the absence of a formal school building survey by qualified experts, it should be collected with the caveat that it will need to be interpreted with caution. A number of contributors to the study asked for data about the “child friendliness” of a school that would include child centred learning, violence, child participation, clubs, participation of girls in forums, protection, quality, access etc. To effectively measure all of these issues in a reliable way creates a challenge, which may be beyond the current state of development of the PES/EMIS. It may be possible to use some proxy indicators such as lesson types used once a week, or use of corporal punishment in last month that reflect the degree of child friendliness. The suggestion of recording learning material also has merit, but may create difficulties of volume of information. Short of conducting a complete stock take of the school, full coverage of such information may be impractical. The PES already records receipt of text books, but not the adequacy or otherwise of supply. It may be practical to record some key resources that are commonly available and for which there is money or a donor ready to provide.

29

PES Evaluation Report, 2008 A request was made for issues relating to Education Management to be recorded in the PES. Some aspects of this are already included, such as staff meetings, function of CEC, use of EMIS tools, and inservice training experiences. Two areas that have been recommended for inclusion and should receive support are non-formal education and secondary education. Both these areas are significant contributors to development and should be systematically recorded. UNESCO is currently undertaking development of a secondary school EMIS. These issues are further developed below in the sections on Coverage and Coordination. The PAE currently records data from non formal institutions, but only in association with the Formal curriculum that some of these institution provide. It would be a sensible development to expand this component of data collection to include a broader definition of non-formal curriculum components. This issue is further developed below in the coverage section. A final suggestion that has been received is the inclusion of hazards and risks that might lead to an emergency situation for education. Given the current context this would be of great benefit for planners and experts in the area. A current programme to develop an emergency preparedness plan is currently emerging and consideration should be given to including relevant data and indicators in the EMIS. In this section a large number of suggestions with regard to possible additional data needs have been made that could be useful in assessing the current state of primary education and in planning for its future development. However, it would not be practical to try and expand the system to include all of these possibilities immediately. Further, the priority for expansion of the system will be more dependent on aspirations of the MoEs than on ‘outsiders’ views and values. It should also be recognised that the ability to expand the range of data collected will depend in part on the success or otherwise of attempts to shift further functions to the control of the MoEs. With these caveats, suggested data expansion has been organised into three groups implying a priority. The groups are called Immediate, Next Phase, and finally Aspirational and are set out in Table 2. Table 2: Suggested Development of Targeted Data Immediate

Enrolment by age, grade and gender Drop out and repetition Secondary School data Condition of buildings/ rehabilitation needs

Next phase

Background ECD of grade 1 enrolments Expanded Non-formal education data Qur’anic schools/ Early childhood Text book numbers Emergency preparedness Outcome/ student achievement data

Aspirational

Child friendliness of schools Teaching and learning resources

The enthusiasm and encouragement with which NGOs and other agencies have supported this review process, and the range of suggestions provided for additional data to be included in the PES should be

30

PES Evaluation Report, 2008 formally recognised and systems put in place to provide regular opportunities to contribute to the data design process.

31

PES Evaluation Report, 2008 1.2

Recommendations a. That expansion of the data collected take place to provide for a full set of EFA and MDG indicators. b. That systematic recording of school outcome data should be undertaken to facilitate judgments about school quality. c. That a criterion referenced approach to school outcomes be adopted that allows the recording of the percentage of students who have achieved a set of defined outcomes. d. That the number of grade 1 enrolments who have attended Qur’anic school and ECD be recorded. e. That Qur’anic status of Qur’anic schools offering an integrated hybrid curriculum be recorded. f. That a more precise analysis of out of school children be undertaken to investigate the underlying issues of out of school children. This should be a study separate from the EMIS. g. That school condition data be collected with the caveat that it will need to be interpreted with caution. h. That proxy indicators such as lesson types used regularly, or use of corporal punishment in last month that reflect degree of child friendliness be considered. i. That some key resources that are commonly available and for which there is money or a donor ready to provide be recorded. j. That the data collected include an expanded view of non-formal curriculum components. k. That a data on emergency risk and emergency preparedness be considered. l. That the priority list for data expansion be adopted. m. That NGOs and other agencies be given regular opportunities to contribute to the data design process.

2

Reporting

2.1 Findings and Analysis In this section comment will be made with regard to two separate aspects of reporting. The first is comment with regard to the Draft report from the 2006/2007 survey, which has not as yet been published. The second draws upon findings with regard to the use of the Data and the EMIS conceptual framework that indicated the EMIS is a process that requires movement of information both up and down within the system. A significant development this year has been the use of Access data base to store the data. This will improve flexibility and accessibility for investigating emerging issues and challenges. More comment on the potential of this data base in made below, but first the draft 2006/7 report is considered. The nature of reporting from the survey has been explored. The draft PES report that has been analysed seems to be opaque, poorly constructed and contains many inaccuracies. These inaccuracies have occurred in the extraction of data from the data base and in the subsequent analysis. A few of the apparent errors are listed here to illustrate the point. • • • • • • • •

Paginations are not consistent with contents table Headings of unnumbered tables on pages 19 to 23 refer to the wrong year Total enrolment figure for North West region on page 19 is 124117. It should be 115999. 115999 is used elsewhere in the report and is consistent with the data in the Access data base. The difference above leads to inconsistent reporting of total enrolment figures across the report (392101 versus 383983). Annex 7 uses a total enrolment of 318159, when a check reveals it should be 383983. (An earlier version of annex 7 which had some major inaccuracies has, fortunately, been extensively modified in the current draft of the report) Percentage change figures are sometimes incorrect or do not use a negative sign to differentiate between directions of change Gross Intake Ratio is inappropriately defined Survival rates are not consistent with EFA definitions

32

PES Evaluation Report, 2008 This list is not claimed to be exhaustive but does create a sense of unease in the reader about the confidence that can be placed in the data of the report, especially when there is glaring errors in the opening tables. It would seem that the errors do not arise from the data collection or data entry processes but rather from the subsequent analysis and documentation process. A more carefully compiled and more attention paid to checking the report may have been useful. The question arises of what to do with the Draft 2006/7 report. It should not be published in its current form. However, some report needs to be made available. The options suggested here are: a. Ask the writer to carefully review the draft and to remove errors b. Ask a new analyst to review the current report with a view to eliminating all identifiable errors, but not attempting to change structure or readability (say 3 weeks). c. Ask a new analyst to start a new report preparation process, that would lead to a user friendly (or at least an improvement on the current model) product with the same scope as the current report (say 2 months). It would be feasible to ask the analyst to extract a small set of summary data for publication in a small “Somalia Education Facts” book that presented key indicators and some time series data disaggregate by Gender and Zone. This summary data could then be given to a designer to be prepared for publication. d. If all the above seems out of the questions as simpler option would be to just prepare a “Somalia Education Facts” book and distribute it with a disc copy of the access data base with a set of queries embedded to facilitate more detailed inquiry when necessary (say 1 month). It was said earlier that the report seem opaque. By this it is meant that it is difficult to access, and find data, it seems to lack a coherent design and organisational layout. A conceptual framework guiding the construction of the report would be a useful addition. Part of this problem is due to the vast amount of data being reported, and it should be reported. It is clear that this report is intended for a relatively few people in the Development sector who require base line data for the development of proposals and submissions. It has been noted in this evaluation that the MoE makes very little use of the data in the current report format. This is in part due to a lack of systematic planning and monitoring processes. However, NGOs have reported making extensive use of the report in its current format. If there is to be a shift from a ‘survey’ to an ‘EMIS’ a more effective way of sharing the data will have to be found. In the conceptual framework for an EMIS the need for a system that allows for dissemination of material in a flexible way was outlined and emphasised the need for a two way flow of information “so that all levels of the organisation can make effective use of the data and reports generated from the data.” This can never be achieved with a single report format such as the PES is currently using. It is necessary to clearly identify target groups, consider their needs and capacities and to design report formats to suit. It is axiomatic that UNICEF/UNESCO’s data needs are different from a news agency, regional office or local school/community. The current model of reporting reflects a model where information flows one way preventing lower levels of the MoE organisation from benefiting from the data collection and analysis process that they were so fundamentally important in providing. A set of target groups needs to be identified and reports designed specifically for each group. This process should also be supported by a front end module built for the access data base that can be flexibly accessed by a variety of users without the technical capacity to use the query module of Access. A front end module is software specially developed to provide a facility that allows data users to access information by making selections from a set of parameters of interest. An example of this has already been developed for the Health Management Information System for Somalia. At a minimum the following set of reports should be developed and distributed:

33

PES Evaluation Report, 2008 • • • • •



A school report that should be provided within 3 months of data collection. This report can also be used as a data validation check. In addition it can be used by schools and CECs for planning processes and development of school improvement plans. District reports that reflect education conditions within the district generally and allow schools and CECs to benchmark themselves and district officers to determine priorities Regional reports that reflect education conditions across regions and allow for comparison of progress between regions. Zone and Somalia reports that that provide a summary of major EFA indicators and trends disaggregated by Region, gender and Rural/Urban. The tables in the current report would be available by front end module to data base. A set of thematic reports that would change from year to year and would provide deeper analysis of EMIS data of issues of specific relevance. This might include gender, vulnerable and marginalised children, water and sanitation, condition of building or impact of specific programmes. It is expected that specialist technical assistance would be required to support this work. This technical assistance would be supported by a variety of interested agencies. The capacity to interrogate the data base using the query module of Access would be developed at Zone level and would be available for generating specialised reports for stakeholders

In preparing these reports specific attention should be paid to the technical capacity of the target group to interpret and use the reports. It has been suggested that a glossary of terms be included in each report to enhance readability. To make more explicit the thoughts behind the need for alternative reporting formats, a set of indicative data needs and how they might be used have been developed and included as Appendix7. They address a range of specific school, district and regional issues that are common issues in emerging education systems. They provide some direction as to how evidence based planning might begin. An emerging form of reporting is in the form of Maps. The EU has funded the Food and Agriculture Organisation to develop a Somalia Dynamic Atlas. Data from the EMIS should systematically be integrated into a comparable atlas via GIS software. This format provides a useful form of collating data sets and providing easily accessed information in map form with associated attachments and tables. The GPS coordinates that have been collected in the PES will facilitate this synthesis. Such a synthesis is discussed further in the coordination section of this report. DEVINFO format of reporting should also be considered. DEVINFO is a data base format commonly used in developing countries for providing access to key data and indicators across a wide range of areas, including education, health, population, WASH, etc. It is not country specific and hence the data of many countries can be accessed using this format. It has become evident that there is little capacity in some zones in the areas of evidence based planning, monitoring of progress at various levels of the system, or school improvement planning. It will not be adequate to merely produce reports. It will therefore be necessary for support and training to be provided for many levels of the MoE, on how to use the reports to improve the education outcomes for the children of Somalia. There already exists Community Education Committees (CEC) that provide support for education at the local level. Additionally, UNICEF is supporting a Community Development Project that aims to empower local communities to harness resources through identifying local needs and priorities, and proposal writing. These structures can be used to enhance education ownership at the local level. By linking to Community Development Committees (CDCs) this process can be used to mobilise resources. This process could be strengthened in the field of education by providing school/community reports that focus on reporting local conditions and facilities together with relevant comparisons. Such reports could be used to underpin local proposal development

34

PES Evaluation Report, 2008 2.2

Recommendations a. That the draft 2006/2007 report should not be published in its current form. b. That a set of target groups are identified and reports designed specifically for each group. This process should also be supported by a front end module built for the access data base that can be flexibly accessed by a variety of users without the technical capacity to use the query module of Access. c. That reports include a school/CEC report, District reports, Regional reports, Zone and Somalia compact reports that provide a summary of major EFA indicators and trends disaggregated by Region, gender and Rural/Urban, and a set of thematic reports that would change from year to year and would provide deeper analysis of EMIS data of issues of specific relevance. d. That the capacity to extract information from the Access data base using the Query module of Access be developed at Zone level among those responsible for data base management. e. That Data from the EMIS be systematically integrated into an Atlas via GIS software f. That necessary support and training be provided for many levels of the MoE, on how to use and interpret the reports.

D. IMPACT Findings and Analysis The impact of such survey instruments as PES is difficult to measure in objective terms. This study has attempted to assess impact by making inquiries with regard to the use and status of the PES report. Use of the PES by NGO and UN agencies has been reported earlier in this report, and indicates extensive use of the data and indicators by all agencies contacted. In particular it has allowed participation in EFA reporting and the setting of appropriate Millennium Development Goals. The general view is that the PES is the only significant information source in the sector for Somalia. These positive responses tend to indicate the perception that the PES reports are a valued and reliable instrument for planning in the sector. To the extent that the PES is having an impact on or “contribute to increased enrolment rates, greater opportunities for girls, increased numbers of teachers and a larger number of quality schools” (Appendix 2) can only be measured in the success of the programmes that are underpinned by the survey. The way in which the survey data underpinned proposals and programme development described by agencies earlier is evidence of this critical link. It is therefore necessary to look at the trend studies to quantify the impact of the efforts in the sector. One example is provided here to make the point. The number of children in schools has increased from approximately 150000 in 1999 to just fewer than 400000 in 2007. This trend has been consistent over the last ten years and applies to all three zones, with the exception of a decline in enrolments in the CSZ in the last year of measurement. The Gross Enrolment Ratio (GER) has also increased across all three zones in the last ten years. The juxtaposition of the PES and this increase in enrolment and GER does not establish cause and effect directly, but is indicative of some success in the sector when viewed holistically. While these judgements of impact can be made, there have been identified in this report a number of EFA indicators that cannot be calculated due to the data not being collected. Examples of these are net enrolment rations, and survival and efficiency indicators. The impact of the PES would be enhanced by the inclusion of these data and indicators. While there is evidence of impact at the macro level of agency planning, there is little evidence of a significant impact of the survey at either ministry level or at regional, school or community level. This is probably a result of capacity constraints and a lack of a wide ranging and flexible reporting structure. Strategies have been identified above to improve performance and impact in these areas. These include the decentralization of survey strategies and data use to enhance ownership of the processes involved. The emergence of evidence driven planning is one of these processes that will be associated with improved impact.

35

PES Evaluation Report, 2008 Currently common practice in agencies and NGOs is to sustain separateness between sectors such as education, health, water and sanitation, child protection etc. While attempts are made to have a more integrated approach to programme planning through constructs such as focusing on common regions or the identification of ‘cross cutting’ issues in an attempt to achieve synergies of programming effort, such attempts are not often reflected in systematic programme infrastructure and often have to be achieved through an additional layer of arrangements internal to an organisation. One approach to build a more systematic infrastructure to achieve cross sector cooperation is the DevInfo data base that stores key indices for all sectors across many countries to facilitate more holistic analyses. DevInfo tends to focus on macro level data with some disaggregation. There does appear to be an opportunity to further enhance impact through the programme infrastructure by allowing cross sector investigations to be made by linking the EMIS to the Health IMS at district, and community level to allow systematic investigations to be made in support of holistic programming activities. This together with a mapping process facilitated by the GPS coordinates being collected will allow for some detailed and holistic micro level reporting that will be particularly valuable in emergency situations. Hence, enhancing impact of the existing data collection and storage processes across sectors. Recommendation a. That impact of the PES be improved by implementation of the recommendations of this report in the areas of diversified reporting, capacity building on data use, broadening the scope of data collected, decentralisation of PES processes and linking PES to other data systems

E. SUSTAINABILITY Findings and Analysis If sustainability of the process means that the MoEs can undertake the PES independently of outside technical assistance and funding support, then it is abundantly evident that the process is not sustainable now. Ministries are dependent on UNICEF for funding, organisation, conceptual overview, monitoring and much of the motivation for undertaking the annual survey. One Director General, while understanding the notion and need for sustainability sees UNICEF as part of that sustainability. UNICEF is not like a small NGO, it will be here in the future. Clearly the DG has a different concept of sustainability than that suggested above. The need has already been discussed for the carefully designed structures and capacity building procedures at ministry level. These are emerging and UNICEF contributions of Planning and EMIS technical assistance and the establishment of an EMIS unit for each of the three ministries will be important contributions to sustainability. The extent to which they are successful obviously will impact on sustainability. To the extent that lack of quality staff is seen as a threat to sustainability, it may be necessary to look for different arrangements that the traditional employment of staff to permanent positions within the MoEs to meet capacity demands of the EMIS. One possible alternative arrangement is to issue outsourcing tenders to undertake the work to suitable institutions that can exhibit the appropriate management and technical skills. Such an institution may be the local institutes of higher learning which might be expected to have suitable research, data collection and IT skills available. Such ‘outsourcing’ strategies would have the advantage of providing funding support for students and could assist in developing genuine research interest in the data to link with the concept of generating specialist reports on particular areas of the education sector. In addition to the above general concepts there are a number of perceived threats to sustainability specific to the local contexts that are listed here. 1. The EMIS units will have to be staffed with capable and professional local people. It was often said that trained local staff are hard to hold due to the demand to trained staff from other

36

PES Evaluation Report, 2008

2.

3. 4.

5.

6. 7.

8.

organisations. This will be a threat to sustainability and reflects on the need to provide competitive salaries and incentives for the new recruits. There will be a need for assistance until such time as financial support for education, from the government, is adequate. In the short term, MoEs should provide evidence of their commitment to the EMIS process by accepting at least partial responsibility for staff salaries and have a plan to accept full responsibility in the foreseeable future. The development of clear ethical standards and effective monitoring systems of staff will be required. These may include defined work hours and conditions, commitment to achievement of targets in a timely fashion, performance appraisal, and adequate reward for effort. Transparent selection procedures of staff to ensure nepotism is avoided and that appropriate skill levels and potential are reflected in employees. Coordination of EMIS across three ministries, especially as capacity grows within zones, will need to be carefully managed. Along with decentralisation, it is to be expected that greater autonomy of action will result, and hence differentiated EMIS practice may follow. Establishment of communication and travel capacity is necessary for transfer of the PES to more usual structures of EMIS operations. While these are beginning to emerge, they need to be consolidated and encouraged. Regional level monitoring and quality assurance systems will be reflected in these developments. In Puntland the restriction of movement for the technical assistance staff and current restrictions to compounds will impact adversely on capacity building plans. In central south the current instability threatens sustainability. The technical assistance will probably need to be located in Nairobi. This will create challenges for the development of capacity in local staff, without which sustainability in the longer term becomes difficult. Maintenance of computing facilities. Access to internet will be necessary for virus control, plus a regular maintenance contract and updating plan for computing equipment.

Recommendations a. That employment conditions be such as to ensure the retention of professional staff b. That outsourcing some EMIS activities be considered in areas of staff shortage c. That merit based employment procedures be defined and implemented d. That effective work practices and standards be identified and made explicit e. That a mechanism to coordinate EMIS activities across the three zones be established f. That an effective quality assurance system be established that includes communication and travel arrangements g. That systems of EMIS implementation be differentiated across regions to reflect the differences in local conditions h. That a computer maintenance and replacement policy be developed and implemented

F. COVERAGE Findings and Analysis Since the PES survey is a census, there can be no threats to validity derived from a sampling bias. There have been some questions as to the suitability of conducting the survey on a sample basis rather that the current census. This would not be consistent with the conceptual framework for an EMIS outlined here, and would severely limit the potential use of the EMIS as it grows to a complete system. While the priority for development in the recent past has been primary education, there have been several requests for the PES/EMIS to be expanded to include secondary education. While the secondary education system is currently relatively small, it still needs to be assessed and planned for, hence the necessity for a systematic data collection procedure to be implemented. UNESCO is currently beginning that implementation. Further comment on this issue is offered under the coordination section. One planning officer indicated that nomadic schools are included in the current PES, while another in a different zone indicated that there were no Nomadic schools. The Save the Children NGO has established

37

PES Evaluation Report, 2008 a number of models to target Nomadic children in a special project involving the establishment of 44 Nomadic schools and 5300 learners. It is important that these can be identified in the PES/EMIS process to allow monitoring of impact of the project over time. Obviously the nomadic children should be included in any data collection process due to their perceived vulnerability and marginalisation. Currently there is no system for identifying nomadic school status in the PES. This should be addressed, and REOs asked to identify locations of such schools if they exist. The PES covers primary schools that target children of normal school age who study the formal primary curriculum. In addition to primary schools there are Primary Alternative Education Centres (PAE) that target youth who have either not attended school or who have had schooling disrupted by social instability. In addition, to the youth, there are a significant number of primary school aged children who attend these centres by accompanying older siblings. Both these groups at PAEs have a significant enrolment of girls. As a result of these differentiated groups, two curriculums are offered. One is the standard primary formal curriculum offered to children of normal primary school age and the other is referred to as the ‘Non Formal Curriculum.’ The non formal curriculum consists of the primary school formal curriculum for the first five years, compressed into a four year programme to better suit the older youth that it targets. PES currently records data with regard to students undertaking the formal primary curriculum in alternative education institutions, in addition to the number of students undertaking the non formal education curriculum. Unfortunately, it does not record ages of these students, and ages would seem to be a critical issue when considering the diversity among the groups catered for these alternative centres. There is no provision for an alternative form of non formal education to be included in the survey. This might include intensive adult literacy classes, traditional crafts, peace education, child raring practices etc. Some requests were received for these to be included in the PAE form of the PES. This would be suitable if they were supervised and monitored by the MoEs. It becomes very difficult to adequately define some of these activities as they are often transient. If they were to be included in the PES/EMIS they would need to be addressing a policy position that the MoEs or agencies were interested in. As an example such a policy position might be improving adult literacy. If this were the case it would be relevant to include non-formal education adult literacy classes in the survey. Recommendations a. That data collection systems be expanded to include all areas of that have policy implications for MoEs. This would include ECCE, secondary education and non formal education. b. That nomadic schools be clearly identified in the PES

G. COORDINATION Findings and Analysis UNICEF seems to work very effectively with other stakeholders. There is a consultation process on an annual basis and the work of UNICEF in coordinating the PES process is well respected and valued. The PES is well known in the education sector and, as detailed earlier in this report, the data it provides is widely used by NGOs, UN agencies and Ministry authorities. Of all agencies that contributed to this evaluation only one indicated that it did not have a copy of the reports made available by UNICEF. Further issues that will require effort in the future to ensure adequate coordination are the development of the Secondary School EMIS by UNESCO and the coordination across the three zones of Somalia as responsibility is shifted to MoEs as capacity and ownership develops. The UNESCO secondary survey has some elements of its design in common with PES in that enumerators collect data and it is entered into computers in Nairobi. However, the data is entered into SPSS and not a relational data base. The data is entered twice as a validity check and a relevant comparison process undertaken. The current reporting system is very rudimentary and is based on the ‘tables’ capacity of SPSS.

38

PES Evaluation Report, 2008 The lack of similar codes as the PES uses for zone, region, district and settlement and the lack of location coordinates make it impossible to bring the two data systems together. This would be necessary if the Ministry wanted to investigate available secondary capacity as compared with demand for secondary school places from graduating primary students. It was also noted that data entry training within zones is being duplicated and data entry computers are being provided by both UNICEF and UNESCO. Discussions with UNESCO staff indicate that they have a philosophy for an EMIS similar to the conceptual framework presented here and aspire to the transfer of responsibility for the survey and the emerging EMIS to the MoEs. The UNESCO spokesman spoke passionately about the need to harmonise the primary and secondary systems but agreed that it is not happening yet. A structure needs to be found that will bring the two systems together in a rational way. One suggestion is that both the PES and the Secondary survey be run through the EMIS unit when it is established and that the EMIS coordinator and the associated technical assistant be responsible for harmonising the two systems. Exciting opportunities for cross cutting appraisal of conditions exist if the links to WASH, health and emergency data, via a school mapping exercise, are well coordinated. This can be done by using location coordinates and existing zone, regional, district boundaries in addition to accurate settlement locations an up to date population data. An earlier consultants report on School Mapping recommended that UNICEF engage a GIS specialist to facilitate this work across many areas of UNICEF activity. This would be a useful way of developing mapping processes for education and would facilitate the integration of several areas of UNICEF activity In addition to the PES there have been other surveys conducted for a variety of purposes and by a variety of agencies. A recent example is the CARE survey of Baseline Survey of Integrated Support for Primary and Alternative Basic Education (Ispabe) Programme in Central- South Somalia. This study is restricted to just three regions in one zone, and covers many of the areas studied in PES. However, a significant additional contribution is that it adds a qualitative dimension to the data. Instead of just asking, what or how many, is asks why. In this way it supplements the PES material. Additionally the two studies could be used as a validation of the data they have in common. Another study of some significant is the Multiple Indicators Cluster Survey (MICS) undertaken by UNICEF in 2005. This survey is a model used in many developing countries and is based on a standard ‘core’ with adaptations to investigate special contextual issues in Somalia. It used a sample approach to explore a range of indicators at Zonal level and allows for disaggregation by gender and Rural/Urban. It provides for a range of indicators that are not currently available in PES, such as net enrollment ratios and survival to grade five indices. It also covers a range of areas outside of education such as nutrition, child health, environment, child protection, HIV/AIDS etc. It is thus a much more broad brush survey that provided a wide range of data on a national scale, but would not be suitable for many, indeed most, of the functions expected of an EMIS. There are other studies that have been conducted in Somalia, such as the From Perception to Reality: A study on Child Protection in Somalia, that don’t directly record EMIS type data. However, they should be included in programme development in the education context since they included data on issues such as child violence, children with special needs and other significant areas that matters that impact directly on formal school activity. Recommendations a. That the PES and Secondary School data systems be coordinated to ensure consistency between them. b. That mapping tools are used as a coordinating device across sectors c. That survey and study design be undertaken to complement the PES/EMIS.

39

PES Evaluation Report, 2008 H. CONCLUSION: FUTURE STEPS TOWARDS AN EMIS It has been noted that currently UNICEF is committed to providing technical support in the areas of Planning and EMIS. These resources need to be integrated closely with the experience and support mechanisms being recommended here. It is suggested that an EMIS unit is established within a Directorate of Planning and Statistics of each Zone Ministry. The EMIS unit would be staffed by: • • • •

EMIS Coordinator EMIS data base manager Five data entry and retrieval staff Two secretarial and administrative assistants

The unit would have the assistance of technical support responsible for assisting the EMIS coordinator and data base manager in planning development of the unit and the capacity development of the staff of the unit. The capacity development would have to take place in the areas of: • • • • • • • • •

Relational data base design and construction. Data form design in Access Data entry, cleaning and validation Design and use of Access queries Design and generation of Access reports Use of front end software to generate flexible reports and provision of specialist information Use of mapping software Design, Management and implementation of the EMIS process Linking of the EMIS unit with the Planning arms of the Directorate

From this list it can be seen that the technical assistance will need to be in place for a considerable length of time. While accurate estimates of this time frame are difficult, a minimum of five years should be planned for. While it has been recommended that capacity building structures are improved to include, along with training, the opportunity to gain experience in complex tasks within a supportive environment, there is a need to emphasis that the Ministries have some additional responsibilities. It is essential that they identify suitable qualified focal points and the necessary support staff to undertake these duties. Some of the current practices of people working without salary will threaten sustainability. In addition, the appointed staff must be adequately supervised to ensure the achievement of set goals and targets and a professional work ethic in terms of hours worked and commitment to the organisation. Without these qualities in the EMIS staff no capacity building exercises can be expected to impact significantly on the organisation. The approach being assumed is that the PES is an early step towards the development of an EMIS, supervised and administered by MoEs. This was the understanding that emerged in 1997 and there have been some tentative steps towards this autonomy since, but there does not seem to have been developed a comprehensive transition plan. The tentative steps include provision of computing hardware, training of enumerators, use of access data base etc. More recently plans have emerged for technical assistance to be provided for further capacity development and with the provision of buildings etc. The challenge will be to adequately assess the rate of development of the capacity of the MoEs to ensure sustainability of the desired autonomous EMIS. There needs to be some agreement as to precisely what aspirations there are in the development of an EMIS. The language of EMIS is being used in a number of different ways that seem at odds with international standard use of the term. To assist in addressing this issue a conceptual framework (see

40

PES Evaluation Report, 2008 Appendix 3) has been developed that assists in defining what is meant by EMIS and, if agreed to, will assist in designing a pathway of development. It is an aim of this review to present some options for moving from where the PES is now to where stakeholders aspire to be in (say) 5 years time. The conceptual framework has been used extensively in developing countries in Africa and Asia. It is accepted as an aspirational model for an EMIS for the component Zones of Somalia. The current PES and its associated Access Data Base can be referred to as the initial component parts of an EMIS. The current ‘EMIS tools’ should be referred to as ‘School Records’. Earlier in this report a set of components was identified that are required to be developed as part of the transition from a survey to an EMIS. Central to this transition is the notion that the PES will not be replaced, but rather will be further developed to form an EMIS consistent with the conceptual framework. Throughout this report are sets of recommendations relating to each of the components in the list below. A brief summary of these recommendations is provided here to clarify development directions being recommended. •

• •

• •



Identification of data to be collected and design of data forms. A prioritised set of additional data is listed under relevance and appropriateness section. This reflects the EFA model included in the conceptual framework (Appendix 3) together with some context specific data recommendations gleaned from field observations. Data Collection. This is currently undertaken by enumerators. Suggestions are made about developing conditions to facilitate head teachers communicating with REOs with completed data sheets Validating data collection and data cleaning. A set of recommendations are provided in the efficiency section. Central to this is the development of an audit trail for more efficient monitoring of data collection, the improvement of training leading to a certification process, the transparency of employment processes and identification of a set of ethical considerations. Data entry processes. This should be the next component to be transferred to the MoEs. Suggestions are made in the ownership section about how this might be done and how the training component of capacity building needs to be developed Data analysis and Report generation. In the reporting section a set of recommendations are made that will expand and enhance the use of the data. It is envisaged that many of these will be produced by a ‘front end’ software component to the existing access data base. The emphasis here is on flexibility and a variety of report formats that are designed to suit a variety of stake holders. Central to this concept is the reporting to regions, districts and schools and their communities to facilitate planning and school improvement. The integration of various data collection strategies currently undertaken. In a section on data duplication, the implications of some current planning and timing issues are discussed and suggestions made for reducing impact on resource use caused by duplication of data collection by different sections of the Ministries.

The EMIS focal point and the EMIS technical assistance should develop a ‘ROLLING’ transition plan for the development of components of the EMIS and the shift of responsibility to the MoEs. An EMIS Development Planning Matrix is provided as Appendix 6 that might provide a useful starting point for the rolling plan. Such a plan should include a ‘vision’ for the EMIS unit to be seen as a service unit to other units within the Ministry such as regional offices and schools and their communities, planning, examinations, curriculum and teacher development. A precise time line is not provided here as it will be subject to successful implementation of the capacity building in earlier components. Hence the notion of a ‘rolling’ plan that will be upgraded annually subject to evaluation of previous targets. It is clear in two of the zones that the development of an EMIS/Planning Directorate would take place within the current structures of the MoE. These two ministries expressed enthusiasm and commitment to undertaking this capacity building process and have staff identified to begin the task. However, there are challenges to be overcome in the Central South zone where there is no functioning Ministry. Currently in Central South there are umbrella organisations and a community care centre emerging that are acting as coordinating agencies across 5 of the 10 Regions. These umbrella organisations have formed

41

PES Evaluation Report, 2008 a coordination body to assist education development. It might be possible to form a subgroup of this coordination body to that would act as an EMIS committee to supervise the development of an EMIS/Planning unit in lieu of a functioning ministry. This unit would eventually fit into the Ministry when it emerges. The obvious threat to sustainability with this model is the willingness of the new ministry to accept the unit members as employees. Alternatively, there may exist a local NGO that is capable of managing the process and would be capable to overseeing the development of the EMIS/Planning unit. This process would have a higher risk of failure when the time came to hand over the unit to a ministry structure since the employees would see themselves as employees of the NGO and not the ministry In addition to these challenges is the issue of location of the technical assistance. Currently International staff are forbidden to enter the Central South Zone due to the lack of security in the area. While this situation persists, it will be impossible to develop the EMIS/Planning unit as in the other two zones. If this is the case it would be preferable to use the current PES structures and procedures until the security situation improves. It is difficult to develop capacity in a ministry if there is no functioning Ministry. Advice from UNICEF officers in CSZ is for this option of continuing with current PES procedures until such time a greater stability returns to the Zone and a functioning Ministry emerges. An issue that will emerge as a possible constraint without careful planning is the issue of coordination across the three zones as ownership is developed within the zones and local issue are addressed. There is a likelihood that the data collection forms of the three zones will begin to diverge as each zone begins to exercise greater control over the process. If this happens it may become difficult to bring the three data bases together into a single Somalia data base. It will be necessary to use procedures such as those that currently exist to ensure the future harmonisation of the three data bases. Recommendations a. That the UNICEF Planning and EMIS technical assistance be integrated closely with an EMIS and Planning Directorate. b. That MoEs identify suitable qualified focal points and the necessary support staff to undertake the duties associated with EMIS development. c. That the appointed staff be adequately supervised to ensure the achievement of set goals and targets d. That the conceptual framework is accepted as an aspirational model for an EMIS for the component Zones of Somalia. e. That the current PES and its associated Access Data Base is referred to as the initial component parts of an EMIS. That the current “EMIS tools” be referred to as School records. f. That the EMIS focal point and the EMIS technical assistance develop a ‘ROLLING’ transition plan for the development of components of the EMIS and the shift of responsibility to the MoEs g. That the rolling transition plan incorporates changes in data collection, data cleaning and validation, and changes in reporting recommended in this evaluation

42

PES Evaluation Report, 2008 REFERENCES Anon. (2007) Summary Field Report – Somalia 27 – 31 March 2007: Training Of Trainers Primary Education Survey Micro Planning Workshop – Bossaso Anon. End of Consultancy Report. Primary Education Survey Anon. Data Analysis and data verification process Anon. (1999). Notes on the use of class register. UNICEF Anon Oct 2002. A Trip Report on the Education Management System (EMIS): Software training. Boveington J. T. (2007) UNICEF Assessment-Capacity Development CA-CD Assignment. Ministry of Education, Somalia. Brentall, C., Peart, E., Carr-Hill, R., and Cox, A. (2000) Thematic Education For All Study. Funding Agency Contributions to Education For All. Overseas Development Institute. London. Mulinge, L., (2007). EMIS: Version 1. User Guide. UNICEF, Nairobi Lecompte, M. D., & Preissle, J. (1993). Ethnography and Qualitative Design in Educational Research (2nd ed.). San Diego: Academic Press. May, C. and Vine, K. (1997). Indicators of Educational efficiency and Effectiveness: A Data Dictionary. UNESCO PROAP. Bangkok Mendelsohn, J. (2007). Further Development of the School Mapping Programme in Somalia. UNICEF Ministry Of Education. (no date). EMIS Registers Instruction Manual. UNICEF Ministry of Education. (no date). School Register. UNICEF Ministry of Education. (no date). Class Register. UNICEF Ministry of Education. (no date). Pupil Information Card. UNICEF UNICEF (2000) Primary School Education Survey Report 1999/2000. UNICEF UNICEF (2001) Primary School Education Survey Report 2000/2001. UNICEF UNICEF (2002) Primary School Education Survey Report 2001/2002. UNICEF UNICEF (2003) Primary School Education Survey Report 2002/2003. UNICEF UNICEF (2004) Primary School Education Survey Report 2003/2004. UNICEF UNICEF (2005) Primary School Education Survey Report 2004/2005. UNICEF UNICEF (2006) Primary School Education Survey Report 2005/2006. UNICEF UNICEF (2007) Draft Primary School Education Survey Report 2006/2007. UNICEF UNESCO (1998) Training Package: Educational Management Information System. Principal Regional Office for Asia and the Pacific. Bangkok UNESCO (2000). Support for Education for All 2000 Assessment. UNESCO Principal Regional Office. Bangkok. (CD) UNESCO (2000) http://portal.unesco.org/education/en/ev.phpURL_ID=43385&URL_DO=DO_TOPIC&URL_SECTION=201.html Windham, 1988. Improving the Efficiency of Educational Systems: Indicators of Educational Efficiency and Effectiveness.

43

PES Evaluation Report, 2008 APPENDICES Appendix 1 Summary of Recommendations A.

Efficiency Validity 1. That REOs, Supervisors and UNICEF staff are encouraged to develop a ‘culture of support’ for enumerators to encourage transparent and open work practices. 2. That such a culture should be encouraged by a module on ethical behaviour being included in the training exercises 3. That enumerators should be evaluated both during training and during the field work, and employment in the PES process be dependent on satisfactory appraisal 4. That GPS training be supported by field work and mapping exercises to ensure an understanding of the implications of incorrect location data 5. That an audit trail be established to facilitate effective tracking of school visits and data forms 6. That enumerators be required to record school locations as ‘way points’ in their GPS as part of the audit trail 7. That enumerators be required to validate data in the school from source documents on selected indicators. 8. That effective Quality Assurance mechanisms be developed that would establish regular contact with schools. These are seen as a necessary prerequisite for the validation process. 9. That in future data collection processes, the data census become an ‘update’ of data rather than the current collection of all data on an annual basis 10. That a missing data report be provided at the completion of data entry to allow follow up in the field. Training 11. That a certification process for enumerators and supervisors be used to judge competence for the work tasks. 12. That the certification process be based on explicit competencies that must be demonstrated by enumerators and supervisors. 13. That a set of additional modules to the current training be included in the training course 14. That the course be lengthened to include the extra content 15. That participants be able to revisit opportunities to learn and demonstrate the competencies Enumerators and Supervisors 16. That the enumerators/supervisor positions be competitively attained. It would be advisable to include more than the required number in the training process and only the best included in the data collection phase. 17. That the performance of enumerators and supervisors be evaluated during the data collection phase by independent assessors using predefined competencies. 18. That REOs be rotated across data collecting teams and thus create a more independent system of monitoring. 19. That a fine balance is recognised between commitment to quality and commitment to capacity development. 20. That as capacity is developed and infrastructure built the enumerator/supervisor structure be abolished and replaced with an EMIS officer located in each regional office and head teachers be empowered to complete update of data forms along with subsequent data collection processes in the form of term or semester reports.

44

PES Evaluation Report, 2008 Timing 21. That data be collected in late September or early October (or as early in the school year as can be achieved) 22. That data be entered into the data base in November, 23. That preliminary reports begin to flow to stakeholders, in particular REOs and schools, in December. Data Duplication 24. That careful planning take place to provide data on a timely basis for the variety of MoE directorates and stakeholders B

Effectiveness Utility-Current and Potential 25. That timeliness of data reporting be improved to suit institutional needs 26. That capacity to use the data be developed among stakeholders, 27. That alternative formats of reporting of data be developed to differentiate between stakeholders 28. That a systematised and institutionalised planning process be developed. This institutionalisation need to take place at central, regional and school/community levels EMIS Tools 29. That the use of the ‘EMIS tools’ is clarified and appropriate mentoring provided for head teachers on the use and value of these tools 30. That a review of the design and use of the tools be undertaken. 31. That plans be developed to integrate this data with the EMIS in the form of supplementary data collection and as a contributor to data verification 32. That further distribution of the EMIS tools be supported by developing understanding of their importance and use Ownership 33. That the responsibility for transferring of EMIS components to MoEs are considered one at a time as capacity and resources are developed. In this way a smooth transition from the centrally managed survey to a decentralised EMIS can be achieved 34. That responsibility for data entry be shifted to Zones over a two to three year period. 35. That initially MoE data entry staff be used to enter the data once and that the Nairobi data entry staff enter it the second time

C

Relevance / Appropriateness Data Required for an EMIS 36. That expansion of the data collected take place to provide for a full set of EFA and MDG indicators 37. That systematic recording of school outcome data should be undertaken to facilitate judgments about school quality. 38. That a criterion referenced approach to school outcomes be adopted that allows the recording of the percentage of students who have achieved a set of defined outcomes 39. That the number of grade 1 enrolments who have attended Qur’anic school and ECD be recorded. 40. That Qur’anic status of Qur’anic schools offering an integrated hybrid curriculum be recorded. 41. That a more precise analysis of out of school children be undertaken to investigate the underlying issues of out of school children. This should be a study separate from the EMIS 42. That school condition data be collected with the caveat that it will need to be interpreted with caution 43. That proxy indicators that reflect degree of child friendliness be considered, such as lesson types used regularly, or use of corporal punishment in last month

45

PES Evaluation Report, 2008 44. 45. 46. 47.

D

That some key resources that are commonly available and for which there is money or a donor ready to provide be recorded That the data collected include an expanded view of non-formal curriculum components. That a data on emergency risk and emergency preparedness be considered That the priority list for data expansion be adopted

Reporting 48. That the draft 2006/2007 report should not be published in its current form. 49. That a set of target groups are identified and reports designed specifically for each group. This process should also be supported by a front end module built for the access data base that can be flexibly accessed by a variety of users without the technical capacity to use the query module of Access. 50. That reports include a school/CEC report, District reports, Regional reports, Zone and Somalia compact reports that provide a summary of major EFA indicators and trends disaggregated by Region, gender and Rural/Urban, and a set of thematic reports that would change from year to year and would provide deeper analysis of EMIS data of issues of specific relevance. 51. That the capacity to extract information from the Access data base using the Query module of Access be developed at Zone level among those responsible for data base management. 52. That Data from the EMIS be systematically integrated into an Atlas via GIS software 53. That necessary support and training be provided for many levels of the MoE, on how to use and interpret. Impact 54. That impact of the PES be improved by implementation of the recommendations of this report in the areas of diversified reporting, capacity building on data use, broadening the scope of data collected, decentralisation of PES processes and linking PES to other data systems

E

Sustainability 55. That employment conditions be such as to ensure the retention of professional staff 56. That outsourcing some EMIS activities be considered in areas of staff shortage 57. That merit based employment procedures be defined and implemented 58. That effective work practices and standards be identified and made explicit 59. That a mechanism to coordinate EMIS activities across the three zones be established 60. That an effective quality assurance system be established that includes communication and travel arrangements 61. That systems of EMIS implementation be differentiated across regions to reflect the differences in local conditions 62. That a computer maintenance and replacement policy be developed and implemented

F

Coverage 63. That data collection systems be expanded to include all areas of that have policy implications for MoEs. This would include ECD, secondary education and non formal education. 64. That nomadic schools be clearly identified in the PES

G

Coordination 65. That the PES and Secondary School data systems be coordinated to ensure consistency between them. 66. That mapping tools are used as a coordinating device across sectors 67. That survey and study design be undertaken to complement the PES/EMIS.

H

Conclusion: Future steps towards an EMIS 68. That the UNICEF Planning and EMIS technical assistance be integrated closely with an EMIS and Planning Directorate.

46

PES Evaluation Report, 2008 69. 70. 71. 72.

73.

74.

That MoEs identify suitable qualified focal points and the necessary support staff to undertake the duties associated with EMIS development. That the appointed staff be adequately supervised to ensure the achievement of set goals and targets That the conceptual framework is accepted as an aspirational model for an EMIS for the component Zones of Somalia. That the current PES and its associated Access Data Base is referred to as the initial component parts of an EMIS. That the current “EMIS tools” be referred to as School records. That the EMIS focal point and the EMIS technical assistance develop a ‘ROLLING’ transition plan for the development of components of the EMIS and the shift of responsibility to the MoEs That the rolling transition plan incorporates changes in data collection, data cleaning and validation, and changes in reporting recommended in this evaluation

47

PES Evaluation Report, 2008 Appendix 2 Terms of Reference

Terms of Reference for the Engagement of an Evaluation Expert Evaluation of the Annual Primary Education Survey in Somalia

1.

BACKGROUND

1.1 Country Background With an overall Human Development ranking of 161 out of 163 countries on the human development index, available data on Somalia points towards enormous gaps in meeting the needs of women and children in the social sector. The country has not had a de facto central government authority, or any other feature associated with an established independent state since the last 15 years. Functioning regional governments have been formed in the Northwest (Somaliland) and Northeast (Puntland), and more recently a Transitional Federal Government was formed in 2004 but with very limited de facto authority.

1.2 UNICEF Country Programme UNICEF has had a large field and programme presence in Somalia over the last 15 years, with three zonal offices. A new country programme, within the framework of the United Nations Transition Plan, will aim to accelerate progress towards targets 3-8 of the Millennium Development Goals by further increasing access to basic services for accelerated child survival and development through humanitarian assistance, strengthening the institutional capacity of government as a duty bearer and further enabling children and women to claim their rights. 1.3 UNICEF Education Programme Notwithstanding the work done by international development and relief organisations, access to basic education and gender equity in education in Somalia remains a real concern. With a Gross Enrolment Ratio of 30 per cent, overall primary school enrolment ranks among the lowest in the world. In terms of gender equity, the Somali girl child has even less chance of accessing basic education. After 17 years of an emergency-driven response in Somalia, the absence of a viable education system continues to hinder Somalia’s progress towards meeting the Millennium Development Goals (MDGs) and Education For All (EFA) goals. In this context, the new UNICEF “Go to School” programme will aim to improve access to and quality of basic education with an emphasis on developing mechanisms to increase girls’ enrolment. In addition, an institutional development component will provide financial support and expertise to the existing ministries of education for the development and implementation of gender-sensitive education policies and standards, including the development of an effective Education Management Information System. 1.4 The Primary Education Survey The Primary Education Survey (PES) has been conducted almost every year for the last nine years. However, no formal evaluation of the Survey has ever been carried out. The original purpose of the survey was to fill a critical data gap in the education sector and to meet the information needs of the Ministry’s of Education (MoEs), national and international development stakeholders as well as donors, with an independent, reliable and routine source of data for trend analysis and programme planning. The PES has evolved over the years with additional data elements being added to meet the growing needs of

48

PES Evaluation Report, 2008 education stakeholders. From the beginning, UNICEF clearly stated its intention to build capacity of MoE officials in order for the MoEs to take full responsibility for the design and management of the PES. The PES was in many ways only meant to be an interim step on the road to establishing a full operational Education Management Information System (EMIS). The ultimate goal of building such an effective EMIS was the establishment of a reputable system and expertly trained education officials at central, regional and district levels to ensure the efficient flow of data in order to facilitate the monitoring of service provision as well as providing critical information to educational planners and policy-makers. In reality, this goal remains far from being realized. While some EMIS tools were developed and distributed, they have not been institutionalized. This is partly due to the absence of an overall EMIS framework. Efforts have been made, with limited success, to train MoE officials in basic data management using the data collected during the PES. Attempts to involve MoE officials in data collection for the PES have also had mixed and sometimes questionable results. Furthermore, there is little evidence that any of the data from the PES is being translating into information that is used for educational planning and policy-making purposes.

2. EVALUATION PURPOSE 2.1 Reasons for this evaluation Availability of reliable and up-to-date educational statistics/data is essential not only for formulation of policies and educational planning, but also making information based decisions on problematic issues in the day-to-day administration of the educational system. After nine years, the Primary Education Survey remains the most up-to-date and reliable source of data on the status of education in Somalia. However, for the PES to maintain it credibility and to ensure the validity of the data collected, an evaluation of the current and past processes involved in this important routine education survey, as well as an assessment of its utility levels, is considered necessary. The evaluation will also examine the relationship between the need for a sustainable EMIS and the future role of the PES. The Strategic Partnership in Education (DFID-UNICEF-UNESCO), intends to fund the next annual education survey. However, prior to doing so, it is deemed important to evaluate the current annual process of data collection and data dissemination. Such an evaluation is fully justified in view of the fact that without a fully functional and efficient management information systems, it will be difficult to bring about any reform in basic education in Somalia. To date, the PES has played a crucial role in maintaining the availability of a reliable data source. How the effectiveness of this role can be strengthened and its relationship to the establishment of a broader system for EMIS, will depend to on the outcomes and recommendations of an independent evaluation. 2.2 Target audience and proposed use of results The results, conclusions and recommendation emanating from this evaluation will be primarily of use to the Ministries of Education and the UNICEF Somalia country office and UNESCO. Secondary target audience will be UNICEF/UNESCO field offices, INGOs and LNGOs. 2.3 Timing of the Evaluation The final report from this evaluation is expected by XX

49

PES Evaluation Report, 2008 3.

SCOPE AND FOCUS

3.1 Evaluation Objectives The scope of the proposed evaluation will cover the routine annual surveys of primary education in Somalia. The evaluation will focus on the main aspects of the surveys, i.e. data collection and management as well as information dissemination and utility. In addition, it will look at the performance of the agency in supporting this recurring activity and the collaboration with its partners. The overall objectives of the evaluation are: ƒ ƒ ƒ ƒ ƒ ƒ ƒ ƒ ƒ ƒ

To assess the current and past design, planning and management of the key processes and methods that have been employed to conduct the PES since its inception To assess the degree to which the PES meets the real data needs/analysis of the range of education stakeholders in Somalia To assess the role of the PES in relation to the nascent EMIS and review the current linkages To assess the need for expanding the breadth and depth of the information gathered and in particular, to review the idea of integrating the collection of Secondary School data To assess the mechanisms for ensuring internal validity and reliability of the data presented in the PES To assess the degree to which training of MoE officials in data collection and management have been successfully and appropriately implemented To assess the utility level of the PES in Somalia, in particular the benefits of its results and outputs to the intended users To identify best practice, innovative interventions and shortcomings in the current process of planning and implementing the annual survey To make recommendations and suggestions on possible improvements related to all aspects of the routine survey, including, if necessary, ways of increasing levels of participation and ownership and utility of the survey/data by the MoE authorities and other stakeholders To make recommendations with respect to the evolving relationship between the current/future PES and the nascent EMIS system

3.2 Major Questions The evaluation will address questions which have been organized according to generally applicable evaluation criteria. ƒ ƒ ƒ ƒ ƒ ƒ ƒ

Effectiveness Relevance / Appropriateness: Efficiency: Impact: Sustainability: Coverage: Coordination:

3.3 Performance Standards Prevailing norms (for cost, time and service quality) for UNICEF program in Somalia shall be used as benchmarks for addressing the above listed evaluation questions.

50

PES Evaluation Report, 2008 4. EVALUATION PROCESS AND METHODS 4.1 Evaluation Stages The evaluation shall span the following main stages: i.

ii. iii. iv. v. vi.

Initial desk review of all relevant documents; annual PES documents, databases, country programme documents, consultancy reports, project documents, progress reports, training programme reports and minutes of meetings among others Submission of an Inception Report mapping out the evaluation plan, questionnaires and highlighting key initial findings Consultations with individuals and organisations concerned, including field visits (see details below) Preparation of draft report and circulation among individuals and organisations concerned Submission of the draft report to UNICEF USSC, highlighting the main conclusions and lessons learned from the evaluation Production and submission of final report

4.2 Evaluation Data Collection Methods The principal source of data will be interviews with respective stakeholders. Respondents will include MoE officials at central, regional and district levels, NGO officials, head teachers, teachers, survey enumerators, supervisors, UNICEF/UNESCO staff at Nairobi and zonal level as well as local and international NGOs. The methods followed would be largely qualitative and include key informant semi-structured interviews, community level interviews and focus group discussions. Questionnaires for guiding the interviews will be developed for review in the inception phase. Should it be necessary, the expert will hire local translators. 5. Accountabilities The evaluation expert shall be professionally independent to carry out a critical assessment to enable him or her to come up with candid answers to the questions posed above. The expert shall, for administrative purposes, be managed by UNICEF’s USSC Education Section - XX , who will also provide technical advice for the task. The evaluation expert shall be responsible for efficient and timely performance and the quality of the reports. The expert will be bound by normal UNICEF rules of confidentiality. 6. Evaluation Expert The evaluation expert shall be responsible for scrutiny of data sources, design of questionnaires, conducting interviews, data analysis and report writing. The ideal candidate would have: • A Master’s degree in the social sciences, education and or development studies • Extensive experience (8-10 years minimum) in the evaluation of social sector programmes, particularly education programmes • Experience in design, management and evaluation of education information systems • Expertise in qualitative and quantitative methods of evaluation • Expertise in ICT, including spreadsheets and database management • Excellent knowledge of evaluation norms, standards and approaches • Proven communication, facilitation and writing skills

51

PES Evaluation Report, 2008 • • •

Excellent knowledge of English (oral and in writing) Familiarity with the Somali context Ability to work independently

A candidate with the following would be preferred: • Field experience in post-conflict situations • Knowledge of national/local language(s) • Expertise in evaluating social sector projects The evaluation expert shall arrange his or her own office equipment (as laptop) and interview material (as tape recorders). While no specific work schedule is being suggested, they would be required to deliver the agreed products within the specified time. 7. Procedures and Logistics The evaluation is expected to take two months as per the following time-line after the engagement of the expert in early January 2008 . Table 3 Timeline for Evaluation Activity

Location

Duration Working Days

Briefing at UNICEF USSC

USSC, Nairobi

2

Desk Study

USSC, Nairobi

5

Interviews: Stakeholders in Nairobi

Nairobi

5

Preparation of an Inception Report, listing methodology, means of data collection and questionnaires Presentation of Inception Report, feedback and finalization of methods Visit to Somaliland, interview UNICEF/UNESCO staff, consult records and meet/ interview MoE authorities at all levels Visit to Puntland, interview UNICEF/UNESCO staff, consult records and meet/ interview MoE authorities at all levels Visit to Central Southern Zone, interview UNICEF/UNESCO staff, consult records and meet/ interview MoE authorities at all levels Preparation and presentation of preliminary findings and recommendations

2 USSC, Nairobi

1

Somaliland

5

Puntland

5

Central South

5

USSC, Nairobi

5

Preparation of draft report, presentation & feedback

USSC, Nairobi

7

Final report submission

USSC, Nairobi

3

Total Working Days

45

52

PES Evaluation Report, 2008 Appendix 3 EMIS Conceptual framework To clarify the understanding about what an EMIS is, and how it can be of assistance in developing an education infrastructure, the definition and conceptual framework of an EMIS has been adopted from the work undertaken by UNESCO (1998) Bangkok. This conceptualisation defines an EMIS to be ‘an organised group of information and documentation services that collects, stores, processes, analyses and disseminates information for educational planning and management” (p.2). As such it is more than a data storage system. Rather it integrates system components of inputs, processes, outputs and reporting in such a way that accuracy, flexibility and adaptability, and efficiency are maximised. The elements that need to be integrated are the: • Needs of data producers and users • Data • Information handling • Storage of data • Retrieval of data • Data analysis • Computer and manual procedures • Networking among EMIS centres (UNESCO 1998, p. 4) It also has a dynamic process orientation that focuses on the two-way flow of information so that all levels of the organisation can make effective use of the data and reports generated from the data. This flow of information needs to be efficient by avoiding duplication of effort, made accurate by incorporating validation procedures and adequate training of personnel at all levels, and meeting the needs of all stakeholders in the system. This complex interaction of components is represented in Figure 1 and has been adapted from the UNESCO (1998) report to reflect some aspects of the Somalia context. The main goal is to develop approaches which contribute to the systematic incorporation of performance data into the policy design and implementation cycle, not the enhancement of data on an individual agency to support intervention ( Brental, Peart, Carr-Hill, and Cox 2000). More specifically the UNESCO (1998) report argues that a well designed EMIS has the ability to: • To improve capacities in data processing, storage, analysis and supply of educational management information so that educational planners and administrators can avail themselves of reliable and timely data. • To co-ordinate and further improve dispersed efforts in the acquisition, processing, storage, transmission, analysis, repackaging, dissemination and use of educational management information. • To facilitate and promote the use of relevant information by various agencies and individuals at all levels for more effective educational planning, implementation and management. • To streamline the flow of information for decision-making by reducing and eliminating duplications as well as filling information gaps. • To provide information for policy dialogue and scenarios for development of the education system. (UNESCO 1998, p.3)

53

PES Evaluation Report, 2008

EMIS DATA

LEVELS OF DECISION MAKING

Reports

National

Students

INFORMATION PRODUCTS

Plans Zones

Curriculum

Products

Regions

Personnel

Data/statistics bulletins

Districts

Finance

Directories

Villages/Schools Early childhood Primary High school PAE NFE Nomadic and Qur’anic Schools

Facilities

Other

Physical plant and Facilities design Budget Estimates Others

Figure 1: Information Flow (adapted from UNESCO 1998)

The importance of an EMIS is supported by the increasing tendency to call for in-country capacity for evaluation by donor agencies. This is reflected in donor agencies shifting their focus from specific projects to system level indicators (Brental et al. 2000). This demand can best be clarified by taking a systems approach to view the education system. May and Vine (1997) identified such an approach that, it was argued, identified the “major factors in an educational production system.” The model used has been used extensively elsewhere (Windham, 1988) and is predicated on a systems approach that involves inputs, processes and outputs as its major components (Table 4). Table 4 Major Factors in the Education Production System Determinants Effects Inputs

Process

Outputs

Outcomes

Student characteristics Teacher characteristics School Characteristics Instructional materials and equipment characteristics Facilities characteristics

Forms of instructional organisation Class room technologies Teacher and student time allocation

Cognitive achievement Improved manual skills Attitudinal changes Behavioral changes

Employment Earnings Status Attitudes Behaviors

54

PES Evaluation Report, 2008 The significance of the model is that it can provide guidance as to how to monitor and evaluate the efficiency and effectiveness of the education system. To achieve this it has been suggested that a range of indicators can be calculated that reflect the status of the each component of the system (May and Vine 1997). Such a model has been extensively developed in the Education for All (EFA) project established by UNESCO following the Jomtiem agreement of 1990 (UNESCO 2000). This model defined 18 indicators that have three significant functions. They can be used to describe the health of an education system. Secondly, by using trend data they can be used to evaluate progress towards a set of key educational goals, and finally, by disaggregating the data across gender or regional groups, they can be used to assess the equity of the system. The abbreviated name and definition of the 18 core indicators are presented in Table 5. Table 5 The 18 Core ‘Education For All’ Indicators Abbreviation Indicator Indicator N° ECD

1

Gross enrolment ratio in Early Childhood Development (ECD).

G1ECD

2

Percentage of new entrants to grade 1 with ECD experience.

AIR-NIR

3, 4

Intake rates (AIR and NIR) in primary education.

GER-NER

5, 6

Enrolment ratios (GER and NER) in primary education.

EXP

7 8

a) Public current expenditure on primary education as a % of GNP; b) Public current expenditure per pupil on primary education as a % of GNP per capita; and Public current expenditure on primary education as a % of total public current expenditure on education.

TEACHERS

9

% of primary teachers with required academic qualifications;

10

% of primary teachers who are certified to teach.

P-T RATIO

11

Pupil-teacher ratio in primary education

REPETITION

12

Repetition rates by grade in primary education

SURVIVAL

13

Survival rate to grade 5;

14

Coefficient of efficiency at grade 5 and at the final grade

15

% of pupils who master basic learning competencies

ACHIEVEMENT LITERACY

16, 17 18

Adult Literacy rates 15-24 years and 15 years and over; Literacy Gender Parity Index (GPI)

From these indicators, the data required for their calculation can be deduced. For the trend analysis to be undertaken a longitudinal set of data needs to be compiled with the implication that the data collection needs to be updated annually. Finally, if full use of the data is to be made the variability of the data needs to be investigated. Possible sources of this variability are gender, Zone, Region, District, Village, socioeconomic profiles, school type etc. Hence, the need to be able to disaggregate the data across these variables. While these indicators are useful for providing general information about efficiency and effectiveness of the education system, it is useful to think about what data can be useful in informing us about many other aspects of the system. If systems developed elsewhere for developing countries are looked at (e.g., UNESCO Model, Timor Leste, Myanmar) there appears to be considerable uniformity as the general nature of EMIS data, given that they always need to be adapted for local structures, conditions and contexts.

55

PES Evaluation Report, 2008 We can think of this data in the form of a database containing a number of foundation tables in a relational database. They are tables concerning 13 domains of data, namely: • • • • • • • • • • • • • • • •

School Infrastructure School Management School Pedagogy School Curriculum School Budget School Inspection Student Admissions Student Grade/Class Assignment Student Attendance Student Learning Achievement Teachers Employed Teacher Grade/Class Assignment Teacher Attendance Administrative Staff General Staff Population

The EMIS would also contain similar domains of data for the multiple levels of education institutions including early childhood centres, primary, secondary, non formal education institutions and tertiary institutions. It is a common model to gradually move to this complete system, by initially building the technical capacity required for effective EMIS implementation in one sector and then expanding to other sectors. Each domain of data may be represented by more than one table. For example, school infrastructure would have a separate table for each of the following aspects of infrastructure: • Buildings and Classrooms • Furniture • Teaching and Learning Resources • Water and Sanitation • Utilities (eg. mains electricity supply) • Communications And the school management domain would have separate tables for: • School Management Committee • School CECs • Special school development projects • Catchment Area • Affiliated Schools Similarly, the School Curriculum domain would have separate tables for: • Primary School Subjects • Middle School Subjects • High School Streams and Subjects All of the data referred to above originates from within the education system with the exception is population data. This needs to be derived from census data and embedded in the EMIS at village level by single year of age and sex. Typically these data is derived from school level and aggregated at district, regional and national levels. The precise organisation for this aggregation depends upon many factors including personnel technical capacity, hardware availability, remoteness, and financial resources. A highly developed system would generally provide for data entry to be undertaken at a decentralised level and aggregated as it was sent forward to regional and national levels.

56

PES Evaluation Report, 2008 At each level EMIS there would be an associated Decision Support System (DSS) tool set appropriate for that level. For example, Regional/District data could be interfaced with UNESCO’s decentralized education planning models, and with Geographic Information Systems (GIS) software for mapping school catchment areas. Similarly, higher levels would have tools such as SPSS and data transfer programmes for exporting database tables to SPSS (a statistical analysis package) in their DSS. Further, the national level could be interfaced with UNESCO’s national education planning model and the UN’s DevInfo software for thematic mapping on a national scale of crucial education indicators such as those that measure progress against Education for All (EFA) and Millennium Development Goals (MDG) goals. EMIS has the potential to lift management knowledge and skills across the Ministry and thus increase the effectiveness of education management across the country. Importantly, the process involves a two way flow of information. When data is aggregated centrally reports need to be provided for all levels of stakeholders More broadly, the village/district component of EMIS could be conceptualized as a social sector local management and planning information system by ensuring some compatibility between the health and education information systems. As a minimum that would ensure that education and health managers and planners use the same basic population data. Such arrangements will assist greatly responses to emergency situations in particular, and will support integrated programming by development agencies. This broader integrated approach would have substantial resource sharing benefits. For example, there would need to be only one set of hardware and one management structure in each township covering both EMIS and HMIS, one set of DSS software tools, and one GIS with education and health data at village level. Given the scarcity of trained personnel this would have obvious advantages. These conceptualisations will be used a guide to the evaluations of the information systems in use in Somalia.

57

PES Evaluation Report, 2008 Appendix 4 Research Questions and Data Collection Planner Research Question

Data Source

Effectiveness: 1. To what extent does the data/information generated from the survey, meet the needs of the primary stakeholders: MoE officials at central, regional and district level?

Director General, Planning officer Regional office, district office, schools

2. Have adequate efforts been made to strengthen and empower the educational authorities at all levels in terms of increasing their capacity to take full ownership of the process of planning and implementing the annual survey?

UNICEF, survey team, DG, Planning Annual reports

3. Is the annual survey embedded/ institutionalised in the Educational Policies and Strategies of the MoEs?

DG, Planning, regional, district, school UNICEF, UNESCO. EFA FTI Link to 1. How is data used? Specific examples

4. What are the lessons learned as a result of the annual survey, in terms of building capacities among district authorities, communities, local government and local educational institutions?

UNICEF, Survey team, Planning regional, district, school, annual reports

5. In terms of planning and implementing, to what degree have the annual surveys provided appropriate monitoring systems to help education stakeholders (UNICEF and UNESCO included) measure impact and effectiveness, and improve accountability?

UNICEF, UNESCO, EFA, FTI

6. How are the “EMIS documents” of school register, student register, Pupil record card and instructions for use, used?

UNICEF, DG, Planning, Region, district school. How long have they been in use, is data centrally collected? Is this done electronically

7. What other data are systematically collected, how, when and by whom?

DG, planning, region, district, schools. Is there duplication?

Ask for examples. Map to common EMIS models

Who would do it? What resources are needed, how can responsibility be shifted

Link to 2

LINK to 1, examples

Relevance / Appropriateness: 8. In designing the survey, was an assessment of the needs of the educational institutions, district authorities and MoEs ever taken into consideration?

Survey design team, UNICEF, DG, Planning, regions, districts, annual reports

9. Are the questionnaires well designed and appropriate?

Survey team, consultant, Schools Problems with completion?

10. Are the questionnaires really gender sensitive?

Consultant. Disaggregation where appropriate?

11. Do they provide appropriate data to highlight the needs of vulnerable and marginalised children?

UNICEF, consultant

58

Has Ministry ever suggested additions to surveys? LINKS to 1 and 5

PES Evaluation Report, 2008 Research Question

Data Source

12. Are the survey results used for MoE planning and management purposes and if so, to what extent?

DG, planning, region, district

13. Are the survey results documented and presented fully in line with the requirements and wishes of the MoE authorities?

DG, Planning, region, district schools What reports do you get? How are the reports presented (paper, Book, electronically). What would your preference be?

14. Is feedback given to the end-users, in other words, are districts and educational institutions provided with the documented and userfriendly survey results?

DG, Planning, region, district schools

15. Are educational institutions and districts motivated to carry out and collaborate with the annual surveys? Do the results from the survey motivate districts and educational institutions to improve their performance?

Regions, districts, schools, survey teams

16. Is the planning and implementation of the survey guided by cross-cutting principles of gender sensitivity, partnerships, national ownership, capacity development, bottom-up approach to programming, participation of children and young people, and gender sensitivity?

UNICEF,

17. Does the data collected properly match the data requirements of the EMIS information system?

Consultant, UNESCO documents. EFA indicators. Link to 1

Link to 13

Is there resistance to process? If so why? Are they perceived to be valuable?

How significant is this?

Efficiency: 18. Are the annual surveys carried out in a timely manner, taking into account climatic, logistic and security constraints?

UNICEF, annual reports, survey team

19. To what extent are the communities and educational institutions involved in planning, timing and carrying out the routine surveys?

UNICEF, survey teams

20. Is the total enumeration approach employed the preferred method for carrying out the survey, or would a stratified sampling method be feasible?

Consultant

21. Is the timing/content and duration of enumerator training appropriate?

UNICEF, survey team, evaluation reports What training?

22. To what extent is pilot testing used to improve skills and monitor training outcomes?

UNICEF, survey teams design reports

23. Does the payment of incentives skew outcomes or promote rent-seeking only attitudes?

UNICEF, Planning, annual reports Evidence? Or Opinion, survey team

24. In collecting and collating the data, do adequate validity checks take place at the point of data collection, data cleaning and data entry into the information system?

UNICEF, Annual Reports, Survey teams.

59

Seek general reply from DG and Planning

PES Evaluation Report, 2008 Research Question

Data Source

25. Are data integrity and validity checks carried out, in terms of random visits to educational institutions?

DG, Planning, Region, district, UNICEF, annual reports

26. Has the cost of carrying out the survey been reasonable in relation to other parts of the country programme?

UNICEF, Annual work plan, budget

27. Can recommendations be made as to cost saving elements in terms of funding the routine annual survey?

Consultant, UNICEF Discus in relation to ques 3, development of EMIS

Impact: 28. Do the annual survey and the published results contribute to increased enrolment rates, greater opportunities for girls, increased numbers of teachers and a larger number of quality schools?

UNICEF, FTI, DG, Planning Link to effectiveness

29. Have there been unintended positive or negative consequences, as a result of the annual survey taking place? Sustainability: 30. Is the recurring process of conducting the PES likely to continue if/when UNICEF withdraws its funding support?

DG, Planning,

31. Can recommendations be made about shifting the responsibility for data collection to the districts and educational facilities, as opposed to a donor-driven and donor-coordinated method of data collection while maintaining its validity?

DG, Planning, Region, district

Link to 2, 3

If a EMIS system was established, owned by MoE, how would it be structured, decentralised, What resources would be needed? Where would you start?

Coverage: 32. Does the PES equally cover urban and rural areas?

UNICEF, annual reports,

33. Did UNICEF’s assistance in any way influence the existing inequities between urban and rural educational institutions, provided these exist?

Is this relevant here?

34. Does the PES provide sufficient data on alternative education pathways such as Nomadic and Qur’anic Schools?

UNICEF, annual reports

Coordination: 35. Does UNICEF effectively work with other stakeholders (local, district, regional authorities, NGOs, UN agencies) during all stages of the survey process?

UNESCO. Relevant NGOs, district, regions, Planning, DG

36. Are all international development partners and NGOs fully aware of the routine survey of primary education funded by UNICEF and are

Relevant NGOs, UNESCO, UNDP…????

60

PES Evaluation Report, 2008 Research Question

Data Source

these partners routinely supplied with the survey results? 37. Does any duplication of activities take place, in terms of surveys of primary education in Somalia?

Schools, district, regions, planning What other data is collected? Link to 6, 7

38. How can coordination be improved for better efficiency and sustainability?

Consultant, recommendations

Coherence: 39. Does the type of data collected match the reporting requirements of national and global frameworks: the Reconstruction and Development Framework; the Joint Needs Assessment (JNA), the UNTP IMEP, the MDG targets and the EFA goals?

DG, Planning, EFA, FTI Reconstruction and development framework

40. How consistent has the planning and implementation of the PES been within a human-rights based and gender sensitive approach to programming?

UNICEF, consultant, annual reports, annual work plans

61

Link to 1 and 5

PES Evaluation Report, 2008 Appendix 5 List of Interviewees Names of Interviewees/ Focus Groups

Organisational Status

Maurice Robson

UNICEF Education Officer

1/7/08

Edith Mururu

Education Specialist UNICEF

3/7/08

Lawrence Mulinge

Computer Programming Consultant

4/7/08

Woki Munyui

Education Specialist UNICEF

7/7/08

Education Sector Meeting

Nairobi

8/7/08

Catherine Remmelzwaal

Education Specialist UNICEF

8/7/08

Maulid Warfa

Education Specialist UNICEF

13/7/08

Safia Jibril Abdi

Project officer UNICEF

12/7/08

Rashid Hassan Muse

Project Officer UNICEF

14/7/08

Hassan Haji Mohmoud

Minister of Education Somaliland

12/7/08

Ali Abdi Odowa

Director General of Education Somaliland

12/7/08

Abdi Abdulahi

Director of Planning Somaliland

14/7/08

James Wamwangi

Field Coordinator UNESCO

15/7/08

Mohamud Bile Dubbe

Minister of Education Puntland

19/7/08

Abdi Mohamed Gobbe

Vice Minister of Education Puntland

19/7/08

Mohamed Jama

Director of Examination Puntland

19/7/08

Said Farah Mohd

Director of Curriculum and Teacher Training Puntland

19/7/08

Khalif Yusuf Muse

Regional Education Officer Nugal

19/7/08

Abdulkadir Mohamed

Enumerator

19/7/08

Said Mohamed Hassan

UNICEF Programme Assistant

19/7/08

Abdirashid Ismail

OIC Save the Children Puntland

19/7/08

Ahmed Ali Shire

SCOTT coordinator Save the Children Puntland

19/7/08

Ahmed Abbas Ali

Basic Education Manager Save the Children Puntland

19/7/08

Representatives of 11 NGOs

AET, SCUK, Care Somalia, SC-Denmark, CfBT Education Trust, Caritas Somaliland, UNESCO, Trocaire, Diaz, WFL.

25/7/08

Christophe Mononye

UNESCO Program Specialist

13/8/08

Kimanzi Muthengi

Secondary Survey Manager UNESCO

13/8/08

62

Date

18/8/08

PES Evaluation Report, 2008 Appendix 6 EMIS Development Planning Matrix Activity Vision and Rationale for EMIS accepted Relational data base design and construction Form design Use Current model with added data

1 X

2

X

X

X X

GPS Way points Replace Enumerators with Head teachers Conduct independent validation survey Check data from EMIS Tools

X

Select different key data elements for validation each year Timing Conduct Census in February

X

Both entries in Zone

X

X

X X

X X X

X

X

X X X X

X

Threats

Capacity Development

5

X

Data Collection, Cleaning and Validation Audit trail

Conduct Census in September Data Entry, Pre-coding and cleaning On entry in Zone, One in Nairobi

Assumptions 4

X

Update of previous years report Increase data set with Phase 1 additions (EFA data and school types) Phase2 additions (ECD background, NFE, etc) Phase 3 additions (qualitative aspects)

Year 3

X

A clear vision of role of the PES and EMIS is enunciated by senior staff Current data base design is used and updated as necessary

Lack of commitment to evidence based planning and monitoring Secondary and Non formal data bases not consistent

EMIS coordinator EMIS manager

Current forms updated to include new data School reports generated and used as basis for following years data collection

Consistency across zones as EMIS units develop independence Reporting procedures not developed

EMIS staff

Schools given adequate warning on preparation of age/grade data Resources available

EMIS tools not being used

Demand warrants this data Properties can be quantified objectively

Data becomes too complex for collection.

EMIS coordinator EMIS manager Area specialists

Documentary evidence collected of school visits and data checking GPS are available Quality assurance system developed to allow regular contact with schools There is still a need to confirm data collection processes are accurate School understand need to provide documentary evidence for data provided Physical check of some data elements each year will improve validity

To much reliance on UNICEF staff

REOs, EMIS coordinator

Technical competence not developed Isolated schools missed Nomadic schools not found Cost

Enumerators REOs Head Teachers EMIS coordinator

Ethical issues of false data are not understood

REOs, Head Teachers, Enumerators

Deliberate misrepresentation of the physical reality can still take place

REOs, Head Teachers

Planning time is not available to commence in September Implies two data collection in 2009

Early rains, continued insecurity in some areas Resources are available

Adequate training and support is available to data entry staff. Successful data entry in year 1

Need to communicate between two teams.

EMIS unit data entry staff

Professional qualities of staff

Data base manager

63

Information is not known

Senior MoE Officials

Head teachers Enumerators Heads of relevant sections of Ministry

PES Evaluation Report, 2008 Activity 1 Enumerators and supervisors Employment Training

2

X

X

District reports

X

Regional reports

X X

Thematic reports

X

Interrogate Access data base

X

Front end reporting software developed Indicators transported to DEVINFO School Mapping School location data validated Data Transported to appropriate mapping software Education data linked with health, emergency data Establish EMIS Unit Somaliland Puntland

X

X

X

X

Capacity Development

Sound and transparent processes are in place. Performance reviews take place Greater emphasis on understanding. Competencies demonstrated to qualify. New modules developed

Lack of professional commitment Sound Lack of effective performance monitoring. Inadequate numbers trained to allow rejecting ‘failures’

MoE senior staff

Training of Head teachers and CEC on use Training on use for planning and monitoring Training on use for planning and monitoring Current model of report reduced to key tables, indicators and trends Special interests are identified and technical assistance employed Undertaken at zone level

Proposals developed are ignored

Head teachers, CEC

Enumerators

DEOs Quality assurance procedures not in place

REOs

MoE does not embrace evidence based planning

Ministry senior staff

Capacity is not available in EMIS Unit

Data base manager

X

MoE won’t release data

NGO staff

X

All GPS location data collected

Understanding of mapping processes

Enumerators EMIS Unit staff

Specialist Mapping expertise employed

Poor match between school location and settlement location A perceived irrelevance of crosscutting approach Quality staff not available Staff leave after capacity built Ministry don’t make financial commitment Continued instability Lack of functioning ministry IT maintenance contract not issued Equipment not maintained Availability of resources

X X

Useful model for linking data sets developed Staff of EMIS Coordinator EMIS data base manager Five data entry and retrieval staff Two secretarial and administrative Assistants Ministries financial commitment grows over time Internet, anti virus software available

X X X X X

Threats

Develops over time. Specialist programmer employed UNICEF staff responsible for this task

South Central Zone Maintenance and update plan for computing equipment Technical Assistance needed in each zone PES International consultant available

Assumptions 5

X

Reporting and analysis School report

Zone and Somalia

Year 3 4

X

X

X

X

UNICEF ongoing support Provides overall coordination

64

All stakeholders,

Does not coordinate across zones and EMIS Units

EMIS staff All EMIS staff All EMIS staff All EMIS staff All EMIS staff EMIS manager and data entry staff

PES Evaluation Report, 2008 Appendix 7 Evidence Based Planning A major thrust of this evaluation has been to suggest the development of systems and reporting techniques that will facilitate evidence based planning processes to be used in the education sector of Somalia. It has been seen in the text that while Agencies and NGOs have been making systematic use of the PES report, MoEs, regions, districts and school communities have not been making use of the data. It has also been argued that this is in large part due to the lack of suitable reporting strategies to make available to these stakeholders the necessary data. There is considerable evidence in support of the proposition that decentralising decision making and responsibility can lead to enhanced education opportunities for children. This process is explained succinctly in a UNICEF paper studied in the desk review for this work. It said in part: The overall goal is to bring interested people together, to encourage them to identify and prioritize key improvements that need to be made, and then to plan how the improvements can be made. In a sense, this would amount to a process of diagnosis of challenges faced by school-aged children in the areas, from which should emerge an interest and commitment to improve schooling. Much of this would be built upon the processes that have been established by Community Education Committees (CECs), the assumption being that the most effective agents of change are likely to be parents and those that represent their vested interests. Mendelsohn 2007 Schools can be provided with simple forms of data that can allow them to compare their own school’s status with that of other comparable schools. This data can be in the form of infrastructure quality (inputs) or in the form of student achievements (outputs). The existence of such data allows and encourages the generation of discussion and debate about the key issues. More importantly it provides objective evidence for such debate. However, this evidence is often the ‘what’ component of the evidence and local input needs to be provided as to ‘why’. As an example, if a school finds that it has a low gross enrolment ratio in comparison to other schools in its district, it could conduct its own investigation into why that is the case and to then identify possible solutions that would work in that specific context. In this way generic overall policy, which may lack application in some areas, is replaced with differentiated strategies that are designed around local issues and gain greater ownership and support. It has also been suggested that school communities will need specialist assistance in developing submissions and proposals to address the key issues when there is need for outside assistance. This process has begun through Mobilization agents and Community development initiatives. When district and regions receive requests for assistance they will need to be able to effectively prioritize the requests. Hence they will need district and regional overviews of school status and performance to allow the most rational decisions to be made. Such reports are not complex to generate once the data is in the form of an Access data base and the requisite reports have been designed in the reporting context of Access. The need for an effective quality assurance system has been discussed in this report in several places. Such a system can be in part supported by effective monitoring of region, district and school performance. Supervisors are provided with objective data, which can form the basis of mentoring discussion and the formation of school improvement processes. Mendelsohn (2007) provided a matrix of possible ideas for mobilizing school improvement activity that was mainly orientated around a school mapping process. In the matrix below these ideas have been adjusted and added to so that they can better reflect the EMIS context. These are presented as examples of possible activities that can be generated from reports and thus provide indicators as to the possible nature of those reports. This is not intended to be exhaustive, but rather an indicative list.

65

PES Evaluation Report, 2008

Issue

Indicator

Goals and challenges

School Enrolment

Identify groups of people and areas where few children are in school

GER, NER Local knowledge of location out of school children

Plan interventions based on information collected locally

School Infrastructure

Need for new schools

Average distance to nearest school

Identify construction sources

Need for additional classrooms

Number of pupils per room

Identify construction sources

Condition of classroom

Construction materials Plan maintenance programme General condition descriptor Expert assessment of condition

Latrines

Number of pupils per latrine

Priority to schools with no latrines

Water

Quantity/ cost per pupil

Priority to schools with no/inadequate water supplies

Drop out

Identify reasons for high drop out rates when they occur

Drop out rate Reasons? Survival rate to grade 5 Compare with other schools, districts and regions

How is high drop out rate at end of lower primary addressed

Teaching resources

Need to identify a basic list of resources that all schools should have

Number of text books per pupil Compare with other schools

Student achievement

Mean scores on external exams Student literacy rate Percentage of students mastering a set of defined basic skills

Need to be careful to benchmark schools against suitable comparable schools

Student attendance

Attendance rate Comparison with other comparable schools

Need good records using EMIS tools

Teacher Attendance

66

PES Evaluation Report, 2008 Appendix 8 Glossary Achievement. Performance on standardized tests or examinations that measure knowledge or competence in a specific subject area. The term is sometimes used as an indication of education quality in an education system or when comparing a group of schools. Adult literacy rate. Number of literate persons aged 15 and above, expressed as a percentage of the total population in that age group. Different ways of defining and assessing literacy yield different results regarding the number of persons designated as literate. Apparent Intake Rate. The total number of new entrants in the first grade of primary school regardless of age, expressed as a percentage of the population of the official primary school entrance age. Basic education. The whole range of educational activities, taking place in various settings (formal, nonformal and informal), that aim to meet basic learning needs. It has considerable overlap with the earlier concept ‘fundamental education’. Basic skills. Usually refers to some minimum competence in reading, writing and calculating (using numbers). The term is synonymous in many uses with basic learning needs. Constant prices. A way of expressing values in real terms, enabling comparisons across a period of years. To measure changes in real national income or product, economists value total production in each year at constant prices using a set of prices that applied in a chosen base year. Continuing (or further) education. A general term referring to a wide range of educational activities designed to meet the basic learning needs of adults. See also Adult education and Lifelong learning. Criterion Referenced Assessment. Assessment is based on observing previously defined competencies and is reported as a set of competencies (see Norm Referenced Assessment). Curriculum. A course of study pursued in educational institutions. It consists of select bodies of knowledge, organized into a planned sequence, that are conveyed by educational institutions, primarily schools, to facilitate the interaction of educators and learners. When applied to adult, non-formal and literacy programmes, the term often implies a less formalized organization of learning materials and methods than in schools and tertiary institutions. Indeed, in programmes aimed at individual empowerment and social transformation, the curriculum may be developed as a dialogue with and between learners. DevInfo. DevInfo is a powerful database system that is used to compile and disseminate data on human development. The software package has evolved from a decade of innovations in database systems that support informed decision making and promote the use of data to advocate for human development. The DevInfo project is an interagency initiative managed by UNICEF on behalf of the United Nations (UN) System. Drop-out rate by grade. Percentage of pupils or students who drop out from a given grade in a given school year. It is the difference between 100% and the sum of the promotion and repetition rates. Early childhood care and education (ECCE). Programmes that, in addition to providing children with care, offer a structured and purposeful set of learning activities either in a formal institution or as part of a non-formal child development programme. ECCE programmes are normally designed for children from age 3 and include organized learning activities that constitute, on average, the equivalent of at least 2 hours per day and 100 days per year. Education for All Development Index (EDI). Composite index aimed at measuring overall progress towards EFA. At present, the EDI incorporates four of the most easily quantifiable EFA goals – universal

67

PES Evaluation Report, 2008 primary education as measured by the net enrolment ratio, adult literacy as measured by the adult literacy rate, gender as measured by the gender-specific EFA index,and quality of education as measured by the survival rate to Grade 5. Its value is the arithmetical mean of the observed values of these four indicators. Education Management Information System (EMIS). An EMIS is an organized group of information and documentation services that collects, stores, processes, analyses and disseminates information for educational planning and management. As such it is more than a data storage system. Rather it integrates system components of inputs, processes, outputs and reporting in such a way that accuracy, flexibility and adaptability, and efficiency are maximized. Enrolment. Number of pupils or students enrolled at a given level of education, regardless of age. See also gross enrolment ratio and net enrolment ratio. Entrance age (official). Age at which pupils or students would enter a given programme or level of education assuming they had started at the official entrance age for the lowest level, studied full-time throughout and progressed through the system without repeating or skipping a grade. The theoretical entrance age to a given programme or level may be very different from the actual or even the most common entrance age. Fast Track Initiative (FTI). The Education for All-Fast-Track Initiative, launched by the World Bank, is a global partnership between developed and developing countries to promote free, universal basic education by 2015. The initiative seeks to ensure that no country that has demonstrated its commitment to education will fail to meet this goal for lack of resources or technical capacity. In addition to mobilizing funds, the initiative supports the design of comprehensive sector-wide education plans and fills gaps in policy, capacity and data. Functional literacy/illiteracy. A person is functionally literate/illiterate who can/cannot engage in all those activities in which literacy is required for effective functioning of his or her group and community and also for enabling him or her to continue to use reading, writing and calculation for his or her own and the community’s development. Gender parity index (GPI). Ratio of female to male values (or male to female, in certain cases) of a given indicator. A GPI of 1 indicates parity between sexes; a GPI above or below 1 indicates a disparity in favour of one sex over the other. Gender-specific EFA index (GEI). Composite index measuring relative achievement in gender parity in total participation in primary and secondary education as well as gender parity in adult literacy. The GEI is calculated as an arithmetical mean of the gender parity indices of the primary and secondary gross enrolment ratios and of the adult literacy rate. Grade. Stage of instruction usually equivalent to one complete school year. Graduate. A person who has successfully completed the final year of a level or sublevel of education. In some countries completion occurs as a result of passing an examination or a series of examinations. In other countries it occurs after a requisite number of course hours have been accumulated. Sometimes both types of completion occur within a country. Gross enrolment ratio (GER). Total enrolment in a specific level of education, regardless of age, expressed as a percentage of the population in the official age group corresponding to this level of education. The GER can exceed 100% due to early or late entry and/or grade repetition. Gross intake rate (GIR). Total number of new entrants in the first grade of primary education, regardless of age, expressed as a percentage of the population at the official primary-school entrance age. Gross domestic product (GDP). Sum of gross value added by all resident producers in the economy, including distributive trades and transport, plus any product taxes and minus any subsidies not included in

68

PES Evaluation Report, 2008 the value of the products. Gross national product (GNP). Gross domestic product plus net receipts of income from abroad. As these receipts may be positive or negative, GNP may be greater or smaller than GDP. Gross national product per capita. GNP divided by the total population. Language (or medium) of instruction. Language(s) used to convey a specified curriculum in a formal or non-formal educational setting.. Literacy. According to UNESCO’s 1958 definition, it is the ability of an individual to read and write with understanding a simple short statement related to his/her everyday life. The concept of literacy has since evolved to embrace multiple skill domains, each conceived on a scale of different mastery levels and serving different purposes. See Chapter 6 for a detailed discussion. Literacy projects/programmes. Limited-duration initiatives designed to impart initial or ongoing basic reading, writing and/or numeracy skills. Literate/Illiterate. As used in the statistical annex, the term refers to a person who can/cannot read and write with understanding a simple statement related to her/his everyday life. Millennium Development Goals (MDG). In September 2000, 189 countries signed the United Nations Millennium Declaration committing themselves to eradicating extreme poverty in all its forms by 2015. To help track progress toward these commitments, a set of time-bound and quantified goals and targets, called the Millennium Development Goals, were developed for combating poverty in its many dimensions - including reducing income poverty, hunger, disease, environmental degradation and gender discrimination. Multiple Indicator Cluster Survey (MICS). The MICS programme developed by UNICEF assists countries in filling data gaps for monitoring the situation of children and women through statistically sound, internationally comparable estimates of socioeconomic and health indicators. The household survey programme is the largest source of statistical information on children. Net attendance rate (NAR). Number of pupils in the official age group for a given level of education who attend school in that level, expressed as a percentage of the population in that age group. Net enrolment ratio (NER). Enrolment of the official age group for a given level of education, expressed as a percentage of the population in that age group. Net intake rate (NIR). New entrants to the first grade of primary education who are of the official primary school entrance age, expressed as a percentage of the population of that age. New entrants. Pupils entering a given level of education for the first time; the difference between enrolment and repeaters in the first grade of the level. Non-formal education. Learning activities typically organized outside the formal education system. The term is generally contrasted with formal and informal education. In different contexts, non-formal education covers educational activities aimed at imparting adult literacy, basic education for out-of-school children and youth, life skills, work skills, and general culture. Such activities usually have clear learning objectives, but vary in duration, in conferring certification for acquired learning, and in organizational structure. Norm Referenced Assessment. A form of assessment that ranks and compares students. Commonly reported as a percentage score (see Criterion Referenced Assessment).

69

PES Evaluation Report, 2008 Out-of-primary-school children. Children in the official primary school age range who are not enrolled in primary school. Percentage of new entrants to the first grade of primary education with ECCE experience. Number of new entrants to the first grade of primary school who have attended the equivalent of at least 200 hours of organized ECCE programmes, expressed as a percentage of the total number of new entrants to the first grade. Percentage of repeaters. Number of pupils enrolled in the same grade or level as the previous year, expressed as a percentage of the total enrolment in that grade or level. Primary Education. Programmes normally designed on a unit or project basis to give pupils a sound basic education in reading, writing and mathematics and an elementary understanding of subjects such as history, geography, natural sciences, social sciences, art and music. Religious instruction may also be featured. These subjects serve to develop pupils’ ability to obtain and use information they need about their home, community, country, etc. Also known as elementary education. Public current expenditure on education as percentage of total public expenditure on education. Recurrent public expenditure on education expressed as a percentage of total public expenditure on education (current and capital). It covers public expenditure for both public and private institutions. Current expenditure includes expenditure for goods and services that are consumed within a given year and have to be renewed the following year, such as staff salaries and benefits; contracted or purchased services; other resources, including books and teaching materials; welfare services and items such as furniture and equipment, minor repairs, fuel, telecommunications, travel, insurance and rent. Capital expenditure includes expenditure for construction, renovation and major repairs of buildings and the purchase of heavy equipment or vehicles. Public expenditure on education. Total public finance, devoted to education by local, regional and national governments, including municipalities. Household contributions are excluded. Includes both current and capital expenditure. Public expenditure on education as percentage of total government expenditure. Total current and capital expenditure on education at every level of administration, i.e. central, regional and local authorities, expressed as a percentage of total government expenditure (on health, education, social services, etc.). Pupil. A child enrolled in pre-primary or primary education. Youth and adults enrolled at more advanced levels are often referred to as students. Pupil/teacher ratio (PTR). Average number of pupils per teacher at a specific level of education, based on headcounts for both pupils and teachers. Repetition rate by grade. Number of repeaters in a given grade in a given school year, expressed as a percentage of enrolment in that grade the previous school year. School life expectancy (SLE). Number of years a child of school entrance age is expected to spend at school, including years spent on repetition. It is the sum of the age-specific enrolment ratios for primary, secondary, post-secondary non-tertiary and tertiary education. School-age population. Population of the age group officially corresponding to a given level of education, whether enrolled in school or not. Secondary education. Lower secondary education is generally designed to continue the basic programmes of the primary level but the teaching is typically more subject-focused, requiring more specialized teachers for each subject area. The end of this level often coincides with the end of

70

PES Evaluation Report, 2008 compulsory education. In upper secondary education the final stage of secondary education in most countries, instruction is often organized even more along subject lines and teachers typically need a higher or more subject specific qualification lower secondary level. Survival rate by grade. Percentage of a cohort of pupils or students who are enrolled in the first grade of an education cycle in a given school year and are expected to reach a specified grade, regardless of repetition. Teachers or teaching staff. Number of persons employed full time or part time in an official capacity to guide and direct the learning experience of pupils and students, irrespective of their qualifications or the delivery mechanism, i.e. face-to-face and/or at a distance. Excludes educational personnel who have no active teaching duties (e.g. headmasters, headmistresses or principals who do not teach) and persons who work occasionally or in a voluntary capacity. Trained teacher. Teacher who has received the minimum organized teacher training (pre-service or inservice) normally required for teaching at the relevant level in a given country. Trainer. In the context of adult education, someone who trains literacy educators, providing pre-service or in-service training in adult literacy teaching methods. Transition rate to secondary education. New entrants to the first grade of secondary education in a given year, expressed as a percentage of the number of pupils enrolled in the final grade of primary education in the previous year. Triangulation. A concept borrowed from surveying. In a research context it refers to a process of using two or more data sources to confirm and observation or acceptance of an hypothesis. Definitions based on those from http://portal.unesco.org/education/en/ev.phpURL_ID=43385&URL_DO=DO_TOPIC&URL_SECTION=201.html

71