DQAF mission Botswana 2011
m (1 revision) |
(cht category as botswana instead of SADC) |
||
| Line 98: | Line 98: | ||
** The allocation of regional EMIS officers | ** The allocation of regional EMIS officers | ||
| − | [[Category: Country]] [[Category: | + | [[Category: Country]] [[Category:Botswana]] |
Latest revision as of 12:40, 15 October 2012
Contents |
FINDINGS OF THE DQAF
In this section we will identify processes, systems and technologies that are (a) examples or evidence of good practice, and (b) opportunities to improve or strengthen current practices for optimal efficiency. The major themes extrapolated from the situational analysis have been summarised and are presented below:
Ministry Strengths
Government awareness: The situational analysis reveals that the government and the MoESD recognise the importance of quality data for evidence-based decision making, policy formulation and improving education delivery and outcomes. There is a strong awareness by the Ministry of data quality issues – mainly timeliness and reliability - as well as of the lack of functional planning.
It seems that there is political will to improve the situation. In this vein, the MoESD has recognised the important role that education statistics can play in the education sector. The Revised National Policy on Education recommendation 124 (1994) recommended the establishment of the Division of Planning, Statistics and Research as soon as possible, with adequate resources to coordinate and commission research. The policy further emphasised that a priority for the Division should be the development of the Education Management Information System (EMIS). The Organisation and Methods (2006) document on the final report on the restructuring of the MoESD, 2006, REC: 3.6.2.1.G, takes this policy recommendation a step further by proposing that the Division of Planning, Statistics and Research be elevated to department level.
Sound legislation: The mandates and functions of the departments in the Ministry are governed and guided by relevant legislation and related policies. The Ministry of Education is mandated by the Education Act (CAP 58:01) Section 3 (1) to provide for the proper development of education and for matters incidental to or connected with education from primary to post-primary level (Organisation and Methods, 2006). However, it must be said that the mandate of the Ministry in the Education Act does not reflect the important role of education statistics in education, nor does it include any reference to data collection processes. Through the Tertiary Education Act of 1999, Section 5 (2) (a)- (f), the Tertiary Education Council (TEC) should play an important regulatory role in tertiary education. The Teaching Service Act (Cap 62:01) has mandated Teaching Service Management (TSM) to take over, amongst other, the function of personnel statistics. Furthermore, the Revised National Policy on Education (RNPE) White Paper No. 2 of April 1994 identified seven key strategy issues for the future development of education in Botswana.
Personnel commitment: The other positive point that we could identify during our mission is that the education personnel are committed and work hard to meet the requirements of their mandates. We encountered high levels of commitment throughout the education system, at the institutional, regional and national level.
Planning
Strategic document: We could not find a strategic document that drives the data management agenda and sets the targets for the education sector in the coming years. Such a document should be based on an economic analysis of the education system following a results-oriented approach. There seems to be an attempt within BOTA to analyse labour market needs in order to orient the provision of training by VET providers. This type of socio-economic approach should also be used to analyse the sector as a whole to help in decision making.
No planning unit: The staff members in the planning unit that is currently in the MoESD are seconded from the Ministry of Finance and Development Planning. Their function is focused more on development projects and their core mandate is related to infrastructure. They do not fulfil the traditional planning function in the Ministry. There is no planning unit with a monitoring and evaluation (M&E) function that can act as champion for indicators to drive the data demand initiative. In the absence of a proper planning unit it seems that there is a lack of leadership around the M&E function. The result is that some departments with no planning mandate are doing some planning themselves because of the need for it, and often they conduct their own data collection, which creates a serious response burden at the school level because of overlapping data requests for the same information from different departments and even ministries (see Data collection processes further on for more detail). Furthermore, although several staff members have been through high-level technical training in education planning, there is no ground for them to apply their competencies effectively.
No budget vote: The Division of Planning, Statistics and Research (DPSR) does not have a vote on the budget. This has an impact on planning and, together with the financial procedures that are time-consuming, slows down the implementation of planned activities regarding data collection.
Data collection processes
Ad hoc and uncoordinated data collection processes: The implementation of the EMIS Unit has been done without a clear definition of its mandate. The ambiguity concerning who has responsibility for data collection, processing, data analysis and dissemination has created a data leadership vacuum within the Ministry. Ministries and departments need quality data that is complete, accurate and timely and that can be used by managers as a basis for making appropriate decisions that contribute to the quality, expansion and sustainability of their programmes and to fulfil their mandates. The data is unfortunately not provided by the main data-producing agencies (DPSR in MoESD and the Central Statistics Office (CSO)), or the data they receive is unreliable, incomplete, untimely and rarely pertains to the mandates they have to fulfil. The lack of leadership to address these data quality issues and the resultant leadership vacuum that exists causes each of these units to develop their own “silo-based” and parallel data collection processes, mostly under pressure to perform their tasks. The data collection processes therefore are rarely the result of a coordinated effort to address the information needs of education planners, policy makers and managers. The result is ad hoc, fragmented and uncoordinated data collection throughout the entire education system.
Weak linkages of strategic partners: The link between ministries and departments with the same data requirements seems to be very weak. The result is a lack of consensus and data sharing between these data producers and data users at each level of the education system regarding the information needed. Below are examples of data collection duplication and no data sharing amongst strategic partners.
Lack of norms and standards
Although CSO has overall responsibility for certifying all the data that Government publishes, and has its own statisticians supporting the Ministries, there do not appear to be standards for data quality that the Ministries should adhere to. There is not yet National Strategy for the Development of Statistics (NSDS) that would coordinate data collection across Ministries. In the absence of standards and coordination, departments resort to making up their own rules concerning the collection and management of their data.
Training: Education officers receive little if any training in data collection methods, and rarely have standardised instructions or manuals on how to collect the data.
Feedback: Since education managers at the regional and local (institutional) level rarely receive feedback on the data reported to higher levels, they have little incentive to ensure the quality of the collected data and to comply with reporting requirements. In fact, regional officers reported that their staff members regard the data collection and verification function as another burden and the inspectorate feels that it impacts on their core pedagogical function.
Data analysis and usage – Data is collected for reporting purposes only and it appears that very little, if any, data analysis takes place within the MoESD.
Human resources
There is only one staff member in EMIS; the others are seconded from CSO (Ministry Of Finance and Planning). There also is a need for a clear role clarification between the Education Statistics Unit (ESU) and the EMIS unit. All the staff members in the Education Statistics Unit have been seconded from the Central Statistics Office (CSO). This has limited the Ministry of Education in terms of building capacity and development of its own staff. There is no EMIS presence at the regional level in terms of functions or education officers.
Decentralisation
The Organisation and Methods (2006) document emphasises that the Ministry of Education realised the need to decentralise its functions from the centre to regional level as a result of the rapid expansion of the education system. However, there is no EMIS focal point or function at the regional level. We want to illustrate the importance of coordination at regional level with a success story. The first term data collection exercise is conducted successfully and within an acceptable timeframe essentially because it is closely coordinated with the regions and the inspectorates (refer to paragraph: First term data collection process for primary schools for more details on this process). However, it must be added that this assignment is seen by the sub-national level as an additional task and it is not included in the job description of the staff involved in it.
On the other hand, the annual report is seriously delayed (last published in 2007) because the national level has to link directly with the schools, without any involvement by education officers at the regional level.
While 300 IT specialists have been deployed in the country (mainly in secondary schools), no significant effort has been made to improve planning at the regional level. These IT staff members have been deployed without a clear job description.
The regions are doing their own planning without an information system and are not feeling concerned about the statistics collected by headquarters. They have their own “10-day grace period form” that informs them of the situation at the school level.
Information systems
There was general concern about the extent to which information management functionality (i.e. to support the information activities of “creation”, “retrieval” and “storage”) has being implemented in government information systems. The units that collect data also develop information systems with minimal input from the central IT unit and the DPSR. The result is that unstructured database systems are found throughout the Ministry of Education that do not adhere to the norms and standards required by the IT Department. Every unit we visited either had a system, or was in the process of developing one, or was planning to develop one without any collaboration with key strategic partners or relevant stakeholders.
RECOMMENDATIONS
Based on the above situational analysis and the accompanying findings, recommendations are put forward for the following categories:
Institutional arrangements and coordination among concerned structures
- Implement the recommendations of the O&M (2006) in terms of the organisational restructuring of the Monitoring and Evaluation function and provide more autonomy to DPSR in controlling its own funds.
- Staff the EMIS Unit with at least one additional officer at Headquarters and place at least one person (focal point) with an EMIS function at the regional level (there are no EMIS staff members in the regions, while most of the other Ministries and Departments have a presence at the regional level).
- Reinforce and institutionalise the relations between CSO and MoE and clarify the roles of EMIS and CSO in terms of questionnaire design, questionnaire dissemination, data collection processes, data compilation, data analysis, reporting and publications.
- It is recommended that CSO strengthen the ongoing effort for the promotion of the National Strategy for the Development of Statistics (NSDS).
- Establish a Quality group that associates EMIS and CSO staff and that should meet regularly according to a predefined schedule.
- It is recommended that the Education Act be revised to include statistical responsibilities and that the mandate of MoESD is clearly identified in terms of the scope and periodicity of the production of educational statistics.
Data collection processes
- Develop, in collaboration with the Central Statistics Office (CSO), an EMIS Policy that will define data collection processes and data dissemination procedures. This policy could create a framework that allows for the coordinated and sustainable development of education information systems. It could also create a framework for establishing and maintaining effective and sustainable standards governing education statistics, data and information systems in Botswana.
- Improve forms regarding information on respondents’ rights and responsibility and make sure manuals are available
- Establish a survey registration system to ensure there is control over the different data collection requests that go to the schools. This should help to eliminate unnecessary duplication of effort and could ensure that there is only one entry point into the schools reducing the response burden at school level.
- All data collection processes should be harmonised through the control and management of a national unit (DPSR), taking into account the data needs at regional level.
- Implement a data collection and dissemination schedule with clearly defined deadlines of all the steps involved in the process. This schedule should be well distributed within and outside education.
- Investigate the possibility of having only one data collection exercise for Headquarters during the year.
Information systems
- Develop terms of reference (ToR) for an integrated information system that adheres to the functionality that is required within the MoESD and sub-national levels in a medium-term perspective of a decentralisation down to the institution level if required.
- Organise a consultative structure in the MoESD with the objective to finalise the strategy for the development / acquisition of a new information system.
- Develop norms and standards as requirements for the development of any system in the future in order to enhance and promote capacity building and harmonisation in the different departments of the MoESD.
Data quality issues
- Prioritise support for and management of the regular Statsbrief publication.
- Update and maintain the master list of all educational institutions in Botswana. The School Registration Unit in the DPSR performs such a function but this is currently a manual process which should be automated and linked to the information system through a national unique school ID.
- Conduct regular revision studies and methodology assessments through data quality meetings.
- Develop procedures and activities to ensure that annual publications meet the necessary quality standards (completeness, timeliness and reliability). To that effect missing data treatment should be improved and generalised.
- Assistance to users should be structured and data requests kept under record for further improvement of data collection processes.
Capacity building
- Conduct a more detailed skills inventory and design a capacity building strategy which would be grounded in the ministry activities hands on application of acquired competencies.
- Conduct training and capacity-building based on the national needs and objectives. A practical exercise would be the production of a Country Status Report that would provide a clear picture of the education sector performance following a socio-economic approach.
- Adequate training should be conducted for head teachers (school registers, data collection, education indicators), inspectors (data quality) and regional staff (EMIS, planning).
Next step: preparation of Action Plan
- There is a need for the development of an Action Plan that will identify and prioritise a set of actions needed to address the weaknesses identified by the diagnostic study.
- It will also identify a coherent framework for capacity building of education statistics, and will identify the costs associated with implementation of the plan. It will serve therefore as the basis for discussion, and for mobilising additional resources as required to cover the in-country costs associated with implementation of the plan.
- The plan should include:
- The development and implementation of an EMIS Policy and identification of an EMIS expert to lead such a process.
- The development of a “hands-on learning by doing” capacity-building strategy for education planning and EMIS.
- A sound assessment of potential technical solutions that will guide the choice of a proper information system.
- Prior to the development of the plan and to ensure its successful implementation, it is again recalled that the implementation of the O&M recommendations should be prioritised, and in particular:
- The establishment of a dedicated planning unit (structure)
- The staffing of the central EMIS Unit
- The allocation of regional EMIS officers