DQAF mission Namibia 2011

From Ed-DQAF Wiki
(Difference between revisions)
Jump to: navigation, search
m (1 revision)
(add category Namibia)
 
Line 79: Line 79:
 
*Identification of technical partners
 
*Identification of technical partners
 
*Implementation / evaluation
 
*Implementation / evaluation
 +
 +
[[Category:Namibia]]

Latest revision as of 13:28, 15 October 2012

Preliminary findings and recommendations presented to the Education Review meeting
(ETSIP Annual Review Meeting, Heja Lodge, Windhoek, 19 – 21 October, 2011)


Contents

Preliminary findings

Data quality and Data Use: a vicious cycle

When data users (e.g., policy makers, analysts) lose faith in data, they often discourage support for maintaining and strengthening EMIS system. In turn, this jeopardizes the EMIS ability to produce timely and reliable data.

In Namibia, we found quite good quality data yet a low demand

  • Data Quality
    • UIS Main expertise
    • DQAF main purpose
  • Use of Data for decision making
    • Other partners expertise
    • Strong expectation from stakeholders
    • “Impressions” – additional investigation is needed (institutional audit, job descriptions, …)

Good practices

  • Recognition of EMIS as unique source of data
    • Central level:
      • Prefilling the Data Collection Instrument,
      • Maintenance of the Master list of schools
      • Systematic control of erroneous and missing data
      • Systematic tracking system of follow up to schools
      • List of schools with high enrolment gain / loss is presented in management meetings
    • Sub national level:
      • Standard registers, Guidelines for data quality control,
      • Self-explanatory Data Collection Instrument,
      • Proximity support conducted by circuits’ teams,
      • Training provided by EMIS
  • High level of professionalism / Good quality data produced by DNEA on learners’ performances
    • Examinations
    • National Standard tests

Room for improvement

  • Scope:
    • Gaps for some sub-sectors (VET, HE, ECD): Progress is noticed in the development of Information Systems but minimum collaboration with EMIS and CBS has been noticed and issues of data quality seem to be rarely discussed
    • Use of secondary data sources through surveys and studies are not promoted (understanding drop out, population mobility, research on OVC, …). DHS 2003/4, 2006/7, intercensus… are not used for comparison.
    • Availability of data on expenditure – no publication of budget data although submitted to UIS
    • Efforts to answer information needs (registry of needs) don’t seem to be given priority.
  • Serviceability:
    • User-friendly tool to query raw data is missing
    • Population data are not reliable (not updated) and not available at enough level of detail (regional disparities are difficult to assess)
    • Statistical publications need to be improved. The current EMIS Booklet would need to be more user-friendly and an additional publication with analytical content is required
  • Process:
    • Relevance of regrouping the different data collection exercises should be discussed (improving timeliness and reducing burden and duplication).
    • Duplication of data collection between EMIS and HR + payroll creating confusion between data sources (triangulation is required)
    • Information system decentralization is required and expected by regions
    • Reduced motivation at school level due to minimum impact on working conditions improvement
    • Minimal communication between EMIS and IT department
  • interest in data: Insufficient engagement and interest in EMIS from data users
    • Minimal feedback from users on the draft EMIS Booklet produced after data cleaning
    • Few data requests from users
  • Data use
    • Central level:
      • The M&E function should be given more importance (its hierarchical positioning is not adequate). An institutional audit with definition of job profiles is required.
      • Need for improving capacities in using evidence for policy making is recognized,
      • Staffing is not adequate
      • Need for a global education sector performance analysis / CSR based on existing data and additional studies / surveys that could also serve as capacity building by “learning by doing” exercise
      • Suggestion was made to involve external expertise in providing independent analysis
    • Regional level:
      • Tools for data management are expected
      • Capacity development for planning and budgeting is required
    • Political level:
      • Culture of numbers needs to be developed
      • Advocacy for evidence-based decision making (simple figures on physical facilities, costing of classrooms rehabilitations and text books, etc.)

The way forward

  • Participative review of the report: senior management, and key players (data users, EMIS,…) should give their input and reach a global.
  • Official endorsement and commitment of senior management,
  • Definition of an action plan fully owned by MoE,
  • Advocacy for partners support
  • Identification of technical partners
  • Implementation / evaluation
Personal tools
Namespaces
Variants
Actions
Menu
Navigation
Toolbox