Main evolutions in the DQAF Methodology

From Ed-DQAF Wiki
Jump to: navigation, search

The UNESCO Institute for Statistics (UIS) is mandated to develop and implement evaluation methodologies that assess the quality of data produced by national statistical systems within the UNESCO domain of competence. The Institute has been active in using such standards and tools for assessing education data quality produced by member states. The UIS Data Quality Assessment Framework (DQAF) is an instrument which current form is based on one initially developed by the International Monetary Fund in 2002 to assess the quality of economic data. In 2004, the World Bank and UIS modified it for use in the evaluation of education data.

For several years, UIS has engaged in diagnoses of national education statistics systems using the DQAF methodology. Such ‘DQAFs’ were implemented in Latin American and in Sub-Saharan Africa (SSA) between 2005 and 2006.

Between 2009 and 2010, within the UNESCO BREDA support to the African Union Second Decade of Education and the SADC education programme, a second round of eleven diagnoses has been conducted in SSA with some modifications in the DQAF methodology itself and in its implementation.

When conducting these exercises, several inconsistencies in the structure of the instrument and in the relevance of some items as well as the need for standard scoring guide lines and other methodological issues have been identified. In addition, questions about the diagnosis process and the validation procedures were raised.


This page provides a general overview of the Data Quality Assessment Framework (DQAF) evolution process between 2009 and mid 2012 by:

  • Briefly overviewing the evolution of DQAF tools
  • Understanding why the DQAF tool has evolved
  • Reviewing how the DQAF tool has evolved, the main proposed changes and how changes have been implemented
  • Discussing countries contributions and their role in the evolution of DQAF

So as to discuss and agree on possible next steps for mid-2012 onward.

Contents


Overview

FORMER MATRIX
REVISED MATRIX
2008
2009
2010
2011
Madagascar Cape Verde Congo Botswana
Lesotho Kenya CAR
Mozambique Tanzania Mainland-Update Chad
South Africa Zimbabwe Malawi
Swaziland Mauritius
Tanzania Mainland Namibia
Zambia

The Education DQAF has evolved following the completion of seven DQAF assessment piloted in SADC countries and the synthesis discussed at the data quality assessment meeting held in Maputo (2009) that highlighted the need to revise the DQAF tool.

A synthesis of seven pilot DQAF’s was published in March 2010

The Stakeholders were convened in Cape Town in December 2010

  • Member States: Representatives from Ministries of Education (Cameroon, Cape Verde, Kenya, Botswana, South Africa)
  • Representatives from international and regional bodies: PARIS21, Pôle de Dakar, World Bank, regional bodies (Statistics South Africa, Afristat, AfDB, IMF, UIS).
  • Recommendations on improving the DQAF tool were put forth and
  • Next Steps were evaluated by stakeholders.

Progress since 2010

  • A external consultant was engaged by UIS to collaborate on improving DQAF tool during December 2010.
  • The DQAF assessement was expanded to the remaining SADC countries excluding Seychelles. For DRC and Angola it was decided to postpone it because of the ongoing EMIS projects implementation.
  • The six generic DQAF dimensions are still assessed following the DQAF revision.
  • The Structure of the matrix was improved and additional DQAF reports were produced with revised tool.
  • Revision of some original version of DQAF reports where sub-sector analysis has been undertaken in 2010.
  • Development and implementation of action plans : progress were made in some countries but the development of an action plan is still pending in others.
  • Validation of reports completed by some countries during 2010 and 2011.

But the implementation of certain recommendations to improve DQAF tool have still to be determined.

General suggestions to improve the DQAF tool

MAIN SUGGESTIONS
1. Improve the structure and organisation of the DQAF matrix and report
  • Develop more user friendly instrument - reduced matrix
  • Improve understanding of terms, definitions and concepts across dimensions
  • Limit redundancies across different dimensions
  • Focus on "official" statistics
2. Clarify the nature of legal frameworks (related to education statistics)
  • Consider other ACTS or legislation - not limited to Statistics ACTs
3. Improve the relevance of the matrix (assess context in country)
  • Assess relevance of statistical practices in countries - exclude them from matrix
4. Expand the scope of analysis of multiple data sets/providers in the country
  • Assess sub-sector education statistics data producers (not EMIS alone)
  • Adjust DQAF matrix according to country context to assess data producers
5. Review overall scoring methodology
  • Score at the dimension level - emphasis on outcomes of dimension not score
  • Score of sub-sectors of education
  • Exclude scoring at the aggregate level
6. Engage regional and national experts
  • Engage external editors on reports
  • Develop capacity development programme with regional extertise
  • Develop training materials - manuals and guides on DQAF, norms and standards

Improvements to the structure of the reports since 2010

alt text

A synthesis of seven pilot countries assessments was completed and a report was published in March 2010 as: Assessing Education Data Quality in the Southern African Development Community (SADC) [1]

The suggestions from the 2010 Cape Town meeting have been adapted for reports since 2010.

  • Enhancing and highlighting positive aspects of data quality - best practices in the countries.
  • Summarising recommendations and areas for improvement at onset of all reports – clearer narrative.
  • Analysing sub-sector data sets/ data producers (where feasible) including EMIS e.g. TVET and Higher education e,g, Tanzania DQAF scope of analysis expanded from original version.
  • Reports are more “user-friendly” with context of the system elaborated for each sector e.g. presenting data on the education system in the country.
  • Adopting “consensus approach” to scoring to ease report writing.
  • Working closely with regional and other expert partners to draft and edit reports – e.g. Botswana, Malawi, Mauritius drafted in conjunction with consultants e.g. (Stellenbosch University, PARIS21) in 2011.
  • Promoting south – south cooperation –e.g. Stellenbosch University, UNESCO Harare focal points involved in DQAF (Botswana).
  • Validation process for the DQAF reports: feedback from countries (intended) to lead to action plans –e.g. Malawi, Mauritius validated in 2012.


There remains a need to engage external editors to work on the DQAF reports as suggested in 2010 and the modalities to publish country reports (if necessary as suggested in 2010) are to be determined.

Improvements to the content of the matrix

  • Less emphasis on assessing multiple elements related to the production process e.g. no longer assessing the availability of certain information e.g. on education expenditure, teacher, and school characteristics that may not be readily available in several countries.
  • Less emphasis on assessing statistical practices that may not be applicable or feasible in several countries e.g. “revision policies”
  • Reducing or eliminating redundancies within the DQAF matrix e.g. Moving sub-dimensions from one dimension to another or excluding them from the matrix.
  • Adopting more transparent and clear definitions (user friendly) and refining concepts of DQAF dimensions and sub-dimensions to better understand the objective of the dimension.
Revised content
SUB-DIMENSIONS REVISED MATRIX (July 2011)
0- Pre-Requisites of Quality
0.1 Legal and institutional environment
  • Complementary legislation (not limited to Statistical ACTS in the country)
0.2 Resources
  • Skills, experience and qualifications of staff;
  • Physical facilities to perform tasks
0.3 Quality Awareness
  • "Monitoring Processes" to assist managers in quality
  • Tradeoffs among dimensions of quality
1- Integrity
1.1 Professionalism
  • No major changes
1.2 Transparency
  • No major changes
1.3 Ethical Standards
  • Guidelines - Ethics and staff
  • Agencies' management as role models
2- Methodological Soundness
2.1 Concepts and definitions
  • Documentation on national concepts and definitions
  • Deviations of concepts from national definitions are checked
2.2 Scope
  • Assessing data overlaps - redundancies
2.3 Classification
  • National classification of programmes and its application
  • UIS ISCED mappings
  • Reporting data according to ISCED classifications
2.4 Basis for recording
  • Database analysis undertaken in Dimension 3
3- Accuracy and Reliability
3.1 Source data
  • Less focus on data sets not usually collected in countries in the region
3.2 Assessment and validation of source data
  • Focus on measures to improve the accuracy of data e.g field visits etc
  • Use of registers to monitor school response rates
3.3 Statistical techniques
  • Computation of education statistics indicators in accordance with Dimension 2
3.4 Assessment of Intermediate Results
  • No major changes
3.5 Revision studies
  • No major changes
  • Sub-dimension 3.6: Archiving of source data and statistical results
4- Serviceability
4.1 Relevance
  • Sub-dimension 4.1 Excluded from the matrix
4.2 Timeliness and periodicty
  • Learning achievement surveys related to country monitoring needs
  • Publication of Finance statistics related to financial year
4.3 Consistency
  • Emphasis on "final statistics" and consistency with secondary data
4.4 Revision policy
  • Subdimension 4.4 Excluded from the matrix
5-Accessibility
5.1 Data Accessibility
  • Public awareness around data dissemination products
  • Use of electronic databases validated by data producing agencies
  • Simultaneous release of data related to pre-announced schedule
5.2 Metadata accessibility
  • Uses of metadata and effects on data quality
5.3 Assistance to users
  • Schedule for data requests known to EMIS users

Sub-sector scoring

  • Sector wide scoring process has been adopted to assess relevant institutions responsible for the production of statistics (where feasible) for all sub-dimensions e.g. Mauritius, 2011.
  • It was suggested in 2010 that scoring should be limited to situations where a minimum level of information is available and assessed e.g. TVET
  • In most countries in the region, it is more practical to combine scoring of pre-primary, primary and secondary education as General Education.
  • Private and Public institutions are not scored separately in the matrix.

Simplifying scoring

Seven Pilot Countries

  • Focus on highlighting the deviation of each sub-dimension from the international norms.
  • Emphasis on an overall score based on scoring of each dimension.
e.g. Dimension 2: Methodological Soundness – country example Tanzania
Diag2.jpg

Reports completed in 2011

  • Focus on highlighting the deviation of each dimension by education sector from the international norms.
  • Generic assessment of each dimension (based on assessment of sub-dimensions) – excluding overall score.
Diag1.jpg
Example
Dimension General

education

TVET HE
Pre-requisites of quality 1.99 1.93 1.94
Integrity 2.44 2.44 2.36
Methodological soundness 3.31 2.96 3.04
Accuracy and reliability 2.58 2.46 2.57
Serciability 2.88 2.71 2.71
Accessibility 1.96 1.41 1.73


References

  1. See A Synthesis of Seven Country Assessments
Personal tools
Namespaces
Variants
Actions
Menu
Navigation
Toolbox