The fact finding mission

(Difference between revisions)
Jump to: navigation, search
(Preparing the mission)
(Preparing the mission)
Line 61: Line 61:
 
:The composition of these two meetings is at the discretion of the national authorities. It can be restricted to the Ministry officials or opened to partners. In the case of Namibia for instance, a presentation of the mission main findings and recommendations was presented during the education sector review meeting.
 
:The composition of these two meetings is at the discretion of the national authorities. It can be restricted to the Ministry officials or opened to partners. In the case of Namibia for instance, a presentation of the mission main findings and recommendations was presented during the education sector review meeting.
 
*<big>Preparation of interviews</big>
 
*<big>Preparation of interviews</big>
:'''Interview guide''' is given here<ref name="guide">See [[media:Interview_guide.pdf|Interview guide]]</ref> as examples of questions to be asked when conducting the interviews.
+
:'''Interview guide''' is given here<ref name="guide">See [[media:Interview guide.pdf|Interview guide]]</ref> as examples of questions to be asked when conducting the interviews.
 
:Also is given here<ref name="instruments">See [[media:List of instruments.pdf|List of instruments]]</ref>  a '''list of instruments''' to be filled (before, during or after the interview) for collection of information descriptive of situation in which the structures are involved in the production of data. It is recommended to send it in advance to the national respondents.
 
:Also is given here<ref name="instruments">See [[media:List of instruments.pdf|List of instruments]]</ref>  a '''list of instruments''' to be filled (before, during or after the interview) for collection of information descriptive of situation in which the structures are involved in the production of data. It is recommended to send it in advance to the national respondents.
 
*<big>Preparing the matrix </big>
 
*<big>Preparing the matrix </big>

Revision as of 13:59, 5 June 2014

The diagnosis itself is done during a mission to the country (2 weeks in average). This page presents some guidance for conducting this type of mission.

Contents


What should be diagnosed?

A DQAF diagnostic practically addresses all education data produced by a given country. It however mainly focuses on the quality of data produced by administrative routine data systems but verifies nevertheless whether other type of data has been produced or not. The latter can be education data produced by National Bureau of Statistics (via population censuses or household surveys) or by the Ministry of Education or other body (for data on learning outcomes for instance).

In summary, the DQAF is an assessment of data sets produced and/or used by existing Education Management Information Systems (see EMIS) or Education Statistical Information Systems (SIS) to be used as public statistics, for education policy formulation and for education planning purpose. It can address one or a group of sub-sectors.

Unless specific objective is defined in advance to the mission, the approach should be sector wide, taking into account education data on all sub-sectors of education. In that sense, the methodology has been recently improved. The scoring in particular is done separately for each data sets producer identified (See: Main evolutions in the DQAF Methodology).

When a sector wide approach is adopted, the identification of the data producers is done by the expert team leading the mission together with the national respondents. The choice will be led by (1) the availability of data sets, (2) the identification of which entities are officially responsible for these data sets in terms of collection, production and dissemination and (3) the operationality of these entities.

The entities responsible can be identified but there may not be any process in place for the data set. In that case the assessment for this data set will not be done. For instance in Namibia, Higher Education and Vocational Education and Training are two data sets under the responsibility of the National Training Authority and the National Council for Higher Education respectively. At the time of the assessment, these bodies were setting information systems (VETMIS and HEMIS) but not fully operational. It has been decided to only assess the Ministry of Education MIS, in charge of data sets for pre-primary to secondary education.

Preparing the mission

The mission needs to be prepared in advance according to the following points:

  • Identifying the experts
The choice of experts is key for the success of the operation. It will guarantee the independence of the assessment and the professionalism of recommendations provided.
The team must be comprised of experts selected according to their competences in domains related to EMIS (public statistics, Education, information systems, organization, quality management).
This manual must be shared in advance by all the participating experts.
  • Identifying national respondents
The national respondents are the ones who will facilitate the investigation. They need to be informed of what is expected from them, mainly: to help in identifying the structures to be interviewed, to look for the supporting documents and other types of evidence.
They also need to understand what the DQAF is about. Documentation on the DQAF methodology will be send to them and distant support will be ensures via e-mail if needed. This Wiki can be used for this purpose and sample reports can also be shared.
There should be one main national respondent who will help in organising the mission, make the necessary appointments, and organise the review of the report from the country side.
Depending on the context of the intervention and considering the national education system organisation, one respondent will be identified in each of the bodies responsible for different sub-sectors and in the umbrella agency responsible for the National Statistical system (CSO or NBS).
When the whole education sector is targeted, most of the time, the main respondent will be from the EMIS structure in the Ministry in charge of Basic Education. In some cases (Mauritius, Kenya, Botswana for instance), the main respondent is a staff seconded from the CSO. This of course facilitates the organisation of the mission.
When the diagnosis gives priority to a specific sub-sector, the main respondent will be chosen within the Ministry responsible for it. It will however be necessary to identity in advance respondents in other bodies in charge of delivering trainings in this sub-sector.
  • Identifying and collecting supporting documents
Supporting documents and other descriptive information can be gathered directly by experts through national web sites and with the support of national respondents.
A list of these documents / information can be inspired from Model of a country page.
  • Identifying main structures to be interviewed
The structures to be interviewed are:
  • The main users / consumers
The national data users are mainly the different departments of the Ministries of Education (who can sometimes be also producers), the development partners and other stakeholders when constituted as official entities.
  • The structures involved in the collection, production and dissemination of data
Mainly training institutions (schools, colleges, universities, etc.), decentralised and central units under the Ministries in charge of Education, the National Bureau of Statistics and other units responsible for data production will be identified.
When possible a minimum of 2 institutions per sub-sector and 2 structures of decentralised levels are to be investigated, mainly to corroborate whether the instructions (in terms of data collection, verification and archival) given by the central level are well received and observed.
  • Communication with the National entities, UIS, UNESCO and other partners
Time in advance, the mission will be announced to main official authorities and partners through official letter. The letter should be sent either by UIS regional Advisor or by the Director of the UNESCO Office covering the concerned country. An example of Letter announcing DQAF fact finding mission is given here[1].
The letter is to be sent through the UNESCO National Comission.
  • Meetings schedule
A meeting schedule is prepared and shared in advance with all identified people to be interviewed. In practice, because of availability of people to be interviewed, this schedule is often re-arranged when the mission is conducted. However, a briefing and debriefing meetings need to be at the beginning and at the end of the mission.
These 2 meetings are to be organised in the presence of the most important representative of the main entities investigated (the Minister or his/her direct collaborator).
  • The briefing meeting aims at:
  • Presenting the context of the mission and its overall objective and expected results
  • Sharing the main important points of the methodology
  • Validating the schedule and confirm the responsibility of the national respondents
  • Seeking support from the authorities and officialising the launch of the mission
  • The debriefing meeting is intended to:
  • Present the mission main findings and recommendations
  • Agree on the next steps concerning the validation of the report
  • Discuss possible collaboration in the development and implementation of an action plan for the improvement of data quality
The composition of these two meetings is at the discretion of the national authorities. It can be restricted to the Ministry officials or opened to partners. In the case of Namibia for instance, a presentation of the mission main findings and recommendations was presented during the education sector review meeting.
  • Preparation of interviews
Interview guide is given here[2] as examples of questions to be asked when conducting the interviews.
Also is given here[3] a list of instruments to be filled (before, during or after the interview) for collection of information descriptive of situation in which the structures are involved in the production of data. It is recommended to send it in advance to the national respondents.
  • Preparing the matrix
Once the producers of data sets are identified for the assessment, a matrix must be contextualized by duplicating the scoring columns for each producer (See: ??).
  • Capacity building for the national respondents
This exercise will be build on top of materials sent in advance (Identifying national respondents). The national respondents should have read it. Necessary clarifications can be made as well as verification of supporting documents gathered as evidence.
  • Using interview guides and data collection tools
The interview guide[2] is intended to structure the sequence of questions. Of course, it needs to be adapted to the function of the interviewed person (statistician, school head, department head, etc.).
In practice, the interview will be conducted by one of the expert, another will take notes and (depending on the composition of the team) one will check whether all information has been gathered to answer all items (practices) of the matrix.
Data collection tools[3] sent in advance should be reviewed with the national respondents and duplicated as required. When possible, it will be useful to distribute it in advance so that interviewees can fill it prior to the interview. It will then be discussed during the meetings.
  • Role of experts
Experts will seek for evidence that will ultimately help in filling the matrix and scoring (See: Scoring Guide Lines) each of the items. Although the task is not easy, the assessment is intended to be as impartial as possible. In that sense, questions should not be orienting the answers and all decisions made on the level of scoring should be done in the light of evidence and in relation with the Fundamental Principles of Official Statistics.
In practice, it is recommended that a debriefing meeting is hold every day by the expert team to:
  • Put together the notes taken during the past interviews,
  • Verify whether evidence have been sufficiently gathered to score the matrix,
  • Identify items that have not been assessed,
  • Prepare the next meetings.
By the end of the mission, the expert team should have fully filled the matrix and the main findings and recommendations should be put down to be presented to the national authorities (See: Meetings schedule).

Scoring guide lines

The main purpose of conducting a DQAF exercise is to review the quality of data and to highlight the areas that should be improved upon to meet internationally accepted standards. Scoring the different elements (the different practices) of quality helps in formulating recommendations that can be used by data producing units in their endeavor to improve data quality. The scoring and the recommendations allow identifying priorities and setting up action plans to be implemented by Ministries of education and other bodies in charge of education statistics.

Scoring will also highlight good practices that can be taken as example by other countries.

Finally, although like most assessments, the scoring remains subjective, it can be used to compare data quality among countries. This last point is nevertheless very sensitive aspect of the DQAF because each exercise is conducted in a different context. In particular, depending on different elements, the DQAF may address one or several data producing unit in charge of a set of data in the overall education sector. In most countries the education sector is under the responsibility of several Ministries which often have developed their own EMIS Unit. The decision of assessing one or several data producing unit impacts very much on the comparability across countries.

The scoring is done for each data producing unit part of the exercise and for each of the 140 practices of the framework according to a scale from 1 to 4 according to the following guide lines:

  • Practice not observed (scored 1)
  • Practice largely not observed: Significant departures and the authorities will need to take significant action to achieve observance (scored 2)
  • Practice largely observed: Some departures, but these are not seen as sufficient to raise doubts about the authorities’ ability to observe the DQAF practices (scored 3)
  • Practice observed: Current practices generally in observance meet or achieve the objectives of DQAF internationally accepted statistical practices without any significant deficiencies (scored 4)
  • Not applicable: Used only exceptionally when statistical practices do not apply because of specific circumstances (not scored – will not be included in the calculation of the average)


During The fact finding mission, the group of experts will discuss each of the 140 practices of the DQAF and will score it using the dedicated Matrix (See Annex? – Examples of Matrix).

This Matrix is organized following the different dimensions, sub-dimensions and practices of The UIS Education DQAF and the bellow columns:

  • Question to: The unit / person(s) to whom the practice is to be discussed with or has been discussed with
  • Means of verification: Any document, data base, URL, etc. that can be used as a reference to verify the practice and its scoring
  • Score: The score that the group of experts will attribute to the practice. It may be one score for each of the data producing unit that has been decided to be addressed in the study (See choice of Data producing unit).
  • Average: This field is automatically calculated from the scores to finally get the average score of the sub-dimensions and Dimensions for each Data producing unit
  • Discussions / Observations: Arguments that justify the scoring and that has been discussed by the group of experts.
  • Recommandation: Possible recommendation(s) that if applied would help improving the scoring


These different elements constitute a documentation for the justification of the scoring as well as a support for the formulation of recommendations. Discussions with the national team can be organised around these points in order to scrutinise more in depth analysis.

The Matrix is organised so that averages are calculated by groups of practices and sub-dimensions and so that the final graph presenting the overall scoring is automatically generated:

Diag1.jpg
See: Main evolutions in the DQAF Methodology

The matrix, thus filled, represents the raw material for writing the report.

References

  1. See Letter announcing DQAF fact finding mission
  2. 2.0 2.1 See Interview guide
  3. 3.0 3.1 See List of instruments

External links

Personal tools
Namespaces
Variants
Actions
Menu
Navigation
Toolbox