School Education Quality Index: A Few Observations by Prof Arun C Mehta

Background

Exclusively based on U-DISE Data, National Institute of Educational Planning and Administration (NIEPA) initiated computing Educational Development Index (EDI) based on a set of 24 parameters in 2005-06 which continued up to the year 2014-15. It was annual practice to compute EDI separately for Primary and Upper Primary levels of education and also a composite index for the entire Elementary level of education. A set of 24 indicators were being used in computing EDI which were re-grouped into the four sub-groups, namely Access, Infrastructure, Teachers and Outcome indicators.

Principal Component Analysis (PCA) was applied to decide the factor loading and weights. In the case of a few variables, policy options were explored to identify the best values instead of based on the observed values.

School Education Quality Index: NITI Aayog, 2019

School Education Quality Index: Some Observations

Performance Grading Index 2017-18: MHRD

PERFORMANCE GRADING INDEX 2018-19

The EDI in its new avatar, namely School Education Quality Index (SEQI) is computed for the base year 2015-16. The latest document, namely SEQI: The Success of Our Schools was released by NITI Aayog on 30th September 2019 is based on 2016-17 data i.e. reference year which was collected mostly as on 30th September 2016. SEQI is an improved version of EDI as it is more comprehensive in nature and is based on more sets of indicators and unlike EDI; is not confined only to U-DISE data; however, U-DISE still remained the main source of data.

In addition to U-DISE data, SEQI has also extensively used learning outcomes data of the National Achievement Survey’s (NAS) conducted by the NCERT on November 13, 2017, apart from a few other data-sets provided by the States & UTs.

School Education Quality in India 2023

While the total number of indicators and sectors which have been used in SEQI is comprehensive but a few of the crucial indicators, like retention rate, ratio of primary to upper primary and upper primary to secondary schools/sections and percentage of schools with female teachers, and a few others, such as, average annual drop-out rate at primary level of education has not been considered which has got significant implications for the Country to achieve the goal of universal school education.

It may also be of high importance to observe that enrollment in school education in India during 2015-16 and 2016-17 has shown a decline of about 9 million enrollment of which 6.8 million (Primary, 5.32 million & Upper Primary, 1.51 million) alone declined in case of elementary level of education i.e. Classes 1 to 8 which has got serious implications for the country to achieve the goal of universal elementary level of education but declining enrollment has not been considered as one of the indicators in computing SEQI.

It was perhaps for the first time that enrolment at the Upper Primary level of education (Classes VI to VIII) had also declined in 2016-17 from its previous level i.e. 2015-16. Individually also, Class I, V, VI & VII and Class X, XI & XII all declined in 2016-17 which has got serious implications for enrolment at other higher levels of education to grow in years that follow. At least, Net Apparent Entry Rate which is considered crucial for achieving universal enrolment should have been used. Needless to mention that even enrolment in Class I had also declined to 25.29 million in 2016-17 from its previous level, i.e. 27.17 million in 2015-16.

As many as 30 indicators have been used in computing 2016-17 SEQI which are classified under two categories, namely Outcomes and Governance Processes Aiding Outcomes. Category one Outcome is further divided into four domains, namely Learning, Access, Infrastructure and Equity outcomes which has as many as 16 indicators as against 14 indicators including student and teacher attendance, teacher availability, training, accountability and transparency all which are not part of the regular collection of administrative data but provided by the states and is not available in the public domain and not an easy task to examine the validity of such data sets.

Limited information has been provided on how such data set as stated by the States & UTs was validated. On the other hand, as many as 10 indicators from NAS have been used as compared to 9 indicators from the U-DISE sources. The rest of the indicators are either obtained from the GoI portal, namely ShaGun or have been reported by the States & UTs. Depending upon the nature of an indicator, a few indicators have been used for all the schools including Private Aided & Unaided managements while a few others have been used only for Government and Government aided schools/management. Few indicators used in computing 2016-17 SEQI are worth to describe.

Over a period of time, it has been observed that percentage of out-of-school children identified and mainstreamed has always been incomplete because of which the same had never been reported in U-DISE publications which is now under SEQI is used but is reported by the States & UTs and is not easy to validate the same. This indicator might have avoided as it has already been captured indirectly in the Adjusted-NER at Elementary and Secondary level of education used in computing SEQI.

Another important indicator used in SEQI is the percentage of children whose unique Id is seeded in SDMIS. It is mentioned that “States and UTs are encouraged to track their students through the SDMIS as a way to inform UDISE. UDISE is meant to serve as a longitudinal database for tracking the schooling status of students to provide a foundation for evidence-based policy responses”.

It is heartening to observe SEQI document mentioning “that all States and UTs have successfully migrated from their existing Management Information Systems (MIS) to the SDMIS”. However, it is unfortunate that SDMIS in-sync with U-DISE launched during the 2016-17 data collection was discontinued in the following year for unknown reasons through which detailed individual student records on 35 parameters in case of 210 million students were recorded the majority of which also had the Unique Ids. Had it been continued, the same would have eventually helped in improving enrolment statistics generated through the U-DISE which would have lead India towards developing a Child-tracking system in view of which the next SEQI, if computed this indicator would have to be dropped.

Another indicator that was planned but dropped in the final calculation is GER of CWSN Children (age-group 6 to 18 years) because of unavailability of the published data which at the very first place shouldn’t have been included in the initial list of indicators because of its very definition. Where do we get the CWSN population of age-group 6 to 18 years in 2016-17 whereas the reliable child population in the school age-groups is even not available?

The percentage of average daily attendance of teachers recorded in the electronics attendance system is another indicator that has been used instead of indicators that focuses more on what teachers do in the school. Instead of 10 RTE facility indicators, the percentage of schools meeting teacher norms as per the RTE Act has only been used.

Instead of using the percentage of teachers provided with the sanctioned number of days of training/in-service training, emphasis should have been given to indicators that capture whether the training provided meets the teacher’s requirement and is need-based? It is common practice across the Country that DIETreceives themes (along with a number of programs to be conducted, program days & number of participants, etc in each program) of the capacity building programs identified by the SCERT which is generally common to all the districts across the state?

It is of interest to observe that many states have reported the percentage of schools that have made school development plans a hundred which is contrary to the situation at the grassroots level across the Country. Rather percentage of blocks and districts used school development plans in the formulation of district elementary/secondary education plans as envisaged in SSA must have been used. One of another interesting indicators used in computing SEQI is average number of days taken by State/UT to release Central/State share to State Societies but it is silent on a number of months delayed by the Central agencies to release the funds to states. Rather, in the case of 9 UTs, there is no provision to release the state share.

A total 20 weight-age points have been assigned to states recruited new teachers through online system, but the SEQI is totally silent on the percentage of para/contractual-teachers to total teachers which has grown many-fold in the recent past which is evident in the percentage of contractual teachers being disseminated through U-DISE. In fact, many states have discontinued recruiting regular teachers and instead recruit only para-teachers.

It has also been observed that indicators not showing large variations across States & UTs, such as percentage of schools with girls’ toilets, would not have been used in computing SEQI as all the states, small medium and large ones have reported this percentage to be 100. Seeding of UIDs in SDMIS in 2016-17 is another such indicator that also didn’t have any variation in addition, to a few other such indicators.

One of the other important points which have been observed is that SEQI is computed for the entire school education as one entity whereas in 2016- 17, SSA and RMSA were two separate programs as Samagra came into the picture during 2018- 19 in view of which there must have been two separate indices, one for elementary (also for primary and upper primary levels) and another for secondary and higher secondary level of education.

School Education Quality Index is based on U-DISE 2016-17 data which was collected as on 30th September 2016 has now become almost 3 years old; therefore data used in computing SEQI is termed outdated. The process of data entry of SEQI indicators and submission by the States & UTs began in April 2018 and ended on December 2018 during which the unpublished 2017-18 UDISE data was also available with the States & UTs but the most recent data was not used in computing SEQI.

It is hoped that the next SEQI will be based on the latest data i.e. 2019- 20 (30th September 2019 as date of reference) being collected through U-DISE+ which is supposed to be the real-time data but the same is still being collected. It is hoped that the next SEQI data will directly be obtained from the U-DISE+ portal and states will not be required to upload the data on ShaGun or other portal as has been the case with the SEQI 2016-17.

It may be observed that the Educational Development Index being computed by MHRD and NIEPA during the period 2005-06 to 2014-15 was more scientific as weights to each indicator was assigned based on Principal Component Analysis and as such no human element was involved in assigning the weights whereas in SEQI weights have been assigned manually in consultation with the MHRD, Sector Experts and even stakeholders, namely States and UTs which may change if different set of experts are engaged in assigning the weights which may dramatically change the SEQI index.

SEQI document mentioned that because of the lack of time-series information, it was not possible to assign weight-age but EDI which was also based on 4 cross-sectional data used to assign weights which were more scientific than the procedure adopted in computing SEQI. One of the important indicators used is Adjusted-NER which describes children’s participation of an age group in the corresponding education level which is based on enrolment and age-specific population.

Though enrolment is available from the UDISE but the same is not true for corresponding child population in the absence of which projected population has been used but the same based on 2011 Census is not available in the absence of which all enrolment based indicators such as Adjusted-NER may be treated as provisional and may change once more recent child projected population is available. It is hoped that the NITI Aayog will quickly get the age-specific child population immediately after the 2021 Census is released state and district-specific.

It has rightly been said that SEQI has been developed to provide insights and data-based feedback on the success of school education in India which shall help India achieving SDG by 2030 which cannot be achieved unless SEQI is computed district and within the district block-wise. A state may have high SEQI but all of its districts may not be at par as a few of which may take more years to achieve goals of school education while a few others may be in a position to achieve the same in the near future.

From the document, it is not clear whether there is any planning to bring out district-wise SEQI and within the district, block specific SEQIs? It may be recalled that many states attempted computing EDI at the state level and identified districts & blocks which need more attention while formulating district education plans. But because of the lake of expertise at the state and district level, the same could not be attempted across the Country and sustained.

District level Planning & MIS Officials must be oriented to ensure that SEQI is computed at the district level, block-wise by the district officers. They are also required to be trained to provide inputs in district plans based on the outcome of district-specific SEQI. Computing SEQI may not be an issue as every bit of information used in computing SEQI must be available online interactive portal at all the disaggregated levels, such as school, cluster, block, district, state, and national level. Rather, the same should have been taken up along with the computation of the State-specific SEQI.

The index attempts to provide a platform for promoting evidence-based policymaking and highlights possible course-corrections in the education sector. It is mentioned in the document that SEQI will be used in formulation of education policy but form the SEQI document one fails to get the information whether the same was shared with the Kasturi Ranjan Committee Report of which is now available in the public domain in the form of Draft National Policy of Education 2019.

Needless to mention that SEQI is largely based on published data been provided by national institutions such as NCERT (NAS) and NIEPA (U-DISE) but they failed even to get the acknowledgment or even mentioned in the SEQI document. From the document, one gets the impression that these institutions did not play any role or were not engaged in the process of computing SEQI except at the initial stage of identification of indicators.

A close look at the roles and responsibilities reveals that major role is played by the development partner and private parties and the apex national institutions which has got in-house expertise and were engaged in similar exercises in the form of computing EDI which were used to be published by the MHRD through the Elementary Education in India: Flash Statistics were not engaged. Data provided by NCERT & NIEPA were validated by a private agency and the World Bank was the lead agency.

However, limited information is provided as to how the data was validated and what was the 5 process of validation. One fails to understand importance being given to these parties rather than its own institutions especially when expertise to undertake such exercises is available in-house. Better would have been if the apex MHRD institutions could have played a leading role but for the unknown reasons they were not involved in the whole exercise. It is hoped that apex institutions will be given a bigger role and SEQI will be institutionalized in years that follow.

Even the Interactive SEQI portal was developed by the private developer. While launching UDISE+, MHRD in its booklet (April 2019) has raised serious concerns about the quality and validity of U-DISE data and mentioned that “there was a big question mark on the quality and reliability of the data, especially on enrolment and infrastructure”, which if true, the entire efforts of computing SEQI may be treated as futile, as SEQI is largely based on UDISE data!! Maybe because of these limitations, U-DISE was dislocated from NIEPA to NICMHRD from the year 2018-19?

 Even though 2015-16 SEQI is not directly comparable with the 2016-17 SEQI as about 10 indicators were either merged/dropped or modified in 2016-17 because of the issues concerning data, still results reveal an interesting picture. On the one hand, few states have shown improvement in overall percentage points on the other hand a few others shown declines over the previous year i.e. 2015-16.

Improvement both in case of percentage points and rank of Haryana is impressive which needs further analysis as what the state has done in a short period of a year so that other State & UTs may also learn from its experience. Ranking of Haryana amongst major states increased from 8th (51%) to 3rd (69.5%) in 2016-17. An increase of 18.5 percentage points is unexplainable as the information available in the public domain doesn’t specify any major development in the state between 2015-16 and 2016-17.

On the other hand, Karnataka has shown a decline in both percentage points and rank as it has gone down from a high rank of 5 (56.6%) in 2015-16 to 13 (52.9%) in 2016-17 which also needs further explanation.

Needless to mention that Karnataka is considered as one of the advanced states of the Country and had initiated many state-specific programs towards achieving the goal of school education including enhancing the quality of education which also includes a host of technology-related interventions. In addition to Haryana, states may also like to be benefited based on the experience of two top-ranked states, namely Kerala (1st, 82.2%) and Tamil Nadu (2nd, 73.4%) but the same though on the top of the list are still not the perfect ones in relation to the indicators used in computing SEQI.

Needless to mention that until the bottom-ranked states, such as Bihar, Chhattisgarh, Jharkhand, etc are improve, India cannot afford to achieve the goal of school education for which district-specific SEQIs and within the district, block-specific SEQIs may reveal interesting picture with regard to the status of universalization and target year of likely realization of goals.