Senior High School English National Examination and Thinking Skills

Ummu Lathifah Ahmad


When English National Examination (abbreviated into ENE) as a norm-referenced test is designed for instructional purposes, to evaluate the result of national curriculum, it is very significant to conduct item test evaluation since it gives a clear portrait of the quality of the items and of the test as a whole. The purpose of this study was to analyze which levels of the Barrett taxonomy were more reflected in ENE items of 2013/2014 academic year and whether the proportions of items among the twenty test packages in the ENE assessing students’ Lower Order Thinking Skills (LOTS) and Higher Order Thinking Skills (HOTS) are consistent. The researcher adopted the qualitative descriptive approach using a content analysis card to codify the ENE items. To ensure the reliability of the study, three inter-raters analyzed a sample of the test packages. The results indicated that questions asking LOTS still prevailed in ENE items. Of all the twenty test packages, the items categorized into literal level represented around 68.6% of the total number of the questions. Meanwhile, the questions belonging to reorganization came to occupy a percentage of 20.8 and the questions asking the students’ inferential level only reached 10.3%. Also, the tests were not enriched sufficiently with the evaluation comprehension since they only comprised 0.3%. The results also showed the complete absence of “Appreciation” – the highest level of thinking in the mentioned taxonomy. It is obvious that there is a shortage of items questioning students’ HOTS in the exam and they are not well-treated. Accordingly, this finding reveals that there is still much room for ENE to be the driving force in the effort to make learners critical thinkers. In the light of these data, this study recommends modifying the English National Exam by providing them with more question items that include HOTS.

Save to Mendeley

Full Text: