Interrater Reliability Study of the Diploma in Basic Education Examination in English, Mathematics and Integrated Science conducted by the Institute of Education, UCC in Ghana
DOI:
https://doi.org/10.47963/jedp.v5i.997Keywords:
interrater reliability, scorer agreement, total score, error scoresAbstract
The study was conducted to determine the interrater reliability (ratter agreement) of the Diploma in Basic Education (DBE) examination conducted by the Institute of Education of UCC in Ghana. The population consisted of 13,352 first year students who were admitted for the 2015/2016 academic year to pursue the DBE programme and offered English, Core Mathematics and Integrated Science. Using the stratified random sampling technique, 600 scripts of each course were sampled from twelve Colleges of Education for the study. The Pearson Product Moment Correlation Coefficient and paired samples t-test were used for the analyses. The results showed a high interrater reliability in the three courses English (r=0.819 at 0.05α), Mathematics (r=0.878 at 0.05α) and Integrated Science (r=0.867at 0.05α) courses. In addition, the hypothesis testing revealed that the differences between raters in Mathematics and Integrated Science were not significant at 0.05α, p>0.05 indicating that the differences had no impact on total scores. However, in English the differences were found to be significant at 0.05α. It was recommended that the Institute of Education intensifies its coordination sessions with examiners with special emphasis on the English examiners. It was further suggested that team leaders should be more vigilant with vetting of scripts.