A Comparative Analysis of Pen-and-Paper Exam and Computer-based Exam Results for A Year 3 Undergraduate Course

CHEW Lup Wai1* and Stephen En Rong TAY2

1,2 Department of the Built Environment, College of Design and Engineering

*lupwai@nus.edu.sg

 

Chew, L. W., & Tay, E. R. S. (2024). A Comparative Analysis of Pen-and-Paper Exam and Computer-based Exam Results for A Year 3 Undergraduate Course [Lightning Talk]. In Higher Education Conference in Singapore (HECS) 2024, 3 December, National University of Singapore. https://blog.nus.edu.sg/hecs/hecs2024-chew-tay

SUB-THEME

Opportunities from Wellbeing

 

KEYWORDS

Exam format, Pen-and-paper exam, Computer-based exam

 

CATEGORY

Lightning Talk

 

CONTEXT

The COVID-19 pandemic has motivated studies to look at stress with online learning and assessments. One such study reported how students were stressed with electronic exams due to concerns related to exam duration, navigation mode, and technical problems (Elsalem et al., 2020). A separate study acknowledged exam and technology anxiety due to invigilation options, technical issues, and owning suitable devices (Aristeidou et al., 2024). In addition, the mode of assessment was reported to affect stress levels (Ali et al., 2015). However, these findings contradict student feedback received for the course PF3105 “Research Methods” that revealed students were more stressed taking pen-and-paper exams, citing their familiarity with computer-based exams. They also worried about bad handwriting and time constraints as students claimed they were able to type faster than write. Hence, this study presents a comparative analysis of the final exam results from a pen-and-paper exam and a computer-based exam for PF3105 “Research Methods”, a core course for all students in the B. Sc. Project and Facilities Management Programme. The aim is to investigate if there is any difference in student performance arising from a computer-based exam.

Hence, this lightning talk aims to answer the following research question: do students perform differently under pen-and-paper and computer-based exams?

METHODOLOGY

While the assessment consists of group projects (40%) and final exam (60%), this study only considers the final exams, as the difference between cohorts lies in the mode of exams. In AY2022/23, 115 students took a pen-and-paper exam; in AY2023/24, 96 students took a computer-based exam. The pen-and-paper exam consists of short-answer questions, while the computer-based exam has multiple-choice questions in addition to short-answer questions. Both exams have 60 marks as the possible full mark. The questions were graded by the same marker across both cohorts.

RESULTS AND DISCUSSION

Table 1 summarises the exam results and Figure 1 shows the distributions. The pen-and-paper exam cohort scored an average of 42.74, with a median of 43.50. Conversely, the computer-based exam cohort exhibited an average score of 37.16, with a median of 38.38. These findings indicate that, on average, students who took the pen-and-paper exam achieved higher scores compared to those who took the computer-based exam. The significant difference in average scores suggests a variation in student performance between the two assessment methods (p-value < 0.001 with a two-tailed test). This difference is not attributed to the multiple-choice questions in the computer-based exam, as repeating the analysis by removing the multiple-choice questions (i.e., only comparing the short answer question in both exams) yields the same conclusion. The standard deviations further highlight the dispersion of scores within each exam format. However, the similarity in the minimums (19.50 for pen-and-paper exam and 21.25 for computer-based exam) and the maximums (54.00 for pen-and-paper exam and 51.00 for computer-based exam) suggests that the weakest and strongest students in the cohort are not affected by the format of the exam. Assuming that the difference lies in exam stress, students may be more stressed in the computer-based exam, and thus perform poorly, but this needs further studies to firmly conclude.

Table 1
Descriptive statistical results for pen-and-paper exam and computer-based exam.

HECS-a51-Table1

 

HECS-a51-Fig1

Figure 1. Distributions of pen-and-paper exams (AY2022, 115 students) and computer-based exams (AY2023, 96 students). Both exams have 60 marks as the possible full mark.

CONCLUSION AND SIGNIFICANCE

The study found that students in the pen-and-paper exam cohort performed significantly better than the computer-based exam cohort after accounting for differences in question type. As more exams are being switched from pen-and-paper to computer-based format to accommodate large class sizes, further research is recommended to delve deeper into the underlying causes of potential stress, and thus wellbeing of students, to explore plausible strategies for optimising the effectiveness and fairness of computer-based exams in the higher education context.

 

REFERENCES

Ali, M., Asim, H., Edhi, A.I., Hashmi, M.D., Khan, M.S., Naz, F., Qaiser, K.N., Qureshi, S.M., Zahid, M.F. & Jehan, I. (2015). Does academic assessment system type affect levels of academic stress in medical students? A cross-sectional study from Pakistan. Medical education online, 20(1), Article 27706. https://doi.org/10.3402/meo.v20.27706

Aristeidou, M., Cross, S., Rossade, K. -D., Wood, C., Rees, T., & Paci, P. (2024). Online exams in higher education: Exploring distance learning students’ acceptance and satisfaction. Journal of Computer Assisted Learning, 40(1), 342-359. https://doi.org/10.1111/jcal.12888

Elsalem, L., Al-Azzam, N., Jum’ah, A. A., Obeidat, N., Sindiani, A. M., & Kheirallah, K. A. (2020). Stress and behavioral changes with remote E-exams during the Covid-19 pandemic: A cross-sectional study among undergraduates of medical sciences. Annals of Medicine and Surgery, 60, 271-279. https://doi.org/10.1016/j.amsu.2020.10.058

Viewing Message: 1 of 1.
Warning

Blog.nus accounts will move to SSO login soon. Once implemented, only current NUS staff and students will be able to log in to Blog.nus. Public blogs remain readable to non-logged in users. (More information.)