Amanda Huee-Ping WONG1,*, WONG Lik Wei1, HOOI Shing Chuan1, and LEE Shuh Shing2
1Department of Physiology, Yong Loo Lin School of Medicine (YLLSOM)
2Centre for Medical Education (CENMED), YLLSOM
Wong, A., Wong, L. W., Hooi, S. C., & Lee, S. S. (2024). Student-generated questions: A novel approach for encouraging cognitive engagement [Paper presentation]. In Higher Education Conference in Singapore (HECS) 2024, 3 December, National University of Singapore. https://blog.nus.edu.sg/hecs/hecs2024-ahpwong-et-al-2/
SUB-THEME
Opportunities from Wellbeing
KEYWORDS
Students’ questions, student-generated questions, Bloom’s taxonomy, cognitive engagement, supportive learning environment.
CATEGORY
Paper Presentation
INTRODUCTION
Question-asking is a crucial process in fostering critical and reflective thinking across different education levels (Aflalo, 2021; Cuccio-Schirripa & Steiner, 2000), yet its role in medical education is often overlooked. Encouraging students to generate their own questions can help them uncover gaps in their understanding, stimulate their curiosity, and engage more deeply with the material (Schmidt, 1993). This practice also provides teachers with valuable insights into students’ learning processes and difficulties. While previous studies have largely focused on using student-generated multiple-choice questions for self-assessment (Gooi & Sommerfeld, 2015; Lahti et al., 2023), this study offers an alternative method utilising students’ questions, specifically queries that students submitted based on self-directed learning materials. Systematic categorisation of these questions according to topic and cognitive level allows educators to not only identify problem areas and explore cognitive engagement with course content, but also tailor their educational strategies according to learner needs. This anonymous platform offers students a safe environment that encourages reflection and peer learning, and has the potential to enhance cognitive engagement, which has been shown to positively influence student achievement and wellbeing (Ng et al., 2022; Pietarinen et al., 2014).
METHODS
This study utilises a content analysis approach to evaluate the questions submitted anonymously by first-year undergraduate medical students within the Cardiovascular Physiology blended learning series. The shared question and answer (Q&A) document, integrated into the self-directed learning segment, was accessible alongside other educational resources such as online lecture videos, eBooks, and quizzes. During this segment, the teaching team monitored the document and provided timely feedback to the submitted questions. Students subsequently attended in-person case-based discussions to reinforce knowledge and enhance interactive learning. The questions were categorised by two independent raters according to the revised Bloom’s taxonomy to assess cognitive engagement (Anderson et al., 2001; Chin & Osborne, 2008), specifically into the following cognitive levels: Remember, Understand, Apply, and Analyse. Inter-rater reliability was measured to ensure consistency in the classification process.
RESULTS
A total of 298 questions were collected and analysed over four academic years. The distribution of these questions, categorised according to Bloom’s taxonomy, revealed that the majority were classified as ‘Understand’ (56%) and ‘Apply’ (29%), followed by the ‘Remember’ (4%) and ‘Analyze‘ (11%) categories (Figure 1). When examined by topic, the highest frequency of questions pertained to the ‘Electrical Basis of Electrocardiogram’ and ‘Cardiac Output and Cardiac Failure’ chapters. A detailed analysis demonstrated that student questions were predominantly within the ‘Understand’ and ‘Apply’ categories across most chapters. Notably, the ‘Cardiac Contraction and Cardiac Cycle’ chapter was unique in that it had a higher number of ‘Apply’ questions compared to ‘Understand’ questions. The overall inter-rater reliability for categorising the questions was 83.2%, underscoring the robustness of the classification process.
Figure 1. Overall students’ questions according to cognitive levels.
CONCLUSION
This study demonstrates the utility of student-generated questions in promoting cognitive engagement with course content and providing learners with a safe environment to express uncertainties and receive timely feedback from the teaching team. The predominance of questions in the ‘Understand’ and ‘Apply’ categories aligns with educational goals that prioritise comprehension and practical application in foundational medical education. This approach not only offers educators insights to refine teaching strategies and better address cohort-specific needs but also offers another opportunity to foster a supportive learning environment. By creating a psychologically safe platform for students to engage and reflect, this approach could enhance their overall educational experience. Integrating such practices can contribute to improved academic achievement and student wellbeing, supporting the ongoing advancement of pedagogical practices in medical education.
REFERENCES
Aflalo, E. (2021). Students generating questions as a way of learning. Active Learning in Higher Education, 22(1), 63-75. https://doi.org/10.1177/1469787418769120
Anderson, L. W., Krathwohl, D. R., Airasian, P. W., Cruikshank, K. A., Mayer, R. E., Pintrich, P. R., Raths, J., & Wittrock, M. C. (2001). A taxonomy for learning, teaching, and assessing: a revision of Bloom’s taxonomy of educational objectives (Complete ed.). Longman.
Chin, C., & Osborne, J. (2008). Students’ questions: a potential resource for teaching and learning science. Studies in Science Education, 44(1), 1-39. https://doi.org/10.1080/03057260701828101
Cuccio-Schirripa, S., & Steiner, H. E. (2000). Enhancement and analysis of science question level for middle school students. Journal of Research in Science Teaching, 37(2), 210-224. https://doi.org/https://doi.org/10.1002/(SICI)1098-2736(200002)37:2<210::AID-TEA7>3.0.CO;2-I
Gooi, A. C. C., & Sommerfeld, C. S. (2015). Medical school 2.0: How we developed a student-generated question bank using small group learning. Med Teach, 37(10), 892-896. https://doi.org/10.3109/0142159X.2014.970624
Lahti, J., Salamon, M., Farhat, J., & Varkey, T. (2023). Multiple choice question writing and medical students: a systematic literature review. In: MedEdPublish.
Ng, B. J. M., Han, J. Y., Kim, Y., Togo, K. A., Chew, J. Y., Lam, Y., & Fung, F. M. (2022). Supporting Social and Learning Presence in the Revised Community of Inquiry Framework for Hybrid Learning. Journal of Chemical Education, 99(2), 708-714. https://doi.org/10.1021/acs.jchemed.1c00842
Pietarinen, J., Soini, T., & Pyhältö, K. (2014). Students’ emotional and cognitive engagement as the determinants of well-being and achievement in school. International Journal of Educational Research, 67, 40-51. https://doi.org/https://doi.org/10.1016/j.ijer.2014.05.001
Schmidt, H. G. (1993). Foundations of problem-based learning: some explanatory notes. Med Educ, 27(5), 422-432. https://doi.org/10.1111/j.1365-2923.1993.tb00296.x