Student-generated Questions: A Novel Approach For Encouraging Cognitive Engagement

Amanda Huee-Ping WONG1,*, WONG Lik Wei1, HOOI Shing Chuan1, and LEE Shuh Shing2

1Department of Physiology, Yong Loo Lin School of Medicine (YLLSOM)
2Centre for Medical Education (CENMED), YLLSOM

*phswhpa@nus.edu.sg 

Wong, A., Wong, L. W., Hooi, S. C., & Lee, S. S. (2024). Student-generated questions: A novel approach for encouraging cognitive engagement [Paper presentation]. In Higher Education Conference in Singapore (HECS) 2024, 3 December, National University of Singapore. https://blog.nus.edu.sg/hecs/hecs2024-ahpwong-et-al-2/

SUB-THEME

Opportunities from Wellbeing 

KEYWORDS

Students’ questions, student-generated questions, Bloom’s taxonomy, cognitive engagement, supportive learning environment. 

CATEGORY

Paper Presentation 

 

INTRODUCTION 

Question-asking is a crucial process in fostering critical and reflective thinking across different education levels (Aflalo, 2021; Cuccio-Schirripa & Steiner, 2000), yet its role in medical education is often overlooked. Encouraging students to generate their own questions can help them uncover gaps in their understanding, stimulate their curiosity, and engage more deeply with the material (Schmidt, 1993). This practice also provides teachers with valuable insights into students’ learning processes and difficulties. While previous studies have largely focused on using student-generated multiple-choice questions for self-assessment (Gooi & Sommerfeld, 2015; Lahti et al., 2023), this study offers an alternative method utilising students’ questions, specifically queries that students submitted based on self-directed learning materials. Systematic categorisation of these questions according to topic and cognitive level allows educators to not only identify problem areas and explore cognitive engagement with course content, but also tailor their educational strategies according to learner needs. This anonymous platform offers students a safe environment that encourages reflection and peer learning, and has the potential to enhance cognitive engagement, which has been shown to positively influence student achievement and wellbeing (Ng et al., 2022; Pietarinen et al., 2014). 

 

METHODS 

This study utilises a content analysis approach to evaluate the questions submitted anonymously by first-year undergraduate medical students within the Cardiovascular Physiology blended learning series. The shared question and answer (Q&A) document, integrated into the self-directed learning segment, was accessible alongside other educational resources such as online lecture videos, eBooks, and quizzes. During this segment, the teaching team monitored the document and provided timely feedback to the submitted questions. Students subsequently attended in-person case-based discussions to reinforce knowledge and enhance interactive learning. The questions were categorised by two independent raters according to the revised Bloom’s taxonomy to assess cognitive engagement (Anderson et al., 2001; Chin & Osborne, 2008), specifically into the following cognitive levels: Remember, Understand, Apply, and Analyse. Inter-rater reliability was measured to ensure consistency in the classification process.

 

RESULTS

A total of 298 questions were collected and analysed over four academic years. The distribution of these questions, categorised according to Bloom’s taxonomy, revealed that the majority were classified as ‘Understand’ (56%) and ‘Apply’ (29%), followed by the ‘Remember’ (4%) and ‘Analyze‘ (11%) categories (Figure 1). When examined by topic, the highest frequency of questions pertained to the ‘Electrical Basis of Electrocardiogram’ and ‘Cardiac Output and Cardiac Failure’ chapters. A detailed analysis demonstrated that student questions were predominantly within the ‘Understand’ and ‘Apply’ categories across most chapters. Notably, the ‘Cardiac Contraction and Cardiac Cycle’ chapter was unique in that it had a higher number of ‘Apply’ questions compared to ‘Understand’ questions. The overall inter-rater reliability for categorising the questions was 83.2%, underscoring the robustness of the classification process. 

Figure 1. Overall students’ questions according to cognitive levels. 

 

CONCLUSION

This study demonstrates the utility of student-generated questions in promoting cognitive engagement with course content and providing learners with a safe environment to express uncertainties and receive timely feedback from the teaching team. The predominance of questions in the ‘Understand’ and ‘Apply’ categories aligns with educational goals that prioritise comprehension and practical application in foundational medical education. This approach not only offers educators insights to refine teaching strategies and better address cohort-specific needs but also offers another opportunity to foster a supportive learning environment. By creating a psychologically safe platform for students to engage and reflect, this approach could enhance their overall educational experience. Integrating such practices can contribute to improved academic achievement and student wellbeing, supporting the ongoing advancement of pedagogical practices in medical education. 

 

REFERENCES

Aflalo, E. (2021). Students generating questions as a way of learning. Active Learning in Higher Education, 22(1), 63-75. https://doi.org/10.1177/1469787418769120  

Anderson, L. W., Krathwohl, D. R., Airasian, P. W., Cruikshank, K. A., Mayer, R. E., Pintrich, P. R., Raths, J., & Wittrock, M. C. (2001). A taxonomy for learning, teaching, and assessing: a revision of Bloom’s taxonomy of educational objectives (Complete ed.). Longman.  

Chin, C., & Osborne, J. (2008). Students’ questions: a potential resource for teaching and learning science. Studies in Science Education, 44(1), 1-39. https://doi.org/10.1080/03057260701828101  

Cuccio-Schirripa, S., & Steiner, H. E. (2000). Enhancement and analysis of science question level for middle school students. Journal of Research in Science Teaching, 37(2), 210-224. https://doi.org/https://doi.org/10.1002/(SICI)1098-2736(200002)37:2<210::AID-TEA7>3.0.CO;2-I  

Gooi, A. C. C., & Sommerfeld, C. S. (2015). Medical school 2.0: How we developed a student-generated question bank using small group learning. Med Teach, 37(10), 892-896. https://doi.org/10.3109/0142159X.2014.970624  

Lahti, J., Salamon, M., Farhat, J., & Varkey, T. (2023). Multiple choice question writing and medical students: a systematic literature review. In: MedEdPublish. 

Ng, B. J. M., Han, J. Y., Kim, Y., Togo, K. A., Chew, J. Y., Lam, Y., & Fung, F. M. (2022). Supporting Social and Learning Presence in the Revised Community of Inquiry Framework for Hybrid Learning. Journal of Chemical Education, 99(2), 708-714. https://doi.org/10.1021/acs.jchemed.1c00842  

Pietarinen, J., Soini, T., & Pyhältö, K. (2014). Students’ emotional and cognitive engagement as the determinants of well-being and achievement in school. International Journal of Educational Research, 67, 40-51. https://doi.org/https://doi.org/10.1016/j.ijer.2014.05.001  

Schmidt, H. G. (1993). Foundations of problem-based learning: some explanatory notes. Med Educ, 27(5), 422-432. https://doi.org/10.1111/j.1365-2923.1993.tb00296.x  

Does Grading an Assignment Matter for Student Engagement – A Case Study in an Interdisciplinary Course with Science and Humanities

LIU Mei Hui1 and Stephen En Rong TAY2

1Department of Food Science and Technology, College of Humanities and Sciences, NUS
2Department of the Built Environment, College of Design and Engineering (CDE), NUS

fstlmh@nus.edu.sg; stephen.tay@nus.edu.sg

 

Liu, M. H., & Tay, S. E. R. (2024). Does grading an assignment matter for student engagement: A case study in an interdisciplinary course with science and humanities [Paper presentation]. In Higher Education Conference in Singapore (HECS) 2024, 3 December, National University of Singapore. https://blog.nus.edu.sg/hecs/hecs2024-liu-tay/

SUB-THEME

Opportunities from Engaging Communities

 

KEYWORDS

Interdisciplinarity, peer learning, student-generated questions, assessment, feedback

 

CATEGORY

Paper Presentation 

 

INTRODUCTION

The Scientific Inquiry II (SI2) course – HSI2007 “Deconstructing Food” – employed scenario-based student-generated questions and answers (sb-SGQA) previously to encourage interdisciplinary learning (Tay & Liu, 2023). In the activity, students were tasked to develop questions and answers based on the learning objectives that are contextualised to community examples beyond the classroom. This contextualisation to a scenario helps develop authentic assessment (Wiggins, 1990). To further increase student engagement with the sb-SGQA activity, the sb-SGQA activity changed to a graded assignment in AY2023/24 Semester 1. This was motivated by literature that reports how a graded assignment motivates students in their learning, specifically as an extrinsic motivator, in which students are incentivised to work towards a reward (i.e. good grade) (Docan, 2006; Harlen et al., 2002; Schinske & Tanner, 2014). Hence, this study aims to answer the following questions:

  1. Does the graded sb-SGQA improve student performance, evidenced through a comparison of the continual assessment marks between the graded and ungraded cohorts?
  2. What are students’ perceptions of the sb-SGQA approach from both the graded and ungraded cohorts?

METHODOLOGY

The graded sb-SGQA (20% weightage) was adopted in AY2023/24 Semester 1, and results were compared with data from AY2022/23 Semester 2, when the sb-SGQA was not graded. Across both cohorts, two continual assessment (CA) components, a MCQ Quiz (20% weightage) and Individual Essay (20% weightage) were analysed as these two components were present in both cohorts. Numerical data was analysed with JASP, an open-source statistical package (Love et al., 2019).

RESULTS

In Figure 1, students analysed and discussed differences between meals served to students in the East and West differ, and Figure 2 demonstrates how students employed content materials from the online community for a case study. Through these questions, students demonstrated concepts of nutrition, food microbiology (e.g., fermented foods), and health-related information.

HECS2024-a20-Fig1

Figure 1. Example of students’ work analysing meals in other communities

 

HECS2024-a20-Fig2

Figure 2. Student work in question-and-answer generation through engaging the digital community.

Though formal evidence has not been collected, we believe the project is impactful based on several observations. Participants demonstrate increased confidence and curiosity as they develop coding and robotics skills, particularly after successfully completing projects or engaging in hackathons. Exposure to tech fairs broadens their understanding of technology’s potential and encourages further exploration. These activities are designed to spark interest in technology and create a positive learning environment, which we believe is key to fostering long-term engagement in the field.

 

When the CA scores were analysed, a statistically significant difference was observed for the MCQ Quiz but not for the Individual Essay (refer to Table 1). This could be attributed to the open-ended nature of the Individual Essay assessment component, which requires student competencies in articulation of ideas and positioning their views, which may have masked the effect.

Table 1
Score comparisons for MCQ Quiz, Individual Essay, and CA across the graded (n=102) and ungraded (n=184) cohorts

HECS2024-a20-Table1

 

Table 2 represents student feedback on the sb-SGQA approach. Majority of the students in both the graded and ungraded cohorts shared that the sb-SGQA has helped with their learning. Though the activity was challenging, the students enjoyed it and recommended it for future courses. The qualitative feedback (refer to Table 3) revealed how Humanities and Sciences students appreciated how their diverse views could be incorporated through the sb-SGQA (Humanities 1, Humanities 3, Science 3). The sb-SGQA also forces students to reflect deeper on the course materials to develop meaningful questions and answers, thus aiding their learning (Humanities 2, Science 1). The contextualisation of the learning objectives to community examples was appreciated by students (Humanities 4, Science 2). The approach was also utilised by students to integrate topics taught through the entire course, thus allowing students to appreciate the course as a whole (Science 4). The themes were similar in the ungraded cohorts.

 

Table 2
Student feedback from the graded (left) and ungraded (right) cohorts separated by “/”. Responses represented as a percentage, and were obtained from 102 respondents in the graded cohort and 120 respondents in the ungraded cohort. The modes are bolded for highlight

HECS2024-a20-Table2

 

Table 3
Qualitative feedback from Humanities and Science students in the graded cohort

HECS2024-a20-Table3

CONCLUSION AND SIGNIFICANCE

The change to a graded assignment increased students’ performance in the MCQ Quiz segment but not the Individual Essay segment. Student perceptions to the approach were generally positive across both the graded and ungraded cohorts. The results suggest that students’ perceived value of a learning activity may not be solely dependent on whether the learning activity is graded or not. The significance of this study lies in how the use of sb-SGQA could aid with community engagement in the creation of case studies without software and hardware costs involved.

REFERENCES

Docan, T. N. (2006). Positive and negative incentives in the classroom: An analysis of grading systems and student motivation. Journal of the Scholarship of Teaching and Learning, 6, 21-40. https://scholarworks.iu.edu/journals/index.php/josotl/article/view/1668/1666

Harlen, W., Crick, R. D., Broadfoot, P., Daugherty, R., Gardner, J., James, M., & Stobart, G. (2002). A systematic review of the impact of summative assessment and tests on students’ motivation for learning. https://dspace.stir.ac.uk/bitstream/1893/19607/1/SysRevImpSummativeAssessment2002.pdf

Love, J., Selker, R., Marsman, M., Jamil, T., Dropmann, D., Verhagen, J., Ly, A., Gronau, Q. F., Šmíra, M., Epskamp, S., Matzke, D., Wild, A., Knight, P., Rouder, J. N., Morey, R. D., & Wagenmakers, E.-J. (2019). JASP: Graphical Statistical Software for Common Statistical Designs. Journal of Statistical Software, 88(2), 1 – 17. https://doi.org/10.18637/jss.v088.i02

Schinske, J., & Tanner, K. (2014). Teaching more by grading less (or differently). CBE Life Sci Educ, 13(2), 159-166. https://doi.org/10.1187/cbe.cbe-14-03-0054

Tay, E. R. S., & Liu, M. H. (2023, 7 December 2023). Exploratory implementation of scenario-based student-generated questions for students from the humanities and sciences in a scientific inquiry course. Higher Education Campus Conference (HECC) 2023, Singapore. https://blog.nus.edu.sg/hecc2023proceedings/exploratory-implementation-of-scenario-based-student-generated-questions-for-students-from-the-humanities-and-sciences-in-a-scientific-inquiry-course/

Wiggins, G. (1990). The case for authentic assessment. Practical Assessment, Research and Evaluation, 2, 1-3. https://doi.org/10.7275/ffb1-mm19

Viewing Message: 1 of 1.
Success

Blog.nus login is now via SSO. Don't see or cannot edit your blogs after logging in? Please get in touch with us, and we will fix that. (More information.)