Integrating Design Competitions In Civil Engineering Education: Case Studies From Singapore And China

DU Hongjian1,* and LIANG Yan2 

1Department of Civil and Environmental Engineering, NUS
2School of Civil Engineering, Zhengzhou University, China

*ceedhj@nus.edu.sg

Du, H., & Liang, Y. (2024). Integrating design competitions in civil engineering education: Case studies from Singapore and China [Paper presentation]. In Higher Education Conference in Singapore (HECS) 2024, 3 December, National University of Singapore. https://blog.nus.edu.sg/hecs/hecs2024-hjdu-lyan/

SUB-THEME

Opportunities from Engaging Communities 

KEYWORDS

Design competition, professional engineers, assessment, sustainability real-life problem 

CATEGORY

Paper Presentation 

 

INTRODUCTION

Previous literature has shown that design competition can have tremendous educational value in developing desired skills and competencies in students, provided ideal conditions are identified and maintained (Bunchal, 2004). The Royal Academy of Engineering’s report on Educating Engineers for the 21st century concluded that engineering courses must align better with the evolving needs of business and industry (2006). More high-quality project work is needed, centred around real-life problems and ideally delivered in collaboration with industry (Davies, 2013). In addition to technical skills, enabling skills are crucial, allowing engineer to operate effectively in a commercial environment (Gadola & Chindamo, 2019).  

 

Despite these recognised benefits, there is limited research on the use of design competition in civil engineering education. This paper explores how a design competition in structural concrete design impacts student learning at both the National University of Singapore (NUS) and Zhengzhou University (ZZU), China. The framework of this innovative teaching method is illustrated in Figure 1, showcasing its application in diverse educational contexts.  

Figure 1. Proposed teaching frameworks based on design competition. 

 

METHODOLOGY

The method was initially implemented in the course CE3165 “Structural Concrete Design” in AY2023/24, a core course in the Civil Engineering Programme at NUS. In the past, conventional design projects within CE3165 failed to evoke significant interest among students, who often found them to be labour intensive with minimal returns. Recognising the need for a paradigm shift, I sought to reimagine the design project as a dynamic and competitive endeavour. The design competition was introduced in collaboration with the Institution of Structural Engineers Singapore Regional Group, challenging teams to design the structural frame for Singapore’s first net-zero building. By providing clear assessment guidelines, the marking rubric facilitated an objective and transparent evaluation process, allowing judges to assess the merits of each design comprehensively (Table 1). The competition involved presentations evaluated by professionals from the construction industry (Figure 2). The design competition method was subsequently introduced in a similar course in the School of Civil Engineering, Zhengzhou University. During a visit to NUS between 2022 and 2023, the lecturer (co-author of this paper) identified similar challenges faced in his course: lack of student motivation and a disconnection between theory and real-life design. After observing the implementation of this design competition at NUS, the lecturer decided to adopt it at his home university.  

 

I was involved in the planning of the course and was invited to serve as an external judge in the design competition in 2024. The same format and marking rubrics were used. At Zhengzhou University, due to course requirements, students participated in the design competition individually, with a total of 15 students. I attended the presentation online (refer to Figure 3). An anonymous student survey was conducted to evaluate their feedback on the design competition.  

Table 1
Marking rubrics of reports and presentations of the design competition

 

Figure 2. Judge commenting on the design solution at NUS. 

 

Figure 3. Judge commenting on the design solution at ZZU. 

 

RESULTS

The design competition has yielded tangible evidence of its effusiveness in enhancing student learning outcomes and fostering a deeper understanding of sustainability in structural engineering. Quantitative scores from student evaluations corroborate the effectiveness of the competition, with high ratings indicating satisfaction with the learning outcomes and overall experience on their learning of structural design (Figure 4) and sustainability (Figure 5). Qualitative feedback from students highlights the positive impact of the design competition on their learning experience, with many expressing increasing motivation, engagement, and enthusiasm in structural engineering and sustainability (Table 2).  

Figure 4. Feedback from NUS and ZZU students on the question “Do you think the design competition has helped your learning of structural concrete design?” (1 represents “Not at all”, 5 represents “Very much”). 

 

Figure 5. Feedback from NUS and ZZU students on the question “Do you think the design competition has motivated your thinking and learning of sustainability?” (1 represents “Not at all”, 5 represents “Very much”). 

 

Table 2
Qualitative comments from students on the design competition 

 

CONCLUSIONS

This study compares the effectiveness of using design competition in two universities for similar courses. Results consistently demonstrated that design competitions lead to higher student learning motivation and a deeper understanding of structural design. The positive outcomes indicate the potential for broader adoption of this teaching method in engineering curricula, paving the way for more engaged and practically skilled engineering graduates.  

 

REFERENCES

Buchal, R. O. (2004). The educational value of student design competitions. In Proceedings of the inaugural CDEN design conference, Montreal, Canada. 

Davies, H. C. (2013). Integrating a multi-university design competition into a mechanical engineering design curriculum using modern design pedagogy. Journal of Engineering Education, 24(5), 383-396. https://doi.org/10.1080/09544828.2012.761679  

Gadola, M., & Chindamo, D. (2019). Experiential learning in engineering education: The role of student design competitions and a case study. International Journal of Mechanical Engineering Education, 47(1), 3-22. https://doi.org/10.1177/0306419017749580 

Royal Academy of Engineering. (2006). Educating engineers for the 21st century: The industry view. A commentary on a study carried out by Henley Management College for the Royal Academy of Engineering. London, UK.  

Leveraging Adult Learners’ Professional Experience Through Scenario-based Student-generated Questions And Answers In Engineering Mechanics

DU Hongjian1 and Stephen En Rong TAY2 

1Department of Civil and Environmental Engineering, College of Design and Engineering (CDE), NUS
2Department of the Built Environment, CDE, NUS

ceedhj@nus.edu.sgstephen.tay@nus.edu.sg 

Du, H., & Tay, S. E. R. (2024). Leveraging adult learners’ professional experience through scenario-based student-generated questions and answers in engineering mechanics [Paper presentation]. In Higher Education Conference in Singapore (HECS) 2024, 3 December, National University of Singapore. https://blog.nus.edu.sg/hecs/hecs2024-hdu-sertay/

SUB-THEME

Opportunities from Engaging Communities 

KEYWORDS

Engineering education, adult learners, relevance, student-generated questions and answers, assessment

CATEGORY

Paper Presentation

 

INTRODUCTION 

Adult learning is crucial for workforce development, ensuring that professionals can adapt to changes and thrive in their careers. Therefore, the Singapore government has implemented various initiatives including lifelong learning through the SkillsFuture Movement to address this challenge. The National University of Singapore (NUS) contributes to these efforts through the Bachelor of Technology (BTech) Programmes designed for polytechnic graduates working in the industry.  

 

Specifically, TCE2155 “Structural Mechanics and Materials”, a core course for BTech (Civil Engineering) received feedback from a control cohort expressing the need for evaluations of real-life structures to better understand course content. This observation agrees with the literature that adult learners are often more motivated by practical and relevant content that directly apply to their personal and professional lives (Merriam & Bierema, 2014). Hence, the use of scenario-based student-generated questions and answers (sb-SGQA) was adopted as the approach allows students to provide scenarios based on their professional experience. In brief, the sb-SGQA approach provides learners the opportunity to develop questions and answers to specific learning objectives within the course (Tay & Tay, 2021). This aligns with the adult learner experience, which is one of the six principles for adult education proposed by Knowles (1992). Hence, there is potential for sb-SGQA to allow the adult learner community to utilise their professional experience for learning. In addition, past experience with implementing sb-SGQA provided confidence and familiarity with the approach (Du & Tay, 2022).  

 

Hence, this paper aims to answer two key questions:  

  1. a) Does sb-SGQA help adult learners link their professional experiences with course content? 
  2. b) How can sb-SGQA impact adult learners’ performance?

 

METHODOLOGY 

TCE2155 is offered for first year BTech (Civil Engineering) undergraduates, who must be at least aged 21 and have two years of full-time work experience. The sb-SGQA approach was introduced in TCE2155, with student feedback compared across three runs: the initial run without sb-SGQA (control in AY2020/21) and two subsequent runs with sb-SGQA (intervention in AY2022/23 and AY2023/24). Data collected included student assignments, final exam grade, feedback, and module scores. Detailed methodology of the sb-SGQA implementation follows a previous work by the authors (Du and Tay, 2022). In the initial run without sb-SGQA, a conventional teaching approach was employed. Students were given a pre-defined structural analysis question, and they were required to calculate the force and stress in the structure. This approach focused on the application of formulae and calculations, without involving real-life scenarios or encouraging students to generate their own questions and solutions.

 

RESULTS AND DISCUSSION 

The number of enrolled students in TCE215 and those that responded to the survey are:  

AY2020/21 (control cohort): 33 enrolled and 17 responded 

AY2022/23 (intervention cohort): 28 enrolled and 16 responded 

AY2023/24 (intervention cohort): 29 enrolled and 8 responded  

 

As displayed in Figure 1, the feedback score for the course and teacher improved in the intervention runs. One limitation lies in the limited sample size of less than 40 for the cohorts, which may need additional control and intervention cohorts in subsequent academic years to further validate the promising results. For example, the dip in score for “Course” and “Thinking ability” could be attributed to academic abilities of the intervention cohorts. Nevertheless, it is interesting that despite the plausible difference in academic abilities of the intervention cohorts, the score for “Teacher” and “Interest” remains high. Students gave higher ratings to the module and the lecturer. Reports also revealed higher ratings in areas such as “The teacher has enhanced my thinking ability” and “The teacher has increased my interest in the subject.” Qualitative feedback included comments such as “This module is very interesting and can relate to my working life” and “Able to apply it to daily work” indicating the practical benefits of sb-SGQA. 

Figure 1. Teaching score from students regarding the course, teacher, increased interest in the subject, and thinking ability in control (AY2020/21) and intervention (AY2022/23 and AY2023/24) cohorts. 

 

Figure 2 shows the final exam grade distributions of TCE2155 in the three runs. Note that no students in the intervention cohort scored 0-15 and no students in the control cohort scored 90-100. This demonstrates that sb-SGQA can encourage all adult learners, especially the weaker students, to perform better in the final exam. Furthermore, an analysis of the submitted assignments in the intervention cohort highlighted how many students were able to use their professional experience to design the questions and answers (refer to Figure 3). In the control cohort, adult learners would not be able to draw upon their professional experience to contextualise the learning objectives in the course.  

Figure 2. Final exam grade distributions of the final exam in control (AY2020/21) and intervention (AY2022/23 and AY2023/24) cohorts.  

 

Figure 3. Sample of submitted assignment from AY2023/24 (intervention cohort). 

 

CONCLUSION 

The sb-SGQA approach was implemented in TCE2155 within the BTech (Civil Engineering) programme. As a result, adult learners were able to link their professional experience with the course content, which was shown to impact adult learners’ performance in the assignments submitted. With no additional hardware or software required, the sb-SGQA presents itself as a cost-effective method for improving engineering education for adult learners. 

 

REFERENCES

Chin, C. C., & Brown, D. E., (2013). Student-generated questions: A meaningful aspect of learning in science, International Journal of Science Education, 24(5), 521-549. http://dx.DOI.org/10.1080/09500690110095249   

Du, H. J., & Tay, S. E. R. (2022). Using scenario-based student-generated questions to improve the learning of engineering mechanics: A case study in civil engineering [Paper presentation]. In Higher Education Campus Conference (HECC) 2022, 7-8 December, National University of Singapore. https://ctlt.nus.edu.sg/wp-content/uploads/2024/10/ebooklet-i.pdf  

Merriam, S. B., & Bierema, L. L., (2013). Adult Learning: Linking Theory and Practice [eBook]. Jossey-Bass. 

Knowles, M. S. (1992). Applying principles of adult learning in conference presentations. Adult Learning, 4(1), 11-14. https://doi.org/10.1177/104515959200400105

Tay, M. X. Y., & Tay, S. E. R. (2021). Scenario-Based Student-generated Questions for Students to Develop and Attempt for Authentic Assessments [Workshop]. In International Society for the Scholarship of Teaching and Learning, 27th October 2021. 

Does Grading an Assignment Matter for Student Engagement – A Case Study in an Interdisciplinary Course with Science and Humanities

LIU Mei Hui1 and Stephen En Rong TAY2

1Department of Food Science and Technology, College of Humanities and Sciences, NUS
2Department of the Built Environment, College of Design and Engineering (CDE), NUS

fstlmh@nus.edu.sg; stephen.tay@nus.edu.sg

 

Liu, M. H., & Tay, S. E. R. (2024). Does grading an assignment matter for student engagement: A case study in an interdisciplinary course with science and humanities [Paper presentation]. In Higher Education Conference in Singapore (HECS) 2024, 3 December, National University of Singapore. https://blog.nus.edu.sg/hecs/hecs2024-liu-tay/

SUB-THEME

Opportunities from Engaging Communities

 

KEYWORDS

Interdisciplinarity, peer learning, student-generated questions, assessment, feedback

 

CATEGORY

Paper Presentation 

 

INTRODUCTION

The Scientific Inquiry II (SI2) course – HSI2007 “Deconstructing Food” – employed scenario-based student-generated questions and answers (sb-SGQA) previously to encourage interdisciplinary learning (Tay & Liu, 2023). In the activity, students were tasked to develop questions and answers based on the learning objectives that are contextualised to community examples beyond the classroom. This contextualisation to a scenario helps develop authentic assessment (Wiggins, 1990). To further increase student engagement with the sb-SGQA activity, the sb-SGQA activity changed to a graded assignment in AY2023/24 Semester 1. This was motivated by literature that reports how a graded assignment motivates students in their learning, specifically as an extrinsic motivator, in which students are incentivised to work towards a reward (i.e. good grade) (Docan, 2006; Harlen et al., 2002; Schinske & Tanner, 2014). Hence, this study aims to answer the following questions:

  1. Does the graded sb-SGQA improve student performance, evidenced through a comparison of the continual assessment marks between the graded and ungraded cohorts?
  2. What are students’ perceptions of the sb-SGQA approach from both the graded and ungraded cohorts?

METHODOLOGY

The graded sb-SGQA (20% weightage) was adopted in AY2023/24 Semester 1, and results were compared with data from AY2022/23 Semester 2, when the sb-SGQA was not graded. Across both cohorts, two continual assessment (CA) components, a MCQ Quiz (20% weightage) and Individual Essay (20% weightage) were analysed as these two components were present in both cohorts. Numerical data was analysed with JASP, an open-source statistical package (Love et al., 2019).

RESULTS

In Figure 1, students analysed and discussed differences between meals served to students in the East and West differ, and Figure 2 demonstrates how students employed content materials from the online community for a case study. Through these questions, students demonstrated concepts of nutrition, food microbiology (e.g., fermented foods), and health-related information.

HECS2024-a20-Fig1

Figure 1. Example of students’ work analysing meals in other communities

 

HECS2024-a20-Fig2

Figure 2. Student work in question-and-answer generation through engaging the digital community.

Though formal evidence has not been collected, we believe the project is impactful based on several observations. Participants demonstrate increased confidence and curiosity as they develop coding and robotics skills, particularly after successfully completing projects or engaging in hackathons. Exposure to tech fairs broadens their understanding of technology’s potential and encourages further exploration. These activities are designed to spark interest in technology and create a positive learning environment, which we believe is key to fostering long-term engagement in the field.

 

When the CA scores were analysed, a statistically significant difference was observed for the MCQ Quiz but not for the Individual Essay (refer to Table 1). This could be attributed to the open-ended nature of the Individual Essay assessment component, which requires student competencies in articulation of ideas and positioning their views, which may have masked the effect.

Table 1
Score comparisons for MCQ Quiz, Individual Essay, and CA across the graded (n=102) and ungraded (n=184) cohorts

HECS2024-a20-Table1

 

Table 2 represents student feedback on the sb-SGQA approach. Majority of the students in both the graded and ungraded cohorts shared that the sb-SGQA has helped with their learning. Though the activity was challenging, the students enjoyed it and recommended it for future courses. The qualitative feedback (refer to Table 3) revealed how Humanities and Sciences students appreciated how their diverse views could be incorporated through the sb-SGQA (Humanities 1, Humanities 3, Science 3). The sb-SGQA also forces students to reflect deeper on the course materials to develop meaningful questions and answers, thus aiding their learning (Humanities 2, Science 1). The contextualisation of the learning objectives to community examples was appreciated by students (Humanities 4, Science 2). The approach was also utilised by students to integrate topics taught through the entire course, thus allowing students to appreciate the course as a whole (Science 4). The themes were similar in the ungraded cohorts.

 

Table 2
Student feedback from the graded (left) and ungraded (right) cohorts separated by “/”. Responses represented as a percentage, and were obtained from 102 respondents in the graded cohort and 120 respondents in the ungraded cohort. The modes are bolded for highlight

HECS2024-a20-Table2

 

Table 3
Qualitative feedback from Humanities and Science students in the graded cohort

HECS2024-a20-Table3

CONCLUSION AND SIGNIFICANCE

The change to a graded assignment increased students’ performance in the MCQ Quiz segment but not the Individual Essay segment. Student perceptions to the approach were generally positive across both the graded and ungraded cohorts. The results suggest that students’ perceived value of a learning activity may not be solely dependent on whether the learning activity is graded or not. The significance of this study lies in how the use of sb-SGQA could aid with community engagement in the creation of case studies without software and hardware costs involved.

REFERENCES

Docan, T. N. (2006). Positive and negative incentives in the classroom: An analysis of grading systems and student motivation. Journal of the Scholarship of Teaching and Learning, 6, 21-40. https://scholarworks.iu.edu/journals/index.php/josotl/article/view/1668/1666

Harlen, W., Crick, R. D., Broadfoot, P., Daugherty, R., Gardner, J., James, M., & Stobart, G. (2002). A systematic review of the impact of summative assessment and tests on students’ motivation for learning. https://dspace.stir.ac.uk/bitstream/1893/19607/1/SysRevImpSummativeAssessment2002.pdf

Love, J., Selker, R., Marsman, M., Jamil, T., Dropmann, D., Verhagen, J., Ly, A., Gronau, Q. F., Šmíra, M., Epskamp, S., Matzke, D., Wild, A., Knight, P., Rouder, J. N., Morey, R. D., & Wagenmakers, E.-J. (2019). JASP: Graphical Statistical Software for Common Statistical Designs. Journal of Statistical Software, 88(2), 1 – 17. https://doi.org/10.18637/jss.v088.i02

Schinske, J., & Tanner, K. (2014). Teaching more by grading less (or differently). CBE Life Sci Educ, 13(2), 159-166. https://doi.org/10.1187/cbe.cbe-14-03-0054

Tay, E. R. S., & Liu, M. H. (2023, 7 December 2023). Exploratory implementation of scenario-based student-generated questions for students from the humanities and sciences in a scientific inquiry course. Higher Education Campus Conference (HECC) 2023, Singapore. https://blog.nus.edu.sg/hecc2023proceedings/exploratory-implementation-of-scenario-based-student-generated-questions-for-students-from-the-humanities-and-sciences-in-a-scientific-inquiry-course/

Wiggins, G. (1990). The case for authentic assessment. Practical Assessment, Research and Evaluation, 2, 1-3. https://doi.org/10.7275/ffb1-mm19

Viewing Message: 1 of 1.
Warning

Blog.nus accounts will move to SSO login soon. Once implemented, only current NUS staff and students will be able to log in to Blog.nus. Public blogs remain readable to non-logged in users. (More information.)