Session (6 Aug) Reflection

Evaluation of Course and Teaching Impact: Planning

Participants: Park Mihi (facilitator), Saito Yukiko, Lee Li Neng, Yuzuru Hamasaki, Yannick Appriou, Baranska Malwina, Nina Laurel Powell, Matthew Lepori, Dunya Deniz Lepori, Chin Chuan Fei, Joshua Kurz, Stuart Derbyshire

Description

The session was initiated to learn and discuss how to evaluate the course, and impact of teaching throughout the semester. In the session, Mihi, the facilitator started a session with an opening presentation to invite all the participants to be on a same page of understanding course evaluation and potential methodologies. Based on the pre-session survey, student survey and peer review were the methods that many participants showed an interest. Mihi shared some information about student survey and portfolio & peer review gained from various literature, followed by an active discussion with all the participants. At the end, we had a small group discussion with an aim to share thoughts about current practice, evaluation methods and plan for a course evaluation of a coming semester.

Points and ideas from the session

1. Clear course goal shared with the students

Based on previous literature, we understood the importance of setting a clear course goal, and to be shared with the students. But in NUS setting, colleagues found that this effort is not always welcome by the students. One colleague clearly got a feedback from a student complaining that a lecturer spends too long to explain why they are doing such academic activities.

To address such an issue, we have discussed of a possibility of providing a clear instruction from the university to all NUS student that the students are responsible for making a choice for higher education. More specifically, NUS students can be requested to be more responsible for the modules they chose to read by participating actively and doing necessary work to make learning happen. Also, educators need to constantly remind students of why learning matters, and why attaining such knowledge matters to them (e.g. how this knowledge and learning experience helps you to study the following modules, and do better in life) to convince learners to be more engaged.

2. Design of Student survey

Currently NUS employs four questions for students to evaluate the module as below.

  • The teacher has enhanced my thinking ability.
  • The teacher provided timely and useful feedback.
  • The teacher has increased my interest in the subject.
  • Overall, the teacher is effective.

In addition, each department is encouraged to create additional three questions that serve the department-specific evaluation purposes. However, during the discussion, we found out that not all the departments created three additional questions. Also, many of the participants agreed that the currently-used questions do not much help educators to improve the quality of teaching. Perhaps, one reason could be that the students hardly have a chance to be trained to give pedagogically constructive feedback for teaching (e.g. not knowing pedagogical terms, not knowing what to look at…). Secondly, the questions are not set a clear focus of what we are looking for. Consequently, we like to have more specific questions about Intellectual content, e.g. course material, intellectual coherence of course content, articulation of intellectual goals for learners and congruence of those goals with course content and mission, value or relevance of ideas, knowledge, skills covered by the course, assessment types and so on, so that we can actually make use of the information gained from student feedback, to improve teaching. However, we understand that, if we plan our own student surveys, these should not conflict with the formal feedback exercise, and need to avoid student fatigue.

Another topic was if number scoring is truly reflective of learners’ feedback. Many of the participants wish to explore an essay-type of feedback to evaluate the course. Because the average score of the current four questions was 4.2 out of 5, which is too high (like ceiling effect). It may not give much information about how students really felt.

3. Freedom in structuring and scheduling the course

Many colleagues shared a challenge in achieving a course aim due to limited contact hours, scheduling, and fixed type of assessment. We agreed that longer/frequent contact with learners, especially longer/frequent tutorial hours would support better learning. Scheduling is also very important factor for learners’ satisfaction and motivation of the module.

We discussed more precise ways to evaluate how well we are improving the students’ critical thinking.

(a) By assessing whether students are better able to raise new and more challenging questions about course content and disciplinary assumptions.

(b) By challenging students to critique articles that they read – maybe sometimes from a more inter-disciplinary perspective, drawing on readings that they do in other fields.

(c) By asking students in each group to set an assessment question.

Above mentioned proposals were made based on the idea of enhancing critical thinking, learner ownership, learner involvement, good articulation, and collective learning.

4. Support diversity in class

There was a concern shared about student dynamics in classroom. For instance, introvert students are reluctant to speak up to share their thoughts in class. Nina shared https://perusall.com/ with the group to engage diverse students in the module. Explaining norms for discussion can help to create more balance between introverted/extroverted students. e.g. the ‘step up/step back’ rule for respectful discussion.

Finally, we agreed that sharing our experiences across disciplines in the LC is highly useful. Brainstorming new strategies, learning from other experiences, clarifying constraints faced by our different disciplines.

 

Reference

Bernstein, D. (2006). Making teaching and learning visible: Course portfolios and the peer review of teaching. Bolton, Mass: Anker Pub. Co.

Berk, R. A. (2005). Survey of 12 Strategies to Measure Teaching Effectiveness. International Journal of Teaching and Learning in Higher Education

 

Prepared by Mihi and Chuan Fei

Session2 in August

Hi, I am planning to open a session early in August, to discuss how to evaluate your course, and impact of your teaching throughout the semester.

During the session, I like to discuss on the following topics;

  1. how to design the evaluation
  2. potential methodology
  3. how to interpret the findings
  4. how to reflect the outcome

in the hope to implement the findings for the coming semester. And the follow-up session should happen in December to discuss the findings.

Also, we will discuss the potential chances for write-ups at NUS internally first.

I will upload more information when the details are ready.

Thank you.

Mihi