My new reflection, titled ‘Who Needs To Get Prepared for Online Courses?’ on online teaching during the Pandemic 2020, March to April is published here.
There are many more useful posts found in ‘Teaching Connections’ by NUS, CDTL.
My new reflection, titled ‘Who Needs To Get Prepared for Online Courses?’ on online teaching during the Pandemic 2020, March to April is published here.
There are many more useful posts found in ‘Teaching Connections’ by NUS, CDTL.
There is a new blog for faculties of Educator track (ET) at NUS.
We conduct;
Career Advancement for FASS Educators
Please visit the blog for more info!
Inspiring reading to understand the importance of Faculty Mentoring, particularly, ‘4. Adding more non-tenure-track faculty also increases the need for mentoring programs‘ is the point that we are trying to fill the gap in Committee on Career Advancement for FASS Educators (CAFE).
Why (And How) We Need to Improve Faculty Mentoring in Higher Education
Due to this world-affecting COVIC-19 situation, I converted my lectures online, as many of our colleagues do. Since language learning is believed to be best in face-to-face mode, I tried to accommodate our online learning experience to maximise its particular benefits (e.g. using multimedia and easier sharing).
Here I wrote my reflection to share with the learning community, possibly to support other to avoid the same mistakes I made, or get on to the first stage more smoothly.
Feel free to contact me, if you have any inquiries.
Dear colleagues,
Happy New Year!
Hope you are having a great holiday season with your beloved ones.
We had a great session with Nina before Christmas holiday (please check out Nina’s presentation here, https://blog.nus.edu.sg/mihi/), and I am happy to have another colleague, Li Neng for the first session of 2020. This informative session will feed us some food for thoughts, timely at the beginning of a new semester.
Title: Using brief questioning exercises to encourage inquiry based learning, by Lee Li Neng (Psychology)
Date and Time: 15 Jan 2020 (Wed), 10AM
The abstract will be shared soon.
Please sign up here: https://doodle.com/poll/5grwsvtftv6x24fw
Nina kindly shares her slides with public. If you are interested in her research or further research collaboration, feel free to contact Nina, nina.powell@nus.edu.sg.
Really excited with sharing a new session by Nina from Psychology department is coming soon.
Date and time: 19 November, 2019 (Tuesday), 2-4PM
Title: Cultivating a culture of uncertainty in the classroom
Abstract: Certainty, specifically needs for epistemic certainty, often result in the use of heuristics in judgment and decision-making, and an overall shallow processing of information. This has been well documented in social cognition, and applies to judgments of people and actions in many domains (e.g., moral decisions, planning, stereotyping, attitudes, etc.). Generally, people are very happy to make judgments on the basis of very little information, and can quickly rely on shallow and emotional processing of information in order to reach a conclusion. A desire for certainty is very powerful, and often motivates us to reach conclusions quickly, maximising our efforts and conserving our resources. Certainty also has important consequences for our self-image, helping to increase self-esteem. I wish to extend this understanding in psychology to the classroom – how can a need and desire for certainty – a natural consequence of contact with knowledge and information – adversely affect students’ understanding of a topic, and their desire to pursue more information? I will explore the theoretical basis for my assertion that we should be cautious of students’ certainty, and cultivate a learning environment that favours uncertainty – that the purpose of education is not to know, but to consider.
I am sure we will learn something new! See you there.
Exam should not be just an Exam!
How can we help students recap the experience of doing an exam in a developmental way? Please read this article. I also attach my own exam wrapper used for LAK4201 (Korean5) for your reference.
Article:
LAK4201 Exam Wrapper:
Abstract;
We compared students’ self-reported perception of learning with their actual learning under controlled conditions in large enrollment introductory college physics courses taught using 1)active instruction (following best practices in the discipline) and 2) passive instruction (lectures by experienced and highly rated instructors). Both groups received identical class content and handouts, students were randomly assigned, and the instructor made no effort to persuade students of the benefit of either method.
Students in active classrooms learned more (as would be expected based on prior research), but their perception of learning, while positive, was lower than that of their peers in passive environments. This suggests that attempts to evaluate instruction based on students’ perceptions of learning could inadvertently promote inferior (passive) pedagogical methods. For instance, a superstar lecturer could create such a positive feeling of learning that students would choose those lectures over active learning. Most importantly, these results suggest that when students experience the increased cognitive effort associated with active learning, they initially take that effort to signify poorer learning. That disconnect may have a detrimental effect on students’ motivation, engagement, and ability to self-regulate their own learning. Although students can, on their own, discover the increased value of being actively engaged during a semester-long course, their learning may be impaired during the initial part of the course. We discuss strategies that instructors can use, early in the semester, to improve
students’ response to being actively engaged in the classroom.
Evaluation of Course and Teaching Impact: Planning
Participants: Park Mihi (facilitator), Saito Yukiko, Lee Li Neng, Yuzuru Hamasaki, Yannick Appriou, Baranska Malwina, Nina Laurel Powell, Matthew Lepori, Dunya Deniz Lepori, Chin Chuan Fei, Joshua Kurz, Stuart Derbyshire
Description
The session was initiated to learn and discuss how to evaluate the course, and impact of teaching throughout the semester. In the session, Mihi, the facilitator started a session with an opening presentation to invite all the participants to be on a same page of understanding course evaluation and potential methodologies. Based on the pre-session survey, student survey and peer review were the methods that many participants showed an interest. Mihi shared some information about student survey and portfolio & peer review gained from various literature, followed by an active discussion with all the participants. At the end, we had a small group discussion with an aim to share thoughts about current practice, evaluation methods and plan for a course evaluation of a coming semester.
Points and ideas from the session
1. Clear course goal shared with the students
Based on previous literature, we understood the importance of setting a clear course goal, and to be shared with the students. But in NUS setting, colleagues found that this effort is not always welcome by the students. One colleague clearly got a feedback from a student complaining that a lecturer spends too long to explain why they are doing such academic activities.
To address such an issue, we have discussed of a possibility of providing a clear instruction from the university to all NUS student that the students are responsible for making a choice for higher education. More specifically, NUS students can be requested to be more responsible for the modules they chose to read by participating actively and doing necessary work to make learning happen. Also, educators need to constantly remind students of why learning matters, and why attaining such knowledge matters to them (e.g. how this knowledge and learning experience helps you to study the following modules, and do better in life) to convince learners to be more engaged.
2. Design of Student survey
Currently NUS employs four questions for students to evaluate the module as below.
In addition, each department is encouraged to create additional three questions that serve the department-specific evaluation purposes. However, during the discussion, we found out that not all the departments created three additional questions. Also, many of the participants agreed that the currently-used questions do not much help educators to improve the quality of teaching. Perhaps, one reason could be that the students hardly have a chance to be trained to give pedagogically constructive feedback for teaching (e.g. not knowing pedagogical terms, not knowing what to look at…). Secondly, the questions are not set a clear focus of what we are looking for. Consequently, we like to have more specific questions about Intellectual content, e.g. course material, intellectual coherence of course content, articulation of intellectual goals for learners and congruence of those goals with course content and mission, value or relevance of ideas, knowledge, skills covered by the course, assessment types and so on, so that we can actually make use of the information gained from student feedback, to improve teaching. However, we understand that, if we plan our own student surveys, these should not conflict with the formal feedback exercise, and need to avoid student fatigue.
Another topic was if number scoring is truly reflective of learners’ feedback. Many of the participants wish to explore an essay-type of feedback to evaluate the course. Because the average score of the current four questions was 4.2 out of 5, which is too high (like ceiling effect). It may not give much information about how students really felt.
3. Freedom in structuring and scheduling the course
Many colleagues shared a challenge in achieving a course aim due to limited contact hours, scheduling, and fixed type of assessment. We agreed that longer/frequent contact with learners, especially longer/frequent tutorial hours would support better learning. Scheduling is also very important factor for learners’ satisfaction and motivation of the module.
We discussed more precise ways to evaluate how well we are improving the students’ critical thinking.
(a) By assessing whether students are better able to raise new and more challenging questions about course content and disciplinary assumptions.
(b) By challenging students to critique articles that they read – maybe sometimes from a more inter-disciplinary perspective, drawing on readings that they do in other fields.
(c) By asking students in each group to set an assessment question.
Above mentioned proposals were made based on the idea of enhancing critical thinking, learner ownership, learner involvement, good articulation, and collective learning.
4. Support diversity in class
There was a concern shared about student dynamics in classroom. For instance, introvert students are reluctant to speak up to share their thoughts in class. Nina shared https://perusall.com/ with the group to engage diverse students in the module. Explaining norms for discussion can help to create more balance between introverted/extroverted students. e.g. the ‘step up/step back’ rule for respectful discussion.
Finally, we agreed that sharing our experiences across disciplines in the LC is highly useful. Brainstorming new strategies, learning from other experiences, clarifying constraints faced by our different disciplines.
Reference
Bernstein, D. (2006). Making teaching and learning visible: Course portfolios and the peer review of teaching. Bolton, Mass: Anker Pub. Co.
Berk, R. A. (2005). Survey of 12 Strategies to Measure Teaching Effectiveness. International Journal of Teaching and Learning in Higher Education
Prepared by Mihi and Chuan Fei