Adopting a Paperless Laboratory Report Management System: A Feasibility Study

XU Hairuo
Department of Chemistry

This article summarises my Professional Development Programme (Teaching) (PDP-T) practicum on implementing a paperless laboratory report management system using the National University of Singapore’s Integrated Virtual Learning Environment (IVLE) as the platform.

Recommended Citation:
H. Xu (2015). Adopting a paperless laboratory report management system: A feasibility study. CDTL Brief, 18(1), pp. 14-18.

Background and Objective
The laboratory report is an important component of student assessment in practical modules. In the Department of Chemistry, students are currently required to submit both hard and soft copy versions of their laboratory reports. The soft copy version is an electronic document containing the laboratory report typed by the student; it is uploaded to the “Student Submission” folder in the IVLE Workbin for the lecturer to grade and conduct a plagiarism check. The hard copy version includes, besides the laboratory report, a handwritten datasheet, a spectrum printout or other documents related to the practical session. Students submit the hard copies to the laboratory staff, who sort the reports according to the class roster and hand them to the lecturer for grading. After grading, the lecturer returns the reports to the laboratory and informs students to collect their reports so that they can view the grades and comments.

Benefits of going paperless
One of the logistical challenges of this process is the effort needed to sort and transport these hard copies. Every semester, there are always a number of reports that are left uncollected in the laboratory, which also takes up storage space. As such, it would be timely to implement a paperless laboratory report management system by which students submit only soft copies of the reports and grading can be done on these electronic documents.

Adopting a paperless approach is beneficial for several reasons. For one thing, most students nowadays use computers to type their reports. This method would ensure that tasks such as report submission (by students) and the downloading and grading of these reports (by the lecturer) can be achieved without either party having to be physically on campus. This is an advantage if the lecturer plans to conduct online group discussions or even a virtual laboratory in future. In addition, the lecturer would be able to easily conduct online plagiarism checks on soft copies of the reports, which could deter students from copying other people’s work and in the long run, also foster independent thinking. McDowall (2004) and Giles (2012) have highlighted the benefits of this approach in their studies of similar but more comprehensive paperless systems. These include significant reductions in cost (e.g. incurred by paper waste and logistics cost) and paper work, as well as increased efficiency (Giles, 2012).

In this case, the objective of this practicum is to test the feasibility of putting in place a paperless system of managing and grading laboratory reports using IVLE as the platform. We compared the lecturer’s input of time and effort when using the conventional method (i.e. grading the hard copies) against the paperless method. Students’ experiences with these two methods were also surveyed and compared.

As the lecturer of the module CM2192 “Experiments in Chemistry 3” in Semester 2 of AY2013/14, I chose two experiments (Experiment 8 “Rovibrational Spectrum of Hydrogen Chloride” and Experiment 9 “The Electronic Absorption Spectrum of Iodine”) in this module to conduct this practicum. These two experiments are both about molecular spectroscopy, and are similar in terms of the level of difficulty and the page limit for the respective laboratory reports. A class of 98 students were divided into two groups (Groups 1 and 2). For each group, both submission and grading were conducted using the conventional method for one experiment, and the paperless method was administered for the other experiment. (see Table 1).

Table 1. Methods (conventional or paperless) used for
laboratory report management.


Following the paperless method, students were instructed to upload electronic versions of their laboratory reports to the “Student Submission” folder in the IVLE Workbin. Since each report also included a spectrum printed in the laboratory, students were given the option of either uploading the scanned or photographed files of the spectrum printout as an attachment, or simply submitting the spectrum printouts in hard copy to the laboratory if they had no access to a scanner or smartphone. The latter option was given to minimise the extra cost students might incur if they had to purchase additional equipment for the soft copy submission.

As the lecturer, I first collected the spectrum printouts submitted to the laboratory in hard copy (only a few copies). I downloaded all the reports to my computer before starting to grade them. When grading, I read each report on the computer screen and inserted my comments into the electronic documents by using either the “Comment” tools in Microsoft Office Word or the “Comment and Markup” tools in Adobe Acrobat Pro. When it came to grading the spectrum printouts, I referred to either the electronic attachments or the hard copies handed in. After grading, I keyed in the grade in the “Comment” page of the student’s file in the IVLE Workbin, and uploaded the annotated file as an attachment. (see Figure 1) The “Student Submission” folder was set in such a way that each student could only view his/her own grade and file. In IVLE, another tool called “Annotate” also allowed the lecturer to add comments in the student’s document. However, this tool was not used in this practicum because it required online operation and was less convenient than editing the document offline.

Figure 1. The “Comment” page on IVLE in which the lecturer keys in the student’s grade and includes the annotated file as an attachment.


Results and Feedback
Table 2 compares the two methods and their impact on the learning environment as well as the parties involved in the laboratory session, namely the instructional team (comprising the lecturer and laboratory staff) and the two groups of students.

Table 2. Comparison between the conventional and the paperless methods.



Feedback from the lecturer and laboratory staff
On average, it took me (the lecturer) six minutes to grade a report on hard copy (we recorded the timings with a stopwatch), which included reading it, writing brief comments in the margins and inserting a grade on the front page. This increased to eight minutes when I used the paperless method. This is because the time required to download and upload soft copies of the report had to be taken into account, even though the time spent writing the comments was shorter, since for me typing tends to be faster than writing. To save more time, I made a document with a list of common comments from which the selected comment could be copied to the student’s report when needed.

As for the laboratory staff, they indicated through informal conversations with the lecturer that they preferred the paperless method because they no longer needed to sort and distribute hard copies of the reports.

Feedback from the students
A survey was also conducted to evaluate students’ experiences with the two methods (shown in Table 2). The survey results are reflected in Tables 3 and 4.

According to the results collated in Table 3, a larger portion of students gave a positive response to the change, either preferring the paperless method (43.3%) or feeling neutral about it (33.3%). The responses shown in Table 4 also indicate this, with the majority of respondents agreeing that the paperless method saved them time (86.6%) and money (66.7%). In the survey, some students also gave qualitative comments that the paperless method saved them the trouble of going to the laboratory to submit and collect their reports, and they were able to review the comments of the reports anywhere with their smartphones. Some also considered the paperless method “an environmentally-friendly way of marking the reports”. In terms of negative responses, only a small percentage of respondents considered scanning or taking a photograph of the spectrum printout to be time-consuming (30.0%) and costly (3.3%).

Table 3. Students’ responses on whether they prefer the conventional
or paperless method. (Response size: 30)


*Note: Students had to answer the following multiple-choice question:

Overall, which of the two methods do you prefer?

  • The conventional method (submitting/collecting hard copy report + spectrum in the lab, and uploading the soft copy of the main report to IVLE for plagiarism check), or
  • The paperless method (uploading soft copies of both the report and spectrum to IVLE, and downloading reports with grade and comments).

Table 4. Students’ responses on their experiences with the two methods. (Response size: 30)


Concluding Reflections
The paperless method of managing laboratory reports has been tested and found to be feasible. According to the qualitative and quantitative feedback, this method saved students and laboratory staff time and reduced wastage of paper, even though it required slightly more work for the lecturer. However, we found limitations in the methodology which can be addressed when we implement this system for future laboratory sessions. For instance, the datasheet and spectrum printouts could be digitised to facilitate easy uploading to the IVLE. The lecturer could also recommend that students submit their reports in a certain file type based on his/her preference. For example, I prefer students to submit their reports as Word documents rather than as PDFs, because it is easier and faster to insert comments into the former. In terms of evaluating the effectiveness of future runs of this method, the qualitative and quantitative feedback could be substantiated by other sources of data (e.g. focus group sessions with students). To further refine this approach, one could also look into harnessing the benefits of going paperless by conducting online group discussions or a virtual laboratory for subsequent sessions.

Giles, J. (2012, January 26). Going paperless: The digital lab. Nature, 481, 430-431. Retrieved from

McDowall, R.D. (2004). Designing a paperless laboratory. Scientific Computing & Instrumentation, 21(12), L8-L14.
Retrieved from

Print or download pdf version: Paperless Lab Report

Xu Hairuo_crop

About the Author
Dr Xu Hairuo teaches physical chemistry and practical modules in the Department of Chemistry. She has a keen interest in enhancing her students’ learning experiences and continuously seeks new and more effective ways of managing teaching and assessment in her modules.


Question-based Learning (QBL): An Innovative Approach to Teaching Clinical Anatomy in Medical Education


Department of Anatomy

Recommended Citation:
Iravani, O. (2015). Question-based learning: An innovative approach to teaching clinical anatomy in medical education. CDTL Brief, 18(1), pp. 7-13.

Anatomy is one of the most important subjects in medical education. In many schools, learning anatomy is mandatory for medical students (Older, 2004). Students gain basic knowledge of the human structure by learning anatomy and correlating it with various clinical conditions (Bay & Ling, 2007). At the National University of Singapore (NUS), the anatomical sciences, comprising gross anatomy, histology and embryology, are taught at the Yong Loo Lin School of Medicine’s Department of Anatomy. The medical students there benefit from a variety of teaching methods such as lectures, practical sessions and tutorial classes.

Overview of the Anatomy Curriculum
Currently, the School teaches anatomy to first-year medical students through a three-step process. First, students learn the basics of human anatomy during a two-hour lecture, where they become acquainted with each anatomical region and the basic theoretical concepts which focus on the structure of the body for that specific region. The lectures help them gain a quick yet comprehensive introduction to the subject by providing a “guideline map” of the human body.

The second step involves students receiving hands-on training during the practical sessions, during which they have to identify the human structures on prosected (ready to examine) cadavers. The sessions are valuable opportunities for students to apply the knowledge they have acquired during the lectures on human specimens (also known as “silent mentors”). Working on a real-time simulation of the human body is beneficial to students and enhances their knowledge of anatomy. During the practical sessions, students are also trained in chest and abdomen ultrasonography by the clinical anatomist, skills they need to cultivate in order to correlate the internal structures with surface landmarks.

The final step involves reinforcing the knowledge acquired. This is done during the tutorials where students discuss what they have learnt with their peers and tutors. At this point, they may still be unclear about certain structures, relationships and their clinical significance. The tutorials are where students get to fill in any gaps in their knowledge and clarify ambiguities they might have about the course content through active interaction with each other and their tutors. More importantly, these sessions help them integrate basic anatomical knowledge with clinical applications, and serve as a guide in helping them determine what they need to know when it comes to applied anatomy and clinical scenarios. For example, when teaching students about the lymphatic drainage of the tongue, they were first shown a few clinical figures illustrating patients with metastasis of tongue carcinomas. A discussion about potential lymphatic drainage routes would follow. Such discussions give students a better sense of how they can apply basic anatomical concepts (in this case, lymphatic drainage of the tongue) to a clinical scenario.

Current Approaches to Teaching Anatomy
According to the literature, several methods have been used. They range from traditional lectures to interactive methods such as case-based, problem-based, and team-based learning (Turney, 2007; Ganguly, 2010). Each method has its advantages and disadvantages. For example, while lectures are effective when it comes to disseminating information to a large number of students within a limited time, they tend to become passive recipients of the course content. Meanwhile, more interactive methods such as problem-based learning are effective in engaging small groups of students, even if they can be more time-consuming compared to conducting lectures.

What then is the best method to teach anatomy? It would seem that there is no single ideal method and a combination of various interactive models comprising active directed discussions and demonstrations using teaching aids might be preferable (Davis et al., 2014).

Applying Question-based Learning (QBL) to the Tutorials
To fulfil the curriculum’s learning outcomes more effectively, we introduced question-based learning (QBL) into the tutorials. In QBL, the lecturer designs objective-oriented questions which students discuss in a systematic manner during their respective tutorial sessions. Such questions would help students navigate the vast amounts of anatomical information in the curriculum. QBL is based on aspects of inquiry-based learning, which Virginia S. Lee and her colleagues define as “a range of strategies used to promote learning through students’ active and increasingly independent investigation of questions, problems and issues, often for which there is no single answer” (Lee et al., 2004, p. 6). According to Feletti (1993), as cited by Lauren M. Antsey and her colleagues, it is an interdisciplinary approach to learning that “fosters problem-solving and critical thinking, and requires that students assume a greater degree of responsibility as they guide and manage their own learning” (Antsey et al., 2014, p. 64).

Questioning is a typical form of formative assessment (“Assessment for Learning”), and is “one of the most common methods of checking learner understanding” (Jones, 2005, p. 10). A set of well-constructed questions can help students organise their thoughts and highlight parts of the curriculum content that they do not know or might require further clarification. For the lecturer and tutors, such questions are not only useful in helping to evaluate their students’ progress and levels of understanding, but it is also a useful in giving immediate feedback to students about their learning. In fact, well-designed questions and guided discussions can help the lecturer and tutors create “an inquiry-based learning environment in which students are confident about approaching their inquiry, that they can find things out for themselves through the use of appropriate questioning and provision of support materials to discover their own path” (McKinney, 2010, p. 23). Such an approach would help first-year medical students feel more confident about deciding what is clinically essential as they sift through the anatomical information presented in the curriculum.

QBL was developed for the first year medical students’ tutorial groups in Academic Year 2014/15. During this period, three tutorial groups were taught using QBL, with an average number of 18-21 students in each group. The QBL process was set up as follows:

Step 1: Designing objective-oriented questions
This was one of the most important steps in QBL, in which the lecturer had to design questions based on the tutorial objectives provided under the School’s curriculum. Students were first presented with a fascinating clinical condition in form of a simple figure of a normal or abnormal manifestation, X-ray or sound. The students were not involved in the diagnosis of any disease. They were then asked if they could see any abnormality in the figure, X-ray etc. Once the students’ attentions were on the clinical scenario, the lecturer asked them about the relevant normal structures which might be affected by the particular disease condition. The author coined the term “clinification” to describe this process of teaching basic science under the shadow of clinical conditions.

Step 2: Pre-tutorial test
Students were given a list of 10 statements in a pre-tutorial test sheet called “Assessment for Learning”. The statements were designed based on the principles of good feedback practice outlined by Nicol and Macfarlane-Dick (2006). The students had to indicate whether the pre-tutorial statements were “True” or “False” on the test sheet. As an example, the students were give the following statement regarding the motor innervation of the tongue: “In unilateral damage of hypoglossal nerve, the tongue deviates toward the lesion.”

Getting students to attempt the test sheet before the tutorial was beneficial to their learning in a few ways. Firstly, the statements provided high quality objective-oriented information which helped students to proactively consider the learning point and discuss it with their peers. Secondly, attempting these test sheets before the tutorial (and repeating the test post-tutorial) helped them assess their own understanding of the course content and provided a form of “self-feedback” on their learning. They could track their own progress and clarify any ambiguities they might have had about important learning points. For the lecturer and tutors, the results of the pre- and post-tutorial tests provided a valuable source of feedback to evaluate the students’ learning and the teaching atmosphere.

Step 3: Brainstorming amongst the class
This was followed by a session of brainstorming for the entire class. Each student had to present a simple term or structure learnt from the lectures and practical sessions they attended in the beginning of the week. This part of the tutorial was a good opportunity for students to share their knowledge and clarify any doubts they had about the topic covered.

Step 4: Students form discussion groups
Following the brainstorming session, the class was divided into discussion groups. Students were randomly assigned groups according to the name list (2-3 students per group). The groups were given pre-designed topics and allowed to discuss their respective topics for 5-10 minutes. The pre-designed topics consisted of objective-oriented questions which helped guide the students from a general to more specific concepts. For instance, when the students were supposed to learn about circulation and reabsorption of the cerebrospinal fluid (CSF), they were shown CT¹ images of a normal and a hydrocephalus² brain. First, they had to indicate which CT image was the normal brain. Next, they had to pinpoint the observed abnormality (with or without assistance). Once they understood that one or more brain ventricles were abnormally dilated, they had to point out the potential causes of the hydrocephalus. At this juncture, the groups had to discuss the routes of CSF flow in the central nervous system.

Step 5: Group presentations
Based on the pre-designed sets of questions, each group had to present their answers within a limited time. The other groups were able to openly interact with the presenting group and ask questions. At the end of each presentation, the presenting group gave a summary of their own topic. In case the presenting group was not able to answer the questions, the members would interact with the other groups and discuss the questions as a class. The lecturer and tutors were on hand to facilitate these discussions and address any ambiguities about the topic.

Step 6: Post-tutorial test and feedback
In this step, students had to re-read the statements they received at the start of the tutorial and indicate in the post-tutorial test sheets if these statements were “True” or “False”. In this step, the students were expected to imbibe important learning points from the statements. After that, they had to complete an anonymous questionnaire in which they gave anonymous feedback about QBL and whether the teaching methods implemented made an impact on their learning of anatomy (See Table 1).

Table 1. Sample of the questionnaire that students had to complete.


They were also asked to write down any questions or doubts they might have about any of the topics covered. Once this was done, the tutor collected the questionnaires for analysis.

Step 7: Concluding discussions for the tutorial
The tutorial session concluded with a discussion about the correct answers for the pre- and post-tutorial statements.

Step 8: Debrief about the questionnaire results
This was done at the beginning of the next tutorial, where the lecturer discussed the results of the questionnaire with the tutorial group. This included addressing the feedback students gave (both negative and positive) and any clarifications they needed regarding the tutorial content.

Statistical Analysis
Paired t-test was used to compare the pre- and post-tutorial test results with the correct answers. We performed a correlation analysis to assess the correlation between the pre- and post-tutorial test results with the correct answers. p < 0.05 was considered statistically significant.

An analysis of the results showed that students’ learning significantly improved after the QBL tutorial sessions. Using paired t-test, the results indicate that there was a statistically significant difference between the results of the pre-tutorial tests with the correct answers (p < 0.039). Correlation analysis revealed a weak correlation between the results of the pre-tutorial tests with the correct answers (p < 0.001, correlation coefficients (CC)=0.502). On the other hand, statistical analysis showed no significant difference between the results of the post-tutorial test with the correct answers (p=0.059). There was also a high correlation between the results of the post-tutorial tests with the correct answers (p < 0.001, CC=0.920).

An analysis of the questionnaire results revealed that students responded positively to the introduction of QBL during tutorials. The feedback indicated that they had fun during the tutorials and QBL led to improvements in their classroom interactions and overall understanding of anatomy. 100% of the students believed that they learnt “much” and “very much” from the tutorial sessions. 88.9% of the students declared that they really enjoyed the QBL, 5.6% enjoyed the sessions while 5.6% were neutral. Meanwhile, 22.2% of the students indicated that they always used the tutorial notes, 22.2% often used them, while 44.4% and 5.6% indicated that they used the notes “sometimes” and “rarely” respectively. In terms of their level of class interaction, 77.8% of the students evaluated themselves as being “very interactive” and “interactive”, 16.7% as “moderately interactive” and 5.6% confessed to being “weak” when it came to class interactions. In addition, 50% of the students “strongly agreed” that QBL made them more interactive than before, and the rest “agreed” that QBL had a positive effect on them. For “Assessment for Learning”, all the students evaluated (100%) found it to be “useful” and “very useful” respectively. Finally, 66.7% of the students preferred to receive the tutorial slides (notes) after the class, while the rest preferred to receive it before class.

An engaging teaching method can connect students to areas of learning in such a way that they may easily solve any questions that arise. In this case, QBL was used effectively to teach anatomy to first-year medical students during their tutorial classes. In these sessions, sets of objective-oriented questions were continuously used to trigger the students’ curiosity about the subject. Through these questions and the various QBL activities, they learnt how to apply basic anatomical concepts to clinical conditions. Tackling these questions made the subject interesting and relevant to them. In addition, activities such as the pre- and post-tutorial tests (“Assessment for Learning”) ensured that students’ attentions were more attuned towards learning important clinical anatomy topics. These activities were also useful in providing the lecturer with invaluable information about their learning. The students also provided feedback which revealed gaps in their knowledge, queries they might harbour about the topic and their level of satisfaction towards the teaching strategies employed by QBL.

So far, this article has highlighted how QBL can make positive contributions to students’ learning. However, there are some concerns which should be considered for future iterations of QBL. For example, educators need to consider an optimal strategy when it comes to designing analytical questions which serve as effective guides in helping students navigate the massive amounts of information they have to contend with in the anatomy curriculum. Secondly, due to time constraints this method may be more suitable for small classes (<30). Lastly, running a successful QBL requires good time management since there are several learning activities the class would need to complete during the tutorial. In conclusion, QBL can be easily applied to other subjects in the medical curriculum, although further research is needed to evaluate its effectiveness.

1. A computed tomography (CT) scan refers to an imaging method that uses X-rays to create pictures of cross-sections of the body.

2. Hydrocephalus refers to the build-up of too much cerebrospinal fluid in the brain. (Source:

The author would like to thank Professor P Gopalakrishnakone (Department of Anatomy, Yong Loo Lin School of Medicine) and Professor Bay Boon Huat (Head of Department of Anatomy, Yong Loo Lin School of Medicine) for their invaluable suggestions and comments. The author also would like to express his gratitude to Mrs. Somayeh Keshani (Computer engineer and IT analyst) for her valuable consultations during the development and analysis of the study.

Anstey, L.M., Michels, A., Szymus, J., Law, W., Ho, E.M.-H., Qu, F., Yeung, R.T.T. & Chow, N. (2014). Reflections as near-peer facilitators of an inquiry project for undergraduate anatomy: Successes and challenges from a term of trial-and-error. Anatomical Sciences Education, 7, 64-70.

Bay B.H. & Ling E.A. (2007). Teaching of anatomy in the new millennium. Singapore Medical Journal Editorial, 48(3), 182-183.

Davis, C.R., Bates, A.S., Ellis, H. & Roberts, A.M. (2014). Human anatomy: Let the students tell us how to teach. Anatomical Sciences Education, 7(4), 262-272.

Feletti, G. (1993). Inquiry-based and problem-based learning: How similar are these approaches to nursing and medical education? Higher Education Research & Development, 12(2), 143-156.

Jones, C.A. (2005). Assessment for Learning. Learning and Skills Development Agency. Retrieved from

Lee, V.S. (2012). What is inquiry-guided learning? In V.S. Lee (Ed.), Inquiry-Guided Learning. New Directions for Teaching and Learning, no. 129, (pp. 5-14). San Francisco, CA: Jossey-Bass.

McKinney, P. (2010). Inquiry-based learning and information literacy: A meta-analytical study. CILASS (Centre for Inquiry-based Learning in the Arts and Social Sciences), University of Sheffield. Retrieved from!/file/IL_meta-analysis_PM-FINAL.pdf.

Nicol, D.J. & Macfarlane‐Dick, D. (2006). Formative assessment and self‐regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199-218.

Older, J. (2004). Anatomy: A must for teaching the next generation. Surgeon, 2(2), 79-90.

Ganguly, P.K. (2010). Teaching and learning of anatomy in the 21st century: Direction and the strategies. The Open Medical Education Journal, 3, 5-10.

Turney, B. (2007). Anatomy in a modern medical curriculum. Annals of the Royal College of Surgeons of England, 89(2), 104-107.

Print or download pdf version: Question-based Learning (QBL)


About the Author
Dr. Iravani is a medical doctor and cancer biologist who teaches clinical anatomy, histology and embryology in the Department of Anatomy. He subscribes to the philosophy that education is transformative for both students and teachers when they believe in “learning to teach and teaching to learn”. He also believes that if the students can assess their own learning, they will be able to regulate their studies independently and proactively.

Empowering Psychology Teaching Assistants Through a Symposium

Li Neng LEE*, Sarah Shi Hui WONG* and Stephen Wee Hun LIM
Department of Psychology

Recommended Citation:
Lee L.N., Wong, S.S.H., & Lim, S.W.H.  (2015). Empowering Psychology Teaching Assistants Through a Symposium. CDTL Brief, 18(1), pp. 19-29.

for tertiary education has been increasing worldwide with predictions that it will continue to do so, from 97 million students in 2000 to a projected total of 262 million students by 2025 (Bjarnason et al., 2009). There has been a similar trajectory of increasing demand for tertiary education in Singapore, indicated by a marked increase in proportion of the population with tertiary education; the percentage of the Singapore population of university graduates increased from 14.7% to 25.7% between 2002 and 2012 (Teo, 2013). This rising demand for higher education, coupled with a worldwide economic slowdown, poses a unique and challenging question for higher education institutions: How can higher institutions of learning maintain a high quality of teaching when faced with rapidly growing student numbers and reduced student-staff ratios (Muzaka, 2009)?

Employing graduate teaching assistants (GTAs) to assist in undergraduate teaching has been considered a viable solution to the conundrum. It costs relatively less to employ GTAs and it might be easier for faculty to engage their help due to the flexibility of their schedules. With an increase in the numbers of GTAs taking up significant undergraduate teaching assignments in Australia (Bell & Mladenovic, 2008), North America (Park, 2004), UK (Muzaka, 2009), and Singapore, it is pertinent for educators to understand how to better prepare and empower GTAs for their teaching responsibilities.

There has been some research that focused on the training of GTAs (DeChenne, 2010). The National Postgraduate Committee (NPC) of the National Union of Students in the UK promulgated guidelines for the employment of postgraduate students as teachers (NPC, 1993), which included adequate training as a requirement. Yet, according to Goddard (1998), although it has been recognised that GTAs contributed significantly to teaching in higher education institutions, the training has not been entirely adequate. With the exception of North America, which has well-established tutor training programmes (TTPs), this issue has been taken less seriously in Australia and the UK until fairly recently (Herbert et al., 2002a, b). Therefore, it is imperative to have in place a framework for the training of GTAs designed to accommodate the wide variation of roles and responsibilities (Sharpe, 2000).

The Centre for Development of Teaching and Learning (CDTL) which promotes best practices in teaching and learning at the National University of Singapore (NUS), recognised the need for GTA training, and has developed a comprehensive 2-day training program known as the Teaching Assistants Programme (TAP) to prepare new GTAs for their roles in undergraduate teaching. The TAP effectively empowers GTAs with a range of broad skills that are relevant for teaching and learning in general. At the same time, we recognised that across various disciplines, teachers may apply specific pedagogies when engaging undergraduate students, and require specialised skill sets and paradigms. Generally, higher education in the first decade of the 21st century has become characterised by learner-centred instruction (see, for example, Huba & Freed, 2000), with the aim of promoting active learning, although there are exceptional situations. For example, Lindbloom-Ylänn and colleagues (2006) found that instructors from science, technology, engineering and mathematics (STEM) related disciplines tend to teach in a more teacher-centric, information-transmission way as compared to liberal arts instructors.

On top of differing pedagogical skills and paradigms, each academic department provides a unique teaching and learning environment that GTAs must learn to navigate in order to become effective teachers. More pertinent to psychology, Prieto and Meyers (1999) asserted that many psychology departments around the world appear to offer GTAs formal training and supervision in an inconsistent manner. We thought that discipline-specific training for psychology GTAs will complement the broad university-wide training that they already receive from CDTL, and further promote effective teaching within the Department of Psychology. To this end, we launched, for the first time, a teaching workshop for our psychology GTAs, which we called the Psychology Teaching Assistants Symposium (PTAS). This Symposium—a pilot study—aimed to facilitate a meeting for discussing and sharing best practices and ideas in order to empower our GTAs on how best to teach psychology students. Our research question examines the extent to which our PTAS initiative would have impacted our attendees in that regard.

Psychology Teaching Assistant Symposium (PTAS)
The inaugural PTAS was held on 25 February 2014 and a total of 22 participants from the Department of Psychology attended the event, including GTAs, full-time TAs, and faculty members. At the end of the programme, participants would be able to:

  • Apply appropriate strategies to address pedagogical issues during tutorials;
  • Apply appropriate skills on how to relate to module lecturers and students;
  • Apply strategies on how to balance coursework and research with teaching.

These learning outcomes have been mapped by the PTAS Subcommittee¹ in a half-day programme consisting four interactive sessions, during which attendees would learn teaching skill sets and, at the same time, have opportunities to ask questions as well as interact and share experiences with the PTAS Chair and Co-Chair:

(A) Session 1: Developing pedagogical and presentation skills,
(B) Session 2: Engaging and connecting with students: A panel discussion,
(C) Session 3: Relating to module lecturers and students, and
(D) Session 4: Balancing coursework and research with teaching

A) Session 1: Developing pedagogical and presentation skills
This segment of the symposium focused on discussing pedagogical and presentation techniques that can be applied to specific tutorial formats that GTAs may frequently encounter across the various modules offered by the Department. The chief impetus behind such an approach was the conviction that the teaching challenges GTAs face depend largely on the type and format of modules they teach. It is hoped that framing teaching strategies within specific teaching contexts would allow GTAs the opportunity to develop and apply problem-specific solutions more effectively.

Participants were randomly divided into five discussion groups at the beginning of the session, consisting of three to four members in each group. Within each group, participants were asked to share their experiences, teaching roles, challenges and difficulties faced vis-à-vis the modules they taught. Each group then summarised their key discussion points on a whiteboard. We further categorised participants’ responses into two broad types of tutorials: open tutorials (in which the tutorial format unfolds unpredictably in real-time), and closed tutorials (in which the tutorial format is pre-determined and highly structured). See Table 1 below for a summary.

Table 1. Challenges (as identified by participants) that
Psychology teaching assistants face across various tutorial formatsLim_table1

Strategies to address challenges in closed tutorials
As several challenges participants raised could potentially be resolved with good presentation skills, we conducted a live demonstration of teaching a psychological statistical concept (Levene’s Test for equality of variances) within a closed tutorial format. Through this demonstration, we highlighted how GTAs can convey complex theoretical concepts in a simple and memorable manner via strategies such as pacing their speech, repeating key points for emphasis, and injecting personally relevant examples to aid understanding. Of particular interest was the concern that adhering to a lesson template and repeatedly delivering the same structured lesson can become dull and unchallenging for GTAs after some time. Methods to deliver the standard class materials in new, diverse ways were discussed. First, GTAs were encouraged to keep themselves abreast of the latest literature and draw on relevant portions to show how the foundational knowledge being taught is being advanced in the research field today. Second, they were encouraged to constantly illuminate how the psychological concepts actually played out in one’s own daily experiences, in order to promote active and durable learning among students (see Lim & Gan, 2013). Third, GTAs were challenged to explore the use of technology in teaching standard class materials which may potentially create positive student learning experiences, although they were also cautioned against potential pitfalls (see Lim & Yong, 2013, for a detailed discussion).

Strategies to address challenges in open tutorials
Next, we provided a live demonstration of an open tutorial. A recurring concern among participants was the challenge of encouraging active student participation in tutorials. We raised two plausible reasons to account for students’ tendency to be passive in class: a perceived “power gap” between GTAs and students, where the latter sees GTAs as unapproachable authority figures, as well as a deep fear of failure. To overcome these problems, we proposed that GTAs aim to build connections between students and themselves, as well as among students themselves.

Building connections with students
Since students’ perceptions of GTAs as unapproachable authority figures may discourage them from actively sharing their viewpoints, GTAs can seek to break down such barriers in the classroom both psychologically and physically. For instance, GTAs can utilise appropriate examples from their personal lives to illustrate concepts in class. This strategy will not only enable GTAs to explain difficult theories in a more relatable way, but also allow students to see GTAs as more personable, down-to-earth and approachable, thus bridging the perceived power gap for more effective open dialogue. At the same time, GTAs can set up appropriate classroom arrangements that reduce the physical distance between themselves and their students (e.g., physically moving nearer to the students when speaking; removing unnecessary equipment), thereby creating a conducive learning environment that effectively supports classroom engagement.

Creating a conducive learning environment to build students’ confidence
To help students to overcome their fear of failure, GTAs can create a safe learning culture by providing opportunities for students to first gather their thoughts and engage in small-group discussions before sharing their viewpoints, rather than expecting an immediate response to a question. This can put students on the spot, and imposing such time pressures can be stressful for them. At the same time, responding effectively to students’ answers can promote sustained classroom engagement in the long run. For instance, GTAs can and should use appropriate positive reinforcement such as praise and affirmation in response to students’ contributions, thus creating a positive classroom climate that boosts students’ self-confidence and encourages increased classroom participation. Also, by elaborating and building on students’ good answers, GTAs can create a valuable learning opportunity for the entire class to gain a deeper understanding of the subject matter and what makes for an insightful response. When students supply incorrect answers to a question, it is vital that GTAs continue to respond in a positive and encouraging manner by patiently guiding students towards finding a right or better answer. When GTAs demonstrate that they are receptive to their students’ ideas, they can build connectivity and provide much-needed assurance to students to actively voice their opinions in the classroom.

(B) Session 2: Engaging and connecting with students:
A panel discussion
Leading seamlessly from the first session, we chaired a panel discussion of some pertinent concerns participants had highlighted pertaining to their own pedagogical experiences and the challenges they faced during and outside of classroom teaching. These discussion themes had been gathered from an online pre-symposium survey, in which we invited participants to submit their inputs in order to define the scope of the discussion. This was so that participants could also summarily recognise the challenges that educators commonly face in the classroom. To encourage participants to provide accurate and honest opinions, they were assured complete anonymity and confidentiality of their responses. Based on the data collected from 11 survey respondents, we identified “engaging and connecting with students” as a recurring theme which warranted open discussion. The panellists also fielded questions from participants. Table 2 provides a summary of the session’s proceedings.

Table 2. Participants’ questions and panellists’ responses on
engaging and connecting with students

To boost participants’ sense of self-efficacy in applying the pedagogical strategies discussed, we concluded the panel discussion with an open sharing session, during which participants engaged in self-assessment to identify one of their strengths and how they can capitalise on this strength to connect with their students for more effective teaching.

(C) Session 3: Relating to module lecturers and students
During this segment of the symposium, we facilitated a forum with the primary objective of providing a platform for participants to voice potentially sensitive concerns related to their experiences working alongside lecturers and students. To this end, faculty members attending the symposium were invited to leave the venue for a tea break, while the GTAs and full-time TAs took part in the forum. Participants were reminded to avoid revealing any identifying information when sharing their feedback, which we subsequently noted on a whiteboard. Faculty members were then invited back to the venue, and served as panellists who addressed the issues that had been raised.

Some pertinent issues were addressed, including clarifying the role of psychology GTAs, the distribution of teaching and grading responsibilities between lecturers and GTAs, and appropriate handling of pedagogical conflicts between lecturers and GTAs. Importantly, faculty members also proposed the development of department-level support networks with the following goals: to provide GTAs with the guidance and mentorship necessary for managing teaching demands, as well as enhancing tutor welfare by making available a formal outlet for GTAs to air grievances and seek help to address problematic situations.

(D) Session 4: Balancing coursework and research with teaching
In view of the high number of GTAs in the Department, this session was intended as an avenue for GTAs to discuss optimal ways in which they can manage their coursework and exams while fulfilling their teaching and grading responsibilities to the best of their abilities. Based on their personal experiences, participants exchanged good practices in effective grading. We also offered recommendations on how GTAs can integrate their various duties. These included making long-term plans that work around tightly clustered grading deadlines and implementing a firm schedule for their coursework and research.

As an ending note to the PTAS, we facilitated a reflection session during which participants were invited to share their personal encounters with students whom they had positively impacted through their teaching. Drawing on these recollections, we highlighted that despite its challenges, teaching is truly a rewarding profession and a great privilege for educators, and we encouraged participants to continue building on their strengths towards becoming yet more effective teachers of psychology.

Attendee Feedback on the PTAS
At the end of the PTAS, participants were invited to fill out a voluntary and anonymous questionnaire to evaluate the symposium. Participants were offered the option of either completing a printed or online version of the questionnaire. The questionnaire consisted of ten quantitative items (α = .92) and four qualitative items. A sample of the questionnaire is shown below:

Table 3. Sample of the questionnaire to evaluate PTAS 2014

Quantitative evaluation of the PTAS
The quantitative items involved rating the effectiveness of the symposium’s four individual sessions, whether the overall symposium was relevant, useful, informative, and effective, the extent to which the symposium had motivated participants to pursue better teaching practices, as well as the extent to which participants would be able to apply the knowledge gained from the symposium in their teaching. All ratings were performed on a 5-point Likert-type scale with higher scores indicating higher levels of the variable measured (mean scores and standard deviations are presented in Table 3). Participants’ ratings across the six items related to their evaluation of the overall symposium were collapsed to yield a mean overall PTAS effectiveness score of 4.50 (S.D. = 0.47).

Table 4. Mean scores and standard deviations of participants’
quantitative evaluations of PTASLim_table4

Qualitative evaluation of the PTAS
The questionnaire’s qualitative items asked participants to identify some strengths of the PTAS, how they would be applying what they had learnt from the symposium in their teaching, their recommendations for ways in which the PTAS could be improved, as well as their comments and suggestions for activities and initiatives for future runs of the PTAS. We consolidated participants’ feedback and coded them according to common underlying threads in Table 4.

Table 5. Participants’ qualitative evaluations of PTASLim_table5

Based on the data gathered above, the PTAS positively impacted the GTAs who attended it. In general, attendees evaluated the PTAS 2014 positively, and thought that it was “personally relevant” and “highly engaging”. They liked the fact that the symposium promoted “exchanges of problems [sic] and ideas between panel members and attendees, created “[a] chance to connect with other TAs”, “didn’t merely impart teaching skills but also reminded [them] to reflect and leverage [on their] strengths”, contained “inspiring testimonials from the facilitators”, and provided “a refreshing reminder [that they too] can potentially impact and inspire students”. A participant “[felt] recharged to continue to teach passionately”.

The single drawback that was observed in this inaugural PTAS was that there were potentially many more topics and areas that could have been covered (although we must point out that the present scope of the session was commensurate with our goal for this session to be a pilot run). Indeed, this drawback was corroborated by sentiments from many participants who suggested having a full-day programme for future runs of the symposium. We believe that a variety of disciplines—beyond psychology—can organise similar symposiums to empower their teaching assistants. Moving ahead (two semesters after the GTAs have attended the PTAS), we intend to conduct focus group interviews with selected GTAs to keep track of how they have applied the skills and insights which they have gained from this PTAS, and how these have enhanced their teaching. The data will further assess the impact of the PTAS on GTAs’ teaching journeys. We are excited about having these conversations with our GTAs in the immediate future.

1. The PTAS subcommittee comprises Assoc Prof Melvin Yap (Psychology Graduate Programme Coordinator and PTAS Advisor), Dr. Stephen Lim (PTAS Chair), Mr. Lee Li Neng (PTAS Co-Chair), and Ms. Sarah Wong (PTAS Secretariat).

Bell, A., & Mladenovic, R. (2008). The benefits of peer observation of teaching for tutor development. Higher Education, 55, 735–752.

Bjarnason, S., Cheng, K.-M., Fielden, J., Lemaitre, M.-J., Levy, D. C., & Varghese, N. V. (2009). A new dynamic: Private higher education. 2009 World Conference on Higher Education (pp. 1–124).

DeChenne, S. E. (2010, April 29). Learning to teach effectively: Science, technology, engineering, and mathematics graduate teaching assistants’ teaching self-efficacy. PhD dissertation, Oregon State University.

Goddard, A. (1998, December 11) Postgrads get into class training, Times Higher Education Supplement. Retrieved from

Herbert, D.M.B., Chalmers, D. & Hannam, R. (2002a). Enhancing the training, support and management of sessional teaching staff. Paper presented at the Australian Association for Research in Education Conference, University of Queensland, Brisbane. Abstract retrieved from

Herbert, D., Masser, B. & Gauci, P. (2002b). A comprehensive tutor training program: Collaboration between academic developers and teaching staff. Paper presented at the Australian Association for Research in Education Conference, University of Queensland, Brisbane. Abstract retrieved from

Huba, M. E., & Freed, J. E. (2000). Learner-centered assessment on college campuses: shifting the focus from teaching to learning. Community College Journal of Research and Practice, 24, 759–766.

Kim, M. K., Lim, S. M., Khera, O., & Getman, J. (2014). The experience of three flipped classrooms in an urban university: an exploration of design principles. Internet and Higher Education, 22, 37–50.

Lim, S. W. H., & Gan, D. Z. Q. (2013). Using an innovative assessment method to promote lifelong learning among psychology undergraduate students. CDTL Brief, First Look (Nov/Dec 2013).

Lim, S. W. H., & Yong, P. Z. (2013). Student perceptions of the use of technology in teaching: Towards a positive learning experience. CDTL Brief, 16(2), 12–18.

Lindblom Ylänne, S., Trigwell, K., Nevgi, A., & Ashwin, P. (2006). How approaches to teaching are affected by discipline and teaching context. Studies in Higher Education, 31(3), 285–298. doi:10.1080/03075070600680539

Muzaka, V. (2009). The niche of Graduate Teaching Assistants (GTAs): Perceptions and reflections. Teaching in Higher Education, 14(1), 1–12. doi:10.1080/13562510802602400

Park, C. (2004). The graduate teaching assistant (GTA): Lessons from North American experience. Teaching in Higher Education, 9(3), 349–361. doi:10.1080/1356251042000216660

Prieto, L. R., & Meyers, S. A. (1999). Effects of Training and Supervision on the Self-Efficacy of Psychology Graduate Teaching Assistants. Teaching of Psychology, 26(4), 264–266. doi:10.1207/S15328023TOP260404

Sharpe, R. (2000). A framework for training graduate teaching assistants. Teacher Development, 4(1), 131–143. doi:10.1080/13664530000200106

Teo, Z. (2013). Educational Profile of Singapore Resident Non-Students, 2002-2012. Statistics Singapore Newsletter. Retrieved from .

Print or download pdf version: Empowering Psychology Teaching Assistants

*The first two authors contributed equally to this work.

Correspondence should be addressed to Stephen Wee Hun LIM (Email:, Department of Psychology, Faculty of Arts and Social Sciences, National University of Singapore, Block AS4, 9 Arts Link, Singapore 117570.

LiNengAbout the Authors:
Mr. Lee Li Neng
is currently a Teaching Assistant and also pursuing his Ph.D. in the Department of Psychology. He has won the FASS Graduate Students’ Teaching Award several times and is currently on the Honour Roll. He is intrigued by education, how technology can interact and contribute to it, and especially the art of teaching. Specifically, he is interested in the question of how teaching can be mastered and passed on to other people.


Sarah PhotoMs. Sarah Wong is a Teaching Assistant in the Department of Psychology. For her, teaching involves challenging and being challenged by students in a deeply enjoyable process of mutual learning. She continually seeks to inspire growth in her students, not just in the classroom but also in the bigger picture of life. Her postgraduate research focuses on developing pedagogical approaches and cognitive strategies towards enhancing young children’s musical creativity.


Dr. Stephen Lim hails from the Dept of Psychology and strives to continue bringing life-transforming educational experiences to his students. As Assistant Dean (External Relations and Student Life) in the FASS and Executive Council Member of the NUS Teaching Academy, he contributes meaningfully towards student mentorship and development in and beyond the classroom. He is also the Founding Director of the Cognition Lab in his Dept, and continues to spearhead educational research projects.


Reflecting on and discussing the philosophy and practice of teaching