Collecting and Using Multiple Sources of Assessment Evidence to Document Your Impact on Students’ Learning

Mark GAN
Centre for Development of Teaching and Learning (CDTL)

Mark shares five sources of assessment evidence related to student learning outcomes that can be used to measure teaching impact and improve practice.

student studying

Gan, M. J. S. (2022, July 1). Collecting and using multiple sources of assessment evidence to document your impact on students’ learning. Teaching Connections. 

It is important, before you delve into preparing or writing your teaching portfolio, to think and reflect about your impact on students’ learning. Essentially, you want to address the following question: “What do your students know and are able to do as a result of your teaching?” To do so, you need to collect, analyse, interpret and report (or communicate) about the evidence derived from multiple data sources.

Here are five sources of assessment evidence related to students’ learning outcomes that you can use to document your impact and improve practices (other than students’ feedback on teaching effectiveness):

Source of evidence Description


E.g. Capstone projects, model-building, poster presentation

Think of how you would like your students to demonstrate their knowledge, skills and attitudes in a purposeful and meaningful way. Consider designing authentic assessment tasks that require students to inquire, investigate, interpret, and integrate data and information to solve real-world problems.

Diagnostic tests

E.g. Concept quizzes, pre-topic tests, drawing concept maps

These are usually short higher-order MCQs or structured questions that test students’ application of key concepts, principles and theories. It allows the teacher to quickly diagnose or gather immediate feedback on students’ (prior) understanding and often, helps to uncover misconceptions, as well as identify students’ strengths and areas of need.


E.g. Adapting the AAC&U VALUE Rubrics

Strictly speaking, a rubric is not evidence but rather a marking or grading tool that articulates criteria for successive levels of performance. Not only can you share rubrics with your students, you can co-create parts of a rubric with students, get students to co-develop their own rubrics or using a rubric in peer evaluation or feedback.

Student-generated questions

E.g. Try PeerWise, an online tool for students to create, share, evaluate and discuss MCQs

Involving students in the assessment process can reap benefits ranging from a more holistic understanding of the success criteria to increasing confidence and self-regulation about their own learning. One good way of doing this, is to create opportunities for students to generate or formulate their own assessment questions. Coupled with self- or peer assessment, student-generated questions can be a powerful approach to active and deep learning.


E.g. Pre-post questionnaire, existing inventories [validated scales such as Motivated Strategies for Learning Questionnaire (MSLQ)]

Besides mid- and end-of-module student feedback surveys, there are many opportunities to gather information about students’ learning progress. Not only can you gain a better insight into your students’ strengths and weaknesses (i.e. learner profiling), you can gather immediate feedback on how well your lesson went and how best to address areas of learning concern.

Use our Learning Management System’s (LMS) online polling tool (and other polling tools such as PollEverywhere and Mentimeter) and online quizzes.


mark gan

Mark GAN is an Associate Director of the Centre for Development of Teaching and Learning (CDTL) in NUS. He has been involved in a wide variety of higher educational initiatives and programmes to enhance professional development of staff, such as courses for developing a Teaching Portfolio and writing of teaching inquiry grants. His research interests include feedback and assessment, and the impact of academic development work on teaching and learning. Mark has a PhD in Education from the University of Auckland, supervised by Professor John Hattie.

Mark can be reached at


Print Friendly, PDF & Email