Tag Archives: modified essay questions

How shall we know them? Learning through assessment

Technology in Pedagogy, No. 19, March 2014
Written by Kiruthika Ragupathi

The vast majority of us learn1; the question is of course, what we learn. Simon Watts, an associate professor in the Department of Chemistry from the Faculty of Science, National University of Singapore believes that all aspects of what we do as a teacher should facilitate student achievement of learning outcomes – this includes both ‘teaching’ and ‘assessment’ activities. Of course, the achievement of learning outcomes by reviewing a final exam one has just failed is clerically a little inconvenient, but in learning terms the assessment would have achieved one of its primary objectives.

Having spent time working on learning and personal support systems in higher education both in Oxford and New Zealand,  A/P Watts talked about his abiding fascination with the way people learn and think2, how they communicate and work in groups. Thus it came as no surprise when he started the session by elaborating on his pedagogic research which is centered around culture3,4 – the NUS culture, particularly the student-staff culture. The paradigm for the work is: behaviours, attitudes and customs; though he is sure that there will be other external drivers that may reinforce these. Without understanding the prevailing culture, it is very difficult to study the learning processes says A/P Watts particularly when the cultural paradigm here in Singapore is so interesting.

A learning process characterized by memorizing and rote learning, “a low level learning culture” does not give a deep grasp of the subject matter, and will affect the ability to process the full breadth and implications of the material concerned5. It has been said that this culture is common amongst many Singaporean students. However, A/P Watts proposes that the learning process be treated like a Tango (a dance where one partner leads, and other follows), and we, as learning facilitators, have a duty to lead this dance. His hypothesis is that the current culture is a result of this Tango.

In this session, A/P Watts discussed the initial development of two applications that facilitate student learning:

  • pre-laboratory tests
  • secure Modified Essay Question (MEQ)

The IVLE assessment tool was used for administering both these tests. The development of these applications is about how IVLE can be used to facilitate students to achieve planned modular learning outcomes and also look at how staff are facilitating high order student learning.

Pre-laboratory tests

The pre-laboratory tests were designed for CM2192, a compulsory module in Chemistry. Many times, as facilitators, we face the challenge of teaching students who simply don’t read at all or do not know how to read the textbook/scripts; and they seem to read and feel as if they need to memorize everything. This is a huge problem when teaching the laboratory, since time is limited for any kind of theory discussion. Hence the use of pre-tests which it is hoped help students focus their reading and help them prepare and understand the most important parts of the material before performing the experiments.

Therefore, the desired learning outcomes of the pre-lab tests designed by A/P Watts and the Chemistry Teaching Team were on safety, familiarity with script, and subject familiarity, but not necessarily requiring them to have an in-depth knowledge of the subject but the basic knowledge needed for the laboratory exercise. The laboratory exercises were designed in such a way that half were based on analytical chemistry while the other half were on physical chemistry. The focused reading of the scripts in preparation for the pre-tests forces students to think about and understand what they are about to do, and also provides framework for studying key concepts for the laboratory exercises. These tests act like a “door wardens” with an outcome of pass or fail. Any student not able to pass the test with 2 attempts is not allowed to take the practical: subtly informing students that preparation is needed.

The teaching team drafted a total of 15 unique questions per practical reflecting desired learning outcomes. Although each member of the team chose the form of their questions, effectively there were two categories: either questions that were yes/no or true/false (i.e. a 50% chance of being right in a blind guess), or more difficult questions: “Fill in the blanks” or “Which of the following is FALSE or TRUE?  (1 of 4)”. Each pre-lab test has 6 questions of which the students would need to get 5 correct to pass the test. The IVLE Assessment tool was used to design the pre-lab tests with every student taking the pre-lab test before every practical exercise. Each test is open for 60 minutes, only available and unique to that batch of the students taking that particular laboratory exercise.

The questions that A/P Watts asked himself were:

  • Is this technique in this application effective for student learning and in the effective use of staff time?
  • Do question types used influence pre-lab scores?
  • Are there marked differences between two the analytical and physical chemistry halves of course?
  • Do students “learn”? (not what do they learn: questions or assessment methods?)
  • Is there a link between pre-lab tests and final assessments?

From this experiment, it was fairly clear the staff time was reduced considerably from 35 hours per semester (based on individual tests twice for each practical on each week) to about 20 hours per semester (of which 70% is manually downloading marks and 20% re-setting tests). With more IVLE applications, it is estimated that this might take only 5 hours per semester.

It also became apparent that students had difficulty with analytical compared physical chemistry questions. However, it was also noted that physical chemistry section had more of “50% chance” T/F questions. Feedback from the participants proposed the usage of negative marking even when using the T/F question type would be able to take care of the issue; while other participants suggested getting students to key in the rationale for choosing a particular choice or to justify a reason for their choice over other choices.

A/P Watts recognized that it was difficult to gauge the effectiveness of student learning though students have reported that these pre-lab tests have helped them better understand and focus on the experiment.

Pedagogical advantages that pre-tests offer

Pre-test is a measurement of the learning received either through pre-lectures, readings or scripts as a result of understanding students’ prior knowledge before participating in an activity. It should be noted that such pre-tests can be used not only in the laboratory setting but also for any activity that requires students to prior preparation for participation in an activity.

Reasons for having pre-tests with focused readings/ video lectures are:

  • They are helpful to quantify the knowledge attained in the class and if the desired learning outcomes were achieved by students with diverse learning styles and varied preparation. More specifically, the tests indicate how the students are prepared for the learning activities and how they are learning.
  • The focused readings / video lectures force students to think about and understand what they are about to do, and also provide them with a framework for studying key concepts for tests.
  • They should also promote curiosity and imagination to predict outcome of experiments or activities while also promoting good reading/listening comprehension strategies like: previewing, re-reading, making connections, and self-questioning, before, during and after reading the scripts.
  • It is hoped that they also improve students’ participation and engagement during learning activities/laboratory exercises.
  • The data collected from the tests may enable facilitators to target students requiring extra help and will also help in identifying teaching and learning methods that need to be changed or developed.

Modified Essay Question (MEQ)

Modified Essay Questions (MEQs) are often included as assessments to test higher order cognitive skills as the more commonly used multiple-choice questions (MCQs) are generally regarded as testing for knowledge recall only. Thus an MEQ is a compromise between multiple-choice question (MCQ) and essay and sits in between these two test instruments in terms of the ability to test higher cognitive skills and in the ease of marking to a consistent standard.

A/P Watts used MEQs as part of the final examination (50% assessment balance) for the module CM4282 – Energy Resources. The desired learning outcomes were: Subject familiarity; ability to quantify abstract problems; ability to use simple models; lateral and logical reasoning.

For developing the MEQs, he again used the IVLE Assessment Tool, but now in a secure environment. Students were allowed to take the test in a secure mode while being able to access specific documents (a Singapore Statistical and Information Pack and a Singapore government policy paper:  Singapore National Climate Change Strategy).

The MEQs were unique in the sense that it employed a “No turning back” design. To illustrate this further, take for example an MEQ with four-part question, students move from Q1a à Q1b à Q1c à Q1d. The student needs the answer to (for example) Q1a to answer Q1b, and so on. Hence, if a student gets Q1a wrong he/she will be disadvantaged in the questions that follow which depended on the Q1a answer. In this situation, it would not be fair to penalize the student in questions 1b, 1c and 1d for the error in 1a. Hence, after students answers Q1a, they are given the correct answer to 1a, but cannot go back and correct their original answer, but can only proceed to the next question. However, now they have a good 1a answer with which to proceed. A sample of an MEQ used is given below:

MEQ - Sample

The research questions that A/P Watts asked himself were:

  • Is this technique in this application effective for student learning and in the effective use of staff time?
  • How does using a computer affect student performance6?
  • How does “no turning back” affect student performance?
  • Do students “learn” from the exam?

The MEQ allowed the staff to mark the final exam in 4 hours, mainly due to the structure of the IVLE assessment; thereby reducing staff time. Students indicated that it was the hardest module but the most enjoyable they had done, and had suggestions for improvement. However, students also felt that the MEQ ‘no turning back’ structure did not help them have a better understanding of the full set of questions and hence did not allow them to do planning at the beginning of the exam before taking the questions. Taking this feedback into consideration, A/P Watts has decided to allow students access to a soft copy of the paper during the exam. As a follow-up, until there were “drawing” input abilities for IVLE, he also felt the need to use it more for course-work exams rather than only for final examinations.

A/P Watts ended his presentation and opened up the discussion posing questions and seeking feedback and ideas from participants.  Some of the questions that were posed include:

  • How do we test absolute student learning and how do we know what they are learning?
  • Participants’ thoughts on the “no turning back” structure

This stimulated a lively discussion with participants sharing their experiences in the laboratories and in the use of assessments.

Q:  Most of the time in the laboratory classes, 2% of the un-prepared   students take up 98% of staff time. So with the use of the pre-tests, does   the preparation of the students match the pre-test scores?
SW: More students were prepared,   but unfortunately students were still focused on getting the results that   they need to get, and generally do not come with probing questions to be   discussed during the lab. Often we see students start to think and prepare   for the labs only when they start writing their reports; and so the pre-lab   questionnaire is a way to get students to start thinking about it   before-hand.Comment from participants:

You mentioned that students   analytical scores improve over the term – would this be due to the fact that   they are involved in the lab work before they learn a concept and hence   during the end of term their understanding of concepts are better.

 

Q:  Is it a lot a work for a pre-laboratory work?
SW: Looking at a combination and   changing questions for each set of pre-lab tests is automatic, opening up   tests to different groups of students does not require much time. The time   needed is for the preparation of the questions, which is a team effort, and   as mentioned earlier time is saved in marking them, and getting students to   be better prepared for the laboratory exercises. 
Q:  You only allow them to take the pre-lab two attempts. Will it be a disadvantage   the students?
SW: Yes, only two attempts are   allowed as we also do not have a big pool of questions to recycle. However,   when students fail both the attempts, this alerts the academic who runs the   practical session, to assist if there is a genuine problem. But for students   who do not take the effort to attempt the questions, they will not be allowed   to do the practical. 
Q:  Wouldn’t changing numerical values will allow you to create a large   pool of questions?
SW: Yes, that is entirely possible,   and something I badly want to do. I have not found an easy way to do this   within the IVLE system; but will be exploring for ways to have that happen   with the IVLE team. 
Q:  Are the students asking more qualitative questions during the lab   after the pre-lab? Many students just give you answer that they got from   their senior’s reports
SW: Not necessarily. I asked   students questions about their results during the lab and did not really get   better qualitative answers.  Some will   just give you answers from their senior’s reports.

The discussion then moved on to improving the laboratory learning experience and making it engaging and fun. One participant mentioned that it might be good to have at least one of the practical to be a flawed practical in the first year, as a way of training them into the lab as a practical skill; while others felt practical is progressive work and it is important to address the average NUS student – it is not that their attitudes are bad, but they need some encouragement to progress over the years. Participants also noted that there are other factors that play a part – curriculum design; validation by external bodies; breadth of coverage in the first and second year. They also felt the need to think for the common student, not dumb it down – get the structure, get them to move a little bit out of their comfort zone; but will still need to have the teaching team colleagues on board. There were suggestions to add to the current video lectures before the labs with virtual practical using Java applets.  As teachers we aim to facilitate an inquisitive nature in our students, although often one asks observational questions first.

Others highlighted the fact that if we consider culture is a problem (i.e. if seniors are passing lab materials and report to juniors), why are we as facilitators not riding on that to use it to our advantage. While grappling with a lot of things; we teach skills, but we also need to teach attitudes.  Having talked to a lot of students – they need to learn but not at the expense of their grades – you try to make the system such that we encourage them along the way.

Another issue that was discussed was on training of Graduate Teaching Assistants, as sometimes the lab technicians and GTAs might give away the correct results, getting students to repeat the exercise rather than help them probe further.

Finally the discussion moved to MEQs; particularly the usage of MEQs during for course-work tests rather than for the final exams. It was largely agreed that using essay type question for large classes or resorting to MCQs was not the way to go for enhancing student learning. Hence, MEQs would be a good option to consider particularly if easy questions could be designed for the first few tests; and as we proceed over the semester – have more difficult questions. This will show an improvement in the assessment, and could be due to their familiarity of the content as well as the process over the semester.

 

References

  1. Basic Cognitive Theory (e.g. Piaget) for example Paul, K. [1951] The Psychology of Intelligence. Routledge, London.
  2. Schaeffer, B. K., Rubenfels, M.G. [2001] Critical Thinking, what is it and how do we teach it? Current Issues in Nursing. 6e J.M. Dochtorman and H.K. Grace, Mosby, St Louise.
  3. Baker, D. Taylor, P.C.S. [1995] The effect of culture on the learning of science in non‐western countries: the results of an integrated research review. Int. J. Sci. Educ. 17, 695-704.
  4. Bishop, R., Berryman, M. [2006] Culture speaks: cultural relationships and classroom learning. Huia. Wellington, NZ.
  5. Biggs, J. B. [2003] Teaching for quality learning at university. Buckingham Open University Press, 2e
  6. Sloan & McGinnis, (1978); Russell & Wei (2004); Powers, et al., (1994); Mogey (2010); Burke and Cizek (2006); Russell and Haney (1997); Mogey et al., (2010), etc.