Tag Archives: Assessment

Conducting assessments online: How can you support your students?

In a previous post, I focused on the options that teachers may consider for conducting assessments online. While these may be important, it is far more important to start thinking about students and their well-being. The current COVID-19 outbreak is indeed a difficult time for our students as well, who may be dealing with a great deal of anxiety and stress. While we need to ensure the quality of our courses and the assessments, we need to give the best opportunity for our students to complete their courses.

Therefore, when designing your assessments, it may be good to think about:

  • How you can minimise additional anxiety for students in these difficult times.
    Recognise the fact that students may be facing challenging personal circumstances while working from home. Particularly when entire families are working/learning from home during this period, it may be challenging for them to even find a quiet spot to take the exam or attend your live lectures. Or that students may have periods of illness during reading weeks, revision and/or examination periods.
  • What you can do to offer flexibility in your assessments, but of course, while maintaining  accountability. For example, you could:
    • Be flexible with your deadlines.
    • Have extended duration or provide longer time to complete your assessments,
    • Offer multiple attempts for students to complete an assessment. You could then take the average of the best attempt or best two attempts.
    • Give choice in the topic, method, criteria, weighting or timing of assessments.
    • Give freedom and autonomy on the assessment format (essay/paper, poster, presentation, video, infographic).
    • Provide a second chance for students. Have students take a time-constrained closed-book online exam, and then follow it up with a second copy of the same, but as an automated take-home online assessment. The student may use class notes, the text, or any other available reference materials except other checking with their peers or others. A composite grade is calculated using the following formula: in-class exam score plus the product of half the points missed on the in-class exam times the percentage score on the take-home exam. (adapted from “Better testing for better learning” by Murray, 1990 In College Teaching)
  • Group work and/or projects. While using group projects as your assessment, think about if your can offer flexibility in whether they can work in group or alone. Working in groups during this period may be stressful particularly when their grades have to depend on their peers’ work as well.
  • How students may experience varying levels of digital literacies and internet connectivity, while completing their assessment tasks.
  • How you can provide a robust assessment method but still be able to meet the needs of students with diverse needs, including reasonable adjustment.
  • Finally and more importantly, how your assessments can offer an equitable solution for both you and your students.
[This post is a reproduction from my other blog on Towards Open Learning]

Conducting assessments online: What options do you have?

The COVID-19 pandemic has thrown many faculty members into a limbo particularly when it concerns assessments. While most have managed with online teaching to some extent either through recording of their lectures or conducting live webinars, many are still grappling with how to conduct assessments online. In this post, I put together some options that you may wish to consider be it using your campus learning management systems (LMS) [examples include: Moodle, Canvas, Blackboard, LumiNUS], or other open platforms that may be available to you.

When developing your online assessments, some important aspects for you to consider include:

  • Establishing the purpose of your assessment, and what you hope to achieve
  • Aligning your assessments with the learning outcomes of your course
  • Communicating your expectations and requirements clearly to your students

More importantly, you will need to remember that online examinations are quite different from your usual face-to-face examinations. Simply converting your current face-to-face examinations into an online format can be a recipe for failure; as these online exams are not in a confined, invigilated environment. So it is important that you ask yourself the following questions when planning to conduct online assessments:

  • How do you make the assessment process meaningful to both you and your students?
  • Should the assessment be synchronous(real-time) or asynchronous (delayed), group or individual?
  • Is securing the assessment necessary? If so, at what level?
  • How do you plan to provide high quality feedback and opportunities for your students to act on feedback?

In this post, I list some online assessment options that you can use. I have also included some examples and available tools that you can consider.

Traditional assignments online
Traditional assignments can be in the form of Essays, Case studies, Article reviews, Proposal writing or Report writing. You can get students to submit the essays to you for review. Most LMS systems have either an Assignments tool or Files Submission tool with which students can submit these traditional assignments online. If you are using Microsoft Teams or the Google Classroom, you can similarly use the Assignments to upload files. Additionally, you can also use the Essay type (short answer) questions with the Quiz tool for students to submit essays.
Whatever be the tool, be transparent in the marking criteria so that students know what is expected of them, and be specific about the allocated marks, word limits. Finally, be sure to offer individualised relevant feedback and a summary of common mistakes for the entire class.

Quizzes (automated online assessment)
Online Quizzes that contain multiple-choice questions, multiple response questions, fill-in-the-blanks, true/false, and matching questions can be developed to assess students’ learning or to assess their prior knowledge. Such automated quizzes can also be embedded within video lectures or micro-lectures to test students’ learning, and these quizzes are generally referred to as in-video quizzes.

Timed online assessment
If you are considering to conduct your mid-semester exams and final exams in an online format, you can design assessments that are time-constrained. To minimise cheating in such exams, you should also consider:

  • randomisation of questions and options in your MCQs
  • personalisation of questions (e.g., numerical values are personalised for individual students; questions are selected from a large pool of questions which are of same difficulty level)
  • structure the questions in such a way that students are prevented from returning to previous question or section (sequencing your questions)
  • require students to provide a short justification (rationale) for each MCQ question (sometimes, referred to as two-tiered MCQs).

Online interaction
The Forum or Discussion tool, blogs and wikis facilitate asynchronous interaction. You can use these tools to monitor student learning via their contributions to online forums, chats, blogs and wikis. These contributions can be in the form of reading summaries, collaborative learning where students work in small groups to provide critical peer review on each other’s work.

Group assessments online 
Students can work in groups to create online presentations, project artefacts and upload their presentations for you to review or to be reviewed by peers or both.

Critical reflection and meta-cognition  
You can consider the use of electronic portfolios, online journals, logs, diaries, blogs, wikis, embedded reflective activities. For any of these, you can consider using peer and self-assessment to assess these critical reflection pieces.

Online oral examinations
One-on-one or small group oral examinations can be conducted via any video conferencing tool such as Skype, Microsoft Teams, Zoom, Google Duo. Additionally, you can also consider students to role play or participate in debates via the online video conferencing platforms to assess their learning. You can use the recording function in these tools, in case you would like to review these at a later date.

Getting students to submit assessment questions
You can get students to create and submit assessment questions online for each topic or course. An online quiz with two-part short answer/essay question can be used to get students to (1) create and input their assessment question and (2) write their explanation on what is being assessed and why it is important for student learning and how it is related to the learning outcome(s) of the topic or course. Alternatively you can get students submit their assessment questions on the forum, and conversations with peers and instructors on the strength and weakness of the assessment question via the forum.

Take-home quizzes on reading assignments
A take-home quiz for every reading assignment with one question for approximately each page of the text, with the order of the questions following the order of the pages. The test questions (stem) should be “objective” in a fairly literal sense, the answers (the options) should be quite specific; to answer a question, students should need to do little more than find the right sentence or paragraph and read it with a certain degree of understanding.
(extracted from “Tests that also teach” by Williams, 1988, In American Philosophical Association Newsletter on Teaching Philosophy)

Group multiple-choice test
When taking a multiple-choice in-class test, students could consult with their peers if they wished; but each student would finally have to complete the online assessment, and would be graded individually.
(adapted from “Better testing for better learning” by Murray, 1990 In College Teaching)

Paired testing
This assessment consists of a series of thirty-question exams of two parts each. The first set of fifteen questions is taken individually by each student (can be a timed quiz). But, the second set of fifteen questions is assigned to student teams of two. Student teams can discuss each test item but were not required. Finally, each student turns in an individual answer to the quiz.
(extracted from “Peer-mediated testing: The effects of an alternative testing procedure in higher education” by Hendrickson, J. M., Brady, M. P., and Algozzine., B., 1987, Educational and Psychological Research)

This blogpost is adapted from a resource guide on designing effective online assessments that was developed by me many years back, which I use for conducting workshops on designing online assessments at my campus.

[This post is a reproduction from my other blog on Towards Open Learning]

How shall we know them? Learning through assessment

Technology in Pedagogy, No. 19, March 2014
Written by Kiruthika Ragupathi

The vast majority of us learn1; the question is of course, what we learn. Simon Watts, an associate professor in the Department of Chemistry from the Faculty of Science, National University of Singapore believes that all aspects of what we do as a teacher should facilitate student achievement of learning outcomes – this includes both ‘teaching’ and ‘assessment’ activities. Of course, the achievement of learning outcomes by reviewing a final exam one has just failed is clerically a little inconvenient, but in learning terms the assessment would have achieved one of its primary objectives.

Having spent time working on learning and personal support systems in higher education both in Oxford and New Zealand,  A/P Watts talked about his abiding fascination with the way people learn and think2, how they communicate and work in groups. Thus it came as no surprise when he started the session by elaborating on his pedagogic research which is centered around culture3,4 – the NUS culture, particularly the student-staff culture. The paradigm for the work is: behaviours, attitudes and customs; though he is sure that there will be other external drivers that may reinforce these. Without understanding the prevailing culture, it is very difficult to study the learning processes says A/P Watts particularly when the cultural paradigm here in Singapore is so interesting.

A learning process characterized by memorizing and rote learning, “a low level learning culture” does not give a deep grasp of the subject matter, and will affect the ability to process the full breadth and implications of the material concerned5. It has been said that this culture is common amongst many Singaporean students. However, A/P Watts proposes that the learning process be treated like a Tango (a dance where one partner leads, and other follows), and we, as learning facilitators, have a duty to lead this dance. His hypothesis is that the current culture is a result of this Tango.

In this session, A/P Watts discussed the initial development of two applications that facilitate student learning:

  • pre-laboratory tests
  • secure Modified Essay Question (MEQ)

The IVLE assessment tool was used for administering both these tests. The development of these applications is about how IVLE can be used to facilitate students to achieve planned modular learning outcomes and also look at how staff are facilitating high order student learning.

Pre-laboratory tests

The pre-laboratory tests were designed for CM2192, a compulsory module in Chemistry. Many times, as facilitators, we face the challenge of teaching students who simply don’t read at all or do not know how to read the textbook/scripts; and they seem to read and feel as if they need to memorize everything. This is a huge problem when teaching the laboratory, since time is limited for any kind of theory discussion. Hence the use of pre-tests which it is hoped help students focus their reading and help them prepare and understand the most important parts of the material before performing the experiments.

Therefore, the desired learning outcomes of the pre-lab tests designed by A/P Watts and the Chemistry Teaching Team were on safety, familiarity with script, and subject familiarity, but not necessarily requiring them to have an in-depth knowledge of the subject but the basic knowledge needed for the laboratory exercise. The laboratory exercises were designed in such a way that half were based on analytical chemistry while the other half were on physical chemistry. The focused reading of the scripts in preparation for the pre-tests forces students to think about and understand what they are about to do, and also provides framework for studying key concepts for the laboratory exercises. These tests act like a “door wardens” with an outcome of pass or fail. Any student not able to pass the test with 2 attempts is not allowed to take the practical: subtly informing students that preparation is needed.

The teaching team drafted a total of 15 unique questions per practical reflecting desired learning outcomes. Although each member of the team chose the form of their questions, effectively there were two categories: either questions that were yes/no or true/false (i.e. a 50% chance of being right in a blind guess), or more difficult questions: “Fill in the blanks” or “Which of the following is FALSE or TRUE?  (1 of 4)”. Each pre-lab test has 6 questions of which the students would need to get 5 correct to pass the test. The IVLE Assessment tool was used to design the pre-lab tests with every student taking the pre-lab test before every practical exercise. Each test is open for 60 minutes, only available and unique to that batch of the students taking that particular laboratory exercise.

The questions that A/P Watts asked himself were:

  • Is this technique in this application effective for student learning and in the effective use of staff time?
  • Do question types used influence pre-lab scores?
  • Are there marked differences between two the analytical and physical chemistry halves of course?
  • Do students “learn”? (not what do they learn: questions or assessment methods?)
  • Is there a link between pre-lab tests and final assessments?

From this experiment, it was fairly clear the staff time was reduced considerably from 35 hours per semester (based on individual tests twice for each practical on each week) to about 20 hours per semester (of which 70% is manually downloading marks and 20% re-setting tests). With more IVLE applications, it is estimated that this might take only 5 hours per semester.

It also became apparent that students had difficulty with analytical compared physical chemistry questions. However, it was also noted that physical chemistry section had more of “50% chance” T/F questions. Feedback from the participants proposed the usage of negative marking even when using the T/F question type would be able to take care of the issue; while other participants suggested getting students to key in the rationale for choosing a particular choice or to justify a reason for their choice over other choices.

A/P Watts recognized that it was difficult to gauge the effectiveness of student learning though students have reported that these pre-lab tests have helped them better understand and focus on the experiment.

Pedagogical advantages that pre-tests offer

Pre-test is a measurement of the learning received either through pre-lectures, readings or scripts as a result of understanding students’ prior knowledge before participating in an activity. It should be noted that such pre-tests can be used not only in the laboratory setting but also for any activity that requires students to prior preparation for participation in an activity.

Reasons for having pre-tests with focused readings/ video lectures are:

  • They are helpful to quantify the knowledge attained in the class and if the desired learning outcomes were achieved by students with diverse learning styles and varied preparation. More specifically, the tests indicate how the students are prepared for the learning activities and how they are learning.
  • The focused readings / video lectures force students to think about and understand what they are about to do, and also provide them with a framework for studying key concepts for tests.
  • They should also promote curiosity and imagination to predict outcome of experiments or activities while also promoting good reading/listening comprehension strategies like: previewing, re-reading, making connections, and self-questioning, before, during and after reading the scripts.
  • It is hoped that they also improve students’ participation and engagement during learning activities/laboratory exercises.
  • The data collected from the tests may enable facilitators to target students requiring extra help and will also help in identifying teaching and learning methods that need to be changed or developed.

Modified Essay Question (MEQ)

Modified Essay Questions (MEQs) are often included as assessments to test higher order cognitive skills as the more commonly used multiple-choice questions (MCQs) are generally regarded as testing for knowledge recall only. Thus an MEQ is a compromise between multiple-choice question (MCQ) and essay and sits in between these two test instruments in terms of the ability to test higher cognitive skills and in the ease of marking to a consistent standard.

A/P Watts used MEQs as part of the final examination (50% assessment balance) for the module CM4282 – Energy Resources. The desired learning outcomes were: Subject familiarity; ability to quantify abstract problems; ability to use simple models; lateral and logical reasoning.

For developing the MEQs, he again used the IVLE Assessment Tool, but now in a secure environment. Students were allowed to take the test in a secure mode while being able to access specific documents (a Singapore Statistical and Information Pack and a Singapore government policy paper:  Singapore National Climate Change Strategy).

The MEQs were unique in the sense that it employed a “No turning back” design. To illustrate this further, take for example an MEQ with four-part question, students move from Q1a à Q1b à Q1c à Q1d. The student needs the answer to (for example) Q1a to answer Q1b, and so on. Hence, if a student gets Q1a wrong he/she will be disadvantaged in the questions that follow which depended on the Q1a answer. In this situation, it would not be fair to penalize the student in questions 1b, 1c and 1d for the error in 1a. Hence, after students answers Q1a, they are given the correct answer to 1a, but cannot go back and correct their original answer, but can only proceed to the next question. However, now they have a good 1a answer with which to proceed. A sample of an MEQ used is given below:

MEQ - Sample

The research questions that A/P Watts asked himself were:

  • Is this technique in this application effective for student learning and in the effective use of staff time?
  • How does using a computer affect student performance6?
  • How does “no turning back” affect student performance?
  • Do students “learn” from the exam?

The MEQ allowed the staff to mark the final exam in 4 hours, mainly due to the structure of the IVLE assessment; thereby reducing staff time. Students indicated that it was the hardest module but the most enjoyable they had done, and had suggestions for improvement. However, students also felt that the MEQ ‘no turning back’ structure did not help them have a better understanding of the full set of questions and hence did not allow them to do planning at the beginning of the exam before taking the questions. Taking this feedback into consideration, A/P Watts has decided to allow students access to a soft copy of the paper during the exam. As a follow-up, until there were “drawing” input abilities for IVLE, he also felt the need to use it more for course-work exams rather than only for final examinations.

A/P Watts ended his presentation and opened up the discussion posing questions and seeking feedback and ideas from participants.  Some of the questions that were posed include:

  • How do we test absolute student learning and how do we know what they are learning?
  • Participants’ thoughts on the “no turning back” structure

This stimulated a lively discussion with participants sharing their experiences in the laboratories and in the use of assessments.

Q:  Most of the time in the laboratory classes, 2% of the un-prepared   students take up 98% of staff time. So with the use of the pre-tests, does   the preparation of the students match the pre-test scores?
SW: More students were prepared,   but unfortunately students were still focused on getting the results that   they need to get, and generally do not come with probing questions to be   discussed during the lab. Often we see students start to think and prepare   for the labs only when they start writing their reports; and so the pre-lab   questionnaire is a way to get students to start thinking about it   before-hand.Comment from participants:

You mentioned that students   analytical scores improve over the term – would this be due to the fact that   they are involved in the lab work before they learn a concept and hence   during the end of term their understanding of concepts are better.

 

Q:  Is it a lot a work for a pre-laboratory work?
SW: Looking at a combination and   changing questions for each set of pre-lab tests is automatic, opening up   tests to different groups of students does not require much time. The time   needed is for the preparation of the questions, which is a team effort, and   as mentioned earlier time is saved in marking them, and getting students to   be better prepared for the laboratory exercises. 
Q:  You only allow them to take the pre-lab two attempts. Will it be a disadvantage   the students?
SW: Yes, only two attempts are   allowed as we also do not have a big pool of questions to recycle. However,   when students fail both the attempts, this alerts the academic who runs the   practical session, to assist if there is a genuine problem. But for students   who do not take the effort to attempt the questions, they will not be allowed   to do the practical. 
Q:  Wouldn’t changing numerical values will allow you to create a large   pool of questions?
SW: Yes, that is entirely possible,   and something I badly want to do. I have not found an easy way to do this   within the IVLE system; but will be exploring for ways to have that happen   with the IVLE team. 
Q:  Are the students asking more qualitative questions during the lab   after the pre-lab? Many students just give you answer that they got from   their senior’s reports
SW: Not necessarily. I asked   students questions about their results during the lab and did not really get   better qualitative answers.  Some will   just give you answers from their senior’s reports.

The discussion then moved on to improving the laboratory learning experience and making it engaging and fun. One participant mentioned that it might be good to have at least one of the practical to be a flawed practical in the first year, as a way of training them into the lab as a practical skill; while others felt practical is progressive work and it is important to address the average NUS student – it is not that their attitudes are bad, but they need some encouragement to progress over the years. Participants also noted that there are other factors that play a part – curriculum design; validation by external bodies; breadth of coverage in the first and second year. They also felt the need to think for the common student, not dumb it down – get the structure, get them to move a little bit out of their comfort zone; but will still need to have the teaching team colleagues on board. There were suggestions to add to the current video lectures before the labs with virtual practical using Java applets.  As teachers we aim to facilitate an inquisitive nature in our students, although often one asks observational questions first.

Others highlighted the fact that if we consider culture is a problem (i.e. if seniors are passing lab materials and report to juniors), why are we as facilitators not riding on that to use it to our advantage. While grappling with a lot of things; we teach skills, but we also need to teach attitudes.  Having talked to a lot of students – they need to learn but not at the expense of their grades – you try to make the system such that we encourage them along the way.

Another issue that was discussed was on training of Graduate Teaching Assistants, as sometimes the lab technicians and GTAs might give away the correct results, getting students to repeat the exercise rather than help them probe further.

Finally the discussion moved to MEQs; particularly the usage of MEQs during for course-work tests rather than for the final exams. It was largely agreed that using essay type question for large classes or resorting to MCQs was not the way to go for enhancing student learning. Hence, MEQs would be a good option to consider particularly if easy questions could be designed for the first few tests; and as we proceed over the semester – have more difficult questions. This will show an improvement in the assessment, and could be due to their familiarity of the content as well as the process over the semester.

 

References

  1. Basic Cognitive Theory (e.g. Piaget) for example Paul, K. [1951] The Psychology of Intelligence. Routledge, London.
  2. Schaeffer, B. K., Rubenfels, M.G. [2001] Critical Thinking, what is it and how do we teach it? Current Issues in Nursing. 6e J.M. Dochtorman and H.K. Grace, Mosby, St Louise.
  3. Baker, D. Taylor, P.C.S. [1995] The effect of culture on the learning of science in non‐western countries: the results of an integrated research review. Int. J. Sci. Educ. 17, 695-704.
  4. Bishop, R., Berryman, M. [2006] Culture speaks: cultural relationships and classroom learning. Huia. Wellington, NZ.
  5. Biggs, J. B. [2003] Teaching for quality learning at university. Buckingham Open University Press, 2e
  6. Sloan & McGinnis, (1978); Russell & Wei (2004); Powers, et al., (1994); Mogey (2010); Burke and Cizek (2006); Russell and Haney (1997); Mogey et al., (2010), etc.

Using Multimodal Communications for Critical Thinking Assignments

Technology in Pedagogy, No. 10, August 2012
Written by Kiruthika Ragupathi

Recent pedagogical movements have demonstrated the value of mastering multiple literacies, asking students to become knowledgeable not only in their analysis of the written word, but also in other forms of visual media ranging from advertisements to photojournalism to cinema. However, while approaches to literacy have become increasingly “multimodal”, student outputs have remained largely “unimodal”, with the written word being privileged for its ability to convey a level of complexity supposedly outside the purview of other communication forms.

Research indicates that students who incorporate multimodal forms and approaches to their learning are better engaged with the content than those who employ traditional approaches, thereby enhancing their thinking and learning process. It is possible for students to convey their ideas that is critically engaged through the use of multimodal forms, says Dr Jasmine Nadua Trice, a lecturer in the Ideas and Exposition programme, a multidisciplinary critical thinking and writing programme at the National University of Singapore.  Her background in Film and Media studies with a PhD in Communication Culture and her interest in teaching film studies, public speaking and film productions lead her to trying out the use of multimodal communications in her modules.

In this session, Dr Trice shared her experience teaching a General Education Module (GEM) that essentially employs multimodal communications focusing not on technology but on the content (Emergent Media), and more importantly on the multimodal forms that the assignments took place in. Using her class as a case study, she examined the potential usefulness of multimodal communications for undergraduate level criticism, asking what kinds of critical pedagogies such an approach to student inquiry might enable.

Multimodal communications: An overview

Dr Trice provided a brief overview of multimodal forms of communication and highlighted some examples of scholarly work that inspired the proposal of a new course.

Multimodal communication is a form of communication that uses a combination of written, audio and visual forms to convey an idea and works in tandem with media literacy movements. Gunther R. Kress, a Professor of Semiotics & Education at the University of London points out that “in this ‘new media age’ the screen has replaced the book as the dominant medium of communication and this dramatic change has made image, rather than writing, the center of communication.” Multimodal literacy, therefore, is an established field and it is apparent that it is possible to understand critical ideas and academic analysis through multimodal forms in an undergraduate classroom. 

Multimodal scholarship

In the recent years, multimodal scholarship is stronger in the fields of media studies and digital humanities. The multimodal scholarships sometimes take the form of web-based or interactive text forms and at other times use video essays or screencasts.

Dr Trice showcased some examples of scholarly work that were of particular interest to her:

  • Vectors, Journal of Culture and Technology in a Dynamic Vernacular, USC: “Vectors is realized in multimedia, melding form and content to enact a second-order examination of the mediation of everyday life.”
  • International Journal of Learning and Media, MIT: “Rich media contributions representing key research findings that exceed the boundaries of the printed page.”
  • Kairos,  A Journal of Rhetoric, Technology, & Pedagogy: “publish scholarship that examines digital and multimodal composing practices, promoting work that enacts its scholarly argument through rhetorical and innovative uses of new media.”
  • Alliance for Networking Visual Culture: “creates scholarly contexts for the use of digital media in film, media and visual studies.”

 She highlighted the examples that she would use in the module – those that her students need to read or watch. The examples helped students to visualize complex concepts and emphasizes the fact that creativity is the most important factor in using multimodal communications effectively.

  •  Alexandra Juhasz, Learning from YouTube (MIT Press, 2010):
    Learning from YouTube, the first video—book published investigates questions with a series of more than 200 texts and videos, also known as “texteos.” This video-book, an example of web-based or interactive text mode, integrates the news clips based on the interviews that Dr Juhasz had with CNN, her book and the assignments created by her students when she taught the module on “Learning from YouTube”. Students in her class used YouTube as the media to do their assignments.

Dr Trice highlighted that when she tried using this as an example in her module, the students were disconcerted with the format and also required that a steep learning curve was necessary for her students to use such interactive text. Hence this reading, she said was avoided in the current semester.

  • Richard Langley, “American Un-Frontiers: Universality and Apocalypse Blockbusters”: 
    This example showcases the use of visual elements and the usage of text in a video essay to underscore the idea that the author is getting across. This example of video essay integrates icons, text and archival footage in interesting ways employing a screencast method that employs a linear way of presenting the video essay. (http://vimeo.com/32288942)
  • David Gauntlett, “Making is Connecting” (www.artlab.org.uk / www.theory.org.uk)
    An example of what one can do with basic screencast software (http://www.youtube.com/watch?v=nF4OBfVQmCI&feature=relmfu)
    This example highlights: 

    • The tone that is used in the video he uses when oral voice-over is added much more casual – and that the casual tone does not make the video any less professional but more importantly how the tone has to match the medium;
    • the use of basic information design for presentation of ideas;
    • gives a context of what is being talked about;
    • gives a literature review with citing secondary sources that is used;
    • also visualize the quotes from others.
  • Images from a graphic design books. E.g., visualizing content – Europe; Corriette Schoenaerts, for her fashion spread on countries and borders, in Robert Klanten et al., Eds. Data Flow: Visualizing Information in Graphic Design (Berlin: Gestalten, 2010), 189; Christoph Niemann, Sleep Agony Chart, in Robert Klanten et al., Eds. Data Flow: Visualizing Information in Graphic Design (Berlin: Gestalten, 2010), 107;  C.P.G. Grey, “The True Cost of the Royal Family Explained”

Dr Trice emphasized to her class that the content skills, conceptual skills and practical skills learned in the module would be integrated in producing the assignments. She reassured her students that it was not necessary for them to have high levels of technical ability and skills for doing well in the module and what mattered most was the CREATIVITY.

Proposal for a new module using multimodal communications

A workshop at CDTL that introduced her to the screencasting options available at NUS (Camtasia Relay and Ink2Go), her background in Film and Media studies coupled with her interests in teaching film studies, public speaking and film productions inspired her to propose the new module on “Emergent Media and Multimodal Communications”. This enabled her to combine all her interests to explore a more productive approach to teaching. The module was developed so as to provide students with a broad understanding of transitional media and culture not only through engagement with module content, but also through developing written, oral, and visual communication strategies.

To get her students to understand and better appreciate the use of multimodality in the module, she introduced the idea of multimodality to her students by probing them on:

  • what the idea of “modality” entails,
  • what the idea of “multimodality” entails,
  • how different is multimodal from unimodal communications, and
  • how the written word is still the dominant mode employed in most of the University assignments.

She briefed her class on how things would be different in the module where they (her students) would be involved in producing assignments that employ different forms of modality. The students on the first day of the class were also encouraged to contemplate on what they would gain and/or lose when moving from the written mode to multimodal approach of critical ideas. Students were then asked to reflect upon if it was possible for them to convey critical ideas and academic analysis through multimodal forms. She emphasized the idea of critical thinking and whether it was possible to convey their ideas that is critically engaged and analytically rigorous using images, audio and the written or spoken word.

The module had three units with each culminating in an assignment that require students to use one or more of the written, oral, or visual communicative modes.  The assignment tasks were designed to cultivate the practical comprehension of media by allowing students to convey ideas about class content using multiple forms of communication, both residual and emergent. The tasks enabled her students to:

  • combine video, still images, audio, and text to convey complex, academic investigations in a clear and creative manner, and
  • convey critical ideas in an unconventional form.

However, she also emphasized that the main focus of the assignments were on thinking about the ideas and the video essays using Screencast and not on the technology itself.

The first assignment was to use multimodal essays which were posted on the class Facebook page with peers providing reviews and comments on the essay. The second assignment involved the use of Screencast videos. And the final assignment was an oral presentation in groups.

Assignment 1 – Multimodal Essay:

The multimodal essay assignment is not about testing the multimedia skills but on the usage of the visual parts of the essay. The students were advised against the use of pre-made or readily available templates, as it was important to create something original that is visually and aesthetically compelling. The output was a 575-600 word multimodal essay. The assignments were graded in such a way that the three quarters of the grade was for the content analysis and on how they would visualize the theoretical concepts (75% for analysis; 15% for multimodal aspects and 10% for writing style and structure). All students were required to post their assignments on a class Facebook page that has to be accompanied with an explanation as to why they used a certain approach (assignments were uploaded on SCRIBD, an online PDF environment). This allowed students to justify their visual process/approach taken. Their classmates were then required to comment and critique on their peers’ work. Dr Trice felt that this was extremely helpful for her to understand the student’s thought process, particularly when it is difficult to understand the execution.

The common approach was that students used the evolution approach (e.g., from a book to iPAD). One student used a newspaper format and provided a wider context with the use of news splashes. Listed below are some samples from her students’ work:

Assignment 2 – Screencast/ Video essay

Students will create screencast videos or video essays, each of which should be 6 minute long clip. Again, the grading’s focus was on the analysis with 75% marks assigned for content and 25% assigned for the multimodal aspect. Dr Trice briefed and showed samples on how the students’ video essays should focus on multimodal scholarship and information design; use of videos, moving and still images; slides, on editing and juxtaposition, the voice-over narration, the use of on-screen text and symbols, and the use of music.

Students produced a variety of video essays: videos with no voice – so text heavy slides, with interesting use of on-screen text, good visualization of core ideas, and visuals inspired by RSAnimate series.

Assignment 3: Group oral presentation

The focus of the oral presentations was on: visual aids employed in the presentation; audiences and informative strategies; the vocal and physical modes of delivery; and on preparing for questions.

Assessment/Grading criteria

Overall, the assignments were assessed based on the following grading criteria:

Analysis (75%)

Multimodal aspects (25%)
(Composition, visual components,
editing + transitions, voice)

  • Demonstrates a clear understanding of class readings
  • Assesses and applies these ideas to other authors or to student’s own thoughts &  examples
  • Clearly organized, with an introduction, transitions, and a conclusion
  • Flows smoothly, building the analysis with each section
  • Demonstrates an understanding of the multimodal principles studied in class
  • Uses these principles in creative and compelling ways to support the overall analysis

 Pedagogical potentials of multimodal communications

  1. Enrich and empower student learning. Providing learners an opportunity to create a shared representation of language – textual form, visual form and an auditory form— proves to be cognitively and pedagogically valuable. The usage of multimodal  communication in their assignments help students transfer ideas from writing into multiple ways of communicating, offering them greater opportunities for meaning making. It helps them convey their ideas in critically engaged and analytical rigorous ways. With the changed and changing communication, the use of multimodality in the assignments will enable students to enter the workplace confident of their own potentials.  
  2. Engages peers and promotes reflection. The multimodal components provides a greater opportunity for students to engage with their peers as it allows them to present their arguments in multiple ways through written, spoken, and visual texts. When students view, comment and critique the work of the peers, it aids in reflection after the assignment task and promotes overall learning. These appeal to students’ interest and motivate them to be engaged learners.
  3. Enhance writing and communication skills. Making the multimodal essays, video essays and screencast helped students to hone their writing and communication skills.

Reflections and future directions

Dr Trice reflected upon the planning of assignments and indicated that she would change the way she did the oral presentation assignment and would consider the use of other criteria for assessing multimodal forms based on the work by (Ball, 2012). Ball (2012) identifies items that need to be considered when assessing such multimodal forms of assignments and could also be used by students when developing their assignments and while peer reviewing other’s work. Some items to consider include: (i) the project’s structural or formal elements must serve its conceptual core; (ii) the design decisions made must be deliberative, controlled, and defensible; (iii) the project should have distinguishable and significant goals that are different from what be achieved on paper; (iv) the design should enact the argument; and importantly it is important for students to have thought of a visual metaphor for the argument.

Summary of Feedback/ Suggestions from the Discussion

Dr Trice welcomed ideas and ways that participants have employed multimodality in their classroom.  A  lively discussion followed and participants discussed on:

  • To what degree is it possible for undergraduates to convey critical analysis through multimodal forms?
  • How to develop class content and/or assignments that would allow students to employ multimodal approach?
  • How should the grading criteria be designed to effectively assess such multimodal forms of assignments?

There were other participants who had used such multimodal forms of assignments in their modules. They also agreed with Dr Trice that the technical skill of students was never a problem as it was “super easy” to edit and create movies (e.g., FinalCut Pro, Windows movie maker, Adobe premiere). They also pointed out that when students start working in groups, they tend to help each other. One participant felt that once a student’s work is uploaded, and a high bar is set, then all the other students try to outdo each other, and in the process also teach other.

Another participant indicated that for his module, the students were free to choose their own platform based on what they were comfortable with – YouTube, videos, multimodal or essay. Based on his experience, students submitting written essays tend to go deeper in their analysis. However, he used an assessment criterion that awarded 60% for content and 40% for presenting ideas and also had two different sets of criteria for the multimodal form and the written essay. However, he found that it was difficult to follow two standards and he also felt that this might not be fair.

Q & A Session

Listed below are some questions from the subsequent Q & A session:

Q:  Did you have lessons that taught students the necessary technical skills for creating such assignments?
JT: Drag-drop editing necessary to create these videos does not need background knowledge on technology. I created a tutorial using Screencast – options include the use of camtasia, ink2go, & imovie. Students can also meet with me for consultations if they needed help. Only those students who were super enthusiastic used the consultation sessions—and usually the huge proportion of focus was on content.
Q:  Do students with technical skills/technology background have an edge/advantage over the others?
JT: This was something that I put a considerable amount of thought into and that was the reason for having the grading criteria place a greater emphasis on the content rather than on technology. I also got students to get their preliminary sketches and to have discussions with their peers before submitting the assignment. I also got them work in groups and discuss on what they were planning to do, and that also helped, I think.
Q:  How do you measure if this new method is more effective than your old method?
JT: I don’t think it is very different from writing an essay, it is pretty similar structurally and in terms of the ideas that they get across if they are doing a voice-over in particular. It is interesting for collaboration, and students will find it easy for peers to watch it rather than reading the peer’s essay. It is also interesting for public dissemination– making the materials available beyond the classroom. So it might be good to a website in addition to the Facebook page. In term of whether it is better for critical thinking – I think a lot of it is same when compared to writing essays but this form is more novel, and students like it for its novelty.I also felt that students were seeing each other’s work and were benefitting from it. I required them to comment on at least 2 of their peer’s work.  But students generally went beyond that and comments comment on more students’ work. Since it is in the students’ social space, appearing on their FB timeline. I also discussed with them on providing constructive critique and how they could improve on their comments.
Q:  Do you spend time to talk to students about copyright and plagiarism (fair use of information)?
JT: During my first lesson, I talk to them on the fair use of information. I told them to make the link to their essays private as I was not sure if the references made were appropriate. I did not spend too much time on that aspect as the presentations were not made public. And also since this was mainly for educational purpose, I guess it is fair use.  I also informed that whatever they used cannot be pre-fabricated and that the components made has to be original. I also gave them open source websites, creative commons site from where they could get the images, photos and music. Students also need to provide a worksite page, and which would have the references and links. Personally, I would spend more time on it when I teach this course again.
Q:  If we want to incorporate this type of teaching, as a teacher what skills do I need to have?
JT: CDTL’s workshops like Screencast, Breeze, Ink2Go, Moviemaker would be a good starting point. I also researched and explored online on the things one can do with Screencast. It is also important to get a lot of examples and showcased them to the students and engaged them in discussion during the class. Most of these applications are intuitive.
Q:  Have you wondered about how different a traditional essay by the same student would be? What are the implicit and explicit assumptions that are more pronounced in multimodal forms vs a traditional essay? Are students prone to making assumptions when make a video due to addition of music, tonality, songs, etc. How do contrast the both?
JT: That’s right, rather than explicitly spelling out exactly what they are trying to get across, they would present to you with some kind of multimodal image, sound combination and expect those to the work, and the meaning is more ambiguous.But this is something that I have not looked at it very rigorously which I should look at.  But one of the things that I try to do is to include a lot of opportunity for students to discuss their design elements and they would need to include justifications for their decisions in terms of the multimodal form. And since for most of them this was a first attempt, most of them were reading scripts from probably an essay format. There should be probably ways to study if there is difference between traditional essay & video essay. But definitely this is something that I would like to work on in the future.

 References

Cheryl E. Ball, (2012). “Assessing Scholarly Multimedia: A Rhetorical Genre Studies Approach,” Technical Communication Quarterly, 21: 61-77, 2012.

 

 

Assess students prior knowledge to identify misconceptions

Activity type:  Feedback, Student Learning 

Tool(s) used:  LumiNUS Quiz, Readings, Weblinks

Description:
Prior knowledge is necessary for learning, hence it would be a benefit if faculty are able to assess their students’ prior knowledge of a subject before they start their introduction of the subject in class.

Using a simple quiz (LumiNUS Quiz) with multiple choice questions, faculty can quickly gauge their students’ knowledge level. For instance, you could have a short quiz at least two days before the lecture asking students to identify new concepts or distinguish between various new concepts in the assigned readings.

Provide support for assignments

Activity type: Assessment, Communication, Collaboration

Tool(s) used: LumiNUS Files, LumiNUS Forum

Description:
Week 1:
During the first week, the facilitator provides the class with an introduction and overview of the assignment, essay or proposal that is to be prepared by students.

Week 2: The students take this time to think through their proposals, while the facilitator helps them with their queries

Week 3: Students submit their draft proposal through the discussion forum (LumiNUS Forum). Students from the class peer review at least three draft proposals submitted by their peers. The facilitator also participates in the discussion and gives his feedback. However, the facilitator gives time for other students to give their comments before he intervenes with his comments.

Week 7: During week 7, all students submit their sections 1, 2 & 3 through the Students Submissions available in the LumiNUS Files. Students are paired up to review their peer’s work. Each student submits their peer reviewed back into the Students Submissions section. The facilitator then review both the draft poposal and the feedback from the peers and gives his feedback allowing students to further improve on their work.

Week 13: Students then submit their final assignment (1,000 – 2,000 word essay) through the Students Submission section of the LumiNUS Files.

This way the students benefit a lot and are able to critically evaluate and improve on their assignments effectively.