The Potential Harm of Student-generated Feedback Surveys on Teaching

Rebekah WANIC and Nina POWELL
Department of Psychology, Faculty of Arts and Social Sciences (FASS)

Rebekah and Nina take us through the reasons why educators should adopt a cautionary approach towards student-generated surveys that are constructed outside of instructor-mediated guidance.

Nina-Rebeka
Photo courtesy of Shutterstock
Wanic, R., & Powell, N. (2022, April 25). The potential harm of student-generated surveys on teaching. Teaching Connections. https://blog.nus.edu.sg/teachingconnections/2022/04/25/the-potential-harm-of-student-generated-feedback-surveys-on-teaching/

 

Some of you may have received emails beginning with a pleasant greeting followed by a student, or group of students, sharing some collated feedback about student perceptions of the module or broader curriculum. You may also notice that such emails are arriving earlier in the semester.

In most cases, they do so without ill intent. Rather, they are communicating their feelings and perceptions of the module or more specific aspects such as instruction, assigned readings, or assignments.

Good intentioned or not, such student-generated feedback may be inappropriate and harmful both to students themselves and larger educational aims. Below, we outline several reasons why it is problematic for students to create their own satisfaction surveys and module feedback questionnaires, and why educators should take a cautionary approach to student-led feedback processes.

 

Should Students Have Control of the Classroom?

First, educators, not students, are in charge of module design, content, evaluative criteria and the structure of class sessions. As such, they maintain the authority to determine the most appropriate way for the course to be structured and delivered and the assignments, readings, and experiences that are best suited to meet the module’s educational objectives. A student feedback survey on course content is at most a self-report or indirect measure of perceived teaching quality. Each individual student would have their own preferred teaching style that they perceive responding to best, what content they would like to review and what types of assignments they prefer to complete. This makes it impossible to satisfy all students simultaneously, as a choice of one method or mode over another will necessarily leave some individuals dissatisfied.

Furthermore, students lack the discipline-specific knowledge and pedagogical training to determine module content and structure. While we agree that sharing end-of-semester feedback1 can be useful for educators in considering future module-related modifications, research demonstrates that student preference (Daniel & Woody, 2010) and perception (Baron, 2021) are not always aligned with what actually facilitates their learning. Educators versed in pedagogical evidence should therefore decide on appropriate course adjustments based not solely on students’ perceptions, and only after careful consideration.

 

Students Have No Training in Survey Construction

Student-constructed informal feedback surveys, typically related to teaching, module content and assignments, invariably emphasise feelings and perceptions, and typically demonstrate little knowledge in survey construction. It is not for nothing that most universities offer undergraduate and graduate courses in research and survey design. Furthermore, educators who design student feedback surveys do so to evaluate their larger educational goals and principles. Satisfaction and feelings are not the main focus, as this is typically unhelpful in determining whether educational aims are being achieved.

As an illustrative example, one student-generated survey solicited feedback to a question with response options framed as “coping easily”, feeling “overwhelmed” or being “totally lost” with respect to workload, and another that solicited open-ended responses to how “the subject,” but not the module, could be better. Analysis of the question-and-answer options within these surveys highlight its amateurish elements: lack of clarity in precisely what is being assessed, double-barrelled wording, and lack of relevance to any education aim. Thus, interpretation and usefulness of such feedback is compromised.

Additionally evident is the problematic use of mental health language. A university education, and its curricula and modules, should be challenging and this challenge should be expected and appreciated. Constantly couching the struggles faced within a demanding curriculum or difficult coursework in the language of mental health leads students to misinterpret any stress as a sign of pathology and easily become overwhelmed with a focus on self-care and stress management. While these (self-care and stress management) are useful skills to cultivate, students who are reinforced in the belief, through self- and university-generated surveys, that their satisfaction, enjoyment and stress reduction are paramount concerns will be hampered in their ability to develop resilience and undermined in their quest to achieve the very things they desire.

Finally, feedback surveys are typically distributed anonymously. As such, without proper care in the choice and wording of questions, they can potentially generate inappropriate comments and allow students to disguise their accountability-free disrespect as feedback. Administrative changes, a general move toward increasing consumerism in education, and anonymous forums that allow for disembodied communication have all contributed to a growing sense of student entitlement coupled with a paucity of liability for improper conduct. A student-generated survey lacking careful construction opens the door for comments and insults that would otherwise not be shared during in-person interaction and provide no useful information other than perhaps recognition of students’ immaturity.

 

Conclusion

Many think that when students take initiative, such as constructing their own feedback survey, they should be rewarded and attended to. However, here we have outlined why this is problematic and should be mediated by proper coaching. Student-generated surveys constructed without instructor-mediated guidance lack a focus on important educational objectives, place inordinate emphasis on feelings and short-termism, and lack evidence of quality design. It is imperative that educators with the knowledge and experience do not position themselves as allies by promoting attention to students’ preferences and emotional enjoyment at the expense of providing them with the intellectual challenge and guidance that they signed up for. We must work harder to prevent students from engaging in destructive behaviour and work to structure the nature of feedback such that it promotes rather than detracts from educational aims, and fosters rather than impedes the development of a resilience-enhancing mindset.

 

rebeka-wanic

Rebekah WANIC is a Senior Lecturer in the NUS Department of Psychology, joining NUS in December 2020. She regularly teaches the core module PL3105 “Social Psychology” and the upper-level seminar PL4225 “Psychology of Gender” and previously taught over 15 different modules during her time in the US. Dr. Wanic is passionate about critically evaluating educational practice and co-facilitated a learning community with Dr. Powell to explore how pedagogical techniques and faculty behavior can undermine or support student resilience.

Rebekah can be reached at psyraw@nus.edu.sg.

Nina Powell

Nina POWELL is a Senior Lecturer in the NUS Department of Psychology and has been teaching in NUS since 2013. Her research focuses on judgment, certainty, and decision-making in children and adults, both in the context of moral decision-making and education. She teaches introductory psychology modules (PL1101E “Introduction to Psychology” and PL3234 “Developmental Psychology”) and upper-level seminar modules on Moral Psychology and Historical Controversies in Psychology. She is also Deputy Director of Undergraduate Studies in Psychology, a member of CAFÉ (Career Advancement for FASS Educators), and the co-founder of MADE in Psych (Mentoring and Demystifying Education in Psychology).

Nina can be reached at nina.powell@nus.edu.sg.

 

Endnote

  1. The feedback can include what they would have liked to learn more about, what topics were the most challenging, and/or what techniques or activities they felt were beneficial to their learning.

 

References

Baron, N. S. (2021, May 3). Why we remember more by reading – especially print – than from audio or video. The Conversation. https://theconversation.com/why-we-remember-more-by-reading-especially-print-than-from-audio-or-video-159522

Daniel, D. B., & Woody, W. D. (2010). They hear, but do not listen: Retention for podcasted material in a classroom context. Teaching of Psychology, 37(3), 199-203. http://dx.doi.org/10.1080/00986283.2010.488542

 

Print Friendly, PDF & Email