Students’ Peer Review Procedure on an English Proficiency Course: Reflecting to Move Forward

Timothy Luke GROOMBRIDGE and Deborah Ann CHOO
Centre for English Language Communication (CELC)

Timothy and Deborah reflect on the benefits and limitations of applying student peer review (PR) in their English for Academic Purposes efficiency course. In particular, they discuss how PR is applied in the course and examine the extent it adheres to principles offered in the literature.

By pch.vector on Freepik

Image by pch.vector on Freepik.

Groombridge, T. L., & Choo, D. A. (2024, March 26). From learners to collaborators: Exploring the role of student-teacher partnerships in UTOP. Teaching Connections.


While the overall efficacy of student peer review (PR), both from the perspective of the reviewer and the reviewee is not in doubt (Liu & Hansen Edwards, 2018; Papadopoulos et al., 2016), limitations on the procedure currently employed on an English for Academic Purposes proficiency course at NUS are worth exploring. In this context, peer review refers to students evaluating each other’s assignments and suggesting modifications. The aim of this brief reflection is to outline the procedure, to analyse to what extent it adheres to principles offered in the literature, and to provide the rationale for further study of this key learning component.


Time, or the lack of it, is perhaps the main obstacle to improving the PR process. Recently, course time has been reduced and this resulted in the axing of the in-person meetings that students often seemed to favour, and which Liu and Hansen Edwards (2018) stress the importance of, as it crucially allows clarification of written comments to take place. However, this was somewhat compensated for by the fact that students, working in groups of three, had more time to read their allotted scripts—the first half of a 1000-word Problem Solution essay —and to make suggestions following a set of “Yes”/“No” questions that followed the grading rubric’s categories of “Content”, “Organisation”, and “Language”. Communication was delivered asynchronously via e-mail. However, some students reported that they had also shared their texts with each other via WhatsApp. Although all students followed the procedure and reported that it had been a positive experience, a rigorous analysis is required to establish what sort of edits are being recommended and to what extent quality uptake is occurring, i.e. to what extent is student feedback literacy being developed (Carless & Boud, 2018).


Several key issues are of note. Firstly, there is limited time for PR training for both students and of teachers. Such training is regarded as essential by the research (Chang, 2016; Hyland, 2019). Currently, teachers can give brief instructions in the class before the PR tasks are undertaken, and students are independently meant to access instructional notes and videos. However, it is unlikely that everyone does this, and anecdotal comments from students suggest this is the case. Secondly, the checklist questions: the current list comprises 30 “Yes”/“No” questions. Both global and local foci are encouraged. It might be more effective, however, to reduce the number of questions (Chang, 2016), and make them predominantly open-ended and also to emphasise textual organisation over surface features (Liu & Hansen Edwards, 2018). Baker (2016) suggests it is often the local errors that less experienced students seem to focus on, and such feedback is perhaps more relevant in terms of fine-tuning writing just prior to final submission, which is not the case in the current context. Carless and Boud (2018) also recommend that teachers can model effective PR, thus providing learners with much-needed examples of how organisational changes can be suggested and under what circumstances. (For an example of PR modelling, see Appendix.)


The overall impression of the PR process, from both the teachers’ and students’ perspectives, would appear to be mixed. Although students reported enjoying the procedure, and provisionally it would appear that some of the comments and suggestions made have had a beneficial effect on re-drafts, there seems little doubt that a more in-depth study is warranted to assess:

  1. the type of comments that are given and whether uptake is taking place.
  2. the quality and type of the uptake.
  3. the efficacy, or otherwise, of the check questions.
  4. the type and quantity of training that both learners and teachers receive/require.
  5. whether the grading rubric should be adjusted to reflect the process more accurately.


The main aim of any further study would be to improve the PR literacy of students and to move towards a more optimal model of PR (Crossman & Kite, 2012; Van Den Berg et al., 2006). It could also go some way to determine whether the PR that occurs on an initial proficiency course has any beneficial washback on the other courses the learners undertake at NUS. Also, the concept of grading PR as a separate course component might be considered (Baker, 2016). Several students reported that they would have spent considerably more time on the process had their efforts been more fully rewarded.


The preliminary findings outlined in this reflection warrant further examination and we are currently seeking Institutional Review Board (IRB) approval to conduct a study. Such approval would allow for a thorough analysis of the comments and scripts to gain a far clearer understanding of the strengths and weaknesses of PR, and would also make it possible to formally survey the learners to gain insights into their perceptions of the process.


Appendix. An Example of Effective Modelling of Peer Review.



Baker, K. M. (2016). Peer review as a strategy for improving students’ writing process. Active Learning in Higher Education, 17(3), 179-92.

Carless, D. & Boud, D. (2018). The development of student feedback literacy. Enabling uptake of feedback. Assessment and Evaluation in Higher Education, 43(8), 1315-25

Chang, C. Y. H. (2016). Two decades of research in L2 peer review. Journal of Writing Research, 8(1), 81-117.

Crossman, J. M., & Kite, S. L. (2012). Facilitating improved writing among students through directed peer review. Active Learning in Higher Education, 13(3), 219-29.

Hyland, K., & Hyland, F. (2019). Contexts and issues in feedback on L2 writing. Feedback in Second Language Writing: Contexts and Issues, 1-22.

Liu, J., & Hansen Edwards, J. G. (2018). Peer response in second language writing classrooms. University of Michigan Press.

Papadopoulos, P. M., Lagkas, T. D., & Demetriadis, S. N. (2017). Technology-enhanced peer review: Benefits and implications of providing multiple reviews. Journal of Educational Technology & Society, 20(3), 69-81.

Van Den Berg I, Admiraal W and Pilot A (2006) Peer assessment in university teaching: Evaluating seven course designs. Assessment & Evaluation in Higher Education 31(1), 19–36.



Tim GROOMBRIDGE is a Lecturer with the Centre for English Language Communication (CELC). His research interests have included the use of English in other subjects, particularly maths, and he has also explored summary writing skills in hybrid reading to write tasks.

Tim can be reached at


Deborah CHOO is an Instructor with CELC, and specialises in teaching the proficiency levels of English Academic Writing. Deborah has explored the use of Socratic questioning as a coaching tool for self-evaluation of students’ writing. For many years, she has served in the CELC’s English Assist and Writing Communication Hub programmes.

Deborah can be reached at


Print Friendly, PDF & Email