DIGITAL TOOLS FOR PROMOTING SOCIAL READING by Tamara TATE & Mark WARSCHAUER

Abstract

We know that students benefit from social learning: collaboration can help students process and understand new information, see different perspectives, and create a community of learners. However, reading in undergraduate classes is often a solitary activity. Annotation has long been known as a way to support reflective reading. Recently available social e-reading tools now allow a whole class of students to collectively annotate the same document. During social annotation, students simultaneously access and mark digital texts and leave comments and questions for each other, creating an anchored context for conversation. Social annotation improves learning by supporting self-regulation, increasing engagement, providing scaffolding for improved reading comprehension, and promoting deep thinking. While useful for all students, social annotation may be especially helpful for English learners and other students not studying in their primary language, allowing them to access the same materials as peers with the provided scaffolding. These positive outcomes occur when social annotation is well integrated into instruction. When it is too complicated, confusing, or buggy to access and use, it can distract students and actually harm their language and literacy development. Research on the impact of social annotation on learning processes and outcomes will be synthesized, including implications for use of these tools in English language teaching. We will then discuss a case study of the use of social annotation throughout a graduate school course, along with suggestions for implementation.

Keywords: Reading, Social Annotation, English learners, Second language learning

INTRODUCTION

We know that students benefit from social learning. Collaboration can help students process and understand new information, see different perspectives, and create a community of learners (Tate & Warschauer, 2022; Mendenhall & Johnson, 2010). However, reading in undergraduate classes is often a solitary activity, done (or perhaps more often not done) prior to class. If students have not actively engaged with the reading materials prior to class time, the ensuing synchronous class time discussion can be unproductive, with the instructor forced to lecture on material that should have been learned prior to class or skip the planned content. What can instructors do to encourage and support student reading prior to class? Can students prepare asynchronously and digitally, reading in a reflective manner that is visible to the instructor?

One way readers interact with texts is through annotation. Using a pencil, pen, or perhaps post-it note, students have long used annotations to note important information, raise questions, and reflect on what they are reading. Unfortunately, these handwritten notes are clunky and hard to navigate, group, or otherwise consolidate. In addition, the notes are inaccessible to their classmates and instructors. No one is there to answer the question the student may have noted.  The inaccessibility of handwritten annotations has fostered an interest in digital annotations, which allow students and teachers to more easily engage in activities like chunking or commenting on text. But as we know, learning – and especially language learning – is more powerful when it is social. And that is the principle behind a recent set of tools for social annotation, which allow groups of learners and teachers to annotate texts together. There are a number of software products for social annotation, with the two best known being Perusall and Hypothes.is. We have used Perusall successfully in our own undergraduate and graduate courses, and also reviewed the literature on its use, particularly in second language classrooms. We will outline the current state of research on social annotation and present a case study of our use in a graduate course to inform others who may be interested in trying social annotation in their courses.

During social annotation, students simultaneously access and mark digital texts and leave comments and questions for each other (Morales et al., 2022; Figure 1). Annotating directly on a text creates what scholars call an anchored context for conversation (Brown & Croft, 2020). Students can ask about specific language issues or share personal responses right next to the portions of the text that they highlight. The instructor can also prime the discussion, either by raising questions for students to consider or by responding to student comments.

Figure 1. Example of annotations in Perusall

Below is an example of how we have used social annotation in one of our own graduate classes on literacy and technology. A student raised a comment on the text, and then other students and the instructor have responded as well (Figure 2).

Figure 2.  Annotations in conversation with each other

 

 

 

Social annotation software also comes with a wide range of tools for instructors. Perusall, for example, has an automated student scoring mechanism that takes into account the quantity and quality of student postings, and also provides student confusion reports and student activity reports to faculty based on overall class comments. More broadly, social annotation software makes students’ thoughts and questions visible during the reading process, which can help instructors better understand and support their learning trajectories.

RESEARCH BASIS

Social annotation has been shown to improve student learning (Cadiz et al., 2000; Nokelainen et al., 2003; Marshall and Brush, 2004; Ahren, 2005; Gupta et al., 2008; Robert, 2009; Su et al., 2010). Social annotation improves learning by supporting self-regulation, increasing engagement, providing scaffolding for improved reading comprehension, and promoting deep thinking. While useful for all students, social annotation may be especially helpful for English learners and other students not studying in their primary language, allowing them to access the same materials as peers with the provided scaffolding. We discuss each of these briefly, with a deeper discussion of scaffolding for reading comprehension in order to highlight ways in which social annotation may be of particular benefit to English learners or students reading in their non-primary language.

Self-Regulation

One-way social annotation improves student learning is by increasing accountability for student reading, thus increasing student completion of reading assignments from reported levels of 60-80% not reading assigned texts to 90-95% completion of assigned readings prior to class (Miller et al., 2018). Social annotation tools also may increase the amount of time students spend reading (Miller et al., 2018), though we caution that this finding does not reflect off-line reading time nor account for the quality of the time spent online. Online social annotation also has a benefit over discussion forums, often used to discuss class readings–the online annotations are fixed in the text, rather than digressing as discussion threads often do.  This may help students stay more grounded in the text and task at hand (Sun & Gao, 2017).

Engagement

Instructors report substantial student motivation and engagement when using social annotation (d’Entremont & Eyking, 2021).  Social annotation has even been seen to promote affective as well as cognitive engagement with texts (Traester et al., 2021). Social annotation makes the learning experience more interactive (Chen et al., 2014; Su et al., 2010) and creates a sense of community (Solmaz, 2020; Allred et al., 2020) and a relaxed pedagogical setting suitable for educational risk taking (Solmaz, 2021). One researcher found that “More than 60% of students said annotating helped them feel connected to their peers, and almost 70% said it increased the amount of interaction they had with others in the class (Novak et al., 2012).

Scaffolding for Reading Comprehension

Reading comprehension abilities improve with the use of social annotations (Sun & Gao, 2017; Chang & Hsu, 2011; Chen et al., 2010; Chen et al., 2014) for both first and second language students (Sun & Gao, 2017). Social annotation distributes second language learners’ cognitive load (Blyth, 2014) and offers the opportunity for individualized reading support (Tseng et al., 2015).

Online annotations are often characterized as improving three levels of comprehension:  surface-based, text-based, and situation-based (Tseng et al., 2015). Surface-based comprehension means the students have sufficient grammatical knowledge and vocabulary to decode the meaning of the text. Text-based comprehension is a layer deeper and occurs when students cannot only decode the text, but can reproduce the essential information from the text.  Finally, in the situation-based comprehension level students can situate the textual information in other knowledge and integrate it in a coherent manner (Tseng et al., 2015). An instructor can cause students to use the social annotation tool to support these levels of comprehension and both instructors and peers can provide scaffolding through annotation of the text under study.  For example, instructors can prompt students to summarize what they have read at various points in the text, students can add definitions for words that they had to look up, and both instructors and peers can add background information that makes the text more understandable or suggest linkages to other class content (d’Entremont & Eyking, 2021).

Social annotation not only provides an interactive reading context for students with support for comprehension of the specific passage under study, but also opportunities to model and practice effective reading strategies (Bahari et al., 2021). Students using social annotation have improved their use of reading strategies (Chen & Chen, 2014). Collaborative annotation also provides students with the opportunity to draw others’ attention to specific content; organize, index, and discuss new information; and correct misunderstandings (d’Entremont & Eyking, 2021; Razon et al., 2012).

In one example, Solmaz (2021) carried out a study of social annotation in an English class of college students in Turkey. By analyzing students’ annotations, he found contextual affordances were facilitated through digital social reading practices such as asking comprehension questions, expanding content through questions, integrating knowledge from other sources, exploring additional information, activating background knowledge, and providing additional contextual information. Text-to-text connections are a powerful practice for developing contextual knowledge (Adams & Wilson, 2021). Linguistic affordances of digital collaborative reading practices were helpful for skills in areas such as reading, vocabulary, grammar, and writing (Solmaz, 2021). Specifically, students could emphasize the function of a grammatical structure, share grammar-related notes, ask structure centered questions, provide vocabulary-related explanations, ask questions about lexical context, and add multimedia representing lexical items.

Promoting Deep Thinking

Social annotation promotes deeper thinking by encouraging students to actively reflect on the text as they make annotations. Social annotation supports reinforcement of existing knowledge and the contextual integration of new knowledge (Glover et al., 2007). Social annotation helps students to build their knowledge in new domains by making and drawing on connections within texts together with their classmates and instructor. The act of annotation causes a reader to think about the content that they are about to write in order to ensure both the relevance and the merit of their thoughts before sharing them with their instructor and peers (Glover et al., 2007). It allows students time and space to consider rhetorical choices, reflect, think, and gather evidence prior to engaging in a discussion of the text (Chen & Chiu, 2008). In addition, the social affordances of digital collaborative practices include the recognition of multiple perspectives (Solmaz, 2021), which in turn scaffolds more critical thinking. Students report deeper learning than in typical class discussions of reading (d’Entremont & Eyking, 2021).

And a Word of Warning

All of these positive affordances have been reported multiple times. And we have noted them in our own instruction. However, a word of warning. These positive outcomes occur when social annotation is well integrated into instruction. When it is too complicated, confusing, or buggy to access and use, it can distract students and actually harm their language and literacy development (Archibald, 2010). It is important for instructors to be intentional about which digital tools and when they implement them in courses, since each tool has a learning curve. Thus, “we recommend purposeful implementation based on the accessibility of the technology, how effectively it addresses specific learning goals, and how well its intended purposes fit the needs of the students” (Allred et al, 2020, p. 238). Indeed, students report that the annotations take a great deal of time (d’Entremont & Eyking, 2021) and significant instructor time is required for integration.  A final caveat, social annotation may only work when it is used in a collaborative mode, rather than as an individual, so the context in which it is employed is of critical importance (Johnson et al., 2010; Hwang et al., 2007).

CASE STUDY

We used social annotation at a large research university in the southwestern United States.  Our setting was a graduate course in literacy and technology taught by the second author with learning assistant support from the first author. The course was presented in hyflex mode, with students having the option of participating in person or online, due to the ongoing pandemic. Each week students had three to four readings on the week’s topic, generally research articles.  The class used the Canvas learning management system to host all course content.

The authors were interested in using social annotation for the course and investigated multiple platforms months prior to the start of the course. Perusall was selected because it was integrated with the Canvas learning platform, had been successfully used by other faculty members at the institution who offered suggestions and support, was free, and allowed annotation on uploaded pdfs of articles so that all reading could be made available on the platform without charge to students.

The authors explained in the first course announcement that they would be using social annotation and uploaded a screencast explaining how students would access and use the platform. The use of social annotation was positioned as an experiment and student feedback was encouraged.  Indeed, use of the tool was iteratively revised over the course of the quarter based on student feedback.  For example, after the first week, the instructor required that student readings be done by the night before class in order to allow the instructor to review the annotations prior to class because the annotations had proven so useful for creating engaging, deep class discussions. Students were also no longer put in groups to comment, preferring to have the entire class in the same group so they could see one another’s comments. Although we had read that groups of more than 5 commenting on readings could become chaotic, we did not find this to be a problem in our class of approximately 10 people.

The instructor took various roles during the quarter. Especially in the early weeks, one of the authors would post comments, suggest questions, and make linkages between readings for students to consider.  See Figure 3 for an example of an instructor asking a specific question and student replies.

Figure 3.  Instructor and student discussion of text

 

 

 

 

 

 

 

 

 

 

 

 

In later weeks, less prompting by the instructors was needed and students were independently annotating with comments like, “This reminds me of a conversation we had in class about …” in which they connected the text to an earlier class discussion.  They also became confident in clarifying and refining each other’s annotations as time went on and in stating “I’m a little confused by this…” Posts became more conversational over time, for example “I was wondering the same thing!” One of our students even tried to regularly post memes as part of their annotations, adding some multimodality (and humor) to the readings.

One of the authors would also try to check in on the ongoing status of annotations during the week (generally the 48 hours before readings were due) to answer questions, correct misconceptions, and generally become aware of topics of interest that could be addressed in the upcoming synchronous class.  Prior to class, the instructor would find annotations that would serve as useful discussion starters in class, often copying the text of the annotation to the week’s slides and letting the student know that their annotation would be part of the class discussion.  Here’s an example (Figure 4) when the instructor had reviewed students’ comments and made notes on them during her slide presentation so she could call on the students to further elaborate during class.

Figure 4. Class slide with excerpts from reading annotations to prompt student in class discussion

This process was time intensive, but led to very rich discussions in which all students could participate.  Even students who might not prefer speaking in class were more likely to expand on thoughts they had been able to put down in the privacy of their reading without time constraints. We think the ability to think through their responses prior to class, coupled with the foreknowledge they would be called on to elaborate, allowed students (including those who were multilingual or had learning disabilities) to fully participate in ways that cold-calling on them would not.

We do note that there are students for whom the social annotation process hampers their reading for a variety of reasons, and for such situations, alternative arrangements may be appropriate (e.g., a 1-page reading response discussing the week’s readings). For some students with reading disabilities, the additional textual input from their peers may be too distracting for them to process. A student with social anxiety may be uncomfortable showing their thinking to their peers in the annotations. In these situations, the instructor can consult with the student, and possibly the disability resources on campus, to determine how to best structure the students’ learning. We have not seen research at this point that looks at the use of social annotation for neurodiverse populations, so instructors will need to creatively and compassionately address these issues as they arise.

Grading of the social annotations each week was awkward. Perusall will do automatic scoring of annotations based on quantity of annotations, perceived quality of annotations, amount of time recorded reading on the Perusall site, and other factors.  The factors can be adjusted by the instructor, and the settings used by the authors were based on suggestions from other faculty members.  However, since students were not required to read on Perusall (they could read offline and simply add annotations after reading) and the AI used imperfectly judged quality, the authors repeatedly assured students that reading grades would be adjusted at the end based on feedback from students. To allow students to provide this feedback, an assignment was created on Canvas giving each student the opportunity to give the instructor their own self-assessment of their reading/reflection in the course, giving themselves a grade of 0-100.  The instructor took these ratings into consideration and revised the automatic grading scores as deemed appropriate. 30% of the students chose to provide feedback, one noting that a score for a week had not been corrected on Canvas, but Perusall was “good overall,” and others explaining why they deserved higher grades and noting their appreciation for the opportunity to provide feedback. We recommend careful consideration of giving appropriate credit for the work involved in online social annotation and not de-motivating students with excessively narrow requirements for grading. As noted in a recent Opinion piece,

An engaging technology-aided activity guided by an instructor might feel like an invasion of privacy for students hesitant to make private thoughts public. An instructor encouraging open-mindedness can make students feel like they’re being watched. An over-reliance on technology assessing participation can make students feel like homework compliance is valued more highly than comprehension and the substance of their responses. (Cohn & Kalir, 2022)

CONCLUSION

If deep reading of a select amount of text is necessary in a course (as compared to skimming a large number of texts), instructors should consider the use of social annotation. Social annotation can help level the playing field for students who will be reading in a language other than their primary language. It may also support students with learning disabilities or provide important interactivity and social connection in an online course. We found active use of social annotation by instructors and students greatly enhanced the quality and breadth of in-class discussions.  However, instructors should select their platform with care to ensure that it is easy to use and well-integrated with their course platform. They need to provide clear instructions and ensure that all students are comfortably able to interact with the social annotation tool.  Course readings should be reduced in number to account for the additional time and effort to be put into more thoroughly reading a limited number of texts.  The instructor should plan to use the annotations actively, so that students are not shouting into a void, but rather their voices are amplified and discussed in class or seen and discussed by the instructor in the annotations. Students should receive appropriate credit in the grading scheme for their work reading the texts and the grading rubric should be transparent.

In summary, social annotation is similar to many other educational technologies we have researched (see, e.g., Grimes & Warschauer, 2010; Tate, et al, 2019; Warschauer, 1996; Warschauer, 2011)  – not a magic bullet to transform learning, but a valuable tool if effectively integrated into instruction. Educators who design learning experiences that build on both the affordances of the technology and the context of their classes can expect corresponding benefits for their students in both engagement and learning.

Acknowledgment: This paper draws on the keynote presentation made by the second author on social reading at the 6th CELC (e)Symposium, May 30, 2022.

REFERENCES

Adams, B. & Wilson, N. S. (2020). Building community in asynchronous online higher education courses through collaborative annotation. Journal of Educational Technology Systems, 49(2).

Ahren, T. C. (2005). Using online annotation software to provide timely feedback in an introductory programming course. Paper Presented at the 35th ASEE/IEEE Frontiers in Education Conference, Indianapolis, IN. Available at: http://www.icee.usm.edu/icee/conferences/FIEC2005/papers/1696.pdf

Allred, J., Hochstetler, S., & Goering, C. (2020). “I love this insight, Mary Kate!”: Social annotation across two ELA methods classes. Contemporary Issues in Technology and Teacher Education, 20(2), 230-241.

Archibald, T. N. (2010). The effect of the integration of social annotation technology, first principles of instruction, and team-based learning on students’ reading comprehension, critical thinking, and meta-cognitive skills. The Florida State University.

Bahari, A., Zhang, X., & Ardasheva, Y. (2021). Establishing a computer-assisted interactive reading model. Computers & Education, 172, 104261.

Blyth, C. (2014). Exploring the affordances of digital social reading for L2 literacy: The case of eComma. Digital Literacies in Foreign and Second Language Education, 12, 201-226.

Brown, M. & Croft, B. (2020). Social annotation and an inclusive praxis for open pedagogy in the college classroom. Journal of Interactive Media in Education, 2020(1), 8. DOI: http://doi.org/10.5334/jime.561

Cadiz, J. J., Gupta, A., & Grudin, J. (2000). Using web annotations for asynchronous collaboration around documents, in Proceedings of CSCW 00: The 2000 ACM Conference on Computer Supported Cooperative Work (Philadelphia, PA: ACM), 309–318.

Chang, C. K., & Hsu, C. K. (2011). A mobile-assisted synchronously collaborative translation–annotation system for English as a foreign language (EFL) reading comprehension. Computer Assisted Language Learning, 24(2), 155-180.

Chen, C. M., & Chen, F. Y. (2014). Enhancing digital reading performance with a collaborative reading annotation system. Computers & Education, 77, 67-81.

Chen, C. M., Wang, J. Y., & Chen, Y. C. (2014). Facilitating English-language reading performance by a digital reading annotation system with self-regulated learning mechanisms. Journal of Educational Technology & Society, 17(1), 102-114.

Chen, G., & Chiu, M. M. (2008). Online discussion processes: Effects of earlier messages’ evaluations, knowledge content, social cues and personal information on later messages. Computers & Education, 50(3), 678-692.

Chen, J. M., Chen, M. C., & Sun, Y. S. (2010). A novel approach for enhancing student reading comprehension and assisting teacher assessment of literacy. Computers & Education, 55(3), 1367-1382.

Cohn, J., & Kalir, R. (2022, April 11). Opinion:  Why we need a socially responsible approach to “social reading.” The Hechinger Report. Retrieved from https://hechingerreport.org/opinion-why-we-need-a-socially-responsible-approach-to-social-reading/

d’Entremont, A. G. & Eyking, A. (2021). Student and instructor experience using collaborative annotation via Perusall in upper year and graduate courses. Proceedings of the Canadian Engineering Education Association (CEEA-ACEG) Conference.55(3), 1367-1382.

Glover, I., Xu, Z., & Hardaker, G. (2007). Online annotation–Research and practices. Computers & Education, 49(4), 1308-1320.

Grimes, D., & Warschauer, M. (2010). Utility in a fallible tool: A multi-site case study of automated writing evaluation. Journal of Technology, Language, and Assessment 8(6), 1-43.

Hwang, W. Y., Wang, C. Y., & Sharples, M. (2007). A study of multimedia annotation of Web-based materials. Computers & Education, 48(4), 680-699.

Johnson, T. E., Archibald, T. N., & Tenenbaum, G. (2010). Individual and team annotation effects on students’ reading comprehension, critical thinking, and meta-cognitive skills. Computers in Human Behavior, 26(6), 1496-1507.

Marshall, C. C., & Brush, A. J. B. (2004). Exploring the relationship between personal and public annotations. Proceedings of JCDL’04: The 2004 ACE/IEEE Conference on Digital Libraries (Tucson, AZ), 349–357.

Mendenhall, A., & Johnson, T. E. (2010). Fostering the development of critical thinking skills, and reading comprehension of undergraduates using a Web 2.0 tool coupled with a learning system. Interactive Learning Environments, 18(3), 263-276.

Miller, K., Lukoff, B., King, G., & Mazur, E. (2018). Use of a social annotation platform for pre-class reading assignments in a flipped introductory physics class. Frontiers in Education, 2018(3). doi: 10.3389/feduc.2018.00008

Morales, E., Kalir, J. H., Fleerackers, A. & Alperin, J.P. (2022). Using social annotation to construct knowledge with others: A case study across undergraduate courses [version 2; peer review: 2 approved]. F1000Research 2022, 11:235 https://doi.org/10.12688/f1000research.109525.2

Nokelainen, P., Kurhila, J., Miettinen, M., Floreen, P., & Tirri, H. (2003). Evaluating the role of a shared document-based annotation tool in learner-centered collaborative learning. Proceedings of ICALT’03: the 3rd IEEE International Conference on Advanced Learning Technologies (Athens), 200–203.

Novak, E., Razzouk, R., & Johnson, T. E. (2012). The educational use of social annotation tools in higher education: A literature review. The Internet and Higher Education, 15(1), 39-49.

Razon, S., Turner, J., Johnson, T. E., Arsal, G., & Tenenbaum, G. (2012). Effects of a collaborative annotation method on students’ learning and learning-related motivation and affect. Computers in Human Behavior, 28(2), 350-359.

Robert, C. A. (2009). Annotation for knowledge sharing in a collaborative environment. Journal of Knowledge Management, 13, 111–119. doi:10.1108/13673270910931206

Solmaz, O. (2020). The nature and potential of digital collaborative reading practices for developing English as a foreign language. International Online Journal of Education and Teaching (IOJET), 7(4). 1283-1298. http://iojet.org/index.php/IOJET/article/view/763

Solmaz, O. (2021) The affordances of digital social reading for EFL learners: An ecological perspective.  International Journal of Mobile and Blended Learning (13) 2.

Su, A. Y. S., Yang, S. H., Hwang, W. Y., & Zhang, J. (2010). A web 2.0-based collaborative annotation system for enhancing knowledge sharing in collaborative learning environments. Computer Education, 55, 752–766. doi:10.1016/j.compedu.2010.03.008

Sun, Y., & Gao, F. (2017). Comparing the use of a social annotation tool and a threaded discussion forum to support online discussions. The Internet and Higher Education, 32, 72-79.

Tate, T, & Warschauer, M.  (2022): Equity in online learning. Educational Psychologist, DOI: 10.1080/00461520.2022.2062597

Tate, T. P., Collins, P., Xu, Y., Yau, J. C., Krishnan, J., Prado, Y., Farkas, G., & Warschauer, M. (2019). Visual-syntactic text format: Improving adolescent literacy. Scientific Studies of Reading, 23(4), 287-304.

Traester, M., Kervina, C., & Brathwaite, N. H. (2021). Pedagogy to Disrupt the echo chamber: Digital annotation as critical community to promote active reading. Pedagogy, 21(2), 329-349.

Tseng, S. S., Yeh, H. C., & Yang, S. H. (2015). Promoting different reading comprehension levels through online annotations. Computer Assisted Language Learning, 28(1), 41-57.

Warschauer, M. (1996).  Motivational aspects of using computers for writing and communication.  In Warschauer, M. (Ed.), Telecollaboration in foreign language learning (pp. 29-46). Honolulu, HI: University of Hawai’i Second Language Teaching and Curriculum Center.

Warschauer, M. (2011). Learning in the cloud: How (and why) to transform schools with technology.  New York: Teachers College Press.

Tamara TATE is a Project Scientist at the University of California, Irvine, and Assistant Director of the Digital Learning Lab. She leads the Lab’s work on Investigating Digital Equity and Achievement (IDEA), partnering with school districts, universities, nonprofit organizations, media and tech developers, and others in iterative development and evaluation of digital and online tools to support teaching and learning. She received her B.A. in English and her Ph.D. in Education at U.C. Irvine and her J.D. at U.C. Berkeley.
Mark WARSCHAUER is Professor of Education at the University of California, Irvine, where he directs the Digital Learning Lab. Professor Warschauer has made foundational contributions to language learning through his pioneering research on computer-mediated communication, online learning, technology and literacy, laptop classrooms, the digital divide, automated writing evaluation, visual-syntactic text formatting, and, most recently, conversational agents for learning. His dozen books and more than 200 papers have been cited more than 40,000 times, making him one of the most influential researchers in the world in the area of digital learning. He is a fellow of the American Educational Research Association and a member of the National Academy of Education.

 

Leave a Reply