Read Only Participants Summary

User Generated

oebjarlrqtveyoe1

Humanities

Description

Read "Read-Only Participants: A Case for Student Communication   in Online Classes" by Nagel, Blignaut, and Cronje which is   located in the e-Library Resource section of the Topic 1 materials.

After reading the Nagel, Blignaut, and Cronje article, write a   250-500-word summary of it.


Unformatted Attachment Preview

Interactive Learning Environments ISSN: 1049-4820 (Print) 1744-5191 (Online) Journal homepage: https://www.tandfonline.com/loi/nile20 Read-only participants: a case for student communication in online classes L. Nagel , A. S. Blignaut & J. C. Cronjé To cite this article: L. Nagel , A. S. Blignaut & J. C. Cronjé (2009) Read-only participants: a case for student communication in online classes, Interactive Learning Environments, 17:1, 37-51, DOI: 10.1080/10494820701501028 To link to this article: https://doi.org/10.1080/10494820701501028 Published online: 13 Mar 2009. Submit your article to this journal Article views: 29775 Citing articles: 25 View citing articles Full Terms & Conditions of access and use can be found at https://www.tandfonline.com/action/journalInformation?journalCode=nile20 Interactive Learning Environments Vol. 17, No. 1, March 2009, 37–51 Read-only participants: a case for student communication in online classes L. Nagela*, A.S. Blignautb and J.C. Cronjéc a University of Pretoria, South Africa; bNorth-West University, South Africa; cCape Peninsula University of Technology, South Africa (Received 5 April 2007; final version received 25 May 2007) The establishment of an online community is widely held as the most important prerequisite for successful course completion and depends on an interaction between a peer group and a facilitator. Beaudoin reasoned that online students sometimes engage and learn even when not taking part in online discussions. The context of this study was an online course on web-based education for a Masters degree in computer-integrated education at the University of Pretoria. We used a mixed methodology approach to investigate how online activity and discussion postings relate to learning and course completion. We also investigated how student collaborative behaviour and integration into the community related to success. Although the quantitative indices measured showed highly significant differences between the stratifications of student performance, there were notable exceptions unexplained by the trends. The class harboured a well-functioning online learning community. We also uncovered the discontent students in the learning community felt for invisible students who were absent without reason from group assignments or who made shallow and insufficient contributions. Student online visibility and participation can take many forms, like read-only participants who skim over or deliberately harvest others’ discussions. Other students can be highly visible without contributing. Students who anticipate limited access due to poor connectivity, high costs or other reasons can manage their log-in time effectively and gain maximum benefit. Absent and seldom contributing students risk forsaking the benefits of the virtual learning community. High quality contributions rather than quantity builds trust among mature students. We suggest how to avoid read-only-participation: communicate the required number of online classroom postings; encourage submission of high quality, thoughtful postings; grade discussions and give formative feedback; award individual grades for group projects and rotate members of groups; augment facilitator communication with Internet-independent media to convey important information. Read-only-participants disrupt the formation of a virtual community of learners and compromise learning. Keywords: higher education; web-based learning; participation; lurkers; virtual community of learners Background As more formal education courses are available online, quality and non-completion remain problems: While online course enrolments continue to climb, retention and success rates in such courses and programs are frequently reported as typically lower than those delivered in *Corresponding author. Email: lynette.nagel@up.ac.za ISSN 1049-4820 print/ISSN 1744-5191 online Ó 2009 Taylor & Francis DOI: 10.1080/10494820701501028 http://www.informaworld.com 38 L. Nagel et al. a traditional classroom format; those of us in roles that support online students have a role in reversing that trend! (Schreck, 2006) Researchers often measure the success of online learning as students’ perception of learning and course throughput rates. Drop-out rates for online courses range from 20 to 50%, often 10–20% higher than for equivalent contact courses (Bernard, Brauer, Abrami, & Surkes, 2004). Searching for a model to predict student success in online learning, Bernard et al. (2004) found that students’ frame of mind can predict readiness for learning and affect course outcomes, while ‘‘prior achievement is still the best predictor of future achievement’’ (Bernard et al., 2004, p. 44). Research shows that online participation is necessary to ensure successful course completion (Klemm, 1998; Rovai & Barnum, 2003; Swan, Shea, Fredericksen, Pickett, & Pelz, 2000). Clark and Feldon (2005) concluded that a facilitator who participates and interacts with students prevents them from abandoning their course. Better cognitive outcomes occur when students engage and form a virtual community of learners. The development of a community depends on online interaction with their peers and the facilitator. Learner satisfaction, perseverance, and cognitive outcomes characterize the formation of a virtual learning community. Some contest participation as a prerequisite to learning, claiming students learn sufficiently by observation (Beaudoin, 2002; Sutton, 2001), and lobby for leniency towards lurking or read-only participation. This article responds to Beaudoin’s (2002) article ‘‘Learning or lurking? Tracking the ‘invisible’ online student.’’ He reasoned that students sometimes engage and learn even when not taking part in online discussions with faculty and other students and showed that low profile students: spend a significant amount of time in learning-related tasks, including logging on, even when not visibly participating, and they feel they are still learning and benefiting from this low-profile approach to their online studies. (p. 147) We investigated the importance of student online ‘‘visibility’’ apparent in the quantity and quality of participation. We explored as a case study the successful completion of a postgraduate online course by asking the following research questions. (1) How did online participation relate to learning and successful course completion? (2) How did participation influence the learning community? Literature The debate on online participation Taking part in discussions A learning management system (LMS) tracks progress and performance and reveals students who do not log in to their online classroom or who log in without participating. Klemm (1998) blamed classroom-based teaching where students expect entertainment for conditioning them to passive learning. Therefore, they seldom realize the benefits of participating actively in online discussions, naturally Interactive Learning Environments 39 lurking. Well-facilitated online discussions can be more inclusive than classroom discussions by including introverted students and enabling better quality interaction (Cox, Carr, & Hall, 2004; Prammanee, 2003). Rovai and Barnum (2003) claimed that passive online learning through ‘‘listening’’ without participation produces no measurable increase in knowledge, as they could predict perceived learning through the number of messages posted. Others have also reported that distributed students who participate in dynamic discussions had better course completion rates and that failing students interacted less frequently (Davies & Graff, 2005; Swan et al., 2000). Active online participation also benefits learning. Improved learning Deep cognitive learning (Prammanee, 2003) and high levels of interactivity are possible in online discussions, as students can prepare well-considered contributions (Kettner-Polley, 2005). According to Carr, Cox, Eden, and Hanslo (2004), students who focused on building knowledge and collaborative interactions had a superior average performance, as challenging online interactions promote understanding. Interactive learning provides an instructor with insight into student misconceptions, difficulties, conceptual problems, and verbal pitfalls. Asking leading questions elicits insights into what students understand, more than simply telling them the answer. Immediate feedback from their peers and instructors and social interaction built into the online discussions contribute to learning (Collins, Brown, & Holum, 1991). Collaborative learning activities contribute to deep learning, critical thinking skills, a shared understanding, and long-term retention (Garrison, Anderson, & Archer, 2001). Consistency in course design, interaction with course instructors, and active discussion—have been consistently shown to significantly influence the success of online courses. It is posited that the reason for these findings relates to the importance of building community in online courses. (Swan et al., 2000, p. 513) Community of learners Interaction is conducive to the emergence of a community of practice (Collins et al., 1991) and a virtual community of learners (Collison, Elbaum, Haavind, & Tinker, 2000). Learning from your peers in a structured way can ameliorate the social isolation online students often experience (Boud, Cohen, & Sampson, 1999). Collaborative learning groups solve problems while sharing and clarifying ideas (Cox et al., 2004). In a collaborative learning environment students develop critical thinking skills and a shared understanding and deep learning, while retaining learning over the long term (Garrison et al., 2001). In a community of practice novices learn from experts by observing authentic tasks and executing progressively more advanced tasks themselves under an expert eye (Johnson, C. S., 2001). Complex tasks can be learnt in a community of practice wherein ‘‘participants actively communicate about and engage in the skills involved in expertise’’ (Collins et al., 1991, p. 16). Frequent, meaningful, valued, and dynamic discussions in an online course lead to the formation of a virtual learning community where students interact and support each other. According to Collison et al. (2000), members of a healthy online community of learners post regularly and collaborate with other participants, as well as teach and moderate the 40 L. Nagel et al. online discussions spontaneously. Group cohesion, trust, respect, and belonging further characterize a community of learning (Kreijns, Kirschner, & Jochems, 2003). The formation of a community cannot be taken for granted. Some students do not participate fully. The case for read-only participation Legitimate non-participation Non-participation may initially be legitimate, as peripheral online learners make limited entrances into the community, remaining on the outskirts, observing the activities of more advanced participants and learning from it (Collins et al., 1991). Sutton (2001, p. 223) also reasons that ‘‘direct interaction is not necessary for all students and that those who observe and actively process interactions between others will benefit through the process of vicarious interaction.’’ As students increase their expertise, they move from the periphery to the centre (Carr et al., 2004), with increasing visibility. Beaudoin (2002) found that invisible students sometimes ‘‘spend a significant amount of time in learning-related tasks, including logging on, even when not visibly participating, and they feel they are still learning and benefiting from this low-profile approach to their online studies’’ (p. 147). Williams (2004) advocated using the term read-only participants (ROP) rather than the derogatory lurker for non-participatory students and vicarious interactors. He cautioned that while the ROPing students may be satisfied that their learning needs are met, they do not contribute to the larger community. Inadvertent non-participation Students do not actively participate in online discussions because they procrastinate, they feel isolated, or they’re unfamiliar with the technology. They may also miss the course structure or control of discussions and therefore remain unconvinced of the course’s benefits (Miller, Rainer, & Corley, 2003). Patterns of online participation and interaction can vary across cultural groups. In many developing countries the digital divide is increasing, due to an inadequate infrastructure and few Internet subscriptions (Roycroft & Anantho, 2003). The exclusive use of English in nonEnglish speaking cultures, economic development, and available bandwidth also affect student success. Facilitator participation Student interaction is not the only factor influencing collaboration, learning, and successful course completion. Students become more involved in an online conference when the facilitator participates as guide, providing extensive critique, feedback, and encouragement (Collison et al., 2000). An effective learning community requires an instructor with integrated social, cognitive, and teaching presence (Cox et al., 2004). Facilitators should teach critical thinking, effective communication, and problem-solving skills (Shavelson & Huang, 2003). The current vogue to embrace a constructivist pedagogy where the instructor withdraws from the online learning environment, allegedly to promote discovery and experimental learning activities, is unsubstantiated (Kirschner, Sweller, & Clark, 2006). Interactive Learning Environments 41 Automated e-learning or a lurking instructor presents an even greater impediment to learning than do lurking students. Context of this study We presented an 8 week course on web-based distance learning to Masters students on a computer-integrated education course at the University of Pretoria. This was an elective course in a programme usually presented in blended contact and online mode. We delivered this course entirely online using the WebCTTM Campus Edition as the LMS. The delivery mode enabled enrolment of a diverse cohort of 22 geographically distributed students with ages ranging from nearly 30 to nearly 50. The student ages represent baby boomers and generation X (Oblinger, 2003). The course followed a constructivist approach and consisted equally of theoretical and practical applications structured around eight salient online learning topics. Each week the students had to research online scholarly literature on the topic and post their contribution to the LMS discussions area, where they also posted peer reviews. Concurrently, students had to create webbased artefacts applying the theory. We provided formative feedback during the course and assessed students using integrated assessment of authentic tasks, focusing on outcomes. In the latter half of the course students also created two rounds of group assignments in teams of five to seven, as experience of collaborative online work was a course outcome. One of these was a rubric to score online collaborative behaviour, strongly taking into account their contributions to group assignments. Participating in discussions, replying to pleas for help and offering tips and advice completed the tally. Students used this rubric to allocate a collaboration score for each student that contributed 10% to their year mark. The other 90% derived from research postings, web artefacts, peer review, and collaborative assessment. The final course grade also included their reflective examination essays, depicting their writing skills. Unlike Davies and Graff (2005), we did not use their final course grade as an indication of success. Instead, we used the ongoing year mark that reflected a wider spectrum of mastery and application. We observed students’ experiences with online learning through multiple windows. These consisted of their private blogs (only shared with the facilitator) for reflection and self-assessment, open paragraph questions included in an online quiz, a reflective essay, and feedback questions e-mailed to the students about one month after completion of the course. The facilitator also documented observations in a diary. Methodology The course presenters simultaneously conducted research, using a mixed methodology (Sharp & Fretchling, 1997). A qualitative methodology allowed us to probe the context of the non-participating students and the class’s perceptions and reactions. We conducted content analysis using ATLAS.tiTM software on the following primary documents: students’ blog postings, 1615 discussion posts, an online quiz, and examination essays. Representative quotes from student postings are in their original form, reflecting their use of English as a second language. We validated the findings against the facilitator’s field notes and used multiple 42 L. Nagel et al. documents and perspectives. The researchers also facilitated the online course and, as participant observers, ensured the reliability of the findings. The student tracking tool in the LMS provided a quantitative view of student activity in the course, including the numbers of original postings and replies. The WebCT Campus edition student tracking tool maintains a record of the number of times a student accesses the various course areas. The term ‘‘hits’’ is defined in the WebCT help pages as ‘‘the number of times the student accessed the Homepage, a tool [including the items read or posted in discussions], or a content module page.’’ We calculated their reply ratio by dividing the number of replies to others by their own original posts. Table 1 ranks students according to their year mark and shows the students’ numbers of hits in the LMS and discussion messages posted, their reply ratio, collaboration score, and whether they returned the voluntary post-course feedback. Unlike the rest, the collaboration score is a qualitative measurement obtained by using a rubric to assess each student’s collaborative behaviour. We represent student online activities using the assumptions of Davies and Graf (2005), who categorized students according to final course grades. Our grade categories reflected the assessment stratification used in South African Higher Education. One student abandoned the course very early, and we did not include this data. We stratified the rest of the class into three grade group categories: a Fail group for students who did not complete the entire course or achieved less than 50%; a Pass group of students who aggregated between 50% and 74%. Those with 75% or more we called Distinction candidates. One student (subject 6) changed categories after the final essay and passed the course. We used this stratification for all statistics. Table 1. Summary of individual student grades and participation profile. Subject no. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 a Year mark Hits Messages Reply ratio posted Collaboration score a 424 244 1161 1706 871 223 1406 966 776 844 1503 1758 1093 1487 1675 1810 963 1165 1226 1853 2980 24 14 30 50 50 10 68 54 30 36 73 58 37 104 53 126 43 68 68 148 112 0.8 1.8 0.4 0.9 1.4 0.1 1.5 1.3 1.0 1.3 1.1 1.5 1.5 2.7 2.3 3.2 1.0 1.8 2.0 2.7 1.7 0 0 2 6 3 0 3 7 8 5 8 3 9 9 8 10 8 9 9 10 9 a a a 38.8 48 53 60.2 60.9 61.6 63.1 64 66 66.3 70.2 80 80.3 80.9 83.8 85.4 88.5 Student voluntarily abandoned the course before submitting the final examination essay. Feedback submitted Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes Interactive Learning Environments Table 2. 43 Average number of hits, posts and follow-up posts per student in grade groups. Grade group N Hits Posts Reply ratio Collaboration Feedback (%) Fail Pass Distinction H value/w2 Significance 6 9 6 771.5 1278.7 1666.2 H ¼ 26.3 4.001 30 57 94 H ¼ 34.5 4.001 1.06 1.43 2.06 H ¼ 24.7 4.001 2.2 6 9.2 H ¼ 52.8 4.001 17 44 83 w2 ¼ 47 4.01 Figure 1. Average dimensions for each grade group. Like Davies and Graff (2005), we used the Kruskall–Wallis non-parametric test to investigate the significance of differences in online activities among these grade groups. We also calculated the significance of the difference in return rates of voluntary questions using w2 with two degrees of freedom, as shown in Table 2. Figure 1 shows a graphical representation of the values given in Table 2. We show the average value for each criterion for each of the grade groups. Discussion Student online visibility and learning success Like Beaudoin (2002), we did some tracking of our ‘‘invisible’’ students, trying to pinpoint reasons for their invisibility, as well its effects. We compared their online participation profiles and indicators of their integration into the virtual community with their success in completing the course. Interested in improving course completion rates, we first identified the unsuccessful students, to see if their participation differed from the others. Student LMS hits One can approximate students’ participation in the online classroom quantitatively by the number of times they open pages, read discussions, or post, as shown in Figure 1a. It shows that the student group that aggregated a failing grade or did not complete the course opened significantly fewer pages than the successful students. Their average of less than 800 implied that they saw only about half the online 44 L. Nagel et al. material in the course. Students who achieved distinctions read even more than did the average students. Learning success depends on the interaction with reliable technology (Swan, 2003). The digital divide running through the infrastructure and economic and cultural dimensions (Roycroft & Anantho, 2003) influences connectivity and participation. Students whose infrequent log ins rendered them invisible compromised their success. The blogs revealed that students employed in the e-learning industry had practically unlimited bandwidth with state of the art computers and software. Others made do with much less and singled it out in the quiz as their biggest challenge: Costly and demanding financially, time consuming, stressful . . . . Not for the poor people, under privileged students can be dropouts (Q). Students experienced other technical problems that compounded their infrequent Internet connectivity: Sometimes my (dial-up) connection was not reliable. (Q) There are moments during this module where in my area I experienced a number of electricity cuts and this kept me anxious and waiting to get started with work. (E) Some students showed resilience in coping with poor infrastructure, regular electricity cuts, and poor connectivity; they managed successfully without compromising their studies. For others, technological problems were overwhelming. . . . first three weeks of the course I couldn’t work productively because of constant trouble with my PC (wrong Internet Explorer program, needed Java program to read and send information and finally got the Blaster virus). This made me very aware of the high-dependency on technology in the e-learning world. No Computer—No learning— No success. (E) It is not always clear why some students persist against enormous odds while others give up. Motivation possibly played a role for the last two students, as the student with the electricity problems required the credits to graduate. Students perceived connectivity as the reason for erratic peer contributions, as they did not ‘‘see’’ the lurkers, but noticed that some withheld contributions. When some of the peers are struggling for access, their level of contribution is hampered. We are a nice bunch enrolled for this course. Some learners easily share and are spontaneous, while others hold back. Even opening numerous online pages (Table 1) does not always indicate participation. Rovai and Barnum (2003) cautioned that attending courses without participation produces no measurable increase in knowledge and students who wish to pass just through attendance do not succeed. Learning requires interaction not only with the content, but also with co-learners (Swan, 2003). The number of discussion posts The majority of Discussion posts were compulsory and provided a view on peer group contributions. Figure 1b depicts the extent of student participation. Like hits, there was a significant difference between the numbers of postings from the students Interactive Learning Environments 45 in different grade groups. Students who failed or abandoned the course posted on average significantly fewer discussions than their successful counterparts, confirming Davies and Grafs (2005) results. We also observed a significant difference between average and excellent students, a trend Davies and Graff could not indicate. On average, the high performing students were also most active in the discussions. There were also average performing students (subjects 7 and 14) who posted a proliferation of messages, constituting ‘‘noise’’ in the discussions (Williams, 2004), and an excellent performer (subject 17) who posted few (Table 1), reminiscent of vicarious or read-only participation. The number of posts, therefore, does not reflect student involvement. Ratio of replies to original posts per student grade group This metric indicated a student’s style of participation, whether peer focused or selffocused, and is independent of participation quantities. From Figure 1c it is evident that the more successful students more readily interacted with their peers. Successful students replied two or three times more often to other posts than they initiated original posts. The less successful students’ replied less often than they originally posted. The difference between all groups is highly significant (Table 2). These observations confirm that, after a minimum interaction establishing the necessary support, the quality and dynamics of interaction further influenced online learning (Davies & Graff, 2005). This metric still does not indicate the real quality of contributions. To encourage rational discourse Klemm (1998) urged facilitators to grade on the quality of the postings and not to settle for mere opinions. Absent students and those who contribute little of value or virtually ‘‘nod’’ their approval in threaded discussions do not deceive their peers (Collison et al., 2000). Quality participation Klemm (1998) proposed using peer groups to grade the value of each person’s contribution. Therefore, we designed one team assignment to develop a rubric for scoring online collaborative behaviour. The collaboration score (Figure 1d) is an average of assessments by two peers and the facilitator using this rubric. While rudimentary, it indicates how students rated others’ participation. Like all previously discussed quantitative measurements of student activity in the online classroom, the collaboration score showed highly significant differences among the three stratifications of students, as unsuccessful students had low collaboration scores and the highly successful ones scored highest. Interpretation of the scores is problematic, as again there are notable exceptions. Subjects 6 and 12 (Table 1) logged in often, but they did not score high on collaboration and presented themselves as classic readonly participants. We also used peer review extensively as a mechanism to improve interaction and learn collaboratively (Boud et al., 1999). The transparent learning gave students insight into each other’s work. Most students were positive about the peer assessment process and realized the advantages: With traditional learning, nobody really has access to your assignments, except if you want them to. To me e learning proofed to be a very transparent way of learning. For the first time in my life I had freely access to everybody else’s assignments. I were able to 46 L. Nagel et al. position myself, to compare my own writing and most important learn from others. I was intrigued by the differing viewpoints from which the assignments were approached. Peer assessment sharpens a student’s responses—the student knows he cannot ‘‘get away’’ with lazy work. While the non-contributing students may be satisfied that their learning needs are met (Beaudoin, 2002), they do not contribute to the benefit of the community. We contend that the quality of a student’s contributions to the course reflects integration into the community. Group participation Cooperative group assignments encourage students to participate online. As previous teamwork in this programme resulted in much unresolved conflict, we scheduled group assignments in the latter half of the course and allocated a smal portion of the grades to these activities. The rationale for using group work was teaching students the challenges of working in distributed online groups. Despite online support in the form of dedicated discussion groups and synchronous chat rooms to ease the management of their assignments, some students participated insufficiently and created discontent. Prodded in the quiz, numerous students indicated teamwork as the biggest challenge in the course: Collaborative work via the Internet (was) very difficult. Team work—the response from people, ways of communicating within the group and I ‘‘think’’ the ability for people to ‘‘ignore’’ the postings in the hope that other people in the group would do it. The challenge of online teamwork also emerged as a prominent theme in students’ reflective essays at the end of the course. The chat rooms were functioning well and the teams worked together beautifully. Unluckily not all team members could participate here. I really HATE working in a group. My attitude is not to depend on others, and to make sure that I don’t need to rely on others. I trust myself and my own work most of all. This all in all makes me a VERY bad team player! As I expected, only three team members were actively involved during the group work assignment. We were supposed to be seven in the team. It was once again not a very satisfactory experience, because only a few group members participated. Team work, this proved to be a challenge. As the nominated team captain, I learnt a few lessons; these being people are demanding, they wanted to know I was online and on track. There were those people who tried to participate but when the chips were down and timelines tight they were nowhere to be found. Then there were those people whom I knew I could rely on, it seemed a bit of performance punishment, but they just got more work to do, because I knew they would cope. Working in a team online, there are still those who just don’t get the meaning of the word team. Group membership rotated. In constructivist fashion, students self-organized their groups and appointed their own leaders. Organization and leadership in online teams exhibit distinct dynamics. ‘‘In contrast to face-to-face teams, the leadership role of virtual teams is shared among team members’’ (Johnson, S. D., Suriya, Yoon, Interactive Learning Environments 47 Berrett, & La Fleur, 2002, p. 379). When team members did not share responsibility, problems arose. Some contributions to group assignments were late and unusable, reflecting low quality planning discussions, consisting of little more than affective messages. These students were very enthusiastic spectators, cheering from the sidelines and afterwards congratulating the team on good work, even if they did not expend much effort. Scott Johnson and his group (Johnson, S. D., et al., 2002) suggested ‘‘Problems in the virtual teams came from a lack of willingness to participate, lack of planning, conflicting schedules, or individual disagreements. Most of these are social interaction issues’’ (p. 391). Not all our students were adverse to online group work. When the peers are encouraged to work together, they better realise their collective potential. Creating a rubric as a team was quite fascinating. I created my rubric and I felt good to see my work joined with the work of others (E). We worked so hard with my teammates . . . . I call this team the A team because of the outstanding work we did (E). Significantly, some of the accolades came from the very students that others complained about and in their reflective essays accused of withholding contributions. Many low performing students had poor metacognition of their contribution. Non-English speaking students can find it challenging to participate in fast-paced synchronous chats (Carr et al., 2004). Some students participated erratically in synchronous chats and some never mastered the tool, in spite of clear online instructions. Some managed to log in but did not respond when other participants repeatedly encouraged them to contribute. This adversely affected other students, as they suspected those students might be spying. The learning needs of some of the read-only participants were met, even if they contributed minimally. Some thought that affective participation was sufficient. Diverse students understood their responsibility to the online community differently. Virtual community Voluntary participation After exploring many factors that influence successful course outcomes, we investigated the role of the virtual community on learning and the effect of nonparticipation on the community. According to Collison et al. (2000), students in a healthy online community support their community. Their concern became evident when they contributed without expecting rewards. After concluding the course we e-mailed a request for feedback to clarify some outstanding issues. We assumed that voluntary responses would indicate prolonged involvement in their community. In Figure 1e we display the results of the replies. As expected, the students who did not successfully complete the course nearly unanimously ignored the request. The difference between the average students and the distinction candidates was both interesting and highly significant. Figure 1 shows that the successful students were not only most active online, but were also the most involved in the virtual community, contributing more posts, 48 L. Nagel et al. replying to a larger percentage of fellow student posts, displaying collaborative behaviour, and readily providing voluntary feedback. An integrated community A core of students represented a high functioning, healthy online community (Collison et al., 2000). The ethnography showed the concern and support that existed in this community, with students informing their peers of imminent absence from discussions. Reasons for absence were often work related, teachers attending conferences or school tours, for example. Students were also willing to be vulnerable (Barab, Thomas, & Merrill, 2001) and shared personal circumstances, like serious illness, road accidents, and death among close associates. By extending support, close affective bonds and a camaraderie developed. This resembled Barab et al.’s (2001, p. 105) community, where ‘‘students readily shared their feelings, critically examined course issues, extended their support in helping peers.’’ High quality (useful and timely) contributions granted membership to the community. The community in turn helped students to improve the quality of their contributions in a positive feedback fashion. Our community was not inclusive. At its core students participated often, while at the periphery individuals participated less. Facilitator support Some of the less connected students communicated with the facilitator by e-mail, telephone, and short text messages. The distributed rural student with the intermittent electricity supply reported these by short text message or telephone and thus negotiated deadlines. A few communicated to the facilitator personal circumstances that precluded class participation. We accommodated them by allowing them to work separately. Their interaction with the facilitator possibly contributed to their success (King, 2002). Other low participation students used ordinary e-mail to communicate with the facilitator or to submit assignments, thereby indicating that their lack of communication and participation was not caused by poor connectivity but by poor LMS attendance. E-mails consisted mostly of excuses for missing deadlines, but we were often unable to respond due to overflowing mailboxes. The reasons for this poor participation remains obscure, as they did not return telephone calls or e-mails, nor did they reply to discussion postings. These poorly connected students seldom made valuable contributions, as they frequently missed important instructions. Online support went unheeded. They did not improve their work and did not integrate into the online community of learners. Many of these invisible students had poor completion rates and grades. No amount of online coaching will improve the learning experience for unconnected students. They illustrate Bernard et al.’s (2004) finding that frame of mind and previous performance are the best indicators of online learning success. Conclusions We present evidence that in a predominantly participative class the number of times students access the course, the number of contributions to discussions, the ratio of replies to others’ posts, and integration into the learning community all Interactive Learning Environments 49 significantly relate to successful course completion. These metrics, however, have poor individual predictive value because the great diversity of students in the cohort included numerous exceptions. Low online visibility and participation can take many forms, with students assuming different roles: . read-only participants, merely skimming or deliberately harvesting much of value from others’ discussions; . highly visible without contributing much of value; . poorly visible due to poor connectivity or high costs, although some manage their log in time effectively and gain maximum benefit; . absent for other reasons, but interacting with the facilitator and staying on track; . absent and non-reading, non-participating for undisclosed reasons, not sharing the benefits of the virtual learning community. Only students who contributed to the class or interacted with the facilitator completed the course successfully. Our calculations confirm that students who are at risk of not completing a course contribute less and their contributions are of poorer quality, reflecting less interaction with fellow students and the facilitator. Because of low frequency log ins these students miss out on crucial support needed for success (Davies & Graff, 2005). People also lurk in professional list servers, using content or ideas from a discussion but contributing nothing in return (Klemm, 1998). While read-only participants learn from others without visibly participating or adding value to the discussions (Beaudoin, 2002), the dynamics in an online community of learners depend strongly on diverse contributions from all its members. Other than Netgeneration students (Oblinger, 2003), mature students often resent dependence on others, sentiments that may conflict with the necessity to post often and care for the community (Collison et al., 2000). We found that high quality contributions rather than quantity builds trust among mature students. In an online community students spontaneously moderate the discussions and give cognitive feedback, allowing novice members to grow into full participation. Non-participating students relinquish coaching, feedback, and support from the facilitator and their peers, as the affective dynamics in the community precludes non-participating members. We caution against Beaudoin’s permissiveness towards lurkers. It is not in the interests of the community if a large number of the class are read-only participants. This also deters the isolated student. To avoid read-only participation we endorse Klemm’s (1998) suggestions, and further suggest a facilitator should: . communicate the required log in frequency clearly; . encourage the submission of high quality thoughtful postings and grade them accordingly; . grade all discussions initially and give formative feedback, in private if necessary; . grade individual contributions to group projects (peer or self-generated) and do not give the same grade for all; . rotate members of groups so that students are not stuck with non-participating members or, when feasible, allow students to choose groups; 50 L. Nagel et al. . foster collaboration dependent on content-related interactions; . structure group assignments so that students work in parallel rather than serially, such that inadequate contributions do not impair others in their peer group; . convey important information via Internet-independent media, such as mobile phone technology. The problem of poorly performing online students and those that abandon their course is complex. Students who did not contribute did not become part of the community and did not benefit from facilitation, tutoring, or peer feedback. The other students reacted to this behaviour. We foresee that a large number of lurking students in an online class can prevent the formation of a virtual community of learners and compromise everyone’s education. Notes on contributors Lynette Nagel recently completed her Ph.D. in Computer-Integrated Education. Her research interests include the dynamics of student interaction in online classes. She is an instructional designer at the University of Pretoria. Address: Department for Education Innovation, University of Pretoria, Pretoria, South Africa. Seugnet Blignaut is a Research Professor in Education at the North-West University. She obtained a Ph.D. in Computer-Assisted Learning from the University of Pretoria, South Africa. Her research interests include the role of the online instructor, gender issues in online learning, and integrating computers in learning at school. Address: Education Sciences, North-West University, Potchefstroom Campus, Private bag X6001, Potchefstroom, 2520 South Africa. Johannes Cronjé is the Dean of the Faculty of Informatics and Design at the Cape Peninsula University of Technology. He has been working in computers and education since 1994 and has supervised more than 60 Masters and 27 Ph.D. students. His research interests include communication patterns in online learning communities and the functioning of virtual learning communities in multicultural contexts. Address: Cape Peninsula University of Technology, PO Box 652, Cape Town, 8000 South Africa. References Barab, S.A., Thomas, M.K., & Merrill, H. (2001). Online learning: From information dissemination to fostering collaboration. Journal of Interactive Learning Research, 12(1), 105–143. Beaudoin, M.F. (2002). Learning or lurking? Tracking the ‘‘invisible’’ online student. Internet and Higher Education, 5, 147–155. Bernard, R.M., Brauer, A., Abrami, P.C., & Surkes, M. (2004). The development of a questionnaire for predicting online learning achievement. Distance Education, 25(1), 31–47. Boud, D., Cohen, R., & Sampson, J. (1999). Peer learning and assessment. Assessment & Evaluation in Higher Education, 24(4), 413–426. Carr, T., Cox, G., Eden, A., & Hanslo, M. (2004). From peripheral to full participation in a blended trade bargaining simulation. British Journal of Educational Technology, 35(2), 15. Clark, R.E., & Feldon, D.F. (2005). Five common but questionable principles of multimedia learning. In R.E. Mayer (Ed.), Cambridge handbook of multimedia learning. Cambridge: Cambridge University Press. Collins, A., Brown, J.S., & Holum, A. (1991). Cognitive apprenticeship: Making thinking visible. American Educator, 15(3), 6–11. Collison, G., Elbaum, B., Haavind, S., & Tinker, R. (2000). Facilitating online learning: Effective strategies for moderators. Madison, WI: Atwood Publishing. Interactive Learning Environments 51 Cox, G., Carr, T., & Hall, M. (2004). Evaluating the use of synchronous communication in two blended courses. Journal of Computer Assisted Learning, 20, 183–193. Davies, J., & Graff, M. (2005). Performance in e-learning: Online participation and student grades. British Journal of Educational Technology, 36(4), 657–663. Garrison, D.R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. The American Journal of Distance Education, 15(1), 7–23. Johnson, C.S. (2001). A survey of current research on online communities of practice. Internet and Higher Education, 4, 45–60. Johnson, S.D., Suriya, C., Yoon, S.W., Berrett, J.V., & La Fleur, J. (2002). Team development and group processes of virtual learning teams. Computers & Education, 39, 379–393. Kettner-Polley, R.B. (2005). Virtual professor þ virtual student ¼ real education. Retrieved January 18, 2005, from http://iiswinprd03.petersons.com/distancelearning/code/articles/ distancelearnprof10.asp King, F.B. (2002). A virtual student. Not an ordinary Joe. Internet and Higher Education, 5, 157–166. Kirschner, P.A., Sweller, J., & Clark, R.E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41(2), 75–86. Klemm, W.R. (1998). Eight ways to get students more engaged in online conferences. Technological Horizons in Education Journal, 26(1), 62–64. Kreijns, K., Kirschner, P.A., & Jochems, W. (2003). Identifying the pitfalls for social interaction in computer-supported collaborative learning environments: A review of the research. Computers in Human Behavior, 19, 335–353. Miller, M.D., Rainer, R.K., & Corley, J.K. (2003). Predictors of engagement and participation in an on-line course. Online Journal of Distance Learning Administration, 6(1), 13. Oblinger, D. (2003). Boomers, gen-Xers & millennials. Understanding the new students. Educause, 4, 37–47. Prammanee, N. (2003). Understanding participation in online courses: A case study of perceptions of online interaction. ITFORUM, 68, 16. Rovai, A.P., & Barnum, K.T. (2003). On-line course effectiveness: An analysis of student interactions and perceptions of learning. Journal of Distance Education, 18(1), 57–73. Roycroft, T.R., & Anantho, S. (2003). Internet subscription in Africa: Policy for a dual digital divide. Telecommunications Policy, 27, 61–74. Schreck, V. (2006). It takes a virtual village: Practical strategies for improving online learning retention rates. Retrieved January 6, 2007, from www.innovativeeducators.org/product_p/ 38.htm Sharp, L., & Fretchling, J. (1997). User-friendly handbook for mixed method evaluations. Retrieved December 22, 2006, from www.ehr.nsf.gov/EHR/REC/pubs/NSF97-153/ START.htm Shavelson, R.J., & Huang, L. (2003). Responding responsibly to the frenzy to assess learning in higher education. Change, 35(1) (January/February), 10–19. Sutton, L.A. (2001). The principle of vicarious interaction in computer-mediated communications. International Journal of Educational Telecommunications, 7(3), 223–242. Swan, K. (2003). Learning effectiveness online: What the research tells us. In J. Bourne & J.C. Moore (Eds.), Elements of quality online education: Practice and direction (pp. 13–45). Needham, MA: Sloan Center for Online Education. Swan, K., Shea, P.J., Fredericksen, E.E., Pickett, A.M., & Pelz, W.E. (2000). Course design factors influencing the success of online learning. Paper presented at the WebNet 2000 World Conference on the World Wide Web and Internet, San Antonio. Chesapeake, VA: AACE. Williams, B. (2004). Participation in on-line courses—how essential is it? Educational Technology & Society, 7(2), 1–8.
Purchase answer to see full attachment
Explanation & Answer:
250 words
User generated content is uploaded by users for the purposes of learning and should be used following Studypool's honor code & terms of service.

Explanation & Answer

Attached.

Running head: ARTICLE SUMMARY; “READ ONLY PARTICIPANTS”

“Read-Only Participants” Summary
Name
Institution Affiliation
Date

1

ARTICLE SUMMARY; “READ ONLY PARTICIPANTS”
The article "Read-only participants" by Nagel, Blignaut, and Cronje published in 2009
discusses impacts of read-only participants to virtual community contributions and rates of
course completion in online learning as the primary objective. The authors hold that active
participation in online learning influences individual course completion tendency and also own
learning. Students with minimal engagement in active participation have low rates of course
completion and have minimal contribution to the effectiveness of the c...


Anonymous
Very useful material for studying!

Studypool
4.7
Trustpilot
4.5
Sitejabber
4.4

Similar Content

Related Tags