Initial Trends in Enrolment and Completion of
Massive Open Online Courses
Katy Jordan
The Open University, UK
Abstract
The past two years have seen rapid development of massive open online courses
(MOOCs) with the rise of a number of MOOC platforms. The scale of enrolment and
participation in the earliest mainstream MOOC courses has garnered a good deal of
media attention. However, data about how the enrolment and completion figures have
changed since the early courses is not consistently released. This paper seeks to draw
together the data that has found its way into the public domain in order to explore
factors affecting enrolment and completion. The average MOOC course is found to
enroll around 43,000 students, 6.5% of whom complete the course. Enrolment numbers
are decreasing over time and are positively correlated with course length. Completion
rates are consistent across time, university rank, and total enrolment, but negatively
correlated with course length. This study provides a more detailed view of trends in
enrolment and completion than was available previously, and a more accurate view of
how the MOOC field is developing.
Keywords: MOOCs; higher education; massive open online courses; online education;
distance learning
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Jordan
Introduction
In the past two years, massive open online courses (MOOCs) have entered the
mainstream via the establishment of several high-profile MOOC platforms (primarily
Coursera, EdX, and Udacity), offering free courses from a range of elite universities and
receiving a great deal of media attention (Daniel, 2012). 2012 has been referred to as
‘the year of the MOOC’ (Pappano, 2012; Siemens, 2012), and some herald this as a
significant event in shaping the future of higher education, envisioning a future where
MOOCs offer full degrees and ‘bricks and mortar’ institutions decline (Thrun, cited in
Leckart, 2012).
There are clearly great potential individual and societal benefits to providing universitylevel education free of some of the traditional barriers to participation in elite education,
such as cost and academic background. However, it is not clear the extent to which
MOOCs provide these benefits in practice. MOOCs may favour those who are already
educationally privileged; Daphne Koller of Coursera has stated that the majority of their
students are already educated to at least undergraduate degree level, with 42.8%
holding a bachelor’s degree, and a further 36.7% and 5.4% holding master’s and
doctoral degrees (Koller & Ng, 2013). A further study of Coursera students enrolled in
courses provided by the University of Pennsylvania indicates a greater dominance of
highly educated students, 83.0% of respondents being graduates and 44.2% being
educated at the postgraduate level (Emanuel, 2012). The author concludes that MOOCs
are failing in their goal to reach disadvantaged students who would not ordinarily have
access to educational opportunities (Emanuel, 2013). In order to succeed in a MOOC
environment, higher digital literacy may be required of students (Yuan & Powell, 2013),
potentially exacerbating pre-existing digital divides. In theory MOOCs remove
geographical location as a boundary to access, although a lack of internet access may
prevent this from being realized in practice (Guzdial, 2013).
Although smallerscale, connectivist MOOCs have existed for several years, the
development of largerscale MOOCs offered by elite institutions has propelled MOOCs
into the mainstream. The earliest and perhaps most highly cited example is the Stanford
AI class, which attracted 160,000 students (20,000 of whom completed the course)
when it ran in autumn 2011 (Rodriguez, 2012). However, while this example is often
used, it is unlikely to be representative of how the field is developing. A survey
undertaken by The Chronicle of Higher Education in February 2013 suggested that the
average MOOC enrolment is 33,000 students, with an average of 7.5% completing the
course (Kolowich, 2013). Detailed studies of particular courses have emphasized that
those who enroll upon courses have a wide variety of motivations for doing so (Breslow
et al., 2013; Koller, Ng, Do, & Chen, 2013); however motivation does not predict
whether a student will complete a course (Breslow et al., 2013). In examining
completion and engagement with courses, studies have focused upon characterizing
types of learners (Kizilcec, Piech, & Schneider, 2013; Koller et al., 2013). Limitations of
these studies are that they focus upon a small number of early MOOCs, and ascribe
Vol 15 | No 1
Feb/14
134
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Jordan
course completion primarily to student choice and motivation. There is a gap in the
research literature here about what could be learnt about characteristics of courses
themselves and their effect upon enrolment and completion, which this study sought to
explore.
Six-figure enrolment statistics have generated a good deal of interest in MOOCs in the
higher education sector, and are frequently conflated with active participation or
completion. However, the earliest courses are the most frequently cited examples and
may not be representative of how the phenomenon is developing, and the extent to
which enrolment numbers are indicative of completion has not been explored
comprehensively. These issues are obscured to an extent by a lack of consistent data
being made open to those outside of the MOOC platforms. For example, the Coursera
data export policy gives individual institutions control over the data that is released
about courses (Coursera, 2012), and in practice the extent of data sharing is highly
variable and ad hoc.
Now, over 18 months on from the advent of the large MOOC platforms, this paper seeks
to synthesise the data that has found its way into the public domain in order to address
some of the very basic questions associated with MOOCs. How massive is ‘massive’ in
this context? Completion rates are reputedly low, but how low? From the available data,
can we learn anything about factors which might affect enrolment numbers and
completion rates?
Methods
The approach taken here drew together a variety of different publicly available sources
of data online to aggregate information about enrolment and completion for as many
MOOCs as possible. Information about enrolment numbers and completion rates were
gathered from publicly available sources on the Internet. Given the media attention
which MOOCs have garnered, and their ‘massive’ nature, there is a good deal of publicly
available information to be found online, including news stories, university reports,
conference presentations, and MOOC student bloggers. Issues of reliability associated
with using this data are addressed below.
1
The list of completed MOOCs maintained at Class Central was used as a starting point
for the inquiry. Completed courses from Coursera, EdX, and Udacity were identified for
inclusion in the study, while other individual MOOCs and platforms were excluded. This
criteria was used because (i) Coursera, EdX, and Udacity are the platforms which have
received the greatest media focus and have fuelled the global interest in MOOCs, (ii) the
platforms account for the vast majority of MOOCs to date, and (iii) the platforms reflect
the higher education sector more broadly, offering courses presented from ‘bricks and
1
http://www.class-central.com/#pastlist
Vol 15 | No 1
Feb/14
135
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Jordan
mortar’ institutions through the platforms. At the time of writing (22nd July 2013), this
list comprised 279 courses (including courses which have run multiple times).
Enrolment and completion figures were selected as the data to be collected for the
courses, as these are the metrics which are most commonly available. Completion in this
sense was defined as the percentages of students who had satisfied the courses’ criteria
in order to gain a certificate. The exact activities required to achieve this vary according
to course. Where possible, data was also recorded about the number of ‘active users’ in
courses. Information about the number of active users was available for 33 courses,
although some did not provide any definition of the term. Those courses who did define
active users characterized them as students who actively engaged with the course
material to some extent (as opposed to those who enrolled but did not use the course at
all). For example, this includes having logged in to a course, attempted a quiz, or viewed
at least one video. Data was also collected about the date a course began, the course
length in weeks, and university ranking (using the Times Higher Education World
Rankings; THE, 2013) in order to explore whether these factors affect enrolment and
completion.
The enrolment and completion data was collected in two ways: via internet searches and
crowdsourcing information from students who participated in courses, by appealing via
social media. Students contributed data which had been shared with them by the course
instructor to the author’s blog (Jordan, 2013). This yielded information about
enrolment numbers for a total of 91 courses (32.6% of total potential sample), and
completion for 42 courses (15.1% of total). For transparency, the sources used for all
data items are included here. Details of courses for which only enrolment data was
available are shown in Table 1; details of courses for which completion data was found
are shown in Table 2.
Vol 15 | No 1
Feb/14
136
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Jordan
Course
Institution
Enrolled
Start date
Length
(weeks)
Platform
Source
Table 1: Data Drawn from Online Sources for Courses for which Enrolment Numbers
Only were Available
Introduction to
Databases
Stanford
University
60000
2011-10-01
9
Coursera
Widom, 2012
Human-Computer
Interaction
Stanford
University
29105
2012-05-28
5
Coursera
Lugton, 2012
Introduction to
Sociology
Princeton
University
40000
2012-06-11
7
Coursera
Lewin, 2012a
Introduction to
Finance
University of
Michigan
125000
2012-07-23
15
Coursera
Masolova,
2013
Algorithms, Part I
Princeton
University
65000
2012-08-12
6
Coursera
Princeton
University,
2012
Introduction to
Sustainability
University of
Illinois at
UrbanaChampaign
32000
2012-08-27
8
Coursera
Rushakoff,
2012
Securing Digital
Democracy
University of
Michigan
14000
2012-09-03
5
Coursera
University of
Michigan,
2012
Statistics One
Princeton
University
96000
2012-09-03
12
Coursera
Bialik, 2013
Modern &
Contemporary
American Poetry
University of
Pennsylvania
36000
2012-09-10
10
Coursera
Unger, 2013
Introduction to
Mathematical
Thinking
Stanford
University
57592
2012-09-17
10
Coursera
Devlin, 2012
A History of the
World since 1300
Princeton
University
83000
2012-09-17
12
Coursera
Cervini, 2012
Organizational
Analysis
Stanford
University
81000
2012-09-24
10
Coursera
Hawkins,
2013
An Introduction to
Interactive
Programming in
Python
Rice
University
54000
2012-10-15
Coursera
Weinzimmer,
2012
The Modern
World: Global
History since 1760
University of
Virginia
40000
2013-01-14
15
Coursera
Kapsidelis,
2013
Microeconomics
for Managers
University of
California,
Irvine
37000
2013-01-21
10
Coursera
Heussner,
2013
Fundamentals of
Human Nutrition
University of
Florida
45000
2013-01-22
Coursera
Nelson, 2013
Data Analysis
Johns
Hopkins
University
102000
2013-01-22
Coursera
Jordan, 2013
Vol 15 | No 1
8
Feb/14
137
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Length
(weeks)
Platform
University of
California,
Irvine
15000
2013-01-28
5
Coursera
Florida Public
Health
Training
Center, 2013
Introduction to
Digital Sound
Design
Emory
University
45000
2013-01-28
4
Coursera
Williams,
2013
Nutrition for
Health Promotion
and Disease
Prevention
University of
California, San
Francisco
50000
2013-01-28
6
Coursera
Ferraro, 2013
Grow to Greatness:
Smart Growth for
Private Businesses,
PartI
University of
Virginia
71000
2013-01-28
5
Coursera
University of
Virginia, 2013
Developing
Innovative Ideas
for New Companies
University of
Maryland,
College Park
85000
2013-01-28
6
Coursera
Welsh &
Dragusin,
2013
The Modern and
the Postmodern
Wesleyan
University
30000
2013-02-04
14
Coursera
Roth, 2013
Clinical Problem
Solving
University of
California, San
Francisco
28000
2013-02-11
6
Coursera
Harder, 2013
Aboriginal
Worldviews and
Education
University of
Toronto
23000
2013-02-25
4
Coursera
Stauffer, 2013
Introduction to
Music Production
Berklee
College of
Music
50000
2013-03-01
6
Coursera
Clark, 2013
Songwriting
Berklee
College of
Music
65590
2013-03-01
6
Coursera
Pattison, 2013
Sustainable
Agricultural Land
Management
University of
Florida
13000
2013-03-04
9
Coursera
Nelson, 2013
How Things Work
1
University of
Virginia
20000
2013-03-04
Coursera
Burnette,
2012
Leading Strategic
Innovation in
Organizations
Vanderbilt
University
33000
2013-03-05
8
Coursera
Furman
University,
2013
Economic issues,
Food & You
University of
Florida
16000
2013-03-18
10
Coursera
Nelson, 2013
Global sustainable
energy: past,
present and future
University of
Florida
18000
2013-03-24
15
Coursera
Nelson, 2013
Vol 15 | No 1
Source
Start date
Principles of Public
Health
Course
Enrolled
Institution
Jordan
Feb/14
138
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Start date
Length
(weeks)
Platform
Source
Science,
Technology, and
Society in China I:
Basic Concepts
The Hong
Kong
University of
Science and
Technology
17000
2013-04-04
3
Coursera
Sharma, 2013
Introduction to
Improvisation
Berklee
College of
Music
39000
2013-04-29
5
Coursera
Burton, 2013
Grow to Greatness:
Smart Growth for
Private Businesses,
Part II
University of
Virginia
71000
2013-04-29
4
Coursera
University of
Virginia, 2013
TechniCity
Ohio State
University
16000
2013-05-04
4
Coursera
Campbell,
2013
Nutrition, Health,
and Lifestyle:
Issues and Insights
Vanderbilt
University
66000
2013-05-06
6
Coursera
Moran, 2013
History of Rock,
Part One
University of
Rochester
30000
2013-05-13
7
Coursera
Rivard, 2013
First-Year
Composition 2.0
Georgia
Institute of
Technology
17000
2013-05-27
8
Coursera
Head, 2013
Creative
Programming for
Digital Media &
Mobile Apps
University of
London
International
Programmes
70000
2013-06-03
6
Coursera
Gillies, 2013
Growing Old
Around the Globe
University of
Pennsylvania
4500
2013-06-10
6
Coursera
Posey, 2013
Course
Enrolled
Institution
Jordan
Vol 15 | No 1
Feb/14
139
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Jordan
Completed
Start date
Length
Platform
104000
41600
13000
2011-10-01
10
Coursera
McKenna,
2012
Introduction
to Artificial
Intelligence
Stanford
University
160000
80000
20000
2011-10-01
10
Udacity
Schmoller,
2012
6.002x Circuits and
Electronics
Massachusetts
Institute of
Technology
154763
7157
2012-03-05
14
MITx
Lewin,
2012b
Software
Engineering
for SaaS
University of
California,
Berkeley
50000
3500
2012-05-18
5
Coursera
Meyer, 2012
Listening to
World Music
University of
Pennsylvania
36295
22018
2191
2012-07-23
7
Coursera
Jordan,
2013
Internet
History,
Technology,
and Security
University of
Michigan
46000
11640
4595
2012-07-23
13
Coursera
Severance,
2012
Gamification
University of
Pennsylvania
81600
49776
8280
2012-08-27
6
Coursera
Werbach,
2012
6.002x:
Circuits and
Electronics
Massachusetts
Institute of
Technology
46000
6000
3008
2012-09-05
14
EdX
Chu, 2013
Functional
Programming
Principles in
Scala
École
Polytechnique
Fédérale de
Lausanne
50000
9593
2012-09-18
7
Coursera
Miller &
Odersky,
2012
Social
Network
Analysis
University of
Michigan
61285
25151
1410
2012-09-24
8
Coursera
Jordan,
2012
Bioelectricity:
A Quantitative
Approach
Duke
University
12000
7761
313
2012-09-24
9
Coursera
Belanger &
Thornton,
2013
Greek and
Roman
Mythology
University of
Pennsylvania
55000
2500
2012-09-24
10
Coursera
Jordan,
2013
An
Introduction
to Operations
Management
University of
Pennsylvania
87000
58000
4000
2012-09-24
8
Coursera
Barber,
2013
Mathematical
Biostatistics
Bootcamp
Johns
Hopkins
University
15930
8380
740
2012-09-24
7
Coursera
Anderson,
2012
Computing for
Data Analysis
Johns
Hopkins
University
50899
27900
2012-09-24
4
Coursera
Simply
Statistics,
2012
Vol 15 | No 1
Source
Active
Stanford
University
Institution
Introduction
to Machine
Learning
Course
Enrolled
Table 2: Data Drawn from Online Sources in Relation to MOOC Enrolment, Number of
Active Users, and Completion Rates
Feb/14
140
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Start date
Length
Platform
7
Coursera
St.
Petersburg
College,
2013
33000
14000
1705
2012-10-10
12
Coursera
Duke
Today, 2012
Harvard
University
150349
100953
1388
2012-10-15
24
EdX
Malan, 2013
3.091x:
Introduction
to Solid State
Chemistry
Massachusetts
Institute of
Technology
28512
6000
2082
2012-10-15
12
EdX
Chu, 2013
Computational
Investing, Part
I
Georgia
Institute of
Technology
53205
28199
2554
2012-10-22
9
Coursera
Balch,
2013a
Think Again:
How to
Reason and
Argue
Duke
University
226652
132000
5322
2012-11-26
12
Coursera
Riddle,
2013a
Introduction
to Astronomy
Duke
University
60000
40000
2141
2012-11-27
8
Coursera
Belanger,
2013
Drugs and the
Brain
California
Institute of
Technology
66800
10426
4400
2012-12-01
5
Coursera
Lesiewicz,
2013
Calculus:
Single
Variable
University of
Pennsylvania
47000
7000
2013-01-07
13
Coursera
Unger, 2013
Calculus One
Ohio State
University
35579
24385
2013-01-07
15
Coursera
Evans, 2013
Image and
video
processing:
From Mars to
Hollywood
with a stop at
the hospital
Duke
University
40000
23000
4069
2013-01-14
9
Coursera
Riddle,
2013b
Artificial
Intelligence
Planning
University of
Edinburgh
29894
15546
654
2013-01-28
5
Coursera
University
of
Edinburgh,
2013
E-learning and
Digital
Cultures
University of
Edinburgh
42844
21862
1719
2013-01-28
5
Coursera
University
of
Edinburgh,
2013
Critical
Thinking in
Global
Challenges
University of
Edinburgh
75844
35084
6909
2013-01-28
5
Coursera
University
of
Edinburgh,
2013
University of
Toronto
Introduction
to Genetics
and Evolution
Duke
University
CS50x:
Introduction
to Computer
Science I
Vol 15 | No 1
Source
Completed
2012-09-24
Learn to
Program: The
Fundamentals
Enrolled
8243
Institution
38502
Course
Active
Jordan
Feb/14
141
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Institution
Enrolled
Active
Completed
Start date
Length
Platform
Introduction
to Philosophy
University of
Edinburgh
98128
53255
9445
2013-01-28
7
Coursera
University
of
Edinburgh,
2013
Astrobiology
and the Search
for
Extraterrestria
l Life
University of
Edinburgh
39556
20413
7707
2013-01-28
5
Coursera
University
of
Edinburgh,
2013
Equine
Nutrition
University of
Edinburgh
23322
18998
8416
2013-01-28
5
Coursera
University
of
Edinburgh,
2013
Introductory
Organic
Chemistry Part 1
University of
Illinois at
UrbanaChampaign
17400
9000
2013-01-28
8
Coursera
Arnaud,
2013
Stat2.1x:
Introduction
to Statistics:
Descriptive
Statistics
University of
California,
Berkeley
52661
8181
2013-02-20
5
EdX
Adhikari,
2013
Computational
Investing, Part
I
Georgia
Institute of
Technology
25589
15688
1165
2013-02-23
8
Coursera
Balch,
2013b
AIDS
Emory
University
18600
10601
2013-02-25
9
Coursera
Williams,
2013
Introductory
Human
Physiology
Duke
University
PatternOriented
Software
Architectures
for Concurrent
and
Networked
Software
Vanderbilt
University
30979
Introduction
to
Mathematical
Thinking
Stanford
University
27930
A Beginner's
Guide to
Irrational
Behavior
Duke
University
142839
Gamification
University of
Pennsylvania
Medical
Neuroscience
Duke
University
Vol 15 | No 1
Source
Course
Jordan
33675
1036
2013-02-25
12
Coursera
Zhou, 2013
20180
1643
2013-03-04
8
Coursera
Jordan,
2013
1950
2013-03-04
10
Coursera
Schmoller,
2013
82008
3892
2013-03-25
8
Coursera
Jordan,
2013
66438
34548
5592
2013-04-01
6
Coursera
Werbach,
2013
44980
18433
756
2013-04-08
12
Coursera
Novicki,
2013
Feb/14
142
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Completed
Start date
Length
Platform
1520
2013-04-15
6
Coursera
Kenyon,
2013
2087
2013-04-16
7
Coursera
Jordan,
2013
12197
500
2013-04-29
10
Coursera
Signsofchao
s blog, 2013
6918
1626
2013-04-30
7
Coursera
Anderson,
2013
1432
2013-05-01
8
Coursera
Farkas,
2013
58000
2013-05-01
8
Coursera
Farkas,
2013
10000
5000
2013-05-20
8
Coursera
Friedrich,
2013
26,915
15392
2013-06-03
6
Coursera
Course site
at Coursera
Healthcare
Innovation
and
Entrepreneurs
hip
Duke
University
Mathematical
Biostatistics
Bootcamp
Johns
Hopkins
University
21916
Generating the
Wealth of
Nations
University of
Melbourne
28922
Sports and
Society
Duke
University
19281
Introduction
to
International
Criminal Law
Case Western
Reserve
University
21000
Inspiring
Leadership
through
Emotional
Intelligence
Case Western
Reserve
University
90000
Statistical
Molecular
Thermodynam
ics
University of
Minnesota
Introduction
to Systems
Biology
Icahn School
of Medicine at
Mount Sinai
Source
Active
15596
Enrolled
Institution
Course
Jordan
Data analysis was conducted using linear regression carried out with Minitab statistical
software. Linear regression was chosen as the approach to analysis because at this stage
the aim of the research was exploratory, to identify potential trends rather than being
explanatory and seeking to fit a model. This would be a valuable goal for follow-up
research particularly if more consistent data became available for MOOCs more broadly.
Linear regression analyses were carried out individually according to different factors of
interest rather than as a single multiple regression due to issues of data consistency and
availability; that is, data is not available for every field in Tables 1 and 2 for every course,
so n varies according to different tests (see Results and Analysis section). Rather than
discarding courses for which the full spectrum of data was not available and in order to
gain the greatest insight possible into the different factors, a series of individual
regression analyses were carried out.
Vol 15 | No 1
Feb/14
143
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Jordan
Limitations
There are a number of limitations which must be borne in mind with the approach
taken by this study, including issues of validity of data and reliability of the research
instruments used.
In terms of validity, it should be noted that the accuracy of figures varies according to
sources, with some institutions releasing highly accurate figures and others (particularly
when releasing enrolment data through the press) are rounded figures. This reflects the
fact that MOOC courses do not consistently release this information into the public
domain, and most of the courses that would have been eligible for inclusion (67.4%)
have not released any data. Of the institutions or instructors choosing to make data
available, bias may be introduced according to their motivations for publicizing this
information, which are unknown. There is also a degree of trust involved in the
information provided by student informants via the blog.
It should be emphasized that the study sought to be exploratory in nature, identifying
trends of interest in the data as a starting point for further research but not seeking to
explain or model the phenomenon. Reliability of the approach is less contentious as the
data have been collected via several rounds of internet searches during the data
collection period (February 13th to July 22nd 2013) and shown in full in Tables 1 and 2
should others wish to reproduce the tests or carry out alternative analyses. By collating
data ‘in the open’ at the author’s blog (Jordan, 2013), this offered a platform for others
(including course leaders) to scrutinize the data and provide more accurate figures in
some cases.
Results and Analysis
Trends in Total Enrolment Figures
Total enrolment numbers draws upon the data in both Tables 1 and 2, which comprises
a total of 91 courses (excluding three courses which are missing total enrolment figures).
Total enrolment figures range from 4,500 to 226,652 students, with a median value of
42,844. The data does not exhibit a normal distribution (Figure 1); six-figure
enrolments are not representative of the ‘typical’ MOOC. Total enrolments are shown
plotted against the date each course began in Figure 2. This demonstrates a negative
correlation, with enrolment numbers decreasing over time.
Vol 15 | No 1
Feb/14
144
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Jordan
30
25
Frequency
20
15
10
5
0
0
40000
80000
120000
160000
Total number of students enrolled
200000
Figure 1. Histogram of total enrolment numbers for the sampled courses (n = 91).
Total number of students enrolled
250000
200000
150000
100000
50000
0
1
-0
10
11
20
1
1
1
1
1
-0
-0
-0
-0
-0
01
04
07
10
01
12
12
12
13
12
20
20
20
20
20
Date course began
1
1
-0
-0
04
07
13
13
20
20
Figure 2. Scatterplot of total enrolment numbers plotted against course start date for
the sampled courses (n = 91).
Vol 15 | No 1
Feb/14
145
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Jordan
A regression analysis was carried out, prior to which the data was subject to a Box-Cox
transformation as the residuals do not follow a normal distribution. Regression analysis
showed that date significantly predicted total enrolment figures at the 95% significance
level by the following formula: ln(Enrolled) = 104.249 - 0.00226915*StartDate (R2 =
0.1719, p < 0.001). The relationship is a negative correlation, indicating that as time has
progressed, enrolment figures have decreased. The relationship is relatively weak (time
as a factor accounts for 17.2% of the variance observed, as R2 is a measure of the fraction
of variance explained by the model; Grafen & Hails, 2002), although the sample is
sufficiently large that this is statistically significant (critical R2 values decrease according
to sample size, with an n of 91 being relatively large; Siegel, 2011). This highlights that a
focus upon figures from early courses is misleading and not representative of how the
field is developing.
The relationship between course length and total enrolments was also considered, and
found to demonstrate a positive correlation between course length and total enrolment
(Figure 3).
Total number of students enrolled
250000
200000
150000
100000
50000
0
0
5
10
15
Course length (weeks)
20
25
Figure 3. Scatterplot of total enrolment numbers plotted against course length for the
sampled courses (n = 87).
Following a Box-Cox transformation, regression analysis showed that course length
significantly predicted (at the 95% significance level) total enrolment figures by the
following formula: ln(Enrolled) = 10.2248 + 0.0491206*Length (R2 = 0.0545, p =
0.029). The correlation between the variables is positive, indicating courses that are
Vol 15 | No 1
Feb/14
146
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Jordan
longer attract a greater number of enrolments. The relationship is relatively weak,
accounting for 5.5% of the variance observed, although the sample size is sufficiently
large for this to be a statistically significant relationship. This positive correlation may
suggest that prospective MOOC students prefer more substantial courses (however, see
also the relationship between course length and completion rates).
In addition, the relationship between university ranking and enrolment figures was
considered, although it was not found to be significant at the 95% level.
Trends in Completion Rates
Completion rates were calculated as the percentage of students (out of the total
enrolment for each course) who satisfied the criteria to gain a certificate for the course.
This information was available for 39 courses in the sample. Completion rates range
from 0.9% to 36.1%, with a median value of 6.5% (Figure 4). The data is skewed, so the
higher completion rates are not representative, with completion rates of 5% being
typical.
20
Frequency
15
10
5
0
0
5
10
15
20
25
30
Percentage of total enrollment to complete course
35
Figure 4. Histogram of completion rates for the sampled courses (n = 39).
As the residuals were not normally distributed, a Box-Cox transformation was again
carried out before conducting regression analysis. No significant relationships were
found between completion rate and date, university ranking, or the total number of
students enrolled. Completion rates remained consistent across these factors. A
significant negative correlation was found however between completion rate and course
Vol 15 | No 1
Feb/14
147
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Jordan
Percentage of total enrollment to complete course
length, shown in Figure 5. Regression analysis showed that course length significantly
predicted completion rate by the following formula: ln(PercentTotalCompleted) =
2.64802 - 0.100461*CourseLength (R2 = 0.2373, p = 0.002). The correlation in this case
is negative, indicating that a lower proportion of students complete longer courses.
Course length accounts for 23.4% of the variance observed, and the correlation is
significant at the 95% significance level.
40
30
20
10
0
5
10
15
Course length (weeks)
20
25
Figure 5. Scatterplot of completion rate plotted against course length for the sampled
courses (n = 39).
While considering completion rate as the percentage of the total enrolment that
complete the course is the type of data that is most readily available, a criticism of this
characterization is that many students may enroll without even starting the course, and
that completion rates would be better characterized as the proportion of active students
who complete. This level of information is available for a subset of the sampled courses
(39 courses with a number of active students and total enrolment; 33 courses with data
about the proportion of active students who complete).
The number of active students is remarkably consistent as a proportion of the total
enrolment of the course (with approximately 50% of the total enrolment becoming
active students). This is shown graphically in Figure 6. Regression analysis showed that
total enrolment significantly predicted the number of active students by the following
formula: Active = 0.543336*Enrolled (R2 = 0.9556, p < 0.001). The correlation is strong
(accounting for 95.6% of the variance) and positive, showing a consistent relationship
Vol 15 | No 1
Feb/14
148
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Jordan
between total enrolment and the percentage who become active students (being
approximately 54% of those who enroll).
140000
Number of active students
120000
100000
80000
60000
40000
20000
0
0
50000
100000
150000
200000
Total number of students enrolled
250000
Figure 6. Scatterplot of number of active students plotted against total enrolment for
the sampled courses (n = 39).
When calculating completion rate as the percentage of active students who complete the
course, completion rates range from 1.4% to 50.1%, with a median value of 9.8% (Figure
7). While completion rates as a percentage of active students span a wider range than
completion rates as a percentage of total enrolments, there remains a strong skew
towards lower values. The differences here would be worthwhile to explore in further
detail to explore features of course design that may account for the wider variation
observed.
Vol 15 | No 1
Feb/14
149
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Jordan
14
12
Frequency
10
8
6
4
2
0
0
12
24
36
Percentage of active students who complete course
48
Figure 7. Histogram of completion rates as a proportion of active students for the
sampled courses (n = 39).
No significant relationships were found between completion rate as a proportion of
active users and date, university ranking, total enrolment, or (in contrast to completion
rate as a percentage of total enrolment) course length. This may suggest that enrolled
students may be put off starting longer courses, but this is less of an issue for those who
do become actively engaged in the course.
Conclusions
The findings here demonstrate changes in the field since the concept of MOOCs entered
the mainstream and the inception of the major MOOC platforms. It is misleading to
invoke early enrolment and completion figures as representative of the phenomenon;
six-figure enrolments are atypical, with the median average enrolment being 42,844
students, and decreasing over time as the number of courses available continues to
increase. Although this is lower than the earliest examples, it emphasizes that it is
inappropriate to compare completion rates of MOOCs to those in traditional bricks-andmortar institution-based courses.
The majority of courses have been found to have completion rates of less than 10% of
those who enroll, with a median average of 6.5%. The definition of completion rate used
here is the percentage of enrolled students who satisfied the courses’ criteria in order to
Vol 15 | No 1
Feb/14
150
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Jordan
earn a certificate, and this definition was used because it is the type of information that
is most frequently available. There are potentially many ways in which MOOC students
may participate in and benefit from courses without completing the assessments. The
wider range of completion rates (while still remaining quite low overall, with a median
of 10%) observed when defining completion as a percentage of active learners in courses
is interesting and warrants further work to better understand the reasons why those
who become engaged initially do or do not complete courses.
This is not to say, however, that completion rates should be ignored entirely. Looking at
completion rates is a starting point for better understanding the reasons behind them,
and how courses could be improved for both students and course leaders. For example,
the relationship between enrolments, completion, and course length is an interesting
issue for MOOC course design, balancing the higher enrolments with the lower
completion rates of longer courses. Figures about how many students achieved
certificates obscure how many students attempted to gain a certificate but did not meet
the criteria. Given that MOOCs are offered free of educational prerequisites, striving to
improve teaching on courses so that students who wish to complete are assisted in doing
so is an important pedagogical issue. The extent of understanding that can be gained
outside of running a MOOC will continue to be constrained however as long as the
release of detailed data about courses is limited.
This study has only considered relationships between enrolment and completion and a
small number of general factors for which data is available publicly; various other
factors would be worthwhile to explore. For example, it would be useful to look at in
terms of the underlying pedagogy, whether differences emerged based on how
transmissive (so-called ‘xMOOCs’) or connectivist (‘cMOOCs’) courses are. The impact
of different assessment types, being necessarily linked to the criteria for achieving a
certificate of completion, would also be a worthwhile area to consider in further detail.
Along with the studies discussed in the introduction which focus upon links between
student demographics or behaviours and completion (Breslow et al., 2013; Kizilcec et
al., 2013; Koller et al., 2013), a limitation of the approach used here is that the data
neglects the student voice. While these approaches can identify trends and patterns,
they are unable to explore in detail the reasons behind the trends observed.
Acknowledgments
The author would like to thank Professor Martin Weller and the two anonymous peer
reviewers for their comments on drafts of this paper. Special thanks to all of the MOOC
students, instructors, and other commentators who contributed data and thoughtful
comments about MOOC completion rates to the authors’ blog.
Vol 15 | No 1
Feb/14
151
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Jordan
References
Adhikari, A. (2013). Completion. Stat2x, Spring 2013 blog. Retrieved from
http://stat2x.blogspot.co.uk/2013/04/completion.html
Anderson, N. (2012). Grades are in for a pioneering free Johns Hopkins online class.
The Washington Post. Retrieved from
http://www.washingtonpost.com/blogs/college-inc/post/grades-are-in-for-apioneering-free-johns-hopkins-online-class/2012/11/14/1bd60194-2e6b-11e289d4-040c9330702a_blog.html
Anderson, S. (2013). Duke Sports and Society MOOC wraps up. Duke Center for
Instructional Technology blog: http://cit.duke.edu/blog/2013/07/duke-sportsand-society-mooc-wraps-up/
Arnaud, C. H. (2013). Flipping chemistry classrooms. Chemical & Engineering News.
Retrieved from http://cen.acs.org/articles/91/i12/Flipping-ChemistryClassrooms.html
Balch, T. (2013a). About MOOC completion rates: The importance of student
investment. The Augmented Trader blog:
http://augmentedtrader.wordpress.com/2013/01/06/about-mooc-completionrates-the-importance-of-investment/
Balch, T. (2013b). MOOC student demographics. The Augmented Trader blog:
http://augmentedtrader.wordpress.com/2013/01/27/mooc-studentdemographics/
Barber, M. (2013). Comment posted on the Introduction to Operations Management
page. Coursetalk.org: http://coursetalk.org/coursera/an-introduction-tooperations-management
Belanger, Y. (2013). IntroAstro: An intense experience. Retrieved from
http://hdl.handle.net/10161/6679
Belanger, Y., & Thornton, J. (2013). Bioelectricity: A quantitative approach. Duke
University’s First MOOC:
http://dukespace.lib.duke.edu/dspace/bitstream/handle/10161/6216/Duke_Bi
oelectricity_MOOC_Fall2012.pdf
Breslow, L., Pritchard, D. E., DeBoer, J., Stump, G. S., Ho, A. D., & Seaton, D. T. (2013).
Studying learning in the worldwide classroom: Research into edX’s first MOOC.
Research and Practice in Assessment, 8, 13-25.
Burnette, D. (2012). The way of the future. The University of Virginia Magazine.
Retrieved from
Vol 15 | No 1
Feb/14
152
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Jordan
http://uvamagazine.org/features/article/the_way_of_the_future#.UdrX_1Pp6
ic
Burton, G. (2013). Did they just say, “39,000 students enrolled in my Improvisation
course?” OMG!. Garyburton.com news/opinion:
http://www.garyburton.com/opinion/did-they-just-say-30000-studentsenrolled-in-my-improvisation-course-omg/
Campbell, G. (2013). The technicity story, part 2. The Technicity Story blog:
http://blogs.lt.vt.edu/technicitystory/2013/04/24/the-technicity-story-part-2/
Cervini, E. (2012) Mass revolution or mass con? Universities and open courses. Crikey.
At http://www.crikey.com.au/2012/12/18/mass-revolution-or-mass-conuniversities-and-open-courses/?wpmp_switcher=mobile
Chu, J. (2013). Duflo, Lander, Lewin to lead spring-semester MITx courses. MIT News:
http://web.mit.edu/newsoffice/2013/mitx-spring-offerings-0131.html
Clark, S. (2013). Coursera – Introduction to music production by Loundon Stearns.
Bytes and Banter blog:
http://bytesandbanter.blogspot.co.uk/2013/06/coursera-introduction-tomusic.html
Coursera. (2012). (DRAFT) Data export procedures. Retrieved from
https://docs.google.com/viewer?a=v&pid=forums&srcid=MDMyNTg5NzM4O
TAxMTY2NDg5NzEBMDEwNDAzNzI4ODgxODU0NTkwODQBLTkwOXZQa2h
uODRKATQBAXYy
Daniel, J. S. (2012). Making sense of MOOCs: Musings in a maze of myth, paradox and
possibility. Journal of Interactive Media in Education. Retrieved from
http://www-jime.open.ac.uk/jime/article/view/2012-18
Devlin, K. (2012a). Liftoff: MOOC planning – part 7. Devlin’s Angle blog:
http://mooctalk.org/2012/09/21/mooc-planning-part-7/
Duke Today. (2012). Introduction to genetics and evolution, a preliminary report. Duke
Today: http://today.duke.edu/node/93914
Emanuel, E. J. (2013). Online education: MOOCs taken by educated few. Nature,
503(342). Retrieved from http://dx.doi.org/10.1038/503342a
Evans, T. (2013). Here’s the scoop on Ohio State MOOCs. Digital Union, Ohio State
University: http://digitalunion.osu.edu/2013/04/01/osu-coursera-moocs/
Farkas, K. (2013). Case Western Reserve University’s free online courses exceeded
expectations. Cleveland.com:
Vol 15 | No 1
Feb/14
153
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Jordan
http://www.cleveland.com/metro/index.ssf/2013/07/case_western_reserve_u
niversit_9.html
Ferraro, K. (2013). Nutrition consulting. Ingrain Health:
http://www.ingrainhealth.com/nutrition-consulting.html
Florida Public Health Training Center. (2013). A public health refresher course. Florida
Public Health Training Center Online Mentor Program blog:
http://phmentorships.wordpress.com/2013/02/01/a-public-health-refreshercourse/
Friedrich, A. (2013). UMN faculty: MOOCs have made us rethink learning. On Campus:
http://blogs.mprnews.org/oncampus/2013/07/umn-faculty-moocs-havemade-us-rethink-learning/
Furman University. (2013). TEDx FurmanU 2013 Redesigning Education Cast.
Tedxfurmanu.com website: http://www.tedxfurmanu.com/#!2013/c1g5h
Gillies, M. (2013). Creative programming for digital media & mobile apps. Marco Gillies
webpage at Goldsmiths, University of London:
http://www.doc.gold.ac.uk/~mas02mg/MarcoGillies/creative-programmingfor-digital-media-mobile-apps/
Grafen, A., & Hails, R. (2002). Modern statistics for the life sciences. Oxford: Oxford
University Press.
Guzdial, M. (2013). Slides from “The revolution will be televised” MOOCopalpse panel.
Computing Education blog:
http://computinged.wordpress.com/2013/03/09/slides-from-the-revolutionwill-be-televised-moocopalypse-panel/
Harder, B. (2013). Are MOOCs the future of medical education? BMJ Careers:
http://careers.bmj.com/careers/advice/view-article.html?id=20012502
Hawkins, D. (2013). Massive open online courses (MOOCs): The Thursday plenary
session. Against the Grain Blog: http://www.against-thegrain.com/2013/06/massive-open-online-courses-moocs-the-thursdayplenary-session/
Head, K. (2013). Inside a MOOC in progress. The Chronicle of Higher Education.
Retrieved from http://chronicle.com/blogs/wiredcampus/inside-a-mooc-inprogress/44397
Heussner, K. M. (2013). More growing pains for Coursera: In another slip-up, professor
departs mid-course. Gigaom: http://gigaom.com/2013/02/19/more-growingpains-for-coursera-in-another-slip-up-professor-drops-out-mid-course/
Vol 15 | No 1
Feb/14
154
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Jordan
Jordan, K. (2012). Networked life, social network analysis, & a new appreciation for
feedback. MoocMoocher blog:
http://moocmoocher.wordpress.com/2012/12/21/networked-life-socialnetwork-analysis-a-new-appreciation-for-feedback/
Jordan, K. (2013). Synthesising MOOC completion rates. MoocMoocher blog:
http://moocmoocher.wordpress.com/2013/02/13/synthesising-mooccompletion-rates?
Kapsidelis, K. (2013). U. Va. set to launch global classrooms. Times Dispatch. Retrieved
from http://www.timesdispatch.com/news/local/education/college/u-va-setto-launch-global-classrooms/article_53fbd2b8-8bb1-58ff-89281eaca612a103.html
Kenyon, A. (2013). Healthcare Innovation and Entrepreneurship final comments. Duke
Center for Instructional Technology blog:
http://cit.duke.edu/blog/2013/07/healthcare-innovation-andentrepreneurship-final-comments/
Kizilcec, R. F., Piech, C., & Schneider, E. (2013). Deconstructing disengagement:
Analyzing learner subpopulations in massive open online courses. Third
International Conference on Learning Analytics and Knowledge, LAK ’13
Leuven, Belgium.
Koller, D., & Ng, A. (2013). The online revolution: Education for everyone. Seminar
presentation at the Said Business School, Oxford University, 28th January 2013.
Retrieved from http://www.youtube.com/watch?v=mQ-KsOW4fU&feature=youtu.be
Koller, D., Ng, A., Do, C., & Chen, Z. (2013). Retention and intention in massive open
online courses: In depth. Educause Review. Retrieved from
http://www.educause.edu/ero/article/retention-and-intention-massive-openonline-courses-depth-0
Kolowich, S. (2013, March 21). The professors who make the MOOCs. The Chronicle of
Higher Education. Retrieved from http://chronicle.com/article/TheProfessors-Behind-the-MOOC/137905/#id=overview
Leckart, S. (2012). The Stanford education experiment could change higher education
forever. Wired Magazine. Retrieved from
http://www.wired.com/wiredscience/2012/03/ff_aiclass/3/
Lesiewicz, A. (2013). Drugs and the brain. ATA Science & Technology Division blog:
http://ata-sci-tech.blogspot.co.uk/2013/02/drugs-and-brain.html
Vol 15 | No 1
Feb/14
155
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Jordan
Lewin, T. (2012a). College of future could be come one, come all. The New York Times.
Retrieved from http://www.nytimes.com/2012/11/20/education/colleges-turnto-crowd-sourcing-courses.html
Lewin, T. (2012b). One course, 150,000 students. The New York Times. Retrieved from
http://www.nytimes.com/2012/07/20/education/edlife/anant-agarwaldiscusses-free-online-courses-offered-by-a-harvard-mitpartnership.html?ref=education
Lugton, M. (2012). Review of the Coursera Human Computer Interaction Course blog:
http://reflectionsandcontemplations.wordpress.com/2012/07/14/review-ofthe-coursera-human-computer-interaction-course/
Malan, D. J. (2013). This was CS50x. CS50 blog: https://blog.cs50.net/2013/05/01/0/
Masolova, E. (2013). Interview with Daphne Koller, CEO of COURSERA. Eduson blog:
https://www.eduson.tv/blog/coursera
McKenna, L. (2012). The big idea that can revolutionize higher education: ‘MOOC’. The
Atlantic. Retrieved from
http://www.theatlantic.com/business/archive/2012/05/the-big-idea-that-canrevolutionize-higher-education-mooc/256926/
Meyer, R. (2012). What it’s like to teach a MOOC (and what the heck’s a MOOC?). The
Atlantic. Retrieved from
http://www.theatlantic.com/technology/archive/2012/07/what-its-like-toteach-a-mooc-and-what-the-hecks-a-mooc/260000/
Miller, H., & Odersky, M. (2012). Functional programming principles in Scala:
Impressions and statistics. Scala Documentation website: http://docs.scalalang.org/news/functional-programming-principles-in-scala-impressions-andstatistics.html
Moran, M. (2013). Free online nutrition course kicks off May 6th. Vanderbilt News.
Retrieved from http://news.vanderbilt.edu/2013/05/coursera-nutrition/
Nelson, B. (2013). UF offers massive online learning for free. 1565today.com:
http://1565today.com/uf-offers-massive-learning-online-for-free/
Novicki, A. (2013). Medical Neuroscience in Coursera has just finished. Duke Center for
Instructional Technology blog: http://cit.duke.edu/blog/2013/07/courseramedical-neuroscience-week-3/
Pappano, L. (2012). The year of the MOOC. The New York Times.
http://www.nytimes.com/2012/11/04/education/edlife/massive-open-onlinecourses-are-multiplying-at-a-rapid-pace.html?pagewanted=1
Vol 15 | No 1
Feb/14
156
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Jordan
Pattison, P. (2013). Coursera songwriting course starts July 19th. Patpattison.com:
http://www.patpattison.com/news/entry?id=16
Posey, J. (2013). Free Penn online course offers lessons on growing old. Penn News.
Retrieved from http://www.upenn.edu/pennnews/news/free-penn-onlinecourse-offers-lessons-growing-old
Princeton University. (2012). Office of Information Technology administrative report,
September 07, 2012. Retrieved from http://www.princeton.edu/oit/about/oitadministrative-report/PDFs/Admin_09-12.pdf
Riddle, R. (2013a). Preliminary results on Duke’s third Coursera effort, “Think Again”.
Duke Center for Instructional Technology blog:
http://cit.duke.edu/blog/2013/06/preliminary-results-on-dukes-thirdcoursera-effort-think-again/
Riddle, R. (2013b). Duke MOOCs: Looking back on “Image and Video Processing”. Duke
Center for Instructional Technology blog:
http://cit.duke.edu/blog/2013/06/looking-back-on-image-and-videoprocessing/
Rivard, R. (2013). Three out of 2U. Inside Higher Ed. Retrieved from
http://www.insidehighered.com/news/2013/05/17/three-universities-backaway-plan-pool-courses-online
Rodriguez, C. O. (2012). MOOCs and the AI-Stanford like Courses: Two successful and
distinct course formats for massive open online courses. European Journal of
Open, Distance, and E-Learning. Retrieved from
http://www.eurodl.org/index.php?article=516
Roth, M. S. (2013). My modern experience teaching a MOOC. The Chronicle of Higher
Education. Retrieved from http://chronicle.com/article/My-Modern-MOOCExperience/138781
Rushakoff, H. (2012). Free to learn: Geology, chemistry, and microeconomics are
among U of I’s first free online courses on Coursera. University of Illinois at
Urbana-Champaign College of Liberal Arts & Sciences News. Retrieved from
http://www.las.illinois.edu/news/2012/coursera/
Schmoller, S. (2012). Peter Norvig’s TED talk reflecting on creating and running the
online AI course. Schmoller.net: http://fm.schmoller.net/2012/07/peternorvigs-ted-talk-about-the-ai-course.html#more
Schmoller, S. (2013). Second report from Keith Devlin’s and Coursera’s Introduction to
Mathematical Thinking MOOC. Schmoller.net:
Vol 15 | No 1
Feb/14
157
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Jordan
http://fm.schmoller.net/2013/06/second-report-from-keith-devlins-itmtcourse.html
Severance, C. (2012). Internet history, technology and security (IHTS) grand finale
lecture slides. Retrieved from
http://www.slideshare.net/fullscreen/csev/internet-history-technology-andsecurity-grand-finale-lecture-20121001/7
Sharma, Y. (2013). Hong Kong MOOC draws students from around the world. The
Chronicle of Higher Education. Retrieved from
http://chronicle.com/article/Hong-Kong-MOOC-DrawsStudents/138723/?cid=wc&utm_source=wc&utm_medium=en
Siegel, A. F. (2011). Practical business statistics (6th ed.). Oxford: Academic Press.
Siemens, G. (2012). MOOCs are really a platform. Elearnspace blog:
http://www.elearnspace.org/blog/2012/07/25/moocs-are-really-a-platform/
Signsofchaos blog. (2013). An assessment of a MOOC. Signsofchaos blog:
http://signsofchaos.blogspot.co.uk/2013/07/an-assessment-of-mooc.html
Simply Statistics. (2012). Computing for data analysis (Simply statistics edition). Simply
Statistics blog: http://simplystatistics.org/2012/10/29/computing-for-dataanalysis-simply-statistics-edition/
St. Petersburg College. (2013). Alex Sharpe successfully completes University of Toronto
online course via Coursera. The CCIT Bulletin, St. Petersburg College. Retrieved
from http://www.spcollege.edu/ccit-bulletin/?p=1012
Stauffer, J. (2013). Connected Arctic educators discussion thread.
https://plus.google.com/114587962656605254648/posts/fmLmhDE9cSk
Times Higher Education. (2013). World university rankings 2012-2013. Retrieved from
http://www.timeshighereducation.co.uk/world-university-rankings/201213/world-ranking
Unger, M. (2013). Eye on the future: Coursera. Penn Current. Retrieved from
http://www.upenn.edu/pennnews/current/2013-02-21/eye-future/eye-futurecoursera
University of Edinburgh. (2013). MOOCs @ Edinburgh 2013 – Report #1. University of
Edinburgh:
http://www.era.lib.ed.ac.uk/bitstream/1842/6683/1/Edinburgh%20MOOCs%
20Report%202013%20%231.pdf
University of Michigan. (2012). Halderman’s “Securing Digital Democracy” opens on
Coursera. Department of Electrical Engineering and Computer Science:
Vol 15 | No 1
Feb/14
158
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Jordan
http://www.eecs.umich.edu/eecs/about/articles/2012/Halderman_Coursera_l
aunch.html
University of Virginia. (2013). U. Va. Darden School’s first Coursera class reaches
71,000 registrants. University of Virginia Darden School of Business news.
Retrieved from http://www.darden.virginia.edu/web/Media/Darden-NewsArticles/2013/Dardens-First-Coursera-Class-Reaches-71000-Registrants/
Weinzimmer, S. (2012). Rice’s first Coursera class enrolls 54,00. The Rice Thresher.
Retrieved from http://www.ricethresher.org/rice-s-first-coursera-class-enrolls54-000-1.2932146#.UcsMlJw1DTo
Welsh, D. H. B., & Dragusin, M. (2013). The new generation of massive open online
courses (MOOCs) and entrepreneurship education. Small Business Institute
Journal, 9(1), 51-65.
Wesleyan University. (2013). Passion driven statistics. Wesleyan University
Quantitative Analysis Center: http://www.wesleyan.edu/qac/studentprofile/homepage_slideshow_coursera_information.html
Werbach, K. (2012). Gamification course wrap-up. PennOpenLearning YouTube
channel: http://www.youtube.com/watch?v=NrFmiqhBep4
Werbach, K. (2013). Gamification Spring 2013 statistics. Coursera Gamification
YouTube channel:
http://www.youtube.com/watch?v=E8_3dNEMukQ&feature=youtu.be
Williams, K. (2013). Emory and Coursera: Benefits beyond the numbers. Emory news
center:
http://news.emory.edu/stories/2013/05/er_coursera_update/campus.html
Widom, J. (2012). From 100 students to 100,000. ACM SigMod Blog:
http://wp.sigmod.org/?p=165
Yuan, L., & Powell, S. (2013). MOOCs and open education: Implications for higher
education (JISC CETIS white paper). Retrieved from
http://publications.cetis.ac.uk/2013/667
Zhou, H. (2013). Duke University completes its first “Introductory Human Physiology”
MOOC! Duke Center for Instructional Technology blog:
http://cit.duke.edu/blog/2013/06/reflection-physio/
Vol 15 | No 1
Feb/14
159
Initial Trends in Enrolment and Completion of Massive Open Online Courses
Jordan
Vol 15 | No 1
Feb/14
160
Deakin Research Online
This is the authors’ final peered reviewed (post print) version of
the item published as:
Palmer, Stuart, Holt, Dale and Bray, Sharyn 2008, Does the discussion help? The impact of a
formally assessed online discussion on final student results, British journal of educational
technology, vol. 39, no. 5, pp. 847-858.
Available from Deakin Research Online:
http://hdl.handle.net/10536/DRO/DU:30017812
Reproduced with the kind permission of the copyright owner.
Copyright : 2008, Wiley-Blackwell
Does the discussion help? The impact of a formally assessed online
discussion on final student results
Stuart Palmer, Dale Holt and Sharyn Bray
Drs Stuart Palmer and Dale Holt are Senior Lecturers in the Institute of Teaching and Learning at
Deakin University in Australia. Ms Sharyn Bray is a Research Assistant in the School of Engineering
and Information Technology at Deakin University in Australia. Address for correspondence: Dr
Stuart Palmer, Institute of Teaching and Learning, Deakin University, Geelong, Victoria, 3217,
Australia. Tel: +613 5227 8143; fax: +613 5227 8129; email: spalm@deakin.edu.au
Abstract
While there is agreement that participation in online asynchronous discussions can enhance
student learning, it has also been identified that there is a need to investigate the impact on
student course performance of participation in online discussions. This paper presents a case
study based on an undergraduate engineering management unit employing a formally
assessed online discussion area. It was observed that while many students read a significant
number of discussion postings, generally, the posting of new and reply messages occurred at
the minimum level required to qualify for the assignment marks. Based on correlation and
multiple regression analysis, it was observed that two variables were significantly related to a
student’s final unit mark – prior academic ability and the number of new postings made to the
online discussion. Each new posting contributed three times as much to the final unit mark as
its nominal assessment value, suggesting that the work in preparing their new discussion
postings assisted students in the completion of a range of assessable tasks for the unit. The
number of postings read was not significantly correlated with the final unit mark, suggesting
that passive lurking in this online discussion did not significantly contribute to student
learning outcomes.
Introduction
Dialogue is considered to be an essential element of human learning, particularly for distance
education (Gorsky & Caspi, 2005). It includes interactions between students and teachers,
exchanges between students, interactions between students and others not directly involved in
their learning processes and dialogue with oneself in the form of reflective thought (Webb,
Jones, Barker & van Schaik, 2004). With the advent of online technologies in teaching and
learning, particularly in distance education, the use of online discussion forums is now a
widespread medium for learning dialogue. Online discussion can be synchronous through the
use of real-time chat tools, but many examples of online discussions documented in the
literature present the use of asynchronous discussion. That is, where students post new and
follow-up messages to an electronic bulletin-board at the times that suit them, and not
necessarily at the same time that other students are accessing the discussion system. The
claimed benefits of online asynchronous discussion forums include:
the time between postings for reflective thought that might lead to more considered
responses than those possible in face-to-face situations (Garrison, Anderson & Archer,
1999);
for off-campus students, two-way communication can be enhanced, reducing student
isolation and making possible dialogue with other students (Kirkwood & Price, 2005);
the convenience of choice of place and time to learners (Cotton & Yorke, 2006);
the creation of a sense of community (Davies & Graff, 2005);
the development of skills for working in virtual teams (Conaway, Easton & Schmidt,
2005);
increased student completion rates from increased peer interaction and support (Wozniak,
2005); and
increased student control, ability for students to express their own ideas without
interruption, the possibility to learn from the collectively created content, the creation of a
permanent record of one’s thoughts, the creation of a reusable instructional tool that
models expected answers and discussion use, and they create a valuable archive of
material for investigation and research (Hara, Bonk & Angeli, 2000).
While there is wide agreement that participation in online asynchronous discussions can
enhance student learning, and significant work has been done characterising, and theorising
on the nature of student communications in online discussions, it has also been identified that
there is a need to investigate the impact on student course performance of participation in
online discussions (Hara et al., 2000). In a combined quantitative and qualitative analysis of
the online discussion postings of education students studying by distance education in
Australia it was found that those students achieving the highest final unit grade also had the
highest frequency of posting, and that lower achieving students were less active online;
though the authors do not claim these findings as conclusive evidence of the effect of online
participation on learning outcomes (as measured by marked assessment activities) (Stacey &
Rice, 2002). In a quantitative analysis of two online discussions in the UK involving 543
computing students, it was found that both the number of student accesses of the system and
the number of student postings to the system were significant predictors of variance in final
mark (in one case) and variance in final grade (in the other case) (Webb et al., 2004). In a
quantitative analysis of online discussion usage involving 122 UK business students based on
what percentage of all online system accesses related to usage of the online communication
system, it was found that students achieving high or medium passing grades were
significantly more active in the discussion area than students achieving a low passing grade,
and in turn, students achieving a low passing grade were significantly more active than
students who failed (Davies & Graff, 2005).
It is noted that while the literature suggests a correlation between increased interaction and
increased learning, there is limited research to understand the impact of different types of
postings on learning outcomes (as measured by unit final grade) (Conaway et al., 2005).
Simply encouraging students to get more involved in online discussions may not necessarily
lead to better learning outcomes – there is a need to understand what are the ‘salient factors’
in online interaction that might enhance learning (Davies & Graff, 2005). One debated factor
is whether student participation in online discussions should be optional or mandatory. It has
been noted that some learning theories suggest that user motives largely determine how
students engage with learning activities; intrinsically motivated learners will invest high
levels of cognitive effort regardless of any associated rewards, whereas extrinsically
motivated learners may be enticed to participate by gaining unit marks, but their engagement
may be instrumental and shallow (Kuk, 2003). While there is evidence that online discussion
interaction carried out on a voluntary basis may lead to better learning outcomes (as measured
by unit final grade) (Weisskirch & Milburn, 2003), a pragmatic approach suggests that
discussion contribution is likely to be low unless there is some compulsion to participate
(Graham & Scarborough, 2001). Students have many competing demands on their time, and
if their use of online learning tools is optional, the perceived benefits of participation will
need to outweigh the perceived efforts of using the system. In this case, for some students,
there may be benefits in providing extrinsic motivators for students to learn and use the
system (Garland & Noyes, 2004).
Another form of optional engagement with online discussion forums is ‘lurking’, where
students enrolled in a discussion do not make postings, rather they simply read the postings of
others. These lurkers may not be detected by some online systems, and the question remains,
are these lurkers learning or not? (Hara et al., 2000) There is some evidence that both active
participation (posting) and passive participation (lurking) may be beneficial to online
discussion users (Webb et al., 2004). A final, but important question about student learning
and participation in online discussions relates to the presumption that the often observed
correlation between student participation (number of postings, assessed quality of posting,
etc) and learning outcomes (student final unit mark/grade, etc) is causative, and not simply the
result of more able and/or motivated students engaging more deeply with the online
discussion than less able students (Cotton & Yorke, 2006). Is it possible that the students
with the best results in a unit would have done well in the unit, regardless of whether an
online discussion was employed or not?
Context
The School of Engineering and Information Technology at Deakin University in Australia
offers a three year Bachelor of Technology (BTech) and a four year Bachelor of Engineering
(BE) at undergraduate level. These programs are delivered in both on-campus and offcampus modes.
These programs include the second-year engineering management /
professional practice study unit SEB221 Managing Industrial Organisations.
consists of four modules:
This unit
1. Systems Concepts for Engineers and Technologists;
2. Managing People in Organisations;
3. Manufacturing and the Environment; and
4. Occupational Health and Safety.
Prior to 2005, this unit was delivered in both on-campus (weekly classroom lectures) and offcampus (printed study guides) modes, with on-campus students generally purchasing the
printed study guides as well, and all students having access to an online area providing basic
resources, including an optional asynchronous discussion forum and the capacity for
academic staff to post ‘announcements’ to all class members. The unit assessment regime
consisted of two assignments each worth 25 % of the unit marks and an end-of-semester
examination worth 50 % of the unit marks. In 2005, this unit was converted to ‘wholly
online’ delivery mode, where all teaching of the unit occurred online (Holt & Challis, 2007).
The printed study guides were replaced by a CD-ROM version of the study materials,
enhanced with interactive/animated diagrams and video material. Up to this time, the first
author had academic responsibility for the Managing People in Organisations module, and
was not responsible for unit overall. The assessment regime was not changed for wholly
online delivery.
At the end of 2005, due to staffing changes, the first author assumed full responsibility for the
entirety of SEB221, and a review of the wholly online delivery strategy for the unit was
undertaken. Deakin University’s policy and procedure for ‘Online Technologies in Courses
and Units’ requires that wholly online units be, “… designed to help students to develop their
skills in communicating and collaborating in an online environment…” (Holt & Challis,
2007). While the inclusion of an optional general online discussion area may have met the
‘letter of the law’ for the wholly online unit policy, it was considered inadequate as a means
for genuinely developing student online communication and collaboration skills. For 2006,
ten % of the unit marks were taken from the final examination and dedicated to formally
assessed assignment activity based around the online discussion area.
The other unit
assessment items were retained. A summary of the assignment instructions given to students
is provided below. ‘DSO’ refers to Deakin Studies Online, the local name of the Blackboard
course management system (CMS) used at Deakin University.
This assignment requires you to both reflect on your studies and to constructively engage with the
wholly online environment used in this unit. You are required to post reflections on the course
material and to comment on the postings made by other students during the semester. You have
two types of task in this assignment.
Task 1 – Reflect on the course material you have studied in the current week. Identify what you
think is the most important topic, access the DSO system for this unit, open the Assignment 1
forum area for the appropriate week, select ‘Compose Message’ and post a few paragraphs on your
selected topic that explain why you think it is important.
Task 2 – Review some of the Assignment 1 posts made by other students and select one to
comment on. With that message open select ‘Reply’ and post a follow-up to the original message.
You may add your own additional thoughts/reasons for why that topic is important, you may wish
to contribute an example related to that topic from your own experience, or something else.
You need to make at least five postings for each type of task given above, ie, at least ten postings in
total, five of type one and five of type two. You should make only one of each type of posting in a
given week. Only the best posting for either task type in a given week will be marked. If your
postings demonstrate constructive and thoughtful reflection, you will be awarded up to 1 mark
per posting, up to a maximum of 10 marks in total for the assignment. You can make more than
five postings for each type of task to maximise your mark for Assignment 1. Please use your own
thought/words, do not simply reproduce the course notes. Please note that the forum areas will
not remain open for posting all semester, ie, it will not be possible to complete all your postings
late in the semester.
In summary, students were asked to make at least five ‘new’ postings reflecting on the course
material, with up to one mark awarded for each of the five ‘best’ new posts, and, to make at
least five ‘follow-up’ postings reflecting on the prior posts of their peers, with up to one mark
awarded for each of the five ‘best’ follow-up posts. Student participation in the online
discussion was made ‘mandatory’ in the sense that marks were assigned to participation. As
noted previously, the literature suggests that some form of extrinsic motivation is required to
ensure a high level of student discussion participation. A weighting of ten % was chosen for
discussion participation – this figure is noted in case studies elsewhere in the literature
(Graham & Scarborough, 2001; Hara et al., 2000). It was felt that this weighting would
provide incentive for most students to participate, while at the same time not compromising
the unit assessment regime should there be unforeseen implementation issues with this initial
trial of the asynchronous discussion assignment.
Strategies to promote a high level of
participation in online discussions include requiring a specific number of postings per
assignment and/or per week (Conaway et al., 2005). In this case, both these strategies were
combined. It has been found that a key element in the effective use of computer conferencing
is ‘intentional design’ of the online environment (Harasim, 1991). Intentional design includes
designating conferences (online discussion areas) according the nature of the task (formal or
informal), the duration of the task (one week, whole semester, etc), size of the group (plenary,
small group etc), etc. Separate weekly discussion spaces were created to structure the formal
student assignment postings. This permitted newer discussion areas to be progressively
revealed, and older areas to be progressively set as read-only as the semester progressed. A
separate informal area was maintained for general unit discussion and questions. As noted,
the assignment-related discussion areas did not remain open all semester, to encourage
students to engage with the unit material in a timely manner across the semester. Due to the
nature of the assignment task, all of the discussion areas were open to all students – there was
no separate small-group discussions employed.
As this was the first time a formally assessed online discussion task was employed in this
unit, it was decided to undertake a quantitative investigation to explore the forms of student
engagement with the online discussion, the impact of participation on the students’ final unit
result, whether passive participation/lurking had any benefit, and whether any impact/benefit
was separable from the students’ prior general academic performance in their studies.
Method
Student participation in online discussions can be analysed in quantitative terms (number of
postings, length of postings, number of messages read, etc), qualitative terms (does the
posting
exhibit
cognitive/social/teaching
presence?,
does
the
posting
exhibit
knowledge/comprehension/application/analysis?, is the posting on task/off task?, etc) or some
combination of quantitative and qualitative. Quantitative analysis can be performed quickly
using system data, but may not yield a complete picture of student engagement in the
discussion (Hara et al., 2000). However, qualitative analysis requires the examination of
every student posting to classify the content, consuming significant time and open to the
variation in message content classification by different assessors (Cotton & Yorke, 2006).
At the commencement of the semester, an initial model posting of the type expected was
made to seed the discussion and provide an exemplar to students. During the semester,
student postings were assessed on an on-going basis according to the published criteria. Both
in initial and follow-up postings, students were asked to discuss unit content, hence the
postings were primarily assessed based on the quality of cognitive presence.
Following the completion of the semester, data on the student usage of the online discussion
area was compiled from the following sources:
student age (whole years at the end of semester);
student gender (male or female);
student normal mode of study (on-campus or off-campus);
student course of study (BTech, BE or other);
student prior general academic performance (measured at Deakin by the Weighted
Average Mark (WAM));
the total number of discussion messages read (or at least opened) by the student;
the total number of new/initial discussion postings made by the student;
the total number of follow-up/reply discussion postings made by the student; and
the final unit mark obtained by the student for SEB221.
The collected data were analysed and the following information was compiled:
descriptive statistics on the use of the discussion areas;
visualisation of the patterns of usage of the discussion areas;
investigation of correlation (Pearson’s linear correlation coefficient) between data variable
pairs; and
multivariate linear regression to find the significant independent variables contributing to
the dependent variable ‘final unit mark’.
Results and discussion
Descriptive statistics
The number of student completing the unit (still officially enrolled at the end of the semester)
was 86. The total number of assessable messages posted was 645. The average number of
words per posting was 290. Figure 1 shows the distribution of assessable student postings
across the semester.
120
No. of messages
100
80
60
40
20
0
1
2
3
4
5
6
7
8
9
10
11
12
Week
Figure 1: Distribution of assessable student postings across the semester
There was a general downward trend in discussion posting until week 8, after which the
number of remaining weeks in the semester equalled the number of posts required from a
student to maximise their possible mark, and after which the general trend picked up again
slightly, perhaps indicating a belated effort by those students who hadn’t actively engaged
with the discussion assignment task previously. Figure 2 shows the ranked distribution of
total new/initial postings made by students.
10
9
8
No. new posts
7
6
5
4
3
2
1
0
1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85
Student
Figure 2: Ranked distribution of total new/initial postings made by students
The mean number of new postings was 3.8, with a standard deviation of 2.8. The median and
modal number was 5, and the range was 0 to 9. Figure 3 shows the ranked distribution of
total follow-up/reply postings made by students.
50
45
No. replies posted
40
35
30
25
20
15
10
5
0
1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85
Student
Figure 3: Ranked distribution of total follow-up/reply postings made by students
The mean number of follow-up postings was 3.7, with a standard deviation of 5.4. The
median number was 3.5, the modal number was 0, and the range was 0 to 47. It is well
known that students take a strategic approach to study, and the learning activities they engage
most fully with are those most clearly associated with what will be assessed (James, McInnis
& Devlin, 2002).
Even though marks were attached to students’ contribution to the online
discussion as an overt indicator that participation was considered important, and disregarding
students with a final mark of zero for the unit, 16.7 % of students made no new/initial
postings and 11.9 % of students made no follow-up/reply postings. A similar rate of students
foregoing assessment worth ten % based on participation in an online asynchronous
discussion task is noted in the literature (Graham & Scarborough, 2001). Figures 2 and 3
suggest that even those students who did engage with the assignment task only tended to do
the minimum required (one new post and one reply post per week, up to a maximum of ten
combined) to qualify for the assignment marks on offer. This type of minimum student
engagement in an assessable online discussion activity is reported elsewhere (Hara et al.,
2000), and reinforces the idea that students are busy, and extrinsic motivation is likely to be
necessary to encourage even a basic level of participation in online discussion activities.
Figure 4 shows the ranked distribution of total number of messages read by students –
technically, the CMS records the number of messages ‘opened’ by students, but this was
taken as a proxy measure of number of messages ‘read’ by students.
800
No. messages read
700
600
500
400
300
200
100
0
1
5
9
13 17 21 25 29 33 37 41 45 49 53 57 61 65 69 73 77 81 85
Student
Figure 4: Ranked distribution of total number of messages read by students
The mean number of messages read was 149.6, with a standard deviation of 201.7. The
median number was 63.5, the modal number was 669, and the range was 0 to 669. Note that
the figure of 669 is higher than the figure of 645 assessable messages given above, as it
includes some messages posted by students who did not complete the unit, but that were
never-the-less read by the completing students. Interestingly, the modal number of messages
read was also the maximum number, indicating that a significant proportion of students read
every single discussion posting.
Visualisation of patterns of usage
A method for visualising the message posting profile of all students together as a group was
devised. A ranking factor was computed for each student, based on weighting postings early
in the semester higher, and postings later in the semester lower. This factor was used to rank
order all students from highest to lowest.
Figure 5 shows the rank ordered profile of
new/initial postings made by students across the semester.
Figure 5: Rank ordered profile of new postings by students across the semester
Four relatively distinct discussion new posting profiles, with approximately equal proportions
of students in each can be observed. Students 1-21 (21 students, 24.4 %) made their required
five (or so) posts, commencing at week one, and then generally left the discussion space.
Students 22-44 (23 students, 26.7 %) commenced their posts in week one and then had a
range of posting profiles, typically not continuous, re-entering the discussion space at various
points over the twelve weeks. Students 45-69 (25 students, 29.1 %) commenced their posts
some time after week one and then had a range of posting profiles, typically not continuous,
with students who commenced their posting late in the twelve week period exhibiting more
intense posting in an attempt to meet the assignment criteria of making five new posts in total.
Students 70-86 (17 students, 19.8 %) made no postings at all during the twelve week period.
Data variable paired value correlations
Two significant correlations were observed; final unit mark and WAM (r = +0.43, p < 4105
), and final unit mark and total number of new postings (r = +0.49, p < 210-6). Inspection of
variable pair scatter plots revealed that the relationship between final unit mark and number of
new postings plateaued after five new postings. After the data range for the number of new
postings was limited to five or less the correlation was (r = +0.59, p < 410-9). As might be
expected, a correlation was observed between previous academic performance (as measured
by the student’s WAM), and final unit result in SEB221. The observed correlation between
total number of new postings and final unit mark was strongest for number of new posts
between zero and five. This is not surprising as, while students were allowed to make
multiple new postings per week, only the single ‘best’ new posting result was taken as the
mark for that week. While both WAM and number of new posts appear to have a positive
correlation with final unit mark, they do not have a significant correlation with each other (r =
+0.23, p > 0.033), suggesting that they are not significantly multicollinear with the final unit
result, and that both contribute independently and positively to the final unit mark.
Multivariate regression
Following removal of three data items with an unknown (not BE or BTech) course of study
and four data items for students with a final unit mark of zero (did not complete unit but did
not official withdraw their enrolment), multivariate linear regression analysis was conducted
with final unit mark as the dependent variable. All other known variables were initially
introduced as independent variables, and step-wise regression was performed until all
remaining variables were significant. Table 1 shows the coefficients of the regression model
and their significance.
Table 1: Multivariate linear regression model for dependant variable ‘final unit mark’
Variable
Coefficient
Standard error
Beta
Significance
No. new posts (≤ 5)
3.05
0.47
0.50
p < 110-8
WAM
0.51
0.08
0.48
p < 310-8
Constant
28.17
5.50
-
p < 310-6
An Analysis of Variance (ANOVA) test suggests that the regression model is significant (F78
= 47.29, p < 510-14), though the model predicts only 55.4 % of the variation on final unit
mark (R2 = 0.554). The regression residuals were approximately normally distributed. The
model explains only just over half of the variation observed in the final unit mark, hence there
exist other factors with a significant influence on final unit mark that were not available in the
data collected for this analysis. The results of the regression analysis support the results of
the data pair correlation analysis that both the number of new postings and WAM contribute
significantly and independently to final unit mark. Based on the marking scheme of ‘up to 1
mark per posting’, it would be expected, all other things being equal, that posting one new
message would add approximately one mark to the final unit result. Instead, the regression
analysis indicates that there was a significant benefit (up to three marks per new posting)
beyond the notionally allocated marks for new postings. This suggests that the work that
students completed in preparing their new discussion postings engaged them with the unit
material and assisted them in the completion of other assessable tasks for the unit.
None of the student demographic characteristics (age, gender, mode of study and course of
study) were found to be significantly correlated with levels of participation in the discussion
(messages read, new postings and reply postings), suggesting that all students were able to
participate in the online discussion exercise on a generally equal basis. It has been proposed
that the ways in which students engage with online asynchronous discussions will influence
the learning outcomes achieved (Cotton & Yorke, 2006).
The four types of student
engagement with the discussion space identified in Figure 5 were used as a grouping variable
and entered into the multiple regression analysis, but it was not found to be a significant
contributor to final units result.
Conclusion
A formally assessed online discussion task was introduced into an engineering management
unit delivered in wholly online mode, as a response to a perceived need to better-develop
student online communication skills.
While it was qualitatively observed that student
participation in unit online discussions increased significantly compared to previous unit
offerings, following the introduction of a formally assessed online discussion task, a
quantitative examination was undertaken to investigate the impact of the students’
participation in the online discussion on their final unit results. It was observed that while
many students read a significant number of discussion postings, generally, the posting of new
and reply messages occurred at the minimum level required to qualify for the assignment
marks. Based on new postings to the online discussion, four distinct patterns of posting were
observed. Based on correlation and multiple regression analysis, it was observed that two
measured variables were significantly related to a student’s final unit mark – their weighted
average mark (used as a proxy measure for general prior academic ability) and the number of
new postings that they made to the online discussion. In addition, these two variables were
not significantly correlated with each other, and were both significant in the regression model
obtained, suggesting that both contribute independently to the final unit mark. The regression
model explained more than half of the observed variation in final unit mark, and while it
shouldn’t be interpreted literally as the ‘formula’ that determines a student’s final unit mark, it
does suggest that the influence of active participation in the online discussion assignment
through the posting of reflective contributions based on the course material made about the
same contribution to a student’s final unit mark as their general prior academic ability.
Further, the regression model indicated that each new posting contributed three times as much
to the final unit mark as its nominal assessment value of ‘up to 1 mark per posting’ would
otherwise indicate, suggesting that the work in preparing their new discussion postings
engaged students with the unit material and assisted them in the completion of a range of
assessable tasks for the unit. However, while active contribution to the online discussion in
the form of new posts was a significant factor in the final unit mark, simply reading the posts
of other students was not. The number of postings read was not significantly correlated with
the final unit mark, suggesting that passive ‘lurking’ in this online discussion did not
significantly contribute to student learning outcomes (as measured by final unit mark).
References
Conaway, R. N., Easton, S. S. & Schmidt, W. V. (2005). Strategies for Enhancing Student
Interaction and Immediacy in Online Courses. Business Communication Quarterly, 68(1),
23-35.
Cotton, D. & Yorke, J. (2006, 3-6 December). Analysing online discussions: What are
students learning? Paper presented at the 23rd annual ascilite conference: Who’s learning?
Whose technology?, Sydney University Press, Sydney.
Davies, J. & Graff, M. (2005). Performance in e-learning: online participation and student
grades. British Journal of Educational Technology, 36(4), 657-663.
Garland, K. & Noyes, J. (2004). The effects of mandatory and optional use on students’
ratings of a computer-based learning package. British Journal of Educational Technology,
35(3), 263-273.
Garrison, D. R., Anderson, T. & Archer, W. (1999). Critical Inquiry in a Text-Based
Environment: Computer Conferencing in Higher Education. The Internet and Higher
Education, 2(2/3), 87-105.
Gorsky, P. & Caspi, A. (2005). Dialogue: a theoretical framework for distance education
instructional systems. British Journal of Educational Technology, 36(2), 137-144.
Graham, M. & Scarborough, H. (2001). Enhancing the learning environment for distance
education students. Distance Education, 22(2), 232-244.
Hara, N., Bonk, C. J. & Angeli, C. (2000). Content analysis of online discussion in an applied
educational psychology course. Instructional Science, 28(2), 115-152.
Harasim, L. (1991, 8-11 January). Designs & tools to augment collaborative learning in
computerized conferencing systems. Paper presented at the Twenty-Fourth Annual Hawaii
International Conference on System Sciences, Institute of Electrical and Electronics
Engineers, Kauai, Hawaii.
Holt, D. & Challis, D. (2007). From policy to practice: one university's experience of
implementing strategic change through wholly online teaching and learning. Australasian
Journal of Educational Technology, 23(1), 110-131.
James, R., McInnis, C. & Devlin, M. (2002). Assessing Learning in Australian Universities.
Melbourne, Australia: Centre for the Study of Higher Education and The Australian
Universities Teaching Committee.
Kirkwood, A. & Price, L. (2005). Learners and learning in the twenty-first century: what do
we know about students’ attitudes towards and experiences of information and
communication technologies that will help us design courses? Studies in Higher Education,
30(3), 257-274.
Kuk, G. (2003). E-Learning Hubs: Affordance, Motivation and Learning Outcomes [WWW
document]. Retrieved 10 February, 2007, URL:
http://www.business.heacademy.ac.uk/resources/reflect/conf/2003/kuk/kuk.pdf
Stacey, E. & Rice, M. (2002). Evaluating an online learning environment. Australian Journal
of Educational Technology, 18(3), 323-340.
Webb, E., Jones, A., Barker, P. & van Schaik, P. (2004). Using e-learning dialogues in higher
education. Innovations in Education and Teaching International, 41(1), 93-103.
Weisskirch, R. S. & Milburn, S. S. (2003). Virtual discussion: Understanding college
students’ electronic bulletin board use. Internet and Higher Education, 6(3), 215-225.
Wozniak, H. (2005). Online discussions: Improving the quality of the student experience
[WWW document]. Retrieved 10 February, 2007, URL:
http://www.odlaa.org/events/2005conf/ref/ODLAA2005Wozniak.pdf
Purchase answer to see full
attachment