Miguel Martins, Senior Lecturer – Marketing Department
Download
Report
Transcript Miguel Martins, Senior Lecturer – Marketing Department
eLearning Celebration
22 July 2007
Improving performance on a Marketing
module through the use of
Formative Computer-Assisted Assessment
Miguel Martins, Senior Lecturer – Marketing Department
eLearning Celebration
22 July 2007
This paper was first presented at the
2nd European Conference on
The First Year Experience
Gothenburg University, 9th - 11th May 2007
Miguel Martins, Senior Lecturer – Marketing Department
eLearning Celebration
22 July 2007
Work in progress
Miguel Martins, Senior Lecturer – Marketing Department
eLearning Celebration
22 July 2007
Introduction
The delivery mode at MK1004 – Marketing Principles
follows very much the traditional lecture model.
Students have two in-class examinations.
The first one is made of two parts: a MCQ (Multiple
Choice Questions) test, consisting of 30 questions, and
five open questions.
The MCQ test accounts for 25% of the final student
grade for this module.
Miguel Martins, Senior Lecturer – Marketing Department
eLearning Celebration
22 July 2007
Introduction
There are two main reasons that pressed the author to
conduct this research.
First - the University of Wolverhampton is moving to a
more virtual and self-learning environment through the
use of WOLF.
Second - the week before the first examination students
are presented with an in-class formative MCQ test.
Miguel Martins, Senior Lecturer – Marketing Department
eLearning Celebration
22 July 2007
Literature review
There are a number of ways in which a VLE can be used
to assess students. The most widespread one is computerassisted-assessment (CAA).
One of the tools supplied by a VLE is self-assessment,
through the use of a databank of questions, linked to
automated marking and instant feedback; other frequent
tool is the tracking of the students’ use of materials
within the VLE. Both facilities are present in WOLF.
Miguel Martins, Senior Lecturer – Marketing Department
eLearning Celebration
22 July 2007
Literature review
The correlation between the use of formative CAA and
exam results have been researched by several authors
and a positive link has been found.
However, according to Clarke et al. (2004) research into
the use of online formative MCQ still remains scarce and
more studies are needed.
Miguel Martins, Senior Lecturer – Marketing Department
eLearning Celebration
22 July 2007
Miguel Martins, Senior Lecturer – Marketing Department
eLearning Celebration
22 July 2007
Miguel Martins, Senior Lecturer – Marketing Department
eLearning Celebration
22 July 2007
Evaluation Methodology
Reeves (1993) suggest that using a combination of quantitative
and qualitative approaches is more suitable to assess learning
supported by a VLE.
In order to improve the validity of our study a number of sources
– access logs, survey (with close and open end questions),
automated tracking and analysis of in-class test results was used.
Miguel Martins, Senior Lecturer – Marketing Department
eLearning Celebration
22 July 2007
Evaluation Methodology
Access logs give us the opportunity to find out about accessing
patterns of each student, in relation to date, hour and number of
times a MCQ test was accessed and number of times it was
completed.
Automated tracking of results allowed the researcher to find out
about the marks gained for each test.
Miguel Martins, Senior Lecturer – Marketing Department
eLearning Celebration
22 July 2007
Table 4 – Tracking for first and last day of the online MCQ tests
Total of 3512 page views and 1014 interactions
Miguel Martins, Senior Lecturer – Marketing Department
eLearning Celebration
22 July 2007
Evaluation Methodology
Analysis of in-class test results allow for historical comparisons
with the three last cohorts of students (2005-06 Semester 1 & 2
and 2006-07 Semester 1) as well as within this specific cohort by comparing students who did take the online MCQ tests with
those that did not take them and within the group that did take the
tests, by comparing formative grades with summative ones.
Miguel Martins, Senior Lecturer – Marketing Department
eLearning Celebration
22 July 2007
Evaluation Methodology
Finally, surveys were chosen as a way to get students’ opinions
regarding the usefulness of the online MCQ tasks towards their
learning and final results.
Questionnaire:
- 38 closed questions and 5 open questions
- paper-based and completed in-class after the test
- applied to 50% of the students enrolled in the module
- a total of 114 questionnaires were received.
Miguel Martins, Senior Lecturer – Marketing Department
eLearning Celebration
22 July 2007
Qualitative analysis
The overall conclusion is that the online MCQ tests were a
valuable learning resource for many students, helping them
prepare for the summative in-class test.
“the online MCQ tests motivated me to learn and to keep trying” (86%)
“the online MCQ tests forced me to study more” (81%)
“the marks helped me to access how I was doing on my learning
process” (80%)
“as a method of learning I enjoyed doing the online MCQ tests” (76%)
“the online MCQ tests were useful in revising the content of the
module” (76%)
Miguel Martins, Senior Lecturer – Marketing Department
eLearning Celebration
22 July 2007
Miguel Martins, Senior Lecturer – Marketing Department
eLearning Celebration
22 July 2007
Qualitative analysis
Question 3 asked the students if they have enjoyed the online
MCQ tests as a method of learning. Two students who did
not access the programme agreed with the questions and one
student who accessed it just once and spent only 4:04
minutes strongly agreed.
One may ask why he didn’t spend more time in
such an enjoyable activity!
Miguel Martins, Senior Lecturer – Marketing Department
eLearning Celebration
22 July 2007
Qualitative analysis
Question 4 asked the students if the online MCQ tests were
well organised and structured. One student who accessed the
task only once and for just 15 seconds did agree.
We are delight to see that in just 15 seconds the
student was still able to appreciate the organisation
and structure of the task.
Relying on a questionnaire alone seems, from these
findings, to be highly problematic
Miguel Martins, Senior Lecturer – Marketing Department
eLearning Celebration
22 July 2007
Quantitative analysis
The last three cohorts had average grades of 16.58 (sd 3.75),
18.49 (sd 4.1), and 18.04 (sd 4.53).
The current cohort had an average of 16.94 (sd 3.83)*.
But
*sample of 64 students
Miguel Martins, Senior Lecturer – Marketing Department
eLearning Celebration
22 July 2007
Quantitative analysis
If we take into consideration only the students that
fully used the online tool (12 tests ) then the
average goes up to 19.67 (sd 4.5).
AVERAGE
STD
Nb Students
(%)
HIGH
(9,10,11,12)
19.31
3.99
20.3%
MEDIUM
(5,6,7,8)
17.14
4.19
21.9%
LOW
(1,2,3,4)
16.25
2.95
37.5%
NIL (0)
15.62
4.05
20.3%
Miguel Martins, Senior Lecturer – Marketing Department
eLearning Celebration
22 July 2007
1.00
Final Results Summative test
0.90
0.80
0.70
0.60
0.50
0.40
R Sq Linear = 0.304
0.30
0.00
0.20
0.40
0.60
0.80
Average Result on formative tests
Miguel Martins, Senior Lecturer – Marketing Department
1.00
236 observations
eLearning Celebration
22 July 2007
Model
Nb of page views
Nb of tests taken
R² = .002
β = .046*
R² = .066
β = .257
R² = .103
β = .321
CAA results
Average time
spent per test
Total time
spent on tests
R² = .002
β = .041*
R² = .109
β = .330
R² = .097
β = .312
R² = .304
β = .551
R² = .021
β = .144*
R² = .291
β = .539
Miguel Martins, Senior Lecturer – Marketing Department
Summative result
eLearning Celebration
22 July 2007
Conclusion and further research
Miguel Martins, Senior Lecturer – Marketing Department
eLearning Celebration
22 July 2007
Work in progress
If you want to receive a copy
of the final report please email me at:
[email protected]
Thank you
Miguel Martins, Senior Lecturer – Marketing Department