Presentation slides as a file
Download
Report
Transcript Presentation slides as a file
Evaluating the
usability of the BlueJ
environment.
Marilyn Barallon
Supervisor: Dr.Linda McIver
Outline.
What is BlueJ.
What is usability.
How we measure usability.
What has been done in the past.
Studying BlueJ.
What we found- a ‘picture’ of what students are
doing with BlueJ
Conclusion
Further Work
What is BlueJ?
A visual introductory programming environment to help
teach JAVA and OO.
Based on the Blue System developed at Sydney and
Monash University.
Currently being maintained by Deakin University,
Australia and the University of Kent, UK.
Currently being used by 412 institutions worldwide.
Encourages an ‘objects first’ approach to teaching Object
Orientation.
What is BlueJ? (cont’d).
Designed to be simple to use:
A simple main window:
UML picture of the program
‘Workbench’ area that displays objects that can be interacted
with.
Simple Debugger:
Stack window
Instance, local and static variable windows
State of the program window (running, finished)
A simple
syntax directed editor
What is usability?
Subjective
There are three main views from which usability
can be observed and assessed (N Bevan and J
Kirakowski et al):
Product-oriented view: features of the product
User-oriented view: mental effort and attitude of
the
user.
User-performance view: how the user interacts with
the product.
Our research focuses on an assessment of
BlueJ’s usability based on the last two views.
Measuring Usability?
Evaluation frameworks (examples):
Price et al. framework includes 6 broad and general categories:
Scope
Content
Form
Method
Interaction
Effectiveness
Phillips and Mehandjiska et al. framework included 5 categories,
each of which was further broken down into specific
subcategories:
Learnability
Efficiency
Flexibility
Robustness
Feedback
What has been done?
Only formal evaluations so far: surveys and data
loggers.
Problems with survey data:
Requires students to reflect on past interactions
Rank thoughts and/or feelings according to unique
judgement of what is ‘very useful’
Never really used anything else, so how would they
know? (How well did it help you learn JAVA?)
Problems with program logger data:
Fails
to tell us what was happening between
compilations.
Or what were the ‘substantial edits’
Understanding the user.
Usability assessment requires understanding of
users.
Surveys and data loggers do not provide a
mechanism for understanding users.
Video-taped ‘think-out-loud’ observational
experiments
Directly
observe and analyse ‘what’ students are
doing, natural behaviour.
And ask ‘why’ at the time.
‘Discuss’ problems they encounter, at the time. (JAVA
or interface problem)
Create an overall ‘picture’ of BlueJ user behaviour.
Problems with Observational
Experiments.
Researcher Bias
Reactive behaviour
Limited degree to which we can generalise
behaviour to other people.
In order to mitigate these risks, long term
observational studies and training could be
done.
Studying BlueJ
Observations of students using BlueJ.
Better understanding of users.
A first step towards a comprehensive
usability study.
Detailed picture of student understanding
of BlueJ and JAVA.
Studying BlueJ (cont’d)
Observations involved debugging a program
using BlueJ.
video taped ‘think-out-loud’ sessions
including a short questionnaire
Two-hour
Conducted
4 pilot study experiments, with final year
students (1 from software engineering, 2 from digital
systems, 1 computer science).
5 final experiments with 2nd years
(Computing degrees) whom have completed
cse1203- Programming with JAVA 2.
Conducted
The Program.
Program draws pictures using graphical
output, making it easy to ‘see’ what is an
error.
Students were first shown what the correct
output of the program should look like.
5 semantic bugs.
Problem Solving Abilities.
Did not use BlueJ to explore the program.
Minimal
exploratory behaviour of the program with the
workbench.
Jumped straight into code.
Lacked independent exploration and testing
abilities.
Sought
guidance whenever they could.
As a result, poor hypothesis establishment and
refinement of the behaviours of errors.
Can students use the debugger?
One student.
Common misconceptions and difficulties:
Setting breakpoints on non-executable lines of code.
Instantiation of the debugger window- (which method to call)
Did not understand how the debugger traces the program.
Should I set one on each line?
Will it go from here to there?
Can I set a breakpoint on each line in this class and use the
debugger to step through this class?
Can I move backwards?
These problems were seen in both final and 2nd years.
Common Debugging Strategies.
All students used code comparison at least once.
Students appeared to edit ‘Suspicious’ statements for no
apparent reason.
We hypothesis that students were selecting particular
statements to change based in the unfamiliar ways in
which they were being used
For example in a loop condition, change ‘<‘ to a ‘<=‘
For example in a loop structure, count+=2 instead of count +=1
Interestingly, most students expressed print statements
as their preferred testing strategy.
Problems with BlueJ.
Compiler and JAVA exception error messages.
Failed
to take notice of them.
Failed to understand what they were telling them.
Status bar:
Students
failed to use it to distinguish between when
the machine is busy and when it is free.
‘Remove’ option on the workbench
Problem
for ‘new’ users learning OO, language syntax
and BlueJ.
Conclusion.
Final and 2nd years do not possess
independent debugging and problem
solving ability.
Students cannot use the debugger.
Shown
the correct workings of the program
beforehand.
Graphical output makes it easy to understand
‘what’ is happening.
Could ask questions at any time.
Conclusion (cont’d)
Lack of understanding of JAVA and OO.
Find ways of teaching and developing
debugging and problem solving ability.
Enforcing
use of the debugger.
Need to make aware to students how it
facilitates exploration.
The way BlueJ handles its error messages,
displays its status bar and object workbench
need to be investigated further for possible
usability issues and redesign.
Further Work.
We can now move forward and construct a usability
framework specific to BlueJ.
Redesign the way error messages are displayed:
Relocate the error message area to another section of the screen.
Remove the error message area and use pop-up windows
Keep the error message area and better direct the users attention to it.
Further investigation to determine what ‘new’ BlueJ
users think the workbench ‘remove’ option does.
Re-locate the status bar so that it encourages students
to use it to determine when they should perform their
next action.