CRESST - UCLA.edu

Download Report

Transcript CRESST - UCLA.edu

C RE SS T/U C LA
Using Technology to Assess Students’
Web Expertise
Davina C. D. Klein
CRESST/UCLA
Louise Yarnall
Center for Technology in Learning, SRI International
Christina Glaubke
CRESST/UCLA
Paper presented at AERA
New Orleans—April 2000
C RE SS T/U C LA
Technology Focus: WWW
 Over 1.5 billion Web pages currently available,
increasing at rate of 1.9 million pages per day
 At least 40% of U.S. classrooms linked to Internet

With 9 million children using Internet at school
 Most common classroom use for WWW is research

3rd most common use of computers at school

Has surpassed drill-and-practice software
 Nearly 90% of teachers perceive classroom WWW
access as valuable or essential for their teaching
2
C RE SS T/U C LA
Research Approach
 Study experienced Web users
 Identify set of measures to be used to assess
students’ fluency with the World Wide Web
 Web Expertise Assessment (WEA)



Web-based, authentic performance assessment
Tasks require students to navigate through large
information space searching for relevant
information and bookmarking relevant findings
All measures logged and coded
3
C RE SS T/U C LA
WEA Search Task
 Imagine you are learning about the U.S. presidents in your
history class. Your teacher has asked you to write a report
about what presidents said during their speeches when first
elected to office. She has asked you to find out which
presidents spoke of the importance of an educational
system available to all without charge.

Use WEA to find this information for your report.

Find as many useful pages as you can.


Bookmark pages by clicking on the Add Bookmark
button near the top of your screen.
You may bookmark as many useful pages as you think
necessary.
4
*
C RE SS T/U C LA
WEA Background Questionnaire
 Used to evaluate students’ WWW background
knowledge


Paper-and-pencil survey
Students rated statements on scale of 1 (“I really
don’t agree”) to 5 (“I really agree”)

e.g., “The information on the World Wide Web is not
very useful”
11
C RE SS T/U C LA
Participants
 120 middle and high school students



Students had strong technology
background
Students had access to WWW in class
Students were familiar with navigating
the World Wide Web
12
C RE SS T/U C LA
Methods
 Students completed WWW background
questionnaire
 Students trained on WEA
 Students given 20 minutes to complete WEA
search task

Reminded to bookmark relevant pages
 Data coded with high interrater reliabilities
(ranging from .97-.99)
13
C RE SS T/U C LA
Evaluating WEA Performance
Item
F1
.97
.92
.76
Factor loadings
F2
F3
F4
Number of times back used
navigational
strategies
Number of steps in search
= .88
Number of revisited info pages
Average bookmark score
.92 finding
Quality of bookmark set
.87 ability
= .86
Efficiency of search
.85
Web helpful with search
.81
background
Web info useful
.80
Web
Web info correct
.73 knowledge
= .76
Web info detailed
.69
Number of good searches
.91 searching
Quality of keyword search set
.77 expertise
= .71
Number of redirected searches
.66
Eigenvalues
2.54
2.37
2.34
2.00
Percent variance explained
19.5% 18.2% 18.0% 15.4%
Note. Only factor loadings with absolute values of .30 and higher shown.
14
C RE SS T/U C LA
Creating WEA Scales
 Navigational strategies ( = .88)



Number of times back button used
Number of steps in search
Number of revisited information pages
 In general, students navigated well



Students used back often (M = 17) for orientation
Students revisited over one third of info pages
visited, orienting themselves in Web space (M = 4)
Students completed many steps (M = 93)
15
C RE SS T/U C LA
Creating WEA Scales (con’t.)
 Finding ability ( = .86)



Average bookmark score
Quality of bookmark set
Efficiency of search
 In general, students able to find information



Average bookmark peripherally relevant to task
(M = 1.8 on 0-3 scale)
Quality of bookmark response set good
(M = 1.8 on 0-3 scale)
About one fifth of pages bookmarked
appropriately (efficiency M = .21)
16
C RE SS T/U C LA
Creating WEA Scales (con’t.)
 Background Web knowledge ( = .76)




WWW helpful in finding information
Information on WWW not very useful (r)
Information on WWW is accurate/correct
Not a lot of detailed information on WWW (r)
 In general, students familiar with Web




Students agreed that WWW is helpful in finding information
(4.2)
Students disagreed that information on WWW is not useful (2.1)
Students were neutral/agreed that information on WWW is
accurate (3.1)
Students disagreed that there is not a lot of detailed or in-depth
information on WWW (2.3)
17
C RE SS T/U C LA
Creating WEA Scales (con’t.)
 Searching expertise ( = .71)



Number of good searches
Quality of keyword searching set
Number of redirected searches
 In general, students had difficulty searching
(consistent with literature)



Number of good searches low (M = 2.0)
Quality of keyword searching set rather poor
(M = 1.7 on 0-3 scale)
Students redirected searches, browsing search
output before selection (M = 2.3)
18
C RE SS T/U C LA
Results
 Work explored constructs underlying students’
Web fluency
 Students demonstrated their abilities to search,
navigate, and find information
 Analyses identified important individual
measures and we coded these reliably
 Composite indicators from factor analysis
make sense theoretically
19
C RE SS T/U C LA
Next Steps
 Future research will focus on validity of
assessment


Can we distinguish between expert and novice
Web users with WEA?
Is WEA sensitive to instruction?
 Once validity established, use WEA to
examine effects of Internet usage
 Link assessment and instruction

Create guidelines for teaching the Web effectively
20
C RE SS T/U C LA
For More Information
 Visit our Web site at:

http://www.cse.ucla.edu/CRESST/pages/aera00.htm
 Available:

Overheads of this presentation

Full paper

And much, much more...
21