Powerpoint 97/2000 Format

Download Report

Transcript Powerpoint 97/2000 Format

Automated Benchmarking Of
Local Authority Web Sites
Brian Kelly
UK Web Focus
UKOLN
University of Bath
Bath, BA2 7AY
UKOLN is supported by:
Email
[email protected]
URL
http://www.ukoln.ac.uk/
Contents
•
•
•
•
•
•
•
2
Background
WebWatch Project
Current Approach
Pilot UK Local Authority Survey
Other Approaches
Discussion
Conclusions and Recommendations
The Problem
Background
• Local and central government organisations are
developing Web-based services
• There is a need to audit the services in order to
measure compliance with standards and guidelines,
coverage, usability, etc.
Aim Of This Talk
• This talk describes experiences in the use of Webbased auditing services and summarises the
benefits and limitations of this approach
3
NOTE
• The talk does not provide detailed results of a survey of UK
Local government Web sites
• The talk does not cover manual evaluation of Web sites
Web Site Benchmarking
Why benchmark Web sites?
• To monitor compliance with standards and
community guidelines
• To inform the Web community (e.g. W3C) on the
uptake of Web standards and protocols
• To monitor developments across a community
• To allow funders to observe developments
• To allow members of a community to see how the
community is developing and how they compare
with the community
4
Benchmarking Examples
Examples:
 Do local government Web sites comply with W3C
WAI guidelines?
 How large are the entry points to local government
Web sites?
 Do the Web sites comply with HTML, CSS, XML,
etc. standards?
 Does it appear, for example, that awareness of the
importance of accessibility and standards
compliance been accepted or does it seem to be
too difficult to provide compliance?
5
WebWatch Project
WebWatch project:
•
•
•
•
Funded for one year by British Library
Started in 1997
Software developer recruited
Development and use of robot software to monitor
Web sites across communities
• Several surveys carried out:
 UK Public Library Web sites
 UK University Web sites
 …
• See <http://www.ukoln.ac.uk/web-focus/
webwatch/reports/>
6
WebWatch Mark II
By 1999:
• Funding had finished
• Software developer left
• Realisation that:
 Development of in-house software was expensive
 Web site auditing tools were becoming available
 Web site auditing Web services were becoming
available
• Since 1999:
 Use of (mainly) freely available Web services to
benchmark various public sector Web communities
 Regular columns in Ariadne e-journal
<http://www.ariadne.ac.uk/>
 Experience gained in issues of Web site
benchmarking
7
Benchmarking Web Sites
Bobby is an
example of a Webbased
benchmarking
service which
provides information
on compliance with
W3C WAI guidelines
8
http://www.cast.org/bobby/
Use Of The Services
The benchmarking Web sites are normally designed for
interactive (manual) use
However the input to the Web sites can be managed
automatically, which speeds up the submission process
It would be possible to automate processing of the
results, but this hasn’t (yet) been done:
• Lack of software developer resources
• Quality of output needs to be determined
• It should be the responsibility of the service provider
to provide output in reusable format
9
Displaying Results
The input to the
benchmarking Web
services and a
summary of the results
is provided as a Web
resource.
This provides:
• Openness of
methodology
• Ability to compare
your Web sites with
those published
10
Use of Bobby
11
Analysis of UKOnline appears to
show a compliant site, 0.5K in
size.
Examination show that this is an
analysis of a Redirect page.
Analysis of the destination shows
lack of compliance with WAI
guidelines and a size of 1.17 K
Further examination show that
this is an analysis of a Frames
page. Analysis of the individual
frames shows:
• A file size of 24.8 K for one
frame
• The other frame could not be
analysed due to lack of
support for cookies in Bobby
Bobby analysis of
<http://www.ukonline.gov.uk/>
Benchmarking Services (2)
NetMechanic is
another examples of
a Web-based Web
site testing services
It can check:
• Links
• HTML and
browser
compatibility
• File sizes
•…
12
http://www.netmechanic.com/
Some Issues
When using Bobby and NetMechanic different results
may be obtained.
This may be due to:
• Analysis vs following redirects
• Analysis of frameset page but not individual frame
pages
• Not analysing images due to Robot Exclusion
Protocol
• Differences in covering external resources such as
JavaScript files, CSS, etc.
• Splash screens
• …
13
Benchmarking Sites
It is possible to benchmark entire Web sites
and not just individual pages, such as entry
points:
•
•
•
•
Nos. of links to Web site
Nos. of pages indexed
Relationships with other Web sites
…
You can also measure the server availability
and uptime (e.g. using Netcraft)
14
Standard Files
It is also possible to analyse a number of
standard Web sites files:
• The robots.txt file
 Has one been created (to stop robots for
indexing, say, pre-release information)?
 Is it valid?
• The 404 error page
 Has a tailored 404 page been created or is the
server default one used?
 Is it rich in functionality (search facility, links to
appropriate help information, etc.)?
Note that manual observation of the functionality
of these files is currently needed
15
Market For Benchmarking
There is increasing interest in Web site benchmarking:
• Industry (search for “Web auditing”)
• Consortia e.g. see SOCITM “Will you be Better
Connected in 2001?” service at
<http://www.socitm.gov.uk/mapp/mapdf/
Web_inner.pdf>:




16
Visual impairment rating
12 page report about your site
Recommendations for improving site
£495 (subscribers) or £950 for survey
Who Does The Work And Why?
Who should benchmark?
• Community itself (e.g. national association)
 But how self-critical can it be?
• The funders
 But will they take on-board the complexities?
• Neutral body
 But is there an obvious body to do the work?
What is the purpose of the benchmarking?
• Is it linked to funding, with penalty clauses for noncompliance?
• Is it to support the development of the community,
by highlighting best practices?
17
Technical Issues
Web Services
• There is a need to develop from use of interactive
Web sites to services designed for machine use
• There may be a role for a “Web Service” approach in
which a rich set of input can be provided (e.g. using
SOAP).
EARL
• There is a need for a neutral and reusable output
format from benchmarking services
• W3C’s EARL (Evaluation and Reporting Language)
may have a role to play
• As EARL is based on RDF it should be capable of
describing the benchmarking environment in a rich
and machine understandable way
• See <http://www.w3.org/WAI/ER/IG/earl.html>
18
Recommendations (1)
Standards Bodies (e.g. W3C & Equivalent)
• There is a clear need for rigourous
definitions to assist in Web auditing in order
to ensure that valid comparisons can be
made across auditing services
• Examples:
 Definitions of a “page”
 Files which should be analysed
 How to handle robot exclusion protocol
 User-agent view
19
Recommendations (2)
Applications Developers
• There is to ensure that Web-based benchmarking
services can be tailored and the output can be
reused
• Benchmarking services should be capable of
emulating a range of user agents
• Benchmarking services should provide user control
over compliance with the Robot Exclusion Protocol
• Benchmarking services should provide user control
over definitions of files to be analysed
• Benchmarking services should provide user control
over the definition of a page (e.g. include redirected
pages, sum results of original and redirected page,
etc.)
20
Recommendations (3)
There are benefits to communities in monitoring
trends and sharing best practices which have
been spotted in benchmarking work
• Let’s share the results and issues across our related
communities
• Let’s share the approaches to benchmarking across
bodies involved in benchmarking
21
Questions
Any questions?
22