MS PowerPoint 97/2000 format

Download Report

Transcript MS PowerPoint 97/2000 format

Approaches To Automated
Benchmarking Of Public
Sector Web Sites
Brian Kelly
UK Web Focus
UKOLN
University of Bath
Bath, BA2 7AY
UKOLN is supported by:
Email
[email protected]
URL
http://www.ukoln.ac.uk/
Contents
•
•
•
•
•
•
•
2
Background
WebWatch Project
Current Approach
Pilot UK Local Authority Survey
Other Approaches
Discussion
Conclusions and Recommendations
The Problem
Background
• Local and central government organisations are
developing Web-based services
• There is a need to audit the services in order to
measure compliance with standards and guidelines,
coverage, usability, etc.
Aim Of This Talk
• This talk describes experiences in the use of Webbased auditing services and summarises the
benefits and limitations of this approach
3
NOTE
• The talk does not provide detailed results of a survey of UK
public sector Web sites although a summary of a pilot is given
• The talk does not cover manual evaluation of Web sites
Web Site Benchmarking
Why benchmark Web sites?
• To monitor compliance with standards & guidelines
• To monitor trends and developments across a
community
• To allow funders to observe developments
• To allow members of a community to see how the
community is developing and how they compare with
the community
• To inform the Web community on the uptake of Web
standards and protocols e.g.
 inform W3C on extent of compliance with WAI guidelines
across large communities
 inform e-Government on take-up of E-GIF standards
Benchmarking Examples
Examples:
 Do local government Web sites comply with W3C
WAI guidelines?
 How large are the entry points to local government
Web sites?
 Do the Web sites comply with HTML, CSS, XML,
etc. standards?
 Do the Web sites work?
 Does it appear, for example, that awareness of the
importance of accessibility and standards
compliance been accepted or does it seem to be
too difficult to provide compliance?
5
WebWatch Project
WebWatch project:
•
•
•
•
Funded for one year by British Library
Started in 1997
Software developer recruited
Development and use of robot software to monitor
Web sites across communities
• Several surveys carried out:
 UK Public Library Web sites
 UK University Web sites
 …
• See <http://www.ukoln.ac.uk/web-focus/
webwatch/reports/>
6
WebWatch Mark II
By 1999:
• Funding had finished
• Software developer left
• Realisation that:
 Development of in-house software was expensive
 Web site auditing tools were becoming available
 Web site auditing Web services were becoming available
• Since 1999:
 Use of (mainly) freely available Web services to
benchmark various public sector Web communities
 Regular columns in Ariadne e-journal
<http://www.ariadne.ac.uk/> (list at
<http://www.ukoln.ac.uk/web-focus/
webwatch/reports#latest/>)
 Experience gained in issues of Web site benchmarking
Benchmarking Web Sites
Bobby is an
example of a Webbased
benchmarking
service which
provides information
on compliance with
W3C WAI guidelines
8
http://www.cast.org/bobby/
Use Of The Services
The benchmarking Web sites are normally designed for
interactive (manual) use
However the input to the Web sites can be managed
automatically, which speeds up the submission process
It would be possible to automate processing of the
results, but this hasn’t (yet) been done:
• Lack of software developer resources
• Quality of output needs to be determined
• It should be the responsibility of the service provider
to provide output in reusable format
9
Displaying Results
The input to the
benchmarking Web
services and a summary of
the results is provided as a
Web resource.
This provides:
• Openness of
methodology
• Ability to compare your
Technique Used
Web sites with those
• Use the Web service on a site
published
• Copy URL into template
• Determine URL structure
• Use as basis for use with other URLs
http://bobby.cast.org/bobby/bobbyServlet?
URL=http%3A%2F%2Fwww2.brent.gov.uk%2F&output=Submit&gl=wcag1-aaa
Use of Bobby
11
Analysis of UKOnline appears to
show a compliant site, 0.5K in
size.
Examination show that this is an
analysis of a Redirect page.
Analysis of the destination shows
lack of compliance with WAI
guidelines and a size of 1.17 K
Further examination show that
this is an analysis of a Frames
page. Analysis of the individual
frames shows:
• A file size of 24.8 K for one
frame
• The other frame could not be
analysed due to lack of
support for cookies in Bobby
Bobby analysis of
<http://www.ukonline.gov.uk/>
Benchmarking Services (2)
NetMechanic is
another examples of
a Web-based Web
site testing services
It can check:
• Links
• HTML and
browser
compatibility
• File sizes
•…
12
http://www.netmechanic.com/
Benchmarking Sites
It is possible to benchmark entire Web sites
and not just individual pages, such as entry
points:
•
•
•
•
Nos. of links to Web site
Nos. of pages indexed
Relationships with other Web sites
…
You can also measure the server availability
and uptime (e.g. using Netcraft)
13
Standard Files
It is also possible to analyse a number of standard Web
sites files:
• The robots.txt file
 Has one been created (to stop robots for indexing, say,
pre-release information)?
 Is it valid?
• The 404 error page
 Has a tailored 404 page been created or is the server
default one used?
 Is it rich in functionality (search facility, links to appropriate
help information, etc.)?
• Search Engine page
 Is a search facility provided, and, if so, what type?
Note: manual observation of functionality of these files is currently needed
Pilot Benchmarking
Short-listed candidates for the SPIN 2001-SOCITM Web
site Awards were used in a pilot benchmarking exercise:
• Benchmarking initially carried out in July 2001 (for a
paper at the EuroWeb 2001 conference)
• Repeated in April 2002 (allowed trends to be spotted)
• Web sites analysed were:
5 English,
5 Scottish,
4 Welsh
and 4
Northern
Ireland
L B Brent
L B Richmond
Wokingham Council
Dundee City Council
Moray Council
Cardiff CC
Isle of Anglesey CC
Antrim BC
Belfast City Council
L B Camden
Tameside MBC
Dumfries & Galloway Council
East Renfrewshire Council
West Lothian Council
Ceredigion CC
Wrexham CBC
Armagh DC
Newtownabbey BC
Findings at <http://www.ukoln.ac.uk/web-focus/events/conferences/spin-2002/>
Pilot Benchmarking Findings (1)
Accessibility (using Bobby)
• In first survey 8 (44%) sites had no WAI P1 errors on
home page
• In second survey only 1 site had no P1 errors
Comments
• Accessibility is an important issue and awareness of
this is growing. – but the most visible page on these
Web sites tends not to be accessible, and this is
getting worse
Discussion
• Bobby changed its service between the two surveys.
It no longer reports on the file size. Has it changed
its algorithm for measuring accessibility?
Pilot Benchmarking Findings (2)
HTML Quality, etc. (using NetMechanic)
• One home page appeared to have a broken link in
both surveys, but this appears not to be the case
• All home pages have HTML errors, and in some
cases this is getting worse (from 4 errors to 101
errors in one case)
Comments
• Compliance with HTML standards is needed in
order to (a) avoid browser dependencies (b)
facilitate access by specialist browser (c) facilitate
repurposing.
• The Web sites do not appear to be addressing this
• Many errors could be easily fixed – e.g. by adding
an HTML DTD statement at top of file
Pilot Benchmarking Findings (3)
Web Server Software (using Netcraft)
• 12 Web sites (66.7%) use an MS Windows platform,
5 (27.8%) a Unix platform and 1 (5.6%) an unknown
platform.
• Proportions had not changed in second survey
• Will proportions change in light of MS licensing fees?
Link Popularity (using LinkPopularity)
• In the initial survey the most linked-to site in the
initial survey was Dundee City Council (896 links
according to AltaVista) or L B Brent (626 links
according to Google).
• In the second survey the most linked-to site was
Brent (883 links according to AltaVista) or Cambden
(1,810 links according to Google).
Pilot Benchmarking Findings (4)
404 Page
• 12 Web sites (67%) still had the server default 404
page
• Proportions had not changed in second survey
Search Engine Page
• 6 Web sites (33%) do not appear to have a search
facility
19
Some Issues (1)
When using Bobby and NetMechanic different results
may be obtained.
This may be due to:
• Analysis vs following redirects
• Analysis of frameset page but not individual frame
pages
• Not analysing images due to Robot Exclusion
Protocol
• Differences in covering external resources such as
JavaScript files, CSS, etc.
• Splash screens
• …
20
Some Issues (2)
Bobby changed its interface, URL, functionality and
licensing conditions between the two surveys:
• URL change meant previous live survey wouldn’t
work
• Bobby no longer provides information on browser
compatibility errors or file sizes
• The downloadable version of Bobby is no longer
free (not an issue for this work)
This illustrates some of the dangers of this approach
It is not known if Bobby’s algorithms were changed for
measuring WAI compliance, which could affect
comparisons
21
Market For Benchmarking
There is increasing interest in Web site benchmarking:
• Consortia e.g. see SOCITM “Will you be Better
Connected in 2001?” service at
<http://www.socitm.gov.uk/mapp/mapdf/Web_inner.pdf>:




Visual impairment rating
12 page report about your site
Recommendations for improving site
£495 (subscribers) or £950 for survey
• Industry e.g. companies such as Business2www
 Published factual audit of Local Government sites
 See <http://www.business2www.com/>
• Or Google search for “Web auditing”
22
Who Does The Work And Why?
Who should benchmark?
• Community itself (e.g. national association)
 But how self-critical can it be?
• The funders
 But will they take on-board the complexities?
• Neutral body
 But is there an obvious body to do the work?
What is the purpose of the benchmarking?
• Is it linked to funding, with penalty clauses for noncompliance?
• Is it to support the development of the community,
by highlighting best practices?
23
Technical Issues
Web Services
• There is a need to develop from use of interactive
Web sites to services designed for machine use
• There may be a role for a “Web Service” approach in
which a rich set of input can be provided (e.g. using
SOAP).
EARL
• There is a need for a neutral and reusable output
format from benchmarking services
• W3C’s EARL (Evaluation and Reporting Language)
may have a role to play
• As EARL is based on RDF it should be capable of
describing the benchmarking environment in a rich
and machine understandable way
• See <http://www.w3.org/WAI/ER/IG/earl.html>
24
Recommendations (1)
Standards Bodies (e.g. W3C & Equivalent)
• There is a clear need for rigourous definitions to
assist in Web auditing in order to ensure that valid
comparisons can be made across auditing services
• It would be useful to provide test case Web sites in
order to compare different audits
• Examples:




25
Definitions of a “page”
Files which should be analysed
How to handle robot exclusion protocol
User-agent view
Recommendations (2)
Applications Developers
• There is to ensure that Web-based benchmarking
services can be tailored and the output can be
reused
• Benchmarking services should be capable of
emulating a range of user agents
• Benchmarking services should provide user control
over compliance with the Robot Exclusion Protocol
• Benchmarking services should provide user control
over definitions of files to be analysed
• Benchmarking services should provide user control
over the definition of a page (e.g. include redirected
pages, sum results of original and redirected page,
etc.)
26
Recommendations (3)
There are benefits to communities in monitoring
trends and sharing best practices which have
been spotted in benchmarking work
• Let’s share the results and issues across our related
communities
• Let’s share the approaches to benchmarking across
bodies involved in benchmarking
27
Questions
Any questions?
28