Snijkers_Onat_Visschers

Download Report

Transcript Snijkers_Onat_Visschers

The Annual Structural Business Survey
Developing and Testing
an Electronic Form
Ger Snijkers
Evrim Onat
Rachel Visschers
Statistics Netherlands
Division of Business Statistics
Tuesday, June 19, 2007
ICES3, June 18-21, 2007, Montreal
1
1. Dutch Annual Structural
Business Survey



Annual survey of economic activity
Mandatory
75.000 business each year:
• Sample of small firms, bigger firms each year

Until now: paper questionnaire
• length may differ, 20 pages is typical


Three parts:
• revenues and costs
• summary of business accounts:
profits and losses
• industry specific specifications
Tuesday, June 19, 2007
ICES3, June 18-21, 2007, Montreal
2
2. The paper questionnaire
Characteristics

A4 booklet
•
•

Right page: items
Left page: help texts - long and voluminous
Items are grouped in sections
•
Long sections
• Completion process: complicated and hard
•
•
•
•
Large amount of detailed information
Broad range of business information:
several departments, several respondents
No match with definitions used, but same label:
CBS – business definitions
Lay-out: misinterpratations and errors
Tuesday, June 19, 2007
ICES3, June 18-21, 2007, Montreal
3
Vernieuwde huisstijl:
2. Onderzoek naar de PS
Q oud
Tuesday, June 19, 2007
ICES3, June 18-21, 2007, Montreal
4
3. The web questionnaire
Project goal

Develop a web questionnaire
•
•
•
•
same contents
mixed-mode design: paper and web
support completion process:
- motivate respondents to use this mode
into the field: March 2006
 Start of project: June 2004
 In the field: Spring 2006
Tuesday, June 19, 2007
ICES3, June 18-21, 2007, Montreal
5
3. The web questionnaire
Developing and testing
In five stages:
1. Testing the prototype
(31-1-’05)
• pre-tests to test usability: 3 waves
2. Revision of questionnaire
(1-9-’05)
• expert reviews
3. Testing of revised questionnaire
(1-1-’06)
• additional usability tests
4. Implementation of field pilot
5. Implementation of survey
Tuesday, June 19, 2007
ICES3, June 18-21, 2007, Montreal
(1-3-’06)
(1-3-’07)
6
3.1. The web Q: The prototype
Tuesday, June 19, 2007
ICES3, June 18-21, 2007, Montreal
7
3.1. The prototype: 3 test waves
Test 1
Test 2
Test 3
Period
Aug. 2004
Oct. 2004
Nov./Dec. 2004
Form
Blaise IS
Blaise EDR
Blaise CBSquest
On/off-line
On-line
Off-line
Off-line
Mode
Browser
CD-rom
Downloadable
via internet
Number of
interviews
15
CBS employees
37
businesses
6
businesses
Tested by
CBS employees,
external designer
6 business
interviewers
2 cogn. lab
interviewer +
2 business
interviewer
Results
Experiences from
CBS employees
Experiences
from
interviewers
Experiences
with
respondents
Tuesday, June 19, 2007
ICES3, June 18-21, 2007, Montreal
8
3.1. The prototype
Research issues
1. How does the e-form work in practice?
• Completing the questionnaire
• Question-and-answer process
2. What features should be included to make
it easy to use?
• Respondent friendly: ‘Computer-assisted’ tools
• User wishes
3. How should the web Q be designed in
relation to the paper Q?
• The same or a different design
• ‘look-and-feel’ of paper and e-form
Tuesday, June 19, 2007
ICES3, June 18-21, 2007, Montreal
9
3.1. The prototype
Research issues
1. How does the Q work?
 Laborious and complex process
• Long, complex questionnaire (≥ 25 items)
• Complex completion process:
- several sessions, several informants
- kick-and-rush behaviour
• Imagine ...
a respondent sitting behind his/her computer ...
 Respondent got lost in the questionnaire
Tuesday, June 19, 2007
ICES3, June 18-21, 2007, Montreal
10
3.1. The prototype
Research issues
2. Features to make it easy to use?
• What am I supposed to do (next)?
• Easy to download, install, complete, send data back
• It is one process: downloading – sending data back,
• Clear instructions and explanations (but not read)
•
How is the questionnaire built up?
• Show how the questionnaire is structured: overview
• Help to find the way in the questionnaire
• No hidden rules, no unexpected functionalities
•
Where am I? What did I do so far?
• Provide overview of the completion process
• Clear navigation, no scrolling
• Printing function
Tuesday, June 19, 2007
ICES3, June 18-21, 2007, Montreal
11
3.1. The prototype
Research issues
2. Features to make it easy to use? Con’d
• Printing
• Passing on sections of Q to other departments
• Checking the data before transmitting
• Getting authorisation to release the data
•
Calculations
• Well accepted, … even expected
•
Entry-search
• ‘Google’-like search: “Where to put these data?”
•
Navigation and overview
• Choose a setup people are familiar with:
- setup of tax office, windows explorer
Tuesday, June 19, 2007
ICES3, June 18-21, 2007, Montreal
12
3.1. The prototype
Research issues
3. Design of paper and web Q?
• The computer is different than paper
• The web Q reacts to the respondent
• Reading from the screen is different
• Navigating and getting an overview
works differently
• Kick-and-rush behaviour,
even stronger than on paper
• The use of computer-assisted functionalities;
the respondent expects the computer to react
Tuesday, June 19, 2007
ICES3, June 18-21, 2007, Montreal
13
3.1. The prototype
Conclusions of pre-test waves
•
Visual design
 Clear and logical
 Simple, transparent, consistent
 No hidden and unexpected functionalities
•
Support the completion process
 Other mode, other features, other visual design
 Different than paper form, same ‘look-and-feel’
•
Tailor to kick-and-rush behaviour
 Small sections, small tasks
 Short and clear explanations
Tuesday, June 19, 2007
ICES3, June 18-21, 2007, Montreal
14
3.2. The revised questionnaire
Based on:
• Pre-test results
• Expert reviews
• Iterative process with
- Professional designer
- Questionnaire designers
- Methodologists
• A user friendly design was put first,
not the IT tool
• New prototypes designed in Power Point
Tuesday, June 19, 2007
ICES3, June 18-21, 2007, Montreal
15
Tuesday, June 19, 2007
ICES3, June 18-21, 2007, Montreal
16
3.3. The revised questionnaire
Conclusions of add. pre-tests
•
•
10 concurrent in-depth interviews
Usability and user friendliness has been
improved



respondents enjoyed working with the
questionnaire
they could handle the task
even though … the task had not changed
 Web questionnaire design is
communication design
Tuesday, June 19, 2007
ICES3, June 18-21, 2007, Montreal
17
3.4. Field pilot
Set-up:
 March-July 2006
 7200 businesses, 5 industries
 Advance letter with
• internet address: www.cbs.nl/productiestatistiek
• user name and password
• leaflet to introduce web questionnaire
and explain why the survey is conducted
• paper questionnaire not mentioned
Tuesday, June 19, 2007
ICES3, June 18-21, 2007, Montreal
18
Leaflet
Tuesday, June 19, 2007
ICES3, June 18-21, 2007, Montreal
19
3.4. Field pilot
Goals:
 Implementation of web questionnaire
• paper and web flows

Test the whole process
• downloading – completing – sending-in data
• investigate completion process in the field:
- debriefing interviews with respondents
- audit trails


Response rates
Data quality
• data editing, mode effects
Tuesday, June 19, 2007
ICES3, June 18-21, 2007, Montreal
20
3.5. Survey
Now running
 About 75.000 business received this
questionnaire
Tuesday, June 19, 2007
ICES3, June 18-21, 2007, Montreal
21
Web questionnaire design is
communication design
Tuesday, June 19, 2007
ICES3, June 18-21, 2007, Montreal
22