Transcript Giesen

The Response Process Model as a Tool for
Evaluating Business Surveys
Deirdre Giesen
Statistics Netherlands
Montreal, June 20th 2007
ICES III
Outline
–Questionnaire testing at Stat Netherlands
–Collecting data on the response process
–Reviewing field visits to reflect on response
process model
–Data used
–Preliminary results
Questionnaire testing
– Recently more attention for establishment
data collection (efficiency, response burden
and quality)
– One of the strategies: improving
questionnaires
–Focus of Question Lab: Response Burden
and Data Quality
– Preferably: multi method evaluation
– Favorite method: company visits to collect
data response process
Response process model for
business surveys
1. Encoding of information in company records or
memory
2. Selection and identification of the respondent(s).
3. Assessment of priority
4. Comprehension of the data request
5. Retrieval of relevant information from records or
memory
6. Judgment of the adequacy of the response
7. Communication of the response
8. Release of the data
Sudman, S., Willimack D.K., Nichols E. & Mesenbourg T. (2000),
Willimack, D.K. & Nichols E. (2001)
Collecting data on the response
process
– on site
– methodologist and field officer
– mix of observing and reconstructing
– if necessary: general interview
– standard protocol with adaptations
– detailed visit reports
– video taping
Pilot study by Hak & Van Sebille (2002)
Standard protocol
1. Introduction
2. General questions
3. Observation or reconstruction response
process
4. Evaluating
5. Correcting data and answering questions
Review of field visits:
1. Which problems did we find that caused
data error and/or response burden?
2. How are these problems linked to the
different steps of the model ?
3. To what extent were the steps of the model
useful for describing and understanding the
process of responding to a business
survey?
Evaluation studies reviewed
Name
Mode
# reports reviewed
SBS2003
Paper
10
Transport
Electronic 3
Producer Prices
Electronic 5
International trade
Electronic 7
Sourcing
Electronic 5
Respondents and visitsreviewed
– retrospective interviews (6), observations
(15) and general interviews about response
process (9)
– respondents from size classes 0 to 9
– respondents from retail, wholesale, service,
manufacturing, building, transport and external
accountants
Encoding
Problems found
– Lack of available information important and
source of response burden and data error
– Important to distinguish lack of information
and (ease of) accessibility of information
Recommendations
– Change information request if possible
– Assist respondent with data collection
Selection and Identification of
Respondents
Problems found
– electronic forms extra difficulties
–distribution from SN to firm
– characteristics of respondent
– distribution within firm
– change of respondents
Selection and Identification of
Respondents
Recommendations
– information on who to contact
– instrument design should allow for easy
forwarding of (parts of) the exact questionnaire
– data request and specific arrangements with
firm should be documented in a way
understandable for new respondent
Assessment of priorities
Problems found
– Timely and correct completion is generally
not a high priority
– Most respondents hardly see any reward or
benefits for their effort
– Some respondents may deliberately provide
wrong data to prevent response burden
Assessment of priorities
Recommendations
– design questionnaires for quick readers and
clickers
– adapt data collecting strategies: reminders,
quality control, incentives and penalties
– improve general communication to stress
importance of contribution to national statistics
Comprehension
Many problems found at several levels
– General design and goal of the study
– Overall design of the instrument
– Specific questions
Recommendations
– Improve communication ´around´ questionnaire
– Develop tailored questionnaires for small businesses
in lay language
– Many suggestions to improve wording, order, layout of
total instrument and specific questions
Retrieval
Problems found
– Using the wrong sources
– Lack of access to or cooperation from
sources
– Lack of knowledge of sources
– Compiling errors (also if automated)
– Excessive response burden for certain tasks
– Retrieval strategies vary
Retrieval
Recommendations
– ask less detailed information if possible
– explicitly allow estimates for known difficult
variables
– design materials to make internal data
collection easier and more accurate
– stress more clearly about which unit
respondents should be reporting on
Judgment
Often difficult to distinguish retrieval and
judgment problems
Problems found
– Checking the questionnaire can cause high
response burden
– Motivation to do so lacks
– Instrument design can hinder easy checking
and editing
– Few confidentiality issues
Judgment
Recommendations
– Stimulate respondents to check their
answers by automated checks in electronic
instruments and quality control combined with
feedback after submission of the data.
– Design instruments to facilitate checking and
editing the data by respondent
Communication
Problems found
– many usability issues in e-forms that made
communicating a specific answer difficult
– data error through obligatory fields
– electronic sending of questionnaire important
source of response burden en data error
Communication
Recommendations
– many specific technical and usability issues
that need to be addressed
– allow empty fields for known difficult
variables
– ideas for electronic questionnaire
functionalities that will make communication of
answer easier
Release of the data
No problems with response burden or data
error problems found in this step
Problems ´outside´ model
– Filing of report for internal use
– Lack of feedback after data has been
submitted
Conclusions
– response process model is helpful
framework to find problems with data collection
– some problems can only be discovered when
studying the actual response process in detail
– response process does not end with release
of the data, model might be extended to take
this into account