Transcript Slide 1
Effects of Design
in Web Surveys
Vera Toepoel
Tilburg University
The Netherlands
7/21/2015
CentERdata: Two Online Panels
1. CentERpanel:l
• Exists for 17 years
• 2000 households
• Respondents fill out questionnaires every
week
Online interviews as method, but:
Probability sample drawn from address sampling
frame of Statistics Netherlands
Recruitment of new panel members addressbased
Includes households without internet access
(less than 20%): Equipment
3
7/21/2015
CentERdata: Two Online Panels
2. LISS Panel
• Grant from The Netherlands Organisation for
Scientific Research
• 5000 households
• Established in 2007 (we fielded 1st
questionnaire!)
• Respondents fill out questionnaires every
month
Online interviews as method, but:
Probability sample drawn from address sampling
frame of Statistics Netherlands
Contacted by letter, telephone or visit
Includes households without internet access
(less than 20%): Equipment
4
7/21/2015
1 item per screen
5
7/21/2015
4 items per screen
6
7/21/2015
10 items per screen
7
7/21/2015
8
7/21/2015
Answer categories
9
7/21/2015
Open-ended
10
7/21/2015
Vertical: positive to negative
11
7/21/2015
Horizontal
12
7/21/2015
Numbers 1 to 5
13
7/21/2015
Numbers 5 to 1
14
7/21/2015
Numbers 2 to -2
15
7/21/2015
Trained Respondents:
Panel conditioning
• Content (knowledge on topics)
– Prepare for future surveys
– Develop attitudes
• Procedure (question-answering process)
– Learn how to interpret questions
– Answer strategically
– Speed through the survey
16
7/21/2015
Procedure (answer process)
• Differences between trained and fresh
respondents with regard to web survey
design choices
– Items per screen
– Response category effects
– Question layout
17
7/21/2015
Overall:
Difference in mean duration of the entire
survey between panels: 436 seconds for
the trained panel and 576 seconds for
the fresh panel.
18
7/21/2015
Experiment 1: Items per screen
• Social Desirability Scale
• 10 items
• 3 different formats:
– 1 item per screen
– 5 items per screen
– 10 items per screen
19
7/21/2015
Experiment 1: Items per screen
20
7/21/2015
• Trained respondents had higher interitem correlations for multiple-item-perscreen formats.
• No significant difference in item nonresponse.
• Mean score of the Social Desirability
Scale showed no evidence for social
desirability bias.
• The mean duration to complete the ten
social desirability items did not differ
significantly between panels.
Experiment 2: Answer Categories
21
7/21/2015
Experiment 2: Answer Categories
22
7/21/2015
Experiment 2: Answer Categories
• Category effect found
• No difference in category effect between
trained and fresh respondents
23
7/21/2015
Experiment 3: Question Layout
Question: Overall, how would you rate the
quality of education in the Netherlands?
Answer: 5-point scale
24
7/21/2015
Six formats:
1. Reference format (decremental)
2. Reverse scale: incremental
3. Horizontal layout
4. Add numbers 1 to 5 to verbal labels
5. Add numbers 5 to 1 to verbal labels
6. Add numbers 2 to -2 to verbal labels
Experiment 3: Question Layout
1.
2.
3.
4.
5.
Decremental vs. incremental: T+ F
Vertical vs. horizontal layout: No numbers vs. numbers 1 to 5:Numbers 1 to 5 vs. numbers 5 to 1: T+F
Numbers 5 to 1 vs. Numbers 2 to -2: T+F
•
Trained respondents more easily selected
one of the first options.
T=significant differences in Trained panel
F=significant differences in Fresh panel
25
7/21/2015
Design Effects in Web Surveys:
Comparing Trained and Fresh
Respondents
• Overall little differences between trained
and fresh respondents
• Trained respondents are somewhat
more sensitive to satisficing:
– Shorter completion times
– Higher inter-item correlations for
multiple-items-per-screen formats
– Select first response options more often
26
7/21/2015
Current and Future Research
• It has been little more than a decade
since systematic research was begun on
visual design effects in web surveys.
• In the last decade dozens of studies
have been conducted
• It is now important that we begin to
understand the importance of each of
the visual effects
• Can we reduce visual effects by effective
question writing?!
27
7/21/2015
Effective Question Writing
Tourangeau, Couper, and Conrad (POQ
2007) suggest there may be a hierarchy
of features that respondents attend to:
Verbal language>numbers> visual cues
Question: Can the effects of visual layout
be diminished through greater use of
verbal language and numbers?
28
7/21/2015
Experiment 1: Visual Heuristics
(joint with Don Dillman)
•
1.
2.
3.
4.
5.
29
7/21/2015
Tourangeau, Couper, and Conrad (POQ 2004;
2007):
Middle means typical: respondents will see the
middle option as the most typical
Left and top means first: the leftmost or top
option will be seen as the ‘first’ in conceptual sense
Near means related: options that are physically
near each other are expected to be related
conceptually
Up means good: the top option will be seen as the
most desirable
Like means close: visually similar options will be
seen as closer conceptually
Experimental conditions:
• Polar point or fully labeled scale
• With or without numbers (1 to 5)
Middle Means Typical
Fully labeled: even spacing
30
7/21/2015
Fully labeled: uneven spacing
Left and Top Means First
Fully labeled with color: consistent ordering
31
7/21/2015
Fully labeled with color: inconsistent ordering
Near Means Related
Polar point with numbers: separate screens
32
7/21/2015
Polar point with numbers: single screen
Up Means Good
Polar point with numbers: incremental
33
7/21/2015
Polar point with numbers: decremental
Like Means Close
Polar point
34
7/21/2015
Polar point with color
Like Means Close
Polar point with numbers (1 to 5)
35
7/21/2015
Polar point with different numbers (-2 to 2)
Labels, numbers and visual heuristics: is there a
hierarchy?
Effect
heuristic?
1. Middle
Means
Typical
2.Left and
Top Means
First
3. Near
Means
Related
4. Up
Means
Good
5. Like
Means
Close
no
no
yes
yes
yes
yes
yes
Color: yes
Dif. #: no
yes
no
no
Numbers
reduced
effect?
36
7/21/2015
Effect
heuristic in
fully
labeled
scales?
no
no
Experiment 2: Pictures in web
surveys (joint with Mick Couper)
Replicate study Couper, Tourangeau, and
Kenyon (POQ 2004)
– 1. No Picture
– 2. Low frequency picture
– 3. High frequency picture
Add verbal instructions
37
7/21/2015
– A. No verbal instruction
– B. Instruction to include both high and
low frequency instances
– C. Instruction to include only low
frequency instances
Low and High frequency picture
38
7/21/2015
Can verbal instructions reduce the
effects of pictures?
MANOVA
– main effect instructions
lambda=.597, p<.0001
– main effect pictures
lambda=.964, p<.0001
– interaction instructions*pictures
lambda=.9691,p<.0001
39
7/21/2015
– This suggests that while both the main
effect and interaction are
significant, instructions explain more of
the variation in the answers than
pictures!
Future Research
How to reduce visual design effects in web
surveys
40
7/21/2015
LISS data
Every researcher (irrespective of nationality)
who wants to collect data for scientific,
policy or societal relevant research can
collect data via the LISS panel at no cost
Proposals can be submitted through
www.lissdata.nl
Existing data free available for academic use
longitudinal core studies
proposed studies
disseminated through www.lissdata.nl
41
7/21/2015