Introduction to Web Surveys - Survey Research Laboratory

Download Report

Transcript Introduction to Web Surveys - Survey Research Laboratory

Introduction to Web Surveys
Timothy Johnson
Survey Research Laboratory
University of Illinois at Chicago
March 2011
Web Surveys
 First reported use in early 1990’s
 Dramatic increase in use over the past decade
 Numerous web survey software packages now available
Basic Advantages of Web Surveys
 Speed
 Cost
 Convenient (self-administered)
 Multi-media delivery (sound, video)
 Power of computer-assisted programming
 Unique, hi-tech
 Similar arguments were made regarding CATI (in the
1970s) and CAPI (in the 1980s) technologies
Designing Web Questionnaires
Basic Design Approaches
• Static web questionnaire
• Survey in single HTML document
• Respondents can scroll through document
• Data sent to server once when survey is completed
• Interactive web questionnaire
• Questions are delivered one at a time or in modules
• Data is sent to server after each screen is completed
• Conducive to use of skip patterns, consistency checks, range
checks, etc.
Static Web Questionnaires
• Very similar to mail and other self-administered
questionnaires
• Can minimize download time
• Respondents can skip questions, but the process is not
usually automated
• Hypertext links can be used to facilitate skips
• All information is lost if respondent quits before
finishing
• More advantageous for short questionnaires
Interactive Web Questionnaires
• This approach permits the use of all computer-assisted
programming devices
• May increase length of survey due to additional
download time
• Partial data is captured for respondents who quit before
finishing questionnaire
• More advantageous for longer and more complex
questionnaires
Progress Indicators
• The purpose is to motivate respondents to complete the questionnaire
in the absence of an interviewer
• Couper et al. (2001): 89.9% completed survey with progress indicator vs.
86.4% completing survey without one
• Very useful in interactive questionnaires, where respondent does not
know how long the questionnaire is
• Not necessary in static questionnaires where respondents can
determine the length by scrolling through it
• May add to survey length if increase download time
• There is some concern of increased break-offs
• Transition sentences are an alternative
• Empirical evidence regarding effectiveness not clear
General Screen Design
• Do not use background color or images
• Background colors can create contrast & reading problems
• “visual noise”
General Screen Design #2
• Be aware that images may bias responses
• Witte et al (2004) – National Geographic Survey
• Images increased support for species protection
• Couper et al (2007) – healthy vs. sick person image
• When exposed to fit person, respondents consistently rated their own health as
lower than when exposed to sick person
• Use upper right corner for contact information
• Privacy/IRB information can be clickable from there
• If top of screen format is consistent:
• Respondents will tend to ignore that section across pages
• “Banner-Blindness”
General Screen Design #3
• Access to other relevant information can also be provided:
• Answers to commonly asked questions about the survey
• pdf versions of the full questionnaire
Effect of Color on Web Survey Completion
• Do not overuse color but use it consistently
• Use red only for emergency messages
• Red-green distinctions a problem with persons who are color-blind
• 10% of males are color blind
• 99% of color blind persons cannot distinguish green & red
• White or off-white backgrounds seem to work best
• Some evidence that R’s view black-on-white web pages as being
more ‘professional’ than white-on-black web pages
• Couper (2008) prefers light blue backgrounds
Color, continued
• For maximum readability, should be high contrast between
text color and background color
• Bright colors are easier to see than pastels
• Colored backgrounds often used by spammers and may
reduce response rates
Color Associations for Adults
Color
Positive Associations
Negative Associations
Red
Power, love, fire, passion,
intimacy, courage
Danger, aggression, blood,
hot, stop
Green
Money, freshness, envy,
nature, growth
Inexperience, misfortune
Purple
Royalty, luxury
Pink
Female, cure, soft, gentle
Blue
Male, sky, water, peace,
truth, calm
Sadness, depression
Orange Autumn, Halloween, creative caution
Yellow
Happiness, sunshine,
optimism, summer
Illness, hazard
Brown
Earth, nature
Bland
Gray
Maturity, dignity
Gloomy, conservative, boring
White
Winter, virginity, clean,
innocent, truth, peace, snow
Cold, sterility, clinical
Black
Formality, style, power, depth
Death, evil, mourning, night,
mystery, fear
Text
• Always avoid small font sizes (use 10-12 point)
• Appears to be some preference for Arial over Times Roman
font
• Do not overuse bold, underline, italics and other forms of
emphasis
Question Presentation
• Avoid requiring R to horizontally scroll
• Avoid any scrolling may be best
Question Presentation
• Avoid requiring R to horizontally scroll
• Avoid any scrolling may be best
• No agreement about inclusion of question numbers
• Excluding them may avoid skip logic confusion
• Likert questions (fully-labeled) should be displayed vertically
Question Presentation #2
• Respondents less likely to skip words when lines are kept
short
• Provide computer-operating instructions at the precise
point when a R may need to use that information
• When # of responses cannot be fitted on single screen:
• Double- or triple-banking may be best approach
• Place a box around the categories in order to ‘group’ them as
being relevant to the question
Question Presentation #3
• Visibility principle
• Options that are visible are more likely to be selected than those
that are not visible until the R takes some action to display them
• Response models
• Serial processing model
• Search options for pre-existing judgment
• Deadline processing model
• Spend certain amount of time and select best answer found before
cognitive deadline (a form of satisficing)
Common Types of Response Options
for Web Surveys
1.
2.
3.
4.
5.
6.
Radio buttons or boxes
Drop-down boxes
Check boxes
Slider bars
Text boxes
Open-ended questions
Radio Buttons
•Options are typically mutually exclusive
Be careful not to use long grids
that lose column headings:
Boxes instead of buttons
Drop-Down Boxes
•Useful only for closed lists of response options
•Can be designed to allow for single or multiple choices
•Options provided must be exhaustive
•Drop boxes more difficult to use than radio buttons
Beware of Scroll Mice
 Healy (2007)
 Drop-downs (compared to radio buttons) led to higher item
nonresponse and longer response times
 Respondents using scroll mice to complete the survey were
prone to accidentally changing an answer if presented with
drop-down questions
Check Boxes
•Unlike radio buttons, multiple choices can be clicked via check
boxes
Radio Button/Check Box Hybrid
Slider Bars (a.k.a. visual analog
scale, graphic rating scales, “sliders”)
Slider Bars - Research
• Random experiment by Bayer & Thomas (2004) of
Harris Interactive
• Slider bars took about twice as long for completion as any other
scale type (including semantic differentials, likert, etc)
• Answering 2 slider bar questions averaged 42.3 seconds,
compared to 21.3 seconds for semantic differential questions
• Couper (2008) says results using slider bars are “quite
similar” to what is obtained from a scale that uses radio
buttons
Open-ended Questions
•providing more space encourages respondents to provide longer
answers
Present Single or Multiple Items per
Screen?
• For interactive questionnaires, multiple items per screen:
• Are completed more quickly by respondents
• May provide more context
• Intercorrelations among items are consistently higher when grouped
together on one screen (Couper et al. 2001).
• Also, multiple item screen versions:
• take less time to complete
• produce less missing data
Survey Navigation
• A consistent format should be followed
• Use action buttons that are different from any response
input elements such as radio buttons
• “next screen” or “next question” buttons should be on all pages
• Crawford et al (2005) recommends putting them in the lower left
corner
• “previous screen” or “previous question” buttons should be in the
bottom right corner
Key Point
• Never force respondents to answer a question
• Adds to frustration
• IRB implications
• No other questionnaire formats ‘force’ answers
Key Questionnaire Design Principles
Summary
• Minimize respondent burden and frustration
• The fewer ‘clicks,’ the better
• The less scrolling, the better
• The fewer distractions, the better
• The fewer problems knowing how to navigate the questionnaire,
the better
• The less download time required, the better
• Never forcing respondents to answer questions
Some other design recommendations
to consider (from Couper 2008):
• Remove unneeded content and clutter
• Minimize the number of different colors and fonts being
used
• Use consistent design formats through the entire
instrument
• Avoid putting too much material on any page
Summary
 Web surveys vary greatly in their goals, design, execution,
analysis, etc.
 Evaluation must be done in the context of the type of survey
being conducted
 Cannot say that all web surveys are good or bad
 Methodological research is being done on a moving target
Thank You
[email protected]