Transcript Chapter 2

Chapter 2
Research Methods in
Social Psychology
Chapter Outline

Characteristics of Empirical Research
 Research Methods
 Research in Diverse Populations
 Ethical Issues in Social Psychological
Research
Methodology


A set of procedures that guide the collection and
analysis of data.
In a typical study:
1. Develop a research design.
2. Go in a laboratory or field setting and collect
data.
3. Code and analyze the data to test hypotheses
and arrive at conclusions about the behaviors or
events under investigation.
Objectives of Research

Describe reality.
 Identify correlations between variables.
 Test causal hypotheses.
 Develop and test theories.
Research Hypotheses

A conjectural statement of the relation
between two or more variables.
 Many social psychological studies begin with
one or more hypotheses.
 Noncausal hypotheses make statements
about observed relations between variables.
 Causal hypotheses relate two variables “X
causes Y” or “Higher levels of X produce
lower levels of Y”.
Causal Hypotheses

Always include an independent variable and
a dependent variable.
 An independent variable is any variable
considered to cause or have an effect on
some other variable(s).
 A dependent variable is any variable caused
by some other variable.
Extraneous Variable

Any variable that is not expressly included in
the hypothesis but has a causal impact on the
dependent variable.
 Extraneous variables are widespread in
social psychology because most dependent
variables of interest have more than one
cause.
Internal Validity

Variables that are free from contamination by
extraneous variables.
 Internal validity is a matter of degree; findings
may have high or low internal validity.
 Without internal validity, a study cannot
provide clear, interpretable results.
External Validity

The extent to which a causal relationship,
identified in a particular setting with a
particular population, can be generalized to
other populations, settings, or time periods.
 External validity is important because the
results of a study often have practical
importance only if they generalize beyond the
particular setting in which they appeared.
4 Main Research Methods

Surveys
 Naturalistic observation
 Archival research based on content analysis
 Experiments
Surveys

Collecting information by asking members of
some population a set of questions and
recording their responses.
– Useful for identifying the average response
to a question, the distribution of responses
within the population and how groups of
respondents differ from one another.
Two Types of Surveys

In an interview survey, a person serves as an
interviewer and records the answers from the
respondents.
 In a questionnaire survey, the questions
appear on paper, and the respondents read
and answer them at their own pace.
Response Rate

The percentage of people contacted who
complete the survey.
– An interview study can obtain response
rates of 75 to 80% or more.
– Mailed questionnaires rarely attain more
than a 50% response rate.
Reliability

The extent to which an instrument produces
the same results each time it is employed to
measure a particular construct under given
conditions.
Assessing Reliability: Test-retest
Method

Investigator applies the measuring instrument
to the same respondents on two different
occasions, and compares the responses.
– If the correlation between the responses is
high, the instrument has high reliability.
– If the correlation is low, the instrument has
low reliability.
Assessing Reliability: Split-half
Method

Example:
– A scale of 20 questions to measure psychological
well-being is administered to respondents.
– Randomly divide the questions into two groups of
10, calculate a score for each respondent on each
group of 10, and compute a correlation between
the scores.
– A high correlation provides confirmation that the
scale is reliable.
Validity

An instrument has face validity if its content
is similar to the behavior or process of
interest.
 An instrument has criterion validity if we can
use it to predict respondents’ standing on
some other variable of theoretical interest.
 An instrument has construct validity if it
provides a good measure of the theoretical
concept being investigated by the research.
Survey Questions: Guidelines
1.
2.
3.
4.
The more precise and focused a question, the
greater its reliability and validity.
Avoid jargon or specialized terminology unless you
are interviewing specialists.
Questions of moderate length elicit more complete
answers than very short ones.
Threatening questions requiring quantified answers
are better asked by presenting a range of answers
(1–5) than by asking a question requiring an exact
number.
Single Items

The single-item scale consists of a direct
positive or negative statement, and the
respondent indicates whether they agree,
disagree, or are unsure.
– Takes minimum of time and space to
present and is easy to score.
– They are general and detect only gross
differences in attitude.
Likert Scales

A technique based on summated ratings,
provides information about how each person
feels about the object of interest and how
each respondent’s attitude compares with the
attitudes of others.
The Sample

The set of all people whose attitudes are of interest to
the researcher.
– Simple random sample - researcher selects units
from the population so each unit has an equal
probability of being included.
– Stratified sample - researcher divides the
population into groups according to
characteristics, selects a random sample of
groups, and draws a sample of individuals within
each group.
Field Study

Involves making observations about behavior
as it occurs naturally in everyday settings.
 Data are collected by one or more
researchers who directly observe the activity
of people and record information about it.
 This method has been used to investigate
many forms of social behavior in natural
settings.
Strengths and Weaknesses of
Field Studies


Strength: Allows researchers to study social activity in
real-world settings.
Weaknesses:
– Sensitivity to recording methods used.
– Observations recorded after the fact are less
reliable than those recorded on the spot or those
based on audio- or videotaping.
– Validity may depend on the identities the
investigators project while making observations.
Archival Research

Acquisition and analysis (or re-analysis) of
information collected previously by others.
 Archival research usually costs less than
alternative methods.
Content Analysis

Involves undertaking a systematic scrutiny of
documents or messages to identify specific
characteristics and make inferences based
on their occurrence.
Steps in Content Analysis
Identify the unit to be studied—is it the word,
the sentence, the paragraph, or the article?
2. Define the categories into which the units will
be sorted.
3. Code the units in each document into the
categories.
4. Look for relations within the categorized
data.
1.
Strengths of Archival Research

Comparatively low cost.
 By using existing information, an investigator
may complete a study more quickly.
 Investigator can test hypotheses about
phenomena that occur over extended periods
of time.
Weaknesses of Archival
Research


Lack of control over the type and quality of
information.
Creating a reliable and valid content analysis scheme
for use with records can be difficult.
Experiments

For a study to be a true experiment, it must
have these characteristics:
1. Researcher must manipulate one or more
of the independent variables that are
hypothesized to have a causal impact on
the dependent variable(s) of concern.
2. Researcher must assign participants
randomly to the different levels of each of
the independent variables.
Experiments: Controlling Factors

Experiments must control factors affecting
dependent variable by:
1. Randomly assigning participants to treatments.
2. Holding constant known extraneous variables.
3. Incorporating extraneous variables in the design.
4. Measuring extraneous variables and including
them in the data analysis as covariates of
independent variables.
Strengths And Weaknesses Of
Research Methods
Method
Internal
Validity
External
Validity
Investigator
Control
Survey
Moderate
Moderate
Moderate
Observational
Study
Low
Moderate
Moderate
Archival Research
Low
Moderate
Low
Laboratory
Experiment
High
Moderate
High
Field Experiment
Moderate
High
Moderate
Strengths And Weaknesses Of
Research Methods
Method
Intrusiveness
of Measures
Difficulty
Ethical
Survey
Moderate
Moderate
Few
Observational
Study
Moderate
Moderate
Many
Archival Research
Low
Low
Few
Laboratory
Experiment
Moderate
Moderate
Some
Field Experiment
Low
High
Some
Meta-analysis

A statistical technique that allows the
researcher to combine results from previous
studies to determine what, collectively, they
say.
Steps In Conducting a Metaanalysis
1.
2.
The researcher locates all previous studies on the
question.
For each study, the investigator computes a statistic
that measures:
• The difference was between those who did and
did not interact with members of the group.
• What the direction of the difference was (whether
those who had contact were more or less
prejudiced).
Steps In Conducting a Metaanalysis
3.
The researcher averages all the values of d
over all the studies that were located.
• This value tells what the direction of the
difference is in attitudes between those
who do and do not have contact with the
group and how large the difference is for
all the studies combined.
Common Ethical Questions in
Research Studies
Is it possible that participants might be
harmed physically?
2. Does the study use any form of deception?
3. Does the study induce participants to engage
in behavior that might threaten their self
respect?
1.
Common Ethical Questions in
Research Studies
If the investigators make audio- or
videotapes of the participants, will they
obtain permission to use the tapes as data?
5. How will investigators preserve the
confidentiality of participants?
6. Will the investigators tell potential
participants in advance about the risks that
their participation may entail?
4.
Common Ethical Questions in
Research Studies
Will participants have a chance to ask
questions about the study before they
consent?
8. Will the investigators inform the participants
that they have the right to terminate their
participation at any time?
9. At the end of the study, will investigators
debrief the participants and tell them about
the nature of the study and its procedures?
7.
Informed Consent: Essential Elements

Researcher should explain of the purposes of
the research and describe the procedures to
be employed.
 Investigators should inform participants about
any foreseeable risks of participation.
 Researchers should provide a description of
any benefits to the participant or others.
Informed Consent

Investigators should provide information
about which medical or psychological
resources are available if participants are
adversely affected by participation.
 Researchers should offer to answer questions
about the study whenever possible.
 Researchers should inform potential
participants that they have the right to
terminate their participation at any time.