Intro to Research

Download Report

Transcript Intro to Research

Sociology 282
Searching for [causal] relationships
Cause
• Laypersons definition: when one thing
affects another
• an explanation involving the belief that
variation in the independent variable will
be followed by variation in the dependent
variable, when all other things are equal
(ceteris paribus)
Hypothesis
•
•
A statement of a presumed causal
relationship [between 2 or more
variables]
Schutt: a tentative statement about
empirical reality involving a relationship
between two or more variables
Nomothetic
• Usually where the quantitative folks are
• Often is probabilistic
example
• As more of kid’s play is in the street more
injuries will occur
• The more often a women goes to her car
alone out of the mall the more likely her
purse will be stolen
• As you increase your partying before
exams your grades go down
Probabilistic?
• Note it probably will happen, but not to
everyone…or not every time
Ideographic
•
•
•
•
•
•
Usually the place the qualitative folks are
located
the finding that a series of events following an
initial set of conditions leads in a progressive
manner to a particular event or outcome
Narrative reasoning
High concern for context
Holistic explanations
Deterministic
example
• Stephen King’s broken leg was caused by
walking on a narrow shoulder of the road,
on a slope where there was poor vision
and a speeding, drunken driver swerved
off the road
cause
• When changes in the [Causal thing] X
independent variable lead to changes in
the [effect] dependent variable Y
example
• Study [X]
Performance [Y]
• The more you study, the better you'll do in
this class
example2
• Fire fighters
Fire damage
Spurious relationship?
•
•
•
Size of fire
fire fighters
damage
spurious relationship
•
•
Schutt: a relationship that appears to be
connected but is not
An apparent relationship is really caused
by a prior variable Z which only makes X
and Y look as if they're related
Major Problem:
• What looks like a causal relationship may
be spurious
• This is a major issue for research
design/data collection and will be
considered below in the “Internal validity
section
Types* of social science research
• * note, I think that Schutt’s calling these
“types” of research rather than “goals” is
confusing
Types* of social science research
• Descriptive Research
• Exploratory Research
• Predictive Research [not mentioned in
Schutt]
• Explanatory Research
• Evaluation
Descriptive Research
• Research that defines and describes
social phenomena (e.g., National
Geographic “Survey 2000” that described
Internet users around the world and
identified differences between countries)
• Research that is not searching for causes
or reasons why things happen … stuff like
taking a census or poll taking for political
purposes
Exploratory Research
• investigation of social phenomena without
expectations (e.g., electronic diabetes
newsgroups were found to also be support
and information networks, a place where
information could be assimilated to inform
choices)
• Perhaps research that looks for causes
when we know very little about our topic of
study
Predictive Research
• Research that establishes relationships
that will let us guess how folks will do in
the future (e.g. using information like
SATs or ACTs to predict future college
performance when we don’t presume that
performance on these tests causes later
college performance)
Explanatory Research
•
research that identifies causes and
effects of social phenomena (e.g.,
research that suggests that Internet use
hurts or helps other forms of social
interaction.)
Evaluation
• research that determines the effects of a social
program or other type of intervention (e.g., in the
Toronto, Ont. Suburb that was wired with the
Internet, universal Internet access increased
relations between residents)
• Research that assesses how well you do your
job
• …how well a program works for your kids
• Whether your job will be cut
• Research that will affect YOU!
Evaluation
• Inputs: Programs and goals; Equipment
and facilities
• Processes: Program and its delivery
• Products: Short run, immediate
accomplishments
• Outputs: Program goals accomplished
• Outcomes; Long run result of program
goals
Quantitative and Qualitative
Orientation
• These days, there are two major philosophical
orientations towards data collection and
interpretation. One is “soft” and the other is
“hard.”
• One advocates that “science will save us.” …and
the other has advocates criticizing science as
the subjective perspective imposed on us by
powerful, Eurocentric, white Protestant males.
• One advocates “objective truth” and sees the
other as “any subjectivity goes.”
• Be prepared to consider these perspectives
throughout the semester.
Orientations?
• Note that I see these as extremes on a
continuum and I see that all the issues and
criteria relevant for one situation, relevant
for others.
• In particular, we’ll study experiments,
surveys and participant observation [or
qualitative research]
• I think that what applies to one applies to
all.
–
Quantitative methods: data collection
methods such as surveys and experiments
that record variation in social life in terms of
categories that vary in amount
•
•
Data are numbers OR attributes that can be
ordered in terms of magnitude
Most often used for explanation, description, and
evaluation
–
Qualitative methods: data collection methods such
as participant observation, intensive interviewing,
and focus groups that are designed to capture
social life as participants experience it rather than in
categories predetermined by the researcher
•
•
•
Data are mostly written or spoken words or observations
Data do not have a direct numerical interpretation
Exploration is the most often motive for using qualitative
methods
Goals of Social Research
• Note: this is Schutt’s title
• I prefer types of validity or
• Four reasons science must be tentative
Goals
•
•
•
•
Measurement validity [later called validity]
External validity
Causal [Internal validity]
Authenticity
•
•
Measurement validity: exists when a
measure measures what we think it
measures
Operational definition matches
conceptual definition
•
•
Schutt says “Generalizability: exists when a
conclusion holds true for the population,
group, setting, or event that we say it does,
given the conditions that we specify.”
External Validity involves generalizing from a
study about 1. types of subjects, 2. settings, 3.
measures employed, 4.the timing of delivering
the causal variable and the timing of when the
effect is observed
External validity
• Some causal influences work under; 1)
some conditions [winter or summer and
SAD] 2) for certain types of people
[children vs. adults] 3) with specific types
of causes in the same category [e.g.
therapy] or with different timing [studying
for this class all at once or in small doses]
• Generalization is based on representation.
• If you haven’t looked at it you can’t talk
about how your results apply to it.
Statistical interaction
• If the X-Y relationship works differently
under some conditions, etc. than other this
is refereed in the social sciences as
statistical interaction.
example
• Performance
– High
o
– o
– o
– Low
–
– No audience
o
Audience
– Red dots are for experienced performers
•
•
•
Causal (internal) validity: exists when a
conclusion that A leads to or results in B is
correct
Causal validity…more later…means that we
tentatively trust that the cause is responsible
for the effect…and that we’re not being fooled
by other things going on
Internal validity is concerned with eliminating
spuriousness and certain types of statistical
interaction
• Authenticity: When the understanding of a
social process or social setting is one that
reflects fairly the various perspectives of
participants in that setting (i.e., a resolution of
whether an objective social reality exists
independent of actors’ interpretations )
• Note that I don't see a resolution here. I see that
colleagues want to ensure that what we as
researchers think is going on is cool with
participants
However
• Sometimes the critics of “objective
research” feel that the “scientist” is
speaking for the participant and that leads
to false research…yet such critics have
no qualms about becoming “the voices of
the disenfranchised” even to the point of
figuring out what people feel for
them…saying what the participant is
saying even when the participant isn’t
aware of this…aren’t we back to square 1?
What we’re about:
• Major emphasis on validating cause-effect
relationships
• …in experiments surveys and so called
qualitative studies
• Examining measurement, casual & generalizing
validity and authenticity
• Using learning research tools and
• Learning new vocabulary!
• Becoming an informed consumer and critic or
research