Marketing Research
Download
Report
Transcript Marketing Research
1
Marketing Research
Aaker, Kumar, Day and Leone
Tenth Edition
Instructor’s Presentation Slides
2
Chapter Nineteen
Correlation Analysis and
Regression Analysis
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
3
Definitions
• Correlation analysis
▫ Measures strength of the
relationship between two
variables
• Correlation coefficient
▫ Provides a measure of the
degree to which there is an
association between two
variables (X and Y)
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
4
Regression Analysis
• Statistical technique that is used to relate two or more
variables
• Objective is to build a regression model or a prediction
equation relating the dependent variable to one or
more independent variables
• The model can then be used to describe, predict, and
control the variable of interest on the basis of the
independent variables
• Multiple regression analysis - Regression analysis that
involves more than one independent variable
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
5
Correlation Analysis
• Pearson correlation coefficient
▫ Measures the degree to which there is a linear association between
two interval-scaled variables
▫ A positive correlation reflects a tendency for a high value in one
variable to be associated with a high value in the second
▫ A negative correlation reflects an association between a high value
in one variable and a low value in the second variable
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
6
Correlation Analysis (Contd.)
• Population correlation (p) - If the database includes
an entire population
• Sample correlation (r) - If measure is based on a
sample
R lies between -1 < r < + 1
R = 0 ---> absence of linear association
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
7
Scatter Plots
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
8
Scatter Plots (Contd.)
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
9
Correlation Coefficient
Simple Correlation Coefficient
Cov( x, y ) ( X i X ) * (Yi Y )
Pearson Product-moment Correlation Coefficient
Xi X
(Yi Y )
1
rxy
*
*
( n 1)
Sx
Sy
rxy
http://www.drvkumar.com/mr10/
Cov xy
Sx * S y
Marketing Research 10th Edition
10
Determining Sample Correlation Coefficient
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
11
Testing the Significance of the
Correlation Coefficient
• Null hypothesis:
Ho : p = 0
• Alternative hypothesis:
Ha : p ≠ 0
• Test statistic
Example: n = 6 and r = .70
At = .05 , n-2 = 4 degrees of freedom,
Critical value of t = 2.78
Since 1.96<2.78, we fail to reject the null hypothesis.
http://www.drvkumar.com/mr10/
t .70
62
1 0.70
2
1.96
Marketing Research 10th Edition
12
Partial Correlation Coefficient
Measure of association between two variables after controlling for the
effects of one or more additional variables
rXY , Z
http://www.drvkumar.com/mr10/
rXY rXZ * rYZ
(1 r ) * (1 r )
2
XZ
2
YZ
Marketing Research 10th Edition
13
Regression Analysis
Simple Linear Regression Model
Yi = βo + β1xi + εi
Where
▫ Y = Dependent variable
▫ X =Independent variable
▫ β o = Model parameter that represents mean value of dependent variable (Y) when
the independent variable (X) is zero
▫ β1 = Model parameter that represents the slope that measures change in mean value
of dependent variable associated with a one-unit increase in the independent
variable
▫ εi = Error term that describes the effects on Yi of all factors other than value of Xi
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
14
Simple Linear Regression Model
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
Simple Linear Regression Model –
A Graphical Illustration
http://www.drvkumar.com/mr10/
15
Marketing Research 10th Edition
16
Assumptions of the Simple
Linear Regression Model
• Error term is normally distributed (normality assumption)
• Mean of error term is zero [E(εi) = 0)
• Variance of error term is a constant and is independent of the values
of X (constant variance assumption)
• Error terms are independent of each other (independent assumption)
• Values of the independent variable X are fixed (non-stochastic X)
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
17
Estimating the Model Parameters
• Calculate point estimate bo and b1 of unknown parameter βo and β1
• Obtain random sample and use this information from sample to estimate βo
and β1
• Obtain a line of best "fit" for sample data points - least squares line
Predicted value of Yi
,
Where
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
18
Residual Value
• Difference between the actual and predicted values
• Estimate of the error in the population
ei = yi - yi
= yi - (bo + b1 xi)
• bo and b1 minimize the residual or error sum of squares (SSE)
SSE = ei2 = ((yi - yi)2
= Σ [yi-(bo + b1xi)]2
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
19
Standard Error
• Mean Square Error
• Standard Error of b1
• Standard Error of b0
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
20
Testing the Significance of Independent Variables
• Null Hypothesis
▫ There is no linear relationship between the independent &
dependent variables
H0: β1 = 0
• Alternative Hypothesis
▫ There is a linear relationship between the independent &
dependent variables
H a : β1 ≠ 0
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
21
Testing the Significance of Independent Variables
(Contd.)
• Test Statistic
t = b1 - β1
sb1
• Degrees of Freedom V = n – 2
• Testing for a Type II Error
Ho:
Ha:
β1 = 0
β1 ≠ 0
• Decision Rule
Reject ho: β1 = 0 if α > p value
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
22
Sum of Squares
SST
Sum of squared prediction error that would be
obtained if we do not use x to predict y
SSE
Sum of squared prediction error that is obtained
when we use x to predict y
SSM
Reduction in sum of squared prediction error that
has been accomplished using x in predicting y
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
23
Predicting the Dependent Variable
• Dependent variable, yi = bo + bixi
• Error of prediction is yi – y
• Total variation (SST)
= Explained variation (SSM) + Unexplained variation (SSE)
(Yi - Y)2 = (Yi - Y)2 + (Yi – Yi)2
Coefficient of Determination (r2)
• Measure of regression model's ability to predict
r2 = (SST - SSE) / SST
= SSM / SST
= Explained Variation / Total Variation
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
24
Multiple Regression
• A linear combination of predictor factors is used to predict the
outcome or response factors
• The general form of the multiple regression model is explained as:
where
β1 , β2, . . . , βk are regression coefficients associated with the
independent variables X1, X2, . . . , Xk and
ε is the error or residual.
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
25
Multiple Regression (Contd.)
• The prediction equation in multiple regression
analysis is
Ŷ = α + b1X1 + b2X2 + …….+bkXk
where
Ŷ is the predicted Y score and
b1 . . . , bk are the partial regression coefficients.
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
26
Partial Regression Coefficients
Y = α + b1X1 + b2X2 + error
• b 1 is the expected change in Y when X1 is changed by
one unit, keeping X 2 constant or controlling for its
effects.
• b 2 is the expected change in Y for a unit change in X2,
when X1 is held constant.
• If X1 and X2 are each changed by one unit, the expected
change in Y will be (b1 / b2)
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
27
Evaluating the Importance of Independent Variables
• Consider t-value for βi's
• Use beta coefficients when independent variables are in
different units of measurement
Standardized βi = bi
Standard deviation of xi
Standard deviation of Y
• Check for multicollinearity
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
28
Stepwise Regression
• Predictor variables enter or are removed from the
regression equation one at a time
• Forward Addition
▫ Start with no predictor variables in regression
equation
i.e. y = βo + ε
▫ Add variables if they meet certain criteria in terms of
F-ratio
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
29
Stepwise Regression (Contd.)
• Backward Elimination
▫ Start with full regression equation
i.e. y = βo + β1x1 + β2 x2 ...+ βr xr + ε
▫ Remove predictors based on F- ratio
• Stepwise Method
▫ Forward addition method is combined with removal of
predictors that no longer meet specified criteria at
each step
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
30
Residual Plots
Random distribution of
residuals
Heteroskedasticity
http://www.drvkumar.com/mr10/
Nonlinear pattern of
residuals
Autocorrelation
Marketing Research 10th Edition
31
Predictive Validity
• Examines whether any model estimated with one set of data continues to
hold good on comparable data not used in the estimation.
• Estimation Methods
1. The data are split into the estimation sample (with more than half of the total
sample) and the validation sample, and the coefficients from the two samples are
compared.
2. The coefficients from the estimated model are applied to the data in the validation
sample to predict the values of the dependent variable Yi in the validation sample,
and then the model fit is assessed.
3. The sample is split into halves – estimation sample and validation sample for
conducting cross-validation. The roles of the estimation and validation halves are
then reversed, and the cross-validation is repeated
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition
32
Regression with Dummy Variables
Yi = a + b1D1 + b2D2 + b3D3 + error
• For rational buyer, Ŷi = a
• For brand-loyal consumers, Ŷi = a + b1
http://www.drvkumar.com/mr10/
Marketing Research 10th Edition