Transcript Test

MARK2038
Data Base Marketing Strategies II
Week 11
Instructor: Santo Ligotti
Email: [email protected]
Testing, Metrics, and Post
Analysis
This week
Testing, metrics, and post analysis
In-class assignment #5
Structure/content of final test (July 18th,
2006)
Learning Objectives:
You just learned:
why testing of DBM programs is important;
4 steps you can take to test DBM programs;
how to analyze the effectiveness of direct
response campaigns including response rate,
ROI and cost per response.
Campaign Management
Process
Campaign
Planning
1.
Planning


List
Compilation





Implementation



Measurement
List
Budget
Offer/call to action
Fulfillment
Creative format
Messages and copy
Response device
Testing process
Response tracking
Financial success measures
Campaign Management
Process
Campaign
Planning
2.
List compilation

List
Compilation



Implementation
Measurement

Purchase response
lists/compiled lists
Ensure any last-minute field
edits are complete
Select list members
Forward records to
agency/suppliers
Flag records for inclusion in
CRM system
Campaign Management
Process
Campaign
Planning
3.
Implementation


List
Compilation
Implementation

4.
Measurement


Measurement
Campaign is activated
Customer inquiries and orders are
acted upon
Information is received from
selected media channels
Monitor the results of the campaign
for effectiveness
Input recommendations to direct
marketing planning
Time to Market
Marketing campaigns require an average of 2.5
months to implement.
Reducing Time to Market
The longer the
campaign lead
time,
The less likely
the message will
be relevant to its
audience…
… and the less
likely it will be
“highly effective.”
Getting the right mix, requires
internal partnerships
 A partnership between Marketing and Analytics will
maximize campaign results
Involve the data analytics team at the beginning
of the campaign to establish key business
objectives, pre-analysis, targeting and key metrics/tracking
Continually integrate the data analytics team’s
tracking and key insights into future campaigns to
maximize ROI of all marketing initiatives
The Business Challenge
 With increasing pressure from shareholders/analysts
to continually improve financial results, marketers
need to able to illustrate that their campaigns are
delivering strong results
 In order to ensure marketing dollars are maximized,
data analytics needs to become a key partner in the
ongoing measurement & tracking of campaigns
 A number of marketer’s are still struggling to
demonstrate that their campaigns deliver
quantifiable results
So how do we as marketers achieve this?
Data Analytics is key to CRM
Process
LISTEN
DELIVER
MESSAGE
CREATE
APPROPRIATE
MESSAGE
ACTION
TEST
AND
LEARN
IDENTIFY
POTENTIAL
CUSTOMER
ACTIONS
KNOW
THE
CUSTOMER
Knowing Your Customer starts
with Data Analytics
LISTEN
ACTION
DELIVER
MESSAGE
CREATE
APPROPRIATE
MESSAGE
TEST
AND
LEARN
KNOW
THE
CUSTOMER
IDENTIFY
POTENTIAL
CUSTOMER
ACTIONS
Analyze customer behaviour
to determine key drivers of
value
Know recent key events
and interaction with your
company
Utilizing Data Analytics allows you to
Identify Potential Customer Actions
LISTEN
ACTION
DELIVER
MESSAGE
CREATE
APPROPRIATE
MESSAGE
TEST
AND
LEARN
KNOW
THE
CUSTOMER
IDENTIFY
POTENTIAL
CUSTOMER
ACTIONS
Continuously target and
tailor offerings based on
testing and learning
Marketing and Data Analytics allows you
to Create Appropriate Message
LISTEN
ACTION
DELIVER
MESSAGE
CREATE
APPROPRIATE
MESSAGE
TEST
AND
LEARN
KNOW
THE
CUSTOMER
IDENTIFY
POTENTIAL
CUSTOMER
ACTIONS
Explicitly manage the flow
and sequence of marketing
communications to each
customer
Marketing Delivers the Message to the
Customer
LISTEN
ACTION
DELIVER
MESSAGE
CREATE
APPROPRIATE
MESSAGE
TEST
AND
LEARN
KNOW
THE
CUSTOMER
IDENTIFY
POTENTIAL
CUSTOMER
ACTIONS
Create a dynamic and
consistent messaging and
response capability at all
customer touch
(communication) points
Data Analytics allows you to Listen to
the customers response
LISTEN
ACTION
DELIVER
MESSAGE
CREATE
APPROPRIATE
MESSAGE
TEST
AND
LEARN
KNOW
THE
CUSTOMER
IDENTIFY
POTENTIAL
CUSTOMER
ACTIONS
Capture and remember
relevant customer
conversations
Data Analytics allows you to Track the
Customer Responses and gain Insights
LISTEN
ACTION
DELIVER
MESSAGE
CREATE
APPROPRIATE
MESSAGE
TEST
AND
LEARN
KNOW
THE
CUSTOMER
IDENTIFY
POTENTIAL
CUSTOMER
ACTIONS
 Customer responds to the
message
 Key Learning’s are
integrated into future
programs by marketing
Establishing a Test & Learn Partnership
between marketing & data analytics will
maximize results
LISTEN
ACTION
DELIVER
MESSAGE
CREATE
APPROPRIATE
MESSAGE
TEST
AND
LEARN
KNOW
THE
CUSTOMER
IDENTIFY
POTENTIAL
CUSTOMER
ACTIONS
Conduct sophisticated tests,
share learning widely, and
implement fast read and
re-launch capability
The Concept of Testing
Why Test?
Good economics:
 Use a sample to learn what works and what doesn’t
work before rolling to entire database
Continuous Improvement
 Learn how to improve marketing programs to
ensure they’re the most effective
Testing Multiple Variables
Test all or some variables
Why?
 Learning Loop: Generates constant feedback on
how to improve effectiveness of communications
Commonly tested variables:
 Lists
 Offers
 Creative execution
 Channel
 Content
Testing an Idea
Four Steps
1 Plan Test


Define objectives
Set up test and control groups
Execute Test
3 Track Results
4 Analyze Results
2




Response rate
ROI
Cost per response
LTV
Example - Department Store
Assumptions
Store has a house credit card tied to customer database
containing 400,000 men and women
Store credit card allows capture of information about
purchases
Store has new line of designer clothes for women, being
promoted through print ads
Would like to increase sales of new clothing line
Decide to test a direct mail program with a small group of
women customers, before roll out to entire database
Offer: If buy new suit by May 30, will receive a free piece
of costume jewelry worth $20 by presenting this offer
Step 1: Plan Test
i) Define Marketing Objectives
What are you trying to accomplish?
Objectives should be measurable and time-bound.
Department Store Example:



To increase sales to existing customers by 4% within 1 year.
To achieve sales of new clothing line of $4.2 million.
To increase LTV per customer from $80 to $125 over next 12
months.
Step 1: Plan Test
ii) Set up test and control groups
Total
Customers
Test
Group
Control
Group
Gets
Offer
Does NOT get
offer
Why use a Control Group?
• Allows you to measure the effect of the
•
•
promotion versus not running it
No offer or promotional piece sent to the
control group
Can be larger/smaller than test group
Step 1: Plan Test
Set up test and control groups
Query the database to determine how many women
have credit cards in their name
Example - Department store
200,000 women with department store credit card in
their name
Must select 2 groups from this 200,000:
 Women who get the direct mail offer (Test)
 Women who do not get the DM offer (Control)
Test & Control Groups: How large?
Cost considerations: make as small as possible
Statistical validity: make as large as possible
Rule of Thumb:
Each group must be big enough so that you receive at
least 500 responses from the promoted group
Example
If anticipate response rate of 2%
Test group needs to be (500/2%) = 25,000
Test & Control Groups: How large?
Example: Department Store
Anticipate response rate of 2.5%
200K women in database
Test group size = 500/.025 = 20,000
Step 1: Plan Test
Set up test and control groups
Construct Test Group using ‘Nth’ method (per RFM)
YOUR CONTROL WOULD BE THE SAME FOR THE ENTIRE MAILING
UNIVERSE, REGARDLESS OF HOW MANY CELLS
Nth = Total customers in database
Test Group Quantity
Example: Department store
• Test group of 20K
• Add another 20K for control group … total = 40K
• Nth = 200,000/40,000 = 5
• Select every 5th customer from master database
•
That is, select customer record #5, #10, #15 ...
Why use ‘Nth’ select?
Test and Control groups must be exact statistical
replicas of the master database
Must mirror the master database - will have the
same percentage of people with similar
characteristics:
 Same postal code
 Same income
 Same # of children
 Same lifestyle
 Same purchase behaviour etc.
Step 2: Execute Test
Execute Program among test group, interacting
normally with control group
200k Women
customers
Test
Group
Control
Group
No
Mailed
Offer
Step 3: Track Results
Assign a source code
A “source code” is assigned to each test variable to
facilitate measurement and analysis
A source code is a series of letters or numbers used to
identify a particular offer
Rule: different source code for each new variable
Example:
 Women who got offer: OFFERMAY03
 Women who did not get offer: NOOFFERMAY03
Step 4: Analyze Results
What is the key learning?
Response
Rate
16%
14%
12%
10%
8%
6%
4%
2%
0%
15%
10%
3,000
responses
2,000
responses
Test Group
Control Group
What is a response?
A response can be ...

Phoning a 1-800 number

Providing information (e.g. survey answers)

Entering a contest

Purchasing a product

Signing up for a service
Our example
Step 4: Analyze Results
Evaluate success using a number of factors:
How did the program perform relative to
objectives?
Did the promotion come in on budget?
Metrics used to analyze performance:
Response Rates Analysis (RR%)
Cost per Response (CPR)
Return on Investment (ROI)
LTV
Response Rate Analysis
Step 1
Calculate response rate
for Test Group
Step 2
Calculate response rate for
control group
Step 3
Calculate incremental lift
between test and control
groups
Response Rate Analysis
First, calculate response rate for Test group
Department Store Example
Direct mail offer: Get free piece of costume
jewelry if buy suit by May 30
20,000 mailed, 3,000 responded
Test RR%
=
Responder Quantity x 100=15%
Test Quantity
Response Rate Analysis
Then calculate response rate for the Control
group
Department Store Example
20,000 in Control Group do not receive direct
mail offer
Still, 2,000 people respond to print advertising
and buy a suit by May 30
Control RR% =
Responder Quantity x 100
Control Quantity
Response Rate Analysis
Third, calculate % Lift between groups
% Lift = Test RR% – Control RR% x 100
Control RR%
Evaluation
The higher the lift, the better
Positive % Lift = Test performed better than Control
Negative % Lift = Control performed better than Test
Based on the Department Store example,
what is the incremental lift percentage?
Cost per Response Analysis
Campaign Costs / Budget Include:
Planning & Campaign Development
 Agency Costs (e.g. Fees, Creative Development)
 List Development (e.g. data work)
Campaign Execution
 Printing, Laser/Lettershop, Postage
Response Costs
 The marketing cost associated with response to a
database marketing campaign
 BRC postage, data entry, offer fulfillment, call
centre
Cost per Response
Cost per response = Total cost of program
# responses
Department Store Example
Total program costs = $210,000 (includes campaign
development, execution, response costs)
Cost/response = $210,000/3,000 = $70
Evaluation: the lower the cost, the better
Return on Investment (ROI) Analysis
ROI = what you earn on a campaign relative to what
you spent on a campaign
Evaluation: the higher, the better
Objective: To determine if you made money from your
database marketing investment
Return on Investment Analysis
ROI = Revenue – Program Costs x 100
Program Costs
Department Store Example
Total program costs = $210,000
Sales revenue = $450/suit=(450*3000)
3,000 responses to program
What is the ROI ?
Lifetime Value
Next step:
Determine promotion effect on lifetime value
Increased lifetime value, rather than immediate shortterm payout, should be the real goal of database
marketing
 Test effectiveness of alternative ways of increasing
LTV
Testing an Idea
Four Steps
1 Plan Test


Define objectives
Set up test and control groups
Execute Test
3 Track Results
4 Analyze Results
2




Response rate
ROI
Cost per response
LTV
Metrics Example:
CIBC: Direct Mail Creative
Execution Test
Example: CIBC Creative Test
•
•
3 different Direct Mail pieces created for launch of
CIBC Adventura Gold Visa card
Packages all the same except the outer envelope:
»
Cell A: High-end envelope & CIBC logo
»
Cell B: High-end envelope & Adventura logo
»
Cell C: High-end envelope & CIBC logo &
Aventura logo
Example: CIBC Creative Test
Calculate the % lift, cost per response and ROI
for each cell
Which envelope creative would you roll out to
the entire database of customers?
Example: CIBC Creative Test
Cell Quantities
Mail
Control
Cell A
50,000
10,000
Responders
Mail
Control
5,000
500
2,500
3,500
200
7,500
1,000
Program Costs
$100,000
$100,000
$100,000
Net Profit
$500,000
$300,000
$500,000
Response Rate Mail
Control
Lift
Cost/Response
ROI
Cell B
50,000
10,000
10,000
Cell C
50,000
10,000
Example: CIBC Creative Test
Cell Quantities Mail
Control
Responders
CELL A
50,000
10,000
CELL B
50,000
10,000
10,000
CELL C
50,000
10,000
5,000
3,500
2,500
7,500
Control
Program Costs
500
$ 100,000.00 $ 100,000.00 $ 100,000.00
Response Rate Mail
Control
Lift
Cost/Response
7%
5%
15%
0%
40%
200%
10%
5%
100%
$
20.00 $
28.57
40.00
$
13.33
Example: CIBC Creative Test
ROI% = Revenue – Program Costs
Program Costs
% Lift vs. Control
Cell A
100%
100%
Cell B
150%
40%
x 100
Cell C
50%
200%
Net
Profit
Revenue
$500,000
$300,000
$500,000
Program Costs
$100,000
$100,000
$100,000
ROI
Example: CIBC Creative Test
ROI = Revenue – Program Costs
Program Costs
x 100
Cell A
100%
Cell B
150%
Cell C
50%
200%
Net
Profit
Revenue
$500,000
$300,000
$500,000
Program Costs
$100,000
$100,000
$100,000
400%
200%
400%
% Lift vs. Control
ROI
100%
40%
Example: CIBC Creative Test
Based on the results, which envelope creative would
you roll out to all customers?
»
Cell A: High-end Envelope + CIBC logo
»
Cell B: High-end envelope & Adventura logo
»
Cell C: High-end envelope & CIBC logo &
Adventura logo
In-class Exercise (Worth 10%)-Part
1
Read Luring ‘em back to school, Strategy Magazine,
November 2003
Write 2 measurable, timebound objectives for the
integrated marketing programs executed by CMC.
What was the CMC strategy?
What direct marketing tactics were used?
How would you measure campaign success?
In-class Exercise-Part II:
Luring ‘em back to school
Complete the following table comparing the differences between
the direct mail and e-mail catalogue mailings. Which program
appears to be more successful? Why?
catalogues sent
response rate
total responses
mktg cost/catalogue
total catalogue mktg costs
other mktg costs (agency fees etc.)
total mktg costs
avg profit /course
total profit
ROI
cost/response
DM
60,000
2.5%
EM
78,000
1.5%
$3.00
$0.25
$300,000
$300,000
$700
$700
Statistical Significance
Statistical certainty is impossible
We normally talk of level of confidence in statistical
predictions
In DM this is often 95% (19 out of 20 times) or 90%
(18 out of 20 times) confidence - results will be
repeated within an acceptable margin of error
The confidence level set normally depends on
financial risk
Where Are the Other 95% - the Direct
Marketer’s Non-respondents
Research evidence suggest that it is all due to poor
timing!!
Not ready or unable to transact because:



lack of funds
don’t know how the product or service will perform
domestic upheaval (e.g. moving house)
Is this the reason why repeat mailings and follow-ups
are often successful?
Also, is this the reason behind the possible
discrepancy between test results and roll-up?
Selecting Response Channels
How do you want them to respond?
The 3 main channels are:
Mail
 Phone
 Internet
 Additional Channels include:


Mobile Devices
Response Channel Specific Metrics
Direct Mail
Response Rate versus no mail group
Creative Tests-different letter versions
Offer Tests-different offer types
Response Mechanisms (call/in-person)
Telemarketing
Response Rate versus no calll group
Percentage Right Party Connect
Wrap code analysis
Cross and Up sells
Creative Testing-Scripts
Internet
Response Rate versus no contact
View Rate
Abandon Rate
Accept Rate
Click Through Rate
Re-visit Rate
Creative Tests-different content pages
Push versus Pull tactics
Channel Combinations
Response Rate versus single channel
Measurement
It’s not enough to count responses.
Response does not indicate the level of
customer commitment.
 Measuring response doesn’t tell us WHY
consumers behave the way they do.
 Response builds only limited knowledge of
customer behaviour.

Beyond Response
What kind of people are responding?
What other market segments are there?
What offers trigger different groups to
respond?
How many ways can we present a message?
Where are the overlaps in media used?
What messages are appropriate for various
media?
Performance Measurement
Historical data can be useful in
evaluating the performance of similar
marketing campaigns.
Performance Measurement
MEASURE
OPERATIONALIZATION
Response rate
Percentage of prospects contacted
who replied
Number of inquiries
Number of fulfillments
Number of qualified leads
Number of leads who expressed
interest that were converted into
sales or opportunities
New customers acquired
Number of purchasers who had not
purchased before
Customer lifetime value
Net present value of customer over a
specified period of time
Customer acquisition cost
Total marketing costs divided by
number of new customers
TEST
TEST
TEST
Testing Variables
1.
Products/Services
2.
Media e.g.. Lists, print, Internet
3.
The Offer
4.
Formats/Layouts
5.
Timing Schedules
Common Experimental
Designs
Split-run experiment

Compare responses of campaign A to
campaign B using the same list (split in
two)
Before-and-after experiment

Compare the outcomes of campaign A
recipients to a control group that did not
receive it.
Good Pre-test Design
A good experiment will measure the
effect of ONE variable on another
(response rate).
Compare, on a limited audience:
(Offer A) vs. (Offer B) vs. (Offer C)
2. (Creative A) vs. (Creative B)
3. (Segment A) vs. (Segment B)
1.
Bad Pre-test Design
Marketer attempts to:
alter more than 1 variable per test cell in
the same experiment
 compare results in one medium to another
 test different response channels
 split the list into test cells that are too
small (n<30 responses)

Next Week: Test Structure (25%)
Class Test: July 18th, 2006
2 hours
Final Exam
Responsible for everything covered in class, including
handouts
Covers Materials from Week 1-Week 10
Structure
Multiple choice
Short Answer
Metrics Problem
Case Study