Recently, DSA was asked to come up with 10 great database marketing ideas. We couldn’t think of ten great ideas, database related or otherwise, but it was easy to think of 40 or so things that database marketers shouldn’t do. Here then is the first installment of our Top 40 list of things not to do. (Not in order of importance.)
1) Never enhance your entire database just to profile it.
If you want a customer profile, a small random sample will do. However, if after building a model based on a sample of responders and non-responders you discover that one or more overlay variables improve the model, then you will need to enhance the entire file with just those variables. This is a very cost efffective approach because you only purchase data that has proven to be significant.
2) Never use a segmentation model without testing segments expected to perform below average.
Predictive models don t last forever, so they need to be tested. If you promote only the top segments and response is below expectations you won’t know if the model stopped working, if the result was a seasonal aberration, or if a scoring error had been made, unless segments not expected to do well have also been promoted.
3) Never commission a customer segmentation study or a predictive model without an implementation plan in mind.
Not every segmentation scheme nor every model can be easily implemented, and gearing up for implementation can take longer than the model building process itself. Therefore for models to be useful (and cost effective),their implementation needs to be planned for well in advance.
4) Never assume that multi-product promotions will work better than single product promotions even if each product is targeted against the same promotion group.
No one’s sure why this it is true, but more often than not it has proven to be true; maybe it has to do with providing too many choices, maybe it’s something else, but be careful when thinking about this apparently rational marketing strategy.
5) Never decide on a database platform without testing it at roll-out volumes.
The proof is in the pudding. It’s next to impossible for a marketing person, or even a data processing person to evaluate the claims of competing vendors. A live test of your data against your requirements is the only satisfactory answer.
6) Never believe a modeling result that doesn’t conform with your experience or intuition.
Models quantify expectations. If a result seems wrong it’s probably a data processing error, not a revelation.
7) Never assume a testable contact strategy doesn’t need to be tested.
A new contact strategy is just like a new creative, it may seem obviously superior to the current control, but you won’t know for sure until you’ve tested it.
8) Never assume the data you are working with is correct.
Even though our quantitative analysis tools have improved enormously, the data processing systems that deliver data can still turn out misleading or misinterpreted data. Always assume the worst and trust your instincts.
9) Never try to compare open-ended responses to an RFP.
Vendors all have their own ways of providing estimates, and it’s next to impossible to be sure that any two responses are comparable. The solution: lay out a set of very specific activities (file sizes, number and types of updates, number of mailings, etc.)
and ask each vendor to tell you how much it will cost (in total dollars, not rates) to execute the plan. Never sign off on a major application development project whose costs will be estimated after the start of the project. Guess what’s likely to happen .
10) Never in the presence of real data processing professionals say anything technical like: relational database, SQL, schema, normalized, etc.
You will sound silly.
Check back for the next 10 items on our list of the Top 40 things not to do.