A few weeks back one of RKG's sharp analysts provided insight into the vexing problems associated with thin data. Given the conversion rates in retail PPC, it becomes very difficult to separate signal from noise when the number of click-throughs is relatively small. This has some important consequences for paid search management and how we think about and react to data. The most important metric to measure and predict in paid search is the value per click associated with a given ad. Value could be leads, sales dollars, margin dollars or some other metric that corresponds even more closely to "goodness" to the advertiser. Tying the cost per click paid to this value per click for each ad maximizes value within an advertiser's ROI needs. However, at the ad level, we don't often have enough data to separate the signal from the noise. Analyzing statistical noise is a waste of time; reacting to statistical noise can be a fatal error. The solution is to add data until the signal is reasonably clear. Please forgive a photography metaphor. If thin data is analogous to insufficient light, we can say that there are essentially two ways to generate a clear picture:
- by widening the aperture; or
- by lengthening the time of exposure (the third option of a flash would be akin to adding artificial data -- not a good idea).
Join the Discussion