We use cookies. You have options. Cookies help us keep the site running smoothly and inform some of our advertising, but if you’d like to make adjustments, you can visit our Cookie Notice page for more information.
We’d like to use cookies on your device. Cookies help us keep the site running smoothly and inform some of our advertising, but how we use them is entirely up to you. Accept our recommended settings or customise them to your wishes.

Holiday 2006 PPC Clicks: High Traffic, High Conversion, High Profitability

An earlier post described how our clients fared in terms of sales-per-click going through Holiday 2006. Sales-per-click (SPC) is measure of site conversion. (Indeed, SPC is the conversion rate, just multiplied by average order value.) While the SPC measure provides some insight into the quality of traffic coming to client sites from the engines, this measure says nothing about the cost or profitability of that traffic. High converting clicks are wonderful, but can be wildly unprofitable if you overpay for them. Having asked "how was the conversion quality of click traffic during the holiday?", we next ask "how profitable was the click traffic during the holiday?" One handy measure of profitability is the advertising-to-sales ratio, aka the "A/S" ratio. A digression on the A/S ratio: the fundamental law of direct marketing states that when a direct marketer can generate revenue without spending too much on advertising, she'll be profitable. How much she can spend on advertising is a function of her margin, her other variable selling costs (credit card discount, warehouse, call center, etc), her overhead (buildings, staff), and her earnings target. An old small catalog rule-of-thumb is ad expense plus cost-of-goods must total below 80% of sales -- that leaves 10% for variable costs, 5% to cover overhead, and 5% towards ebit. Marketers seeking richer earnings need that sum below 80%; marketers seeking growth may go above 80%. For a gifts cataloger with a 50% cogs, that means ad expense, all in, must come in at 30% of sales or less. End digression. Now, some of our clients instruct us to run their campaigns to a specific A/S target. Others choose to steer their campaigns based on advertising-to-margin ratio ("A/M") , or on return-on-ad-spend ratio ("ROAS") , or on a cost-per-order ("CPO") , or some other favored cost-vs.-benefit ratio. And some of our clients instruct us to buy search on a cost-per-impression basis (CPM), seeking to drive brand awareness and traffic to their retail stores cost-effectively. While our various clients collectively steer by a gaggle different metrics -- indeed, many clients employ multiple metrics within different portions of their campaigns to reflect different objectives or margin structures -- at the end of the day, most of these metrics boil down to wanting to buy as many clicks as possible which convert well. So, while not every client steers by A/S, aggregate A/S is a reasonable measure of this "buy media smartly" concept. For the following graph, we computed an aggregate A/S by dividing total ad spend by total resulting sales, by day. Note that unlike the previous graph, this effectively weights larger advertisers more heavily than smaller advertisers. To protect out clients' privacy, we then indexed this measure by dividing each's day's ratio by the October 1, 2006 value. This indexed aggregate A/S is graphed below by the pink line. You see this line basically hovers near 1.0. This means that our clients, in aggregate, pretty much maintained the same advertising efficiency across the holiday period. This isn't surprising, as our bid management technology uses sophisticated statistics and modeling to achieve this very goal. Digression: we're currently collaborating with statisticians at UVA, Rochester, and Santa Clara on pay-per-click optimization models -- more on that work after the academic papers are published. Had this pink line climbed sharply, that would indicate rising A/S -- that is, decreasing efficiency -- which would show our algorithms weren't working. Happily, that didn't happen. Our algorithms are grounded in statistical theory, and they work. The blue line on the graph is total daily sales, again indexed by dividing each value by the October 1st value. Note how the blue line soars in December, indicating daily sales up to four times higher than those on October 1, all the while maintaining the efficiency target. (Because we indexed both A/S and sales, the relative position of these lines to one another doesn't reflect anything.) Enough words, already. Finally, the graph. RKG sales vs. efficiency results Holiday 2006 What's the take-away message from all this? At a constant efficiency, our client base in aggregate enjoyed strong sales increases during the holiday. This demonstrates that not only did click conversion increase, but that click quality increased far faster than click cost, leading to higher click profitability. While we're proud of our strong bid management technology, we're not taking credit for the holiday surge. The surge reflects Christmas, Chanukah, New Years, etc. (Of course, a less sophisticated bidding approach might have overspent or underspent during this critical fast-paced season... ) We offer this graph to suggest, that based on our experience across our client base, on average Holiday 2006 provided both high volume clicks, high converting clicks, and high profitability clicks.
Join the Discussion