Thursday, September 28, 2006

An Asterisk that Cost 2 Million Dollars

This just in: a colleague of mine pointed out an analysis that showed a sudden spike in the number of new trial subscriber signups for one of our clients. In early 2005, they had just introduced a new product version and through mid 2006, they were averaging around 3,000 new trial subscribers each week -- which was less than half of what they were getting with the previous product version.

What had happened?

Turns out this client had several product feature descriptions listed during the trial sign-up process. One of these product descriptions happened to have an asterisk next to it, which was explained at the bottom of the page saying it required a credit card number upfront to use it. For a casual observer, it wasn't clear to which product feature the asterisk actually belonged. So, during a content review session, someone caught this and said -- wow, people are looking at our new product and they think they need a credit card to subscribe, which they really don't, and they get very hesitant.

So, in July, a quick content change was made. The asterisk was removed.

Since we are tracking all the transactions, within 7 days after this change, we started seeing a sudden spike in new trial subscriptions, which has levelled off above 7,000 new trial signups a week (for now). More than double of what was happening prior to the removal of the infamous asterisk.

My colleague did a quick calculation on revenue impact, and it basically translated roughly $2,000,000 in additional revenue in the coming 12-months because of the asterisk removal.

I hope you don't have a similarly expensive asterisk ANYWHERE on your website.

Friday, September 22, 2006

Can Analytics Influence Direct Marketing Creative Process?

Direct Marketing Association (DMA's) Northern California chapter hosted its first independent meeting yesterday (Thursday Sep 21st) at Intuit's campus in Sunnyvale. This summer, the national DMA "abandoned" formal support for all its local chapters asking them all to go on their own. It's good to see that Northern California DMA has managed to make this independent start, hopefully they'll get adequate local support to thrive.

The main speaker was Bill Mirbach, VP of Direct marketing and direct sales at Intuit -- presenting a talk titled " Owner's Manual for the Creative Process". Now, I'll be the first admit that professionally direct marketing creative is the last thing we deal with -- at our work, we leverage direct marketing data for predictive analytics, so while we can measure and predict if creative version A will perform better than version B for a target audience, that is very different from the "creative process" itself. So, I was intrigued with the topic.

Bill has been around the silicon valley high tech industry. The highlight of the talk for me was a story he shared going back to 1984 when he helped the founder of a fledgling software company called Intuit with their direct ads. The talk was mostly about how companies should chose vendors and vice versa for the creative process, not the artistic aspect itself -- so, during Q&A, I asked Bill how does knowledge of your audience impact the direct marketing creative process? Can one leverage their direct marketing data, knowledge of their customer segments, etc. to make the creative process more effective? What's been his experience?

Here's what I heard:

Yes, a better knowledge of one's audience and their likes and dislikes about an organization's products and services certainly helps the creative person to craft their message more effectively? However, while this sounds logical, this is not what normally happens. The creative process is more the product of the discipline and idiosyncracies of the "creator", rather than driven by data-derived intelligence.

Bill mentioned a particular test where they wanted to measure the performance of a scare-tactic message ("if you don't use our product, you'll be sorry") versus a benefits-focused message ("and you can get X, Y, and Z at the click of a button") -- where the creative person just didn't believe in scare-tactic and came up with a very tepid "scary" message (which obviously didn't perform well). Whereas he used a different creative person who specialized in scare-tactic message (scary thought, pardon the pun) -- which turned out to be very effective.

What does this tell us?

I don't know how much of Bill's story is the norm or exception, but he certainly has been around direct marketing creative people a lot more than I have -- so I must respect his POV. Still, it seems like rather than asking creative people to use marketing data driven intelligence to fine tune their message -- it's probably better the other way around -- i.e. leverage the analytics to find out what type of messages you want to be send out to different segments -- THEN find the right creative team to craft those messages.

What do you think?

Tuesday, September 19, 2006

Passion for Data Visualization

I found this on Christopher Ahlberg's blog, and I completely agree that this is a true display of "passion for information visualization".

The software used for the presentation is at and is also a google tool ( It appears to be "bundled" with the global economic data, not sure if there is an open decoupled version that one can point to their data and play around. Although it seems flash-based and pulling data from static data sources (didn't seem like rdbms, but I could be wrong) -- but this would be a great way visualize OLAP data.

What's interesting is the concept of a "play" button for the time dimension, which makes a great use of animation to see how different quantities (measures) change over time. It also manages the screen real-estate well to put different dimensions on X or Y-axis. But most of all, this truly exemplifies what data visualization is all about -- it goes beyond the realm of charts and graphs that take a while to decipher, and rather tells a very clear, compelling, and visual story. Very impressive!

Sunday, September 10, 2006

We've got charts and graphs to back us up, so f@#$ off!

Recently I had a potential partnership discussion to evaluate whether our predictive analytics technology could provide key insights from this potential partner's (I'll call them company XYZ) marketing database. Here's how my conversation went with Mr. X, an exec at company XYZ, (which is a marketing technology company):

Me: "So, what are they key business pain points for your clients that we can analyze?"
Mr. X: "Well, you know, the usual stuff -- marketing ROI, cut costs, increase sales, etc."
Me: "ah.. yes, but can we delve a bit deeper? Where exactly clients' marketing programs need help?"
Mr. X: "what do you mean?"
Me: "Well, are they more worried about increasing acquisition volume, or is it more about predicting high-LTV customers, or is it more about retention? I'm trying to get a sense of what is their #1 issue?"
Mr. X: "they don't know.. it's probably all of that stuff"
Me: "It's important that we get a sense of priority, because otherwise we are talking about applying analytics without really knowing what we are trying to optimize"
Mr. X: "well, you are the analytics expert -- you need to tell them what to analyze. They don't think like you, worrying about success metrics, etc. They ask us to run marketing programs, and now we'd like to sell them some analytics. I can tell you what data we have on their marketing programs, now you tell me what kind of analytics you can provide me that I can sell."
Me: "But we do need to understand their business objectives before determining what analytics is relevant enough so that they'll pay for it"
Mr. X: "What I need from you is some screenshots --- some charts and graphs that show what kind of analytics you can do -- I'll be more than happy to review that and tell you if we can work together"

Needless to say, I didn't sense a true spirit of partnership here, but I did sense an attitude that I find more often than I'd like that analytics is all about producing charts and graphs that the user will somehow find useful.

Which, IMHO, is total BS!

But I can't blame Mr. X too much because this is a pretty common perception of analytics in the marketplace. Recently I talked to a marketing exec who said --"everytime I meet with the analytics guys from our agency, they basically have this big ream of a powerpoint deck filled with one chart after another -- and I don't want to see all that stuff; all I want them to do is to tell me what relevant insight(s) did they find, and what course of marketing action would they recommend, and it's like pulling teeth to get them to move beyond charts and graphs and talk about action."

Look at the websites of any business intelligence software provider, or analytical software provider -- and I will guarantee you that you will see a bunch of fancy charts and graphs, and dashboards with enough dials and speedometers to make you dizzy. Somewhere along the line, maybe we have forgotten that the purpose of analytics is to equip us with insights that enable better decision making.

So, first off, the type of analysis being done has to be aware of what type of decisions we are exactly expecting to improve; and second, the result of the analysis needs to be presented in a fashion that is "integrated" into the decision making. Maybe you list your recommended actions next to your charts and graphs, or maybe you somehow highlight the figures and trends that demand attention. My point -- don't leave it upto your user figure out the action based on the fancy charts and graphs, find out what decisions users are trying to make, and provide information that fills in that gap between analytics and actionable insight.