Showing posts with label marketing analytics. Show all posts
Showing posts with label marketing analytics. Show all posts

Monday, November 24, 2008

BI 2.0, Next Generation BI, and Everythig New and Improved

My fellow blogger Bhupendra Khanal has an interesting post that mentions the challenges associated with BI 2.0/Information 2.0 (he also plugs OpenI, my open source project, which is much appreciated -- Bhupendra, may OpenI karma come back to you thousand-fold :-) 

Software industry, not unlike any other, contains a lot of hype and probably sometimes even more so with all this 2.0 buzz, which probably seems cool to the industry insiders, but is definitely confusing to the market.

Take BI 2.0 (or Information 2.0) for example - what in the world does it mean? Well, turns out, at the end of the day, to most BI vendors, it means more fancy charts and graphs and dashboards, except this time they'll have rounded corners, larger fonts with brighter colors, and maybe a fit of Flash and/or Ajax thrown in for a good measure to demonstrate live interactivity.

All this is fine and well, but all this is also pure BS if you are not helping your user make better decisions, or informing them of something new.

If BI 2.0 or Information 2.0 is to be seen as the "next generation" (it seems you cant' escape these cliches), then it needs to go beyond charts/graphs/dashboard paradigm. BI applications and tools need to be rooted in the knowledge worker's workflow - and should be cognizant of the types of decisions that need supporting. BI needs to be aware of the domain context - i.e. which industry are you supporting? which area - marketing, finance, operations, research..? Because without this, the best BI can do is to provide nice visuals and hope and pray that the user knows how to translates them into intelligence and action.

But software can be better than that if is stops being lazy. And that's my hope with our work in OpenI. We certainly started in the charts/graphs/dashboards paradigm, so we are as guilty as anybody. But as they say in any 12-step plan, "acceptance" is the first step -- and now, we are moving towards a future of BI software that caters to the root need for intelligence -- i.e. not only that you see your data clearly, but you also see it in your specific business context, and get immediate option to act upon it.

For e.g. a marketing analytics BI application - once it incorporates the data about customers and marketing campaigns and resulting purchases -- should not wait for a user to define dashboards and reports, but rather already provide a suite of analyses that answer the most typical marketer's questions - i.e. how effective are my campaigns , who are my best customers/prospects, and what tactics work the best for individual customer segments? And don't stop there btw -- if you have identified some new and interesting customer segments - it should integrate with an online campaign manager tool to immediately launch new campaigns; or publish the list of your most valuable customers to your e-commerce engine or call center platform which know how to treat them in a special way; etc. etc.

Similar scenarios can be applied to other industries and domains too. We as BI software developers just need to imagine differently. And that's what 2.0, new version, or next generation is all about - next level of imagination.

Wednesday, October 29, 2008

What is the Cost for Different Phases of Outbound Marketing

A colleague recently sent me an email asking:
I'm trying to find out the cost/spend associated with different phases of outbound marketing campaigns. At a high level, I'm trying to understand the process as,
  • Idea generation/ Message theme discussions (e.g. what is the campaign all about)
  • Associated content generation (web site promotion, hard print material, email
    content generation...in summary, (creative + message) generation  )

  • Outbound execution : actual delivery, publishing of hte message
Can you provide guidance as to,

1. If I missed any major step(s)
2. What % of total cost will be allocated to each of the above steps.....here if you can add the vertical (retail, hitech software, hi-tech mfg etc), it would help me more
Not that I'm an expert, but my response was as follows - see if you agree or better yet, can add in your 2 cents:

I think you have identified the key themes. I tend to think about outbound marketing in the following categories

Target
  • Who will you contact? who is your audience? what is your access to that market? If you want to  go direct (email, direct mail, telemarketing), how are you going to obtain contact information
    -- homegrown lists, purchased lists?
  • Is there a segmentation strategy applicable? If so, what are you costs/efforts to define/implement it?
Message
  • Once you know who you will reach out to, you need to craft your message. This involves figuring out the creative for each media (email layout, direct mail layout, video or radio ad, etc.) and producing it
  • Different elements of the message - creative content, offer, promotion, etc.
Execution
  • How will the message get out? What are the different media channels? Are you going to work with an agency that can manage all channels, or do it yourself?
  • How will you co-ordinate the different channels? e.g. someone who got an email offer ends up calling your telemarketing center, are they all in sync?
  • How well are you able monitor your campaigns in progress and how quickly can you respond to feedback?
Optimization

This is more around anayltics, but a critical part (of course I'm biased :-) -- which is to look at the operational metrics of all campaigns and optimize mainly for 2 things - determine the most profitable/relevant segments and for each segment, figure out the optimal contact strategy

Cost-wise, execution will be the biggest chunk, probably 50-60% of overall cost, closely followed by "target" (acquisition of contact information or markets).  Rest is probably evenly divided.

Wednesday, December 12, 2007

Holy Trinity of Marketing Automation Systems?

An old friend and fellow blogger Michael Fassnacht has an interesting blog post -- he advocates a concept of combining marketing resource management (MRM) systems with marketing analytics, the goal being optimization of marketing resource usage, in a fashion very similar to financial asset optimization.

While on surface, this may seem like a straight-up combination of MRM systems and marketing analytics systems, but a hidden yet crucial system to implement this concept is the one that does the marketing program execution. This is the system that uses different subset of marketing assets for each program and its various campaigns, and hopefully tracks the effectiveness of those assets in a full cycle, i.e. all the way to conversion (or non-conversion). In a typical marketing shop today, these are usually 3 different systems/vendors, with little or no integration between them.

Now, if you have a marketing program execution platform that is (a) in tune with all your assets in all different channels, and (b) tracks asset usage and effectiveness in different programs, and also publishes it to an analytics platform -- you are close to implementing the concept that Michael is advocating.

Is there a platform out there that does all this? Stay tuned :-) And I'll be also curious to hear your feedback on marketing automation platforms out there that come close to implementing this concept.

Tuesday, June 19, 2007

Email Marketing is about Engagement First, Revenue Second

For the last couple of months, I have been trying to get my head around the analytical nuances of email-centric marketing (as a part of my new gig with Responsys). At a first glance, there seems to be a plethora of metrics – some around deliverability, then open rates, click rates, conversion rates, etc, etc. Which ones of these are the true measures of success? To some extent, they all are – but I can’t help but look for a top-down hierarchy which looks at the bottom line first, and then delves into the various supporting factors.

The success of your email (or any other customer contact for that matter) depends on whether the email was successful in influencing a desired action on the recipient’s part -- Did they actually open the email? Did they actually click on any of the external links in the email? Did they actually perform the action that was the goal of the email – like making a purchase or signing up for a program? Measuring each one of these events and understanding how they are influenced by external factors are crucial to optimizing the success of your email campaigns.

A common misperception about emails is that they are more-or-less “free”. Compared to the cost of a direct mail piece or a telemarketing call, the physical cost of sending an email is definitely lower by several orders of magnitude. As a result, what is also common is that organizations get somewhat sloppy about measuring the cost and impact of their email campaigns, and they tend to measure email campaign effectiveness on a more general level.

Particularly important are the hidden costs of email campaigns. Since it is easier and cheaper to send several emails to a recipient, often “list fatigue” is realized much sooner in the email than in offline channels. This means people in the email lists opt out more quickly and the response rates decrease more sharply, indicating a more negative experience for the email recipients.

This breakdown in relationship manifests itself in the following email engagement metrics:

  • Decreasing click-through rates and conversion rates
  • Higher opt-out and spam-out rates

However, a common trap most email marketers fall in is to address the above issues by increasing the volume of emails. The thought pattern is something like this:

I am sending out 100,000 emails today, and I only get a 0.1% conversion rate. Since my target is to drive 10,000 new web sales by email this quarter, I must send out 10,000*100 / 0.1 = 10,000,000 emails this quarter. And because it costs me only a penny per email, I am basically spending $100,000. If the gross margin per sale is higher than $10, then I even have positive ROI.

While this calculation may be mathematically correct, the major flaw in this approach is that it evaluates email purely on short-term revenue potential, and does not consider the impact of email volume on the overall quality of customer engagement. In a way, it is very similar to the bad image telemarketing has gotten over the years because telemarketers have been entirely focused on making a large volume of calls without caring much how many people get infuriated in the process. The difference is that in the case of emails, people can easily unsubscribe with a click, tag you as a spammer, or simply hit delete any time they see an email from you. They can do a lot more damage to you in a much shorter amount of time.

So, it is essential to think of email as an engagement tool first, and revenue generator second. The primary goal of your email campaign should be engagement, which in turn drives value. It does not work the other way around.

In the coming days, I will post more of my thoughts around measuring engagement and its influencers. I will also like to hear from you – what are your thoughts around measuring engagement driven by email, and/or do you see the problem space differently? What should be the role of analytics in driving success of email campaigns?

The more I think about this, it seems that analyzing success of email campaigns is not just looking at one particular success outcome, or even looking at individual metrics like open rates or click rates -- the task seems more about developing some sort of "Email Engagement Index" (EEI -- do we really need another acronym?) that incorporates all the different manifestations of engagement that were influenced by the email (open, click, purchase, forum posts, calls to the call center, web site visit, etc.) with their appropriate "weights". Then we need to track the impact of our email campaigns with respect to this index. Else we run the danger of looking at this multi-dimensional issue in a more unary manner (such as looking at only revenue or open rate, one at a time), and thus, not realize the actual impact of the email campaign.

I'll have more on this soon, but thought I should at least get the dialog started.

Friday, October 20, 2006

"All marketers are liars". Seth Godin is a marketer. Hence?

Sorry for the cheeky title, but Seth sort-of ticked me off today with his post "Nobody Knows Anything".

First off, as someone who makes his living providing marketing analytics, I get a bit, let's say annoyed, when someone like Seth starts a blog post with "There are two kinds of marketing analysis, both pretty useless".

I calmed down a little bit when later in the post he conceded "Here’s the really good news: in addition to analysis, marketing today offers something that actually works: a process".

But his post in general has an attitude that says "Marketing is not science", and that "most marketing breakthroughs come down, sooner or later, to luck".

Well, I am not a guru like Seth (or as "lucky" in terms of having a couple of bestsellers under my belt), and he definitely knows a lot of things I don't -- but this is my blog :-) -- so, I will say that marketing is just as much of a science as social science is. It may not have equivalents to Newton's law of gravity or Einstein's theory of relativity, but marketing analytics does have some tried and tested ways to leverage data to make smart predictions about future behavior of customers and prospects.

Now, once equipped with the intelligence that marketing analytics provides, it is completely up to the marketers on how successfully they can change their strategy and tactics to yeild results (so, marketing is only partially scientific) -- but marketing analytics will almost always put a marketer somewhere above pure dumb luck.

Of course, you may agree with Seth, or not -- but I just had to get this off my chest.

Thursday, October 12, 2006

Marketing Analytics for Salesforce.com

This week and last, we've had 2 good conferences in San Francisco, very relevant for marketing analytics. Last week it was DreamForce'06 from salesforce.com, and earlier this week, we had DMA'06.

Let me talk about DreamForce first because it has become a fad these days (specially for anyone working in BI or SaaS) to integrate with salesforce.com using AppExchange. Since early this year, I have been eagerly trying to find an angle between our business and salesforce.com, so DreamForce'06 was obviously very relevant.

A good friend of mine from college, John Barnes, was at DreamForce'06. John is the VP of Technology at Model Metrics, a Chicago-based firm that specializes in customization and integration of salesforce.com with legacy systems. I pinged John to find out more about what salesforce.com is actually doing about marketing automation and analytics, which was very insightful (thanks John).

Salesforce.com website describes marketing autmation as a key component of their platform, which also includes marketing analytics. Turns out while it is possible to define and manage campaigns using salesforce.com, the marketing analytics bit only provides some very basic reports.

So my idea is fairly simple. If people are actually running high-volume direct marketing campaigns using salesforce.com, then we will write an AppExchange component to suck in all the campaign and customer data into our platform and then provide much more sophisticated predictive analytics and other fun stuff on our platform -- make a lot of money, retire early, speak at next DreamForce (sorry I get carried away with this vision thing).

Anyway, about 10-20% of SF customers are using the marketing functionality today. The SFA is the main use but marketing is growing more and more. But 10% of 22,000+ customers is still a good market to go after. And the marketing analytics (and reporting) on SF is lacking. The main obstacle to making it better is that their API does not have a join capability so the only way to do better reporting today is to have a copy of SFDC locally and use replication software to keep it up to date (by DBAmp or Relational Junction on the AppExchange).

Sounds like a good opportunity.

So, overall I am happy with DreamForce'06. I had a chance to shoot breeze with an old college buddy, and learn about a very feasible way for us to get into the AppExchange game. Now I need someone who will go in on this with me as a "design partner" :-)

Monday, October 09, 2006

How to win $1 Million from Netflix?

Fellow blogger Michael Fassnacht noted rightly that Netflix has caused quite a stir for marketing data geeks with their recent $1 million prize offer for "substantially improving" their existing Cinematch algorithm to make more accurate predictions of "how much someone is going to love a movie based on their movie preferences".

Call it "crowdsourcing", or harnessing "group smart" -- the approach is intriguing, and one of a kind. Being a curious soul myself, I decided to register a team from our company to check this out (who knows what may happen? we can be smart sometimes with enough luck :-)

A few interesting facts:
  • The contest is actually slated to go another 5 years until 2011, the bar being raised each year to improve over last year's winner
  • A fine but important distinction: the algorithm needs to predict how someone will rent a movie, NOT what movie someone will rent
  • At first glance, the data provided by Netflix seems pretty "skimpy" in terms of richness. Basically you get:
    • List of movies
    • List of ratings assigned for each movie by an extensive list of Netflix members
  • My first reaction was that having extra information on the movies themselves might help. There's a bunch of stuff available from IMDB . However, apparently there are license restictions and also Netflix doesn't really consider extra data to be valuable in improving their algorithm (see the discussion thread )
The "enjoy the journey, not the destination" mantra may be apt for this contest. As you can see on the discussion forum on netflix , this process has invited all sorts of interesting conversation on the validity of approaches, whether Netflix has provided enough data, why should one even bother, etc. etc. -- a dream peer review IMHO, albeit a bit too noisy. So, Netflix should be getting a lot more than their money's worth via this process -- not just by getting better algorithms and the PR buzz, but also by leveraging an almost open-source-type process to involve external community for their internal R&D.

At the moment, I agree with Michael's assessment that trying to solve this with ratings data alone might not be the best way to go. There seem to be so many other interesting dimensions that should influence somone's movie rating: movie characteristics like the cast, director, etc., review from critics, local media review, geo/demographic information about the Netflix member, among others. None of these are being considered in the current algorithm. I can understand Netflix's hesitancy to interface with 3rd party resources, but perhaps they should make all the datapoints within Netflix's movie database available for this contest -- and second, encourage contestants to add their own qualitative datapoints. If the goal is to approach this as a pure improvement of a data mining problem -- then increasing the depth of data should help.

I'll keep you all posted how far we get on this. Being a small company, we will do this in the copious amount of spare time left over after working on existing client work that pays the bills. Still, it should be a lot of fun.

Thursday, September 28, 2006

An Asterisk that Cost 2 Million Dollars

This just in: a colleague of mine pointed out an analysis that showed a sudden spike in the number of new trial subscriber signups for one of our clients. In early 2005, they had just introduced a new product version and through mid 2006, they were averaging around 3,000 new trial subscribers each week -- which was less than half of what they were getting with the previous product version.

What had happened?

Turns out this client had several product feature descriptions listed during the trial sign-up process. One of these product descriptions happened to have an asterisk next to it, which was explained at the bottom of the page saying it required a credit card number upfront to use it. For a casual observer, it wasn't clear to which product feature the asterisk actually belonged. So, during a content review session, someone caught this and said -- wow, people are looking at our new product and they think they need a credit card to subscribe, which they really don't, and they get very hesitant.

So, in July, a quick content change was made. The asterisk was removed.

Since we are tracking all the transactions, within 7 days after this change, we started seeing a sudden spike in new trial subscriptions, which has levelled off above 7,000 new trial signups a week (for now). More than double of what was happening prior to the removal of the infamous asterisk.

My colleague did a quick calculation on revenue impact, and it basically translated roughly $2,000,000 in additional revenue in the coming 12-months because of the asterisk removal.

I hope you don't have a similarly expensive asterisk ANYWHERE on your website.