Thursday, September 18, 2008

Brand Awareness is a Metric, NOT a Goal

OK just a quick one. Here it is 1:00 in the morning, and I have to write this because it's on my mind. I'll regret it when the alarm goes of at 6:something.

I read a whitepaper recently, and it started off by making the critical point that in order to design metrics, you must be clear about your objectives. OK we all agree.

But then this same document goes on to outline brand awareness as an objective.

Now at first, this didn't jump out at me.

But has brand awareness ever been a business objective? Do people go into business to maximize brand awareness? Or is brand awareness "quantified", a measure of your effectiveness at reaching your target audience and disposing them to consume your product?

Monday, August 18, 2008

The Acid Test for Custom Reports

OK, I'm speaking to the people who work in small entrepreneurial comapanies here. For those of us who chose to work in a dynamic entrepreneurial setting do it for the rush of immediacy. The ability to move quickly. to stay light on our feet, and make real change.

Lots of times, in performance meetings, someone at the table (ok, it's usually me) will pipe up and say "How hard would it to be able to just find out ..." It's a theoretical discussion only.

Someone has a new report request.

Maybe its something they've thought through, and is going to revolutionize the way we do our business.

Maybe its just idle curiosity.

The tech people at the table immediately tend to jump on the "How do we do this?" bandwagon, totally bypassing the "Is this something we should be allocating valuable resources to?" train.

And usually, at first blush, the answer looks and feels like ...pretty quickly.

But how often is that true?

Sometimes, what happens at that point, is someone pipes up and "authorizes" the report, based on the assumption that someone else can whip it up over sandwiches today at lunch.

Then that someone else spends the whole afternoon on it, because they ran into a few unforeseen stumbling blocks. Oh, and by the way, they have a couple of questions about how the reports should be formatted...so now followup meetings are being scheduled, and the person who first requested the report, is now piling on additional feature requests, unchecked. And the dominoes start to topple....

The project has outgrown the petrie dish, and is limping hideously through the corridors of your place of work, wreaking havoc. People have forgotten its original, innocuous status as a "theoretical discussion". The hours invested in it have infused the project with the value of human sweat.

Let's take a peek a few weeks down the road, and go ask how much time the new report is saving? A couple of possibilities (terribly overgeneralized of course)

1. The person who first requested the report has a new spring in her step, and has lost the haunted look that comes from too many hours manually calculating stuff that is better done by a computer.

This is the result you're going for. Congratulations.

That young genius is probably going to start poring over the reports and come up with a great recommendation that will materially alter how you do business...save you a gazillion dollars, and pay for the implementation 40 times over before next week's meeting.

2. The person who requested the report has relegated it to the pile of "stuff I don't need to babysit anymore." Ask her how it's progressing, and she'll pull a report for you while you wait. ....and oftentimes discover that the data is being pulled incorrectly or the report is garbled, or not in a format that is useful to anyone. ***By virtue of having been authorized in the first place, the project has been elevated to the status of "stuff worth doing properly"****. At this point countless additional hours may be sunken into the pursuit, before any kind of cost:benefit analysis happens.

3. The person who first requested the report is completely buried. The time requirement to analyze the implications of the new report is consuming them and they no longer seem to have time to stop and think about how much benefit the new information affords them(factoring in all of the data exceptions, annotating the performance anomalies that affect the data output, and closing the knowledge gap between the data and the information that data stands proxy for).

The Three Things That Should ALWAYS Happen Before Anyone Builds a Custom Report

1. Write a SPEC. Even for a little thing. This process is great for shining a light on the holes that are so easy to gloss over in discussion. We sometimes want to rush past this step. We know exactly what we want. We think we have expressed it clearly. We think there is no room for error.

I'm married to a system architect. I've tried asking him to just whip me up a report (I am a self-confessed data junkie). Without a written spec, he refuses, even when I assure him its quick and easy. Even when I say pretty please or bat my eyelashes.

Try this. If is a simple report, it will only take you a few minutes to write out the functional spec. Describe all the inputs, the data sources, and all the possible outputs, depending on what inputs the report receives. This will bring you a lot closer to a shared understanding with the people who are building the report.

If having the report is not worth the time it takes to properly specify how it works, you probably don't need it.

2. Remember that data is only a proxy for information. It is imperfect, and prone to misinterpretation. It can act as a red herring, or mask important trends. Be sure that everyone involved understands how the data is being calculated. Call the data what it is, not what it is supposed to represent.

Create a A data Glossary of Terms.

3. Things automated are easily forgotten. Ensure you build in a mechanism for following up on the results of the report.

Final Notes

There is a lot of room for assumptions when people toss an idea about over the boardroom table. But programming a custom report is an exact science. Many a programmer has misinterpreted the requirement, and built a report that does not meet the need. And often the report that DOES meet the need is a LOT harder to build.

Thursday, June 19, 2008

Multivariate Test Conversion Page

Thanks so much for participating in my multivariate test....

Or, if you came straight here from somewhere else, and are willing to take 2 seconds to participate, I'd appreciate it. Just click the link.

Multivariate Testing, A Work in Progress

I am very excited about google's web optimizer, which makes multivariate testing easy and manageable. I have run various A/B tests for the pursposes of website optimization in the past, but always with unweildy manual systems. the variables very quickly spun out of control, and the results had to be manually processed and interpreted.

I have struggled for the last few days to set up a multivariate test on a dynamically generated site, with limited success (server problems have meant limited ftp access to try out the quick/easy wordpress plugin that I found after a little searching. Its called Google Website Optimizer from Content.Robot.)

So, I thought I would set up the experiment here instead. Of course, I'll have to pay all of my friends and loved ones to come and visit the site achieve statistically significant volumes, but it will at least give me practise at setting up the process.

So, first I have to identify what action I want people to take. I'll start simple and just ask them to

click this link to participate in my multivariate test.

The variables I'll test will include

1. put the link in the middle of the page versus the top of the page
2. including an image or not
3. 2 alternate headlines

And I'll create a "conversion page", to use the google website optimizer lingo, thanking site visitors for playing.

p.s. clearly I should have posted my conversion page first, so they would appear in order

Wednesday, May 7, 2008

Emerging Keyword Trends With Hittail and Blogpulse

How do you identify appropriate keywords phrases when they are only hours or days old?

Take the phrase "subprime crisis" for instance, or "global food crisis". These are subjects that have only recently been thrust into the public consciousness. Language is constantly evolving, but SEO experts need to be ahead of the curve.

Using keyword research tools like wordtracker or Trellian is excellent for understanding the "voice of the customer" - words that prospects use to describe the product or service that you offer- as long as the phrases are well established. But what about keyword phrases that are evolving or being coined today? By the time an SEO expert has found them, he is already behind in the race to get search engine position.

That is where tools like Hittail and Blogpulse come in.

I've written about Hittail a couple of times, but if you haven't downloaded it, you can get it here free: www.hittail.com.

Blogpulse is a tool from Neilsen Buzzmetrics. You can check it out at www.blogpulse.com.

Hittail give you a realtime stream of keywords, as well as the ability to easily repeat the search that brought your visitor. It allows you to identify emerging keyword combinations quickly.

What it doesn't do is help to expand your keyword list to related terms.

If the phrase is very new, the keyword research tools won't help you here.

So how can you identify related keywords to the longtail phrases Hittail shows you from your site traffic?

For this I have recently started to use Blogpulse. Results are fresh. timely. and framed in the language of the people who are most interested...niche bloggers and the site visitors who post their comments. I have to comb through the results, looking for useful terms and phrases. It is not neatly packaged like a keyword tool, and volumes are anyone's guess. But the wording in use by the bloggers and commenters is a good starting place for building out a list.

The site also displays helpful charts showing incidences of the phrase over time.

I am on the hunt for a more immediate keyword research tool. Can anyone suggest one for me?

Thursday, April 24, 2008

I'm Trying Out Hittail - Part 2

I wrote a couple of weeks ago that I had installed Hittail on this blog, in order to get a better sense of what subjects my readers are interested in. I wanted to see which topics the Hittail engine would recommend that I focus on.

Well, after a week, I took the next step and installed it on several higher traffic blogs.

So far, the results are encouraging....and somewhat addictive for the compulsive web-analyst. (Did I mention I'm thinking of starting a support group for marketers who feel compelled to while away their nights studying emerging user trends?)

At the office, we've taken a couple of (I think) exciting steps to align our editorial and marketing efforts, and the Hittail application is playing a role in that process. It is helping us to determine which stories to feature, allowing us to craft more targeted headlines (both to improve SEO and the user experience), and increasing our awareness of which keyword phrases we should be monitoring position and optimizing for.

A Word on the Hittail Suggestions Tool

Hittail provides suggestions as to which keywords your website should focus on optimizing for. As a self-professed metrics junkie, I've wasted a few thoughtcycles pondering the algorithm that identifies words.

Hittail is about the longtail strategy, so the recommendations are not based on keyword volume. Some of the words it has selected are pretty obscure, so it doesn't seem to be any kind of keyword clustering. From what I can hypothesize, it seems to select words for which your site appears fairly low in the rankings, but for which you still managed to attract a visitor (implying that those who ranked above you provided unsatisfactory results for the motivated searcher). Presumably, I guess, if you could improve you rankings on these phrases, you would attract much more of this type of motivated/previously unsatisfied traffic to your site.

I have no real idea if this is how it works. I'll be posting to the forums at hittail.com to see if I can ferret out an answer. Meanwhile, I can recommend the tool, and report that it is proving useful for me on numerous levels. You can download it



Anyone else out there using hittail? How are you implementing its recommendations?

Thursday, April 17, 2008

Predictive Versus Descriptive Modeling with Analytics

I wanted to post about approaches to using web analytics data that recognize its limitations, and its power.

Many organizations use historical analytics data as a basis for forecasting future growth, and establishing performance goals and budgets. This applicaton for analytics data can blur the distinction between predictive and descriptive data. Understanding this difference is critical to an effective analytics program. It generally falls to the analytics professional to ensure that the difference is clearly understood within the organization.

I'm going to start out with a couple of definitions. What do I mean when I say predictive versus descriptive modeling?

Predicitive modeling refers to a mathematical model that can accurately predict future outcomes. For instance, I know that if I apply sufficient heat to water, the water will reaach 100 degrees celsius and begin to boil (barring slight variations for altitude which are also predictable). The rate at which this happens and the amount of energy required can be mathematically described.


Descritive modeling refers to a mathematical model that describes historical events, and the presumed or real relationship between between elements that created them. For instance, yesterday when I went to the store to buy milk, it cost me $1.00 a litre, last month it was 95 cents, last year it was 80 cents.. Based on historical events, I assume it will cost me roughly $1.05 to buy a litre of milk next month.

Web analytics falls in to the latter category. It is a set of descriptive, historical statistics.

Past Versus Future Performance

I direct marketing activities for a division of a financial publisher. The company has an outstanding track record for identifying and recommending market-beating stocks. Historically, they've made recommendations that represented a lot of money for a lot of investors worldwide.

But at the end of each financial report that we send out....and those of virtually any financial advisory service or market report is a warning to readers something like:

Past performance may not be indicative of future performance.


***Web Analytics reports should carry the same user warnings***


Have any of you ever sat in a management meeting in which company representatives have said something like, "We have the data to demonstrate the relationship between ad spending and revenues, so if we want to grow sales by 20%, we just need to ratchet up our ad spending accordingly."?

This kind of thinking fails to recognize the rate of change in markets, technologies and the competitive landscape, and fails to factor in the concepts of resource scarcity and the law of diminishing returns, as well as the potential from economies of scale. That's without even considering the inherent flaws in any data gathering and processing system.

So, if the operation of complex markets can not be accurately predicted by simplified analytics models, what are the more tangible uses of historical analytics data?

1. To identify broken systems. A significant change in performance data can often indicate a technical problem, overloaded systems, broken links, faulty logic etc.

2. To select between alternatives. Analytics is particularly apt for testing market responses to different offers, creative, or sales processes with A/B or multivariate testing. It can also provide a guide as to which channels or markets tend to be most lucrative or most cost effective among existing channels. (The challenge, then, is often to find ways to expand the more lucrative channel).

3. To flag new market opportunities. A careful study of web analytics data can reveal new opportunities for cost savings, revenue generation or operational improvements.

4. To extablish a meaningful dialogue with existing and potential customers. Web analytics data can help us to learn about customer needs, desires and propensities. It can teach us the language that the customer uses to articulate their needs, so that we can respond meaninfully. It also can give us parameters to personalize the user experience to better meet their needs and create loyalty, trust, and ultimately customer satisfaction

One final VERY POPULAR use for historical analytics data

And hey,...IF your company persists in falling in to the infinite resources analytics model trap. IF they continue to hold fast to the belief that past performance predicts future performance, you can always use your historical analytics data as a roadmap to all of the external factors that caused your company to veer off its charted course.


Wednesday, April 16, 2008

My Other Blog is Spam?

A few days ago, as I found myself writing here, and trailing off in to other subjects, I decided to set up another blog on a favourite subject of mine: offer strategy.

I went and set up the url http://offerstrategy.blogspot.com, and over the next few days, wrote a total of three articles on the subject of creating sales premiums that make sense. Designing offers that help to close sales.

The blog has a total of three external links:
1 to this blog.
1 between two articles
1 to an article on my company's website about the mortgage crash in the US (which is topical to the article, since I think it can be argued that the crash is the result of flawed offer strategy)

This morning I receive mail from blogger informing me that my blog has been flagged as a potential "spam blog"

I went and read about the defininitions of a spam blog, and it cited pirated content and volumes of nonsensical links as the general offenders. I am a little stunned to discover that my three little articles, and three little links could somehow have set off alarm bells someplace. I am curious to know how it could possibly fit the criteria, and if so, do the criteria make sense?

Anyways, I have put the blog up for review by the humans at Google. At least someone will be reading it now, as I hadn't gotten around to publishing it in any way...I was going to wait until there was a little more content.

Has anyone else had their blog unjustifiably locked as potential spam? Please share your story with me. Meanwhile, I'll climb back down off my soapbox now. Sorry for having strayed from the subject matter. To get back to analytics...I am currently working on an article for the WAA Blog about User Generated Content. Hope to share that next week.

Friday, April 11, 2008

I'm Trying out Hittail

A couple of days ago I installed the the Hittail real-time search hits application on this blog. You can register and get the code for your own site at http://www.hittail.com. I am running the freebie version of the application, for now.

I learned about Hittail on the Web Analytics Association site blog. If you are not a member, and you are interested in Web Analytics, I suggest you join the Association.

  • the community is very helpful and supportive
  • they organize educational events
  • the forums are very active, and provide a good mix of practical information along with insights on where the industry is heading
Anyways, I installed Hittail here on this blog, and then, when everything seemed to go well, I also installed it via the wordpress plugin on http://www.moneyweekes.com. So far...no data on that site. Now, I am not panicking yet, because it took a couple of days for the stats to register for this blog. But when they appeared, all the historical data was there.

As a surprise bonus, in addition to reporting keyword traffic, the system informed me that my blog had been stumbled (I had been meaning to submit this blog to stumbleupon, but I'm delighted that someone else found the site and gave me the thumbs up), and received linkedin visits and traffic from the WAA today.

While all of those are great to know, I'm not sure how they will play in to the software's stated objective to help me identify which longtail keywords I should be optimizing for.

Ambar Shrivastava from Hittail described the primary benefit as follows:

"The idea is to provide actionable data that helps webmasters and bloggers better connect with their existing readers as well as attract new readers interested in similar topics to what you already have on your website. This differs from other analytics tools that mainly describe what is happening on your site with little information on how to improve it."

I ran into a couple of snags along the process of installing (what can I say, I'm tired...it's easy, but I was firing on half cylinders). And I can tell you that the team over at Hitbox fairly jumped at the chance to help me out. They get double thumbs up for customer service.

I have shared the application around the office, so I'll have more to report back shortly, I expect. We are all anticipating good things. I work for a publisher, so our sites are very original-content rich, which seems like a good fit for Hittail. However, the moneyweekes.com site is i Spanish. Not sure how the application will handle other languages.

One of our programmers looked unimpressed and muttered "um, you know you can get that data from your log files, right?"... Yeah, I know. But Hittail does digest the information for me and highlight key points...and theoretically it provides optimization suggestions based on my words. I say theoretically, because my search volumes don't merit any suggestions as yet. . And most exciting of all....it does it in almost realtime. So I can start getting a sense of when people are visiting the sites...what is the best time of day to publish my newsletter? How often should I be posting here? I read on their website that I should have about 100 pages minimum.

So if you found me through a searchengine (or maybe even if you didn't) chances are tomorrow I'll be thinking about your keyword search, and how I might better address the issues you seem to be searching for online.

Anyone else out there installed the Hittail code on their site? How is it working for you?

Tuesday, March 18, 2008

Knowing When NOT to maximize your ROI

I was just reading a fascinating discussion over on Jim Novo's blog about the effectiveness of social media sites versus chatrooms as channels for advertising.

One commentator made a suggestion that Jim could "do a bad job" with his cpc campaign...as in design ads to attract fewer, but more targeted clicks, as a means to better optimize his ROI. Of course, it makes perfect sense. It is like designing incentives that are only of interest to the audience that would buy your product.

Don't give away a palm pilot - everyone wants one of those. give away a demo version of your product, or a discount on purchase, or a whitepaper that compares and contrasts the features of your product versus the competition - something that removes barriers to purchase.

The advertiser is happier. The respondent is happier - because you are speaking to their specific needs. Can anyone tell me which stakeholder loses out in this equation? The publisher.

In my opinion, this is a relationship that must be considered when measuring ROI, that hardly ever makes it into anyone's calculations.

The supply of quality publishers depends on their ability to generate revenues from ads. I have had the experience at times in the past of negotiating so hard with a publisher, that they refused to renew the contract when it had run its course. So, ok...I got the client the best possible deal in the short term, but what were the long term effects? I had "used up" available channels to access my target audience.

A client runs SEM campaigns on Google and also a very successful Affiliate program.

They pay their affiliates 5% of revenues.

SEM on a cost per click basis that works out to roughly 20% of revenues.

Each represents approximately one third of sales.

I suggest to the client that they could afford to pay their affiliates 8%.

The first response is, of course,

Why pay 8% when their affiliates were doing it for 5%?

The most important step in the process is to recognize affiliates as an important stakeholder in the sales process. That one can "create" more lower-cost inventory by taking care of a particular stakeholder group, or encouraging its development.

So then the questions
1. how many more affiliates can be attracted at 8%?
2. how much can SEM campaigns be reduced in response? and what is the expected cost per acquisition at the new, presumably optimized (and therefore more cost effective) volume.
3. what is the associated cost of marketing to this stakeholder group to bring them on board in sufficient volumes?

It is a step...or rather a leap of faith, that requires some careful calculating, careful planning, and total commitment - that stems from the recognition of the multiple stakeholders that make your organization work.

Can you share stories of when it made sense for your business to pay MORE for customer acquisition?

Tuesday, March 4, 2008

SCRUM and Freedom from Preformatted Reports

A little girl watches her mother prepare a roast for Sunday dinner. Her mother lays out the piece of meat, rubs spices across the surface, and then carefully cuts the two ends off with a sharp knife, and sets them aside.

"How come you cut the ends off?" asks the little girl

The woman smiles knowingly, and explains that her mother taught her the process when she was a little girl, and it makes the tenderest, most flavourful roast.

The woman later asks her own mom why, cutting the ends off the roast makes it taste so much more tender, and her mom responds that she is unsure of the science involved, but that she was taught the technique as a young woman by her own mother.

At which point, the mystery having taken on more significant proportions, the woman makes a call to her grandmother and asks "Gran, why does cutting the ends of the roast make it come out more tender?"

The grandmother, who is approaching ninety, seems disoriented by the question, so the woman explains futher "You taught mom, and she taught me to prepare a roast by rubbing salt and spices into the surface, and then cut off both ends before putting it into the pan."

The grandmother, when she hears this begins to chuckle, "No dear, I cut the ends off so the roast would fit in the useless little roasting pan my mother-in-law gave me for a wedding present."

I am often reminded of this cautionary tale when I examine the institutionalized reporting structures in place in many organizations. With the competitive pressures of business and the volumes of data available, organizations often develop "institutional blindness" around their analytics. Reports become entrenched, and we forget the urgent needs that fostered them in the first place.

I recently stumbled across Marianina Manning´s blog Web Analytics Princess when reading an article about measuring User Generated Content by Judah Phillips of Web Analytics Demistified. On Marianina´s site, there is an article about Virtual Worlds London 2007 in which she relates comments from the CEO of of Habbo Hotels Timo Soininen. He is describing his implementation of SCRUM project management techniques in developing the virtual worlds in response to user metrics and player feedback. For those of you who are not rugby fans, the SCRUM refers to the team huddling together and moving as a unit in a tightly knit formation, allowing the collective to jointly determine speed and direction.

And I started thinking about applying SCRUM Project Management techniques to web analytics. Not just the design of systems, but actual report generation too.

I was struck by a sudden vision of a future of Web Analytics unfettered by pre-formatted (canned) reports. Today Application Developers shape our focus and our perception of our websites by pre-supposing what kinds of reports we will need. Google´s Conversion tracking shystem recognizes only 4 possible types of conversions: sale, lead, signup and pageview. Management measures performance success primarily by focusing on comparisons against historical data. Many web analysts have made the point that historical data is rendered more and more irrelevant the faster things change on the web.

Almost a decade ago, I worked on a project to develop an analytics tool that essentially stored all the data in its raw format, and allowed the analyst to specify all of the input fields to customize the report. The system had ZERO preformatted reports. It was totally flexible...(and also often crashed the database and the site, because that was before we´d caught on to batch processing or staging the data, or protecting against system overload).

The project was shelved. People couldn´t wrap their heads around the idea of total report freedom. It required a certain intimacy with the data and facility with the stats that proved a barrier to adoption. ....at least that's what I think happened :)

...but I think that, in the fast-changing world of web2.0, where user audiences are SO fluid, and their usage patterns are SO dynamic, a free-format reporting system, combined with a SCRUM approach to analytics, that seeks to identify the emerging patterns without presupposing the answers, may be the path to effective optimization.

I know this is a little out in left field. But the neural networks geniuses of the worlds are creating these capabilities....we just need to create the management practices and reporting/visualization tools to follow suit. What do you think? Is this a possible future direction for the next generation of WA? or am I totally out to lunch?

Thursday, February 21, 2008

Why is Adwords King?

Why is Google Adwords so popular as an advertising channel? I have had many people tell me that it is because it is so effective at targeting, and generates a higher return on investment.

It stands to reason...people are out searching for what you're offering at the very moment that you reach them. I could certainly make that argument to a client with confidence.

But then again, where else could the searcher find so many of your competitors all in one place? Does this swim against the stream of relationship marketing at all?

I've had clients tell me that they focus on Google Adwords because its scalable. Someone told me the other day that he could quadruple his registrations overnight with Adwords, if he wanted to.

Really? He could come up with four times as many targeted, traffic-generating keywords and maintain the Return on Investment he has acheived up until now? Does that mean he is currently missing out on optimization opportunities? Or maybe he operates in a world without competitive pressures of supply and demand.

Here´s my theory

We all hunger for instant gratification and total control that Adwords gives us.

We chronically require the soothing gratification of being able to control our actions and rectify our mistakes...because the capabilities of Web Analytics hold our bad Media Placement decisions up for scrutiny.

No more worrying about having to ride out a long-term advertising contract with a third-party supplier that is underperforming

No dabbling in the mysteries of SEO and the hoped-for results that MIGHT materialize at some time in the future. Even if their is compelling evidence that organic traffic is cheaper and more loyal.

Has anyone out there been doing this long enough to remember Flycast?

Well, in the early days of the Flycast ad network, I could manage my own placements site by site, control the cpm I paid on each site, the volume of impressions I was willing to accept (yes, we still paid cpm back then). All through my own handy management application. That was back before Google even took ads.

I loved it. I pitched it to all my clients. Here was a place that I could measure and optimize performance, and demonstrate the results. I could easily demonstrate to my clients how we were saving money. Making them money. Improving their performance all the time.

In those days, keyword advertising was mostly sold on a cpm basis. You signed long-term contracts, and someone else was in control of the level of optimization. Paid search represented a relatively small segment of the market.

OK, so if its all about controlling your ROI in real time, why don´t affiliate programs rule the ad revenue universe?

Again, it´s the risk of exposure. When you get 40 or 50 thousand affiliates signing up for your program, it can be difficult to police their advertising practices. And cleaning up the messes created by overzealous affiliates can be a lengthy process. Not that I recommend avoiding affiliate programs. I love them.

Nor would I recommend avoiding adwords. Now don´t get me wrong. I like Adwords. I love Adwords. Google repeatedly makes me a hero to my clients because I can exercise tight control over advertising, test effectively and scale spending up and down easily in response to the changes in the business climate.

They have done an excellent job of developing tools that are valuable and functional, and their success reflects that.

But I think it is important not to have blinders on about the specific value of Adwords, and SEM in general and its place in the Internet Marketing spectrum. Throwing all of your eggs into the Adwords basket can mean passing up on more lucrative advertising channels in favour of more immediate control.

The powerful analytics applications available on the market increasingly allow us to take the "lifetime value of a customer" view in measuring ROI, and to develop and refine a variety of channels to optimize our returns.

Thursday, February 14, 2008

Thoughts on Measuring User Generated Content

I have been reading a lot lately about measuring user generated content, in preparation to take on the role as Moderater for the UGC group of the Web Analytics Association. I keep coming across discussions about measuring audience engagement. People publishing alternative formulas for quantifying audience engagement using bounce rate, recency and frequency as proxies for different aspects of engagement. They are all informative discussions of the alternative means of establishing a metric. They introduce potentially useful new paradigms. And yet, I find myself continually asking - why?

I have to admit that the "audience engagement" metric smells to me like one of those marketing-championed, artificial, qualitative-concept-as-performance-statistic labels like "brand awareness"... Sorry, I am a direct marketer at heart, and I´m all about the conversion ratio - It´s why I love analytics. Brand awareness is mostly a happy side-effect of maximizing sales, or some other desired "transaction" in my world.

When I first enrolled in a graduate business program, I was majoring in nonprofit administration, with a focus on the administration of arts organizations. I remember during my first year, I was taking a Policy course, in which the prof stood up and asked everyone "What is the manager´s primary responsibility?" I figured this was one of those broad discussion topic questions, and people around the classroom were piping up with all kinds of answers about efficiency, facilitation, training of their reports, clarifying vision and mission and so on.

The professor just stood looking grim and dismissive, subtly shaking his head. It began to dawn on me that this was not a topic for discussion. There was a right answer, according to him. Only one, and here it is:

The primary responsibility of management is to maximize shareholder value.

This event particularly stands out in my memory because, as a young and idealistic nonprofit-administration major, I had to wade into the fray, and start arguing about whether the point held true when there was no profit motive. No profit? Inconceivable. My classmates largely decided I was idealistic and naive.

But when I think about it, the statement does hold true for me. Where there is no profit motive, the constituents are the shareholders. The responsibility of governments is to maximize value for its citizens. My responsibility as a Marketing Manager is to maximize the value for the shareholders in the company where I am employed.

Add to this two other things that I have come to believe about designing analytics systems.

1. For an interaction between two parties to be useful and measurable online, there must be a "transaction" - an exchange of items of value, whether it is email address in exchange for whitepaper, or dollars in exchange for box of chocolates (more on this in an upcoming article).

2. The objective of a performance metric is to track expected or predicted progress towards a specific, measureable goal, based on historical performance.

And so, with this in mind, I come back to the question of measuring User Generated Content, and the audience engagement metric. And I just want to present the idea in the following framework.

I am a Marketing Manager. My responsibility is to maximize shareholder value. In that context, how does measuring audience engagement serve my objectives, constitute a transaction, and track predicted progress towards a measurable goal? The answer to these questions will, I believe, serve as a framework to lead me to the appropriate metric for audience engagement.

Of course, it is obvious that the development and contribution of user generated content constitutes a significant, and valuable transaction on a website (in most cases, excluding spam, and other like content that subtracts, rather than adds value). I don´t think any marketer would argue this point in the world of web 2.0. But the measurement of this contribution must be placed in the context of a specific a measurable goal that serves towards the objective of "maximizing shareholder value", in order to constitute an applicable metric.

What do you think? Should measuring user generated content be subjected to a more rigorous framework, or is there perhaps value in an organic approach, to identify possible future goals, metrics or trends opportunistically? I propose that such activities are usually a distraction for upper management, and breed misunderstanding, wrong assumptions and distrust of the analytics process. Do you agree or disagree?

Wednesday, February 6, 2008

Social Networking, User Generated Content and Designing Analytics

I have, for several years been a subscriber to Christopher Knight´s free newsletter www.ezinearticles.com. This morning, I received an invitation from him to connect via:

Linkedin
Myspace
Facebook
Twitter

I accepted his invitation to connect on LinkedIn, because of their handy notification system which tells me when he adds new connections...and as a metrics junkie, I am really curious to know what kind of conversion rates he gets. I wrote in my invitation to connect and asked him to let me know how many invitations went out, so I could get some kind of ballpark idea of the response rates. Of course, I have no idea where I fall in the response curve, so whatever I can observe is suspect as an indicator of what Chris actually experiences in terms of response.

Hopefully he´ll write and tell me what kind of response he got. Now that we´re linkedin buddies.

I recently joined the Web Analytics Association, and volunteered to help coordinate an online special interest group on "measuring user generate content". So, of course, faced with the pressure of writing some stuff to contribute, I´ve been looking at everything through the UGC lense the last little while.

I have to admit, UGC and Social Networking to me spill over into eachother´s swimming pools a lot. I have no idea where to draw the lines. I mean, here is some UGC spawned by Chris´s mailing this morning (look at me, calling him Chris, now).

So, like always, when puzzling with this kind of fuzzy area, I go back to what for me is "first principles".

"What am I trying to acheive?" defines how I approach measuring. How well does the UGC serve my objectives? and what proxies can I construct from within the available data to represent the milestones in moving toward my objectives?

So...connecting to someone I have never met or worked with on linkedin.com presents all kinds of interesting questions.

Have I in some way established a closer relationship with Chris Knight by connecting to him on linkedin? He surely will get a better idea of who his readers are. I checked, and he has 500+ connections (we were already 3rd level connected via 7 or 8 other people). Now, the UGC of his contacts, their profiles, their relationships, has been brought into my field of vision. I wonder how the Web Analytics guys at linkedin interpret that data.

How do they measure UGC?

What defines success for them? If each of their members has a higher average number of connections, can they take that stat as indicitave of business success? Or could it possibly be indicative of a dilution of the primary objectives of linkedin? which for me, as a user, is at least partially to establish some level of prequalification or trust among new contacts. If everyone is a contact, how do I differentiate?

How do other people decide who to connect to via various social networking channels, and how do those decisions to connect define their perception of the channel, its usefulness, and their propensity to generate UGC for the channel? And most importantly for the analytics junkie...how can we extract and interpret data that will accurately inform us about these things...and serve our objectives?

Wednesday, January 23, 2008

Competition for Google Analytics

Just a quickie post.

I have signed up this week for beta access to 2 new analytics tools.

1. Microsoft's project Gatineau, which you can sign up for here. You'll need an adcenter account to participate, and you must be a US customer....which I sort of am, at a stretch, since the parent company is out of Baltimore...at least the site is hosted in the States

2. A realtime analytics tool reinvigorate's Snoop. Click on the beta test link in the upper right to sign up.

...I'm still waiting for my invites. I am so impatient I can't even wait for GA to update my stats every day, how do they expect me to wait for this?

Anyone put me on to a free, realtime analytics tool?

Tuesday, January 22, 2008

Measuring RSS Feeds and the Search for Realtime, Free, Opensource Analytics Tools

Once upon a time, I had a team of programmers build me custom analytics tools to answer all of my tracking questions. I wanted to be able to follow site visitors from offsite, through specific banner ads, to onsite registration to attempted download, re-registration, application based activity and related email newsletter and promotion receipt and response, plus attribute the activity to various and potentially competing affiliates who worked with me.

But the tracking universe keeps expanding. Today I want to know how many people are reading my RSS feed. Whether the people who are visiting my site are already subscribed. Are they forwarding the newsletter to other people?

Trying to force-fit this kind of granularity out of google analytics, which is my current tool, is pretty challenging. I want to be able to break it down to the user level and see it in real time. Did person X, from campaign Y ever come back to the website after they signed up? Are people who come from keyword searched more likely to forward articles on to associates?

So, I have been doing lots of online reading about various tools. Anyone know of a free tool that allows me to see reports in real time? One that allows me to slice and dice at will, ideally.

Friday, January 18, 2008

Tracking Return Visits by Registered Users With Google Analytics

As a marketer, I am hungry to reach out and make contact with my site visitors. To understand what drives them. What do they like about the site? How do they use it?

Today´s challenge is setting up my GA tracking so that I can differentiate visits from people who are signed up to receive my newsletter from people who haven't. Sure, unique versus return visitors is a good wide approximation. but I want to know specifically the behaviour patterns of the people who have given me permission to reach out and communicate to them every morning with our newsletters.

I am thinking that I can do this by using the "user defined" visitor report. Basically, this report is designed to allow us to sort site visitors based on the content of their responses in a submitted form.

An example. Let's say you have a form in which site visitors can vote for their favorite Beetle. You have 4 custom segments defined by including the utm_setvar function in your website code - John,, Paul, George and Ringo. Now you can sort your visitors according to their preference in your GA reports.

To read on how to do this: you can check out the following forum posting at Nuhit.com. In this case, they are talking about segmenting visitors by vbulletin variables

I, of course, am not a programmer, so I´ll be passing the job into the capable hands of one of the programmers here on our team :)

In my case, I want to figure out whether I can set up two user defined segments....completed the form or didn't. I am thinking that what has to happen is that the form has to be, essentially, completed with a predefined "notsubscribed" segment....but then I need to figure out how to lump all of the other possible responses into a single category. I am going to talk to my programmers about whether we can set the segments as "contains @" or "does not contain @", because, hey...if there's no @, the form´s no good anyways, right?

I´ll let you know how it works out later in the day or week, depending.

Wednesday, January 16, 2008

SEO for a Good Cause with Bruce Clay

I was excited to read in my copy of Bruce Clay´s January SEO Newsletter, that the company has organized an SEO Charity Contest. If you're interested, entries are due by February 11, so act fast.

Unfortunately for me, I´m way down in Buenos Aires, and so it would be impractical for me to try to participate, since I can´t attend the events. Maybe next year they´ll include some remote training prizes for participants from further afield.

What a great opportunity for individuals who are anxious to have their cause heard. I can see so many positives:

the contest winner gets an opportunity to benefit from the great course materials and Search Engine Strategies Conference,

the charity gets the help of an individual to promote their cause, PLUS they have a great PR piece to spin out, as a result of winning,

Bruce Clay has the opportunity to raise awareness around a bunch of great causes, among the powerful communicators of the SEO community! Maybe some other pros in the community might find a worthwhile charity or two to adopt and promote in the process.

I´m going to spread the word in my little community to see what kind of causes people would like to see supported. I´d be interested, if anyone feels like sharing, to hear what kinds of causes people think would be ideal for this contest. What great charity do you think could really leverage this kind of opportunity to make a real difference?

Monday, January 14, 2008

Accepting the Inaccuracies of Analytics Data?

Hurol Inan, on his website Hurolinan.com

posts an article entitled

Best Practice KPI Reporting for the Online Channel


In it, he outlines the four factors to improve decision-making

factor number one: The data the reports are based on must be accurate, and free of bias. KPI reports that rely on inaccurate data sets are not only invalid but also dangerous as they may cause the wrong decisions to be made.

While I totally agree with what Hurol Inan has to say on the subject, I have a couple of concerns.

1. While improving accuracy is important, it is, I think, also important to understand the limits of the technology, and the resultant possibility of inacuracies that are virtually unavoidable. This is the natural result of a distributed tracking system that relies on users accepting our tracking cookies, following prescribed paths within expected timelines, and not confusing our tracking systems by returning to a site multiple times through multiple channels. For today, the best we can do is to work towards ensuring that report users understand specifically what the data represents, and that they understand the limitations and potential for error.

2. Accuracy also depends on the company´s definitions and practices. For instance, if you are running multiple ad campaigns, and a user responds to numerous advertisements before making a purchase, or returns through online ads, once already a customer, how do you properly attribute the return on investment for advertising spend? Having clearly defined and published policies on the hierarchy of tracking events is critical both for measuring behaviour, and communicating to stakeholders - both internal staff and external affiliates.

3. In my experience, too often, there is no common understanding of the specific definitions of the data being tracked, particularly among upper management, advisory boards or investor groups. A plan for educating all users on the meanings, definitions, limitations of dashboard reports and KPIs is critical for establishing the practise of trusting and acting on the data, and acheiving upper management support for allowing KPIs to shape strategy.

Friday, January 11, 2008

Dangers of Dashboard Reports

I'm not about to go around telling you that businesses don't need dashboard reports. only that they can provide a false sense of security. a dangerously warm-fuzzy feeling of control.

I have often witnessed cases where team members get so used to seeing similar patterns repeated that they stop really looking at reports. oftentimes these reports hold the key to important operational changes, but the data is so dense that the changes pass undetected until reaching crisis proportions. meanwhile management, secure in the knowledge that they have dashboard reports keeping the pulse of the company's operations, continue their future planning based on flawed assumptions.

Some of the pitfalls I have seen

differences between booked sales and fulfilled sales in evaluating effectiveness of marketing campaigns

failure to properly train all stakeholders in the proper interpretation of data. A team member who has not had the meaning of a report, and its contained data explained thoroughly is apt to assume that the data is completely reliable, needs no interpretation, and is all properly attributed. Anyone who has spent time working with online tracking knows this not to be the case.

data tunnelvision allowing the historical content of dashboard reports to limit your marketing team's imagination in fostering and developing new revenue generation or customer acquisition channels.

upper management or accounting team assuming that historical results are scaleable to any size and that the historical performance patterns will hold steady
failure to consider internal labour costs when evaluating online campaign effectiveness in reports

slowing site operation in preference to internal report generation, data processing and synchronization

delaying report generation beyond usefulness

storing aggregated data and failing to foresee future requirements for reporting calculations

Can Someone fill me in on Starware.com?

I´m always interested when I come across a website that seems to be anomalous. yesterday, doing a little brainstorming for a client, I was on ranking.com, reviewing their list of top sites.

I was perusing the list, the usual suspects, Google, MSN, AOL and so on, and my eye stopped at number 8. starware. I thought to myself, "I´ve never heard of this website," and wondered what kind of a service they must be offering, then I glaned over to the links in, thinking it must be a social networking site with tons of inbound links - something that has acheived success through grass-roots marketing.

But the site has SIGNIFICANTLY fewer inbound links that all of the other sites on the page.

AOL in the number 7 spot reports over 53 million links.

ebay, in the number 9 spot, over 5 million.

starware.com? just over 70 thousand.

So next, I link over to the site, to see if I can find some clue there. OK, it´s a search engine. Must be a brand owned by one of the Internet Behemoths. So, I read a little more on their site, and look at Alexa.com to see how they are ranked (205, with only 115 links in - still pretty impressive, though not top 10).

Next I start thinking about the differences in the reporting from Alexa and Ranking.com. I can think of a couple of possible explainations.

1. ranking.com and starware are affiliated in some way and users of the ranking.com and starware.com toolbars are integrated somehow.

2. starware.com is particularly popular among people who have downloaded the ranking.com toolbar. these are people who are more inclined to download toolbars, newer on the ´Net than alexa users, who are perhaps, more the old guard?

Finally I had a look at the traffic history on Alexa.com. These guys have been around a few years. traffic starts to grow in 2004, ramps up dramatically in ´06, and then roughly doubles over ´07.

I need to close here by saying I'm in Buenos Aires, and so there are also lots of major movie releases, hot new tv shows and stuff that I totally miss down here. Can somebody fill me in?

Thanks.

Lu