Do Metrics and Measurement Matter?

August 1st, 2007 by Bob Bly

The cover story in this month’s issue of Training & Development magazine is “Metrics and Measurement: Do They Matter?”

The article argues in favor of measuring success in sales training and performance … vs. (I would guess) NOT measuring it.

The fact that the headline is phrased as a question implies that there are people who are AGAINST measuring the results generated through sales training.

Sounds unbelievable, doesn’t it?

Yet, there are people who don’t measure the results generated from their marketing programs.

And there are those who never test marketing ideas against each other in a simple A/B split.

They argue passionately about whether concept A or B … or headline A or B … is better, when they could quicky and easily test the two concepts or headlines online at minimal cost.

Marketers give lip service to testing, but except for the bigger direct marketers, most companies do little measurement and even less split testing.

I think the top reasons for lack of testing and measurement are:

A. Lack of knowledge of how to conduct a test.
B. View testing as too much work and hassle.
C. Not sure what they would do with the results.

Any other reasons you can think of why so many marketers measure their results minimally if at all?

Share

This entry was posted on Wednesday, August 1st, 2007 at 2:59 pm and is filed under General. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.

6 responses about “Do Metrics and Measurement Matter?”

  1. Riel Langlois, comic book writer said:

    The marketing department of an organization would look for a jump in sales to evaluate if a program was particularly successful.

    My experience with marketing managers is that they watch the profit numbers. For example, a newspaper made $4000 in ad profits in July 2006. In June 2007 the ad reps try a new marketing seminar. July 2007 shows a $5000 profit – up $1000 from last year.

    If there weren’t other obvious influencing factors, the manager would assume that the seminar was successful. I haven’t seen a lot of scientific method involved in management in marketing, but I think their “gut” evaluation methods have merit.

  2. Jim Logan said:

    There are a number of B2B marketing departments I’ve worked with who don’t test and measure their results because the company doesn’t measure their performance on the response of their work and its associated ability to generate revenue.

    Marketing is often measured on product management and corporate marketing activities – positioning of the product, organization and timing of the product release, and how spectacular the booth looks at the big tradeshow.

    If more marketing departments were measured on things like quality of leads, prospective customer response, etc. I imagine you’d find more direct marketing and testing.

    It would be interesting to survey companies and see if there’s a correlation to the use of direct marking tactics and testing to Marketing being graded in significant part to revenue performance.

  3. David Fideler said:

    You should always measure RESULTS, but it doesn’t always make sense to TEST . . .

    Marketing program results, regardless of size, should always be measured and evaluated for effectiveness, including return on investment. Not to do so is just inept.

    As far as A/B testing goes, that really depends upon the actual marketing program and the size of the mailing.

    In a B2C mailing of 100,000, you would certainly want to do an A/B test before doing the actual full-scale mailing.

    However, in very highly targeted B2B mailings, the actual target audience can be quite small . . . 1,000, 2,000, 3,000, etc. In cases like that, even if you were to do a very small test, the results obtained might not even be statistically valid. Understandably, it might not be meaningful to test very tiny mailings — but the results should always be evaluated, regardless of size or scale.

    – David Fideler
    Concord Communications and Design
    http://www.concordcd.com/

  4. Ted Grigg said:

    I think there is another reason marketers do not test as they should. Lack of resource. Not, not money resource, but lack of staff and management energy to deal with testing.

    For several of my direct marketing consulting clients, testing is the straw that breaks the camel’s back. Many marketing groups are stretched to the breaking point. The reason that happens so much is a topic for another discussion.

    Just getting out the work takes everything they’ve got.

    In such environments, the first thing to go is strategy. It is like walking on one leg. I would call that limping, not walking.

    The title of an article I wrote “It Works Compared to What?” is a quote of my initial response to a marketer who tells me an execution works but does not validate it through ongoing testing. They may think they are walking, but they are really hobbling when they could be running.

    You can download the pdf by going to the articles section on my web site at http://www.dmcgresults.com.

  5. Mike said:

    It’s very beautifully.

  6. Lou Smith said:

    Speaking of metrics, I found a visit from http://www.bly.com/blog/ in my access logs. Puzzled: yes, flattered: a bit, thanks for checking http://www.safelists.us.

Leave a Reply