A marketing tale

Marketing is a tricky thing. One lesson that I have begun to learn over time is that simplicity and consistency always seem to triumph over a more comprehensive, but more complex story. Take the case of Tivo in the UK. A couple of my friends bought Tivo when it first appeared in Britain and started to have that kind of scary, glazed expression normally associated with religious fanatics or users of interesting pharmaceutical products. I then saw a cinema ad for Tivo and it seemed great: it would find TV programs for you without you having to know when they were scheduled – how cool was that?! It would learn what programs that you liked and record them speculatively for you; you then ranked how much you liked or disliked them and it would get better and better at finding things you enjoyed. You could turn the whole TV experience from being a passive broadcast experience into one where you effectively had your own TV channel, just with all your favorite programs. Oh, and it looked like you could skip past adverts, though of course the Tivo commercial politely glossed over that.

Well, I bought one and I was like a kid in some kind of store. I soon acquired the same crazed look in my eyes as my fellow Tivo owners, and waited smug in the knowledge that I was at the crest of a wave that would revolutionize broadcasting. My friend at the BBC confirmed that every single engineer there was a Tivo fanatic. And then: nothing happened. Those BBC engineers, myself and a few others constituted the entire UK Tivo market – just 30,000 boxes were sold in the UK. Eventually Tivo gave up and, although Tivo is still (just about) supported in the UK, you can’t even buy Tivo 2, or even a new Tivo 1 except on eBay.

What happened? The message was too complex. Years later Sky caught on to the DVR concept and brought out the vastly functionally inferior Sky+. How did they advertise it? They just showed a few exciting clips with the viewer freezing and then replaying: “you can replay live TV” was all that was said. This was a fairly minor option on a Tivo that the Tivo commercial barely mentioned, yet it was simple to understand. Sky+ sales took off, and myself and some BBC sound engineers are left with our beloved Tivos, praying that they don’t go wrong. It is another Betamax v VHS story, but this time the issue was a marketing one. Tivo still limps on in the US, still growing slowly in subscriber numbers through sheer product brilliance (helped by being boosted on “Sex in the City”), but has clearly not fulfilled its potential.

What this little parable should teach us is that a key to successful marketing is simplicity, stripping everything down to the core thing that represents value to the customer, and then shutting up. With a simple message people can describe the product to their friends or colleagues, and so spread the word. With a complex, multi-part message they get bogged down and so cannot clearly articulate what the product does at its heart. It is so tempting to describe the many things that your product does well, but it is probably a mistake to do so. Find the one core thing that matters to customers, explain this as simply as possible, and repeat as often and as loudly as you can.

Size still isn’t everything

Madan Sheina, who is one of the smarter analysts out there, has written an excellent piece in Computer Business Review on an old hobby horse of mine: data warehouses that are unnecessarily large. I won’t rehash the arguments that are made in the article here (in which Madan is kind enough to quote me) as you can read it for yourself but you can be sure that bigger is not necessarily better when it comes to making sense of your business peformance: indeed the opposite is usually true.

Giant data warehouses certainly benefit storage vendors, hardware vendors, consultants who build and tune them and DBAs, who love to discuss their largest database as if is was a proxy for their, er, masculinity (apologies to those female DBAs out there, but you know what I mean; it is good for your resume to have worked on very large databases). The trouble is that high volumes of data make it harder to quickly analyse data in a meaninfgul way, and in most cases this sort of data warehouse elephantitis can be avoided by careful consideration of the use cases,probably saving a lot of money to boot. Of course that would involve IT people actually talking to he business users, I won’t be holding my breath for this more thoughtful approach to take off as a trend. Well done Madan for another thoughtful article.

Revenue recognition blues

Cognos shares have slid nearly 20% in recent weeks as an SEC probe into their accounting continues. The questions raised are in the notoriously tricky area of US GAAP rules, specifically on “VSOE” (or vendor specific objective evidence) which determine how much revenue can be credited for a deal in current figures, and what amount should be deferred. The post-Enron climate has ushered in a much harsher review of software industry practices than was normal in the past, and such esoteric sounding accounting rules can seriously impact a company, as Cognos is now seeing.

Word in the market is that the underlying business is actually quite robust at present, so hopefully this will be a blip for the company rather than anything more serious. Cognos 8 means that there is quite a lot of potential for Cognos to gain revenue as customers upgrade to the new software, which features much better integration between ReportNet and Powerplay, and a complete revamp of Metrics Manager, which is retitled Metrics Studio. These improvements should see Cognos customers steadily upgrading, and so having a positive impact on the company’s already pretty healthy finances. However perhaps some more conservative interpretation of US GAAP on their part would be wise.

MDM: comply

James Kobielus makes an important point regarding master data management – its role in compliance. We know that large companies today generally have many different versions of master data scattered around their organizations: 11 different definitions of “product” on average for example, according to one survey from research firm Tower Group. This of course makes any business performance management question hard to answer: “how much of product X was sold last week” is tricky to discover if there are eleven systems that think they are the master source for information of products. However it may be worse than that: if you are having to produce some report for reasons of regulatory compliance, then such ambiguity may have serious consequences.

In the article James says that “without an unimpeachable official system of records, your lawyers will have to work twice as hard to prove your organization is complying with the letter of the law”. Of course the lawyers won’t be too troubled about that (all those juicy billlable hours) but business executives certainly need to consider the possible compliance implications of poor master data, as well as its consequences elsewhere.

The patter of tiny pitfalls

There are some sensible tips from Jane Griffin on MDM pitfalls in a recent article. As she points out, improving your master data is a journey, not a destination, so it makes sense to avoid trying to boil the ocean and instead concentrate on a few high priority areas, perhaps in one or two business units. It would make sense to me to start by identifying areas where MDM problems were causing the most operational difficulties e.g. misplaced orders. By starting where there is a real problem you will have less difficulty in getting business buy-in to the initiative. Be clear that there are lost of different types of master data e.g. we are involved with a project at BP which manages 350 different master data types, and clearly some of these will be more pressing an issue than others.

I have seen some articles where people are struggling to justify an MDM initiative, yet really such initiatives should be much easier to justify than many IT projects. For a start IT people can put the issues in business terms. Master data problems cause very real, practical issues that cost money. For example poor or duplicated customer data can increase failed deliveries, and issues with invoicing. Poor product data can result in duplicated marketing costs, and in some cases even cause issues with health and safety. Problems with chart of accounts data can delay the time needed to close the books. These are all things that have a cost, and so can be assigned a dollar value to fix.

Successful MDM projects will be heavily business-led, driven by the need to improve operational performance. IT staff need to educate business people that there are now an emerging set of solutions that can help, and get those business people involved in owning the data. It is the lack of data governance in many companies that contributed to the poor state of master data in the first place.

How healthy is your data warehouse?

Not all data warehouses are created equal. Indeed both custom-built and some packaged data warehouse products can have surprising limitations in terms of their functionality. Just as I referred recently to Dave Waddington’s excellent checklist of things that would indicate a master data management problem, I would like to propose a series of questions that could be used to assess the depth of functionality of your data warehouse, whether it is custom built or packaged. For this list I am indebted to Dr Steve Lerner (until recently IS Director, Global Finance Applications and Integration at pharmaceutical firm Merial), who was kind enough to set out a series of symptoms that he had found indicated a problem with a data warehouse application. What I like about these is that they are all real business problems, and not a series of features defined by a software vendor or database designer. They are as follows.

1. Do you have difficulty conducting what-if analysis for a variety of business or product or geographical hierarchies?

2. Would it be hard for your current system to determine the impact of a business organization change on Operating Income?

3. Would it be hard for your current system to determine the impact of realigning geographical associations on regional profitability estimates?

4. Do you have difficulty restating historical data?

5. Can you view historic data using both a time-of-transaction basis and a current basis?

6. Can you currently restate historical data using new account structures?

7. Do you have difficulty viewing composites of data from sources with different granularities along key dimensions (i.e., comparing daily sales for a month, to forecast sales done monthly, to your annual profit plan, and to a five year long-range projection)?

8. Do you have difficulty with “bad data” getting into your current data warehouse?

9. Do you have difficulty maintaining the accuracy of your reference data?

10. Do you have difficulty with traceability from source to report?

So, how did your data warehouse application score? If it did not do well (i.e. failed on several of these ten points) then you should be concerned, because business users are likely to do exactly these types of things with the data in the warehouses, if not today then at some point. When they struggle, they will come looking for you.

A potential application of this checklist would be identify the best and worst data warehouses in your company. This type of “health check” could be useful in prioritising future investment e.g. it may highlight that some systems are in urgent need of overhaul or replacement. If you work in an IT department then going to business users with this kind of health check could be seen as being very pro-active and enhance the IT department’s credibility with the business. If you are a systems integrator then creating a process for measuring the health of a data warehouse along these lines could be a useful tool that could be sold as a consulting engagement for clients.

Casting spells

I write this column using some software called Blogger, which is fairly simple to use but is rather limited in some ways, so I am probably going to switch to a more flexible blog editor soon. However one cause of constant entertainment is the Blogger spell check function, which almost makes the thing worthwhile. My typing is erratic at best, so I frequently encounter the Blogger spell checker. At first I found its eccentric suggestions annoying, or even inept, but now I find that they have a certain charm of their own. It takes me back to the early days of word processing, when spell checkers were crude, and their alternative suggestions for one’s typographical errors were sometimes wildly inappropriate. Blogger’s spell checker recalls that era, as it presents sometimes surreal suggestions for what to a human eye is a pretty easy mistake to spot. For example, if you misspell:

“management” as “managemnet”

then you are presented with two alternatives. Its best guess is “mincemeat”, which is somehow appropriate in a couple of cases of managers I can recall, but not really a very likely error. Its only other attempt is “mankind”. This is not an isolated case. If you write about federated databases then it is endearing to see the typo:

“federaion” have the two alternatives: “bedroom” or “veteran”

proposed by the beastie. “Bedroom”? I would love to understand the algorithm that came up with that one. I was also impressed by:

“peformance”

Instead of the pretty obvious “performance” it rather sweetly suggests “peppermints”.

However my favorite is that if you type “Blogger” as a phrase then not only does it not recognize it. The term “blog” also sadly is a complete mystery to it, which might seem an omission given that it is intended as a spell checker for, er, blogs, or perhaps “blocs” as the spell checker so helpfully proposes. For “blogger”then it suggests the wonderfully ironic:

“blocker”

How true, how true. I would be interested to hear of your worst spell check horror, or indeed of a spell checker whose ineptness rival Blogger’s. Any offers?