Appliances are proving popular

There is a useful overview of the growing appliance market in Computer Business Review:

The appliance market is nothing if not growing, with no fewer than ten appliance vendors now identified by analyst Madan Sheina (who by the way, is one of the smarter analysts out there). Of course apart from Teradata many of these are small or very new. Teradata accounts for about USD 1 billion in total revenue (the accounts will become much clearer once they separate from NCR) though this includes services and support, not just licences. The next largest vendor is Netezza, who does not publish their revenue (though I would estimate over USD 50M). Kognitio used to be around USD 5M in revenue, though they seemed perky when I last spoke to them so may be a little bigger now. DataAllegro will certainly be smaller than Netezza, as will be the other new players. It is too early to say how well HP’s Neoview appliance will do, though clearly HP has plenty of brand and channel clout, especially now that it has acquired Knightsbridge.

Still, so many entrants to a market certainly tell you that plenty of people feel that money can be made. So far Teradata and Netezza have had the field pretty much to themselves, but the entrance of HP and the various newer vendors will create greater competition, which ultimately can only be of benefit to customers.

An unlikely source of BI ideas

I fully agree with an article by Steve Miller:

about how the Harvard Business Review is a surprisingly useful resource for people working in business intelligence. One of the recurring themes I have noticed over the years with projects going wrong is that the root cause of problems is more often people communications than technology. Of course as technologists we are inevitably drawn to the technical issues around the latest technology – performance, how buggy the software is etc, but few pieces of commercial software are so poor that they will cause a project to fail directly due to the software (I exempt Commerce One from this generalisation; it was that bad). The useful thing about Harvard Business review is that it gives some insight into the kind of issues that are confronting senior management, or at least about the kind of issues they are reading about.

However the HBR is rather hard work. There are rarely articles about technology directly (an exception was the November 2006 “Mastering the Three Worlds of Information Technology”) but technology often crops up within other articles, as Steve Miller points out. What I would add is that HBR can be a rather ponderous read. Their articles tend to be long and in-depth rather than bright and breezy, and there is a politically correct element about HR issues which can seem quite sanctimonious. But for every painfully worded article about the joys of diversity training there are several useful ones about current management trends and hot topics.

Speaking the same language as senior management is a stepping stone on the road to better understanding and communication, and that in turn will help improve the propsects of success for a BI project.

A different kind of conference

I spent the last two days at the eWorld Purchasing and Supply conference on London, where I had a couple of speaking slots (you can’t escape your past, and a long time ago I worked in Shell’s technology planning area, which involved software.procurement). I was pleasantly surprised by the scale of the conference. These days most IT conferences are struggling to get decent attendance, as more and more people seek out information on-line. Yet this specialist conference managed to get over 300 attendees on both days, and from the conversations I had these were mostly “real” delegates i.e. people with actual projects and problems to solve, rather than the tyre-kickers who sales people dread and often seem to constitute much of the attendance of some IT conferences.

For another perspective on the conference see the following blog:

Credit where credit is due to the organisers, Revolution Events, who did an excellent job with administration and organisation (only let down a bit by the caterers on day 1 who were of the “oh, we didn’t think this many people would be coming” variety). The exhibits area had decent flow of traffic and the speaking slots stayed well on time, a particular bugbear of mine. Perhaps these more specialist conferences, concentrating on a particular vertical or in this case functional niche, are the way to go.

When is an appliance not an appliance?

What we call things is important. The recent rise of data warehouse “appliances”, pioneered by Netezza (and arguably Teradata before that) is an interesting case in point. For years the relational database vendors spent their energy in making sure that transaction systems ran quickly and reliably. Business intelligence applications were not a major focus, and this led to a number of approaches to dealing with very large data warehouse applications. Certain types of index scheme would work very well for read-only BI queries, for example, and Red Brick was an early example of a database optimised as such. Later Teradata did a superb job of carving out a high end niche by using parallel processing hardware and specialist database software to take advantage of this properly. They did such a good job that after a while Teradata almost became synonymous with large data warehouses, of the types typically encountered in retail banks, supermarket chains, telcos etc. Oracle and othe others made some half-hearted attempts to fight back with features like star joins, but by then it was too late: the specialist data warehouse device, in the form of Teradata, had become established. Of course such projects were still large and complex. Most data warehouse project costs are associated with people, not hardware or software, and this does not change whether you are using SQL Server or Teradata as your database.

However, marketing can at times (not often, but sometimes) be a clever and subtle thing. When Netezza brought out essentially a device like Teradata, but quicker and cheaper, the label “appliance” was used, and a very clever one it is. In normal English usage an appliance is something that we just plug in, like a toaster or a coffee maker. Without making any such overt claims, the “appliance” label has a comforting implication that your data warehouse project will have that toaster installation-like quality previously lacking with pesky traditional databases. Given that a DW appliance is just some clever hardware and an optimised database, your project issues are in fact identical to those of any other DW project. Analysis, user requirements, data quality, sourcing, design and reporting all have to be done, although the appliance may certainly be able to handle large volumes of data at a much better price point than a traditional hardware/database combination. Since the hardware and software on a project may typically account for less than 20% of the project costs, this is an undeniably useful thing, but hardly takes us into toaster territory.

Yet the label matters. In a rather breathless blog yesterday:

Mike Stevens, who I don’t know personally but appears to have a background in PR rather than hands-on data warehouse project implementation, claims that appliances spell “trouble for traditional data warehouse vendors” since an appliance may cost just USD 150k whereas “conventional solutions cost millions”. He falls into the language trap of the appliance. Your data warehouse still has to to deal with all those people-intensive things (data sourcing reporting, testing) whether you use a conventional SQL database and a regular server, or a specialist DW appliance. The issues are all identical, except with an appliance you have some additional cost since less familiar skills will need to be brought to bear (there are more Oracle skills out there than Netezza ones). The savings on hardware by using an appliance may be very significant and comfortably justified on a large data warehouse, but such a project is not going to cost USD 150k and a quick plug in the wall socket.

If this kind of misconception is so easily repeated by journalists (or at least bloggers) then I wonder how widespread this view is amongst IT managers, and how much this has helped data warehouse “appliances” catch on? Would Netezza have done quite so well if they had been labelled something less reassuring, like a “data warehouse turbo toolkit”? It was said that HP was so bad at marketing it would, if it sold sushi, describe it as “cold dead fish”. The “appliance” vendors shows that smart marketing can still be done within hi-tech.

Microstrategy Joins the Party

To go with Business Objects’ excellent Q4 results, Microstrategy also reported good figures, suggesting that the BI industry is in generally good shape. Revenue was USD 92.6M, up 20% over last year. Just USD 36.6M of this was license revenue, but this was 17% up on last year.

There was very healthy USD 32M operating margin, which means an operating margin of 34%. Other measures were also healthy e.g. days sales outstanding of just 54, and cash flow from operations of USD 26M. Admittedly this is down a little as expenses have risen by 22% year over year, but all the same this is a healthy business.

These results are all the more significant because Microstrategy has been in rather a flat phase for some time, with licence growth almost flat since early 2005. This perky set of results will be all the more welcome for its staff and shareholders given this background.

No objections to Business Objects results

Business Objects delivered a very solid Q4, with revenue of USD 371M up 22% from a year previously. The good news is that the increase was mainly due to license revenue, at USD 180M up 16%. The operating margin of 23.6% was the best the company has ever achieved.

The only small cloud on the horizon was that the core BI business was actually in decline in license terms. The EIM revenue was USD 23M (well done to the First Logic boys and girls), USD 30M in enterprise perfromance management, and USD 127M (down 4% year over year) in the core BI products. This apparently paradoxical result is in line with my long-standing thesis about the saturation of the BI tools market in enterprises.

There were 13 deals in excess of USD 1M (up from nine last quarter), and overall in 2006 there were 35 deals of this size, which is actually down form 46 last year. Overall growth in Q4 was geographically well spread, with Europe up 25%, Asia Pacific 26% and the Americas 19%. The results reflect some wise acqusitions, since without the EIM/First Logic revenue, and the EPM revenue (based around the acquired SRC software and to a lesser exent the ALG acqusition) things would not look so rosy. Still, this vindicates the strategy of trying to move up the food chain in BI beyond core reporting, so credit where credit is due.

How fast can BI get?

FAST is the latest enterprise search company to dip its toe in the water of business intelligence, following Autonomy’s recent announcements. In the case of FAST, which is arguably the leader in enterprise search (and must surely be the leading Norwegian software company), they have done so through acquisition. They bought Corporate Radar, a small BI vendor who had some quite clever reporting technology (based around the Microsoft platform) that was quite flexible, and browser-based. On one project that I encountered at Kalido in the US, it was particularly good at building specialist financial reports e.g. gross margin “waterfall” analysis.

What is less clear to me is how “FAST Radar” as it is now known, really integrates with the FAST search engine. Superficially it is appealing to have “search” applied to BI, after all, if Google can scan the whole internet in seconds, why can I never find my monthly sales figures? However the problem in dealing with structured data is the ambiguity of metadata within corporate organisations (“which sales figures do you mean, exactly”), a problem that search technologies, clever though they are, barely scratch the surface of. Putting a very efficient index on a keyword is great for text searching, but it is less obvious to me how useful this would be in resolving ambiguous or inconsistent metadata. Hence I wonder whether the “integration” of these technologies goes much more than skin deep.

If anyone out there has practical experience of a project that uses one of these search engines combined with a BI tool or data warehouse, then please make a comment on this blog, as I am sure that your experiences will be of considerable interest to others.