Contrasting fortunes

For quite some time now I have been banging on (bloggin on?) about the gradual but steady progress than Microsoft has been making with its business intelligence offerings, and how these pose a long term threat to the pure-play BI vendors that seems to me underestimated by most.  Of course opinions are one thing but hard data is another, so it is nice to be able to put some numbers behind this thesis (“smug mode”, as they would say on “Red Dwarf”).  In the last year Microsoft’s BI offerings grew in revenue by a very healthy 36%.  Compare and contrast this progress with that of the pure play vendors.  Cognos just announced its quarterly results, which showed license revenue 1% down on a year on year quarterly basis, and down 7% overall (modest growth overall came from its services business). Cognos is still generating healthy profits at 12.8% operating margin, which although down from 15.5% last year is certainly very respectable. But the underlying difficulty in selling more licenses in what is supposed to be a fast growing BI market shows that something is not right.  Business Objects has managed slightly better sales growth but at the cost of halving its profitability over the last year, while Hyperion has managed 8.9% growth in revenue with just a small slip in profitability.  Of these, Hyperion with its financial consolidation speciality is the least threatened by the fairly low-end Microsoft BI tools.

Microsoft may not be particularly agile, but it is chipping away steadily at the business of the pure-play BI players, and I can only see this trend continuing.                                                                     

 

When you are in a hole, stop digging

SAP continues to score own goals around its troubled MDM offering.  At a Las Vegas conference the usually well-informed and ever-urbane Shai Agassi explained to attendees how MDM was critical to the success of SOA.  All true, but this runs up against the unfortunate problem that SAP’s own MDM offering is, to put it politely, not one of the current stars of their software firmament.  As noted in the article, despite SAP’s huge customer base and famed account control, they are hard pressed to rustle up a few customer references.  It is actually worse than that.  I know for a fact that an SAP customer was, just weeks ago, told that the SAP MDM offering would not be ready for effective production use for at least 18 months.  The customer was not told that by one of SAP’s competitors, but by SAP themselves at a private briefing in Walldorf. It seems that this news has yet to fully spread even amongst SAP.  The SAP MDM product marketing manager is quoted as saying the customer base is “quietly increasing”, which in the usually ultra-spun lingo of product managers translates as less than wild enthusiasm and delight.  IBM Global Services make a stack of money from implementing SAP, so when they say “it’s very difficult to find the customers using the software” then they probably mean it. 

I found it pretty revealing when SAP didn’t even show up at either of the CDI/MDM conferences run earlier this year by Aaron Zornes (one in San Francisco, one in London). This at least seemed a prudent approach.  Let’s face it, all software companies have some things go better than others, so if you are struggling with a particular product then it seems sensible to keep your head down, avoid publicity and get on with fixing the issues rather than calling attention to the problem.  This is what makes this high profile “MDM is key” speech seem either very bold indeed, or a bit of a gaffe (or, as they say at Real Madrid, a bit of a Woodgate). 

 

 

Deja Vu

Cordys just released their latest version of the Composite Application Framework, but rather oddly labels part of their offering a master data management framework. Cordys is certainly an interesting company, set up by the founder of Baan with the idea of providing an application-neutral true SOA middleware stack.  What is striking is the sheer scale of the investment, with USD 135 million sunk so far (and counting) and 550 employees. 

Cordys looks to have developed from a ground-up a very scalable middleware platform, but from reading their material quite carefully I cannot really see anything that looks to me like a true master data management application.  There does not appear to be any real semantic integration capability within Cordys, and indeed the otherwise positive Ovum report that their web site points to notes that data integration is the one limited area of their software.  I suspect that they have just hooked onto the MDM bandwagon as so many other vendors have done, feeling they need to appear to have an offering in a trendy space.  Certainly master data is important to how composite applications are going to work, and the enterprise service bus that Cordys provides will need to fed with consistent master data from somewhere in order to be effective, yet their software does not appear to have a lot to offer in this area so far.

The major challenge to Cordys is the sheer ambition of what they have produced.  A complete middleware stack is not something which enterprises will lightly contemplate introducing, especially as it would seem that Cordys’ productivity message relies on using their stack wholesale.  Yet large companies already have huge investments in middleware from IBM, SAP, Oracle and Microsoft, as well as with independents like Tibco and the like, and are going to be very reluctant indeed to swap these out, however appealing a new framework is.  It is noticeable that Cordys’ customers so far are virtually all in the Netherlands, and are generally quite small companies.  I couldn’t see a really large enterprise in the customer list on their web site.  This does not surprise me since the sheer scale of the prospect swapping out infrastructure is going to be a daunting one to any large company.  Mr Baan’s pockets are presumably very deep, but I can’t believe that 550 employees can be fed and watered on a diet of Dutch SMEs, so it will be critical for Cordys to break out of the local market and start landing some major enterprises at some point, just as Baan originally did with Boeing.

There is a worrying parallel here with Asera, a start-up of the internet boom set up in 1998 by ex Oracle staff with massive (USD 175 million0 venture capital backing, whose aims were rather similar to Cordys in ambition.  This company similarly hit the brick wall of entrenched middleware in large companies, and duly died an unpleasant death in 2003. This is one investment that you will not hear Kleiner Perkins boasting about.  It is to be hoped that Cordys fares better, but it has an uphill battle, in my view.

Did data quality just get more interesting?

The data quality market has been a strange beast.  The problem is huge, with just about every company in the world having issues regarding data quality.  Duplicate customer information is the area that is most familiar, but the problem can be a lot more serious than a few bits of duplicated junk mail.  More than a decade ago a Shell exploration drill neatly drilled into part of an existing drilling operation because the co-ordinates of the existing pipe were incorrect in a database.  Fortunately there was no oil flowing through the pipe at the time or a major spillage would have occurred, but even so the rig was off-line for some time, at a cost of about half a million dollars a day even then. I can certainly testify that a large chunk of every data warehouse project is spent dealing with data quality issues, even when the data comes from supposedly authoritative sources like ERP systems. 

Yet despite a real problem and real dollars attached to this problem, the data quality software market is something of a minnow.  Forrester estimates it at USD 1 billion, and that includes consulting. The software part of the market is maybe USD 200 million.  Over recent years pure play vendors have been bought up and incorporated into broader offerings e.g. Vality by Ascential (now IBM), FirstLogic by Business Objects, Similarity Systems by Informatica, Trillium by Harte Hanks.  The price of FirstLogic, USD 69 million for a business with USD 50 million in revenue, was hardly something to make investors salivate (to be fair, Similarity’s price tag of USD 49 million was at a much better multiple, and there were some peculiarities around Firstlogic).  Why should this be?  Part of the problem is that data quality issues remain a resolutely human problem, typically meaning that when validation rules are defined in a data quality tool, only part of the problem surfaces.  Business processes around who actually owns the data and who can fix problems are as big a problem as actually finding errors in the data itself.  It is encouraging to see a couple of start-ups take a new approach to this old chestnut.  Though differing somewhat in the technologies used, both take the approach of trying to discover relationships between existing data by analysing existing datasets rather than relying on building up a top-down rules profile. Exeros just secured a USD 12 million series B funding round, and has high quality venture backing from Globespan Capital Partners and Bay Partners. A rival company with a similar approach is Zoomix.  Based on technology developed in the Israeli military, Zoomix uses data mining techniques to seek out the relationships amongst existing data, presenting its first attempt to a business analyst and then learning from the responses so as to improve the algorithms in future iterations. They also have an interesting new product which can apply these rules in real-time to an existing transaction system, called up as an SOA service.  This effectively acts like a firewall, but for data quality. Zoomix has customers in Israel and Europe, and has set up a European HQ in London.  These newcomers present fresh competition to the largest vendor (Trillium) as well as to other start-ups such as Datanomic (based in Cambridge – the original one rather than the Massachusets one) and more specialist quality tools such as Silver Creek, who have taken a tightly targeted approach, in their case dealing with complex product data.

Investors in these companies have clearly seen that, while data quality may be more of a component in a broader solution (BI, MDM, whatever) there is enough in the way of a real problem here to allow room for innovative approaches.  The exits for these companies may well be acquisition by larger companies, just as with Similarity Systems, but it is good to see a fresh approach being taken to this very real, if not exactly sexy, business problem.   

 

Back from the dead

Those of you with long memories will recall that the first three real ETL vendors were Prism, Carleton and ETI.  The others were acquired but ETI survived as an independent company, though with an ever-diminishing profile.  Early this year they were apparently down to just one salesman, but having the US Department of Defense as a customer does wonders for maintenance revenue.  In recent years the company had been pared down to a minimum, and I had assumed that, like an old soldier, they might just fade away.  However in the summer the original investors were bought out and a new capital injection happened in a USD 6.5 million round from investors Appian Ventures of Denver, Access Venture Partners and Osprey Ventures, and a $5M line of credit was negotiated with Comerica bank.  Consequently the company is effectively born again, with new money, owners and management, but with established technology.

ETI’s software used to be strong at dealing with extracting data from esoteric sources, generating code against things like COBOL workbooks and assorted mainframe file systems, as well as having the usual transformation capabilities.  It suffered from being rather complex to use and from some weak marketing. 

So, the interesting question is whether this old warhorse can be dusted off, repainted and revitalised.  Judging by the seven vice presidents that have appeared in the management ranks, the new board is not afraid to spend some of that new money.  They have also licensed in some data quality offerings from a couple of small British companies, which is a logical step to broaden the product range from just ETL.  This is important because ETL on its own is a tough market, as ETI has discovered.  More and more ETL functionality is being thrown into the database (MSFT with SSIS – previously DTS – and Oracle with the poorly named Warehouse Builder tool, which is really an ETL tool) which makes it hard work to persuade a customer to buy your technology.  Only Sunopsis has really made much headway here in recent times, with a clever pitch built around using rather than competing with the database capabilities.  Other pure plays like Sagent have withered and died.  Informatica is really the only ETL player of size left standing, and they have broadened their appeal by going for a wider integration message.  So what has changed that will allow ETI to flourish now when it clearly has not done fo some time?  Perhaps new the investors have noticed a flurry of companies being bought out in the data quality space recently and so can see a fairly quick exit, perhaps there is just too much venture capital around, or maybe they have more ambitious plans for the company.

ETI certainly has some well proven technology, and its foray into data quality looks logical.  Good luck to them.  Yet relaunching a company is hard work, and it will take some impressive sales and marketing execution to turn breath new life into this particular body.

Cognos treads water

I have written before about how the pure play vendors face considerable pressure in what is a more saturated market than they believe.  Further evidence of this theory being borne out comes with the latest results from Cognos, which although on the surface looked OK with an 8% revenue increase, in fact hid a year on year decline in license revenue. Ten deals of $1 million in the quarter is also down from last quarter. The company announced that it was laying off 210 staff, though it continues to invest in its sales force.  The sales force managed to sell $210k of license per rep in the last quarter, which is not exactly stellar given Cognos well known brand and its position as the number 2 player in the BI space. 

Profit margins are a still healthy 13%, but this compares to a peer group average of 16.3%, and full year profit is actually down by nearly 30% compared to the previous year.  Though the stock price is off its lows when the SEC investigation was announced, it can be seen that this has not been a great year for investors in Cognos stock, who would have been much better off with an index tracker.

Chart Graphic 

In case you are wondering, Business Objects share price has not exactly been lighting up the night sky either.  They have annual sales growth of 12% but profits have shrunk 46% year on year:

Chart Graphic 

I am sticking to my theory that all the pure-play BI vendors will continue to face a difficult environment for the foreseeable future. 

 

Question master

Jane Griffin poses “five questions to ask your MDM vendor” in a DM review article this week. I like the idea of the article, but I think one of the questions is somewhat flawed, and there are some other questions than seem just as pertinent to me. 

Firstly, remember that MDM is a market that did not really exist two years ago.  There were some CDI and PIM vendors around before that, but in terms of general purpose MDM solutions you had what exactly?  Kalido MDM and SAP MDM (which unfortunately had the small drawback that it didn’t work).  Yet now there are maybe 60 vendors who claim they have an MDM solution.  Well, they are either all quick learners, or some of these solutions will perform best on Powerpoint rather than any other operating system  This begs the question:

Question 6.  How many customer references do you have?  How many are in production with your software, and can I talk to three of them?  Trust me, this is the question that will send shivers down the spine of an MDM software vendor.  The pioneering vendors went through plenty of issues with their early releases and some emerged better than others.  Newer vendors will have exactly the same problems, so you should get a good sense of how other people have been getting on with the software.

Q7.   How well does the MDM product support local to global mapping of data i.e.how well does it deal with semantic differences?  In the search for “one version of the truth” you will encounter the pesky reality that in a large company there are international coding structures but also local coding structures e.g. for local products specific to a market.  Both structures are “right” but they both need to be managed e.g. when a new version of the international coding structure comes out, will it overwrite local amendments made to the last version?  This brings us on to….

Q8..  How good is the product’s version control?  Master data is rarely simple and unique, and will most certainly change over time.  Can the MDM solution deal with multiple draft copies of master data that is in review, or can it only deal with “golden copy” data.  The latter would be a significant limitation to how much of your data governance you are able to automate.

Q9.  How good is the hierarchy management of the product?   In particular, can it deal with varying levels and associations, how easy is to make bulk changes e.g. re-parenting, how intuitive is it to map hierarchies together, and how flexible is it to set validation rules (in batch, on-line, both?).  The nitty gritty of your MDM solution will likely involve significant playing around with business hierarchies, and some tools are a lot smarter than others.

Q10.  Can you easily track the lineage of any changes i.e. can you trace you did what to the product code structure three versions ago?  Some products are much smarter at time-variant analysis than others, which is increasingly relevant in this age of greater compliance scrutiny. 

There are many more I am tempted to add, but I like Jane’s idea of coming up with a manageable number.  Of Jane’s original questions,

Q1.  Proven methodology?
Q2.  Business benefits?
Q3.  Integration with EDM strategy?
Q4.  Data governance?
Q5.  Is the product extensible?

Q1 is sensible, though a true software vendor will often work with systems integrators, so the question about methodology and experience is as much directed to the SI as to the MDM vendor.

Q2 I disagree with.  The vendor should be able to come up with a good indication of the likely costs of the MDM project based on the information that you, the customer, can give it.  An experienced MDM vendor should even have a project estimating tool based on experience, ready to plug in your project characteristics and spit out an estimate.  But the vendor cannot articulate the business benefits for your business.  They may have examples, and have seen some interesting cases elsewhere, but the specifics of the benefits you will see will depend on the specifics of your company, its current issues and the areas that the project will tackle.  No vendor (or even SI) can tell you that as well as you can, or to be more precise, your business sponsor.  I am all for business cases, as readers of this blog know, but your benefits are the one thing a vendor cannot really help you with.

Q3   Assuming this means “enterprise data management” (rather than “document management”; so many acronmys….) then this is a fair question. Clearly you want the approach taken to fit into your broader goals.

Q4.  Good question.  Data governance is key to an MDM project, but the question to the vendor should be around their support for processes e.g. do they support different user profiles, allow for automation of master data workflow etc? (some vendors do, most do not).  But it is also a question for you.  How well articulated is your current process for updating master data, and who owns it?  If you have no idea here, then a clever tool is going to be of limited use.

Q5.  Very good question.  In particular, how well does the MDM tool support all kinds of master data, not just “customer” and “product”.  I know i keep banging on about this, but if you start picking a CDI solution here and a PIM solution there then you will quickly have a new set of data silos to go with your old ones. You must consider the broader perspective of master data, where there are literally hundreds of types of data that need to be managed.  You may start with high value ones like customer, but can the solution extend itself to others?

Incidentally, remember to ask your consultancy or systems integrator a few questions too.  One question will do to start with: how many MDM projects has this particular set of consultants (rather than your firm in general) done, and can I speak to the last customer they each worked with?  That should put the cat amongst the pigeons….

 

 

 

Managing software risk

An interesting, common sense debate was featured in Silicon.com last week. A panel of CIOs was asked whether they felt comfortable buying from small suppliers or whether they preferred dealing with the big players. There was a surprising degree of consensus that, in general, CIOs felt OK about small suppliers: indeed in some cases they actively preferred them. Perhaps this is a sign of a steady recovery in economic confidence, with CIOs preparing to come out of their shells into which they retreated in 2001 and 2002. As I have written about before, buying software from small suppliers carries risks, but this is true of big suppliers also. Just because a giant company may not go bust does not stop them dropping products for any number of reasons, as I can testify from personal experience.

The one element in the article that did make me smile was the assumption that code escrow was a form of insurance against a small vendor folding. Indeed code escrow arrangements have become quite standard in contracts, and generate modest fees for those organisations that provide the service. I hate to disillusion those CIOs, but code escrow is not the panacea it may seem. Sure, so you get the source code, but then what? Firstly, you have to hope that the vendor has been diligent about keeping their escrow up to date with the version of software that you are actually using. But more to the point, the raw code itself is of limited use without the design specifications that go along with it (at least assuming you actually want to continue developing it). Even if you are looking at basic support only, how well documented is the code? I had the misfortune to try and execute an escrow contract once when I was working at Esso. The tape of source code duly turned up and it was 3 million lines of undocumented assembler code. While my colleague (an expert at assembler code) got a misty gleam in his eye as he could see a job for life coming up, we concluded that we simply couldn’t justify taking this on, and opted to go for a complete replacement instead. So, if you are insisting on source code escrow from your vendor, be aware of the pitfalls and ask some searching questions about documentation.

On software and relationships

An article in Intelligent Enterprise asks “why can’t vendors and customers just get along?” after explaining the many issues at which they are usually at loggerheads. Having been both a customer and a software vendor, I think Joshua Greenbaum points to one key point in his article: honesty. As a customer I found that software vendors frequently made patently absurd claims about their software “this tool will speed up application development by 1000%” being one memorable example. Release dates for software were another issue: vendors fail to grasp that businesses have to plan for change all the time, so a release date slipping by three months is rarely an issue provided that you are told about it in good time. However if you have lined up a load of resources to do an upgrade, it just not turning up on the appointed day does cause real cost and annoyance.

Another bug-bear is testing, a tricky subject since it is impossible to fully test all but the most trivial software (see the excellent book “the art of software testing“) . However vendors vary dramatically on the degree of effort that they put in. At Kalido we have extensive automated test routines which run on every build of the software, which at least means that quite a lot of bugs get picked up automatically, though bugs of course still get through. Yet according to someone who used to work at Oracle, there it was policy to do minimal testing of software, where the testing strategy was described as “compile and ship”. This certainly avoids the need for lots of expensive testers, but is hardly what customers expect.

However, customers can be unreasonable too. Their legal departments insist on using sometimes surreal contract templates that were often designed for buying bits of building equipment rather than software, resulting in needless delays in contract negotiation (but in-house corporate lawyers don’t care about this, indeed it helps keep them busy). They can also make absurd demands: we recently lost a contract after refusing to certify that we would fix ANY bug in the software within eight hours, something which patently cannot be guaranteed by anyone, however responsive. A small vendor who won the bid signed up to this demand, and so will presumably be in breach of contract pretty much every day. Quite what the customer thinks they have gained from this is unclear. It is not clear why some customers behave in such ways, perhaps they feel like exacting revenge for previous bad experiences with vendors, or maybe some corporate cultures value aggressive negotiating.

From my experience on both sides of the fence, the best relationships occur when both parties understand that buying enterprise software is a long-term thing, in which it is important that both sides feel they are getting value. Vendors are more inclined to go the extra mile to fix a customer problem if that customer has been doing lots of reference calls for them, and actively participates in beta test programs, for example. As with many things in life, there needs to be a spirit of mutual respect and co-operation between customers and vendors if both are to get the best out of their relationship.

Do we really all need Business Intelligence tools?

An article I read in in DM Review today highlights Forrester Research saying that “25 percent and 40 percent of all enterprise users” would eventually use BI software. I’m not quite sure what they are smoking over at Forrester, but this seems to me like another of those lessons in the danger of extrapolating. You know the kind of thing: “if this product growth of x% continues, within ten years everyone on the planet will have an iPod/Skype connection/blog/whatever.” The flaw with such thinking is that there is a natural limit to the number of people that will ever want a particular thing. In the case of enterprise software that number is, I would suggest, much lower than commonly supposed. This is for the simple reason that most people are not paid to do ad hoc analysis of data in their jobs. Sure, some finance and marketing analysts spend their days doing this, but how many powerful BI tools does the average sales rep/bank clerk/shelf stacker really need? I’m thinking none at all, since their job is to sell things or do whatever it is that bank clerks do, not be musing on the performance of their company’s products or its customer segmentation.

In my experience of implementing large data warehouse systems at large corporations, there are remarkably few people who need anything more than a canned report, or just possibly a regular Excel pivot table. A salesman needs to work out his commission, a support manager needs to track the calls coming in that week, but these are for the most part regular events, needing a regular report. In a large data warehouse application that has 1,000 end users of the reports produced from it, the number of people setting up these reports and doing ad hoc analysis may well be just 10 i.e. around 1% of the total. Once you get past management and the people paid to answer management’s questions, there are just not that many people whose job it is to ponder interesting trends, or explore large data sets for a living. For this reason a lot of companies end up procuring a lot more “seats” of BI software than they really need. In one case I am intimately familiar with, even after five years of rolling out a leading BI product, the penetration rate was always much lower than I had expected, and never went as high as 5%, and much of this usage was not for genuine “BI” usage.

Of course this is not what the salesmen of BI vendors want to hear, but it is something that IT and procurement departments should be aware of.