Andy on Enterprise Software

The dark side of a marketing announcement

October 31, 2007

Business Objects announced a witches brew tie-up with Big Blue, bundling DB2 and its data warehouse capabilities with Business Objects together as a business intelligence “offering”. Given that Business Objects already works happily enough with DB2, it is rather unclear as to whether this is a any more a ghostly smoke and mirrors marketing tie-up rather than anything deeper, but it certainly makes some sense for both companies. It does, however, hint at Business Objects moving away from pure database platform independence, which takes on a new significance given the takeover (sorry: “merger” – much background cackling) of Business Objects by SAP. Is this really a subtle move to try and irritate Oracle, the other main DBMS vendor, give the highly competitive situation between SAP and Oracle, who are locked in a nightmare struggle for world domination? In this case, is SAP just manipulating Business Objects like a possessed puppet, pulling the strings behind the scenes, or was this just a hangover from the pre-takeover days, with the Business Objects marketing machine rolling on like a zombie that stumbles on yet does not realise it already has no independent life, clinging to some deep-held memory of those days in its old life. SAP has a more tense relationship with IBM itself these days. IBM sells cauldrons of consulting work around SAP implementations, but found a knife in its back when SAP started promoting the Netweaver middleware in direct competition with IBM’s Web Sphere.

Announcements from Business Objects from now on all need to be looked at through the distorting mirror of the relationship with its new parent, as there may be meaning lurking that would not have existed a month ago. Everything needs to be parsed for implications about the titanic Oracle v SAP struggle, as Business Objects should strive as far as possible to appear utterly neutral to applications and databases in order to not spook its customers. Arch rivals Cognos, Microstrategy and SAS will take advantage of any hint that the giant behind Business Objects is just pulling its strings.

Happy halloween everyone!

Common sense starts to prevail

October 30, 2007

Regular readers of this blog are probably tired of hearing about me advocating that MDM vendors need to move beyond single domain solutions (CDI, PIM) into solutions that can cater for a wide range of master data types. I have spoken at a number of the very useful CDI/MDM Institute (previously CDI Institute) conferences organised by Aaron Zornes, which are pretty much the only MDM conferences out there, and initially (as indicated by its earlier name) Aaron seemed fairly sceptical about this message. It is therefore encouraging to see him starting to lean this way in an article in DM Review. In the article he bases this view on multiple conversations with people responsible for MDM at large enterprises.

This is quite right; perhaps I had this view initially because I used to be a technology strategist at Shell and so was trained to think this way, but it has always seemed blindingly obvious to me that single domain solutions are at best a sticking plaster when it comes to MDM. There are simply too many classes of master data to contemplate fragmenting MDM solutions by domain, each to a potentially different vendor. Large companies don’t like dealing with more vendors than they have to, and common sense tells you that it is easier to get economies of scale in terms of skill sets. never mind software licenses, by using technology that is capable of dealing with all kinds of master data in the same way. Personally I would be cautious above vendors who bolt on wider domain capability to existing technologies that were initially hard coded around a specific domain such as customer or product. It is never easy to re-architect software to do something that its original designers never intended. It will be easier for the pure play generic MDM vendors to add better performance etc than it will be for a CDI vendor to be genuinely able to deal with multiple domains consistently.

Having already changed the name from “CDI Institute” to CDI/MDM Institute” it’s only three letters away from ending up with the “MDM Institute”.

All in the timing

October 24, 2007

Business Objects Q3 results were rather soft, showing license revenue (USD 139M) just 2% up year over year. There were eight deals over USD 1 million, broadly similar to recent quarters. The business line called “information discovery and delivery” i.e. the classic reporting tools, did least well, while enterprise performance management was somewhat healthier.

However overall this is rather feeble growth (by contrast Informatica had a fine quarter, so the excuses offered by management about weak markets seem pretty lame). Perhaps there have been too many acquisitions to digest, and of course now the swallower has itself been gulped up by the much bigger fish of SAP. The price tag SAP paid look like a fairly high premium to the underlying Business of Business Objects, as reflected in its share price dip on announcement, but these results suggest that Business Objects shareholders at least can be very satisfied indeed with the price they got.

Data mis-governance

October 22, 2007

I spent this morning at a data governance seminar sponsored by Dataflux, at which Jill Dyche or Baseline Consulting spoke about her experiences of data governance best practice in client organisations, and Philip Howard of Bloor gave his perspective. Data governance seems to be something very much in its infancy despite the long-established issues it addresses, with only a tiny proportion of organisations having made a lot of progress (according to an IBM Global Services 5 point data governance maturity scale, no company is further along than stage three, and only a handful of companies even manage that). There seems little in the way of a silver bullet here, just missionary work to convince the business that data ownership needs to be taken seriously. Sometimes a “burning platform” can stimulate interest. Recently Nationwide Building Society was fined GBP 1 million due to the theft of a laptop on which customer data was stored (albeit in encrypted form). Interestingly, the fine was not directly due to the loss of the data but the fact that they had no processes in place to determine that there was actually customer data on the laptop. Such cases illustrate the risks, at least in regulated industries, of having poor data governance polices.

Another aspect of data governance often overlooked is the proliferation of data in corporate spreadsheets. Apparently Allied Irish Bank have a stunning 185 TB of storage devoted to spreadsheets alone, and who knows how much of this is duplicated. With studies showing that, in a spreadsheet with over 200 rows there is a 90% chance of an error, the potential for problems is self evident. When I was at Shell there was a whole group on the corridor opposite me who built spreadsheet models and audited existing ones, some of which are highly important (e.g. financial models used for capital intensive projects). This group paid its way many times over by uncovering flaws in existing operational models. Yet I suspect they only scratched the surface, and how common are such initiatives? This should be a promising area for companies such as Compassoft, which do spreadsheet “discovery and control”. Indeed there are no shortage of scandals related to manipulation of spreadsheets, including the USD 700M one at Allied Irish. And you thought you had enough data quality problems in your corporate systems….

Informatica marches on

October 20, 2007

Informatica had a very solid quarter indeed, with revenue up 22% at USD 96 million, of which USD 41 million was licence revenue (also up 22%). Maintenance is now a handy USD 38.3 million and consulting/services USD 16.7 million. These are very healthy ratios for a software company, as recurring maintenance is the best revenue of all; consulting at less than 20% of revenues means that the company is still a proper software company and not a consulting firm in disguise. Interestingly, growth in Americas was 15%, but growth outside was 36%.

The company revealed that 43% of its use cases were in the context of data warehousing, and its largest verticals were financial services, hi-tech and public sector.

Informatica is an interesting example of how a company can prosper when its main competitor (Ascential) is taken out of the market by a behemoth (IBM). Commentators often assume that being taken over by a behemoth means greater muscle, yet often the behemoth is distracted, bureaucratic and annoys the key staff of the company it has taken over. This can leave an independent competitor in an almost unchallenged position. This effect is amplified when the vendor in question is competing in a market where platform neutrality is important, as data integration and business intelligence are.

One small storm cloud in all the blue sky that Informatica is seeing is that SAP’s purchase of Business Objects will presumably have some effect on the OEM deal that SAP has with Informatica (since Business Objects own rival ETL technology from Acta). However based on the history of Ascential’s own OEM deal with SAP which preceded this one, I doubt that, even in the worst case, this would have much financial significance (my sources told me that Ascential never made much money of that deal) even if SAP dropped the deal entirely, which is by no means clear.

Babies and bathwater

October 15, 2007

I read a rather odd article in Enterprise Systems proclaiming that “the data warehouse might be dead”. The thrust of the article was the old idea of piecing together reports for a particular user or department by accessing source systems directly rather than bothering with a pesky data warehouse, in this case advocated by a senior business user from Bank of America. I understand the frustration that business people have with most corporate data warehouses today. They are typically inflexible, and so unable to keep up with the pace of business change. Indeed this thought was echoed recently by Gartner analyst Don Feinberg, who said that data warehouses more than five years old should be re-written. To a person who has an urgent information need, going to an IT department that tells him that he cannot have the information he needs for weeks or months is understandably irritating.

Yet the apparent solution of accessing source systems directly is flawed, and the problems this causes is after all is why people invented data warehouses in the first place. Yes, you can patch together data from source systems, and yes, that one report you want may appear less complicated (and quicker) to get than going through a change request in corporate IT. Overall, though the economics of point to point spaghetti v a central warehouse are easy to see. The organisation as a whole will spend much more money in this manner than by having a central warehouse where the data can be relied upon; the more complex the organisation, the larger this gap will be.

Probably worse, the business user gets some numbers out, but are they the right numbers? Anyone who has worked on data warehouse projects will be familiar with the frequently dismal quality of data even in supposedly trusted corporate source systems such as ERP. It is often only by looking at sources together than problems with the data show up. Often errors in, say, regional systems can cancel out or be obscured, and the true picture only emerges when data is added up at a collective level.

In these days of increasing anxiety about banking scandals and greater regulation, companies can ill afford to subject themselves to unnecessary risk by making decisions based on data of questionable quality. The issue is that most data warehouses today are based on inflexible approaches and designs, causing lengthy and costly delays in updating the warehouse when the business, inevitably, changes. It does not have to be this way. You can construct data warehouses in a more flexible manner, and in a way in which business users are engaged with the process. By running parallel master data management initiatives and setting up an MDM repository, the data warehouse can be relieved of at least some of the burden of sorting out corporate reference data, giving it more of chance.

It is incumbent of IT departments to embrace modern data warehouse technologies and complementary MDM best practice in order to avoid driving their customers to “skunk works” desperation in order to answer their needs. IT organisations that fail to do this risk being marginalised, but also indirectly drive up costs and risks for their companies.

Last man standing

October 11, 2007

So, what does the acquisition of Business Objects by SAP mean for the BI industry? In some ways this is a curious move by SAP. Business Objects is a very successful company, yet the overlap between SAP customers and Business Objects customers is much less than generally thought. Of course really large companies tend to have lots of software, so many companies will have SAP and also Business Objects, but this does not meant that they are being used in concert. I spoke to an executive of Business Objects this week who told me that the overlap was “almost zero” because for Business Objects to access the SAP environment was awkward (”nearly impossible” were his exact words). Typically Business Objects reports will be running against a separate data warehouse being fed by SAP and other systems rather than Business Objects directly reading, say, a BW warehouse or (far less likely) directly accessing SAP systems. Business Objects is much more common in Oracle shops, which makes me wonder whether the purchase is more a defensive one i.e. to take Business Objects out of the market before Oracle gets its hands on it. Oracle + BOBJ would be a much more natural fit, and so I am sure that this move will displease Oracle, who had bought Hyperion, yet Brio was only a distant third in the reporting space behind Business Objects and Cognos.

For Business Objects customers the news is not necessarily positive. SAP tends to be fiercely proprietary about its environment, preferring to fight it out for footprint with Oracle in customers rather than opening everything up. Hence SAP BW, for example, while it can certainly load data from non-SAP systems, is suited best to an SAP environment, and it would be bizarre indeed to imagine a customer without SAP buying SAP BW. Business intelligence is not SAP’s core competence, and there is some considerable danger of dilution of attention, as well as future product directions potentially being pulled in awkward ways e.g just how open will SAP want Business Objects to continue to be to other sources such as Oracle? Of course for now it is all public harmony, but think down the road a year or two and consider whether someone in SAP might at least consider the option of making it harder for Business Objects to work with Oracle’s applications in order to encourage customers to switch from Oracle to SAP applications. Do you think this would never cross their minds; not even a little bit? Moreover SAP has a distinct culture, and has no track record of an acquisition of this size (indeed, until this week its executives had scorned the very idea). I suspect that just about every person in Business Objects is at least updating their resume right now, and even if things work out fine it is going to be at best unsettling for staff. As Woody Allen said, the lamb may lie down with the lion, but the lamb will not get much sleep.

I think Cognos is the winner here, as it can now stand as the clear leader in business intelligence as a truly independent vendor. Companies with genuinely mixed environments will surely edge towards Cognos now in preference to Business Objects. Even if it turns out that Business Objects truly will be run as an independent business, there will always be that nagging doubt (reinforced by Cognos sales people). Informatica has demonstrated that you can prosper perfectly well when your main competitor is bought by a behemoth (in their case Ascential by IBM) and Cognos is now positioned to follow suit.

Creating a burning data quality platform

October 1, 2007

There is a blog I read by Forrester today that rang true. The point being made is that data quality is a hard sell unless some crisis happens. This is evidently true, since the data quality market is small. and yet the problems are large. I have encountered several shocking pieces of data quality in my time that were costing millions of dollars. In one case an undetected pricing error in a regional SAP system meant that a well known brand was being sold at zero profit margin. In another case, a data quality error in a co-ordinate system caused an exploration bore to be dug into an existing oil well, which luckily was not in production at the time so “only” cost a few million dollars. In more general terms every dollar spent on data quality should save you four. Yet these examples I mentioned (and there are plenty more) actually showed up not in data quality projects but in data warehouse or master data projects, which in principle were supposed to be taking “pure” data from the master transaction systems. This does not inspire confidence in the state of data in corporate systems which are not “clean”.

I am not sure why this sorry state of affairs exists other than to note that in most companies data quality is regarded as an IT problem, when in actual fact the IT folk are the last people to be in a position to judge data quality. Responsibility lies firmly in the business camp. Moreover, as I have mentioned, justifying a data quality project should not be hard: it has real dollar savings, quite apart from other benefits e.g. reduction in reputational risk. I suspect that some of the problem is that it is embarrassing (”no problems with our company data, no sirree”) and, let’s face it, pretty dull. Would you rather work on some new product launch or be buried away reviewing endless reports checking whether data is what it should be?

For people toiling away in corporate IT the right way to get attention might be to use a modern data quality tool, find a sympathetic business analyst and poke around some corporate systems. These days the tools do a lot of self-discovery, so finding anomalies is not as manually intensive as it used to be. If you turn over a few stones in corporate systems you will be surprised at what will turn up. Chances are that at least one of the issues you encounter will turn out to be expensive, and this may raise the profile of the work, allowing sponsorship to dig around in other areas.