Andy on Enterprise Software

MDM and risk

May 31, 2007

It is not often that I even bother to read articles written by vendors, but there were some good points made in an article by a practice manager for Sipierian regarding MDM and regulation. The point being made was how increased regulation, both in the US with its Sarbanes Oxley and Patriot Act, but also elsewhere with things such as Basel 2 in financial services, should be a significant external “push” for MDM to complement internal “pull” by corporations. In order to measure the overall risk levels at a bank you need to know the total aggregate positions taken with counter-parties, and be able to see whether there are any high exposures with particular clients (the case of Enron springs to mind). In order to do this you need to know exactly who you are doing business with, including subsidiaries of that company, and yet how well do companies really know this?

Many MDM projects set out to get a better understanding of the total picture of either customers or suppliers, since their multiple source systems and classifications of these make it very hard to get a single consistent picture. Certainly many years ago Shell realised that it had no idea how much business it did with, say, Ford or Unilever, since quite apart from internal classification overlap, it was not clear exactly what “Ford” or “Unilever” consists of. This was a key reason why it invested heavily in an enterprise data warehouse project. Multinational companies have so many subsidiaries, often with different trading names (for example Shell owns companies like Bharat Petroleum, Unilever is known as “Hindustan Lever” in India) that it is unlikely that individual operating units have carefully checked the Dun & Bradstreet numbers of all these companies and classified them correctly.

This is important enough when dealing with a global account, but can be critical when dealing with financial trades. I know of one MDM initiative that a financial services organisation that started off as a direct result of Enron, when it transpired that in fact the organisation thought it knew how much exposure it had with Enron, but rapidly discovered that it did not when Enron collapsed. I certainly know of one famous financial institution where a former VP admitted to me that the bank had “no clue” how much business it did with a large, complex beast like Deutsche Bank, for all the usual MDM reasons.

The thing I find curious is all these regulations are all pretty much in place now, and although companies have spent a money on compliance, it is clear from these two cases that the problems are far from solved. The next time an Enron-like event happens (and it will) companies will not only be nursing losses from their exposed positions, but may also have regulatory problems if it turns out that they actually did not truly know the extent of their exposure. Given the state of data quality and master data in most large organisations, I wonder whether companies are being complacent or regulators simply sleepy in checking the effectiveness of the systems at companies. Having a report that tells you your exposure level is all very well, but how reliable are the numbers that make that up? My experience of working with data warehouse and MDM applications tells me that they are likely to be a lot less reliable than many people think.

If you find all this talk of banks rather abstract, consider this: the average hospital has 25 systems that record patient information. If you are one of those patients, how confident are you that these will all tie up?

BI for everyone?

May 29, 2007

As usual, Philip Howard has some thoughtful comments on the subject of enterprise data warehousing. The recent plethora of data warehouse appliances, pioneered by Netezza but now popping up from companies ranging from start-ups to HP, certainly has the potential to change the data warehouse landscape. However as Philip points out, it is less clear that data warehouse appliances need be connected with” ubiquitous BI”. I have written previously st some length on my view that there is really no obvious reason for the “democratisation of data” i.e. with anyone in the company having unfettered access to corporate data using whizzy reporting tools. Quite apart from whether the tools are really cuddly enough (doubtful) the question rarely asked is why would this vision be necessary or even appropriate? There are certainly people in a company whose job it is to analyse data: they would be, er, analysts. Everyone else pretty much needs a limited set of data to get on with their jobs, and certainly I would be nervous if every factory worker and truck driver in a company decided to spend an hour or two a day investigating corporate data warehouses. A salesman needs a limited of set of numbers in a year: “here is your quota” while I struggle to see why people outside finance or marketing (and only a subset of those) really need to be spending their time wrestling with data at all.

To be sure one class of people benefits from a “BI tool on every desktop”: vendors, both BI vendors and those selling associated databases and hardware. I have yet to read any articles in Harvard Business Review from CEOs complaining that their profits would be higher if only every employee in the company had a BI tool. BI ubiquity seems to me a solution in search of a problem.

How do I love thee – let me count the ways

May 26, 2007

Those in the industry know that there is a dance that goes on between vendors who crave analyst endorsement, and analyst firms who portray vendor independence to end-user firms while happily trousering large fees from the vendors that actually constitute most of their income. The Cranky PM has written entertainingly on this corrupt relationship before, but usually the analyst firm at least makes a pretence of playing hard to get. However a recent puff piece for Informatica by Ventana analyst David Stodder raises the bar on sycophantic behaviour by analysts towards vendors who have taken out paid contracts with them. David gushes about Informatica’s support for MDM and how all right-minded “companies starting out with MDM look at what Informatica has to offer”. How about an alternative view:

“Informatica does not make a MDM application”

Strong stuff. What kind of vicious competitive slur can this be? Perhaps it is from a jealous competitor, or maybe some cynical spurned journalist or analyst that Informatica was less generous towards with its cheque book? Er, no, the source of this counter-statement is actually on the official Informatica web site. Last time I looked, having an ETL tool and a purchased data quality product, however good, does not equate to an MDM application, and all credit to Informatica for not pretending otherwise.

What amused me was this piece was entitled “analyst insight”. Yep, dazzling insight there all right.
(with apologies to Elizabeth Barret Browning).

Business Objects Discovers Text

May 22, 2007

Business Objects continues to broaden its offerings, in this case into the area of text search by buying Inxight , a company that was founded in 1997 based on research at Xerox Parc, and steadly built up an impressive customer list, partly through OEM arrangements. Its technology competes with Autonomy, Clear Forest and Stratify, and is strong in the area of multi-language support. The company had struggled somewhat in terms of market momentum, especially given the very hefty venture capital financing that it received (it was up to a $22M Series D round by 2002, following $29M of funds in 2001). Although its numbers are not public (and the purchase price is unclear at this point), it seems unlikely that it was yet profitable. Business Objects strong balance sheet provides a sensible home for the technology, and Business Objects’ very capable sales and marketing is a good match for a company with strong technology which has not executed that well in these areas. Text search is certainly an important and growing area in these days of increased regulation, and this adds a useful additional arrow to the Business Objects technology quiver.

A new twist to appliances

May 18, 2007

I wondered what Foster Hinshaw would get up to after he left Netezza, and now we know. He has set up the rather awkwardly named Dataupia, a data warehouse appliance with a difference. It is an important difference, as his appliance runs on Oracle rather than on a proprietary database like Netezza. It will also run on DB2 or SQL Server, for that matter. You just plug in MPP capable hardware to take advantage of the appliance. This is important, since having a proprietary database brings with it not only a certain amount of cost and new skills required, but also makes conservative corporate buyers nervous. If you are a Telco with really vast amounts of transaction data then this trade off may be worthwhile, as indeed can be seen in Netezza’s considerable success, but if you could get much of the benefit (and this is unclear since at this stage there are no comparative performance figures) while still running on your existing mainstream database, this would sooth the nerves of corporate CIO types who might otherwise try and block the introduction of a new database. Just as importantly, it allows existing data warehouse applications to be able to claim appliance like performance boosts. While the vast bulk of data warehouses today are custom built, this ought to be of interest to true data warehouse applications such as Kalido, which could presumably easily run on top of Dataupia’s appliance.

I think this is a very interesting development, assuming that the new product delivers on its promise. The market for an appliance capable of running on a mainstream database platform ought to be much broader than the set of applications that currently addressed by hardware appliances (or even software-based ones with their own database like Kognitio).

The missing link

May 14, 2007

I thought that Connie Moore made a good point in an article regarding BI and BPM: BI vendors are missing out on the “process” end of things. I would go broader than that and say that MDM vendors are similarly missing a trick, and that in MDM it matters more. If you are building some reports then process may certainly have relevance, but when it comes to master data it is central. How does master data get created, read, updated and deleted? For example a marketing manager may want to introduce a new consumer type (as an aside, I discovered to my general mortification this week via garlik.com that I am classified in marketing terms as a “contented grey”, which I suppose was better than some of the painful sounding alternatives like “constrained solo”). This is a new type of master data and you can be sure that a major new type will have impacts on several systems, so will require quite probably a review or two before it goes upstairs for sign off. That process of creating, review, revision and sign-off currently probably happens by email, yet it should really be managed properly by a workflow tool. This is exactly what MDM should be all about, and yet most of the vendors I saw at the recent trade show in London had a look as blank as a Woolworths shop assistant when I asked them about workflow and process within their tools.

Several of the successful MDM projects I have seen make process quite central to the project. A few MDM products have support for workflow, but most are missing out and will need to work with other vendors to provide this, which is a less appealing proposition to customers than having an integrated approach.

Tibco buys Spotfire

May 3, 2007

Tibco has made a serious foray into the business intelligence area via its announced acquisition of Spotfire this week. Spotfire has built up an excellent position in the visualisation space, starting in the pharmaceutical industry and then expanding into areas like upstream oil services. Based in Stockholm, it has successfully penetrated the US market, and its CEO Chriotopher Ahlberg has always impressed me. In a sea of failed visualisation vendors which never seem to catch on properly, Spotfire has a been a rare commercial success.

Spotfire’s revenues are not public, but are likely to be around USD 50M, so the USD 195M cash price paid for the company is a healthy one, reflecting Spotfire’s excellent momentum and differentiated technology. What will be interesting is to see whether Tibco is really the company that is best placed to exploit Spotfire’s technology. Tibco’s strong position in financial services will give Spotfire access to an attractive market. and while the companies’ technologies are broadly complementary, visualisation is a different animal than EAI, so it is less clear to me that the Tibco salesforce will be ideally placed to understand and successfully explain the benefits of Spotfire. The stock market seemed a little unconvinced as well, Tibco’s share price dipping on the announcement. Sensibly, Tibco is retaining the Spotfire brand and keeping the company as a separate division under Christopher Ahlberg.

Given the scale of the acquisition, this signals a serious move by Tibco into the broader BI market.

Master Data comes to London – day 2

May 2, 2007

The CDI/MDM Institute conference has just wrapped up in Kensington, and I felt it was very successful. Organisers IRM ran an efficient conference, and attendance on day 2 held up pretty well. There was a fine keynote speech given by a certain dashing ex-Kalido founder, and the rest of the day was in a series of tracks with customer case studies and some vendor and consultant presentations. The second day felt a bit light on customer case studies compared to day one, but overall this conference did well at getting a reasonable number of real customers talking about projects, rather than just being a string of subtle or not-so-subtle sales pitches from vendors.

Aaron Zornes gave a reasonably balanced overview of the main thirteen MDM vendors (D&B, Dataflux, Data Foundations, Hyperion, I2, IBM, Initiate, Kalido, Oracle Purisma, SAP, Siperian, Teradata, Visionware) with a recurring theme that few have really got to grips with providing support for the data governance/collaboration element of MDM. Certainly the workflow around how master data is a key part to most MDM projects, but it looks as if most of the projects out there at present today are doing this via home-grown efforts, e.g. one UK company I spoke to had done this, using Biztalk as a framework. As more and more MDM projects go beyond pilots into production then the aspect of maintenance of master data will become more pressing, so I would expect to see customers increasingly demanding this. Indeed they should really be insisting that workflow support for MDM processes are one of the key evaluation criteria for software selection.

Overall, it seems that MDM is moving into the mainstream in the UK.

Master Data comes to London – day 1

May 1, 2007

This week is the CDI/MDM Institute London conference. It is a useful bellwether of MDM progress in the UK, and based on the attendance today it looks like MDM interest is indeed picking up in the UK. There are just over 300 attendees and 22 vendors exhibiting. Compared to the one last year there are encouragingly more customer case studies (last year the speakers were mostly vendors presenting), for example from Panasonic, BT, Harrods, Allied Bakeries, M&S and the Co-Op.

It is noticeable that the CDI v MDM debate continues to favour the broader view that customer is just one (important) type of master data, with the MDM acronym now being used by most of the vendors. The Panasonic case study was a good example, starting out as a product master data initiative and now spreading to customer, and then on to market information. The speaker was able to share some real business benefits form the initiative (enabling new products to be launched two weeks quicker as well as data quality savings), measured in millions of pounds. IBM claimed to be integrating its various acquired technologies, which is an improvement from the conversation I had with them a year ago when it was claimed that there was nothing wrong with having a clutch of separate, incompatible repositories, one for customer, one for product etc. When I asked how many different repositories would be needed to cope with all the different types of master data in an enterprise I received the mystical answer “seven”, at which point I gave up as the conversation had seemed to move into the metaphysical realm. We shall see where the integration efforts lead.

Aaron Zornes gave a useful high level split of the MDM market into the groupings of:

- operational e.g. CDI hubs like Siperian, Oracle
- analytical e.g. Hyperion Razza, Data Foundations
- collaborative i.e. workflow (e.g. Kalido does a lot of this)

which seems to me a useful split. Certainly no one vendor does everything, so understanding where the strengths of the vendors are, even in this simplistic way, at least helps customers narrow down which vendors are most likely to match their particular problem.

IBM, Initiate, SAS and Kalido are the main sponsors of the event, and once again SAP chose not to attend (to be fair, SAP did speak at two of the US MDM conferences). Nimesh Mehta assures me that SAP MDM is making steady market progress, but with no numbers he is willing to share I cannot verify this. However the buzz at the conference suggest that most customers here are using products from specialist vendors. One repeated theme in talking to SAP MDM early adopters is its apparent inability to deal well with customer data, perhaps not surprising given the A2i heritage of the product. No doubt SAP has lots of resources to throw at this problem, but at present it is not obvious that it is getting much in the way of production deployments. Clearly SAP’s dominant market position should get it on to every MDM shortlist, but how many real broad deployments there are in production is much less clear.

There were a couple of entertaining exhibit conversations. One Dilbert-esque one was with a sales person. I asked the following question “what does your product do – is it a repository, or data quality tool, or something else?”, The sales person took a sudden physical step back like a scalded cat and said “oh, a technical question; I’ll need to find someone else to answer that.” Now maybe I’m old-fashioned but “what does your product do?” seems to me a question that even a software salesman should be able to hazard a guess at. What kind of questions do you think this salesperson is likely to be able to field? I’m guessing anything beyond “where is the bar?” or “where do I sign the order form” is going to prove challenging.

I was amused to see Ab Initio had a stand. Ab Initio is famous in the industry for its secretive nature e.g. customers have to sign a NDA in order to see a product demo. This is driven by its eccentric founder Sheryl Handler, and makes life hard for its sales and marketing staff. There was indeed no printed brochure or material of any kind, and they (very charming) sales person I spoke to was unable to confirm very much about the company other than it seemed pretty certain that there was a UK office. Ab Initio’s technology has the reputation of being the fastest performing data transformation tool around, and in the UK has most of the really heavy-duty data users (BT, Vodaphone, Tesco, Sainsbury etc) as customers. It must certainly make it interesting trying to sell the thing, but perhaps the aura of mystery paradoxically helps; after all, this is not a company that anyone could accuse of aggressive marketing.