Andy on Enterprise Software

Are you an MDM sinner?

April 28, 2006

There is a lot of common sense in the article about the “seven deadly sins” of MDM according to Knightsbridge. In particular:

“Believing that complete enterprise consensus is possible. Master data will exist in a constant state of evolution. It must be accepted that there will never be a point in time at which all business units are in agreement with and have completely implemented master data for all key subject areas.”

This may seem obvious but it is a critical point which businesses time and again fail to grasp, waiting for some mega central project to deliver this nirvana, which will never happen. Master data will NEVER be standardized across an entire corporation. You can improve the degree to which different versions are around, but the actual goal of harmonized master data is always an over-the-horizon goal. For a start, business change too rapidly to expect anything else. Even if by magic you could unify all your master data tomorrow, within a short time there would be a major change e.g. an acquisition of another company, which will upset that pretty picture; the new company’s master data will clearly not be the same as yours, and the operational systems at least cannot just be switched over without a protracted systems integration project. Hence you must always have to deal with the situation where master data is in multiple places. What you need to do is to be aware of that and to put in place processes and workflow that manage the situation, rather than assuming it will one day go away.

Knightsbridge are a very experienced firm that have more experience of MDM than most and Faisal Shah is a smart guy: their whitepaper is well worth reading.

MDM in the real world

April 26, 2006

Master data management has become remarkably trendy now, with software vendors lining up to re-badge their offerings as relating to master data in some way. Some of these offerings run best on the overhead projector, and it is relatively hard to find customers who have actually finished an MDM project and been live for some time, rather than those have completed pilot projects or are “investigating” the issue (i.e. attending conferences in nice sunny locations)

Irrespective of your technology choice, if you want to look at MDM at the sharp end then you could do worse than read a Ventana report which documents several KALIDO MDM customer case studies bsed on in-depth interviews of these customers by Ventana analysts. The case studies show a number if important points whatever technology you are using e.g. the sheer breadth of master data that is out there just waiting to be managed. One bank implemented an MDM system to manage the various versions of statutory accounts that are submitted by its numerous subsidiaries, and which have to be carefully consolidated from a compliance viewpoint. As you can imagine, there is a lot of complexity in the master data associated with a chart of accounts for a massive international bank. This is a long way from the customer hub stereotype that many peopel have of MDM projects.

Another customer manages 350 different types of master data, showing that “customer” and “product” are just a small part of the problem. An issue discussed in the report is the need to set the MDM project in the context of a data governance initiative, for example defining roles for data “customers”, “administrators” and “stewards”, each of whom have different responsibilities. Any MDM project will have to address the organizational issues and roles of this type, and the issues encountered will be similar, even if the Kalido-specific elements of the report is not of interest to you.

And you thought Sarbanes Oxley was bad

April 24, 2006

John McCormick raises a spectre more suited to Halloween than spring: the possibility that CIOs in public companies could be required by the US government to certify that the information they supply to governemnt agencies is correct. As the article points out, given that much of the information stored in corporate systems is wrong or incomplete (the article quotes a Gartner guesstimate of 25% being in this category) this could result in a few worry lines on CIO foreheads.

I think that, at least for now, this is scaremongering. The US government is aware of the considerable backlash against Sarbanes Oxley from business, and indeed some minor softening may occur in due course. The notion that they would compound this massively by asking for certification of all data in a company, something that is manifestly impossible anyway, seems far fetched. Even if the governemnt were “selective” as the article suggests, then presumably areas of interest would tend to be things which are currently carefully regulated anyway e.g. FDA documents in the pharma industry, or information in defence companies. Even the current administration would hardly try to put in regulation that would allow government agencies to go on fishing trips through corporate data, and then demand certificatoin that what they found was right.

Or would they?

Hyperion goes shopping

In Hollywood there is something of a herd mentality. If one studio brings out a gladiator film (say) then the others feel obliged to do likewise. Software is something of a fashion business, and so some similar behavior occurs. It seems like the indispensable fashion accessory at the moment is a data quality vendors: one simply can’t be seen out without one, darling. Hyperion showed off its latest purchase this week: Upstream software, a small data quality vendor from Michigan. As in fashion, the price is so much more tantalizing if it is only available to those in the know, and so the terms of the deal were not announced. I believe that Upstream’s revenues were about USD 8M and had about 30 employees. At some point Hyperion will presumably have to come clean, since they are a public company. Recently data quality vendors have been snapped up at bargain-basement prices, and certainly Hyperion has the sort of bank balance that it could pay fro Upstream out of loose change even if it paid a full price. However its most recent results show a certain amount of stumbling. Even though profit margins were still strong at 16%, but license revenues (the key to long term health for a software vendor ) actually fell by 6% on a year over year basis. Nonetheless, Hyperion has an enviable franchise as the undisputed king of financial consolidation, and is highly profitable.

The deal actually makes good sense to me: financial consolidation is Hyperion’s core business, and those of us who work in the industry know just how flaky the quality of data can be in those superficially shiny ERP finance systems. Hence a data quality spin makes good sense for Hyperion’s message to nervous CFOs. Unfortunately data quality is a very intractable problem, involving as it does human nature, and even the cleverest software can only assist with fixing issues of this nature. The data quality software vendor market may be shrinking, but underlying the problem of data quality itself is not.

Psst, wanna buy a BI vendor?

April 20, 2006

In Enterprise Systems, Stephen Swoyer concurs with my view that Oracle’s recent BI announcement was a big yawn. Moreover, he raises the interesting possibility of IBM buying Cognos, a rumor that has been circulating in the industry for several months. IBM has been strictly agnostic with regards to getting into applications, but that would not stop it buying a pure-play BI vendor. Clearly it has the financial resources to do what it wants, and I feel that Cognos would be a slightly more natural fit for IBM than Business Objects, as Cognos plays a little higher up the corporate food chain e.g. with its Adaytum budgeting acquisition. Of course IBM did buy Alphablox, but this was a small vendor that had some technology useful to IBM, so that did not really alter the big picture. Cognos is getting back on its feet after a stumble late in 2005, and has generally strong technology.

If you consider the broad landscape, the industry has its “stack wars”, with IBM, SAP (NetWeaver), Oracle (Fusion) and Microsoft. IBM is, in my view, the best placed here since it is the only one of these that does not have application turf to defend (even Microsoft has its applications to worry about, and like errant teenagers to its parents, a big worry they are). IBM has planted its feet firmly in the master data management landscape, and to me it would seem entirely complementary and consistent for it to make a big play into the business intelligence world. Acquiring a company the scale of Cognos would certainly be the kind of bold step that would demonstrate that IBM is very serious about this area. It would firm up its “stack”offering in a quite natural way. Of course I am not privy to the internal musings of IBM, but this rumor makes more sense than many.

Lies, Damned Lies and Surveys

April 10, 2006

A survey sponsored by Oracle (http://www.computing.co.uk/itweek/news/2153695/surveys-show-bi-failing-s) hits a new low in terms of insight. The classic line is: “In Oracle’s survey of 200 UK and Irish IT managers, over half of organisations said they did not have any BI systems, though 69 percent of respondents said BI was important to help senior managers run their business.” Apart from the apparent conclusion that over 20% of the respondents seem to struggle to keep two ideas in their head for more than five minutes, the notion that half of the UK’s companies lack a single BI tool is pretty absurd. We have had well over a decade of Business Objects, Cognos and others jamming BI tools, and even before that there were tools like Focus and Nomad. You would have to be recently returned from the moon not to have encountered a BI software salesman as a UK IT manager.

I do wonder sometimes about the accuracy of some of these surveys. I recall years ago at a Gartner conference being handed a thick survey, which demanded all kinds of detail in terms of IT budget breakdown, future spending trends by area etc. You needed to return the completed survey in order to get a chance of winning a prize, and I remember saying to a guy next to me who had just finished his “how on earth do you remember all of that budget info for your organisation?” The reply was “are you kidding, I just made it up, but I really want that prize”. Many surveys do make use of incentives to get people to fill them in, and I wonder just how accurate the data really is in many of them as a result.

Separately, a more plausible insight in a different survey is: “Meanwhile, a survey of 1,000 UK business managers at companies with over 250 staff, published by ICS, indicates a widespread need for better BI systems. The study found that over three quarters of respondents were forced to make decisions “blind” due to late or insufficient business information”. By contrast, this is entirely believable, though not for the reason that the article gave. The critical issue is that you can have as many pretty reporting tools and dashboards as you like, but you need accurate and timely information to feed those systems coming from a data warehouse (unless you are one of the few brave souls using EII). The problem is that most data warehouses are entirely unable to keep up with the pace of business change (reorganisations, acquisitions etc) and so are constantly out of date. Consider a data warehouse with just ten source systems. A major change in one of its sources will impact the warehouse schema, and may take three months to fix the schema, the load routines and the reports that are impacted by the change (this is a pretty typical figure in my experience at Shell).

A major change of this type does not happen every day, but is almost certain to happen once a year to each of these source systems, maybe twice. There are then ten sets of separate changes, each taking three months worth of changes needed to the warehouse every year. Even assuming that the changes are neatly spread over the year and that you have plenty of programming resources to fix the changes, so you can do these in parallel, you still have 15 months of change to fit into 12 months; basically the warehouse can never catch up. You may well have more than 10 sources for your data warehouse, so the problem could be even worse than this. This is indeed what happens in reality: the data warehouse is usually out of date, so armies of Excel jockeys in finance get the answers via email and have to manually number-crunch for anything really critical while the warehouse lumbers on with out of date information. This situation is not the fault of the BI tools – it is the fault of the data warehouses that feed the BI tools. Until companies admit that the status quo is failing and start abandoning custom-build warehouses this problem will persist. It is like with treating alcoholism: the first step is admitting that there is a problem.

Honey I shrunk the attendees

April 5, 2006

I was in Silicon Valley this week speaking at Software 2006. This was pitched as having 2,500 attendees, and the organizers claimed 1,700 on the day itself, yet at any plenary session I could only count about 400 or so. Indeed on the first morning the main hall was so awkwardly empty that the number of chairs was dramatically reduced for future sessions, presumably to make it seem fuller. This is getting silly, rather like the perennial numbers game between police (”10,000 protestors”) and demonstrators (”100,000 protestors”) played out in countries the world over. As noted previously the trade show seems to be in secular decline, even here in the heart of hi-tech country. The show itself had good speakers and excellent conference admin, yet the partly deserted exhibit hall spoke volumes. Even the bikini-clad girl handing out free gifts (note to the marketing manager at Aztec who hired the model: you may want to consider trying a gimmick that does not look quite so tacky; even in the 1980s this seemed a bit dubious) was unable to lift the atmosphere. The exhibit sponsors did not seem best pleased (sample comment: “four of out of five people who came to the booth were trying to sell us something rather than the other way around”) and the supposed legion of CIOs attending were either cunningly disguised or further optimism on the organiser’s behalf.

Ironically the conference hotel perhaps held the clue as to why attendance at trade shows seems to be so hard to drum up these days. If you go to the Hyatt Regency Santa Clara concierge desk you are greeted not by a person but by a video screen. The concierge herself (the helpful Anna) sits 80 miles away and chats to you, even able to print out directions on the printer at the concierge desk. If even the hotel concierge can’t be bothered to travel to work any more and can do her job quite adequately by video link, is it any wonder that busy executives spend less time at trade shows and more time on webinars?

Integration or management?

I enjoyed a thoughtful article by Colin White, long-time database expert and general guru, about data integration and MDM. In it he defines an architecture for data integration that splits out the technologies from the different techniques and supporting applications. He also points out a couple of important things: one is that CDI is not a very useful term, since it implies that it is only about integrating customer data. This is very true, and applies to MDM in general. It is not the case that CDI just means taking half a dozen separate customer data sources and somehow banging them together. In reality it will not be practical to end up with one master source for customer (or pretty much any master) data, so what is important is to be able to to catalog what is out there, map the differences in definitions so that sense can be made of these differences, and a process defined so that changes can be propagated in a controlled way to the various sources. This is much more than just synchronizing updates between SAP and Siebel. Beyond simple things like names and addresses, you also need to consider how customer data is to be classified e.g. into different a market segments or demographic groups. Changing this classification is a non-trivial business process that will require input from various people within (and possibly beyond) marketing, and will likely involve multiple versions that need to be discussed, tested and modified before being published, and then propagated into the various operational systems. As Colin says, “management” is a much more appropriate term here than “integration”: this may seem an esoteric point, but names matter (if you doubt this, ask Vauxhall, whose “Nova” car means “no go” in Spanish)

Another point well made by Colin is how the term “real time” is regularly abused. Since most business intelligence requires some form of analysis, it is rare indeed that it needs to be truly “real time” e.g. looking at the buying patterns in a retail store by branch may usefully show all sorts of things (which items are moving, which promotions are working etc) yet this information has no more meaning of you get it at 14:15 than if you get it at 14:05, or indeed at 11:32. Having it a few minutes more “real time” adds no meaning, yet will cost dramatically more in terms of IT complexity and cost. I would argue that there are very few things indeed in business intelligence that truly require real-time data feeds. Certain operational queries may need this e.g. checking a customer’s credit rating, or looking at overall trading exposure before placing a trade, but these are a small subset of what is usually termed business intelligence.

True love

April 4, 2006

After a long courtship Microsoft today announced its engagement to its long-time lover, ProClarity. ProClarity had long been in a monogamous relationship with Microsoft as a partner, and had clearly been trying to win Microsoft’s heart ever since they first met. This s a excellent example of power dating in the software industry, as Proclarity had seemingly tied its fate to Microsoft as a conscious strategy for a long time. Proclarity had revenues of USD 15.1M in 2005, but had barely grown in 2005 over 2004 (just 2% growth) and at 135 employees was probably not profitable. Growth of 2% and losing money is not a healthy position for Proclarity or any other software vendor, so its new position as a blushing bride is a sound move for the company. Financial terms of the Microsoft dowry were not obvious from the announcement.

Microsoft gains a strong reporting technology and moves ever further into direct competition with the mainstream BI vendors Business Objects and Cognos, a trend noted previously in this blog.