I permitted myself a wry smile when I first heard the hype about “real time†business intelligence, which is hyped again this week . The vision sounds appealing enough: as soon as someone in Brazil types in a new sales order, the ultra-swish business intelligence system in central office knows and reacts immediately. Those who have worked in large corporations will be entertained by the naivety of this, since most large companies would be grateful just to know who their most profitable global accounts are.
The mismatch between fantasy and reality is driven by two factors. The first is that business rules and structures (general ledgers, product classification, asset hierarchies, etc) are not in fact uniform, but are spread out among many disparate transaction system implementations – one survey found that the average Global 2000 company has 38 different sources of product master data alone. Yes, this is after all that money spent in ERP. Large companies typically have dozens or even hundreds of separate ERP implementations, each with a subtly different set of business structures from the next (plus the few hundred other systems they still have around). The second problem is that the landscape of business structures is itself in constant flux, as groups reorganize, subsidiaries are sold or new companies acquired.
Today’s business intelligence and data warehouse products try to sweep this reality under the carpet, producing tools to convert the source data into a lowest common denominator consistent set that can be loaded into a central data warehouse. This simplification is understandable, but means that local variations are lost, and many types of analyses are not possible. Worse, if the business structures change in the source systems, then the data warehouses and reports built on top of them are undermined, with any changes to the structure of the data warehouse taking typically months to bring about. In these intervening months, what happens to the “real time†business intelligence?
The problem comes down to fundamental truth: databases do not like having their structure changed. Adding data is fine, but something which affects the structure of a database (a major reorganization will usually do the trick) will cause pain. If you doubt this, ask a CFO how long it will take him or her to integrate an acquisition just enough to be able to run the management accounts as one combined entity. For some companies acquisitions are a way of life, with several undertaken a year. Such companies are always chasing their tail in terms of trying to get a complete picture of their business performance. This is not just inconvenient but also costly: one company built a large and well-used conventional data warehouse, costing USD 4 million to build. When they properly accounted for all aspects of maintenance, including business user time (which few companies do) they found it was costing USD 3.7 million per year to maintain. There was nothing wrong with the warehouse design; they were operating in a volatile business environment, with 80% of the maintenance cost caused by dealing with business change.
What is needed, and generally what the industry has failed to deliver, are technology solutions that are comfortable dealing with business change: “smarter†software. Today few IT systems can cope with a change in the structure of the data coming into the system without significant rework. The reason for this is in the heart of the way that databases are designed. They are usually implemented to reflect how the business is structured today, with relatively little regard to how to deal with future, possibly, unpredictable, change. Introductory courses on data modeling show “department†and “employee†with a “one-many†relationship between them i.e. a department can have many employees, but a person can be only in one department (and must be in one department). This is easy to understand and typical of the way data models are built up, yet even this most basic model is flawed. I have myself been in between departments for a time, and at another time was briefly part of two departments simultaneously. Hence the simple model works most of the time, but not all of the time: it is not resilient to exceptional cases, and IT systems built on this model will break and need maintenance to cope when such special cases arise. This is a trivial example, but it underlies the way in which systems, both custom built and packaged, are generally built today. Of course it is hard (and expensive) to cater for future and hence somewhat unknown change, but without greater “software IQ†we will be forever patching our systems and discovering that each package upgrade is a surprisingly costly process. If you are the CFO of a large company, and you know that it takes years to integrate the IT systems of an acquired company, and yet you are making several acquisitions each year, then getting a complete view of the business performance of your corporation requires teams of analysts with Excel spreadsheets, the modern equivalent of slaughtering a goat and gazing at its entrails for hidden meaning.
Some techniques in software are emerging that tackle the problem in a more future-oriented way, but these are the exception today. Unfortunately the vendor community finds it easier to sell appealing dreams than to build software to actually deliver them. “Real-time business intelligence†comes from the same stable as those who brought you the paperless office and executive information systems (remember those?) where the chief executive just touches a screen and the company instantly reacts. Back in reality, where it takes months to reflect a reorganization in the IT systems, and many months more just to upgrade a core ERP system to a new version, “real time†business intelligence remains a pipe dream. As long as people design data models and databases the traditional way, you can forget about true “real-time†business intelligence across an enterprise: the real world gets in the way. It is interesting that the only actual customer quoted in the techworld article, Dr Steve Lerner of Merial, had concluded that weekly data was plenty: “The consensus among the business users was that there was no way they were prepared to make business decisions based on sales other than on a weekly basis”.