A new survey from from Gartner has some interesting findings. Business intelligence in its broadest sense has moved from #10 in CIO priorities to #2, quite a jump. Spending in this area is set to rise sharply, with companies on average spending 5-10% of their overall software budgets on BI, but with some sectors such as finance spending 16% of their software budgets on business intelligence (more can be found in research note G00129236 for those of you who are Gartner clients). This is obviously good news for vendors in the space, but it seems to me that CIOs have been very slow to grasp that providing business insight is surely a high priority for their customers. Once the Y2k madness was over and everyone had rushed in their shiny new ERP, CRM and supply chain systems, just what else was it that CIOs should be doing rather than exploiting the wealth of information now being captured by these systems? CIOs are increasingly under pressure to deliver value to their companies, and what better way to do this than by providing improved insight into the performance of the company’s operations? Surely there is more bang for the buck for a CIO here than in fiddling with new middleware or upgrading the network, activities in which most business people have little interest and regard as “business as usual”. Anyway, the penny now seems to be dropping, so given it is finally on their radar CIOs should also consider what will stop them delivering this value, with its inherent career-enhancing kudos.
The main barriers to adoption of business intelligence, in Gartner’s view, are:
- a lack of skills
- difficulty in getting business intelligence out of enterprise applications (e.g. ERP)
- perceived high cost of ownership
- resistance due to some managers viewing enterprise-wide transparency as a threat to their personal power
I recently wrote on this last point. Let’s consider the others.
The skills one is the easiest: “organizations lack the analytical skills, causing them to have difficulty using the tools.” The answer is that most people in a business simply do not need a sophisticated analytical tool. It is not their job to be creating whizzy charts or mining through data – most business managers just need a regular report telling them their key performance information e.g. production throughput, sales figures etc. This requires at most Excel, and probably not even that. As I have argued elsewhere, the answer to BI vendors selling you thousand of copies of their software is simple: just say no, to quote Nancy Reagan. In my experience perhaps 5%-10% of end users of data warehouse applications actually need true ad hoc analytic capability – the rest need a report or at most an Excel pivot table. Putting a complex, powerful but unfamiliar tool on business people’s desks and then wondering why usage rates are low is a self-inflicted problem.
The second barrier is the dffifcult in getting BI linked to enterprise applications. This is a real issue, with the big application vendors either providing weak capability or, where they do, tying it too heavily to their own data structures. While there is a place for operational reporting, enterprise-wide performance management requires information from a wide range of sources, some of it in spreadsheets and external data. Obsessed with trying to broaden their own footprint, application vendors seem unable to let go and realize that customers have a diverse set of applications and are not going to simply ditch everything they have from everyone else. The answer here is to adopt an enterprise data warehouse approach that is separate from the application vendors and neutral to the applications. Leave the operational reporting to the app vendors if you must, but at the enterprise level you need something that is truly application-neutral. Cutting this link, and using app vendors analytic tools for what they are good at, rather than trying to shoe-horn them into roles they were never designed for, will save you a lot of pain and suffering here.
The third issue is cost of ownership, and here too the issue is very real. Recent work by The Data Warehouse institute (TDWI) shows that data warehouses have very high cost of maintenance and support. Indeed according to major TDWI survey, the annual support costs are 72% of the implementation costs, on average. For people used to traditional “15% of build” costs this may seem outlandish, but it is not. The reason that maintenance costs are often around 15% of build costs for transaction systems is that this is roughly the amount of code that is impacted by change each year. Most transaction systems are operating in a specific area of the business that does not change radically every week, so much of the application code is stable. By contrast, a data warehouse (by definition) takes as sources many different transaction systems, and every one of these is subject to change. So if you had a data warehouse with six sources, each of which had a 15% degree of change per year, then your warehouse is subject to a 6 * 15% = 90% level of change stress. Of course this is too simplistic, but you can see the general idea: with many sources, each undergoing some change, the warehouse encounters more change issues than any specific transaction system. Hence custom-built data warehouses do indeed have these very high lavels of maintenance costs. There is not a lot to be done about this unless you use a data warehouse design specifically built to address this issue, but unfortunately the mainstream approaches (3NF, star schema, snowflake schema) do not.
So to summary, you can take steps to circumvent at least three of Gartner’s four barriers. The fourth one involves tackling human nature, which is a more difficult problem that software design or architecture.