Andy on Enterprise Software

Microstrategy catches a cold

July 31, 2006

The latest financial results from Microstrategy show it joining the lengthening list of BI vendors that are struggling.  Cognos has at least shaken off its SEC troubles, but it has had mixed results at best recently, while Business Objects recent share price chart has that 1930s Great Depression look about it (see below)   Microstrategy showed a 5% decline in licence revenue, both year over year and on a trailing 12 month basis.  While total revenue increased due to a spurt in services revenue, no software company can honestly say that they are happy if their lifeblood, software licence revenue, is in decline.  The company’s profits were down but operating margins were still a very healthy 32%, albeit again declining from a year ago.

I believe that this latest addition to the casualty list is continued supporting evidence for my long-standing thesis that pure BI players are going to struggle in the medium term due to the steady encroachment of Microsoft with its cheap and cheerful competing BI technologies, and also to the largely unrecognised saturation of the high end enterprise market for BI tools. 

It can be seen below that Microstrategy’s share price has held up better than its main rivals recently, but this latest set of results has triggered a sell-off.

(source for graphs: Yahoo finance web site)

Chart

Chart

Chart

 

What’s in a name?

July 27, 2006

In a b-eye network article Bill Inmon reminds us of the importance of patents, especially for small start-up companies.  The current case against Blackberry is indeed a salient reminder that a patent infringement can bring very serious consequences even to a market-leading company.  I would go further than Bill in suggesting that start-ups need to seriously consider all their intellectual property, which as a well as patent applications extends to trademarks also. 

Patents can take a long time to sort out – Kalido’s core design patent was applied for in 1998 yet only granted in 2003 in the UK, and in 2005 in the US.  Certainly the US patent office is a curious beast which has many quirks, but being fast moving is not one of them.  However once you have a patent application in you can claim “patent pending” and are pretty much protected (unless of course, the patent application is rejected).  Start-ups need to consult with a patent lawyer early on if they are to avoid trouble.  For example, if you market something, which can constitute a demo a beta version of your product to a prospect, then you have 12 months to register for a patent on it or you may invalidate any future patent application.  It would be easy to see how someone could fall on this kind of legal tripwire.

The name of your company or product is also worth protecting via trademark in the countries that you expect to be marketing in.  This is less troublesome than it used to be thanks to some international reciprocal arrangements that mean you no longer have to file a trademark in every country individually.  However some countries (including the US) are not yet signatories to this accord.  There are several unfortunate cases of products being marketed and then suddenly discovering that they violated a trademark in a key market.  The costs of either withdrawing the launch, rebranding or fighting a court case can be very high, especially compared to the relatively modest costs of registering trademarks in the first place. 

You would hope by now that people would have figured out that registering web addresses is a free-for-all, yet you still see cases in the newspapers of well-known companies finding that that the natural internet address of their latest product has just been hijacked by some guy in Oklahoma who would like a large some of money for it, thank you very much. 

All these aspects, web names, trademarks and patents, need to be considered carefully even by the smallest start-up.  There are costs to be incurred, but the alternative can be disastrous, and patents in particular can be a genuine asset down the line.

 

 

Wine prices and BI

July 26, 2006

There is an interesting piece in Advertising Age by Jack Neff discussing the increasing desire of companies to move to premier pricing, even at the cost of losing some customers.  It raises the critical question – how good is the information that companies have on the performance of their brands?  For a multi-national company in particular even tracking the profit margin of a particular product or brand can be a thorny issue – do the French and the Italian subsidiaries allocate all their costs in the same way as the Brazilians or the Japanese?  Chances are they do not, and plenty of internal meeting time is spent debating whose country’s operations are more effective.  Yet pricing is a powerful weapon. I recall when I worked on a project at Shell Retail that we had just put in a data warehouse that tracked every single (non fuel) transaction at all Shell’s shops in Germany.  For the first time the category managers were able to see comprehensive information about sales by store, by SKU etc each morning.  In this way they could track the behaviour of promotions, for example, and rapidly see the effect of other changes they made.  For example in one cluster of stores they experimented by raising the price of wine to see what effect it had on volume sold.  Interestingly, it had no effect at all, so they kept nudging up the price until an impact could be seen.  This single decision, when multiplied across all the stores in Germany, had a significant impact on profitability.

Pricing is a complex animal and hard to predict its effects in advance.  Hence it is critical that you have in place an information infrastructure that allows you to see the true operating margins of products and brands, and also the margin by customer and channel.  This may require a major investment, since if the cost allocation rules are not uniform in a corporation then you will need a system that can cope, and be able to continue coping when changes occur.  You may not need “real time” information, but as in the above example you at least want to be able to see the effect of decisions the next day, perhaps within hours.

The article mentions Unilever as a case study amongst others, and indeed its comprehensive network of linked data warehouses around the world give it precisely the ability to see true gross margin across all their brands.  This enables them to decide which brands to invest in further, and which to pare back, as well as enabling them to track more operational aspects of business performance.

This ability, sometimes called “3D marketing” i.e. to be track performance across multiple business dimensions like customer, channel and product, is a powerful enabler to more subtle and profitable pricing decisions. In some projects I am aware of serious pricing discrepancies causing very real gross margin issues, which were buried away in a maze of confusing systems and so went unnoticed, in some cases for years.  Clever pricing can be highly profitable just as inadvertently poor pricing can cost you money.  The common denominator is having the ability to track profitability at the detailed level, in any business dimension, across the globe. Companies like Unilever have shown that this can be done.

Pandora’s box and hope for new CFOs

July 25, 2006

Recent management change at Radioshack reported in CFO.com shows just how important it is for CFOs to be able to produce accounts that they can confidently sign off in today’s stricter regulatory environment. An incoming CFO needs to feel absolutely certain that the books are in pristine shape, and may have to produce historical financial information from systems that he or she did not implement and is unfamiliar with. Particularly as in the case of Radio Shack, when the CFO has to deliver reports when they’ve been in the job less than half the fiscal year. Often, a new CFO is faced with a Pandora’s box when they do peek inside the finance systems they’ve just inherited.

How confident can they be given the serious consequences if something turns out to be awry? What if there are acquisitions, which need to be speedily assimilated into the corporate structure, yet in reality take years to convert or replace the acquired company’s incompatible IT systems? When a new CFO opens up the lid on the financial systems on which they rely, what do they uncover? We are all familiar with those scenes in horror films where the victim opens the forbidden door, or the lid on the long-shut chest, and as the audience we think “oh no, don’t open that”. How confident can an incoming CFO be that they are not about to re-enact such a scene?

The unpleasant reality in most companies is that financial data resides in multiple systems e.g. through a series of subsidiaries, or is in transition in the case of acquired companies. As I have written about elsewhere, letting a single and reliable view of corporate performance information can be a thorny problem. If you have to go back over time, as when changes occur to the structure of a chart of accounts or when major reorganisations happen, it is difficult to compare like with like. Moreover, CFOs need to understand the origin of the data on which they rely with a full auditable trail. This means that finance teams need to be active in defining and checking the business rules and processes from which data is derived in their company, and how these processes are updated when changes occur. Relying on the ERP system to do all this is insufficient since many of the business rules reside outside of these systems. This is why modern data warehousing and master data management software can help deliver clearer visibility. Ideally a CFO should be able to gather financial data together and view it from any perspective and at any level of detail – without having to standardise operational systems and business processes. The most intelligent software uses a model of the business, not the data or the supporting IT architecture, as its blueprint. Such a business model-driven approach insulates the company from change, since the reporting formats change immediately in response to changes in the business model. Using such intelligent software means that business change – such as re-organisations, new products, consolidation programs, and de-mergers – should no longer be feared.

Leading the organisation throughout change provides a real opportunity for the data-driven CFO to make his or her mark. By using the most modern technology available they can do this safely and without becoming a compliance victim. Good luck to all newly appointed CFOs!

So many databases…

July 24, 2006

In his ZD Net blog discussing systems management David Berlind raises a useful point: is it possible for a company to truly standardise on a DBMS?  While every CIO would like to be able to say “We only have xxxx” (substitute Oracle/DB2/SQL Server etc as desired) in reality it is virtually impossible to standardise on a single DBMS.  For a start, the sheer weight of existing systems mean that conversion of a system using one DBMS to another is a far from routine activity.  The DBMS systems might all support one of the SQL standard variants, but we all know that in reality much of the code to access a DBMS uses the various proprietary extensions of the database.  Even if you could somehow wave a magic wand and achieve a single DBMS nirvana for your in-house systems, many packages only support particular databases and so you may be forced into a seperate DBMS by a package vendor – few CIO departments have the clout to prevent the business choosing a package that is non-standard if the business wants it enough.  Moreover any attempt at standardisation will be overtaken by acquisitions made by the company, since you certainly can’t assume that every company that you acquire has a single DBMS or that, even if they did, their choice is the same as yours.

The difficulty in actually porting an application from one DBMS to another actually gives a powerful lock-in to DBMS vendors.  If you doubt this, check out the margins of Oracle, Microsoft and IBM’s software business (if you can find it) – net profit margins of 30% or more, which is what Oracle and Microsoft have, are achieved by very few companies in the world.  The trouble companies have is that, while they may be unhappy with a DBMS vendor, they can huff and they can puff at contract negotiation time, but in reality there is very little they can do: the DBMS vendor knows that the cost of migrating off of their platform would dwarf any savings in license costs that could be obtained.

In reality it is the same with applications and the data within them.  Migrating from a major application package is a massive task, so in reality you will have to live to learn with a hetereogeneous world of databases, applications and data, like it or not.  Yet companies spend huge amounts of money  trying to “standardise”, which is essentially only a little more attainable as a goal than the holy grail.  Better to try and manage within the reality and concentrate ion strategies that will allow you to make the most of that situation.  Certainly, you don’t want to allow unnecessary diversity, but you need to carefully look at the true cost/benefit case of trying to eliminate it.

CDI (or is it MDM?) Summit

July 20, 2006

I recently presented at the CDI MDM summit in London, the first one in the UK and indeed only the second held at all (I also presented at the inaugural one in San Francisco in March this year). These conferences are the vehicle of the CDI Institute, set up by ex-META analyst Aaron Zornes. He seems in two minds as to whether to re-title them MDM from CDI, and so this one’s title stayed on the fence as “CDI MDM”. At some point Aaron will acknowledge that CDI is just a subset of MDM, but for now the fence straddling continues. Anyway, the title is less critical than the content, and the conference went quite well. There were supposedly 170 registrations, though this included vendors exhibiting, but there did seem to be up to 100 genuine customers at the session. The first day was a workshop where Aaron gave his personal views on the space and the leading vendors, while the second day was a mix of case studies from customer and vendor pitches of varying degrees of subtlety. It is remarkable how many vendors haven’t grasped that conference attendees do not want to hear direct sales pitches when they have paid to attend a conference; if they wanted a sales pitch they would invite the vendor to come to their office. SAP once again stayed away from the conference, which given the state of their current MDM offering was probably prudent.

The exhibits were pretty well attended and the case studies showed that there are some pioneers in the MDM area who are beginning to get some value from their projects, but equally clearly the space is very immature, with various vendors clambering on the MDM bandwagon despite these vendors having seemingly never heard of the term a year or two ago. As well as the pure MDM plays like Kalido, Siperian and Purisma, there were several data quality vendors like Initiate, Trillium and Datanomic, and the expected presence from IBM, Oracle and Hyperion. A number of systems integrators also turned up, but I have the impression that there are not enough major projects out there yet to have excited the big SIs. It seemed clear that the “more than just customer” message was winning, with more than one speaker highlighting the danger of developing silos of master data if one hub is set up for customer, one for product etc. It would indeed seem odd to go from one form of silo (by organisational unit) to another (by data type) when this is entirely unnecessary.

The conference organisers did a good job, with sessions staying to time, and the exhibit hall was laid out sensibly. With the general decline of conferences it was good to see one that clearly had captured some interest and was well run.

Metamorphosis

July 19, 2006

It is a big day for me today, as I have decided to move from Kalido to pursue other interests. Kalido has come a long way since I encountered some original generic modeling research at Shell in 1996 that I could see had massive potential to provide Shell with integrated information from across the world throughout business change. After success at Shell with the software, I set up a business unit to commercialize Kalido, resulting in Kalido being set up as an independent company in 2001, and, with the backing of major venture capitalists, subsequently spun off from Shell in 2003. By this time it was clear that the next phase of growth for the company was to become successful in the US market, the largest in the world, and as I am based in the UK, I handed over the reins of CEO, and assumed the role of customer champion, company spokesperson and chief strategist. There has been no shortage of projects to work on, and I have thoroughly enjoyed continuing to raise the public profile of Kalido, but now that a new CEO – Bill Hewitt – has come on board to take the company to its next level of growth, I felt it was the right time for me to move on. Bill Hewitt has exactly the right background in enterprise software to take the company to the great commercial success that it deserves.

I have immensely enjoyed building Kalido up from an idea to a company with tremendous potential, and I look forward to seeing its continuing success. It has been an exhilarating experience for me, above all because I have had the privilege of working with a group of highly talented and committed individuals. It has been an immense pleasure to see so many examples of real business benefit in customer projects that have deployed Kalido in over 100 countries. The success that the company has enjoyed so far has been based on a passion for customer success and the high quality of its people, and is something I am extremely proud to have been associated with.

I intend to initially do some independent consulting and do a little writing. This blog, of course, will live on!

Some rare common sense

July 18, 2006

Ad Stam and Mark Allin have written an excellent piece in DM Review this month covering data stewardship and master data management. They correctly point out that, with regards to business intelligence systems, that “change will occur, and time to react will decrease” and lay out a sensible architecture for dealing with this issue. I very much like the way they put emphasis on the need for a business unit to deal with data governance as a key building block. In the article they explain the key requirements of such a group and make the interesting analogy of logistics, which is usually sourced these days to a separate unit or even separate company. Similarly they believe that the management of master data should be managed by a non-IT business unit.

The article also correctly distinguishes between “golden copy” data held in the data warehouse and a master data management repository, which in addition will hold master data in all its stages. The master data repository should be linked to the data warehouse, but are not the same physical entity since the master data repository has to handle “unclean” data whereas the data warehouse should have only fully validates data stored in it.

It is a pleasant change to read such a sensible article on best practice in data management, but this is because Ad and Mark are real practitioners in serious enterprise-wide projects through their work at Atos Origin e.g. at customers like Philips. They are not people who spend their lives giving slick Powerpoint presentations at conferences but are close to the action in real-world implementations. I worry that there are too many people on the conference circuit who are eloquent speakers but haven’t actually seen a real live project for a long time. I have known Ad Stam for many years and can testify that his team at Atos are an extremely competent and experienced set of practitioners who focus on customer delivery rather than self-publicity. If you have a data warehouse or MDM project then you could do a lot worse than use Ad’s team.

Vive la France

July 17, 2006

For some time I have been involved with an EU project that wrapped up last week in Brussels. With the unpromising name Sun&Sup it tried to identify the issues that hold back hi-tech start-ups in Europe, and to make recommendations that could improve the current situation. The project invited periodic input from selected hi-tech start-up companies across the EU (along with various service providers to start-ups) and I represented the UK on this project.

Make no mistake that there is a problem: once you get beyond SAP, Business Objects and Sage you will be hard pressed to name a large European software company. Israel has done a better job than the combined resources of Europe, with companies like Check Point Software, Amdocs, Mercury Interactive and many others. Israel has the second highest ranking for VC investment, and even in absolute terms has the second highest number of start-ups after the USA, yet it has a population of just over 6 million. There are many reasons for Europe’s hi-tech malaise, and few easy answers. The Sun&Sup project tried to deliver some very low-key, pragmatic services in pilot form, such as a self-help network of companies wishing to expand across borders, an expert system to help companies assess their business plans, a mentoring program to provide non-executive directors for start-ups, amongst others. Its most ambitious recommendation was to lobby to replicate the US system in government procurement, which sets aside USD 100 billion of government spending for small companies. European government procurement favour large companies: 50% of economic activity in Europe is from SMEs, yet only 30% of government spending is with SMEs. Of course opening up more government business to SMEs would not be a panacea, but it would help, as the successful federal Small Business Act has demonstrated for many years.

The highlight of the wrap-up session of the project in Brussels was to hear the French Trade minister Christine Lagarde making an eloquent case for the need for change in public procurement. It was indeed refreshing to an Anglo-Saxon ear to hear a small business initiative being championed by a French minister. Ms Lagarde was an extremely impressive speaker, yet clearly faced entrenched opposition from the Commission and indeed from several member countries in trying to open up public procurement. Indeed, from the way that several of the modest Sun&Sup initiatives ended up being buried or transferred to other EU projects, it seemed clear that the lack of high-tech competitiveness in Europe is something that will remain the subject of much hand-wringing for a long time to come.

SeeWhy real-time event monitoring makes sense…

July 13, 2006

Given the consolidation in the business intelligence sector, and the recent share price dips even of leaders Cognos and Business Objects, you might wonder why anyone would bring out a new BI product. Certainly there is no shortage of reporting tools, data mining has yet to break out of its statistician niche, while visualisation tools have again failed to become a mass market. However one area that does make sense for a new entrant to focus on is real-time event monitoring, which is typically today addressed (poorly) by the vendors of major applications.

SeeWhy software is a UK start-up which has managed to get over the key hurdle of signing up initial high class customers such as Diageo. It is run by Charles Nichols, previously an executive of Business Objects. Charles is a smart guy who understands the space well. The software pulls data out of real-time message queues, enabling alerts to be generated e.g. for supply chain data in the case of Diageo. The company should continue to focus on this niche in my view, and ovoid trying to be “all things to all men”. For example it would be natural to extend its capability to data mining in order to spot anomalies or trends, but would be wise to partner with existing data mining tools in order to do this. Similarly, if they start to build up repository capabilities and looking at trends in their customer data they should avoid trying to compete with general purpose data warehouse technology, or they risk undermining their message of “real time” analysis. I have written elsewhere how EII vendors struggle when they try and position themselves as general purpose business intelligence tools, since fundamental issues like data quality get in the way if you do not have a persistent store of data such as a data warehouse. This has led to pioneer EII vendor Metamatrix stalling in the market, with virtually no growth in revenues last year. By concentrating on drawing data from real time message queues, marketing to that niche and by selective partnering in other areas SeeWhy should be able to prosper in an apparently crowded market.