Andy on Enterprise Software

The supply chain gang

November 30, 2005

There is a thoughtful article today by Colin Snow of Ventana in Intelligent Enterprise. In it he points out some of the limitations today in trying to analyze a supply chain. At first sight this may seem odd, since there are well established supply chain vendors like Manugistics and I2, as well as the capabilities of the large ERP vendors like SAP and Oracle. However, just as with ERP, there are inherent limitations with the built-in analytic capabilities of the supply chain vendors. They may do a reasonable job of very operational level of reporting (”where is my delivery”) but struggle when it comes to analyzing data at a broader perspective (”what are my fully loaded distribution costs by delivery type”). In particular he hits the nail on the head as to one key barrier: “Reconciling disparate data definitions”. This is a problem even within the supply chain vendors’ software, some of which have grown through acquisition and so do not have a unified technology platform or single data model underneath the marketing veneer. We have one client who uses Kalido just to make sense out of data within I2’s many modules, for example.

More broadly, in order to make sense of data across a complete supply chain you need to reconcile information about suppliers with that in your in-house systems. These will rarely have consistent master data definitions i.e. what is “packed product” in your supply chain system may not be exactly the same as “packed product” in you ERP system, or within your marketing database. The packaged application vendors don’t control every data definition within an enterprise, and the picture worsens if the customer needs to work with external suppliers more closely e.g. some supermarkets have their inventory restocked by their suppliers when stocks go below certain levels. Even if your own master data is in pristine condition, you can be sure that your particular classifications structure is not the same as any of your suppliers. Hence making sense of the high level picture becomes complex since it involves reconciling separate business models. Application vendors assume that their own model is the only one that makes sense, while BI vendors assume that such reconciliation is somehow done for them in a corporate data warehouse. What is needed is an application-neutral data warehouse in which the multiple business models can be reconciled and managed, preferably in a way that allows analysis over time e.g. as business structures change. Only with this robust infrastructure in place will the full value of the information be able to be exploited by the BI tools.

The shrinking band of ETL players

November 29, 2005

Informatica has had a couple of good quarters and is about the last independent ETL player left standing, now that Ascential has disappeared into the IBM Websphere maw. The only other players out there now are the quirky Ab Initio in the high volume niche, but hamstrung by surreal business practices (customers must sign an NDA even to get a demo) and, er, well that’s about it. Sunopsis has taken a smart approach by using the capabilities within the database engines, but who else is left? Sagent had the last rites read before being bought by Group 1, while early pioneer ETI extract seems to have shrunk almost to oblivion, at least in terms of market presence. The trouble is that the features that ETL tools provided are increasingly being built into the underlying database engines, as Microsoft has with DTS (which it is about to revamp substantially in SQL Server 2005). This makes it harder for companies to justify a high price tag for more functional tools. Informatica has correctly abandoned its mad strategy of “analytics” and has broadened into EAI, where there is a larger market. Even this market is pretty competitive though, with Tibco battling it out with IBM Websphere, and with third-placed Web Methods actually shrinking in revenue in 2004. The remaining ETL vendors can be comforted by the fact that a lot of companies still “do it yourself”, and so there are plenty of opportunities still for sales since the market is a long way from being saturated. However the pricing pressure that the “free” offerings in the database engines puts on vendors make it a struggle to make a good living. This pressure will increase as these capabilities become more functional and scalable.

Open up your email – it’s the feds!

November 28, 2005

I recall a few days ago being sent an email that was so transparently obviously a virus that I thought “who on earth would click on such an obviously dodgy looking attachment?” It was an email from the FBI (yeah, right, email being the FBI’s most likely form of communication to me) saying that “your IP address had been logged looking at illegal web sites” and then inviting you to click on an attachment of unknown file type asking you to “fill in a form about this activity”. I’m guessing that if the FBI were troubled about my web site browsing they would be more likely to burst through the front door than send an email. At the time I chuckled to myself and deleted it, but apparently millions of people actually did decide to fill in the form about their dubious web browsing, immediately infected their PCs with the “sober” virus.

Against this depressing demonstration of human gullibility was at least the entertainment value of the deadpan statement from the real FBI, which read: “Recipients of this or similar solicitations should know that the FBI does not engage in the practice of sending unsolicited e-mails to the public in this manner.” Quite so.

Wiping the slate CleAn?

When a company or organization changes its name, you know that its troubles are big. A few days ago Computer Associates made the radical name change to “CA” (I congratulate the brand naming people for their imagination; I imagine their fee was suitable modest). This follows a series of accounting scandals that has accounted for most of the executive team and has its previous CFO pleading guilty to criminal charges and previous CEO Sanjay Kumar now leaving the company. The name change is in a fine tradition of hoping to cover up the past: in the UK we had a nuclear power station called Windscale which had an unfortunately tendency to leak a bit, and a safety reputation that would have troubled Homer Simpson. A swift bit of PR damage control and voila – Sellafield was born. We all felt much safer that day, I can tell you. Of course some name changes are just good sense e..g Nintendo was previously called Marafuku (I kid you not).

However the big question for CA is whether its rebirth will be superficial or deep-rooted. CA was the body-snatcher of the IT industry, picking up ailing companies that had decent technology and maintenance revenues for a song, stripping out most of the costs and milking the maintenance revenue stream. It does have some strong systems management technologies like Unicenter, and is a very large company, with USD 942 million in revenues last quarter. However its famously antagonistic relationships with its customers have not helped as it has had to weather a series of scandals and management changes. Jim Swainson of IBM is the latest person to have the role of turning the company around in his new role as CEO. I wish him luck, as a company with troubles that deep-seated will not be fixed by a PR blitz and a name change.
I hope he has better fortune than PWC did with “Monday”.

Overcoming (some of) the BI barriers

November 25, 2005

A new survey from from Gartner has some interesting findings. Business intelligence in its broadest sense has moved from #10 in CIO priorities to #2, quite a jump. Spending in this area is set to rise sharply, with companies on average spending 5-10% of their overall software budgets on BI, but with some sectors such as finance spending 16% of their software budgets on business intelligence (more can be found in research note G00129236 for those of you who are Gartner clients). This is obviously good news for vendors in the space, but it seems to me that CIOs have been very slow to grasp that providing business insight is surely a high priority for their customers. Once the Y2k madness was over and everyone had rushed in their shiny new ERP, CRM and supply chain systems, just what else was it that CIOs should be doing rather than exploiting the wealth of information now being captured by these systems? CIOs are increasingly under pressure to deliver value to their companies, and what better way to do this than by providing improved insight into the performance of the company’s operations? Surely there is more bang for the buck for a CIO here than in fiddling with new middleware or upgrading the network, activities in which most business people have little interest and regard as “business as usual”. Anyway, the penny now seems to be dropping, so given it is finally on their radar CIOs should also consider what will stop them delivering this value, with its inherent career-enhancing kudos.

The main barriers to adoption of business intelligence, in Gartner’s view, are:

  • a lack of skills
  • difficulty in getting business intelligence out of enterprise applications (e.g. ERP)
  • perceived high cost of ownership
  • resistance due to some managers viewing enterprise-wide transparency as a threat to their personal power

I recently wrote on this last point. Let’s consider the others.

The skills one is the easiest: “organizations lack the analytical skills, causing them to have difficulty using the tools.” The answer is that most people in a business simply do not need a sophisticated analytical tool. It is not their job to be creating whizzy charts or mining through data – most business managers just need a regular report telling them their key performance information e.g. production throughput, sales figures etc. This requires at most Excel, and probably not even that. As I have argued elsewhere, the answer to BI vendors selling you thousand of copies of their software is simple: just say no, to quote Nancy Reagan. In my experience perhaps 5%-10% of end users of data warehouse applications actually need true ad hoc analytic capability – the rest need a report or at most an Excel pivot table. Putting a complex, powerful but unfamiliar tool on business people’s desks and then wondering why usage rates are low is a self-inflicted problem.

The second barrier is the dffifcult in getting BI linked to enterprise applications. This is a real issue, with the big application vendors either providing weak capability or, where they do, tying it too heavily to their own data structures. While there is a place for operational reporting, enterprise-wide performance management requires information from a wide range of sources, some of it in spreadsheets and external data. Obsessed with trying to broaden their own footprint, application vendors seem unable to let go and realize that customers have a diverse set of applications and are not going to simply ditch everything they have from everyone else. The answer here is to adopt an enterprise data warehouse approach that is separate from the application vendors and neutral to the applications. Leave the operational reporting to the app vendors if you must, but at the enterprise level you need something that is truly application-neutral. Cutting this link, and using app vendors analytic tools for what they are good at, rather than trying to shoe-horn them into roles they were never designed for, will save you a lot of pain and suffering here.

The third issue is cost of ownership, and here too the issue is very real. Recent work by The Data Warehouse institute (TDWI) shows that data warehouses have very high cost of maintenance and support. Indeed according to major TDWI survey, the annual support costs are 72% of the implementation costs, on average. For people used to traditional “15% of build” costs this may seem outlandish, but it is not. The reason that maintenance costs are often around 15% of build costs for transaction systems is that this is roughly the amount of code that is impacted by change each year. Most transaction systems are operating in a specific area of the business that does not change radically every week, so much of the application code is stable. By contrast, a data warehouse (by definition) takes as sources many different transaction systems, and every one of these is subject to change. So if you had a data warehouse with six sources, each of which had a 15% degree of change per year, then your warehouse is subject to a 6 * 15% = 90% level of change stress. Of course this is too simplistic, but you can see the general idea: with many sources, each undergoing some change, the warehouse encounters more change issues than any specific transaction system. Hence custom-built data warehouses do indeed have these very high lavels of maintenance costs. There is not a lot to be done about this unless you use a data warehouse design specifically built to address this issue, but unfortunately the mainstream approaches (3NF, star schema, snowflake schema) do not.

So to summary, you can take steps to circumvent at least three of Gartner’s four barriers. The fourth one involves tackling human nature, which is a more difficult problem that software design or architecture.

Pure as the driven data

November 24, 2005

In a recent conference speech, IDC analyst Robert Blumstein had some interesting observations about linking business intelligence applications to corporate profitability. Noting how many business decisions are still made based on spurious, incomplete or entirely absent data, he notes that “It’s easier to shoot from the hip, in many ways”. I found this comment intriguing because it echoes similar ones I have heard before in my corporate career. I remember one of my managers saying that many corporate managers didn’t seek data to support their decisions because they felt that using their “judgment” and “instincts” were mainly what they were being paid for. This syndrome was summarized elegantly by historian James Harvey Robinson, who said: “Most of our so-called reasoning consists in finding arguments for going on believing as we already do.”

I personally believe that there are very, very few managers who are so gifted that their instincts are always right. The world was always a complex place, but it is ever more so now with a greater pace of change in so many ways. Hence I believe that being “data driven” is not only a more rational way of responding to a complex world, but that it will lead to greater success in most cases. As the economist John Maynard Keynes said on being questioned over a change of his opinion: “When the facts change, I change my mind — what do you do, sir?”. I have observed that the most impressive managers I have seen are prepared to modify their decision in the face of compelling new information, even if that contradicts their “experience”, which was often built up many years ago in quite different situations.

Making business decisions is hard, all the more so in large organizationss where there are many moving parts. There are many insights that good quality data can give that contradict “experience”. One customer of ours discovered that some of their transactions were actually unprofitable, which had never come to light since the true costs of manufacturing and distribution were opaque prior to their implementing a modern data warehouse system. All the people involved were experienced, but they were unable to see their way through the data jungle. In another customer, what should have been the most profitable product line in one country was also being sold at a loss through one channel, but again the true gross margin by channel was opaque prior to their new data warehouse system; in this case it was a problem of a poorly designed commissionn plan that was rewarding salesmen on volume rather than profitability. “Data driven” managers will seek to root out such business anomaliess through the analysis of hard data, fact rather than opinion.

It is often noted that data warehouse projects have a high failure rate. Of course there are many reasons for this, such as the difficulty most have in keeping up with businesss change, and the vagaries that beset any IT project. Yet could part of the problem be that, at least in some cases, the people for whom the systems are supposed to provide informationn simply would prefer to wing it?

Why are most job interviews so bad?

November 22, 2005

We have all been to job interviews, but has it struck you how remarkably random the whole process is? Some companies put in effort in but I recall interviews where the person interviewing me was clearly bored, had been asked to fill in for someone else, didn’t really know what the job was about etc. If it is a big company they might have a session about the company e.g. I recall as a graduate going to an interview at Plessey and hearing about the pension plan for an hour; just what every 21 year old is dying to listen to. On the other side of the fence, most of us will have interviewed some surreal people. I had one guy who was clearly on drugs, and one CFO candidate who considered that answering my accounting questions was “beneath him”. My colleague had one candidate for a technical author who, when asked about his prior work experience, jumped up, grabbed and opened the large suitcase he was carrying, revealing a cloud of dust, a horrible musty smell and two maintenance manuals for the ejector seat of an aircraft, which he proceeded to read out loud.

Does it really have to be this way?

I have been studying this in some depth recently, and was pleased to find that there is at least some science around. If you look at various selection techniques, it turns out that “unstructured interviews” i.e. the ones most of us are used to, are actually a pretty dismal way to select people. A major 2001 study looked into various selection techniques and tracked back performance at selection e.g. how good their interview was v how well the candidates were performing in the job a few years later. It turns out that unstructured interviews manage just a 0.15 correlation between interview results and job success i.e. only a bit better than random (correlation of 1 is perfect, zero is randomly, while -1 is perfect inverse correlation). Highly structured interviews, based on job competencies and evidence-based questions (which can be trained) manage 0.45 correlation. Ability tests on their own manage a correlation of 0.4, and if combined with structured interviews take the number up to 0.65, which although still not perfect was the best score that had been achieved.

Ability tests take various forms. The best of all are ones that are directly related to the job in hand e.g. actually getting someone to do a sales pitch if they are a salesman, or to write some code if they are a programmer. For more on one of these see an earlier blog. The most creative of these I heard about was an interviewer for a sales position who asked each candidate to sell him the glass of water in front of him on his desk. One candidate want to the waste-paper bin by the desk, pulled out a match, set fire to the paper inside and then said “how much do you want for the water now?”. Generally less creative approaches are adequate, and at Kalido we use a software design test for all our software developers which enables us to screen out a lot of less gifted candidates, saving time both for us and the candidates.

General intelligence tests also score well as, all other things being equal, bright people do better in a job than those less bright; studies show that this applies across all job disciplines (yes, yes, you can always think of some individual exception you know, but we are talking averages here). The 0.4 correlation with job success that these tests provide is a lot better than the 0.15 which most interviewing manages. Personality profiles can be used to supplement these, as for some types of job research has been done which shows certainly personality types will find it more comfortable than others. For example a salesman who was hated rejection, didn’t enjoy negotiating, disliked working on his own and was pessimistic might still be a good salesman, but would probably not be a very happy one. You don’t have to invent such profiles and tests: there are several commercially available ones, such as the ones we use at Kalido from SHL.

The cost/benefit case for employing proper interview training and such tests is an easy one to make: the cost of a bad job hire is huge just in terms of recruitment fees, never mind the cost of management time in sorting it out, the opportunity costs of the wasted time etc. Yet still most software companies don’t employ these inexpensive techniques. Perhaps we all like to think our judgment of people is so great that other tools are irrelevant, yet remember that 0.15 correlation score. There may be a few great interviewers out there, but most people are not, and by supplementing interviews by other tools like job-related tests and good interview training we can improve the odds of hiring the best people. I used to work at Shell, who did a superb job of structured interview training, and I recall being trained for several days, including video playback of test interviews, on how to do a half-hour graduate interview. This may sound like a lot of effort, but it is trivial compared to the cost of a bad hire.

Many software companies seem to be missing a trick here. When I applied for jobs as a graduate I recall virtually every large multi-national had an extensive selection process including ability tests, yet in the software industry, where almost all the assets are its people, such things seem rare. I was amused to hear a recruitment agency whining at me for our use of screening tests at Kalido: “but only software companies like Microsoft and Google do tests like that”. I rest my case.

A bit rich

November 21, 2005

You may have seen SAP’s latest advertising campaign, which breathlessly claims that “A recent study of companies listed on NASDAQ and NYSE found that companies that run SAP are 32% more profitable than those that don’t*. Based on a 2005 Stratascope Inc. Analysis”. There are at least three interesting features in this advert. The first is that little asterisk at the end. If you read the (small font) footnote you will see that this excludes all financial services companies. A little odd until you realize that financial services companies these days have two particular characteristics: they make a lot of money, and they rarely use SAP. Could their inclusion have, perhaps changed the results somewhat? Of course one could ask Stratascope, the market research firm headed by a chairman and President Juergen Kuebler, a nine-year SAP veteran, and I’m sure they will give an unbiased and objective opinion. I am going to take a wild guess and say that including financial services companies would not make the figure look better.

However by far the most interesting aspect of this advert is its sheer chutzpah, with its implication that if you use SAP then you will be more profitable: 32% more in fact. Lest the subtleties of statistics have escaped the denizens of Walldorf, I would like to remind them that because two datasets have a positive correlation, it does not mean that this correlation is caused by anything. For example, I observe that my increasing age is well correlated with the steady rise in global temperatures. As far as I know, there is no direct link. Similarly, one could observe: “the stork population has gone up, as has the human population. Hence storks must create human babies”. For example, I can tell you that four of the UK’s five most admired companies last year were Kalido customers (true) yet to say that one implies the other is absurd.

It is particularly implausible to make such bold claims in the case of IT systems of any kind, which may well have distinct and real benefits in individual cases and projects but whose influence on overall productivity has generally eluded economists entirely. The studies there have been are controversial e.g. the 2002 McKinsey study that showed that, other than a few sectors (retail, securities, telco, wholesale, semiconductors IT) there had been no productivity growth whatever between 1995 and 2000 in the US despite heavy IT investment. This study was looking at all IT investment, of which ERP is only a small fraction.

So overall, SAP’s claim excludes a key industry sector to selectively improve its results and in any case makes a claim that is logically spurious and has no supporting evidence. Other than that, excellent. All par for the course in software industry marketing.

Uncomfortable bedfellows

November 18, 2005

It is rare to find the word “ethical” and “software company” to appear in the same sentence. The industry has managed to become a byword for snake oil, aggressive pricing and sneaky contract terms. Years ago when working at Exxon I recall one vendor who sold Esso UK some software, rebadging the product as two separate products and then trying to charge Esso for the “other” product, which of course they had already bought. Needless to say I was having none of that, but the very notion that they would try to do this spoke volumes about their contempt for the customer.

The prize (so far) goes to one of my colleagues, who used to work for a software company that once sold a financial package to a customer on the basis that it had a particular module. The only problem was that it did not exist. He was asked to set up a “demo” of the software to the customer which sounds like something out of “The Office”. In one room sat the customer at a screen, who typed in various data to the system and requested a report from an (entirely fictiitious) pick list of reports that the vendor was supposed to have built but had not. In the next room was a programmer. When the customer pressed “enter” the data would appear in a table, and they quickly manually edited a report format using the customer data, which was “off to the printer”. A couple of minutes later the report was brought in to the customer, who could they see the new reporting module in action. The slow response time was explained by an “old server”. Lest you think this was some fly by night operation, this major provider of financial software had over USD 100 million revenue back in the early 1990s, which was when this particular scam was perpetrated. And yes, they closed the deal.

As if to prove that enterprise software companies are still amateurs when its comes to dubious behavior, Sony has just made all the wrong headlines by placing what is essentially a clever virus on its CDs, purportedly to avoid digital copyright violations. The software installs itself into the root directory of your PC and, quite apart from preventing unathorised copying of music, also broadcasts back to Sony what music you have been listening to. Apparently million of PCs may have been infected, and only after several refusals have Sony now agreed to stop producing the spyware. Just what corporate manager at Sony thought this was a really bright idea that no-one would figure out is yet to emerge. However it is safe to say that Sony’s PR agency are not having a quiet run-up to Christmas right now.

I’d be interested to hear about any reader’s experiences of outrageous software company behavior.

Less is more when it comes to innovation

A survey by the Economist Intelligence Unit (sponsored by PWC) just released today has a very interesting finding that backs up something I have written about before: when it comes to innovation, don’t look for it in large companies.

In answer to the question:

“Small or start up competitors are more likely than large, established companies to create breakthrough products or business models” no less than 70% of senior executives “agreed” or “strongly agreed”, with only 10% disagreeing. Given the vastly greater resources and R&D budgets available to large companies, why the dearth of innovation there?

It is easy to argue that bureaucracy is the cause but I think there is another reason that I have not seen written about. I had some dealings with Oracle in the 1990s when they were concerned about the emergence of object databases, and they wanted customer input as to whether this was a real threat to them. What struck me in several meetings in Redwood City as I met with a range of senior Oracle technologists, was how that the most impressive people were the ones working on the database kernel, the core of the Oracle product. Less impressive were ones working on the applications, and least of all were some working on the tools layer above. This makes sense: if you are a top developer and join Oracle then you probably want to work on the crown jewels. Similarly in my dealings with my favorite Walldorf-based ERP vendor I have found the best people to have worked on the basis, the next best the modules, and the least impressive ones on the peripheral tools. Again, the key to SAP’s success has been its integrated ERP system, so it is hardly surprising that the top people gravitate there. Moreover the area which made the company initially successful is probably the one where the greatest understanding of the customer issues resides. The farther you move away from this the less likely it is that the best people will be working, and also the less likely it is that the senior executives (who built the company n the first place around a core technology) will grasp the opportunity and back innovation. Hence the ideas leak out of the company as those passionate about them leave to set up start-ups.