Marketing blues

My prize for the most creative marketing jargon of the week goes to IBM, who announced that they now consider their offerings to be a “third generation” of business intelligence.  Come again?  In this view of the world, first generation BI was mainframe batch reporting, while the second generation was data warehousing and associated BI tools like Cognos, Business Objects etc.  So, as you wait with bated breath for the other shoe to drop, what is the “new generation”?  Well, it would seem that this should include three things:

(a) pre-packaged applications

(b) focus on the access and delivery of business information to end users, and support both information providers and information consumers

(c) support access to all sorts of information, not just that in a data warehouse.

Well (a) this is certainly a handy definition, since IBM just happens to provide a series of pre-built data models (e.g. their banking data model) and so (surprise) would satisfy the first of these criteria.  It is in fact by no means clear how useful such packages are outside of a few specific sectors that lend themselves to standardisation.  Once you take a pre-existing data model and modify it even a little (as you will need to) then you immediately create a major issue for how you support the next vendor upgrade.  This indeed is a major challenge that customers of the IBM banking model face.  Nothing in this paper talks about any new way of delivering these models e.g. any new semantic integration and versioning capability.

Criteria (b) is essentially meaningless since any self respecting BI tool could reasonably claim to focus on information consumers.  After all, the “universe” of Business Objects was a great example of putting user-defined terminology in front of the customer rather than just presenting tables and columns.  Almost any existing data warehouse with a decent reporting tool could claim to satisfy this criteria.

On (c) there is perhaps a kernel of relevance here, since there is no denying that some information needs are not always kept in a typical data warehouse e.g. unstructured data.  Yet IBM itself does not appear to have any new technology here, but merely is claiming that DB2 Data Joiner allows links to non-DB2 sources. All well and good, but this is not new. They haven’t even done something like OEM an unstructured query product like Autonomy, which would make sense.

Indeed all that this “3rd generation” appears to be is a flashy marketing label for IBM’s catalog of existing BI-related products.  They have Visual Warehouse, which is a glorified data dictionary (now rather oddly split into two separate physical stores) and scheduling tool, just as they always have.  They talk about ETI Extract as an ETL tool partner, which is rather odd given their acquisition of Ascential, which was after all one of the two pre-eminent ETL tools, and given ETI’s near-disappearance in the market over recent years.  They have DB2, which is a good database with support for datatypes other than numbers (just like other databases).  They also have some other assorted tools like Vality for data quality.

All well and good, but this is no more and no less than they had before. Moreover it could well be argued that this list of tools actually misses several important points that could be regarded as important from a “next generation” data warehouse architecture.  The paper is oddly silent on the connection between this and master data management, which is peculiar given IBM’s buying spree in this area and its direct relevance to data warehousing and data quality.  There is nothing about time-variance capabilities and versioning, which are increasingly important.  What about the ability to handle a federation of data warehouses and synchronise these?  What about truly business model-based data warehouse generation and maintenance?  How about the ability to be embedded into transactional systems via SOA?  What about “self discovery” data quality capabilities, which are starting to appear in some start ups.

Indeed IBM’s marketing group would do well to examine Bill Inmon’s DW 2.0 material, which while not perfect at least has a decent go at setting out some of the capabilities which one might expect from a next generation business intelligence system.

There is no denying that IBM has a lot of technology related to business intelligence and data warehousing (indeed, its buying spree has meant that it has a very broad range indeed).  Yet there is not a single thing in this whitepaper that constitutes a true step forward in technology or design.  It is simply a self-serving definition of a “3rd generation” that has nothing to do with limitations in current technology or new features that might actually be useful.  Instead it just sets out a definition which conveniently fits the menagerie of tools that IBM has developed and acquired in this area. To put together a whitepaper that articulates how a series of acquired technologies fits together is valid, and in truth this is what this paper is.  To claim that it represents some sort of generational breakthrough in an industry is just hubris, and destroys credibility in the eyes of any objective observer.  This is by no means unique in the software industry, but is precisely why software marketing has a bad name amongst customers, who are constantly promised the moon but delivered something a lot more down to earth.

I suppose when presented with the choice of developing new capabilities and product features that people might find useful, or just relabelling what you have lying around already as “next generation”, the latter is a great deal easier.  It is not, however, of any use to anyone outside a software sales and marketing team.



Contrasting fortunes

For quite some time now I have been banging on (bloggin on?) about the gradual but steady progress than Microsoft has been making with its business intelligence offerings, and how these pose a long term threat to the pure-play BI vendors that seems to me underestimated by most.  Of course opinions are one thing but hard data is another, so it is nice to be able to put some numbers behind this thesis (“smug mode”, as they would say on “Red Dwarf”).  In the last year Microsoft’s BI offerings grew in revenue by a very healthy 36%.  Compare and contrast this progress with that of the pure play vendors.  Cognos just announced its quarterly results, which showed license revenue 1% down on a year on year quarterly basis, and down 7% overall (modest growth overall came from its services business). Cognos is still generating healthy profits at 12.8% operating margin, which although down from 15.5% last year is certainly very respectable. But the underlying difficulty in selling more licenses in what is supposed to be a fast growing BI market shows that something is not right.  Business Objects has managed slightly better sales growth but at the cost of halving its profitability over the last year, while Hyperion has managed 8.9% growth in revenue with just a small slip in profitability.  Of these, Hyperion with its financial consolidation speciality is the least threatened by the fairly low-end Microsoft BI tools.

Microsoft may not be particularly agile, but it is chipping away steadily at the business of the pure-play BI players, and I can only see this trend continuing.                                                                     


When you are in a hole, stop digging

SAP continues to score own goals around its troubled MDM offering.  At a Las Vegas conference the usually well-informed and ever-urbane Shai Agassi explained to attendees how MDM was critical to the success of SOA.  All true, but this runs up against the unfortunate problem that SAP’s own MDM offering is, to put it politely, not one of the current stars of their software firmament.  As noted in the article, despite SAP’s huge customer base and famed account control, they are hard pressed to rustle up a few customer references.  It is actually worse than that.  I know for a fact that an SAP customer was, just weeks ago, told that the SAP MDM offering would not be ready for effective production use for at least 18 months.  The customer was not told that by one of SAP’s competitors, but by SAP themselves at a private briefing in Walldorf. It seems that this news has yet to fully spread even amongst SAP.  The SAP MDM product marketing manager is quoted as saying the customer base is “quietly increasing”, which in the usually ultra-spun lingo of product managers translates as less than wild enthusiasm and delight.  IBM Global Services make a stack of money from implementing SAP, so when they say “it’s very difficult to find the customers using the software” then they probably mean it. 

I found it pretty revealing when SAP didn’t even show up at either of the CDI/MDM conferences run earlier this year by Aaron Zornes (one in San Francisco, one in London). This at least seemed a prudent approach.  Let’s face it, all software companies have some things go better than others, so if you are struggling with a particular product then it seems sensible to keep your head down, avoid publicity and get on with fixing the issues rather than calling attention to the problem.  This is what makes this high profile “MDM is key” speech seem either very bold indeed, or a bit of a gaffe (or, as they say at Real Madrid, a bit of a Woodgate). 



Deja Vu

Cordys just released their latest version of the Composite Application Framework, but rather oddly labels part of their offering a master data management framework. Cordys is certainly an interesting company, set up by the founder of Baan with the idea of providing an application-neutral true SOA middleware stack.  What is striking is the sheer scale of the investment, with USD 135 million sunk so far (and counting) and 550 employees. 

Cordys looks to have developed from a ground-up a very scalable middleware platform, but from reading their material quite carefully I cannot really see anything that looks to me like a true master data management application.  There does not appear to be any real semantic integration capability within Cordys, and indeed the otherwise positive Ovum report that their web site points to notes that data integration is the one limited area of their software.  I suspect that they have just hooked onto the MDM bandwagon as so many other vendors have done, feeling they need to appear to have an offering in a trendy space.  Certainly master data is important to how composite applications are going to work, and the enterprise service bus that Cordys provides will need to fed with consistent master data from somewhere in order to be effective, yet their software does not appear to have a lot to offer in this area so far.

The major challenge to Cordys is the sheer ambition of what they have produced.  A complete middleware stack is not something which enterprises will lightly contemplate introducing, especially as it would seem that Cordys’ productivity message relies on using their stack wholesale.  Yet large companies already have huge investments in middleware from IBM, SAP, Oracle and Microsoft, as well as with independents like Tibco and the like, and are going to be very reluctant indeed to swap these out, however appealing a new framework is.  It is noticeable that Cordys’ customers so far are virtually all in the Netherlands, and are generally quite small companies.  I couldn’t see a really large enterprise in the customer list on their web site.  This does not surprise me since the sheer scale of the prospect swapping out infrastructure is going to be a daunting one to any large company.  Mr Baan’s pockets are presumably very deep, but I can’t believe that 550 employees can be fed and watered on a diet of Dutch SMEs, so it will be critical for Cordys to break out of the local market and start landing some major enterprises at some point, just as Baan originally did with Boeing.

There is a worrying parallel here with Asera, a start-up of the internet boom set up in 1998 by ex Oracle staff with massive (USD 175 million0 venture capital backing, whose aims were rather similar to Cordys in ambition.  This company similarly hit the brick wall of entrenched middleware in large companies, and duly died an unpleasant death in 2003. This is one investment that you will not hear Kleiner Perkins boasting about.  It is to be hoped that Cordys fares better, but it has an uphill battle, in my view.

Agent Smith goes to London

On the 13th and 14th of September there was a business intelligence forum in London run by a fairly new organisation called “Obis Omni” (no, I don’t know what it means either). I was a speaker at the event, which was quite well attended given that it is a rather new conference.  The customer attendees seemed to be what one was billed i.e. people really involved in BI projects (some events seem to struggle to keep focus), and the conference seemed generally very successful based on the conversations that I had with assorted attendees.  

What was rather endearing and a little scary was the sheer efficiency of the conference administration. The event ran very tightly to time, and there seemed to be armies of helpers to guide you around, all rather disturbingly fitted out with communication devices in their ears just like those of Agent Smith in the Matrix. Indeed the only criticism would be that they were a little over-enthusiastic at times.  After my talk I was speaking to a delegate in the corridor, when one of the Agent Smith types came up, interrupted oiur conversation and said “you are due to attend session X now Ms Jones, please come along”.  This wasn’t the speaker who was late you understand, just a delegate.  God forbid that an unauthorised corridor conversation should take place during session time.  The delegate looked as stunned as I was and was led meekly away to her session without putting up a fight. 

I can never see this kind of thing catching on in Italy or Spain.  At the ETRE conference the tragically mistitled “organisers” struggle to keep sessions within half a day of schedule, and generally mooch around in a resigned if amiable state of chaos.  Here an eerie calm was the order of the day (come to think of it, I never did see that delegate again…)

The pre-conference administration and exhibit set up was as spookily efficient as everything else, with briefings just after dawn for exhibtors and, it has to be said, nicely set out booths with careful traffic flow.  I even had a new experience of having my slides lightly censored.  My crime was using a (fully credited) Bloor slide to show an overview of the market, and a chart which listed several relational databases in one of the bullets.  Apparently this violated rule 438 subparagraph (c) in the conference rulebook about vendor promotion.  Well, I have nothing to do with Oracle, IBM etc, and just put them in a list as examples, so I am a little hazy as to how exactly this was “promotion” (DB2 is a database, shock horror), but the offending slides were duly excised from the presentation, and probably ceremonially burnt as well.  I actually think it is quite admirable that they would go through the slides and try to ensure there were no blatant vendor sales pitches, but this did seem just a tad over-zealous.

Anyway, enough teasing.  I would much rather that an event ran with military efficiency than collapsed in a shambolic heap, so congratulations to the organisers for arranging such a slick conference.  If only they could be persuaded to take over the British railway system….




Did data quality just get more interesting?

The data quality market has been a strange beast.  The problem is huge, with just about every company in the world having issues regarding data quality.  Duplicate customer information is the area that is most familiar, but the problem can be a lot more serious than a few bits of duplicated junk mail.  More than a decade ago a Shell exploration drill neatly drilled into part of an existing drilling operation because the co-ordinates of the existing pipe were incorrect in a database.  Fortunately there was no oil flowing through the pipe at the time or a major spillage would have occurred, but even so the rig was off-line for some time, at a cost of about half a million dollars a day even then. I can certainly testify that a large chunk of every data warehouse project is spent dealing with data quality issues, even when the data comes from supposedly authoritative sources like ERP systems. 

Yet despite a real problem and real dollars attached to this problem, the data quality software market is something of a minnow.  Forrester estimates it at USD 1 billion, and that includes consulting. The software part of the market is maybe USD 200 million.  Over recent years pure play vendors have been bought up and incorporated into broader offerings e.g. Vality by Ascential (now IBM), FirstLogic by Business Objects, Similarity Systems by Informatica, Trillium by Harte Hanks.  The price of FirstLogic, USD 69 million for a business with USD 50 million in revenue, was hardly something to make investors salivate (to be fair, Similarity’s price tag of USD 49 million was at a much better multiple, and there were some peculiarities around Firstlogic).  Why should this be?  Part of the problem is that data quality issues remain a resolutely human problem, typically meaning that when validation rules are defined in a data quality tool, only part of the problem surfaces.  Business processes around who actually owns the data and who can fix problems are as big a problem as actually finding errors in the data itself.  It is encouraging to see a couple of start-ups take a new approach to this old chestnut.  Though differing somewhat in the technologies used, both take the approach of trying to discover relationships between existing data by analysing existing datasets rather than relying on building up a top-down rules profile. Exeros just secured a USD 12 million series B funding round, and has high quality venture backing from Globespan Capital Partners and Bay Partners. A rival company with a similar approach is Zoomix.  Based on technology developed in the Israeli military, Zoomix uses data mining techniques to seek out the relationships amongst existing data, presenting its first attempt to a business analyst and then learning from the responses so as to improve the algorithms in future iterations. They also have an interesting new product which can apply these rules in real-time to an existing transaction system, called up as an SOA service.  This effectively acts like a firewall, but for data quality. Zoomix has customers in Israel and Europe, and has set up a European HQ in London.  These newcomers present fresh competition to the largest vendor (Trillium) as well as to other start-ups such as Datanomic (based in Cambridge – the original one rather than the Massachusets one) and more specialist quality tools such as Silver Creek, who have taken a tightly targeted approach, in their case dealing with complex product data.

Investors in these companies have clearly seen that, while data quality may be more of a component in a broader solution (BI, MDM, whatever) there is enough in the way of a real problem here to allow room for innovative approaches.  The exits for these companies may well be acquisition by larger companies, just as with Similarity Systems, but it is good to see a fresh approach being taken to this very real, if not exactly sexy, business problem.   


Phishing with no bait

I’m sure you have received by now many plausible looking emails from assorted banks, many of which you do not have an account with, saying something like: “we need to update some information due to a possible security breach: please click here and give us your details so we can suck all the money out of your account you dimwit”.  Sometimes the phrasing is a little different, especially at the end, but you get the general idea.  I must admit the first time I saw one of these I thought for several seconds before hitting the delete key, but now they are ten a penny and we all ignore them in the same way we ignore unsolicited emails saying “I love you; please click on this attachment to find out more, sucker”.

With this in mind I have been wondering when the next enterprising criminal would raise the bar on phishing emails of this type and manage to construct something original and plausible, tempting yet authoritative.  E-criminals and the authorities are a little like cheetahs and gazelles, locked in a never-ending battle of wits, so what is the next turn of speed that phishing can offer?

I am pleased to declare that it was not the email I just received, purportedly from “Citibank”.  The first clue was the title was “Citibank Account Informations” (sic).  Banks have their flaws, but they usually manage to master basic spelling in the title of their communications – “informations”?  The next clue that this could possibly be less than legitimate was that instead of taking the minimal trouble of copying something like a Citibank logo from their web page, which let’s face it takes about five seconds, these jokers managed no logo and an email entirely with a yellow background, rather than the Citibank corporate blue and white.

The text itself declares that “Citibank account is about to expire”.  Not quite English either, but also guys: expire?  Bank accounts may do many things, have terms and conditions changed, interest rates updated etc, but one thing that they never, ever do is expire.  I can just see the bank advertising campaign for one of these now: “open an account with us, give us your money, but don’t wait too long before accessing it as it will expire; sorry”.  Even if they said it really quickly as they do at the end of radio adverts I think this would be hard to pull off.  

The email concludes with two lines of text which contain a further two grammatical errors and a couple of capital letters used incorrectly.

Whatever happened to criminal ingenuity?  It makes me positively misty eyed about Nigerian 419 scam letters, where at least you don’t expect the English to be perfect.  Maybe they have started to outsource these scam emails but are having teething troubles with quality control?




Back from the dead

Those of you with long memories will recall that the first three real ETL vendors were Prism, Carleton and ETI.  The others were acquired but ETI survived as an independent company, though with an ever-diminishing profile.  Early this year they were apparently down to just one salesman, but having the US Department of Defense as a customer does wonders for maintenance revenue.  In recent years the company had been pared down to a minimum, and I had assumed that, like an old soldier, they might just fade away.  However in the summer the original investors were bought out and a new capital injection happened in a USD 6.5 million round from investors Appian Ventures of Denver, Access Venture Partners and Osprey Ventures, and a $5M line of credit was negotiated with Comerica bank.  Consequently the company is effectively born again, with new money, owners and management, but with established technology.

ETI’s software used to be strong at dealing with extracting data from esoteric sources, generating code against things like COBOL workbooks and assorted mainframe file systems, as well as having the usual transformation capabilities.  It suffered from being rather complex to use and from some weak marketing. 

So, the interesting question is whether this old warhorse can be dusted off, repainted and revitalised.  Judging by the seven vice presidents that have appeared in the management ranks, the new board is not afraid to spend some of that new money.  They have also licensed in some data quality offerings from a couple of small British companies, which is a logical step to broaden the product range from just ETL.  This is important because ETL on its own is a tough market, as ETI has discovered.  More and more ETL functionality is being thrown into the database (MSFT with SSIS – previously DTS – and Oracle with the poorly named Warehouse Builder tool, which is really an ETL tool) which makes it hard work to persuade a customer to buy your technology.  Only Sunopsis has really made much headway here in recent times, with a clever pitch built around using rather than competing with the database capabilities.  Other pure plays like Sagent have withered and died.  Informatica is really the only ETL player of size left standing, and they have broadened their appeal by going for a wider integration message.  So what has changed that will allow ETI to flourish now when it clearly has not done fo some time?  Perhaps new the investors have noticed a flurry of companies being bought out in the data quality space recently and so can see a fairly quick exit, perhaps there is just too much venture capital around, or maybe they have more ambitious plans for the company.

ETI certainly has some well proven technology, and its foray into data quality looks logical.  Good luck to them.  Yet relaunching a company is hard work, and it will take some impressive sales and marketing execution to turn breath new life into this particular body.

Cognos treads water

I have written before about how the pure play vendors face considerable pressure in what is a more saturated market than they believe.  Further evidence of this theory being borne out comes with the latest results from Cognos, which although on the surface looked OK with an 8% revenue increase, in fact hid a year on year decline in license revenue. Ten deals of $1 million in the quarter is also down from last quarter. The company announced that it was laying off 210 staff, though it continues to invest in its sales force.  The sales force managed to sell $210k of license per rep in the last quarter, which is not exactly stellar given Cognos well known brand and its position as the number 2 player in the BI space. 

Profit margins are a still healthy 13%, but this compares to a peer group average of 16.3%, and full year profit is actually down by nearly 30% compared to the previous year.  Though the stock price is off its lows when the SEC investigation was announced, it can be seen that this has not been a great year for investors in Cognos stock, who would have been much better off with an index tracker.

Chart Graphic 

In case you are wondering, Business Objects share price has not exactly been lighting up the night sky either.  They have annual sales growth of 12% but profits have shrunk 46% year on year:

Chart Graphic 

I am sticking to my theory that all the pure-play BI vendors will continue to face a difficult environment for the foreseeable future. 


Keep it cranky

I came across a really inspired blog the other day which I would highly recommend that you read.  The Cranky Product Manager is written by an anonymous American product manager at a software company.  There are several fine aspects to the blog, not least of which is that it is well written (let’s face it, too many blogs out there look like they were written by a dyslexic 12 year old).  However the best aspect is that her anonymity enables her to be delightfully rude about many aspects of the merry go round that is the software industry.  Her blog “Streetwalkers in Disguise” is a delightful example of this.

Those who have worked for some time in the software industry will have many a wry smile at the trials and tribulations of the Cranky PM, whose writing clearly reflects the very realities of product management, rather than some ultra-spun anodyne story that is so often fed to eager journalists as an “insider” story but is just clever PR.  

As someone who wades through more blogs than I care to admit to, I wish more blogs were like this one.