Andy on Enterprise Software

Leaving Las Vegas

May 16, 2015

The Informatica World 2015 event in Las Vegas was held as the company was in the process of being taken off the stock market and into private ownership by private equity firm Permira and a Canadian pension fund. The company was still in its quiet period so was unable to offer any real detail about this. However my perception is that one key reason for the change may be that the company executives can see that there is a growing industry momentum towards cloud computing. This is a challenge to all major vendors with large installed bases, because the subscription pricing model associated with the cloud presents a considerable challenge as to how vendors will actually make money compared to their current on-premise business model. A quick look at the finances of publicly held cloud-only companies suggest that even these specialists have yet to really figure it out, with a sea of red ink in the accounts of most. If Informatica is to embrace this change then it is likely that it’s profitability will suffer, and private investors may offer a more patient perspective than Wall Street, which is notoriously focused on short-term earnings. It would seem to me that there is unlikely to be any real change of emphasis around MDM from Informatica, given that it seems to be their fastest growing business line.

On the specifics of the conference, there were announcements for the company around its major products, including its recent foray into data security. The most intriguing was the prospect of a yet to be delivered product called “live data map”. The idea is to allow semantic discovery within corporate data, and allow end-users to vote on how reliable particular corporate data elements are, rather as consumers vote for movies on IMDB or rate others on eBay. The idea is that this approach may be particularly useful as companies have to deal with “data lakes” where data will have little or none of the validation applied to it that would (in theory) be the case with current corporate systems. The idea is tantalising but this was a statement of direction rather than a product that was ready for market.

The thing that I found most useful was the array of customer presentations, over a hundred in all. BP gave an interesting talk about data quality in the upstream oil industry, which has typically not been a big focus for data quality vendors (there is no name and address validation in the upstream). Data governance was a common theme in several presentations, clearly key to the success of both master data and data quality projects. There was a particularly impressive presentation by GE Aviation about their master data project, which had to deal with very complex aeroplane engine data.

Overall, Informatica’s going private should not have any negative impact on customers, at least unless its executives end up taking their eye off the ball due to the inevitable distractions associated with new ownership.

The Private Side of Informatica

April 8, 2015

Yesterday Informatica announced that it was being bought, not by a software firm but by a private equity company Permira. At £5.3 billion, this values the data integration vendor at over five times the billion dollar revenue that Informatica saw in 2014, compared to an industry average of 4.4 recently. This piece of financial engineering will not change the operational strategy for Informatica. Rather it is a reflection of a time when capital is plentiful and private equity firms are feeling bullish about the software sector. Tibco and Dell have followed a similar route. Company managers will not have to worry about quarterly earnings briefings to pesky financial analysts, and will instead be accountable only to their new owners. However, private equity firms seek a return on their investment, usually leveraging plenty of debt into such deals (debt is tax efficient compared to equity), and can be demanding of their acquisitions. From a customer viewpoint there is little to be concerned about. One exit for the investors will be a future trade sale or return to the stock market, so this deal does not in itself change the picture for Informatica in terms of possible acquisition by a bigger software company one day.

SAS Update

June 7, 2014

At a conference in Lausanne in June 2014 SAS shared their current business performance and strategy. The privately held company (with just two individual shareholders) had revenues of just over $3 billion, with 5% growth. Their subscription-only license model has meant that SAS has been profitable and growing for 38 years in a row. 47% is Americas, 41% from Europe and 12% from Asia Pacific. They sell to a broad range of industries, but the largest in terms of revenue are banking at 25% and government at 14%. SAS is an unusually software-oriented company, with just 15% of revenue coming from services. Last year SAS was voted the second best company globally to work for (behind Google), and attrition is an unusually low 3.5%.

In terms of growth, fraud and security intelligence was the fastest growing area, followed by supply chain, business intelligence/visualisation and cloud-based software. Data management software revenue grew at just 7%, one of the lowest rates of growth in the product portfolio (fraud management was the fastest growing). Cloud deployment is still relatively small compared to on-premise but growing rapidly, expected to exceed over $100 million in revenue this year.

SAS has a large number of products (over 250), but gave some general update information on broad product direction. Its LASR product, introduced last year, provides in-memory analytics. They do not use an in-memory database, as they do not want to be bound to SQL. One customer example given was a retailer with 2,500 stores and 100,000 SKUs that needed to decide what merchandise to stock their stores with, and how to price locally. They used to analyse this in an eight-hour window at an aggregate level, but can now do the analysis in one hour at an individual store level, allowing more targeted store planning. The source data can be from traditional sources or from Hadoop. SAS have been working with a university to improve the user interface, starting from the UI and trying to design to that, rather than producing a software product and then adding a user interface as an afterthought.

In Hadoop, there are multiple initiatives to apply assorted versions of SQL to Hadoop from both major and minor suppliers. This is driven by the mass of skills in the market with SQL skills compared to the relatively tiny number of people that can fluently program using MapReduce. Workload management remains a major challenge in the Hadoop environment, so a lot of activity has been going on to integrate the SAS environment with Hadoop. Connection is possible via Hive QL. Moreover, SAS processing is being pushed to Hadoop with Map Reduce rather than extracting data. A SAS engine is placed on each cluster to achieve this. This includes data quality routines like address validation, directly applicable to Hadoop data with no need to export data from Hadoop. A demo was shown using the SAS Studio product to take some JSON files, do some cleansing, and then use Visual Analytics and In-Memory Statistics to analyze a block of 60,000 Yelp recommendations, blending this with another recommendation data set.

Informatica Update

May 22, 2014

At Informatica World in Las Vegas recently the company made a number of announcements. The label “Intelligent Data Platform” was used to encompass the existing technology plus some new elements around metadata. The key new element was software that helped end-users to provision data directly in an attempt to unblock the traditional bottleneck when IT and business users. The software allows a business user to search for available data using business terms, and then presents to the users the best matches for that, including showing the source systems. The system presents an interface to the user and can capture the actions the user takes in selecting data in the form of workflow steps, that can later be automated by IT if appropriate. This certainly demonstrated well, and seems to have had some happy early adopters.

Separately, Informatica announced a data security product. Secure@Source that will be an early application of the Intelligence Data Platform. A demo included an attractive looking “heat map” showing data sensitivity, proliferation and levels of risk, was based on prior DI mappings between sources and targets defined in PowerCenter. . The obvious issue here is whether Informatica’s sales force understands the specialised security market, and whether customers will perceive it is a natural brand in an area that it not currently perceived to be associated with by many, although to be fair it already has offerings in PDM, DDM and test management.

There was plenty of discussion around Big Data at the conference, and partners such as Cloudera spoke on the subject as well as staff from the company. Certainly the scale of data now can be vast, with Facebook apparently having 500 petabytes to manage. The company has several initiatives in this area linking to Hadoop. Certainly all that data will have to be managed somehow, so companies with core strengths in integration and data quality ought in principle to be able to carve out a place in the Big Data world, which at present still seems very formative and immature in general. The Vibe Data Stream product is clearly aimed at this new type of data, such as that generated by sensors.

Financially, Informatica seems back on track after the issues of 2012, and seems to have had a good quarter. One intriguing thing is just how significant MDM is now to Informatica – a whole day at the conference was devoted to MDM, and although the company does not break out software sales by product line, it was clear that MDM is both the fastest growing segment for it, and now is a significant chunk of its new license revenues. The $130 million acquisition of Siperian may in retrospect seem to be money well spent. Version 10 is the next major release, due out in late 2014.

The company is clearly investing heavily at present, with 17% of its spend going into R&D at the moment. As the company seeks to maintain growth against a backdrop where the core integration market is maturing, the main challenge for the company would seem to me to be whether its sales staff, used to selling integration software to IT folks, can adapt to selling the new products effectively, some of whom are aimed at business people.

Informatica goes shopping

October 9, 2012

Informatica has made an offer to buy Germany PIM vendor Heiler – the deal has not gone through yet and the German securities laws are complex, but it appears to be a “friendly” takeover. There are a few interesting aspects to this. Firstly, it sets a useful valuation benchmark. Heiler did 17.4 million Euros in revenue in their last financial year, and the offer is 80.8 million, so this is a price to sales ratio of 4.6, a healthy though not extreme valuation (Heiler also has 15.8 million euros of cash and is modestly profitable, with profits in the last financial year of 1.4 million Euros). It had been around in the MDM market for 12 years, and so is quite a mature product/company, shown in the split of its revenue, with nearly half its revenue in services, and a fifth in maintenance revenue, with several hundred customers.

The deal makes sense to Heiler, as Informatica has a far more powerful sales channel. From Informatica’s perspective they gain a solid piece of technology with a proven footprint in the product data domain, whereas Informatica, for all its multi-domain marketing, has been primarily used to managed customer data. They also gain a slice of the European MDM market, reducing their heavy US revenue preponderance. Moreover, assuming the deal goes ahead, Informatica now has several hundred new customers to up-sell its other software to e.g. its integration and data quality offerings.

The deal also shows that the M&A market is still active for MDM software, which is positive news for the shareholders of other independent MDM vendors out there.

Low Hanging Fruit

July 8, 2010

EMC has entered the data warehouse appliance market via the purchase of Greenplum, who made good progress in the space with their massively parallel database (based on PostgresSQL). Greenplum had impressive reference customers and some of the highest end references out there in terms of sheer scale of data. They had also been one of the few vendors (Aster is another) that went early into embracing MapReduce, a framework designed for paralleism and suitable for certain types of complex processing.

Data warehouse appliances can be a tough sell because conservative buyers are nevous of making major purchases from saller comapnies, but the EMC brand will remove that concern. Also, EMC have a vast existing customer base and the sales channel that can exploit this. Seems like a sensible move to me.

No Data Utopia

August 11, 2009

The data warehouse appliance market has become very crowded in the last couple of years, in the wake of the success of Netezza, which has drawn in plenty of venture money to new entrants. The awkwardly named Dataupia had been struggling for some time, with large-scale redundancies early in 2009, but now appears to have pretty much given up the ghost, with its assets being put up for sale by the investors.

If nothing else, this demonstrates that you need to have a clearly differentiated position in such a crowded market, and clearly in this case the sales and marketing execution could not match the promise of the technology. However it would be a mistake to thing that all is doom and gloom for appliance vendors, as the continuing recent commercial success of Vertica demonstrates.

To me, something that vendors should focus on is how to simplify migration off an existing relational platform. If you have an under-performing or costly data warehouse, then an appliance (which implies “plug and play”) sound appealing. However although appliance vendors support standard SQL, it is another thing to try and migrate a real-life database application, which may have masses of proprietary application logic locked up in stored procedures, triggers and the like. This would seem to me the thing that is likely to hold back buyers, but many vendors seem to focus entirely on their price/performance characteristics in their messaging. It actually does not matter if a new appliance has 10 times better price performance (let’s say, saving you half a million dollars a year) if it takes several times that to actually migrate the application. Of course there are always green-field applications, but if someone could devise a way of dramatically easing migration effort from an existing relational platform then it seems to me that they would have cracked the code on how to sell to end-users in large numbers. Ironically, this was just the kind of claim that Dataupia made, which suggests that there was a gap between its claims and its ability to convince the market that it was really that easy, despite accumulating a number of named customer testimonials on its web-site.

Even having the founder of Netezza (Foster Hinshaw) did not translate into commercial viability, despite the company attracting plenty of venture capital money. The company has no shortage of marketing collateral; indeed a number of industry experts who have authored glowing white-papers on the Dataupia website may be feeling a little sheepish right now. Sales execution appears to have been a tougher nut to crack. I never saw the technology in action, but history tells us that plenty of good technology can fail in the market (proud owners of Betamax video recorders can testify to that).

If anyone knows more about the inside story here then feel free to contact me privately or post a comment.

Running Against the Tide

January 9, 2009

We recently completed the Q4 “Market Landscape” for MDM. As part of this we looked at all the vendors in the market and obtained figures for software revenues and growth of each. One interesting aspect of this is that the MDM software market so far appears to be holding up well. Indeed it is currently growing at an annualised rate of 30% according to our research,a healthy clip. It should be noted that the market size figures that we use exclude systems integrator revenues associated with MDM – these are estimated at around three times the size of the software market. As an aside, it is these kind of assumptions that can lead to seemingly wide discrepancies between market size estimates from different firms; typically you see a figure quoted in the press, but what does it include and exclude? Our figures for MDM software exclude data quality vendors, which are handled in a separate twice yearly update.

These figures, which are admittedly retrospective, confirm our November market research study looking at the effect of the financial crisis on MDM spending. This found that about as many companies were planning to accelerate their MDM spend as were planning to slow it down or defer projects (admittedly nearly a third of respondents were undecided).

So far at least, then, both actually Q4 spend as seen by vendors, and spending intentions in our survey are telling the same story. MDM software revenues are holding up well. We will continue to track the market closely, with a Q2 2009 Market Update to be published in July.

Recession?

November 19, 2008

Usually when economic times are tough then there are a series of things that the bean counters do to rein in costs. First they ban business travel except for customer facing situations, followed by freezing training budgets and recruitment, before sharpening their knives more seriously. The first two of these mean that conferences are usually at the sharp end of corporate spending cuts. I have just returned from speaking at a BI/MDM conference in Amsterdam, and was pleasantly surprised to see a healthy attendance of paying customers, a large number of which appeared in the last few days. This must have been a considerable relief to the conference organisers, and it was certainly nice to be speaking a packed room rather than one with rows of empty seats.

It is certainly hard to be sure just how deep recession is likely to be, with seemingly contradictory data all around. A scary figure is the cost of shipping (the Baltic Dry Index) which is a reasonable predictor of the flow of trade. This has dropped a little matter of 95% since the peak in June, with shipping companies cancelling orders and talking of mothballing ships. This kind of broader economic data would seem to suggest a fairly sharp recession is on the cards, and this must feed through into lower IT expenditure eventually, just as night follows day. Of course some areas will be hit harder than others, but I am always suspicious of claims that a certain area is “strategic” and so will be unaffected. Usually this is whistling in the dark by vendors. We shall see.

The economy and Wile E. Coyote

October 27, 2008

Like many of us, I am curious as to what extent the meltdown in the banks will affect the rest of the economy and, in particular, enterprise software spending. Random conversations over the last few weeks with vendors have been varied, with only those exposed heavily to financial services seemingly seeing a real decline in spend (one company had Lehman Brothers on its Q4 sales forecast). Certainly some sectors may barely be affected e.g. the public sector, or perhaps pharmaceuticals (people still get ill and will need to pay for their pills) and maybe law (imagine all the fun the lawyers will have as banking positions unwind and contracts cannot be fulfilled, never mind the shareholder class action suits). However there are only so many of these and I wonder whether we are in the situation that you get in cartoons, where Wile E Coyote or Bugs Bunny runs off a cliff and happily progresses forward until he actually looks down and notices there is no ground any more.

A troubling sign of this is in a sector a long way from the world of credit default swaps, that of trucks. Volvo is one of the largest suppliers of commercial lorries and trucks to continental Europe, and sold 41,970 trucks in the third quarter of 2007, when admittedly things were booming. They just announced their Q3 2008 results. How many trucks do you reckon that they sold? Less than 41,970 for sure, but what what sort of reduction might you expect? Maybe a 10% drop in sales, perhaps 20%, or even 30% if things had become really bad? Well, they actually sold 115 trucks in the last three months. That is not a typo. It is a 99.7% reduction in sales.

Now that is a scary number.

The Information Difference will shortly be conducting a survey of enterprise software buyers, looking specifically at their spending plans for master data management and data quality. I’ll keep you updated when we have some results (a few week’s time). If your company would like to sponsor this survey, you have four days left to do so (just contact me and I’ll send you details).