IBM’s Smart Analytics System: More Than An Appliance?

When is an appliance not an appliance? When it’s more. On July 28, IBM’s Software Group and Systems and Technology Group (i.e., the hardware folks) hosted an analyst event to introduce the Smart Analytics System.The discussion began with a series of conversations about the value of “workload optimization,” or the effective tuning of processors, storage, memory and network components with software used for information management.  Not controversial, but hardly news. IBM claims to be raising the bar, though, with the promise of a system that is already tuned, and attuned to the needs of its purchaser, at a level far beyond appliances that other vendors have delivered: appliances, if you will, not only predesigned for specific use cases, but customized for specific instances of those use cases. It’s no accident that IBM never called the Smart Analytics System an “appliance.” Extending the Smart brand here is a powerful move, and IBM appears poised to make good on its promise.

Curiously, the speakers provided no specifics about the offering(s) until the final 10 minutes of the scheduled time during Q&A, when a direct question from Doug Henschen at last elicited some detail from Arvind Krishna, VP Enterprise Information Products. (Krishna was on the podium because Ambuj Goyal, the GM of Information Management Software, who was supposed to be there, was conducting a session on IBM’s acquisition of SPSS. M&A is tricky – you go with it when you go with, and the timing was an unfortunate bit of luck, as the coverage of the acquisition has overshadowed the event we were there for; spending $1.2B tends to get people’s attention.)

Krishna said that  IBM will release the Analytic Server(s) for immediate availability in September. Although the specifications are not in the press materials, the presentations given out, or the first level or two of content on IBM’s web site, the details appear to be as follows:

  • Hardware – apparently a Power System 550 running AIX 6.1, with some memory (an unspecified amount) and IBM DS5300 storage arrays (again, amount unspecified.) Steve Mills, head of IBM SW Group, offered that the Analytic Server, available in 6 sizes, is “non-decomposable,” which we take to mean in-place upgrades of memory or storage are not possible unless the customer moves to a different size (model) and it’s not clear if it’s possible to add to the largest size.
  • Software – presumably DB2, InfoSphere Warehouse, and Cognos with database compression (likely the compression already shipping in DB2 9.7), analysis, dashboards and reporting. Cubing Services and OLAP were mentioned as well as Data Mining and Unstructured Information Analytics, some of which are options in the underlying product families. It’s not clear which are options here. ETL was not mentioned so it’s not clear if it is part of the package. Also discussed was Advanced Workload Management – which was mentioned in the context of consolidating warehouses and marts, though it’s not clear if there is a facility for rearchitecting a new larger warehouse from existing ones or federating those in place through some sort of metadata layer. Application modules for verticals will also be available, once again leveraging the IP-as-an-asset model IBM SWG has been sharpening with its partners in the Services and Research organizations.
  • Services– these were unspecified, except to say that they will create an “analytics ready” delivery, and take the labor out of integration,  optimization, and data restructuring for performance. Beyond appliance like installation and pre-integration, this presumably involves customized work for each buyer to assess and optimize their data; no time or number of people involved were specified. It’s not clear if this will be completely variable and priced accordingly or if some fixed number of person-hours comes with each size.
  • Network clustering from an unnamed partner was also mentioned; at this time, we have no details regarding this component.

Support for everything is included, as well as periodic “health checks” to ensure that software and hardware optimization remain up to date with changes in data composition, usage and distribution over time. More than an appliance? Unquestionably.

“This is not just bundling,” Steve Mills said. “It’s top to bottom understanding and optimization.”

Fair enough; IBM has drawn a powerful line in the silicon. But as an announcement, the story falls short – model numbers, capacities, specific software components, and go-to-market details are conspicuously absent. Most important, prices and how they will vary with size were not discussed, and although capacity addition was touted, it’s not clear how it will work.

With Netezza now promising less the 20K per terabyte of user data, and other new ADBMS vendors aggressively entering the market, IBM must be careful to make its value proposition more explicit. The claims of a 3x speed increase, and a 50% reduction in storage, are commonplace in the ADBMS marketplace today. Mill’s point that “Labor cost and time are the limiting factors” opens a new dimension in the discussion. The promise of cutting those elements post-install by at least half is a powerful value proposition. The tough part for IBM will be proving it, and convincing prospects that the total cost is thus in line with the market – or better.

IBM also announced the Smart Analytics Optimizer, an add-on solution that will attach to a System z to boost analytic performance. It will exploit in-memory techniques (in response to a question, SSD was acknowledged as at least part of this), vector processing, parallel evaluation of query predicates. Again, details were difficult to come by, but mainframe users would be well served by such an offering.

What does this mean for users? A faster time to value, assurance that their system will perform near the peak of its theoretical capability, and ongoing management to ensure that the system does not drift away from excellence as inevitable scope creep, data creep, and inertia take their toll. IBM had some impressive user stories to tell, but of course those stories were about the pieces that will go into the new offering. The promised synergies were not yet referenceable, and will not be until some buyers take the Smart Analytics Systems for a spin and find out just how smart they are.

Published by Merv Adrian

Independent information technology market analyst and consultant, 40 years of industry experience, covering software in and around the data management space.

4 thoughts on “IBM’s Smart Analytics System: More Than An Appliance?

  1. Can you post system x base Smart Analytics 5600 hardward and software offering too.

    Thanks
    PK

Leave a Reply

%d bloggers like this: