Amazon Redshift Disrupts DW Economics – But Nothing Comes Without Costs

At its first re:Invent conference in Late November, Amazon announced Redshift, a new managed service for data warehousing. Amazon also offered details and customer examples that made AWS’  steady inroads toward enterprise, mainstream application acceptance very visible.

Redshift is made available via MPP nodes of 2TB (XL) or 16TB (8XL), running Paraccel’s high-performance columnar, compressed DBMS, scaling to 100 8XL nodes, or 1.6PB of compressed data. XL nodes have 2 virtual cores, with 15GB of memory, while 8XL nodes have 16 virtual cores and 120 GB of memory and operate on 10Gigabit ethernet.

Reserved pricing (the more likely scenario, involving a commitment of 1 year or 3 years) is set at “under $1000 per TB per year” for a 3 year commitment, combining upfront and hourly charges. Continuous, automated backup for up to 100% of the provisioned storage is free. Amazon does not charge for data transfer into or out of the data clusters. Network connections, of course, are not free  - see Doug Henschen’s Information Week story for details.

This is a dramatic thrust in pricing, but it does not come without giving up some things.


Cloudera-Informatica Deal Opens Broader Horizons for Both

Cloudera‘s continuing focus on the implications of explosive data growth has led it to another key partnership, this time with Informatica. Connecting to the dominant player in data integration and data quality expands the opportunity for Cloudera dramatically; it enables the de facto commercial Hadoop leader to find new ways to empower the “silent majority” of data. The majority of data is outside; not just outside enterprise data warehouses, but outside RDBMS instances entirely. Why? Because it doesn’t need all the management features database management software provides – it doesn’t get updated regularly, for example. In fact, it may not be used very often at all, though it does need to be persisted for a variety of reasons. I recently mentioned Cloudera’s success of late; it’s going to be challenged by some big players in 2011, notably IBM, whose recent focus on Hadoop has been remarkably nimble. So these deals matter. A lot. The Data Management function is being refactored before our eyes; both these vendors will play in its future. Read more of this post

Living in the Present is SO Yesterday

It’s an occupational hazard of living in the future that analysts can begin to ignore the present – unless we make it a practice to seek it out. Here in the Valley, that can be difficult, when being a week behind the latest version of something the rest of the world hasn’t heard of yet equates to being a luddite. That can lead to AADD (analyst attention deficit disorder.) Read more of this post

How the Cloud Will Lead Us to Industrial Computing

From  Judith Hurwitz, president, Hurwitz & Associates (

I spent the other week at a new conference called Cloud Connect. Being able to spend four days emerged in an industry discussion about cloud computing really allows you to step back and think about where we are with this emerging industry. While it would be possible to write endlessly about all the meeting and conversations I had, you probably wouldn’t have enough time to read all that. So, I’ll spare you and give you the top four things I learned at Cloud Connect. I recommend that you also take a look at Brenda Michelson’s blogs from the event for a lot more detail. I would also refer you to Joe McKendrick’s blog from the event. Read more of this post

Judith Hurwitz Comments on Cloud Impact on HW Biz

My longtime friend and colleague Judith Hurwitz and I have decided to cross-post on one another’s blogs (hers is at I’m delighted to have her here. For me, this is another step in the continuing evolution of the loosely coupled independent analyst collaborations I find myself participating in more and more, and a very exciting development. Welcome, Judith!


I am thrilled to be contributing my “cloudy” observations to your blog. I have been an analyst and consultant focusing on distributed software. I look at everything from service oriented architectures, service management, and even information management. My philosophy is that cloud computing, in all its iterations, is the future of a significant portion of enterprise software.  Judith Hurwitz, President, Hurwitz & Associates

I thought I would provide my thoughts on the future of hardware in the context of where software is headed.

It is easy to assume that with the excitement around cloud computing would put a damper on the hardware market. But I have news for you. I am predicting that over the next few years hardware will be front and center.  Why would I make such a wild prediction? Here are my three reasons: Read more of this post

Informatica Passes Half-Billion Mark, Buys Siperian, Targets Cloud

Informatica has announced another, long-rumored acquisition: Siperian, thus continuing a steady march toward a comprehensive portfolio play. In 2009, its strong growth path made it the clear independent leader in data integration.  With Release 9, its vision of a data integration platform grew to providing a comprehensive approach to everything from data discovery services to data quality. While growth slowed during a tough year for the economy overall, Informatica grew revenue in every quarter, and made key acquisitions in 3 successive quarters (Applimation, AddressDoctor and Agent Logic) and began to make significant moves into the cloud via partnerships with Amazon, and others. Agent Logic added event detection and processing to support real-time alerting and response. As 2010 begins, this latest move is synergistic from the outset; Rob Karel points out in his excellent blog post that “Siperian MDM technology…already is deeply integrated with Informatica’s identity resolution and postal address technology. In addition…Siperian MDM customers [are] using Informatica for data integration and data quality, meaning there is a lot of existing experience and know-how on integrating Informatica’s portfolio with Siperian.” Read more of this post

Xkoto’s Database Virtualization Expands Cloud Opportunities

Xkoto, the database virtualization pioneer, has generated substantial interest since its first deployments in 2006. Still privately held and in investment mode, Xkoto sees profitability on the horizon, but offers no target date, and appears in no hurry. Its progress has been steady: in early 2008, a B round of financing led by GrandBanks Capital allowed a step up to 50 employees as the company crossed the 50 customer mark. 2008 also saw Xkoto adding support for Microsoft SQL Server to its IBM DB2 base. Charlie Ungashick, VP of marketing for Xkoto, says that 2009 has been going well, and the third quarter was quite strong. And at the end of September 2009, Xkoto announced GRIDSCALE version 5.1, which adds new cluster management capabilities to its active-active configuration model, as well as Amazon EC2 availability. Read more of this post

IBM Showcases Software Vision and Hadoop Research

At IBM’s 8th annual Connect meeting with analysts, Steve Mills, Senior VP and Group Executive, had much to crow about. Software is the engine driving IBM’s profitability, anchoring its customer relationships, and enabling the vaulting ambition to drive the company’s Smarter Planet theme into the boardroom. Mills’ assets are formidable: 36 labs worldwide have more than 100 SW developers each, plus 49 more with over 20 – 25,000 developers in all. Mills showcased all this in a matter-of-fact, businesslike fashion with minimal hype and little competitor bashing. A research project aimed at extending Hadoop usage to a broader audience was among the highlights.  Read more of this post

Teradata Transition On Course in Steady Quarter, With Exciting New Offerings Ahead

How good was Teradata’s Q3? Not bad, but no improvement over a so far lackluster year, which nonetheless has seen the stock  price rise steadily. In 2008,  the striking rise in Teradata’s Linux revenue growth was matched only by the corresponding drop in its Unix revenue, and that “steady as she goes” performance continues through its still unevenly applied OS transition. In Q3, revenues were down a little (3%) year over year, and margin was flat (down 0.6%). YTD product revenues are down 11%.  Service revenues were up 5% for the quarter but only 2% YTD.  Still, net income rose 5%, in part because of strong expense controls. Since early 2008, Teradata has lost a little momentum through a difficult economy compared to its rivals at Oracle and IBM. Its next transition – after independence from NCR and the OS shift – is a product portfolio change catalyzed by the growth of appliance competitors like Netezza. So far, Teradata has managed to drive the product changes into the market well, claiming 65% of its appliance sales are new names. The hot new all-SSD Extreme Performance Appliance is now coming on-stream, and will create a new category advantage if, as Teradata believes, there are customers willing to pay for its spectacular performance. Read more of this post

Informatica, Strong Through Tough Times, Looks Ahead

Not everyone in the software industry is suffering. Informatica Q2 revenues were $117.3 million, up 3% year over year, and license revenues for the second quarter were $48.7 million, relatively flat. That makes 19 quarters in a row – very impressive. Informatica added 65 customers in the quarter and now claims nearly 3800, with wins in multiple geographies.  Read more of this post


Get every new post delivered to your Inbox.

Join 134 other followers