Wednesday, February 24, 2010

3Tera brings powerful, simple way for customers to move apps to clouds -- and help reshape CA?

Today’s announcement of CA’s definitive agreement to acquire 3Tera has a couple of interesting wrinkles. It’s definitely of interest for those of us who have followed 3Tera and the many companies in the cloud computing space for a while. But it also has the possibility of shaking up the admittedly stodgy image of CA – and what CA can deliver for customers thinking about cloud computing.

But first, for those who haven’t been living and breathing this stuff, here’s the quick take:

3Tera is one of the early pioneers in cloud computing. Even better, they are one of the early pioneers with customers, which tells you that they are onto something.

The 3Tera product, AppLogic, has a pretty slick graphical user interface that customers can use to configure and deploy composite applications to either public or private clouds. It takes the many, very manual tasks required to put an application into a cloud environment and lets you take care of them with an elegant series of clicks, drags, and drops. (For a high-speed tour of how this works, they have a 4-minute demo you can watch here.)

Yes, as you’d expect, there’s “some assembly required” up front to make this all possible, but one 3Tera customer noted that their IT folks would not allow them to go back to doing app configuration and deployment the old, manual way now that they were using AppLogic. We considered that a good sign.

And, because 3Tera creates a layer of abstraction for the applications that you’re encapsulating, you have a bunch of deployment options, both now and at any point in the future. That could come in very handy as you think about both public and private cloud possibilities.

Both service providers & enterprises can use 3Tera for cloud enablement

3Tera’s most aggressive customers have been the managed service providers – companies that are scrambling to find compelling, differentiated ways to offer cloud-based services to their customers. Enterprises have also started see 3Tera’s possibilities for deploying composite apps into a private cloud environment. However, enterprises have been a little more reticent to make sure they know what they are getting into before making the leap to cloud.

This is probably one of the areas that CA can help improve by backing the 3Tera innovations with significant resources: enterprises need to feel comfortable to move applications to the cloud. A 3Tera/CA combination will give enterprises a significant partner that’s providing a technology to make the steps possible – and much more doable.

And luckily, we’re expecting the whole 3Tera team to come over to CA: there’s still a bit of evangelizing left to do.

Even so, however, 3Tera customer successes have been noted by an analyst or two. In his report on private clouds last year (“Deliver Cloud Benefits Inside Your Walls,” April 13, 2009), Forrester analyst James Staten called 3Tera’s AppLogic “the leading cloud infrastructure software offering available today” and noted that it was “the foundation of many public clouds.” That’s a great foundation to build from.

One of the reasons 3Tera’s probably had good success with service providers is the fact that it uses Xen to help abstract the applications from the infrastructure. Xen is usually good news given the margin pressures that service providers are under. In the enterprise, however, VMware is a much more common and acceptable choice. It was already on 3Tera’s roadmap to deliver VMware support, but here’s another bit of good news coming with this deal: CA plans to extend AppLogic to also be able to use both ESX and Microsoft’s Hyper-V.

Note to self: This is a big shift for CA

The 3Tera deal is certainly a very public acknowledgement by CA that cloud computing is front and center to what’s changing in IT. And, the deal drops another piece into place in a rapidly filling-out strategy by CA to address those changes.

Customers are likely going to have a number of important needs as cloud computing becomes more central. Organizations want to know how their systems are performing using business metrics (Oblicore can help with this). They need to decide what components to take to the cloud – and what components not to. 3Tera can help them cloud-enable and deploy the apps they want to move now. And, as customers look to optimize this process and their environment, that’s where the Cassatt expertise can be useful.

Alongside these moves, the existing CA portfolio continues to have a strong, immediate role to play. The 3Tera deal may seem like quite a shift from CA’s traditional assurance, automation, and security businesses, but in fact, is a complimentary piece. Customers need to have the ability to manage their environments end-to-end, even if the cloud is involved, and they can (and are) working on that using CA’s existing solutions today. There are lots of opportunities for more linkage going forward as well.

Which leads me back to comments like those in a Derrick Harris’s GigaOM article (complete with some prognosticating about CA’s next moves) a week or so ago: “Let’s be honest, systems management vendor CA doesn’t exactly inspire visions of innovation.”

Hopefully (as Derrick mentions), we’re in the process of changing that. But we’ll leave the verdict on that up to customers.

3Tera: Since the “good old days” of cloud computing…before it was cloud computing

On a more personal note, I’ve been watching 3Tera since my early days at Cassatt, as we and they were all wrestling with how to describe our respective offerings, how to get market traction, and what moves would pay off. Like Cassatt, these guys were cloud before cloud was cool. A lot of the way they describe themselves predates the cloud computing phraseology. And that’s OK. As I noted earlier, they focused on the problem of how to encapsulate apps and make it possible to deploy them in lots of ways and locations, including the cloud.

An interesting connection point: after Cassatt landed at CA, our former CEO Bill Coleman took on a consulting role with 3Tera that I wrote about back in August. Some of his comments then look interesting now, in light of the CA deal with 3Tera.

More details will follow here as I have a chance to dig into some of the interesting aspects of this deal, its implications, and other questions that will come up. Until then, the press release about the acquisition is posted here.

Thursday, February 11, 2010

From private clouds to solar panels: more control and uniqueness, but are they worth it?

Andi Mann of EMA wrote recently that failures are endemic to public clouds. And, by the way, that’s OK. In fact, says Andi, failures are normal part of what your IT infrastructure needs to be able to deal with these days.

Even if you take it as a given that we’ll hear about cloud service failure after failure in the news from now on (a daunting prospect in and of itself), public clouds surprisingly still set a pretty high bar for internal IT. Andi’s figures put some public cloud uptime numbers at 3 to 3.5 “nines,” as in 99.9 or 99.95% uptime – that's 5-10 minutes of downtime each week.

Now, if you’re hoping to get a lot of the public cloud computing benefits but do so on-premise by creating a private cloud infrastructure, there's a serious amount of work and investment required to match public cloud availability for all of your applications. Andi pins “normal” cloud computing outages at 5-10 times less likely than those in internal data centers.

“CIOs who are planning to build their own private cloud have a surprisingly high bar to reach,” blogs Andi.
Sounds like it may not be worth the effort for private clouds then, eh?
Actually, think again. It just might be, but for other reasons than you might think.
Private clouds hold what’s most unique about your organization
Mike Manos recently had some interesting observations from his time at Microsoft and at his recently-ended stint at Digital Realty Trust (sounds like he’s heading to greener pastures at Nokia). In response to James Hamilton at Amazon (thanks to Dave O’Hara for pointing out the discussion), Mike postulated that the things that make private clouds quite interesting, despite the high bar, are the way they encapsulate the unique components of an organization.

In other words, the most tailored and specific things about your IT environment are the best argument for a private cloud.

That rings true for me. Cloud computing is a way to pay for only what you need, and a way to have compute, storage, and other resources appear and disappear to support your demands, as those demands appear and disappear. There are components of what IT does for your company that are not unique. Those sound perfect to move to external, public cloud computing services at some point. They are commodities, and should be handled as such. Maybe not now, but eventually.

The more specific, complex, special pieces of IT seem logical to be the ones you keep inside as you get started down the cloud path. Those take the kid gloves and your special expertise, at least for now.

The push to get the most from those important, unique pieces of IT is giving enterprises a strong incentive to pursue a cloud-style architecture in-house. To again quote Andi Mann’s EMA research, private clouds are the preferred approach to cloud computing for 75% of the organizations they polled, far ahead of the interest in public clouds.

Are private clouds a temporary fix or a permanent fixture?

With all of that as a background, how permanent are private clouds? Here's a quick detour to help answer that:
Chris Hoff of Cisco collected commentary at his Rational Survivability blog on the topic of private clouds recently by weighing in on the appropriateness of the IT-as-electricity analogy that Nick Carr brought mainstream with The Big Switch. His quick take was that private clouds might be like batteries (thought he didn’t go too far explaining his concept, beyond labeling it an “incomplete thought” to get conversation going). However, a couple of his commenters had an analogy I liked better: that of a solar power generator.

So, is a solar power generator a good analogy for a private cloud? You’re generating “IT power” for your own use, using your own resources. Unlike what the “battery” analogy implies, a private cloud implementation is not what you’d call temporary. In fact, as Manos was thinking about private clouds in the blog noted above, one of his comments was that “there’s no such thing as a temporary data center.” Or a temporary private cloud infrastructure, I’d add. Like it or not, most IT projects, even if they are done for an ostensibly short period of time, end up living long beyond their intended sunset.

Private clouds will be no different. Many (like Gartner) see private clouds as a stepping stone or a temporary requirement until the public cloud addresses all of the roadblocks people keep complaining about. But once that infrastructure to add/subtract, build/collapse things in your IT environment is in place, you should be able to get a lot of use from it. And it will live on. This is similar, in fact, to the situation if you had taken the time and effort to get that solar installation up and running.

As things progress, I predict this “either/or” kind of language (as in “public clouds” or “private clouds”) that we’ve been seeing will fall by the wayside. I think Manos is right: “and” will be the way of the future. We’ll aim for use of the public cloud where it makes sense. And we’ll keep using private clouds – leveraging their reflection of organizational uniqueness, coupled with an unintentional permanence because, well, they work. We’ll find a way to take advantage of both.

Paving the way for hybrid clouds

This scenario, by the way, makes hybrid clouds the end state, a situation that Hoff sees as “the way these things always go.” Scott Hammond, CEO of newScale, uses an example that reiterates this: “The data center looks like my dad’s basement.” In other words, IT continues to be a strange mishmash of the new, combined with all that’s come before. That’s reality.

So a conversation that started by questioning whether public cloud computing service outages are endemic or even a problem, which then shifted to how private clouds can hold unique value for organizations, ends up connecting the two.

Of course, hybrid clouds will require an additional level of thinking, management, and control. That’s a topic that will have to get a unique post of its own one of these days.

In the meantime, I’ll leave you to ponder what other cloud computing metaphors we might be able to unearth in Scott’s dad’s basement. It just might be worth it. Especially if we find something to help those solar panels pay off.

Wednesday, February 3, 2010

'Forbes' rebuttal: don’t abolish cloud computing, just swap jargon for what customers really want

I’m guessing that Lee Gomes’ article on this week was intended to provoke a strong reaction. The title said it all: “Abolish cloud computing!” If so, he got me.

As I read the article, I couldn’t help but feel that he was missing the point. So, I crafted a quick response and dashed it off to him in an e-mail. To Lee’s credit, he was interested in the dialog, and thought his readers would be, too.

So, at Lee’s request, I polished up that e-mail a bit and two days after I saw a Forbes headline saying “Abolish cloud computing,” they ran my rebuttal to “save cloud computing.” Everyone loves a good debate, right?

Definitely read Lee’s original article and my response, but I’ll summarize it for you here since the main points were pretty straightforward. The industry has struggled mightily with the definition of cloud computing over the past 12 months (or longer). And grand pronouncements have inevitably followed. Public clouds are the future. Or not. Private clouds don’t exist. Or are really important for large companies.

This back and forth has led to a lot of confusion and some folks have even thrown in the towel. But I don’t think we should.

Lee’s analysis assumed that cloud computing was essentially the same concept as simply running something “over the Internet.” If that’s the case, then I agree with Lee – cloud computing is not worth all the chatter.

However, I don’t think that’s the case. As I mentioned in my rebuttal, “I think the aspects that make ‘cloud computing’ something IT shops should consider in the first place make it worth having its own term.”

Some of the special characteristics I point to are ones I’ve talked about here on this blog before (and supports the definitions from pundits, analysts, and even the NIST): elasticity from pooled resources, pay-as-you-go, and automation to enable on-demand self-service.

Big difference from just doing things "over the Internet" in my book.

So, take a look at the articles and comment at (or right here). I especially like the comment at Forbes emphasizing the need to be very clear what you’re talking about when parading around new terms like cloud computing.

And thanks to Lee and Forbes for the chance to make my point.

(Update 2/4: David Linthicum weighed in with his thoughts on Lee's article in his Infoworld cloud computing column today, asking "Is 'cloud computing' hurting cloud computing?" Also worth a read. In an industry arguing about definitions, his headline says it all.)